J. Prompts
time limit per test
1 second
memory limit per test
256 megabytes
input
standard input
output
standard output

With the introduction of Large Language Models (LLMs), Alex began to actively use them for various purposes: to explain an incomprehensible topic on the subject, make a training plan, and even write a birthday greeting to his friends. To use LLM, you need to write a lead-in called prompt. The aim of the prompt is to explain to the model what is to be done and what is expected as a response. Thus the whole question cannot consist of a prompt only. The prompt usually looks like this:

You work as a proofreader for educational books. Rewrite the following text, correcting grammatical, punctuation, and semantic errors.

Alex's friends also began using LLMs and actively sharing their generation results. Alex decided to make a collection of successful prompts to use them more actively later. A prompt is considered successful if the result of most generations with this prompt is considered successful.

Alex has a set of questions that he and his friends have asked LLM. He came up with the following algorithm to find all the prompts from a set of questions:

  1. At the beginning, all questions are divided into two non-empty sequences of words called the prefix and suffix of the string. If you write all the words from the prefix and suffix sequentially with a space, you will get the original question. Then only the prefixes are processed.
  2. Next, two random prefixes $$$q_i$$$ and $$$q_j$$$ are taken. If these prefixes have the largest common non-zero prefix $$$q^{*}$$$ such that $$$q_i = q^{*} + q_i^{+}$$$ and $$$q_j = q^{*} + q_j^{+}$$$, then $$$q_i$$$ and $$$q_j$$$ are replaced by $$$q^{*}$$$ and are not processed further (By $$$q+$$$ we mean a non-empty subsequence of words from $$$q$$$).
  3. Step 2 is repeated as long as it is possible to replace prefix pairs by the largest common non-zero prefixes.

At the end of this algorithm, we obtain the set of all prompts for the questions.

Help Alex: find all the successful prompts from the questions Alex and his friends asked LLM.

Input

The first line of the input contains a single non-negative integer $$$N$$$ — the number of questions ($$$1 \le N \le 1000$$$).

The second line of the input data contains $$$N$$$ numbers; the $$$i$$$-th number is 0 if the generation result for the $$$i$$$-th question was unsuccessful and 1 if the generation result for the $$$i$$$th question was successful.

The following $$$N$$$ non-empty lines contain questions — sequences of words separated by a space. A word is a sequence of characters of the Latin alphabet, which may end with a punctuation mark. The length of the line does not exceed 10000, and the number of words in the line does not exceed 1000.

Output

Print all the successful prompts that can be obtained from the questions on separate lines in any order.

Example
Input
7
0 1 1 1 1 0 1
Create text about my mom
Summarize Crime and Punishment
Create text about my friend
Create text about my boss
Summarize calculus book
Summarize Harry Potter
Create text about my dog
Output
Create text about my
Summarize