You are given an array of strings A of size N. You need to pick two strings Ai and Aj such that len(Ai)* len(Aj) is maximized. Is there a optimal solution that can be less than order of N*N? Thanks in advance.
# | User | Rating |
---|---|---|
1 | tourist | 3993 |
2 | jiangly | 3743 |
3 | orzdevinwang | 3707 |
4 | Radewoosh | 3627 |
5 | jqdai0815 | 3620 |
6 | Benq | 3564 |
7 | Kevin114514 | 3443 |
8 | ksun48 | 3434 |
9 | Rewinding | 3397 |
10 | Um_nik | 3396 |
# | User | Contrib. |
---|---|---|
1 | cry | 167 |
2 | Um_nik | 163 |
3 | maomao90 | 162 |
3 | atcoder_official | 162 |
5 | adamant | 159 |
6 | -is-this-fft- | 158 |
7 | awoo | 155 |
8 | TheScrasse | 154 |
9 | Dominater069 | 153 |
10 | djm03178 | 152 |
Could Someone tell the optimal approach
You are given an array of strings A of size N. You need to pick two strings Ai and Aj such that len(Ai)* len(Aj) is maximized. Is there a optimal solution that can be less than order of N*N? Thanks in advance.
Name |
---|