Suppose we are given two increasing arrays A and B , we need to find max((a[j]-a[i-1] )- (b[j]-b[i-1])) , where j>=i .
Can we solve it in better than O(n^2) time complexity . If so , please give me any Hint .
# | User | Rating |
---|---|---|
1 | tourist | 4009 |
2 | jiangly | 3823 |
3 | Benq | 3738 |
4 | Radewoosh | 3633 |
5 | jqdai0815 | 3620 |
6 | orzdevinwang | 3529 |
7 | ecnerwala | 3446 |
8 | Um_nik | 3396 |
9 | ksun48 | 3390 |
10 | gamegame | 3386 |
# | User | Contrib. |
---|---|---|
1 | cry | 167 |
2 | Um_nik | 163 |
3 | maomao90 | 162 |
3 | atcoder_official | 162 |
5 | adamant | 159 |
6 | -is-this-fft- | 158 |
7 | awoo | 157 |
8 | TheScrasse | 154 |
9 | Dominater069 | 153 |
9 | nor | 153 |
Suppose we are given two increasing arrays A and B , we need to find max((a[j]-a[i-1] )- (b[j]-b[i-1])) , where j>=i .
Can we solve it in better than O(n^2) time complexity . If so , please give me any Hint .
Name |
---|
You are basically asking for $$$max((a[j] - b[j]) - (a[i] - b[i]))$$$ where
i < j
holds. Let $$$c_i = a_i - b_i$$$. Then it becomes: $$$max(c[j] - pref[j - 1])$$$ where $$$pref[i]$$$ is minimum element of $$$c[0...i]$$$. This can be solved in linear time.