I am currently stuck on this problem: https://mirror.codeforces.com/contest/296/problem/C Any help would be appreciated. PS: I couldn't understand the editorial
# | User | Rating |
---|---|---|
1 | tourist | 3985 |
2 | jiangly | 3814 |
3 | jqdai0815 | 3682 |
4 | Benq | 3529 |
5 | orzdevinwang | 3526 |
6 | ksun48 | 3517 |
7 | Radewoosh | 3410 |
8 | hos.lyric | 3399 |
9 | ecnerwala | 3392 |
9 | Um_nik | 3392 |
# | User | Contrib. |
---|---|---|
1 | cry | 169 |
2 | maomao90 | 162 |
2 | Um_nik | 162 |
4 | atcoder_official | 161 |
5 | djm03178 | 158 |
6 | -is-this-fft- | 157 |
7 | adamant | 155 |
8 | awoo | 154 |
8 | Dominater069 | 154 |
10 | luogu_official | 150 |
I am currently stuck on this problem: https://mirror.codeforces.com/contest/296/problem/C Any help would be appreciated. PS: I couldn't understand the editorial
Name |
---|
Hint: If you know how to do queries of incrementing a range where size of array is n(<=100000) and no. of queries q (<=100000) in O(n+q) then u can solve this problem by applying the technique twice. If you dont know the above technique: https://www.youtube.com/watch?v=4AFWuZIbJ9g&ab_channel=HimanshuSingal
So the idea to do this problem is you can merge multiple queries of the same type into one query. For example, the query 1 5 3 repeat 5 times is equal to 1 5 15. Then you can just using range queries technique to calculate the number of times each query is used, and then another time to execute the queries to the array. The time complexity is O(n+q) or O(n log n + q log q) depend on the range queries algorithm you used (segment tree, BIT, difference array, or stuff like that)