The 2022 ICPC Seoul Regional Mirror Contest has ended (link), and since no one made a post for discussion, let's discuss below. How to solve A, B, G and H?
№ | Пользователь | Рейтинг |
---|---|---|
1 | tourist | 4009 |
2 | jiangly | 3823 |
3 | Benq | 3738 |
4 | Radewoosh | 3633 |
5 | jqdai0815 | 3620 |
6 | orzdevinwang | 3529 |
7 | ecnerwala | 3446 |
8 | Um_nik | 3396 |
9 | ksun48 | 3390 |
10 | gamegame | 3386 |
Страны | Города | Организации | Всё → |
№ | Пользователь | Вклад |
---|---|---|
1 | cry | 167 |
2 | Um_nik | 163 |
3 | maomao90 | 162 |
3 | atcoder_official | 162 |
5 | adamant | 159 |
6 | -is-this-fft- | 158 |
7 | awoo | 157 |
8 | TheScrasse | 154 |
9 | Dominater069 | 153 |
9 | nor | 153 |
The 2022 ICPC Seoul Regional Mirror Contest has ended (link), and since no one made a post for discussion, let's discuss below. How to solve A, B, G and H?
Название |
---|
A: Rotate the coordinate by pi/4, split into even and odd, then do N^3 * M^3 dp do compute grundy value for each subrectangles.
H: dfs over the tree formed by suffix links of suffix automaton. Maintain the set of ending positions in a set. Naively compute the answer for each node (i.e. binary search + repeatly doing lower_bound on endpos). No idea why the last step is fast. Possibly weak test.
I think your algorithm runs in $$$O((n \log n)^{1.5})$$$ time, if the naive algorithm is written to run in $$$O(min(|S|, n / x \log n))$$$ time. This is a generous estimate, so in practice it should be faster. Unless we exploit some specific structure of suffix trees, I don't think we can claim $$$O(n^{1.5 - \epsilon})$$$ upper bound.
Hi, does anyone know how to submit for the problems in this contest after it is ended? Thank you!
It doesn't seem to be accepting submissions currently, but some regionals get uploaded on Baekjoon or CF gyms.
Problems are now available for solving here.
H: Create a suffix array and calculate lcp. Iterate through each length from n to 1 maintaning all the possible substrings of such length using dsu (we merge a and b if lcp between them is equal to length that we are checking right now). For each of the segments in dsu we maintain sets of indeces mergin using small to large trick. We know that segment of size k represents a k-substring so now we only need to calculate d. To do that we will repeat lower bound on set from position we are in + length that we are checking. For length x the sum of d will be number of componenets + n/x so if we find a way to skip the segments in which d=1 we get O(n log n) jumps in total giving us the final complexity of O(n log^2 n).
Edit: n/x was a lie
B: The paper. For case $$$t = 1$$$ we can afford $$$n^2$$$ computation so let's do it. We can identify two $$$LL$$$ corners which characterize the leftmost/rightmost end of the polygon. Using these corners we can split the polygon into its upper part and lower part. WLOG assume the length of the upper part is at least the length of the lower part. I claim that in the optimal solution:
With this assumption, you can run $$$N^2$$$ DP. I actually didn't get AC, but the onsite team have the same solution.
For case $$$t = 2$$$ you just divide by four $$$LL$$$ corners and compute some value. I think there could be some corner cases.
G: The paper. I actually don't know the model solution and the algorithm in the paper seems to be very complex. I think in the paper the author is doing some radial sweep, claiming that there are at most $$$O(N \sqrt K)$$$ events, and do handle those events with fully-dynamic convex hulls, which results in something like $$$O(N \sqrt K \log^2 N)$$$? idk
For G, someone mentioned that this problem is similar and I'm guessing that it might be solved with the same ideas.
The official solution for the hardest few problems has been published yesterday. The solution for G follows the same steps as the paper, but simplifies the detail while keeping the solution fast enough. Most significantly, it avoids the usage of dynamic convex hulls.
Using the point-line duality, this problem is equivalent to finding the shortest $$$y$$$-distance between the upper $$$i$$$-level and the lower $$$(k+1-i)$$$-level for some $$$i$$$. This requires computing the $$$(\leq k)$$$-level, which is what the paper does.
We apply the duality again (i.e. come back to the same set of points), and first compute the convex layers. We're only interested in the first $$$k$$$ layers, so we simply compute the convex hull $$$k$$$ times in $$$O(nk \log n)$$$. Then we compute the $$$k$$$-level (not $$$(\leq k)$$$-level) by rotational sweeping. Every time we try to find the next point, we check all $$$O(k)$$$ candidates one by one (at most 2 per layer). Since the complexity of the $$$k$$$-level is $$$O(nk^{1/3})$$$, the sweep works in $$$O(nk^{4/3})$$$. Finally, we compute the $$$(\leq k)$$$-level using plane sweeping. Since the complexity of the $$$(\leq k)$$$-level is $$$O(nk)$$$ and we maintain $$$k$$$ lines every time, it works in $$$O(nk \log k)$$$. The total time complexity is thus $$$O(nk(\log n + k^{1/3}))$$$.
How to solve L?
The problem reduces to finding two cycles of the same length. The lengths of cycles are bounded by $$$3$$$ and $$$n$$$, and there are at least $$$n - 2$$$ back edges when you consider any normal spanning forest (any DFS forest in particular). If there are two cycles of the same length containing exactly one back edge each, we're done. Else, all cycles have distinct lengths, and for every integer between $$$3$$$ and $$$n$$$ (both inclusive), there exists a cycle of that length with exactly one back edge. Consider two cycles of length $$$n$$$ and $$$n - 1$$$, and XOR them. Since there already exists a 3-cycle with exactly one back edge, and this new cycle has exactly 2 back edges, we are done.