Блог пользователя maroonrk

Автор maroonrk, история, 19 месяцев назад, По-английски

We will hold AtCoder Regular Contest 160.

The point values will be 400-500-500-700-800-900.

We are looking forward to your participation!

Problem Statement here.

It can happen that our website is unstable due to DDoS Attacks, and in that case, you can read the statement from the above link (we will upload the pdf when the contest starts). We would make the contest unrated only if the attack caused a significant inconvenience or unfairness.

  • Проголосовать: нравится
  • +117
  • Проголосовать: не нравится

»
19 месяцев назад, # |
  Проголосовать: нравится +10 Проголосовать: не нравится

Auto comment: topic has been updated by maroonrk (previous revision, new revision, compare).

»
19 месяцев назад, # |
  Проголосовать: нравится +74 Проголосовать: не нравится

As a Writer, I hope you enjoy the contest and achieve good results. Good luck!

»
19 месяцев назад, # |
  Проголосовать: нравится +13 Проголосовать: не нравится

C++ 20 support when?

»
19 месяцев назад, # |
  Проголосовать: нравится +10 Проголосовать: не нравится

Did anyone meet the same situation? Here.

In problem D,I guess that if we operate no more than $$$k-1$$$ operation two on each position,then we won't count the same array for two times.

Why it only got Wa on so few test cases?

  • »
    »
    19 месяцев назад, # ^ |
      Проголосовать: нравится +10 Проголосовать: не нравится

    Did you manage to calculate $$$(1+x+\dots+x^{k-1})^{n-k+1}$$$ fast? How?

    • »
      »
      »
      19 месяцев назад, # ^ |
        Проголосовать: нравится -10 Проголосовать: не нравится

      NTT

      • »
        »
        »
        »
        19 месяцев назад, # ^ |
        Rev. 3   Проголосовать: нравится +10 Проголосовать: не нравится

        Isn't it $$$O(nk \cdot \log^2(nk))$$$ if done with small to large?

        I also tried to write it as $$$((x^k-1)/(x-1))^{n-k+1}$$$, but it's still $$$O(nk \cdot \log(nk))$$$.

    • »
      »
      »
      19 месяцев назад, # ^ |
      Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

      Just as follows : ffor(i,0,k-1) f[i]=1; ntt(f,1); ffor(i,0,sze-1) f[i]=qpow(f[i],n-k+1)%MOD; ntt(f,-1);

    • »
      »
      »
      19 месяцев назад, # ^ |
        Проголосовать: нравится +8 Проголосовать: не нравится

      Brute force dp

      • »
        »
        »
        »
        19 месяцев назад, # ^ |
          Проголосовать: нравится +28 Проголосовать: не нравится

        Ok, I'm more and more surprised :D

        I thought you couldn't do NTT if the final polynomial size is $$$> 10^6$$$, and now it turns out that even naive multiplication works?

        • »
          »
          »
          »
          »
          19 месяцев назад, # ^ |
          Rev. 2   Проголосовать: нравится +10 Проголосовать: не нравится

          Why? NTT doesn't have a huge constant factor (350 ms runtime for me) and for the standard modulo, it's divisible by $$$2^{23}$$$, so final polynomials up to that size should be fine.

          For larger polynomials than that, NTT limit isn't a complete priority since memory consumption starts to become a problem depending on implementation.

          • »
            »
            »
            »
            »
            »
            19 месяцев назад, # ^ |
            Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

            Ok, good to know.

            Actually I was a bit biased because I had tried to compute $$$1/(x-1)^{n-k+1}$$$ on my machine, and it ran in $$$> 10$$$ seconds. Is calculating the inverse much slower than doing a simple NTT?

  • »
    »
    19 месяцев назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    Yup, exactly my solution no idea why this gets WA.

»
19 месяцев назад, # |
  Проголосовать: нравится +10 Проголосовать: не нравится

How to solve D? I was doing NTT to get the number of ways to make x operations of the second type such that no index has more than $$$K - 1$$$ operations done to it and the division of the rest I do using stars and bars, this didn't pass on last 6 cases is there some kind of corner case?

  • »
    »
    19 месяцев назад, # ^ |
    Rev. 3   Проголосовать: нравится +8 Проголосовать: не нравится

    There might be some division by zero (multiples of 998244353) when doing stars and bars if you compute the combinations by repeatedly dividing and multiplying by big numbers.

»
19 месяцев назад, # |
  Проголосовать: нравится -12 Проголосовать: не нравится

it's too hard

»
19 месяцев назад, # |
  Проголосовать: нравится +40 Проголосовать: не нравится

How are there so few solutions to E? Super ez problem idea-wise, a bit more annoying implementation-wise but nothing crazy.

Obviously each leaf needs an extra edge touching it, so we can't have lower cost than the sum of leaves' weights.

Main idea: if there are $$$L$$$ leaves, we find the centroid of leaves — one non-leaf such that there are less than $$$L/2$$$ leaves in each of its subtrees, or an edge such that there are $$$L/2$$$ leaves in each subtree created by removing it. Now we can pair up the leaves. This obviously works, removing a vertex splits the tree into 2 subtrees connected by an extra edge or 3 subtrees with 2 of them connected by an extra edge and the 3rd connected to one of them by an extra edge. (This is where degree $$$\le 3$$$ comes into play.)

The only catch is an odd number of leaves. The best hypothetical option is pairing one leaf $$$l$$$ with a minimum-weight vertex. If we can do that, the remaining leaves can be paired up in the same way. We only need to watch out for situations where the leaf $$$l$$$ is the only leaf in some subtree created by removing a vertex, since we only need to arbitrarily choose 1 leaf in each subtree for the argument from the previous paragraph to work. That means the removed vertex is part of the path $$$p-l$$$ inclusive, where $$$p$$$ is the lowest parent of $$$l$$$ with degree 3. The sum of lengths of these paths is $$$O(N)$$$, we can run along them and check if there's one that doesn't contain the chosen min-weight vertex.

It turns out that the only case where this strategy can fail is a star with the min-weight vertex in the center — we obviously can't choose that one. In that case, we just pair up a leaf with the second-min-weight vertex, everything works the same as above.

  • »
    »
    6 месяцев назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    Sorry for necro-posting, but can I ask why would you think about centroid for the first place? I think the problem "match leaves to make a graph biconnected" and the idea of centroid are quite irrelevant here for me, can you tell me what's the intuition behind this?

    • »
      »
      »
      6 месяцев назад, # ^ |
        Проголосовать: нравится +7 Проголосовать: не нравится

      There's a rather famous problem with Hamiltonian cycle in graph metric (imagine a clique where length of each edge = distance of its endpoints on a given labeled tree), where it turns out that moving between different subtrees of centroid is optimal. Then the cost of an edge $$$(x,y)$$$ is $$$dep_x+dep_y$$$, which is the same formula, and pairing vertices instead of moving in a Hamiltonian cycle between them just halves the cost with this formula.

»
19 месяцев назад, # |
  Проголосовать: нравится +18 Проголосовать: не нравится

I am confused with problem D. In solution, why can we just consider sequences with option 2 used less than k times in the same place. And then consider adding some option 1. If two situation both satisfy option 2 used less than k times in the same place, but by applying 1, we can go from one to another, is it possible that we count twice?

  • »
    »
    19 месяцев назад, # ^ |
      Проголосовать: нравится +8 Проголосовать: не нравится

    In fact that is impossible. Noted that for two situation with different ways to do option 2, there will exist at least one position $$$i$$$ such that $$$A_i\%k$$$ is different with the two situation, which means that the two situation will not be count twice regardless how to assign option 1.

    Suppose that we assign option 2 from $$$i=1$$$ to $$$i=p$$$ . If two situation with same assignment for $$$1\le i\le p$$$ , but different assignment for $$$i=p+1$$$ , $$$A_{p+1}\%k$$$ must be different.

    If two situation is same for assigning option 2 in every position, they are different iff the way of assigning option 1 is different. So it is impossible to count twice.

»
19 месяцев назад, # |
  Проголосовать: нравится +8 Проголосовать: не нравится

How to solve B? No editorial also no English video on ARC160.

  • »
    »
    19 месяцев назад, # ^ |
      Проголосовать: нравится +18 Проголосовать: не нравится

    Consider the biggest element among $$$a$$$ , $$$b$$$ and $$$c$$$ as $$$maxnum$$$ .

    If $$$maxnum \le \sqrt{n}$$$ , all of $$$a$$$ , $$$b$$$ and $$$c$$$ can be the number from $$$1$$$ to $$$maxnum$$$ . For this part, the number of triples is $$$(\sqrt{n})^3$$$ .

    If $$$\sqrt{n}+1 \le maxnum \le n$$$ , only one of $$$a$$$ , $$$b$$$ and $$$c$$$ is equal to $$$maxnum$$$, while others are less than or equal to $$$\lfloor \frac{n}{maxnum}\rfloor$$$ . So for every $$$maxnum$$$ in this range, the number of triples is $$$3\times (\lfloor \frac{n}{maxnum}\rfloor)^2$$$ .

    Noted that there is an useful trick called 'sqrt decomposition' to enumerate all the different $$$maxnum$$$ more than $$$\sqrt{n}$$$ in a complex of $$$O(\sqrt{n})$$$. By using it, the total complex is $$$O(T\sqrt{n})$$$ .