Блог пользователя Vladosiya

Автор Vladosiya, история, 22 месяца назад, По-русски

1790A - Поликарп и День числа Пи

Идея: MikeMirzayanov

Разбор
Решение

1790B - Таисия и кубики

Идея: Gornak40

Разбор
Решение

1790C - Престановка

Идея: MikeMirzayanov

Разбор
Решение

1790D - Матрёшки

Идея: MikeMirzayanov

Разбор
Решение

1790E - Влад и пара чисел

Идея: Vladosiya

Разбор
Решение

1790F - Тимофей и черно-белое дерево

Идея: molney

Разбор
Решение

1790G - Фишки на графе

Идея: senjougaharin

Разбор
Решение
Разбор задач Codeforces Round 847 (Div. 3)
  • Проголосовать: нравится
  • +81
  • Проголосовать: не нравится

»
22 месяца назад, # |
  Проголосовать: нравится +26 Проголосовать: не нравится

I think F is interesting, many ways to solve

»
22 месяца назад, # |
  Проголосовать: нравится +5 Проголосовать: не нравится

For F I would recommend tourist solution .

  • »
    »
    22 месяца назад, # ^ |
    Rev. 3   Проголосовать: нравится +1 Проголосовать: не нравится

    It's TC is nlogn?

    Edit: It's TC is nlogn '.'

    • »
      »
      »
      22 месяца назад, # ^ |
      Rev. 2   Проголосовать: нравится +47 Проголосовать: не нравится

      It actually is O(n log n). Without going into the details of the solution, it's easy to see from the code that its time complexity is O(sum of answers for all operations), i.e. the sum of all numbers that we are supposed to output.

      Claim: after k operations, the minimum length of path between black nodes is at most 2 * (n / k), for k > 0.

      Proof by contradiction: Suppose that the minimum path length is greater than 2 * (n / k). Then, for every black vertex v there are at least (n / k) nodes u such that dist(u, v) <= n / k, and by the assumption all these u's must be white, because (n / k) < 2 * (n / k). Let cnt(v) denote the exact number of these white nodes for vertex v. Let S = sum of cnt(v) across all black v. I claim that no white vertex is counted twice in S. Because if it is, then it has a distance of at most (n / k) to 2 different black nodes; but then the distance between those 2 black nodes is at most 2 * (n / k), which contradicts our original assumption. Thus, S counts each white vertex at most once, and is therefore bounded by (n — k). But we also know that each of the k black vertices contributes at least (n / k) to S. Then,

      S <= n — k --> k * (n / k) <= (n — k) --> k <= 0, which is a contradiction. This finishes the proof.

      Now the TC of the algorithm is the sum of 2 * (n / k) for all k from 1 to n. But this is just a sum of harmonic series, the value of which is O(n log n).

      • »
        »
        »
        »
        22 месяца назад, # ^ |
          Проголосовать: нравится +3 Проголосовать: не нравится

        Finally understood the proof completely after trying to upsolve this problem for 1 day :)

      • »
        »
        »
        »
        22 месяца назад, # ^ |
        Rev. 4   Проголосовать: нравится +32 Проголосовать: не нравится

        Another way to prove your claim is by thinking about the euler tour of a tree.

        Intuition: It's straightforward to calculate the maximum value of the minimum distance between any two elements after some $$$K$$$ operations in the case of a list instead of a tree. So let's try to represent the "tree like a list".

        Your claim would have been trivial if we were given a list instead of a tree in the problem. You can "transform" a tree and represent it using the list of nodes visited during the euler tour of the tree. The difference of position between any two elements in the list will always be greater or equal to the actual distance between the nodes (that are being represented through those elements) in the tree. Like in this case, the difference in position between nodes 6 and 1 is 4, whereas the actual distance is 3.

        Image

        Now we have a list of $$$2N - 1$$$ elements, and we know that even if these elements were independent of each other (same node does not appear in the list twice), the answer would be bounded by $$$\lceil\frac{2N - 1}{K}\rceil$$$.

        Note The bound works for the tree as well because we know whatever the answer for our list is, the actual answer for our tree is lesser or equal.

        • »
          »
          »
          »
          »
          22 месяца назад, # ^ |
            Проголосовать: нравится +3 Проголосовать: не нравится

          This is indeed very simple, thanks a lot for the alternative approach!

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

the round was great thank you so much!

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

when will system tests take place?

»
22 месяца назад, # |
Rev. 2   Проголосовать: нравится -8 Проголосовать: не нравится

Is this contest unrated?

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

All problems are interesting! Like them. But a little difficult for div 3:)

»
22 месяца назад, # |
Rev. 3   Проголосовать: нравится 0 Проголосовать: не нравится

On task B, the values of the elements are bounded from $$$1$$$ to $$$6$$$. So, even if you set everything to $$$s-r$$$ and then subtract manually, the number of operations in the worst case is $$$5n-5$$$, which is still $$$O(n)$$$.

  • »
    »
    22 месяца назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    vector res is sorted as well so it should be O(nlogn)

    • »
      »
      »
      22 месяца назад, # ^ |
        Проголосовать: нравится -7 Проголосовать: не нравится

      Nope, my solution goes for an entirely different approach. I just start with $$$s-r$$$ on every index, and loop through $$$[1,n)$$$ (0-indexed), subtracting $$$1$$$ everytime until we get $$$s$$$ sum in total. The worst case time complexity is $$$O(n)$$$ due to the limit in the elements' values.

      • »
        »
        »
        »
        22 месяца назад, # ^ |
          Проголосовать: нравится 0 Проголосовать: не нравится

        okay I thought you were talking about the solution given in tutorial its complexity is given as O(N*N). Even I went with a similar approach that is printed s-r until sum becomes less than s-r then divided the sum left(if any) on the remaining number of elements to be printed. So the TC was O(N) in my approach.

    • »
      »
      »
      22 месяца назад, # ^ |
        Проголосовать: нравится +1 Проголосовать: не нравится

      My solution for task B: 190928632 complexity: O(n)

»
22 месяца назад, # |
  Проголосовать: нравится +2 Проголосовать: не нравится

D is harder than E . E was quite Simple.

»
22 месяца назад, # |
Rev. 2   Проголосовать: нравится +29 Проголосовать: не нравится

F is a standard problem for centroid decomposition — build centroid tree, process two types of requests — add a new black vertex and find closest.

When we insert a new black vertex, just push into all vertexes before us the distance to that centroid.

When we get the distance, just iterate over all ancestors and relax ans with min(ans, d(v, cent) + mn[cent])

Working code: https://mirror.codeforces.com/contest/1790/submission/190980021

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Can someone help me understand why my problem F will TLE?

My idea is to go through each black node and do a dfs up to answer steps, (answer is the min distant so far) and see if I can find another black node that is closer than answer. If I find one, update answer so that next time we can dfs less depth. However, this will tle at test case 18. I thought the time complexity is O(n^1.5), please point out my mistake.

My submission 190916688

Thank you for your help.

  • »
    »
    11 месяцев назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    Here is a counter use case that will lead to O(n^2) for this method. A star graph with 1 in the middle and all other nodes connect to 1. Color everything other than 1 to black, then you will see every time, we need to traversal all the edges connect to 1, which lead to O(n^2).

    So we need to update the dist[curr] to be the shortest to black node and stop if keep going will not make a better answer. Above case, we should stop at 1 every time because keep going to 1's neighbor will not make things better.

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Can someone explain tourist solution for problem f in detail

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

To anyone who has understood tourist's solution, please correct me if I am wrong

Whenever we convert a node N1 black, we consider it's ancestors in hierarchical order and recompute their best distance as the min of the distance from node N1 and it's previous best distance until we encounter the root of the tree or we encounter an ancestor which has best distance better than it's distance from N1 and we break( because all the rest ancestors can't possibly be benifited from node N1 ) or we reach an ancestor which has distance from N1 greater than current ans(as it will never ever contribute to our actual answer as the answer always decreases or stays the same)

IDK about it's TC tho

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

There's a different solution for problem C. link

»
22 месяца назад, # |
Rev. 3   Проголосовать: нравится 0 Проголосовать: не нравится

https://mirror.codeforces.com/contest/1790/submission/190946802 Can any body tell me the time complexity of f solution and optimize this solution?? btw let me tell my solution:

Initially what I did is I found the min distance of all the node from black node and stored in an array.

Then during any query I found minimum distance between two black nodes which takes order of 1 time then accordingly updates the minimum distance of the nodes using BFS and update until new distance is smaller then minimum

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Please start the system testing. Why does it take so long when other platforms give the new ratings 15 mins after the end contest.

»
22 месяца назад, # |
Rev. 3   Проголосовать: нравится +1 Проголосовать: не нравится

E can be done with binary search too

  • »
    »
    22 месяца назад, # ^ |
    Rev. 3   Проголосовать: нравится 0 Проголосовать: не нравится

    here is my implementaion 191020389

  • »
    »
    14 месяцев назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    for binary search doesn't f(x) have to be monotonic? sorry I'm not understanding how bs won't get stuck somewhere. In this case

    $$$\frac{X}{2}$$$

    is a solution for a so bs will always try it but how do you prove it without knowing the solution beforehand?

    • »
      »
      »
      14 месяцев назад, # ^ |
        Проголосовать: нравится 0 Проголосовать: не нравится

      we have fixed number of bits i.e. of x, now we will alway assign a<=b such that a+b=2*x, now if we increase a b will decrease and also xor value of both will decrease, which means xor value is monotic decreasing with respect to a, hence we apply binary search l=1 and r=x, for the value of a .

      • »
        »
        »
        »
        14 месяцев назад, # ^ |
          Проголосовать: нравится 0 Проголосовать: не нравится

        if x=6, and a=2 and b=10, a^b=8. now if increase a by 1, a=3, b=9. and a^b=10. so on increasing a => a^b increases. now we increase a by 2. 5^7= 2. therefore a^b is not monotonic so not a candidate for binary search

        • »
          »
          »
          »
          »
          14 месяцев назад, # ^ |
          Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

          .

        • »
          »
          »
          »
          »
          14 месяцев назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          I am applying binary search on a

        • »
          »
          »
          »
          »
          14 месяцев назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          If a,b pair exists when x=y for a being mid y=mid^(2*x-mid) and if x>y Then we have to increase xor it can only be done by decreasing a as told u earlier we are finding solution a<=b and if xor of a nd b to increase a should be decreased and b should be increased,and vice-versa for y<x

»
22 месяца назад, # |
  Проголосовать: нравится +3 Проголосовать: не нравится

Can someone tell me in detail why the time complexity of the F of √n question came about?

»
22 месяца назад, # |
Rev. 2   Проголосовать: нравится +2 Проголосовать: не нравится

I was asked to give more details on my solution of 1790E - Влад и пара чисел. I will also share it there.

Solution for problem E
»
22 месяца назад, # |
Rev. 2   Проголосовать: нравится +4 Проголосовать: не нравится

Is it possible to do F using a BFS? If I understand the tutorial correctly, we skip vertices v that have d[v] > ans. Similarly, we could do a BFS where the number of levels you visit does not exceed ans. Will this be as efficient as the DFS approach? Why or why not?

»
22 месяца назад, # |
  Проголосовать: нравится +4 Проголосовать: не нравится

Can someone please explain solution of F more clearly. I didn’t understand the editorial solution that why it works and also the time complexity

»
22 месяца назад, # |
  Проголосовать: нравится +4 Проголосовать: не нравится

I am really disappointing because its my first contest and i solved only two and i get only one right . but i think there is something wrong in q B-Taisia and Dice because in the q ( in any order /print any) and i got it right. like here : 3 12 6 my answer was : 6 3 3 3 + 3 = 6 and 6+ 6 = 12 and that is 3 dices. I am really disappointing but hey ! is my first one . thank you :) <3

»
22 месяца назад, # |
  Проголосовать: нравится +3 Проголосовать: не нравится

So, I was thinking when to show my way to solve problems and I think that it's best if I do it only when at least I can solve three problems. So, like always don't focus on my code because it has less value that what I was thinking in the contest.

A
B
C
D

Hope that my way of thinking can help you in some kind of way to your next contest.

»
22 месяца назад, # |
  Проголосовать: нравится +1 Проголосовать: не нравится

Alternate appraoach for D: Divide and Conquer — https://mirror.codeforces.com/contest/1790/submission/190924408

Like merge-sort, but instead the merge function calculates number of matryoshka sets that can be combined across left and right intervals.

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Could someone tell me why use unordered_map in D will TLE and map will not? unordered_map code:191060893 map code:191061543

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

I really don't understand the editorial of problem F !! can anyone explain its solution and what is the proof of its complexity $$$O(n\sqrt{n})$$$ ?

»
22 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

I really don't understand the tutorial of problem F !! can anyone explain it in more details and prove its time complexity $$$O(n\sqrt{n})$$$ ?

»
22 месяца назад, # |
Rev. 3   Проголосовать: нравится 0 Проголосовать: не нравится

I wonder how the bound of $$$O(n \sqrt{n})$$$ with DFS/BFS is proven. Suppose that we only update nodes with distance smaller than $$$2$$$. Even in such case, we can visit all $$$n$$$ nodes on certain graphs. Even if we know that $$$dist \le \sqrt{n}$$$, how can we ensure the number of visited nodes?

  • »
    »
    21 месяц назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    Exactly I have the same question. Even if the max distance is √n, for every vertex we have to visit √n*edges connect to the vertex which overall sum to what I dont know(but how can you say it of order n root n),plus I think dfs is working correct because it randomly choses vertex and replaces ans which in turn reduces total actual iterations of the vertex. I am saying this beacause I remember a dude complaing his code was giving TLE at 7th test case with bfs but it got AC with dfs.

»
21 месяц назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

With regard to my comment above, I would like to request a thorough check on the given solution of problem 1790F. I am a bit confident that the solution isn't correct (i.e. it may not run in $$$O(n \log n)$$$ or $$$O(n \sqrt{n})$$$ for all trees). Thanks a lot!

»
21 месяц назад, # |
Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

__

»
16 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Can someone explain more solution of E?

»
14 месяцев назад, # |
Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

I don't understand in G why this test case expects NO:

test

There is a path from 3 to 1 all having bonus, so we can reach 1 using rules. Can someone explain? Also sorry for necroposting

»
11 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Problem C is the worst problem I've seen so far. ruined my day thanks

»
10 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Can we solve F with Square Root Decompos ?

»
10 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

241357745

why I am getting TLE ? can anyone pls help me? thanks in advance

»
8 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Hii , Can Anyone Please tell me Why Problem D is Giving TLE On test case 31 when we try to sort the array which contains long values.

»
5 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

I have O(1) solution for E instead of the O(logn) in the editorial, just take int a =x/2 and int b =x^a, and check if (a + b == x * 2), then print a,b else print -1

»
5 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

I have a better O(1) solution for E instead of the O(logn) in the editorial, just take int a =x/2 and int b =x^a, and check if (a + b == x * 2), then print a,b else print -1

»
4 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Model solution of F doesn't pass the test in C++20, but still make it in C++17. However it just barely pass, which mean this is not a good solution, or the constraint should be nerfed.