Bazoka13's blog

By Bazoka13, 23 months ago, In English

1975A - Bazoka and Mocha's Array

Idea: Bazoka13 Solution: Bazoka13 Prepared by: Bazoka13

Hint 1
Hint 2
Solution
Code

1975B - 378QAQ and Mocha's Array

Idea: Bazoka13 Solution: Bazoka13 Prepared by: Bazoka13

Hint 1
Hint 2
Solution
Code

1975C - Chamo and Mocha's Array

Idea: Bazoka13 Solution: Bazoka13 Prepared by: Bazoka13

Hint 1
Hint 2
Hint 3
Solution
Code

1975D - Paint the Tree

Idea: SanweiTreap Solution: Gellyfish Prepared by: Bazoka13

Hint 1
Hint 2
Solution
Code

1975E - Chain Queries

Idea: Bazoka13 Solution: Serval Prepared by: Bazoka13

Hint 1
Hint 2
Hint 3
Solution
Code

1975F - Set

Idea: Toxel Solution: errorgorn, Toxel Prepared by: Nerovix

Hint 1
Solution
Code

1975G - Zimpha Fan Club

Idea: Gellyfish, zimpha Solution: Gellyfish Prepared by: Gellyfish, errorgorn, MagicalFlower

Hint 1
Hint 2
Hint 3
Solution
Code

1975H - 378QAQ and Core

Idea: Bazoka13 Solution: Toxel Prepared by: Bazoka13

Solution
Bonus
Code

1975I - Mind Bloom

Idea: Gellyfish Solution: Gellyfish Prepared by: Gellyfish, errorgorn

Hint 0
Hint 1
Hint 2
Solution
Code

Solution by Melania

Solution
  • Vote: I like it
  • +111
  • Vote: I do not like it

| Write comment?
»
23 months ago, hide # |
 
Vote: I like it +9 Vote: I do not like it

Any guesses on what game Jellyfish will be playing next?

»
23 months ago, hide # |
Rev. 2  
Vote: I like it +18 Vote: I do not like it

Fun facts for problem E: At first, this problem was Div. 2 D and required adding vertices dynamically.

»
23 months ago, hide # |
 
Vote: I like it -34 Vote: I do not like it

E was Nice.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

I had a different solution for C where I was able to infer hint 1, but could not deduce hint 2, so I instead ended up simulating the entire process by considering each element as a candiate (in descending order) and optimizing the $$$O(n^2)$$$ solution using data structure (set) to maintain the minimum distance between adjacent elements.

Overkill, I know, but just wanted to share my thought process during the contest.

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it 0 Vote: I do not like it

    In the contest, I almost implemented the following solutions. In decreasing order of elements, let's see if we have a subarray with median >= X. In the original array, lets replace all the nos < X by -1 and >=X by 1. A subarray with median >= X exists, which implies that there exists one subarray with length >= 2 and sum >= 1.

    In the segment tree, if we maintain the following in a node, - Sum - max pref sum - max suffix sum - max subarray sum

    After each -1 -> 1 update we can check if there exists a subarray with length >=2 and sum >= 1.

    262621530 262621561

    Btw once someone realised this idea. We don't need a segment tree; we can do a binary search on the median and check if a subarray exists with sum >= 1 and length >= 2.
    262622439

    • »
      »
      »
      23 months ago, hide # ^ |
      Rev. 2  
      Vote: I like it 0 Vote: I do not like it

      I have done this in O(N) 262595462, I couldn't formally prove it, but my intuition was that if there are number x,y,z such that x<=y<=z, then median of them is y, right? Any window of size K, and it's median is x, then this should imply that there are at least floor(K+1/2) elements in that window which have value less than or equal to x, for odd K (2*M-1), the number of elements less than or equal to x should be at least M, and since the size is 2*M-1, this implies in any case, the difference of indices of elements less than or equal to x should be at most 2, in other words there exists a smaller window [a,b,c] in the window of size K, such that either a and b or b and c or c and a should be less than or equal to x.

      • »
        »
        »
        »
        23 months ago, hide # ^ |
         
        Vote: I like it +3 Vote: I do not like it

        Thats the solution in editorial too.

        Look at it like this.

        Lets say we have an array of 11 elements and the median is 10.
        What I will prove is there exists 3 adjacent elements in this array which has median >= 10.

        Median = 10, implies that this array has atleast 6 elements >= 10.

        Lets look at first 2 elements,
        If both of them are >= 10, then we can take first 3 elements and the median will be >= 10 of these first 3 elements.

        Otherwise either none of them or only one of them is >= 10. We can delete these two elements. In remaining 9 elements we will still have atleast 5 elements >= 10.

        If we keep repeating this process either we will find first 2 elements >= 10, in which case we are done. Or atlast we will be left with 3 elements and among them 2 of them must be >= 2.

        Among 11 elements, atleast 6 of them are >= 10.
        Among 9 elements, atleast 5 of them are >= 10.
        Among 7 elements, atleast 4 of them are >= 10.
        Among 5 elements, atleast 3 of them are >= 10.
        Among 3 elements, atleast 2 of them are >= 10. If we reach this subarray, we can just use it to get a subarray of size 3 which has atleast 2 elements >= 10.

        The proof for even case is similar.
        Similary if we have an array of 12 elements and median is 10. There must exist atleast 7 elements >= 10.
        We can use similar reasoning to say that there exists atleast 2 adjacent elements >= 10.

        This is the reason why we should only look at subarray of length 2 and 3 to get maximum possible median.
        Using a subarray of size 5 (a1,a2,a3,a4,a5) will make the median worse when compared with 3 size subarray's present in the subarray, namely (a1,a2,a3) , (a2,a3,a4) , and (a3,a4,a5).

»
23 months ago, hide # |
 
Vote: I like it +1 Vote: I do not like it

Fun facts for problem H: At first, this problem was Div. 2 F.(=・ω・=)

»
23 months ago, hide # |
 
Vote: I like it +33 Vote: I do not like it

Desired solution for E is much more simpler than thinking about HLD .~.

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it 0 Vote: I do not like it

    at first, I used DSU, after getting WA I noticed that I need HLD, I almost gave up but somehow, I figured out that DSU works too

  • »
    »
    23 months ago, hide # ^ |
    Rev. 3  
    Vote: I like it +11 Vote: I do not like it

    At my first glance,it's surely a HLD problem.However,i noticed that the graph is a tree and only degrees of nodes influnced the answer,then i wrote something very close to the editorial with a higher complexity of $$$O(n \sqrt{n})$$$,which ran rather quick.If using set to implement the editorial's idea,it would be a much simpler solution than the $$$O(n)$$$ one.And the way that the solution figured out the root which has two sons amazed me.Anyway,it's a problem worth thinking.

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it 0 Vote: I do not like it

    DuongNeverAlone can you explain your solution for E please?

    • »
      »
      »
      23 months ago, hide # ^ |
       
      Vote: I like it +3 Vote: I do not like it

      i also used HLD (and got TLE), but my approach was to keep the set of all minimal elements of the relation "a is ancestor of b", i.e. for each "path to the root" i only keep the lowest vertex (max depth) in the set. when adding/removing a black vertex you have to know the next black vertex on the path to the root which can be done using HLD and segtrees.

      the rest is similar to the editorial; you have a few conditions with which you can check whether the minimal vertices form a chain (there must be at most 2, the path between them (==> euler tour trick + segtrees/fenwick tree) must only consist of black vertices ...)

      • »
        »
        »
        »
        23 months ago, hide # ^ |
         
        Vote: I like it 0 Vote: I do not like it

        The constraints were not generous. I do not know HLD very well too, so I did not go for that. I was trying something similar to editorial's approach but quite different in implementation.

        Sorry, I am not very good at tree queries but I get your idea a bit. By "euler tour trick + segtrees/fenwick tree" do you mean the approach where you flatten the tree and build segtree/fwick tree over the flat tree?

        • »
          »
          »
          »
          »
          23 months ago, hide # ^ |
           
          Vote: I like it +3 Vote: I do not like it

          with HLD you can answer path sum queries in $$$O(\log^2(n))$$$ and the constants (at least in my implementation) aren't that good. i couldn't come up with another approach though so i gave it a try

          i used the euler tour trick to count the number of black vertices on the path between two "minimal" vertices. i could've also used the hld, but this would be another $$$O(\log^2(n))$$$ factor, so i used ETT after the first TLE. you basically have each vertex twice in the flattened tree, once at "dfs_innum" and one at "dfs_outnum" with a "1" at "innum" and "-1" at "outnum". then a path sum from the root to "u" is sum [0; dfsin[u]]. and to answer such queries you can use a FT/SegTree.

          the total complexity is still $$$O(n \log^2(n))$$$

          • »
            »
            »
            »
            »
            »
            23 months ago, hide # ^ |
             
            Vote: I like it +3 Vote: I do not like it

            Thank you, I understood this part. I will try implementing it on my own. Thanks a lot!

    • »
      »
      »
      23 months ago, hide # ^ |
       
      Vote: I like it 0 Vote: I do not like it

      Do you mean HLD approach?

      • »
        »
        »
        »
        23 months ago, hide # ^ |
         
        Vote: I like it 0 Vote: I do not like it

        No, the one that you submitted here 262667007.

        • »
          »
          »
          »
          »
          23 months ago, hide # ^ |
          Rev. 9  
          Vote: I like it +3 Vote: I do not like it

          Oh, I just implement the approach of the editorial. Looking at the 4 "if" at the end of the while loop, you will see the four cases that exactly the same like the editorial said:

          If the black vertices form a chain: — no black vertex that has three or more black child vertices and there is at most one black vertex that has two black child vertices. — there is at most one black vertex whose parent vertex is white. — if there is one black vertex that has two black child vertices, its parent vertex must be white.

          If you need to detail of the variables I used, then: $$$cnt1$$$, $$$cnt3$$$ is number of black nodes which have $$$1$$$ and $$$3$$$ (or more) black child respectively. $$$st$$$ is the set of black vertices which have $$$2$$$ black child. white_cnt is the number of black vertices which have a white parent. I used a $$$cnt$$$ array to keep track the number of black child for each vertex $$$i$$$, denote by $$$cnt_i$$$. For each query, I need to update all those variables before checking for the mentioned $$$4$$$ conditions.

          if(cnt3){
          	cout << "NO" << endl;
          	continue;
          }
          

          This ensures there's no black vertex having 3 or more black child

          if(st.size() > 1){
          	cout << "NO" << endl;
          	continue;
          }
          

          This ensures there is at most 1 black vertex with 2 black child.

          if(white_cnt != 1){
          	cout << "NO" << endl;
          	continue;
          }
          

          Exactly 1 black vertex has a white parent.

          if(st.size()){
          	if(a[par[*st.begin()]] == 1){
          		cout << "NO" << endl;
          		continue;
          	}
          }
          

          If there exists a black vertex with 2 black child, then its parent must be white.

          PS: I just realize that the $$$cnt1$$$ is pretty useless here, never mind it :v.

          • »
            »
            »
            »
            »
            »
            23 months ago, hide # ^ |
            Rev. 3  
            Vote: I like it 0 Vote: I do not like it

            Thank you for the explanation. I had some doubts —

            Exactly 1 black vertex has a white parent.

            How do you handle the case with root being a part of the chain? In that case there won't be any white parent. I put a dummy node over root as a white node but I am not sure if that is the best way to about it.

            Also, could you please explain how the parent is being updated per query.

            if(a[v] == 1){
            	if(cnt[v] == 1) --cnt1;
            	else if(cnt[v] == 2) st.erase(v);
            	else if(cnt[v] >= 3) --cnt3;
            }else{
            	white_cnt -= cnt[v];
            }
            

            I did read the editorial but the implementation there is going on top my head.

            • »
              »
              »
              »
              »
              »
              »
              23 months ago, hide # ^ |
              Rev. 5  
              Vote: I like it +5 Vote: I do not like it

              I've already handled the case when the root node being a part of the chain. I use the dummy node $$$0$$$ as white node, and connect it to the node $$$1$$$ (which is the root of the tree). Focus on the part before I used DFS, and that's it:

              par[1] = 0;
              cnt[0] = a[1];
              

              For the updating part, I recommend you to do it yourself for better understanding, as different people have their own coding style. However, I still explain mine in this situation. It is similar to something like updating the sum of an array. Let's say I have an array $$$a$$$ with the sum $$$s$$$. If I need to update $$$a_i = x$$$, then $$$s$$$ will be updated like:

              s -= a[i];
              a[i] = x;
              s += a[i]
              

              which is pretty much similar to my implementation. It was like: remove the previous state, and adding the new state to the variable.

              PS: If you have any more questions, just DM for me as the comment section is pretty "flowwy" right now.

              • »
                »
                »
                »
                »
                »
                »
                »
                23 months ago, hide # ^ |
                 
                Vote: I like it 0 Vote: I do not like it

                Thank you, I will try thinking and coding it again myself. Thanks a lot for all the explanations, really kind of you.

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it +5 Vote: I do not like it

    And is much more simpler than thinking about the number of black chains of three vertices and debugging counter's update function lol 262610439

»
23 months ago, hide # |
 
Vote: I like it +1 Vote: I do not like it

I liked this contest very much, the best one I did this year I guess. F then E are my favourite problems in the contest.

»
23 months ago, hide # |
Rev. 2  
Vote: I like it +5 Vote: I do not like it

Problem A,B,C Video Editorial : YOUTUBE VIDEO LINK --Click Here

»
23 months ago, hide # |
Rev. 2  
Vote: I like it +29 Vote: I do not like it

Problem G has a linear solution. Or maybe it doesn't.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

nice problems, E is very interesting, tutorial solution is way easier than mine any source to study how to solve problems like F, it took me much time to understand it, and I didn't even get close to a solution

  • »
    »
    23 months ago, hide # ^ |
    Rev. 2  
    Vote: I like it 0 Vote: I do not like it

    What xor_two represent in the tutorial solution? Could you please explain the editorial code of question E.

    • »
      »
      »
      23 months ago, hide # ^ |
       
      Vote: I like it 0 Vote: I do not like it

      there is a case for the chain, when

      there is 1 vertex with 2 child

      there is 2 vertex with 0 child

      the rest must have 1 child

      but you also have to check that the vertex having 2 child must have white parent.

      So if you maintain xor of all the vertices having 2 child, when their count is 1, the xor will be exactly the vertex which has 2 children, so you can check whether it has white parent or not.

»
23 months ago, hide # |
 
Vote: I like it +7 Vote: I do not like it

problem D idk why it works but it dose
op1 = 2 * (n — 1) — max_dph_a
op2 = 2 * (n — 1) — max_dph_b + dist_bwn_a_b % 2

ans = min(op1, op2) + dist_bwn_a_b

»
23 months ago, hide # |
 
Vote: I like it +1 Vote: I do not like it

1975C — Chamo and Mocha's Array

Detailed Video Editorial

English Language => https://www.youtube.com/watch?v=J-xsU37pJoI

Hindi Language => https://www.youtube.com/watch?v=g6720vEw8r4

»
23 months ago, hide # |
 
Vote: I like it +8 Vote: I do not like it

Can you explain editorial of F more clearly? Firstly, what does mean array $$$f[n][s]$$$ and why such transitions on it?

  • »
    »
    23 months ago, hide # ^ |
    Rev. 2  
    Vote: I like it +8 Vote: I do not like it

    I don't understand the editorial of F too. It would be nice if someone explains it in more of a detail.

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it +8 Vote: I do not like it

    It's some sort of SOS dp. f[n][s] is "for each set bit b of f[n][s] it means that after considering all masks of the first n bits you have b 1's to spare in the remaining bits".

    Basically when you use a bit 1 as 1, then you consume 1 bit which is the reason why there's the shift right. When you use 1 as 0 or 0 as anything then no bit is consumed.

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it 0 Vote: I do not like it

    I guess in fact they are enumerating $$$i$$$ from $$$n-1$$$ to $$$0$$$ in the sample implementation.

  • »
    »
    23 months ago, hide # ^ |
    Rev. 4  
    Vote: I like it +8 Vote: I do not like it

    Crux of the idea is —

    We are adding nos from n-1 to 0.

    N = 6
    Our current set is ***101, i.e. we have added 3 and 5 in the set, we are yet to decide for 2,1,0.

    Lets say we have -
    1. f(100101) allows set sizes 0,1,3,4
    2. f(100001) allows set sizes 0,3,5

    The idea is we can merge all f(100***) into just one constraint.

    Union of 100101 and ***101 is set of size 2. So we can say for the remaining bits we are allowed to only include -2,-1,1,2 nos . We can remove -ve and say we must include 1 or 2 nos only.

    Union of 100101 and ***001 is set of size 1. So we can say for the remaining bits we are allowed to only include -1,2,4 nos . We can remove -ve and say we must include 2,4 nos only.

    Infact we can merge these two constraints and say we must include 2 nos among 0,1,2.

    This way when we have chosen X nos so far. For remaining nos we have $$$2^{N-X}$$$ distinct combinations. We can reduce original $$$2^N$$$ constraints to $$$2^{N-X}$$$ distinct constraints.

    Basically after deciding whether we should include n-1 in our set or not, we can reduce original 2^n constraints to $$$2^{N-1}$$$ constraints.

    By merging $$$s_0s_1s_2s_3s_40$$$ with $$$s_0s_1s_2s_3s_41$$$.
    If we include $$$N-1$$$ we subract all the allowed sizes by 1 in $$$s_0s_1s_2s_3s_41$$$, and then merge with $$$s_0s_1s_2s_3s_40$$$
    If we do not include $$$N-1$$$ and then we directly merge $$$s_0s_1s_2s_3s_41$$$ with $$$s_0s_1s_2s_3s_40$$$

    Now, if you read the editorial and model soln again it should make sense what they are doing.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

Can someone explain approach binary search for problem C

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it 0 Vote: I do not like it

    to determine if the answer is greater or bigger than x, you check whether there are two indices i, j (i != j) that diffrence between i and j is smaller than 3 and a[i] >= x and a[j] >= x. you can see my submission to see my impl.

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it 0 Vote: I do not like it

    The idea is similar to what explained in below gfg link.

    LINK — https://www.geeksforgeeks.org/find-largest-median-of-a-sub-array-with-length-at-least-k/

    IDEA : will apply BS on set of values given.

    CHECK(_) : "check(mid)" will validate if there exist a subarray of length >= 2 , having median mid or not.

    LOGIC for CHECK(_) : we will build a prefix array ,
    pre[i] = 1 ,a[i] >= mid , otherwise -1. : After building prefix sum find the max_subarray_sum , if it's positive than median id possible otherwise not.

    SUBMISSION LINK : https://mirror.codeforces.com/contest/1975/submission/262866407

    • »
      »
      »
      23 months ago, hide # ^ |
       
      Vote: I like it 0 Vote: I do not like it

      if it's positive than median id possible otherwise not.

      how do make sure this always works ?

      is there any proof or logic behind it

      • »
        »
        »
        »
        23 months ago, hide # ^ |
         
        Vote: I like it 0 Vote: I do not like it

        -> A subarray in a′ of given array having a non-negative sum means that the number of elements ≥mid is at least the number of elements <mid in that subarray , and so if we sort the subarray and find median it will com out to be mid.

        CONDITION :- -> For a subarray to have mid as its median, at least half (or more) of its elements must be ≥mid.

        -> Hence, if a subarray in a′ has a sum ≥0, then the subarray has at least as many 1's as −1's, implying mid can be a median.

        I hope this makes sense.

        • »
          »
          »
          »
          »
          23 months ago, hide # ^ |
           
          Vote: I like it 0 Vote: I do not like it

          yes indeed it makes , i tried to impliment the same idea but i didnt find how !

          thank you

»
23 months ago, hide # |
 
Vote: I like it +1 Vote: I do not like it

too unfortunate didn't realize that minimum number of steps to reach every vertex from certain vertex in tree is classical problem by taking furthest distance d then 2 * (n — 1) — d. I've tried to implement that from scratch but couldn't make it...😭

»
23 months ago, hide # |
 
Vote: I like it +83 Vote: I do not like it

I was discussing with adamant about some different approaches to do the Wildcard Matching in problem G. I had some ideas to hack him, because his solution was randomized and had a pretty bad probability of failure. The basic idea is the following case:

*a*a*a*a*
b-b-b-

Here, whenever characters a and b accidentally match, the algorithm fails, because afterwards it can match all the a's with the wildcards. So with this you can build testcases that have failure probability of $$$(1 - \text{failure at one position})^{10^6}$$$. Unfortunately people do not always reset their randomness everytime they do a wildcard matching, so then this case is pretty useless. But by putting multiple different kinds of strings at the places of a and b, and for example constructing similar testcases with words of $$$2$$$ and $$$3$$$ characters, you can also hack fishy solutions which don't reset their randomness.

This hack hacked some solution in Polygon, so it gave a Unexpected Verdict :(.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it
int mx=INT_MIN;
    for(int i=1;i<n;i++)
    {
        int t=min(a[i],a[i-1]);

        mx=max(mx,t);
    }
    cout<<mx<<"\n";

can someone explain why does this logic doesnot work for question 3 why cannot we just check the median of all subarray of length 2 and return the maximum among them

»
23 months ago, hide # |
Rev. 2  
Vote: I like it 0 Vote: I do not like it

For G, I am wondering why this submission is getting a WA but this one gets an AC.

I tried to use Gellyfish's template cuz the MOD = 2013265921 seemed cool, but is the way I copied it somewhat wrong? I copied just as it is except that I made p[] and w[] for every 2^i beforehand.

»
23 months ago, hide # |
Rev. 2  
Vote: I like it 0 Vote: I do not like it

Editorial for 1975D - Paint the Tree is not clear to me.

Will anyone explain me in more details ?

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it 0 Vote: I do not like it

    Firstly, it's clear that the Pb movements are useless if it's not moving on a red vertex. once it reaches a red vertex, it's easy to see that we can make Pa move ahead of Pb (or with Pb), so, after we reach a red cell, we just need to traverse the whole tree to make it all blue. we can traverse the whole tree and go back to where we were in 2*(n-1) moves, but once we color the last vertex blue we don't need to go back, so if the distance between the starting node and that last-colored vertex is d, we will need 2*(n-1)-d moves, obviously we need to maximize d so it'll be the furthest vertex from out starting position. the last thing is that we need to reach a red vertex ASAP so we can calculate the distance between Pa and Pb and divide it by 2 rounded up.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

In problem B, I retrieve minimum even and minimum odd number of array a.Then iterate over array a and check wheater ai is divisible by minimum even or minimum odd number. So what is the problem of this approach?

»
23 months ago, hide # |
Rev. 2  
Vote: I like it 0 Vote: I do not like it

we can simplify E further by considering another array pc[v] = xor (c[v] and costs of its children). Now, c and pc form a bijection. And a chain of black vertices is just 4 black vertices in pc that satisfy some conditions. 262582001

»
23 months ago, hide # |
Rev. 3  
Vote: I like it +8 Vote: I do not like it

Please someone tell me what is wrong with this solution for problem D. my submission (It's failing on test 3).

Or please provide a test case where it may fail.

»
23 months ago, hide # |
 
Vote: I like it +1 Vote: I do not like it

Can someone please explain solution of Problem D in detail with "proof"? Thanks in advance

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it +4 Vote: I do not like it

    The key point is to notice that no matter how the vertices are painted red before they meet for the first time, the minimum cost will be $$$2∗(n−1)−maxdep$$$ if we choose the $$$vertice$$$ first painted blue as the root. The proof is that $$$P_A$$$ can choose to go to any subtree of the root before they meet, ensuring $$$2∗(n−1)−maxdep$$$ can be reached(It's obvious that it's the minimum). And it is also obvious that delaying the time of the meeting will at most decrease $$$maxdep$$$ by 1, as the new root can only be the child of the original root.

»
23 months ago, hide # |
 
Vote: I like it +6 Vote: I do not like it

Probably an overkill but E can be solved using Euler Tour and Seg Tree . Code

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

Please explain how will we find the first vertex that will be painted blue in Problem D, the editorial is not clear to me.

»
23 months ago, hide # |
Rev. 5  
Vote: I like it +3 Vote: I do not like it

For problem C: - Why check for a length of 3? Initially, I used a length of 2 and noticed that some tests failed. I then considered a length of 3. - Can you help me ?

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it
»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

262692303 Can anyone tell why mle and any possible sol to remove it??

»
23 months ago, hide # |
Rev. 2  
Vote: I like it 0 Vote: I do not like it

Problem C was very similar to a past problem 1349B - Orac and Medians. It is exactly the same operation on a subarray, only it asks a slightly different question, but both problems can be easily reduced to eachother and the same observations can be used.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

What would be the rating of problem D?

»
23 months ago, hide # |
Rev. 2  
Vote: I like it 0 Vote: I do not like it

Help needed. Can anyone explain my mistake for problem A? My approach was as follows. From the beginning, find the first element that breaks a non-decreasing sequence, and starting from there again "find" an element that breaks the current non-decreasing sequence. If there is none (i == n), it's good. Also, make sure that last element is actually less or equal to first.

I'm sure the bug is stupid, but can't see it. Also, the submission is here. 262528960

Gist of the code
  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it 0 Vote: I do not like it

    The code of your submission fails on cases as this one.

    1
    4
    4 6 7 5
    

    Your code gives YES, though it should be NO.

    But when I use your snippet 262716265 it works.

    • »
      »
      »
      23 months ago, hide # ^ |
       
      Vote: I like it 0 Vote: I do not like it

      Thanks. I didn't realize that the two codes are different. And I thought the change was impactless, so I didn't bother submitting the updated (and correct) one. My bad.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

Can anyone explain why would a subarray of length 3 always suffice to provide the correct solution for C ?

  • »
    »
    23 months ago, hide # ^ |
     
    Vote: I like it +1 Vote: I do not like it

    Think about determining whether an array's any sub-array has a median $$$ \gt x$$$. Clearly, the array needs to have a majority of $$$ \gt x$$$ values (i.e. 3 values greater than $$$x$$$ out of 5, or 6 out of 9, 51 out of 100 etc). So, for every $$$ \lt x$$$ values, there has to be equal (in fact, more) number of $$$ \gt x$$$ values.

    Now, with that observation in mind, if the array starts to look like $$$x, \lt x, \lt x, \lt x,...$$$ which is 1 $$$x$$$ and 3 values less than $$$x$$$, then you'll need to have 3 $$$x (or \gt x)$$$ to have a 4 out of 7 majority. So, it's better to ignore the first value (so that we don't have to carry the "burden" of 3 $$$ \lt x$$$ values, and start fresh from the later positions. If the answer exists (for the previous example, indeed there are 3 more $$$ \gt x$$$ values are coming ahead, you'll find them in the future anyway.

    Thus, you can see that the moment we get an $$$x$$$ followed by 2 $$$ \lt x$$$ values, we ignore that and start again from the next position. This is the reason of every sub-array of length $$$3$$$ and its median is a potential answer (and we are taking the maximum of them). Hopefully I was clear enough.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

Thank you for the contest; I really learnt something from E.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

Can someone explain the solution to Problem F? The tutorial mentions "constraints" and sets T1 and T2 without defining anything.

»
23 months ago, hide # |
 
Vote: I like it +8 Vote: I do not like it

Here's an alternative solution for E which I found easier to understand than the model.

Let's root the tree arbitrarily. If the forest formed by the black vertices has 1 leaf, it consists of a single chain leading upwards from that leaf. If there are 2 leaves, each leaf similarly forms a chain leading upward, except now we need to check if they meet and stop at their LCA.

This is as simple as checking if the highest black vertex has 2 black children. The 2 black children tell us it's the LCA we're looking for, and we know that each of them must lead directly to a leaf, because it would mean there are more than 2 leaves otherwise.

»
23 months ago, hide # |
 
Vote: I like it +10 Vote: I do not like it

Problem G is very beautiful. Both the problem and the solution is.

»
23 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

In problem $$$D$$$, you can try brute force all the nodes to determine the best answer.

Let $$$sumTree[i]$$$ be the minimum number of steps needed to traverse all the tree considering $$$i$$$ as a root. As we know, $$$sumTree[i] = 2 * (N - 1) * maxHeight(i)$$$. Now the problem is to find $$$maxHeight$$$ for each node. We can see that this value will always equal to the max distance from node $$$i$$$ to one of the tree diameters (as they are considered the farthest nodes by apart). So know, we can calculate $$$sumTree$$$ for any node in just $$$O(1)$$$ by determining the diameter and their distances for each node in the tree (which can be simply done by $$$3$$$ dfs).

Now to calculate our $$$answer$$$, we just need to calculate distances from our start nodes $$$P_A$$$, $$$P_B$$$.

For each node $$$i$$$, $$$ans[i]$$$ $$$=$$$ $$$distFromB[i]$$$ + $$$sumTree[i]$$$ . (don't forget to check that person $$$A$$$ has reached node $$$i$$$ before person $$$B$$$). The $$$answer$$$ will be the min one for all nodes.

This is the solution link

»
23 months ago, hide # |
Rev. 3  
Vote: I like it +10 Vote: I do not like it

For problem I, it looks like the second solution naturally extends to arbitrary $$$c_i$$$ as long as division by zero does not occur.


Here's an argument that division by zero does not occur when the $$$c_i$$$ are constrained to be nondecreasing:

  • When $$$c_i\neq 1$$$, $$$C=0$$$ (you can never loop back to the same state)
  • When $$$c_i=1$$$, and $$$C$$$ is nonzero, $$$C$$$ must be of the form $$$1/x$$$ for some integer $$$x\in [2,N]$$$.

So in either case $$$1-C\not\equiv 0\mod{10^9+7}$$$.


I wonder if there are other examples of nontrivial constraints on $$$c_i$$$ (other than $$$c_i$$$ nondecreasing) that guarantee that division by zero does not occur.

  • »
    »
    23 months ago, hide # ^ |
    Rev. 2  
    Vote: I like it 0 Vote: I do not like it

    It is interesting that the original problem is actually to choose any card to play, maximizing the probability of emptying the draw pile.

    Through a large amount of data verification, it has been shown that the problem is equivalent to the final problem. However, until the day before the start of the competition, no one was able to provide a rigorous proof, so the final problem is given.

»
22 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

_why do we need to sort vector b in B-problem

  • »
    »
    19 months ago, hide # ^ |
     
    Vote: I like it 0 Vote: I do not like it

    else it won't be able to find the smallest element and the smallest element not divisible by the smallest element

»
22 months ago, hide # |
Rev. 2  
Vote: I like it +3 Vote: I do not like it

I had some trouble with F and couldn't find a satisfactory explanation, so I decided to write one including motivation.

So first, notice that the structure of the problem isn't friendly. We're given constraints on things that heavily depend on one another in a non-obvious way. We can try merging conditions together without fixing anything, but the relationships are complex. This means that naturally, we should try fixing some elements in $$$S$$$.

Say we have fixed $$$x$$$ elements to definitively be/not be in $$$S, $$$ and denote the set of their indices as $$$X.$$$ Let's try to figure out how we can use the fact that we have fixed $$$x$$$ elements in order to reduce redundancies in the constraints.

We want to try to infer information about the non-decided elements from the decided ones. Consider one of the $$$2^{N-X}$$$ possible subsets of the non-decided elements, and call it $$$R$$$. Consider every $$$T$$$ that is the union of $$$R$$$ and some subset of $$$X.$$$ Notice that since we know $$$V_{T}$$$ and we know how many elements in $$$X$$$ are present in $$$S, $$$ we can immediately update $$$V_{R}$$$ to reflect this known information. Iterating over all possible subsets we can keep updating $$$V_{R}$$$ again and again. Now, we don't have to consider every subset of $$$X, $$$ we can consider $$$V_{R}$$$ as one constraint. This is what the editorial means when it says that there are $$$2^{N-x}$$$ constraints once we have fixed x elements: this is because there is one constraint associated with every $$$R.$$$

Now, the problem is that computing the constraint for a fixed $$$R$$$ is expensive: it takes $$$\mathcal{O}(2^x)$$$ operations since we have to merge all sets of the form $$$R + \text{subset of X}.$$$ So instead, as we build up the set $$$X, $$$ we can try repeatedly merging constraints from the constraint set. Say we were trying to fix a new element $$$\text{pos}$$$ to be either in $$$S$$$ or out of $$$S, $$$ and let's say we have the constraint set for every $$$R \in [0, 1, 2, \dots, N-1] - X.$$$ How will the constraint sets change upon declaring $$$\text{pos}$$$ to either be in $$$S$$$ or not be in $$$S$$$?

For every set $$$R' \in [0, 1, 2, \dots, N-1] - [\text{pos}] - X, $$$ we can "merge" $$$V_{R'}$$$ and $$$V_{R' + \text{pos}}$$$ together depending on whether we declare $$$\text{pos}$$$ to be in $$$S$$$ or not. Now, we can simply DFS and find the potential answers by checking for contradictions at the end of the process.

Submission: 267928405

»
21 month(s) ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

In problem d in its solution it's said that it based on a classical problem can anyone please provide more information about this.

»
17 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

IN QUES. C ,IF I TAKE SUBARRAY OF LENGTH 2 EVERYTIME INSTEAD OF 3,WHY I AM GETTING WRONG ANSWER ,WHATS WRONG THERE .

»
16 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

Can anyone recommend problems which are similar to F?

»
12 months ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

Can someone tell the proof of problem A?

»
7 weeks ago, hide # |
Rev. 2  
Vote: I like it 0 Vote: I do not like it

The editorial for D is quite unclear TBH.

"Considering all subsequent movements of PB, PA can restore these movements one by one after reaching r, then PB will pass through all vertices have been painted red." -> "Considering all subsequent movements of PA, PB can restore these movements one by one after reaching r, then PB will pass through all vertices that have been painted red."

"The two must be close to each other, and then until the first vertex is painted blue." — didn't fully understand the meaning.

Also I'm quite struggling to understand complete proof for "If another vertex is r, although the value of d may increase, every time the value of d increases by 1, the time when PA and PB meet will also increase by at least 1, so the answer will not decrease.". Is the optimal strategy to increase d is always pick neighbour of r (max d'=d+1) and is it proven inductively? max d' in neighbours subtree is <=d-1, otherwise it would be having max d from r (1+d). Are we guaranteed that the shortest path to r' will go through r?

»
12 days ago, hide # |
 
Vote: I like it 0 Vote: I do not like it

67