tacocat's blog

By tacocat, history, 6 months ago, In English

Given an array A of size n. Do these q queries:

1 pos x: assign A[pos] = x;

2 l r k: find k-th smallest element in range [l,r]

Constraints:

n,q <= 1e5

A[i], x <= 1e9

My first aproach was to use a Wavelet tree with a BST(specifically a Treap) on each of its nodes. This gives a time complexity of O(n*log^2) and can do online queries(I still did it offline, i had to read all the input and compress the values because i don't want to make the Wavelet tree dynamic). However it has a really high constant factor and it takes 7-8 seconds to run with the full constraints.

Here's my code if you're interested (sorry if it's poorly written):

My Code

It is allowed to do queries offline, and the problem also has a tag of paralel binary search.

Can someone please show me the intended solution with paralel binary search (or even optimize my current code) ?

UPDATE: Solution founded :D https://robert1003.github.io/2020/02/05/parallel-binary-search.html#a-hrefhttpacmhdueducnshowproblemphppid5412hdu5412---crb-and-queriesa

  • Vote: I like it
  • +2
  • Vote: I do not like it

»
6 months ago, # |
  Vote: I like it +4 Vote: I do not like it

You can solve it using persistent segment tree

  • »
    »
    6 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    How would you handle the update queries? I can do it with persistent segment tree if there are no updates

»
6 months ago, # |
  Vote: I like it -36 Vote: I do not like it

Here’s a simpler way to tackle the problem:

Use a Segment Tree: Think of it like a big tree where each node covers a part of the array and keeps a sorted list of the numbers in that part.

For Updates: When you change a number in the array, you update the sorted lists in the parts of the tree that cover that number.

For Queries: To find the k-th smallest number in a range, you collect and merge the sorted lists from the relevant parts of the tree, then pick the k-th smallest from the merged list.

This method is pretty efficient and manages both updates and queries well.

  • »
    »
    6 months ago, # ^ |
    Rev. 2   Vote: I like it 0 Vote: I do not like it

    I remember doing this for a similar problem without updates and small k constraint, but this problem has larger constraints for k, which is up to n. I can be mistaken but it seems your solution has a time complexity of O(n^2*log^2)

    • »
      »
      »
      6 months ago, # ^ |
        Vote: I like it +18 Vote: I do not like it

      It's probably chatgpt solution

  • »
    »
    6 months ago, # ^ |
      Vote: I like it +3 Vote: I do not like it

    This approach is O(nq), since merging the lists take O(n).

  • »
    »
    6 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    How would you merge the lists to find the kth quickly?

»
6 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Could you give the problem link?

»
6 months ago, # |
  Vote: I like it +3 Vote: I do not like it

You can use segment tree or Fenwick tree where t[i] is ordered_set of indices j such that value a[j] is in the i-th segment of segment/Fenwick tree.

  • »
    »
    6 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    I think that it is similar to my solution, but i used treaps instead of ordered sets. This is the first time I've heard about it and it seems pretty powerful. I wonder if it is allowed to be used in competitions, as it is pretty wild for a pre implemented data structure.

    • »
      »
      »
      6 months ago, # ^ |
        Vote: I like it +3 Vote: I do not like it

      Tried the oredered_set out and it runs 1s slower than treap with full constraints :(

»
6 months ago, # |
Rev. 2   Vote: I like it 0 Vote: I do not like it

Use a merge sort tree and on each node use an ordered set for updates

  • »
    »
    6 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    That was also my first though but how are you getting kth smallest number in O(logn^2).

    • »
      »
      »
      6 months ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      My bad yeah the query will take O(logn ^ 3)

»
6 months ago, # |
  Vote: I like it 0 Vote: I do not like it

You can use MergeSort Tree and then binary search or in this case use a ordered_multiset (since it also suports order_of_key() operation) for update queries. But since, binary search takes up extra O(logN) time we can use fractional cascading to precompute this at the cost of extra memory by storing the iterators of lower_bound in the right and left child of current node in the Segment Tree, along with the element values.

Thus, each time we query we get a O(logN) complexity. However, building the update function will be trickier here.

I think combining this with this should work.

»
5 months ago, # |
  Vote: I like it +4 Vote: I do not like it

Holy username. I couldn't help but laugh at it. Definitely a normal and not sussy at all name.

»
5 months ago, # |
  Vote: I like it +13 Vote: I do not like it
  • »
    »
    5 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    thanks for sharing this :D

    I updated the blog with the solution

»
5 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Auto comment: topic has been updated by tacocat (previous revision, new revision, compare).

»
5 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Alternative: we can try using a Splay tree. Updates is just updating the value of the node, and for querying, we first splay(l-1), then splay(r+1). Then, binary searching on [l,r] gives us the answer in O(NlogN), albeit with a pretty high constant.