Final_Track's blog

By Final_Track, history, 25 hours ago, In English

As part of a research task, I came across a sub-problem as follows:

Given an undirected, unweighted tree of $$$n$$$ nodes and a threshold $$$k$$$, partition the $$$n-1$$$ edges of the tree into simple paths such that every edge belongs to exactly one simple path and the maximum length of a simple path is atmost $$$k$$$. What is the minimum number of simple path partitions which meet this criteria? Note that every edge belongs to exactly one simple path, but the same vertices can be visited by multiple simple paths.

I was wondering if this is by any chance a "standard" problem with a known polynomial time solution?

  • Vote: I like it
  • +14
  • Vote: I do not like it

»
24 hours ago, # |
  Vote: I like it +57 Vote: I do not like it

Maybe not this exact problem, but it is similar to matching leaves into pairs to minimize the sum of paths lengths and then the maximum of them. Which is a problem I wrote several times in contests.

Root the tree somewhere (you can root at the leaf for simplicity), and do dp from subtree. There should be exactly one path going out of the subtree, so we want to cover everything else in the subtree with min number of paths. For a subtree, 2 things are important at the end: the length of path going up, and the number of paths we used in the subtree. It is more important to minimize the number of paths used, because you can always stop the current path, thus increasing the number of paths used only by 1, but making the length of the current path to be 0. So, we can do a DP[v] = (number of paths, length of current path going up), minimizing this pair lexicographically.

Now in the vertex we have a lot of paths coming from children, and we need to match them somehow. To be precise, we need to choose one of the paths to continue higher to the parent, some other paths we will just stop right here, and all others need to be split into pairs s.t. in every pair the sum is at most $$$k$$$.

The first thing we care about is the number of paths used, so we want to maximize the number of pairs matched. Do a binary search; to check if it is possible to make $$$m$$$ pairs, we will leave $$$2m$$$ shortest paths, and match smallest with largest, 2nd smallest with 2nd largest and so on.

Now that we know how many pairs we need to make, we want to choose the shortest path to be left to go up to the parent, subject to the condition that without it we still have to be able to make the same number of pairs. You can use another binary search here.

So, one DP recalculation in vertex $$$v$$$ works in $$$O(d_v \log d_v)$$$ where $$$d_v$$$ is the number of children of $$$v$$$ (we need to sort the lengths of the paths we got from children and do a couple of binary searches). Total complexity is $$$O(n \log n)$$$.

  • »
    »
    21 hour(s) ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Omg

  • »
    »
    12 hours ago, # ^ |
      Vote: I like it +1 Vote: I do not like it

    Wow, this is a very elegant solution compared to anything I had in mind. Thank you!

  • »
    »
    11 hours ago, # ^ |
      Vote: I like it +2 Vote: I do not like it

    is it possible to remove the binary searches? (althought it wont reduce the complexity)

    first part we can do two pointers, pair smallest path with largest path that can be valid

    second part, try using smallest path not used in first part

    • »
      »
      »
      7 hours ago, # ^ |
        Vote: I like it +11 Vote: I do not like it

      Yeah, you should be able to remove both binary searches, but you still need sorting. I wanted to describe the most simple-to-understand solution (and it doesn't even worsen the complexity).

      For the second part, I don't think just reusing the result for the first part is correct, because you tried to use the smallest part first, and maybe you didn't have to. But you certainly can do the following: match everything except the largest part, then do a scanline on the unused part, each time you move the pointer you need to recheck one new pair.