Anthony_Smith's blog

By Anthony_Smith, history, 3 years ago, In English

I recently came across Problem 95E. The problem pretty much asks "Given a set of coins whose values sum to $$$N$$$, find for each number from $$$1$$$ to $$$N$$$ the minimum number of coins needed to create that total sum".

I know of a way to do this with a $$$O(logN)$$$ factor (if a weight $$$w$$$ occurs $$$o$$$ times, split it up into $$$log(o)$$$ weights $$${ w, 2^1*w, 2^2*w, ..., 2^{log_2(o)} w}$$$ then do regular knapsack), but I also want to learn how to do it with a $$$O(sqrtN)$$$ factor. The general idea I've learned is that there are at most $$$O(sqrtN)$$$ distinct coin values, and that there exists some way to process all coins of one fixed coin-value in $$$O(N)$$$. But I don't know how to process all occurrences of a fixed coin-value in $$$O(N)$$$. I have read this blog, which uses the approach of finding for each total sum the minimum number of the current coin needed to create it. But the blog is for 0/1 Knapsack only, while here I want to find the minimum number of coins needed for each total sum.

Can someone provide a detailed explanation/implementation of the $$$O(NsqrtN)$$$ Knapsack Algorithm? Thanks!

  • Vote: I like it
  • +2
  • Vote: I do not like it

»
3 years ago, # |
Rev. 3   Vote: I like it +14 Vote: I do not like it

The idea to process $$$v$$$ items of value $$$k$$$ is to treat the problem as a sliding window maximum: for a state $$$dp_{i,j}$$$ (where $$$i$$$ is the index of the current value $$$k$$$ being processed and $$$j$$$ is cost), you only care about the states $$$dp_{i-1,j-k}, dp_{i-1, j-2k}, \ldots, dp_{i-1,j-vk}$$$.

Then, if you consider the values $$$j$$$ modulo $$$k$$$, you'll notice that for a fixed remainder it just becomes a range max query on a 'sliding window' interval (namely, the left bound of the interval may only move to the right each query), which can be computed in amortized constant time using a monotonic deque.

The idea of the monotonic deque is to maintain the 'history' of the values as you move from left to right, but kicking all values $$$x$$$ that are smaller than a 'more recent' value $$$y$$$. This way, the front ('oldest' part) of the deque will always have the maximum of the current sliding window, and as the left boundary moves to the right, you just need to kick out the values that are no longer in the current window.

»
3 years ago, # |
  Vote: I like it +13 Vote: I do not like it

This is an old video on Knapsack these days, but in the middle I show how to solve the O(n sqrt n) case on the way to solving a tougher variant.

»
3 years ago, # |
  Vote: I like it +15 Vote: I do not like it

Probably I have misunderstood something, but I want to note that the method you've mentioned (if a weight $$$w$$$ occurs $$$o$$$ times, split it up into $$$log(o)$$$ weights $$$w$$$,$$$2^1∗w$$$,$$$2^2∗w$$$,...,$$$2^{log2(o)}w$$$, then do regular knapsack) already provides us an $$$O(N\sqrt{N})$$$ solution. The idea with monotonic deque is alright too, but I find it a bit more difficult to implement. On the other hand, here is not very easy proof.

First of all, split all occurrences of $$$w$$$ into $$$log$$$ parts. I will call these parts super-coins. To be able to find the answer, you need to maintain some pairs cost, count, where cost is total cost of super-coin and count is how many initial coins are there in the super-coin. You can do it kind of naively, initially there are at most $$$O(\sqrt{N})$$$ different coins and you can't get more than $$$O(\sqrt{N}\log{N})$$$ super-coins.

My goal now is to prove that in fact you can't create more than $$$O(\sqrt{N})$$$ super-coins. That means that the complexity of the solution is $$$O(N\sqrt{N})$$$.

Consider all super-coins cost, count where cost is at least $$$\sqrt{N}$$$. As sum of costs doesn't change and equals to $$$N$$$, there can't be more than $$$O(\sqrt{N})$$$ of these super-coins.

Now the opposite, harder part: deal with super-coins cost, count where cost is less than $$$\sqrt{N}$$$. Each of them is created from count initial coins with equal costs cost / count. That means that for each super-coin one can derive, what it is created from. So, according to the algorithm, there are not more than 2 equal super-coins. Now, let's see, what super-coins we could obtain. One can conclude from the algorithm that count is always a power of 2, while cost is some multiple of it. Therefore, we can just list here all possible super-coins in the notation <cost, count>:

  • $$$<1 \cdot 2^0, 2^0>$$$, $$$<2 \cdot 2^0, 2^0>$$$, ..., $$$<\lfloor \sqrt{N} \rfloor \cdot 2^0, 2^0>$$$

  • $$$<1 \cdot 2^1, 2^1>$$$, $$$<2 \cdot 2^1, 2^1>$$$, ..., $$$<\lfloor \frac{\sqrt{N}}{2} \rfloor \cdot 2^1, 2^1>$$$

...

  • $$$<2^{\lfloor log_2{\sqrt{N}} \rfloor}, 2^{\lfloor log_2{\sqrt{N}} \rfloor}>$$$

There are only $$$\sqrt{N} + \frac{\sqrt{N}}{2} + .. + 1 \le 2\sqrt{N}$$$ of them, q. e. d.

I hope the proof was understandable, here is the code for reference. Fell free to question about the proof or somewhat.

»
10 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Is it possible to do the logN way if we change the problem to the number of ways to create the total sum? I don't think it's possible because the logN way will cause overcounting if o is not in the form (2^x)-1 but please correct me if I'm wrong.