hmehta didnt made an announcement on CF :/
SRM 756 will be held today.(in less than 30 mins) Atleast topcoder site says so.
Time: 16:30 UTC+5:30, April 25, 2019
Lets discuss problems after the contest.
№ | Пользователь | Рейтинг |
---|---|---|
1 | tourist | 3985 |
2 | jiangly | 3814 |
3 | jqdai0815 | 3682 |
4 | Benq | 3529 |
5 | orzdevinwang | 3526 |
6 | ksun48 | 3517 |
7 | Radewoosh | 3410 |
8 | hos.lyric | 3399 |
9 | ecnerwala | 3392 |
9 | Um_nik | 3392 |
Страны | Города | Организации | Всё → |
№ | Пользователь | Вклад |
---|---|---|
1 | cry | 169 |
2 | maomao90 | 162 |
2 | Um_nik | 162 |
4 | atcoder_official | 161 |
5 | djm03178 | 158 |
6 | -is-this-fft- | 157 |
7 | adamant | 155 |
8 | awoo | 154 |
8 | Dominater069 | 154 |
10 | luogu_official | 150 |
hmehta didnt made an announcement on CF :/
SRM 756 will be held today.(in less than 30 mins) Atleast topcoder site says so.
Time: 16:30 UTC+5:30, April 25, 2019
Lets discuss problems after the contest.
Название |
---|
The round was written by IH19980412. A draft of the editorial can be found here.
Thanks for the nice problemset!
I liked Div1 Hard that inclusion-exclusion principle works elegantly. Using De Bruijn Sequence, we can improve the solution to $$$O(2^N + HW)$$$, which is faster than naive $$$O(2^N \times N + HW)$$$.
Unless I'm mistaken, you mean Gray code and not a De Bruijn sequence. At least that's what the technique I have in mind uses: you order all subsets in such a way that each two consecutive subsets only differ by adding or removing one item, which allows you to compute the value for the new subset faster by just updating the value of the old subset.
Alternative solution for Hard: go through the table from top to bottom and maintain dp with state being a vector of degrees of all special cells. It works in $$$nm4^k$$$, but we can optimise it: if we are now in row $$$i$$$, we only need to know about cells in rows $$$i-1, i, i + 1$$$. It still works slow if there are a lot of special cells in three adjacent rows, but in that case there will not be a lot of special cells in any three adjacent columns, so let's just transpose the table.
My solution for Div1 Hard: Consider the neighbors of the undetermined cells. Intuitively it appears that there can't be many undetermined cells with more than one special neighbors (can't provide the maximum number, but the worst test case I came up with had 25 such cells). Thus, we can iterate over all of the combinations for choosing a state for those cells, and calculate the number of ways with respect to the other undetermined cells in $$$O(2^K*S)$$$, where $$$K$$$ is the number of undetermined cells with more than one special neighbors, and $$$S$$$ is the number of special cells. The complexity may be worse than the intended solution, but it avoids using the inclusion-exclusion principle.
I've had the same idea but I seem to have a mistake in counting the number valid ways after picking a state for the undetermined cells with more than one special neighbor. Could you provide a code for your solution?
What I do is that for each special cell, count the number of active cells let that be activeCnt and number of undetermined cells let that be undetCnt. if activeCnt > 3 then the whole field is invalid. Otherwise I do summation of C(undetCnt, i) such that i is in [0, min(undetCnt, 3 — activeCnt)] and multiply the output of the summation to the final result which is initialized by 1.
For each special cell I count the number of active neighbor cells, and the number of undetermined neighbor cells that are connected only to them. Let $$$Q$$$ be the count of undetermined cells that aren't neighboring a special cell, and $$$S$$$ be the count of special cells. The total number of ways to choose a state for the remaining undetermined cells is equal to $$$2^Q * \prod_{i=1}^{S} ways(i)$$$, where $$$ways(i)$$$ equals the number of ways to choose a state for the undetermined neighbor cells that are neighboring exclusively the special cell $$$i$$$. Let $$$c_i$$$ be the count of such cells for the special cell $$$i$$$. Then, $$$ways(i) = 2^{c_i}$$$, if the current count of active neighboring cells to special cell $$$i$$$ is less than $$$4-c_i$$$, and it is equal to $$$2^{c_i}-1$$$ if the current count of active neighboring cells to special cell $$$i$$$ is equal to $$$4-c_i$$$.
Consider a test where 20 special cells are arranged diagonally. Then there would be 38 such cells.
Thanks for pointing out that test case, I haven't thought of it.
Hi Everyone, I am certainly missing something very obvious for problem NewBanknote. My idea is, add the new currency to given denomination and sort them. For any given value, I try all the currencies recursively and take minimum out of it. My solution is failing for some values with off by one which I believe tailored for this problem. I am not looking for solution, but a hint that what is wrong with my thinking. A small test case would be great.
Try newBanknote = 50001, amuontsToPay = 200002.
Thank you very much. Now, I see my mistake. I have another noob question. I assume this problem is related to coin change which is universally assumed to be dynamic programming, but it can be solved greedily if there are some special cases of denomination. I am wondering what kind of relation/structure/constraints/pattern on coin denomination makes it greedy ?
The set of coins is termed as "canonical" if the greedy algorithm produces optimal solution. There is a paper on verifying if the set of coins is canonical. Link
For a while already, I've wondered about the problem which appeared as Div1 Medium in this round:
In the round, the size was only up to 40, which allowed for multiple exponential solutions. The question is, is it solvable in polynomial time? It certainly is when we know the subsequence to search for. So far,
The problem statement is so short that one would think it had been studied decades ago, with some definite result. I'll be grateful if someone sheds some light on the matter.
Here is a link that I found https://cstheory.stackexchange.com/questions/34/how-hard-is-unshuffling-a-string that says it is NP-hard
Yay, many thanks for the find!
Amazing that the answer was found only recently.