akshit5638's blog

By akshit5638, history, 4 days ago, In English

we have an array of size n.each element of array is between 1 and 10.we are also given an integer k.We can divide array in atmost k subsets.After dividing, each element should belong to exactly one subset.We have to tell if it is possible to divide such that every subset sum<=10

constraints:

test case,t<=100

n<=1000

1<=a[i]<=10

1<=k<=100

example

t=1

n=12

2 1 2 5 4 3 6 1 1 9 3 2

k=4

than a valid division is

[9,1],[6,1,3],[5,2,3],[4,2,2,1]

  • Vote: I like it
  • +9
  • Vote: I do not like it

»
4 days ago, # |
  Vote: I like it 0 Vote: I do not like it

Google ?

  • »
    »
    4 days ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    goldman sachs

»
4 days ago, # |
  Vote: I like it +7 Vote: I do not like it

This showed up in my online assessment too, I feel the testCases for this problem were quite weak cause what I did was sort the array in descending order and keep a bags array B. now for each element A[i] i try to find some bag by iterating B which A[i] can fit... if for any A[i] it can't fit then it's false, else true. This solution is definitely wrong but it passed all the testCases somehow.

  • »
    »
    4 days ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    yes people have passed with greedy approach but it is not correct,I have tried 3-4 different greedy and dp too but can't find a way

  • »
    »
    4 days ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    I am not sure if this as well is correct but, I was thinking it should be solved like , find all pairs with sum 10, remove those pairs, then find all pairs with sum 9 remove those pairs, ..... now if total number of pairs is less than the number of bags then false, else yes. If i am not wrong, you can find the number of pairs with a particular sum in O(n) so it shouldn't cost more than 1e4 per test Case overall.

    • »
      »
      »
      4 days ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      but its not sure removing pair will be optimal, we could use triplet and more to make sum

      • »
        »
        »
        »
        4 days ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        could use a while loop with a multiset and a lower_bound on it for the required till the lower_bound gives us a sum greater than 10

        • »
          »
          »
          »
          »
          4 days ago, # ^ |
            Vote: I like it 0 Vote: I do not like it

          can you explain in more detail

          • »
            »
            »
            »
            »
            »
            4 days ago, # ^ |
              Vote: I like it 0 Vote: I do not like it

            take the elements in negative and find the smallest(greatest in absolute) element near our target which keeps on changing , like we start with target = 10 , we get 9 from the multiset , remove that , change target to 1 , we get 1 . Do this till we have bags left. If bags finish return false else if elements finish return true. May work , haven't tried it

            • »
              »
              »
              »
              »
              »
              »
              4 days ago, # ^ |
                Vote: I like it +4 Vote: I do not like it

              5 4 4 3 3 2 and 2 bags

              by multiset approach we get [5,4],[4,3,3],[2]

              but it is [5,3,2],[4,3,3]

              • »
                »
                »
                »
                »
                »
                »
                »
                5 hours ago, # ^ |
                Rev. 2   Vote: I like it 0 Vote: I do not like it

                well there should be two 4, yes but it is wrong [6,3,5,2,2,2] multiset ans-[6,3][5,2,2][2] ans- [6,2,2][5,3,2]

»
4 days ago, # |
  Vote: I like it +11 Vote: I do not like it

This question is similar to the bin packing problem, but there is an upper bound on the number of bins to be used.

The original bin packing problem is NP-complete, but I am not sure about this problem. One of the possible ways to find the answer (yes or no) is that we could use multiple interations for solving this problem. I don't know if this would work but we can try and have a high chance of succeeding. Repeat this process multiple times:
1. Shuffle the original array randomly (You can try the descending array once).
2. Use the best-fit algorithm for bin packing to find a possible way to arrange.
3. if the best-fit algorithm finds number of bins to be less than or equal to k then answer is yes. else if best fit algorithm returns the number of bins for fitting to be "b". Then the optimal answer is greater than or equal to (b-1)/1.7 . If this value is greater than k then answer is no.

If you can't find a sure answer after all these iterations, then answer no.

  • »
    »
    3 days ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    thanks.I guess they just wanted greedy solution as it seems impossible to solve it in less than exponential time

»
3 days ago, # |
  Vote: I like it +1 Vote: I do not like it

I think i got a greedy approach but not sure if it may get wrong! We can sort the array in descending, for eg in your case ---> 9 6 5 4 3 3 2 2 2 1 1 1 Since we dont need to maintan the subset itself , instead we can maintain a vector of exactly k elements(max subsets possible) which represents sum left after filling each subset( initially set to 10) in the order mentioned below.

Order

The idea is to keep a pointer at i=0 then to then we move forward with pointer j (initially 0 representing the jth element of array) in the sorted array till sum at subset i<=10 also at each step decreasing sub[i](subset i -->sum) by v[j] , if at a step sub[i]-v[j]<0 we move pointer i to next subset, if(i exceeds k , we break the loop)

now for all left j till n-1 , we need to put the remaing elements left in any of subsets with sum remaining >=1. We can do this with maintaining a map of count sum left and and for each j to n-1 we check if there exist a lowerbound of v[j] and decrease the count. THE LAST PART OF THE PROCESS MAY BE INEFFICIENT AND I DIDN,T PUT MUCH THOUGHT IN THINKING OF OPTIMIZATION .

»
3 days ago, # |
  Vote: I like it 0 Vote: I do not like it

If I'm not wrong, this problem is analogous to the Elevator Rides problem on CSES problem set. Assume the numbers in the array to be weights of people which have to be transported via a lift. The lift can only take a certain amount of load(which is 10 according to your problem). Therefore you need to find the minimum number of times you will need to use the lift in order to transport all the people. If the number is <= k(here), then its possible, otherwise not. You can look up to the DP solution of this which also incorporates bit-masking. Also take a look at this tutorial.

  • »
    »
    3 days ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    problem is same but constraint on n are 1000 so bitmask dp is not possible

  • »
    »
    3 days ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    That problem is solved in $$$O(2^{N}*N)$$$, here they are expecting a time complexity of $$$O(N^2)$$$.

    • »
      »
      »
      3 days ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      Oh, got it thanks, then ig the best approach is the one which the guy above told regarding bin packing.

»
3 hours ago, # |
Rev. 2   Vote: I like it 0 Vote: I do not like it

maybe i am wrong but still, i come up with this, i am just summing up to create 10 first than 9 than .... 1 so first i calculate the frequency of each element ex-

1 — 30

2 — 10

3 — 4

5 — 6

7 — 3

9 — 8

for 10 i will just add them alone now for other frequency their sum only matter till they contribute less than 10 for 1 i can consider sum as 1, 2, 3, ... 10 for 2 -> 2 , 4 , 6 , 8 , 10 so to do this i will need to take sum times 1 sum times 2 , sum times 3 .... and remove everything which sum up to 10 than do the same till sum up to 1

for this i could do using bitmask dp where i need to insure i am only taking 1 case from each 1 to 10 that is i cant take three 1 and 2 one at the same time