Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

[Help] Divide an array into two subsequences such that the sum of their xor sums is maximum

Revision en4, by SuperSheep, 2023-01-24 04:58:37

I'm trying to solve this problem 103708A - Anya's gifts, in more detail, we need to do this.

Given an array A of length n (n105, ai<250), divide it into two subsequences X and Y such that (x1...x|X|)+(y1...y|Y|) is maximum. Print the maximum xor_sum(X)+xor_sum(Y).

It is obvious that for each 2p, if the ammount of elements ai such that ai and 2p=2p is odd, the answer will always have its p-th bit on. I don't know how to handle the case where it's even.

Any help would be appreciated. Thanks!

Tags xor, xorsum

History

 
 
 
 
Revisions
 
 
  Rev. Lang. By When Δ Comment
en4 English SuperSheep 2023-01-24 04:58:37 88
en3 English SuperSheep 2023-01-23 21:40:08 13 Tiny change: 'that $a_i & 2^p = 2^p' -> 'that $a_i and 2^p = 2^p' (published)
en2 English SuperSheep 2023-01-23 21:32:25 423 Tiny change: 'x_1 \oplus$' -> 'x_1 \oplus x_2 \oplus ... \oplus x_{|X|}$'
en1 English SuperSheep 2023-01-23 20:56:45 314 Initial revision (saved to drafts)