So I am solving the prob: Find XOR sum of all subsets, where $$$n \lt 1000,\sum a_i\leq 2000000$$$, so I use bitset<2000001> to optimize it. Then the weird thing happened, I passed the prob with this code:
int main() {
ios_base::sync_with_stdio(false);cin.tie(NULL);cout.tie(NULL);
int n; cin >> n;
bitset<2000001> dp;
dp[0] = 1;
for (int i = 0; i < n; i++) {
int x; cin >> x;
dp = dp ^ (dp << x);
}
int ans = 0;
for (int j = 0; j <= 2000000; j++)
if (dp[j])
ans ^= j;
cout << ans << '\n';
return 0;
}
But, with this code:
----- same ----
vector<int> w(n);
bitset<2000001> dp;
dp[0] = 1;
for (int i = 0; i < n; i++) {
cin >> w[i];
dp = dp ^ (dp << w[i]);
}
----- same ----
I got WA on some test cases, where my output is always 0. Is there any difference between these two approach? Why it failed with the second code?








