I have solved the popular xor maximization problem XMAX — XOR Maximization using gaussian elimination . [problem statement : Given a set of integers S = { a1, a2, a3, ... a|S| }, we define a function X on S as follows: X( S ) = a1 ^ a2 ^ a3 ^ ... ^ a|S|. Given a set of N integers, compute the maximum of the X-function over all the subsets of the given starting set.]
But when applying same code in XOR with Subset from codechef, it gets WA.
[**problem statement** : You have an array of integers A1, A2, ..., AN. The function F(P), where P is a subset of A, is defined as the XOR (represented by the symbol ⊕) of all the integers present in the subset. If P is empty, then F(P)=0.
Given an integer K, what is the maximum value of K ⊕ F(P), over all possible subsets P of A? ]
the only change i made is initialize answer = k. than after using gaussian elimination , i greedily checked if taking it increase answer.
My code : Link
can anyone please tell me whats getting wrong.
[Deleted]
thank you. the dp solution is nice.
but i have just come across the blog Gaussian Elimination for XOR Maximization , there the above mentioned two problems are shown as example of gaussian elimination. so i just wanted to check out how gauss works here.
You can also solve it using the main idea of gauss and some linear algebra.
Here is the solution using that idea. Link
Hey, I read your code for like 15 minutes and it seemed perfectly OK for me, then I noticed a tinny little silly mistake :
in the 34th line you have a parentheses problem, instead of x^ans > ans you have to change it to (x^ans) > ans .
Fixing that will give you AC.
PS : you could code the Gaussian Elimination without building the actual matrix (simply by applying bit operators on the array), Good Luck.
oh no.. thanks a lot man.