Original problem: 1592E - Bored Bakry
I somehow misread the problem into this: you are required to find a subsequence [L,R], such that the value $$$a_L & a_{L+1} ... & a_R - a_L ^ a_{L+1} ^ ... ^ a_R$$$ is maximized.
I only come up with a O(nlognlogV) solution, where V is maximum value that an $$$a_i$$$ can take.
Here's my solution:
First, suppose we consider all possible subsequences with left endpoint L.
Since as r increase, AND(L,r) decrease, and there are only logV possible different values for it, we can divide all possible r values into logV segments, within each AND(L,r) is constant.
Now we need to find least value for XOR(L,r) in each segment. Note that XOR(L,r) = PREFIX_XOR(r) ^ PREFIX_XOR(L-1), if we have a Trie tree for each PREFIX_XOR(i) in the segment, walking on it according to digits of PREFIX_XOR(L-1) yields the least XOR.
However we cannot afford to build Trie tree for every query. We can build a segment tree for segment (1,n), and build a Trie tree for each segment tree node. That way both space and time complexity will be O(nlognlogV).
I'm not sure if algorithms with better complexity exist, can u guys help me?