It is a general question, which I came across while solving the Div2 1000 pts problem from SRM 735 on topcoder in practice. I couldn't get an efficient solution for the problem so I decided to read the editorial. I understood the approach, but there was a series of operations implemented using fenwick tree.

There were 2 kinds of operations. They required to answer the no. of elements less than a certain no. at a given instance and also to allow insertion of elements(something like updating the count). I have read policy based data structure (order statistic tree) and I thought that that might also be a good choice to perform these series of operations, as they allow insertion of elements and to find the order of a key(no. of elements less than a certain element) both in O(lg n). Similarly, every operation for a fenwick tree also had a complexity of O(lg n). The complexity using both fenwick tree and order-statistic tree came out to be O(m*n*lg n), where n <= 1e5 and m <= 50.

But the max. time for a test case using approach 1(fenwick tree) was 139ms, whereas using the approach 2(order-statistic tree) was 1.489 s. Actually, in approach 2, there were few more cases which had time greater than 1s.

Here is my code using the fenwick tree, and here is the one with the order-statistic tree.

I haven't solved many problems with the os tree. Indeed, I don't think it is that popular a data structure and generally the problem setter might have some other data structure in mind while making a problem or maybe some other technique. So my question is: Given a problem which I know can be asymptotically solved if I use os tree, Shall I try to use it or try to find alternative solutions to solve the problem and completely abandon the idea of using them ?

Being clearer, can I trust the O(lg n) bound per operation that is specified with this data structure ?

Anyone familiar with these data structures ? Feel free to give your opinions. I am not asking for some strict rule. Just an opinion.

Shall I try to use it or try to find alternative solutions to solve the problem and completely abandon the idea of using themWell, it depends. I can think of a situation in which you can use os tree but not fenwick tree. Let's say that we have a "tree in each segment tree node" solution and we have a very large factor "n", we cannot possibly allocate O(n) memory for a BIT in each node because we will have O(n^2) memory, which can be very huge. Using os tree will enable us to use only O(n * logn * logn) memory.

Well, actually I haven't solved any problem of the kind you mentioned. But it does provide me with an instance where I might want to use the os tree to my advantage if I am solving some problem using this technique. Thanks. :)

Yes, you can. It's just a huge constant factor hidden behind that bound that makes your program so slow. Generally, self-written things tend to beat by performance their STL analogs (though there're some exceptions as well).

Yes, I also think that the bad performance in comparison to other ds is because of a larger constant factor as compared to the handwritten ds. I'll keep this point in mind before finalizing my deision to use os trees. Thanks. :)