I was solving this problem which seemed like a pretty much standard problem of persistent segment tree but i came across this solution which seems to use some dfs ordering of nodes and i'm unable to figure out what is going on?
Problem: Given a tree (100000 nodes) and a value in each node, and for each query you are given 2 nodes and a value V , you have to find number of nodes whose value is less than V.
This is my first time on codeforces. Thanks in advance
We basically divide the query range from u to v as root to u, root to v and root to lca(u,v). Compute the answer for these paths, and you can restore the answer for the original query. To compute the answer for a particular node, we first store all the ranges (l,r) for that particular node. Notice that if all the values from the root to a node v are added to a bit, then we can find the answer using one range query. Now we can just perform dfs, add a node to the bit at the beginning of the DFS call and remove it from the BIT at the end of the DFS call. This will ensure that when we are a node, all its ancestors are in the bit.
Edit : same as above