Today there was Google's Kickstart round H and I have a doubt in the question 3 of it, here's that link to the problem , I have solved this using a doubly linked list data structure for efficient replacements and anyone can easily analyze that over all time time complexity is O(n) , here's is the link to my code.Can someone please tell the reason for the TLE over large inputs..
It can't hurt to try fast IO.
I used fast IO as well, but there's no improvement
Have you tried to replace unordered_set with set? unordered_set is unreliable and hackable.
UPD. unordered_map too.
Yeah, first I used ordered, but then I thought the TLE would be because of the extra logn factor ,so I switched to unordered ones
As a start: you're iterating through the entire list every time you check its length.
Yeah, maybe this can be the issue ,I have not considered as a major draw back, but as you said this can cause TLE ,I will try it to optimize this and update again ,thankyou
Hey, it worked, thankyou soo much, here is the code