Hi,
I understand that the rule of thumb for 1sec per test case execution to keep the number of operation upto 5*10^8
. In some platforms like tocoders, I guess it is 10*10^8
. However, I feel it sheerly depends on what kind of operations we are doing.
But after doing some digging, I found that 1sec/testcase and 2sec/test case is merely same. I have a problem where the input array size is 10^5 and the number of queries can run upto 10^5
as well. So for each query if my complexity is O(n)
and for all the queries, essentially the number of operations can be upto 10^10
. Which should naturally take more than 1 sec and hence the test case won't run in 1 sec. But I have given 2 sec.
So my question is that can anyone please explain me that why it can't even run in 2 sec as well? I am a beginner and hence please excuse me if I am asking something stupid or obvious.
Thanks.
If we can do
10^5
in 1 sec, then in y seconds, we can doy*(10^5)
operations. Not(10^(5*y)
.