Hey guys! My professor proposed me to solve this problme. Given a number p <= 10^7, print the sum of the digits of 2^p. For exemple, 2^4 = 16 so we'll print 7. How can I do this properly? I first tried to use in a way or in another, the little theorem of fermat but it made no sense here.
Well since $$$2^p$$$ will have less than $$$10^7$$$ digits you can compute $$$2^p$$$ by using FFT with complexity $$$Nlog^2N$$$. Then you simply find the sum of the digits.
So this is the only solution. I was wondering if there is a solution with math for every number not only 2
2^10^7 has 3010300 decimal digits which is greater than 10^7 then how can you say that 2^p is less than 10^7 ??
3010300 has only 7 digits, how can it be larger than 10^7?
i didn't mean it. 2^10^7 has 3010300 digits in decimal representation.
$$$10^{10^7}$$$ will have exactly $$$10^7 + 1$$$ digits and since $$$2 < 10$$$ then $$$2^{10^7}$$$ will have less than $$$10^7 + 1$$$ digits.
Even better. Realize that 1log1 + 2log2 + 4log4 + 8log8 + ... + nlogn is O(nlogn), thus your solution is O(nlogn).
Can you please explain, why is this formula correct? Something is very wrong about it.
N + N/2 + N/4 + N/8 + N/16 + ... converges to 2N
Well, yeah, this one is true, but the formula must be $$$\sum_{i=1}^n n/i\cdot i \cdot log(i)$$$ for all powers of two which is $$$O(n\cdot log^2)$$$. You can even apply master theorem here $$$f(n)=2f(n/2)+O(nlog) = O(n\cdot log^2)$$$
No, it's simple. All logs in that sum are at most $$$\log N$$$, so you can estimate that it's $$$\le 1 \log N + 2 \log N + 4\log N + \ldots + (N/2) \log N + N \log N$$$ $$$= \log N (1 + 2 + 4 + \ldots + N/2 + N)$$$.
But it's not $$$1+2+4+...+n$$$, because our sum is $$$\sum n/i \cdot i \cdot log(i) = n \sum log(i) \geq n \cdot (log(n)/2)^2$$$
What do the terms in that sum correspond to? It clearly can't be computing $$$2^1, 2^2, 2^3, \ldots$$$ step by step because each term is at least $$$O(N)$$$.
My algorithm is a little different from binary exponentiation. It can multiply any $$$n$$$ short numbers, so it takes $$$nlog^2$$$. And yes, binary exponentiation will take exactly $$$nlog$$$ time.
Ok, so now you know that multiplying many small numbers isn't a great idea.
You don't need to do two recursive calls to do binary exponentiation.
Using your logic plain binary exponentiation works in $$$f(n)=2f(n/2)+O(1)=O(n)$$$
Wait, that's illegal.
How is it possible you can multiply two numbers of length $$$n/2$$$ in $$$O(1)$$$. I think the reccurence must be $$$f(n)=f(n/2)+O(nlog) = O(nlog)$$$.
or just do plain of iterative exp
both of these are exactly the recurrence you said.
Tell your prof that in base 2 the sum of the digits is just 1
Your answer seems so straight forward that I spent few minutes thinking how to compute sum of digits mod 2, just to realize that you mean't sum of digits when the number is itself in base 2 :facepalm:
Is it any easy way to compute sum of digits mod 2?
interesting, i like your prof
you can do the computation in strings then finally add the result
Hey why the downvotes? This a totally valid solution since the question didn't specify required complexity or time limit.
Do we need to specify every time that time limit is smaller than a few hours?
"This a totally valid solution" — excuse me, but where do you see a solution here? Information of how to store the digits doesn't seem to me like a totally valid solution :).
Lol he suggested manually calculating it with biginteger strings, which, while dumb, is a solution.
actually yes it tried to do it and it gives tle.