UnexpectedValue's blog

By UnexpectedValue, 12 months ago, In English

I could not find any article or blog on this idea, and felt having one could be helpful to many. There might be unwanted errors, so please feel free to to share any suggestions, corrections or concerns. For those who are unware of Binary Exponentation, this CP-Algorithms article or this video by Errichto can help get familiar with it.

Introduction

Now that we know the idea behind binary exponentiation; let us try expanding the idea. Say, we break the problem into two parts, $$$L$$$ and $$$R$$$, such that, once we have both the results, combining them gives us our final answer as $$$ans = L \odot R$$$, where $$$\odot$$$ is an associative binary operator. If we can find $$$R$$$ as a function of $$$L$$$ as in $$$R = f(L)$$$ fast enough, say $$$O(X)$$$, then our final answer can be found in $$$O(X\;log(n))$$$. The idea is simple enough, so we now have a look at some simple examples. For the sake of simplicity we assume problem size $$$n$$$ is of form $$$2^k$$$ , $$$ k \in \mathbb{Z} $$$ .

Example 1: Calculating $$$a^n$$$

We are starting from the simplest example. We know that $$$ans = \underbrace{a \cdot a \dots \cdot a \cdot a }_\text{ n times} $$$. If we take $$$R = f(L) = L$$$, where $$$L = a^{n/2}$$$; we can find $$$R$$$ in $$$O(1)$$$ and build our answer as -

$$$\newline$$$

$$$ans = \underbrace{\underbrace{\dots \dots \dots}_{L} \odot \underbrace{\dots \dots \dots}_{R=f(L)}}_\text{L'} \odot \underbrace{\dots \dots \dots \dots \dots \dots \dots}_\text{R' = f(L')} \; \odot \dots \dots$$$

We are able to compute $$$R'$$$ directly from $$$L'$$$, unlike other methods such as RMQ or Binary Lifting where we build our answer from already calculated $$$L$$$ and $$$R$$$ on the independent smaller ranges to later combine them to compute the the answer for the bigger range.

Time Complexity: Finding $$$R = f(L)$$$ is $$$O(1)$$$, so total time complexity is $$$O(log(n))$$$.

Implementation

Example 2: Summation of GP Series

Considering a standard GP series, we find its summation as

$$$\newline$$$

$$$ans = 1 + r + r^{2} \dots r^{n-1} = \frac{r^{n}-1}{r-1}$$$

If we have to find the summation under arbitrary MOD then it's not necessary that $$$(r-1)^{-1}$$$ exists, and so we alternatively express our sum as

$$$ans = \underbrace{1 + r + r^{2} \dots r^{ \frac{n-1}{2} } }_\text{L} + \underbrace{r^{ \frac{n}{2} } + \dots + r^{n-1}}_\text{R} $$$

which gives $$$R = f(L) = r^{\frac{n}{2}} \cdot L$$$.

To handle the case for odd $$$n$$$ during implementation, we can break series as

$$$\newline$$$

$$$ans = 1 + a \cdot \underbrace{(1 + a + a^2 + \dots + a^{n-2})}_\text{say sum is S}$$$

We can find $$$S$$$ as $$$(n - 1)$$$ is even, and final answer will be $$$= 1 + a \cdot S$$$.

Time Complexity: Computing $$$R = f(L)$$$ is $$$O(log(n))$$$, so our total time complexity becomes $$$O(log(n)^{2})$$$. But we can calculate powers of $$$r$$$ along with our series, so we restrict the final time complexity to $$$O(log(n))$$$.

Implementation

We can use a similar idea for calculating GP series of matrices $$$ = I + A + A^2 + \dots + A^{n-1}$$$. Just like integers under modulo, we can't always guarantee $$$(A-I)^{-1}$$$ exists.

Contest Example: In AtCoder ABC 293 Task E the task is to just compute the sum of a GP series.

Example 3: Summation of AP-GP series

We start by considering the simple series $$$\displaystyle r + 2 \cdot r^2 + 3 \cdot r^3 \dots + (n-1)\cdot r^{n-1} = \sum\limits_{i=0}^{n-1} i\cdot r^i$$$. For this we express the summation as below

$$$ans = \underbrace{r + 2 \cdot r^2 + \dots + (\frac{n}{2}-1) \cdot r^{ \frac{n}{2}-1 }}_\text{L} + \underbrace{ \frac{n}{2} \cdot r^{\frac{n}{2}} + \dots + (n-1) \cdot r^{n-1}}_\text{R} $$$

Now we solve for $$$R = f(L)$$$.

Solution for R as a function of L

To handle the odd case during implementation, we express summation as:

$$$\newline$$$

$$$sum = r + 2\cdot r^2 + \dots + (n-1) \cdot r^{n-1}$$$
$$$= \left(r + r^2 + \dots + r^i + \dots + r^{n-1} \right) + \left( r^2 + 2\cdot r^3 + \dots + (i-1)\cdot r^{i-1} + \dots + (n-2)\cdot r^{n-1} \right)$$$
$$$= r \cdot \underbrace{\left(1 + r^1 + \dots + r^i + \dots + r^{n-2} \right)}_\text{X} + r \cdot \underbrace{\left( r^1 + 2\cdot r^2 + \dots + i\cdot r^{i} + \dots + (n-2)\cdot r^{n-2} \right)}_\text{Y}$$$

We can get $$$X$$$ and $$$Y$$$ because they are of even length, and our final answer will be $$$= r \cdot X + r \cdot Y$$$ .

Time Complexity: To get $$$R$$$ we need to solve $$$L_{0}$$$ which takes $$$O(log(n))$$$ so total will be $$$O(log(n)^2)$$$, but we can calculate powers of $$$r$$$, $$$L_{0}$$$ and $$$L$$$ together which makes total time $$$O(log(n))$$$.

Implementation

For any generic AP-GP series $$$\displaystyle = ‎‎\sum\limits_{i=0}^{n-1} ( a + i \cdot b ) \cdot r^i$$$ $$$= ‎‎a \cdot \underbrace{\sum\limits_{i=0}^{n-1} r^i}_\text{ L0 } + b \cdot \underbrace{\sum\limits_{i=0}^{n-1} i \cdot r^i}_\text{L} = a \cdot L_{0} + b \cdot L $$$, where both are being calculated with single function call.

AtCoder ABC 129 task F is a good practice example for the ideas in the blog.

Genralising the above series summations

Now, let us consider $$$\displaystyle S(n,m) = ‎‎\sum\limits_{i=0}^{n-1} i^m \cdot r^i $$$. Then we would have GP $$$= S(n,0)$$$ and AP-GP $$$= S(n,1)$$$. Hence,

$$$\newline$$$

$$$\displaystyle S(n,m) = r + 2^m \cdot r^2 + \dots + i^m \cdot r^i + \dots + (n-1)^m \cdot r^{n-1}$$$
$$$\displaystyle = \underbrace{r + 2^m \cdot r^2 + \dots + (\frac{n}{2}-1)^{m} \cdot r^{\frac{n}{2}-1}}_\text{L(m,n/2)} + \underbrace{{(\frac{n}{2}})^{m} \cdot r^{\frac{n}{2}} + \dots + (n-1)^m \cdot r^{n-1}}_\text{R(m,n/2)}$$$

Here $$$L_{m,\frac{n}{2}} = S(\frac{n}{2},m)$$$. Then, $$$S(n,m) = L_{m,\frac{n}{2}} + R_{m,\frac{n}{2}}$$$, We need to compute $$$R_{m,\frac{n}{2}}$$$.

$$$\newline$$$

$$$\displaystyle R_{m,\frac{n}{2}} = r^{\frac{n}{2}} \cdot \left( (\frac{n}{2})^m + (\frac{n}{2}+1)^m \cdot r + \dots + (\frac{n}{2} + i)^m \cdot r^i + \dots (\frac{n}{2} + \frac{n}{2} - 1)^m \cdot r^{\frac{n}{2}-1} \right)$$$
$$$\displaystyle R_{m,\frac{n}{2}} = r^{\frac{n}{2}} \cdot \left( \sum\limits_{i=0}^{\frac{n}{2}-1} \left( \sum\limits_{j=0}^{m} \binom{m}{j} \left(\frac{n}{2}\right)^{m-j} \cdot i^j \right) \cdot r^i \right) $$$

Now, swapping the summations -

$$$\displaystyle R_{m,\frac{n}{2}} = r^{\frac{n}{2}} \cdot \left( \sum\limits_{j=0}^{m} \binom{m}{j} \left( \frac{n}{2}\right) ^{m-j} \underbrace{\left( \sum\limits_{i=0}^{\frac{n}{2}-1} i^j \cdot r^i \right)}_\text{L(j,n/2)} \right) $$$

Finally,

$$$\displaystyle R_{m,\frac{n}{2}} = r^{\frac{n}{2}} \cdot \left( \sum\limits_{j=0}^{m} \binom{m}{j} \left( \frac{n}{2}\right) ^{m-j} \cdot L_{j,\frac{n}{2}} \right) = r^{\frac{n}{2}} \cdot \left( \sum\limits_{j=0}^{m} \binom{m}{j} \left( \frac{n}{2}\right) ^{m-j} \cdot S_{\frac{n}{2},j} \right) $$$

As $$$j \leq m$$$ if we calculate everything together, just like above and hence know the value of every $$$L_{j,\frac{n}{2}}$$$ without any extra time. We can validate our results for the above illustrated examples of AP and AP-GP series.

Validating our results against our previous examples.

We can now evaluate our original summation as $$$\displaystyle S_{n,m} = S_{\frac{n}{2},m} + r^{\frac{n}{2}} \cdot \left( \sum\limits_{j=0}^{m} \binom{m}{j} \left( \frac{n}{2}\right) ^{m-j} \cdot S_{\frac{n}{2},j} \right)$$$.

Handling Odd Case

We have $$$\displaystyle S_{n,m} = \sum\limits_{i=0}^{n-1} i^m \cdot r^i = r \cdot \left( \sum\limits_{i=1}^{n-2} (i+1)^m \cdot r^i \right)$$$. However we can only write this if $$$m>0$$$ but for $$$m=0$$$ first term will not be $$$0$$$ but instead be $$$\displaystyle \lim_{x \to 0} x^x = 1$$$. So for $$$m=0$$$ we will add $$$1$$$ instead of $$$0$$$, keeping the remainder of the expression same. Thus

$$$\displaystyle S_{n,m} = r \cdot \left( \sum\limits_{i=0}^{n-2} \sum\limits_{j=0}^{m} \binom{m}{j} i^j \cdot r^i \right)$$$

And swapping summations

$$$\displaystyle S_{n,m} = r \cdot \left( \sum\limits_{j=0}^{m} \binom{m}{j} \underbrace{\sum\limits_{i=0}^{n-2} i^j \cdot r^i}_\text{S(n-1,j)} \right) = r \cdot \left( \sum\limits_{j=0}^{m} \binom{m}{j} S_{n-1,j} \right) \hspace{0.25cm} \text{for} \hspace{0.25cm}m > 0$$$

Hence, we have the final result as:

$$$ \displaystyle \begin{equation} S_{n,m}= \begin{cases} 1 + r \cdot S_{n-1,0} & \text{if } m=0\\ r \cdot \left(\displaystyle \sum\limits_{j=0}^{m} \binom{m}{j} S_{n-1,j} \right) & \text{if } m > 0 \end{cases} \end{equation} $$$

We can also construct a clean visualization of the even and odd cases using matrix multiplication instead of summation. These are illustrated as below.

Odd Case.
Even Case.

In the above cases, whenever we are computing values of $$$R_{j, n / 2} \hspace{0.25cm} \text{for} \hspace{0.25cm} j \in [0, m]$$$, they still indeed maintain our original form of $$$R = f(L)$$$ as we can express them as

$$$\vec{R_{\frac{n}{2}}} = f(\vec{L})= r^{\frac{n}{2}} \left( M(n,m) \cdot \vec{L_{\frac{n}{2}}} \right) $$$

Time Complexity: To find $$$R_{j,\frac{n}{2}}$$$ takes summation of $$$m$$$ terms and we do that for every $$$j \leq m $$$ which will take $$$O(m^2)$$$. So, total complexity is $$$O(m^2 \cdot log(n)).$$$

Implementation
  • Vote: I like it
  • +153
  • Vote: I do not like it

| Write comment?
»
12 months ago, # |
  Vote: I like it +1 Vote: I do not like it

Nice blog!

btw S(n, m) can be calculated in subquadratic time. For more information, see the Forum section of this Library Checker problem.

  • »
    »
    12 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Thanks for sharing this information. This is new to me and I will certainly have a look into it. Seems intriguing.

»
12 months ago, # |
Rev. 5   Vote: I like it +26 Vote: I do not like it

Example 2 even case can be much easier:

$$$a^0 + a^1 + \cdots + a^{n-1}$$$ = $$$a^0 + a^2 + a^4 + \cdots + a^1 + a^3 + a^5 + \cdots$$$ = $$$(a + 1) \times (a^0 + a^2 + \cdots + a^{n-2})$$$

code
  • »
    »
    12 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Thanks for adding this easy to implement idea to the one shared in the blog! Will certainly come in handy.