Изменения рейтингов за последние раунды временно удалены. Скоро они будут возвращены. ×

Hadamard product and binomial convolution of linear recurrences

Правка en6, от adamant, 2022-05-25 03:21:36

Hi everyone!

Let $$$R$$$ be a ring, $$$d_0, d_1, d_2, \dots \in R$$$ and $$$e_0, e_1, e_2, \dots \in R$$$ be linear recurrence sequences, such that

$$$\begin{gather} d_m = \sum\limits_{i=1}^k a_i d_{m-i}\text{ for }m \geq k, \\ e_m = \sum\limits_{i=1}^l b_i e_{m-i}\text{ for }m \geq l. \end{gather}$$$

In some applications, the following two sequences arise:

$$$\begin{gather} f_k &=& d_k e_k & \text{(Hadamard product)}, \\ f_k &=& \sum\limits_{i+j=k} \binom{k}{i} d_i e_j & \text{(binomial convolution)}. \end{gather}$$$

Today I'd like to write about the framework that allows to prove that both the sequences defined above are also linear recurrences. It would also allow to compute their characteristic polynomials in $$$O(kl \log kl)$$$, which is optimal as their degrees are $$$O(kl)$$$ in both cases.

Umbral calculus

Generally, a linear recurrence $$$f_k$$$ can be described and analyzed with the help of the linear functional $$$T : R[f] \mapsto R$$$ such that

$$$ T(f^k) = f_k. $$$

For such functional, $$$T(P(f)) = 0$$$ when $$$P(f)$$$ is a multiple of the characteristic polynomial of $$$f_k$$$. The existence of the characteristic polynomial is the criterion of $$$f_k$$$ being a linear recurrence. So, we need to prove that there is such a polynomial for $$$f_k$$$ defined above.

Joint umbral calculus

To analyze joint properties of $$$d_k$$$ and $$$e_k$$$, we define a linear functional $$$T: R[d, e] \to R$$$ such that

$$$ T(d^i e^j) = d_i e_j. $$$

Similarly to the case of a single recurrence, $$$T(f(d, e))=0$$$ whenever $$$f(d, e)$$$ is a linear combination of $$$a(d)$$$ and $$$b(e)$$$, where

$$$\begin{gather} a(d) &=& d^k - \sum\limits_{i=1}^k a_i d^{k-i}, \\ b(e) &=& e^l - \sum\limits_{j=1}^l b_j e^{l-j}. \end{gather}$$$

are the characteristic polynomials of $$$d_i$$$ and $$$e_j$$$. In other words, $$$T(f(d, e))=0$$$ whenever $$$f(d, e)$$$ lies in the ideal $$$\langle a(d), b(e) \rangle$$$.

Composed sum

For the binomial convolution let $$$f=d+e$$$, then

$$$ T(f^k)=T((d+e)^k) = T\left(\sum\limits_{i=0}^k \binom{k}{i} d^i e^{k-i}\right) = \sum\limits_{i=0}^k \binom{k}{i} T(d^ie^{k-i}) = f_k $$$

To show that $$$f_k$$$ is a linear recurrence obeying to the rule

$$$ f_m = \sum\limits_{i=1}^t c_i f_{m-i}\text{ for }m \geq t, $$$

it is sufficient to show that there is a characteristics polynomial $$$c(f)$$$ such that $$$c(f) \in \langle a(d), b(e) \rangle$$$.

Assume that $$$R$$$ is an integral domain. Then the polynomial exists and can be defined explicitly as

$$$ c(d+e) = \prod\limits_{i=1}^k \prod\limits_{j=1}^l ((d+e)-(\lambda_i + \mu_j)), $$$

where

$$$\begin{gather} a(d) &=& \prod\limits_{i=1}^k (d-\lambda_i),\\ b(e) &=& \prod\limits_{j=1}^l (e-\mu_j). \end{gather}$$$

The fact that $$$c(d+e) \in \langle a(d), b(e) \rangle$$$ is proven as follows:

$$$\begin{align} c(d+e) = & \prod\limits_{i=1}^k \prod\limits_{j=1}^l ((e-\mu_j) + (d - \lambda_i )) = \\ = \sum\limits_{r_{ij} \in \{0,1\}} & \prod\limits_{i=1}^k \prod\limits_{j=1}^l (e-\mu_j)^{r_{ij}}(d-\lambda_i)^{1-r_{ij}}. \end{align}$$$

In the sum above, there are $$$2^{kl}$$$ summands, each of them is divisible by either $$$a(d)$$$ or $$$b(e)$$$, so $$$c(d+e) \in \langle a(d), b(e)\rangle$$$.

The polynomial $$$c(f)$$$ defined above is called the composed sum of $$$a(d)$$$ and $$$b(e)$$$.

Composed product

Now the question is, how to prove that the Hadamard product $$$f_k = d_k e_k$$$ is a linear recurrence?

Using similar logic as above, one would define $$$f = de$$$ and then look for $$$c(f) \in \langle a(d), b(e) \rangle$$$. Let

$$$ c(de) = \prod\limits_{i=1}^k \prod\limits_{j=1}^l (de - \lambda_i \mu_j). $$$

This one is a bit trickier to prove. Let's start with $$$k=l=1$$$:

$$$ c(de) = de-\lambda \mu = d(e-\mu) + (d - \lambda)\mu. $$$

Rewriting it in the same way for arbitrary $$$k$$$ and $$$l$$$, we get

$$$\begin{align} c(de) = & \prod\limits_{i=1}^k \prod\limits_{j=1}^l (d(e-\mu_j) + (d - \lambda_i )\mu_j) = \\ = \sum\limits_{r_{ij} \in \{0,1\}} & \prod\limits_{i=1}^k \prod\limits_{j=1}^l d^{r_{ij}}\mu_j^{1-r_{ij}}(e-\mu_j)^{r_{ij}}(d-\lambda_i)^{1-r_{ij}}. \end{align}$$$

Then the same logic applies as to $$$c(d+e)$$$ in the binomial convolution case.

The polynomial $$$c(f) = c(de)$$$ defined above is called the composed product of $$$a(d)$$$ and $$$b(e)$$$.

Computing composed products and sums

Let $$$s_i$$$ be the sum of $$$i$$$-th powers of all $$$\lambda$$$ and $$$t_j$$$ be the sum of $$$j$$$-th powers of all $$$\mu$$$, that is

$$$\begin{align} s_i = \lambda_1^i + \dots + \lambda_k^i, \\ t_j = \mu_1^j + \dots + \mu_l^j. \end{align}$$$

The roots of the composed sum are $$$\lambda_i + \mu_j$$$ for all $$$i$$$ and $$$j$$$ and of the composed product are $$$\lambda_i \mu_j$$$, from which we can see that

$$$\begin{gather} \sum\limits_{i=1}^k \sum\limits_{j=1}^l (\lambda_i \mu_j)^z &=& \sum\limits_{i=1}^k \lambda_i^z \sum\limits_{j=1}^l \mu_j^z &=& s_z t_z, \\ \sum\limits_{i=1}^k \sum\limits_{j=1}^l (\lambda_i + \mu_j)^z &=& \sum\limits_{x+y=z} \binom{z}{x} \sum\limits_{i=1}^k \lambda_i^x \sum\limits_{j=1}^l \mu_j^y &=& \sum\limits_{x+y=z} \binom{z}{x} s_x t_y. \\ \end{gather}$$$

So, if we're able to transform from $$$a(d)$$$ to $$$s_i$$$, then from $$$b(e)$$$ to $$$t_j$$$, compute the transforms above on them and then recover the characteristics polynomials from the result, it would solve the problem.

Next thing we should note is that the generating function of $$$s_i$$$ is

$$$ \sum\limits_{i=0}^\infty s_i x^i = \sum\limits_{i=0}^\infty \sum\limits_{j=1}^k \lambda_j^i x^i = \sum\limits_{j=1}^k \frac{1}{1-\lambda_i x}. $$$

It can be further expanded as

$$$ \sum\limits_{j=1}^k \frac{1}{1-\lambda_i x} = k + \sum\limits_{j=1}^k \frac{\lambda_i x}{1-\lambda_i x} = k - x \frac{A'(x)}{A(x)}. $$$

where

$$$ A(x) = 1 - \sum\limits_{i=1}^k a_i x^i = \prod\limits_{i=1}^k (1-\lambda_i x) $$$

is the reversed characteristic polynomial of $$$d_k$$$. Its log-derivative is indeed

$$$ \frac{A'(x)}{A(x)} = \sum\limits_{i=1}^k \frac{-\lambda_i}{1-\lambda_i x}. $$$

Finally, to inverse this transform, we could make use of the fact that $$$\frac{A'}{A} = (\log A)'$$$, hence for

$$$ B(x) = k - x \frac{A'(x)}{A(x)} $$$

it holds that

$$$ A(x) = \exp \int \frac{k-B(x)}{x}dx. $$$

The resulting $$$f(x)$$$ has degree $$$kl$$$, so only $$$kl$$$ terms of $$$\frac{A'}{A}$$$ and $$$\exp$$$ are needed and they may be computed in $$$O(kl \log kl)$$$.

Теги tutorial, linear recurrence, composed sum, composed product, polynomials, power series

История

 
 
 
 
Правки
 
 
  Rev. Язык Кто Когда Δ Комментарий
en6 Английский adamant 2022-05-25 03:21:36 13 doesn't seem to be true
en5 Английский adamant 2022-05-25 03:16:49 404
en4 Английский adamant 2022-05-24 12:05:13 45
en3 Английский adamant 2022-05-24 03:00:33 4
en2 Английский adamant 2022-05-24 02:35:47 9
en1 Английский adamant 2022-05-24 02:31:00 6430 Initial revision (published)