Rating changes for last rounds are temporarily rolled back. They will be returned soon. ×

A problem collection of ODE and differential technique

Revision en7, by jqdai0815, 2020-04-24 11:20:11

A problem collection of ODE and differential technique

This problem might be well-known in some countries, but how do other countries learn about such problems if nobody poses them.

For those who are interested in well-known problems in China.

Thank Elegia and djq_cpp for developing this technique. Thank Rewinding for reviewing this article.

Chain Reaction in UOJ Round 3 By vfleaking

Statement

​ You are given a set $$$A$$$, you need to compute $$$g_{i} = \frac{1}{2} \sum_{j,k}{i-1 \choose j}{i-1-j \choose k} g_jg_k$$$ where $$$i-1-j-k \in A$$$.

Solution

​ Let the EGF of $$$g$$$ be $$$x(t)$$$ and EGF of $$$A$$$ be $$$a(t)$$$. Thus $$$x'(t)=\frac{1}{2} a(t) x^2(t)+1$$$. We can solve this equation by D&C and FFT in $$$O(n\log^2 n)$$$. But there is a slower solution in $$$O(n\log n)$$$.

​ For a polynomial equation $$$f(x(t))=0$$$, we can use the Newton's method to solve it. If we find a polynomial $$$x_n$$$ that $$$x \equiv x_n \pmod {t^n}$$$. We can derive the following equation by the Taylor series:

$$$0=f(x_{2n}) = f(x_n) + f'(x_n) (x_{2n} -x_n) + \dots$$$

​ Thus $$$x_{2n} = x_n - \frac{f(x_n)}{f'(x_n)} \pmod {t^{2n}}$$$.

​ For an ODE $$$\frac{d}{dt} x =f(x)$$$, we have

$$$\begin{eqnarray*} \frac{d}{dt} x_{2n} &\equiv & f(x_n) + f'(x_n) (x_{2n}-x_n) \pmod {t^{2n}} \\ & \equiv & f(x_n) + f'(x_n) x_{2n} - f'(x_n) x_n \pmod {t^{2n}} \end{eqnarray*}$$$

Let $$$r= e^{-\int f'(x_n) dt}$$$.

$$$\begin{eqnarray*} \left(\frac{d}{dt} x_{2n}\right)r &\equiv & (f(x_n)-f'(x_n) x_n) r + f'(x_n) x_{2n} r \pmod {t^{2n}} \\ \frac{d}{dt} (x_{2n}r)& \equiv & (f(x_n) - f'(x_n)x_n) r \pmod {t^{2n}} \end{eqnarray*}$$$

In this problem, $$$f(x)=\frac{1}{2} a(t) x^2(t)+1$$$. Since exp, multiplication and division takes $$$O(n \log n)$$$ time, the we have the time complexity $$$T(n)=T(n/2) + O(n\log n)$$$,which is $$$O(n\log n)$$$.

​ Note: I think we can generalize this method to solve high order ODE or system of ordinary differential equations. But the constant factor will be much more terrible.

Gnutella Chessmaster in MIPT contest Ptz camp 2018, Summer

Statement

​ I think the author of this problem is Endagorion.

​ This is a simplified version of this problem: We define the weight of a sequence $$$a_1, a_2, \dots, a_k$$$ as $$$\prod_{i=1}^k (a_i+i)$$$. You are given a sequence $$$c_1, c_2, \dots, c_n$$$. Compute the sum of the weight of all the subsequences of $$$c$$$ with length $$$k=1, 2, \dots, n$$$.

​ You can submit this problem here Little Q's sequence. However, you only need to compute the sum of all the subsequences here.

Solution

​ First, we consider the naive dp: $$$f_{i,j}=f_{i-1,j} + f_{i-1,j-1} \cdot (j+a_i)$$$. Let $$$g_{i,j} = f_{i,i-j}, b_i=a_i+i$$$, we have $$$g_{i,j}=g_{i-1,j-1} + g_{i-1,j} (b_i-j)$$$.

​ Let $$$g_i(x)$$$ be the OGF of $$$g_i$$$, we have $$$g_i(x)=x g_{i-1}(x)+b_i g_{i-1}(x) - x g'_{i-1}(x)=x(g_{i-1}(x)-g'_{i-1}(x))+b_i g_{i-1}(x)$$$.

$$$g_i(x) e^{-x} = x (g_{i-1}(x) e^{-x} - g'_{i-1}(x) e^{-x}) + b_i g_{i-1}(x) e^{-x} = x (g_{i-1}(x)e^{-x})' + b_i g_{i-1}(x) e^{-x}$$$

​ Let $$$h_i=g_i(x) e^{-x}$$$, $$$h_i = x {h'_{i-1}}+b_i h_{i-1}$$$, where $$$h_{i,j}=j h_{i-1,j}+b_i h_{i-1,j}=(b_i+j) h_{i-1,j }$$$. Thus $$$h_{n,j}=h_{0,j} \prod_{i=1}^n (b_i+j)$$$. It's a multipoint evaluation of the polynomial $$$P(x)=\prod_{i=1}^n (x+b_i)$$$, which can be solved in $$$O(n\log ^2 n)$$$.

​ If we only need to compute the sum, there is also an alternative way to solve it. $$$g_i=(x - x \frac{d}{dx} +b_i) g_{i-1}(x)$$$. Let $$$P(x) = \prod_{i=1}^n (x+b_i)=\sum_{k=0}^n p_k x^k$$$. Then $$$g_n=\sum_{k=0}^n p_k (x-x\frac{d}{dx})^k(1)$$$. By induction, $$$(x-x\frac{d}{dx})^k (1)=\sum_{j=0}^k \lbrace ^k_j \rbrace (-1)^{k-j} x^j$$$. Let $$$q_k=\sum_{j=0}^{k} \lbrace ^k_j \rbrace (-1)^{k-j}$$$. According to OEIS, $$$\sum_{k\geq 0} q_k x^k=e^{1-x-e^{-x}}$$$. Thus we can compute $$$q_k$$$ in $$$O(n\log n)$$$.

​ Note: Due to the special property, the original version can also be solved in $$$O(n \log n)$$$ by a combinatorial method.

Exp in Gennady Korotkevich Contest 5 By tourist

Statement

​ You are given a polynomial $$$P(x)$$$, you need to find the first $$$m$$$ coefficients of $$$Q=P^n (x)$$$ in $$$O(m|P|)$$$.

Solution

​ There are many similar problems like Lucky Tickets, or computing Catalan numbers, Large Schröder numbers. Also, we will discuss the relation between ODE and P-recursive sequence further in the latter part.

​ We have

$$$(P^{n+1})'=(n+1) P^n P' = (P^n P)'=(P^n)'P + P^n P'$$$

​ Thus $$$n Q P'= Q'P$$$. Consider the coefficient of $$$x^i$$$ in both parts, we have

$$$n(q_i p_1 + 2 q_{i-1} p_2 +\dots + k q_{i-k+1} p_k) = (i+1) q_{i+1} p_0 + iq_ip_1 + \dots + (i-k+1) q_{i-k+1} p_k$$$

.

​ We can compute $$$q_{i+1}$$$ from $$$q_{i-k+1}, q_{i-k+2}, \dots, q_{i}$$$ in $$$O(k)$$$.

​ BTW, a more intuitive way to derive this formula is we take $$$\ln$$$ of both sides and then take the derivative. $$$\ln G=n\ln F \Rightarrow G'/G=nF'/F$$$.

​ Note: This method also works for $$$n$$$ is not an integer. For example, you can also compute $$$\sqrt{P(x)}$$$ in the same manner. More example: compute $$$G=\sum_{i=0}^k \frac{F^i}{i!}$$$ . $$$G'=F'(G-\frac{F^k}{k!})$$$. We can compute $$$F^k$$$ using the above method and then solve this equation in a similar way.

Dirichlet $$$k$$$-th root in EC Final By jqdai0815

Statement

​ You are given a number theory function $$$g$$$ and $$$k$$$, you need to find a function $$$f$$$ such that $$$g=f^k$$$. The multiplication is Dirichlet convolution here.

Solution

​ The intended solution is $$$O(n\log^2 n)$$$. This improved solution credits to _rqy and Elegia.

​ Let $$$F(s)=\sum_{n\geq 1} \frac{f(n)}{n^s}$$$, which is the Dirichlet generating function. We also denote the Dirichlet generating function of $$$g_i$$$ by $$$G(s)$$$. $$$F'(s)=\sum_{n\geq 1} \frac{ f(n) \ln n}{n^s}$$$, thus $$$f'(n)=f(n) \ln n$$$.

​ Using the argument of the previous problem, we have $$$k G F'=F'Q$$$. Consider the coefficient of $$$n$$$-th term, we have $$$\sum_{d|n} f'(d) g(\frac{n}{d})=\frac{1}{k} \sum_{d|n} f(d) g'(\frac{n}{d})$$$. Since $$$g(1)=1, g'(1)=0$$$, we can compute $$$f'(n)$$$ and then $$$f(n)$$$ in $$$O(n\log n)$$$.

​ The remaining problem is how to deal with $$$\ln n$$$ in the modular arithmetic. Intuitively, we can replace $$$\ln n$$$ with $$$\Omega(n)$$$, which is the number of prime factors of $$$n$$$ counted with multiplicities. Since it's completely additive like $$$\ln n$$$. A rigorous proof can be like that we define a transformation $$$T$$$ of $$$f$$$ such that $$$(Tf)(n) = \Omega(n) f(n)$$$, and we can find $$$T (fg)=(Tf)g+f(Tg)$$$. So we can just replace the derivative of the DGF in the above argument to this transformation.

Rhyme By Elegia

Statement

​ Compute the sum of $$$\frac{n!}{\prod_{i=1}^k x_i!}$$$ where $$$\sum_{i=1}^k x_i=n$$$ and $$$d|x_i$$$ for all $$$i (1\leq i\leq k)$$$.

​ $$$n\leq 10^9, k\leq 2000, d=6$$$

Solution

​ The EGF of each term is $$$\sum_{j\geq 0} \frac{x^{jd}}{(jd)!}=\frac{1}{d}\sum_{j=0}^{d-1} e^{\omega^j x}$$$. Let $$$f(x)= \left(\frac{1}{d}\sum_{j=0}^{d-1} e^{\omega^j x}\right)^k$$$.

​ Since $$$w^2=w-1$$$, we have $$$f(x)=\sum c_{a,b} e^{(a+b\omega)x}$$$ where $$$-k\leq a,b\leq k$$$. And then we can sum over $$$c_{a,b} (a+b\omega)^n$$$ to get the answer. The question is that how to compute $$$c_{a,b}$$$ effectively. Since $$$1$$$ and $$$\omega$$$ are independent, we can regard $$$\frac{1}{d}\sum_{j=0}^{d-1} e^{\omega^j x}$$$ as a bivariate polynomial $$$F(u,v)$$$ where $$$u=e^x, v=e^{\omega x}$$$. We need to compute $$$G(u,v)=F(u,v)^k$$$. If we use FFT directly, the time complexity is $$$O(k^2 \log k)$$$, which might be not fast enough. However, we have $$$F \frac{\partial G}{\partial u}=k \frac{\partial F}{\partial u}G$$$. Thus we can compute $$$G$$$ in $$$O(k^2)$$$ in the same manner. We omit the details here.

Discussion about ODE and P-recursive sequence

​ It is known that every algebraic generating function is D-finite, and the coefficient is P-recursive.

​ First, there are some known results from Enumerative Combinatorics Volume 2. If $$$w$$$ be a formal power series over $$$K$$$, let $$$V_w$$$ denote the vector space over $$$K(x)$$$ spanned by $$$w, w', \dots$$$. Since it's D-finite, the dimension of $$$V_w$$$ is finite.

  • If $$$u,v \in \mathcal{D}$$$, then $$$w=au+bv \in \mathcal{D}$$$, $$$\dim V_w \leq \dim V_u+\dim V_v$$$

  • If $$$u,v \in \mathcal{D}$$$, then $$$w=uv \in \mathcal{D}$$$, $$$\dim V_w \leq \dim V_u\dim V_v$$$

  • If $$$u,v \in \mathcal{D}$$$, then $$$w=u * v \in \mathcal{D}$$$, $$$\dim V_w \leq \dim V_u\dim V_v$$$, $$$u* v$$$ is the Hadamard product here.

  • If $$$u \in \mathcal{D}$$$, $$$v$$$ is algebraic and $$$v(0)=0$$$, then $$$w=u(v(x)) \in \mathcal{D}$$$

Thus we can construct ODE for the generating function of P-recursive sequences. Here are some example:

  • $$$g_1(x)=\sum_{i\geq 0} \frac{x^i}{i!} \Rightarrow g_1 = g'_1$$$

  • $$$g_3(x)=\sum_{i\geq 0} x^i i! \Rightarrow g_3=g'_3x^2+g_3x+1$$$

  • $$$g_4(x)=\sum_{i\geq 0} \frac{x^i}{i!(i+k)!} \Rightarrow g_4=g"_4x+(k+1)g'_4$$$

  • $$$g_5(x)=\sum_{n\geq 0} \frac{1}{(k-1)n+1} {kn \choose n} x^{(k-1)n+1} \Rightarrow g_5=\frac{k x g'_5}{1+(k-1) g'_5}$$$

  • $$$g_6(x) = (1+x)^a (1-x)^b \Rightarrow g'_6 = \frac{ag_6}{1+x} + \frac{bg_6}{1-x}$$$

  • $$$g_8(x)=f^n \Rightarrow ng_8 f'= f g'_8$$$

Other examples are we can construct the ODE for the truncation of $$$P$$$-recursive sequences.

  • $$$g_2(x) = \sum_{i=0}^k \frac{x^i}{i!} \Rightarrow g_2=g'_2+\frac{x^i}{i!}$$$

  • $$$g_7(x)=\sum_{i=0}^k {n\choose i} a^i b^{n-i} \Rightarrow ng_7(a'+b')-g'_7(a+b)=n{n-1 \choose k} (a' a^k b^{n-k}-b'a^{k+1}b^{n-k-1})$$$

Universal Judge Aircraft in UOJ Round 19 By [user:ridiculos]

Statement

​ There are $$$n$$$ variables $$$x_1, x_2, \dots, x_n$$$ and two given parameters $$$a,b (a>b)$$$. Initially, all the variables are set to $$$0$$$.

​ Every time, you will choose a variable $$$x_i$$$ randomly and uniformly among the variables, which are less than $$$a$$$. You will terminate the process when all variables are not less than $$$b$$$. Compute the expected value of the number of variables that are equal to $$$a$$$.

​ $$$n, a, b \leq 250$$$

Solution

​ The intended solution is $$$O(na^2 \log n)$$$. This solution credits to djq_cpp.

​ We omit the routine generating function manipulations here. The hardest part of this problem is that let $$$P(x)=\sum_{i=0}^{b-1} \frac{x^i}{i!}$$$, you need to compute $$$P(x), P^2(x), \dots, P^{n-1}(x)$$$. If we use FFT directly, the time complexity is $$$O(na^2 \log n)$$$. Since the degree of $$$P$$$ is not small, the method in problem Exp doesn't help a lot here.

​ Let $$$R=\frac{x^{b-1}}{(b-1)!}, S=P^{k-1}, T=P^k, U=RS$$$. We notice that $$$P'=P-\frac{x^{b-1}}{(b-1)}=P-R$$$. Thus $$$T'=kP'S=k(P-R)S=kPS-RS=kT-U \Rightarrow (i+1)t_{i+1}=k t_i-u_i$$$. Since $$$U$$$ can be computed from $$$S$$$ in linear time, $$$P^k$$$ can be computed from $$$P^{k-1}$$$ in linear time. We remove the $$$O(\log n)$$$ factor here.

Generalization and Discussion

​ I'm thinking to generalize this technique to other problems. But I don't find amazing results. The following is the discussion with Rewinding.

​ For example, let $$$P(x)=\sum_{i=0}^n \frac{x^i}{i!i!}$$$, we can only get the similar recurrence of $$$P^{i} (P')^j$$$. If we need to compute $$$P^1,P^2, \dots, P^k$$$, we need to compute all terms $$$P^i (P')^j$$$ for $$$i+j\leq k$$$. So the time complexity is $$$O(k^2 nk )$$$. It may perform well when $$$k$$$ is extremely small, and $$$n$$$ is very large. However, there is another concern that I conjecture that the $$$in, in+1, \dots, i(n+1)$$$ term of $$$P^k$$$ maybe a P-recursive sequence. I do believe it's a P-recursive sequence. But I don't know how large the degree and the order it can be.

​ Another direction of generalization is to compute the first $$$n$$$-th term of $$$k$$$-th power of $$$P(x)=\sum_{i=0}^a \frac{x^i}{i!}$$$ more effectively. I don't think we can improve it to linear time. But it might be possible to improve it to $$$O(n \log a)$$$ or just $$$O(n\log n)$$$ but without $$$\exp, \ln$$$.

Chinese Elephant Chess By djq_cpp

Statement

​ Find the number of binary matrixes with $$$n$$$ rows and $$$m$$$ columns that there are at most $$$2$$$ ones in each row and column.

​ $$$n,m \leq 5\times 10^4$$$.

Solution

​ You can regard it as a bipartite graph with $$$n$$$ vertices and $$$m$$$ vertices in both parts. The degree of every vertex is no more than $$$2$$$. The graph consists of even cycles, even chains, and odd chains.

​ The EGF of even cycles, even chains and odd chains are

$$$F=\sum_{i\geq 2} \frac{1}{2i} x^iy^i$$$

,

$$$G=\sum_{i\geq 1} x^iy^i$$$

,

$$$H=\frac{1}{2}\left(\sum_{i>0} x^{i+1} y^i + \sum_{i>0} x^i y^{i+1}\right)+x+y$$$

respectively.

​ And the answer is $$$n!m![x^ny^m] e^{F+G+H}$$$.

​ It's hard to deal with bivariate EGFs. We need to transform them into univariate EGFs. Since the degrees of $$$x$$$ and $$$y$$$ are the same in EGF of even cycles and even chains, we can transform them into univariate EGFs directly. For odd chains, we know the number of chains starting in the left part is less than the number in the right part by $$$m-n$$$. So we can enumerate the number of odd chains in the left part, and transform it into a univariate EGF.

​ The new EGF is

$$$F=\sum_{i\geq 2} \frac{1}{2i} x^i$$$

,

$$$G=\sum_{i\geq 1} x^i$$$

,

$$$H=x+\frac{1}{2} \sum_{i\geq 2}x^i$$$

.

​ The answer is $$$n!m! [x^m] e^{F+G} \sum_{i=0}^n \frac{H^iH^{m-n+i}}{x^i i! (m-n+i)!}$$$. Let $$$P(x)=\sum_{i\geq 0} \frac{x^i}{i! (m-n+i)!}, Q=\frac{H^2}{x}$$$. The formula can be simplified to $$$n!m![x^m] e^{F+G} H^{m-n} P(Q)$$$.

​ We know the coefficient of $$$P$$$ is P-recursive. So we can construct an ODE for $$$P$$$ that $$$P=P"x+(m-n+1)P'$$$.

​ We also have that

$$$(P(H))'=P'(H)H', (P(H))"=P"(H)H^2 + P'(H)H"$$$

. So

$$$P'(H)=\frac{P(H)'}{H'}, P"(H)=\frac{P(H)"H'-P(H)'H"}{H'^3}$$$

.

​ Then we can get the ODE of $$$P(H)$$$ :

$$$P(H)H'^3=P(H)'((m-n+1) H'^2 - H"H) + P(H)"H'H$$$

, which is something like $P(H)\cdot A=(P(H))' \cdot B+(P(H))'' \cdot C$, where $$$A,B,C$$$ are three polynomials. So we can use D&C and FFT to solve this equation in $$$O(n\log^2 n)$$$.

​ Note that $$$H=\frac{1}{2}(x+\frac{x}{1-x})$$$, if we multiply $$$(1-x)^6$$$ in the both side of the ODE of $$$P(H)$$$, we can get a polynomial recurrence of $$$P(H)$$$ with small order and degree. So $$$P(H)$$$ can be computed in $$$O(n)$$$ here. It's not hard to see $$$H^{m-n}$$$ and $$$e^{F+G}$$$ are also P-recursive, so the convolution should also be P-recursive. Then we solved this problem in the linear time easily. But I think the order and degree will be too large if we try to find the recurrence of the answer directly. So a more effective way might solve these recurrences directly and compute the answer by FFT.

Generalization and Discussion

​ It looks polynomial modular composition can be computed in $$$\tilde{O}(n)$$$ in academia. But I don't know whether it's practical. The most popular algorithm is $$$O(n^2 + n^{1.5} \log n)$$$. You can also read the discussion and implementation of the Brent-Kung algorithm here. However, it is a bit slower than $$$O(n^2)$$$ one when $$$n=20000$$$.

​ It's not hard to see the above method can be applied to compute the polynomial composition $$$F(G(x))$$$ when $$$F$$$ is D-finite. We can construct the ODE of $$$F(G(x))$$$ and solve the equation using D&C and FFT in $$$O(n\log ^2 n)$$$. However, when the ODE of $$$F$$$ is too complicated, there is a severe issue of the constant factor.

​ Combined with Newton's method, we can compute the composition inverse of a D-finite series. A good application is to compute the number of simple permutations (a.k.a NEERC18 I) with length $$$1,2, \dots, n$$$ in $$$O(n\log^2 n)$$$.

Pearl in CTSC 2019 By laofudasuan

Statement

​ Compute $$$\frac{n!}{2^d} [x^n]\sum_{p=0}^{k} {m \choose p} (e^x-e^{-x})^p (e^x+e^{-x})^{m-p}$$$.

​ $$$n\leq 10^9, m\leq 10^5$$$

Solution

​ This solution credits to djq_cpp.

​ Let $$$Q(x)=\sum_{p=0}^k {m \choose p} (x-1)^p (x+1)^{m-p}$$$. So

$$$2mQ-(2x)Q'=m{m-1 \choose k}((x-1)^k(x+1)^{m-k}-(x-1)^{k+1}(x+1)^{m+k-1})$$$

. And we can simplify it to

$$$mQ-xQ'=m{m-1 \choose k} (x-1)^k(x+1)^{m-k-1}$$$

.

​ The right is P-recursive, thus $$$Q$$$ can be computed in $$$O(m)$$$. So the answer is $$$\frac{n!}{2^d} [x^n] Q(e^{2x}) e^{-mx}$$$. The whole problem can be solved in $$$O(m(1+\log_m n))$$$.

​ Note: We need to compute $$$1^n, 2^n, \dots, m^n$$$, but it's multiplicative. Thus we can compute them in $$$O(m\log_m n)$$$.

History

 
 
 
 
Revisions
 
 
  Rev. Lang. By When Δ Comment
en8 English jqdai0815 2020-04-24 11:21:43 3
en7 English jqdai0815 2020-04-24 11:20:11 2
en6 English jqdai0815 2020-04-24 08:22:31 4
en5 English jqdai0815 2020-04-24 08:21:07 2
en4 English jqdai0815 2020-04-24 03:43:21 2
en3 English jqdai0815 2020-04-23 17:27:51 47
en2 English jqdai0815 2020-04-23 17:23:20 46
en1 English jqdai0815 2020-04-23 16:06:28 16809 Initial revision (published)