Introduction
Hello. I would like to share some nice stuff, called Sherman-Morrison formula. If you know this formula, it's easier to understand how BFGS update works in convex optimization and machine learning. But in this topic, we will just simply focus on this formula and learn how to calculate inverse matrix of rank-1 updated invertible matrix faster.
The motivation of this post is that my problem using Sherman Morrison formula is rejected by multiple coordinators.
Definition
You have invertible $$$\mathbb{R}^{n \times n}$$$ matrix $$$A$$$, and $$$\mathbb{R}^{n}$$$ vectors $$$u$$$ and $$$v$$$. Then $$$ (A + uv^{T})^{-1} = A^{-1} - \frac{A^{-1}uv^{T}A^{-1}}{v^{T}A^{-1}u + 1}$$$. If $$$v^{T}A^{-1}u = -1$$$, then $$$A + uv^{T}$$$ is non-invertible.
Time complexity
You can calculate Sherman Morrison formula in $$$O(n^2)$$$, because $$$A^{-1}u$$$, $$$v^{T}A^{-1}$$$ and $$$v^{T}A^{-1}$$$ are vector.
Normally you need $$$O(n^3)$$$ time complexity to calculate inverse matrix except for some special matrix(diagonal matrix etc). But, since you can calculate Sherman-Morrison formula in $$$O(n^2)$$$, if you find some matrix can be represented as rank-$$$1$$$ updated matrix of special matrix, you can calculate inverse of that matrix in $$$O(n^2)$$$.
Generalization
There is a generalization of this formula called Woodbury Matrix Identity.
Sample problem
This problem is that rejected problem.
You have matrix $$$A = \begin{bmatrix} 1 & 2 & \cdots & n \\ n+1 & n+2 & \cdots & 2n \\ \cdots & \cdots & \cdots & \cdots \\ n^{2}-n+1 & n^{2}-n+2 & \cdots & n^{2} \end{bmatrix} + diag(c_{1}, c_{2}, \ldots, c_{n})$$$.
In $$$i$$$-th query, you will select any row or column of $$$A$$$ and multiply this by $$$x_{i}$$$. Compute $$$A^{-1}$$$ if $$$A$$$ is invertible after each query.
Constraints: $$$2 \le n \le 200$$$, $$$1 \le q \le 200$$$, $$$0 \lt c_{i}$$$, $$$x_{i} \neq 0$$$.