# Joan Lindsay Orr

$$\newcommand{\CC}{\mathbb{C}} \newcommand{\NN}{\mathbb{N}} \newcommand{\RR}{\mathbb{R}} \newcommand{\TT}{\mathbb{T}} \newcommand{\ZZ}{\mathbb{Z}} \newcommand{\H}{\mathcal{H}} \newcommand{\e}{\epsilon} \newcommand{\x}{\mathbf{x}} \newcommand{\y}{\mathbf{y}}$$

$$\newcommand{ip}[2]{\langle #1, #2 \rangle} \newcommand{\span}{\text{span}}$$

# Cauchy Interlacing Theorem

## Courant-Fischer Minmax Theorem

Let $$A$$ be an $$n\times n$$ Hermitian matrix and let $$\lambda_1 \le \cdots \le \lambda_n$$ be its eigenvalues. Then $\lambda_k = \min_{\dim(U) = k} \max_{\substack{\|x\|=1 \\ x\in U}} \ip{Ax}{x} \tag{1}$ and $\lambda_k = \max_{\dim(U) = n-k+1} \min_{\substack{\|x\|=1 \\ x\in U}} \ip{Ax}{x} \tag{2}$

Note in particular that the min and max are attained (i.e., they are not inf and sup).

Proof. Let $$\{e_i\}$$ be an orthonormal basis of eigenvectors with $$Ae_i = \lambda_i e_i$$. First, let $$U_k := \span\{e_1, \ldots, e_k\}$$. If $$x\in U_k$$ and $$\|x\|=1$$ then $$x=\sum_{i=1}^k \alpha_i e_i$$ with $$\sum_{i=1}^k| \alpha_i|^2 = 1$$. Thus $\ip{Ax}{x} = \sum_{i=1}^k \lambda_i |\alpha_i|^2 \le \lambda_k \sum_{i=1}^k| \alpha_i|^2 = \lambda_k$ Since $$x\in U_k$$ is arbitrary, $\inf_{\dim(U) = k} \max_{\substack{\|x\|=1 \\ x\in U}} \ip{Ax}{x} \le \max_{\substack{\|x\|=1 \\ x\in U_k}} \ip{Ax}{x} \le \lambda_k$ and the maximum is attained by compacteness of the unit ball in $$\CC^n$$.

On the other hand, let $$U$$ be an arbitrary subspace of dimension $$k$$ and consider $$U\cap V_k$$ where $$V_k := \span\{e_k, \ldots, e_n\}$$. The intersection cannot be $$\{0\}$$ for if it were, $$\dim(U+V_k) = k + (n-k+1 = n+1$$, which is impossible. Thus there is a unit vector $$v = \sum_{i=k}^n \alpha_i e_i \in U\cap V_k$$. It follows that $\ip{Av}{v} = \sum_{i=k}^n \lambda_i |\alpha_i|^2 \ge \lambda_k \sum_{i=k}^n| \alpha_i|^2 = \lambda_k$ and so since $$v\in U$$, $\max_{\substack{\|x\|=1 \\ x\in U}} \ip{Ax}{x} \ge \ip{Av}{v} \ge \lambda_k$ Since $$U$$ was arbitrary, $\inf_{\dim(U) = k} \max_{\substack{\|x\|=1 \\ x\in U}} \ip{Ax}{x} \ge \lambda_k \ge \max_{\substack{\|x\|=1 \\ x\in U_k}} \ip{Ax}{x} \ge \inf_{\dim(U) = k} \max_{\substack{\|x\|=1 \\ x\in U}} \ip{Ax}{x}$ Thus equality holds throughout, and we can see that the infinum is attained at $$U_k$$, justifying the stated form.

To prove formula (2), consider the Hermitian matrix $$A' := -A$$. Its eigenvalues in ascending order are $$\lambda'_k := -\lambda_{n-k+1}$$ and so, by (1), $-\lambda_k = \lambda'_{n-k+1} = \min_{\dim(U) = n-k+1} \max_{\substack{\|x\|=1 \\ x\in U}} \ip{A'x}{x} = \min_{\dim(U) = n-k+1} \max_{\substack{\|x\|=1 \\ x\in U}} \ip{-Ax}{x}$ and the result follows by bringing the negative sign to the outside.

# Cauchy Interlacing Theorem

An $$m\times m$$ matrix $$B$$ is a principal submatrix of the $$n\times n$$ matrix $$A$$ if it is obtained by deleting $$(n-m)$$ columns and their corresponding rows from $$A$$.

Let $$A$$ be an $$n\times n$$ Hermitian matrix with eigenvalues $$\alpha_1 \le \cdots \le \alpha_n$$ and let $$B$$ be an $$m\times m$$ principal submatrix with eigenvalues $$\beta_1 \le \cdots \beta_m$$. Then $\alpha_k \le \beta_k \le \alpha_{n-m+k}$ for $$k=1,2,\dots, m$$.

Proof. Let $$P:\CC^n \rightarrow \CC^m$$ be the orthogonl projection of deleting the coordinates corresponding to the rows and columns removed from $$A$$ to make $$B$$. So $$B = P A P^*$$. (The "*" is necessary because $$P$$ maps between different spaces and so is not self-adjoint.)

Fix $$1\le k \le m$$ and let $$U$$ be a $$k$$-dimensional subspace of $$\CC^m$$ such that $$\beta_k = \max\{\ip{Bx}{x} \mid x\in U, \|x\| = 1 \}$$. Then \begin{align} \beta_k &= \max\{\ip{Bx}{x} \mid x\in U, \|x\| = 1 \} \\ &= \max\{\ip{PAP^*x}{x} \mid x\in U, \|x\| = 1 \} \\ &= \max\{\ip{AP^*x}{P^*x} \mid x\in U, \|x\| = 1 \} \\ &= \max\{\ip{Ay}{y} \mid y\in P^*(U), \|y\| = 1 \} \\ &\ge \alpha_k, \end{align} since $$P^*(U)$$ is a $$k$$-dimensional subspace of $$\CC^n$$.

Next, let $$U$$ be a $$(m-k+1)$$-dimensional subspace of $$\CC^m$$ such that $$\beta_k = \min\{\ip{Bx}{x} \mid x\in U, \|x\| = 1 \}$$. As above, $$P^*(U)$$ is a $$(m-k+1)$$-dimensional subspace of $$\CC^n$$. Note that $$m-k+1 = n-j+1$$ where $$j=n-m+k$$ and so, letting $$V$$ range over all $$(n-j+1)$$-dimensional subspaces of $$\CC^n$$, we get \begin{align} \alpha_{n-m+k} &= \max_{\dim(V) = n-j+1}\min_{\substack{\|y\|=1 \\ y\in V}}\ip{Ay}{y} \\ &\ge \min_{\substack{\|y\|=1 \\ y\in P^*(U)}}\ip{Ay}{y} \\ &= \min_{\substack{\|x\|=1 \\ x\in U}}\ip{AP^*x}{P^*x} \\ &= \min_{\substack{\|x\|=1 \\ x\in U}}\ip{Bx}{x} \\ &= \beta_k \end{align}

# Singular Values

Unlike the previous sections of this page, which have listed eigenvalues in increasing order, the singular values of a matrix are usually listed in decreasing order as $$s_1(T) \ge s_2(T) \ge \cdots \ge s_n(T)$$. Thus the Courant-Fischer theorem applied to singular vales yields: \begin{align} (s_k(T))^2 &= \min_{\dim(U) = n-k+1} \max_{\substack{\|x\|=1 \\ x\in U}} \ip{T^*Tx}{x} \\ &= \min_{\dim(U) = n-k+1} \max_{\substack{\|x\|=1 \\ x\in U}} \|Tx\|^2 \\ \end{align} and \begin{align} (s_k(T))^2 &= \max_{\dim(U) = k} \min_{\substack{\|x\|=1 \\ x\in U}} \ip{T^*Tx}{x} \\ &= \max_{\dim(U) = k} \min_{\substack{\|x\|=1 \\ x\in U}} \|Tx\|^2 \\ \end{align} Hence $s_k(T) = \min_{\dim(U) = n-k+1} \max_{\substack{\|x\|=1 \\ x\in U}} \|Tx\| = \max_{\dim(U) = k} \min_{\substack{\|x\|=1 \\ x\in U}} \|Tx\| \tag{3}$

If $$A$$ is an $$n\times n$$ matrix and $$B$$ is obtained by deleting a single column of $$A$$ then $$B^*B$$ is a principal submatrix of $$A$$ and by the Cauchy Interlacing Theorem (remembering that the order is reversed) $s_{i+1}(A) \le s_i(B) \le s_i(A) \qquad (i=1, 2,\dots, n-1)$ The same clearly holds when a single row is deleted. Thus, by repeated application, if $$B$$ is obtained by deleting a total of $$m$$ rows and/or columns of $$A$$,

$s_{i+m}(A) \le s_i(B) \le s_i(A) \qquad (i=1, 2,\dots, n-m)$