Mathursday is back after a very long time. The last year was unusually hectic for all of us and I couldn’t devote enough time to posts. We restart with the study of eigenvalues which finds significant use in many important areas in mathematics, and computer science. In this post, we’ll discuss some fundamental results on eigenvalues that are reused commonly. These results are common and well-known but it can help to relook at their proof.

## Preliminary

**Lemma:** (Eigenvalues of symmetric matrices)** **If is be a symmetric matrix then there is an orthonormal basis of consisting of eigenvectors of .

We’ll assume knowledge of the following:

**Lemma:** (Dimension Formula) Given vector sub-space of we have:

**Definition **(**Rayleigh Quotient**): For a square matrix , its Rayleigh quotient is the function from to defined below:

.

## Rayleigh–Ritz Theorem

**Rayleigh–Ritz Theorem:** Let be a symmetric matrix with eigenvalues upto multiplicity and with an orthonormal basis of eigenvectors , where has eigenvalue , then for all we have:

,

where means that is orthogonal to . For , the only constraint on is that .

**Proof:** We can express each as for some . Then we have:

where in the last line we have . Observe that this is a probability vector for any value of , and any probability vector can be written using . Hence, max over , is same as max over , which itself is same as max over .

If , then we have for all . This implies:

.

It is straightforward to observe that this minima is achieved when the entire probability mass is on . Hence, proved.

**Lemma:** Let be a symmetric matrix with eigenvalues and an orthonormal basis of eigenvectors . Then for all we have:

.

**Proof:**We can adopt proof for Rayleigh-Ritz to write:

.

It is obvious to see that the minima is achieved when the entire probability mass is on the smallest eigenvalue possible which is , and the maximum is achieved when the entire probability mass is on the largest eigenvalue possible which is . Hence, proved.## Courant-Fischer Min-Max Theorem

**Courant-Fischer Min-Max Theorem:** Let be a real symmetric matrix with eigenvalues corresponding to an orthonormal set of eigenvectors . Then for any we have:

,

.

**Proof:** Let and let be any subspace of dimension . Then from dimensionality theorem we have:

.

This allows us to pick a non-zero vector in . For this, we have , and therefore, from previous result we have . This gives us:

.

However, since this result holds for any of dimensionality , therefore, we can write:

.

We now need to prove the other direction. Let then . Further, for any , we have, from previous result, . This implies:

.

Combining these two inequalities, proves the first min-max equality. The second equation can be proven similarly.## Weyl’s Inequality

**Weyl’s Inequality:** Let be two real symmetric matrix. For all , let denote the eigenvalues of , and respectively, arranged in ascending order. Let be the unit norm eigenvectors of , and respectively corresponding to the eigenvalue. Then for all and , we have:

**Proof:** The proof is similar to that of Courant-Fischer, in that we will define a set of subspaces and show that we can pick a point in their intersection. We will prove the first inequality and the second one is proven in a similar fashion. For a fixed value , we define three subspaces:

You can sort of guess why these subspaces were defined in this manner by looking at the inequality we are trying to prove. We have on the left hand-side corresponding to matrix , and and on the right hand side corresponding to matrices and respectively. We have , and . Applying the dimension theorem twice we get:

Let be a non-zero vector in . Then since it is in , we have , where the last equality holds from observing that Rayleigh’s quotient of a matrix is linear in matrix. Finally, as and we have and . Combining these terms we get which is what we want to prove.

Equalities and inequalities involving eigenvalues are quite useful and constitute a big part of linear algebra literature. Interested readers can look at *Matrix Algebra & Its Applications to Statistics & Econometrics* by C. R. Rao and M. B. Rao, or *Matrix Analysis* by Rajendra Bhatia. Bhatia also has an extremely interesting article on eigenvalue inequalities which covers interesting history: *Linear Algebra to Quantum Cohomology: The Story of Alfred Horn’s Inequalities*.