# 6.6 Orthogonal Projections and the Spectral Theorem

In this section, we rely heavily on Theorems 6.16 (p. 369) and 6.17 (p. 371) to develop an elegant representation of a normal (if ) or a self-adjoint (if ) operator T on a finite-dimensional inner product space. We prove that T can be written in the form , where  are the distinct eigenvalues of T  are orthogonal projections. We must first develop some results about these special projections.

We assume that the reader is familiar with the results about direct sums developed at the end of Section 5.2. The special case where V is a direct sum of two subspaces is considered in the exercises of Section 1.3.

Recall from the exercises of Section 2.1 that if , then a linear operator T on V is the projection on  along  if, whenever , with  and , we have . By Exercise 27 of Section 2.1, we have



So . Thus there is no ambiguity if we refer to T as a “projection on ” or simply as a “projection.” In fact, it can be shown (see Exercise 17 of Section 2.3) that T is a projection if and only if . Because  does not imply that , we see that  does not uniquely determine T. For an orthogonal projection T, however, T is uniquely determined by its range.

# Definition.

Let V be an inner product space, and let  be a projection. We say that T is an orthogonal projection if  and .

Note that by Exercise 13(c) of Section 6.2, if V is finite-dimensional, we need only assume that one of the equalities in this definition holds. For example, if , then .

An orthogonal projection is not the same as an orthogonal operator. In Figure 6.5, T is an orthogonal projection, but T is clearly not an orthogonal operator because .

Now assume that W is a finite-dimensional subspace of an inner product space V. In the notation of Theorem 6.6 (p. 347), we can define a function  by . It is easy to show that T is an orthogonal projection on W. We can say even more—there exists exactly one orthogonal projection on W. For if T and U are orthogonal projections on W, then . Hence , and since every projection is uniquely determined by its range and null space, we have . We call T the orthogonal projection of V on W.

To understand the geometric difference between an arbitrary projection on W and the orthogonal projection on W, let  and . Define U and T as in Figure 6.5, where T(v) is the foot of a perpendicular from v on the line  and . Then T is the orthogonal projection of V on W, and U is a different projection on W. Note that , whereas . From Figure 6.5, we see that T(v) is the “best approximation in W to v”; that is, if , then . In fact, this approximation property characterizes T. These results follow immediately from the corollary to Theorem 6.6 (p. 348).

As an application to Fourier analysis, recall the inner product space H and the orthonormal set S in Example 9 of Section 6.1. Define a trigonometric polynomial of degree n to be a function  of the form



where  or  is nonzero.

Let . We show that the best approximation to f by a trigonometric polynomial of degree less than or equal to n is the trigonometric polynomial whose coefficients are the Fourier coefficients of f relative to the orthonormal set S. For this result, let , and let T be the orthogonal projection of H on W. The corollary to Theorem 6.6 (p. 348) tells us that the best approximation to f by a function in W is



For an application of this material to electronic music, visit goo.gl/EN5Fai.

An algebraic characterization of orthogonal projections follows in the next theorem.

# Theorem 6.24.

Let V be an inner product space, and let T be a linear operator on V. Then T is an orthogonal projection if and only if T has an adjoint T* and .

# Proof.

Suppose that T is an orthogonal projection. Since  because T is a projection, we need only show that T* exists and . Now  and . Let . Then we can write  and , where  and . Hence



and



So  for all  thus T* exists and .

Now suppose that . Then T is a projection by Exercise 17 of Section 2.3, and hence we must show that  and . Let  and . Then , and so



Therefore , from which it follows that .

Let . We must show that , that is, . Now



Since , the first term must equal zero. But also



Thus ; that is, . Hence .

Using the preceding results, we have  by Exercise 13(b) of Section 6.2. Now suppose that . For any , we have . So , and thus . Hence .

Let V be a finite-dimensional inner product space, W be a subspace of V, and T be the orthogonal projection of V on W. We may choose an orthonormal basis  for V such that  is a basis for W. Then  is a diagonal matrix with ones as the first k diagonal entries and zeros elsewhere. In fact,  has the form



If U is any projection on W, we may choose a basis  for V such that  has the form above; however  is not necessarily orthonormal.

We are now ready for the principal theorem of this section.

# Theorem 6.25 (The Spectral Theorem).

Suppose that T is a linear operator on a finite-dimensional inner product space V over F with the distinct eigenvalues . Assume that T is normal if  and that T is self-adjoint if . For each , let  be the eigenspace of T corresponding to the eigenvalue , and let  be the orthogonal projection of V on . Then the following statements are true.

1. (a) 

2. (b) If  denotes the direct sum of the subspaces  for , then .

3. (c)  for 

4. (d) 

5. (e) 

# Proof.

(a) By Theorems 6.16 (p. 369) and 6.17 (p. 371), T is diagonalizable;

so



by Theorem 5.10 (p. 277).

(b) If  and  for some , then  by Theorem 6.15(d) (p. 368). It follows easily from this result that . From (a), we have



On the other hand, we have  by Theorem 6.7(c) (p. 349). Hence , proving (b).

(c) The proof of (c) is left as an exercise.

(d) Since  is the orthogonal projection of V on , it follows from (b) that . Hence, for , we have , where , proving (d).

(e) For , write , where . Then



The set  of eigenvalues of T is called the spectrum of T, the sum  in (d) is called the resolution of the identity operator induced by T, and the sum  in (e) is called the spectral decomposition of T. The spectral decomposition of T is unique up to the order of its eigenvalues.

With the preceding notation, let  be the union of orthonormal bases of the ’s and let . (Thus  is the multiplicity of .) Then  has the form



that is,  is a diagonal matrix in which the diagonal entries are the eigenvalues  of T, and each  is repeated  times. If  is the spectral decomposition of T, then it follows (from Exercise 7) that  for any polynomial g. This fact is used below.

We now list several interesting corollaries of the spectral theorem; many more results are found in the exercises. For what follows, we assume that T is a linear operator on a finite-dimensional inner product space V over F.

# Corollary 1.

If  then T is normal if and only if  for some polynomial g.

# Proof.

Suppose first that T is normal. Let  be the spectral decomposition of T. Taking the adjoint of both sides of the preceding equation, we have  since each  is self-adjoint. Using the Lagrange interpolation formula (see page 53), we may choose a polynomial g such that  for . Then



Conversely, if  for some polynomial g, then T commutes with T* since T commutes with every polynomial in T. So T is normal.

# Corollary 2.

If , then T is unitary if and only if T is normal and  for every eigenvalue  of T.

# Proof.

If T is unitary, then T is normal and every eigenvalue of T has absolute value 1 by Corollary 2 to Theorem 6.18 (p. 379).

Let  be the spectral decomposition of T. If  for every eigenvalue  of T, then by (c) of the spectral theorem,



Hence T is unitary.

# Corollary 3.

If , then T is self-adjoint if and only if T is normal and every eigenvalue of T is real.

# Proof.

Suppose that T is normal and that its eigenvalues are real. Let  be the spectral decomposition of T. Then



Now suppose that T is self-adjoint and hence normal. That its eigenvalues are real has been proved in the lemma to Theorem 6.17 (p. 371).

# Corollary 4.

Let T be as in the spectral theorem with spectral decomposition . Then each  is a polynomial in T.

# Proof.

Choose a polynomial  such that . Then



# Exercises

1. Label the following statements as true or false. Assume that the underlying inner product spaces are finite-dimensional.

1. (a) All projections are self-adjoint.

2. (b) An orthogonal projection is uniquely determined by its range.

3. (c) Every self-adjoint operator is a linear combination of orthogonal projections.

4. (d) If T is a projection on W, then T(x) is the vector in W that is closest to x.

5. (e) Every orthogonal projection is a unitary operator.

2. Let , and  be the standard ordered basis for V. Compute , where T is the orthogonal projection of V on W. Do the same for  and .

3. For each of the matrices A in Exercise 2 of Section 6.5:

1. (1) Verify that  possesses a spectral decomposition.

2. (2) For each eigenvalue of , explicitly define the orthogonal projection on the corresponding eigenspace.

3. (3) Verify your results using the spectral theorem.

4. Let W be a finite-dimensional subspace of an inner product space V. Show that if T is the orthogonal projection of V on W, then  is the orthogonal projection of V on .

5. Let T be a linear operator on a finite-dimensional inner product space V.

1. (a) If T is an orthogonal projection, prove that  for all . Give an example of a projection for which this inequality does not hold. What can be concluded about a projection for which the inequality is actually an equality for all ?

2. (b) Suppose that T is a projection such that  for all . Prove that T is an orthogonal projection.

6. Let T be a normal operator on a finite-dimensional inner product space. Prove that if T is a projection, then T is also an orthogonal projection.

7. Let T be a normal operator on a finite-dimensional complex inner product space V. Use the spectral decomposition  of T to prove the following results.

1. (a) If g is a polynomial, then


2. (b) If  for some n, then .

3. (c) Let U be a linear operator on V. Then U commutes with T if and only if U commutes with each .

4. (d) There exists a normal operator U on V such that .

5. (e) T is invertible if and only if  for .

6. (f) T is a projection if and only if every eigenvalue of T is 1 or 0.

7. (g)  if and only if every  is an imaginary number.

8. Use Corollary 1 of the spectral theorem to show that if T is a normal operator on a complex finite-dimensional inner product space and U is a linear operator that commutes with T, then U commutes with T*.

9. Referring to Exercise 20 of Section 6.5, prove the following facts about a partial isometry U.

1. (a) U*U is an orthogonal projection on W.

2. (b) 

10. Simultaneous diagonalization. Let U and T be normal operators on a finite-dimensional complex inner product space V such that . Prove that there exists an orthonormal basis for V consisting of vectors that are eigenvectors of both T and U. Hint: Use the hint of Exercise 14 of Section 6.4 along with Exercise 8.

11. Prove (c) of the spectral theorem. Visit goo.gl/utQ9Pb for a solution.