# 6.4 Normal and Self-Adjoint Operators

We have seen the importance of diagonalizable operators in Chapter 5. For an operator on a vector space V to be diagonalizble, it is necessary and sufficient for V to contain a basis of eigenvectors for this operator. As V is an inner product space in this chapter, it is reasonable to seek conditions that guarantee that V has an orthonormal basis of eigenvectors. A very important result that helps achieve our goal is Schur’s theorem (Theorem 6.14). The formulation that follows is in terms of linear operators. The next section contains the more familiar matrix form. We begin with a lemma.

Lemma. Let T be a linear operator on a finite-dimensional inner product space V. If T has an eigenvector, then so does T*.

# Proof.

Suppose that v is an eigenvector of T with corresponding eigenvalue λ. Then for any ,



and hence v is orthogonal to the range of . So  is not onto and hence is not one-to-one. Thus  has a nonzero null space, and any nonzero vector in this null space is an eigenvector of T* with corresponding eigenvalue .

Recall (see the exercises of Section 2.1 and see Section 5.4) that a subspace W of V is said to be T-invariant if T(W) is contained in W. If W is T-invariant, we may define the restriction  by  for all . It is clear that  is a linear operator on W. Recall from Section 5.2 that a polynomial is said to split if it factors into linear polynomials.

# Theorem 6.14 (Schur).

Let T be a linear operator on a finite-dimensional inner product space V. Suppose that the characteristic polynomial of T splits. Then there exists an orthonormal basis  for V such that the matrix  is upper triangular.

# Proof.

By Exercise 12(a) of Section 5.2, there exists an ordered basis  for V such that  is upper triangular. Now apply the Gram-Schmidt process to  to obtain an orthogonal basis  for V. For each k, , let



As in the proof of Theorem 6.4,  for all k. By Exercise 12 of Section 2.2,  for all k. Hence  for all k, and so  is upper triangular by the same exercise. Finally, let  for all  and . Then  is an orthonormal basis for V, and  is upper triangular.

We now return to our original goal of finding an orthonormal basis of eigenvectors of a linear operator T on a finite-dimensional inner product space V. Note that if such an orthonormal basis  exists, then  is a diagonal matrix, and hence  is also a diagonal matrix. Because diagonal matrices commute, we conclude that T and T* commute. Thus if V possesses an orthonormal basis of eigenvectors of T, then .

# Definitions.

Let V be an inner product space, and let T be a linear operator on V. We say that T is normal if . An  real or complex matrix A is normal if .

It follows immediately from Theorem 6.10 (p. 356) that T is normal if and only if  is normal, where  is an orthonormal basis.

# Example 1

Let  be rotation by , where . The matrix representation of T in the standard ordered basis is given by



Note that ; so A, and hence T, is normal.

# Example 2

Suppose that A is a real skew-symmetric matrix; that is, . Then A is normal because both  and  are equal to .

Clearly, the operator T in Example 1 does not even possess one eigenvector. So in the case of a real inner product space, we see that normality is not sufficient to guarantee an orthonormal basis of eigenvectors. All is not lost, however. We show that normality suffices if V is a complex inner product space.

Before we prove the promised result for normal operators, we need some general properties of normal operators.

# Theorem 6.15.

Let V be an inner product space, and let T be a normal operator on V. Then the following statements are true.

1. (a)  for all .

2. (b)  is normal for every .

3. (c) If x is an eigenvector of T corresponding to eigenvalue , then x is also an eigenvector of T* corresponding to eigenvalue . That is, if , then .

4. (d) If  and  are distinct eigenvalues of T with corresponding eigenvectors  and , then  and  are orthogonal.

# Proof.

Proof. (a) For any , we have



The proof of (b) is left as an exercise.

(c) Suppose that  for some . Let . Then , and U is normal by (b). Thus (a) implies that



Hence . So x is an eigenvector of T*.

(d) Let  and  be distinct eigenvalues of T with corresponding eigenvectors  and . Then, using (c), we have



Since , we conclude that .

# Theorem 6.16.

Let T be a linear operator on a finite-dimensional complex inner product space V. Then T is normal if and only if there exists an orthonormal basis for V consisting of eigenvectors of T.

# Proof.

Suppose that T is normal. By the fundamental theorem of algebra (Theorem D.4), the characteristic polynomial of T splits. So we may apply Schur’s theorem to obtain an orthonormal basis  for V such that  is upper triangular. We know that  is an eigenvector of T because A is upper triangular. Assume that  are eigenvectors of T. We claim that  is also an eigenvector of T. It then follows by mathematical induction on k that all of the ’s are eigenvectors of T. Consider any , and let  denote the eigenvalue of T corresponding to . By Theorem 6.15, . Since A is upper triangular,



Furthermore, by the corollary to Theorem 6.5 (p. 345),



It follows that , and hence  is an eigenvector of T. So by induction, all the vectors in  are eigenvectors of T.

The converse was already proved on page 367.

Interestingly, as the next example shows, Theorem 6.16 does not extend to infinite-dimensional complex inner product spaces.

# Example 3

Consider the inner product space H with the orthonormal set S from Example 9 in Section 6.1. Let , and let T and U be the linear operators on V defined by  and . Then



for all integers n. Thus



It follows that . Furthermore, ; so T is normal.

We show that T has no eigenvectors. Suppose that f is an eigenvector of T, say,  for some . Since V equals the span of S, we may write



Hence



Since , we can write  as a linear combination of . But this is a contradiction because S is linearly independent.

Example 1 illustrates that normality is not sufficient to guarantee the existence of an orthonormal basis of eigenvectors for real inner product spaces. For real inner product spaces, we must replace normality by the stronger condition that  in order to guarantee such a basis.

# Definitions.

Let T be a linear operator on an inner product space V. We say that T is self-adjoint (or Hermitian) if . An  real or complex matrix A is self-adjoint (or Hermitian) if .

It follows immediately that if  is an orthonormal basis, then T is self-adjoint if and only if  is self-adjoint. For real matrices, this condition reduces to the requirement that A be symmetric.

Before we state our main result for self-adjoint operators, we need some preliminary work.

By definition, a linear operator on a real inner product space has only real eigenvalues. The lemma that follows shows that the same can be said for self-adjoint operators on a complex inner product space. Similarly, the characteristic polynomial of every linear operator on a complex inner product space splits, and the same is true for self-adjoint operators on a real inner product space.

Lemma. Let T be a self-adjoint operator on a finite-dimensional inner product space V. Then

1. (a) Every eigenvalue of T is real.

2. (b) Suppose that V is a real inner product space. Then the characteristic polynomial of T splits.

# Proof.

(a) Suppose that  for . Because a self-adjoint operator is also normal, we can apply Theorem 6.15(c) to obtain



So  that is,  is real.

(b) Let  be an orthonormal basis for V, and . Then A is self-adjoint. Let  be the linear operator on  defined by  for all . Note that  is self-adjoint because , where  is the standard ordered (orthonormal) basis for . So, by (a), the eigenvalues of  are real. By the fundamental theorem of algebra, the characteristic polynomial of  splits into factors of the form . Since each  is real, the characteristic polynomial splits over R. But  has the same characteristic polynomial as A, which has the same characteristic polynomial as T. Therefore the characteristic polynomial of T splits.

We are now able to establish one of the major results of this chapter.

# Theorem 6.17.

Let T be a linear operator on a finite-dimensional real inner product space V. Then T is self-adjoint if and only if there exists an orthonormal basis  for V consisting of eigenvectors of T.

# Proof.

Suppose that T is self-adjoint. By the lemma, we may apply Schur’s theorem to obtain an orthonormal basis  for V such that the matrix  is upper triangular. But



So A and A* are both upper triangular, and therefore A is a diagonal matrix. Thus  must consist of eigenvectors of T.

The converse is left as an exercise.

We restate this theorem in matrix form in the next section (as Theorem 6.20 on p. 381).

# Example 4

As we noted earlier, real symmetric matrices are self-adjoint, and self-adjoint matrices are normal. The following matrix A is complex and symmetric:



But A is not normal, because  and . Therefore complex symmetric matrices need not be normal.

# Exercises

1. Label the following statements as true or false. Assume that the underlying inner product spaces are finite-dimensional.

1. (a) Every self-adjoint operator is normal.

2. (b) Operators and their adjoints have the same eigenvectors.

3. (c) If T is an operator on an inner product space V, then T is normal if and only if  is normal, where  is any ordered basis for V.

4. (d) A real or complex matrix A is normal if and only if  is normal.

5. (e) The eigenvalues of a self-adjoint operator must all be real.

6. (f) The identity and zero operators are self-adjoint.

7. (g) Every normal operator is diagonalizable.

8. (h) Every self-adjoint operator is diagonalizable.

2. For each linear operator T on an inner product space V, determine whether T is normal, self-adjoint, or neither. If possible, produce an orthonormal basis of eigenvectors of T for V and list the corresponding eigenvalues.

1. (a)  and T is defined by .

2. (b)  and T is defined by .

3. (c)  and T is defined by .

4. (d)  and T is defined by , where


5. (e)  and T is defined by .

6. (f)  and T is defined by .

3. Give an example of a linear operator T on  and an ordered basis for  that provides a counterexample to the statement in Exercise 1(c).

4. Let T and U be self-adjoint operators on an inner product space V. Prove that TU is self-adjoint if and only if .

5. Prove (b) of Theorem 6.15.

6. Let V be a complex inner product space, and let T be a linear operator on V. Define


1. (a) Prove that  and  are self-adjoint and that .

2. (b) Suppose also that , where  and  are self-adjoint. Prove that  and .

3. (c) Prove that T is normal if and only if .

7. Let T be a linear operator on an inner product space V, and let W be a T-invariant subspace of V. Prove the following results.

1. (a) If T is self-adjoint, then  is self-adjoint.

2. (b)  is T*-invariant.

3. (c) If W is both T- and T*-invariant, then .

4. (d) If W is both T- and T*-invariant and T is normal, then  is normal.

8. Let T be a normal operator on a finite-dimensional complex inner product space V, and let W be a subspace of V. Prove that if W is T-invariant, then W is also T*-invariant. Hint: Use Exercise 24 of Section 5.4.

9. Let T be a normal operator on a finite-dimensional inner product space V. Prove that  and . Hint: Use Theorem 6.15 and Exercise 12 of Section 6.3.

10. Let T be a self-adjoint operator on a finite-dimensional inner product space V. Prove that for all 



Deduce that  is invertible and that the adjoint of  is .

11. Assume that T is a linear operator on a complex (not necessarily finite-dimensional) inner product space V with an adjoint T*. Prove the following results.

1. (a) If T is self-adjoint, then  is real for all .

2. (b) If T satisfies  for all , then . Hint: Replace x by  and then by , and expand the resulting inner products.

3. (c) If  is real for all , then T is self-adjoint.

12. Let T be a normal operator on a finite-dimensional real inner product space V whose characteristic polynomial splits. Prove that V has an orthonormal basis of eigenvectors of T. Hence prove that T is self-adjoint.

13. An  real matrix A is said to be a Gramian matrix if there exists a real (square) matrix B such that . Prove that A is a Gramian matrix if and only if A is symmetric and all of its eigenvalues are non-negative. Hint: Apply Theorem 6.17 to  to obtain an orthonormal basis  of eigenvectors with the associated eigenvalues . Define the linear operator U by .

14. Simultaneous Diagonalization. Let V be a finite-dimensional real inner product space, and let U and T be self-adjoint linear operators on V such that . Prove that there exists an orthonormal basis for V consisting of vectors that are eigenvectors of both U and T. (The complex version of this result appears as Exercise 10 of Section 6.6.) Hint: For any eigenspace  of T, we have that W is both T- and U-invariant. By Exercise 7, we have that  is both T- and U-invariant. Apply Theorem 6.17 and Theorem 6.6 (p. 347).

15. Let A and B be symmetric  matrices such that . Use Exercise 14 to prove that there exists an orthogonal matrix P such that  and  are both diagonal matrices.

16. Prove the Cayley-Hamilton theorem for a complex  matrix A. That is, if f(t) is the characteristic polynomial of A, prove that . Hint: Use Schur’s theorem to show that A may be assumed to be upper triangular, in which case



Now if , we have  for , where  is the standard ordered basis for . (The general case is proved in Section 5.4.)

The following definitions are used in Exercises 17 through 23.

# Definitions.

A linear operator T on a finite-dimensional inner product space is called positive definite [positive semidefinite] if T is self-adjoint and  for all .

An  matrix A with entries from R or C is called positive definite [positive semidefinite] if  is positive definite [positive semidefinite].

1. Let T and U be self-adjoint linear operators on an n-dimensional inner product space V, and let , where  is an orthonormal basis for V. Prove the following results.

1. (a) T is positive definite [semidefinite] if and only if all of its eigenvalues are positive [nonnegative].

2. (b) T is positive definite if and only if


3. (c) T is positive semidefinite if and only if  for some square matrix B.

4. (d) If T and U are positive semidefinite operators such that , then .

5. (e) If T and U are positive definite operators such that , then TU is positive definite.

6. (f) T is positive definite [semidefinite] if and only if A is positive definite [semidefinite].

Because of (f), results analogous to items (a) through (d) hold for matrices as well as operators.

2. Let  be a linear transformation, where V and W are finite-dimensional inner product spaces. Prove the following results.

1. (a) T*T and TT* are positive semidefinite. (See Exercise 15 of Section 6.3.)

2. (b) 

3. Let T and U be positive definite operators on an inner product space V. Prove the following results.

1. (a)  is positive definite.

2. (b) If , then cT is positive definite.

3. (c)  is positive definite.

Visit goo.gl/cQch7i for a solution.

4. Let V be an inner product space with inner product , and let T be a positive definite linear operator on V. Prove that  defines another inner product on V.

5. Let V be a finite-dimensional inner product space, and let T and U be self-adjoint operators on V such that T is positive definite. Prove that both TU and UT are diagonalizable linear operators that have only real eigenvalues. Hint: Show that UT is self-adjoint with respect to the inner product . To show that TU is self-adjoint, repeat the argument with  in place of T.

6. This exercise provides a converse to Exercise 20. Let V be a finite-dimensional inner product space with inner product , and let  be any other inner product on V.

1. (a) Prove that there exists a unique linear operator T on V such that  for all x and y in V. Hint: Let  be an orthonormal basis for V with respect to , and define a matrix A by  for all i and j. Let T be the unique linear operator on V such that .

2. (b) Prove that the operator T of (a) is positive definite with respect to both inner products.

7. Let U be a diagonalizable linear operator on a finite-dimensional inner product space V such that all of the eigenvalues of U are real. Prove that there exist positive definite linear operators  and  and self-adjoint linear operators  and  such that . Hint: Let  be the inner product associated with V,  a basis of eigenvectors for U,  the inner product on V with respect to which  is orthonormal (see Exercise 22(a) of Section 6.1), and  the positive definite operator according to Exercise 22. Show that U is self-adjoint with respect to  and  (the adjoint is with respect to ). Let .