# 2.4 Invertibility and Isomorphisms

The concept of invertibility is introduced quite early in the study of functions. Fortunately, many of the intrinsic properties of functions are shared by their inverses. For example, in calculus we learn that the properties of being continuous or differentiable are generally retained by the inverse functions. We see in this section (Theorem 2.17) that the inverse of a linear transformation is also linear. This result greatly aids us in the study of inverses of matrices. As one might expect from Section 2.3, the inverse of the left-multiplication transformation  (when it exists) can be used to determine properties of the inverse of the matrix A.

In the remainder of this section, we apply many of the results about in- vertibility to the concept of isomorphism. We will see that finite-dimensional vector spaces (over F) of equal dimension may be identified. These ideas will be made precise shortly.

The facts about inverse functions presented in Appendix B are, of course, true for linear transformations. Nevertheless, we repeat some of the definitions for use in this section.

# Definition.

Let V and W be vector spaces, and let  be linear. A function  is said to be an inverse of T if  and . If T has an inverse, then T is said to be invertible. As noted in Appendix B, if T is invertible, then the inverse of T is unique and is denoted by .

The following facts hold for invertible functions T and U.

1. .

2.  in particular,  is invertible.

We often use the fact that a function is invertible if and only if it is both one-to-one and onto. We can therefore restate Theorem 2.5 as follows.

3. Let  be a linear transformation, where V and W are finite-dimensional spaces of equal dimension. Then T is invertible if and only if .

# Example 1

Let  be the linear transformation defined by . The reader can verify directly that  is defined by . Observe that  is also linear. As Theorem 2.17 demonstrates, this is true in general.

# Theorem 2.17.

Let V and W be vector spaces, and let  be linear and invertible. Then  is linear.

Proof.

Let  and . Since T is onto and one-to-one, there exist unique vectors  and  such that  and . Thus  and  so



# Corollary.

Let T be an invertible linear transformation from V to W. Then V is finite-dimensional if and only if W is finite-dimensional. In this case, .

Proof.

Suppose that V is finite-dimensional. Let  be a basis for V. By Theorem 2.2 (p. 68),  spans  hence W is finite-dimensional by Theorem 1.9 (p. 45). Conversely, if W is finite-dimensional, then so is V by a similar argument, using .

Now suppose that V and W are finite-dimensional. Because T is one-to-one and onto, we have



So by the dimension theorem (p. 70), it follows that .

It now follows immediately from Theorem 2.5 (p. 71) that if T is a linear transformation between vector spaces of equal (finite) dimension, then the conditions of being invertible, one-to-one, and onto are all equivalent.

We are now ready to define the inverse of a matrix. The reader should note the analogy with the inverse of a linear transformation.

# Definition.

Let A be an  matrix. Then A is invertible if there exists an  matrix B such that .

If A is invertible, then the matrix B such that  is unique. (If C were another such matrix, then .) The matrix B is called the inverse of A and is denoted by .

# Example 2

The reader should verify that the inverse of



In Section 3.2, we will learn a technique for computing the inverse of a matrix. At this point, we develop a number of results that relate the inverses of matrices to the inverses of linear transformations.

# Theorem 2.18.

Let V and W be finite-dimensional vector spaces with ordered bases  and , respectively. Let  be linear. Then T is invertible if and only if  is invertible. Furthermore, .

Proof.

Suppose that T is invertible. By the Corollary to Theorem 2.17, we have . Let . So  is an  matrix. Now  satisfies  and . Thus



Similarly, . So  is invertible and .

Now suppose that  is invertible. Then there exists an  matrix B such that . By Theorem 2.6 (p. 73), there exists  such that



where  and . It follows that . To show that  observe that



by Theorem 2.11 (p. 89). So , and similarly, 

# Example 3

Let  and  be the standard ordered bases of  and , respectively. For T as in Example 1, we have



It can be verified by matrix multiplication that each matrix is the inverse of the other.

# Corollary 1.

Let V be a finite-dimensional vector space with an ordered basis , and let  be linear. Then T is invertible if and only if  is invertible. Furthermore, .

Proof.

Exercise.

# Corollary 2.

Let A be an  matrix. Then A is invertible if and only if  is invertible. Furthermore, .

Proof.

Exercise.

The notion of invertibility may be used to formalize what may already have been observed by the reader, that is, that certain vector spaces strongly resemble one another except for the form of their vectors. For example, in the case of  and , if we associate to each matrix



the 4-tuple (a, b, c, d), we see that sums and scalar products associate in a similar manner; that is, in terms of the vector space structure, these two vector spaces may be considered identical or isomorphic.

# Definitions.

Let V and W be vector spaces. We say that V is isomorphic to W if there exists a linear transformation  that is invertible. Such a linear transformation is called an isomorphism from V onto W.

We leave as an exercise (see Exercise 13) the proof that “is isomorphic to” is an equivalence relation. (See Appendix A.) So we need only say that V and W are isomorphic.

# Example 4

Define  by . It is easily checked that T is an isomorphism; so  is isomorphic to .

# Example 5

Define



It is easily verified that T is linear. By use of the Lagrange interpolation formula in Section 1.6, it can be shown (compare with Exercise 22) that  only when f is the zero polynomial. Thus T is one-to-one (see Exercise 11). Moreover, because , it follows that T is invertible by Theorem 2.5 (p. 71). We conclude that  is isomorphic to .

In each of Examples 4 and 5, the reader may have observed that isomor-phic vector spaces have equal dimensions. As the next theorem shows, this is no coincidence.

# Theorem 2.19.

Let V and W be finite-dimensional vector spaces (over the same field). Then V is isomorphic to W if and only if .

Proof.

Suppose that V is isomorphic to W and that  is an isomorphism from V to W. By the lemma preceding Theorem 2.18, we have that .

Now suppose that , and let  and  be bases for V and W, respectively. By Theorem 2.6 (p. 73), there exists  such that T is linear and  for  Using Theorem 2.2 (p. 68), we have



So T is onto. From Theorem 2.5 (p. 71), we have that T is also one-to-one. Hence T is an isomorphism.

By the lemma to Theorem 2.18, if V and W are isomorphic, then either both of V and W are finite-dimensional or both are infinite-dimensional.

# Corollary.

Let V be a vector space over F. Then V is isomorphic to  if and only if .

Up to this point, we have associated linear transformations with their matrix representations. We are now in a position to prove that, as a vector space, the collection of all linear transformations between two given vector spaces may be identified with the appropriate vector space of  matrices.

# Theorem 2.20.

Let V and W be finite-dimensional vector spaces over F of dimensions n and m, respectively, and let  and  be ordered bases for V and W, respectively. Then the function , defined by  for , is an isomorphism.

Proof.

By Theorem 2.8 (p. 83),  is linear. Hence we must show that  is one-to-one and onto. This is accomplished if we show that for every  matrix A, there exists a unique linear transformation  such that . Let , and let A be a given  matrix. By Theorem 2.6 (p. 73), there exists a unique linear transformation  such that



But this means that . So  is an isomorphism.

# Corollary.

Let V and W be finite-dimensional vector spaces of dimensions n and m, respectively. Then  is finite-dimensional of dimension mn.

Proof.

The proof follows from Theorems 2.20 and 2.19 and the fact that .

We conclude this section with a result that allows us to see more clearly the relationship between linear transformations defined on abstract finite- dimensional vector spaces and linear transformations from  to .

We begin by naming the transformation  introduced in Section 2.2.

# Definition.

Let  be an ordered basis for an n-dimensional vector space V over the field F. The standard representation of V with respect to  is the function  defined by  for each .

# Example 6

Let  and . It is easily observed that  and  are ordered bases for . For , we have



We observed earlier that  is a linear transformation. The next theorem tells us much more.

# Theorem 2.21.

For any finite-dimensional vector space V with ordered basis  is an isomorphism.

Proof.

Exercise.

This theorem provides us with an alternate proof that an n-dimensional vector space is isomorphic to  (see the corollary to Theorem 2.19).

Let V and W be vector spaces of dimension n and m, respectively, and let  be a linear transformation. Define , where  and  are arbitrary ordered bases of V and W, respectively. We are now able to use  and  to study the relationship between the linear transformations T and .

Let us first consider Figure 2.2. Notice that there are two composites of linear transformations that map V into :

1. Map V into  with  and follow this transformation with  this yields the composite .

2. Map V into W with T and follow it by  to obtain the composite .

These two composites are depicted by the dashed arrows in the diagram. By a simple reformulation of Theorem 2.14 (p. 92), we may conclude that



that is, the diagram “commutes.” Heuristically, this relationship indicates that after V and W are identified with  and  via  and , respectively, we may “identify” T with . This diagram allows us to transfer operations on abstract vector spaces to ones on  and .

# Example 7

Recall the linear transformation  defined in Example 4 of Section 2.2 . Let  and  be the standard ordered bases for  and , respectively, and let  and  be the corresponding standard representations of  and . If , then



Consider the polynomial . We show that . Now



But since , we have



So .

Try repeating Example 7 with different polynomials p(x).

# Exercises

1. Label the following statements as true or false. In each part, V and W are vector spaces with ordered (finite) bases  and , respectively,  is linear, and A and B are matrices.

1. (a) 

2. (b) T is invertible if and only if T is one-to-one and onto.

3. (c) where 

4. (d)  is isomorphic to .

5. (e)  is isomorphic to  if and only if .

6. (f)  implies that A and B are invertible.

7. (g) If A is invertible, then .

8. (h) A is invertible if and only if  is invertible.

9. (i) A must be square in order to possess an inverse.

2. For each of the following linear transformations T, determine whether T is invertible and justify your answer.

1. (a)  defined by .

2. (b)  defined by .

3. (c)  defined by .

4. (d)  defined by .

5. (e)  defined by .

6. (f)  defined by .

3. Which of the following pairs of vector spaces are isomorphic? Justify your answers.

1. (a)  and .

2. (b)  and .

3. (c)  and .

4. (d)  and .

4. Let A and B be  invertible matrices. Prove that AB is invertible and .

5. Let A be invertible. Prove that  is invertible and . Visit goo.gl/suFm6V for a solution.

6. Prove that if A is invertible and , then .

7. Let A be an  matrix.

1. (a) Suppose that . Prove that A is not invertible.

2. (b) Suppose that  for some nonzero  matrix B. Could A be invertible? Explain.

8. Let A and B be  matrices such that AB is invertible.

1. (a) Prove that A and B are invertible. Hint: See Exercise 12 of Section 2.3.

2. (b) Give an example to show that a product of nonsquare matrices can be invertible even though the factors, by definition, are not.

9. Let A and B be  matrices such that .

1. (a) Use Exercise 9 to conclude that A and B are invertible.

2. (b) Prove (and hence ). (We are, in effect, saying that for square matrices, a “one-sided” inverse is a “two-sided” inverse.)

3. (c) State and prove analogous results for linear transformations defined on finite-dimensional vector spaces.

10. Verify that the transformation in Example 5 is one-to-one.

11. Let  mean “is isomorphic to.” Prove that  is an equivalence relation on the class of vector spaces over F.

12. Let



Construct an isomorphism from V to .

13. Let V and W be n-dimensional vector spaces, and let  be a linear transformation. Suppose that  is a basis for V. Prove that T is an isomorphism if and only if  is a basis for W.

14. Let B be an  invertible matrix. Define  by . Prove that  is an isomorphism.

15. Let V and W be finite-dimensional vector spaces and  be an isomorphism. Let  be a subspace of V.

1. (a) Prove that  is a subspace of W.

2. (b) Prove that .

16. Repeat Example 7 with the polynomial .

17. In Example 5 of Section 2.1, the mapping  defined by  for each  is a linear transformation. Let , which is a basis for , as noted in Example 3 of Section 1.6.

1. (a) Compute .

2. (b) Verify that  for  and


18. Let  be a linear transformation from an n-dimensional vector space V to an m-dimensional vector space W. Let  and  be ordered bases for V and W, respectively. Prove that  and that , where . Hint: Apply Exercise 17 to Figure 2.2.

19. Let V and W be finite-dimensional vector spaces with ordered bases  and , respectively. By Theorem 2.6 (p. 73), there exist linear transformations  such that



First prove that  is a basis for . Then let  be the  matrix with 1 in the ith row and jth column and 0 elsewhere, and prove that . Again by Theorem 2.6, there exists a linear transformation  such that . Prove that  is an isomorphism.

20. Let  be distinct scalars from an infinite field F. Define  by . Prove that T is an isomorphism. Hint: Use the Lagrange polynomials associated with 

21. Let W denote the vector space of all sequences in F that have only a finite number of nonzero terms (defined in Exercise 18 of Section 1.6), and let . Define



where n is the largest integer such that . Prove that T is an isomorphism.

The following exercise requires familiarity with the concept of quotient space defined in Exercise 31 of Section 1.3 and with Exercise 42 of Section 2.1.

1. Let V and Z be vector spaces and  be a linear transformation that is onto. Define the mapping



for any coset  in .

1. (a) Prove that  is well-defined; that is, prove that if , then .

2. (b) Prove that  is linear.

3. (c) Prove that  is an isomorphism.

4. (d) Prove that the diagram shown in Figure 2.3 commutes; that is, prove that .

2. Let V be a nonzero vector space over a field F, and suppose that S is a basis for V. (By the corollary to Theorem 1.13 (p. 61) in Section 1.7, every vector space has a basis.) Let C(S, F) denote the vector space of all functions  such that  for all but a finite number of vectors in S. (See Exercise 14 of Section 1.3.) Let  be defined by  if f is the zero function, and



otherwise. Prove that  is an isomorphism. Thus every nonzero vector space can be viewed as a space of functions.