# 1.5 Linear Dependence and Linear Independence

Suppose that V is a vector space over an infinite field and that W is a subspace of V. Unless W is the zero subspace, W is an infinite set. It is desirable to find a “small” finite subset S of W that generates W because we can then describe each vector in W as a linear combination of the finite number of vectors in S. Indeed, the smaller S is, the fewer the number of computations required to represent vectors in W as such linear combinations. Consider, for example, the subspace W of  generated by , where , and . Let us attempt to find a proper subset of S that also generates W. The search for this subset is related to the question of whether or not some vector in S is a linear combination of the other vectors in S. Now  is a linear combination of the other vectors in S if and only if there are scalars , and  such that



that is, if and only if there are scalars , and  satisfying



Thus  is a linear combination of , and  if and only if the system of linear equations



has a solution. The reader should verify that no such solution exists. This does not, however, answer our question of whether some vector in S is a linear combination of the other vectors in S. It can be shown, in fact, that  is a linear combination of , and , namely, .

In the preceding example, checking that some vector in S is a linear combination of the other vectors in S could require that we solve several different systems of linear equations before we determine which, if any, of , and  is a linear combination of the others. By formulating our question differently, we can save ourselves some work. Note that since  we have



That is, because some vector in S is a linear combination of the others, the zero vector can be expressed as a linear combination of the vectors in S using coefficients that are not all zero. The converse of this statement is also true: If the zero vector can be written as a linear combination of the vectors in S in which not all the coefficients are zero, then some vector in S is a linear combination of the others. For instance, in the example above, the equation  can be solved for any one of , or  because each of these has a nonzero coefficient. Therefore any one of , or  can be written as a linear combination of the other three vectors. Thus, rather than asking whether some vector in S is a linear combination of the other vectors in S, it is more efficient to ask whether the zero vector can be expressed as a linear combination of the vectors in S with coefficients that are not all zero. This observation leads us to the following definition.

# Definition.

A subset S of a vector space V is calledlinearly dependent if there exist a finite number of distinct vectors in S and scalars , not all zero, such that



In this case we also say that the vectors of S are linearly dependent.

For any vectors , we have  if . We call this the trivial representation of 0 as a linear combination of . Thus, for a set to be linearly dependent, there must exist a nontrivial representation of 0 as a linear combination of vectors in the set. Consequently, any subset of a vector space that contains the zero vector is linearly dependent, because  is a nontrivial representation of 0 as a linear combination of vectors in the set.

# Example 1

Consider the set



in . We show that S is linearly dependent and then express one of the vectors in S as a linear combination of the other vectors in S. To show that S is linearly dependent, we must find scalars , and , not all zero, such that



Finding such scalars amounts to finding a nonzero solution to the system of linear equations



One such solution is , and . Thus S is a linearly dependent subset of , and



Hence



# Example 2

In , the set



is linearly dependent because



# Definition.

A subset S of a vector space that is not linearly dependent is called linearly independent. As before, we also say that the vectors of S are linearly independent.

The following facts about linearly independent sets are true in any vector space.

1. The empty set is linearly independent, for linearly dependent sets must be nonempty.

2. A set consisting of a single nonzero vector is linearly independent. For if {u} is linearly dependent, then  for some nonzero scalar a. Thus


3. A set is linearly independent if and only if the only representations of 0 as linear combinations of its vectors are trivial representations.

The condition in item 3 provides a useful method for determining whether a finite set is linearly independent. This technique is illustrated in the examples that follow.

# Example 3

To prove that the set



is linearly independent, we must show that the only linear combination of vectors in S that equals the zero vector is the one in which all the coefficients are zero. Suppose that a1 ,a2 ,a3, and a4 are scalars such that



Equating the corresponding coordinates of the vectors on the left and the right sides of this equation, we obtain the following system of linear equations.



Clearly the only solution to this system is , and so S is linearly independent.

# Example 4

For  let . The set



is linearly independent in . For if



for some scalars , then



By equating the coefficients of  on both sides of this equation for , we obtain



Clearly the only solution to this system of linear equations is .

The following important results are immediate consequences of the definitions of linear dependence and linear independence.

# Theorem 1.6.

Let V be a vector space, and let . If  is linearly dependent, then  is linearly dependent.

Proof. Exercise.

# Corollary.

Let V be a vector space, and let . If  is linearly independent, then  is linearly independent.

Proof. Exercise.

Earlier in this section, we remarked that the issue of whether S is a minimal generating set for its span (that is, one such that no proper subset of S is a generating set) is related to the question of whether some vector in S is a linear combination of the other vectors in S. Thus the issue of whether S is the smallest generating set for its span is related to the question of whether S is linearly dependent. To see why, consider the subset  of , where , and . We have previously noted that S is linearly dependent; in fact,



This equation implies that  (or alternatively,  or ) is a linear combination of the other vectors in S. For example, . Therefore every linear combination  of vectors in S can be written as a linear combination of , and :



Thus the subset  of S has the same span as S!

More generally, suppose that S is any linearly dependent set containing two or more vectors. Then some vector  can be written as a linear combination of the other vectors in S, and the subset obtained by removing v from S has the same span as S. It follows that if no proper subset of S generates the span of S, then S must be linearly independent. Another way to view the preceding statement is given in Theorem 1.7.

# Theorem 1.7.

Let S be a linearly independent subset of a vector space v, and let v be a vector in V that is not in S. Then  is linearly dependent if and only if .

Proof. If  is linearly dependent, then there are vectors  in  such that  for some nonzero scalars . Since S is linearly independent, one of the ’s, say , equals v. Thus , and so



Because v is a linear combination of , which are in S, we have .

Conversely, let . Then there exist vectors  in S and scalars . Therefore



Note that  for  because . Hence the coefficient of v in this linear combination is nonzero, and so the set  is linearly dependent. Thus  is linearly dependent by Theorem 1.6.

Linearly independent generating sets are investigated in detail in Section 1.6.

# Exercises

1. Label the following statements as true or false.

1. (a) If S is a linearly dependent set, then each vector in S is a linear combination of other vectors in S.

2. (b) Any set containing the zero vector is linearly dependent.

3. (c) The empty set is linearly dependent.

4. (d) Subsets of linearly dependent sets are linearly dependent.

5. (e) Subsets of linearly independent sets are linearly independent.

6. (f) If  and  are linearly independent, then all the scalars aj are zero.

2. 3 Determine whether the following sets are linearly dependent or linearly independent.

1. (a)  in 

2. (b)  in 

3. (c)  in 

4. (d)  in 

5. (e)  in 

6. (f)  in 

7. (g)  in 

8. (h)  in 

9. (i)  in 

10. (j)  in 

3. In , prove that the set



is linearly dependent.

4. In , let  denote the vector whose jth coordinate is 1 and whose other coordinates are 0. Prove that  is linearly independent.

5. Show that the set  is linearly independent in .

6. In , let  denote the matrix whose only nonzero entry is 1 in the ith row and jth column. Prove that  is linearly independent.

7. Recall from Example 3 in Section 1.3 that the set of diagonal matrices in  is a subspace. Find a linearly independent set that generates this subspace.

8. Let  be a subset of the vector space .

1. (a) Prove that if , then S is linearly independent.

2. (b) Prove that if F has characteristic two, then S is linearly dependent.

9. Let u and v be distinct vectors in a vector space V. Show that {u, v} is linearly dependent if and only if u or v is a multiple of the other.

10. Give an example of three linearly dependent vectors in  such that none of the three is a multiple of another.

11. Let  be a linearly independent subset of a vector space V over the field . How many vectors are there in span(S)? Justify your answer.

12. Prove Theorem 1.6 and its corollary.

13. Let V be a vector space over a field of characteristic not equal to two.

1. (a) Let u and v be distinct vectors in V. Prove that {u, v} is linearly independent if and only if  is linearly independent.

2. (b) Let u, v, and w be distinct vectors in V. Prove that {u, v, w} is linearly independent if and only if  is linearly independent.

14. Prove that a set S is linearly dependent if and only if  or there exist distinct vectors  in S such that v is a linear combination of .

15. Let  be a finite set of vectors. Prove that S is linearly dependent if and only if  or  for some .

16. Prove that a set S of vectors is linearly independent if and only if each finite subset of S is linearly independent.

17. Let M be a square upper triangular matrix (as defined on page 19 of Section 1.3) with nonzero diagonal entries. Prove that the columns of M are linearly independent.

18. Let S be a set of nonzero polynomials in P(F) such that no two have the same degree. Prove that S is linearly independent.

19. Prove that if  is a linearly independent subset of  then  is also linearly independent.

20. Let  be the functions defined by  and , where . Prove that f and g are linearly independent in F(R, R).

21. Let  and  be disjoint linearly independent subsets of V. Prove that  is linearly dependent if and only if . Visit goo.gl/Fi8Epr for a solution.