# 1.5 Linear Dependence and Linear Independence

Suppose that V is a vector space over an infinite field and that W is a subspace of V. Unless W is the zero subspace, W is an infinite set. It is desirable to find a “small” finite subset `S` of W that generates W because we can then describe each vector in W as a linear combination of the finite number of vectors in S. Indeed, the smaller `S` is, the fewer the number of computations required to represent vectors in W as such linear combinations. Consider, for example, the subspace W of ${\text{R}}^{3}$ generated by $S=\{{u}_{1},\text{}{u}_{2},\text{}{u}_{3},\text{}{u}_{4}\}$, where ${u}_{1}=(2,-1,\text{}4),\text{}{u}_{2}=(1,-1,\text{}3),\text{}{u}_{3}=(1,\text{}1,-1)$, and ${u}_{4}=(1,-2,-1)$. Let us attempt to find a proper subset of `S` that also generates W. The search for this subset is related to the question of whether or not some vector in `S` is a linear combination of the other vectors in S. Now ${u}_{4}$ is a linear combination of the other vectors in `S` if and only if there are scalars ${a}_{1},\text{}{a}_{2}$, and ${a}_{3}$ such that

that is, if and only if there are scalars ${a}_{1},\text{}{a}_{2}$, and ${a}_{3}$ satisfying

Thus ${u}_{4}$ is a linear combination of ${u}_{1},\text{}{u}_{2}$, and ${u}_{3}$ if and only if the system of linear equations

has a solution. The reader should verify that no such solution exists. This does not, however, answer our question of whether *some* vector in `S` is a linear combination of the other vectors in `S`. It can be shown, in fact, that ${u}_{3}$ is a linear combination of ${u}_{1},\text{}{u}_{2}$, and ${u}_{4}$, namely, ${u}_{3}=2{u}_{1}-3{u}_{2}+0{u}_{4}$.

In the preceding example, checking that some vector in `S` is a linear combination of the other vectors in `S` could require that we solve several different systems of linear equations before we determine which, if any, of ${u}_{1},\text{}{u}_{2},\text{}{u}_{3}$, and ${u}_{4}$ is a linear combination of the others. By formulating our question differently, we can save ourselves some work. Note that since ${u}_{3}=2{u}_{1}-3{u}_{2}+0{u}_{4},$ we have

That is, because some vector in `S` is a linear combination of the others, the zero vector can be expressed as a linear combination of the vectors in `S` using coefficients that are not all zero. The converse of this statement is also true: If the zero vector can be written as a linear combination of the vectors in `S` in which not all the coefficients are zero, then some vector in `S` is a linear combination of the others. For instance, in the example above, the equation $-2{u}_{1}+3{u}_{2}+{u}_{3}-0{u}_{4}=0$ can be solved for any one of ${u}_{1},\text{}{u}_{2}$, or ${u}_{3}$ because each of these has a nonzero coefficient. Therefore any one of ${u}_{1},\text{}{u}_{2}$, or ${u}_{3}$ can be written as a linear combination of the other three vectors. Thus, rather than asking whether some vector in `S` is a linear combination of the other vectors in `S`, it is more efficient to ask whether the zero vector can be expressed as a linear combination of the vectors in `S` with coefficients that are not all zero. This observation leads us to the following definition.

# Definition.

*A subset* `S` *of a vector space* V *is called**linearly dependent**if there exist a finite number of distinct vectors*${u}_{1},\text{}{u}_{2},\text{}\dots ,\text{}{u}_{n}$ *in* `S` *and scalars* ${a}_{1},\text{}{a}_{2},\text{}\dots ,\text{}{a}_{n}$, *not all zero, such that*

*In this case we also say that the vectors of* `S` *are linearly dependent.*

For any vectors ${u}_{1},\text{}{u}_{2},\text{}\dots ,\text{}{u}_{n}$, we have ${a}_{1}{u}_{1}+{a}_{2}{u}_{2}+\cdots +{a}_{n}{u}_{n}=0$ if ${a}_{1}={a}_{2}=\cdots ={a}_{n}=0$. We call this the trivial representation of *0* as a linear combination of ${u}_{1},\text{}{u}_{2},\text{}\dots ,\text{}{u}_{n}$. Thus, for a set to be linearly dependent, there must exist a nontrivial representation of *0* as a linear combination of vectors in the set. Consequently, any subset of a vector space that contains the zero vector is linearly dependent, because $0=1\cdot 0$ is a nontrivial representation of *0* as a linear combination of vectors in the set.

# Example 1

Consider the set

in ${\text{R}}^{4}$. We show that `S` is linearly dependent and then express one of the vectors in `S` as a linear combination of the other vectors in `S`. To show that `S` is linearly dependent, we must find scalars ${a}_{1},\text{}{a}_{2},\text{}{a}_{3}$, and ${a}_{4}$, not all zero, such that

Finding such scalars amounts to finding a nonzero solution to the system of linear equations

One such solution is ${a}_{1}=4,\text{}{a}_{2}=-3,\text{}{a}_{3}=2$, and ${a}_{4}=0$. Thus `S` is a linearly dependent subset of ${\text{R}}^{4}$, and

Hence

# Example 2

In ${\text{M}}_{2\times 3}(R)$, the set

is linearly dependent because

# Definition.

*A subset* `S` *of a vector space that is not linearly dependent is called linearly independent. As before, we also say that the vectors of* `S` *are linearly independent.*

The following facts about linearly independent sets are true in any vector space.

The empty set is linearly independent, for linearly dependent sets must be nonempty.

A set consisting of a single nonzero vector is linearly independent. For if {

`u`} is linearly dependent, then $au=0$ for some nonzero scalar`a`. Thus$$u={a}^{-1}(au)={a}^{-1}0=0.$$A set is linearly independent if and only if the only representations of

*0*as linear combinations of its vectors are trivial representations.

The condition in item 3 provides a useful method for determining whether a finite set is linearly independent. This technique is illustrated in the examples that follow.

# Example 3

To prove that the set

is linearly independent, we must show that the only linear combination of vectors in `S` that equals the zero vector is the one in which all the coefficients are zero. Suppose that *a*_{1} ,*a*_{2} ,*a*_{3}, and *a*_{4} are scalars such that

Equating the corresponding coordinates of the vectors on the left and the right sides of this equation, we obtain the following system of linear equations.

Clearly the only solution to this system is ${a}_{1}={a}_{2}={a}_{3}={a}_{4}=0$, and so `S` is linearly independent.

# Example 4

For $k=0,\text{}1,\text{}\dots ,\text{}n$ let $pk(x)={x}^{k}+{x}^{k+1}+\cdots +{x}^{n}$. The set

is linearly independent in ${\text{P}}_{n}(F)$. For if

for some scalars ${a}_{0},\text{}{a}_{1},\text{}\dots ,\text{}{a}_{n}$, then

By equating the coefficients of ${x}^{k}$ on both sides of this equation for $k=1,\text{}2,\text{}\dots ,\text{}n$, we obtain

Clearly the only solution to this system of linear equations is ${a}_{0}={a}_{1}=\cdots ={a}_{n}=0$.

The following important results are immediate consequences of the definitions of linear dependence and linear independence.

# Theorem 1.6.

*Let* V *be a vector space, and let* ${S}_{1}\subseteq {S}_{2}\subseteq \text{V}$. *If* ${S}_{1}$ *is linearly dependent, then* ${S}_{2}$ *is linearly dependent.*

Proof. Exercise.

# Corollary.

*Let* V *be a vector space, and let* ${S}_{1}\subseteq {S}_{2}\subseteq \text{V}$. *If* ${S}_{2}$ *is linearly independent, then* ${S}_{1}$ *is linearly independent.*

Proof. Exercise.

Earlier in this section, we remarked that the issue of whether `S` is a minimal generating set for its span (that is, one such that no proper subset of `S` is a generating set) is related to the question of whether some vector in `S` is a linear combination of the other vectors in S. Thus the issue of whether `S` is the smallest generating set for its span is related to the question of whether `S` is linearly dependent. To see why, consider the subset $S=\{{u}_{1},\text{}{u}_{2},\text{}{u}_{3},\text{}{u}_{4}\}$ of ${\text{R}}^{3}$, where ${u}_{1}=(2,-1,\text{}4),\text{}{u}_{2}=(1,-1,\text{}3),\text{}{u}_{3}=(1,\text{}1,-1)$, and ${u}_{4}=(1,-2,-1)$. We have previously noted that `S` is linearly dependent; in fact,

This equation implies that ${u}_{3}$ (or alternatively, ${u}_{1}$ or ${u}_{2}$) is a linear combination of the other vectors in `S`. For example, ${u}_{3}=2{u}_{1}-3{u}_{2}+0{u}_{4}$. Therefore every linear combination ${a}_{1}{u}_{1}+{a}_{2}{u}_{2}+{a}_{3}{u}_{3}+{a}_{4}{u}_{4}$ of vectors in `S` can be written as a linear combination of ${u}_{1},\text{}{u}_{2}$, and ${u}_{4}$:

Thus the subset ${S}^{\prime}=\left\{{u}_{1},\text{}{u}_{2},\text{}{u}_{4}\right\}$ of `S` has the same span as `S`!

More generally, suppose that `S` is any linearly dependent set containing two or more vectors. Then some vector $v\in S$ can be written as a linear combination of the other vectors in `S`, and the subset obtained by removing v from `S` has the same span as `S`. It follows that *if no proper subset of* `S` *generates the span of S, then*

`S`

*must be linearly independent.*Another way to view the preceding statement is given in Theorem 1.7.

# Theorem 1.7.

*Let* `S` *be a linearly independent subset of a vector space v, and let*

`v`

*be a vector in*V

*that is not in*$S\cup \left\{v\right\}$

`S`. Then*is linearly dependent if and only if*$v\in \text{span}(S)$.

Proof. If $S\cup \left\{v\right\}$ is linearly dependent, then there are vectors ${u}_{1},\text{}{u}_{2},\text{}\dots ,\text{}{u}_{n}$ in $S\cup \left\{v\right\}$ such that ${a}_{1}{u}_{1}+{a}_{2}{u}_{2}+\cdots {a}_{n}{u}_{n}=0$ for some nonzero scalars ${a}_{1},\text{}{a}_{2},\text{}\dots ,\text{}{a}_{n}$. Since `S` is linearly independent, one of the ${u}_{i}$’s, say ${u}_{1}$, equals `v`. Thus ${a}_{1}v+{a}_{2}{u}_{2}+\cdots +{a}_{n}{u}_{n}=0$, and so

Because `v` is a linear combination of ${u}_{2},\dots ,{u}_{n}$, which are in `S`, we have $v\in \text{span}(S)$.

Conversely, let $v\in \text{span}(S)$. Then there exist vectors ${v}_{1},\text{}{v}_{2},\text{}\dots ,\text{}{v}_{m}$ in `S` and scalars ${b}_{1},\text{}{b}_{2},\text{}\dots ,\text{}{b}_{m}\text{suchthat}v={b}_{1}{v}_{1}+{b}_{2}{v}_{2}+\cdots +{b}_{m}{v}_{m}$. Therefore

Note that $v\ne {v}_{i}$ for $i=1,\text{}2,\text{}\dots ,\text{}m$ because $v\notin S$. Hence the coefficient of `v` in this linear combination is nonzero, and so the set $\{{v}_{1},\text{}{v}_{2},\text{}\dots ,\text{}{v}_{m},\text{}v\}$ is linearly dependent. Thus $S\cup \left\{v\right\}$ is linearly dependent by Theorem 1.6.

Linearly independent generating sets are investigated in detail in Section 1.6.

# Exercises

Label the following statements as true or false.

(a) If

`S`is a linearly dependent set, then each vector in`S`is a linear combination of other vectors in S.(b) Any set containing the zero vector is linearly dependent.

(c) The empty set is linearly dependent.

(d) Subsets of linearly dependent sets are linearly dependent.

(e) Subsets of linearly independent sets are linearly independent.

(f) If ${a}_{1}{x}_{1}+{a}_{2}{x}_{2}+\cdots +{a}_{n}{x}_{n}=0$ and ${x}_{1},\text{}{x}_{2},\text{}\dots ,\text{}{x}_{n}$ are linearly independent, then all the scalars aj are zero.

^{3}Determine whether the following sets are linearly dependent or linearly independent.(a) $\left\{\left(\begin{array}{rr}1& -3\\ -2& 4\end{array}\right),\text{}\left(\begin{array}{rr}-2& 6\\ 4& -8\end{array}\right)\right\}$ in ${\text{M}}_{2\times 2}(R)$

(b) $\left\{\left(\begin{array}{rr}1& -2\\ -1& 4\end{array}\right),\text{}\left(\begin{array}{rr}-1& 1\\ 2& -4\end{array}\right)\right\}$ in ${\text{M}}_{2\times 2}(R)$

(c) $\{{x}^{3}+2{x}^{2},-{x}^{2}+3x+1,\text{}{x}^{3}-{x}^{2}+2x-1\}$ in ${\text{P}}_{3}(R)$

(d) $\{{x}^{3}-x,\text{}2{x}^{2}+4,-2{x}^{3}+3{x}^{2}+2x+6\}$ in ${\text{P}}_{3}(R)$

(e) $\{(1,-1,\text{}2),\text{}(1,-2,\text{}1),\text{}(1,\text{}1,\text{}4)\}$ in ${\text{R}}^{3}$

(f) $\{(1,-1,\text{}2),\text{}(2,\text{}0,\text{}1),\text{}(-1,\text{}2,-1)\}$ in ${\text{R}}^{3}$

(g) $\left\{\left(\begin{array}{rr}1& 0\\ -2& 1\end{array}\right),\text{}\left(\begin{array}{rr}0& -1\\ 1& 1\end{array}\right),\text{}\left(\begin{array}{rr}-1& 2\\ 1& 0\end{array}\right),\text{}\left(\begin{array}{rr}2& 1\\ -4& 4\end{array}\right)\right\}$ in ${\text{M}}_{2\times 2}(R)$

(h) $\left\{\left(\begin{array}{rr}1& 0\\ -2& 1\end{array}\right),\text{}\left(\begin{array}{rr}0& -1\\ 1& 1\end{array}\right),\text{}\left(\begin{array}{rr}-1& 2\\ 1& 0\end{array}\right),\text{}\left(\begin{array}{rr}2& 1\\ 2& -2\end{array}\right)\right\}$ in ${\text{M}}_{2\times 2}(R)$

(i) $\begin{array}{l}\{{x}^{4}-{x}^{3}+5{x}^{2}-8x+6,-{x}^{4}+{x}^{3}-5{x}^{2}+5x-3,\text{}\\ \text{}{x}^{4}+3{x}^{2}-3x+5,\text{}2{x}^{4}+3{x}^{3}+4{x}^{2}-x+1,\text{}{x}^{3}-x+2\}\end{array}$ in ${\text{P}}_{4}(R)$

(j) $\begin{array}{l}\{{x}^{4}-{x}^{3}+5{x}^{2}-8x+6,-{x}^{4}+{x}^{3}-5{x}^{2}+5x-3,\text{}\\ \text{}{x}^{4}+3{x}^{2}-3x+5,\text{}2{x}^{4}+{x}^{3}+4{x}^{2}+8x\}\end{array}$ in ${\text{P}}_{4}(R)$

In ${\text{M}}_{3\times 2}(F)$, prove that the set

$$\left\{\left(\begin{array}{rr}1& 1\\ 0& 0\\ 0& 0\end{array}\right),\text{}\left(\begin{array}{rr}0& 0\\ 1& 1\\ 0& 0\end{array}\right),\text{}\left(\begin{array}{rr}0& 0\\ 0& 0\\ 1& 1\end{array}\right),\text{}\left(\begin{array}{rr}1& 0\\ 1& 0\\ 1& 0\end{array}\right),\text{}\left(\begin{array}{rr}0& 1\\ 0& 1\\ 0& 1\end{array}\right)\right\}$$is linearly dependent.

In ${\text{F}}^{n}$, let ${e}_{j}$ denote the vector whose

`j`th coordinate is 1 and whose other coordinates are 0. Prove that $\{{e}_{1},\text{}{e}_{2},\text{}\dots {e}_{n}\}$ is linearly independent.Show that the set $\{1,\text{}x,\text{}{x}^{2},\text{}\dots ,\text{}{x}^{n}\}$ is linearly independent in ${\text{P}}_{n}(F)$.

In ${\text{M}}_{m\times n}(F)$, let ${E}^{ij}$ denote the matrix whose only nonzero entry is 1 in the

`i`th row and`j`th column. Prove that $\{{E}^{ij}:\text{}1\le i\le m,\text{}1\le j\le n\}$ is linearly independent.Recall from Example 3 in Section 1.3 that the set of diagonal matrices in ${\text{M}}_{2\times 2}(F)$ is a subspace. Find a linearly independent set that generates this subspace.

Let $S=\{(1,\text{}1,\text{}0),\text{}(1,\text{}0,\text{}1),\text{}(0,\text{}1,\text{}1)\}$ be a subset of the vector space ${\text{F}}^{3}$.

(a) Prove that if $F=R$, then

`S`is linearly independent.(b) Prove that if

`F`has characteristic two, then`S`is linearly dependent.

^{†}Let`u`and`v`be distinct vectors in a vector space V. Show that {`u`,`v`} is linearly dependent if and only if`u`or`v`is a multiple of the other.Give an example of three linearly dependent vectors in ${\text{R}}^{3}$ such that none of the three is a multiple of another.

Let $S=\{{u}_{1},{u}_{2},\dots ,{u}_{n}\}$ be a linearly independent subset of a vector space V over the field ${Z}_{2}$. How many vectors are there in span(

*S*)? Justify your answer.Prove Theorem 1.6 and its corollary.

Let V be a vector space over a field of characteristic not equal to two.

(a) Let

`u`and`v`be distinct vectors in V. Prove that {`u`,`v`} is linearly independent if and only if $\{u+v,u-v\}$ is linearly independent.(b) Let

`u`,`v`, and`w`be distinct vectors in V. Prove that {`u`,`v`,`w`} is linearly independent if and only if $\{u+v,\text{}u+w,\text{}v+w\}$ is linearly independent.

Prove that a set

`S`is linearly dependent if and only if $S=\left\{0\right\}$ or there exist distinct vectors $v,\text{}{u}_{1},\text{}{u}_{2},\text{}\dots ,\text{}{u}_{n}$ in`S`such that*v*is a linear combination of ${u}_{1,}{u}_{2},\text{}\dots ,{u}_{n}$.Let $S=\{{u}_{1},\text{}{u}_{2},\text{}\dots ,\text{}{u}_{n}\}$ be a finite set of vectors. Prove that

`S`is linearly dependent if and only if ${u}_{1}=0$ or ${u}_{k+1}\in \text{span}(\{{u}_{1},\text{}{u}_{2},\text{}\dots ,\text{}{u}_{k}\})$ for some $k(1\le k\le n)$.Prove that a set

`S`of vectors is linearly independent if and only if each finite subset of`S`is linearly independent.Let

*M*be a square upper triangular matrix (as defined on page 19 of Section 1.3) with nonzero diagonal entries. Prove that the columns of*M*are linearly independent.Let

`S`be a set of nonzero polynomials in`P`(`F`) such that no two have the same degree. Prove that`S`is linearly independent.Prove that if $\{{A}_{1},\text{}{A}_{2},\text{}\dots ,\text{}{A}_{k}\}$ is a linearly independent subset of ${\text{M}}_{n\times n}(F)$ then $\left\{{A}_{1}^{t},\text{}{A}_{2}^{t},\text{}\dots ,\text{}{A}_{k}^{t}\right\}$ is also linearly independent.

Let $f,\text{}g,\text{}\in \mathcal{F}(R,\text{}R)$ be the functions defined by $f(t)={e}^{rt}$ and $g(t)={e}^{st}$, where $r\ne s$. Prove that

`f`and`g`are linearly independent in`F`(`R`,`R`).Let ${S}_{1}$ and ${S}_{2}$ be disjoint linearly independent subsets of V. Prove that ${S}_{1}\cup {S}_{2}$ is linearly dependent if and only if $\text{span}({S}_{1})\cap \text{span}({S}_{2})\ne \left\{0\right\}$. Visit goo.gl/

Fi8Epr for a solution.