Skip to main content

Section 3.2 Linear independence

To motivate the material of this section we begin by revisiting Example 3.1.6. In that example we showed that \(\SpanS\left(\begin{bmatrix}1\\1\end{bmatrix}, \begin{bmatrix}1\\-1\end{bmatrix}, \begin{bmatrix}3\\2\end{bmatrix}\right) = \mathbb{R}^2\text{,}\) but our calculations actually showed us more. For later convenience, let \(S = \SpanS\left(\begin{bmatrix}1\\1\end{bmatrix}, \begin{bmatrix}1\\-1\end{bmatrix}, \begin{bmatrix}3\\2\end{bmatrix}\right)\text{.}\)

In trying to determine whether or not a vector \(\begin{bmatrix}x\\y\end{bmatrix}\) is in \(S\) we are looking for scalars \(r, s, t\) such that

\begin{equation*} \begin{bmatrix}x\\y\end{bmatrix} = r\begin{bmatrix}1\\1\end{bmatrix}+s\begin{bmatrix}1\\-1\end{bmatrix}+t\begin{bmatrix}3\\2\end{bmatrix}\text{.} \end{equation*}

After row reduction we ended up with the following system of equations:

\begin{gather*} r+\frac{1}{2}t = \frac{x+y}{2} \\ s+\frac{5}{2}t = \frac{x-y}{2} \end{gather*}

We see that \(t\) is a free variable, so (no matter what \(x\) and \(y\) are) we can choose \(t\) arbitrarily and still get a solution to our system. In particular, we can always choose \(t=0\text{,}\) and get the solution \(r = \frac{x+y}{2}, s = \frac{x-y}{2}\text{.}\) Looking back at where the variables \(r, s, t\) appeared, choosing \(t=0\) amounts to not using the vector \(\begin{bmatrix}3\\2\end{bmatrix}\) in our linear combination. That is, we have shown that in this example any linear combination of \(\begin{bmatrix}1\\1\end{bmatrix}\text{,}\) \(\begin{bmatrix}1\\-1\end{bmatrix}\text{,}\) and \(\begin{bmatrix}3\\2\end{bmatrix}\) can actually be written as a linear combination of just \(\begin{bmatrix}1\\1\end{bmatrix}\) and \(\begin{bmatrix}1\\-1\end{bmatrix}\text{.}\) In other words,

\begin{equation*} \SpanS\left(\begin{bmatrix}1\\1\end{bmatrix}, \begin{bmatrix}1\\-1\end{bmatrix}, \begin{bmatrix}3\\2\end{bmatrix}\right) = \SpanS\left(\begin{bmatrix}1\\1\end{bmatrix}, \begin{bmatrix}1\\-1\end{bmatrix}\right)\text{.} \end{equation*}

In this example the vector \(\begin{bmatrix}3\\2\end{bmatrix}\) was redundant, in that removing it did not change the vectors we could get as linear combinations. The collection \(S\) can be described as either \(\SpanS\left(\begin{bmatrix}1\\1\end{bmatrix}, \begin{bmatrix}1\\-1\end{bmatrix}, \begin{bmatrix}3\\2\end{bmatrix}\right)\) or as \(\SpanS\left(\begin{bmatrix}1\\1\end{bmatrix}, \begin{bmatrix}1\\-1\end{bmatrix}\right)\text{,}\) but the latter description is more parsimonious. Moreover, we cannot remove any more vectors from our list and still have the span be \(S\text{,}\) because the span of a single vector in \(\mathbb{R}^2\) is a line, while we already know that \(S = \mathbb{R}^2\text{.}\) We therefore have found a collection of vectors with no redundancies whose span is \(S\text{.}\)

The remainder of this section is all about making the idea of "a set of vectors with no redundancies" precise.

Definition 3.2.1.

Let \(\vec{v_1}, \ldots, \vec{v_k}\) be vectors in \(\mathbb{R}^n\text{.}\) We say that \(\vec{v_1},\ldots,\vec{v_k}\) are linearly independent if the only way to write \(a_1\vec{v_1} + \cdots + a_k\vec{v_k} = \vec{0}\) is by using \(a_1 = a_2 = \cdots = a_k = 0\text{.}\) If \(\vec{v_1}, \ldots, \vec{v_k}\) are not linearly independent then they are called linearly dependent.

Note 3.2.2.

Given any vectors \(\vec{v_1}, \ldots, \vec{v_k}\) if we let \(a_1 = a_2 = \cdots = a_k = 0\) then we certainly do get \(a_1\vec{v_1}+\cdots+a_k\vec{v_k} = \vec{0}\text{.}\) We sometimes say that the expression \(0\vec{v_1} + \cdots + 0\vec{v_k}\) is the trivial way of obtaining \(\vec{0}\) as a linear combination of \(\vec{v_1}, \ldots, \vec{v_k}\text{.}\) Saying that \(\vec{v_1}, \ldots, \vec{v_k}\) are linearly independent is saying that there is no non-trivial choice of scalars that will allow us to make a linear combination of \(\vec{v_1}, \ldots, \vec{v_k}\) be \(\vec{0}\text{.}\)

Any set of vectors containing \(\vec{0}\) will be linearly dependent. Indeed, if we have the vectors \(\vec{0}, \vec{v_2}, \ldots, \vec{v_k}\) then we can write \(5\vec{0} + 0\vec{v_2} + \cdots + 0\vec{v_k} = \vec{0}\text{,}\) and not all of the coefficients on the left side are \(0\text{.}\)

Let \(\vec{v_1} = \begin{bmatrix}1\\2\\3\end{bmatrix}\text{,}\) \(\vec{v_2} = \begin{bmatrix}2\\1\\0\end{bmatrix}\text{,}\) and \(\vec{v_3} = \begin{bmatrix}0\\1\\2\end{bmatrix}\text{.}\) To determine whether these vectors are linearly independent or linearly dependent we need to determine the possibly scalars \(a_1, a_2, a_3\) such that \(a_1\vec{v_1} + a_2\vec{v_2}+a_3\vec{v_3}=\vec{0}\text{.}\) That is, we need to find solutions to:

\begin{equation*} a_1\begin{bmatrix}1\\2\\3\end{bmatrix} + a_2\begin{bmatrix}2\\1\\0\end{bmatrix}+a_3\begin{bmatrix}0\\1\\2\end{bmatrix} = \begin{bmatrix}0\\0\\0\end{bmatrix}\text{.} \end{equation*}

If we expand this vector equation into a system of linear equations we obtain:

\begin{gather*} a_1 + 2a_2 + 0a_3 = 0 \\ 2a_1 + a_2 + a_3 = 0\\ 3a_1 + 0a_2 + 2a_3 = 0 \end{gather*}

We turn this into an augmented matrix and row-reduce:

\begin{equation*} \matr{ccc|c}{1 \amp 2 \amp 0 \amp 0 \\ 2 \amp 1 \amp 1 \amp 0 \\ 3 \amp 0 \amp 2 \amp 0} \to \matr{ccc|c}{1 \amp 0 \amp 2/3 \amp 0 \\ 0 \amp 1 \amp -1/3 \amp 0 \\ 0 \amp 0 \amp 0 \amp 0}\text{.} \end{equation*}

From the reduced row echelon form we see that our system of equations has \(a_3\) as a free variable, and hence has infinitely many solutions. In particular, it has solutions other than \(a_1=a_2=a_3=0\text{,}\) so \(\vec{v_1}, \vec{v_2}, \vec{v_3}\) are linearly dependent.

The first part of the solution to the previous example occurs frequently enough to be worth recording as a theorem.

The system represented by \([A|\vec{0}]\) is exactly the same as the system written in vector form as \(a_1\vec{v_1} + \cdots + a_k\vec{v_k} = \vec{0}\text{,}\) so one has a unique solution (which must be the trivial solution) if and only if the other does.

Are the vectors \(\begin{bmatrix}1\\0\\-1\\2\end{bmatrix}, \begin{bmatrix}1\\1\\1\\1\end{bmatrix}, \begin{bmatrix}2\\4\\3\\-1\end{bmatrix}\) linearly dependent or linearly independent?

Solution.

We set up the appropraite augmented matrix and row reduce:

\begin{equation*} \matr{ccc|c}{1 \amp 1 \amp 2 \amp 0 \\ 0 \amp 1 \amp 4 \amp 0 \\ -1 \amp 1 \amp 3 \amp 0 \\ 2 \amp 1 \amp -1 \amp 0} \to \matr{ccc|c}{1 \amp 0 \amp 0 \amp 0 \\ 0 \amp 1 \amp 0 \amp 0 \\ 0 \amp 0 \amp 1 \amp 0 \\ 0 \amp 0 \amp 0 \amp 0}\text{.} \end{equation*}

We see that our system has a unique solution, so the vectors are linearly independent.

Our next theorem gives a reason that our definition of linear independence is a good notion of having no redundancies amongst the vectors.

First suppose that \(\vec{v_1}, \ldots, \vec{v_k}\) are linearly dependent. Then we can find scalars \(a_1, \ldots, a_k\text{,}\) not all zero, such that \(a_1\vec{v_1} + \cdots + a_k\vec{v_k} = 0\text{.}\) Suppose that \(i\) is one of the indices such that \(a_i \neq 0\text{.}\) Then we can rearrange the linear dependence relation to obtain

\begin{equation*} \vec{v_i} = \frac{a_1}{a_i}\vec{v_1} + \cdots + \frac{a_{i-1}}{a_i}\vec{v_{i-1}} + \frac{a_{i+1}}{a_i}\vec{v_{i+1}} + \cdots + \frac{a_k}{a_i}\vec{v_k}\text{,} \end{equation*}

which shows that \(\vec{v_i}\) is in \(\SpanS(\vec{v_1}, \ldots, \vec{v_{i-1}}, \vec{v_{i+1}}, \ldots, \vec{v_k})\text{.}\)

For the converse, suppose that \(\vec{v_i}\) is in \(\SpanS(\vec{v_1}, \ldots, \vec{v_{i-1}}, \vec{v_{i+1}}, \ldots, \vec{v_k})\text{.}\) Then there are scalars \(b_1, \ldots, b_{i-1}, b_{i+1}, \ldots, b_k\) such that

\begin{equation*} \vec{v_i} = b_1\vec{v_1} + \cdots + b_{i-1}\vec{v_{i-1}} + b_{i+1}\vec{v_{i+1}} + \cdots + b_k\vec{v_k}\text{,} \end{equation*}

which we can rearrange to

\begin{equation*} b_1\vec{v_1} + \cdots + b_{i-1}\vec{v_{i-1}}+1\vec{v_i}+b_{i+1}\vec{v_{i+1}} + \cdots + b_k\vec{v_k}=\vec{0}\text{.} \end{equation*}

The coefficient of \(\vec{v_i}\) is non-zero, so this equation shows that the vectors \(\vec{v_1}, \ldots, \vec{v_k}\) are linearly dependent.

Consider again the vectors from Example 3.2.4, \(\vec{v_1} = \begin{bmatrix}1\\2\\3\end{bmatrix}\text{,}\) \(\vec{v_2} = \begin{bmatrix}2\\1\\0\end{bmatrix}\text{,}\) and \(\vec{v_3} = \begin{bmatrix}0\\1\\2\end{bmatrix}\text{.}\) We found that the equation \(a_1\vec{v_1}+a_2\vec{v_2}+a_3\vec{v_3}=\vec{0}\) is true whenever \(a_1 = -\frac{2}{3}a_3\) and \(a_2 = \frac{1}{3}a_3\text{.}\) Let us choose a specific value of \(a_3\text{;}\) to make the fractions go away we can choose \(a_3 = 3\) (but any choice would be fine). Then we get \(a_1 = -2\) and \(a_2 = 1\text{,}\) which gives us the equation

\begin{equation*} -2\begin{bmatrix}1\\2\\3\end{bmatrix} + \begin{bmatrix}2\\1\\0\end{bmatrix}+3\begin{bmatrix}0\\1\\2\end{bmatrix} = \begin{bmatrix}0\\0\\0\end{bmatrix}\text{.} \end{equation*}

We could re-write this equation as

\begin{equation*} \begin{bmatrix}2\\1\\0\end{bmatrix} = 2\begin{bmatrix}1\\2\\3\end{bmatrix} - 3\begin{bmatrix}0\\1\\2\end{bmatrix}\text{,} \end{equation*}

thus showing that \(\SpanS(\vec{v_1}, \vec{v_2}, \vec{v_3}) = \SpanS(\vec{v_1}, \vec{v_3})\text{.}\)

Note 3.2.9.

Caution! In the previous example we could have rearranged our linear dependence relation differently, and so we could have chosen to remove \(\vec{v_1}\) or \(\vec{v_3}\) instead of \(\vec{v_2}\text{,}\) but it is not always true that we can eliminate any vector we wish.

Theorem 3.2.7 says that if you have a linearly dependent set of vectors then there is at least one vector that is in the span of the others, but it does not say that every vector is in the span of the others. Using Theorem 3.1.7, this means that if you have a linearly dependent set then there is at least one vector that you can remove without changing the span, but there may also be vectors whose removal does change their span.

Exercises Exercises

1.

Which of the following sets of vectors are linearly independent? Support your answer.
  1. \(\left\{\begin{bmatrix}1\\-1\\0\end{bmatrix}, \begin{bmatrix}3\\2\\-1\end{bmatrix}, \begin{bmatrix}3\\5\\-2\end{bmatrix}\right\}\text{.}\)

  2. \(\left\{\begin{bmatrix}1\\1\\1\end{bmatrix}, \begin{bmatrix}1\\-1\\1\end{bmatrix}, \begin{bmatrix}0\\0\\1\end{bmatrix}\right\}\text{.}\)

  3. \(\displaystyle \left\{\begin{bmatrix}1\\-1\\1\\-1\end{bmatrix}, \begin{bmatrix}2\\0\\1\\0\end{bmatrix}, \begin{bmatrix}0\\-2\\1\\-2\end{bmatrix}\right\}\)

  4. \(\left\{\begin{bmatrix}1\\1\\0\\0\end{bmatrix}, \begin{bmatrix}1\\0\\1\\0\end{bmatrix}, \begin{bmatrix}0\\0\\1\\1\end{bmatrix}, \begin{bmatrix}0\\1\\0\\1\end{bmatrix}\right\}\text{.}\)

2.

Suppose that \(\{\vec{x}, \vec{y}, \vec{z}, \vec{w}\}\) is a linearly independent set in \(\mathbb{R}^n\text{.}\) Which of the following sets must also be independent? Support your answer.
  1. \(\{\vec{x}-\vec{y}, \vec{y}-\vec{z}, \vec{z}-\vec{x}\}\text{.}\)

  2. \(\{\vec{x}+\vec{y}, \vec{y}+\vec{z}, \vec{z}+\vec{x}\}\text{.}\)

  3. \(\{\vec{x}-\vec{y}, \vec{y}-\vec{z}, \vec{z}-\vec{w}, \vec{w}-\vec{x}\}\text{.}\)

  4. \(\{\vec{x}+\vec{y}, \vec{y}+\vec{z}, \vec{z}+\vec{w}, \vec{w}+\vec{x}\}\text{.}\)

3.

Here are some vectors in \(\mathbb{R}^4\text{.}\) Explain why they cannot possibly be linearly independent, then find a non-trivial linear combination of these vectors that is equal to \(\vec{0}\text{.}\)
  1. \(\begin{bmatrix}1\\1\\-1\\1\end{bmatrix}, \begin{bmatrix}1\\2\\-1\\1\end{bmatrix}, \begin{bmatrix}1\\-2\\-1\\1\end{bmatrix}, \begin{bmatrix}1\\2\\0\\1\end{bmatrix}, \begin{bmatrix}1\\-1\\-1\\1\end{bmatrix}\text{.}\)

  2. \(\begin{bmatrix}1\\-1\\-1\\1\end{bmatrix}, \begin{bmatrix}-3\\3\\3\\-3\end{bmatrix}, \begin{bmatrix}1\\0\\-1\\1\end{bmatrix}, \begin{bmatrix}2\\-9\\-2\\2\end{bmatrix}, \begin{bmatrix}1\\0\\0\\1\end{bmatrix}\text{.}\)

4.

Find a linearly independent set of vectors that has the same span as the given vectors.
\begin{equation*} \begin{bmatrix}1\\3\\3\\1\end{bmatrix}, \begin{bmatrix}2\\6\\6\\2\end{bmatrix}, \begin{bmatrix}1\\0\\-3\\1\end{bmatrix}, \begin{bmatrix}1\\2\\1\\1\end{bmatrix}\text{.} \end{equation*}

5.

If \(\{\vec{x_1}, \vec{x_2}, \vec{x_3}, \vec{x_4}, \vec{x_5}, \vec{x_6}\}\) is a linearly independent set of vectors, show that the subset \(\{\vec{x_2}, \vec{x_3}, \vec{x_5}\}\) is also independent.