Skip to main content

Section3.2Linear Independence

Objectives
  1. Understand the concept of linear independence.
  2. Learn two criteria for linear independence.
  3. Understand the relationship between linear independence and pivot columns / free variables.
  4. Recipe: test if a set of vectors is linearly independent / find an equation of linear dependence.
  5. Picture: whether a set of vectors in R 2 or R 3 is linearly independent or not.
  6. Vocabulary: linear dependence relation / equation of linear dependence.
  7. Essential Vocabulary: linearly independent, linearly dependent.

Sometimes the span of a set of vectors is “smaller” than you expect from the number of vectors, as in the picture below. This means that (at least) one of the vectors is redundant: it can be removed without affecting the span. In the present section, we formalize this idea in the notion of linear independence.

Span { v , w } v w Span { u , v , w } v w u
Figure1Pictures of sets of vectors that are linearly dependent. Note that in each case, one vector is in the span of the others—so it doesn’t make the span bigger.

Subsection3.2.1The Definition of Linear Independence

Definition

A set of vectors { v 1 , v 2 ,..., v k } is linearly independent if the vector equation

x 1 v 1 + x 2 v 2 + ··· + x k v k = 0

has only the trivial solution x 1 = x 2 = ··· = x k = 0. The set { v 1 , v 2 ,..., v k } is linearly dependent otherwise.

In other words, { v 1 , v 2 ,..., v k } is linearly dependent if there exist numbers x 1 , x 2 ,..., x k , not all equal to zero, such that

x 1 v 1 + x 2 v 2 + ··· + x k v k = 0.

This is called a linear dependence relation or equation of linear dependence.

Note that linear dependence and linear independence are notions that apply to a collection of vectors. It does not make sense to say things like “this vector is linearly dependent on these other vectors,” or “this matrix is linearly independent.”

The above examples lead to the following recipe.

Recipe: Checking linear independence

A set of vectors { v 1 , v 2 ,..., v k } is linearly independent if and only if the vector equation

x 1 v 1 + x 2 v 2 + ··· + x k v k = 0

has only the trivial solution, if and only if the matrix equation Ax = 0 has only the trivial solution, where A is the matrix with columns v 1 , v 2 ,..., v k :

A = E ||| v 1 v 2 ··· v k ||| F .

This is true if and only if A has a pivot position in every column.

Solving the matrix equation Ax = 0 will either verify that the columns v 1 , v 2 ,..., v k are linearly independent, or will produce a linear dependence relation by substituting any nonzero values for the free variables.

(Recall that Ax = 0 has a nontrivial solution if and only if A has a column without a pivot: see this observation in Section 3.1.)

Suppose that A has more columns than rows. Then A cannot have a pivot in every column (it has at most one pivot per row), so its columns are automatically linearly dependent.

A wide matrix (a matrix with more columns than rows) has linearly dependent columns.

For example, four vectors in R 3 are automatically linearly dependent. Note that a tall matrix may or may not have linearly independent columns.

With regard to the first fact, note that the zero vector is a multiple of any vector, so it is collinear with any other vector. Hence facts 1 and 2 are consistent with each other.

Subsection3.2.2Criteria for Linear Independence

In this subsection we give two criteria for a set of vectors to be linearly independent. Keep in mind, however, that the actual definition is above.

Proof
Warning

In a linearly dependent set { v 1 , v 2 ,..., v k } , it is not generally true that any vector v j is in the span of the others, only that at least one of them is.

For example, the set CA 10 B , A 20 B , A 01 BD is linearly dependent, but A 01 B is not in the span of the other two vectors. Also see this figure below.

The previous theorem makes precise in what sense a set of linearly dependent vectors is redundant.

Proof

We can rephrase this as follows:

If you make a set of vectors by adding one vector at a time, and if the span got bigger every time you added a vector, then your set is linearly independent.

Subsection3.2.3Pictures of Linear Independence

A set containing one vector { v } is linearly independent when v A = 0, since xv = 0 implies x = 0.

Span { v } v

A set of two noncollinear vectors { v , w } is linearly independent:

Span { v } Span { w } v w

The set of three vectors { v , w , u } below is linearly dependent:

In the picture below, note that v is in Span { u , w } , and w is in Span { u , v } , so we can remove any of the three vectors without shrinking the span.

Span { v } Span { w } Span { v , w } v w u

Two collinear vectors are always linearly dependent:

Span { v } v w

These three vectors { v , w , u } are linearly dependent: indeed, { v , w } is already linearly dependent, so we can use the third fact.

Span { v } v w u

The two vectors { v , w } below are linearly independent because they are not collinear.

v w Span { v } Span { w }

The three vectors { v , w , u } below are linearly independent: the span got bigger when we added w , then again when we added u , so we can apply the increasing span criterion.

v w u Span { v } Span { w } Span { v , w }

The three coplanar vectors { v , w , u } below are linearly dependent:

v w u Span { v } Span { w } Span { v , w }

Note that three vectors are linearly dependent if and only if they are coplanar. Indeed, { v , w , u } is linearly dependent if and only if one vector is in the span of the other two, which is a plane (or a line) (or { 0 } ).

The four vectors { v , w , u , x } below are linearly dependent: they are the columns of a wide matrix. Note however that u is not contained in Span { v , w , x } . See this warning.

v w u x Span { v } Span { w } Span { v , w }
Figure20The vectors { v , w , u , x } are linearly dependent, but u is not contained in Span { v , w , x } .

Subsection3.2.4Linear Dependence and Free Variables

In light of this important note and this criterion, it is natural to ask which columns of a matrix are redundant, i.e., which we can remove without affecting the column span.

Proof

Note that it is necessary to row reduce A to find which are its pivot columns. However, the span of the columns of the row reduced matrix is generally not equal to the span of the columns of A : one must use the pivot columns of the original matrix. See theorem in Section 3.4 for a restatement of the above theorem.

Pivot Columns and Dimension

Let d be the number of pivot columns in the matrix

A = E ||| v 1 v 2 ··· v k ||| F .
  • If d = 1 then Span { v 1 , v 2 ,..., v k } is a line.
  • If d = 2 then Span { v 1 , v 2 ,..., v k } is a plane.
  • If d = 3 then Span { v 1 , v 2 ,..., v k } is a 3-space.
  • Et cetera.

The number d is called the dimension of the span. We have already met this notion informally in this important note in Section 3.1 and also in this important note in Section 3.1. We will define this concept rigorously in Section 3.4.

There are multiple different spans of vectors associated to any given matrix, A . For instance there is the span of the columns, and there is also the solution set of Ax = 0. Each of these spans has a dimension, and these dimensions can be different. Therefore we do not speak about the dimension of a matrix, since it is ambiguous which dimension this is referring to.