Skip to main content

Section4.3Linear Transformations

Objectives
  1. Learn how to verify that a transformation is linear, or prove that a transformation is not linear.
  2. Understand the relationship between linear transformations and matrix transformations.
  3. Recipe: compute the matrix of a linear transformation.
  4. Theorem: linear transformations and matrix transformations.
  5. Notation: the standard coordinate vectors e 1 , e 2 ,....
  6. Vocabulary: linear transformation, standard matrix, identity matrix.

In Section 4.1, we studied the geometry of matrices by regarding them as functions, i.e., by considering the associated matrix transformations. We defined some vocabulary (domain, codomain, range), and asked a number of natural questions about a transformation. For a matrix transformation, these translate into questions about matrices, which we have many tools to answer.

In this section, we make a change in perspective. Suppose that we are given a transformation that we would like to study. If we can prove that our transformation is a matrix transformation, then we can use linear algebra to study it. This raises two important questions:

  1. How can we tell if a transformation is a matrix transformation?
  2. If our transformation is a matrix transformation, how do we find its matrix?

For example, we saw in this example in Section 4.1 that the matrix transformation

T : R 2 −→ R 2 T ( x )= K 0 110 L x

is a counterclockwise rotation of the plane by 90 . However, we could have defined T in this way:

T : R 2 −→ R 2 T ( x )= thecounterclockwiserotationof x by90 .

Given this definition, it is not at all obvious that T is a matrix transformation, or what matrix it is associated to.

Subsection4.3.1Linear Transformations: Definition

In this section, we introduce the class of transformations that come from matrices.

Definition

A linear transformation is a transformation T : R n R m satisfying

T ( u + v )= T ( u )+ T ( v ) T ( cu )= cT ( u )

for all vectors u , v in R n and all scalars c .

Let T : R n R m be a matrix transformation: T ( x )= Ax for an m × n matrix A . By this proposition in Section 2.4, we have

T ( u + v )= A ( u + v )= Au + Av = T ( u )+ T ( v ) T ( cu )= A ( cu )= cAu = cT ( u )

for all vectors u , v in R n and all scalars c . Since a matrix transformation satisfies the two defining properties, it is a linear transformation

We will see in the next subsection that the opposite is true: every linear transformation is a matrix transformation; we just haven't computed its matrix yet.

In engineering, the second fact is called the superposition principle; it should remind you of the distributive property. For example, T ( cu + dv )= cT ( u )+ dT ( v ) for any vectors u , v and any scalars c , d . To restate the first fact:

A linear transformation necessarily takes the zero vector to the zero vector.

One can show that, if a transformation is defined by formulas in the coordinates as in the above example, then the transformation is linear if and only if each coordinate is a linear expression in the variables with no constant term.

When deciding whether a transformation T is linear, generally the first thing to do is to check whether T ( 0 )= 0; if not, T is automatically not linear. Note however that the non-linear transformations T 1 and T 2 of the above example do take the zero vector to the zero vector.

Subsection4.3.2The Standard Coordinate Vectors

In the next subsection, we will present the relationship between linear transformations and matrix transformations. Before doing so, we need the following important notation.

Standard coordinate vectors

The standard coordinate vectors in R n are the n vectors

e 1 = EIIIIG 10...00 FJJJJH , e 2 = EIIIIG 01...00 FJJJJH ,..., e n 1 = EIIIIG 00...10 FJJJJH , e n = EIIIIG 00...01 FJJJJH .

The i th entry of e i is equal to 1, and the other entries are zero.

From now on, for the rest of the book, we will use the symbols e 1 , e 2 ,... to denote the standard coordinate vectors.

There is an ambiguity in this notation: one has to know from context that e 1 is meant to have n entries. That is, the vectors

K 10 L and C 100 D

may both be denoted e 1 , depending on whether we are discussing vectors in R 2 or in R 3 .

The standard coordinate vectors in R 2 and R 3 are pictured below.

e 1 e 2 in R 2 in R 3 e 1 e 2 e 3

These are the vectors of length 1 that point in the positive directions of each of the axes.

For example,

C 123456789 DC 100 D = C 147 DC 123456789 DC 010 D = C 258 DC 123456789 DC 001 D = C 369 D .
Definition

The n × n identity matrix is the matrix I n whose columns are the n standard coordinate vectors in R n :

I n = EIIIIG 10 ··· 0001 ··· 00...............00 ··· 1000 ··· 01 FJJJJH .

We will see in this example below that the identity matrix is the matrix of the identity transformation.

Subsection4.3.3The Matrix of a Linear Transformation

Now we can prove that every linear transformation is a matrix transformation, and we will show how to compute the matrix.

Proof

The matrix A in the above theorem is called the standard matrix for T . The columns of A are the vectors obtained by evaluating T on the n standard coordinate vectors in R n . To summarize part of the theorem:

Matrix transformations are the same as linear transformations.

Dictionary

Linear transformations are the same as matrix transformations, which come from matrices. The correspondence can be summarized in the following dictionary.

T : R n R m Lineartransformation −−−→ m × n matrix A = C ||| T ( e 1 ) T ( e 2 ) ··· T ( e n ) ||| D T : R n R m T ( x )= Ax ←−−− m × n matrix A

We saw in the above example that the matrix for counterclockwise rotation of the plane by an angle of θ is

A = K cos θ sin θ sin θ cos θ L .

Recall from this definition in Section 4.1 that the identity transformation is the transformation Id R n : R n R n defined by Id R n ( x )= x for every vector x .

We computed in this example that the matrix of the identity transform is the identity matrix: for every x in R n ,

x = Id R n ( x )= I n x .

Therefore, I n x = x for all vectors x : the product of the identity matrix and a vector is the same vector.