Primary Goal
Approximately solve the matrix equation
Let us recall one last time the structure of this book:
We have now come to the third part.
Approximately solve the matrix equation
Finding approximate solutions of equations generally requires computing the closest vector on a subspace to a given vector. This becomes an orthogonality problem: one needs to know which vectors are perpendicular to the subspace.
First we will define orthogonality and learn to find orthogonal complements of subspaces in Section 7.1 and Section 7.2. The core of this chapter is Section 7.3, in which we discuss the orthogonal projection of a vector onto a subspace; this is a method of calculating the closest vector on a subspace to a given vector. These calculations become easier in the presence of an orthogonal set, as we will see in Section 7.4. In Section 7.5 we will present the least-squares method of approximately solving systems of equations, and we will give applications to data modeling.
In data modeling, one often asks: “what line is my data supposed to lie on?” This can be solved using a simple application of the least-squares method.
Gauss invented the method of least squares to find a best-fit ellipse: he correctly predicted the (elliptical) orbit of the asteroid Ceres as it passed behind the sun in 1801.