Unit 3
Vector Space Structure

Introduction

Previous units of this course presented evidence that deep connections exist between systems of linear equations and geometrical structures, such as subspaces of a vector space. In exploring these connections, you will find that the vector space Rn is very important, since it is the standard model for an n-dimensional vector space. That is, any n-dimensional vector space is isomorphic to Rn.

Reviewing briefly, if Ax = b is a system of m equations in n unknowns, then each of the n unknowns can be taken as a Cartesian co-ordinate in Rn; hence, each of the m equations defines an (n − 1)-dimensional hyperplane in Rn. The solution set for the entire system of equations, if it exists, is the intersection of these m hyperplanes. We saw in Unit 2 that the solution set is not a subspace unless the system of equations is homogeneous, and the i-th hyperplane, defined by ai1x1 + … + ainxn = bi, is not a subspace unless bi = 0.

On the other hand, if a set S of vectors in Rn is given, then the set lin(S) of all possible linear combinations of elements of S is a subspace, called the subspace spanned by S. Furthermore, if the vectors in S are linearly independent, then S is called a basis for the subspace lin(S).

We begin this unit by discussing the fact that any system of m linear equations in n unknowns defines a subspace of Rn, and a related subspace of Rm, significant for a deeper understanding of these equations. We then study a method for construction of a basis for any vector space, and finally, we consider how a basis of a vector space may be used to introduce co-ordinates in that space, and how these co-ordinates change if the basis is changed. [It is the fact that n independent co-ordinates can be introduced in any n-dimensional vector space that gives the isomorphism to Rn. That is, the co-ordinates of a vector v in an n-dimensional vector space V, with respect to a given basis of V, are just an n-tuple of real numbers, hence a vector in Rn.]

The textbook introduces the concept of an inner product space. Just as the general vector spaces you studied in Unit 2 are generalizations of Euclidean n-space, so an inner product space is a vector space with a vector product that is a generalization of the Euclidean dot product. We will not study more general inner product spaces in this course, but restrict considerations to vector spaces with the Euclidean inner (dot) product. The inner product notation will be used, however, so that if u and v are vectors in a vector space V, then we will write <u, v> as well as u · v to indicate the dot product.

Objectives

After completing this unit you should be able to

  1. define and compute the “orthogonal complement” of a subspace of a vector space.
  2. compute the row space and null space of a matrix, and describe the relationship between these two spaces.
  3. define the terms “orthogonal” and “orthonormal,” and determine whether a given basis or a given set of vectors is orthonormal.
  4. use the Gram-Schmidt process to construct an orthonormal basis in any finite dimensional vector space.
  5. find the “least squares” solution for systems of inconsistent linear equations.
  6. define the “change of basis problem,” and find the transition matrix linking two different bases.