Chapter 4: Vector Spaces
Definition of a Subspace
A subspace H is a vector space that has these three properties: 1. The zero element of V is in H 2. H is closed under vector addition, that is for a given u and v in H the sum u + v is in H. 3. H is closed under multiplication by scalars, that is for any scalar "c" and any vector "v" in H, "cv" is also in H.
Definition of a Vector Space
A vector space is a nonempty set "V" of objects, which are defined by addition and scalar multiplication.
Theorem: Linear Dependence in a Vector Space
An indexed set {v₁,..., vp} of two or more vectors, with v₁ ≠ 0, is linearly dependent if and only if some vector 'v_k" (with k > 1) is a linear combination of the preceding vectors v₁,..., v_k-1.
Definition of a Basis
Given H, a subspace of a vector space V, An indexed set of vectors B = {b₁,..., bp} is a basis of if it meets the following criteria: 1. The vectors of B are linearly independent 2. The subspace IS the span of B, in other words, H = Span {b₁,..., bp)
The Basis Theorem
Given a vector space V with dimension p≥1, any linearly independent set of p vectors is automatically a basis for V. Likewise, any set of vectors that spans V is automatically a basis for V. Also, if a set of linearly independent vectors has less than p vectors, can be expanded to p vectors.
Theorem: Row Space of A and B
If two matrices A and B are row equivalent, then their row spaces are the same. If B is in echelon form, where every nonzero row has a pivot and therefore linearly independent, the nonzero rows of B form a basis for the row space of A as well as for that of B. Proof: If B produced after a series of row operations on A, then it can be said that the rows of B are merely linear combinations of the rows of A, thus the row space of B is in the row space of A; therefore, the basis for B is the same as the basis for A if Row(B) is in Row(A).
The Spanning Set Theorem
Let S = {v₁,..., vp} be a set in V, and let H = Span {v₁,..., vp}, so: 1. If one vector in S = {v₁,..., vp} is a linear combination of the other vectors, then that vector can can be deleted from the S and the remaining vectors will still span H. 2. If H ≠ {0}, then some subset of S will necessarily be in H. Using this theorem, the a basis can be seen as a linearly independent set of vectors that is as large as possible and a spanning set of vectors that is as small as possible.
Theorem: Basis for Col (A)
The pivot columns for a matrix A, which are found from B, the row echelon form of A, is the basis for Col (A) because the columns of B have the same linear dependance relations as the columns of A. Note that the pivot columns of A can only be identified from the pivot columns of B