Linear Algebra Test #3
Coordinate mapping (determined by β)
*x*→[*x*]b
Vector space - add'l facts
0*u*=*0* c*0*=*0* -*u*=(-1)*u*
Linear dependence facts
1. A set containing a single vector *v* is linearly independent if and only if *v≠0*. 2. Any set containing the zero vector is linearly dependent. 3. A set of two vectors is linearly dependent if and only if one of the vectors is a multiple of the other.
Linear Transformation
A linear transformation T from a vector space V into a vector space W is a rule that assigns to each vector *x* in V a unique vector T(*x*) in W, such that (i) T(*u*+*v*) = T(*u*) + T(*v*) for all *u,v* in V, and (ii) T(c*u*) = cT(*u*) for all *u* in V and all scalars c.
Vector space
A nonempty set V of objects, called vectors, on which are defined two operations, called addition and multiplication by scalars (real numbers), subject to the ten axioms (or rules) listed below. The axioms must hold for all vectors *u*,*v*, and *w* in V and for all scalars c and d. 1. The sum of *u* and *v*, denoted by *u*+*v*, is in V. 2. *u*+*v*=*v*+*u*. 3. *u*+*v*+*w*=*u*+(*v*+*w*). 4. There is a *zero* vector *0* in V such that *u*+*0*=*u*. 5. For each *u* in V, there is a vector - *u* in V such that *u*+(-*u*)=*0*. 6. The scalar multiple of *u* by c, denoted by c*u*, is in V. 7. c(*u*+*v*)=c*u*+c*v*. 8. (c+d)*u*= c*u*+d*u*. 9. c(d*u*)=(cd)*u*. 10. 1*u*=*u*.
Characteristic Equation
A scalar λ is an eigenvalue of an nxn matrix A if and only if λ satisfies the characteristic equation det (A-λI) = 0
Diagonalization
A square matrix A is said to be diagonalizable if A is similar to a diagonal matrix, that is, if A = PDP⁻¹ for some invertible matrix P and some diagonal matrix D.
Subspace
A subspace of a vector space V is a subset H of V that has three properties: a. The zero vector of V is in H. b. H is closed under vector addition. That is, for each *u* and *v* in H, the sum *u*+*v* is in H. c. H is closed under multiplication by scalars. That is, for each *u* in H and each scalar c, the vector c*u* is in H.
Eigenvector and eigenvalue
An *eigenvector* of an nxn matrix A is a nonzero vector *x* such that A*x*=λ*x* for some scalar λ. A scalar λ is called an *eigenvalue* of A if there is a nontrivial solution *x* of A*x*=λ*x*; such an *x* is called an eigenvector corresponding to λ.
Linear independence
An indexed set of vectors {*v*₁,...,*v*p} in V is said to be linearly independent if the vector equation c₁*v*₁+c₂*v*₂+...+cp*v*p = *0* has only the trivial solution, c₁=0,...,cp=0.
Theorem 4.4
An indexed set {*v*₁,...,*v*p} of two or more vectors, with *v₁≠0*, is linearly dependent if and only if some *v*j (with j>1) is a linear combination of the preceding vectors, *v*₁,...,*v*(j-1).
Theorem 5.5 - The Diagonalization Theorem
An nxn matrix A is diagonalizable if and only if A has n linearly independent eigenvectors. In fact, A = PDP⁻¹, with D a diagonal matrix, if and only if the columns of P are n linearly independent eigenvectors of A. In this case, the diagonal entries of D are eigenvalues of A that correspond, respectively, to the eigenvectors in P. A is diagonalizable if and only if there are enough eigenvectors to form a basis of Rⁿ. We call such a basis an eigenvector basis of Rⁿ.
Theorem 5.6
An nxn matrix with n distinct eigenvalues is diagonalizable. This is a *sufficient* condition for a matrix to be diagonalizable.
Basis for Nul A
Finding the basis for Nul A is the same as finding vectors that span Nul A.
Theorem 4.1
If *v*₁,...,*v*p are in a vector space V, then Span{*v*₁,...,*v*p} is a subspace of V. *Add'l* We call Span{*v*₁,...,*v*p} *the subspace spanned* (or *generated*) by {*v*₁,...,*v*p}. Given any subspace H of V, a *spanning* (or *generating*) *set* for H is a set {*v*₁,...,*v*p} in H such that H = Span{*v*₁,...,*v*p}.
Theorem 5.2
If *v*₁,...,*v*r are eigenvectors that correspond to distinct eigenvalues *λ*₁,...,*λ*r of an nxn matrix A, then the set {*v*₁,...,*v*r} is linearly independent.
Similarity
If A and B are nxn matrices, then A is similar to B if there is an invertible matrix P such that P⁻¹AP = B, or, equivalently, A = PBP⁻¹.
Row Space
If A is an mxn matrix, each row of A has n entries and thus can be identified with a vector in Rⁿ. The set of all linear combinations of the row vectors is called the row space of A and is denoted by Row A. Row A = Col A(transpose)
Characteristic Polynomial
If A is an nxn matrix, then det (A-λI) is a polynomial of degree n called the characteristic polynomial of A.
Finite-dimensional and infinite-dimensional
If V is spanned by a finite set, then V is said to be *finite-dimensional*, and the *dimension* of V, written as dim V, is the number of vectors in a basis for V. The dimension of the zero vector space {*0*} is defined to be zero. If V is not spanned by a finite set, then V is said to be *infinite-dimensional*.
Theorem 4.10
If a vector space V has a basis of n vectors, then every basis of V must consist of exactly n vectors.
Theorem 4.9
If a vector space V has a basis β = {*b*₁,...,*b*n}, then any set in V containing more than n vectors must be linearly dependent.
Theorem 5.4
If nxn matrices A and B are similar, then they have the same characteristic polynomial and hence the same eigenvalues (with the same multiplicities).
Theorem 4.13
If two matrices A and B are row equivalent, then their row spaces are the same. If B is in echelon form, the nonzero rows of B form a basis for the row space of A as well as for that of B.
Imaginary part of a complex vector *x*
Im *x*
Theorem 5.3 - Properties of Determinants
Let A and B be nxn matrices. a. A is invertible if and only if det A ≠ 0. b. det AB = (det A)(det B) c. det A(transpose) = det A d. If A is triangular, then det A is the product of the entries on the main diagonal of A. e. A row replacement operation on A does not change the determinant. A row interchange changes the sign of the determinant. A row scaling also scales the determinant by the same scalar factor.
Theorem 5.7
Let A be an nxn matrix whose distinct eigenvalues are λ₁,...,λp. a. For 1≤k≤p, the dimension of the eigenspace for λk is less than or equal to the multiplicity of the eigenvalue λk. b. The matrix A is diagonalizable if and only if the sum of the dimensions of the eigenspaces equals n, and this happens if and only if (i) the characteristic polynomial factors completely into linear factors and (ii) the dimension of the eigenspace for each λk equals the multiplicity of λk. c. If A is diagonalizable and βk is a basis for the eigenspace corresponding to λk for each k, then the total collection of vectors in the sets β₁,...,βp forms an eigenvector basis for Rⁿ.
Theorem IMT (cont'd, part 2)
Let A be an nxn matrix. Then A is invertible if and only if: s. The number 0 is not an eigenvalue of A. t. The determinant of A is not zero.
Theorem IMT (cont'd, part 1)
Let A be an nxn matrix. Then the following statements are each equivalent to the statement that A is an invertible matrix. m. The columns of A form a basis of Rⁿ. n. Col A = Rⁿ. o. dim Col A = n p. rank A = n q. Nul A = {*0*} r. dim Nul A = 0
Theorem 4.11
Let H be a subspace of a finite-dimensional vector space V. Any linearly independent set in H can be expanded, if necessary, to a basis for H. Also, H is finite-dimensional and dim H≤dim V
Basis
Let H be a subspace of a vector space V. An indexed set of vectors β = {*b*₁,...,*b*p} in V is a basis for H if: (i) β is a linearly independent set, and (ii) the subspace spanned by β coincides with H: that is, H = Span {*b*₁,...,*b*p}
Theorem 4.5 - Spanning Set Theorem
Let S = {*v*₁,...,*v*p} be a set in V, and let H = Span{*v*₁,...,*v*p}. a. If one of the vectors in S - say, *v*k - is a linear combination of the remaining vectors in S, then the set formed from S by removing *v*k still spans H. b. If H≠{*0*}, some subset of S is a basis for H.
Theorem 4.12 - The Basic Theorem
Let V be a p-dimensional vector space, p≥1. Any linearly independent set of exactly p elements in V is automatically a basis for V. Any set of exactly p elements that spans V is automatically a basis for V.
Theorem 4.7 - The Unique Representation Theorem
Let β = {*b*₁,...,*b*n} be a basis for a vector space in V. Then for each *x* in V, there exists a unique set of scalars c₁,...,cn such that *x* = c₁*b*₁+...+cn*b*n
Theorem 4.8
Let β = {*b*₁,...,*b*n} be a basis for the vector space V. Then the coordinate mapping *x*→[*x*]b is a one-to-one linear transformation from V onto Rⁿ. This is an isomorphism.
Implicit Definition
Nul A is defined implicitly, because it is defined by a condition that must be checked.
Change of coordinates matrix
Pb = [*b*₁ *b*₂ ... *b*n] where *x*=Pb[*x*]b. Pb is the change-of-coordinates matrix from B to the standard basis in Rⁿ. Since the columns of Pb form a basis for Rⁿ, Pb is invertible by the IMT.
Real part of a complex vector *x*
Re *x*
Standard Basis for Pn
S = {1,t,t²,...,tⁿ}
Explicit Description
Solving the equation A*x*=*0* amounts to producing an explicit description of Nul A.
β-coordinates of x
Suppose β = {*b*₁,...,*b*n} is a basis for V and *x* is in V. The *coordinates of x relative to the basis β* (or the *β-coordinates of x*) are the weights c₁,...,cn such that *x* = c₁*b*₁+...+cn*b*n.
Theorem 4.3
The column space of an mxn matrix A is a subspace of Rm.
Column Space fact
The column space of an mxn matrix A is all of Rm if and only if the equation A*x*-*b* has a solution for each *b* in Rm.
Column Space
The column space of an mxn matrix A, written as Col A, is the set of all linear combinations of the columns of A. If A=[*a₁,...,an*], then Col A = Span{*a₁,...,an*}
Dimensions of Nul A and Col A
The dimension of Nul A is the number of free variables in the equation A*x*=*0*, and the dimension of Col A is the number of pivot columns in A.
Theorem 4.14 - The Rank Theorem
The dimensions of the column space and the row space of an mxn matrix A are equal. This common dimension, the rank of A, also equals the number of pivot positions in A and satisfies the equation rank A + dimNul A = n
Theorem 5.1
The eigenvalues of a triangular matrix are the entries on its main diagonal.
Kernel of a transformation
The kernel (or null space) of a linear transformation T is the set of all *u* in V such that T(*u*)=*0* (the zero vector in W).
Multiplicity
The multiplicity of an eigenvalue λ is its multiplicity as a root of the characteristic equation.
Theorem 4.2
The null space of an mxn matrix A is a subspace of Rⁿ. Equivalently, the set of all solutions to a system A*x*=*0* of m homogeneous linear equations in n unknowns is a subspace of Rⁿ.
Null Space
The null space of an mxn matrix A, written as Nul A, is the set of all solutions of the homogeneous equation A*x*=*0*. In set notation, Nul A = {*x:x* is in Rⁿ and A*x*=*0*}
Theorem 4.6
The pivot columns of a matrix A form a basis for Col A.
Rank
The rank of A is the dimension of the column space of A.
Zero subspace
The set consisting of only the zero vector in a vector space V is a subspace of V, and written {*0*}.
Eigenspace
The set of all solutions of (A-λI)*x* = *0*, which is the null space of A-λI, is a subspace of Rⁿ and is called the eigenspace of A corresponding to λ.
Standard Basis for Rⁿ
The set {*e*₁,...,*e*n} where *e*₁ = [1 0 0...0] transposed...*e*n = [0 0...0 1].
Linear dependence
The set {*v*₁,...,*v*p} is said to be linearly dependent if the vector equation c₁*v*₁+c₂*v*₂+...+cp*v*p = *0* has a nontrivial solution, that is, if there are some weights, c₁,...,cp, not all zero, such that the equation holds.
β-coordinate vector of x
[*x*]b = [c₁ ... cn] transpose.
Subspaces of R³ and their dimensions
~ 0-dimensional subspaces: Only the zero subspace. ~ 1-dimensional subspaces: Any subspace spanned by a single nonzero vector. Such subspaces are lines through the origin. ~ 2-dimensional subspaces: Any subspace spanned by two linearly independent vectors. Such subspaces are planes through the origin. ~ 3-dimensional subspaces: Only R³ itself. Any LI vectors in R³ span all of R³, by the IMT.