Chapter 4: Vector Spaces
Linearly Independent
Indexed set {v1, ..., vp} in V is linearly independent if c'1'v'1'+c'2'v'2'+...+c'p'v'p' = 0 has only a trivial solution, c'1' = 0, ..., c'p' = 0
Set Notation for the Null Space
Nul A = {x : x is in R^n and Ax = 0}
Column Space
Of an mxn matrix A, is the set of all linear combinations of the columns of A. If A = [a'1' ... a'n'], then Col A = Span{a'1' ... a'n'}
Linear Combination
Any sum of scalar multiples of vectors
Dimensions of Nul A
Number of free variables in Ax = 0
Column Space of an mxn Matrix A is all of R^m IFF
Ax = b has a solution for each b in R^m
Set Notation for the Column Space
Col A = {b : b = Ax for some x in R^n}
Column Space of a Matrix (Theorem 3)
Column space of an mxn matrix A is a subspace of R^m
Standard Basis
Columns of an nxn identity matrix (In) for R^n
Zero Vector Space Dimension
Defined to be zero
Dimension
Dimension of V is the number of vectors in a basis for V
Nullity
Dimension of the null space of A
Rank Theorem (Theorem 14)
Dimensions of the column space and row space of an mxn matrix A are equal. Rank of A also equals the number of pivot positions in A and satisfies rank A + dim Nul A = n (# of columns)
Row Space Theorem (Theorem 13)
If 2 matrices A and B are row equivalent then their row spaces are the same. If B is in echelon form, the nonzero rows of B form a basis for the row space of A and of B
Steady-State Vector (Equilibrium Vector)
If P is a stochastic matrix, then a steady-state vector for P is a probability vector q such that Pq = q
Steady-State Vector & Convergence (Theorem 18)
If P is an nxn regular stochastic matrix, then P has a unique steady-state vector q. If x0 is any initial state and x'k'+1 = P'x'"k" for k = 0, 1, 2, ..., then the Markox chain {x'k'} converges to q as k --> infinity
Basis of a Vector Space (Theorem 9)
If a vector space V has a basis B = {b'1', ..., b'n'}, then any set in V containing more than n vectors must be linearly dependent
Basis of a Vector Space Pt. 2 (Theorem 10)
If a vector space V has a basis of n vectors, then every basis of V must consist of exactly n vectors
B-Coordinate Vector of x
If c1, ..., cn are the B coordinates of x, then the vector in R^n: [x]'b' = [c'1' ... c'n']
Spanned Subspace (Theorem 1)
If v'1', ..., v'p' are in a vector space V, then Span {v'1', ..., v'p'} is a subspace of V
Linearly Dependent Sets (Theorem 4)
Indexed set {v1, ..., vp} of 2 or more vectors, with v1 =/ 0, is linearly dependent IFF some vj (j > 1) is a linear combination of the preceding vectors v1, ..., vj-1
Invertible Matrix Theorem (Continued)
Let A be an nxn matrix. Following statements are each equivalent to the statement that A is an invertible matrix: 1) Columns of A form a basis of R^n 2) Col A = R^n 3) dim Col A = n 4) rank A = n 5) Nul A = {0} 6) dim Nul A = 0
Change of Coordinates Matrix (Theorem 12)
Let B = {b'1', ..., b'n'} and C = {c'1', ..., c'n'} be bases of a vector space V. Then there is a unique mxn matrix P 'C<--B' such that [x]'c' = P 'C<--B' [x]'b' The columns of P 'C<--B' are the C-coordinate vectors of the vectors in the basis B. That is, P 'C<--B' = [[b'1']'c' [b'2']'c' ... [b'n']'c']
Unique Representation Theorem (Theorem 7)
Let B = {b'1', ..., b'n'} be a basis for a vector space V. For each x in V, there exists a unique set of scalars c'1', ..., c'n' such that x = c'1'b'1'+...+c'n'b'n'
Basis
Let H be a subspace of a vector space V. Indexed set of vectors B = {b'1', ..., b'p'} in V is a basis for H if: 1) B is a linearly independent set 2) Subspace spanned by B coincides with H; that is H = Span{b'1', ..., b'p'}
Spanning Set Theorem (Theorem 5)
Let S = {v'1', ..., v'p'} be a set in V, and let H = Span{v'1', ..., v'p'} 1) If 1 of the vectors in S is a linear combination of the remaining vectors in S, then the set formed from S by removing vk still spans H 2) If H =/ {0}, some subset of S is a basis for H
Basis Theorem (Theorem 12)
Let V be a p-dimensional vector space, p >= 1. Any linearly independent set of exactly p elements in V is automatically a basis for V. Any set of exactly p elements that spans V is automatically a basis for V
Subspace of a Finite-Dimensional Space (Theorem 11)
Let h be a subspace of a finite dimensional vector space V. Any linearly independent set in H can be expanded to a basis for H. H is finite dimensional and dim H <= dim V
Linear Transformation
Linear transformation T from a vector space V into a vector space W is a rule that assigns to each vector x in V a unique vector T(x) in W, such that: 1) T(u+v) = T(u)+T(v) for all u, v in V 2) T(cu) = cT(u) for all u in V and all scalars c
Vector Space
Nonempty set V of objects (vectors) on which are defined 2 operations (addition and multiplication by scalars) subject to the 10 axioms (rules) listed: 1) Sum of u and v, denoted by u+v, is in V 2) u+v = v+u 3) (u+v)+w = u+(v+w) 4) There is a zero vector 0 in V such that u+0 = u 5) For each u in V, there is a vector -u in V such that u+(-u) = 0 6) Scalar multiple of u by c, denoted by cu, is in V 7) c(u+v) = cu+cv 8) (c+d)u = cu+du 9) c(du) = (cd)u 10) lu = u
Null Space of a Matrix (Theorem 2)
Null space of an mxn matrix A is a subspace of R^n. Set of all solutions to Ax = 0 of m homogenous linear equations in n unknowns is a subspace of R^n
Dimensions of Col A
Number of pivot columns in A
Rank
Of A, is the dimension of the column space of A
Row Space
Of A, is the set of all linear combinations of the row vectors if A is mxn matrix and each row of A has n entries
Kernel (Null Space)
Of T, is the set of all u in V such that T(u) = 0 (zero vector in W)
Range
Of T, is the set of all vectors in W of the form T(x) for some x in V.
Subspace
Of a vector space V, is a subset H of V that has 3 properties: 1) Zero vector of V is in H 2) H is closed under vector addition. For each u and v in H, sum u+v is in H 3) H is closed under multiplication by scalars. For each u in H and each scalar c, vector cu is in H
Null Space
Of an mxn matrix A, is the set of all solutions of Ax = 0
(P c<--b)^-1 =
P 'C<--B'
Basis for Col A (Theorem 6)
Pivot columns of a matrix A
Markov Chain
Sequence of probability vectors x0, x1, x2, ..., together with a stochastic matrix P, such that x'1' = P'x'"0", x'2' = P'x'"1", x'3'= P'x'"2". Described by the 1st-order difference equation x'k'+1 = P'x'"k" for k = 0, 1, 2, ...
Zero Subspace
Set consisting of only the zero vector {0} in a vector space V
Linearly Dependent
Set {v'1', ..., v'p'} in V is linearly dependent if c'1'v'1'+c'2'v'2'+...+c'p'v'p' = 0 has a nontrivial solution, that is if there are some weights, c'1', ..., c'p' (not all zero), such that the vector equation holds
Stochastic Matrix
Square matrix whose columns are probability vectors
B-Coordinates of x
Suppose B = {b'1', ..., b'n'} is a basis for V and x is in V. Coordinates of x relative to basis B are the weights c'1', ..., c'n' such that x = c'1'b'1'+...+c'n'b'n'
State Vector
System or sequence of experiments (entries in xk list respectively)
Finite Dimensional
V is finite dimensional if V is spanned by a finite set
Infinite Dimensional
V is infinite dimensional if V is not spanned by a finite set
Probability Vector
Vector with nonnegative entries that add up to 1
[c1 c2 | b1 b2]~
[I | P 'C<--B']