M340L
Dim(Nul(A))
# of free variables in Ax=0
# of vectors in spanning set of A = ?
# of pivot columns in A
Dim(Col(A))
# of pivot columns in A
Dimension
# of vectors in a basis
(AB)^-1 = ?
(A^-1)(B^-1)
(A^T)^-1 = ?
(A^-1)^T
det(AB) = ?
(Det(A))(Det(B))
Does (A-λI)x=0 have non-trival solution
-Iff det(A-λI) = 0
Given matrix A and λ, find an eigenvector for λ
-[A-λI | 0] and reduce -the general solution is the eigenvector for that value
Telling if a vector v is in Nul(A)
-compute Av -if equal to zero vector, yes
Given matrix A & vector x, is x an eigenvector of A?
-multiply Ax -if Ax is a multiple of x, yes with eigenvalue of the multiple
How to create a spanning set for the Null space of A
-row reduce augmented matrix [A 0] -create equations and solve for the basic variables -place parametric solution in terms of free variables
How to find the inverse of a nxn matrix
1.)Create augmented matrix [A I_n] 2.)Preform row operations to turn A into the Identity matrix 3.)What is left is A^-1, if it doesn't reduce, A is not invertable
Requirements of a subspace
1.Contains zero vector 2.Closed under addition 3.Closed under multiplication
Given matrix A, find the eigenvalues
1.Work out determinant of (A-λI) 2.Set it equal to 0 and solve for all λ
(A^-1)^-1 = ?
A
det(A) =/= 0
A is invertible
eigenvalues of a triangular matrix
Along the main diagonal
Rank(A)
Dim(Col(A))
T or F: A row replacement operation on A does not change the eigenvalues
FALSE
T or F: For a square matrix A, vectors in Col A are orthogonal to vectors in Nul A.
FALSE |1 1| |00|
T or F: If λ + 5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A.
FALSE -5 is an eigenvalue. (The zeros are the eigenvalues.
T or F: A is diagonalizable if A = PDP^−1 for some matrix D and some invertible matrix P
FALSE D must be a diagonal matrix
T or F: The sum of the dimensions of the row space and the null space of A equals the number of rows in A
FALSE Equals number of columns by rank theorem. Also dimension of row space = number pivot columns, dimension of null space = number of non-pivot columns (free variables) so these add to total number of columns.
T or F: Row operations preserve the linear dependence relations among the rows of A
FALSE For example, Row interchanges mess things up
T or F: A is diagonalizable if and only if A has n eigenvalues, counting multiplicity
FALSE It always has n eigenvalues, counting multiplicity
T or F: If A is diagonalizable, then A had n distinct eigenvalues
FALSE It could have repeated eigenvalues as long as the basis of each eigenspace is equal to the multiplicity of that eigenvalue. The converse is true however.
T or F: If A is diagonalizable, then A is invertible
FALSE It's invertible if it doesn't have zero an eigenvector but this doesn't affect diagonalizabilty.
T or F: If B is any echelon form of A, the the pivot columns of B form a basis for the column space of A.
FALSE It's the corresponding columns in A.
T or F: To find the eigenvalues of A, reduce A to echelon form
FALSE Row reducing changes the eigenvectors and eigenvalues
T or F: If v1 and v2 are linearly independent eigenvectors, then they correspond to different eigenvalues.
FALSE The converse if true, however.
T or F: A is diagonalizable if A has n eigenvectors
FALSE The eigenvectors must be linear independent
T or F: If Ax = λx for some scalar λ, then x is an eigenvector of A.
FALSE The vector must be nonzero
T or F: The eigenvalues of a matrix are on its main diagonal
FALSE This is only true for triangular matrices.
T or F: If Ax=λx for some vector x, then λ is an eigenvalue of A
FALSE This is true as long as the vector is not the zero vector
T or F: det A^T=(-1)det A
FALSE detA^T=det A
T or F: The determinant of A is the product of the diagonal entries in A.
FALSE in general. True if A is triangular.
T or F: An elementary row operation on A does not change the determinant
FALSE interchanging rows and multiply a row by a constant changes the determinant
T or F: If A is 3 × 3, with columns a1, a2, a3 then det A equals the volume of the parallelepiped determined a1, a2, a3.
FALSE it's the absolute value of the determinant. We can prove this by thinking of the columns as changing under a linear transformation from the unit cube. When we apply a transformation, the volumes gets multiplied by determinant.
T or F: For any scalar c, ||cv|| = c||v||
FALSE need absolute value of c.
T or F: If A is invertible, then A is diagonalizable.
FALSE these are not directly related
T or F: If B is an echelon form of A, and if B has three nonzero rows, then the first three rows of A form a basis of Row A.
FALSE, The nonzero rows of B form a basis. The first three rows of A may be linear dependent
T or F: The columns of the change-of-coordinates matrix P_(C<-B) are B-coordinates vectors in C
FALSE. They are the C-coordinate vectors of the vectors in basis B
Rank Therom
For an mxn matrix A, rank(A) + dim(nul(A)) = n and rank(A) <= m
AA^-1 = ?
Identity matrix
linear independence
If a homogeneous equation only has the trivial solution, no free variables
Conditions for a linear transformation being invertible
If for transformation S, there exists a transformation T such that S(T(x)) = x and T(S(x)) = x, then T is the inverse of S
How does adding rows or multiples of rows affect the determinant?
It does not
How does multiplying a row by a scalar affect determinant?
Multiplies the determinant by the scalar
Column space
Set of all linear combinations of A
Row space
Set of all linear combinations of the row vectors
Eigenspace
Set of zero vector and all eigenvectors for one λ (set of all solutions x for (A-λI)x=0)
Homogeneous
System of linear equations that can be written in the form Ax = 0 and [A 0]
T or F: (det A)*(det B)=det(AB)
TRUE
T or F: A matrix A is not invertible if and only if 0 is an eigenvalue of A.
TRUE
T or F: For any scalar c, u · (cv) = c(u · v)
TRUE
T or F: If V = R^4 and C is the standard basis for V, then P_(C<-B) is the same as the change of coordinate matrix P_B
TRUE
T or F: If the distance from u to v equals the distance from u to −v, then u and v are orthogonal.
TRUE
T or F: u · v − v · u = 0
TRUE
T or F: v · v = ||v||^2
TRUE
T or F: A steady-state vector for a stochastic matrix is actually an eigenvector
TRUE A steady state vector has the property that Axx = x. In this case λ is 1
T or F: If ||u||2 + ||v||2 = ||u + v||2 , then u and v are orthogonal
TRUE By Pythagorean Theorem
T or F: The row space of A^T is the same as the column space of A.
TRUE Columns of A go to rows of A^T
T or F: On a computer, row operations can change the apparent rank of a matrix
TRUE Due to rounding error
T or F: If R^n has a basis of eigenvectors of A, then A is diagonalizable
TRUE In this case we can construct a P which will be invertible. And a D
T or F: Finding an eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy
TRUE Just see if Ax is a scalar multiple of x
T or F: The multiplicity of a root r of a characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue of A
TRUE That's the definition
T or F: An eigenspace of A is a null space of a certain matrix
TRUE The eigenspace is the nullspace of A − λI
T or F: The dimension of null space of A is the number of columns of A that are not pivot columns
TRUE These correspond with the free variables.
T or F: A number c is an eigenvalue of A if and only if the equation (A − cI)x = 0 has a nontrivial solution
TRUE This is a rearrangement of the equation Ax = λx
T or F: If x is orthogonal to every vector in a subspace W , then x is in W ^⊥
TRUE by definition of W ^⊥
T or F: The dimensions of the row space and the column space of A are the same, even if A if A is not square
TRUE by the Rank Theorem. Also since dimension of row space = number of nonzero rows in echelon form = number pivot columns = dimension of column space
T or F: If vectors v1, . . . , vp span a subspace W and if x is orthogonal to each vj for j = 1, . . . , p then x is in W ^⊥
TRUE since any vector in W can be written as linear combination of basic vectors and dot product splits up nicely over sums and constants.
T or F: If AP = PD, with D diagonal then the nonzero columns of P must be the eigenvectors of A
TRUE. Each column of PD is a column of P times A and is equal to the corresponding entry in D times the vector P. This satisfies the eigenvector definition as long as the column is nonzero.
T or F: For an m × n matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A
TRUE. RowA^⊥ = Nul A ColA^⊥ = NulA^⊥
T or F: If A and B are row equivalent, then their row spaces are the same
TRUE. This allows us to find row space of A by finding the row space of its echelon form
T or F: the columns of P_(C<-B) are linearly independent
TRUE. the columns of P_(C<-B) are coordinate vectors of the linearly independent set B.
Condition for a non-trivial solution
The homogeneous equation has at least one free variable
A Similar to B
There exists a vector P such that A= PBP^-1
T or F: The row space of A is the same as the column space of A^T.
True
inverce of 2x2 matrix
__1__| d -b | ad-bc |-c a |
determinant of 2x2 matrix
ad-bc
nonsingular matrix
invtertible matrix
Set with more columns than rows
linearly dependent
linearly dependent or independent? Set that contains the zero vector
linearly dependent
basis
linearly independent set of vectors that spans R^n
elementary matrix
matrix that is row equivalent to the identity matrix and thus invertible
How does interchanging rows affect determinant?
negates the determinant
Eigenvector
nonzero vector x such that Ax=λx, for eigenvalue λ
Singular matrix
not invertible matrix
Kernel
null space of a linear transformation
matrix linear dependence
one of the indexed set of vectors is a linear combination of the others
determinant of a triangular matrix
product of the main diagonal
Row space therom
row equivalent matricies have the same row space
det(s *A) = ?
s * det(A)
range
set of all possible results of a linear transformation
Null space of A
set of all solutions to Ax=0
vector magnitude
sqrt(x^2 + y^2 +... + z^2)
trivial solution
x = 0(or the 0 vector), solution that all homogeneous linear systems have
Ax=b solution of an invertible matrix
x = b A^-1