Exam2 T/F
If A and B are 5x5 matrices and dim(Nul(B)) =4, then Col(BA) is either a point or a line in R^5
Always
If A is invertible and x is an eigenvector for A, then x is also an eigenvector for A^-1
Always
If A^2 = A, then 2 is not an eigenvalue of A
Always
If {a1...ap} is a linearly dependent set of vectors, then some subset of these vectors forms a basis for Span{a1....ap}
Always
Suppose H is a four-dimensional subspace of R7. Then any set of four linearly independent vectors in H will span H.
Always
If A nad B are nxn matrices, then det(AB) = det(BA)
Always. det(AB) = det(A)det(B) = det(B)det(A) = det(BA)
A plane in R3 is a two-dimensional Space
False
Any system of n linear equations in n variables can be solved by Cramer's Rule
False
Each eigenvalue of A is also an eigenvalue of A^2
False
IF A is a mxn and rankA = m, then the linear transformation x -> Ax is one to one.
False
IF B = {b1,...,bn} and C = {c1,...,cn} are bases for a vector space V, then the jth column of the change of coordinates matrix P is the coordinate vector [ci]B
False
If A and B are nxn matrices, with detA = 2 and detB=3, then det(A+B) = 5
False
If A is a 2x2 real matrix with complex eigenvalues L = 1+-i then A^4 = I2
False
If A is a 3x3 matrix, then det5A = 5detA
False
If A is a nxn and detA =2, then detA^3 = 6
False
If A is invertible, then detA^-1 = detA
False
If B is produced by interchanging two rows of A, then detB = detA
False
If S = {v1,....,vp-1} is linearly independent, then so is S
False
If S is linearly independent, then S is a basis for V
False
If an mxn matrix A is row equivalent to an echelon matrix U and if U has k nonzero rows, then the dimension of the solution space of Ax=0 is m-k
False
If u and v are in R2 and det[u v] = 10, then the area of the triangle i nthe plane with vertices at 0, u and v is 10.
False
Row operations on a matrix can change the null space
False
Similar matrices always have exactly the same eigenvectors
False
The nonpivot columns of a matrix are always linearly dependent.
False
The nonzero rows of a matrix A form a basis for RowA
False
The rank of a matrix equals the number of nonzero rows
False
The sum of two eigenvectors of a matrix A is also an eigenvector of A
False
Two eigenvectors corresponding to the same eigenvalue are always linearly dependent
False
det(-A) = -det(A)
False
detA^T = -detA
False
If a 5x5 matrix A has fewer than 5 distinct eigenvalues, then A is not diagonalizable
False, Let A be the 5x5 identity matrix
If A is diagonalizable, then the columns of A are linearly independent.
False, if A is a diagonal matrix with 0 on the diagonal, then the columns of A are not linearly independent
The eigenvalues of an upper triangular matrix A are exactly the nonzero entries on the diagonal of A.
False. All the diagonal entries of an upper triangular matrix are the eigenvalues of the matrix (Theorem 1 in Section 5.1). A diagonal entry may be zero.
If L is an eigenvalue of A and M is an eigenvalue of B then the product LM must be an eigenvalue of AB.
False. Consider a case
An nxn matrix with n linearly independent eigenvectors is invertible
False. Having n linearly independent eigenvectors makes an n n × matrix diagonalizable (by the Diagonalization Theorem 5 in Section 5.3), but not necessarily invertible. One of the eigenvalues of the matrix could be zero.
If A is row equivalent to the identity matrix I, then A is diagonalizable
False. If A is row equivalent to the identity matrix, then A is invertible. The matrix in Example 4 of Section 5.3 shows that an invertible matrix need not be diagonalizable.
A (square) matrix A is invertible iff there is a coordinate system in whihc the transformation x --> Ax is represented by a diagonal matrix
False. Let A be a singular matrix that is diagonalizable. (For instance, let A be a diagonal matrix with 0 on the diagonal.) Then, by Theorem 8 in Section 5.4, the transformation x-> Ax is represented bya diagonal matrix relative to a coordinate system determined by eigenvectors of A.
Every 2x2 matrix has at least on real eigenvalue
False. Matrix with charac eq x^2 + 1 has eigen values x= +-i
If X is the set of vectors with only integer entries, X = {[m,n] in R2}, then X is a subspace of R2
False. X is not closed under scalar multiplication
Eigenvalues must be nonzero scalars
False. Zero is an eigenvalue of each singular square matrix
If A and B are nonzero matrices but AB=0, then 0 must be an eigenvalue of both A and B
Flase
Let A(6x4) and B(4x6) be matricies, show that AB(6x6) cannot be invertible.
R6 --B--> R4 --A--> R6. It can be invertible if it goes from a lower dimension to a higher but not from a higher to lower. You lose information in the process.
IF Ax = Lx then x is an eigenvector of A
Sometimes
If A and B are both nxn and det(A) = det(B) then A and B are similar.
Sometimes
If A and B are both nxn, then det(A+B) = det(A) + det(B)
Sometimes
If A and B are similar matrices, and x is an eigenvector for A, then x is also an eigenvector for B
Sometimes
A change of coordinates matrix is always invertible
True
A nonzero vector cannot correspond to two different eigenvalues of A
True
Each eigenvector of A is also an eigenvector of A^2
True
Each eigenvector of an invertible matrix A is also an eigenvector of A-1
True
For all square matrices A, det(A) = det(A^T)
True
IF A is a nxn matrix for which the equation Ax=b has at least one solution for every b in Rn, then the equation A^3x=0 has only the trivial solution
True
IF A is mxn and the linear transformation x -> Ax is onto, then rankA = m
True
IF A nad B are invertible nxn matrices, then AB is similar to BA
True
IF B is produced by multiplying row 3 of A by 5 then, detB = 5detA
True
IF L is an eigenvalue, then the dimension of the correspoinding eigenspace El must be less than or equal to the algebraic multiplicity of L.
True
IF dimV = p and SpanS = V, then S cannot be linearly dependent
True
IF two rows of a 3x3 matrix A are the same, then detA = 0
True
If A is a 2x2 matrix with a zero determinant, then one column of A is a multiple of the other
True
If A is invertible and 1 is an eigenvalue for A, then 1 is also an eigenvalue of A^-1
True
If A is invertible, then (detA)(detA^-1) = 1
True
If A is similar to a diagonalizable matrix B, then A is also diagonalizable.
True
If A^3 = 0, then detA = 0
True
If B is formed by adding one row of A a linear combination of the other rows, then detB=detA
True
If B is obtained from a matrix A by several elementary row operations, then rank B = rankA
True
If H is a subspace of R3, then there is a 3x3 matrix A such that H = ColA
True
If S= {v1, ... vp-1} spans V, then S spans V
True
If SpanS=V, then some subset of S is a basis for V
True
If each vector ej in the standard basis for Rn is an eigenvector of A, then A is a diagonal matrix
True
If matrices A nad B have the same reduced echelon form, then RowA = RowB
True
Row operations on a matrix A can change the linear dependence relations among the rows of A
True
Similar matrices always have exactly the same eigenvalues
True
The identity matrix is not similar to any other matrix except itself
True
The set of all linear combinations of v1,...vp is a vector space
True
detA^TA >= 0
True
If A is a 5x5 matrix whose columns span R5, then the columns of A also span R5
True - Invertible Matrix Theorem
If A is a 7x7 matrix and the columns of A^10 form a basis for R7, then the columns of A itself also form a basis for R^7
True - Invertible Matrix Theorem
Eigenvectors must be nonzero vectors
True, by definition, an eigenvector must be nonzero
Every symmetric (that is, A = A^T) 2x2 matrix always has two real eigenvalues, counting multiplicietes.
True.
If A contains a row or column of zeros, then 0 is an eigenvalue of A
True. If A contains a row or column of zeros, then A is not row equivalent to the identity matrix and thus is not invertible. By the Invertible Matrix Theorem (as stated in Section 5.2), 0 is an eigenvalue of A.
If A is an nxn diagonalizable matrix, then each vector in Rn can be written as a linear combination of eigenvectors of A
True. If A is diagonalizable, then A has n linearly independent eigenvectors in R^n. By the basis theorem, the set of these eigenvectors spans R^n. THis means that each vector in Rn can be written as a linear combination of the eigenvectors of A.
There exists a 2x2 matrix that as no eigenvectors in R^2
True. Let A be the matrix that rotates vectors through pi/2 radians about the origin, then Ax is not a multiple of x when x is nonzero
The matrices A and A^T have the same eigenvalues, counting multiplicities
True. Matrices A and A^T have the same characteristic polynomial
If A^n = 0 and L is an eigenvalue of A, then L = 0
True. See soln
If A is a 3x3 matrix with eigenvalues 1,2,3 then det(A) = 6.
True. Since A is 3x3 with 3 distinct real eigenvalues, A is diagonalizable.
If A is the following matrix ,then the columns of A^25 are linearly independent. A = [(1,2,3,4),(0,1,0,1),(2,3,1,5),(1,1,1,1)]
True. Since A^25 is a 4x4 matrix, the IMT says that its columns are linearly independent iff its invertible. Since det(A^25) = (det(A))^25, A^25 is invertible iff A is invertible. We can row reduce to find that there is a pivot row for each row and is thus invertible.
There exists a 2x3 matrix A and a 3x2 matrix B such that the product AB is invertible.
True. Take an example of two matricies, calculate determinant, if nonzero then invertible.
Every 3x3 matrix with real entries will have at least one real eigenvalue
True. This is false if it was 2x2
For all nxn matricies A, det(AA^T) >= 0
True. det(AA^T) = det(A)*det(A^T) = det(A)*det(A) = det(A)^2 >=0
If A is similar to LambdaI for some scaler, labmda, then A = LambdaI
Ture. If A is similar to LIn, then there is some invertible matrix P such taht A = P(LIn)P^-1. Since scalars can move in and out of a matrix product, A = l(PInP^-1) = LIn
If A and B are 5x5 matrices, then rank(AB) = rank(A)*rank(B)
sometimes