EXAM 2 TRUE/FALSE
det(A^T) = (-1)detA
False, det(A^T) = det(A) when A is nxn.
An eigenspace of A is a null space of a certain matrix
True, nullspace of A-λI
An elementary row operation on A does not change the determinant.
False, interchanging rows and multiply a row by a constant changes the determinant.
The determinant of a triangular matrix is the sum of the entries on the main diagonal.
False, it is the product of the entries on the main diagonal.
If v1 and v2 are linearly independent eigenvectors, then they correspond to distinct eigenvalues.
False, not true in general.
A matrix A is not invertible if and only if 0 is an eigenvalue of A.
True
A number c is an eigenvalue of A if and only if the equation (A-cI)x=0 has a nontrivial solution.
True
A row replacement operation does not affect the determinant of a matrix.
True
An nxn determinant is defined by determinants of (n-1) x (n-1) submatrices.
True
Finding an eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy.
True
For any scalar c, u•(cv)=c(u•v)
True
To find the eigenvalue of A, reduce A to echelon form.
False
A subspace of Rn is any set H such that: 1. The zero vector is in H. 2. U, V, and U+V are in H. 3. C is a scalar and CU is in H.
False, the critical phrases "for each" are missing in the definition. So, 2. For Each U and V in H, the sum U+V is in H. 3. For each U in H and each scalar c, the vector cU is in H.
If λ+5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A.
False, the eigenvalue would be -5.
For any scalar c, ║cv║=c║v║
False, ║cv║= |c|║v║
det(A+B)=detA+detB
False. only works with det(AB)=det(A)*det(B)
If B is an echelon form of a matrix A, then the pivot columns of B form a basis for Col A.
False, the pivot columns of B indicate which columns of A form a basis for Col A.
The column space of a matrix A is the set of solutions of Ax=b.
False. The column space of A is the set Col A of all linear combinations of the columns of A. If A = [a1 ... an], then Col A = Span{a1,...,an}.
If B is a basis for a subspace H, then each vector in H can be written in only one way as a linear combination of the vectors in B.
True
If B={v1,...,vp} is a basis for a subspace H and if x=c1v1+...+cpvp, then c1,...,cp are the coordinates of x relative to the basis B.
True
If Rn has a basis of eigenvectors of A, then A is diagonalizable.
True
If a set of p vectors spans a p-dimensional subspace H of Rn, then these vectors form a basis for H.
True
If the columns of A are linearly dependent, then det A = 0.
True
If the distance from u to v equals the distance from u to -v, then u and v are orthogonal.
True
If two row interchanges are made in succession, then the new determinant equals the old determinant.
True
If v1, ... , vp are in Rn, then Span {v1,...,vp} is the same as the column space of the matrix [v1 ... vp]
True
If vectors v1,...,vp span a subspace W and if x is orthogonal to each vj for j=1,...,p, then x is in Wperp.
True
If x is orthogonal to every vector in a subspace W, then x is in w perp.
True
Row operations do not affect linear dependence relations among the columns of a matrix.
True
If A is diagonalizble then A has n distinct eigenvalues.
False, If A has n distinct eigenvalues, then A is diagonalizable, not the other way around.
The dimension of Nul A is the number of variables in the equation Ax=0.
False, dim(Nul A) = number of free variables in the equation Ax=0.
The eigenvalues of a matrix are on its main diagonal.
False, only true for triangluar matrices.
The determinant of A is the product of the diagonal entries in A.
False, only true when A is a triangular matrix
(det A)(detB)=det AB
True
If B={v1,...,vp} is a basis for a subspace H of Rn, then the correspondence x->[X]B makes H look and act the same as Rp.
True
If H is a p-dimensional subspace of Rn, then a linearly independent set of p vectors in H is a basis for H.
True
The columns of an invertible nxn matrix form a basis for Rn.
True, (Invertible matrix theorem)
The dimensions of Col A and Nul A add up to the columns in A.
True, by the rank theorem
If AP=PD, with D diagonal, then the nonzero columns of P must be eigenvectors of A.
True, since AP=PD => APP^1 = PDP^-1 => A=PDP^-1 => A is diagonlizable.
u•v - v•u = 0
True, since u•v=v•u, subtracting v•u from both sides would leave u•v-v•u = 0
Given vectors v1,...,vp in Rn, the set of all linear combinations of these vectors is a subspace of Rn.
True, span{v1,...,vp} is a subspace of Rn.
If ║u║^2+║v║^2=║u+v║^2, then u and v are orthogonal.
True, this is the Pythagorean theorem.
The cofactor expansion of det A down a column is the negative of the cofactor expansion along a row.
False
The (i,j)-cofactor of a matrix A is the matrix Aij obtained by deleting from A its ith row and jth column.
False, (i,j)-cofactor cij = (-1)^i+j det(Aij) where Aij is the submatrix obtained by deleting row i and col j of A.
If Ax=λx for some vector x, then λ is an eigenvalue of A.
False, Ax=λx must have a nontrivial solution.
A is diagonalizable if A=PDP^1 for some matrix D and some invertible matrix P.
False, D must be a diagonal matrix, this was not stated.
A is diagonalizable if and only if A has n eigenvalues, counting multiplicities.
False, a 3x3 matrix could have 3 eigenvalues and yet not be diagonablizable.
If A is invertible, then A is diagonalizable.
False, a matrix can be invertible and not be diagonalizable.
det A^T=(-1)det A.
False, det(A^T)=det(A)
A row replacement operation on A does not change the eigenvalues
False, row replacement does not change the determinant, unlike the other 2 row operations.
The set of all solutions of a system of m homogeneous equations in n unknowns is a subspace of Rm.
False, should be a subspace of Rn. (Theorem 12)
If A is diagonalizable, then A is invertible.
False, the invertibility depends on λ=0 not being one eigenvalue of the matrix. It is possible for a matrix to be diagonalizable but not invertible.
Each line in Rn is a one-dimensional subspace of Rn.
False, they must pass through the origin. Only lines through the origin are a subspace of Rn.
A subset H of Rn is a subspace if the zero vector is in H.
False, this is a condition to be a subspace, but two more are needed.
A is diagonalizable if A has n eigenvectors
False, true if A has n linearly independent eigenvectors.
If det A is zero, then two rows or two columns are the same, or a row or a column is zero.
False, while these are true examples of how to make the determinant zero, these are not the only situations.
If Ax=λx for some scalar λ, then x is an eigenvector of A.
False, x must be nonzero.
Let H be a subspace of Rn. If x is in H, and y is in Rn, then x+y is in H.
False, y has to be in H for x+y to be in H.
The determinant of A is the product of the pivots in any echelon form U of A, multiplied by (-1)^r where r is the number of row interchanges made during row reduction from A to U.
True
The dimension of Col A is the number of pivot columns in A.
True
The dimension of the column space of A is rank A.
True
The multiplicity of a root r of the characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue of A.
True
v • v = ║v║^2
True