Linear Algebra True/False
dim Nul A
number of free variables = the number of columns without pivots
(AB)−1=A−1B−1
False
det(A+B)=det(A)+det(B)
False
Basis for Nul A
The vectors in the parametric vector form of the general equation Ax=0 always form a basis for Nul A
A^7 is invertible
True
If x is not in a subspace W, then x−projW(x) is not zero.
True. If x is not in W, then projW(x)≠x, since projW(x) is in W. Thus x−projWx≠0
If A is diagonalizable, then A is invertible.
False
For every x in R3, there is a y in R3 such that T(x)=y
T is a function of R3 to R3
The solution set of a consistent inhomogeneous system Ax=b is obtained by translating the solution set of Ax=0
True
clockwise rotation by pi/2 radians
0 1 -1 0
Reflection about the line y=x
0 1 1 0
Matrices with the same eigenvalues are similar matrices
False
Row operations on a matrix do not change its eigenvalues.
False
There is a vector [b1b2] so that the solutions to; 1 0 1 times x1 = b1 = z axis 0 1 0 x2 b2 x3
False
λ is an eigenvalue of a matrix A if A−λI has linearly independent columns.
False
A 5×5 real matrix has an even number of real eigenvalues.
False it has an odd number of eigenvalues
Invertible Matrix Theorem
1. A is invertible 2. T is invertible 3. A is row equivalent to In 4. A has n pivots 5. Ax=0 has only the trivial solution 6. The columns of A are linearly independent 7. T is one-to-one 8. Ax=b is consistent for a b in Rn 9. The columns of A span Rn 10. T is onto 11. A has a left inverse 12. A has a right inverse 13. A^T is invertible
A least-squares solution of Ax=b is a vector x̂ such that ∥b−Ax∥≤∥b−Ax̂ ∥ for all x in Rn
False. The inequality should be the other way around: ∥b−Ax∥≥∥b−Ax̂ ∥ : this means that Ax̂ is the best approximate solution to Ax=b
Basis for Column A
The pivot columns of A always form the basis for Col A (of the original matrix)
A matrix that is similar to the identity matrix is equal to the identity matrix.
True
The Gram-Schmidt process produces from a linearly independent set {x1,...,xp} an orthogonal set {v1,...,vp} with the property that for each k, the vectors v1,...,vk span the same subspace as that spanned by x1,...,xk
True
The general least-squares problem is to find an x that makes Ax as close as possible to b
True
If the transpose of A is not invertible, then A is also not invertible.
Yes By the Invertible Matrix Theorem, AT being invertible is equivalent to A being invertible. Therefore, if AT is not invertible, then A cannot be invertible.
if b is in the column space of A, then every solution of Ax=b is a least-squares solution
True. If the equation Ax=b can be solved exactly, then any solution is the best possible approximate solution. In other words, Ax̂ cannot be made any closer to b
If W=Span{x1,x2,x3} with {x1,x2,x3} linearly independent, and if {v1,v2,v3} is an orthogonal set of nonzero vectors in W, then {v1,v2,v3} is a basis for W
True. The dimension of W is equal to 3, so any set of 3 linearly independent vectors in W is a basis. Since {v1,v2,v3} is orthogonal, it is linearly independent.
If x̂ is the least-squares solution of Ax=b, then Ax̂ is the point in the column space of A closest to b
True. This is essentially the definition of a least squares solution
rank A
rank A = dim col A = the number of columns with pivots
Find a 3×3 matrix with exactly one (real) eigenvalue -4, such that the -4-eigenspace is a line.
0 1 0 -1 0 0 0 0 -4
Invertible
A matrix is invertible only if the determinate does not equal 0
All of the linear transformations from R3 to R3 that are invertible
C. Reflection in the origin D. Rotation about the y-axis by π E. Identity transformation: T(v)=v for all v F. Dilation by a factor of 3
A is diagonalizable if and only if A has n eigenvalues, counting multiplicity.
False
If A is a 5×4 matrix, and B is a 4×3 matrix, then the entry of AB in the 3rd row / 2nd column is obtained by multiplying the 3rd column of A by the 2nd row of B
False
If an n×n matrix A has fewer than n distinct eigenvalues, then A is not diagonalizable.
False
If det A is zero, then two columns of A must be the same, or all of the elements in a row or column of A are zero.
False
If the number of rows of an augmented matrix in reduced row echelon form is greater than the number of columns (to the left of the vertical bar), then the corresponding linear system has infinitely many solutions
False
A + B is invertible
False each matrix on their own may be invertible, but they may add together to form the zero matrix which is not invertible
The null space of an m×n matrix is a subspace of Rm
False. The null space of an m×n matrix A is the space of all solutions to the matrix equation Ax=0. A solution of this equation must be a vector in Rn, so the null space is a subspace of Rn
If x is a nontrivial solution of Ax=0 ,then every entry of x is nonzero
False. Only one entry of x need be nonzero.
If the equation Ax=0 has the trivial solution, then the columns of A span Rn
Maybe The matrix equation Ax=0 always has the trivial solution. However, the question does not give further information on whether or not the trivial solution is the only solution to the equation. Thus we do not know if there might be nontrivial solutions. If there are nontrivial solutions, then the columns of A are not linearly independent, thus cannot span Rn
If A is diagonalizable, then A2 is also diagonalizable
True
The absolute value of the determinant of A equals the volume of the parallelepiped determined by the columns of A
True
A homogeneous system is always consistent
True the zero vector is always a solution to the homogeneous system
rank A is...
the # of pivot columns
The product of any two invertible matrices is invertible
yes
reflection about the y axis
-1 0 0 1
counter-clockwise rotation by pi/2 radians
0 -1 1 0
The projection onto the x-axis given by T(x,y)=(x,0
1 0 0 0
Give an example of a matrix A such that (1) Ax=b has a solution for infinitely many b∈R3, but (2) Ax=b does not have a solution for all b∈R3
1 0 0 0 0 1 0 0 0 0 0 0
Find a 3 by 3 matrix A which is not invertible, but where no two columns are scalar multiples of each other, and no two rows are scalar multiples of each other
1 0 1 0 1 1 1 1 2
Span{a1,a2} contains only the line through a1 and the origin, and the line through the a2 and the origin.
False Span A contains all linear combinations
If a linear system has four equations and seven variables, then it must have infinitely many solutions
False it could be inconsistent and therefore have no solutions
If a set S of vectors contains fewer vectors than there are entries in the vectors, then the set must be linearly independent.
False take S= 1 0
(A+B)2=A2+B2+2AB
False the order of multiplication must be respected
There are exactly three vectors in span {a1,a2, a3}
False there are infinitely many vectors
If B is an echelon form of a matrix A, then the pivot columns of B form a basis for the column space of A
False. A basis of the column space of A consists of the columns of A that correspond to the pivot columns in B
For any matrices A and B, if the product AB is defined, then BA is also defined
False. For instance, A could be a 5×4 matrix, and B could be a 4×3 matrix
There exists a real 2×2 matrix with the eigenvalues i and 2i
False. Complex eigenvalues come in conjugate pairs, so if a 2×2 matrix has i as an eigenvalue, its other eigenvalue must be −i
If S is a set of linearly dependent vectors, then every vector in S can be written as a linear combination of the other vectors in S
False: in order for S to be linearly dependent, only one vector in S need be expressible as a linear combination of the others
The columns of matrix A are linearly independent if the equation Ax=0 has the trivial solution
False: the equation Ax=0 always admits the trivial solution, whether or not the columns of A are linearly independent.
The eigenvalues of the matrix of an orthogonal projection are −1 and 1
False: −1 is never an eigenvalue of a projection matrix, since there is no nonzero vector y such that projW(y)=−y
In a 5 by 5 If A has three pivots, then ColA is a (two-dimensional) plane
No. The dimension of the column space is the same as the number of pivot columns, so it must be 3
In a 5 by 5 If A has two pivots, then the dimension of Nul A is 2
No. A has two pivots means the dimension of the column space of A. By the Rank Theorem, the dimension of the null space is 5−2=3
A square matrix with two identical columns can be invertible
No If A is invertible, by the Invertible Matrix Theorem, the linear transformation T(x)=Ax is one-to-one. But if two of the columns of A are equal, say the ith column and the jth column, then T(ei)=T(ej)= that column, but ei≠ej. Hence if two of the columns are equal, then A is not invertible.
If the linear transformation T(x)=Ax is one-to-one, then the columns of A form a linearly dependent set
No If the linear transformation T(x)=Ax is one-to-one, this means the only vector that can be mapped to the zero vector is the zero vector, meaning the columns of A satisfies the definition of linear independence
In a 5 by 5 If rankA=4 , then the columns of A form a basis of R5
No. The rank is the dimension of the column space of A by definition and since the rank is 4, the columns of A only span a subspace of dimension 4. But R5 has dimension 5, so the columns A cannot span R5
f the first and second rows of an augmented matrix are (1,1,0) and (0,1,0) respectively, then the matrix is not in reduced row echelon form.
True
The solution set of the linear system whose augmented matrix [a1a2a3b] is the same as the solution set of the equation x1a1+x2a2+a3x3=b
True both the augmented and matrix equation translate into the same thing
(In−A)(In+A)=In−A2
True check by expansion
If v is an eigenvector of A, then cv is also an eigenvector of A for any number c≠0
True. If Av=λv then A(cv)=c(Av)=c(λv)=λ(cv), with λ being the eigenvalue corresponding to Alternatively, the λ-eigenspace is a subspace, so any nonzero multiple of a nonzero vector in the λ-eigenspace is again an eigenvector with eigenvalue λ
For any matrix A, there exists a matrix B so that A+B=0
True: B=(−1)⋅A=−A
The columns of a matrix with dimensions m×n, where m<n, must be linearly dependent
True: it is impossible for such a matrix to have a pivot in each column. Alternatively, such a matrix must give rise to at least one free variable
If A is an m×n matrix then ATA and AAT are both defined
True: AT is an n×m matrix.
If y is in a subspace W as well as its orthogonal complement W⊥, then y must be the zero vector.
True: since y is in W⊥, we have y⋅w=0 for every w in W. But y is in W, so y⋅y=0, which implies y=0
If AT is row equivalent to the n×n identity matrix, then the columns of A span Rn
Yes If AT is row equivalent to the identity matrix, then AT is invertible, by the Invertible Matrix Theorem. Again by the Invertible Matrix Theorem, since AT is invertible, also A is invertible.
If the linear transformation T(x)=Ax is onto, then it is also one-to-one.
Yes If T is onto, then every row in A has a pivot. Since A is square, every column also has a pivot, so T is one-to-one.
If A is invertible, then the equation Ax=b has exactly one solution for all b in Rn
Yes The solution is x=A−1b
If the equation Ax=0 has a nontrivial solution, then A has fewer than n pivot points
Yes When Ax=0 has a nontrivial solution, this means there exists at least one free variable. This implies some column does not have a pivot, so there are fewer than n pivots.
In a 5 by 5 If Ax=0 has only the trivial solution, then ColA=R5
Yes. Ax=0 has only the trivial solution means that all the columns of A are linearly independent vectors in R5, and since there are 5 of them, they form a basis for R5
Dimension of the nul space A
# of columns - rank A = dim nulA
Dimension of the null space A
# of columns - the dimension of the column space
Reflection about the x axis
1 0 0 -1
Give an example of a matrix A and a vector b such that the solution set of Ax=b is a line in R3 that does not contain the origin.
A= 1 0 0 b= 1 0 1 0 0 0 0 0 0
If the columns of A are linearly independent, then det A=0
False. If the columns of a matrix are linearly independent, then by the invertible matrix theorem, this means the matrix is invertible, which in turn implies that its determinant is non-zero.
The determinant of a triangular matrix is the sum of the entries of the main diagonal.
False. The determinant of a diagonal matrix is the product of the entries on the main diagonal.
The homogeneous system Ax=0 has the trivial solution if and only if the system has at least one free variable.
False. The homogeneous system Ax=0 always has the trivial solution 0, whether or not it has a free variable
If {v1,v2,v3} is an orthonormal basis for W, then multiplying v3 by a scalar c gives a new orthonormal basis {v1,v2,cv3}.
False. The vector cv3 will not be a unit vector unless c=±1.
The eigenvalues of A are the entries on its main diagonal.
False. This is only the case if A is a triangular or diagonal matrix.
For every y in R3, there is at most one x in R3 such that T(x)=y
T is a one to one function of R3 to R3
For every y in R3, there is a x in R3 such that T(x)=y
T is an onto function of R3 to R3
If a matrix is in reduced row echelon form, then the first nonzero entry in each row is a 1 and has 0s below it.
True
If the characteristic polynomial of a 2×2 matrix is λ2−5λ+6, then the determinant is 6
True
The columns of an invertible n×n matrix form a basis for Rn
True. An invertible n×n has linearly independent columns by the Invertible Matrix Theorem. Therefore, since any set of n linearly independent vectors in Rn is a basis of Rn, the columns of the invertible matrix form a basis of Rn
A number c is an eigenvalue of A if and only if (A−cI)v=0 has a nontrivial solution.
True. By definition, c is an eigenvalue if and only if there exists some nonzero vector v such that Av=cv⟺Av−cv=0⟺(A−cI)v=0
The cofactor expansion of det A along the first row of A is equal to the cofactor expansion of det A along any other row
True. Cofactor expansion along any row or column always gives the same number.
If there is a basis of Rn consisting of eigenvectors of A, then A is diagonalizable
True. If C is an n×n matrix whose columns are linearly independent eigenvectors, then A=CDC−1, where D is the diagonal matrix whose diagonal entries are the corresponding eigenvalues, in the same order.
The equation Ax=b is homogenous if the zero vector is a solution.
True. If x=0 is a solution, then b=Ax=A0=0
The (i,j) minor of a matrix A is the matrix Aij obtained by deleting row i and column j from A
True. Note that if this question was about the (i,j) cofactor of a matrix A then the answer would have been false since the (i,j) cofactor is the determinant of the matrix Aij times (−1)i+j
If A=QR, where Q has orthonormal columns, then R=QTA
True. A matrix Q has orthonormal columns if and only if QTQ=I, the identity matrix. Hence R=QTQR=QTA
If A is n×n and A has n distinct eigenvalues, then the corresponding eigenvectors of A are linearly independent.
True. Eigenvectors with distinct eigenvalues are linearly independent.
If two columns of A are the same, then the determinant of that matrix is zero
True. If a matrix has two columns which are the same, then its columns are linearly dependent. By the invertible matrix theorem, the matrix is not invertible, and thus its determinant must be 0.
Every real 3×3 matrix must have a real eigenvalue.
True. The characteristic polynomial has degree 3, and any degree 3 real polynomial has at least one real root.
A determinant of an n×n matrix can be defined as a sum of multiples of determinants of (n−1)×(n−1) submatrices.
True. This is referring to the calculation of the determinant using cofactor expansion -- choose one of the rows or columns of the matrix, multiply each entry (with its appropriate sign using the checkerboard patterns) of the row/column with the determinant of the corresponding (n−1)×(n−1) minor.
The set of all solutions of a system of m homogeneous equations in n unknowns is a subspace of Rn
True. To solve a system of m equations in n unknowns, we can insert the equations as rows of a m×n matrix, denoted A, and solve the matrix equation Ax=0. The solution set is the null space of A, which is a subspace of Rn
A is invertible if and only if 0 is not an eigenvalue of A
True. Zero is an eigenvalue if and only if Ax=0x has a nontrivial solution, if and only if Ax=0 has a nontrivial solution. By the invertible matrix theorem, this is equivalent to the non-invertibility of
Real eigenvalues of a real matrix always correspond to real eigenvectors
True. If λ is a real eigenvalue of A, then the equation (A−λI)x=0 has a nonzero real solution
Two vectors are linearly dependent if and only if they are colinear
True: if ax+by=0, with a≠0 (for instance), then x=−bay, which says that x and y lie on the same line. Conversely, if x and y lie on the same line, then there exists a≠0 such that x=ay (unless y=0, in which case swap x); then ay−x=0 is an equation of linear dependence
Dimension of the column space
The dimension of the column space is the number of pivots in the row reduced echelon form of the matrix, because the columns in the original matrix that correspond to these pivots form a basis for the column space.
A row replacement operation does not affect the determinant of a matrix
True
If A is the matrix of an orthogonal projection, then A2=A
True
The column space of an m×n matrix is a subspace of Rm
True. By definition, the column space is the span of the columns, and any span is a subspace
For any matrix A, we have the equality 2A+3A=5A
True: 2A+3A=(2+3)A=5A
If y is in a subspace W, then the orthogonal projection of y onto W is y.
True: the closest vector in W to y is y itself.