Test 2 TF
An m×n upper triangular matrix is one whose entries below the main diagonal are zeros, as is shown in the matrix to the right. When is a square upper triangular matrix invertible?
A square upper triangular matrix is invertible when all entries on its main diagonal are nonzero. If all of the entries on its main diagonal are nonzero, then the n×n matrix has n pivot positions.
A is diagonalizable if A=PDP^−1 for some matrix D and some invertible matrix P
False
A is diagonalizable if and only if A has n eigenvalues, counting multiplicities
False
A matrix A is diagonalizable if A has n eigenvectors
False
A product of invertible n×n matrices is invertible, and the inverse of the product is the product of their inverses in the same order
False
A subset H of set of real numbers ℝn is a subspace if the zero vector is in H.
False
A subspace of set of real numbers ℝn is any set H such that (i) the zero vector is in H, (ii) u, v, and u+v are in H, and (iii) c is a scalar and cu is in H.
False
An elementary row operation on A does not change the determinant.
False
Determine whether the statement "det(A^T) = (-1)det(A) is true or false
False
Determine whether the statement "A row replacement operation on A does not change the eigenvalues" is true or false
False
Determine whether the statement "If A is 3×3, with columns a1, a2, a3, then det(A) equals the volume of the parallelepiped determined by a1, a2, a3" is true or false.
False
If A is diagonalizable, then A has n distinct eigenvalues.
False
If A is diagonalizable, then A is invertible.
False
If A is invertible, then A is diagonalizable
False
If A is invertible, then elementary row operations that reduce A to the identity In also reduce A^−1 to In
False
If Ax = λx for some vector x, then λ is an eigenvalue of A.
False
If B is an echelon form of a matrix A, then the pivot columns of B form a basis for Col(A)
False
If det(A) is zero, then two rows or two columns are the same, or a row or a column is zero
False
If three row interchanges are made in succession, then the new determinant equals the old determinant.
False
If v1 and v2 are linearly independent eigenvectors, then they correspond to distinct eigenvalues
False
If λ+5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A
False
The cofactor expansion of det A down a column is the negative of the cofactor expansion along a row
False
The column space of a matrix A is the set of solutions of Ax = b.
False
The determinant of A is the product of the diagonal entries in A
False
The determinant of A is the product of the pivots in any echelon form U of A, multiplied by (−1)^r, where r is the number of row interchanges made during row reduction from A to U
False
The determinant of a triangular matrix is the sum of the entries on the main diagonal.
False
The eigenvalues of a matrix are on its main diagonal.
False
The set of all solutions of a system of m homogeneous equations in n unknowns is a subspace of set of real numbers ℝm.
False
To find the eigenvalues of A, reduce A to echelon form
False
det(A+B) = det(A) + det(B)
False
det(A^-1) = (−1)det(A)
False
Suppose F is a 5×5 matrix whose column space is not equal to set of real numbers ℝ5. What can you say about Nul(F)?
If Col(F) ≠ set of real numbers ℝ5, then the columns of F do not span set of real numbers ℝ5. Since F is square, the Invertible Matrix Theorem shows that F is not invertible and the equation Fx = 0 has a nontrivial solution. Therefore, Nul(F) contains a nonzero vector.
If Q is a 4x4 matrix and Col(Q) = set of real numbers ℝ4, what can you say about solutions of equations of the form Qx = b for b in set of real numbers ℝ4?
If Col(Q) = set of real numbers ℝ4, then the columns of Q span set of real numbers ℝ4. Since Q is square, the Invertible Matrix Theorem shows that Q is invertible and the equation Qx = b has a solution for each b in set of real numbers ℝ4.
If R is a 6×6 matrix and Nul(R) is not the zero subspace, what can you say about Col(R)?
If Nul(R) contains nonzero vectors, then the equation Rx = 0 has nontrivial solutions. Since R is square, the Invertible Matrix Theorem shows that R is not invertible and the columns of R do not span set of real numbers ℝ6. Therefore, Col(R) is a subspace of set of real numbers ℝ6, but Col(R) not equals set of real numbers Col(R) ≠ ℝ6.
Suppose A is n×n and the equation Ax = b has a solution for each b in set of real numbers ℝn. Explain why A must be invertible.
If the equation Ax = b has a solution for each b in set of real numbers ℝn, then A has a pivot position in each row. Since A is square, the pivots must be on the diagonal of A. It follows that A is row equivalent to In. Therefore, A is invertible.
If A is invertible, then the columns of A^−1 are linearly independent.
It is a known theorem that if A is invertible then A^−1 must also be invertible. According to the Invertible Matrix Theorem, if a matrix is invertible its columns form a linearly independent set. Therefore, the columns of A^−1 are linearly independent.
If C is 6×6 and the equation Cx = v is consistent for every v in set of real numbers ℝ6, is it possible that for some v, the equation Cx = v has more than one solution?
It is not possible. Since Cx = v is consistent for every v in set of real numbers ℝ6, according to the Invertible Matrix Theorem that makes the 6×6 matrix invertible. Since it is invertible, Cx = v has a unique solution
Is it possible for a 5x5 matrix to be invertible when its columns do not span set of real numbers ℝ5? Why or why not?
It is not possible; according to the Invertible Matrix Theorem an n×n matrix cannot be invertible when its columns do not span set of real numbers ℝn.
How many rows does B have if BC is a 5×2 matrix?
Matrix B has 5 rows
A is a 3×3 matrix with two eigenvalues. Each eigenspace is one-dimensional. Is A diagonalizable?
No. The sum of the dimensions of the eigenspaces equals 2 and the matrix has 3 columns. The sum of the dimensions of the eigenspace and the number of columns must be equal.
Explain why the columns of an n×n matrix A span set of real numbers ℝn when A is invertible
Since A is invertible, for each b in set of real numbers ℝn the equation Ax = b has a unique solution. Since the equation Ax = b has a solution for all b in set of real numbers ℝn, the columns of A span set of real numbers ℝn.
Suppose A is n×n and the equation Ax = 0 has only the trivial solution. Explain why A has n pivot columns and A is row equivalent to In.
Suppose A is n×n and the equation Ax = 0 has only the trivial solution. Then there are no free variables in this equation, thus A has n pivot columns. Since A is square and the n pivot positions must be in different rows, the pivots in an echelon form of A must be on the main diagonal. Hence A is row equivalent to the n×n identity matrix, In.
If the given equation Gx = y has more than one solution for some y in set of real numbers ℝn, can the columns of G span set of real numbers ℝn?
The columns of G cannot span set of real numbers ℝn. According to the Invertible Matrix Theorem, if Gx = y has more than one solution for some y in set of real numbers ℝn, that makes the matrix G non invertible
Can a square matrix with two identical columns be invertible? Why or why not?
The matrix is not invertible. If a matrix has two identical columns then its columns are linearly dependent. According to the Invertible Matrix Theorem this makes the matrix not invertible.
If a matrix A is 2×9 and the product AB is 2×7, what is the size of B?
The size of B is 9x7
A matrix A is not invertible if and only if 0 is an eigenvalue of A
True
A number c is an eigenvalue of A if and only if the equation (A−cI)x = 0 has a nontrivial solution
True
A row replacement operation does not affect the determinant of a matrix
True
An eigenspace of A is a null space of a certain matrix.
True
A steady-state vector for a stochastic matrix is actually an eigenvector
True
Determine whether the statement "The multiplicity of a root r of the characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue of A" is true or false
True
Finding an eigen vector of A may be difficult, but checking whether a given vector u is in fact an eigen vector is easy
True
Given vectors v1 ,..., vp in set of real numbers ℝn, the set of all linear combinations of these vectors is a subspace of set of real numbers ℝn.
True
If A can be row reduced to the identity matrix, then A must be invertible
True
If A is invertible, then the inverse of A^−1 is A itself.
True
If A=Start 2 By 2 Table 1st Row 1st Column a 2nd Column b 2nd Row 1st Column c 2nd Column d EndTable abcd and ad=bc, then A is not invertible
True
If AP = PD, with D diagonal, then the nonzero columns of P must be eigenvectors of A.
True
If Ax = λx for some scalar λ, then x is an eigenvector of A
True
If set of real numbers ℝn has a basis of eigenvectors of A, then A is diagonalizable
True
If the columns of A are linearly dependent, then det(A) = 0.
True
If v1 ,..., vp are in ℝn, then S = Span{v1 ,..., vp} is the same as the column space of the matrix A = [v1 ... vp].
True
Row operations do not affect linear dependence relations among the columns of a matrix.
True
The columns of an invertible n×n matrix form a basis for set of real numbers ℝn
True
The null space of an m×n matrix is a subspace of set of real numbers ℝn.
True
[det(A)][det(B)] = det(AB)
True
If the subspace of all solutions of Ax = 0 has a basis consisting of four vectors and if A is a 6x9 matrix, what is the rank of A?
rank A = 5
What is the rank of a 7x9 matrix whose null space is three dimensional?
rank A = 6
If the rank of a 4x7 matrix A is 22, what is the dimension of the solution space Ax = 0?
the dimension of the solution space is 5