TRUE/FALSE
T
For any scalar c, [u]*(c[v]) = c([u]*[v]).
F
If A is an mxn matrix, then the range of the transformation [x] -> A[x] is in Rm.
T
If A is an nxn diagonalizable matrix, then each vector in Rn can be written as a linear combination of eigenvectors of A.
T
If A is invertible, then (detA)(det(A^-1)) = 1
F
If A is invertible, then det(A^-1) = detA
F
If A is mxn and rankA = m, then the linear transformation [x] -> A[x] is one-to-one.
F
If A is nxn and detA = 2, then det(A^3) = 6
F
If BC = BD, then C = D.
F
If L is a line through [0] and if y-hat is the orthogonal projection of [y] onto L, then ||y-hat|| gives the distance from [y] to L.
F
If Pb is the change-of-coordinates matrix, then [[x]]b=Pb[x], for [x] in V.
F
If S is a linearly dependent set, then each vector is a linear combination of the other vectors in S.
F
If T: Rn -> Rm is alinear transformation and if [c] is in Rm, then a uniqueness question is "Is [c] in the range of T?"
F
If V=R2, B={[b]1, [b]2}, and C={[c]1, [c]2}, then row reduction of [ [c]1 [c]2 [b]1 [b]2 ] to [ I P ] produces a matrix P that satisfies [[x]]b = P[[x]]c for all [x] in V.
T
If V=Rn and C is the standard basis for V, then P(C<-B) is the same as the change-of-coordinates matrix Pb introduced in Section 4.4.
F
If [x] is a nontrivial solution of A[x] = [0], then every entry in [x] is nonzero.
T
If [y] = [z]1 + [z]2, where [z]1 is in a subspace W and [z]2 is in W perp, then [z]1 must be the orthogonal projection of [y] onto W.
T
If [y] is a linear combination of nonzero vectors from an orthogonal set, then the weights in the linear combination can be computed without row operations on a matrix.
T
If [y] is in a subspace W, then the orthogonal projection of [y] onto W is [y] itself.
T
If [z] is orthogonal to [u]1 and to [u]2 and if W = Span {[u]1, [u]2}, then [z] must be in W perp.
F
If a 5x5 matrix A has fewer than 5 distinct eigenvalues, then A is not diagonalizable.
T
If a finite set S of nonzero vectors spans a vector space V, then some subset of S is a basis for V.
T
The dimension of the null space of A is the number of columns of A that are not pivot columns.
F
The dimension of the vector space P4 is 4.
T
The dimensions of the row space and the column space of A are the same, even if A is not square.
F
The echelon form of a matrix is unique.
T
The effect of adding [p] to a vector is to move the vector in a direction parallel to [p].
F
The eigenvalues of an upper triangular matrix A are exactly the nonzero entries on the diagonal of A.
F
The equation A[x] = [0] gives an explicit description of its solution set.
T
The general least-squares problem is to find an [x] that makes A[x] as close as possible to [b].
F
The homogeneous equation A[x] = [0] has the trivial solution if and only if the equation has at least one free variable.
T
The kernel of a linear transformation is a vector space.
F
The least-squares solution of A[x] = [b] is the point in the column space of A closest to [b].
T
The null space of A is the solution set of the equation A[x] = [0].
F
The null space of an mxn matrix is in Rm.
T
The number of pivot columns of a matrix equals the dimension of its column space.
F
The number of variables in the equation A[x]=[0] equals the dimension of NulA.
T
The only three-dimensional subspace of R3 is R3 itself.
T
The orthogonal projection of [y] onto [v] is the same as the orthogonal projection of [y] onto c[v] whenever c does not equal 0.
F
The orthogonal projection y-hat of [y] onto a subspace W can sometimes depend on the orthogonal basis for W used to compute y-hat.
F
The pivot positions in a matrix depend on whether row interchanges are used in the row reduction process.
F
The points in the plane corresponding to (-2, 5) and (-5, 2) lie on a line through the origin.
T
The range of a linear transformation is a vector space.
F
The solution set of A[x] = [b] is the set of all vectors of the form [w] = [p] + [v]h, where [v]h is any solution of the equation A[x] = [0].
F
The solution set of a linear system involving variables x1, ... , xn is a list of numbers (s1, ... , sn) that makes each equation in the system a true statement when the values s1, ... , sn are substituted for x1, ... , xn, respectively.
T
The solution set of a linear system whose augmented matrix is [ [a]1 [a]2 [a]3 [b] ] is the same as the solution set of A[x] = [b], if A = [ [a]1 [a]2 [a]3 ].
T
Two vectors are linearly dependent if and only if they lie on a line through the origin.
T
When [u] and [v] are nonzero vectors, Span {[u], [v]} contains the line through [u] and the origin.
F
When two linear transformations are performed one after another, the combined effect may not always be a linear transformation.
F
Whenever a system has free variables, the solution set contains many solutions.
T
[u]*[v] - [v]*[u] = 0.
T
[v]*[v] = ||[v]||^2.
F
Two eigenvectors corresponding to the same eigenvalue are always linearly dependent.
T
Two fundamental questions about a linear system involve existence and uniqueness.
T
Two linear systems are equivalent if they have the same solution set.
F
Two matrices are row equivalent if they have the same number of rows.
F
A (square) matrix A is invertible if and only if there is a coordinate system in which the transformation [x] -> A[x] is represented by a diagonal matrix.
F
A 5x6 matrix has 6 rows.
T
A basic variable in a linear system is a variable that corresponds to a pivot column in the coefficient matrix.
T
A basis is a linearly independent set that is as large as possible.
F
A basis is a spanning set that is as large as possible.
T
A change-of-coordinates matrix is always invertible.
T
A general solution of a system is an explicit description of all solutions of the set.
T
A homogeneous equation is always consistent.
F
A least squares solution of A[x] = [b] is a vector x-hat such that ||[b] - A[x]|| is less than or equal to ||[b] - A(x-hat)|| for all [x] in Rn.
T
A least-squares solution of A[x] = [b] is a list of weights that, when applied to the columns of A, produces the orthogonal projection of [b] onto ColA.
T
A least-squares solution of A[x] = [b] is a vector x-hat that satisfies A(x-hat) = b-hat, where b-hat is the orthogonal projection of [b] onto ColA.
T
A linear transformation T: Rn -> Rm is completely determined by its effect on the columns of the nxn identity matrix.
T
A linear transformation is a special type of function.
T
A linear transformation preserves the operations of vector addition and scalar multiplication.
F
A linearly independent set in a subspace H is a basis for H.
F
A mapping T: Rn -> Rm is one-to-one if each vector in Rn maps onto a unique vector in Rm.
F
A mapping T: Rn -> Rm is onto Rm if every vector [x] in Rn maps onto some vector in Rm.
F
A matrix with orthonormal columns is an orthogonal matrix.
T
A nonzero vector cannot correspond to two different eigenvalues of A.
T
A null space is a vector space.
F
A plane in R3 is a two-dimensional subspace of R3.
F
A plane in R3 is a two-dimensional subspace.
F
A product of invertible nxn matrices is invertible, and the inverse of the product is the product of their inverses in the same order.
T
A row replacement operation does not affect the determinant of a matrix
F
A single vector by itself is linearly dependent.
F
A subset H of a vector space V is a subspace of V if the zero vector is in H.
T
A transformation T is linear if and only if T(c1[v]1 + c2[v]2) = c1T([v]1) + c2T([v]2) for all [v]1 and [v]2 in the domain of T and for all scalars c1 and c2.
T
A vector is any element of a vector space.
T
A vector space is also a subspace.
F
A vector space is infinite-dimensional if it is spanned by an infinite set.
T
A^t + B^t = (A+B)^t
T
An elementary matrix must be square.
T
An elementary nxn matrix has either n or n+1 nonzero entries.
T
An example of a linear combination of vectors [v]1 and [v]2 is the vector (1/2)[v]1.
F
An inconsistent system has more than one solution.
T
An nxn determinant is defined by determinants of (n-1)x(n-1) submatrices
F
An nxn matrix with n linearly independent eigenvectors is invertible.
T
An orthogonal matrix is invertible.
F
Analogue signals are used in the major control systems for the space shuttle.
T
Any linear combination of vectors can always be written in the form A[x] for a suitable matrix A and vector [x].
T
Any list of five real numbers is a vector in R5.
T
Any solution of (A^t)A[x] = (A^t)[b] is a least-squares solution of A[x] = [b].
F
Any system of n linear equations in n variables can be solved by Cramer's rule
T
Asking whether the linear system corresponding to an augmented matrix [ [a]1 [a]2 [a]3 [b] ] has a solution amounts to asking whether [b] is in Span {[a]1, [a]2, [a]3}.
F
ColA is the set of all solutions of A[x] = [b].
T
ColA is the set of all vectors that can be written as A[x] for some [x].
F
Each column of AB is a linear combination of the columns of B using weights from the corresponding column of A.
F
Each eigenvalue of A is also an eigenvalue of A^2.
T
Each eigenvector of A is also an eigenvector of A^2.
T
Each eigenvector of an invertible matrix A is also an eigenvector of A^-1.
T
Each elementary matrix is invertible.
F
Eigenvalues must be nonzero scalars.
T
Eigenvectors must be nonzero vectors.
T
Elementary row operations on an augmented matrix never change the solution set of the associated linear system.
T
Every elementary row operation is reversible.
F
Every linear transformation is a matrix transformation.
T
Every matrix equation A[x] = [b] corresponds to a vector equation with the same solution set.
T
Every matrix transformation is a linear transformation.
F
Every square matrix is a product of elementary matrices.
T
Finding a parametric description of the solution set of a linear system is the same as solving the system.
F
For a square matrix A, vectors in Col A are orthogonal to vectors in NulA.
T
For an mxn matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A.
F
For any scalar c, ||c[v]|| = c||[v]||.
T
For each [y] and each subspace W, the vector [y] - the projection of [y] onto W is orthogonal to W.
T
If A = QR, where Q has orthonormal columns, then R = (Q^t)A.
T
If A = [ A1 A2 ] and B = [ B1 B2 ], with A1 and A2 the same sizes as B1 and B2, respectively, then A+B = [ A1+B1 A2+B2 ].
F
If A and B are 2x2 with columns a1, a2, and b1, b2, respectively, then AB = [ a1b1 a2b2 ].
F
If A and B are 3x3 and B = [ b1 b2 b3 ], then AB = [ Ab1 + Ab2 + Ab3 ].
T
If A and B are invertible nxn matrices, then AB is similar to BA.
T
If A and B are mxn, then both A(B^t) and (A^t)B are defined.
F
If A and B are nxn and invertible, then (A^-1)(B^-1) is the inverse of AB.
F
If A and B are nxn matrices, with detA = 2 and detB = 3, then det(A+B) = 5
F
If A and B are nxn, then (A+B)(A-B) = A^2 - B^2.
T
If A and B are row equivalent, then their row spaces are the same.
F
If A and B are square and invertible, then AB is invertible, and (AB)^-1 = (A^-1)(B^-1).
T
If A can be row reduced to the identity matrix, then A must be invertible.
T
If A contains a row or column of zeros, then 0 is an eigenvalue of A.
F
If A has a QR factorization, say A = QR, then the best way to find the least-squares solution of A[x] = [b] is to compute x-hat = (R^-1)(Q^t)[b].
T
If A is a 2x2 matrix with a zero determinant, then one column of A is a multiple of the other
F
If A is a 3x2 matrix, then the transformation [x] -> A[x] cannot be one-to-one.
T
If A is a 3x2 matrix, then the transformation [x] -> A[x] cannot map R2 onto R3.
T
If A is a 3x3 matrix and the equation A[x] = (1 0 0) has a unique solution, then A is invertible.
T
If A is a 3x3 matrix with three pivot positions, there exist elementary matrices E1, ... , Ep such that Ep ... E1A = I.
F
If A is a 3x3 matrix, then det(5A) = 5detA
F
If A is a 3x5 matrix and T is a transformation defined by T([x]) = A[x], then the domain of T is R3.
T
If A is an invertible nxn matrix, then the equation A[x] = [b] is consistent for each [b] in Rn.
T
If A is an mxn matrix and if the equation A[x] = [b] is inconsistent for some [b] in Rm, then A cannot have a pivot position in every row.
T
If A is an mxn matrix whose columns do not span Rm, then the equation A[x] = [b] is inconsistent for some [b] in Rm.
F
If A is an nxn matrix, then the equation A[x] = [b] has at least one solution for each [b] in Rn.
F
If A is diagonalizable, then the columns of A are linearly independent.
T
If A is invertible and 1 is an eigenvalue for A, then 1 is also an eigenvalue for A^-1.
F
If A is invertible and if r does not equal zero, then (rA)^-1 = r(A^-1).
F
If A is invertible, then elementary row operations that reduce A to the identity In also reduce A^-1 to In.
T
If A is invertible, then the inverse of A^-1 is A itself.
T
If A is mxn and the linear transformation [x] -> A[x] is onto, then rankA = m.
F
If A is row equivalent to the identity matrix I, then A is diagonalizable.
T
If A is similar to a diagonalizable matrix B, then A is also diagonalizable.
T
If AB = BA and if A is invertible, then (A^-1)B = B(A^-1).
F
If AB = C and C has 2 columns, then A has 2 columns.
F
If AB = I, then A is invertible.
F
If AC = 0, then either A=0 or C=0.
T
If A^3 = 0, then detA = 0
T
If A^t is not invertible, then A is not invertible.
F
If B = {[b]1, ... , [b]n} and C = {[c]1, ... , [c]n} are bases for a vector space V, then the jth column of the change-of-coordinates matrix P(C<-B) is the coordinate vector [[c]j]b.
F
If B is an echelon form of a matrix form of a matrix A, then the pivot columns of B form a basis for ColA.
F
If B is any echelon form of A, and if B has three nonzero rows, then the first three rows of A form a basis for RowA.
F
If B is any echelon form of A, then the pivot columns of B form a basis for the column space of A.
T
If B is formed by adding to one row of A a linear combination of the other rows, then detB = detA
T
If B is obtained from a matrix A by several elementary row operations, then rankB=rankA.
F
If B is produced by interchanging two rows of A, then detB = detA
T
If B is produced by multiplying row 3 of A by 5, then detB = 5detA
T
If B is the standard basis for Rn, then the B-coordinate vector of an [x] in Rn is [x] itself.
T
If H = Span { [b]1, ... , [b]p }, then { [b]1, ... , [b]p } is a basis for H.
T
If H is a subspace of R3, then there is a 3x3 matrix A such that H = ColA.
F
If S is linearly independent, then S is a basis for V.
T
If SpanS=V, then some subset of S is a basis for V.
T
If T: R2 -> R2 rotates vectors about the origin through an angle, then T is a linear transformation.
F
If W = Span {[x]1, [x]2, [x]3} with {[x]1, [x]2, [x]3} linearly independent, and if {[v]1, [v]2, [v]3} is an orthogonal set in W, then {[v]1, [v]2, [v]3} is a basis for W.
T
If W is a subspace of Rn and if [v] is in both W and W perp, then [v] must be the zero vector.
T
If [b] is in the column space of A, then every solution of A[x] = [b] is a least-squares solution.
F
If [u] and [v] are in R2 and det[ [u] [v] ] = 10, then the are of the triangle in the plane with vertices at [0], [u], and [v] is 10
T
If [u] is a vector in a vector space V, then (-1)[u] is the same as the negative of [u].
T
If [x] and [y] are linearly independent, and if [z] is in Span {[x], [y]}, then {[x], [y], [z]} is linearly dependent.
T
If [x] and [y] are linearly independent, and if {[x], [y], [z]} is linearly dependent, then [z] is in Span {[x], [y]}.
T
If [x] is in V and if B contains n vectors, then the B-coordinate vector of [x] is in Rn.
T
If [x] is not in a subspace W, then [x] - the projection of [x] onto W is not zero.
T
If [x] is orthogonal to every vector in a subspace W, then [x] is in W perp.
F
If a set S = {[u]j, ... , [u]p} has the property that [u]j*[u]j = 0 whenever i does not equal j, then S is an orthonormal set.
F
If a set contains fewer vectors than there are entries in the vectors, then the set is linearly independent.
F
If a set in Rn is linearly dependent, then the set contains more vectors than there are entries in each vector.
T
If a set {[v]1, ... , [v]p} spans a finite-dimensional vector space V and if T is a set of more than p vectors in V, then T is linearly dependent.
F
If an mxn matrix A is row equivalent to an echelon matrix U and if U has k nonzero rows, then the dimension of the solution space of A[x]=[0] is m-k.
F
If an nxp matrix U has orthonormal columns, then U(U^t)[x] = [x] for all [x] in Rn.
F
If detA is zero, then rows or two columns are the same, or a row or a column is zero
F
If dimV = n and S is a linearly independent set in V, then S is a basis for V.
F
If dimV = n and if S spans V, then S is a basis for V.
T
If dimV = p, then there exists a spanning set of p+1 vectors in V.
T
If dimV=p and SpanS=V, then S cannot be linearly dependent.
T
If each vector [e]j in the standard basis for Rn is an eigenvector of A, then A is a diagonal matrix.
T
If every set of p elements in V fails to span V, then dimV is greater than p.
F
If f is a function in the vector space V of all real-valued functions on R and if f(t) = 0 for some t, then f is the zero vector in V.
T
If matrices A and B have the same reduced echelon form, then RowA = RowB.
F
If one row in an echelon form of an augmented matrix is [0 0 0 5 0], then the associated linear system is inconsistent.
F
If p is greater than or equal to 2 and dimV = p, then every set of p-1 nonzero vectors is linearly independent.
F
If the augmented matrix [ A [b] ] has a pivot position in every row, then the equation A[x] = [b] is inconsistent.
T
If the columns of A are linearly dependent, then detA = 0
T
If the columns of A are linearly independent, then the columns of A span Rn.
T
If the columns of A are linearly independent, then the equation A[x] = [b] has exactly one least-squares solution.
T
If the columns of A span Rn, then the columns are linearly independent.
T
If the columns of an mxn matrix A are orthonormal, then the linear mapping [x] |-> A[x] preserves length.
T
If the columns of an mxn matrix A span Rm, then the equation A[x] = [b] is consistent for each [b] in Rm.
T
If the columns of an nxp matrix U are orthonormal, then U(U^t)[y] is the orthogonal projection of [y] onto the column space of U.
T
If the distance from [u] to [v] equals the distance from [u] to -[v], then [u] and [v] are orthogonal.
T
If the equation A[x] = [0] has a nontrivial solution, then A has fewer than n pivot positions.
T
If the equation A[x] = [0] has only the trivial solution, then A is row equivalent to the nxn identity matrix.
T
If the equation A[x] = [b] has at least one solution for each [b] in Rn, then the solution is unique for each [b].
F
If the equation A[x] = [b] is consistent, then ColA is Rm.
T
If the equation A[x] = [b] is inconsistent, then [b] is not in the set spanned by the columns of A.
F
If the linear transformation [x] -> A[x] maps Rn into Rn, then A has n pivot positions.
F
If the vectors in an orthogonal set of nonzero vectors are normalized, then some of the new vectors may not be orthogonal.
F
If there exists a linearly dependent set {[v]1, ... , [v]p} in V, then dim V is less than or equal to p.
T
If there exists a linearly independent set {[v]1, ... , [v]p} in V, then dimV is greater than or equal to p.
T
If there exists a set {[v]1, ... , [v]p} that spans V, then dimV is less than or equal to p.
T
If there is a [b] in Rn such that the equation A[x] = [b] is inconsistent, then the transformation [x] -> A[x] is not one-to-one.
T
If there is an nxn matrix D such that AD = I, then there is also an nxn matrix C such that CA = I.
T
If two row interchanges are made in succession, then the new determinant equals the old determinant
T
If two rows of a 3x3 matrix A are the same, then detA = 0
T
If vectors [v]1, ... , [v]p span a subspace W and if [x] is orthogonal to each [v]j for j = 1, ... , p, then [x] is in W perp.
F
If x-hat is a least-squares solution of A[x] = [b], then x-hat = [((A^t)A)^-1](A^t([b].
F
If {[v]1, ... , [v](p-1)} is linearly independent, then so is {[v]1, ... , [v]p}.
T
If {[v]1, ... , [v](p-1)} spans V, then {[v]1, ... , [v]p} spans V.
F
If {[v]1, [v]2, [v]3} is an orthogonal basis for W, then multiplying [v]3 by a scalar c gives a new orthogonal basis {[v]1, [v]2, c[v]3}.
T
If ||[u]||^2 + ||[v]||^2 = ||[u]+[v]||^2, then [u] and [v] are orthogonal.
T
In a QR factorization, say A = QR (when A has linearly independent columns), the columns of Q form an orthonormal basis for the column space of A.
T
In order for a matrix B to be the inverse of A, both equations AB = I and BA = I must be true.
F
In some cases, a matrix may be row reduced to more than one matrix in reduced echelon form, using different sequences of row operations.
T
In some cases, a plane in R3 can be isomorphic to R2.
F
In some cases, the linear dependence relations among the columns of a matrix can be affected by certain elementary row operations on the matrix.
T
In the Orthogonal Decomposition Theorem, each term in formula (2) for y-hat is itself an orthogonal projection of [y] onto a subspace of W.
T
Left-multiplying a matrix B by a diagonal matrix A, with nonzero entries on the diagonal, scales the rows of B.
F
Not every linear transformation from Rn to Rm is a matrix transformation.
T
Not every linearly independent set in Rn is an orthogonal set.
T
Not every orthogonal set in Rn is linearly independent.
T
NulA is the kernel of the mapping [x] -> A[x].
T
On a computer, row operations can change the apparent rank of a matrix.
F
R2 is a subspace of R3.
F
R2 is a two-dimensional subspace of R3.
T
Reducing a matrix to echelon form is called the forward phase of the row reduction process.
T
Row operations on a matrix A can change the linear dependence relations among the rows of A.
F
Row operations on a matrix can change the null space.
F
Row operations preserve the linear dependence relations among the rows of A.
T
Similar matrices always have exactly the same eigenvalues.
F
Similar matrices always have exactly the same eigenvectors.
F
The (i,j)-cofactor of a matrix A is the matrix Aij obtained by deleting from A its ith row and jth column
T
The Gram-Schmidt process produces from a linearly independent set {[x]1, ... [x]p} an orthogonal set {[v]1, ... , [v]p} with the property that for each k, the vectors [v]1, ... , [v]k span the same subspace as that spanned by [x]1, ... , [x]k.
F
The best approximation to the [y] by elements of a subspace W is given by the vector [y] - the projection of [y] onto W.
F
The codomain of the transformation [x] -> A[x] is the set of all linear combinations of the columns of A.
F
The cofactor expansion of detA down a column is the negative of the cofactor expansion along a row
T
The column space of A is the range of the mapping [x] -> A[x].
T
The column space of an mxn matrix is in Rm.
T
The columns of P(C<-B) are linearly independent.
F
The columns of a matrix A are linearly independent if the equation A[x] = [0] has the trivial solution.
T
The columns of an invertible nxn matrix form a basis for Rn.
T
The columns of any 4x5 matrix are linearly dependent.
F
The columns of the change-of-coordinates matrix P(C<-B) are B-coordinate vectors of the vectors in C.
T
The columns of the standard matrix for a linear transformation from Rn to Rm are the images of the columns of the nxn identity matrix.
F
The correspondence [[x]] -> [x] is called the coordinate mapping.
T
The definition of the matrix-vector product A[x] is a special case of block multiplication.
F
The determinant of A is the product of the diagonal entries in A
T
The determinant of A is the product of the pivots in any echelon form U of A, multiplied by (-1)^r, where r is the number of row interchanges made during row reduction from A to U
F
The determinant of a triangular matrix is the sum of the entries on the main diagonal
F
The equation A[x] = [b] is consistent if the augmented matrix [ A [b] ] has a pivot position in every row.
T
The equation A[x] = [b] is homogeneous if the zero vector is a solution.
F
The equation A[x] = [b] is referred to as a vector equation.
F
The equation [x] = [p] + t[v] describes a line through [v] parallel to [p].
F
A vector is an arrow in three-dimensional space.
T
AB + AC = A(B+C)
T
The equation [x] = x2[u] + x3[v], with x2 and x3 free (and neither [u] nor [v] a multiple of the other), describes a plane through the origin.
T
The first entry in the product A[x] is a sum of products.
T
The matrices A and A^t have the same eigenvalues, counting multiplicities.
F
The nonpivot columns of a matrix are always linearly dependent.
F
The nonzero rows of a matrix A form a basis for RowA.
F
The normal equations always provide a reliable method for computing least-squares solutions.
F
The rank of a matrix equals the number of nonzero rows.
F
The row reduction algorithm applies only to augmented matrices for a linear system.
T
The row space of A is the same as the column space of A^t.
T
The row space of A^t is the same as the column space of A.
T
The second row of AB is the second row of A multiplied on the right by B.
F
The set Span {[u], [v]} is always visualized as a plane through the origin.
T
The set of all linear combinations of [v]1, ... , [v]p is a vector space.
T
The set of all solutions of a homogeneous linear differential equation is the kernel of a linear transformation.
F
The solution set of A[x] = [b] is obtained by translating the solution set of A[x] = [0].
T
The solution set of the linear system whose augmented matrix is [ [a]1 [a]2 [a]3 [b] ] is the same as the solution set of the equation x1[a]1 + x2[a]2 + x3[a]3 = [b].
F
The standard method for producing a spanning set for NulA, described in Section 4.2, sometimes fails to produce a basis for NulA.
F
The sum of the dimensions of the row space and the null space of A equals the number of rows in A.
F
The sum of two eigenvectors of a matrix A is also an eigenvector of A.
T
The superposition principle is a physical description of a linear transformation.
F
The transpose of a product of matrices equals the product of their transposes in the same order.
T
The transpose of a sum of matrices equals the sum of their transposes.
T
The transpose of an elementary matrix is an elementary matrix.
T
The vector [b] is a linear combination of the columns of a matrix A if and only if the equation A[x] = [b] has at least one solution.
T
The vector [u] results when a vector [u] - [v] is added to the vector [v].
F
The vector spaces P3 and R3 are isomorphic.
F
The weights c1, ... , cp in a linear combination c1[v]1 + ... + cp[v]p cannot all be zero.
T
There exists a 2x2 matrix that has no eigenvectors in R2.
T
A subspace is also a vector space.
F
(AB)C = (AC)B
F
(AB)^t = (A^t)(B^t)
F
A subset H of a vector space V is a subspace of V if the following conditions are satisfied: (i) the zero vector of V is in H, (ii) [u], [v], and [u] + [v] are in H, and (iii) c is a scalar and c[u] is in H.
T
det((A^t)A) is greater than or equal to 0
F
det(-A) = -detA
F
det(A+B) = detA + detB
F
det(A^t) = (-1)detA
F
det(A^t) = -detA