Linear Algebra True/False Final Questions
The equation Ax=0 gives an explicit description of its solution set
? False online but I think True
A is diagonalizable if and only if A has n eigenvalues, counting multiplicities
False
Each line in R^n is a one-dimensional subspace of R^n
False
If A is diagonalizable, then A is invertible
False
If the vectors in an orthogonal set of nonzero vectors are normalized, then some of the new vectors may not be orthogonal.
False
If {v1, v2, v3} is an orthogonal basis for W , then multiplying v3 by a scalar c gives a new orthogonal basis {v1, v2, cv3}
False
The (i,j)-cofactor of a matrix A is the matrix A_ij obtained by deleting from A its ith row and jth column
False
The orthogonal projection yhat of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute yhat
False
When two linear transformations are performed one after another, the combined effect may not always be a linear transformation
False
If lambda+5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A
False- -5 is an eigenvalue
A matrix with orthonormal columns is an orthogonal matrix.
False- 346
The set of all solutions of a system of m homogeneous equations in n unknowns is a subspace of R^m
False- Pg 150
The equation Ax=b is consistent if the augmented matrix [A b] has a pivot position in every row
False- Pg 38
The transpose of a product of matrices equals the product of their transposes in the same order
False- Pg. 101
If A=[abcd] and ab-cd does not equal zero, then A is invertible
False- Pg. 105
If A and B are nxn and invertible, then A^(-1)B^(-1) is the inverse of AB
False- Pg. 107
The row reduction algorithm applies only to augmented matrices for a linear system
False- Pg. 12
If A=[A11 A12 A21 A22] and B=[B1 B2], then the partitions of A and B are conformable for block multiplication.
False- Pg. 120
In some cases, a matrix may be row reduced to more than one matrix in reduced echelon form, using different sequences of row operations.
False- Pg. 13
det(A+b)= detA +detB
False- Pg. 175
Another notation for the vector [ - 4 ] [ 3 ] is [-4 3]
False- Pg. 25
The points in the plane corresponding to (-2,5) and (-5,2) line on a line through the origin
False- Pg. 26
If Ax=(lambda)x for some vector x, then lambda is an eigenvalue of A
False- Pg. 269
To find the eigenvalues of A, reduce A to echelon form
False- Pg. 270
An elementary row operation on A does not change the determinant
False- Pg. 278
The determinant of A is the product of the diagonal entries in A
False- Pg. 278
A is diagonalizable if A=PDP^(-1) for some matrix D and some invertible matrix P
False- Pg. 284
The solution set of a linear system involving variables x1,...,xn is a list of numbers (s1,...,sn) that makes each equation in the system a true statement when the values s1,...,sn are substituted for x1,...,xn, respectively
False- Pg. 3
For a square matrix A, vectors in Col A are orthogonal to vectors in Nul A.
False- Pg. 337
If L is a line through 0 and if yhat is the orthogonal projections of y onto L, then ||yhat|| gives the distance from y to L
False- Pg. 343
The equation Ax=b is referred to as a vector equation.
False- Pg. 36
A 5x6 matrix has six rows
False- Pg. 4
The homogeneous equation Ax=0 has the trivial solution if and only if the equation has at least one free variable
False- Pg. 44
The equation x=p+tv describes a line through v parallel to p
False- Pg. 46
The solution set of Ax=b is the set of all vectors of the form w=p+vh, where vh is any solution of the equation Ax=0
False- Pg. 47
The columns of a matrix A are linearly independent if the equation Ax=0 has the trivial solution
False- Pg. 58
If S is a linearly dependent set, then each vector is a linear combination of the other vectors in S
False- Pg. 59
If A is a 3x5 matrix and T is a transformation defined by T(x)=Ax, then the domain of T is R3
False- Pg. 64-5
If A is an mxn matrix, then the range of the transformation x --> Ax is R^m
False- Pg. 65
Every linear transformation is a matrix transformation
False- Pg. 66
A mapping T: R^n --> R^m is onto R^m if every vector x in R^n maps onto some vector in R^m
False- Pg. 76
If A is a 3x2 matrix, then the transformation x->Ax cannot be one-to-one.
False- Pg. 78
Each column of AB is a linear combination of the columns of B using weights from the corresponding column of A
False- Pg. 97
If A and B are 2x2 with columns a1, a2 and b1, b2 respectively, then AB = [a1b1 a2b2]
False- Pg. 97
A subspace of R^n is any set H such that (i) the zero vector is in H, (ii) u,v,and u+v in H, and (iii) c is a scalar and cu is in H
False- Pg.148
The set Span{u,v} is always visualized as a plane through the origin
False- Pg.30
If A is an nxn matrix then the equation Ax=b has at least one solution for each b in Rn
False- by IMT
If one row in an echelon form of an augmented matrix is [0 0 0 5 0], the the associated linear system is inconsistent
False- x4=0
Every elementary row operation is reversible.
TRUE - Pg. 6
A basic variable in a linear system is a variable that corresponds to a pivot column in the coefficient matrix
True
Finding an eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy
True
Row operations do not affect linear dependence relations among the columns of a matrix
True
An example of a linear combination of vectors v1 and v2 is the vector (1/2)v1
True- (1/2)v1=(1/2)v1+0v2
The columns of an invertible nxn matrix form a basis for R^n
True- Pg 150-1
A row replacement operation does not affect the determinant of a matrix
True- Pg 171
A vector b is a linear combination of the columns of a matrix A if and only if the equation Ax=b has at least one solution
True- Pg 37
A^T+B^T=(A+B)^T
True- Pg. 101
In order for a matrix B to be the inverse of A, both equations AB=I and BA=I must be true
True- Pg. 105
If A is an invertible nxn matrix, then the equation Ax=b is consistent for each b in R^n
True- Pg. 106
Each elementary matrix is invertible
True- Pg. 109
If A=[A1 A2] and B=[B1 B2], with A1 and A2 the same sizes as B1 and B2, respectively, then A+B=[A1+B1 A2+B2].
True- Pg. 119-120
If v1,...., vp are in R^n, the Span{v1,....,vp} is the same as the column space of the matrix [v1,....,vp]
True- Pg. 149
The dimension of Col A is the number of pivot columns of A
True- Pg. 152
If B={v1,....,vp} is a basis for a subspace H and if x=c1v1+...+cpvp, then c1,...cp are the coordinates of x relative to the basis B
True- Pg. 156
If a set of p vectors spans a p-dimensional subspace H of Rn, then these vectors form a basis for H
True- Pg. 158
The dimensions of Col A and Nul A add up to the number of columns of A
True- Pg. 158
An nxn determinant is defined by determinants of (n-1)x(n-1) submatrices
True- Pg. 167
If the columns of A are linearly dependent, then det A = 0.
True- Pg. 173
The determinant of A is the product of the pivots in any echelon form U of A, multiplied by (-1)^r, where r is the number of row interchanges made during row reduction from A to U
True- Pg. 173
Finding a parametric description of the solution set of a linear system is the same as solving the system
True- Pg. 19
A number c is an eigenvalue of A if and only if the equation (A-cI)x=0 has a nontrivial solution
True- Pg. 269
A matrix A is not invertible if and only if 0 is an eigenvalue of A
True- Pg. 272
(detA)(detB)=det(AB)
True- Pg. 278
If R^n has a basis of eigenvectors of A, then A is diagonalizable
True- Pg. 284
The solution set of the linear system whose augmented matrix is [a1 a2 a3 b] is the same as the solution set of the equation. x1a1 + x2a2 + x3a3 = b
True- Pg. 30
For any scalar, c, u*(cv)=c(u*v)
True- Pg. 333
v*v=||v||^2
True- Pg. 333
If the distance from u to v equals the distance from u to -v, then u and v are orthogonal
True- Pg. 335
If vectors v1,...,vp span a subspace W and if x is orthogonal to each vj for j = 1,...,p, then x is in W^perp
True- Pg. 336
Not every linearly independent set in R^n is an orthogonal set
True- Pg. 340
If y is a linear combination of nonzero vectors from an orthogonal set, then the weights in the linear combination can be computed without row operations on a matrix.
True- Pg. 341
For each y and each subspace W, the vector y - proj(w)y is orthogonal to W.
True- Pg. 350
If z is orthogonal to u1 and to u2 and if W = Span{u1, u2}, then z must be in W⊥.
True- Pg. 350
If y is in a subspace W, then the orthogonal projection of y onto W is y itself.
True- Pg. 352
If A is an mxn matrix and if the equation Ax=b is inconsistent for some b in R^m, then A cannot have a pivot position in every row.
True- Pg. 37
If the columns of an mxn matrix A span R^m, then the equation Ax=b is consistent for each b in R^m
True- Pg. 37
The first entry in the product Ax is a sum of products
True- Pg. 38
A homogeneous equation is always consistent
True- Pg. 43-4
If x and y are linearly independent, and if {x,y,z} is linearly dependent, then z is in Span{x,y}
True- Pg. 59
The columns of any 4x5 matrix are linearly dependent
True- Pg. 60
A linear transformation is a special type of function
True- Pg. 64
A transformation T is linear if and only if T(c1v1+c2v2)=c1T(v1)+c2T(v2) for all v1 and v2 in the domain of T and for all scalars c1 and c2
True- Pg. 66
Two fundamental questions about a linear system involve existence and uniqueness
True- Pg. 7
A linear transformation T:R^n--> R^m is completely determined by its effect on the columns of the nxn identity matrix
True- Pg. 71
If T: R^2 --> R^2 rotates vectors about the origin through an angle, then T is a linear transformation.
True- Pg. 73
AB+AC=A(B+C)
True- Pg. 99
If A^T is not invertible, then A is not invertible
True- by IMT
If the columns of A span R^n, then the columns are linearly independent
True- by IMT
If the equation Ax=0 has a nontrivial solution, then A has fewer than n pivot positions
True- by IMT
If the equation Ax=0 has only the trivial solution, then A is row equivalent to the nxn identity matrix
True- by IMT