Math 415 Worksheet T/F
11.7.2 If two vectors are linearly independent, then they are orthogonal.
False
11.7.3 All projection matrices are invertible.
False
12.9.2 Let Q = [q1 . . . qn] be a square matrix with orthogonal columns. Then Q^-1 = [(q1/|q1|) ... (qn/|qn|)].
False
12.9.5 If IBC is a change of basis matrix from C to B (where B, C are two bases of R^n, for some n), then the change of basis matrix ICB is just the transpose of IBC.
False
12.9.6 In a Mimo system described by y = Hx (x the vector of signals transmitted, y the vector of signals received), the nullspace of H consists of the vectors received for all possible transmission vectors.
False
12.9.7 In a Mimo system described by y = Hx (x the vector of signals transmitted, y the vector of signals received), the columns space of H consists of those transmission vectors that give the zero signal at the receiving end.
False
12.9.8 If A = QR (as in the QR decomposition), then Col(A)= Col (R).
False
12.9.9 If a linear transformation T has the identity matrix I as a matrix representation with respect to two bases B, C, then T is the identity transformation.
False
13.7.1 det(A + B) = det(A) + det(B).
False
13.7.2 Suppose that A is a 3 × 3 matrix with det(A) = 9. Then det(2A) = 18.
False
13.7.4 Suppose that A is a 2 × 3 matrix. Then det(A*A^T) = 0.
False
13.7.5 The determinant of A is the product of the pivots in any echelon form U of A, multiplied by (−1)^r, where r is the number of row interchanges made during row reduction from A to U.
False
13.7.7 The eigenvalues of a square matrix are the numbers on its main diagonal.
False
13.7.8 If v1 and v2 are linearly independent eigenvectors of a square matrix A, then they correspond to different eigenvalues.
False
13.7.9 The sum of two eigenvectors of a matrix A is also an eigenvector of A.
False
2.4.3 There cannot be more free variables than pivot variables.
False
2.4.4 There is a linear system that has exactly two solutions.
False
2.4.5 If the augmented matrix of a linear system has two identical rows, the linear system is necessarily inconsistent.
False
3.9.1 The weights c1, ..., cp ∈ R in a linear combination c1v1 + ... + cpvp cannot all be zero.
False
3.9.5 If A is a 2×2 matrix and x ∈ R^2 such that Ax = 0, then either A = 0 (the zero matrix) or x = 0 (zero vector).
False
4.11.3 If an m × n matrix A has a pivot position in every row, then the equation Ax = b has a unique solution for each b in R^m.
False
4.11.5 Suppose A is an l × m matrix and B is an m × n matrix. Each column of AB is a linear combination of the columns of B using weights from the corresponding column of A.
False
4.11.6 The transpose of a product of matrices equals the product of their transposes in the same order.
False
4.11.7 If A and B are 3 × 3 and B = [b1 b2 b3], then AB = [Ab1 + Ab2 + Ab3].
False
4.11.9 If A and B are 3 × 3 such that A = [a1 a2 a3] and B = [2a1 2a2 2a3] , then B = 8A.
False
5.4.10 A is an arbitrary n×n square matrix. If A cannot be brought to an echelon form without using row interchange, then A does not have an LU-decomposition.
False
5.4.2 A and B are arbitrary n×n square matrices. If A is invertible, then AB is also invertible.
False
5.4.3 A and B are arbitrary n×n square matrices. If A and B are invertible, then A + B is also invertible.
False
5.4.4 A is an arbitrary n×n square matrix. If A is invertible, then A has an LU-factorization.
False
5.4.5 A is an arbitrary n×n square matrix. If A is not invertible, then A does not have an LU-factorization.
False
5.4.8 A is an arbitrary n×n square matrix. If A is not invertible and has an LU decomposition A = LU, then neither L nor U are invertible
False
5.4.9 A is an arbitrary n×n square matrix. If A has an LU factorization, then this factorization is unique.
False
6.8.2 Let A be an m × n-matrix and B an n × l-matrix. The null space of AB is contained in the null space A.
False
6.8.3 Let A be an m × n-matrix. The set of all vectors b ∈ R^m that are not in the column space of A, is a subspace of R^m.
False
6.8.6 Let A be an m × n-matrix. The column space of A − I is equal to the column space of A.
False
7.5.2 If S is a linearly dependent set, then each vector is a linear combination of the other vectors in S.
False
7.5.6 If x and y are linearly independent, and if z is in span{x, y}, then {x, y, z} is linearly independent.
False
7.5.7 If a set in R^n is linearly dependent, then it contains at least n + 1 vectors.
False
8.5.10 A is an arbitrary m × n matrix. If B is an echelon form of A, then the pivot columns of B form a basis for the column space of A.
False
8.5.2 A plane in R^3 is a two-dimensional subspace of R^3.
False
8.5.3 The dimension of the vector space P4 (the vector space of polynomials of degree at most 4) is 4.
False
8.5.4 V is an arbitrary vector space. If dim V = n and S is a linearly independent set in V , then S is a basis for V .
False
8.5.6 V is an arbitrary vector space. If dim V = n and if S spans V , then S is a basis of V .
False
8.5.7 A is an arbitrary m × n matrix. If B is an Echelon Form of A, and if B has three nonzero rows, then the first three rows of A form a basis for the row space Col(A^T) of A.
False
8.5.9 A is an arbitrary m × n matrix. The sum of the dimensions of the row space and the null space of A equals the number of rows in A.
False
8.7.2 Let T : V → W be a linear transformation and v1, v2, . . . , vn be vectors in V . If v1, v2, . . . , vn are linearly independent then T(v1), T(v2), . . . , T(vn) are also linearly independent.
False
8.7.3 Let v1, v2, . . . , vn span V and let T : V → W. Do the images Tv1, Tv2, . . . , Tvn span W?
False
9.7.1 If two vectors are orthogonal, then they are linearly independent.
False
9.7.10 If a square matrix has orthogonal columns, then it is invertible.
False
9.7.13 |cv| = c|v| for all scalars c and vectors v in R^n.
False
9.7.5 If W is a subspace of R^n, then W and W⊥ have no vectors in common.
False
11.7.1 All vectors are in R^n for some n. Let W be a subspace of R^n and a, b, c in R^n such that a = b + c and b is in W and c is W⊥. Then b is the projection of a onto W.
True
11.7.4 All vectors are in R^n for some n. If w is a vector in W and T denotes the projection onto the subspace W, then T(w) = w.
True
11.7.5 All vectors are in R^n for some n. If v is a vector in W⊥ and T denotes the projection onto the subspace W, then T(v) = 0.
True
11.7.6 If P be the projection matrix of a projection onto a subspace W, then Nul(P) = W⊥.
True
11.7.7 If W = R^n, that is, we consider W to be the whole vector space R^n, then the projection matrix (in the standard basis) is the identity matrix.
True
12.9.1 The QR decomposition of a matrix is unique
True
12.9.3 If A = QR, where Q has orthonormal columns, then R = Q^T * A.
True
12.9.4 In a QR factorization, say A = QR (when A has linearly independent columns), the columns of Q form an orthonormal basis for the column space of A.
True
13.7.3 Suppose that A is a 2 × 3 matrix. Then det(A^T * A) = 0.
True
13.7.6 A square matrix A is not invertible iff 0 is an eigenvalue of A.
True
2.4.1 There is no more than one pivot position in any row.
True
2.4.2 There is no more than one pivot position in any column.
True
3.9.2 Given nonzero vectors u, v in R^n, span{u, v} contains the line through u and the origin.
True
3.9.3 Asking whether the linear system corresponding to [a1 a2 a3 | b] is consistent, is the same as asking whether b is a linear combination of a1, a2, a3.
True
3.9.4 If A and x are real numbers such that Ax = 0, then either A = 0 or x = 0.
True
3.9.6 If A is a 2 × 2 matrix with two pivots, then Ax = 0 implies that x = 0.
True
4.11.1 If an augmented matrix [A | b] is transformed into [C | d] by elementary row operations, then the equations Ax = b and Cx = d have exactly the same solution sets.
True
4.11.10 If A is an n × n-matrix, then (A^2)^T = (A^T)^2.
True
4.11.2 If 3 × 3 matrices A and B each have three pivot positions, then A can be transformed into B by elementary row operations.
True
4.11.4 If an n × n matrix A has n pivot positions, then the Reduced Row Echelon Form of A is the n × n identity matrix.
True
4.11.8 Suppose A is an l × m matrix and B is an m × n matrix with l ≥ 2. The second row of AB is the second row of A multiplied on the right by B.
True
5.4.1 A is an arbitrary n×n square matrix. If A is invertible then Ax = 0 has exactly one solution, x = 0.
True
5.4.6 A is an arbitrary n×n square matrix. If A is invertible, then the reduced echelon form of A is equal to I.
True
5.4.7 A is an arbitrary n×n square matrix. If A is invertible and has an LU decomposition A = LU, then both L and U are invertible.
True
6.8.1 Let A be an m × n-matrix and B an n × l-matrix. The column space of AB is contained in the column space of A.
True
6.8.4 Let A be an m × n-matrix. If the column space of A only contains the zero vector, then A is the zero matrix.
True
6.8.5 Let A be an m × n-matrix. The column space of A is equal to the column space 2A.
True
7.5.1 The columns of a matrix A are linearly independent if the equation Ax = 0 has only the trivial solution.
True
7.5.3 The columns of any 4 × 5 matrix are linearly dependent.
True
7.5.4 If x and y are linearly independent, and if {x, y, z} is linearly dependent, then z is in span{x, y}.
True
7.5.5 Two vectors are linearly dependent if and only if they lie on a line through the origin.
True
7.5.8 If u and v are linearly independent, then u + v and u − v are linearly independent.
True
8.5.1 The number of pivot columns of a matrix equals the dimension of its column space.
True
8.5.11 A is an arbitrary m × n matrix. The dimension of the null space of A is the number of columns of A that are not pivot columns.
True
8.5.12 A is an arbitrary m × n matrix. If two matrices A and B are row equivalent, then the vector spaces Col(A) and Col(B) have the same dimension.
True
8.5.5 V is an arbitrary vector space. If a set {v1, . . . , vp} spans a vector space V and if T is a set of more than p vectors in V , then T is linearly dependent.
True
8.5.8 A is an arbitrary m × n matrix. The dimensions of the row space and the column space of A are always the same, even if A is not a square matrix
True
8.7.1 Let T : V → W be a linear transformation and v1, v2, . . . , vn be vectors in V . If T(v1), T(v2), . . . , T(vn) are linearly independent then v1, v2, . . . , vn are also linearly independent.
True
8.7.4 Let V and W be vector spaces, and let T : V → W be a linear transformation. Given a subspace U of V , let T(U) denote the set of all images of the form T(x), where x is in U. Is T(U) a subspace of W?
True
9.7.11 If a square matrix has orthonormal columns, then it also has orthonormal rows.
True
9.7.12 If z is orthogonal to v1 and v2 and if W = span(v1, v2), the z must be in W⊥.
True
9.7.14 If S is a subspace of R^n, then S⊥ is a subspace of R^n.
True
9.7.2 All vectors are in R^n for some n. If x is orthogonal to both u and v, then x must be orthogonal to u − v.
True
9.7.3 All vectors are in R^n for some n. If |u|^2 + |v|^2 = |u + v|^2, then u and v are orthogonal.
True
9.7.4 The set of all vectors in R^n orthogonal to one fixed vector is a subspace of R^n.
True
9.7.6 If W is a subspace of R^n, then dim W + dim W⊥ = n.
True
9.7.7 All vectors are in R^n for some n. If {v1, v2, v3} is an orthogonal set and if c1, c2 and c3 are scalars, then {c1v1, c2v2, c3v3} is an orthogonal set.
True
9.7.8 If a matrix U has orthonormal columns, then U^T *U = I
True
9.7.9 If an n × n matrix U has orthonormal columns, then |Uv| = |v| for every v in R^n.
True