Linear Algebra

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

A consistent system of linear equations has exactly one solution.

False

A subset H of a vector space V is a subspace of V if the zero vector is in H

False

When A and B are n × n, invertible matrices, then AB is invertible and (AB)^−1 = (A^−1)(B^−1)

False

When A has a diagonalization P DP^−1 this is a Singular Value Decomposition of A

False

When A is an n×n matrix and the equation Ax = b is consistent, then the solution is unique

False

When u, v are nonzero vectors, then Span{u, v} contains only the line through u and the origin, and the line through v and the origin

False

If A is an n × n matrix, when does the equation Ax = b have at least one solution for each b in Rn

Sometimes

If T : R2 → R2 rotates vectors counterclockwise about the origin through an angle θ, then T is a linear transformation

True

NOT CLOSED UNDER VECTOR ADDITION

True

If u, v, and w are nonzero vectors in R2 and u is not a multiple of v, is w a linear combination of u and v?

Always

A product of invertible n × n matrices is invertible, and the inverse of the product is the product of their inverses in the same order.

False

A set B = {v1, v2, . . . , vp} of vectors in Rn is always linearly independent when p < n.

False

A symmetric n × n A matrix always has n distinct real eigenvalues

False

Every matrix is row equivalent to a unique matrix in echelon form.

False

Every orthogonal matrix is orthogonally diagonalizable

False

For n × n matrix A is said to be diagonalizable when A = P DP^−1 for some matrix D and invertible matrix P

False

Four linearly independent vectors in R5 span R5

False

If A and B are 3 × 3 matrices and B = [b1 b2 b3] , then the product AB is given by AB = [Ab1 + Ab2 + Ab3] .

False

If A is a 2×2 symmetric matrix, then the set of x such that x T Ax = c, for some constant c, corresponds to either a circle, an ellipse, or a hyperbola

False

If A is an m × n matrix and the equation Ax = b is consistent for some b in Rm, then the columns of A span Rm

False

If A is an m × n matrix, then the range of the transformation TA : x → Ax is Rm

False

If A is an n × n matrix and Ax = λx for some scalar λ, then x is an eigenvector of A

False

If A is symmetric, then the change of variable x = P y transforms Q(x) = x^T Ax into a quadratic form with no cross-product term for any orthogonal matrix P

False

If B is an echelon form of a matrix A, then the pivot columns of B form a basis for Col A

False

If L is a line through the origin and if projLy is the orthogonal projection of y onto L, then kprojLyk is the distance from y to L

False

If a set S = {u1, ..., up} has the property that ui · uj = 0 whenever i /= j, then S is an orthonormal set

False

If a system of linear equations has no free variables, then it has a unique solution.

False

If an n × n matrix A is diagonalizable, then A has n distinct eigenvalues

False

If e1, e2, and e3 are the standard basis vectors for R3 then B = {e1, e2, e1 − e2, e3} is a basis for R3

False

If f is a function in the vector space V of all real-valued functions on R and if f(t) = 0 for some t, then f is the zero vector in V

False

If the equation A x = b is consistent, then b is in the set spanned by the rows of A

False

If the vectors in an orthogonal set of nonzero vectors are normalized, then some of the new vectors may not be orthogonal.

False

If v1 and v2 are linearly independent eigenvectors of an n × n matrix A, then they correspond to distinct eigenvalues of A

False

If {v1, v2, v3} is an orthogonal basis for W, then multiplying v3 by a scalar c gives a new orthogonal basis {v1, v2, c v3} for W

False

Let V be a vector space. If dim V = n and if S spans V , then S is a basis for V

False

The best approximation to y by elements of a subspace W is given by the vector y − projW y

False

The columns of a matrix A are linearly independent when the equation Ax = 0 has the solution x = 0.

False

The composition of two linear transformations need not be a linear transformation.

False

The determinant of a triangular matrix is always the sum of the entries on the main diagonal

False

The dimension of Nul A is the number of variables in the equation Ax = 0.

False

The dimension of the vector space P4 of all polynomials of degree at most four is 4

False

The eigenvalues of an n × n matrix A are the entries on the main diagonal of A

False

The equality (ABC)^T = C^T A^T B^T holds for all n × n matrices A, B, and C.

False

The homogeneous equation Ax = 0 has the trivial solution x = 0 if and only if the equation has at least one free variable.

False

The orthogonal projection, projW y, of y onto a subspace W depends on the orthogonal basis for W used to compute it

False

The pivot columns of rref(A) form a basis for Col A.

False

The set H of all polynomials p(x) = a + x^4 a in R , is a subspace of the vector space P6 of all polynomials of degree at most 6

False

The singular values of a matrix A are all positive

False

The subset V = a b : ab ≥ 0 of R2 is closed under vector addition.

False

The sum of the vector u − v and the vector v is the vector v.

False

Three vectors in R5 always span R5

False

When u = [−2 5] v = [−5 2] , then the vectors in Span{u, v} lie on a line through the origin.

False

A basic variable in a linear system is a variable that corresponds to a pivot column in the coefficient matrix

True

A basis {v1, v2, . . . , vp} for a vector space V is a set such that Span{v1, v2, . . . , vp} = V for which p is as small as possible

True

A linear transformation T : Rn → Rm is completely determined by its effect on the columns e1, e2, . . . , en of the n × n identity matrix In.

True

A linear transformation is a function T : Rn → Rm such that T(c1a + c2b) = c1T(a) + c2T(b) for all vectors a, b in Rn and all scalars c1, c2.

True

A quadratic form can always be written as Q(x) = x^T Ax with A a symmetric matrix

True

A quadratic form has no cross-product terms if and only if the matrix of the quadratic form is a diagonal matrix

True

A subset H of a vector space V is a subspace of V if the following conditions are satisfied: i) the zero vector of V is in H, ii) the sum u + v is in H for all u, v in H, iii) the scalar multiple cu is in H for all scalars c and u in H

True

A subspace H of a vector space V is a vector space by itself

True

A transformation T : Rn → Rm is linear if and only if T(c1v1 + c2v2) = c1T(v1) + c2T(v2) for all vectors v1, v2 in Rn and all scalars c1, c2.

True

A vector b is a linear combination of the columns of a matrix A if and only if the equation Ax = b has at least one solution.

True

An example of a linear combination of vectors v1 and v2 is the vector −2v1.

True

An n × n matrix A is not invertible if 0 is an eigenvalue of A

True

An n × n matrix that is orthogonally diagonalizable must be symmetric

True

Any list of five real numbers is a vector in R5

True

Every linear transformation T : Rn → Rm is a matrix transformation.

True

Every n × n matrix A having linearly independent eigenvectors v1, v2, . . . , vn can be diagonalized

True

Every symmetric matrix is orthogonally diagonalizable

True

For each fixed 3 × 2 matrix B, the corresponding set H of all 2 × 4 matrices A such that BA = 0 is a subspace of R 2×4

True

For each y in Rn and each subspace W of Rn the vector y − projW y is in W⊥

True

For every matrix equation Ax = b there corresponds a vector equation having the same solution set.

True

Four linearly independent vectors in R4 span R4

True

If A can be row reduced to the identity matrix, then A must be invertible.

True

If A is a 7 × 7 matrix having eigenvalues λ1, λ2, and λ3 such that (i) the eigenspace corresponding to λ1 is two-dimensional, (ii) the eigenspace corresponding to λ2 is three-dimensional, then A need not be diagonalizable.

True

If A is an invertible n × n matrix, then the equation Ax = b is consistent for every b in Rn

True

If A is an m × n matrix, then A^TA is orthogonally diagonalizable.

True

If A is an m × n matrix, then the range of the transformation T : Rn → Rm, TA : x → A x , is the set of all linear combinations of the columns of A

True

If A is an n × n invertible matrix with singular value σ, then A −1 has singular value σ^−1

True

If A is an n × n matrix and its columns are linearly independent, then the columns of A span Rn

True

If A is an n × n matrix, then (A^2)^T = (A^T)^2

True

If A is invertible, then the inverse of A^−1 is A itself.

True

If H is a p-dimensional subspace of Rn then a linearly independent set of p vectors in H is a basis for H.

True

If W is a subspace of Rn and y is a vector in Rn such that y = z1 + z2 with z1 in W and z2 in W⊥, then z1 is the orthogonal projection, projW y, of y onto W

True

If a set of p vectors spans a p-dimensional subspace H of Rn then these vectors form a basis for H

True

If the columns of an n × n matrix A are linearly dependent, then detA = 0.

True

If the columns of an n×n matrix A span Rn then the columns are linearly independent

True

If u and v are linearly independent and w is in Span{u, v}, then the set {u, v, w} is linearly dependent.

True

If u1, u2, u3 and u4 are vectors in R7 and u2 = 0, then the set S = {u1, u2, u3, u4} is linearly dependent.

True

Not every orthogonal set in Rn is linearly independent

True

The Algebraic Multiplicity of an eigenvalue λ of a square matrix A is the multiplicity of λ as a root of the characteristic equation of A

True

The determinant of an n × n matrix A can be defined recursively in terms of the determinants of (n − 1) × (n − 1) submatrices of A

True

The equation Ax = b is homogeneous if the zero vector is a solution.

True

The expression |x|^2 is a quadratic form

True

The four points (0, 0, 0), (2, −3, −12), (−1, −2, −1), (−3, −1, 7), in 3-space are co-planar, i.e., lie on a plane.

True

The only three-dimensional subspace of R3 is R3 itself

True

The product of the m × n matrix A = [a1 a2 . . . an] and the vector x = [ x1 x2 xn ] in Rn is the vector x1a1 + x2a2 + . . . + xnan in Rm

True

There are 2 × 2 matrices A, B, and C for which AB = AC and A =/ 0, but B /= C

True

When A is symmetric, then an orthogonal diagonalization of A need not be a Singular Value Decomposition of A

True

When u, v are vectors in Rn such that dist(u, v) = dist(u, −v), then u, v are orthogonal

True

is a vector space under the usual addition and scalar multiplication of vectors in R3

True


संबंधित स्टडी सेट्स

ECON 210 Chapter 3 Q&A Supply and Demand

View Set

Lecture 3- Fiber, superfoods, organic foods, comfort foods and dietary approaches to human health

View Set

Chapter 8 Similar Right Triangles

View Set

Contracts and Relationships with Buyers and Sellers

View Set

Research Methods in Psychology Exam 1

View Set