Test 3 True/False

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

If B is any echelon form of A, then the pivot columns of B form a basis for the column space of A.

False, It would be the columns of A

If A is diagonalizable, then A is invertible.

False, It's invertible if it doesn't have zero an eigenvector but this doesn't affect diagonalizabilty.

A vector space is infinite-dimensional if it is spanned by an infinite set.

False, basis can only have a finite number of elements

The nonzero rows of a matrix A form a basis for Row A

False, can span RowA but can't form a basis

Similar matrices always have exactly the same eigenvectors

False, counterexample

The sum of the dimensions of the row space and the null space of A equals the number of rows in A

False, it equals the columns.

The number of variables in the equation Ax=0 equals the dimension of NulA

False, it's the number of free variables

A plane in R^3 is a two dimensional subspace of R^3

False, must pass through the origin

The sum of two eigenvectors of a matrix A is also the eigenvector of A

False, not true in general

The correspondence [x]B -> x is called the coordinate mapping

False, the coordinate mapping is V -> R^n (the given one in the statement is R^n -> V)

Each eigenvalue of A is also an eigenvector of A^2

True, by proof

A matrix A is not invertible if and only if 0 is an eigenvalue of A.

True, by theorem

(detA)(detB) = det AB

True, from the property of determinants

A nonzero vector cannot correspond to two different eigenvalues of A

True, the eigenvalues must equal

The 5x5 matrix A has fewer than 5 distinct eigenvalues, then A is not diagonalizable

False, A is diagonalizable

If dimV = n and S is a linearly independent set in V, then S is a basis for V

False

Row operations preserve the linear dependence relations among the rows of A.

False

A is diagonalizable if A = PDP^-1 for some matrix D and some invertible matrix P

False, D has to be diagonalizable

If B = {b1,...,bn} and C = {c1,...,cn} are bases for a vector space V, then the 4th column of the change-of-coordinates P(C<-B) is the coordinate vector [cj]B

False, P(C<-B) = [bj]C

The vector spaces P3 and R3 are isomorphic.

False, P3 onto R4 is isomorphic

If dimV = n and if S spans V, then S is a basis of V.

False, S must have linearly independent set of vectors

If V = R^2, B = {b1,b2}, and C = {c1,c2}, then row reduction of [c1 c2 b1 b2] to [I P] produces a matrix P that satisfies [x]B = P[x]C for all x in V.

False, [x]C = P[x]B but not [x]B = p[x]C

Two eigenvectors corresponding to the same eigenvalue are always linearly dependent.

False, counterexample

A (square) matrix A is invertible if and only if there is a coordinate system in which the transformation x-> Ax is represented by a diagonal matrix

False, counterexample of a matrix with 0 on the diagonal

If A is diagonalizable, then A has n distinct eigenvalues.

False, counting multiplicities and linearly dependent eigenvectors

det A^T = (-1)detA

False, detA^T = det A

The dimension of the vector space P4 is 4.

False, dimP4 = 5

If A is men and rankA = m, then the linear transformation x -> Ax is one-to-one

False, if the number of columns are greater than the number of rows

The eigenvalues of an upper triangular matrix A are exactly the nonzero entries of the diagonal of A

False, includes zero entries

If an mxn matrix A is row equivalent to an echelon matrix U and if U has k nonzero rows, then the dimension of the solution space of Ax = 0 is m-k

False, it would be n-k

The rank of a matrix equals the number of nonzero rows

False, its the non-zero rows in the echelon form of the matrix not the original matrix

A is diagonalizable if and only if A has n eigenvalues, counting multiplicities.

False, must have n eigenvectors that are linearly dependent

An non matrix with n linearly independent eigenvectors is invertible

False, one eigenvalue could be 0

If v1 and v2 are linearly independent eigenvectors, then they correspond to distinct eigenvalues.

False, repeated eigenvalues may or may not be linearly independent

If B is any echelon form of A, and if B has three nonzero rows, then the first three rows of A form a basis for RowA

False, row operations do not preserve linear dependence relations

To find the eigenvalues of A, reduce A to echelon form.

False, solve det(A-λI)=0

If A is 3x3, with columns of a1, a2, and a3, then detA equals the volume of the parallelepiped determined by a1, a2, and a3

False, the absolute value of the determinant

R^2 is a two dimensional subspace of R^3

False, the dimensions are different

Eigenvalues must be nonzero scalars.

False, the eigenvalue fo 0 is allowed

If λ+5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A.

False, the eigenvalue would be -5

A is diagonalizable if A has n eigenvectors

False, the eigenvectors have to linearly independent

If A is invertible, then A is diagonalizable.

False, these are not directly related

The columns of the change-of-coordinates matrix P(C<-B) are B-coordinate vectors of the vectors in C.

False, they are the C-coordinate vectors in B.

Row operations on a matrix can change the null space.

False, they change the dependence relations but not the solution set

A row replacement operation on A does not change the eigenvalues

False, they change the determinant which will change the eigenvalues

The eigenvalues of a matrix are on its main diagonal.

False, this is only true for a triangular matrix

The determinant of A is the product of the diagonal entries in A

False, this is only true for matrices that are triangular and bigger than 2x2.

Each eigenvalue of A is also an eigenvalue of A^2

False, true only for eigenvalue of 1 and not for each eigenvalue

An elementary row operation on A does not change the determinant.

False, we learned their effect earlier.

If A is diagonalizable, then the columns of A are linearly independent.

False, when they are linearly dependent

If A is row equivalent to the identity matrix I, then A is diagonalizable

False, with only one eigenvector it can't generate a basis for R^2 (for example)

If PB is the change-of-coordinates matrix, then [x]B = PBx, for x in V.

False, x = PB[x]B

If Ax = λx for some vector x, then λ is an eigenvalue of A.

False, x should be a non-zero vector

If Ax = λx for some scalar λ, then x is an eigenvector of A.

False, you need a non-zero vector

A steady-state vector for a stochastic matrix is actually an eigenvector.

True

If V = R^n and C is the standard basis for V, then P(C<-B) is the same as the change-of-coordinates matrix PB introduces in Section 4.4

True

In some cases, a plane in R3 can be isomorphic to R2.

True

On a computer, row operations can change the apparent rank of a matrix.

True

The only three-dimensional subspace of R^3 is R^3 itself.

True

If B is the standard basis for Rn, then the B-coordinate vector of an x in Rn is x itself

True, [x]B = x

If dimV = p and SpanS = V, then S cannot be linearly dependent.

True, by Basis Theorem

If A is an nxn diagonalizable matrix, then each vector in R^n can be written as a linear combination of eigenvectors of A

True, by Basis theorem

If A contains a row or column of zeros, then 0 is an eigenvalue of A

True, by Invertible Matrix Theorem

If A and B are row equivalent, then their row spaces are the same

True, by definition

If R^n has a basis of eigenvectors of A, then A is diagonalizable.

True, by definition

The dimension of the null space of A is the number of columns of A that are not pivot columns

True, by definition

The multiplicity of a root r of the characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue of A

True, by definition

The number of pivot columns of a matrix equals the dimension of its column space.

True, by definition

The row space of A is the same as the column space of A^T

True, by definition

The row space of A^T is the same as the column space of A

True, by definition

Row operations on a matrix A can change the linear dependence relations among the rows of A.

True, by definition in book

If each vector ej in the standard basis for R^n is an eigenvector of A, then A is a diagonal matrix

True, by definition of matrix multiplication

Each eigenvector of an invertible matrix A is also an eigenvector of A^-1

True, by proof

If A is invertible and 1 is an eigenvalue for A, then 1 is also an eigenvalue of A^-1

True, by the definition of an eigenvalue

If a set {v1, ..., vp} spans a finite-dimensional vector space V and if T is a set of more than p vectors in V, then T is linearly dependent.

True, by theorem

If matrices A and B have the same reduced echelon form, then Row A = Row B

True, by uniqueness theorem

If A is mxn and the linear transformation x-> Ax is onto, then rankA = m

True, colA span R^m, thus RankA=m

If H is a subspace of R^3, then there is a 3x3 matrix A such that H = ColA

True, for any matrix in R^n there is a non such that H=ColA

If AP = PD, with D diagonal, then the nonzero columns of P must be eigenvectors of A.

True, iff the columns are linearly independent

A change-of-coordinates matrix is always invertible

True, it is a square matrix

If A and B are invertible nxn matrices, then AB is similar to BA

True, proof

If A is similar to a diagonalizable matrix B, then A is also diagonalizable.

True, proof

If B is obtained from a matrix A by several elementary row operations, then rank B = rank A

True, row operations doesn't change solution set and thus doesnt change the rank

If x is in V and if B contains n vectors, then the B coordinate vector of x is in R^n

True, the B coordinates of x are weights.

The dimensions of the row space and the column space of A are the same, even if A is not square.

True, they are both equal to the RankA

The columns of P(C<-B) are linearly independent.

True, they are the coordinate vectors of a linearly independent set thus they are linearly independent

The matrices of A and A^T have the same eigenvalues counting multiplicities

True, they have the same characteristic polynomial

Eigenvectors must be nonzero vectors

True, this means no free variables and there has to be free variables

Similar matrices always have exactly the same eigenvalues

True, true for all similar matrices

There exists a 2x2 matrix that has no eigenvectors in R^2

True, when they are linearly independent

Finding an eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy.

True, you multiply A by the vector.

A number c is an eigenvalue of A if and only if the equation (A-cI)x=0 has a nontrivial solution.

True, you need a row of zeros (infinitely many solutions)

An eigenspace of A is a null space of a certain matrix.

True, λ is the null space of A-λI


Kaugnay na mga set ng pag-aaral

Chapter 15: Cholinesterase Inhibitors and Their Use in Myasthenia Gravis

View Set

MGT 499 Assessing the Internal Environment of the Firm SB

View Set

Economics: Inventory Method: LIFO

View Set

Module 13: Resp. Failure & ARDS NCLEX Questions

View Set

Exam 4 (chap 17 and hypoth test portions of 18, 20)

View Set