Matrices

Ace your homework & exams now with Quizwiz!

AB=BA

FALSE

Any six vectors in R4 must span R4

FALSE

Each eigenvalue of A is also an eigenvalue of A squared

FALSE

Every vector space has a unique basis.

FALSE

If A,B are nxn symmetric matrices, then AB must be a symmetric matrix.

FALSE

K|If A is an nxm matrix, b is an arbitrary vector in Rn, then the set of vectors x in Rm are such that Ax=b is a subspace of Rm.

FALSE

There exists a vector space consisting of exactly 100 vectors.

FALSE

det(5A)=5det(A)

FALSE

If A, B are nxn symmetric matrices, then AB must be a symmetric matrix.

FALSE transpose(A)=A, transpose(B)=B, transpose(AB)=transpose(B)*transpose(A)=BA which is not necessarily equal to AB.

If k is a scalar and u is a vector, then ||ku||=k||u||

FALSE =|k|||u||

There exists a 3x3 matrix A such that kerA=imageA

FALSE bc the number of columns/rows are odd.

The columns of the matrix A are linearly independent if the equation Ax=0 has the trivial solution.

FALSE but would be true if it specified "only" the trivial solution.

For a square matrix A, vectors in ColA are orthogonal to vectors in NulA

FALSE.

If AB=0, then A=0 or B=0

FALSE.

If A and B are nxn matrices, then (A+B)(A-B)=A^2-B^2

FALSE. (A+B)(A-B)=A^2+BA-AB+B^2. But, AB does NOT necessarily equal BA.

If A and B are both 3x3 matrices, (AB)^T=A^TB^T

FALSE. =B^TA^T

For matrix A,B,C, if AB=C and C has 2 columns, then A has 2 columns.

FALSE. A has two rows if C has two rows.

If x is a nontrivial solution of Ax=0, then every entry in x is nonzero.

FALSE. A nontrivial solution must have at least 1 nonzero component.

If A is row equivalent to I, then A is diagonalizable.

FALSE. A=[1 1; 0 1] is row equivalent to I, but is not diagonalizable, because its eigenvectors do not span Rn.

If A and B are 2x2 matrices such that AB=0, then BA=0

FALSE. AB does not necessarily equal BA.

If A is diagonalizable, then the columns of A are linearly independent.

FALSE. Any diagonalizable matrix that has an eigenvalue equal to 0 is not invertible, and thus has linearly dependent columns.

An nxn matrix with linearly independent eigenvectors is invertible.

FALSE. Any diagonalizable matrix which has an eigenvalue=0 is not invertible, but still has linearly independent eigenvectors because it is diagonalizable.

A square matrix is invertible IFF there is a coordinate system in which the transformation x->Ax is represented by a diagonal matrix.

FALSE. Diagonal matrices are not necessarily invertible.

Each eigenvalue of A is also an eigenvalue of A^2.

FALSE. Eigenvalues of A=[-1 0; 0 -1] are 1 but A^2=I and its eigenvalues are 1.

If W is a subspace of Rn, then W and W(orthogonal) have no vectors in common.

FALSE. Every subspace has the zero vector.

Two eigenvectors corresponding to the same eigenvalue are always linearly dependent.

FALSE. For the matrix I, any two nonzero vectors are eigenvectors corresponding to lambda=1, so there are infinite linearly independent vectors.

Similar matrices always have the exact same eigenvectors.

FALSE. If A and B are similar, then Ax=PB(inv(P))x=lambda*x. If both sides are multiplied by inv(P), Binv(P)x=inv(P)*lambda*x. Thus, inv(P)x is an eigenvector of P.

For matrix A,B, if AB=I, then A is invertible.

FALSE. If AB=BA=I, AND they are both square matrices, THEN it is invertible.

The sum of two eigenvectors of a matrix A is also an eigenvector of a matrix A.

FALSE. If the eigenvectors correspond to different eigenvalues, then their sum is NOT an eigenvector of A.

The homogeneous equation Ax=0 has only the trivial solution if and only if detA=0

FALSE. Iff detA does not equal zero

The orthogonal projection of y onto u is a scalar multiple of y.

FALSE. It is a scalar multiple of u, not y.

If AB=I, then A is invertible

FALSE. Must be square and AB=BA=I

Let V be a vector space. Every subset of V is a subspace of V.

FALSE. Must meet 3 conditions of subspace. Every subspace is a subset but not every subset is a subspace.

Eigenvalues must be nonzero scalars.

FALSE. Noninvertible matrices have eigenvalues of zero.

det(AB)=detA*detB if AB is well defined

FALSE. Only for a square matrix.

If two vectors are orthogonal, they are linearly independent.

FALSE. Orthogonal NONZERO vectors are linearly independent.

A square matrix with orthogonal columns is an orthogonal matrix.

FALSE. Orthogonal matrix is square with orthonormal columns.

If A is an nxn matrix, then det(A)-1=1/det(A)

FALSE. Remember det(A) can equal zero.

The eigenvalues of an upper triangular matrix A are exactly the nonzero entries on the diagonal of A.

FALSE. The eigenvalues of an upper triangular matrix are exactly ALL the entries on its diagonal.

The homogeneous equation Ax=0 has the trivial solution iff the equation has at least 1 free variable

FALSE. The homogeneous equation Ax=0 ALWAYS has at least 1 trivial solution

If a 5x5 matrix A has fewer than 5 distinct eigenvalues, then A is not diagonalizable.

FALSE. The identity matrix I of order 5 has only one eigenvalue, 1, but it is diagonalizable.

The length of every vector is a positive number.

FALSE. The length of the zero vector is 0.

If a matrix U has orthonormal columns, then U(transpose(U))=I

FALSE. This is only true for square matrices.

The set of integers is a subspace of R.

FALSE. k does not have to be an integer.

If A is a 4x3 matrix, then Lv=Av is a linear transformation from R4 to R3.

FALSE. v is 3x1. cannot do a transform from a bigger to smaller R space.

If r is any scalar, ||rv||=r||v||.

FALSE. ||rv||=|r|||v||

Every linear transformation has a kernel space

TRUE

If A is a 4x4 matrix with detA=4, then rankA=4

TRUE

If A is invertible, then inv(inv(A))=A

TRUE

If A, B are similar matrices, detA=detB

TRUE

If ||u+v||^2=||u-v||^2, then u and v are orthogonal.

TRUE

If the product of two matrices A and B is invertible, then A must be invertible as well

TRUE det(AB)=det(A)det(B)

Similar matrices always have the exact same eigenvalues.

TRUE because they have the same characteristic polynomials. B=QA(inv(Q))

If A,B are similar matrices, then they have the same trace

TRUE because they have the same eigenvalues.

Eigenvectors must be nonzero vectors.

TRUE by definition.

If W is a subspace, then ||projWv||^2+||v-projWv||^2=||v||^2

TRUE by orthogonal decomposition theorem, these vectors are orthogonal so the stated equality follows from the Pythagorean Theorem.

There exists a 2x2 matrix that has no eigenvectors in R2

TRUE there are some matrices that have only complex eigenvalues/vectors

The matrices A and transpose(A) have the same eigenvalues, counting multiplicities.

TRUE, because they have the same characteristic polynomial.

A vector and its -v have equal lengths.

TRUE.

For an mxn matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A.

TRUE.

If a 10x10 matrix A has 6 distinct eigenvalues, then the rankA must be at least 5.

TRUE.

If a vector coincides with its orthogonal projection onto a subspace W, then y is in W.

TRUE.

If a vector space has a basis B={b1,b2,b3,b4,b5}, then the number of vectors in every basis is 5.

TRUE.

Suppose that A is a square matrix such that det(A^100)=0. Then A cannot be invertible.

TRUE.

Suppose that A is a square matrix such that det(A^4)=0, then A cannot be invertible.

TRUE.

The null space of matrix A is the set of all solutions to Ax=0

TRUE.

The set of all vectors in Rn orthogonal to one fixed vector is a subspace of Rn.

TRUE.

The equation Ax=b is homogeneous if the zero vector is a solution.

TRUE. A homogeneous equation Ax=0 ALWAYS has at least 1 trivial solution.

If A and B are invertible nxn matrices, then AB is similar to BA

TRUE. AB=AB(A(invA))=A(BA)(inv(A))

If A is invertible and 1 is an eigenvalue for A, then 1 is also an eigenvalue of inv(A).

TRUE. An eigenvalue of the inverse of A equals 1/lambda, and 1/1 is 1 which is the same as the eigenvalue for A.

If A is similar to a diagonalizable matrix B, then A is also diagonalizable.

TRUE. B=PDinvP D=diagonalizable matrix

The distance between u and v is ||u-v||

TRUE. Definition of distance.

A nonzero vector cannot correspond to 2 different eigenvalues of A.

TRUE. Different eigenvalues have linearly independent eigenvectors.

If each vector ej in the standard basis for Rn is an eigenvector of A, then A is a diagonal matrix.

TRUE. Eigenvalues are the diagonal entries of the matrix.

Let A be a square matrix such that transpose(A)*A=I, then detA=1 or detA=-1

TRUE. For any nxn matrix AB, detAB=detAdetB det(AtA)=det(I) =1 or -1

If A is an nxn diagonalizable matrix, then each vector in Rn can be written as a linear combo of the eigenvectors of A.

TRUE. If A is diagonalizable, then it has n linearly independent eigenvectors, which form a basis for Rn, so any vector in Rn is a linear combo of these eigenvectors.

If x and y are linearly independent and if z is in the Span{x,y} then {x,y,z} is linearly independent.

TRUE. If something is in the "span" it means it is a linear combo

Each eigenvector of A is also an eigenvector of A^2.

TRUE. If x is an eigenvector of A, then Ax=lambda*x, therefore (A^2)*x=A*lambda*x=(lambda^2)*x

Each eigenvector of an invertible matrix A is also an eigenvector of inv(A).

TRUE. If x is an eigenvector of an invertible matrix A, then Ax=lambda*x and x=inv(A)*lambda*x. inv(A)*x=(1/lambda)*x

If ||u-v||^2=||u||^2+||v||^2, then u and v are orthogonal.

TRUE. Same as above, except now -v

If ||u+v||^2=||u||^2+||v||^2, then u and v are orthogonal.

TRUE. This is the pythagorean theorem.

If a square matrix has orthonormal columns, then it also has orthonormal rows.

TRUE. UT=U-1

If A, S are orthogonal matrices, then (S-1)AS is also an orthogonal matrix.

TRUE. UTU=I and U-1=UT for symmetric matrices.

If {v1,v2,v3} is an orthogonal set and if c1,c2,c3 are scalars, then {c1v1,c2v2,c3v3} is an orthogonal set.

TRUE. Works out with dot products.

If A contains a row or column of zeros, then 0 is an eigenvalue of A.

TRUE. det(A)=0 with a row or column of zeros (singular). Thus, det(A-0*I)=0, which satisfies the characteristic equation det(A-lambda*I)=0.

Let B be an invertible matrix. det(BA*inv(B))=det(A) for any matrix A.

TRUE. det(inv(B))=1/det(B)

If x is orthogonal to both u and v, then x must be orthogonal to u-v.

TRUE. x'u=0, x'v=0, x'(u-v)=x'u-x'v=0-0=0.


Related study sets

Ch. 11 - 1. Defining Interest Groups

View Set

Jarvis Modules 4-6 Module 4: Abdomen, Female and Male GU & Rectum Module 5: Musculoskeletal and Neurologic and Mental Status Systems Module 6: Skin, Hair, Nails, Breasts, and Nutritional Assessment

View Set

Mastering Math Facts Multiplication Set L (3x4, 4x3, 7x7)

View Set

Fundamentals 402 Questions Practice for Final

View Set

Unit 2: Objects and Primitive Data

View Set

Network Pro Part 2 +++++++++++++++++++

View Set