Linear Algebra True or False

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

The length of every vector is a positive number

False this is because the if you have a zero vector, the length of it is zero

if A is nxn, then A and A^TA have the same singular values

false the singular values of a diagonal matrix are square roots of the eigenvalues - so the eigenvalues can be the same, but the singular values are going to be different

a square matrix with orthogonal columns is an orthogonal matrix

false the statement would be true if it was a square matrix with ORTHONORMAL columns

The sum of two eigenvectors of a matrix A is also an eigenvector of A.

false the sum of two eigenvectors of A is not necessarily an eigenvector of A

if W is a subspace of R^n, then W and W^⟂ have no vectors in common

false the zero vector is in both W and W^⟂

If A is row equivalent to the identity matrix I, then A is diagonalizable

false there could be case where the number of eigenvectors does not match the number of cols, therefore, it could not generate a basis for R^n and the matrix would not be diagonalizable

If a matrix U has orthonormal columns, then U^T*U = I.

false this statement is only true when the matrix U is a square matrix

if every coefficient in a quadratic form is positive, then the quadratic form is positive definite

false you can have all the coefficients in a quadratic form be positive but the eigenvalues aren't all positive. This would mean that the the quadratic form is indefinite quadratic not a positive quadratic

eigenvalues must be non-zero scalars

false zero is an eigenvalue of any singular square matrix

if P is an nxn orthogonal matrix, then the change of variable x = Pu transforms x^TAx into a quadratic form whose matrix is P^-1AP

true if x = Py then; x^TAx = (Py)^T A (PY) = y^TP^TAPy =y^TP-1APy

if A is orthogonally diagonalizable, then A is symmetric

true in order for a matrix to be orthogonal it must be nxn and therefore symmetric hence, the statement is true

The matrices A and A^T have the same eigenvalues, counting multiplicities

true matrices A and A^T have the same characteristic polynomial and the determinants are the same as well

if x is orthogonal to both u and v, then x must be orthogonal to u-v

true orthogonality says that x•u = 0 and x•v = 0 therefore xu-xv = 0 => x(u-v) = 0 this shows that x and (u-v) are orthogonal

a vector v and its negative, -v, have equal lengths

true the definition of a length of a vector is ||cv|| = |c| ||v|| If you have c = -1: |-1| ||X|| = ||X||

the distance between u and v is ||u-v||

true the distance is the length or norm of the vector

Each eigenvector of A is also an eigenvector of A^2

true the eigenvectors remain the same, but the scalar eigenvalue would change depending on the size of the matrix

If a vector y coincides with its orthogonal projection onto a subspace W, then y is in W

true the orthogonal project of any vector y onto W is always a vector in W

the principal axes of a quadratic form x^TAx can be the columns of any matrix P that diagonalizes A

true the principal axes of x^TAx are the cols of any orthogonal matrix P that diagonalizes A

If {v1, v2, v3} is an orthogonal set and if c1, c2 and c3 are scalars, then {c1v1, c2v2, c3v3} is an orthogonal set.

true this is b/c since v1,v2,v3 form an orthogonal set, v1 • v2 = v1 • v3 = v2 • v3 = 0 The same applies for c1c2v1v2 = c1c3v1v3... = 0

if ||u-v||^2 = ||u||^2 + ||v||^2, then u and v must be orthogonal

true this is because ||u+v||^2 = ||u||^2 + ||v||^2 + 2u • -v by the def of orthogonality 2u • -v = 0 this implies the vectors are orthogonal

if ||u+v||^2 = ||u||^2 + ||v||^2, then u and v must be orthogonal

true this is because ||u+v||^2 = ||u||^2 + ||v||^2 + 2u • v by the def of orthogonality 2u • v = 0 this implies the vectors are orthogonal

by a suitable change of variable, any quadratic form can be changed into one with no cross-product term

true this is stated in the principle axes theorem any quadratic form can be written as x^TAx for some symmetric matrix A

The set of all vectors in R^n orthogonal to one fixed vector is a subspace of R^n

true a vector x is in W^⟂ iff x is orthogonal to every vector in a set that span W W^⟂ is a subspace of R^n as well

A nonzero vector cannot correspond to two different eigenvalues of A

true you could have (a - a2)x = 0 as long as a != 0 different eigenvalues have linearly independent eigenvectors

The eigenvalues of an upper triangular matrix A are exactly the nonzero entries on the diagonal of A

false the eigenvalues of an upper triangular matrix are exactly all the entries on the diagonal

similar matrices always have exactly the same eigenvectors

false the eigenvectors of similiar matrices are not necessrily the same

if r is any scalar, then ||rv|| = r||v||

false the equation NEEDS to be |r| ||v|| for any scalar r

the largest value of a quadratic form x^TAx for ||x|| = 1, is the largest entry of the diagonal of A

false the largest value of a quadratic form x^TAx for ||x||=1 is the largest eigenvalue of A

If a 5x5 matrix A has fewer than 5 distinct eigenvalues, then A is not diagonalizable

false the matrix could be a 5x5 identity matrix and have fewer than 5 distinct eigenvalues

the maximum value of a positive definite quadratic form x^TAx is the greatest eigenvalue of A

false the max value of a positive definite quadratic form x^TAx subject to the constraint x^Tx =1 is the greatest eigenvalue

the orthogonal projection of y onto u is a scalar multiple of y

false the orthogonal projection of y onto u is a scalar multiple of u NOT y

Two eigenvectors corresponding to the same eigenvalue are always linearly dependent.

false

An indefinite quadratic form is one whose eigenvalues are not definite

false an indefinite quadratic form is whose given eigenvalue are both positive and negative

a positive definite quadratic form can be changed into a negative definite form by a suitable change of variable x = PU, for some orthogonal matrix P

false any orthogonal change of variable x = py changes a positive definite quadratic form into another positive definite quadratic form

if A is an orthogonal matrix, then A is symmetric

false for example if A = 0 -1 1 0 A is an orthogonal matrix because the dot product is 0 but A is not symmetrical therefore, it is false

An nxn matrix with n linearly independent eigenvectors is invertible

false having n linearly indep eigenvectors makes a nxn matrix diagonalizable, but not necessarily invertible b/c one of the eigenvalues of the matrix could be zero

if A is diagonalizable then the cols of A are linearly independent

false if A is a diagonal matrix with 0 on the diagonal, then the cols of A are not linearly indep

if two vectors are orthogonal, they are linearly independent

false if one of the vectors was the zero vector, the zero vector is orthogonal to any other vector and the two vectors would be linearly dep. However, non-zero orthogonal vectors are linearly indep.

if x^TAx > 0 for some x, then the quadratic form x^TAx is positive definite

false if the value of the quadratric form is >0 for all x != 0, then the quadratic form is positive definite However, for any x, it is not guaranteed to be positive definite

if P is an n x n matrix with orthogonal columns, then P^T = P^-1

false the cols of P should be orthonormal for P^T to be equal to P^-1

if U is mxn with orthogonal columns, then UU^Tx is the orthogonal project of x onto Col U

false the cols of U must be orthonormal to make UU^Tx the orthogonal projection of x onto col U

Each eigenvalue of A is also an eigenvalue of A^2

false the eigenvalue associated with A would be the eigenvalue^2 for A^2

If A is an nxn diagonalizable matrix then each vector in R^n can be written as a linear combination of eigenvectors of A

true if A is diagonalizable, then by the DT, A has n linearly indep eigenvectors and by the basis theorem, those eigenvectors span R^n This means that each vector in R^n can be written as linear combination of the eigenvectors

if a square matrix has orthonormal columns, then it also has orthonormal rows

true if U has orthonormal columnsm then U^T * U =I if U is also square, then the IMT shows that U is invertible and U^-1 = U^T Because U^T * U = I, the cols of U^T are orthonormal and therefore, the rows are orthonormal

If A contains a row or column of zeros, then 0 is an eigenvalue of A

true if a matrix contains a row or col of zeroes, then A is not row equivalent to the identity matrix and therefore is not invertible Also, by the IMT, 0 would be an eigenvalue of A

Eigenvectors must be nonzero vectors

true

If A is similar to a diagonalizable matrix B, then A is also diagonalizable.

true

If each vector e in the standard basis for R^n is an eigenvector of A, then A is a diagonal matrix

true

There exists a 2x2 matrix that has no eigenvectors in R2

true

if A is an orthogonal matrix, then ||Ax|| = ||x|| for all x in R^n

true Because A can be any mxn matrix with orthogonal cols and x is in R^n, then ||Ax|| = ||x||

Each eigenvector of an invertible matrix A is also an eigenvector of A^-1

true If a matrix is invertible, that means that the eigenvalue is not zero therefore, the corresponding eigenvector will be the same for A^-1

If A and B are invertible n × n matrices, then AB is similar to BA

true Since B is invertible, AB is similar to B(AB)B^-1 which is equal to BA

if A is invertible and 1 is an eigenvalue for A, then 1 is also an eigenvalue of A^-1

true by the definition of eigenvalue: Ax = 1x to obtain A^-1: x = A^-1x this can be rewritten as A^-1x = 1x Since x is nonzero, 1 is an eigenvalue of A^-1

Similar matrices always have exactly the same eigenvalues

true by the definition of similar matrices, they always have exactly the same eigenvalues - they have the same characteristic polynomial

if W is a subspace, then ||projw v||^2 + ||v-projw v||^2 = ||v^2||

true by the orthogonal decomposition theorem, the vectors projw v and v - projw v are orthogonal this means that the given equality follows the Pythagorean Theorem


संबंधित स्टडी सेट्स

Chapter 21 (Respiratory Care Modalities)

View Set

PATHO Check Your Understanding Week 6

View Set

Ch .25 long-term complications of diabetes

View Set

Drivers ed risk test study guide

View Set

Sociology: Chapter 4 Socialization, Interaction, and the Self

View Set