Linear Algebra test 3

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

facts about W transverse

1. A vector x is in W transverse if and only of x is orthogonal to every vector in a set that spans W 2. W transverse is a subspace of R^n

theorem 6

An m x n matrix U has orthonormal columns if and only if U^TU = I

The diagonalization theorem

An n x n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors. In fact, A = PDP^-1, with D a diagonal matrix, if and only if the columns of P are n linearly independent eigenvectors of A. In this case, the diagonal entries of D are eigenvalues of A that correspond, respectively, to the eigenvectors in P.

Theorem

An nxn matrix with n distinct eigenvalues is diagonalizable

For any scalar c, ||cv|| = c||v||

FALSE need absolute value of c ||cv||=|c| ||v||

If A is invertible, then A is diagonalizable.

FALSE these are not directly related

If A is diagonalizable, then A is invertible.

FALSE It's invertible if it doesn't have zero as an eigenvector but this doesn't affect diagonalizabilty.

A is diagonalizable if A has n eigenvectors

FALSE. The eigenvectors must be linearly independent.

A is diagonalizable if and only if A has n eigenvalues, counting multiplicities.

False

For a square matrix A, vectors in Col A are orthogonal to vectors in Nul A.

False

If A is diagonalizable, then A has n distinct eigenvalues.

False

A matrix with orthonormal columns is an orthogonal matrix

False An orthogonal matrix is a square invertible matrix U such that U^-1=U^T and such matrix has orthogonal columns

The best approximation to y by elements of a subspace W is given by the vector y − projW y.

False The best approximation theorem, the orthogonal projection, projwy of y onto W=Span{v1,v2} is the closest point to y in W

If {v1, v2, v3} is an orthogonal basis for W , then multiplying v3 by a scalar c gives a new orthogonal basis {v1, v2, cv3}

False The statement is not always true, as it format a new orthonormal basis of dimension 2

The orthogonal projection y of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute y

False the uniqueness of orthogonal decomposition of y as y=projwy+z shows that the orthogonal projection proxy depends only on W and not on the particular basis used to compute it

If W=Span{x1,x2,x3} with {x1,x2,x3} linearly independent, and if {v1,v2,v3} is an orthogonal set in W, then {v1,v2,v3} is a basis for W.

False?? Tru???

Diagonalizing matrices

Given: [ 1 3 3] A=[-3 -5 -3] [ 3 3 1 ] STEP 1: Find the eigenvalues of A STEP 2: Find three linearly independent eigenvectors of A STEP 3: Construct P from the vectors in step 2 STEP 4: Construct D from the corresponding eigenvalues

Theorem 4

If S = {u1 ,..., up} is an orthogonal set of nonzero vectors in Rn, then S is linearly independent and hence is a basis for the subspace spanned by S.

Theorem 9

Let A be a real 2x2 matrix with a complex eigenvalue λ=a-bi (b!=0) and an associated eigenvector v in C^2. Then A=PCP^-1, where P=[Re v Im v] and C=[a -b [b a]

Theorem 3

Let A be an mxn matrix. The orthogonal complements of the row space of A is the null space of A, and the orthogonal complement of the column space of A is the null space of A transverse (Row A) transverse= Nul A and (colA) transverse=NulA transverse

Theorem 7

Let A be an nxn matrix whose distinct eigenvalues are λ1......λp. a. For 1<k<p, the dimension of the eigenspace for λk is less than or equal to the multiplicity of the eigenvalue λk b. The matrix A is diagonalizable if and only if the sum of the dimensions of the eigensapces equals n, this happens if and only if (i) the characteristic polynomial factors completely into linear factors and (ii) the dimension of the eigenspace for each λk equals the multiplicity of λk c. If A is diagonalizable and Bk is a basis for the eigenspace corresponding to λk for each k, then the total collection of vectors in sets B1...Bp forms and eigenvector basis for R^n

theorem 7

Let U be an mxn matrix with orthonormal columns, and let x and y be in Rn. Then a. ||Ux||=||x|| b. (Ux)*(Uy)=x*y c. (Ux)*(Uy)=0 if and only if x*y=0

Best Approximation Theorem

Let W be a subspace of R^n, let <y> be any vector in R^n, and let y^ (y-hat) be the orthogonal projection of <y> onto W. Then y^ is the closest point in W to <y>, in the sense that || y - y^|| < || y - v || for all <v> in W distinct from y^.

Theorem

Let u,v, and w be vectors in R^n, and let c be a scalar. Then a. u*v=v*u b. (u+v)*w=u*w+v*w c. (cu)*v=c(u*v)=u*(cv) d. u*u>0 and u*u=0 if and only if u=0

Theorem 5

Let{u1....up} be an orthogonal basis for a subspace W of Rn. For each y in W, the weights in the linear combination y=c1u1+....+cpup are given by cj=(y*uj)/(uj*uj)

For an mxn matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A

TRUE (row A) perp=NulA

In a QR factorization, say A=QR (when A has linearly independent columns), the columns of Q form and orthonormal basis for the column space of A

TRUE by the QR factorization theorem, the columns of Q form orthonormal basis for the column space of A

If y is in a subspace W, then the orthogonal projection of y onto W is y itself

TRUE recall the theorem that, for a vector y in W=Span{u1,u2,...up} we have projwy=y

If x is orthogonal to every vector in a subspace W, then x is in W perp

TRUE the set of all vectors z that is orthogonal to every vector in a subspace of W

If y is a linear combination of nonzero vectors from an orthogonal set, then the weights in the linear combination can be computed without row operations on a matrix.

TRUE the weights in the linear combination y=c1u1+...cpup can be computed as cj=(y*u)/(u*u) but without row operation on a matrix

In the Orthogonal Decomposition Theorem, each term in formula (2) for Y is itself an orthogonal projection of y onto a subspace of W.

TRUe

If ||u||^2 + ||v||^2 = ||u+v||^2 then u and v are orthogonal.

TRUe Pythagorean theorem

If an nxp matrix U has orthonormal columns, then U(U^t)[x] = [x] for all [x] in Rn.

True

If the columns of an nxp matrix U are orthonormal, then UU^Ty is the orthogonal projection of y onto the column space of U.

True

If x is not in a subspace W, then x - proj(W)x is not zero.

True

u*v-v*u=0

True

For any scalar c, u*(cv) = c(u*v)

True (cu)*v=c(u*v)=u*(cv)

In a QR factorization, say A=QR (when A has linearly independent columns), the columns of Q form an orthonormal basis for the column space of A

True A=QR Q^TA=Q^TQR Q^TA=IR Q^TA=R

If y= z1+z2, where z1 is in a subspace W and z2 is in W prep, then z1 must be the orthogonal projection of y onto W

True By the orthogonal decomposition theorem, for the vector subspace W of Rn, each y in Rn can be written uniquely as y=y^+z

If AP=PD, with D diagonal, then the nonzero columns of P must be eigenvectors of A.

True IF AP=DP implies A=PDP^-1, with D a diagonal matrix, if and only if the columns of P are n linearly independent eigenvectors of A

If R^n has a basis of eigenvectors of A, then A is diagonalizable.

True The matrix A is diagonalizable if and only if there are enough eigenvectors to form a basis of R^n

the Gram-Schmidt process produces from a linearly independent set {x1,...xp} an orthogonal set {v1,...,vp} with the property that for each k, the vectors v1,....vk span the same subspace as that spanned by x1,...,xk

True according to Gram Schmidt process a basis {x1,x2....xp} for a nonzero subspace W of Rn there is an orthonormal basis for W

If vectors v1,...,vp span a subspace W and if x is orthogonal to each vj for j = 1,...,p, then x is in W^perp

True by the definition of WT for x in W x*vi=0

For each y and each subspace W , the vector y − projw y is orthogonal to W

True by the orthogonal decomposition theorem, for the vector subspace W or Rn, each y in Rn can be written as y=projwy+z

If W is a subspace of R^n and if v is in both W and W perp then v must be the zero vector

True by the orthogonal decomposition theorem, for the vector subspace W or Rn, each y in Rn can be written uniquely as y=y^+z

If z is orthogonal to u1 and to u2 and if W = Span{u1, u2}, then z must be in W⊥.

True if a vector z is orthogonal to the vectors in the basis {u1, u2} for W, then z is in W

If the distance from u to v equals the distance from u to -v, then u and v are orthogonal

True u*v=0

v*v=||v||^2

True ||v||=squrt(v*v) so ||v||^2=v*v

The Pythagorean theorem

Two vectors u and v are orthogonal if and only if ||u+v||^2=||u||^2+||v||^2

An orthogonal basis for a subspace W of R^n is

a basis for W that is also an orthogonal set

For u and v in R^n ,the distance between u and v, written as dist(u,v) is the length of the vector u-v

dist(u,v)=||u-v||

A is diagonalizable if A = PDP^-1 for some matrix D and some invertible matrix P.

false for some matrix D does not automatically mean that Matrix D is diagonal

If L is a line through 0 and if y is the orthogonal projection of y onto L, then ||y|| gives the distance from y to L.

false the distance from y to the line l through the origin, is the length of the perpendicular line segment from y to the orthogonal projection y, that is, ||y-y||

If the vectors in an orthogonal set of nonzero vectors are normalized, then some of the new vectors may not be orthogonal.

false when the vectors in an orthogonal set of nonzero vectors are normalized to have unit length, the new vectors will be orthogonal, and hence the new set will be an orthonormal set

Not every linear independent set in Rn is an orthogonal set

true every linear independent set in Rn is need not to be an orthogonal set

Two vectors u and v in R^n are orthogonal (to each other) if

u*v=0

orthogonal projection of y onto u

y^=projuy= (y*u)/(u*u)*u

For any scalar c, the length cv is |c| times the length of v

||cv||=|c|||v||

u*v=

||u|| ||v||cos(theta)

The length or NORM of v is the nonnegative scalar ||v|| defined by

||v||=sqrt(v*v)=sqrt(vq^2+vx^2+....+vn^2)


संबंधित स्टडी सेट्स

renal part 2 (fluid and e- changes) - exam 3

View Set

PrepU Chp 28: Assessment of Hematologic Function and Treatment Modalities

View Set

Exam #3 - Fundamentals - 4/10/20

View Set