linear algebra final (5.1-5.3, 6.1-6.3, 6.5, 6.6)

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

orthogonality

2 vectors u and v in Rₙ are orthogonal (to each other) if u∙v=0

for a square matrix A, vectors in Col A are orthogonal to vectors in Nul A

FALSE

the determinant of A is the product of the diagonal entries in A

FALSE in general TRUE if A is triangular

If λ+5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A

FALSE; -5 is an eigenvalue λ+5=0 -> λ=-5

A is diagonalizable if A=PDP⁻¹ for some matrix D and some invertible matrix P

FALSE; D must be a diagonal matrix

det Aᵀ = (-1) det A

FALSE; det Aᵀ = det A Theorem 3: Properties of Determinants

an elementary row operation on A does not change the determinant

FALSE; interchanging rows and multiplying a row by a constant changes the determinant

A is diagonalizable if and only if A has n eigenvalues, counting multiplicites

FALSE; it ALWAYS has n eigenvalues, counting multiplicity

if A is diagonalizable, then A has n distinct eigenvalues

FALSE; it could have repeated eigenvalues as long as the basis of each eigenspace is equal to the multiplicity of that eigenvalue

the orthogonal projection ŷ of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute ŷ

FALSE; it is ALWAYS independent of basis

if A is diagonalizable, then A is invertible

FALSE; it's invertible if it doesn't have a zero eigenvector, but this doesn't affect diagonalizability

the best approximation to y by elements of a subspace W is given by the vector y - proj𝓌y

FALSE; the best approx. is proj𝓌y

A is diagonalizable if A has n eigenvectors

FALSE; the eigenvectors must be linearly independent

If A is invertible, then A is diagonalizable

FALSE; these are not directly related

If Ax=λx for some vector x, then λ is an eigenvalue of A

FALSE; this is true as long as the vector is not the 0 vector

if an nxp matrix U has orthonormal columns, then UUᵀx=x for all x in Rⁿ

FALSE; this only holds true if U is a square matrix

thm 3 (275) [props of determinants]

Let A and B be nxn matrices: 1. A is invertible if and only if detA!=0 2. detAB = (det A)(det B) 3. det Aᵀ= det A 4. if A is triangular, then det A is the product of the entries on the main diagonal of A 5. a row replacement operation on A does NOT change the determinant a row interchange changes the sign of the determinant a row scaling also scales the determinant by the same scalar factor

A is a 3x3 matrix with 2 eigenvalues. each eigenspace is 1D. is A diagonalizable; why?

NO; the sum of the dimensions of the eigenspaces do NOT equal 3 1+1 != 3

(det A)(det B)= detAB

TRUE

If A is 3x3 with columns a₁, a₂, a₃, then detA equals the volume of the parallelepiped determined by a₁, a₂, a₃

TRUE

a row replacement operation on A does not change the eigenvalues

TRUE

for any scalar c, u.(cv) = c(u.v)

TRUE

for each y and each subspace W, the vector y - proj𝓌y is orthogonal to W

TRUE

if W is a subspace of Rⁿ and if v is in both W and W⟂, then v must be the zero vector

TRUE

if the columns of an nxp matrix U are orthonormal, then UUᵀy is the orthogonal projection of y onto the column space of U

TRUE

if the distance from u to v equals the distance from u to -v, then u and v are orthogonal

TRUE

if y = z₁ + z₂, where z₁ is in a subspace W and z₂ is in W⟂, then z₁ must be the orthogonal projection of y onto W

TRUE

if y is in a subspace W, then the orthogonal projection of y onto W is y itself

TRUE

in the Orthogonal Decomp Thm, each term in formula(2) for ŷ is itself an orthogonal projection of y onto a subspace of W

TRUE

the multiplicity of a root r of the characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue of A

TRUE

v.v = ||v||^2

TRUE

if Rⁿ has a basis of eigenvectors of A, then A is diagonalizable

TRUE;

a matrix A is not invertible if and only if 0 is an eigenvalue of A

TRUE; Invertible Matrix Theorem

if vectors v1,....vp span a subspace W and if x is orthogonal to each vj for j=1,....p, then x is in Wperp

TRUE; any vector in W can be written as linear combos of basis vectors,

If AP=PD, with D diagonal, then the nonzero columns of P must be eigenvectors of A

TRUE; each column of PD is a column of P*A, and is equal to the corresponding entry in D*vector P. This satisfies the eigenvector definition as long as the column is nonzero

Finding an eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy

TRUE; just see if Ax is a scalar multiple of x

a number C is an eigenvalue of A if and only if the equation (A-cI)x=0 has a nontrivial solution

TRUE; this is a rearrangement of the equation Ax=λx

If z is orthogonal to u₁ and to u₂ and if W=Span{u₁, u₂} then z must be in W⟂

TRUE; z will be orthogonal to any linear combo of u₁ and u₂

A is a 5x5 matrix with 2 eigenvalues. 1 eigenspace is 3D, and the other is 2D. is A diagonalizable; why?

YES; the sum of the dimensions of the eigenspaces equal 5 3+2=5

inner product

a matrix product uᵀv or u·v where u and v are vectors; if u·v = 0, u and v are orthogonal

eigenvector; basis

a nonzero vector x such that Ax=λx for some scalar λ a basis consisting entirely of eigenvectors of a given matrix

eigenvalue

a scalar λ such that Ax=λx has a solution for some nonzero(nontrivial) vector x

diagonalization thm (282)

an nxn matrix A is diagonalizable if and only if A has n linearly independent eigenvectors A=PDP⁻¹, with D a diagonal matrix, if and only if the columns of P are n linearly independent eigenvectors of A. in this case, the diagonal entries of D are eigenvalues of A that correspond, respectively, to the eigenvectors in P

characteristic equation

det(A-λI)=0

explain why a 2x2 matrix can have at most 2 distinct eigenvalues explain why an nxn matrix can have at most n distinct eigenvalues

eigenvectors corresponding to distinct eigenvalues are linearly independent; 2x2 matrices must fit in R² space, meaning it can have, at most, 2 linearly independent vectors

distance; dist(u,v)

for u and v in Rₙ; the length of the vector u-v dist(u, v) = ||u-v||

thm 4 (338)

if S = {u₁,...uₚ} is an orthogonal set of nonzero vectors in Rₙ, then S is linearly independent and hence is a basis for the subspace spanned by S

A is a 4x4 matrix with 3 eigenvalues. 1 eigenspace is 1D and 1 other is 2D. Is it possible that A is not diagonalizable; why?

if the remaining

thm 2 (270)

if v₁,...vᵣ are eigenvectors that correspond to distinct eigenvalues λ₁,...λᵣ of an nxn matrix A, then the set {v₁,...vᵣ} is linearly independent

invertible matrix theorem (275)

let A be an nxn matrix. Then A is invertible if and only if: 1. The number 0 is NOT an eigenvalue of A 2. the determinant of A is NOT 0

thm 5 (339)

let {u₁,...uₚ} be an orthogonal basis for a subspace W of Rₙ. for each y in W, the weights in the linear combo: y=c₁u₁ + ... + cₚuₚ are given by cⱼ = (y ∙ uⱼ)/(uⱼ ∙ uⱼ) (j=1,....p)

let λ be an eigenvalue of an invertible matrix A. Show that λ⁻¹ is an eigenvalue of A⁻¹

since A is invertible: λ!=0 there is a non-zero vector x such that Ax=λx Ax=λx -> [A⁻¹](Ax)=[A⁻¹](λx) -> x=λ(A⁻¹x) -> [λ⁻¹]x=[λ⁻¹]λ(A⁻¹x) -> A⁻¹x=λ⁻¹x

length (or norm)

the scalar ||v|| = √v·v = √{v, v}

formula 2 (340)

ŷ = projₗy = [(y∙u)/(u∙u)] u


Set pelajaran terkait

Fundamentals of management MGT 305 Exam 1

View Set

Primerica - Life Insurance Basics (AZ)

View Set

Competitive Advantage and Strategy Mid-Term

View Set

5.4 Direct Object Nouns and Pronouns

View Set

Chapter 11 - How Home Ownership is Held

View Set

Chapter 52: Assessment and Management of Patients with Endocrine Disorders NCLEX

View Set