Final Review

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

6 steps of finding reduced SVD:

1. Find A^T x A 2. Find eigenvalues of A^T x A 3. Put orthonormal eigenvectors you found into matrix V+ (the eigenvectors you just found but with the hat operator applied. There should be a square root of something under each value) 4. Put the singular values on the diagonal of the matrix [SIGMA SYMBOL]+ in the proper order (the example put them in descending order) 5. Find (1/corresponding singular value) x A x corresponding orthonormal eigenvector and then combine the resulting vectors in the proper order to form U+ 6. Done. Write A in the form of U+ x [SIGMA SYMBOL]+ x V+^T (yes, that last one is step 3's result transposed)

What is the eigenvalue of eigenvector x of A^-1 given A is invertible and x is an eigenvector of A with corresponding eigenvalue λ?

1/λ

Positive definiteness of a matrix

A symmetric matrix is said to be positive definite if all its eigenvalues are symmetric.

If A and B share the same set of distinct Eigenvectors, then AB = ...

AB = BA

Real symmetric matrix meaning

All pivots/eigenvalues/upper left determinants are positive

What eigenvalues do A and A transposed share?

All. A matrix and its transpose have the same eigenvalues. BUT EIGENVECTORS ARE NOT NECESSARILY THE SAME.

Orthonormality

An orthonormal set S is an orthogonal set but with everything being a unit vector (everything being a unit vector is what distinguishes an orthonormal set from an orthogonal set, where everything is not a unit vector)

Finding eigenvalues steps

Apply ch. eq. (|( λ * I ) − A|) to A so you get the characteristic polynomial. Then find the zeroes of said polynomial.

A nonzero vector x is an eigenvector of A if:

Ax = λx, for some scalar λ

Characteristic polynomial and its meaning

Class notes version: p(λ) = det[ ( λ * I ) − A ] Meaning: Computing the determinant of the matrix ( λ * I ) − A whose entries contain the unknown λ.

How to find the basis of the column space

Convert to RREF, find pivots. The columns that those pivots are in represent the columns of the ORIGINAL matrix (not row reduced) that constitute the column space

How to find the basis of the row space

Convert to RREF, find pivots. The rows that those pivots are in represent the rows of the ROW REDUCED (NOT the original) matrix that constitute the row space for the original matrix. No, no terms are switched up here. The RREF matrix yields the row space for the original matrix.

What does the dimension of a matrix represent

Dimension is the number of vectors in any basis for the space to be spanned.

How to calculate dimension of the row space (which is also the dimension of the column space)

Dimension of the row space (and also the column space) is the rank (r) of the matrix

How to calculate the rank of the matrix and what does it represent

How to find: Transform to RREF and find pivots. The rank of a matrix is defined as (a) the maximum number of linearly independent column vectors in the matrix or (b) the maximum number of linearly independent row vectors in the matrix. Both definitions are equivalent.

How would you go about the gram-schmidt process with 3 vectors a1, a2, and a3 and what does the gram schmidt process do?

It finds an orthonormal basis for another basis. You'd find q1 = a^ (do the hat equation on vector a), then do u2=a2-(a2 transpose * q1)*q1, q2= the hat operation on u2, and keep doing this pattern until you're out of vectors to do it on (in this case, when you obtain q3).

What is Cramer's Rule, what is its application, and how would you go about using it?

It solves a system instead of back substitution. Find the determinant of the original matrix for reference later. Then replace each row progressively with the result vector and find the determinant each time, and each x/y/z/whatever value should be the result of the corresponding matrix determinant divided by the original matrix determinant. See the example:

Reduced SVD

Just SVD but if there's any column of zeros in the [SIGMA SYMBOL]+ matrix then take it out.

Are eigenvectors corresponding to distinct eigenvalues LI or LD?

Linearly independent (LI)

Can you find the determinant of a non-square matrix (i.e. is a non-square matrix invertible)?

No. The determinant of a non-square matrix does not exist and non-square matrices cannot be inverted as a consequence.

⊥ meaning and how to find it

Orthogonal complement. To find it, find the null space. That's your orthogonal complement

What are the four fundamental subspaces?

Row space, column space, null space, left null space

Cholesky factorization

S = C^T x C where C = sqrt(D) x U (with D and U being from the LDU factorization of S)

Spectral decomposition S

S = Q x [DELTA SYMBOL] x Q^T That is, a matrix of the eigenvectors corresponding to the order of the eigenvectors in the [DELTA SYMBOL] matrix times (an identity matrix times a vector of eigenvalues in descending order times) times the first matrix (the eigenvector matrix) transposed.

The eigenvalues of a triangular matrix (upper triangular, lower triangular or diagonal) are...

The entries in the main diagonal.

Cofactor:

The minor of a position (i,j) in a matrix.

Eigenspace of matrix A

The nullspace N(λI − A)

The null space is the orthogonal complement of the ___ space and vice versa, and the left null space is the orthogonal complement of the ___ space and vice versa.

The nullspace is the orthogonal complement of the row space, and then we see that the row space is the orthogonal complement of the nullspace. Similarly, the left nullspace is the orthogonal complement of the column space. And the column space is the orthogonal complement of the left nullspace.

What is nullity?

The number of vectors present in the null space of a matrix

det(A), when described in terms of A's eigenvalues, is equal to...

The product of all eigenvalues of A.

What is an eigenvalue?

The scalar λ in the equation Ax = λx (def of eigenvector).

The trace of square matrix A is...

The sum of its diagonal elements

T/F: If x is an eigenvector of A with corresponding eigenvalue λ, then x is also an eigenvector of 2A with corresponding eigenvalue 2λ.

True

T/F: If x is an eigenvector of A with corresponding eigenvalue λ, then x is also an eigenvector of A^2 with corresponding eigenvalue λ^2

True

Orthogonality: How can you tell if a vector is orthogonal? A matrix?

Two vectors are orthogonal if their dot product is zero. Matrices are orthogonal if you obtain the identity matrix when you multiply a matrix by its transpose, i.e. A^t * A = I.

Characteristic equation and its meaning

det[ ( λ * I ) − A ] = 0 (Dahal says the class notes version and WolframAlpha version's difference doesn't matter and everything still works out) Meaning: Let A be an n × n matrix, then λ is an eigenvalue of A if and only if det( (λ * I ) − A ) = 0.

Hat equation (i.e. u^ except the ^ is directly on top of the u)

u^ = (u vector) / ( || u vector ||). That is, u hat = the u vector divided by the norm of the u vector.


Kaugnay na mga set ng pag-aaral

Maternity Exam 3 Practice Questions

View Set