MA 265 Exam 2
dim Nul A
# of free variables
What is the inverse of A?
A^-1 = PD^-1P^-1
What if eigenvalues are repeated?
Ideal case: Dimension of eigenspace = algebraic multiplicity (# of eigenvectors found the normal ways) = # of times the eigenvalues appears) Bad case: dimension of eigenspace < algebraic multiplicity --> matrix is NOT diagonalizable
What does it mean if A is diagonalizable?
If A is diagonalizable, then A= PDP^-1 for some invertible P and diagonal D
How to find eigenvalues?
If A is triangular, then the main diagonal elements are eigenvalues.
Spanning Set Theorem. Let S = {v1, ..., vp} be a set in V and let H = Span {v1, ..., vp}
If one of the vectors in S for ex. vk is a linear combination of the remaining vectors in S then the remaining vectors in S, then the set is formed by removing vk from S still span H. If H does not equal 0 same subset of S is a basis for H
Suppose a 5x7 matrix has 4 pivot columns. Is Col A = R^4
No, because Col A is subspace of R^5
n
columns
recall that for a matrix A, a real number λ is an eigenvalue if and only if
det(A - λI) = 0
λ is an eigenvalue of A if and only if
det(A - λI) = 0
(A - λI)x = 0 must have nontrivial solutions
det(A - λI)x = 0 solve for λ
If we have m vectors, where m> n then the matrix A = [b1, bn] has n rows and m columns and A has n pivots but there are m, columns so there are
free variables in Ax = 0 which means its a dependent set
Tow vector spaces are isomorphic if and only if they
have the same dimension.
Vector Space
is a collection of objects called vectors such that a) any vector can be scaled by a scalar b)there is a zero vector such that v +0 = v for every vector c)any two vectors v and w in V can be added
n x n
n eigenvalues/ vector pairs
If a vector space V has dimension n, then any set of vectors in V of size larger than n is
necessarily linearly dependent
If λ= 0 Ax = λx becomes Ax=0 and (A-λI)x = 0 has
nontrivial solutions
eigenvector does not equal 0 then it must have
nontrivial solutions
Subspace (H)
of a vector space is a collection of vectors which satisfy the usual properites
If there exists a one-one and onto transformation T:V--> W then there exists a
one-one and onto transformation S: W--> V
The characteristic polynomial of a matrix A is
p(t) = det(A - tI) t = variable
Complex number
z = a + bi a = real part of z Re(z) b = imaginary part of Im(z)
Complex conjugate
z = a + bi z = a - bi ***switch the sign of the complex****
Eigenvalue can be
zero
dim Col A
# of basis variables or # of pivots
rank(A) + dim(Null(A)) =
# of columns in A
Largest rank is equal to the
# of pivot positions
Homogenous equation
(A - λI)x = 0
A * A =
(entries of A)^2 only when A is a diagonal matrix
Eigenspace
***all multiples of eigenvectors and the zero vector*** Recall that solutions of homogeneous equations form a subspace Eigenspace of A w*r*t an eigenvalue λ is the null space we can find a basis for the eigenspace
Eigenvalue
***scaling factors after transformation*** A scalar lambda such that Ax = lamba x has a solution for some nonzero vector x
Finding the Null Space
- Add a column of 0s - Row reduce - Find which rows are not pivot columns -Find the equations for each of the variables - Set them to the free variables -Those vectors are you null space vectors
Finding the Column Space
- row reduce - see the rows with the pivot columns - look a the original matrix and that is the basis for your columns space
There are two ways to look at a basis:
1. A minimum or most efficient spanning set 2. A largest possible linearly independent set for that subspace
Diagonalize the matrix if possible
1. Find the characteristic polynomial of A. p(t) = det(A - tI) 2. Find a basis for the eigenspace for each eigenvalue.
Example of Vector Spaces
1. IR^n with usual addition and scalar multiplication 2. Let V be the set of functions IR --> IR Then define f +g to be the function whose value at x is cf(x).0 is the zero function. Then, the polynomials of degree less than or equal to n.
For any linear transformation T we have two important invariants:
1. Kernel of T --> collection of vectors v in V such that Tv = 0 2. Range of T--> collection of vectors w in W such that w = Tv for some vector v in V. This is a subspace of w.
3x3
3 eigenvalues/ vector pairs
Suppose a 5x7 matrix has 4 pivot columns. What is the nullity A?
7-4 = 3 nullity A = 3
If A is a square matrix, then it is diagonalizable if it is similar to a diagonal matrix D. This means there exists matrix P such that
A = PDP^-1
0 is an eigenvalue if and only if
A is singular and if and only if det(A) = 0
The basis theorem means that
ANY other set of n linearly vectors spanning V is a basis.
Diagonalizing a matrix has the same advantage. Since
A^R = (PDP^-1)(PDP^-1)...(PDP^-1) = PD^kP^-1
Let A be a nxn matrix. A non-zero vector v in IR^n is said to be an eigenvector of A with eigenvalue λ if
Av = λv or equivalently (A - λI )v= 0 Thus v is a solution to (A- λI)x = 0
Column Space
Ax = b is the span of its column vectors
Basis for Col A are
Ax = b are the pivot columns of the original not the reduced form
This means that x is an eigenvector of A with corresponding eigenvalue, λ when
Ax = λx
If a 5x5 square matrix and the system Ax = 0 has at least two different solutions, then the rank of A is at most 3
FALSE
The matrix A can have more than n eigenvalues.
FALSE det(A- λIn) = 0 Polynomial degree n, then we get at most n roots so n eigenvalues
If 0 is an eigenvalue of A, A is invertible.
False, because if 0 is an eigenvalue A can't be invertible
The matrix A and its transpose A^t have different sets of eigenvalues.
False, they are going to have to same set of values because det(A^t) =det(A)
Basis Theorem
Let H be a p-dimensional subspace of Rn. Any linearly independent set of exactly p elements in H is automatically a basis for H. Also, any set of p elements of H that spans H is automatically a basis for H.
Spanning Set Theorem
Let S = {v1,...,vp} be a set in V, and let H = Span{v1,...,vp}. a) If one of the vectors in S--say, vk--is a linear combination of the remaining vectors in S, then the set formed from S by removing vk still spans H. b) If H doesn't equal {0}, some subset of S is a basis for H.
Spanning Set Theorem allows us to make basis by throwing out linearly dependent
Likewise we can add linearly independent vectors to make basis for subspace
Suppose a 4x7 matrix A has 4 pivot columns. If Nul A = R^3?
No, Nul A is not equal to R^3 because Nul A is not = R^3 because Nul A is a subspace of R^7
A is a 3x3 matrix with two eigenvalues. Each eigenspace is one dimensional. Is A diagonalizable?
No, the sum of the dimensions of the eigenspaces equal 2 and the matrix has 3 columns. The sum of the dimensions of the eigenspace and the number of columns must be equal.
What does it mean for a set to be a basis of some subspace H of a vector space?
Set is linearly independent The subspace spanned by the set must coincide with H H = span{b1, b2, ... bp}
Null Space
Set of all solution to Ax = 0
Let V and W be two (abstract) vector spaces. V and W are said to be isomorphic if there exists linear transformation
T: V -->W which is both one -one and onto
Linear Transformation
T: V--> W is a function which assigns to every vector v in V a vector Tv in W such that a) T(v+w) = Tv =Tw b) T(cv) = cTv
If A is a 12x10 matrix what is the largest possible rank of A? If A is a 10x12 matrix what is the largest possible rank of A?
The rank of A is equal to the # of pivot positions in A. Only 10 columns in a 12x10 matrix and there are only 10 rows in a 10x12 matrix there can be at most 10 pivot position for either matrix. So the largest rank is 10.
Why is the characteristic polynomial useful?
The roots of the characteristic polynomial are eigenvalues of the matrix. The degree of the characteristic polynomial is n. Hence, there are almost n eigenvalues. It is not necessary that there are n Distinct Eigenvalues for example [1 0 0 1] has 1 as the only eigenvalue
Find basis for Nul A is easy
The standard way of solving Ax = 0 always gives us the basis for Nul A
If A is a 4x4 matrix and the system Ax = 0 has at least two different solutions, then the nullity of A is a least 1
True
A homogeneous linear system with fewer equations than unknowns must always have infinitely many solutions.
True! Since dim(Nul(A)) > 0 using rank term
If 0 is an eigenvalue of A, then nullity(A) > 0.
True. Since 0 is an eigenvalue, there exists a nonzero vector x such that Ax = 0, and thus nullity(A) > 0.
How to determine a basis for the row space?
Two matrices A and B which are row equivalent. They have the same row space. If B is an echelon form of A, then basis for row space of B (hence of A) are the non-zero.
Why is diagonalization useful?
When we multiply matrices, for example A*A, the entries are not simply power of entries of A. But this is true when A is a diagonal matrix.
Suppose a 4x7 matrix A has 4 pivot columns. Is Col A = R^4?
Yes, Col A = R^4 because A has 4 pivot columns dim Col A = 4
Eigenvector CANNOT be a
ZERO VECTOR
What does it mean if A is invertible?
Zero is not an eigenvalue of A, so the diagonal entries in D are not zero, so D is not invertible
Identify a nonzero 2x2 matrix that is invertible but not diagonalizable.
[ 1 1 0 1]
Diagonalization
a matrix A is said to be diaonalizable if A can be written as PDP^-1 where D--> diagonal matrix
An indexed set of vectors in V is said to be linearly independent if
c1v1 + c2v2 + cpvp = 0 has only one solution.
{v11, v2, ... vp} in V is linearly independent if
c1v1 + c2v2 +cpvp = 0 has only the trival solution c1 = c2 = cp = 0
If the nullity of a 7x10 matrix is 5, what are the dimensions of the column and row space of A?
dim Col A = 5 dim Row A = 5
If H is a subspace of V then
dim(H) = dim(V)
The # of basis vectors is called the
dimension of the vector space.
A is diagonalizable if it has
distinct eigenvalues
If a vector space V has a basis of n vectors, then
every basis of V must consist of exactly n vectors
A linear system with three equations and ten unknowns is always consistent
false M < N
Vector spaces can be
finite dimensional or infinite dimensional.
For a vector space V its dimension is the # of vectors in one ( and hence every) basis of V. If the number of vectors in the basis is infinite,
it is said to be infinite dimensional.
D(P^1x)
lengthens or shortens whatever follows doesn't rotate diagonal matrix
If a vector space has a basis B = {b2, bn} then any set in V containing more than n vectors must be
linearly dependent
All eigenvectors corresponding to distinct eigenvalues are
linearly independent
If a vector space V has dimension n, then every collection of n vectors in V is
linearly independent if and only if they span V
The homogenous equation Ax=0 has the trivial solution if and only if the equation has at least one free variable
linearly indepenent
If eigenvalues are distinct, then the matrix is always diagonalizabe, but
might be so if eigenvalues are repeated
Row Space
row space of a matrix A is defined to be the Col(A^t) If A is a m x n matrix. The row space is a subspace of IR^n
m
rows
λ
scalar that lengths the vectors
⊄
set of all complex numbers
IR
set of all real #
An indexed set of 2 or more vectors w/ v1 = 0 is linearly dependent if and only if
some vJ (j > 1) is a linear combination of the preceding vectors
Existence and Uniqueness Theorem
states that a linear system is consistent (has a solution) if and only if the rightmost column of the augmented matrix is is not a pivot column. That is, a linear system is consistent if and only if an echelon form of the augmented matrix has no row of the form [0 * * * 0 b] where b is nonzero
A is diagonalizable if and only if eigenvectors of A form a basis for IR^n if and only if the
sum of the dimensions of eigenspaces is n. In this case, p has the eigenvectors of A as its columns. The entries of D are eigenvalues of A
Scalar multiplication does not change
the degree of the polynomial
trace(ABC) =trace(BCA) =
trace(CAB)
A homogenous linear system with five equations and five unknowns is always consistent
true Ax = b is consistent and it has solutions