MTH 220 - Linear Algebra
(A^t)^-1 is equivalent to...
(A^-1)^t
(AB)^-1 is also equivalent to...
(B^-1)(A^-1)
det(A^4) is equivalent to...
(det(A))^4
Ways to find the determinant
- cofactorization - LDU factorization (then take product of the diagonal) - row reduction - use your calculator
Ways to check if vectors are Linearly Independent:
- the only solution to this system of equations is the trivial solution - det(A) ≠ 0 -vectors are *not* parallel -if A is invertible
If the determinant of a 4x4 matrix A is det(A)=-3, and the matrix C is obtained from A by swapping the second and fourth columns, then det(C)=
-det(A) -(-3) = 3
A is invertible if...
1.) AX = b has a unique solution for any vector b 2.) AX = 0 has only the trivial solution 3.) det(A) ≠ 0
How to find the inverse of 2x2 matrices
1.) Find the determinant: ad-bc 2.) Swap positions a & d 3.) leave b & c where they are but *flip* the sign 4.) Multiply the modified matrix by 1/(ad-bc)
How to find the inverse of A
1.) Option 1 ~RREF matrix A, while doing the same row operations on the I matrix ~When matrix A is in RREF, then I have been transformed into the inverse of A ~If matrix A cannot be reduced to RREF then it does not have an inverse!
How do we find the ker(A)
1.) RREF of the matrix 2.) determine the pivots and free variables 3.) (write in parameter form then) write in vector form: ker(A) = span {vectors} -kernel is the set of solutions of the corresponding homogeneous linear equations AX = 0 -kernel is the "row space"
Let M be an invertible 3x3 matrix. The following sequence of row operations turns M into row-reduced echelon form.......
1.) perform the row operations to I; the result will be (M^-1) 2.) Now, you want to find the inverse of (M^-1); the end result will be M. 3.) You can the compute the determinant of M
Find Im(A)
1.) take the Transpose of A. 2.) RREF 3.) take the rows for your vectors and that's your answer: Im(A) = span {vectors of the rows}
det(A^-1) =
1/det(A)
If the determinant of a 3x3 matrix A is det(A)=3, and the matrix B is obtained from A by multiplying the first column by 9, then det(B)=
9 times det(A) 9(3) = 27
(1/det(A)) times (Adj(A)) =
A^-1
True or false: (A+(A^-1))^9 = A^9 + (A^-9)
False
True or false: (A+B)^2 = (A^2)+(B^2)+2AB
False
True or false: A+B is invertible
False
True or false: AB(A^-1)=B
False
True or false: If a subspace V of Rn contains none of the standard vectors E1, E2, ..., En, then V consists of the zero vector only.
False
The columns of a 4x5 matrix are linearly independent.
False! If m<n then the columns are linearly Dependent
True or false: The image of a 3x4 matrix is a subspace of R4?
False, it is a subspace of R3
True or false: If vectors V1, V2, ..., Vn span R4 then n must be equal to 4.
False. It could be 4 or larger than 4 (they are allowed to be linearly dependent)
True or false: The columns vectors of 5x4 matrix must be linearly dependent
False: 5 rows = in R^5 We are given 4 vectors in R^5 So they do no have to be L. Dependent
How to solve a system with LU factorization (matrix A has already been broken up into LU)
L(UX) = b where we will say (UX) = Y LY = b Now we can solve Y (write them out as equations, plugging in each previous y term to determine the new y term --> this will give us our new "vector" Y) Since UX = Y we now solve for X (the same way we solved for Y)
Size of LU factorization
Let's say matrix A is size mxn (to clarify I will use the notation rxc) L: mxm or rxr U: mxn or rxc
Minors and Cofactors
Minors: When you take let's say the term ij, and before the cofactorization you will get a number. This number is the minor Cofactor: You take the minor you found and determine the sign (based off of the position of ij)
Solve for X: AX = b
Multiply by the inverse of A (A^-1)AX = (A^-1)b X = (A^-1)b
Example of finding ker(A) using co-factorization. A = 1 3 5 1 2 2 1 1 −1
No two rows are proportional, so we have to compute the determinant. We can expand along any row. Using the first row, we get det(A)= 1(−4) − 3(−3) + 5(−1) = 0 So ker(A) is the line through (−4,− −3,−1) = (−4,3,−1)
The rank-nullity theorem
Null(A) + Rank(A) = to # of columns of A
Rules of row reduction:
Our goal is to get an upper diagonal matrix 1.) row exchange --> change the sign 2.) ∂Ri + Rj --> no change in the determinant 3.) multiplying a single row or a single column of A by a number ∂ --> multiply the determinant by ∂ 4.) If all rows (or all columns) of A are multiplied by ∂ --> multiply the determinant by ∂^n
The inverse of (A^-1)
The original matrix A
If we are given three separate vectors, how to do find if the vectors are linearly independent?
The vectors will become the columns of a matrix, then check the determinant. If det(A) ≠0, then the vectors/columns are L. Independent
Property of Inverse: If A is a diagonal matrix then A is invertible if and only if...
There are no zeroes found in the diagonal
Find k such that the following matrix M is singular.
This means we want the det(M)=0 So perform the normal steps to find the determinant (such as cofactorization), and just solve for k
True or false: (AB(A^-1))^4 = AB^4 (A^-1)
True
True or false: A^7 is invertible
True
True or false: If vectors v1, v2, v3, and v4 and Linearly independent then v1, v2, and v3 are linearly independent
True
True or false: if three vectors in R3 lie in the *same plane* in R3 then they are linearly dependent
True
True or false: If A and B are nxn matrices, and vector V is in the kernel of both A and B, then V must be in the kernel of matrix AB as well.
True! (We are dealing with zero, so if zero is in B, then zero is in AB...this is not the proper wording behind the concept but whatever)
True or false: if x and y are linearly independent, and if {x, y, z} is linearly dependent, then z is in span {x, y}
True!! If z is in the Span{x,y} then z is a linear combination of the other two
True or false: If 2 U + 3 V + 4 W = 5U + 6 V + 7 W, then vectors U, V, W must be linearly *dependent*.
True, In fact 3U+3V+3W = 0.
True or false: If V1, V2, ..., Vn and W1, W2, ..., Wm are any two bases of a subspace of vector V of R10, then n must equal m.
True. Any two bases of the *same vector space* have the *same number of vectors*.
How to get LU factorization
Try to get U to "RREF" *BUT* You can't swap rows or multiply by a scalar. You can only do ∂Ri + Rj Whenever you get a zero value, you must put -∂ in the corresponding location of your "reference matrix" (this will give us L)
the general idea of LU factorization
We want to make Matrix A into a lower and upper matrix, where the *lower* matrix has ones along the diagonal. L: U 1 0 0 ] - - - [ d e f a 1 0 ] - - - [ 0 g h b c 1 ] - - - [ 0 0 k
The rank plus the nullity is the number of...
columns!!! (mxn....columns = n)
det(A^t) =
det(A)
det(2A) =
det(A) multiplied by 2^n
ker(A) = 0 exactly when
det(A) ≠ 0
det (AB) =
det(A)det(B)
Rank of A or Rank(A)
dim(Im(A)) is based on the range
Nullity of A or Null(A)
dim(ker(A)) is based on the domain
Linearly Independent
if one of the vectors is NOT some combination of the others
A square matrix is called a permutation matrix if...
it contains the entry 1 exactly once in each row and in each column, with all other entries being 0 (Think of it kind of like the Identity, but it can be all scrambled)
Something is a free variable if ....
it corresponds to a pivot column Ex. in parameter form: [1 -2 2 3 -1] [0 0 1 2 -2 ] Our pivots = x1, x3 So that means are free variables = x2, x4, and x5
Size of a matrix mxn
m = rows n = columns mxn = rows by columns
Nontrivial solutions are also called...
non-zero solutions
We know that the vectors are linearly dependent if and only if the matrix has ...
pivotless column
Transpose of the cofactor matrix of A= C^t =
the adjoint Adj(A)
What allows us to determine the dimension of the kernel of A? dim(ker(A))
the number of free variables
What allows us to determine the dimension of the Image of A? dim(Im(A))
the number of pivots
Definition of kernel of A
the set of all vectors that are mapped to zero
Determinant: If two rows or two columns of A are identical or if A has a row or a column of zeroes
then det(A) = 0
to find the Image of a matrix, Im(A), then....
we need only to find the *linearly Independent* *column vectors* of the matrix of the transformation.
quadratic formula
x = [ -b ± sqrt(b^2 - 4ac) ] / 2a
The trivial solution has __________ as an answer
zero