Linear Algebra Exam 1
What is Codomain
"Space" (i.e. R2, R3...) where output vectors exist in
Given matrices A and B, what must be true for AB to exist?
# of columns in A must = Rows in B
Given T: Rm -> Rn defined by T(x) = Ax, Then T is One to One if:
1) Check if M > N, if so T cannot be one to one 2) Else T is one to one if A is linearly independent (i.e. pivot in every column after reduction)
Triangular System Definition
1) Each column variable is pivot in exactly one equation 2) Same # of rows as columns 3) exactly one solution
What makes matrix A invertible
1) It has to be square 2) Its invertible if there exists another matrix B, with same dimensions, such that AB = Identity Matrix Note: Matrix A is invertible precisely when T(x) = Ax and T is invertible If AB = Identity then BA also = identity
Given T: Rm -> Rn defined by T(x) = Ax, Then T is Onto if:
1) check if M < N, if so T cannot be onto 2) Else, T is onto if A spans Rn (i.e. once reduced has a pivot in every row)
Given Ax = b, if A is 3x4 matrix what will b look like
3x1 vector.
Powers of Matrix: If A is a square matrix, then A^k =
A * A *A ... k times
Singular/nonsingular matrix
A square matrix that is invertible is nonsingular If it doesnt have an inverse then its singular
Shortcut to find inverse of matrix: Given matrix A = [ a b c d]
A^-1 = 1 / (ad - bc) * [d -b -c a]
How to determine if set of vectors is linearly independent?
Create augmented matrix containing all the vectors, with farthest right column set to all 0s.
Two types of matrix who's form are unchanged when taken to a power
Diagonal Matrix and Triangular Matrix. For diagonal, you just take individual power of each element So if A is diagonal, and 2 is first value. The first value of A^3 is 8, and this same pattern follows For Triangular: No matter what k is, Triangular matrix A^k will remain same type of triangular (so upper or lower)
Echelon form
Each pivot has only zeros below it
What do triangular matrices look like
For Upper: Same as diagonal matrix, but everything above the diagonal is filled in For Lower: Same as diagonal matrix, but everything below the diagonal is filled in
One to one / Onto?
Given a linear transformation T Rm-> Rn, T is: One to one if for every output vector, there exists at most one input vector such that T(input) = output Onto if for every output vector there exists at least
Homogenous Equation
Has form: a1x1 + a2x2... = 0 Always consistent, since trivial solution (each variable =0 ) is possible
Gaussian Elimination
How to convert matrix to echelon form
Proving linear transformation through matrix:
If A is an n x m matrix, and T(x) = Ax, then T: Rm -> Rn is a linear transformation
What makes a transformation Linear: Given T: R^M -> Rn
If for all possible vectors u & v and scalars s T(u+v) = T(u) + T(v) AND T(r*u) = r * T(u)
What makes a linear transformation invertbile?
If it is both 1 to 1 and onto.
Reduced Row Echelon Form (RREF)
In echelon form & 1) All pivots contain a 1 2) 0's above and below pivots
Inverse Linear Transformation
Inverse of a one-to-one linear transformation T takes the output of T as its input, and returns the input of T as its output
Jacobi Iteration v Gauss Seidel Iteration
Jacobi is more basic. Gauss Seidel is more advanced. Jacobi: Step 1) Set row of system/matrix in terms of a single variable Want it to like this: x1 = 1 + x2 x2 = 3 + x3 x3 = 5 + x1 Step 2) Set all the variables on right hand side of = 0. So: x1 = 1 + 0 x2 = 3 + 0 x3 = 5 + 0 Step 3: Now take the value we solved for and plug them in: So: x1 = 1 + 3 : x1 = 4 x2 = 3 + 5 : x2 = 6 x3 = 5 + 1: x3 = 6 S4: Now plug hese new values back into the original equations. Keep iterating. If Matrix is diagonally dominant this should converge to answers. Gauss Seidel We only set two of the variables = 0, Solve them first and tehn plug in.
Diagonally Dominant
Matrix is diagonally dominant if the sum every row in matrix off the diagonal is <= values on the diagonal
Diagonal Dominance
Means that coefficients on diagonal are >= sum of other coefficients in row
If bottom row is all 0s, and all the rows above have pivots is the matrix onto/spanning Rn?
No. bottom row being all zeros means it does not span Rn
What is domain?
Possible inputs in linear transformation (i.e. R2, R3...)
What is range (in context of linear transfomraiton)
Possible outputs of linear transformation. Subset of Codomain. Also is set of all images of vectors u in Rm under T
Key theorem: Given a matrix of m vectors in echelon form, how to determine if matrix (and vectors that make it up) span R^N? how about whether they are linearly independent? Note that N here is number of rows of the matrix
Span: If m >= n , and every row has a pivot, then the matrix spans rn Independence: If m <= n and every column has a pivot, then matrix is linearly independent NOTE: This means that for it to both span and be independent, m must = n (rows must = columns). Also means that if one of them is true then both must be true
Identity Matrix
Square Matrix (m x m) with ones along the diagonal and 0s everywhere else. Any matrix A (with m columns) multiplied by identity matrix (AB) will return A
What is a transformation (not necessarily linear)
T: Rm -> Rn Function which takes vectors in Rm as input and outputs vectors in Rn as ouput
Given to matrices, multiply them: A = [ 3 1 -2 0] B = [ -1 0 2 4 -3 -1 ]
Take first column of A and multiply by first value in first column of B. Then multiply second column of A by second value in first column of B. Sum of these becomes first column of AB. Do the same with fsecond column of B, to get second column of AB... Answer: [1 -3 5 2 0 -4]
Span (what does it mean how to test if set m of vectors spans Rn)
Test: 1) If m is < n, immediately reject, otherwise proceed 2) Form matrix out of the m vectors. Transform to echelon form. 3) If every row has a pivot, then vectors span Rn.
What is image?
The image u under T is a vector T(u) of output, given a vector u of input, with T as a linear transformation
Diagonal Matrix:
This is matrix that has non zero values on every diagonal, and 0s elsewhere (SAME AS RREF)
T/F Given a set of vectors, the set is linearly dependent if and only if one of the vectors in the set is in the span of another?
True
T/F Having 0 in a set of vectors guarantees linear dependence?
True
T/F is A^-1 is the inverse of A then A is the inverse of A^-1
Truue
Given Matrix [1 , 2 ,3 4 5 .6] return the tranpose
[ 1 4 2 5 3 6]
linear independence conceptual definition
set of vectors is linearly independent if the only combination of variables (i.e. x1, x2...) which combine to make the sum of the vectors =0 is "trivial solution (i.e. all the variables = 0
For Ax = b to be valid, x must:
x (i.e. vector of x variables) must have same number of rows as A has columns