Linear Algebra & its Applications
The dimension of a nonzero subspace H =
# of vectors in a Basis for H
(AB)_ij
(AB)_ij = a_i1(b_1j) + a_i2(b_2j) + .... + a_in(b_nj)
(A^T)^-1 =
(A^-1)^T
Image of x
- For x in R^n, the vector T() in R^m is called the image of x
Subspace of R^3
- a line through the origin - a plane through the origin - a point on the origin - all of R^3
Subspace of R^2
- a line through the origin - a point on the origin - all of R^2
Consistent linear equation
- one solution -infinitely many
Identity Matrix
-a square matrix that, when multiplied by another matrix, equals that same matrix - 1's on the diagonal
Linearly Independent
-the vectors are not scalar multiples of each other - if c1v1 + c2v2 + ... + cpvp = 0 has only the trivial solution
Nul A is a subspace of R^n if A is a mxn matrix...
1. A * 0 = 0 2. u * v in null A Au = 0 Av = 0 A(u+v) = Au + Av = 0 3. u in nul A scalar c Au=0 A(cu) = c(Au) = c* 0 = 0
Theorem: If A is an m x n matrix, u and v are vectors in R^n, and c is a scalar, then...
1. A(u+v) = Au + Av 2. A(cu) = c(Au)
Echelon Form
1. All nonzero rows are above any rows of all zeros 2. Each leading entry of a row is in a column to the right of the leading entry of the row above it 3. All entries in a column below a leading entry are zeros
Find the Null Space (A)
1. Augmented matrix 2. RREF 3. Write in x1, x2, ..., xn form 4. nul(A) - {. [ -3 [ 3 x_3 = 0 x_2 = 0 1 ] 4 ]}
Theorem 4: Let A be an m x n coefficient matrix
1. For each b in R^m, the equation Ax = b has a solution 2. Each b in R^m is a linear combination of the columns of A 3. The columns of A span R^m 4. A has a pivot position win every row
Two Fundamental Questions About a Linear System
1. Is the system consistent; that is, does at least one solution exist? 2. If a solution exists, is it the only one; that is, is the solution unique?
Finding a basis for Span(S)
1. Put vector in matrix A as columns 2. REF(A) 3. Bassis = Pivot Cols (A)
Elementary Row Operations
1. Replacement 2. Interchange 3. Scaling
Writing a Solution Set (of a consistent system) in Parametric Vector Form
1. Row reduce the augmented matrix to RREF 2. Express each basic variable in terms of any free variables appearing in an equation 3. Write a typical solution x as a vector whose entries depend on the free variables, if any 4. Decompose x into a linear combination of vectors (w/ numeric entries) using the free variables as parameters
A transformation / mapping T is linear if:
1. T(u+v) = T(u) + T(v) for all u, v in the domain of T 2. T(cu) = cT(u) for all scalars c and all u in the domain of T
A subspace of R^n is any set H in R^n that has three properties:
1. The zero vector is in H 2. For each u and v in H, the sum u + v is in H 3. For each u in H and each scalar c, the vector cu is in H
For v1, v2, v3 in R^m, write the linear combination 3v1 - 5v2 + 7v3 as a matrix times a vector
3v1 - 5v2 + 7v3 = [v1 v2 v3] [3 -5 7] = Ax
(A^-1)^-1 =
A
(A^T)^T =
A
Column Vector
A matrix with only one column u = [3 -1]
Suppose T: R^5 -> R^2 and T(x) = Ax for some matrix A and for each x in R^5. How many rows and columns does A have?
A must have 5 columns for Ax to be defined. A must have 2 rows for the codomain of T to be R^2.
Linear Combination
A sum of scalar multiples of vectors. The scalars are called the weights.
(A+B)^T =
A^T + B^T
Matrix Equation
An equation in which the variable is a matrix. Ax = b
Theorem: Characterization of Linearly Dependent Sets
An indexed set S = { v1, ..., vp} of two or more vectors is linearly dependent iff at least one of the vectors is S is a linear combination of the others. If S is linearly dependent and v1 does not equal zero, then some v_j is a linear combination of the preceding vectors v1, ..., v_j-1
Theorem: If a set contains more vectors than there are entries in each vector, then the set is linearly dependent.
Any set {v1, ..., vp} in R^n is linearly dependent if p > n.
Homogeneous Linear System
Ax = 0 Has a NON-trivial solution if and only iff the equation has at least one free variable
If A is an m x n matrix, with columns a1... an, and if x is in R^n, then the product of A and x, denoted by Ax, is the linear combination of the columns of A using the corresponding entries in x as weights
Ax = [a1 a2 ... an] [x1... xn] = x1a1 + x2a2 + ... + nan
(AB)^-1 =
B^-1 * A^-1
(AB)^T =
B^T * A^T
An nxn matrix A is invertible if there is an nxn matrix C such that...
CA = I AC = I
How to check if Nul A or Col A?
Check if B is in col A? Col A = Span {a_1 ... a_n} Ax = b has a solution? Check if u is in null A? Au = 0?
Is the system consistent?
Does at least one solution exist?
Uniqueness of the Reduced Echelon Form Theorem
Each matrix is row equivalent to one and only one reduced echelon matrix
Reduced Echelon Form
Follows all the rules of echelon form and, 1. The leading entry in each nonzero row is 1 2. Each leading 1 is the only nonzero entry in its column
To be linearly independent, there will be no ________________
Free variables
Linearly Dependent
If m > n, then a set of m vectors in Rn is linearly dependent; all weights cannot be zero
Is b in the Span {a1, ..., an}?
Is Ax = b consistent?
If a solution exists, is it the only one?
Is the solution unique?
The Invertible Matrix Theorem
Let A be a square n x n matrix. Then the following statements are equivalent. That is, for a given A, the statements are either all true or all false. a) A is an invertible matrix b) A is row equivalent to the nxn identity matrix c) A has n pivot positions d) The equation Ax=0 has only the trivial solution e) The columns of A form a linearly independent set f) The linear transformation x|--> Ax is one-to-one g) The equation Ax=b has at least one solution for each b in R^n h) The columns of A span R^n i) The linear transformation x|--> Ax maps R^n onto R^n j) There is an nxn matrix C such that CA = I k) There is an nxn matrix D such that AD = I l) A' is an invertible matrix (note: A' means A transpose) m) The columns of A form a basis of R^n n) Col(A) = R^n 0) dim[Col(A)] = n p) rank(A) = n q) Nul(A) = {0} r) dim[Nul(A)] = 0
A basis for a subspace H of R^n is a linearly independent set in H that spans H
N/A
A mapping T: R^n -> R^m is said to be one-to-one if each b in R^m is the image of at most one x in R^n
N/A
A mapping T: R^n -> R^m is said to be onto R^m if each b in R^m is the image of at least one x in R6n
N/A
Let T: R^n -> R^n be a linear transformation and let A be the unique standard matrix for T. Then T is invertible if and only if A in invertible
N/A
Theorem: If Ax = b has a solution, then the solution set is obtained by translating the solution set of Ax = 0, using any particular solution p of Ax = b for the translation.
N/A
Theorem: If a set S = {v1, ..., vp} in R^n contains the zero vector, then the set is linearly dependent
N/A
Theorem: Let T: R^n -> R^m be a linear transformation. Then T is one-to-one if and only if the equation T(x) = 0 has only the trivial solution
N/A
Using whether a vector b is in Span { v1, ..., vp} amounts to asking whether the vector equation x1v1+x2v2+...+xpvp = b has a solution or asking whether the linear system with augmented matrix [ v1 ... vp b] has a solution.
N/A
Does AB = BA?
NO (unless one is an identity matrix)
Is 4x1 - 5x2 = x1x2 a linear equation?
No
To span R^2 need a pivot in every _______
Row
A linear transformation T: R^n -> R^n is said to be invertible if there exists a function S: R^n -> R^n such that...
S(T(x)) = x for all x in R^n T(S(x)) = x. for all x in R^n
Transformation / Mapping / Function
T from R^n to R^m is a rule that assigns each vector x in R^n a vector T(x) in R^m
T/F: If an 3x5 Matrix A has 3 pivot positions, then A is invertible.
TRUE
Column Space
The column space of a matrix A is the set of Col A of all linear combinations of the columns of A
Null Space
The null space of a matrix A is the set of Nul A of all solutions of the homogeneous equation Ax = 0
Rank
The rank of matrix A, denoted by rank A, is the dimension of the column space of A
R^n
The set of all vectors in n entries with real coefficients. Written as nx1
Theorem: Let T: R^n -> T^m be a linear transformation.
Then there exists a unique matrix A such that: T(x) = Ax for all x in R^n
Theorem: Let T : R^n -> R^m be a linear transformation, and let A be the standard matrix for T.
Then: 1. T maps R^n onto R^m iff the columns of A span R^m 2. T is one-to-one iff the columns of A are linearly independent
Row Equivalent
Two matrices are row equivalent if there is a sequence of elementary row operations that transforms one matrix into the other.
Is 2x1-x2+5x3 = 9 x1 - x3 = -7 a linear system?
Yes
Will b be in Col A if the REF augmented matrix is consistent?
Yes
How to find A^-1 for a non 2x2 matrix
[ A I ] = [ I A^-1 ]
An mxn matrix A is invertible if and only if A is row equivalent to I_n and any sequence of elementary row operation that reduces A to I_n also transforms I_n into A^-1
[AI] = [IA-1}
Suppose the set B = {b_1, ..., b_p} is a basis for a subspace H. For each x in H, the coordinate vector of x relative to the bases B are weights c_1, ...., c_p such that x = c_1*b_1 + .... + c_p*b_p and the vector R^n is called the coordinate vector of x (relative to B)
[x]_B = [c_1 : c_p]
A System of Linear Equations (Linear System)
a collection of one or more linear equations involving the same variable
Existence and Uniqueness Theorem
a linear system is consistent if and only if the rightmost column of the augmented matrix is NOT a pivot column - that is, if and only if an echelon form of the augmented matrix has NO row of the form [0 ... 0 b] with b nonzero. If a linear system is consistent, then the solution set contains either (i) a unique solution, when there are no free variables, (ii) infinitely many solutions, when there is at least one free variable.
Pivot Column
a pivot column is a column of A that contains a pivot position
Pivot Position
a pivot position in a matrix A is a location in A that corresponds to the leading 1 in the reduced echelon form of A
Basic Variable
a variable in a linear system that corresponds to a pivot column in the coefficient matrix
detA=
ad-bc
Free Variable
any variable in a linear system that is not a basic variable
A set of two vectors is linearly dependent if...
at least one of the vectors is a multiple of the other
The equation Ax = b has a solution if and only if...
b is a linear combination of the columns of A.
If a product AB is the zero matrix, you _________ conclude in general that either A = 0 or B = 0
cannot
The pivot columns of a matrix A form a basis for _________
col A
Span
collection of all vectors than can be written in the form of c1v1 + c2v2 +... + cpvp with c1, ..., cp scalars
Two linear systems are called _____________ if they have the same solution set.
equivalent
Linear Equation
in the variables x1,...xn is an equation that can be written in the form: a1x1 + a2x2 + anxn = b where b and the coefficients a1, ... an are real numbers
m x n matrix
m rows n columns
A set of two vectors is linearly undefended iff...
neither of the vectors is a multiple of the other
Inconsistent linear equation
no solution
The cancellation laws do not hold true for matrix multiplication. If AB = AC, then it is _______ true in general that B = C
not
The dimensions of Null(A) =
nulity(A) = # of free variables
The columns of matrix A are linearly independent iff the equation Ax = 0 has...
only the trivial solution
The dimensions of Col(A) =
rank(A) = # of pivot columns
Leading Entry
refers to the leftmost nonzero entry (in a nonzero row)
The transpose of a product of matrices equals the product of their transposes in ____________ order
reverse
If A = [ a_1 .... a_n ], with the columns in R^m, then...
then Col A is in the same Span { a_1 .... a_n}
If A is an invertible nun matrix, then for each b in R^n, the equation Ax=b has the unique solution...
x = A^-1 * b
Is 3x1 - 5x2 = -2 a linear equation?
yes