Linear Algebra Ch 4, 5, 6, 7 Theorems
Theorem 5.6
"A" is an n x n matrix so A is invertible if det(A) does NOT equal 0
T/F: If A is an n x n matrix with all entries equal to 1, then det(A) = n
False for all n > 1. In these cases, the columns will be equal so linearly dependent, and hence det(A) = 0
T/F: If the cofactors of an n x n matrix A are all nonzero, then det(A) does NOT equal 0
False. Consider A = [ 1 1 1 1 ], then C11 = 1, C12 = -1, C21 = -1, and C22 = 1, but det(A) = 0
T/F: If A is a diagonal matrix, then Mij is also a diagonal for all i and j
False. Consider A equals a 3 x 3 Identity matrix, then M31 = [ 0 0 1 0 ] is not a diagonal
T/F: If A is an upper triangular n x n matrix, then det(A) does NOT equal 0
False. Consider matrix A = [ 0 1 0 0 ], with det(A) = 0
T/F: If E is an elementary matrix, then det(E) = 1
False. For example, E = [ 0 1 1 0 ] is elementary with det(E) = -1
Theorem 5.5
For n greater/equal to 1, we have det(I) = 1
Theorem 5.11
If A has a row or column of zeros, det(A) = 0
Theorem 5.11.2
If A has two identical rows or columns, the det(A) = 0
Theorem 5.10
If A is a square matrix, the det(A) equals the transpose
Theorem 5.9
If A is a triangular matrix the the determinant is the product of the diagonal
Theorem 4.3
If A is an n x m matrix, then the set of solutions to the homogeneous linear system Ax = 0 forms a subspace of Rm
Theorem 4.12
If S is a subspace of Rn, then every basis of S has the same number of vectors
Theorem 5.13.2
If a row of matrix A is multiplied by a number, det(A) = 1/# det(B) (B is new matrix)
Theorem 5.13
If two rows of matrix A and B are interchanged, det(A) = det(B)
Theorem 4.10
Let A and B be equivalent matrices. Then the subspace spanned by the rows of A is the same as the subspace spanned by the rows of B
Theorem 6.6
Let A be a square matrix with eigenvalue lambda. Then the dimensions of the associated eigenspace is less than or equal to the multiplicity of lambda
Theorem 6.2
Let A be a square matrix, and suppose that "u" is an eigenvector of A associated with eigenvalue lambda. Then for any scalar c NOT equal to 0, cu is also an eigenvector of A associated with lambda
Theorem 6.3
Let A be an n x n matrix with eigenvalue lambda. Let S denote the set of all eigenvectors associated with lambda, together with the zero vector 0. Then S is a subspace of Rn
Theorem 6.5
Let A be an n x n matrix, then A is an eigenvalue of A if det (A-lambda In) = 0
Theorem 4.9
Let B = {u1, u2, .... um} be a basis for a subspace S. For every vector s in S there exists a unique set of scalars s1, s2, .......sm such that s = s1 u1 + s2 u2 + ..... sm um
Theorem 4.2
Let S = span{ u1, u2, ....... um} be in Rn. Then S is a subspace of Rn
Theorem 4.6
Let T : Rm —> Rn be a linear transformation. Then T is one-to-one if and only if ker(T) = {0}
Theorem 4.5
Let T : Rm —> Rn be a linear transformation. Then the kernel of T is a subspace of the domain Rm and the range of T is a subspace of the codomain Rn
Theorem 4.17
Let U = {u1, u2, .... um} be a set of vectors in a subspace S of dimension k. If m < k then U doesnt span S, if m > k then U is not linearly dependent
Theorem 4.15
Let U= {u1, u2, ..... um} be a set of m vectors in a subspace S of dimension m. If U is either linearly dependent of span S, then U is a basis for S
Theorem 4.16
Suppose S1 and S2 are both subspace of Rn and that S is a subset of S2 then dim(S1) less/equal dim (S2)
Theorem 4.11
Suppose U = [u1, u2, ..... um] and V = [v1, v2, ..... vm] are two equivalent matrices. Then any linear dependence that exists among the vectors u1, u2, ...... um also exists among the vectors v1, v2, ...... vm
Theorem 6.13
Suppose that A is a real matrix with eigenvalue lambda and associated eigenvector "u", then the vector lambda is also an eigenvalue of A, with associated eigenvector u
T/F: Suppose A, B, and S are n x n matrices, and that S is invertible. If B = S^-1AS, then det(A) = det(B)
True, since det(B) = det(S^-1AS) = det(S^-1) det(A) det(S) = 1/det(S) det(A) det(S) = det(A)
T/F: If A is an invertible matrix, then at least one of the submatrices Mij of A is also invertible
True. If every Mij is not invertible, the det(A) = 0 using any row or column cofactor expansion, since every Mij will have determinant zero. But this contradicts A is an invertible matrix. Thus at least one Mij is also invertible