Linear Algebra Final

Ace your homework & exams now with Quizwiz!

Let 𝐴 be the 𝑚×𝑛 coefficient matrix corresponding to a homogeneous system of equations, and suppose 𝐴 has rank 𝑟 How many parameters does the solution have

# of parameters = 𝑛−𝑟

(row x column)

( m x n) n > m: always has a nontrivial solution; infinitely many solutions

𝑧=𝑎+𝑏𝑖 is a complex number What is the conjugate?

(/(a+bi)) = a-bi

the properties of determinants are... (hint: 8)

(a) If B is the result of switching two rows of A, then det (B) = − det(A). (b) If B is the result of multiplying a row of A by a constant k, then det(B) =k det(A). (c) If B is the result of adding a multiple of a row of A to another, then det(B) = det(A). (d) det(AB) = det(A) det(B) (e) det(A^T ) = det(A) (f) If A is invertible, det(A)= ̸= 0, and det(A^−1) = det(A)^−1 (g) If A has a row/column of 0's, then det (A) = 0. (h) If det(A)= ̸= 0, then the rows/columns of A are linearly independent

two similar matrices share... (hint: 6)

(a) det(A) = det(B) (b) rank(A) = rank(B), (c) trace(A) = trace(B) (d) A and B have the same characteristic polynomials (e) A and B have the same eigenvalues (f) A and B are diagonalizable.

T : Rk --> Rn and S : Rn --> Rm Let A be the matrix such that T(x) = Ax and let B be the matrix such that S(x) = Bx. What is the size of the matrix for the composition S following T?

(m x n)(n x k) = m x k

T : Rk --> Rn and S : Rn --> Rm Let A be the matrix such that T(x) = Ax and let B be the matrix such that S(x) = Bx. Is the composition T following S defined?

(n x k)(m x n) Defined only if k=m

What do the diagonals on a skew-symmetric matrix have to be?

0

invertible

1. A matrix A of dimension n x n is called ____ if and only if there exists another matrix B of the same dimension, such that AB = BA = I, where I is the identity matrix of the same order 2. A square matrix is ___ if and only if its determinant is non-zero 3. Suppose A is an 𝑛×𝑛 matrix. To find 𝐴^−1 if it exists, form the augmented 𝑛×2𝑛 matrix [𝐴 | 𝐼] If possible do row operations until you obtain an 𝑛×2𝑛 matrix of the form [𝐼 | 𝐵] When this has been done, 𝐵=𝐴^−1. In this case, we say that 𝐴 is ______. If it is impossible to row reduce to a matrix of the form [𝐼 | 𝐵] then 𝐴 has no inverse. 4. Let 𝐴 be an ____ 𝑛×𝑛 matrix. Then the columns of 𝐴 are independent and span ℝ𝑛. Similarly, the rows of 𝐴 are independent and span the set of all 1×𝑛1 vectors.

row-echelon form

1. All nonzero rows are above any rows of zeros. 2. Each leading entry of a row is in a column to the right of the leading entries of any row above it. 3. Each leading entry of a row is equal to 1.

reduced row-echelon form

1. All nonzero rows are above any rows of zeros. 2. Each leading entry of a row is in a column to the right of the leading entries of any rows above it. 3. Each leading entry of a row is equal to 11. 4. All entries in a column above and below a leading entry are zero.

Elementary operations are...

1. Interchanging the order in which the equations are listed. 2. Multiplying any equation by a nonzero number. 3. Replacing any equation with itself added to a multiple of another equation.

Let {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} be a collection of vectors in ℝ𝑛. Then the following are equivalent

1. It follows that each coefficient 𝑎𝑖=0 2. No vector is in the span of the others. 3. The system of linear equations 𝐴𝑋=0 has only the trivial solution, where 𝐴 is the 𝑛×𝑘 matrix having these vectors as columns

Make a list of the different ways someone can check if a square matrix has linearly independent columns

1. check the determinant (if it's = 0, its columns are linearly dependent; if it's not equal to zero, then its columns are linearly independent) 2. check the pivot columns of the RREF (if all of the columns are pivot columns, it's linearly independent; if any columns are not pivot columns, that column is linearly dependent)

Let 𝐴 be an 𝑚×𝑛 matrix. The following are equivalent (hint: 6...again)

1. rank(𝐴)=𝑚 2. col(𝐴)=ℝ𝑚, i.e., the columns of 𝐴 span ℝ𝑚 3. The rows of 𝐴 are independent in ℝ𝑛. 4. The 𝑚×𝑚 matrix 𝐴*𝐴^𝑇 is invertible. 5. There exists an 𝑛×𝑚 matrix 𝐶 so that 𝐴𝐶=𝐼𝑚. 6. The system 𝐴𝑥⃗ =𝑏⃗ is consistent for every 𝑏⃗ ∈ ℝ𝑚.

Let 𝐴 be an 𝑚×𝑛 matrix. The following are equivalent (hint: 6)

1. rank(𝐴)=𝑛 2. row(𝐴)=ℝ𝑛, i.e., the rows of 𝐴 span ℝ𝑛. 3. The columns of 𝐴 are independent in ℝ𝑚. 4. The 𝑛×𝑛 matrix 𝐴^𝑇*𝐴 is invertible. 5. There exists an 𝑛×𝑚 matrix 𝐶 so that 𝐶𝐴=𝐼𝑛. 6. If 𝐴𝑥⃗ =0⃗𝑚 for some 𝑥⃗ ∈ℝ𝑛, then 𝑥⃗ =0⃗𝑛.

Let 𝐴 be the 𝑚×(𝑛+1) augmented matrix corresponding to a consistent system of equations in 𝑛 variables, and suppose 𝐴 has rank 𝑟. Then...

1. the system has a unique solution if 𝑟 = 𝑛 2. the system has infinitely many solutions if 𝑟 < 𝑛 KEEP IN MIND: 1. when 𝑟 > 1: no solution 2. when n = r : unique solution 3. when 𝑟 < n : infinitely many solutions

A subset 𝑉 of ℝ𝑛 is a subspace of ℝ𝑛 if

1. the zero vector of ℝ𝑛 is in 𝑉 2. 𝑉 is closed under addition 3. 𝑉 is closed under scalar multiplication

Given a transformation that T : R3 --> R4, what is the size of A?

4x3

Suppose A is an n × n matrix such that the sum of the geometric multiplicities of all the eigenvalues of A equals n. That is, if λ1, . . . , λk are the k eigenvalues of A and gi is the geometric multiplicity of λi, then suppose that the ∑ of gi = n where i=1 to k. Is A diagaonalizable? Explain why or why not.

A is diagonalizable because the geometric multiplicity of an eigenvalue is how many k linearly independent eigenvectors there are for that eigenvalue. ∑ of gi = n where i=1 to k means that there are enough linearly independent eigenvectors to form P such that A = PDP^−1

linearly dependent

A set of non-zero vectors {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} in ℝ𝑛 is said to be _____ if a linear combination of these vectors without all coefficients being zero does yield the zero vector.

linearly independent

A set of non-zero vectors {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} in ℝ𝑛 is said to be ______ if whenever ∑(ai) x (some vector ui) where i=1 and ends at k it follows that each 𝑎𝑖 = 0.

A square matrix A is said to have an inverse A−1 if and only if...

AA^−1 = A^−1A = I

The following properties hold in ℝ𝑛: Suppose {𝑢⃗ 1,⋯,𝑢⃗ 𝑛},is linearly independent. Then {𝑢⃗ 1,⋯,𝑢⃗ 𝑛} is a basis for ℝ𝑛 Suppose {𝑢⃗ 1,⋯,𝑢⃗ 𝑚} spans ℝ𝑛. Then 𝑚≥𝑛. If {𝑢⃗ 1,⋯,𝑢⃗ 𝑛} spans ℝ𝑛,, then {𝑢⃗ 1,⋯,𝑢⃗ 𝑛} is linearly independent.

Assume first that {𝑢⃗ 1,⋯,𝑢⃗ 𝑛} is linearly independent, and we need to show that this set spans ℝ𝑛. To do so, let 𝑣⃗ be a vector of ℝ𝑛, and we need to write 𝑣⃗ as a linear combination of 𝑢⃗𝑖's. Consider the matrix 𝐴 having the vectors 𝑢⃗i's as columns: 𝐴=[𝑢⃗ 1⋯𝑢⃗ 𝑛] By linear independence of the 𝑢⃗𝑖's, the reduced row-echelon form of 𝐴 is the identity matrix. Therefore the system 𝐴𝑥⃗ =𝑣⃗ has a (unique) solution, so 𝑣⃗ is a linear combination of the 𝑢⃗𝑖's. To establish the second claim, suppose that 𝑚<𝑛. Then letting 𝑢⃗ 𝑖1,⋯,𝑢⃗ 𝑖𝑘 be the pivot columns of the matrix [𝑢⃗ 1⋯𝑢⃗ 𝑚] it follows 𝑘 ≤𝑚 <𝑛 and these 𝑘 pivot columns would be a basis for ℝ𝑛 having fewer than 𝑛 vectors Finally consider the third claim. If {𝑢⃗ 1,⋯,𝑢⃗ 𝑛} is not linearly independent, then replace this list with {𝑢⃗ 𝑖1,⋯,𝑢⃗ 𝑖𝑘} where these are the pivot columns of the matrix [𝑢⃗ 1⋯𝑢⃗ 𝑛] Then{𝑢⃗ 𝑖1,⋯,𝑢⃗ 𝑖𝑘} span s ℝ𝑛 and is linearly independent, so it is a basis having less than 𝑛 vectors

True or False: Suppose W is a subspace of dimension 2 in R^3 . Then every basis for R 3 can be reduced to a basis for W by removing one vector.

False. This of a basis {e1, e2, e3}. The span of just two of those vectors is either xy-plane, yz-plane, or xz-plane. The span of {[1 1 1]^T , [1 1 −1]^T} is not any of the three planes mentioned above since those two vectors are not contained in them. Any plane in R^3 that passes through the origin is a subspace of dimension 2.

True or False: If {v1, v2, v3} is a linearly dependent set, then {v1 + v2, v1 + v3, v2 + v3} is linearly independent.

False. Use vectors from R^2 . Then it is not possible for a set of three vectors to be linearly independent.

How can we find a basis for the column space of an (m x n) matrix?

Find the RREF of the given matrix then determine which columns and pivot columns The columns in the original corresponding spots form the basis

How can we find a basis for the row space of an (m x n) matrix?

Find the RREF of the given matrix then the rows with leading 1's form a basis

Let {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} be a set of vectors in ℝ𝑛. If 𝑘>𝑛, then the set is linearly dependent (i.e. NOT linearly independent).

Form the 𝑛×𝑘 matrix 𝐴 having the vectors {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} as its columns and suppose 𝑘>𝑛. Then 𝐴 has rank 𝑟≤𝑛<𝑘, so the system 𝐴𝑋=0 has a nontrivial solution and thus is not linearly independent

Subspaces are Spans of Independent Vectors

If 𝑉 is a subspace of ℝ𝑛, then there exist linearly independent vectors {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} in 𝑉 such that 𝑉=span{𝑢⃗ 1,⋯,𝑢⃗ 𝑘}.

Suppose A is an invertible n × n matrix and {v1, . . . , vn} is a basis for Rn. Show that {Av1,...,Avn} is a basis for Rn.

In order to show that {Av1, . . . , Avn} is a basis for Rn, we need to check two things: that the set of the vectors is linearly independent and that the set spans Rn. Suppose that c1Av + · · · + cnAvn= 0. Then it follows that, by properties of matrix multiplication, we have that A ( c1v1 + · + cnvn ) = 0. Since A is invertible, we have that c1v1 + · + cnvn = 0. Since {v1,...,vn} is a basis for Rn, those vectors are linearly independent, so c1 = · · · = cn = 0. Therefore {Av1, . . . ,Avn} is linearly independent. To prove that {Av1, . . . , Avn} spans Rn, we have to show that any vector w that ∈ Rn can be expressed as a linear combination of the vectors in the set. Let w Rn. Since A is invertible, there exists A^−1, and a vector z = A^−1w in Rn. Because {v1, . . . , vn} is a basis of Rn, there exist c1,...,cn in R such that c1v1 + · · · + cnvn = z By multiplying both sides of the equation by A, we have that c1Av1 + · · · + cnAvn = Az = w Since w is a linear combination of {Av1, . . . , Avn}, w is in the span of the set. Hence {Av1, . . . , Avn} spans Rn. Thus {Av1,...,An} is a basis for Rn.

What is the column space of an (m x n) matrix? (Also, what notation do we use to represent the column space?)

It is the span of the column vectors from the given matrix It is a subspace of Rm Notation: Col(A)

What is the row space of an (m x n) matrix? (Also, what notation do we use to represent the row space?)

It is the span of the rows of the given matrix Notation: row(A)

Results of the Rank Theorem

Let 𝐴 be a matrix. Then the following are true: 1. rank(𝐴)=rank(𝐴𝑇). 2. For 𝐴 of size 𝑚×𝑛, rank(𝐴) ≤ 𝑚 and rank(𝐴) ≤ 𝑛 3. For 𝐴 of size 𝑛×𝑛, 𝐴 is invertible if and only if rank(𝐴)=𝑛 4. For invertible matrices 𝐵 and 𝐶 of appropriate size, rank(𝐴) = rank(𝐵𝐴) = rank(𝐴𝐶)

Row Space of a reduced row-echelon form Matrix

Let 𝐴 be an 𝑚×𝑛 matrix and let 𝑅 be its reduced row-echelon form. Then the nonzero rows of 𝑅 form a basis of row(𝑅), and consequently of row(𝐴).

Basis of null(A)

Let 𝐴 be an 𝑚×𝑛 matrix such that rank(𝐴)=𝑟. Then the system 𝐴𝑥⃗ =0⃗𝑚 has 𝑛−𝑟 basic solutions, providing a basis of null(𝐴) with dim(null(𝐴)) = 𝑛−𝑟

column space

Let 𝐴 be an 𝑚×𝑛 matrix. The ___ of 𝐴, written col(𝐴), is the span of the columns. hint: loot at the RREF for leading ones; the ____ is the columns in the og matrix whose RREF equivalent has leading ones

row space

Let 𝐴 be an 𝑚×𝑛 matrix. The ___ of 𝐴, written row(𝐴), is the span of the rows. hint: look at RREF for leading ones; the___ is the rows in the RREF that contain leading ones

Rank Theorem

Let 𝐴 be an 𝑚×𝑛 matrix. Then dim(col(𝐴)), the dimension of the column space, is equal to the dimension of the row space, dim(row(𝐴)). The following statements all follow from the Rank Theorem.

Rank and Nullity

Let 𝐴 be an 𝑚×𝑛 matrix. Then rank(𝐴) + dim(null(𝐴)) = 𝑛

Let 𝑊 be a subspace. Also suppose that 𝑊=𝑠𝑝𝑎𝑛{𝑤⃗ 1,⋯,𝑤⃗ 𝑚} Then there exists a subset of {𝑤⃗ 1,⋯,𝑤⃗ 𝑚} which is a basis for 𝑊.

Let 𝑆 denote the set of positive integers such that for 𝑘∈𝑆 there exists a subset of {𝑤⃗ 1,⋯,𝑤⃗ 𝑚} consisting of exactly 𝑘vectors which is a spanning set for 𝑊. Thus 𝑚∈𝑆. Pick the smallest positive integer in 𝑆. Call it 𝑘. Then there exists {𝑢⃗ 1,⋯,𝑢⃗ 𝑘}⊆{𝑤⃗ 1,⋯,𝑤⃗ 𝑚} such that span{𝑢⃗ 1,⋯,𝑢⃗ 𝑘}=𝑊. If ∑𝑐𝑖𝑤⃗𝑖 = 0⃗ and not all of the 𝑐𝑖=0, then you could pick 𝑐𝑗≠0, divide by it and solve for 𝑢⃗𝑗 in terms of the others, 𝑤⃗𝑗=∑(−𝑐𝑖/𝑐𝑗)𝑤⃗𝑖 Then you could delete 𝑤⃗𝑗 from the list and have the same span. Any linear combination involving 𝑤⃗𝑗 would equal one in which 𝑤⃗𝑗 is replaced with the above sum, showing that it could have been obtained as a linear combination of 𝑤⃗𝑖 for 𝑖≠𝑗. Thus 𝑘−1∈𝑆 contrary to the choice of 𝑘. Hence each 𝑐𝑖=0 and so {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} is a basis for 𝑊 consisting of vectors of {𝑤⃗ 1,⋯,𝑤⃗ 𝑚}.

subset

Let 𝑈 and 𝑊 be sets of vectors in ℝ𝑛. If all vectors in 𝑈 are also in 𝑊, we say that 𝑈 is a ______ of 𝑊, denoted 𝑈⊆𝑊

subset of a subspace

Let 𝑉 and 𝑊 be subspaces of ℝ𝑛, and suppose that 𝑊⊆𝑉. Then dim(𝑊)≤dim(𝑉) with equality when 𝑊=𝑉.

subspace

Let 𝑉 be a nonempty collection of vectors in ℝ𝑛. Then 𝑉 is called a ______ if whenever 𝑎 and 𝑏 are scalars and 𝑢⃗ and 𝑣⃗ are vectors in 𝑉, the linear combination 𝑎𝑢⃗ +𝑏𝑣⃗ is also in 𝑉

Existence of a basis

Let 𝑉 be a subspace of ℝ𝑛 Then there exists a basis of 𝑉 with dim (𝑉) ≤ 𝑛

Basis of a Subspace

Let 𝑉 be a subspace of ℝ𝑛. Then {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} is a basis for 𝑉 if the following two conditions hold: 1. span{𝑢⃗ 1,⋯,𝑢⃗ 𝑘}= 𝑉 2. {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} is linearly independent

extending a basis

Let 𝑊 be any non-zero subspace ℝ𝑛 and let 𝑊⊆𝑉 where 𝑉 is also a subspace of ℝ𝑛. Then every basis of 𝑊 can be extended to a basis for 𝑉

Standard Basis of ℝ𝑛

Let 𝑒⃗𝑖 be the vector in ℝ𝑛 which has a1 in the 𝑖𝑡ℎ entry and zeros elsewhere, that is the 𝑖𝑡ℎ column of the identity matrix. Then the collection {𝑒⃗1, 𝑒⃗2,⋯, 𝑒⃗𝑛} is a basis for ℝ𝑛 and is called the standard basis of ℝ𝑛.

Properties of Addition of Complex Numbers

Let 𝑧, 𝑤 and 𝑣 be complex numbers. Then the following properties hold. 1. Commutative Law for Addition: 𝑧+𝑤=𝑤+𝑧 2. Additive Identity: 𝑧 + 0= 𝑧 3. Existence of Additive Inverse: For each 𝑧∈ ℂ,there exists −𝑧 ∈ ℂ such that 𝑧+(−𝑧)= 0 In fact if 𝑧=𝑎+𝑏𝑖, then −𝑧=−𝑎−𝑏𝑖. 4. Associative Law for Addition (𝑧+𝑤)+𝑣=𝑧+(𝑤+𝑣)

Properties of Multiplication of Complex Numbers

Let 𝑧, 𝑤 and 𝑣 be complex numbers. Then, the following properties of multiplication hold. 1. Commutative Law for Multiplication: 𝑧𝑤 = 𝑤𝑧 2. Associative Law for Multiplication: (𝑧𝑤)𝑣 = 𝑧(𝑤𝑣) 3. Multiplicative Identity I𝑧 = 𝑧1 4. Existence of Multiplicative Inverse For each 𝑧 ≠ 0 , there exists 𝑧−1 such that 𝑧𝑧^−1=1 5. Distributive Law 𝑧(𝑤+𝑣) = 𝑧𝑤+𝑧𝑣

Let 𝑉 be a subspace of ℝ𝑛 with two bases 𝐵1 and 𝐵2. Suppose 𝐵1 contains 𝑠 vectors and 𝐵2 contains 𝑟 vectors. Then 𝑠=r

Observe that 𝐵1={𝑢⃗ 1,⋯,𝑢⃗ 𝑠} is a spanning set for 𝑉 while 𝐵2={𝑣⃗ 1,⋯,𝑣⃗ 𝑟} is linearly independent, so 𝑠≥𝑟. Similarly 𝐵2={𝑣⃗ 1,⋯,𝑣⃗ 𝑟}is a spanning set for 𝑉 while 𝐵1={𝑢⃗ 1,⋯,𝑢⃗ 𝑠} is linearly independent, so 𝑟≥𝑠.

RREF Matrix w/ INFINITELY Many Solutions

RREF Matrix contains parameters (ie., there are columns of the coefficient matrix that are not pivot columns)

RREF Matrix w/ ONE Solution

RREF Matrix where every column of the coefficient matrix is a pivot column

RREF Matrix w/ NO Solution (inconsistent)

RREF matrix w/ the form [ 0 0 0 | 1 ]

Is the column space of an (m x n) matrix a subspace of Rn or Rm?

Rm because each columns has m entries

Is the row space of an (m x n) matrix a subspace of Rn or Rm?

Rn because each row has n entries

A matrix is a scalar matrix if it is of the form cI for some scalar c. Suppose that A is diagonalizable and has only one eigenvalue. Show that A is a scalar matrix.

Since A is diagonalizable, A = PDP^−1 where D is a diagonal matrix with the eigenvalues of A as its diagonal entries. Suppose the eigenvalue of A is λ. Then it follows that D = λI since λ is the is only eigenvalue. Then, by the result of part (a), it follows that A = λI.

Suppose {𝑢⃗ 1,⋯,𝑢⃗𝑟} is a linearly independent set of vectors in ℝ𝑛, and each 𝑢⃗𝑘 is contained in span{𝑣⃗1,⋯,𝑣⃗𝑠}. Then 𝑠 ≥ 𝑟. In words, spanning sets have at least as many vectors as linearly independent sets.

Since each 𝑢⃗𝑗 is in span{𝑣⃗ 1,⋯,𝑣⃗ 𝑠}, there exist scalars 𝑎𝑖𝑗 such that 𝑢⃗𝑗= ∑𝑎𝑖𝑗𝑣⃗ where i=1 to s Suppose for a contradiction that 𝑠<𝑟. Then the matrix 𝐴=[𝑎𝑖𝑗]= has fewer rows, 𝑠 than columns, 𝑟. Then the system 𝐴𝑋=0 has a nontrivial solution 𝑑⃗, that is there is a 𝑑⃗ ≠0⃗ such that 𝐴𝑑⃗ =0⃗ In other words, ∑𝑗=𝑎𝑖𝑗𝑑𝑗=0,𝑖=1,2,⋯,𝑠 Therefore, ∑𝑑𝑗𝑢⃗ =(∑𝑑𝑗)(∑𝑖𝑎𝑖𝑗𝑣⃗𝑖) =∑(∑𝑎𝑖𝑗𝑑𝑗)𝑣⃗ = ∑0𝑣⃗ which contradicts the assumption that {𝑢⃗ 1,⋯,𝑢⃗ 𝑟} is linearly independent, because not all the 𝑑𝑗are zero. Thus this contradiction indicates that 𝑠≥𝑟

A matrix is a scalar matrix if it is of the form cI for some scalar c. Suppose A is a square matrix that is similar to a scalar matrix cI. Show that A = cI.

Suppose that A and cI are similar for some c ∈ R. Then there exists an invertible matrix P such that A = P(cI)P^−1. By the properties of matrix multiplication, it follows that P (cI)P ^−1 = c(P IP^ −1) = c(P P^ −1) = cI. Therefore, we have that A = cI.

Null Space, or Kernel, of 𝐴

The ____ of a matrix 𝐴, also referred to as the kernel of 𝐴, is defined as follows. null(𝐴) = {𝑥⃗ :𝐴𝑥⃗ =0⃗ }

Image of 𝐴

The ____ of 𝐴, written im(𝐴) is given by im(𝐴) = {𝐴𝑥⃗ :𝑥⃗ ∈ℝ𝑛}

Nullity

The dimension of the null space of a matrix is called the ____, denoted dim(null(𝐴)).

A matrix is a scalar matrix if it is of the form cI for some scalar c. Use part (b) to show that A = [1,0]^T, [1,1]^T is not diagonalizable.

The only eigenvalue of A is 1, so it follows that if A was diagonalizable, it would have to be a scalar matrix. A is not a scalar matrix. Therefore A is not diagonalizable.

Let 𝑈 ⊆ ℝ𝑛 be an independent set. Then any vector 𝑥⃗ ∈ span(𝑈) can be written uniquely as a linear combination of vectors of 𝑈.

To prove this theorem, we will show that two linear combinations of vectors in 𝑈 that equal 𝑥⃗ must be the same. Let 𝑈={𝑢⃗ 1,𝑢⃗2,...,𝑢⃗𝑘}. Suppose that there is a vector 𝑥⃗ ∈ span(𝑈) such that 𝑥⃗ =𝑠1𝑢⃗1 + 𝑠2𝑢⃗2+⋯+ 𝑠𝑘𝑢⃗𝑘, for some 𝑠1,𝑠2,...,𝑠𝑘∈ℝ, 𝑥⃗ =𝑡1𝑢⃗1 + 𝑡2𝑢⃗2+⋯+ 𝑡𝑘𝑢⃗𝑘, for some 𝑡1,𝑡2,...,𝑡𝑘∈ℝ Then 0n = 𝑥⃗ - 𝑥⃗ = (s1 - t1)𝑢⃗1 + (s2 - t2)𝑢⃗2 +...+ (sk - tk)𝑢⃗𝑘 Since 𝑈 is independent, the only linear combination that vanishes is the trivial one, so 𝑠𝑖−𝑡𝑖 =0 for all 𝑖, 1 ≤ 𝑖 ≤ 𝑘1 Therefore, 𝑠𝑖 = 𝑡𝑖 for all 𝑖, 1 ≤ 𝑖≤ 𝑘, and the representation is unique.

Let 𝑉 be a nonempty collection of vectors in ℝ𝑛. Then 𝑉 is a subspace of ℝ𝑛 if and only if there exist vectors {𝑢⃗1,⋯,𝑢⃗𝑘} in 𝑉 such that 𝑉=span{𝑢⃗ 1,⋯,𝑢⃗ 𝑘} Furthermore, let 𝑊 be another subspace of ℝ𝑛 and suppose {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} ∈ 𝑊 means that any other subspace of ℝ𝑛 that contains these vectors will also contain 𝑉.

We first show that if 𝑉 is a subspace, then it can be written as 𝑉=span{𝑢⃗ 1,⋯,𝑢⃗ 𝑘}. Pick a vector 𝑢⃗1 in 𝑉. If 𝑉=span{𝑢⃗1}, then you have found your list of vectors and are done. If 𝑉≠span{𝑢⃗ 1}, then there exists 𝑢⃗2 a vector of 𝑉 which is not in span{𝑢⃗ 1}. Consider span{𝑢⃗ 1,𝑢⃗ 2}. If 𝑉=span{𝑢⃗ 1,𝑢⃗ 2}, we are done. Otherwise, pick 𝑢⃗3 not in span{𝑢⃗ 1,𝑢⃗ 2}. Continue this way. Note that since 𝑉 is a subspace, these spans are each contained in 𝑉. The process must stop with 𝑢⃗𝑘 for some 𝑘≤𝑛 and thus 𝑉=span{𝑢⃗ 1,⋯,𝑢⃗ 𝑘}. Now suppose 𝑉=span{𝑢⃗ 1,⋯,𝑢⃗ 𝑘}. We must show this is a subspace. So let (a*∑ci*some vector u1) + (b*∑di*some vector u1) = ∑(aci+bdi)(some vector u1) which is one of the vectors in span{𝑢⃗ 1,⋯,𝑢⃗ 𝑘} and is therefore contained in 𝑉. This shows that span{𝑢⃗ 1,⋯,𝑢⃗ 𝑘} has the properties of a subspace. To prove that 𝑉⊆𝑊, we prove that if 𝑢⃗ 𝑖∈𝑉, then 𝑢⃗ 𝑖∈𝑊 Suppose 𝑢⃗ ∈𝑉. Then 𝑢⃗ =𝑎1𝑢⃗ 1+𝑎2𝑢⃗ 2+⋯+𝑎𝑘𝑢⃗ 𝑘 for some 𝑎𝑖∈ℝ, 1≤ 𝑖 ≤𝑘1. Since 𝑊 contains each 𝑢⃗𝑖 and 𝑊 is a vector space, it follows that 𝑎1𝑢⃗1 +𝑎2𝑢⃗2+⋯+𝑎𝑘𝑢⃗𝑘∈𝑊.

We say a matrix has distinct eigenvalues if the algebraic multiplicity of each eigenvalue is exactly 1. Show that a matrix with distinct eigenvalues is diagonalizable

We know that a matrix is diagonalizable if for every eigenvalue, we have algebraic multiplicity = geometric multiplicity. For each eigenvalue, there is always at least one eigenvector (because det (A − λI) = 0), geometric multiplicity is at least one. Since we are given that algebraic multiplicity of every eigenvalue is 1, we have algebraic multiplicity = geometric multiplicity for each eigenvalue. Hence the matrix is diagonalizable.

Let 𝐴 and 𝐵 be 𝑚×𝑛 matrices such that 𝐴 can be carried to 𝐵 by elementary row/column operations. Then row(𝐴)=row(𝐵) / col(𝐴)=col(𝐵).

We will prove that the above is true for row operations, which can be easily applied to column operations. Let 𝑟⃗ 1,𝑟⃗ 2,...,𝑟⃗ denote the rows of 𝐴. If 𝐵 is obtained from 𝐴 by a interchanging two rows of 𝐴, then 𝐴 and 𝐵 have exactly the same rows, so row(𝐵)=row(𝐴) Suppose 𝑝≠0, and suppose that for some 𝑗, 1≤ 𝑗≤ 𝑚1 𝐵 is obtained from 𝐴 by multiplying row 𝑗 by 𝑝. Then row(𝐵)=span{𝑟⃗ 1,...,𝑝𝑟⃗ 𝑗,...,𝑟⃗ 𝑚} .Since{𝑟⃗ 1,...,𝑝𝑟⃗ 𝑗,...,𝑟⃗ 𝑚} ⊆ row(𝐴),it follows that row(𝐵) ⊆ row(𝐴). Conversely, since{𝑟⃗ 1,...,𝑟⃗ 𝑚} ⊆ row(𝐵) ,it follows that row(𝐴) ⊆ row(𝐵). Therefore, row(𝐵)=row(𝐴). Suppose 𝑝≠0, and suppose that for some 𝑖 and 𝑗, 1 ≤ 𝑖, 𝑗 ≤ 𝑚1, 𝐵 is obtained from 𝐴 by adding 𝑝 time row 𝑗 to row 𝑖. Without loss of generality, we may assume 𝑖<𝑗. Then row(𝐵)=span{𝑟⃗ 1,...,𝑟⃗ 𝑖−1,𝑟⃗ 𝑖+𝑝𝑟⃗ 𝑗,...,𝑟⃗ 𝑗,...,𝑟⃗ 𝑚}. Since {𝑟⃗ 1,...,𝑟⃗ 𝑖−1,𝑟⃗ 𝑖+𝑝𝑟⃗ 𝑗,...,𝑟⃗ 𝑚}⊆row(𝐴), it follows that row(𝐵)⊆row(𝐴) Conversely, since {𝑟⃗1,...,𝑟⃗𝑚} ⊆ row(𝐵), it follows that row(𝐴) ⊆ row(𝐵). Therefore, row(𝐵)=row(𝐴).

Suppose B is an 5 × 5 matrix with real entries. Must B have at least one real eigenvalue?

Yes; Complex eigenvalues come in pairs based on their complex conjugates for matrices with real entries. That means there are an even number of complex eigenvalues. Since B has odd rows/columns, at least one of the eigenvalues must be real.

dimension of ℝ𝑛 is 𝑛

You only need to exhibit a basis for ℝ𝑛 which has 𝑛 vectors. Such a basis is the standard basis {𝑒⃗ 1,⋯,𝑒⃗ 𝑛}

system of linear equations

a list of equations 𝑎11𝑥1+𝑎12𝑥2+⋯+𝑎1𝑛𝑥𝑛=𝑏1 𝑎21𝑥1+𝑎22𝑥2+⋯ +𝑎2𝑛𝑥𝑛=𝑏2 ⋮ 𝑎𝑚1𝑥1+𝑎𝑚2𝑥2+⋯+𝑎𝑚𝑛𝑥𝑛=𝑏𝑚111+122+⋯ where 𝑎𝑖𝑗 and 𝑏𝑗 are real numbers

homogeneous

each equation in the system is equal to 0

If the product AB is defined for two matrices A and B, then (AB)^2 = A^2B^2

false

True or False: projection is invertible

false

linear combination

if there exist scalars, 𝑎1,⋯,𝑎𝑛1,⋯, such that 𝑉 = 𝑎1𝑋1 +⋯+ 𝑎𝑛𝑋𝑛

T : Rk --> Rn and S : Rn --> Rm Let A be the matrix such that T(x) = Ax and let B be the matrix such that S(x) = Bx. What is the size of B?

m x n

T : Rk --> Rn and S : Rn --> Rm Let A be the matrix such that T(x) = Ax and let B be the matrix such that S(x) = Bx. What is the size of A?

n x k

Do row operations change the row space?

no

If w is an eigenvector for a square matrix A can w be the zero vector?

no

Do column operations change the column space?

no but row operations do

inconsistent

no solution

dimension

number of vectors in a basis

rank

number 𝑟 of leading entries of 𝐴

Rank of a Matrix

rank(𝐴)=dim(row(𝐴))

matrices are equal when

same number of rows, the same number of columns and their corresponding elements are equal

Is the identity matrix symmetric., skew-symmetric, or neither?

symmetric

nontrivial

system has a solution in which not all of the 𝑥1,⋯,𝑥𝑛,⋯ are equal to zero

span

the collection of all linear combinations of a set of vectors {𝑢⃗ 1,⋯,𝑢⃗ 𝑘} in ℝ𝑛 is known as the ____ of these vectors and is written as ___ {𝑢⃗ 1,⋯,𝑢⃗ 𝑘}.

T : Rk --> Rn and S : Rn --> Rm Let A be the matrix such that T(x) = Ax and let B be the matrix such that S(x) = Bx. How can we find the matrix for the composition S following T?

the matrix for S following T is BA

What makes the product of matrices defined?

the number of columns in the first matrix is equal to the number of rows in the second matrix

trace

the sum of elements on the main diagonal of a matrix

consistent

there exists at least one solution

For any invertible square matrix A, (A^T )^−1 = (A^−1 )^ T .

true

If T is ANY linear transformation which maps Rn to Rm, there is always an (m x n) matrix A with the property that T(~x) = A~x.

true

If the rank of an ax n matrix is n then ITS R REF is In

true; Every invertible n × n matrix has rank n If the rank of an nxn matrix is less than n then there is at least one free variable

True or False: If A is an n × n matrix of rank n, then every linear system with coefficient matrix A has a unique solution.

true; If the rank of an ax n matrix is n then its RREF is In Every n × n matrix of rank n is invertible

True or False: a matrix can be both symmetric and skew symmetric

true; square matrices can be both

If λ is an eigenvalue for a square matrix A can be equal to zero?

yes

T : Rk --> Rn and S : Rn --> Rm Let A be the matrix such that T(x) = Ax and let B be the matrix such that S(x) = Bx. Is the composition S following T defined?

yes

equivalent

𝐵 is _________ to the matrix 𝐴 provided that 𝐵 can be obtained from 𝐴 by performing a sequence of elementary row operations beginning with 𝐴

trivial solution

𝑥1=0, 𝑥2=0,⋯,𝑥𝑛=0 0 is always a solution to this system


Related study sets

Midterm exam for Philosophy Weeks 1 - 5, Phycology Midterm exam 2, Phycology Exam 3

View Set

Business Law Chapter 17: Legal Assent

View Set

NCLEX Questions- HEALTH ASSESSMENT CH. 2

View Set

Chapter 28 - Head & Spine Injuries

View Set

Chapter 32: Caring for Clients with Disorders of the Lymphatic System Med Surg

View Set

NRSG 2510 Exam 2: Practice Questions

View Set

World History: Portugal (Unit 5)

View Set