Linear Algebra Test 2

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

Colloraries 5.4.1

1 2) A mxn rank A <= m & rank A <= n 3) Rank A = Rank UA = Rank AU for invertible u & v 4) If A is mxn & B is mxn rank AB <= rank A & rank AB <= rank B

Thm (6.2.1) If W is a subset of a vector space V, W is a subspace of the following

1) 0 e W 2) If u,v e W then u+v e W 3) If u is a subset of W then au e w for a e R

Thm (5.5.6) TFAE for a square matrix A which ck(x) factors completely

1) A is diagnolizable 2) dim[Eλ(A)] = multiplicity of λ for each λ

Thm (5.2.3) A is nxn TFAE

1) A is invertible 2) The columns of A are linearly independent 3) The columns of A span Rn 4) The rows of A are linearly independent 5) The rows of A span Rn

Thm 5.2.8 Let U be a subset of W be subspaces of Rn

1) Dim U is a subset of Dim W 2) If dim U = dim W then U = W

Thm 5.2.6 Let U not equal {0} be a subspace of Rn

1) Has a basis & dim <= n 2) Any independent set in U can be enlarged to a basis of U 3) Any spanning set of U can be cut down to a basis of U

Thm (6.4.2) Let U & W be subspaces of a finite dimensional space V

1) If U is a subset of W then dim U <= dim W 2) If U is a subset of W & dim U = dim W then U = W

Thm (5.4.2) Let A be an mxn matrix & have rank r

1) The n-r basic solution to Ax=0 are a basis of the null space of A, so dim[null A] = n-r 2) Thm(5.4.1(2)) proves a basis of im A = col A & dim[im A] = r

Thm 5.4.1 Let A be an mxn matrix of rank r, then the dim(Col A) = dim (row A) = r. Furthermore A-> R (using elementary row operations)

1) The r nonzero rows of R are a basis of row A 2) The leading 1s in R indicate the columns of A that form the basis of col A

Thm (6.4.1) Let V be a vector space spanned by m vectors

1) V has a finite basis & dim v <= m 2) Every independent set of vectors in V can be extended to a basis (by adding vectors from some known basis) 3) If U is a subspace of V then a) U is finite dimensional & dim U <= dim V b) If dim U = dim V then U = V

Defn 6.3.2 {e1, e2, ..., en} is a basis of the vector space v if

1) spans V 2) is Linearly independent

Thm 5.2.2 If A = [a1, a2, ... ,an] is mxn

1) {a1, a2, ..., an} is independent in Rn IFF Ax=0 implies x=0 2) Rm = span{a1, a2, ..., an} IFF Ax = b has a solution for every b in Rm

Defn 5.2.2 If U is a subspace of Rn, a set {x1,x2,...,xm} is called a basis if

1) {x1,x2,...,xm} is linearly independent 2) U = span {x1,x2,...,xm}

Defn 5.1.1 A set U of vectors in Rn is called a subspace of Rn if it satisfies:

1)The zero vector 0 ∈U. 2)If x ∈U and y ∈U then x + y ∈U. 3)If x ∈U, then ax ∈U for every real number a.

Theorem (4.2.4): Let u and d ̸= 0 be vectors

1. The projection of u on d is given by projdu = (u ·d/(∥d∥)^2 )d. 2. The vector u −projdu is orthogonal to d.

Theorem (5.1.1): Let U = span {x1,x2,...,xk} in Rn, then

1. U is a subspace of Rn containing each xi. 2. If W is a subspace of Rn and each xi ∈W, then U ⊆W.

Theorem (4.3.2): Let u, v, and w denote vectors in R3.

1. u ×v is a vector. 2. u ×v is orthogonal to both u and v. 3. u ×0 = 0 = 0 ×u 4. u ×u = 0 5. u ×v = −(v ×u) 6. (ku) ×v = k(u ×v) = u ×(kv) for any scalar k. 7. u ×(v + w) = (u ×v) + (u ×w)8. (v + w) ×u = (v ×u) + (w + u)

Theorem (4.2.1): Let u, v, and w denote vectors in R3

1. v ·w is a real number 2. v ·w = w ·v 3. v ·0 = 0 = 0 ·v 4. v ·v = ∥v∥^2 5. (kv) ·w = k(w ·v) = v ·(kw) for all scalars k 6. u ·(v ±w) = u ·v ±u ·w

Theorem (4.2.5): Let v and w be vectors in R3.

1. v ×w is a vector orthogonal to both v and w. 2. If v and w are nonzero, then v ×w = 0 if and only if v and w are parallel.

Theorem 4.3.4 If u and v are two nonzero vectors and θ is the angle between u and v, then ||u×v|| =

1. ||u×v|| = ||u|| * ||v|| * sinθ = the area of the parallelogram determined by u and v. 2. u and v are parallel if and only if u×v = 0.

Thm (5.5.1) If A & B are similar nxn matrices

A & B have the same determinants, rank, characteristic polynomial, and eigenvalues

Defn 5.5.3

A is diagnolizable if there is an invertible P & diagnal D such that A = P^-1 * D * P P = [x1, x2, ..., xn] , P must be invertible THUS collection of xis are linearly independent

Defn (4.2.2) Normal Vector

A nonzero vector n is called a normal vector for a plane if it is orthogonal to every vector in the plane.

Defn 5.2.1

A set of vectors {x1,x2,...,xn} is called linearly independent t1x1 + t2x2 +... + tkxk = 0 implies t1 = t2 = tk = 0 If not linearly independent you can take a vector and solve it as a linear combination of the others

Defn 6.1.1 A vector space consists of a set V (called vectors) with an addition operation defined, and a set of scalars (usually R) with scalar multiplication defined, satisfying

A1) u,v e V -> u+v e V A2) u + v = v + u A3) u + (v + w) = (u + v) + w A4) 0 e v where 0 + v = v + 0 = 0 A5) v e V --> -v e v, v + -v = 0 S1) v e V -> av e V for all a e R S2) a(v + w) = av + aw S3) (a + b)v = av + aw S4) a(bv) = (ab)v S5) 1v = v

Thm (6.4.3) Let V be a finite dimensional vector space

Any spanning set of V can be cut down to a basis of V

Thm 5.3.5 If B is not linearly independent

B = {x1,x2,...,xn} is not an orthogonal set

Thm 5.3.1

Dot products work the same in Rn as they do in R3

Thm 5.2.1 If {x1,x2,...,xk} are an independent set of vectors in Rn

Every vector in span {x1,x2,...,xk} has a unique representation as a linear combination of xis

Defn 5.5.4 If λ is an eigenvalue of an nxn matrix A, the eigenspace of A is

Eλ(A) = { x e Rn | Ax = λx}

Thm 5.2.4 Let U be a subspace of Rn.

If U is spanned by a set of m vectors & there is a linearly independent set of k vectors in U then k<=m Basically saying a linear independent set cannot be bigger than a spanning set

Thm (5.2.5)

If {x1,x2,...,xm} & {y1,y2,...,yk} are both bases of a subspace U then m=k

Defn 5.2.3

If {x1,x2,...,xm} is a basis for U, the dimension of U is the number of vectors in the basis

Thm 5.2.7

Let U be a subspace of Rn where dim U = m & let B = {x1,x2,...,xm} be a set of m vectors. B is linearly independent IFF B spans U

Thm (6.4.4) Let V be a vector space of dim V = n & suppose S is a set of n vectors in V

S is linearly independent IFF S spans V

Defn 5.4.1

Span{columns of A} = col A which is a subspace of Rm span {rows of A} = row A which is a subspace of Rn

Null Space

The null space of an m x n matrix A is the set of all solutions to Ax = 0 null A = { x e R^n | Ax = 0}

Span

The set of all linear combinations of a set of vectors span{x1,x2,...,xk} =.{t1x1 + t2x2 + ... + tkxk | ti e R^n}

Defn 5.5.2 Given an nxn matrix A

The trace of A (tr A) is the sum of the values on the main diagnol

Theorem 4.3.5

The volume of the parallelepiped determined by three vectors w, u, and v is given by |w ·(u×v)|.

Thm (5.5.5) If A is an nxn matrix with n distinct eigenvalues

Then A is diagnolizable

Defn 6.4.1 Let V be a vector space that can be spanned by a collection of vectors

Then V is called a finite dimensional space. Otherwise V is called an infinite dimensional space

Thm (6.3.1) If {v1,v2,...,vn} is a linearly independent set of vectors in vector space V

Then ever v e V can be expressed as a linear combination of these vis in at most one way V = a1v1 + a2v2 + ... + anvn V = b1v1 + b2v2 + ... + bnvn 0 = (a1-b1) v1 + (a2 - b2)v2 +... + (an-bn)vn, since all the coefficients equal 0 ai = bi

Theorem (4.2.2) Let v&w be non zero vectors

Then v · w = ||v|| * ||w|| * cos(theta)

Thm (5.5.4) Let x1, x2, ..., xn be eigenvectors corresponding to distinct eigenvalues λ1, λ2, ..., λk of an nxn matrix A k<=n.

Then {x1,x2,...,xk} is a linearly independent set

Definition 4.1

Two vectors u & v are parallel if there is some non-zero constant c such that u = cv

Defn (4.2.1)

Two vectors v and w are orthogonal if v ·w = 0.

Defn 6.2.1 If W is a subset of vector space V

We say W is a subspace if it is also a vector space

Defn 5.5.1 If A & B are nxn matrices

We say they are similar A~B if B= P^-1*A*P 1) A~A (reflexive) 2)If A~B then B~A (symmetric) 3) If A~B & B~C then A~C (reflexive)

Defn 6.3.1 A collection of vectors {v1, v2, ..., vn} of a vector space V is called linearly independent

c1v1 + c2v2 + ... + cnvn = 0 implies c1=c2=cn=0

Def 5.4.2 A is an mxn matrix

dim(null A) + Dim(imA) = R Why?

Image space

im A = { Ax | x e R^n }

Thm (6.3.2) Suppose a vector space V can be spanned by n vectors. If a set of m vectors in v is linearly independent then

m <= n Why?

Thm (6.3.2) If {e1, e2, ..., en} & {f1, f2, ..., fm} are basis of a vector space V then

m = n

Theorem (4.3.1): u·(v×w) =

u·(v×w) = det(v x w) but i j k are replaced with the parts of vector u

Defn 5.3.1

x, y e Rn are orthogonal if x · y = 0 The collection of vectors {x1,x2,..,xn} is called an orthogonal set xi · xj = 0 for i doesnt equal j & xi doesnt equal 0 for all i An orthogonal set is called orthonormal if ||xi|| = 1 for all i

Thm 5.3.2 x, y e Rn

|x · y| =< ||x|| * ||y|| |x · y| = ||x|| * ||y|| IFF x=ay

Theorem (4.3.3):If u and v are vectors in R3, then ||u ×v||^2 =

||u||^2 *||v||^2 −(u ·v)^2.


संबंधित स्टडी सेट्स

Check Yourself with Lateral Reading: Crash Course Navigating Digital Information #3

View Set

Nursing Care of Children practice 1

View Set

LAW 4300 Quiz 5 (ALL TRUE or FALSE

View Set

DOD Cyber Awareness Challenge 2023

View Set