Linear Algebra XM1

¡Supera tus tareas y exámenes ahora con Quizwiz!

A 2x2 has an inverse iff δ=ad-bc≠0 so the inverse of A is

(1/δ)[d -b] [-c a]

matrix multiplication the fancy formal way for (AB)ij

(AB)ij: (k=1 to n)∑Aik*Bkj so AB∈Mp,q (ith row of A)*(jth row of B)

properties of transpose

(A^T)^T=A (A±B)^T=A^T±B^T (cA^T)=c(A^T)

transpose of a matrix

(A^T)ij=Aji if A is mxn, then A^T is nxm

(AB)^1=

(B^1)(A^-1)

If A is a matrix and c,d are constants, d(cA)=(cd)A

(d(cA))ij = d(cA)ij = d(cAij) = (dc)Aij) = (cd)Aij = ((cd)A)ij

if a=cb and c>0 then theta is

properties of nonsingular matrices

1) (A-¹)-¹=A 2) (Aⁿ)-¹=(A-¹)ⁿ=A-ⁿ 3) (AB)-¹=B-¹A-¹ 4) (A^T)-¹=(A-¹)^T prove these by just multiplying both sides of the equation together rank(A)=n

properties of matrix multiplication

1) 0A=0∈Mr,q if A∈Mp,q and 0∈Mr,p 2) IA=A 3) (B+C)A=BA+CA 4) (AB)C=A(BC) 5) transpose of (AB) is B^T*A^T

method for finding the inverse of a matrix that has one. So if A is square

1) augment A to be a nx2n matrix, whose first n columns form A itself and the remaining n columns form the In 2) convert this [A-¹|In] to rref 3) A side has to resemble I. If it can't, stop 4) If it does and is in rref, then it should resemble [In|A-¹]

a matrix is in reduced row echelon form iff

1) first nonzero entry in each row is 1 2) each successive row has its first nonzero entry in a later column 3) all entries above and below the first pivot entry of each row are 0 4) all full rows of zeroes are in the final rows of the matrix

if a=cb and c<0 then theta is

180°

the solution set of a system of equations is unchanged if

3) list the equations in a different order 2) one equation is multiplied by a nonzero constant 1) replace one equation by: the sum of it and any multiple of another equation

if a*b=0 then theta is

90° (orthogonal)

effect of row operations on matrix multiplication

A,B are matrices s.t. their product is defined 1) if R is any row operation, then R(AB)=(R(A))B 2) if R1,...,Rn are row operations, then Rn(...(R2(R1(AB)))...) = (Rn...(R2(R1(A)))...)B

AB=AC doesn't necessarily mean B=C. However, if A is invertible, then

A-¹(AB)=A-¹(AC) → B=C

dimension requirements for multiplying two matrices AB

AB can be multiplied iff Amn and Bnp resultant dimension is mp

If A is a square matrix, then B is an inverse to A iff

AB=I

definition of matrix inverse (invertible)

AB=I and BA=I B=A^-1 and A=B^-1

proof that if A,B are inverses of each other and A,C are inverses of each other, then C=B

AB=I, BA=I, AC=I, CA=I BAC=(BA)C=IC=C BAC=B(AC)=BI=B so C=B

homogeneous system and properties

AX=0 trivial solution is one with all zeroes, nontrival otherwise in rref, if Amn has fewer pivot entries than n, system has a nontrivial solution (rank<n) if A has exactly the same amount of pivot entries as n, then the system has only the trivial solution (rank=n) if there are fewer equations than variables, then there's at least one nonpivot column (with at least one independent variable taking on any value), meaning there would be infinite solutions

diagonal matrices

Aij=0 whenever i≠j has to be square

proof that (B+C)A=BA+CA

A∈Mp,q and B,C∈Mr,p ((B+C)A)ij = (k=1 to p)∑(B+C)ik*Akj = ∑(Bik+Cik)*Akj = ∑Bik*Akj+Cik*Akj = ∑Bik*Akj+∑Cik*Akj = (BA)ij+(CA)ij = (BA+CA)ij) //

proof that IA=A

A∈Mp,q and I∈Mp,p then IA=A∈Mpq the (ij) entry of IA is (k=1 to p)∑(I)ik*(A)kj

prove that if AC=B and [A|B]~[D|G] then DC=G

En...E1[A|B]=[D|G] En...E1*A=D and En...E1*B=G then DC=(En...E1*A)C=En...E1*(AC)=En...E1*B=G corollary: If X1 is a solution to AX=B then X1 is also a solution to DX=G. The converse is true

normalizing a vector

If x∈Rⁿ and x≠0, then u=(1/‖x‖)*x is a unit vector in the same direction of x

scalar multiplication of vector magnitude

Let x∈Rⁿ and c∈R, then ‖c‖*‖v‖=‖cv‖

singular matrix

a matrix that has no inverse

finding angle btwn two vectors using dot product

a*b=‖a‖‖b‖cosθ, where θ is the angle btwn a and b

length of vector

also called norm or magnitude if v=[v1,v2,...] then ‖v‖=√((v1)²+(v2)²+...) ‖v‖²=v*v or ‖v‖=√(v*v)

unit vector

any vector of length 1

proof that xy=‖x‖‖y‖ iff y=cx, c>0

assume that y=cx xy = x(cx) = c‖x‖² = c‖x‖‖x‖ = ‖x‖‖cx‖= ‖x‖‖y‖

proof by induction

base step: prove that the desired statement is true for the initial value i∈Z inductive step: assume that if the statement is true for an integer value k=i, then the statement is true for the next integer value k+1 as well

If A is a matrix and c is a constant, (cAij)=?

c(Aij)

scalar matrices

c*I (scalar multiplication of identity matrix)

matrix reflexivity

every matrix A is row equivalent to itself proof: IC=C

prove that if A,B are both square, of the same size, and invertible, then AB is also invertible

find an inverse for AB: (inverse)AB=AB(inverse)=I B^-1(A^-1(AB))=B^-1((A^-1*A)B)=B^-1(IB)=I do same for A

proof that row operations do not change solution sets

given M and B, let N=[M|B] if v is a solution s.t. Mv=B E(Mv)=EB → (EM)v=(EB) so [EM|EB] →E[M|B]

matrix transitive property

if A∼B and B∼C, then A∼C Proof: EnEn-₁...E1*A=B and Fm...F1*B=C So Fm...F1((En...E1)A)=C → (Fm...F1En...E1)A=C

matrix symmetry

if A∼B, then B∼A ∼ is an equivalence relation (think of E) proof: En...E1A=B, then (E^-1)n[En...E1A]=(E^-1)n[B] so En-1...E1A=(E^-1)nB. Now iterate until A=(E^-1)1...(E^-1)nB

geometric interpretation of vectors

if v∈R², imagine coordinates. v would be the movement from one point to another ex. (3,2) to (1,5) so v would be [-2,3]

finding individual solutions to a system that has infinite soutions

let nonpivot variables take on any real value and derive the other pivot variables from these choices

if A is a not a square matrix, then it has ___ inverse

no

rank of matrix

number of nonzero rows in row reduced echelon form two row equivalent matrices have the same rank if equations are consistent, solution set has (# of variables) - (rank) = degrees of freedom in answer = dimension of solution space less than or equal to number of rows

AX=B has an infinite number of solutions if

one of the columns left of the augmentation bar has no pivot entry in rref. The nonpivot columns correspond to (independent) variables that can take on any value, and the values of the remaining (dependent) variables are determined from those. at least two solutions

elementary matrices

result of performing one row operation on the identity matrix every elementary matrix has an elementary inverse

if A~B then they have the _____ solution set

same

linear combination

sum of scalar multiples of a list of vectors

symmetric and skew-symmetric matrices

symmetric: A=A^T; A+A^T skew-symmetric: A=-A^T; A-A^T

inconsistent system

system's solution set is empty; no solutions

find only a particular row or column of a matrix product

the kth row of AB is the product (kth row of A)*B the ith column of AB is the product A*(lth column of B)

row space

the subset of R^n consisting of all vectors that are linear combinations of the rows of A a question could be to determine whether a vector is in the row space of A, given dimensions matched ex. [5,2]=c1[1,2]+c2[2,1]+... 1c1+2c2=5 2c1+1c2=2

row equivalent

two matrices C,D are row equiv if there's a sequence of elementary matrices E1,E2,...,En s.t. En...E2E1C=D every matrix is row equivalent to a unique matrix in reduced row echelon form

parallel vectors

two vectors are parallel if they are in the same direction (x=cy) xy=±‖x‖‖y‖ and cosθ=±1 (θ=0° or 180°)

upper/lower triangular matrix

upper: Aij=0 if i>j lower: Aij=0 if i<j

what's a dot product (applications?)

vector multiplication (v*w=v1w1+v2w2+...∈R^n) angle between two vectors √(v*v)=‖v‖ (length/magnitude of vector v

proof that if a,b are unit vectors then -1 ≤ a*b ≤ 1

where does a*b show up in a dot product? it shows up in (a+b)(a+b) or (a-b)(a-b) so (a+b)(a+b) = ‖a+b‖² ≥ 0 (a*a)+2ab+(b*b) = ‖a‖²+2ab+‖b‖² = 1+2ab+1≥0 so ab≥-1 (a-b)(a-b) = ‖a-b‖² ≥ 0 ultimately ab≤1 //

orthogonal vectors

xy=0 x and y are perpendicular to each other

prove that if xy=±‖x‖ ‖y‖ (cosθ=±1) then x and y are parallel (y=cx)

y=cx → xy = x(cx) = c‖x‖² so c=xy/‖x‖² xy≠0 bc ‖x‖ and ‖y‖ are nonzero show that y-cx=0 or y-(xy/‖x‖²)x=0 using the property that zz=0 iff z=0, show that [y-(xy/‖x‖²)x]²=0 and we're good

Cauchy-Schwarz Inequality

|xy|≤‖x‖‖y‖

proof of reverse triangle inequality

|y|=|x+y-x|≤|y-x|+|x| |x|=|y+x-y|≤|x-y|+|y| |x+y-x|≤|y-x|+|x| → |x+y-x|-|x|≤|y-x| |y+x-y|≤|x-y|+|y| → |y+x-y|-|y|≤|x-y| so |y|-|x|≤|y-x| and |x|-|y|≤|x-y| |y-x|=|x-y| |x|-|y|=-(|y|-|x|) so |(|x|-|y|)|≤|x-y|

triangle inequality

‖x+y‖≤‖x‖+‖y‖

properties of matrix addition

∀ A, B, B ∈Mm,n 1) A+B=B+A 2) (A+B)+C=A+(B+C) 3) A+0=A 4) A+(-A)=(-A)+A=0 OR ∀ A∃B s.t. A+B=0 5) c(A+B)=cA+cB 6) (c+d)A=cA+dA 7) (cd)A=c(dA) 8) A+B=C+B then A=C

properties of scalar multiplication of matrices

∀ A, B, C ∈ Mm,n and c∈R 1) c(dA)=(cd)A 2) 1A=A 3) c(A+B)=cA+cB 4) (c+d)A=cA+dA

properties of vector multiplication

∀u,v,w∈Rⁿ Commutative: u*v=v*u Distributive: (u*v)w=uw*vw Zero Vector: If v*v≥0, v*v=0 → v=zero vector Associative: (r*u)v=r(u*v)

Properties of vector addition

∀u,v,w∈Rⁿ and ∀r,s∈R Associative: u+v=v+u Commutative: (u+v)+w=u+(v+w) Addition identity: v+zero vector=v Cancellation: u+w=v+w → u=v Distributive: (r+s)v=rv+sv AND ru+rv=r(u+v) Associative: (rs)v=r(sv) Multiplication identity: 1*v=v


Conjuntos de estudio relacionados

Verbos de Silvana Sin Lana en pretérito

View Set

Tx Government Chapters 10-14 Test Bank

View Set

MKTG chapter 16 social responsibility and ethics

View Set

CH 14: Warm-Up and Flexibility Training

View Set

Statistics for the Social Sciences

View Set

Chapter 7: Federal Tax Considerations for Life Insurance and Annuities

View Set