Linear Algebra

¡Supera tus tareas y exámenes ahora con Quizwiz!

A transformation or mapping of T is linear IF

(i) T (u + v) = T(u) + T(v) for all u, v in the domain of T; (ii) T (cu) = cT(u) for all u and all scalars c

row reduction algorithm

1) Locate the first pivot column/first pivot position 2) Locate a pivot, choose a nonzero entry in the pivot column to be the pivot, if necessary do row interchange to move the pivot to the pivot position, if necessary do row scaling to make the pivot equal to 1 3) Create zeros below the pivot use row replacement to make all entries below pivot equal to 0 4) Cover the row containing the pivot and any row above it, apply steps 1-3 to the remaining submatrix 5) Start from the right most pivot and then move upward and to the left, if a pivot does not equal 1, make it equal 1 with row scaling, use row replacement to create zeros above each pivot

Properties of Matrix Multiplication

1. A(BC) = AB(C) 2.A(B+C) = AB +AC (A+B)C = AC+BC 3. r(AB) = (rA)B = A(rB) 4. if A (mxn) ImA = A = AIn *not commutative! *no canceling! * if AB=0, neither A nor B could necessarily equal 0

echelon form

1. All nonzero rows are above any all zero rows; 2. Each leading entry is in a column to the right of the previous leading entry; 3. All entries below a leading entry in its column are zeros

vectors in R3

3 x 1 column matrices with three entires. in 3d space geometrically

A transpose transpose

A

Let A, B, and C be matrices of the same size, and let r and s be scalars.

A + B = B + A (A+B)+C = A+(B+C) A+0=A r(A+B)=rA+rB (r+s)A = rA+sA r(sA) = (rs)A

pivot column

A column that contains a pivot position

general solution

A family of solutions that contains all possible solutions of a linear system.

each column of AB

A linear combination of the columns of A using weights from the corresponding column of B

identify matrix

A square matrix that, when multiplied by another matrix, equals that some matrix.

identity matrix

A square matrix with ones (1s) along the main diagonal, from the upper left element to the lower right element, and zeros (0s) everywhere else.

linear combination

A sum of scalar multiples of vectors. The scalars are called the weights.

A+B transpose

A transpose + B transpose

if A is men matrix and u and v are vectors in Rn and c is scalar

A9u+v) = Au+Av and A(cu)=c(Au)

Characterization of Linearly Dependent Sets

An indexed set S = {v1,...,vp} of two or more vectors is linearly dependent if and only if at least one of the vectors in S is a linear combination of the others.

Homogenous Linear System

Ax = 0

matrix equation

Ax=b

AB Transpose

B transpose x A transpose (reverse order!)

matrix transformation domain and codomain

Domain of T is Rn when A has n columns and codomain of T is Rm when each column of A has m entries.

A set of one vector is linearly independent

IFF the vector V is not the zero vector

Row-Column Rule for Computing AB

If the product AB is defined, then the entry in row I and column j of AB is the sum of the products of corresponding entries from row I of A and column j of B.

existence question

Is the system consistent?

uniqueness question

Is there only one solution?

The set of all vectors with two entries is denoted by

R 2 (R stands for real numbers and 2 indicates each vector contains two entires)

matrix transformation range

Range of T is set of all linear combinations of Columns of A because each image T(x) is of the form Ax

Elementary Row Operations

Replacement, Interchange, Scaling

Reduced Echelon Form

Same as echelon form, except all leading entries are 1; each leading 1 is the only non-zero entry in its row; there is only one unique reduced echelon form for every matrix

geometric description of Span {v}

Span of {v} is set of all scalar multiples of v, which is set of points on line in R3 through V and 0

Transpose of a Matrix

Switch the rows and columns - imagine it kinda swinging up/down a 90 degree angle

Let T:Rn-->Rm be a linear transformation and A is standard matrix for T, then

T maps Rn onto Rm IFF columns of A span Rm T is 1-1 IFF columns of A are linearly independent

If T is a linear transformation, then

T(0) = 0 and T(cu + dv) = cT(u) + dT(v) for all vectors u, v in the domain of T and all scalars c, d.

Th10: let Rn to Rm be a linear transformation, then there exists a unique matrix transformation such that

T(x) = Ax for all x in Rn. A is the men matrix whose jth column is the vector T(ej) and ej is the jth column of identity matrix in Rn

Linear independence of matrix columns

The columns of a matrix A are linearly independent IFF the equation Ax = 0 has only the trivial solution.

standard matrix

The matrix A such that T(x) = Ax for all <x> in the domain of T.

Codomain of T

The set R^m

domain of T

The set R^n

if a set contains more vectors than there are entries in each vector

Then the set is linearly dependent. (columns (p) greater than rows (n).

augmented matrix

a coefficient matrix with an extra column containing the constant terms

linear dependence relation

a homogeneous vector equation where the weights are all specified and at least one weight is nonzero

Existence and Uniqueness Theorem

a linear system is consistent if and only if the rightmost column of the augmented matrix is NOT a pivot column - that is, if and only if an echelon form of the augmented matrix has NO row of the form [0 ... 0 b] with b nonzero. If a linear system is consistent, then the solution set contains either (i) a unique solution, when there are no free variables, or (ii) infinitely many solutions, when there is at least one free variable.

consistent linear system

a linear system with at least one solution

pivot position

a location in matrix A that corresponds to a leading 1 in the reduced echelon form of A

onto

a mapping T: R^n-->R^m is said to be onto R^m if each b in R^m is the image of at least one x in R^n

matrix transformation

a mapping x |-> Ax where A is an m x n matrix and x represents any vector in Rn.

coefficient matrix

a matrix that contains only the coefficients of a system of equations

column vector (vector)

a matrix with only one column

nontrivial solution

a nonzero vector x that satisfies Ax=0,IFF the equation has at least one free variable

nonzero row or column

a row or column that contains at least one nonzero entry

asking whether vector b is in span

amounts to asking whether vector equation x1v1+x2v2...=b which is asking whether linear system with augmented matrix [v1...vp b] has a solution

linearly independent

an indexed set {v1, ..., vp} with the property that the vector equation x1v1 + x2v2 + ... + xnvn = 0 has only the trivial solution

free variable

any variable in a linear system that is not a basic variable

Ax=b has same soln set

as vector equation x1a1...= b which has same soln set as augmented matrix [a1...a2...b]

A set of two vectors is linearly dependent if

at least one of the vectors is a multiple of the other

the equation Ax=b has solution IFF

b is a linear combination of the columns of A, columns of A span Rm, A has pivot position in every row

a mapping T: Rn to Rm is said to be 1-1 if

each b in Rm is the image of at most one X in Rn.

linear equation in variables x1...xn

equation that can be written as a1x1+...+anxn = b

Is T 1-1? vs Does T map Rn onto Rm

existence question vs uniqueness

Geometric Description of R2

identify a geometric point (a, b) with column vector [a over b]. Regard R2 as the set of all points in the plane.

vectors in RN

if n is positive integer, RN denotes collection of all lists of n real numbers, usually written as nx1 column matrices.

Row-Vector Rule for Computing Ax

if product of Ax is defined then the I-th entries is Ax is sum of products of corresponding entries from row I of A and from vector x

row equivalent

if there is a sequence of elementary row operations that transforms one matrix into the other. if augmented matrices of two linear systems are row equivalent, then the two systems have same soln set

geometric description of span {u,v}

if v not multiple of u, then span {u,v}} is the plane in R3 that contains u, v and 0.

for x in Rn, the vector T(x) in R^m is the

image of x

Every matrix transformation

is a linear transformation

Transformation (function or mapping) T from Rn to Rm

is a rule that assigns to each vector x in Rn, a vector T(x) in Rm.

if A is an min matrix with columns a1...an and if x is in Rn, then product of A and X (Ax)

is linear combination of the columns of A using corresponding entries in x as weights.

leading entry

leftmost nonzero entry in a nonzero row

equivalent linear systems

linear systems that have the same solution set

size of matrix

m rows x n columns

AB has the same...

number of rows as A and same number of columns as B

T is onto Rm when the range

of T is all of the codomain Rm (T maps onto Rm if for each b in codomain Rm there exists at least one solution of T9x) = b).

types of solutions in linear system

one solution (intersect), infinite soln(same line), no soln (parallel lines)

vector

ordered list of numbers

Set of all images T(x)

range of T

span {v1...vp}

set of all linear combinations of v1...vp. set of all vectors that can be written in the form c1v1+c2v2....+cpvp

solution set of linear system

set of all possible solutions

Parametic vector equation

suppose Ax=b is consistent for some b and p is a solution. Then the solution set of Ax=b is set of all vectors of form w=p+v(h) where V9h) is any solution of the homogenous equation Ax=0

Let T:Rn-->Rm be a linear transformation. Then T is one-to-one IFF

the equation T(x) = 0 has only trivial solution

Ax is only defined if

the number of columns in A equals the number of entries in x

linearly dependent set

the set {v1, ..., vp} with the property that the vector equation c1v1 + c2v2 + ... + cnvn = 0 has weights c1...cp that are not all zero.

trivial solution

the solution x=0 of a homogeneous equation Ax=0

shear transformation

the transformation T:R^2 --> R^2 defined by T(x) = Ax

scalar multiple of u by c is

the vector cu obtained by multiplying each entry in u by c

zero vector

the vector whose entries are all zero

If a set contains the zero vector

then the set is linearly dependent

intro to linear transformation

think of Matrix A as an object that acts of vector X by multiplication to produce a new vector Ax: correspondence from x to Ax is a function from one set of vectors to another.

basic variables

variables corresponding to pivot columns in a matrix

vector equation and matrix

vector equation x1a1+..=b has same soln set as linear system whose augmented matrix is [a1,a2..b]

linear combination and matrix

vectors a1,a2, and b are columns of augmented matrix. to determine if b can be written as a linear combination of b, find that weights x1,x2 exist. this can be done using row reduction on matrix that corresponds to vector equation

contraction

when T: R2->R2 by T(x)=rx, and r is between or equal to 0 and 1

dilation

when T: R2->R2 by T(x)=rx, and r is greater than 1

when does mapping T not 1-1

when some b in Rm is image of more than one vector in Rn

when does T not map Rn onto Rm

when there is some b in Rn for which the equation T(x) = b has no soln

parametric description of a solution set

write basic variables in terms of free variables. free variables act as parameters

vector equation

x1a1 + x2a2 + ... + xnan = b


Conjuntos de estudio relacionados

HESI/Saunders Online Review- Module 10-Physiological Health Problems

View Set

(290-10-2) Thinking Skills for Troubleshooting

View Set

Ch. 3 Muscular Anatomy - PHED 225

View Set

Managing and Organizations Chapter 3 Study Guide

View Set

ACCT-Chapter 3 Adjusting Accounts and Preparing Financial Statements

View Set

Chapter 7: Urinary Tract (Penny)

View Set