Prelim 2

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

When A is invertible, what is the determinant?

(-1)^r * (product of pivots in U) r is number of row interchanges

Show that if A is invertible, then det A^-1 = 1/det A.

(det A)(det A^-1) = I det I = 1 det A^-1 = 1/det A

Suppose that A is a square matrix with det A^4 = 0. Explain why A cannot be invertible.

(det A^4) = (det A)^4 det A must be 0 If det A = 0, then it is not invertible.

Let A and P be square matrices, with P invertible. Show that det(PAP^-1)=detA

(det P)(det A)(det P^-1) = (detP)(detP^-1)(detA) (detP)(1/detP)(detA) (detA)

Dimension of V

(dim V) The number of vectors in the basis for V (remember this means according to the Spanning Set Theorem that it's the largest linearly independent set and smallest spanning set SO delete accordingly)

Show that {f in C[a,b]: f(a) = f(b)} is a subspace of C[a,b]

1. Let g(t) = 0 for all t. Then g(a) = g(b) = 0, so g is in H. 2. Let g and h be in H. Then g(a) = g(b) and h(a) = h(b), and (g + h)a = g(a) + h(a) = g(b) + h(b) = (g + h)(b), so g + h is in H. Let g be in H. Then g(a) = g(b) and (cg)a = cg(a) = cg(b) = (cg)b, so cg is in H.

What facts about continuous functions should be proved in order to demonstrate that C[a,b] (the set of all continuous real-valued functions defined on a closed interval [a,b] in R) is indeed a subspace?

1. The constant function f(t) = 0. 2. The sum of two continuous functions is continuous 3. A constant multiple of a continuous function is continuous.

Two views of a basis

A basis is a spanning set that is as small as possible. -> Additional vector is deleted, it will no longer span It is also a linearly independent set that is as large as possible. -> If it is enlarged by one vector, it can't be linearly independent

What is a linear transformation?

A linear transformation T from a vector space V into a vector space W is a rule that assigns to each vector x in V a unique vector T(x) in W, such that: (i) T(u + v) = T(u) + T(v) (ii) T(cu) = cT(u)

What is a coordinate system on a set?

A one-to-one mapping of the points in the set into R^n; coordinates give the location of x relative to the standard basis

Eigenvalue

A scalar λ is called an eigenvalue of A if there is a nontrivial solution x of Ax = λx

Subspace

A subspace of a vector space V is a subset H of V that has three properties. a. The zero vector of V is in H. b. H is closed under vector addition. that is, for each u and v in H, the sum u+v is in H. c. H is closed under multiplication by scalars. That is, for each u in H and each scalar c, the vector cu is in H.

What is a vector space? What are the 10 axioms that hold for all vectors.

A vector space is a nonempty set V of objects, called vectors, on which are defined two operations, called addition and multiplication by scalars, subject to the ten axioms listed below. 1. The sum of u and v, denoted by u+v, is in V. 2. u + v = v + u 3. (u + v) + w = u + (v + w) 4. There is a zero vector 0 in V such that u + 0 = u. 5. For each u in V, there is a vector -u in V such that u + -u = 0. 6. The scalar multiple of u by c, denoted by cu, is in V. 7. c(u+v) = cu + cv 8. (c+d)u= cu + du. 9. c(du) = (cd)u. 10. 1u = u

Eigenvector

An eigenvector of an nxn matrix A is a nonzero vector x such that Ax=λx for some scalar λ

The Diagonalization Theorem

An nxn matrix A is diagonalizable if and only if A has n l inearly independent eigenvectors. In fact, A = PDP^-1, with D a diagonal matrix, if and only if the columns of P are n linearly independent eigenvectors of A. In this case, the diagonal entries of D are eigenvalues of A that correspond, respectively, to the eigenvectors in P.

What kind of matrix is diagonalizable?

An nxn matrix with n distinct eigenvalues is diagonalizable; similar to diagonal matrix

How do you find a nonzero vector in Col A? In Nul A?

Any column of A will do for a vector in Col A. To find a nonzero vector in Nul A, row reduce [A 0}. Assigning nonzero value to the free variable will get you a nonzero vector.

Let u and v be vectors in a vector space V, and let H be any subspace of V that contains both u and v. Explain why H also contains Span{u, v}. this shows that Span{u, v} is the smallest subspace of V that contains both u and v.

Any subspace that contains u and v must also contains all scalar multiples of u and v and hence must also contain all sums of scalar multiples of u and v. Thus H must contain all linear combinations of u and v, or Span{u,v}.

Linear combination

Any sum of scalar multiples and vectors

Let B = {b1,....,bn} be a basis for a vector space V. Explain why the B-coordinate vectors of b1,...,bn are the columns e1,...,en of the nxn identity matrix.

B-coordinate vector (the scalars of the linear combination to create x) is given uniquely For each k, bk = 0 * b1 + .... + 1*bk + .... + 0*bn SO [bk]B = (0,....,1,.....,0) = ek

Find a formula for det(rA) when A is an nxn matrix.

By factoring an r out of each of the n rows, det(rA) = r^n*detA.

Consider the polynomials p1(t) = 1 + t, p2(t) = 1 - t, and p3(t) = 2 (for all t). By inspection, write a linear dependence relation among p1, p2, and p3. Then find a basis for Span {p1,p2,p3}

By inspection p3 = p1 + p2. By the Spanning Set Theorem, Span{p1, p2, p3} = Span{p1, p2}. Since neither p1 nor p2 is a multiple of the other, they are linearly independence and hence {p1, p2} is a basis for Span{p1,p2,p3}

What is Col A in relation to linear transformations?

Col A is the range of the linear transformation from x to Ax.

Why is the operation of differentiation a linear transformation?

D: V->W is the transformation that changes f in V into its derivative f' 2 simple differentiation rules in calculus: 1. D(f+g) = D(f) + D(g) 2. D(cf) = cD(f) The kernel of D is the set of constant functions on [a,b]. The range of D is the set W of all continuous functions on [a,b]

Adjugate of A

Denoted by adjA Means the transpose of the matrix of cofactors

Way to prove something H (Span{v1...vp}) a subspace of V

Example: v1 and v2 are in a vector space V H = Span {v1, v2} u= s1v1 + s2v2 w = t1v1 + t2v2 Zero vector is in H because 0 = 0v1 + 0v2. u + w = (s1v1 + s2v2) + (t1v1 + t2v2) = (s1 + t1) v1 + (s2 + t2)v2 cu = c(s1v1 + s2v2) = (cs1)v1 + (cs2)v2

If you ask whether or not a vector is in Col A,

First reduce [A v] to echelon form. If the equation is consistent, v is in Col A.

Show that the coordinate mapping is onto R^n. That is, given and y in R^n, with entries y1,...,yn, produce u in V such that [u]B = y.

Given y = (y1,...,yn) in R^n, let u = y1b1 + .. + ynbn. By definition, [u]b = y. Since y was arbitrary, the coordinate mapping is onto R^n.

What does the B-coordinate vector of x tell us/ B-coordinates of x?

How to build x from the vectors in B; It's like a road map! Consider a basis B = {b1, b2} for R^2 where b1 = [1, 0] (downwards) and b2= [1, 2]. Suppose an x in R^2 has the coordinate vector [x]b = [-2, 3]. Find x. x = (-2)b1 + 3(b2) == -2[1,0] + 3[1,2] = [1,6]

Use the concept of volume to explain why the determinant of a 3x4 matrix A is zero if and only if A is not invertible.

IMT is a 3x3 matrix is not invertible if and only if its columns are linearly dependent. This will only happen if and only if one of the columns is a linear combination of the others; that is, if one of the vectors is in the plane spanned by the other two vectors. This is equivalent to the condition that the parallelepiped determined by the three vectors has zero volume, which is in turn equivalent to the condition that det A =0.

Multiplicative property of determinants

If A and B are nxn matrices, then det AB = (det A)(det B)

Area of parallelogram

If A is a 2x2 matrix, the area of the parallelogram determined by the columns of A is |detA|

Volume of parallelepiped

If A is a 3x3 matrix, the volume of the parallelepiped determined by the columns of A is |detA|

Triangular Matrices and Determinants

If A is a triangular matrix, then det A is the product of the entries on the main diagonal.

Relationship between basis and invertibility

If A is an invertible n x n matrix, then the columns of A form a basis for R^n because they are linearly independent and span R^n, by the Invertible Matrix Theorem. If A is invertible, the columns of A form a basis. (To find it's invertible, you can do things like check if there are n pivot positions)

Relationship between transposes and determinants

If A is an nxn matrix, then det A^T = det A.

Linear transformations of parallelepiped

If T is determined by a 3x3 matrix A, and if S is a parallelepiped in R^3, then {volume of T(S)} = |det A| * {volume of S}

Infinite-dimensional

If V is not spanned by a finite set

Finite-dimensional

If V is spanned by a finite set, then V is finite-dimensional

Rank Theorem

If a matrix A has n columns, then rank A + dimNulA = n

Linear dependence and number of vectors

If a vector space V has a basis B = {b1,...,bn} then any set in V containing more than n vectors must be linearly dependent. Aka: you can't have more than the number of vectors in your basis in a set for the set to be linearly independent Implication: If a vector space has a basis B = {b1,...,bn} then each linearly independent set in V has no more than n vectors.

Basis of n vectors

If a vector space V has a basis of n vectors, then every basis of V must consist of exactly n vectors

When is an indexed set {v1...vp} of two or more vectors (with v1 not equal to 0) linearly dependent?

If and only if some vj (with j > 1) is a linear combination of the preceding vectors, v1...vj-1

When is Nul A = {0}? (2 conditions)

If and only if the equation Ax=0 has only the trivial solution. It is also 0 if and only if the linear transformation is one-to-one.

When do the columns of A span R^m?

If and only if the equation Ax=b has a solution for each b.

When is Col A = R^m? (2 conditions)

If and only if the equation Ax=b has a solution for every b in R^m AND if and only if the linear transformation maps R^n ONTO R^m.

Let H be an n-dimensional subspace of an n-dimensional vector space V. Show that H = V.

If dim V = dim H = 0, then V = {0} and H = {0}, so H = v. If dim V = dim H > 0. Then H contains basis S consisting of n vectors. S would then also be a basis for V. H = V= Span S.

Similar matrices

If nxn matrices A and B are similar, then they have the same characteristic polynomial and hence the same eigenvalues (with the same multiplicities)

Row spaces and matrices

If two matrices A and B are row equivalent, then their row spaces are the same. If B is in echelon form, the nonzero rows of B form a basis for the row space of A as well as for that of B.

Prove that Nul A is a subspace of R^n.

If u and v represent any two vectors in Nul A, Au=0 and Av= 0. A(u+v) = Au + Av = 0 + 0 = 0 A(cu) = c(Au) = c(0) = 0

Linear independence and eigenvectors

If v1,...,vr are eigenvectors that correspond to distinct eigenvalues λ1,....,λr of an nxn matrix A, then the set {v1,....,vr} is linearly independent

When is a subspace spanned by {v1...vp}?

If v1...vp are in a vector space V, then Span{v1...vp} is a subspace of V. If you're given vectors in a vector space -> show they can be written as a linear combination -> they therefore span -> according to theorem 1, they are a subspace of V

What can you determine if a square matrix's determinant is not equal to 0?

It is invertible. (A square matrix A is invertible if and only if det A not equal to 0)

Suppose R^4 = Span{v1...v4}. Explain why {v1...v4} is a basis for R^4.

Let A = [v1 v2 v3 v4]. Then A is square and its columns span R^4 since R^4 = Span {v1, v2, v3, v4}. So its columns are linearly independent by the Invertible Matrix Theorem and {v1, v2, v3, v4} is a basis for R^4.

Properties of Determinants

Let A and B be nxn matrices a. A is invertible if and only if det A not equal to 0 b. detAB = (det A)(det B) c. det A^T = detA d. If A is triangular, then det A is the product of the entries on the main diagonal of A. e. A row replacement operation on A does not change the determinant. A row interchange changes the sign of the determinant. A row scaling also scales the determinant by the same scalar factor.

Row Operations on Determinants (Part 1)

Let A be a square matrix. a. If a multiple of one row of A is added to another row to produce a matrix B then det B = det A.

Row Operations on Determinants (Part 2)

Let A be a square matrix. b. If two rows of A are interchanged to produce B, then det B = -detA.

Row Operations on Determinants (Part 3)

Let A be a square matrix. c. If one row of A is multiplied by k to produce B, then det B = k * det A.

Cramer's Rule

Let A be an invertible nxn matrix. For any b in R^n, the unique solution x of Ax=b has entries given by xi = detAi(b)/detA i =1,2,...,n

An inverse formula using determinants

Let A be an invertible nxn matrix. Then A^-1 = (1/detA)(adjA)

Matrices whose eigenvalues aren't distinct

Let A be an nxn matrix whose distinct eigenvalues are λ1,....,λp. a. For 1<= k <= p, the dimension of the eigenspace for λk is less than or equal to the multiplicity of the eigenvalue λk. b. The matrix A is diagonalizable if and only if the sum of the dimensions of the eigenspaces equals n, and this happens if and only if (i) the characteristic polynomial factors completely into linear factors and (ii) the dimension of the eigenspace for each λk equals the multiplicity of λk. c. If A is diagonalizable and Bk is the basis for the eigenspace corresponding to λk for each k, then the total collection of vectors in the sets B1,..,Bp forms an eigenvector basis for R^n.

Invertible Matrix Theorem (continuation about eigenvalues and determinants)

Let A be an nxn matrix. Then A is invertible if and only if: s. The number 0 is not an eigenvalue of A. t. The determinant of A is not zero.

Let S = {v1...vk} be a set of k vectors in R^n with k < n. Use a theorem from Section 1.4 to explain why S cannot be a basis for R^n.

Let A be the n x k matrix. Since A has fewer columns than rows, there cannot be a pivot position in each row of A. By Theorem 4 in Section 1.4, the columns of A do not span R^n and thus are not a basis for R^n.

Let S = {v1....vk} be a set of vectors in R^n with k>n. Use a theorem from Chapter 1 to explain why S cannot be a basis for R^n.

Let A be the n x k matrix. Since A has fewer rows than columns, there cannot be a pivot position in each column of A. By Theorem 8 in Section 1.7, the columns are linearly independent and thus aren't a basis for R^n.

The Unique Representation Theorem

Let B = {b1,...,bn} be a basis for a vector space V. Then for each x in V, there exists a unique set of scalars c1,...,cn such that x = c1b1 + ... + cnbn

Coordinate mapping

Let B = {b1,...,bn} be a basis for a vector space V. Then the coordinate mapping x -> [x]B is a one-to-one linear transformation from V onto R^n.

Subspaces of a Finite-Dimensional Space

Let H be a subspace of a finite-dimensional vector space V. Any linearly independent set in H can be expanded, if necessary, (MEANING expanded in order to span all of V) to a basis for H. Also, H is finite-dimensional and dim H <= dim V

What is a basis?

Let H be as subspace of a vector space V. An indexed set of vectors B = {b1..bp} in V is a basis for H if (i) B is a linearly independent set, and (ii) the subspace spanned by B coincides with H; that is, H = Span {b1....bp} "An efficient spanning set that contains no unnecessary vectors; constructed from spanning set by discarding unneeded vectors"

Spanning Set Theorem

Let S = {v1...vp} be a set in V, and let H= Span{v1...vp} a. If one of the vectors in S - say, vk - is a linear combination of the remaining vectors in S, then the set formed from S by removing vk still spans H. b. If H is not equal to {0}, some subset of S is a basis for H. Explanation for b: If the original spanning set is linearly independent, then it is already a basis for H. So long as there are two or more vectors in the spanning set, we can delete any dependent vectors until the spanning set is eventually reduced to a basis for H. If it is reduced to one vector, the vector will be nonzero and thus linearly independent.

Linear Transformations of parallelogram

Let T: R^2 -> R^2 be the linear transformation determined by a 2x2 matrix A. If S is a parallelogram in R^2, then {area of T(S)} = |detA| * {area of S}

The Basis Theorem

Let V be a p-dimensional vector space, p>=1. Any linearly independent set of exactly p elements in V is automatically a basis for V. Any set of exactly p elements that spans V is automatically a basis for V. Aka: if a set has exactly the right number of elements, one has to only show either that the set is linearly independent or that it spans the space

Area of parallelogram with scalars

Let a1 and a2 be nonzero vectors. Then for any scalar c, the area of the parallelogram determined by a1 and a2 equals the area of the parallelogram determined by a1 and a2 + ca1.

Let A and B be square matrices. Show that even though AB and BA may not be equal, it is always true that detAB = detBA.

Multiplicative Theorem: (det AB) = (detA)(detB) = (detB)(detA) =detBA

If you're given a matrix A and a vector u, determine if u belongs to the null space.

Multiply A*u. If you get the zero vector, u is in Nul A.

How to tell if a given vector is an eigenvector of a matrix

Multiply given matrix by vector -> If result is a multiple of the vector, it is an eigenvector corresponding to the eigenvalue (# it's multiplied by)

Is det(A+B) = det A + det B?

NO.

Consider the polynomials p1(t) = 1 + t^2 and p2(t) = 1 - t^2. Is {p1, p2} a linearly independent set in P3? Why or why not?

Neither polynomial is a multiple of the other. So {p1, p2} is a linearly independent set in P3. (It is also a linearly independent set in P2 since p1 and p2 both happen to be in P2)

Is a line through the origin a subspace of R^2? Is a plane in R^3 not through the origin a subspace of R^3?

No, because they don't pass through the origin, they don't contain the zero vector.

Is the given set a subspace of Pn? All polynomials of the form p(t) = a+t^2, where a is in R.

No. The zero vector is not in the set.

Is the given set a subspace of Pn? All polynomials of degree at most 3, with integers as coefficients.

No. The set is not closed under multiplication by scalars which are not integers. (Also, the zero vector is not necessarily included)

Dimension of Col A

Number of pivot columns in A

Find a formula for the area of the triangle whose vertices are 0, v1, v2 in R^2.

One half the area of the parallelogram determined by v1 and v2. (1/2)|det A| where A = [v1 v2}

Isomorphism

One-to-one linear transformation fro a vector space V onto a vector space W is called an isomorphism fromV onto W Though the notation for V and W may differ, the two spaces are indistinguishable as vector spaces; each vector space calculation in V is accurately reproduced in W, and vice versa

Show that a subset {u1,...,up} in V is linearly independent if and only if the set of coordinate vectors {[u1]b,...,[up]b} is linearly independent in R^n.

One-to-one means these two equations have the same solution: c1u1 + ... + cpup = 0 (zero vector in V) (4) [c1u1 + ... + cpup]b = [0]b (zero vector in R^n) (5) Since the mapping is linear, c1[u1]b + ... + cp[up]b = [0 ... 0 ] (6) (4) has only the trivial solution if (6) has the trivial solution {u1,...,up} is linearly independent if and only if {[u1]b,...,[up]b} is linearly independent

Facts about coordinate mapping

Pb-1 (the change of coordinates matrix from B to the standard basis in R^n) is an invertible matrix so the coordinate mapping is a one-to-one linear transformation from R^n onto R^n by the Invertible Matrix Theorem

Relationship between row space and column space

Row A = Col (A^T)

Let S be a subset of n-dimensional vector space V, and suppose S contains fewer than n vectors. Explain why S cannot span V.

S can't have fewer than 1 vector so V is automatically not 0. If S spans and contains fewer than n vectors, the basis would then be fewer than n vectors, but that's not possible because dim V = n. Thus S cannot span.

Characteristic equation

Scalar equation det(A - λI)= 0; scalar λ is an eigenvalue of an nxn matrix A if and only if λ satisfies the characteristic equation

Determine if the set H of all matrices of the form | a b | is a subspace of M2x2 |0 d |

Set H is a subspace. Zero matrix is in H. Sum of two upper triangular matrices is upper triangular. Scalar multiple of an upper triangular matrix is upper triangular.

Zero subspace

Set consisting of only the zero vector in a vector space; subspace of V

How to explicitly define Nul A

Solving the equation Ax = 0 amounts to producing an explicit description of Nul A. If you're told to find the spanning set for the null space: 1. First find the general solution of Ax = 0 in terms of free variables. 2. Reduce the matrix to reduced echelon form. Write the basic variables in terms of the free variables. 3. Decompose the vector giving the general solution into a linear combination of vectors where the weights are the free variables. (x2 is a weight; it's x2 * some vector) Every linear combination of the free variables is an element of Nul A. Thus {vectors of the free variables} is a spanning set for Nul A.

Range of T

Span of the columns of A

The coordinates of x relative to the basis B (the B-coordinates of x)

Suppose B = {b1,...,bn} is a basis for V and x is in V. The coordinates of x relative to the basis B (or the B-coordinates of x) are the weights c1,..,cn such that x = c1b1 + ... + cnbn.

Explain why the space P of all polynomials is an infinite-dimensional space.

Suppose that dim P = k < infinity. Now Pn is a subspace of P and dim Pk-1 = k, so dim Pk-1 = dim P. Pk-1 would then be implied to equal P, but that's untrue because p(t) = t^k is in P, but not in Pk-1.

Suppose that T is a one-to one transformation, so that an equation T(u) = T(v) always implies u = v. Show that if the set of images {T(v1),....,T(vp)} is linearly dependent. This fact shows that a one-to-one linear transformation maps a linearly independent set onto a linearly independent set (because in this case the set of images can't be linearly dependent)

Suppose that {T(v1),...,T(vp)} is linearly dependent. Then there exists scalars c1,..., cp not all zero with c1T(v1) + .... + cpT(vp) = o Since T is linear, T(c1v1 + .... + cpvp) = c1T(v1) + .... + cpT(vp) = 0 = T(0) Since T is one-to-one T(c1v1 + ... + cpvp) = 0 BECAUSE T(0) = 0 implying that c1v1 + ... + cpvp = 0 Since not all the c1 are zero, {v1....vp} is linearly dependent.

Show that if {v1...vp} is linearly dependent in V, then the set of images, {T(v1)...T(vp)} is linearly dependent in W. This fact shows that if a linear transformation maps a set {v1...vp} onto a linearly independent set, then the original set is linearly independent, too (because it cannot be linearly dependent) #31 (4.3)

Suppose that {v1...vp} is linearly dependent. Then there exist scalars c1,...,cp not all zero with c1v1 + .... + cpvp = 0. Since T is linear, T(c1v1 + ...+ cpvp) = c1T(v1) + ...+ cpT(vp) and T(c1v1 +... + cpvp) = T(0) = 0 [Always the case] Thus, c1T(v1) + ... + cpT(vp) = 0 and since not all the ci zero, {T(v1),..., T(vp)} is linearly dependent (because they're equal to 0).

When the basis for R^n is fixed, how do you find the B-coordinate vector of a specified x?

The B-coordinates c1, c2 of x satisfy the equation: c1 * b1(the vector) + c2*b2(the vector) = [x] [b1 b2][c1, c2] = [x] Either inverting the matrix on the left and multiplying it to the right side OR using row operations will get you to the solution (the coordinate vector == [x]b)

What is the column space a subspace of?

The column space of an m x n matrix A is a subspace of R^m.

Column space

The column space of an m x n matrix A, written as Col A, is the set of all linear combinations of the columns of A. If A = [a1...an], then Col A = Span {a1...an}

When is the column space of an mxn matrix A all of R^m?

The column space of an mxn matrix A is all of R^m if and only if the equation Ax=b has a solution for each b in R^m.

Rank A

The dimension of the column space of A

Main diagonal and eigenvalues

The eigenvalues of a triangular matrix are the entries on its main diagonal

What is the kernel?

The kernel is also known as the null space. It is the set of all u in V such that T(u) = 0. (Transformation that results in 0)

Null space

The null space of an m x n matrix A, written as Nul A, is the set of all solutions of the homogeneous equation Ax = 0. In set notation, Nul A = {x : x is in R^n and Ax = 0}

Dimension of NulA

The number of free variables in the equation Ax = 0

How many vectors are there in the spanning set for Nul A?

The number of free variables in the equation Ax = 0.

Multiplicity

The number of times a factor exists in a characteristic polynomial

Finding a basis for a column space:

The pivot columns in the original matrix form the basis for Col B. (Reduce to ECHELON form)

What is the range?

The set of all vectors in W of the form T(x) for some x in V.

Span{v1...vp}

The set of all vectors that can be written as linear combinations

Let U be a square matrix such that U^TU=1. Show that det U =+/-1.

To multiple something to equal 1, either both are 1's or both are -1's. Since detU = detU^T, it could be +/- 1

Let S be a finite set in a vector space V with the property that every x in V has a unique representation as a linear combination of elements of S. Show that S is a basis of V.

Two things to prove something is a basis: 1. linear independence 2. spans The set S spans V because every x in V has a representation as a unique linear combination of elements in S. To show linear independence: Suppose S = {v1,..., vn} and that c1v1 + ... + cnvn = 0 for some scalars c1,..., cn. The case when c1 = ... = cn = 0 is one possibility (every subspace has to have the zero vector and this is the only way to have the zero vector) This is the unique and the ONLY possible representation of the zero vector as a linear combination of the elements in S (linear independence means only the trivial solution). So S is linearly independent and thus a basis for V.

Difference between linear dependence in R^n and in a general vector space

When vectors are not n-tuples, the homogeneous equation can usually not be written as a system of n linear equations; vectors can't be made into columns of a matrix A in order to study the equation Ax = 0

Is the given set a subspace of Pn? All polynomials of the form p(t) = at^2.

Yes. The set is Span{t^2}, the set is a subspace by Theorem 1.

All polynomials in Pn such that p(0) = 0.

Yes. The zero vector is in the set. If p and q are in H, (p+q)(0) = p(0) + q(0) = 0. cp(0) = c(p(0)) = c * 0 = 0

Is the set {sint, cost} linearly independent in C[0,1]?

Yes. Sint and cost are not multiples of one another as vectors in C[0,1]; there is no scalar c such that cos t = c*sint for all t in [0,1]

Is the determinant function linear?

Yes. If all columns except one are held fixed, then det A is a linear function of that one variable.

Is the null space a vector space?

Yes. The null space of an mxn matrix A is a subspace of R^n. Equivalently, the set of all solutions to a system Ax = 0 of m homogeneous linear equations in n unknowns is a subspace of R^n.

Is the spanning set produced from the explicit definition of Nul A linearly independent?

Yes. The spanning set is automatically linearly independent because the free variables are the weights on the spanning vectors.

Dimension of zero vector space {0}

Zero

Show that the coordinate mapping is one-to-one.

[u]b = [w]b = [c1,...,cn] u = w = c1b1 + .. + cnbn Since u and w were arbitrary elements of V, the coordinate mapping is one-to-one Means it must be a basis -> unique representation of scalars -> one-to-one because it's unique

The coordinate vector of x (relative to B) or the B coordinate vector of x

[x]b = |c1| | . | | . | |cn|

What can you determine if the determinant is NOT 0?

a. The matrix is invertible. b. The columns form a linearly independent set. c. Unique solution

If coordinate mapping is one-to-one, what two equations must have the same solutions in c1,..., cp?

c1u1 + ... + cpup = 0 (zero vector in V) [c1u1 + ... + cpup]b = [0]b (zero vector in R^n) One to one means ONE unique solution when mapping from V to R^n

Suppose {v1,...,v4} is a linearly dependence spanning set for a vector space V. Show that each w in V can be expressed in more than one way as a linear combination of v1,...,v4.

w = k1v1 + k2v2 + k3v3 + k4v4 because {v1,v2,v3,v4} spans V Set is linearly dependent == c1..c4 NOT all zero such that 0 = c1v1 + c2v2 + c3v3 + c4v4 w = w + 0 = (k1 + c1)v2 + (k2 + c2)v2 + (k3 + c3)v3 + (k4 + c4)v4 Because ci is nonzero, at least one of the new weights produced by w + 0 is expressed in more than one way as a linear combination of v1, v2, v3, v4

Given vectors u1,...,up, and w in V, show that w is a linear combination of u1,...,up if and only if [w]b is a linear combination of the coordinate vectors [u1]b,...,[up]b

w is a linear combination if there exists scalars c1,...,cp such that w = c1u1 + .. + cpup (7) Because the mapping is linear [w]b = c1[u1]b + ... + cp[up]b (8) This implies that the coordinate mapping is one-to-one (because of this again: c1u1 + ... + cpup = 0 (zero vector in V) (4) [c1u1 + ... + cpup]b = [0]b (zero vector in R^n) ) (8) implies (7) because it is one-to-one mapping (coordinate mapping MUST be one-to-one linear transformation) => they're directly linked so they must be linear combinations

The coordinate mapping (determined by B)

x -> [x]b

The change of coordinates matrix

x = Pb[x]b Pb is the change of coordinates matrix from B to the standard basis in R^n Pb = [b1 b2 .... bn] Left multiplication by Pb^-1 transforms the coordinate vector [x]b into x An n x n matrix that implements the coordinate mapping determined by B for a linear transformation from R^n to R^n (THIS would mean going from x -> xb, so you would want to left multiple) would be the inverse of Pb


Set pelajaran terkait

EMT Chapter 3 lifting and moving parts

View Set

Computer Fundamentals 1.04 Section 4- Operating Systems

View Set

Chapter 7: Blood Collection Equipment, Additives, and Order of Draw

View Set

Chapter 21: Respiratory Care Modalities

View Set

Antibody Structure and Antibody-Antigen Interactions

View Set

BIOCHEM FINAL REVIEW CH. 17 QUESTIONS

View Set