math <3

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

Suppose a 4×7 matrix A has three pivot columns. What is dim Nul​ A? Is Col A=ℝ3​? Why or why​ not? Is Col A=ℝ3​? Why or why​ not?

7 - 3 = 4 dim Nul A=4 No, because Col A is a subspace of ℝ4.

If the null space of a 5×8 matrix A is 4​-dimensional, what is the dimension of the column space of​ A?

8 - 4 = 4 Col A = 4

If a 3×9 matrix A has rank 3​, find dim Nul​ A, dim Row​ A, and rank AT.

9 - 3 = 6 dim Nul A=6 dim Row A=3 rank AT=3

Let A=a1, a2, a3 and D=d1, d2, d3 be bases for​ V, and let P=d1Ad2Ad3A. Which of the following equations is satisfied by P for all x in​ V? (i) [x]A = P[x]D (ii) [x]D = P[x]A

Equation​ (i) is satisfied by P for all x in V.

Let U=u1, u2 and W=w1, w2 be bases for​ V, and let P be a matrix whose columns are u1W and u2W. Which of the following equations is satisfied by P for all x in​ V? (i) [x]u = P[x]w (ii) [x]w = P[x]u

Equation​ (ii) is satisfied by P for all x in V.

A matrix with orthonormal columns is an orthogonal matrix.

False. A matrix with orthonormal columns is an orthogonal matrix if the matrix is also square.

. If v1, v2, v3is an orthogonal basis for W​, then multiplying v3 by a scalar c gives a new orthogonal basis v1, v2, cv3.

False. If the scale factor is zero it will not give a new orthogonal basis.

If the vectors in an orthogonal set of nonzero vectors are​ normalized, then some of the new vectors may not be orthogonal.

False. Normalization changes all nonzero vectors to have unit​ length, but does not change their relative angles.​ Therefore, orthogonal vectors will always remain orthogonal after they are normalized.

If L is a line through 0 and if y is the orthogonal projection of y onto​ L, then y gives the distance from y to L.

False. The distance from y to L is given by ||y−y^||.

How can vectors be shown to be linearly dependent?

Form a matrix using the vectors as columns and determine the number of pivots in the matrix.

Let B=b1, ..., bn and C=c1, ..., cnbe bases of a vector space V. Then there is a unique n×n matrix PC ← B such that [x]C=PC ← B[x]B. The columns of PC ← B are the C​-coordinate vectors of the vectors in the basis B. That​ is, PC ← B=b1Cb2C...bnC.

Given v in​ V, there exists scalars x1​, ​..., xn​, such that v=x1b1+x2b2+...+xnbn because B is a basis for V. Apply the coordinate mapping determined by the basis C​, and obtain [v]C=x1b1C+x2b2C+...+xnbnC because the coordinate mapping is a linear transformation. This equation may be written in the form [v]C=b1Cb2C...bnC x1...xn by the definition of the product of a matrix and a vector. This shows that the matrix PC ← B=b1Cb2C...bnC satisfies [v]C=PC ← B[x]B for each v in​ V, because the vector x1...xn is the coordinate vector of v relative to Upper B .

Which theorem could help prove one of these criteria from​ another?

If S=u1, ..., up is an orthogonal set of nonzero vectors in ℝn​, then S is linearly independent and hence is a basis for the subspace spanned by S.

What is satire's purpose?

Its a technique to influence individuals of the corruption and foolishness in society through exaggerated humor .

A is a 3×3 matrix with two eigenvalues. Each eigenspace is​ one-dimensional. Is A​ diagonalizable? Why?

No. The sum of the dimensions of the eigenspaces equals 2 and the matrix has 3 columns. The sum of the dimensions of the eigenspace and the number of columns must be equal.

Without​ calculation, find one eigenvalue and two linearly independent eigenvectors of A=222222222. Justify your answer.

One eigenvalue of A is λ=0 because the columns of A are linearly dependent. Two linearly independent eigenvectors of A are ..... because the entries of each vector sum to 0.

How do these calculations show that u1, u2, u3 is an orthogonal basis for ℝ3​?

Since each inner product is 00​, the vectors form an orthogonal set. From the theorem​ above, this proves that the vectors are also a basis. Express x as a linear combination of the u​'s.

What is the dimension of the vector space ℙ3​?

Since the dimension of ℙ3 is equal to the number of elements in the linearly independent set formed by the given​ polynomials, the given set of polynomials forms a basis for ℙ3.

When are polynomials linearly dependent?

Since the matrix has a pivot in each​ column, its columns​ (and thus the given​ polynomials) are linearly independent.

If A is a 12×9 matrix, what is the largest possible dimension of the row space of​ A? If A is a 9×12 matrix, what is the largest possible dimension of the row space of​ A? Explain.

The dimension of the row space of A is equal to the number of pivot positions in A. Since there are only 9 columns in a 12×9 matrix, and there are only 9 rows in a 9×12 matrix, there can be at most 9 pivot positions for either matrix.​ Therefore, the largest possible dimension of the row space of either matrix is 9.

If the distance from u to v equals the distance from u to −​v, then u and v are orthogonal.

The given statement is false. By the theorem of orthogonal​ complements, it is known that vectors in Col A are orthogonal to vectors in Nul AT. Using the definition of orthogonal​ complements, vectors in Col A are orthogonal to vectors in Nul A if and only if the rows and columns of A are the​ same, which is not necessarily true.

For any scalar​ c, u•​(cv​)=​c(u•v​).

The given statement is true because this is a valid property of the inner product.

v•v = ||v||^2

The given statement is true. By the definition of the length of a vector v​, ||v||=sqrt(v•v).

If vectors v1​,...,vp span a subspace W and if x is orthogonal to each vj for j=​1,...,p, then x is in W⊥.

The given statement is true. If x is orthogonal to each vj​, then x is also orthogonal to any linear combination of those vj. Since any vector in W can be described as a linear combination of vj​, x is orthogonal to all vectors in W.

Row operations preserve the linear dependence relations among the rows of A. Is this statement true or​ false?

The statement is false. Row operations may change the linear dependence relations among the rows of A.

The columns of the​ change-of-coordinates matrix PC ← B are B​-coordinate vectors of the vectors in C.

The statement is false because the columns of the matrix PC ← B are the C​-coordinate vectors of the vectors in B.

If B is any echelon form of​ A, then the pivot columns of B form a basis for the column space of A. Is this statement true or​ false?

The statement is false. The columns of an echelon form B of A are often not in the column space of A.

If V=ℝn and C is the standard basis for​ V, then PC ← B is the same as the​ change-of-coordinates matrix PB that satisfies x=PB[x]B for all x in V.

The statement is true because if C is the standard basis for ℝn​, then biC=bi for 1≤i≤​n, and PB=b1...bn.

The row space of AT is the same as the column space of A. Is this statement true or​ false?

The statement is true because the rows of AT are the columns of ATT=A.

If A and B are row​ equivalent, then their row spaces are the same. Is this statement true or​ false?

The statement is true. If B is obtained from A by row​ operations, the rows of B are linear combinations of the rows of A and​ vice-versa.

The dimension of the null space of A is the number of columns of A that are not pivot columns. Is this statement true or​ false?

The statement is true. The dimension of Nul A equals the number of free variables in the equation Ax=0.

Which of the following criteria are necessary for a set of vectors to be an orthogonal basis for a subspace W of ℝn​? Select all that apply.

The vectors must span W. The vectors must form an orthogonal set.

The​ Gram-Schmidt process produces from a linearly independent set x1, ... , xp an orthogonal set v1, ... , vp with the property that for each​ k, the vectors v1, ... , vk span the same subspace as that spanned by x1, ... , xk.

True. For Wk=​Span{x1, ... , xk​}, v1=x1​, and some v1, ... , vk where {v1, ... , vk​} is an orthogonal basis for Wk​, vk+1=xk+1−projWkxk+1 is orthogonal to Wkand is in Wk+1. ​Also, vk+1≠0. Hence, {v1, ... , vk+1​} is an orthogonal basis for Wk+1.

If y is a linear combination of nonzero vectors from an orthogonal​ set, then the weights in the linear combination can be computed without row operations on a matrix.

True. For each y in​ W, the weights in the linear combination y=c1u1+•••+cpup can be computed by cj=y•ujuj•uj​, where j=​1, . . .​ , p.

Not every linearly independent set in ℝn is an orthogonal set.

True. For​ example, the vectors 01 and 11 are linearly independent but not orthogonal.

If A=​QR, where Q has orthonormal​ columns, then R=QTA.

True. Since Q has orthonormal columns then QTQ=I. So QTA=QT(QR)=IR=R.

Is λ=2 an eigenvalue of A=−36−58​? Why or why​ not?

Yes, λ is an eigenvalue of A because Ax=λx has a nontrivial solution.


Set pelajaran terkait

Section 6 Unknown Questions (Pt.2)

View Set

Unit 2: Physiology Mastering Ch 7, 8, 9, 10

View Set

ECON 4210 Final Exam Review: Exercise Questions

View Set

Introduction to Business Chapter 2

View Set