Math 1554 Midterm 3 True False

Ace your homework & exams now with Quizwiz!

Identify a nonzero 2×2 matrix that is invertible but not diagonalizable.

1101

Not every linearly independent set in ℝn is an orthogonal set.

D. True. For​ example, the vectors 01 and 11 are linearly independent but not orthogonal.

A matrix with orthonormal columns is an orthogonal matrix.

False. A matrix with orthonormal columns is an orthogonal matrix if the matrix is also square.

. If v1, v2, v3 is an orthogonal basis for W​, then multiplying v3 by a scalar c gives a new orthogonal basis v1, v2, cv3.

False. If the scale factor is zero it will not give a new orthogonal basis.

If the vectors in an orthogonal set of nonzero vectors are​ normalized, then some of the new vectors may not be orthogonal.

False. Normalization changes all nonzero vectors to have unit​ length, but does not change their relative angles.​ Therefore, orthogonal vectors will always remain orthogonal after they are normalized.

. If L is a line through 0 and if y is the orthogonal projection of y onto​ L, then y gives the distance from y to L.

False. The distance from y to L is given by y−yhat.

If a set S=u1, . . . , up has the property that ui•uj=0 whenever i≠j​, then S is an orthonormal set.

False. To be​ orthonormal, the vectors in S must be unit vectors as well as being orthogonal to each other.

An orthogonal matrix is invertible.

TRUE The columns are linear independent since orthogonal. Thus invertible by invertible matrix theorem

Suppose W is a subspace of ℝn spanned by n nonzero orthogonal vectors. Explain why W=ℝn.

The column vectors v1​, v2​, v3​, ​..., vn span ℝn if and only if v1, v2, v3, ..., vn is linearly independent.

For a square matrix ​A, vectors in Col A are orthogonal to vectors in Nul A.

The given statement is false. By the theorem of orthogonal​ complements, it is known that vectors in Col A are orthogonal to vectors in Nul AT. Using the definition of orthogonal​ complements, vectors in Col A are orthogonal to vectors in Nul A if and only if the rows and columns of A are the​ same, which is not necessarily true.

For any scalar​ c, cv=cv.

The given statement is false. Since length is always​ positive, the value of cv will always be positive. By the same​ logic, when c is​ negative, the value of cv is negative.

For any scalar​ c, u•​(cv​)=​c(u•v​).

The given statement is true because this is a valid property of the inner product.

If u2+v2=u+v2​, then u and v are orthogonal.

The given statement is true. By the Pythagorean​ Theorem, two vectors u and v are orthogonal if and only if u+v2=u2+v2.

v•v=||v||2

The given statement is true. By the definition of the length of a vector v​, v=v•v.

For an m×n matrix​ A, vectors in the null space of A are orthogonal to vectors in the row space of A.

The given statement is true. By the theorem of orthogonal​ complements, (Row A)⊥=Nul A. It​ follows, by the definition of orthogonal​ complements, that vectors in the null space of A are orthogonal to vectors in the row space of A.

If x is orthogonal to every vector in a subspace​ W, then x is in W⊥.

The given statement is true. If x is orthogonal to every vector in​ W, then x is said to be orthogonal to W. The set of all vectors x that are orthogonal to W is denoted W⊥.

A​ least-squares solution of Ax=b is a vector x such that b−Ax≤b−Axhat for all x in ℝn.

The statement is false because a​ least-squares solution of Ax=b is a vector x such that b−Axhat≤b−Ax for all x in ℝn.

A is diagonalizable if and only if A has n​ eigenvalues, counting multiplicities. Choose the correct answer below.

The statement is false because the eigenvalues of A may not produce enough eigenvectors to form a basis of ℝn.

A is diagonalizable if A=PDP−1 for some matrix D and some invertible matrix P. Choose the correct answer below.

The statement is false because the symbol D does not automatically denote a diagonal matrix.

The orthogonal projection y^ of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute y.

The statement is false because the uniqueness property of the orthogonal decomposition y=y^+z indicates​ that, no matter the basis used to find​ it, it will always be the same.

If A is​ diagonalizable, then A has n distinct eigenvalues.

The statement is false. A diagonalizable matrix can have fewer than n eigenvalues and still have n linearly independent eigenvectors.

A matrix A is diagonalizable if A has n eigenvectors.

The statement is false. A diagonalizable matrix must have n linearly independent eigenvectors.

If A is​ invertible, then A is diagonalizable.

The statement is false. An invertible matrix may have fewer than n linearly independent​ eigenvectors, making it not diagonalizable.

A​ least-squares solution of Ax=b is a vector x that satisfies Ax=b​, where b is the orthogonal projection of b onto Col A.

The statement is true because b is the closest point in Col A to b. ​So, Ax=b is consistent and x such that Ax=b is a​ least-squares solution of Ax=b.

For each y and each subspace​ W, the vector y−projWy is orthogonal to W.

The statement is true because y can be written uniquely in the form y=projWy+z where projWy is in W and z is in W⊥ and it follows that z=y−projWy.

If ℝn has a basis of eigenvectors of​ A, then A is diagonalizable. Choose the correct answer below.

The statement is true because A is diagonalizable if and only if there are enough eigenvectors to form a basis of ℝn.

If y is in a subspace​ W, then the orthogonal projection of y onto W is y itself. Choose the correct answer below.

The statement is true because for an orthogonal basis of​ W, B=u1,...,up​, y and projWy can be written as linear combinations of vectors in B with equal weights.

If the columns of A are linearly​ independent, then the equation Ax=b has exactly one​ least-squares solution.

The statement is true because if the columns of A are linearly​ independent, then ATA is invertible and x=ATA−1ATb is the​ least-squares solution to Ax=b.

The general​ least-squares problem is to find an x that makes Ax as close as possible to b.

The statement is true because the general​ least-squares problem attempts to find an x that minimizes b−Ax.

Any solution of ATAx=ATb is a​ least-squares solution of Ax=b.

The statement is true because the set of​ least-squares solutions of Ax=b coincides with the nonempty set of solutions of the normal​ equations, defined as ATAx=ATb.

If AP=​PD, with D​ diagonal, then the nonzero columns of P must be eigenvectors of A.

The statement is true. Let v be a nonzero column in P and let λ be the corresponding diagonal element in D. Then AP=PD implies that Av=λv​, which means that v is an eigenvector of A.

If z is orthogonal to u1 and u2 and if W=Span u1,u2​, then z must be in W⊥. Choose the correct answer below.

The statement is true​ because, since z is orthogonal to u1 and u2​, it is orthogonal to every vector in Span u1,u2​, a set that spans W.

If A has n linearly independent​ eigenvectors, then so does AT.

True

If A is both diagonalizable and​ invertible, then so is A−1.

True

If W is a subspace of ℝn and if v is in both W and W⊥​, then v must be the zero vector.

True. If v is in​W, then projWv=v .v. Since the W⊥component of v is equal to v minus proj Subscript Upper W Baseline Bold v commav−projWv, the W⊥component of v must be 0. A similar argument can be formed for the W component of v based on the orthogonal projection of v onto the subspace W⊥. Thus, v must be 0.

If the columns of an m×n matrix A are​ orthonormal, then the linear mapping x↦Ax preserves lengths.

True. Ax=x.

Not every orthogonal set in ℝn is linearly independent.

True. Every orthogonal set of nonzero vectors is linearly​ independent, but not every orthogonal set is linearly independent.

The​ Gram-Schmidt process produces from a linearly independent set x1, ... , xp an orthogonal set v1, ... , vp with the property that for each​ k, the vectors v1, ... , vk span the same subspace as that spanned by x1, ... , xk.

True. For Wk=​Span{x1, ... , xk​}, v1=x1​, and some v1, ... , vk where ​{v1, ... , vk​} is an orthogonal basis for Wk​, vk+1=xk+1−projWkxk+1 is orthogonal to Wk and is in Wk+1. ​Also, vk+1≠0. ​Hence, ​{v1, ... , vk+1​} is an orthogonal basis for Wk+1.

If y is a linear combination of nonzero vectors from an orthogonal​ set, then the weights in the linear combination can be computed without row operations on a matrix.

True. For each y in​ W, the weights in the linear combination y=c1u1+•••+cpup can be computed by cj=y•ujuj•uj​, where j=​1, . . .​ , p.

The orthogonal projection of y onto v is the same as the orthogonal projection of y onto cv whenever c≠0.

True. If c is any nonzero scalar and if v is replaced by cv in the definition of the orthogonal projection of y onto v​, then the orthogonal projection of y onto cv is exactly the same as the orthogonal projection of y onto v.

If A=​QR, where Q has orthonormal​ columns, then R=QTA.

True. Since Q has orthonormal columns then QTQ=I. So QTA=QT(QR)=IR=R.

The​ least-squares solution of Ax=b is the point in the column space of A closest to b.

false, ax not a

If A is​ diagonalizable, then A is invertible. Choose the correct answer below.

false, no correlation

If x is a​ least-squares solution of Ax=b​, then x=ATA−1ATb.

false, we do not know if the columnns of a are linearly indep.

. If vectors v1​,...,vp span a subspace W and if x is orthogonal to each vj for j=​1,...,p, then x is in W⊥.

true

.||Ux||=||x|

true

A(Re v)=a Re v+b Im v and A(Im v)=−b Re v+a Im v.

true

A=QR where R is an invertible matrix. Do A and Q have the same column space.​

true

If b is in the column space of​ A, then every solution of Ax=b is a​ least-squares solution.

true

If the distance from u to v equals the distance from u to −​v, then u and v are orthogonal.

true

Re(Ax​)=​A(Re x​) and ​Im(Ax​)=​A(Im x​).

true

if Ax=λx for some nonzero vector x in ℂn​, ​then, in​ fact, λ is real and the real part of x is an eigenvector of A.

true

​(Ux​)•​(Uy​)=0 if and only if x•y=0

true

​(Ux​)•​(Uy​)=x•y

true

In the Orthogonal Decomposition​ Theorem, each term in y=y•u1u1•u1u1+...+y•upup•upup is itself an orthogonal projection of y onto a subspace of W.

true, span of each ui is a 1-dim space of W, so each projection must be onto the subspace spanned by ui.

A​ least-squares solution of Ax=b is a list of weights​ that, when applied to the columns of​ A, produces the orthogonal projection of b onto Col A.

true, the orth projection of b onto col a is the closest point in col a to b. therefore a lss is a list of weights that produces tyhe rpojection of b onto col a from the columns of A.

If y=z1+z2​, where z1 is in a subspace W of ℝnand z2 is in W⊥​,then z1 must be the orthogonal projection of y onto W.

true, the orthogonal decompositon into components of w and wp is unique , so z1 is equal to yhat.


Related study sets

Chapter 7: Portable Fire Extinguishers

View Set

Two-Variable Linear Inequalities

View Set

450. David Goggins # Kids Explain What Is Love

View Set

All of Pharmacology Test (part 2)

View Set

Bio 1 Ch. 16 Molecular Basis of Heredity

View Set

Lesson 5 General Ledger and Trial Balance

View Set