Linear Algebra Quiz 424

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

If the distance from u to v equals the distance from u to minus−​v, then u and v are orthogonal.

By the definition of​ orthogonal, u and v are orthogonal if and only if utimes•vequals=00. This happens if and only if 2 Bold u times Bold v equals negative 2 Bold u times Bold v comma2u•v=−2u•v, which happens if and only if the squared distance from u to v equals the squared distance from u to minus−v. Requiring the squared distances to be equal is the same as requiring the distances to be​ equal, so the given statement is true.

A matrix with orthonormal columns is an orthogonal matrix.

False. A matrix with orthonormal columns is an orthogonal matrix if the matrix is also square.

If the vectors in an orthogonal set of nonzero vectors are​ normalized, then some of the new vectors may not be orthogonal.

False. Normalization changes all nonzero vectors to have unit​ length, but does not change their relative angles.​ Therefore, orthogonal vectors will always remain orthogonal after they are normalized.

If L is a line through 0 and if ModifyingAbove Bold y with carety is the orthogonal projection of y onto​ L, then left norm ModifyingAbove Bold y with caret right normy gives the distance from y to L.

False. The distance from y to L is given by left norm y minus ModifyingAbove Bold y with caret right normy−y.

If a set Sequals=StartSet Bold u 1 comma . . . comma Bold u Subscript p EndSetu1, . . . , up has the property that Bold u Subscript i Baseline times Bold u Subscript j Baseline equals 0ui•uj=0 whenever i not equals ji≠j​, then S is an orthonormal set.

False. To be​ orthonormal, the vectors in S must be unit vectors as well as being orthogonal to each other.

How can this inverse be expressed using​ transposes? Select the correct choice below​ and, if​ necessary, fill in the answer boxes to complete your choice.

Since U and V are orthogonal​ matrices, Upper U Superscript negative 1U−1equals=Upper U Superscript Upper TUT and Upper V Superscript negative 1V−1equals=Upper V Superscript Upper TVT. By​ substitution, left parenthesis UV right parenthesis Superscript negative 1(UV)−1equals=Upper V Superscript Upper T Baseline Upper U Superscript Upper TVTUT.

Why is UV​ invertible?

Since U and V are​ orthogonal, each is invertible by the definition of orthogonal matrices. The product of two invertible matrices is also invertible.

For an mtimes×n matrix​ A, vectors in the null space of A are orthogonal to vectors in the row space of A.

The given statement is true. By the theorem of orthogonal​ complements, left parenthesis Row font size decreased by 3 Upper A right parenthesis Superscript orthogonal(Row A)⊥equals=Nul font size decreased by 3 A.Nul A. It​ follows, by the definition of orthogonal complements, that vectors in the null space of A are orthogonal to vectors in the row space of A.

For a square matrix ​A, vectors in Co l font size decreased by 3Col A are orthogonal to vectors in Nu l font size decreased by 3Nul A.

The given statement is false. By the theorem of orthogonal​ complements, it is known that vectors in Co l font size decreased by 3Col A are orthogonal to vectors in Nu l font size decreased by 3Nul ASuperscript Upper TT. Using the definition of orthogonal​ complements, vectors in Co l font size decreased by 3Col A are orthogonal to vectors in Nu l font size decreased by 3Nul A if and only if the rows and columns of A are the​ same, which is not necessarily true.

For any scalar​ c, left norm c Bold v right normcvequals=cleft norm Bold v right normv.

The given statement is false. Since length is always​ positive, the value of left norm c Bold v right normcv will always be positive. By the same​ logic, when c is​ negative, the value of cleft norm Bold v right normv is negative.

For any scalar​ c, utimes•​(cv​)equals=​c(utimes•v​).

The given statement is true because this is a valid property of the inner product.

If left norm Bold u right normusquared2plus+left norm Bold v right normvsquared2equals=left norm Bold u plus Bold v right normu+vsquared2​, then u and v are orthogonal.

The given statement is true. By the Pythagorean​ Theorem, two vectors u and v are orthogonal if and only if left norm Bold u plus Bold v right normu+vsquared2equals=left norm Bold u right normusquared2plus+left norm Bold v right normvsquared2.

vtimes•vequals=Bold left norm v right normvsquared2

The given statement is true. By the definition of the length of a vector v​, Bold left norm v right normvequals=StartRoot Bold v times Bold v EndRootv•v.

If vectors v11​,...,vSubscript pp span a subspace W and if x is orthogonal to each vSubscript jj for jequals=​1,...,p, then x is in Upper W Superscript orthogonalW⊥.

The given statement is true. If x is orthogonal to each vSubscript jj​, then x is also orthogonal to any linear combination of those vSubscript jj. Since any vector in W can be described as a linear combination of vSubscript jj​, x is orthogonal to all vectors in W.

If x is orthogonal to every vector in a subspace​ W, then x is in Upper W Superscript orthogonalW⊥.

The given statement is true. If x is orthogonal to every vector in​ W, then x is said to be orthogonal to W. The set of all vectors x that are orthogonal to W is denoted Upper W Superscript orthogonalW⊥.

utimes•vminus−vtimes•uequals=0

The given statement is true. Since the inner product is​ commutative, utimes•vequals=vtimes•u. Subtracting vtimes•u from each side of this equation gives utimes•vminus−vtimes•uequals=0.

If the columns of an mtimes×n matrix A are​ orthonormal, then the linear mapping xmaps to↦Ax preserves lengths.

True. left norm Upper A Bold x right norm equals left norm Bold x right normAx=x.

An orthogonal matrix is invertible.

True. An orthogonal matrix is a square invertible matrix U such that Upper U Superscript negative 1 Baseline equals Upper U Superscript Upper TU−1=UT.

Not every orthogonal set in set of real numbers R Superscript nℝn is linearly independent.

True. Every orthogonal set of nonzero vectors is linearly​ independent, but not every orthogonal set is linearly independent.

If y is a linear combination of nonzero vectors from an orthogonal​ set, then the weights in the linear combination can be computed without row operations on a matrix.

True. For each y in​ W, the weights in the linear combination Bold y equals Bold c 1 Bold u 1 plus times times times plus Bold c Subscript p Baseline Bold u Subscript py=c1u1+•••+cpup can be computed by c Subscript j Baseline equals StartFraction Bold y times Bold u Subscript j Over Bold u Subscript j Baseline times Bold u Subscript j EndFractioncj=y•ujuj•uj​, where jequals=​1, . . .​ , p.

Not every linearly independent set in set of real numbers R Superscript nℝn is an orthogonal set.

True. For​ example, the vectors Start 2 By 1 Table 1st Row 1st Column 0 2nd Row 1st Column 1 EndTable01 and Start 2 By 1 Table 1st Row 1st Column 1 2nd Row 1st Column 1 EndTable11 are linearly independent but not orthogonal.

The orthogonal projection of y onto v is the same as the orthogonal projection of y onto cv whenever c not equals 0c≠0.

True. If c is any nonzero scalar and if v is replaced by cv in the definition of the orthogonal projection of y onto v​, then the orthogonal projection of y onto cv is exactly the same as the orthogonal projection of y onto v.

For any two ntimes×n invertible matrices U and​ V, the inverse of UV is left parenthesis UV right parenthesis Superscript negative 1(UV)−1equals=

V^-1U^-1

To show that left parenthesis UV right parenthesis Superscript negative 1(UV)−1equals=​(UV)T​, apply the property that states that for matrices U and V with sizes appropriate for multiplication and​ addition, ​(UV)Tequals=

V^TU^T


Kaugnay na mga set ng pag-aaral

Chapter 29 Civil Rights Review Worksheet

View Set

The Culture and Kingdoms of West Africa

View Set

JFK Inaugural Address - Rhetorical Devices

View Set

Chapter 7: Growth and Development of the Adolescent

View Set

A History of Western Society Chapter 19

View Set

Health Informatics Comprehensive Review

View Set