Ch 6&7 True/False

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

(6.1) For a square matrix A, vectors in Col A are orthogonal to vectors in Nul A.

False.

(6.5) If A has a QR factorization, say A=QR then the best way to find the least-squares solution of Ax=b is to compute x^ = (R^-1)(Q^T)b.

False.

(6.5) The normal equations always provide a reliable method for computing least-squares solutions.

False.

(7.1) An orthogonal matrix is orthogonally diagonalizable

False.

(7.1) The dimension of an eigenspace of a symmetric matrix is sometimes less than the multiplicity of the corresponding eigenvalue

False.

(7.1) There are symmetric matrices that are not orthogonally diagonalizable

False.

(6.5) The least-squares solution of Ax=b is the point in the column space of A closest to b.

False. If x^ is the least-squares solution, then Ax^ is the point in the column space of A closest to b.

(6.3) The orthogonal projection y^ of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute y^

False. It is always independent of basis.

(6.2) A matrix with orthonormal columns is an orthogonal matrix

False. It must be a square matrix.

(6.2) If a set S = {u1, . . . , un} has the property that ui·uj = 0 whenever i ≠ j, then S is an orthonormal set.

False. Might not be normal (magnitude may not be 1).

(6.1) For any scalar c, ||cv|| = c||v||

False. Need absolute value of c.

(6.2) If the vectors in an orthogonal set of nonzero vectors are normalized, then some of the new vectors may not be orthogonal

False. Normalizing just changes the magnitude of the vectors, it doesn't affect orthogonality.

(6.2) Not every orthogonal set in R^n is linearly independent

False. Orthogonal implies linear independence.

(6.3) The best approximation to y by elements of a subspace W is given by the vector y − projW(y).

False. The best approximation is projW y.

(6.2) If L is a line through 0 and if y^ is the orthogonal projection of y onto L, then ||y^|| gives the distance from y to L

False. The distance is ||y − y^||

(6.5) If x^ is a least squares solution of Ax=b, then x^=(A^TA)^-1(A^T)b.

False. The formula applies only when the columns of A are linearly independent.

(6.5) A least-squares solution of Ax = b is a vector x^ such that ||b − Ax|| ≤ ||b − Ax^|| for all x in R^n

False. The inequality is facing the wrong way

(7.1) A nxn symmetric matrix has n distinct real eigenvalues

False. They don't need to be distinct (aka you count multiplicities).

(6.3) If an n × p matrix U had orthonormal columns, then UU^T x = x for all x in R^n

False. This only holds if U is square.

(6.4) If {v1, v2, v3} is an orthogonal basis for W, then multiplying v3 by a scalar c gives a new orthogonal basis {v1, v2, cv3}

False. We don't want c = 0

(6.4) If W = Span {x1, x2, x3} with {x1, x2, x3} linearly independent, and if {v1, v2, v3} is an orthogonal set in W, then {v1, v2, v3} is a basis for W.

False. {v1, v2, v3} is an orthogonal basis for W

(6.1) If the distance from u to v equals the distance from u to −v, then u and v are orthogonal

True

(6.1) For an m × n matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A.

True.

(6.1) For any scalar c, u · (cv) = c(u · v)

True.

(6.1) If vectors v1, . . . , vp span a subspace W and if x is orthogonal to each vj for j = 1, . . . , p then x is in W^⊥

True.

(6.1) If x is orthogonal to every vector in a subspace W , then x is in W^⊥

True.

(6.1) u · v − v · u = 0

True.

(6.1) v · v = ||v||^2

True.

(6.2) If the columns of an m × n matrix A are orthonormal, then the linear mapping x |→ Ax preserves lengths

True.

(6.2) If y is a linear combination of nonzero vectors from an orthogonal set, then the weights in the linear combination can be computed without row operations on a matrix

True.

(6.2) Not every linearly independent set in R^n is an orthogonal set.

True.

(6.2) The orthogonal projection of y onto v is the same as the orthogonal projection of y onto cv whenever c ≠ 0

True.

(6.3) For each y and each subspace W, the vector y − projw(y) is orthogonal to W

True.

(6.3) If W is a subspace of R^n and if v is in both W and W^⊥, then v must be the zero vector.

True.

(6.3) If the columns of an n × p matrix U are orthonormal, then UU^T y is the orthogonal projection of y onto the column space of U.

True.

(6.3) If y = z1 + z2, where z1 is in a subspace W and z2 is in W^⊥, then z1 must be the orthogonal projection of y onto W

True.

(6.3) If y is in a subspace W , then the orthogonal projection of y onto W is y itself.

True.

(6.3) In the Orthogonal Decomposition Theorem, each term in formula (2) for y^ is itself an orthogonal projection of y onto a subspace of W

True.

(6.4) If A = QR, where Q has orthonormal columns, then R = (Q^T)(A)

True.

(6.4) If x is not in a subspace W, then x - projw(x) is not zero.

True.

(6.4) In a QR factorization, say A = QR (when A has linearly independent columns), the columns of Q form an orthonormal basis for the column space of A.

True.

(6.4) The Gram-Schmidt process produces from a linearly independent set {x1, . . . , xp} and orthogonal set {v1, . . . , vp} with the property that for each k, the vectors v1, . . . vk span the same subspace as the spanned by x1, . . . xk .

True.

(6.5) A least squares solution of Ax=b is a list of weights that when applied to the columns of A, produces the orthogonal projections of b onto ColA

True.

(6.5) Any solution of A^T Ax = A^T b is a least squares solution ofAx = b

True.

(6.5) If b is in the column space of A, then every solution of Ax=b is a least-squares solution.

True.

(6.5) The general least squares problem is to find an x that makes Ax as close as possible to b.

True.

(7.1) A nxn orthogonally diagonalizable matrix must be symmetric

True.

(7.1) For a nonzero v in R^n, the matrix vv^T is called a projection matrix

True.

(7.1) If A^T=A and if vectors u and v satisfy Au=3u and Av=4v then u · v=0

True.

(7.1) If B=PDP^T where P^T = P^-1 and D is a diagonal matrix then B is symmetric matrix

True.

(6.1) If ||u||^2 + ||v||^2 = ||u + v||^2, then u and v are orthogonal.

True. By Pythagorean Theorem.

(6.5) A least-squares solution of Ax = b is a vector x^ that satisfiesAx^ = b^ where b^ is the orthogonal projection of b onto ColA.

True. Remember the projection gives us the best approximation

(6.2) An orthogonal matrix is invertible.

True. The columns are linear independent since orthogonal. Thus invertible by invertible matrix theorem.

(6.5) If the columns of A are linearly independent, then the equation Ax = b has exactly one least squares solution

True. Then A^T A is invertible so we can solve A^T Ax = A^T b for x by taking the inverse.

(6.3) If z is orthogonal to u1 and to u2 and if W = Span{u1, u2} then z must be in W^⊥.

True. z will be orthogonal to any linear combination of to u1 and u2.


संबंधित स्टडी सेट्स

CHAPTER 5: EXPANDING THE TALENT POOL RECRUITMENT AND CAREERS

View Set

ΕΠΑΝΑΛΗΨΗ 2ου Κεφ: ΕΙΣΑΓΩΓΗ, ΠΑΡΑΓΩΓΗ ΘΡΕΠΤ. ΟΥΣΙΩΝ ΣΤΑ ΦΥΤΑ, ΠΡΟΣΛΗΨΗ ΟΥΣΙΩΝ & ΠΕΨΗ ΣΤΟΥΣ ΜΟΝΟΚΥΤΤΑΡΟΥΣ ΟΡΓΑΝΙΣΜΟΥΣ, ΣΤΟΥΣ ΖΩΪΚΟΥΣ ΟΡΓΑΝΙΣΜΟΥΣ & ΣΤΟΝ ΑΝΘΡΩΠΟ

View Set

Chap 11 les politiques conjoncturelles

View Set

Final Exam U.S. History II UNIT 1: THE CIVIL WAR

View Set

CompTIA practice test 80% wrong questions

View Set