Chapter 6 True/False
If A has a QR factorization, say A=QR then the best way to find the least-squares solution of Ax=b is to compute x^^ = R^-1Q^Tb.
False
If the vectors in an orthogonal set of nonzero vectors are normalized, then some of the new vectors may not be orthogonal.
False
The normal equations always provide a reliable method for computing least-squares solutions.
False
The orthogonal projection ŷ of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute ŷ.
False
The best approximation to y by elements of a subspace W is given by the vector y - proj w y.
False The best approximation theorem says the best approximation to y is proj w y.
If x^^ is a least squares solution of Ax=b, then x^^=(A^TA)^-1A^Tb.
False The formula applies only when the columns of A are linearly independent.
The least-squares solution of Ax=b is the point in the column space of A closest to b.
False, if x^^ is the least-squares solution, then Ax^^ is the poitn in the column space of A closest to b.
A matrix with orthonormal columns is an orthogonal matrix
False, it must also be a square
A least-squares solution of Ax=b is a vector x^^ such that ||b-Ax||<=||b-Ax^^|| for all x in Rn.
False, it should be ||b-Ax^^||<=||b-Ax||
If L is a line through 0 and if ŷ is the orthogonal projection of y onto L, then ||ŷ|| gives the distance from y to L.
False, the distance is ||y - ŷ||
If an n x p matrix U has orthonormal columns, then UU^Tx = x for all x in Rn.
False, this statement is only true if x is in the column space of U. If n > p then the column space of U will not be all of Rn, so the statement cannot be true for all x in Rn.
If a set S = {u1,...,up} has the property that ui * uj = 0 whenever i != j, then S is an orthonormal set
False, to be orthonormal the vectors in S must be unit vector as well as being orthogonal to each other.
If the columns of an n x p matrix U are orthonormal, then UU^ty is the orthogonal projection of y onto the column space of U.
True
If y = z1 + z2, where z1 is in a subspace W and z2 is in Wperp, then z1 must be the orthogonal projection of y onto W.
True
If y is a linear combination of nonzero vectors from an orthogonal set, then the weights in the linear combination can be computed without row operations on a matrix.
True
If y is in a subspace W, then the orthogonal projection of y onto W is y itself.
True
If z is orthogonal to u1 and to u2 and if W = span{u1, u2} then z must be in Wperp.
True
In the Orthogonal Decomposition Theorem, each term in formula (2) for ŷ is itself an orthogonal projection of y onto a subspace of W.
True
Not every linearly independent set in Rn is an orthogonal set.
True
The general least-squares problem is to find an x that makes Ax as close as possible to b.
True
The orthogonal projection of y onto v is the same as the orthogonal projection of y onto cv whenever c != 0.
True
Not every orthogonal set in Rn is linearly independent.
True, but every orthogonal set of nonzero vectors is linearly independent.
If W is a subspace of Rn and if v is in both W and Wperp, then v must be the zero vector.
True
If b is in the column space of A, then every solution of Ax=b is a least-squares solution.
True
If the columns of A are linearly independent, then the equation Ax=b has exactly one least-squares solution.
True
If the columns of an mxn matrix A are orthonormal, then the linear mapping x -> Ax preserves lengths.
True
A least squares solution of Ax=b is a list of weights that when applied to the columns of A, produces the orthogonal projections of b onto Col a.
True
A least-squares problem of Ax=b is a vector x^^ that satisfied Ax^^=b^^, where b^^ is the orthogonal projection of b onto Col A.
True
An orthogonal matrix is invertible.
True
Any solution of A^TAx=A^Tb is a least-square solution of Ax=b.
True
For each y and each subspace W, the vector y - proj w y is orthogonal to W.
True