Linear Algebra Midterm 3
Determine if the set of vectors is orthonormal.
Find inner product of vectors and make sure it = 0. See if they are unit vectors. Basically u • u = 1 or v • v = 1
Let y, u1 and u2 be vectors in ℝ^3. Find the distance from y to the plane in ℝ^3 spanned by u1 and u2.
Find y hat using ŷ=(y•uj/uj•uj) Then find y-ŷ. Then square the numbers in y-ŷ and square root their sums.
How can the characteristic polynomial of a matrix A be determined?
Find det(A-λI), where λ is a scalar.
For a system of linear equations Xβ=y, what is the design matrix X?
First column is all 1s, 2nd column are the x values of the coordinates.
What does it mean if A is diagonalizable?
If A is diagonalizable, then AequalsPDP Superscript negative 1 for some invertible P and diagonal D.
Which theorem could help prove one of these criteria necessary for a set of vectors to be an orthogonal basis for a subspace W of set of real numbers ℝ^n from another?
If S={u1, . . . up} is an orthogonal set of nonzero vectors in ℝ^n, then S is linearly independent and hence is a basis for the subspace spanned by S.
Determine whether the given matrix is regular. Explain your answer. P= [ 2/7 0 4/5 0 ] [ 0 2/3 0 1/2 ] [ 5/7 0 1/5 0 ] [ 0 1/3 0 1/2 ]
It is not regular. There is no power of P that contains only strictly positive entries.
Determine whether the given matrix is regular. Explain your answer. [ 1/3 0 1/2 ] P = [ 1/3 1/2 1/2 ] [ 1/3 1/2 0 ]
It is regular. The first power of P that contains only strictly positive entries is P^2/
Since X^TX is ________, any equation X^TXx = b, such as the normal equation __________ has a unique solution according to the __________. This solution is the least-squares solution to the system Xβ=y. So, the normal equations have a unique solution if and only if the data include at least two data points with different x-coordinates.
Since X^TX is invertible, any equation X^TXx = b, such as the normal equation X^TXβ = X^Ty has a unique solution according to the Invertible Matrix Theorem. This solution is the least-squares solution to the system Xβ=y. So, the normal equations have a unique solution if and only if the data include at least two data points with different x-coordinates.
If A is diagonalizable, then A has n distinct eigenvalues.
The statement is false. A diagonalizable matrix can have fewer than n eigenvalues and still have n linearly independent eigenvectors.
A matrix A is diagonalizable if A has n eigenvectors.
The statement is false. A diagonalizable matrix must have n linearly independent eigenvectors.
If A is invertible, then A is diagonalizable.
The statement is false. An invertible matrix may have fewer than n linearly independent eigenvectors, making it not diagonalizable.
If z is orthogonal to u1 and to u2 and if W = Span{u1, u2}, then z must be in W⊥.
This statement is true because, since z is orthogonal to every vector in span {u1, u2}, a set that spans W.
The least squares solution, β^, to Xβ=y is the unique solution to . . .
X^TXβ = X^Ty
Why does this show that a square matrix U with orthonormal columns is invertible?
A square matrix is invertible if and only if its columns are linearly independent.
What is the inverse of A?
A^-1=PD^-1P^-1
If the distance from u to v equals the distance from u to minusv, then u and v are orthogonal
By the definition of orthogonal, u and v are orthogonal if and only if u • v = 0. This happens if and only if 2u•v = -2u•v, which happens if and only if the squared distance from u to v equals the squared distance from u to -v. Requiring the squared distances to be equal is the same as requiring the distances to be equal, so the given statement is true.
Compute the distance from y to the line through u and the origin. Where y and u are vectors in |R^2
Calculate ŷ by cj=(y•uj/uj•uj). Then calculate y - ŷ. Once you get that vector. Square the numbers and square root the sum to find distance.
A least-squares solution is a solution to Ax=b.
False
B̂ is the least-squares solution of Ax=b
False
A matrix with orthonormal columns is an orthogonal matrix.
False. A matrix with orthonormal columns is an orthogonal matrix if the matrix is also square.
If {v1, v2, v3} is an orthogonal basis for W,then multiplying v3 by a scalar c gives a new orthogonal basis {v1, v2, cv3}.
False. If the scale factor is zero it will not give a new orthogonal basis.
If the vectors in an orthogonal set of nonzero vectors are normalized, then some of the new vectors may not be orthogonal.
False. Normalization changes all nonzero vectors to have unit length, but does not change their relative angles. Therefore, orthogonal vectors will always remain orthogonal after they are normalized.
If L is a line through 0 and if ŷ is the orthogonal projection of y onto L, then ||ŷ|| gives the distance from y to L.
False. The distance from y to L is given by ||y - ŷ||.
A is a 3x3 matrix with two eigenvalues. Each eigenspace is one-dimensional. Is A diagonalizable? Why?
No. The sum of the dimensions of the eigenspaces equals 2 and the matrix has 3 columns. The sum of the dimensions of the eigenspace and the number of columns must be equal.
Why is it true that the columns of U are linearly independent?
Orthonormal sets are linearly independent sets.
Verify that {u1, u2} is an orthogonal set, and then find the orthogonal projection of y onto {u1, u2}.
See if u1 and u2 are orthogonal. Use ŷ=(y•uj/uj•uj) for each vector and add them together.
How do these calculations show that {u1, u2, u3} is an orthogonal basis for ℝ^3?
Since each inner product is 0, the vectors form an orthogonal set. From the theorem above, this proves that the vectors are also a basis.
Given the property from the previous step, which of the following are equivalent properties to that property?
The equation Xβ=y has a unique least-squares solution for each y in ℝ^m AND the matrix X^TX is invertible
For a square matrix A, vectors in ColA are orthogonal to vectors in NulA.
The given statement is false. By the theorem of orthogonal complements, it is known that vectors in ColA are orthogonal to vectors in NulA^T. Using the definition of orthogonal complements, vectors in ColA are orthogonal to vectors in NulA if and only if the rows and columns of A are the same, which is not necessarily true.
For any scalar c, ||cv|| = c||v||
The given statement is false. Since length is always positive, the value of ||cv|| will always be positive. By the same logic, when c is negative, the value of c||v|| is negative.
For any scalar c, u • (cv)=c (u • v)
The given statement is true because this is a valid property of the inner product.
If ||u||^2 + ||v||^2 = ||u+v||^2, then u and v are orthogonal.
The given statement is true. By the Pythagorean Theorem, two vectors u and v are orthogonal if and only if||u||^2 + ||v||^2 = ||u+v||^2.
v • v = ||v||^2
The given statement is true. By the definition of the length of a vector v, ||v||=√v•v
For an mxn matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A.
The given statement is true. By the theorem of orthogonal complements, (RowA)^⊥ = Nul A. It follow , by the definition of orthogonal complements, that vectors in the null space of A are orthogonal to the row space of A
If vectors v1,...,vp span a subspace W and if x is orthogonal to each vj j for j=1,...,p, then x is in W^⊥.
The given statement is true. If x is orthogonal to each vj, then x is also orthogonal to any linear combination of those vj. Since any vector in W can be described as a linear combination of vj, x is orthogonal to all vectors in W.
If x is orthogonal to every vector in a subspace W, then x is in W^⊥.
The given statement is true. If x is orthogonal to every vector in W, then x is said to be orthogonal to W. The set of all vectors x that are orthogonal to W^⊥
u • v - v • u
The given statement is true. Since the inner product is commutative, u • v = v • u. Subtracting v•u from each side of this equation gives u•v-v•u=0.
Given a 2x2 matrix A, where Upper A equals PCP^-1 and C is of the form [ a -b ] [ b a ], what is the form of P?
The matrix [ Rev lmv ], where A has a complex eigenvalue λ=a-bi (b/=0) and an associated eigenvector v in set of complex numbers C^2
Note that the distance from a point y in ℝ^3 to a subspace W is defined as the distance from y to the closest point in W. What is the closest point in W to y?
The orthogonal projection of y onto W
A least-squares solution of Ax=b is a vector x^ such that ||b - Ax|| ≤ ||b - Ax^||for all x in |R^n
The statement is false because a least-squares solution of Ax=b is a vector x^such that ||b-Ax^|| ≤ ||b - Ax|| for all x in |R^n
If A is diagonalizable, then A is invertible.
The statement is false because invertibility depends on 0 not being an eigenvalue. A diagonalizable matrix may or may not have 0 as an eigenvalue.
A is diagonalizable if and only if A has n eigenvalues, counting multiplicities.
The statement is false because the eigenvalues of A may not produce enough eigenvectors to form a basis of |R^n.
A is diagonalizable ifA=PDP^-1 for some matrix D and some invertible matrix P.
The statement is false because the symbol D does not automatically denote a diagonal matrix.
The orthogonal projection of ŷ of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute ŷ.
The statement is false because the uniqueness property of the orthogonal decomposition y = ŷ+z indicates that, no matter the basis used to find it, it will always be the same.
If |R^n has a basis of eigenvectors of A, then A is diagonalizable.
The statement is true because A is diagonalizable if and only if there are enough eigenvectors to form a basis of |R^n.
A least-squares solution of Ax=b is a vector x^ that satisfies Ax^=b^where b^ is the orthogonal projection of b onto Col A.
The statement is true because b^ is the closest point in Col A to b. So, Ax=b^ is consistent and x^such that Ax^=b^ is a least squares solution from Ax=b.
If y is in a subspace W, then the orthogonal projection of y onto W is y itself.
The statement is true because for an orthogonal basis of W, B= u1, ...up, y and proj(w)y can be written as linear combinations of vectors in B with equal weights.
If the columns of A are linearly independent, then the equation Ax=b has exactly one least-squares solution.
The statement is true because if the columns of A are linearly independent, then A^TA is invertible and x^=(A^TA)^-1 A^Tb is the least squared solution to Ax=b.
If the columns of an nxp matrix U are orthonormal, then UU^Ty is the orthogonal projection of y onto the column space of U.
The statement is true because the columns of U are linearly independent and thus form an orthonormal basis for Col U. So, proj(ColU)y = UU^Tyfor all y in R^n
The general least-squares problem is to find an x that makes Ax as close as possible to b.
The statement is true because the general least-squares problem attempts to find an x that minimizes ||b −Ax||.
Any solution of A^TAx = A^T b is a least squared solution of Ax=b.
The statement is true because the set of least-squares solutions of Ax=b coincides with the nonempty set of solutions of the normal equations, defined as A^TAx = A^T b
For each y and each subspace W , the vector y − proj(w) y is orthogonal to W.
The statement is true because y can be uniquely written in the form y = proj(w)y +z where proj(w)y is in W and z is in W⊥ and it follows that z =y-proj(w)y
If AP=PD, with D diagonal, then the nonzero columns of P must be eigenvectors of A.
The statement is true. Let v be a nonzero column in P and let lambda be the corresponding diagonal element in D. Then AP=PD implies that Av=λv, which means that v is an eigenvector of A.
Write y as the sum of a vector in Span {u} and a vector orthogonal to u.
The vector y can be written as the sum of two orthogonal vectors, one in Span {u}, ŷ, and one orthogonal to u, y - ŷ.
Find the best approximation to z by vectors of the form c1v1+c2v2. Let W be a subspace of ℝ^n, let y be any vector in ℝ^n,and let ŷ be the orthogonal projection of y onto W.
Then ŷ is the closest point in W to y in the sense that ||y - ŷ|| < ||y-v|| for all v distinct from ŷ.
Let U be a square matrix with orthonormal columns. Explain why U is invertible. Which of the following is true of the columns of U?
They are linearly independent. The inner product of each pair of vectors is 0. Each column vector has unit length of 1.
B̂ is in the closest point in Col A to b
True
B̂ is in the column space of A
True
The least squares solution of Ax = b is a vector, x, that minimizes the distance between b and Ax.
True
The vector Ax is in the column space of A.
True
Thus, least squares solutions of Xβ=y coincides with the nonempty set of solutions of the normal equations of X^TXβ = X^Ty
True
The Gram-Schmidt process produces from a linearly independent set {x1, . . ., xp} an orthogonal set {v1, . . ., vp} with the property that for each k, the vectors v1, . . ., vk span the same subspace as that spanned by x1, . . ., xk.
True. For Wk=Span {x1, . . ., xk}, v1=x1 and some v1, . . ., vk where {v1, . . ., vk} is an orthogonal basis for Wk, v k+1=x k+1 - proj(Wk) x k+1is orthogonal to Wk and is in W k+1. Also v k+1 /=0. Hence, {v1, . . ., vk} is an orthogonal basis for W k+1.
If y is a linear combination of nonzero vectors from an orthogonal set, then the weights in the linear combination can be computed without row operations on a matrix.
True. For each y in W, the weights in the linear combination y=c1u1+...+cpup can be computed by cj=(y•uj/uj•uj) where j= 1, . . ., p.
Not every linearly independent set in ℝ^n is an orthogonal set.
True. For example, the vectors [0] [1] [1] [1] are linearly independent but not orthogonal.
If A=QR, where Q has orthonormal columns, then R=Q^TA.
True. Since Q has orthonormal columns then Q^TQ=I. So Q^TA=QT(QR)=IR=R.
Which of the following criteria are necessary for a set of vectors to be an orthogonal basis for a subspace W of set of real numbers ℝ^n? Select all that apply.
Vectors must span W. Vectors must form an orthogonal set.
What are the normal equations that should be used to solve for β?
X^TXβ = X^Ty
What does it mean if A is invertible?
Zero is not an eigenvalue of A, so the diagonal entries in D are not zero, so D is invertible.
Identify a nonzero 2times2 matrix that is invertible but not diagonalizable.
[ 1 1 ] [ 1 0 ] [ 0 1 ] not [ 0 1 ]
If at least two data points have different x-coordinates, the columns of the design matrix . . .
are linearly independent
Vectors u and v are orthogonal if . . .
u • v = 0