5.3, 6.1 - True/False
A is diagonalizable if A has n eigenvalues, counting multiplicities.
False - A always has n eigenvalues, counting multiplicity, regardless of whether it is diagonalizable or not.
u . v - v . u = 0
True - The dot product is commutative by definition
If A is diagonalizable, then A has n distinct eigenvalues.
False - A could have repeated eigenvalues.
A is diagonalizable if A has n eigenvectors.
False - A must have n LINEARLY INDEPENDENT eigenvectors to be diagonalizable (by the Diagonalization Theorem).
A is diagonalizable if A = PDP^-1 for some matrix D and some invertible matrix P.
False - D must be a diagonal matrix.
If A is invertible, then A is diagonalizable.
False - Invertibility and diagonalizability do not affect each other and are two completely different concepts.
If A is diagonalizable, then A is invertible.
False - Invertibility doesn't affect diagonalizability. A matrix is invertible if 0 is not an eigenvalue. A diagonalizable matrix may or may not have 0 as an eigenvalue.
For a square matrix A, vectors in Col A are orthogonal to vectors in Nul A.
False - The statement is not valid for the matrix [ 1 1 ] [ 0 0 ]
For any scalar c, ||cv|| = c||v||.
False - ||cv|| = |c| ||v|| ^ the absolute value of c
If vectors v1, ..., vp span a subspace W and if x is orthogonal to each vj for j = 1, ..., p then xis in W's complement.
True - Any vector in W can be written as a linear combination of basis vectors
If AP = PD, with D diagonal, then the nonzero columns of P must be eigenvectors of A.
True - Each column of PD is a column of P times A and is equal to the corresponding entry in D times the vector P. As long as the column is nonzero, the equation AP = PD is valid.
If Rn has a basis of eigenvectors of A, then A is diagonalizable.
True - We can create a P and a D that is invertible
If ||u||^2 + ||v||^2 = ||u + v||^2, then u and v are orthogonal.
True - by Pythagorean Theorem
For an m x n matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A.
True - by Theorem 3 - Row A complement = Nul A. The orthogonal complement of the row space is the null space, for n m x n matrix A.
For any scalar c, u.(cv) = c(u . v)
True - by definition
If the distance from u to v equals the distance from u to -v, then u and v are orthogonal.
True - by definition
If x is orthogonal to every vector in a subspace W, then x is in W's complement.
True - by definition
v . v = ||v||^2
True - by definition