1554 T/F

Ace your homework & exams now with Quizwiz!

1.1) The n × n zero matrix can be diagonalized.

0) True; zero matrix is diagonalizable

1.2) Matrix A is a 3 × 3 matrix with two eigenvalues, λ1 and λ2. The geometric multiplicity of λ1 is 1, and the geometric multiplicity of λ2 is 2. A is diagonalizable.

1) True; possible that geometric mult = algebraic mult, meaning diagonalizable

1.1) If A and B are invertible n × n matrices, then the sum A + B is also invertible.

1.1) True, Nul(A) is always a subspace

1.10) If A is invertible, then the columns of A^−1 span R^n

1.10) True

1.11) A 4 × 6 matrix could have rank as large as 6.

1.11) False - rank is the number of pivot columns but you can't have 1 6 pivot columns with only 4 rows

1.12) H = {[ x1 x2 ] ∈ R^2 | x_1 = 0} is a subspace.

1.12) True x2 would be free in this case

1.13) V = {x ∈ R^4| x1 − x2 = 0, x4 = 1} is a subspace.

1.13) False, doesn't include zero vector(so if x4 doesn't include the zero vector it isnt a subspace?)

1.14) S = {x = [ x1 x2 x3 ] ∈ R^3 | x1 + 2x2 = 0} is a subspace.

1.14) True

1.15) L = {x = [ x1 x2 x3 ] ∈ R^3 | x1 ≤ 0} is a subspace

1.15) False, try scaling with a negative number

1.16) P = {x = [ x1 x2 x3 ] ∈ R^3 | x1² + x2² + x3² ≤ 1} is a subspace.

1.16) False, not linear It's not closed under multiplication (ex c1 = 0.0000001)

1.17) An n × n matrix with a zero row must be singular.

1.17) True

1.18) All elementary matrices are triangular.

1.18) False - row swapping isn't triangular

1.19) If matrix A is 4 × 3, then the set of solutions to Ax = b is a subspace of R^3

1.19) False - this is only true if b is the zero vector

1.2) The set of all possible solutions to A~x = ~0 is a subspace.

1.2) True, Nul(A) is always a subspace

1.20) If matrix A is row reduced to echelon form to produce matrix E, then the null spaces of A and E are the same.

1.20) True

1.21) If matrix A is row reduced to echelon form to produce matrix E, then the column spaces of A and E are the same.

1.21) False, col spaces are changed by row operations

1.22) If ~u and ~v are in subspace S, then ~u + ~v is also in S.

1.22) True, subspaces are closed under addition

1.23) If ~u is in subspace S, then any vector in Span(~u) is also in S.

1.23) True, subspaces are closed under both addition and multiplication

1.24) The column space of a matrix is a subspace, pivotal columns are vectors, and the rank of a matrix is an integer.

1.24) True

1.25) Swapping the columns of matrix A does not change the value of det(A).

1.25) False, det(a)=det(a^T). So swapping columns is the same as swapping rows for a square matrix.

1.26) Swapping the rows of matrix A does not change the value of det(A).

1.26) False, it changes the sign

1.27) The rank of matrix A and the dimension of Col(A) are always equal to each other.

1.27) True, dim(Col(A)) = rank(A)

1.28) Any matrix can be reduced to echelon form by multiplying the matrix by a set of elementary matrices.

1.28) True

1.29) If a matrix is in upper triangular form then it is also in echelon form.

1.29) False - You can have a matrix in upper triangular with a 0 row in the middle. This is not echelon

1.3) The set of all possible solutions to A~x = ~b is a subspace.

1.3) False, doesn't include zero vector

1.30) If a matrix is in echelon form then it is also in upper triangular form.

1.30) True (based on the definition above)-it is triangular; for i,j entries of the matrix, where i>j (this is only for 2,1), the element is zero

1.31) If a matrix is in upper triangular form then all of the elements on the main diagonal must be non-zero.

1.31) False, only the entries below the diagonal need to be zero Diagonal doesn't depend on the entries in the diagonal only the entries either above or below it For upper triangular we only care about the entries below the diagonal not what is on the diagonal. I.e., A_ij = 0 for i > j not i >= j

1.32) Suppose that E1 and E2 are any 2 × 2 elementary matrices. Then E1E2 = E2E1.

1.32) False, not commutative

1.33) If A is square, and A~x = A~y for some ~x 6= ~y, then det(A) = 0.

1.33) True ---> because the vectors are not linearly

1.34) If A, B, and C are square, A is invertible, and A(B − C) is equal to a zero matrix, then B = C.

1.34) True

1.35) If A, B, and C are square, A is invertible, and A(B − A^TC^T) is equal to a zero matrix, then B = (AC)^T

1.35) False

1.36) Every elementary matrix is invertible.

1.36) True, every elementary matrix is one row operation away from the identity, and the changes made on the determinant by making row operations won't bring the determinant to 0

1.37) If A is invertible, then A~x = ~b has a solution for every~b.

1.37) True, If A is invertible then it is also onto

1.38) The product of invertible matrices is also invertible.

1.38) True - det(a) not equal to 0, and det(b) not equal to 0, det(ab) not equal to 0

1.39) Every matrix can be expressed as a product of elementary matrices.

1.39) False

1.4) If An is invertible for some integer n, then A is also invertible.

1.4) True

1.40) If A is a 3 × 3 matrix and the equation Ax = [1 \\ 0\\ 0] has a unique solution then A is invertible.

1.40) True The keyword here is unique solution. That implies that the columns of A (A is square in this case) are linearly independent => pivot in every column. This means that A is also invertible by the IMT. Just because it has a unique solution for some b doesn't guarantee that it has a unique solution for every b => it is False*--- wrong it does because if a matrix doesn't have a unique solution, it means that vectors are linearly dependent because there are free variables otherwise.

1.41) If B, C, D are square matrices and BC = BD, then C = D.

1.41) False, B could be singular

1.42) If B, C, D are square invertible matrices and BC = BD, then C = D.

1.42) True - B is invertible.. B^-1BC = B^-1BD C=D

1.43) The transpose of an elementary matrix is an elementary matrix.

1.43) True

1.44) If A has N columns then rankA + dim(Nul(A)) = N.

1.44) True - Wade's fav theorem the rank nullity theorem

1.45) The set of all probability vectors in R^n forms a subspace of R^n

1.45) False - the zero vector is not a probability vector because its elements don't add up to one and a space needs a zero matrix to be a space ← I agree with this and it makes a ton of sense. Thank you!

1.46) The set of eigenvectors of an n × n matrix, that are associated with an eigenvalue, λ, span a subspace of R^n

1.46) True Isn't 0 always in an eigenspace? Does the span of the eigenvectors not always pass through the origin? Zero is not considered an eigenvector (because it's trivial). Right, but its still in the eigenspace, its just not an eigenvalue or eigenvector.No, 0 is an eigenvalue when the matrix is singular. See annotated notes for reference. An R^2 matrix that is singular has two eigenvalues, 0 and the trace of the diagonal. Every set of vectors spans the 0 vector, meaning 0 is in the eigenspace even though the 0 vector is never an eigenvector

1.47) An eigenspace is a subspace spanned by a single eigenvector.

1.47) False - if spanned by multiple vectors

1.48) If λ ∈ R is a non-zero eigenvalue of A with corresponding eigenvector ~v, then A~v is parallel to ~v.

1.48) True, same line still counts as parallel (answered in Piazza)

1.49) If A is n × n and A has n distinct eigenvalues, then the eigenvectors of A span R^n

1.49) True, all of them linearly indepene geometric multiplicity of an eigenvalue is greater than one, the eigenspace is spandent from each other

1.5) Any three vectors in R^2 will form a basis for R^2

1.5) False does not form a basis for R3 either, because lower dimension vectors can't span a higher dimension

1.50) An eigenvalue of a matrix could be associated with two linearly independent eigenvectors.

1.50) True: since independent vectors could correspond to the same eigenvalue (see 1.59)

1.51) Row operations on a matrix do not change its eigenvalues.

1.51) False-row reducing change eigenvalues

1.52) If A is singular, then non-zero vectors in the null space of A are also eigenvectors of A.

1.52) True - the eigenvectors are the basis for the nullspace of A-lambdaI. The vectors in eigenspaces are eigenvectors of A In this case A is singular which implies that 0 is an eigenvalue (see notes for explanation). That means that A-0I is just A and the Nul(A) would contain eigenvectors of A.

1.53) An example of a 2 × 2 matrix, that only has eigenvalue zero, is the 2 × 2 zero matrix.

1.53) True- It says "an example" and the zero matrix is "an" example

1.54) A steady-state vector for a stochastic matrix is an eigenvector.

1.54) True Pq=q has an eigenvalue of 1

1.55) If v1 and v2 are linearly independent eigenvectors, then they correspond to distinct eigenvalues.

1.55) false - one eigenvalue can correspond to an eigenspace spanned by multiple vectors (when the geometric multiplicity is greater than one)

1.56) A number c is an eigenvalue of A if and only if the system (A−cI)x = 0 has a nontrivial solution.

1.56) true we find the nullspace

1.57) If (λ − r)^k is a root of the characteristic polynomial of A, then r is an eigenvalue of A with geometric multiplicity k.

1.57) False It should be Algebraic multiplicity

1.58) If A is a square matrix, ~v and ~w are eigenvectors of A, then ~v + ~w is also an eigenvector of A.

1.58) False The statement is true if v and w have the same eigenvalue, and false otherwise.

1.59) An eigenvalue of a matrix could be associated with two linearly independent eigenvectors.

1.59) True example:identity matrix

1.6) If U is an echelon form of matrix A, then rank(U) = rank(A).

1.6) True Even when the matrix is converted to echelon form, its linear dependence remains unchanged. Meaning the number of pivots will stay the same. Hence U and A will have the same number of pivots, so they have the same rank.

1.7) If the nullspace of a square matrix contains only the zero vector, then the matrix is invertible.

1.7) True - This is true not false, as only the trivial solution works. Ya...if 0 vector is the only thing in nullspace then you'd have no free variables right?

1.8) Any four linearly independent vectors in R^4 forms a basis for R^4

1.8) True ---false actually because of row swaps- this was a question from the quiz

1.9) The rank of an invertible n × n matrix is always n.

1.9) True

1.11) If A is a 9 × 9 matrix with 3 distinct eigenvalues, and the eigenspace corresponding to one of the eigenvalues has dimension 7, then A is diagonalizable.

10) True; the other two eigenvalues will each have an eigenspace with a dimension of 1why is this true? How can we say the other two eigenvalues will have geometric multiplicity of 1? Because for an eigenvalue to exist its geometric multiplicity is at least one. And we know that they exist, so their geometric multiplicities are both one. Ahh ThATS ****ING GENIUS It is atleast one but not always one so it might not be true always.

1.12) Any stochastic matrix with a zero entry can not be regular.

11) False; just because it has a zero entry doesn't mean squaring, cubing, etc. will lead to a matrix with a zero.

1.13) If a stochastic matrix is not regular then it cannot have a steady state.

12) False; not a unique steady state

1.14) A diagonalization of a matrix is unique.

13) False; can reorder eigenvalues and their corresponding eigenvectors

1.15) If v ∈ R^n is an eigenvector of a square matrix, then all of the elements of v cannot be equal to zero.

14) True; we don't regard the zero vector as an e-vector

1.16) The eigenvalues of a triangular matrix are the elements on the diagonal of the matrix.

15) True; definition and consequence of cofactor expansion

1.17) To find the eigenvalues of square matrix A, we can row reduce A to echelon form, and then read off the pivots.

16) False; make the polynomial, row operations change eigenvalues

1.18) If A is n × n, and the rank of A − λI is n, then λ is an eigenvalue of A.

17) False; you need some non-zero vector as part of Nul(A - lambda*I). Also this has to be singular, so it can't have a rank of n

1.19) If 2 is an eigenvalue of A, and A is invertible, then .5 is an eigenvalue of A^−1.

18) True; property of Ax = (lambda)x.

1.20) All eigenvalues are non-zero scalars.

19) False. 0 can be an e-value!

1.3) If A ∈ R^2×2, and λ = −1, then A reflects eigenvectors that are associated with λ through a line that passes through the origin.

2) True; vector points in opposite direction with lambda = -1 THis should be reflected through the origin so it is false I believe

1.21) All eigenvectors are non-zero vectors.

20) True by definition of eigenvectors

1.22) An example of a regular stochastic matrix is P = [ 1 0.2 \\ 0 0.8 ]

21) False, bottom left is always 0 no matter what power you raise P to

1.23) An example of a regular stochastic matrix is [ .333 .5 .333 \\ .333 0 .3333 \\ .333 .5 .333 ]

22) True; YES because from any state, you can go to every other state. No sinks!

1.24) A and B are square n × n matrices. If 0 is an eigenvalue of the product AB, then 0 is an eigenvalue of BA.

23) True; product of e-values = determinant;

1.25) If A is square n × n and has n linearly independent eigenvectors, then so does A^T

24) True; pivots still intact

1.26) If the determinant of a matrix is zero, then zero is a factor of the characteristic polynomial of the matrix.

25) True; not invertible, so 0 is an e-value

1.27) If A is diagonalizable, then so is A^k for k = 2, 3, 4, . . .

26) True; A can be written as PDP^(-1)

1.4) If A ∈ R^2×2 is singular, then at least one of the eigenvalues of A can have a non-zero imaginar0y component.

3) False; singular means non-invertible, so zero must be an eigen-value. No way for a complex eigenvalue to exist in the 2*2 case

1.5) An eigenvector of a matrix could be associated with two distinct eigenvalues.

4) False; other way around. Two eigenvectors can share the same eigenvalue

1.6) If A ∈ R^8×8, has 4 eigenvalues, each eigenvalue has algebraic multiplicity 1, and the geometric multiplicity of each eigenvalue is 2, then A can be diagonalized.

5) False; not possible to diagonalize since you don't have enough eigenvalues for eigenvectors. Fails 1 <= geometric <= algebraic for not distinct eigenvalues.can some explain this one? Like what is the difference between algebraic and geometric multiplicity? I never get this check this link: https://youtu.be/Xcln3xG8QGQ geometric is dimensions of Nul(A - lambda I) and algebraic i s number of times eigenvalue is repeatedBut each eigenvector has geometric multiplicity of 2, won't we have 8 eigenvectors? And it is completely valid for a diagonal matrix to have repeated eigenvalues as long as the total number of eigenvectors equal to the total columns of the matrix.

1.7) If A has eigenvalue λ, then AT has eigenvalue λ.

6) True; transposing a matrix doesn't change the characteristic polynomial because det A = det A^T

1.8) The 3 × 3 zero matrix is diagonalizable.

7) True; zero matrices are diagonalizable because 0 is a diagonal matrix (and is similar to a diagonal matrix)

1.9) If A is diagonal, then A can be diagonalized.

8) True; all diagonal matrices are diagonalizable (because all diagonal matrices are similar to D the diagonal matrix)

1.10) If A is triangular, then A can be diagonalized.

9) False; matrix (vector(0,1), vector(0,0)) cannot be diagonalized because geometric multiplicty doesn't add up to 2 in this case for 2 x 2 matrix for only 2 of the same eigenvalue

1.2) If a linear system has more unknowns than equations, then the system can have a unique solution.

False

1.9) If A ∈ R 2×2 and A has linearly dependent columns, the span of the columns of A is a line that passes through the origin.

False A could be the zero matrix

1.5) If AB = AC, then B = C.

False A could be the zero vector

1.13) If A is 2 × 3, then the transformation x → Ax cannot be one-to-one.

False A is a vertical matrix which means it can't be onto not necessarily one-to-one

1.12) If x_1 is a solution to the inhomogeneous system Ax = b, then any vector in Span{x_1} is also a solution to Ax = b.

False Ax=b could have a unique solution if there are no free variables and it is consistent.

1.1) If a linear system has more equations than unknowns, then the system can have a unique solution.

False If this is the case there are infinitely many solutions, therefore it's not unique

1.10) If there are some vectors b ∈ R^m that are not in the range of T(x) = Ax, then there cannot be a pivot in every column of A.

False There cannot be a pivot in every row, this could be achieved in a matrix with more rows than columns

1.7) If A ∈ R^m×n has linearly dependent columns, then the columns of A cannot span R^m

False there only needs to be a pivot in every row of A it could have free variables

1.4) If Ax = Ay for some x \= y, then A could have a pivot in every row, but A cannot have a pivot in every column.

True Ax = Ay implies that there are infinite solutions. Pivot in every row --> system is consistent. Pivot in every column --> unique solution. Thus, we could have a pivot in every row but not column since we need a free variable to have infinite solutions (since we can only have one, infinite, or no solutions in Linear Algebra

1.6) If matrix B has two columns, the columns of B are b_1 and b_2, so that B = [b_1 b_2], then AB = [Ab_1 Ab_2]

True Definition of matrix multiplication

1.3) If for every b, Ax = b has at least one solution, there must be a pivot in every row of A.

True Only way for a system to be consistent

1.11) If transform T is linear, then T(u + v) = T(~u) + T(~v) for all ~u and ~v in the domain of T.

True This is one of three main properties of linear transformations

1.8) If A and B are 2 × 2 matrices, the columns of B are b_1 and b_2, and b_1 = b_2, then the columns of AB are linearly dependent

True the resulting matrix will have 2 columns that are the same

1.14) If a linear system is consistent, then the solution set either contains a unique solution when there are no free variables, or infinitely many solutions when there is at least one free variable

True If a linear system is consistent, then the solution set either contains a unique solution when there are no free variables, or infinitely many solutions when there is at least one free variable.


Related study sets

Chapter 8 Intermediate Financial 1

View Set

Pharmacology ATI study questions part 25

View Set

Social Psychology Task 5 - Attitude Change

View Set

payless situational job interview.

View Set