Linear Algebra Assignments True/False

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

HW 8: A vector v is in Col(A) if and only if Ax = v is consistent.

---

HW 8: Let A1, A2, A3, A4 be 2 × 2 matrices such that M2×2(R) = Span{A1, A2, A3, A4}. The set {A1, A2, A3, A4} is a basis for M2×2(R).

---

HW 8: The column space of a matrix A is the set of solutions of Ax = b.

---

Exam 1: If a linear system has infinitely many solutions, then the REF of the augmented matrix has a row of zeros.

FALSE

Exam 1: If the REF of the augmented matrix of a linear system has a row of zeros, then the system has infinitely many solutions.

FALSE

Exam 1: Let T be a linear transformation. If T(x) = 0, then x = 0.

FALSE

Exam1 PP: If the equation Ax = 0 only has the trivial solution, then the equation Ax = b has a unique solution for any b.

FALSE

Exam1 PP: Let T be a linear transformation. If T(x) = T(y), then x = y.

FALSE

HW 6: Let A be an n × n matrix, and let b be a given vector in R^n . If the system Ax = b is consistent, then detA 6 does not equal 0.

FALSE. Consider A = (1, -1), (0, 0) and b = (2, 0). The system Ax = b is consistent, but detA = 0.

HW 4: If a set of vectors is linearly dependent then at least one of the vectors is a scalar multiple of another one.

FALSE. Consider the set {(1, 0), (0, 1), (2, 3)}. This set is linearly dependent because the last vector is a linear combination of the first two, but neither vector is a scalar multiple of another one.

Exam 2 PP: Cramer's rule can be applied to any kind of linear system

FALSE. Cramer's rule can only be applied to linear systems where the associated matrix is square and invertible.

HW 3: A consistent equation Ax = b where A has more columns than rows can have a unique solution.

FALSE. If Ax = b is consistent, then it either has a unique solution or it has infinitely many solutions. Since A has more columns than rows, not every column can have a pivot. Therefore, there must be at least one free variable, so there are infinitely many solutions.

Exam1 PP: If x is in Span{v1, v2, v3}, then x is in Span{v1, v2, tv3}, where t is any real number.

FALSE. If t = 0, then for x to be in Span{v1, v2, tv3}, it must be a linear combination of v1 and v2 only, which is not necessarily true.

HW 3: If the REF of the augmented matrix of a consistent equation Ax = b has a row of zeros, then the equation has infinitely many solutions.

FALSE. If the matrix A has more rows than columns, then all the variables can be basic and there will be a row of zeros (see the same example as in part a)).

HW 5: If A and B are square matrices such that AB = BA, then A^−1 = B.

FALSE. Let A be the zero matrix. Then AB = BA = 0, but A is not invertible.

Exam1 PP: Let v1, v2, v3 be vectors in R^n . If {v1, v2, v3} is linearly dependent, then each of v1, v2, v3 can be expressed as a linear combination of the other two.

FALSE. One of the weights in the linear dependence relation might be 0, for example, c1v1 + c2v2 + 0v3 = 0. Then, v3 cannot be expressed as a linear combination of v1, v2.

HW 7: R^2 is a subspace of R^3.

FALSE. R^2 is not a subset of R^3 , so it cannot be a subspace of R^3

Exam 2 PP: If a square matrix A is not invertible, then the system Ax = b is inconsistent for all b

FALSE. The linear system Ax = 0 is always consistent regardless of the matrix A.

HW 7: A set containing a finite number of vectors in R^n cannot be a subspace of R^n .

FALSE. The set {0} is a subspace of R^n . However, for any other set containing a finite number of vectors, we can always find a scalar such that the scalar times any vector in the set will not be in the set.

HW 4: If the set {v1, v2, v3, v4} is linearly independent, then {v1, v2, v3, v4, v1+v4} is linearly independent.

FALSE. The vector v1+v4 is a linear combination of v1 and v4. Thus, {v1, v2, v3, v4, v1+ v4} is linearly dependent.

HW 7: The set of invertible 2 × 2 matrices is a subspace of M2×2(R).

FALSE. The zero 2 × 2 matrix is not invertible, so this set fails the first property of subspaces.

HW 6: If A is a square matrix whose diagonal entries are all zero, then detA = 0.

FALSE. det (0, 1), (-2, 0) = 2 and the diagonal entries of this matrix are all zero.

HW 6: If the determinant of a square matrix is zero, then the matrix has either one row or column of zeros.

FALSE. det (1, -2), (2, -4) = −4 + 4 = 0 but the matrix does not have any row or column of zeros.

HW 5: Let A be an n × n matrix, and b be a vector in R^n . If the system Ax = b has infinitely many solutions, then A is not invertible.

TRUE. A is invertible if and only if it can be row reduced to the identity matrix, in which case any linear system has a unique solution.

HW 5: Let A be an invertible square matrix. If AB = AC, then B = C.

TRUE. AB = AC --> A^−1 (AB) = A^−1 (AC) --> InB = InC --> B = C

Exam 2 PP: If detA does not equal 0, for some n × n matrix A, then the columns of A are linearly independent

TRUE. From the Invertible Matrix Theorem, we know that if detA does not = 0 then A is invertible which in turn implies that its columns are linearly independent.

HW 3: If a matrix has m rows and n columns, with m > n, the columns of the matrix cannot span R^m.

TRUE. If A has more rows than columns, then there cannot be a pivot in every row.

HW 5: Let T be an invertible linear transformation. If T(x) = 0, then x = 0.

TRUE. If T is an invertible linear linear transformation, then the matrix associated with T is invertible meaning that Ax = b has a unique solution for any ~b. In particular, the only solution to T(x) = Ax = 0 is the trivial solution x = ~0.

Exam 1: If a set of vectors is linearly dependent, then at least one of the vectors can be written as a linear combination of all the other ones.

TRUE. If a set of vectors {v1, v2, . . . , vp} is linearly dependent, then there are scalars c1, c2, . . . , cp not all zero such that c1v1 + c2v2 + · · · cpvp = ~0.

HW 3: If a consistent equation Ax = b, where A is a square matrix, has infinitely many solutions, then the REF of the augmented matrix has a row of zeros.

TRUE. If the equation has infinitely many solutions, then at least one of the variables is free, meaning that at least one of the columns does not have a pivot. Since there are the same number of rows as columns, this means that there will be a row of zeros as well.

HW 4: If {x, y} is linearly independent, and if z is in Span{x, y}, then {x, y, z} is linearly dependent.

TRUE. Since z is in Span{x, y}, then z is a linear combination of x, y, thus {x, y, z} is linearly dependent.

Exam 1: The columns of any 4 × 5 matrix are linearly dependent.

TRUE. The columns of any matrix that contains more columns than rows are linearly dependent.

HW 6: If the columns of a square matrix A are linearly dependent, then detA = 0.

TRUE. This is a result of the invertible matrix theorem.

Exam 2 PP: If a square matrix has two identical columns, then its determinant is zero.

TRUE. We could subtract one column from the other which does not change the determinant, and this would produce a column of zeros. Doing a cofactor expansion along the column of zeros reveals that the determinant is zero.

HW 4: A set of 3 vectors in R 2 is always linearly dependent.

TRUE. When row reducing A = [v1 v2 v3], where v1, v2, v3 are in R 2 , there will be at least one column without a pivot.


Ensembles d'études connexes

Česky krok za krokem 1, lekce 11: Cestování

View Set

practices lesson 10 FHA insured loans

View Set

EXAM - Section 11, Unit 1: Employment and Cooperation Agreements in Arizona

View Set

Chapter 13 (traits and personality)

View Set

Give Me Liberty:Chapter 4 Slavery, Freedom, And The Struggle For Empire, To 1763

View Set