[MATH 70.1] T/F (Book)

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

False

A 4 × 7 matrix has four columns.

True

A 6 × 3 matrix has six rows.

True.

A Markov chain that is not regular can have a unique steady state matrix.

True

A consistent system of linear equations can have infinitely many solutions.

True. Theorem 1.1 says that all homogeneous systems are consistent.

A homogeneous system of four linear equations in four variables is always consistent.

True. Theorem 1.1 states that a homogeneous system with fewer equations than variables has infinitely many solutions.

A homogeneous system of four linear equations in six variables has infinitely many solutions.

True. A homogeneous system of linear equations must have at least one solution, and that is the trivial solution.

A homogeneous system of linear equations must have at least one solution.

False

A linear system can have exactly two solutions

True

A regular stochastic matrix can have entries of 0.

True. It is stated in Theorem 2.14 on Page 77 that a square matrix A is invertible if and only if it can be written as the product of elementary matrices, and an invertible matrix is a nonsingular matrix.

A square matrix is nonsingular when it can be written as the product of elementary matrices.

False. A stochastic matrix can never have negative entries

A stochastic matrix can have negative entries.

True

A system of linear equations with fewer equations than variables always has at least one solution.

True. There are infinite solutions for this system of one linear equation.

A system of one linear equation in two variables is always consistent.

False. It is possible for three lines to intersect at just one point, or to be overlaid with each other. In both cases, the system would be consistent.

A system of three linear equations in two variables is always inconsistent.

False. These two equations may be parallel, and therefore have no solution.

A system of two linear equations in three variables is always consistent

False. By Theorem 3.3, we know that adding a multiple of a row of a matrix to another does not change the determinant.

Adding a multiple of one column of a square matrix to another column changes only the sign of the determinant.

False

Addition of matrices is not commutative.

True. It is stated in Theorem 2.15 on Page 78.

Ax = O has only the trivial solution if and only if Ax=b has a unique solution for every n× 1 column matrix b.

False. This one is tricky. Every matrix does have a unique reduced row echelon form matrix arrived at through elementary row operations, but this reduced row echelon form of a matrix can be the result of a totally different matrix. That is, two different matrices can have the Identity Matrix as their reduced row echelon form matrix. Thus not making the reduced row echelon form matrix unique.

Every matrix has a unique reduced row-echelon form.

True. Matrices are row-equivalent if one is the product of another through a series of elementary row operations. As such, every matrix is row-equivalent to a row echelon form matrix because every matrix has a row-echelon form matrix resulting from an augmented matrix.

Every matrix is row-equivalent to a matrix in row-echelon form.

True. If we have some matrix C with dimensions m x n, then its transpose C^T has dimensions n x m. Furthermore C(C^T) is a square matrix, and if we apply the property C(C^T) = [C(C^T)]^T, the statement is true.

For any matrix C the matrix C(C^T) is symmetric.

True

For the product of two matrices to be defined, the number of columns of the first matrix must equal the number of rows of the second matrix.

False. Consider A = I and B = -I , where I is the identity matrix. A and B are non-singular but A+B is not. So this statement is false.

If A and B are nonsingular n x n matrixes, then A + B is a nonsingular matrix.

True. det(AB) = det(A)xdet(B) It is impossible that det(A) = 0 and det(B) = 0, meaning that they are both nonsingular as their determinants are nonzero.

If A and B are square matrices of order n such that det(AB) = − 1, then both A and B are nonsingular.

True

If A and B are square matrices of order n, and det(A) = det(B), then det(AB) = det(A^2).

False

If A and B are square matrices of order n, then det(A + B) = det(A) + det(B).

True. Mentioned on page 64

If A can be row reduced to the identity matrix, then A is nonsingular.

False. det(cA) = c^n[det(A)] det(2A) = (2^3)*5 = 40

If A is a 3× 3 matrix with det(A)=5, then det(2A) = 10.

False, by Theorem 3.11, page 130. det(A) = det(A^T).

If A is a square matrix of order n, then det(A) = − det(A^T).

False. Not necessarily

If A is a square matrix, then the system of linear equations Ax=b has a unique solution.

True. By Theorem 3.8

If A is an invertible matrix, then the determinant of A^-1 is equal to the reciprocal of the determinant of A

True. By equivalent condition of nonsingular matrix A

If A is an invertible n× n matrix, then Ax=b has a unique solution for every b.

True

If A is an m × n matrix and B is an n× r matrix, then the product AB is an m × r matrix.

False. By Theorem 3.6

If A is an n× n matrix and c is a nonzero scalar, then the determinant of the matrix cA is nc ∙ det(A).

False. 2E cannot be obtained by using only a single elementary row operation.

If E is an elementary matrix, then 2E is an elementary matrix.

False. A system with just one solution is consistent too.

If a linear system is consistent, then it has infinitely many solutions.

False. By Theorem 3.3, part 1, page 119 det(B) = -det(A)

If a square matrix B is obtained from A by interchanging two rows, then det(B) = det(A).

False. [(A^T)A]^T = (A^T)A. Therefore, (A^T)A is symmetric for any square matrix A

If an n × n matrix A is not symmetric, then (A^T)A is not symmetric.

True. By Theorem 3.4, part 3, page 121

If one column of a square matrix is a multiple of another column, then the determinant is 0.

True. By Theorem 3.4

If one row of a square matrix is a multiple of another row, then the determinant is 0.

True. Refer to equivalent conditions for nonsingular matrix A

If the determinant of an n × n matrix A is nonzero, then Ax = O has only the trivial solution.

False. Easiest counterexample is when matrix A is zero matrix, then this equality holds, but B is not equal to C.

If the matrices A, B, and C satisfy AB = AC, then B = C.

True. Mentioned in Theorem 2.10

If the matrices A, B, and C satisfy BA=CA and A is invertible, then B=C.

False. Since [1 0 0 0 0] means x=0 as the first row, then the last row of the matrix can be 0=0, and the system would be consistent.

If the row-echelon form of the augmented matrix of a system of linear equations contains the row [1 0 0 0 0], then the original system is inconsistent.

True. By Theorem 3.4

If two rows of a square matrix are equal, then its determinant is 0.

True. By Theorem 3.3.

Interchanging two rows of a square matrix changes the sign of its determinant.

True. Theorem 2.1

Matrix addition is commutative.

False

Matrix multiplication is commutative.

True. By Theorem 3.3

Multiplying a column of a square matrix by a nonzero constant results in the determinant being multiplied by the same nonzero constant.

False. This one is tricky too. For multiplication, we have to multiply by a nonzero constant, and not just a regular constant because a regular constant can be 0, and this would render an incorrect solution.

Multiplying a row of a matrix by a constant is one of the elementary row operations.

False. Not always

The cofactor Csub22 of a matrix is always a positive number.

False. The determinant of a 2x2 matrix A is a11(a22) - a21(a12).

The determinant of a 2 x 2 matrix A is a21a12 - a11a22

True. By remark in page 112

The determinant of a matrix of order 1 is the entry of the matrix.

False

The determinant of the sum of two matrices equals the sum of the determinants of the matrices.

True. The identity matrix can be obtained by multiplying any one of the rows of the identity matrix by 1, so it is an elementary matrix.

The identity matrix is an elementary matrix.

False. Surely value of a co-factor is obtained by finding the determinant of the resultant matrix after deleting ith row and jth column. But the sign of ij co-factor is given by (-)1^(i+j) M21, where M21 is the ij minor.

The ij-cofactor of a square matrix A is the matrix obtained by deleting the ith row and jth column of A.

True. It is stated in Theorem 2.13 on Page 77.

The inverse of an elementary matrix is an elementary matrix.

True

The inverse of the inverse of a nonsingular matrix A, is equal to A itself.

False. Theorem 2.9

The inverse of the product of two matrices is the product of their inverses; that is (AB)^-1=A^-1(B^-1).

False

The matrix [a b] [c d] is invertible when (ab-dc) is not equal to 0.

True

The matrix equation Ax=b, where A is the coefficient matrix and x and b are column matrices, can be used to represent a system of linear equations.

False

The steady state matrix of an absorbing Markov chain always depends on the initial state matrix.

True

The system Ax=b is consistent if and only if b can be expressed as a linear combination of the columns of A, where the coefficients of the linear combination are a solution of the system.

False. (AB)^T=B^T(A^T). Matrix multiplication is not commutative.

The transpose of the product of two matrices equals the product of their transposes; that is, (AB)^T = A^T(B^T).

True. Theorem 2.6 on page 57

The transpose of the sum of matrices is equal to the sum of the transposes of the matrices.

True

The transpose of the sum of two matrices equals the sum of their transposes.

False. It is not possible to get a zero matrix from an identity matrix using elementary row operations.

The zero matrix is an elementary matrix.

False

There is only one way to parametrically represent the solution set of a linear equation.

True. By page 112.

To find the determinant of a matrix, expand by cofactors in any row or column.

False. The determinant of a triangular matrix is the product of the entries on the main diagonal.

To find the determinant of a triangular matrix, add the entries on the main diagonal.

True. By definition of column-equivalent matrices

Two matrices are column-equivalent when one matrix can be obtained by performing elementary column operations on the other.

True

Two systems of linear equations when they have the same solution set.

True. When expanding by co-factors, you do not need to find co-factors of zero entries,because zero times its co-factor is zero.

When expanding by cofactors, you need not evaluate the cofactors of zero entries.


संबंधित स्टडी सेट्स

Man. of Strat. All 13 chapter Ultimate Final Study Guide

View Set

FIN 331 Final Pt. 1, Jean Snavely, Western Kentucky University

View Set

Energy, Catalysis & Biosynthesis - Cell Bio Test 2

View Set

Ecology Chapter 4- Coping with Environmental Variation

View Set

Developmental Psychology: Childhood

View Set

Central Idea & Supporting Details

View Set