Jankowski MATH 1553 True False Questions Exam 2

¡Supera tus tareas y exámenes ahora con Quizwiz!

(A+B)^2=A^2+B^2+2AB (A and B are invertible)

False

(AB)^-1=A^-1 * B^-1 (A and B are invertible)

False

A matrix with dimensions m×n, where m > n, has fewer rows than columns.

False

A square matrix with two identical columns can be invertible. (A is an nxn matrix.)

False

A+B is invertible. (A and B are invertible)

False

If A and B are invertible nxn matrices, then A+B is invertible and (A+B)^-1=A^-1+B^-1

False

If A has dimensions 5 ×4 and B has dimensions 4 ×3, then the 3rd row, 4th column entry of AB is obtained by multiplying the 3rd column of A by the 4th row of B.

False

If A has dimensions m ×r and B has dimensions r ×n , then AB has dimensions r × n.

False

If A is a square matrix such that A⋅A equals the 0 matrix, then A must equal the 0 matrix.

False

If A is a square matrix, then there exists a matrix B such that AB equals the identity matrix.

False

If AB has dimensions k×p, then the number of rows of A is p.

False

If AB is defined, then BA is also defined.

False

If S is a set of linearly dependent vectors, then every vector in S can be written as a linear combination of the other vectors in S.

False

If T: R^3 --> R^2 is a linear transformation, then T is one-to-one.

False

If a set S of vectors contains fewer vectors than there are entries in the vectors, then the set must be linearly independent.

False

If a set contains fewer vectors then there are entries in the vectors, then the set is linearly independent.

False

If matrices A and B have the same dimension, then A + B = B + A is known as the Associative Property for Addition of Matrices.

False

If the linear transformation T(x)=Ax is one-to-one, then the columns of A form a linearly dependent set. (A is an nxn matrix.)

False

The ith row, jth column entry of a square matrix, where i > j, is called a diagonal entry.

False

If AB=BC and B is invertible, then A=C.

False (ABB^-1=BCB^-1. A=BCB^-1)

If A is a 3 × 5 matrix and T is a transformation defined by T(x) = Ax, then the domain of T is R^3.

False (Domain is R^5)

If a set in R^n is linearly dependent, then the set contains more vectors than there are entries in each vector.

False (Ex. in R^3, [1, 2, 3] and [3, 6, 9] are linearly dependent.

If S is a linearly dependent set, then each vector is a linear combination of the other vectors in S

False (For example, [1, 1] , [2, 2] and [5, 4] are linearly dependent but the last is not a linear combination of the first two.)

The codomain of the transformation x → Ax is the set of all linear combinations of the columns of A.

False (If A is m × n codomain is R^m. The original statement in describing the range.)

If A is an m × n matrix, then the range of the transformation x → Ax is R^m.

False (R^m is the codomain, the range is where we actually land.)

Every linear transformation is a matrix transformation.

False (The converse (every matrix transformation is a linear transformation) is true, however.)

If A is an mxn matrix and B is an nxp matrix, then each column of AB is a linear combination of the columns of A.

False (The reverse is true. Each column will be a linear combination of the columns of B.)

The columns of matrix A are linearly independent if the equation Ax=0 has the trivial solution.

False (The trivial solution is always a solution.)

If the equation Ax=0 has the trivial solution, then the columns of A span R^n. (A is an nxn matrix.)

Sometimes (saying Ax=0 has the trivial solution does not mean that Ax=0 necessarily has ONLYL the trivial solution.)

(AB)^-1=B^-1 * A^-1 (A and B are invertible)

True

(I-A)(I+A)=I-A^2 (I= identity matrix) (A is invertible)

True

A linear transformation preserves the operations of vector addition and scalar multiplication.

True

A^7 is invertible. (A is invertible)

True

Every matrix transformation is a linear transformation.

True

For any matrix A, 2A + 3A = 5A.

True

For any matrix A, there exists a matrix B so that A + B = 0.

True

If A and B are both square matrices such that AB equals BA equals the identity matrix, then B is the inverse matrix of A.

True

If A has dimensions m ×n and B has dimensions n × r, then AB has dimensions m × r .

True

If A is a matrix with linearly independent columns, then the equation Ax=b has a solution for all b precisely when it is a square matrix.

True

If A is a matrix with more rows than columns (m>n), then the columns of A could be either linearly dependent or linearly independent.

True

If A is an mxn matrix and every vector in R^n can be written as a linear combination of the columns of A, then A is invertible.

True

If A is an mxn matrix and the equation Ax=b has at least one solution for each b in R^n, then the solution is unique for each b in R^n.

True

If A is invertible, then the equation Ax=b has exactly one solution for all b in R^n. (A is an nxn matrix.)

True

If AX = B represents a system of linear equations and A^(-1) exists, then the product A^(-1)B gives the solution to the system.

True

If A^T is row equivalent to the nxn identity matrix, then the columns of A span R^n. (A is an nxn matrix.)

True

If T is the linear transformation whose standard matrix is row 1 (1,0,2) row 2 (-3,1,5), then the domain of T is R^3.

True

If the corresponding entries of two matrices A and B with the same dimensions are equal, then A = B.

True

If the equation Ax=0 has a nontrivial solution, then A has fewer than n pivot points. (A is an nxn matrix.)

True

If the linear transformation T(x)=Ax is onto, then it is also one-to-one. (A is an nxn matrix.)

True

If the transpose of A is not invertible, then A is also not invertible. (A is an nxn matrix.)

True

The 3rd row, 4th column entry of a matrix is below and to the right of the 2nd row, 3rd column entry.

True

The columns of a matrix with dimensions mxn, where m<n, must be linearly dependent.

True

The dimensions of the column space of a matrix are the number of pivot columns in the row reduced form of the matrix.

True

The dimensions of the null space of a matrix are the number of free variables in the row reduced form of the matrix.

True

The product of any two invertible matrices is invertible. (A is an nxn matrix.)

True

Two vectors are linearly dependent if and only if they are colinear.

True

Two vectors are linearly dependent if and only if they lie on a line through the origin.

True (If they lie on a line through the origin then the origin, the zero vector, is in their span thus they are linearly dependent.)

If x and y are linearly independent, and if z is in the Span{x, y} then {x, y, z} is linearly dependent.

True (If z is in the Span{x, y} then z is a linear combination of the other two, which can be rearranged to show linear dependence.)

A linear transformation is a special type of function.

True (Linearity Properties: (i) T(u + v) = T(u) + T(v) and (ii) T(cu) = cT(u).)

If x and y are linearly independent, and if {x, y, z} is linearly dependent, then z is in Span{x, y}.

True (Since x and y are linearly independent, and {x, y, z} is linearly dependent, it must be that z can be written as a linear combination of the other two, thus in in their span.)

A transformation T is linear if and only if T(c1v1 + c2v2) = c1T(v1) + c2T(v2) for all v1 and v2 in the domain of T and for all scalars c1 and c2.

True (linearity properties)


Conjuntos de estudio relacionados

Leccion 4 | Lesson Test (Spanish)

View Set

Chapter 11 Nutrition and Body Composition Coaching and Assessment

View Set

CSE 12 Quiz 3 Questions Gary Gillespie

View Set

Lecture 1B: Fluids and electrolytes continued: electrolyte disorders

View Set

Macroeconomics : Policy and Effects

View Set

The CE Shop Principles of Real Estate I Exam Review

View Set