MATH 1554 - Linear Algebra True or False Questions

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

(T/F) A matrix is not invertible is also called a nonsingular matrix.

False; a matrix that is not invertible is also called a singular matrix. A nonsingular matrix refers to an invertible matrix.

(T/F) Every vector in a linearly dependent set is a linear combination of the preceding vectors.

False; a vector in a linearly dependent set may fail to be a linear combination of the others. It is linearly dependent because there is at least ONE vector to which this is true.

(T/F) The columns of a matrix A are linearly independent if the equation Ax=0 has the trivial solution.

False; all matrices will have the trivial solution to Ax=0. The columns of A will be linearly independent if the equation has no solution other than the trivial solution.

(T/F) An n x m matrix A is always linearly independent if n > m.

False; although such a matrix can be linearly independent, there exists matrices with more vectors than entries that are not (e.g. if one vector is a multiple of another).

(T/F) Two equivalent linear systems can have different solution sets.

False; by definition, two systems are equivalent if they have the same solution set.

(T/F) If A is a n x n matrix, then the equation Ax=b has at least one solution for each b in R^n.

False; by the IMT Ax=b has at least one solution for each b in R^n if the matrix A is invertible. Otherwise, we cannot assume this is true for any A.

(T/F) If the transpose of n x n matrix A in not invertible, A may or may not be invertible.

False; by the IMT if the transpose of A is not invertible, then none of the statements in the IMT can be true for A. So, A cannot be invertible either.

(T/F) If you have matrices A, B and C, and if AB = AC, then B = C.

False; cancellation laws do not hold for matrix multiplication. So, AB = AC does not generally imply that B = C.

(T/F) In some cases, a matrix may be row reduced to more than one matrix in reduced echelon form, using different sequences of row operations.

False; each matrix is row equivalent to one and only one reduced row echelon matrix. However, it can have many different row echelon forms.

(T/F) A line or plane parallel to the origin is a subspace.

False; for it to be a subspace, the line or plane must pass through the origin (as it must contain the zero vector).

(T/F) An n x m matrix A is always linearly independent if n < m.

False; if a matrix contains more vectors than entries in each vector, then the set is linearly dependent.

(T/F) If a set of vectors contains the zero vector, it can be either linearly independent or linearly dependent depending on what vectors make it up.

False; if a set contains the zero vector, it is always linearly dependent since the zero vector is a linear combination of any and all vectors, with weights equal to 0.

(T/F) The matrix of a system with four equations and four variables with pivots in each column will be consistent with infinitely many solutions.

False; if a square matrix has a pivot in every column, it will be consistent with a unique solution. Having infinitely many solutions would mean that there is one or more free variables, or one or more columns without pivots.

(T/F) If the columns of a n x n matrix A span R^n, then the columns are linearly dependent.

False; if the columns of A span R^n, then A is invertible. This means that A must have linearly independent columns.

(T/F) Any square matrix that is not the zero matrix is invertible.

False; if the determinant of a square matrix is 0, it is not invertible.

(T/F) If A and B are n x n and invertible, then A^-1*B^-1 is the inverse of AB.

False; it is equal to the product of the inverses in reverse order (i.e. "socks and shoes" theorem).

(T/F) Col A is the set of all solutions of Ax=b.

False; it is the set of all b that have solutions.

(T/F) If the equation Ax=b is consistent, then Col A is R^m.

False; it must be consistent for all b.

(T/F) In general, multiplying two matrices A and B, AB is the same for BA (i.e. AB = BA).

False; matrix multiplication heavily relies on order and placement. Changing the order in which you multiply the rows and columns of two matrices will generally yield different resulting matrices.

(T/F) Two matrices are row equivalent if they have the same number of rows.

False; row equivalence means that there exists a sequence of row operations that transforms one matrix to the other.

(T/F) A system with fewer equations than unknowns can have a unique solution.

False; such a system will have more variables than equations, meaning it has at least one free variable. So, if it is consistent, it will have infinitely many solutions. Otherwise, it will be inconsistent.

(T/F) After row reducing a matrix A, the resulting row reduced columns that contain pivots form a basis for Col A.

False; the columns that contain pivots after row reducing may or may not form a basis for Col A, as this generally only stands true for the corresponding pivot columns of the original matrix A, not the row reduced form.

(T/F) The determinant of the transpose of n x n matrix A is equal to (-1) det A.

False; the determinant of A transpose is equal to the determinant of the original matrix A.

(T/F) The dimension of the zero subspace {0} is equal to 1.

False; the dimension of the zero subspace is defined to be zero.

(T/F) The domain of a linear transformation T of a matrix A describes the number of rows the matrix A has.

False; the domain of T describes the number of columns A has, while the codomain describes the number of rows (or entries) A has.

(T/F) If A is invertible, then elementary row operations that reduce A to the identity also reduce A^-1 to the identity.

False; the elementary row operations that reduce A to the identity also reduce the identity to A^-1.

(T/F) R^2 is a subspace of R^3.

False; the elements in R^2 aren't even in R^3, thus it cannot be a subspace of it.

(T/F) The homogeneous equation Ax=0 has a nontrivial solution if and only if the equation has no free variables.

False; the equation Ax=0 has a nontrivial solution ONLY if it has at least one free variable.

(T/F) If the linear transformation (x) -> Ax maps R^n into R^n, then A has n pivot positions.

False; the linear transformation (x) -> Ax will always map R^n into R^n for any n x n matrix. A would only have n pivots if the transformation mapped R^n ONTO R^n.

(T/F) A n x m matrix A must have m pivot columns if its columns span R^n.

False; the matrix must have n pivot columns. For A to have a pivot position in every row, we also know that the columns of A span R^n since they are equivalent statements.

(T/F) The null space of an m x n matrix is in R^m.

False; the null space is in R^n.

(T/F) Finding a parametric vector form of the solution set to a linear system is the same as solving the system.

False; the solution set of a linear system can only be expressed in parametric vector form if it has at least one solution.

(T/F) For two matrices of the same size A and B, the transpose of their product AB' is equal to the product of the transpose of A times the transpose of B, (A')(B').

False; the transpose of the product of two matrices AB' is equal to product of their transposes in reverse order (B')(A').

(T/F) The Row Reduction Algorithm only applies to augmented matrices for a linear system.

False; this algorithm applies to any matrix, whether or not it is augmented for a linear system.

(T/F) A set of vectors is linearly dependent if the vector equation v1x1 + v2x2 + ... + vnxn = 0 has only the trivial solution.

False; this defines a linearly independent set of vectors, where the vector equation set equal to zero has only the trivial solution.

(T/F) A linear transformation is considered one-to-one if its corresponding matrix A has a pivot in every row.

False; this describes a transformation that is onto. For it to be one-to-one, A will have a pivot in every column.

(T/F) Two vectors are linearly independent if and only if they lie on a line through the origin.

False; this describes linearly dependent vectors, where the vectors will lie on the same line through the origin because one is a multiple of the other.

(T/F) The null space of a matrix A is equivalent to the span of the columns of A.

False; this describes the column space of A. The null space is the set of all solutions to Ax=0.

(T/F) The solution set of a linear system involving variables (x1, ..., xn) is a list of numbers (s1, ..., sn) that makes each equation in the system a true statement when the list of numbers is subbed for the variables, respectively.

False; this describes the definition of a single solution. The solution set of a linear system may have infinitely many solutions, and it must include all possible solutions, not just one.

(T/F) A vector is an arrow in three-dimensional space.

False; this is an example of a vector, but all vectors do not follow this form.

(T/F) A subset H of a vector space V, is a subspace of V if the zero vector is in H.

False; this is just one of the three properties of a subspace. It must also be closed under addition and scalar multiplication.

(T/F) The determinant of a n x n matrix A is equal to the product of the diagonal entries in A.

False; this is only true if A is a triangular matrix.

(T/F) The basis for a subspace H is any set in H that spans H.

False; this is solely dependent upon whether or not the set is linearly independent. If so, then it is a basis for H.

(T/F) If a set is made up of only a single vector, it is linearly independent.

False; this is true for any and all single vectors if and only if they are not the zero vector. If the set consists of only the zero vector, it is linearly dependent.

(T/F) det(A+B) detA + detB.

False; this is true for multiplication operations, not addition.

(T/F) If det A is zero, then two rows/columns are the same, or a row or a column is zero.

False; this is true for the converse: if A has a row or column of zeros or 2 rows/columns that are the same, then the det A = 0.

(T/F) A linear transformation is onto when there is some b in R^m for which T(x)=b has no solution.

False; this would make it NOT onto, since each b would have to be the image of at least one x.

(T/F) In order for matrix B to be the inverse of A, both AB = I and BA = I must be true.

True, for square matrices if AB = I, there is some C such that BC = I.

(T/F) If A is invertible, then the inverse of A^-1 is A itself.

True; (A^-1)^-1 = A by properties of invertible matrices.

(T/F) Col A is the set of vectors that can be written Ax for some x.

True; Ax gives a linear combination of the columns of A with weights x.

(T/F) For an n x n matrix A, it is possible for rank A (or dim Col A) to be equal to n.

True; according to the Rank Theorem, rank A can be-at most-equal to n. If this were the case, dim Nul A would be equal to 0.

(T/F) A square matrix A is invertible if and only if A is row equivalent to I.

True; any sequence of elementary row operations that reduces A to I will also transform I to the inverse of A.

(T/F) A vector space is also a subspace.

True; at the very least, a vector space is a subspace of itself.

(T/F) Nul A is the kernel of the mapping x -> Ax.

True; both are the solution sets to the homogeneous equation.

(T/F) A free variable can take on any value and still yield a consistent system.

True; by definition, a free variable can be set equal to any value and provide a solution to the set. This is why consistent systems with free variables have infinitely many solutions.

(T/F) A subspace is also a vector space.

True; by definition, a subspace is a subset that satisfies the vector space properties.

(T/F) If the equation Ax=0 has a nontrivial solution, then n x n matrix A has fewer than n pivot positions.

True; by the IMT if Ax=0 has a nontrivial solution, then then A cannot be invertible. Thus, it must have fewer than n pivots.

(T/F) If for a n x n matrix A, Ax=b has at least one solution for each b in R^n, then the solution is unique for each b.

True; by the IMT if Ax=b has at least one solution for each b in R^n, then A is invertible. This also means that the solution is unique for each b.

(T/F) If there is a b in R^n such that Ax=b is inconsistent for n x n matrix A, then the associated linear transformation is not one-to-one.

True; by the IMT if there is a b in R^n such that Ax=b is inconsistent, then Ax=b does not have at least one solution for each b, making it not invertible.

(T/F) Each elementary matrix is invertible.

True; each elementary matrix is invertible with its inverse being the elementary matrix of the same type that transforms it back to the identity matrix.

(T/F) If two row interchanges are made in succession, then the new determinant is equal to the old one.

True; each row swap changes the determinant by a factor of -1. Doing this twice will yield 1 * the original determinant.

(T/F) A consistent linear system has one or more solutions.

True; for a system to be consistent, it must have at least one solution.

(T/F) An invertible matrix A times the inverse of A is equal to the identity matrix.

True; for an invertible matrix A, A times the inverse of A is equal to I, as well as the inverse of A times A (in reverse order).

(T/F) The null space is a subspace.

True; for an n x m matrix, the null space is a subspace of R^n as it follows all conditions of a subspace.

(T/F) A linear transformation can only be both onto and one-to-one if its corresponding matrix A has the same number of rows as it does columns.

True; for it to be both, it would have to have pivots in every row and in every column.

(T/F) It is possible for the null space of a square n x n matrix A to be empty.

True; if A is invertible, then Nul A = {0} and Col A = R^n.

(T/F) If a matrix has a pivot position in every row, the corresponding system is consistent.

True; if a column is augmented to the right to find solutions, the augmented column will not be a pivot column, meaning the system will have to be consistent.

(T/F) Every matrix transformation is a linear transformation.

True; if a transformation follows: 1) T(u+v) = T(u) + T(v); 2) T(cu) = cT(u) for all scalars c, and all u in domain of T, then it is a linear transformation.

(T/F) If Ax=b has a solution, then the solution set can be found by translating the solution set of Ax=0, using any particular solution p of Ax=b for the translation.

True; if b in Ax=b is not the zero vector, then the span of Ax=b is the translate to Ax=0, which is the line/plane parallel to the line/plane that has the same set of vectors as well as the origin.

(T/F) If none of the vectors in the set are multiples of any others in the set, we can say that it is linearly independent.

True; if none of the vectors are linear combinations of any others in the set, then it is linearly independent.

(T/F) A set of two vectors is linearly dependent if at least one of the vectors is a multiple of the other.

True; if one or more vectors in a set is some linear combination of the other vectors in the set, then it is linearly dependent.

(T/F) If A can be row reduces to the identity matrix, then A must be invertible.

True; if we augment [A|I] and reduce to a form [I|B], then B is in the inverse of A such that AB = I.

(T/F) For two matrices A and B, it is possible that if AB = 0, A and B can both be not equal to zero matrices.

True; in general, you cannot conclude based on the product of AB being equal to zero that A or B = 0.

(T/F) A linear transformation is one-to-one if the equation T(x)=0 has only trivial solution.

True; it is one-to-one if the columns of A are linearly independent (i.e. A has pivots in every column) because each b is the image of at most one x.

(T/F) If A is a matrix and k is a positive integer, then A^k denotes the product of k copies of A (A x A x ... A, k times).

True; raising a matrix to the power of k entails that the resulting matrix A^k is the product of multiplying A times itself, k amount of times.

(T/F) Every elementary row operation is reversible.

True; replacement, interchanging, and scaling are all reversible.

(T/F) If A is an invertible n x n matrix, then the equation Ax=b is consistent for each b in R^n.

True; since A is invertible, we know that x = A^-1*b. Or, we can conclude this is true using any equivalences from the IMT.

(T/F) For an n x n invertible matrix A, there exists an n x n matrix C such that CA = I and AC = I.

True; since A is square, this is the definition of an invertible matrix.

(T/F) The zero vector is a subspace.

True; since it 1) contains the zero vector; 2) is closed under addition; and 3) is closed under scalar multiplication, it is a subspace.

(T/F) The column space of an m x n matrix A is R^m.

True; the column space is the subspace of R^m spanned by the columns of A.

(T/F) The column space of A is the range of the mapping x -> Ax.

True; the column space of A is the span of the columns of A.

(T/F) The pivot columns for a matrix A form a basis for the column space of A.

True; the corresponding columns of A that contain pivots (when row reduced) form a basis for A.

(T/F) If A = (a b;c d) and ad = bc, then A is not invertible.

True; the determinant of A = ad-bc. For any invertible matrix, the determinant cannot equal 0. If ad = bc, then ad - bc = 0, making A not invertible.

(T/F) Elementary row operations on an augmented matrix never change the solution set of the associated linear system.

True; the elementary row operations replace a system with an equivalent one. Two row equivalent matrices will have the same solution set.

(T/F) The kernel of a linear transformation is a subspace.

True; the kernel of a linear transformation is all solutions to the linear system Ax=0 which is a linear space.

(T/F) The null space of A is the solution set of the equation Ax=0.

True; the null space is the set of all solutions to the homogeneous equation.

(T/F) The range of a linear transformation is a subspace.

True; the range is the set of all images {T(x) | x in R^n}, which is a span. All spans are subspaces.

(T/F) The span to a set of vectors is that to the range of a linear transformation T.

True; the span and range both include all linear combinations of the set of vectors (or columns of its matrix) that it describes.

(T/F) If a vector is added to a linearly independent set of vectors and creates a new set that is linearly dependent, that the new vector is in the span of the original set of vectors.

True; the span of a set of vectors contains all possible linear combinations of those vectors. Since adding the vector made the set linearly dependent, it would have to be a linear combination of the others.

(T/F) Two fundamental questions about a linear system involve existence and uniqueness.

True; the two questions address whether the solution exists (system is consistent) and whether there is only one solution or many (unique solution).

(T/F) For Ax=b, saying that "the columns of A span R^m" means that every b in R^m spans R^m.

True; the vector b must be in the span of the columns of A in order for it to have a solution.

(T/F) It is possible for a 3 x 5 coefficient matrix for a system with three pivot columns to be consistent.

True; there would be a pivot in each row of the matrix. The augmented matrix would have 6 columns and will not have a pivot in the last column, yielding a consistent system.

(T/F) If the equation Ax=0 for a n x n matrix A has only the trivial solution, then A is row equivalent to the n x n identity matrix.

True; this describes an invertible matrix. Since we know that Ax=0 has only the trivial solution, then it must also be row equivalent to I.

(T/F) The product of two n x n matrices is invertible, and the inverse is the product of their inverses in reverse order.

True; this is a property for two invertible matrices of the same size.

(T/F) The equation Ax=b has a solution if and only if b is a linear combination of the columns of A.

True; this is because when in RREF, the augmented column b can not have a pivot in order for the system to be consistent (i.e. have a solution).

(T/F) A basic variable in a linear system is a variable that corresponds to a pivot column in the coefficient matrix.

True; this is the definition of a basic variable.

(T/F) A vector is any element of a vector space.

True; this is the definition of a vector.

(T/F) For a n-dimensional subspace H, any linearly independent set of n elements in H is automatically a basis for H.

True; this is the premise of the Basis Theorem for any linearly independent set of exactly p elements in a p-dimensional subspace of R^n.

(T/F) A row replacement operation does not affect the determinant of a matrix.

True; this only holds for row replacement. Swapping or scaling rows does however affect the determinant.

(T/F) If the columns of A are linearly dependent, then det A = 0.

True; when row reducing a linearly dependent matrix, we always get at least one row of zeros. This means that in REF, the matrix will have a 0 entry on the main diagonal.


Ensembles d'études connexes

Business Finance Chapter 10 & 11

View Set

gero of death and dying exam 1: UNL

View Set

Compound, complex, and simple sentences

View Set

Nursing Care of the Child with an Infection

View Set

Covalent bonding assignment and quiz

View Set

MGMT 20010 Exam 2 Purdue University

View Set

Chapter 38: Assessment and Management of Patients With Allergic Disorders

View Set

EVPP108 Summer - Final Exam study questions

View Set