3A Midterm: True or False

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

How many rows and columns must a matrix A have in order to define a mapping from R⁴ to R⁵?

A must have 5 rows & 4 columns

Another notation for vector [-4, 3] is [-4 3]

False

For a square matrix A, vectors in Col A are orthogonal to vectors in Nul A

False

Points in the plane corresponding to [-2, 5] and [-5, 2] lie on a line through the origin

False

S is a LD set so each vector is a linear combo of the other vectors in S

False

Set contains fewer vectors than there are entries in the vectors, set is LI

False

Set in Rⁿ is LD, set contains more vectors than there are entries in each vector

False

Set span {u, v} is always visualized as a plane through the origin

False

When two linear transformations are performed one after another, the combined effect may not always be a linear transformation

False

(AB)^T=A^T*B^T

False: B^T*A^T

If A & B are square and invertible then A⁻¹B⁻¹ is the inverse of AB

False: B⁻¹(A⁻¹)

A is a 3x2 matrix so transformation cannot be one-to-one

False: Can be one-to-one, but not onto

A is diagonalizable if A=PDP⁻¹ for some matrix D and some invertible matrix P

False: D should be a diagonal matrix

Each line in Rⁿ is a one-dimensional subspace of Rⁿ

False: Dimension defined for subspace only

A is a 3x5 matrix and T is a transformation defined by T(x)=Ax, then domain of T is R³

False: Domain corresponds to number of columns (R⁵)

The determinant of a triangular matrix is the sum of the entries on the main diagonal

False: If A is a triangular matrix, then det A is the product of the entries on the main diagonal of A

If A is diagonalizable, A is invertible

False: Invertibility depends on 0 not being an eigenvalue, a diagonalizable matrix may or may not have 0 as an eigenvalue

Mapping is onto if every vector in Rⁿ maps onto some vector in R^m

False: Linear transformation is onto if codomain equals range

(AB)C=(AC)B

False: Matrix multiplication not commutative

Every linear transformation is a matrix transformation

False: Matrix transformations are linear transformations

If a set has property that u-sub-i dotted w/ u-sub-j=0 whenever i≠j then S is an orthonormal set

False: Might not be normal

Subspace of Rⁿ is any set H that (i) zero vector is in H, (ii) u, v, and u+v are in H, and (iii) c is a scalar and cu is in H

False: Much apply for EVERY u, v, & c

For any scalar c, magnitude of c*v=c*magnitude of v

False: Need absolute value of c

Not every orthogonal set in R^n is LI

False: Orthogonality implies LI

If A is an mxn matrix, range of transformation is R^m

False: R^m is only the codomain

To find eigenvalues of A, reduce A to echelon form

False: Row reduction changes eigenvalues and eigenvectors

If A and B are 3x3 and B=[b₁ b₂ b₃] then AB=[Ab₁+Ab₂+Ab₃]

False: Solution is not added

Set of all solutions of a system of m homogenous equations in n unknowns is a subspace of R^m

False: Subspace of Rⁿ

Determinant of A is the product of the pivots in any echelon form U of A, multiplied by (-1)^r where r is the number of row interchanges made during row reduction from A to U

False: This changes the determinant

det(A+B)=det A + det B

False: This is true for the product

Columns of matrix A are LI if equation Ax=0 has the trivial solution

False: Trivial solution is always the solution

If the linear transformation maps Rⁿ into Rⁿ then A has n pivots

False: We don't know anything about matrix A

If A is an nxn matrix then equation Ax=b has at least one solution for each b in Rⁿ

False: We need to know if A is invertible or something more

A is diagonalizable if and only if A has n eigenvalues, counting multiplicity

False: n eigenvectors must be linearly independent

An mxn upper triangular matrix is invertible when every position along diagonal is a pivot and nonzero. Also, it must be row equivalent to I

True

Any list of 5 real #s is a vector in R⁵

True

Columns of an invertible square matrix form a basis for Rⁿ

True

Each elementary matrix is invertible

True

Example of a linear combination of vectors v₁ and v₂ is vector 1/2(v₁)

True

Finding an eigenvector of A may be difficult, but checking it is easy

True

If A^T is not invertible then A is not invertible

True

If T: R²→R² rotates vectors about the origin through an angle, then T is a linear transformation

True

If a set B= {v₁, v₂...} is a basis for subspace H and if x=c₁v₁+c₂v₂... then c₁, c₂... are the coordinations of x relative to basis B

True

If columns of A span Rⁿ then columns are LI

True

If det A≠0 then A is invertible

True

If distance from u to v equals distance from u to -v, u and v are orthogonal

True

If equation Ax=0 has a nontrivial solution then A has fewer than n pivots

True

If equation Ax=0 has only the trivial solution then A is row equivalent to the nxn identity matrix

True

If the columns of A are LI then the columns of A span Rⁿ

True

If the equation of Ax=b has at least one solution for each b in Rⁿ then the solution is unique for each b

True

If there is a b in Rⁿ such that the equation Ax=b is inconsistent then the transformation is not one-to-one

True

If there is an nxn matrix D such that AD=I, then there is also an nxn matrix C such that CA=I

True

If x is orthogonal to every vector in subspace W, then x is in the set of all vectors orthogonal to W AKA orthogonal complement of W

True

In order for a matrix B to be inverse of A, both equations AB=I and BA=I must be true

True

Linear transformation is completely determined by its effect on the columns of the nxn identity matrix

True

Matrix A is not invertible if and only if 0 is an eigenvalue of A

True

Number c is of an eigenvalue of A if and only if equation (A-cI)x=0 has a nontrivial solution

True

Row operations don't affect linear dependence relations among columns of a matrix

True

Row replacement operation doesn't affect the determinant of a matrix

True

Second row of AB is the second row of A multiplied on the right by B

True

The orthogonal projections of y onto v is the same as the orthogonal projection of y onto cv whenever c≠0

True

Transpose of a sum of matrices equals the sum of their transposes

True

U and v are nonzero vectors, span {u, v} contains line through u and the origin

True

Vector u results when vector u-v is added to vector v

True

Dimensions of col(A) is # of pivot columns of A

True: # pivot columns = rank(A)

If A is an invertible square matrix then equation Ax=b is consistent for each b in Rⁿ

True: A⁻¹b=x

If v₁, v₂... are in Rⁿ then their span is the same as the column space of the matrix [v₁, v₂...]

True: Column space of an msn matrix is a subspace of R^m

An orthogonal matrix is invertible

True: Columns LI so invertible

The cofactor expansion of det A down a column is equal to the cofactor expansion along a row

True: Determinant of a square matrix can be computed by a cofactor expansion across any row or down any column

Dot product of u & v - dot product of v & u equals 0

True: Dot product is commutative

Columns of any 4x5 matrix are LD

True: Five columns w/ four entries means LD

If columns of A are LD then det A=0

True: If det A = 0 then the matrix is not invertible so it is not LI and must be LD

Two vectors are LD if and only if they lie on a line through the origin

True: If the zero vector is in their span then they are LD

If R^n has a basis of eigenvectors of A then A is diagonalizable

True: Need P and D

For an mxn matrix A, vectors in the null space of A are orthogonal to vectors in row space of A

True: Orthogonal complement of row space of A is null space of A

Transformation of T is linear if & only if T(c₁v₁+c₂v₂...)=c₁T(v₁)+c₂T(v₂)... for all v in the domain T and for all scalars c

True: Property of transformations

Dimensions of col(A) and nul(A) add up to # of columns of A

True: Rank Theorem

Set of p vectors spans a p-dimensional subspace H of Rⁿ, then these vectors form a basis for H

True: Rank Theorem

Linear transformation is a special type of function

True: Special properties such as T(cu+dv)=cT(u)+dT(v)

If x & y are LI & if {x, y, z} is LD, then z is in span {x,y}

True: Z can be written as linear combination of the other two so it is in their span

If x & y are LI, and if z is in their span, then {x,y,z} is LD

True: Z is a linear combination of the other two, which is LD

For any scalar c, dot product of u & (cv)=c*dot product of u & v

True: dot product is commutative

If the columns of an mxn matrix A are orthonormal then linear mapping preserves lengths

True: ||Ux||=||x||


Ensembles d'études connexes

BIO 203 Reading Quizzes Charles Exam 2

View Set

Physical Science Chapter 1, 2, 3 Test Reveiw

View Set

H-English 1: SAT Question of the Day/Grammar Quiz

View Set

Histology: What is the Function?

View Set

unit 2 concept 2 anatomy and physiologyv

View Set

World Geography A: Unit 4 Exam - Primavera

View Set

Pulmonology - PANCE Prep Pearls, Key Word Associations, PACKRAT Questions

View Set