Linear 1 T/F
([T]βα)^−1=[T^−1]βα
False
A vector space cannot have more than one basis.
False
A vector space may have more than one zero vector.
False
AB=I implies that A and B are invertible.
False
A^2=I implies that A=I or A=−I.
False
A^2=O implies that A=O, where O denotes the zero matrix
False
An m×n matrix has m columns and n rows.
False
Every system of linear equations has a solution.
False
Every vector space has a finite basis.
False
Given x1, x2∈V and y1, y2∈W, there exists a linear transformation T:V→W such that T(x1)=y1 and T(x2)=y2.
False
If S generates the vector space V, then every vector in V can be written as a linear combination of vectors in S in only one way.
False
If S is a linearly dependent set, then each vector in S is a linear combination of other vectors in S.
False
If T is linear, then T carries linearly independent subsets of V onto linearly independent subsets of W.
False
If T is linear, then nullity(T)+rank(T)=dim(W).
False
If T(x+y)=T(x)+T(y), then T is linear.
False
If V is a vector space and W is a subset of V that is a vector space, then W is a subspace of V.
False
If f and g are polynomials of degree n,then f+g is a polynomial of degree n.
False
If m=dim(V) and n=dim(W), then [T]^γ_β is an m×n matrix.
False
In P(F), only polynomials of the same degree may be added.
False
In any vector space, a{x}=a{y} implies that {x}={y}.
False
In any vector space, a{x}=b{x} implies that a=b.
False
In solving a system of linear equations, it is permissible to multiply an equation by any constant.
False
L(V,W)=L(W,V).
False
Let W be the xy-plane in R3; that is, W={(a1,a2,0) :a1,a2∈R}. Then W=R2.
False
M2×3(F) is isomorphic to F^5
False
Subsets of linearly dependent sets are linearly dependent.
False
Suppose that β={x1,x2,...,xn} and β′={x′1,x′2,...,x′n} are ordered bases for a vector space and Q is the change of coordinate matrix that changes β′-coordinates into β-coordinates. Then the jth column of Q is [xj]β′.
False
T is one-to-one if and only if the only vector x such that T(x)=0 is x=0.
False
T=LA for some matrix A.
False
T=LA, where A=[T]βα.
False
The dimension of M_{m×n}(F) is m+n.
False
The dimension of Pn(F) is n.
False
The empty set is a subspace of every vector space.
False
The empty set is linearly dependent.
False
The intersection of any two subsets of V is a subspace of V.
False
The matrices A, B∈Mn×n(F) are called similar if B=Q^tAQ for some Q∈Mn×n(F)
False
The span of ∅ is ∅.
False
The trace of a square matrix is the product of its diagonal entries.
False
The zero vector space has no basis.
False
[T^2]βα=([T]βα)^2
False
[U(w)]β=[U]βα[w]β for all w∈W
False
[UT]γα=[T]βα[U]γβ
False
A is invertible if and only if LA is invertible.
True
A must be square in order to possess an inverse.
True
A nonzero scalar of F may be considered to be a polynomial in P(F) having degree zero.
True
A vector in F^n may be regarded as a matrix in M_{n×1}(F).
True
Ann×n diagonal matrix can never have more than n nonzero entries.
True
Any set containing the zero vector is linearly dependent.
True
Every change of coordinate matrix is invertible.
True
Every subspace of a finite-dimensional space is finite-dimensional.
True
Every vector space contains a zero vector.
True
Every vector space that is generated by a finite set has a basis.
True
For any scalar a, aT+U is a linear transformation from V to W.
True
If A is invertible, then (A^−1)^−1=A.
True
If A is square and Aij=δijfor all i and j, then A=I.
True
If S is a subset of a vector space V, then span(S) equals the intersection of all subspaces of V that contain S.
True
If T is linear, then T preserves sums and scalar products.
True
If T is linear, then T(0V)=0W.
True
If T,U:V→Ware both linear and agree on a basis for V, then T=U.
True
If V is a vector space having dimension n, and if S is a subset of V with n vectors, then S is linearly independent if and only if S spans V.
True
If V is a vector space having dimension n, then V has exactly one subspace with dimension 0 and exactly one subspace with dimension n.
True
If V is a vector space other than the zero vector space, then V contains a subspace W such that W not=V.
True
If a vector space has a finite basis, then the number of vectors in every basis is the same.
True
If a1x1+a2x2+···+anxn=0 and x1,x2,...,xn are linearly independent, then all the scalars ai are zero.
True
If f is a polynomial of degree n and c is a nonzero scalar, then cf is a polynomial of degree n.
True
In solving a system of linear equations, it is permissible to add any multiple of one equation to another.
True
L(V,W) is a vector space.
True
LA+B=LA+LB
True
Let T be a linear operator on a finite-dimensional vector space V, let β and β′ be ordered bases for V, and let Q be the change of coordinate matrix that changes β′-coordinates into β-coordinates. Then [T]β=Q[T]β′Q−1.
True
Let T be a linear operator on a finite-dimensional vector space V. Then for any ordered bases β and γ for V, [T] β is similar to [T]γ.
True
Pn(F) is isomorphic to Pm(F) if and only if n=m.
True
Subsets of linearly independent sets are linearly independent.
True
Suppose that V is a finite-dimensional vector space, that S1 is a linearly independent subset of V,andthatS2is a subset of V that generates V. Then S1 cannot contain more vectors thanS2.
True
T is invertible if and only if T is one-to-one and onto.
True
The zero vector is a linear combination of any nonempty set of vectors.
True
Two functions in F(S, F) are equal if and only if they have the same value at each element of S.
True
[IV]α=I.
True
[T(v)]β=[T]βα[v]α for all v∈V.
True
[T+U]γβ=[T]γβ+[U]γβ.
True
[T]^γ_β=[U]^γ_β implies that T=U.
True