Linear Algebra Exam 3 T/F and Practice Questions
a set B={v1,v2,...,vp} of vectors is said to be a BASIS for a subspace H of R^n when...
(1) H contains {v1,v2,...,vp}, (2) H = Span{v1,v2,...,vp}, (3) {v1,v2,...,vp} is linearly independent.
Some other simple criteria for linear independence we met earlier will be also be useful:
(1) a set {v} containing only one non-zero vector is linearly independent because x1v=0 only if x1=0, (2) a set {v1,v2} containing two vectors is linearly dependent if and only if v2 is a scalar multiple of v1 because x1v1+x2v2=0 → v1= −(x2/x1)v2 or v2= −(x1/x2)v1 unless x1=x2=0, (3) any set {v1,...,vk−1,0,vk+1,...,vp} containing the zero vector is linearly dependent because: c1v1+...+ck−1vk−1+ck0+ck+1vk+1+⋯+cpvp = 0 when cj=0 for j≠k but ck=1.
basic theorem: for an m×n matrix A:
1) the columns of a matrix A corresponding to the pivot columns of rref[A 0] form a basis for Col(A), 2) the dimension of Nul(A) is the number of free variables in the equation Ax=0, 3) the dimension of Col(A) is the number of basic variables in the equation Ax=0.
Basis
A basis for a vector space is a sequence of vectors that form a set that is linearly independent and that spans the space.
A matrix is invertible when
A square matrix is singular if and only if its determinant is 0. So a matrix is invertible when the determinant is 0.
Definition of Eigenvalue and Eigenvector
Definition: a scalar λ is said to be an EIGENVALUE of an n×n matrix A if there exists a non-trivial solution x in R^n of the equation Ax=λx; such an x is said to be an EIGENVECTOR corresponding to λ.
v1 and v2 are linearly independent eigenvectors of an nxn matrix A then they correspond to distinct eigenvalues of A. True or False?
FALSE!
Three vectors, one of which is the zero vector, make a basis for R^3. True or false?
False! No set of vectors containing the zero vector can be linearly independent. Thus no set of three vectors containing the zero vector can be a basis for R^3.
To find the eigenvalues of A, reduce A to echleon form. True or False?
False.
find a basis for R^n
Find the associated augmented matrix for Ax=0 and then you use the # of pivot columns of rref(A0) to be the basis of the Col(A) or the # of free variables to be the basis of Nul(A)
When b1, b2, . . . , bp are vectors in R^n and H = Span{b1, b2, . . . , bp}, then {b1, b2, . . . , bp} is a basis for H. True or False?
For the set {b1, b2, . . . , bp} to be a basis for H, BOTH of the conditions (i) H = Span{b1, b2, . . . , bp} , (ii) the set is linearly independent, must be met. so the condition is false.
Find the basis set of a matrix A
Just find the rref (A0) and the pivot columns revel the basis set of A to be {a1,..}
The dimensions of the column space of A is equal to the rank of A. True or False
TRUE!
Col (A) is ...?
The column space of an mxn matrix A (Col A) is the set of all linear combinations of the columns of A. If A = {a1 ... an} , then Col(A)= Span {a1, ..., an}
Why is the PAP^-1 decomposition not unique?
The decomposition is not unique for several reasons: (1) the order of the eigenvalues λ=2,1 in D and the corresponding eigenvectors v1,v2 in P could be reversed; (2) the eigenvectors v1,v2 in P are not unique since scalar multiples of v1,v2 are again eigenvectors.
A set B = {v1, v2, . . . , vp} of vectors in R^n is always linearly dependent when p > n. True or Fase?
True
An n×n matrix A is diagonalizable if A has n distinct eigenvalues. True or False?
True
An n×n matrix is diagonalizable if and only if geo multA(λ) = alg multA(λ) for each eigenvalue λ of A. True or False?
True
If eigenvectors of an nxn matrix A are a basis for R^n, then A is diagonizable. True or False?
True
The only three-dimensional subspace of R^3 is R^3 itself. True or False?
True
For each y in R^n and each subspace W of R^n, the vector y − projection of y on W is in W⊥. True or False?
True I dont reall get this one, its on 1 of practice test
An n×n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors. True or False?
True!
An n×n matrix A is invertible if and only if detA≠0. True or False
True!
The dimension of Nul A is the number of free variables in the equation Ax = 0. True or False?
True! It would be false to say: The dimension of Nul A is the number of variables in the equation Ax = 0.
If H is a p-dimensional subspace of R^n, then a linearly independent set of p vectors in H is a basis for H. True or False?
True; Because H is a p-dimensional subspace of R^n, any linearly independent set of exactly p elements in H is automatically a basis for H, hence also spans H.
If a set of p vectors spans a p-dimensional subspace H of R^n, then these vectors form a basis for H. True or False?
True; Because H is a p-dimensional subspace of R^n, any set of p elements of H that spans H is automatically linearly independent. Also, any linearly independent set of exactly p elements in H is automatically a basis for H.
If A is a 4 × 5 matrix, then dim(Col(A)) + dim(Nul(A)) = 5 . True or False?
True; By Fundamental Theorem of Linear Algebra, for an m × n matrix A, dim(Col(A)) + dim(Nul(A)) = n.
standard basis vectors are linearly independent?
Yes, because the matrix A= [e1,e2,e3,...en] is really In where In x = 0 has only trival solution with vector x=0 and matrix In is the identity matrix. And the Standard vectors for a basis of R^n and they are linearly independent
Find a basis for the eigenspace of the matrix A corresponding to given λ.
a basis for the eigenspace of the matrix A is the Nul(A-λI) of all solutions (A-λI)x=0
ignor
a dimension of 1
A set of vectors is said to be linearly dependent when
a nontrival solutions exists
consistant matrix
a system which has at least one solution is said to be consistant
Because two vectors v1,v2 in R^n are linearly dependent if and only if v2 is a multiple of v1, diagonalization is usually easy to check for 2×2-matrices.
after finding λ and plugging it back into the Ax=λx equation to achieve v1 and v2.
a pivot positions is
after row reduction, the position in a matrix contains a leading one
Diagonalization:
an n×n matrix A is said to be DIAGONALIZABLE when there exist a diagonal matrix and an invertible matrix P such that A=PDP^−1. When the linearly independent eigenvectors {v1,v1,...,vn} of A correspond to eigenvalues λ1,λ2,...,λn (not necessarily distinct), then A=PDP^−1 with P = [v1 v2 ⋯ vn] and pic
A set B={v1,v2,...,vn} of vectors is said to be an EIGENBASIS for R^n when there is ...?
an n×n matrix A such that B is a set of n linearly independent eigenvectors of A.
If B is an n×n matrix, then the homogeneous equation Bx=0 has non-trivial solutions if and only if...?
detB=0.
for a square matrix B the homogeneous equation Bx=0 has a non-trivial solution if and only if...?
detB=0. Thus a scalar λ is an eigenvalue of A if and only if λ is a solution of the scalar equation det(A−λI)=0, an eigenvector for A corresponding to λ is any non-trivial solution of the homogeneous matrix equation (A−λI)x=0.
det[AB]=...? Where A and B are arbitary nxn matrices
det[A]det[B].
det[A^−1]=...? when A is an invertible n×n matrix
det[A^−1]=(det[A])−1=1det[A]
Fundamental Theorem of Linear Algebra: Part I.
for an m×n matrix A: dim(Col(A))+dim(Nul(A)) = n.
inconsistant matrix
if it has no solutions
non-trivial solutions exists means...
implies that there are infinitely many solutions, so when a column has no pivot, the system has one free variable (and is also dependent system)
subspace of R^n is...
is a vector space that is a subset of some other (higher-dimension) vector space.
row reduction is ...?
is the process of using row operations to reduce a matrix to row reduced echelon form.
The null space of an m x n matrix A, written as Nul A, is the...?
set of all solutions to the homogeneous equation Ax = 0, using free variable terms s and t if needed to then find a solutions of the form Null(A)=Span{v1,v2}
the Algebraic Multiplicity of λ is ...?
the Algebraic Multiplicity of λ, denoted by alg multA(λ), is the number of times λ is repeated as a root of det(A−λI)=0, factor det(A−λI), then the Algebraic Multiplicity of eigenvalue α is number of times (λ−α) is repeated,
the geometric multiplicity of λ is ...?
the Geometric Multiplicity of λ, denoted by geo multA(λ), is the dimension of Nul(A−λI). compute rref(A−λI), then the Geometric Multiplicity is the number of free variables.
Span
the span of vectors v1, v2, ... , vn is the set of linear combinations: c1v1 + c2v2 + ... + cnvn and that this is in vector space.
A set of vectors: {v1, v2, v3, ... vn} is said to be linearly independent when ...
the vector equation: x1v1 + x2v2+...xpvp = 0 (where the x's are constants and the v's are vectors) has only trivial solutions. so matrix equation Ax has only trivial solutions.
trivial solutions exists means...
the zero vector is a solution
the solution of the initial value problem is...?
u(t)=(c1)(e^(λ1*t)(v1)+(c2)(e^(λ2*t)(v2)
Basis Property:
when a set B={v1,v2,...,vp} of vectors is a BASIS for a subspace H of R^n, then each x in H has a UNIQUE representation: x = c1v1+c2v2+...+cpvp in terms of the basis vectors.