Linear Algebra Lecture Questions
let A=(ai,j) be an m×n matrix and B=(bi,j) be an n×p matrix. By rows ("third way"), AB is defined by
rowi(AB) = rowi(A)B
Given an invertible n×n matrix A, which of the following computations will reveal A−1 ? (recall: "rref" means the result of performing all of Gauss-Jordan elimination, not merely Gaussian elimination)
rref([A | In])
True or False: For any matrix A, we always have that A⊤A is symmetric
true
True or False: We cannot perform the LU-factorization on a matrix A if elimination on A requires row swaps
true
Suppose v and w are not collinear and u=cv+dw is a linear combination of v and w. Then
u is on the same plane as v and
When performing elimination on [A|b], what is the next (2nd) row operation?
Row3 := Row3 - 2*Row2
If S is a symmetric matrix, then
S is square
A matrix S is symmetric if
S^T = S
For the next three questions, suppose we are trying to solve Ax=b. Select the correct order of the algorithm. Step 1:
Solve for c in Lc = b
For the next three questions, suppose we are trying to solve Ax=b. Select the correct order of the algorithm. Step 2:
Solve for x in Ux = c
Suppose we have a system of three linear equations with three unknowns. Such a system will look like
ax1 + ay1 + az1 = b1 ax2 + ay2 + az2 = b2 ax3 + ay3 + az3 = b3
This system has a solution if these planes intersect. If we instead view Ax=b as ∑i=13xiai=b, then we have a solution if
b can be written as a linear combination of the columns of A
Then the columns of M are not coplanar. This means Mx=b will only have a solution if
b is any vector in R3 (i.e. all right-hand side vectors yield a solution)
Geometrically, the columns of A are coplanar. This means Ax=b will only have a solution if
b is on the same plane as the columns of A
Suppose vectors u, v, and w are coplanar. Let b=cv+dw+eu be some linear combination of these vectors. Then
b is on the same plane as u, v, and w
let A=(ai,j) be an m×n matrix and B=(bi,j) be an n×p matrix. By columns ("second way"), AB is defined by
colj(AB) = Acolj(B)
Let A=(ai,j) and B=(bi,j) be m×n matrices. Then A+B is defined by
(A+B)i,j = ai,j + bi,j
Suppose A and B are n×n invertible matrices. Then
(AB)^-1 = B-1^1A^-1
Given matrices Am×n and Bn×p, we have that
(AB)^T = B^T A^T
let A=(ai,j) be an m×n matrix and B=(bi,j) be an n×p matrix. By entries ("first way"), AB is defined by
(AB)i,j = rowi(A) dot colj(B)
Suppose A is invertible. Then
(AT)-1 = (A-1)T
Let A=(ai,j) be an m×n matrix and c a scalar. Then cA is defined by
(cA)i,j = cai,j
Why would we solve Ax=b using the LU-factorization?
1) If we cannot know the right-hand side ahead of time, this still allows us to do most of the computation nonetheless 2) If we change the right-hand side, then we do not have to reperform most of the computation 3) It is at least as efficient as Gaussian elimination
For a 2×2 system, elimination fails, i.e. results in a pivot of 0, if
1) The column vectors of the coefficient matrix are collinear 2) The corresponding lines are parallel
Given a 2×2 system for which elimination fails, there will still be a solution if
1) The original right-hand side vector is on the same line as the column vectors of the coefficient matrix 2) The corresponding lines are equal
Given a matrix A, which of the following can be used to determine A⊤ ?
1) rowi (AT) = coli (A) 2) (AT)i,j = Aj,i 3) Mirror the entries of A over its main diagonal (i.e. do the hand-flipping dance move) 4) colj (AT) = rowj(A)
Which of the following are valid definitions of Ax?
1) the linear combination of the columns of A using the entries of x as the coefficients. 2) , the vector whose entries are the dot products of the rows of A with the vector x.
Let A be an invertible matrix and A−1 its inverse. Then ...
1)A-1A = I 2) AA-1 = I
Consider the 4×4 matrix P3,4. Multiplication by this matrix performs the row operation swap row3(A) and row4(A). This matrix can be determined by
Apply the row operation to I4. The resulting matrix is P3,4 OR P3,4 = [1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0]
Let c1,...,cn be scalars. Then ∑i=1ncivi will be a nontrivial linear combination of the vectors vi if
At least one coefficient ck is not 0
For the next three questions, suppose we are trying to solve Ax=b. Select the correct order of the algorithm. Step 0:
Compute A = LU
For now, we will consider a matrix to be
A grid of numbers
Suppose vectors u, v, and w are not coplanar. Suppose these vectors are vectors in R5. Then their set of linear combinations is
A hyperplane
If A is an invertible matrix, then
A is square
For now, we will consider a vector to be
A list of numbers
Suppose v and w are vectors. Then the set of all linear combinations of v and w is
A plane if they are not collinear
For now, we will consider a scalar to be
A single number
Let x∈R3. Then Ax is
A vector in R^3
let A=(ai,j) be an m×n matrix and B=(bi,j) be an n×p matrix. By outer products ("fourth way"), AB is defined by
AB = (the sum) colk(A)rowk(B)
Suppose vectors u, v, and w are not coplanar. Suppose these vectors are vectors in R3. Then their set of linear combinations is
All of R^3
The trivial linear combination of the vectors v1,...,vn is
Always equal to the zero vector
When we perform elimination on a system, we produce
An easier-to-solve system with the same solutions as the original system
Geometrically, a solution to a system of linear equations is
An intersection point of the lines/planes/etc represented by the linear equations
Consider the 4×4 matrix D3. Multiplication by this matrix performs the row operation scale row3(A) by c. This matrix can be determined by
Apply the row operation to I4. The resulting matrix is OR D3 = [ 1 0 0 0 0 1 0 0 0 0 c 0 0 0 0 1]
Consider the 4×4 matrix E3,1. Multiplication by this matrix performs the row operation row3(A):=row3(A)−ℓrow1(A). This matrix can be determined by
E(3,1) = [ 1 0 0 0 0 1 0 0 -l 0 1 0 0 0 0 1] OR Apply the row operation to I4. The resulting matrix is E3,1
Suppose we perform three row operations on a matrix A. Suppose the first operation is given by the elimination matrix E1. Suppose the second operation is given by the elimination matrix E2. Suppose the third operation is given by the elimination matrix E3. Which matrix multiplication produces the result of applying these row operations to A?
E3E2E1 A
Suppose E is an elimination matrix. Then to perform the corresponding row operations to a matrix A, we can compute
EA
Suppose we perform Gaussian elimination (not Gauss-Jordan) on a matrix A to produce the upper triangular matrix U. Those row operations can be represented with a matrix E. Then
EA = U
If there does not exist a nontrivial linear combination of u, v, and w that equals the zero vector, then
Every b∈R3 can be written as a linear combination of u, v, and w
If defined, we have that AB=BA, i.e. matrix multiplication is commutative
False
In this class, we will consider a line to be 1-dimensional and a plane to be 2-dimensional. We will call the 3-dimensional analogue of these a
Hyperplane
A system is singular if
It does not have a unique solution
A system is nonsingular if
It has a unique solution
Suppose w, x, y, and z are vectors in R4. Suppose w, x, y, and z are not coplanar. Then
It is possible that two of them are on the same line
A square matrix is invertible iff
Its columns are linearly independent
Suppose we perform Gaussian elimination (not Gauss-Jordan) on a matrix A and we have kept note of the multipliers ℓi,j. When calculating the factorization A=LU, we can compute L by
L = upper triangle, 1's on diagonal, l's below
Let v and w be vectors. Let c and d be scalars. Then cv+dw is a
Linear combination of v and w; it is a vector
Suppose u, v, and w are vectors. If u, v, and w are not coplanar, then
No two of them are on the same line
A nontrivial linear combination of the vectors v1,...,vn is
Not always equal to the zero vector
If there exists a nontrivial linear combination of u, v, and w that equals the zero vector, then (3)
Not every b∈R3 can be written as a linear combination of u, v, and w
When computing the factorization PA=LU, we perform the LU-factorization to the matrix
PA
For a 3×3 system, elimination fails if
The column vectors of the coefficient matrix are coplanar
Let a1=[122],a2=[−2−11],a3=[−113] . Let c1,c2,c3∈R. Then the linear combination ∑i=13civi looks like
The left-hand side of a system of linear equations
When computing the factorization PA=LU
The matrix PA is the result of matrix A having row swaps performed; the matrix P is the permutation matrix that represents those row swaps
Let E be an elimination matrix. Then the product EA produces
The matrix that results from performing the corresponding row operations to A
Given a 3×3 system for which elimination fails, there will still be a solution if
The original right-hand side vector is on the same plane as the column vectors of the coefficient matrix
If there does not exist a nontrivial linear combination of u, v, and w that equals the zero vector, then (3)
The set of all linear combinations of u, v, and w is all of R3
If there exists a nontrivial linear combination of u, v, and w that equals the zero vector, then (2)
The set of all linear combinations of u, v, and w is not all of R3
To say "the system has a unique solution" means
The system has exactly one solution
To say "the system does not have a unique solution" means
The system has more than one solution OR no solutions
0V1 + 0V2 ... 0Vn is
The trivial linear combination of the vectors vi
Let A be an n×n invertible matrix and b∈Rn. Consider the equation Ax=b. Then
There is only one solution, x=A−1b, and every such b produces a solution
Suppose u, v, and w are vectors. If u, v, and w are coplanar, then
They are all on the same plane
Suppose v and w are vectors. If v and w are collinear, then
They are on the same line
Suppose we have a system of three linear equations with three unknowns. Geometrically, such a system represents
Three planes in R^3, possibly intersecting
If defined, we have that A(BC)=(AB)C, i.e. matrix multiplication is associative
True
Suppose we solve Ax=b by applying elimination to [A|b]. This produces [U|c]. True or False: the "c" in [U|c] is equal to the "c" from the algorithm described in the previous three questions.
True
Suppose we have a system of two linear equations with two unknowns. Geometrically, such a system represents
Two lines in R^2, possibly intersecting
Suppose we perform Gaussian elimination (not Gauss-Jordan) on a matrix A and we have kept note of the multipliers ℓi,j. When calculating the factorization A=LU, we can compute U by
U is the result of performing elimination on A
Consider a 2×2 matrix [abcd]. Then
[ a b [ d -b c d ] ^ -1 = (1/ad-bc) -c a ]
Write an identity matrix
[1 0 0 0 1 0 0 0 1 ]
Let A be an m×n matrix and c a scalar. Then cA is defined iff
always
We can think of "multiply by A" as a function f(x)=Ax where
f: R^3 --> R^3
Let A be an m×n matrix and B be a p×q matrix. Then A+B is defined iff
m = p and n = q
Let A be an m×n matrix and B be a p×q matrix. Then assuming it is defined, the size of A+B is
m x n
Let A be an m×n matrix and c a scalar. Then the size of cA is
m x n
let A=(ai,j) be an m×n matrix and B=(bi,j) be an n×p matrix. The size of AB is
m x p
Let A be an m×n matrix and B be a p×q matrix. Then AB is defined iff
n = p
Suppose v and w are not collinear and u is on the same plane as v and w. Then
u is some linear combination of v and w, i.e. u=cv+dw always
If there exists a nontrivial linear combination of u, v, and w that equals the zero vector, then
u, v, and w are coplanar
If there does not exist a nontrivial linear combination of u, v, and w that equals the zero vector, then (2)
u, v, and w are not coplanar
Let r1=[1−2−1],r2=[2−11],r3=[213]. Let c=[c1c2c3]∈R3 be a vector. Define v1=r1⋅c,v2=r2⋅c,v3=r3⋅c. Let v=[v1v2v3]. Compute v.
v=[c1−2c2−c3 2c1−c2+c3 2c1+c2+3c3]