Math 415 Final

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

suppose r is in R and v is in R3. if A is a 5x3 matrix and B is a 3x5 matrix, which of the following expressions is defined?

B(Av) rA rv NOT: vB A(Bv) A+B

L11 thm 3

Let A be an mxn matrix, let b be in Rm, and let xp be in Rn such that Axp = b Then the set of solutions {x in Rn: Ax=b} is exactly xp + Nul(A) so every solution of Ax=b is of the form xp+xn where xn is some vector in Nul(A)

Existence and Uniqueness Theorem

a linear system is *consistent* (has a solution) if and only if an echelon form of the augmented matrix has NO ROW OF THE FORM [0.....0| b] where b is nonzero

an elementary row operation is :

a replacement (add multiple of one row to another) interchange(swap two rows) scaling (multiply a row by a scaler)

L10 Subspace Def

a subspace of a vector space V is a subset of H of V that has 3 properties: 1. the zero vector of V is in H 2. for each u and v in H, u + v is in H 3. for each u in H and each scalar c, cu is in H

pivot variable

a variable that corresponds to a pivot column in the coefficient matrix of a system

free variable

a variable that is not a pivot variable

REDUCED ROW ECHELON FORM (RREF)

all the above and also 4. the leading entry in each nonzero row is 1 5. each leading 1 is the only nonzero entry in its column

Let V be a vector space of dimention at least 3. Let basis B = (b1,b2.....bn) and basis C = (c1,c2....cn) be two different ordered bases of V. Which is always true? (1) if the second entry of vB is zero, then at least one entry of vC is zero (2) if vB is not the zero vector, then vC is not the zero vector (3) if c1 = b1 + b2 + b3, then {c1,b2,b3...bn} is a basis for V. That is, if c1 = b1+b2+b3 then substituting c1 for b1 in B creates a basis for V

(1) FALSE (not always true) (2) TRUE (3) TRUE

Let V be a vector space, and let A,B be two m x n matrices. Which of these is *always* true? (1) If B is an echelon form of A, then the pivot columns of B form a basis of the column space of A (2) If V = span{v1 ...... vp} and S is a set of more than p vectors in V, then S is linearly dependent

(1) FALSE (not always true) if A [1/1] and its rref B[1/0] then clearly the span[1/1] =/= span[1/0] (2) TRUE since V = span{v1...vp}, we have dimV =< p. Since S contains more than p vectors of V, S has to be linearly dependent

Let A be an m x n matrix with echelon form U. Which of the following statement is true for all such A? (1) Col(A) = Col(U) (2) Nul(A) = Nul(U)

(1) False (2) True the solutions of Ax=0 and Ux=0 are the same so (2) holds. The column space is not preserved by elementary row operations in general so (1) does not hold

Which of the following statements are true? (1) Every linearly independent set of vectors in a vector space V forms a basis of V (2) If v1............vn are linearly independent vectors in a vector space V, then dim(V)>= n

(1) False ex. [1/0] is linearly independent but is not a basis of R2 (2) True every set of linearly independent vectors can be extended to basis of V

Consider the following 2 statements: (1) there exists a subspace V. for R7 such that dimV = dimV^perpendicular (2) if V is a subspace of R7, then the zero vector is the only vector which is in V as well as in V^perp

(1)False because dimV + dimV^perp = 7 and there is no integer x such that x+x=7 because its an off number (2) True since the zero vector is the only vector orthogonal to itself

let A,B be nxn matrices and r is in R. which of the following are true?

(A+B)^t = A^t + B^t (rA)^t = rA^t NOT (AB)^t = A^t*B^t

Let lambda be an eigenvalue of a n x n non-zero matrix A. Which of the following statements is FALSE or all such A and lambda

(C): at least one eigenvalue of A is non-zero: [0 1;0 0] is non zero but all eigenvalues are zero TRUE - lambda is an eigenvalue of A_T - lambda^-1 is an eigenvalue of A^-1, if A is invertible - 2lambda is an eigenvalue of 2A - lambda^2 + 1 is an eigenvalue of A^2 + I, where I is the identity matrix

let A,B be nxn matrices and r is in R. Which of the following are true?

(rAB) = (rA)B (AB)C = A(BC) (A+B)C = AC+BC NOT: AB = BA

if diagonizable

1. 3x3 matrix that has only 2 eigenvalues --> there can be more than 2 lin indep eigenvectors of A 2. 5A is diagonalizable 3. A has an eigenbasis 4. theres an eigenvalue of A for which the corresponding eigenspace is spanned by 2 lin indep eigenvectors

row echelon form

1. All nonzero rows are above any rows of all zeros. 2. Each leading entry of a row is in a column to the right of the leading entry of the row above it. 3. All entries in a column below a leading entry are zeros.

1. If a system of linear equation has no free variables, then it has an unique solution 2. Every matrix is row equivalent to a unique matrix in RREF 3. if a linear system has more equations than variables, then there can never be more than one solution 4. if a linear system has more variables than equations, then there much be infinitely-many solutions

1. FALSE 2. TRUE 3. FALSE 4. FALSE

if a linear system is consistent then the solution contains either...

1. a unique solution (where there are no free variables) or 2. infinitely many solutions (where there is at least 1 free variable)

ECHELON FORM (EF)

1. all rows full of zeros should be at the bottom 2. the leading entry of a nonzero row is always STRICTly to the right of the leading coefficient of the row below it 3. all entries in a column below a leading entry are zero

if invertible:

1. det(A) =/= 0 2. the reduced row echelon form of A is an identity matrix 3. Nul(A) = 0 4. A does not necessarily have an LU decomp 5. If A not invertible, then the equation Ax=b is either inconsistent or has non-unique solutions (related to Null) 6. every elementary matrix is invertible 7. if A is inv, then A^2 is also

Let A be a matrix and let U be a row-echelon form of A...

1. the null space of A is equal the null space of U ALWAYS TRUE 2. the column space of A is equal to the column space of U SOMETIMES TRUE 3. the row space of A is equal to the row space of U ALWAYS TRUE

Suppose 4 x 5 matrix A has 3 picots. What is the dimension of Col(A)?

3 dim of Col(A) is the number of pivots

Let H be a subspace of R6 such that {v1,v2,v3,v4} is a basis of H. What is the dimension of H?

4 the dimension of a vector space H is the number of elements in a basis of H

Let A be a 7x5 matrix and B be a 5x6 matrix such that the rank of AB is 5. What is the rank of A

5 Recall that each column of AB is a linear combination of the columns of A. Hence Col(AB) (in?) Col(A). Now the rank of AB is 5 implies that the dimension of Col(AB) is 5. So the previous inclusion implies that the dimension of Col(A) is at least 5, so the rank of A is at least 5. On the other hand, since the dimension of A is 5x7, the rank of A can be at most 5

what is the smallest possible dimension of Nul(A) for a 9x14 matrix A

5 because A only has 9 rows, it can have at most 9 pivot columns. Thus A has at least 5 non-pivot columns and thus dim Nul(A) >= 5

suppose the coefficient matrix of a linear system has 5 rows and 7 columns. if there are 3 pivots, the sum of the number of pivots and free variables is

7 --> free variables are indexed by the columns not containing pivots

L15 def transformations

A map T: V -> W is a linear transformation if T(c1v1 + c2v2) = c1 T(v1) + c2 T(v2) also... T(x+y) = T(x) + T(y) T(cx) = cT(x) T(0) = 0

L13 Def basis

A set of vectors {v1...vp} in V is a basis of V if - V = span{v1...vp} AND - the vectors v1...vp are linearly independent

suppose, A,b,c are mxn matrices and r,s are in R. Which of the following are true...

A+B = B+A rA+rB = r(A+B) r(sA)=(rs)A (A+B)+C = A+(B+C)

suppose A,B are invertible nxn matrices. Which of the following are also invertible?

AB A^-1 A^t A^2 NOT A+B note: if A is not the zero matrix, A^2 can still be the zero matrix note: if A is invertible and B is not invertible, then AB is invertible

Let W be a subspace of Rn, P be the projection matrix of the orthogonal proj onto W and Q be the proj matrix of the orthog proj onto W_orthog. Which is always true

ALWAYS TRUE: - P + Q = I(identity matrix) - PQ = 0 (zero-matrix) - P^2 = P

let A be an n x n matrix and let B be an m x n matrix Which is ALWAYS TRUE

ALWAYS TRUE: - if AT is an orthogonal matrix, then A is an orthogonal matrix not necessarily - if B has orthonormal columns, then BT has orthonormal columns

let A be an m x n matrix and b be in RM Which is always TRUE

ALWAYS TRUE: - if X_ is a least-squares soln to Ax=b and y is in Nul(A) then X_ + y is a least-squares solution to Ax=b not necessarily: - if x_ is a least-squares solutions to Ax = b then x_1 + x_2 is a least squares solution to Ax=b

let A be an m x n matric and b be in Rm. which is always true

ALWAYS TRUE: - let y be in Rn such that AT*A*y = AT*b, then A*y is in Col(A) - The linear system Ax = b always has a least-squares solution

Let A be an m x n matrix, let Q be an m x n matrix with orthonormal columns and let R be an upper triangular invertible n x n matrix such that A = QR Which is ALWAYS TRUE

ALWAYS TRUE: - the columns of Q form an orthonormal basis of the column space of A - Nul(A) is equal to Nul(R) NOT NECESSARILY ALWAYS TRUE: - the columns space of A = Col(R)

Let A be an n xn matrix Which is ALWAYS TRUE

ALWAYS TRUE: - the matrix 3A has the same eigenvectors as A not necessarily: - the matrix 3A has the same eigenvalues as A

let A, B be two m x n matrices that are row equivalent. Which of the following is always true? (1) dim(Nul(A)) = dim(Nul(B)) (2) dim(Col(A)) = dim(Col(B))

BOTH ARE TRUE: since A and B are row equivalent, linear systems Ax = 0 and Bx = 0 have the same solution set i.e. Nul(A) = Nul(B). In particular, we obtain dim(Nul(A)) = dim(Nul(B)) then we also have dim(Col(A)) = n-dim(Nul(A)) = n-dim(Nul(B)) = dim(Col(B))

Let P2 be the vector space of polynomials of degree at most 2. Which of the following subsets of P2 is linearly *dependent*?

DEPENDENT: {t^2 + t , 1 + t , t^2 + 2t + 1} INDEPENDENT: (a) {t^2 + t , 1 + t , 1} (b) {1 , t , t^2} (c) {1 + t , 1 - t , t^2} since (t^2+t)+(1+t)-(t^2+2t+1) = 0 --> this is linearly dependent

Let A be an mxn matrix and let b be a vector in Rm. 3 of the following 4 statements are equivalent to each other. which one is not

EQUIVALENT: - the system AX=b has a solution - b can be expressed as a linear combination of the columns of A - b is in the span of the columns of A NOT EQUIV - the rref of [A|b] has no zero row

[0,,,,,,,,,0] is a vector in every vector space

FALSE

let a,b,c,d be in R and A = [a,b,,c,d] then... A^3 = [a^3,b^3 ,, c^3, d^3]

FALSE

The solution set to a linear system of m equations in n variables is a subspace of Rn

FALSE Recall any subspace of a vector space must contain the zero element of the vector space

Let P2 be the vector space of polynomials of degree at most 2. The coordinate vector of 2 + t + t^2 with respect to the ordered basis: (t^2 - 2t , t + 1 , t - 1) is (1/3/1)

FALSE The polynomial tagged by the given coordinate vector is t^2 + 2t + 2

if A is a n x n matrix, then Col(A) and Nul(A) have no vectors in common

FALSE both are subspaces of Rn and thus both contain the zero vector in this vector space

Let a,b,c be in Rn. If a,b are orthogonal and b,c, are orthogonal, then a,c are orthogonal

FALSE consider for example, non-zero a = c

If V has dimension n then any subset of V having fewer than n non-zero vectors is linearly independent

FALSE consider the case of parallel vectors

if m < n then any set of m vectors in Rn is linearly independent

FALSE consider, for example, the case of parallel vectors

Rn is a subspace of Rn+1

FALSE elements of Rn are not elements of Rn+1

Each vector space has exactly one basis

FALSE for example, in R2 besides the standard basis we can choose: (1/0),(1/1)

Any line or plane in R3 is a subspace of R3

FALSE if the line or plane does not contain the origin, then it cannot be a subspace of R3

(0 / ... / 0) is a vector in every vector space

FALSE not all vector spaces have tuples of numbers as their elements

Any orthonormal collection of vectors in Rn is a basis

FALSE recall a basis must be linearly independent and have the spanning property

Let A be a m x n matrix with m =/= n. The column space Col(A) is a subspace of Rn

FALSE review definition of column space

Any orthogonal collection of vectors in Rn is linearly independent

FALSE review final theorem in lecture notes (18) on orthogonal collection of vectors

suppose m x n matrices A,B are row equivalent. Then Col(A) = Col(B)

FALSE review remark at end of lecture notes (11) on column spaces

Suppose A,B are row equivalent and B is in echelon form. Then, a basis for Nul(A) is given by the columns of B not containing pivots

FALSE review the algorithm outlined in the lecture notes (14) for finding a basis for the nullspace

Suppose A,B are row equivalent and B is in echelon form. Then, a basis for Col(A) is given by the columns of B containing pivots

FALSE review theorem in lecture notes (14) on finding a basis for the column space

if A is a mxn matrix and B is a nxm matrix then AB is a nxn matrix

FALSE it would be a mxm matrix

Let x_ be a least square solution of the system Ax = b Which is FALSE

FALSE: - Ax_ - B = 0 TRUE: - if b is orthogonal to Col(A), then x_ is in Nul(A) - the error vector Ax_ - b is orthogonal to Col(A) - if Nul(A) = 0, then x_ is the UNIQUE least-squares solution

Let A be a 3 x 3 matrix with columns a1, a2, a3. Which of the following statements is FALSE?

FALSE: - det([a2 (a2+6a1) a3]) = -det(A) TRUE: -if a2 = 0, then the det(A) = 0 -if B is obtained from A by adding the third row of A to the first row of A, then det(A) = det(B) - if a1 + 3a2 + a3 = 0, then det(A)=0 - det(-A) = -det(A)

Let A be an n x n matrix and let lambda1 and lambda 2 be eigenvalues of A and v1 and v2 be eigenvectors...

FALSE: - lambda cannot be 0 TRUE: - lambda^2 is an eigenvalue of A^2 - lambda + 1 is an eigenvalue of A+1 - if lambdas =/= then vs are linearly independent - if lambdas =/= then vs are orthogonal to each other

Let W be a subspace of Rn with dim(W) < n. Let P be the proj matrix of the orthog proj onto W with respect to the standard basis. Which is False

FALSE: - the columns of P form a basis of W TRUE: - All eigenvalues of P are either 0 or 1 - Nul(P) = W_orthog - the rank of P is equal to the dim of W

Let A be a 10x4 matrix with dim Nul(A^T)=6. Which of the following statements is FALSE?

FALSE: The nullspace of A has dimension 6 TRUE: (1) The matrix A has rank 4 (2) The equation Ax = 0 has a unique solution (3) The columns of A are linearly independent (4) The row space of A has dimension 4 explanation: the matrix A^T has rank 4 (10-6). This is also the rank of A. By the Rank-Nullity Theorem, Nul(A) has dimension 4-r = 0

if the augmented matrix of a linear system has two identical rows, the linear system has infinitely many solutions

FALSE: it would be true if there were two identical columns now rows

Let Q be orthogonal n x n matrix Which is not true?

FALSE: - det(A) = 1 TRUE: - QT is the inverse of Q - The columns of Q are orthonormal - if Q is upper triangular, then Q is diagonal

Let Basis B = {v1, ......vd} be a set of d vectors in the vector space V. If the vectors in B span V, then which of these statements is FALSE?

FALSE: dim(V) > d TRUE: 1. B is a basis of V is the dimension of V is d 2. dim(V) =< d 3. the vectors in B are linearly independent if the dimension of V is d 4. A subset of the vectors in B is a basis of a subspace of V

Let W be a subspace of Rn with dimW < n. Let P be the projection matrix of the orthogonal projection onto W with respect to the standard basis. Which is false?

FALSE: The columns of P form a basis of W TRUE: - the null space of P = W orthogonal - the rank of P is equal to the dimension of W - all eigenvalues of P are either 0 or 1

every square matrix has a LU decomposition

FALSE: the matrix [o,1,,1,0] does not

Let B be a m x m matrix such that B^T = B. Which of the following statements is false?

False: The dimension of the null space of B is equal to the dimension of the column space of B Trues: 1. The dimension of the column space of B is equal to the dimension of the row space of B 2. The dimension of the null space of B is equal to the dimension of the left null space of B 3. The null space of B is orthogonal to the column space of B

any line of plane in R^3 is a subspace of R^3

False: if the line or plane does not contain the origin, then it cannot be a subspace of R^3

if A and B are bases for Rn, E is the standard basis for Rn and v is in Rn, then which is always equal to Vb

I_AB ^-1 * I_AE * v

L12 Thm 1

Let A be an mxn matrix. The columns of A are linearly independent iff <--> Ax = 0 has only the solution x = 0 <--> Nul(A) = {0} <--> A has n pivots / there are no free variables for Ax=0 special cases: 1. a single non sero vector V1 is always linearly independent 2. two vectors v1,v2 are linearly independent iff neither of the vectors is a multiple of the other 3. vectors v1...vp containing the zero vector are linearly dependent 4. if you take more vectors than there are entries in each vector, then these vectors are linearly dependent (if n>m)

L11 thm 2

Let A be an mxn matrix. b is in Col(A) iff there is an x [x1;...;xn] in Rn such that Ax = b

L16 thm 2

Let T: Rn -> Rm be a linear transformation. Let B := (v1...vn) be a basis of Rn and let C:=(w1...wm) be a basis of Rm. Then there is a matrix TCB such that T(x)c = Tc,bxb for all x in Rn explicitly, Tcb = [T(v1)c ... T(vn)c]

L16 thm 1

Let T: Rn -> Rm be a linear transformation. Then there is a matrix A such that T(x) = Ax for all x in Rn explicitly, A = [T(e1) T(e2) ... T(en)] where e1,e2... is the standard basis of Rn

Let V be a subspace of Rn, n>0 and let P be the proj matrix of the proj onto V. Which is NOT ALWAYS TRUE

NOT ALWAYS TRUE: - Col(PT) = V_orthog ALWAYS TRUE: - Col(P) = V - Nul(P) = V_orthog - rank(P) = dim(V)

Let A be a m x n matrix. Then, nullspace of *A* and column space of *A^T* are subspaces of Rn while nullspace of *A^T* and column space of *A* are subspaces of Rm

Review the Fundamental Theorem of Linear Algebra in the lecture notes (15)

Let A be an mxn matrix, and let b be a nonzero vector in Rm. Suppose that u,v are both solutions to Ax=b. Which of the following must be a solution to AX=b

SOLUTION: -> (1/2)u + (1/2)v -> 5u - 4v NOT SOLUTIONS: -> 6v -> 7u -> 5u + 4v

Let Pn be the vector space of polynomials of degree at most n. The subset of polynomials having 0 as a root is a subspace of Pn

TRUE review criterion for checking subset is a subspace of a given vector space

Let V be a vector space. A subset S in V is a basis for V is span(S) = V and S is a linearly independent set

TRUE review definition of basis for vector space

L13 thm 1

Suppose that V has dimension d: 1. a set of d vectors in V are a basis if they span V 2. a set of d vectors are a basis if they are linearly independent

The set of m x n matrices with real entries under matrix addition and scalar multiplication is an example of a vector space

TRUE

executing a row operation of matrix A is equivalent to multiplying A on the left by the elementary matrix E corresponding to the operation

TRUE

if a consistent system of linear equations has no free variables, then it has a unique solution

TRUE

suppose A has a LU-decomposition and we want to solve Ax=b for many different bi, (say i = 1 thru 10^4). Then, first finding the LU-decomposition of A may be computationally beneficial

TRUE

If V has dimension n then any subset of V having fewer than n vectors cannot span V

TRUE Else Theorem 1 in the lecture notes (13) implies the collection is a basis contradicting the dimension hypothesis

Suppose A,B are row equivalent and B is in echelon form. Then, Col(A), Col(B) have the same dimension

TRUE Recall row equivalent matrices have the same number of pivots

Suppose A,B are row equivalent matrices. Then A,B have the same rank

TRUE Recall row equivalent matrices have the same number of pivots

Let basis B be an ordered basis for vector space V. If x,y are in V then (x+y)of B = x(B) + y(B)

TRUE Review definition of coordinate vector

Let Pn be the vector space of polynomials of degree at most n. The dimension of Pn is n+1

TRUE Review examples in lecture notes (13) on basis and dimension

Let v in Rn. If v(v) = 0, then v = 0

TRUE Review theorem on properties of dot product

if S = {v1 ... vn} is linearly independent in vector space V and y is in span(S), then there is only one linear combination of v1, ... , vn which equals y

TRUE if sigma(a(i)v(i)) from i=1 to n = y = sigma(b(i)v(i)), then 0 = sigma(a(i) - b(i))v(i). Now use the hypothersis on linear independence of the vectors v1, ..., vn

The null space of A and the row space of A are orthogonal complements of each other. Therefore, their intersection is the zero vector, and x is assumed to be a non-zero vector. Vectors in the null space of A are not generally in the same space as vectors in the column space of A. Similarly vectors in the null space of A and A^T, and for vectors in the row space and column space.

TRUE its an explanation for above

suppose m x n matrix A has d free variables. Then, there is a set S of d vectors in Rn such that span(S) = Nul(A)

TRUE review examples in lecture notes (10) on solving for nullspace. Each pivot variable is expressed as a linear combination of free variables

Let v be in Rn. The set of vectors orthogonal to v is a subspace of Rn

TRUE review final example (18) in section on orthogonal vectors

If S is a non-empty subset of vector space V such that any linear sombination of vectors in S is again a vector in S, then S is a subspace of V

TRUE review subspace criterion in lecture notes (10) on vector spaces

Let S be a set of vectors from a vector space V. If S contains the zero vector then S is linearly dependent

TRUE review the special cases given in the lecture notes (12) on linear independence

The difference of any two particular solutions Ax = b is a vector in the nullspace Nul(A)

TRUE review the theorem on structure of solutions to linear systems

If S is a non-empty subset of ector space V, then span(S) is a subspace of V

TRUE review theorem in lecture notes (10) regarding span of vectors and subspaces

The linear system Ax = b is consistent iff b is in Col(A)

TRUE review theorem in lecture notes (11) on relation between column space and consistency of linear systems

A collection of vectors {a1 ... ak} in Rn is linearly independent if and only if Nul(A) = 0 where A(n x k) = (a1......ak) a n x k matrix

TRUE review theorem in lecture notes (12) giving criterion for linear independence of vectors in Rn in terms of nullspace of a matrix

if x1v1 + ... + xnvn = 0 is a linearly relation in vector space V and x1 =/= 0, then v1 is a linear combination of the vectors v2 ... vn

TRUE solve for v1

Let A be a matrix. Any linear combination of vectors in Nul(A) is again a vector in Nul(A)

TRUE use the linearity property of matrix-vector multiplication

every triangular matrix, either upper or lower, has a LU-decomposition

TRUE! let I be the identity matrix of appropriate size. if A is lower triangular, then A=AI and if A is upper triangular then A=IA

Let T: Rn -> Rn be a linear transformation. Let x,y in Rn be such that T(x) = 0 and there exists a vector z in Rn such that T(z)=y. Let A be the matrix representing T with repoect to the standard basis of Rn (that is A=T(ee)) Which of these statements is always true?

TRUE: (1) x is in Nul(A) (2) For every vector v in Rn, Av = T(v) (3) y is in Col(A) explanation: because A is the matrix of T with respect to the standard basis, for any v in Rn, T(v) = Av. In particular, if T(x)=0, so Ax=0. Therefore x is in Nul(A). Also, Az = T(z)=y, so y is in Col(A)

Let A = QR be the QR decomposition of A, and let a1...an be the columns of A. What is true?

TRUE: - AT*A = RT*R - If a1...an are orthogonal then R = [abs(a1) 0...0 0 abs(a2) 0 ...0 0 0 ... abs(an)]

Let P denote the projection of orthog proj onto Col(A), where A is n x n matrix with linearly independent columns and let b be in Rn. Which is always TRUE

TRUE: - P^2 = P - P*b = b if and only if b is a linear combination of the columns of A - P*b = 0 if and only if b is in Nul(AT)

Let Q be an orthogonal matrix... what is true?

TRUE: - The columns of Q are orthonormal - Q_T * Q = I (identity matrix) - abs(det(Q)) = 1 - Q_T is the inverse of Q - the rows of Q are orthonormal

let C be a n x n matrix such that CT = C^-1 then...

TRUE: - det(C) = 1 or det(C) = -1 FALSE: - det(C) >= 0 - det(C) can be any real number - det(C) <= 0

Which of the following is true if A,B are n x n matrices

TRUE: - if every row of A adds up to 0, then det(A) = 0 FALSE: - the det of A is the product of the diagonal entries of A - if every row of A adds up to 1, then det(A) = 1

Let A be a 6x6 matrix and let x be a vector in R6. Which of the following statements is true?

TRUE: The vectors x, Ax, A^2x, A^3x, A^4x, A^5x, A^6x are linearly dependent FALSE: (1) the vectors x, Ax, A^2x, A^3x, A^4x, A^5x, A^6x are linearly independent (2) the linearly independence/dependence of *those vectors* cannot be determined from the given data Explanation: Seven vectors in R6 are always linearly dependent

If the dimension of V is d and we have a set of d vectors which span it, then this set is a basis, and its vectors are linearly independent. If they are not linearly independent, then a subset of B is linearly independent and spans V. This subset has d' < d vectors, and the dimension of V is d' which is less than d in this case

TRUE: its an explanation for the above

Let V be a vector space that is spanned by three linearly independent vectors v1, v2, v3. Which of the following vectors form a basis of V?

TRUE: v1, v2 - v1, v3 False: -- v1, v2, v2-v1 -- v1, v1+v2+v3, v2+v3 We already know from the statement of the question that {v1, v2, v3} is a basis of V and so dim(V) = 3. Thus, we are looking for a set of three linearly independent vectors spanning V. Among the given choices only the one above satisfies these properties

Let A,B,C be nxn matrices. what is true

TRUE: A + B = B + A TRUE: A(BC) = (AB)C TRUE: (AT)T = A FALSE: if AB=0 then A=0 or B=0 FALSE: AB = BA

The space Pn of polynomials of degree at most n has dimension n+1

TRUE: bc {1,t,...,t^n} form a basis of P^n

Suppose that V has dimension n. Then any set in V containing more than n vectors must be linearly dependent

TRUE: can always add more vectors that are linearly independent

the vector space of functions f: R _> R is infinite-dimensional

TRUE: it contains the vector space of all polynomials (which is infinite dimensional)

consider V = span{v1...vp} if one of the vectors, say vk, in the spanning set is a linear combination of the remaining ones, then the remaining vectors still span V

TRUE: span{v1...vp} = span{v1...v_k-1 , v_k+1 , ... vp}

L10 Thm2

The null space of an m x n matrix A is a subspace of Rn. equivalently, the set of all solutions to the system Ax = 0 of m homogeneous linear equations in n unknowns is a subspace of Rn

Let A be an m x n matrix. Which of the following statements is always true?

True: If x is a non-zero vector in the null space of A, then x is not in the row space of A False: 1. if x is in the column space of A and y is in the row space of A, then x is orthogonal to y 2. if x is in Nul(A) and y is in Nul(A^T) then x*y=0 3. if x is in Nul(A) and y is in Col(A) then x*y=0

Let A be an m x n matrix with rank r. Which of the following statements is always true? ... The maximal number of linearly independent vectors orthogonal to the row space of A is equal to...

True: ... the number of free variables of A False: ... r ... m - r ... the dimension of the column space of A^T ... the dimension of the left null space of A

Let A be a 3 x 4 matrix. Which of the following statements is correct for all such matrices?

True: the columns of A are linearly dependent Falses: 1. Any 3 columns of A form a basis of R3 2. The columns of A span R3 3. One of the columns is a multiple of one of the other columns 4. The first three columns of A are linearly independent

L12 Thm 4

Two vectors are linearly dependent if one of them is a scalar multiple of the other

L13 def dimension

V is said to have dimension p if it has a basis consisting of p vectors

L12 def lin independence / dependence

Vectors v1...vp are said to be linearly independent if the equation x1v1 + ... + xpvp = 0 has only the trivial solution and v1, ..., vp are said to be linearly dependent if there are scalars c1...cp not all zeros such that c1v1 + ... + cpvp = 0

Let A be a 4x3 matrix st there is a vector b in R4 such that Ax=b has exactly one solution. what is the rref of A

[1 0 0; 0 1 0; 0 0 1; 0 0 0]

consider the elementary matrix A = [1 0 0; 0 1 0; 1 0 1] what is A^81

[1 0 0; 0 1 0; 81 0 1]

L14 thm1

a basis for Col(A) is given by the picot columns of A

if r is the rank...

dim Col(A) = r dim Col(AT) = r (rowspace) dim Nul(A) = n-r dim Nul(AT) = m-r (left null space)

Suppose A is a m x n matrix of rank r. For each of the subspaces below, choose the correct dimension from among the choices:

dim Col(A) = r dim Nul(A) = n-r dim Col(A^T) = r dim Nul(A^T) = m-r

Uniqueness of the Reduced Echelon Form Theorem

each matrix is row-equivalent to one and only one reduced echelon matrix

R^n is a subspace of R^(n+1)

false

if a linear system has more equations than variables, then the system must be inconsistent

false--> consider the possibility of a duplicated equation. in general consistency cannot be determined by just counting equations and variables, thus the benefit of echelon form

if the coefficient matrix of an augmented matrix has free variables, then the associated linear system has infinitely-many solutions

false--> the system may not be consistent so that talking about uniqueness is not.

the span of two non-zero vectors in R3 is geometrically a plane in R3

false: consider the possibility of parallel vectors

multiplying all entries in a row by a constant is an example of an elementary row operation

false: not true if the constant multiplier is zero

L10 Thm 1

if v1,v2,...vp are in a vector space V, then span{v1,v2...vp} is a subspace of V

L15 Def coordinates

if w in V and B = (v1,v1,...,vp) is an (ordered) basis for V, the coordinate vector of q with respect to the basis B is w_B = [c1;...;cp] s.t. q= c1v1 + ... + cpvp

L16 motto

if you know T on a basis, you know T everywhere: - Let x1...xn be an input basis, a basis for V. A linear map T: V -> W is determined by the values T(x1), ..., T(xn)

given a linear system, the number of solutions may be:

infinite, exactly one, or none

how many solutions will you get with any number of free variables

infinitely many solutions

which of the following CAN geometrically represent the span of a collection of vectors in R3

line, origin, plane all of R3

for m x n matrix A with r pivots, the sum of the dimensions of Nul(A) and Col(A) is

n will always be the number of columns

a linear system only has 3 options

one unique solution, no solution, or infinitely many solutions

Let A be an m x n matrix with m < n. Consider the following two statements. (1) dim Col(A) + dim Nul(A) = n (2) dim Col(A^T) + dim Nul(A^T) = n

only (1) is always true: For any matrix A, dim Col(A) + dim Nul(A) = r + (n - r) = n and dim Col(A^T) + dim Nul(A^T) = r + (m - r) = m, where r is the rank of matrix A. Since m < n, only statement 1 is true

pivot position

position of a leading entry in an echelon form of a matrix

the span of a set of vectors in R3 is a

subset of R3

L11 Thm 1

the column space of an m x n matrix A is a subspace of Rm

L11 def column space

the column space written as Col(A) of an m x n matrix A is Col(A) = span{a1,...an}

pivot column

the column that contains a pivot position

L13 thm 2

the columns of a nxn matrix A form a basis of Rn iff the matrix A has exactly n picots, that is equivalent to say that Nul(A) = {0}

L10 Def Null Space

the nullspace of an m x n matrix A written as Nul(A) is {x in Rn: Ax=0} 1. the zero vector of V is in H 2. for each u and v in H, u + v is in H 3. for each u in H and each scalar c, cu is in H

L15 Def Rank

the rank of a matrix A is the number of pivots it has

L10 Def span

the span{v1, v2, ..., vp} is the collection of all vectors that can be written as x1v1 + x2v2 .... where x1,x2...xp are scalars

Let P(n) be the vector space of polynomials of degree at most n. The subset of polynomials having 0 as a root is a subspace of P(n)

true

a linear system with m equations in n variables has a coefficient matrix with m rows and n columns

true

executing an elementary row operation on linear system leaves the set of solutions unchanged

true

if A,B are 3x3 matrices and x is the second column of B, then the second column of AB is Ax

true

if A,B are nxn matrices such that AB=I(n) then B=A^-1 and BA=I(n) also, where I(n) is the nxn identity matrix

true

let A be a mxn matrix and x,y are in R^n. if z is in span{x,y} then Az is in Span{Ax,Ay}

true

suppose A is a mxn matrix and x = [x1,,,,,,,,,,xn] in R^n. Then Ax is a linear combination of the columns of A with the corresponding weights x1......xn

true

the augmented matrix: [a1.....an | b] represents a consistent linear system if and only if b is expressible as a linear combination of the vectors a1.....an

true

the following expression is a linear combination in R2 -ln5 [-1 ,, 2] + e^3 [pi,,1] - [sin(pi/50 ,, sqrt2]

true

the following is a linear system in the variables x(1) , x(2): pi*x(1) + (sin1)x(2) = -1 e^2 *x(1) + (ln(pi))x(2) = cos3

true

suppose A is an nxn matrix with LU-decomposition A=LU. if c is in R^n satisfies Lc=b and Uy=c, then y solves the equation Ax=b

true!

the product of lower triangular matrices is lower triangular

true!

the set of mxn matrices with real entries under matrix addition and scalar multiplication is an example of a vector space

true!

suppose L is lower-triangular. the system Ly=b may be solved by forward substitution

true! for lower-triangular coefficient matrix, the linear system is solved by forward substitution

if the coefficient matrix of an augmented matrix has pivots in every row, then the associated linear system must be consistent

true--> review the existence portion of the Existence and Uniqueness Theorem in the lecture on echelon forms

the product of permutation matrices is again a permutation matrix

true: the effect of doing successive permutations is again a permutation

suppose A is a mxn matrix and x is in R^n with m=/=n. then AX is a ...

vector in R^m review definition of matrix-vector multiplication


Kaugnay na mga set ng pag-aaral

Lab 7-8: Customize or Write Simple Scripts: Linux installation and configuration

View Set

the fetal genitourinary system PRACTICE QUIZ

View Set

The failure of the Schlieffen Plan

View Set

BUS251: Chapter 30 Reading & Assessment Questions

View Set

InQuizative - Chap. 15: Foreign Policy

View Set

Ch 4 States of Consciousness Quiz

View Set