Optimization Pre-Mid Term
What are the 8 conditions for V to be a vector space?
1. Commutativity: X+Y=Y+X. 2. Associativity of vector addition: (X+Y)+Z=X+(Y+Z). 3. Additive identity: For all X, 0+X=X+0=X. 4. Existence of additive inverse: For any X, there exists a -X such that X+(-X)=0. 5. Associativity of scalar multiplication: r(sX)=(rs)X. 6. Distributivity of scalar sums: (r+s)X=rX+sX. 7. Distributivity of vector sums: r(X+Y)=rX+rY. 8. Scalar multiplication identity: 1X=X.
Two Conditions of Linear Transformation
A linear transformation between two vector spaces V and W is a map T:V->W such that the following hold: 1. T(v_1+v_2)=T(v_1)+T(v_2) for any vectors v_1 and v_2 in V, and 2. T(alphav)=alphaT(v) for any scalar alpha.
Basis Vectors
In more detail, suppose that B = { v1, ..., vn } is a finite subset of a vector space V over a field F (such as the real or complex numbers R or C). Then B is a basis if it satisfies the following conditions: the linear independence property, for all a1, ..., an ∈ F, if a1v1 + ... + anvn = 0, then necessarily a1 = ... = an = 0; and the spanning property, for every x in V it is possible to choose a1, ..., an ∈ F such that x = a1v1 + ... + anvn.
Linear Transformation Kernel
The kernel of a linear transformation T:V-->W between vector spaces is its null space.
Vector Space Span
The span of subspace generated by vectors v_1 and v_2 in V is: Span(v_1,v_2)={rv_1+sv_2:r,s in R}.
8. Distributivity of scalar multiplication with respect to field addition
u, v, w vectors in V and a, b scalars in F. (a + b)v = av + bv
6. Identity element of scalar multiplication
u, v, w vectors in V and a, b scalars in F. 1v = v, where 1 denotes the multiplicative identity in F.
4. Inverse elements of addition
u, v, w vectors in V and a, b scalars in F. For every v ∈ V, there exists an element −v ∈ V, called the additive inverse of v, such that v + (−v) = 0.
3. Identity element of addition
u, v, w vectors in V and a, b scalars in F. There exists an element 0 ∈ V, called the zero vector, such that v + 0 = v for all v ∈ V.
5. Compatibility of scalar multiplication with field multiplication
u, v, w vectors in V and a, b scalars in F. a(bv) = (ab)v
7. Distributivity of scalar multiplication with respect to vector addition
u, v, w vectors in V and a, b scalars in F. a(u + v) = au + av
1. Associativity of addition
u, v, w vectors in V and a, b scalars in F. u + (v + w) = (u + v) + w
2. Commutativity of addition
u, v, w vectors in V and a, b scalars in F. u + v = v + u
Scalar of Vector Spaces
A vector space is defined as a set of vectors, a set of scalars, and a scalar multiplication operation that takes a scalar k and a vector v to another vector kv. For example, in a coordinate space, the scalar multiplication k(v_{1},v_{2},...,v_{n}) yields (kv_{1},kv_{2},...,kv_{n})}. In a (linear) function space, kƒ is the function x ↦ k(ƒ(x)). The scalars can be taken from any field, including the rational, algebraic, real, and complex numbers, as well as finite fields.
Fundamental Theorem of Linear Algebra
Given an m×n matrix A, the fundamental theorem of linear algebra is a collection of results relating various properties of the four fundamental matrix subspaces of A. In particular: 1. dimR(A)=dimR(A^(T)) and dimR(A)+dimN(A)=n where here, R(A) denotes the range or column space of A, A^(T) denotes its transpose, and N(A) denotes its null space. 2. The null space N(A) is orthogonal to the row space R(A^(T)). 3. There exist orthonormal bases for both the column space R(A) and the row space R(A^(T)) of A. (Gram-Schmidt) 4. With respect to the orthonormal bases of R(A) and R(A^(T)), A is diagonal. (SVG)
Linearly Independent
Two or more functions, equations, or vectors f_1, f_2, ..., which are not linearly dependent, i.e., cannot be expressed in the form: a_1f_1+a_2f_2+...+a_nf_n=0 with a_1, a_2, ... constants which are not all zero are said to be linearly independent. A set of n vectors v_1, v_2, ..., v_n is linearly independent iff the matrix rank of the matrix m=(v_1 v_2 ... v_n) is n, in which case m is diagonalizable.
Linearly Dependent Vectors
n vectors X_1, X_2, ..., X_n are linearly dependent iff there exist scalars c_1, c_2, ..., c_n, not all zero, such that sum_(i=1)^nc_iX_i=0.