The following review sheet contains some of the more important theorems, definitions, and lemmas to know for the final for this class. The other not-covered things are assumed to be known as we've seen them a lot throughout the course (this includes things like linear combinations, span, ...)
Chapter 1: Vector Spaces
1.B: Vector Space Definition
Vector Space
A vector space is a set along with an addition on and a scalar multiplication on such that the following properties hold:
Commutativity: for all .
Associativity: and for all and all .
Additive Identity: There exists an element such that for all .
Additive Inverse: For every there exists such that .
Multiplicative Identity: for all .
Distributive Properties: and for all and all .
1.C: Subspaces and Direct Sums
Conditions for a subspace
A subset is a subspace of iff satisfies the following three conditions:
Additive Identity:
Closed Under Addition: implies that
Closed Under Scalar Multiplication: and implies that
Sum of Subsets
Suppose are subsets of . The sum of , denoted , is the set of all possible sums of elements of . More precisely:
Condition for a direct sum
Suppose are subspaces of . Then is a direct sum iff the only way to write as a sum , where each . is by taking each .
Direct sum of two subspaces
Suppose and are subspaces of . Then is a direct sum iff .
Chapter 2: Finite-Dimensional Vector Spaces
2.A: Span and LI
Linearly Indepdendent
A list of vectors in is called linearly-independent if the only choice of that makes equal 0 is .
The empty list is also declared to be linearly independent.
Span
The set of all linear combinations of a list of vectors in is called the span of , denoted . In other words:
The span of the empty list is defined to be .
2.B: Bases
Linearly Independent List extends to a basis
Every LI list of vectors in a finite-dimensional vector space can be extended to a basis of the vector space.
Every subspace of is part of a direct sum equal to
Suppose is finite-dimensional and is a subspace of . Then there is a subspace of such that .
2.C: Dimension
LI list of the right length is a basis.
Suppose is finite-dimensional. Then every LI list of vectors in with length is a basis of .
Spanning list of the right length is a basis
Suppose is finite-dimensional. Then every spanning list of vectors in with length is a basis of .
Essentially if you get right # of basis vectors, and one of either LI or span, you have a basis.
Chapter 3: Linear Maps
3.A: The Vector Space of Linear Maps
Linear Map
A linear map from to is a function such that:
for all (additivity)
for all . (homogeneity)
Linear maps take to
Suppose is a linear map from . Then
For properties, treat like a matrix, because exists later on.
3.B: Null Spaces and Ranges
Nullspace
null space,
For the null space of , denoted , is the subset of consisting of those vectors that maps to . So .
The nullspace has the following properties:
The nullspace is a subspace
Suppose . Then is a subspace of .
injective
A function is injective if implies that
Injectivity is equivalent to null space equals
Let . Then is injective iff .
Range
Range
For the range of is the subset of consisting of those vectors that are of the form for some :
The following are properties:
The range is a subspace
If then the is a subspace of .
surjective
A function is surjective if its range equals .
Fundamental Theorem of Linear Maps
Fundamental Theorem of Linear maps
Suppose is finite-dimensional and . Then is finite dimensional and:
3.C: Matrices
matrix of a linear map,
Suppose and is a basis of and is a basis of . The matrix of with respect to these bases is the matrix whose entries are defined by:
If the bases are not clear from the context, then the notation is used.
You can use the following to help construct the matrix:
Matrix Multiplication
Suppose is an matrix and is . Then is defined to be the matrix whose entry in row and column is given by:
Notation
Suppose is . Then:
If then denotes the matrix consisting of row of .
If then denotes the matrix consisting of column of .
3.D: Invertible Maps and Isomorphic Vector Spaces
invertible, inverse
A linear map is called invertible if there exists a linear map such that equals the identity map on and equals the identity map on .
A linear map satisfying and is called the inverse of .
This inverse is an isomorphism (when is invertible), and make isomorphic.
Dimension shows vector spaces as isomorphic
Two finite-dimensional vector spaces over are isomorphic iff they have the same dimension.
matrix of a vector,
Suppose and is a basis for . The matrix of with respect to this basis is the -by-1 matrix:
where are the scalars such that:
Linear Maps act like matrix multiplication
Suppose and . Suppose is a basis of and is a basis of . Then:
operator,
A linear map from a vector space to itself is called an operator. The notation denotes the set of all operators on . In other words, .
Injectivity is equivalent to surjectivity in finite dimensions
Suppose is finite-dimensional and . Then the following are equivalent:
is invertible
is injective
is surjective
Chapter 4: Polynomials (short)
Division Algorithm for Polynomials
Suppose that with . Then there exist unique polynomials such that:
where .
Zeroes of a polynomial correspond to degree-1 factors
Suppose and . Then iff there is a polynomial such that:
for every .
zero of a polynomial, factor
A number is called a zero (or root) of a polynomial if:
On the other hand, a polynomial is called a factor of if there exists a polynomial such that .
To note, all complex-coefficient polynomials can be factored into complex degree-1 factors, while in the reals that's not necessarily the case (up to possibly terms). See Chapter 4 - Polynomials (short)#^86f79a for more on this.
Chapter 5: Eigenvalues, Eigenvectors, and Invariant Subspaces
5.A: Invariant Subspaces/Definitions of Eigenvalues and Vectors
invariant subspace
Suppose . A subspace of is an invariant under if implies that .
eigenvalue
Suppose . A number is called an eigenvalue of if there exists such that and .
Lemma
Suppose is finite-dimensional and and . Then the following are equivalent.
is an eigenvalue of ;
is not injective;
is not surjective;
is not invertible.
Notice if and then it's an eigenvector.
Linearly Independent Eigenvectors
Let . Suppose are distinct eigenvalues of and are corresponding eigenvectors. Then is LI.
5.B: Eigenvectors in UT Matrices
Suppose and is a polynomial given by:
$
p(z) = a_0 + a_1z + a_2z^2 + \dots + a_mz> $
for . Then is the operator defined by:
$
p(T) = a_0I + a_1T + a_2T^2 + \dots + a_mT> $
Operators on complex vector spaces have an eigenvalue
Every operator on a finite-dimensional, nonzero, complex vector space has an eigenvalue.
Conditions for upper-triangular matrix
Suppose and is a basis of . Then the following are equivalent:
The matrix of () with respect to is upper-triangular
for each
is invariant under for each
Determination of invertibility from upper-triangular matrix
Suppose has an upper triangular matrix with respect to some basis of . Then is invertible iff all the entries on the diagonal of that upper-triangular matrix are non-zero.
Determination of eigenvalues from upper-triangular matrix
Suppose has an upper-triangular matrix with respect to some basis of . Then the eigenvalues of are precisely the entries on the diagonal of that upper-triangular matrix.
5.C: Eigenspaces and Diagonal Matrices
eigenspace,
Suppose and . The eigenspace of corresponding to , denoted , is defined by:
In other words, is the set of all eigenvectors of corresponding to , along with the vector.
Sum of eigenspaces is a direct sum
Suppose is finite-dimensional and . Suppose also that are distinct eigenvalues of . Then:
is a direct sum, and furthermore,
Enough eigenvalues implies diagonalizablility
If has distinct eigenvalues, then is diagonalizable.
Chapter 6: Inner Product Spaces
6.A: Inner Products and Norms
inner product
An inner product on is a function that takes each ordered pair of elements of to a number and has the following properties:
(positivity): for all
(definiteness): iff
(additivity in the first slot): for all
(homogeneity in the first slot): for all and all
(conjugate symmetry): for all
Basic Properties of an Inner Product
(a) For each fixed the function that takes to is linear from to
(b) for every
(c) for every
(d) for all
(e) for all and
norm,
For the norm of , denoted , is defined by:
Basic Properties of the norm
Suppose :
(a) iff
(b) for all
orthogonal
Two vectors are called orthogonal if .
Pythagorean Theorem
Suppose are orthogonal vectors in . Then:
$
||u + v||^2 = ||u||^2 + ||v||> $
Cauchy-Schwarz Inequality
Suppose . Then:
This inequality is an equality iff one of is a scalar multiple of the other.
Triangle Inequality
Suppose . Then:
where we get equality iff one of is a non-negative multiple of the other.
Parallelogram Equality
Suppose . Then:
6.B: Orthonormal Bases
The norm of an orthonormal linear combination
If is an orthonormal list of vectors in , then:
$
\Vert a_1e_1 + \dots + a_me_m \Vert^2 = |a_1|^2 + \dots + |a_m|> $
for all .
This list is LI.
Writing a vector as a linear combination of orthonormal basis
Suppose is an orthonormal basis of and . Then:
and:
$
\Vert v \Vert^2 = |\langle v, e_1 \rangle|^2 + \dots + |\langle v, e_n \rangle|> $
Gram-Schmidt Procedure
Suppose is a LI list of vectors from . Let . For , define inductively by:
Then is an orthonromal list of vectors in such that:
for
Orthornormal list extends to orthonormal basis
Suppose is finite-dimensional. Then every orthonormal list of vectors in can be extended to an orthonormal basis of .
Upper-triangular matrix with respect to orthonormal basis
Suppose . If has an upper-triangular matrix with respect to some basis of , then has an upper-triangular matrix with respect to some orthonormal basis of .
Schur's Theorem
Suppose is a finite-dimensional complex vector space and . Then has an upper-triangular matrix with respect to some orthonormal basis of .