Final Review Sheet of Theorems

The following review sheet contains some of the more important theorems, definitions, and lemmas to know for the final for this class. The other not-covered things are assumed to be known as we've seen them a lot throughout the course (this includes things like linear combinations, span, ...)

Chapter 1: Vector Spaces

1.B: Vector Space Definition

Vector Space

A vector space is a set V along with an addition on V and a scalar multiplication on V such that the following properties hold:

  • Commutativity: u+v=v+u for all u,vV.
  • Associativity: (u+v)+w=u+(v+w) and (ab)v=a(bv) for all u,v,wV and all a,bF.
  • Additive Identity: There exists an element 0V such that v+0=v for all vV.
  • Additive Inverse: For every vV there exists wV such that v+w=0.
  • Multiplicative Identity: 1v=v for all vV.
  • Distributive Properties: a(u+v)=au+av and (a+b)v=av+bv for all a,bF and all u,vV.

1.C: Subspaces and Direct Sums

Conditions for a subspace

A subset UV is a subspace of V iff U satisfies the following three conditions:

  1. Additive Identity: 0U
  2. Closed Under Addition: u,wU implies that u+wU
  3. Closed Under Scalar Multiplication: aF and uU implies that auU
Sum of Subsets

Suppose U1,...,Um are subsets of V. The sum of U1,...,Um, denoted U1+...+Um, is the set of all possible sums of elements of U1,...,Um. More precisely:
U1++Um={u1++um:u1U1,...,umUm}

Condition for a direct sum

Suppose U1,,Um are subspaces of V. Then U1++Um is a direct sum iff the only way to write 0 as a sum u1++um, where each ujUj. is by taking each uj=0.

Direct sum of two subspaces

Suppose U and W are subspaces of V. Then U+W is a direct sum iff UW={0}.

Chapter 2: Finite-Dimensional Vector Spaces

2.A: Span and LI

Linearly Indepdendent

A list v1,...,vm of vectors in V is called linearly-independent if the only choice of a1,...,amF that makes a1v1++amvm equal 0 is a1==am=0.
The empty list () is also declared to be linearly independent.

Span

The set of all linear combinations of a list of vectors v1,...,vm in V is called the span of v1,...,vm, denoted span(v1,...,vm). In other words:
span(v1,...,vm)={a1v1++amvm:a1,...,amF}
The span of the empty list () is defined to be {0}.

2.B: Bases

Linearly Independent List extends to a basis

Every LI list of vectors in a finite-dimensional vector space can be extended to a basis of the vector space.

Every subspace of V is part of a direct sum equal to V

Suppose V is finite-dimensional and U is a subspace of V. Then there is a subspace W of V such that V=UW.

2.C: Dimension

LI list of the right length is a basis.

Suppose V is finite-dimensional. Then every LI list of vectors in V with length dim(V) is a basis of V.

Spanning list of the right length is a basis

Suppose V is finite-dimensional. Then every spanning list of vectors in V with length dim(V) is a basis of V.

Essentially if you get right # of basis vectors, and one of either LI or span, you have a basis.

Chapter 3: Linear Maps

3.A: The Vector Space of Linear Maps

Linear Map

A linear map from V to W is a function T:VW such that:

  1. T(u+v)=Tu+Tv for all u,vV (additivity)
  2. T(λv)=λ(Tv) for all λF,vV. (homogeneity)
Linear maps take 0 to 0

Suppose T is a linear map from VW. Then T(0)=0

For properties, treat T like a matrix, because M(T) exists later on.

3.B: Null Spaces and Ranges

Nullspace

null space, null(T)

For TL(V,W) the null space of T, denoted null(T), is the subset of V consisting of those vectors that T maps to 0W. So null(T)={vV:Tv=0W}.

The nullspace has the following properties:

The nullspace is a subspace

Suppose TL(V,W). Then null(T) is a subspace of V.

injective

A function T:VW is injective if Tu=Tv implies that u=v

Injectivity is equivalent to null space equals {0}

Let TL(V,W). Then T is injective iff nullT={0}.

Range

Range

For T:VW the range of T is the subset of W consisting of those vectors that are of the form Tv for some vV:
range(T)={Tv:vV}

The following are properties:

The range is a subspace

If TL(V,W) then the range(T) is a subspace of W.

surjective

A function T:VW is surjective if its range equals W.

Fundamental Theorem of Linear Maps

Fundamental Theorem of Linear maps

Suppose V is finite-dimensional and TL(V,W). Then range(T) is finite dimensional and:
dim(V)=dim(null(T))+dim(range(T))

3.C: Matrices

matrix of a linear map, M(T)

Suppose TL(V,W) and v1,...,vn is a basis of Vand w1,...,wm is a basis of W. The matrix of T with respect to these bases is the m×n matrix M(T) whose entries Aj,k are defined by:
Tvk=A1,kw1++Am,kwm
If the bases are not clear from the context, then the notation M(T,(v1,...,vn),(w1,...,wm)) is used.

You can use the following to help construct the matrix:
Pasted image 20240207235858.png

Matrix Multiplication

Suppose A is an m×n matrix and C is n×p. Then AC is defined to be the m×p matrix whose entry in row j and column k is given by:
(AC)j,k=r=1nAj,rCr,k

Notation

Suppose A is m×n. Then:

  • If 1jm then Aj, denotes the 1×n matrix consisting of row j of A.
  • If 1kn then A,k denotes the k×1 matrix consisting of column k of A.

3.D: Invertible Maps and Isomorphic Vector Spaces

invertible, inverse

  • A linear map TL(V,W) is called invertible if there exists a linear map SL(W,V) such that ST equals the identity map on V and TS equals the identity map on W.
  • A linear map SL(W,V) satisfying ST=IV and TS=IW is called the inverse of T.

This inverse is an isomorphism (when T is invertible), and make V,W isomorphic.

Dimension shows vector spaces as isomorphic

Two finite-dimensional vector spaces over F are isomorphic iff they have the same dimension.

matrix of a vector, M(v)

Suppose vV and v1,...,vn is a basis for V. The matrix of v with respect to this basis is the n-by-1 matrix:
M(v)=(c1c2cn)
where c1,...,cn are the scalars such that:
v=c1v1++cnvn

Linear Maps act like matrix multiplication

Suppose TL(V,W) and vV. Suppose v1,...,vn is a basis of V and w1,...,wm is a basis of W. Then:
M(Tv)=M(T)M(v)

operator, L(V)

A linear map from a vector space to itself is called an operator. The notation L(V) denotes the set of all operators on V. In other words, L(V)=L(V,V).

Injectivity is equivalent to surjectivity in finite dimensions

Suppose V is finite-dimensional and TL(V). Then the following are equivalent:

  • T is invertible
  • T is injective
  • T is surjective

Chapter 4: Polynomials (short)

Division Algorithm for Polynomials

Suppose that p,sP(F) with s0. Then there exist unique polynomials q,rP(F) such that:
p=sq+r
where deg(r)<deg(s).

Zeroes of a polynomial correspond to degree-1 factors

Suppose pP(F) and λF. Then p(λ)=0 iff there is a polynomial qP(F) such that:
p(z)=(zλ)q(z)
for every zF.

zero of a polynomial, factor

A number λF is called a zero (or root) of a polynomial pP(F) if:
p(λ)=0
On the other hand, a polynomial sP(F) is called a factor of p if there exists a polynomial qP(F) such that p=sq.

To note, all complex-coefficient polynomials can be factored into complex degree-1 factors, while in the reals that's not necessarily the case (up to possibly x2 terms). See Chapter 4 - Polynomials (short)#^86f79a for more on this.

Chapter 5: Eigenvalues, Eigenvectors, and Invariant Subspaces

5.A: Invariant Subspaces/Definitions of Eigenvalues and Vectors

invariant subspace

Suppose TL(V). A subspace U of V is an invariant under T if uU implies that TuU.

eigenvalue

Suppose TL(V). A number λF is called an eigenvalue of T if there exists vV such that v0 and Tv=λv.

Lemma

Suppose V is finite-dimensional and TL(V) and λF. Then the following are equivalent.

  • λ is an eigenvalue of T;
  • TλI is not injective;
  • TλI is not surjective;
  • TλI is not invertible.

Notice if vnull(TλI) and v0 then it's an eigenvector.

Linearly Independent Eigenvectors

Let TL(V). Suppose λ1,...,λm are distinct eigenvalues of T and v1,...,vm are corresponding eigenvectors. Then v1,...,vm is LI.

5.B: Eigenvectors in UT Matrices

p(T)

Suppose TL(V) and pP(F) is a polynomial given by:
$
p(z) = a_0 + a_1z + a_2z^2 + \dots + a_mz> $
for zF. Then p(T) is the operator defined by:
$
p(T) = a_0I + a_1T + a_2T^2 + \dots + a_mT> $

Operators on complex vector spaces have an eigenvalue

Every operator on a finite-dimensional, nonzero, complex vector space has an eigenvalue.

Conditions for upper-triangular matrix

Suppose TL(V) and v1,...,vn is a basis of V. Then the following are equivalent:

  1. The matrix of T (M(T)) with respect to v1,...,vn is upper-triangular
  2. Tvjspan(v1,...,vj) for each j=1,...,n
  3. span(v1,...,vj) is invariant under T for each j=1,...,n
Determination of invertibility from upper-triangular matrix

Suppose TL(V) has an upper triangular matrix with respect to some basis of V. Then T is invertible iff all the entries on the diagonal of that upper-triangular matrix are non-zero.

Determination of eigenvalues from upper-triangular matrix

Suppose TL(V) has an upper-triangular matrix with respect to some basis of V. Then the eigenvalues of T are precisely the entries on the diagonal of that upper-triangular matrix.

5.C: Eigenspaces and Diagonal Matrices

eigenspace, E(λ,T)

Suppose TL(V) and λF. The eigenspace of T corresponding to λ, denoted E(λ,T), is defined by:
E(λ,T)=null(TλI)
In other words, E(λ,T) is the set of all eigenvectors of T corresponding to λ, along with the 0 vector.

Sum of eigenspaces is a direct sum

Suppose V is finite-dimensional and TL(V). Suppose also that λ1,...,λm are distinct eigenvalues of T. Then:
E(λ1,T)++E(λm,T)
is a direct sum, and furthermore,
dim(E(λ1,T))++dim(E(λm,T))dim(V)

Enough eigenvalues implies diagonalizablility

If TL(V) has dim(V) distinct eigenvalues, then T is diagonalizable.

Chapter 6: Inner Product Spaces

6.A: Inner Products and Norms

inner product

An inner product on V is a function that takes each ordered pair (u,v) of elements of V to a number u,vF and has the following properties:

  • (positivity): v,v0 for all vV
  • (definiteness): v,v=0 iff v=0
  • (additivity in the first slot): u+v,w=u,w+v,w for all u,v,wV
  • (homogeneity in the first slot): λu,v=λu,v for all λF and all u,vV
  • (conjugate symmetry): u,v=v,u for all u,vV
Basic Properties of an Inner Product

  • (a) For each fixed uV the function that takes v to v,u is linear from V to F
  • (b) 0,u=0 for every uV
  • (c) u,0=0 for every uV
  • (d) u,v+w=u,v+u,w for all u,v,wV
  • (e) u,λv=λu,v for all λF and u,vV

norm, ||v||

For vV the norm of v, denoted ||v||, is defined by:
||v||=v,v

Basic Properties of the norm

Suppose vV:

  • (a) ||v||=0 iff v=0
  • (b) ||λv||=|λ|||v|| for all λF
orthogonal

Two vectors u,vV are called orthogonal if u,v=0.

Pythagorean Theorem

Suppose u,v are orthogonal vectors in V. Then:
$
||u + v||^2 = ||u||^2 + ||v||> $

Cauchy-Schwarz Inequality

Suppose u,vV. Then:
|u,v|||u||||v||
This inequality is an equality iff one of u,v is a scalar multiple of the other.

Triangle Inequality

Suppose u,vV. Then:
u+vu+v
where we get equality iff one of u,v is a non-negative multiple of the other.

Parallelogram Equality

Suppose u,vV. Then:
u+v2+uv2=2(u2+v2)

6.B: Orthonormal Bases

The norm of an orthonormal linear combination

If e1,...,em is an orthonormal list of vectors in V, then:
$
\Vert a_1e_1 + \dots + a_me_m \Vert^2 = |a_1|^2 + \dots + |a_m|> $
for all aiF.

This list e1,...,en is LI.

Writing a vector as a linear combination of orthonormal basis

Suppose e1,...,en is an orthonormal basis of V and vV. Then:
v=v,e1e1++v,enen
and:
$
\Vert v \Vert^2 = |\langle v, e_1 \rangle|^2 + \dots + |\langle v, e_n \rangle|> $

Gram-Schmidt Procedure

Suppose v1,...,vm is a LI list of vectors from V. Let e1=v1/v1. For j=2,...,m, define ej inductively by:
ej=vjvj,e1e1vj,ej1ej1vjvj,e1e1vj,ej1ej1
Then e1,...,em is an orthonromal list of vectors in V such that:
span(v1,...,vj)=span(e1,...,ej)
for j=1,...,m

Orthornormal list extends to orthonormal basis

Suppose V is finite-dimensional. Then every orthonormal list of vectors in V can be extended to an orthonormal basis of V.

Upper-triangular matrix with respect to orthonormal basis

Suppose TL(V). If T has an upper-triangular matrix with respect to some basis of V, then T has an upper-triangular matrix with respect to some orthonormal basis of V.

Schur's Theorem

Suppose V is a finite-dimensional complex vector space and TL(V). Then T has an upper-triangular matrix with respect to some orthonormal basis of V.