-
Let S = {v1,...,vn} and T = {w1,...,wn}
be bases for the n-dimensional vector space V. Let PST be
the transition matrix from the T-basis to the S-basis. Then PST
is nonsingular and PST-1 is the transition matrix from
the S-basis to the T-basis.
-
S = {u1,...,un} is called orthogonal
if the dot product of any two distinct vectors from the set is 0. S is
called orthonormal if the length of each
of the vectors is one.
-
Any orthonormal set of vectors is linear independent.
-
If W is a subspace of V, then a vector u is called
orthogonal is it is orthogonal to every vector in W. The set of all
orthogonal such u is called the orthogonal
complement.
-
If W is a subspace and V is its orthogonal complement, then
V is a subspace and W intersection V is zero.
-
Every vector in Rn can be written uniquely as a
linear combination of vectors in W and vectors in the orthogonal complement
of W.
-
The orthogonal complement of the orthogonal complement of W
is W.
-
The null space of a matrix A is the orthogonal complement of
the row space of A.
-
The null space of AT is the orthogonal complement
of the column space of A.
-
If Av = lv then l
is called an eigenvalue of A and v is
the corresponding eigenvector of V.
-
det(A - lI) is called the
characteristic polynomial of A and its roots are the eigenvalues.
-
A is nonsingular if and only if 0 is not an eigenvalue of A.
-
A is similar to B if
there is a nonsingular matrix P with B = P-1AP.
-
A is called diagonalizable
if it is similar to a diagonal matrix.
-
A matrix is diagonalizable if and only if it all the roots
of its characteristic polynomial are real and distinct.
-
The eigenvalues of a symmetric matrix are all real.
-
If A is symmetric, then the eigenvectors that belong to
distinct eigenvalues are orthogonal.
-
A matrix is called orthogonal
if A-1 = AT
-
A matrix is orthogonal if and only if its rows (columns)
form and orthogonal set of vectors.
-
A is symmetric if and only if it is orthogonally
diagonalizable.
-
If L is a linear transformation, then L(c1v1
+ ... + cnvn) = c1L(v1) + ... +
cnL(vn)
-
If L is a linear transformation, then L(0) = 0 and L(u - v)
= L(u) - L(v).
-
A linear transformation is completely determined by where it
takes a basis.
-
A linear transformation is called 1-1
if L(u) = L(v) implies that u = v.
-
The kernel of a linear
transformation is the set of v with L(v) = 0.
-
The kernel of a linear transformation is a subspace of its
domain.
-
A linear transformation is 1-1 if and only if its kernel is
zero.
-
The set of all L(v) is called the range
of L.
-
The range of a linear transformation is a subspace.
-
If the domain of a linear transformation L is V then
dim(kerL) + dim(rangeL) = the dimension of V.
-
If L: V -> W is a linear transformation, then L is
1-1 if and only if L is onto.
-
If L(x) = Ax, then A is diagonalizable with n linearly
independent eigenvalues if and only if the matrix with respect to its
eigenvalues is diagonal.