Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. III)
Point of post: This is a continuation of this post.
Problem: Prove that if and are the complex matrices
respectively, and if then .
Problem: This is just plain old computation, we leave this to the reader.
Problem: IF for some -dimensional -space , and if does it follow that ?
Proof: No, this is definitely not true. Take be the unique members of such that
We see then that
so that . That said, so that .
Problem: What happens to the matrix of a linear transformation on a -dimensional -space with respect to the ordered basis when the matrix is computed with respect to for ?
Proof: The columns are permuted via . Put more explicitly,
a) Suppose that is an -dimensional -space with basis . Suppose that are distinct. If such that and is such that prove then there exists scalars such that .
b) Prove that if and for every then for some .
a) We note that for each there exists scalars such that
But, this must be equal to
or, upon subtraction
But, since is linearly independent this implies that for . But, since for we may conclude that for . Therefore, looking back we see that this implies that
Since was arbitrary the conclusion follows.
b) See here.
Problem: If and are linearly independent sets of vectors in an -dimensional -space then there exists some such that for
Proof: Extend to a basis and extend to a basis . Then, let be the unique element of such that . Since this is a bijection between bases our previous characterization of isomorphisms let’s us conclude that .
Problem: If a matrix is such that then there exists such that .
Proof: We may clearly assume that has infinitely distinct elements. Thus, choosing some countable subset we may define (where is [as usual] the Kronecker delta symbol) and
We note then that the general term of is
and the general term of is
thus, the general term of is
from where the conclusion follows.
1. Halmos, Paul R. Finite-dimensional Vector Spaces,. New York: Springer-Verlag, 1974. Print