Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. II)
Point of post: This is a continuation of this post.
a) Prove that if and are linear transformations on a two-dimensional vector space , then commutes with .
b) Is the conclusion of a) true for higher dimensions?
a) Maybe there is some clever way to do this, but right now I’m missing it.
Let’s first prove a lemma:
Lemma: Let , then for some .
A messy but quick calculation shows that
where are two constants. We see then that
from where the conclusion follows.
Remark: The only thing that makes this problem manageable, is that the last equality held regardless of what are. Put more directly
for any .
Note then that by this lemma, for any we have that
Thus, given any fixed ordered basis for we see that
From where it follows by previous theorem that .
b) The answer is no. Recall from a previous theorem that the center of is the set of all multiplies of . In particular, we see that if were to satisfy the desired conditions then would be central (in the center). But, this would clearly imply that their induced matrix representations, under some arbitrary but fixed ordered basis , would have the property that is central, and thus a multiple of . But, taking any and and ordered basis such that
can easily be shown to dissatisfy this condition if . Thus, the answer is no.
Problem: Let defined by . Prove that if a linear transformation commutes with , then there exists a polynomial such that .
Problem: For which of the following polynomials and matrices is it true that ?
I’m not sure exactly how do ‘show’ the calculations, so I’ll just state the result and leave the rest to the reader.
a) This does satisfy
b) This does satisfy .
c) This does not, since
d) This does satisfy