Abstract Nonsense

Crushing one theorem at a time

Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. II)


Point of post: This is a continuation of this post.

7.

Problem:

a) Prove that if T_1,T_2 and T_3 are linear transformations on a two-dimensional vector space \mathscr{V}, then \left(T_1T_2-T_2T_1\right)^2 commutes with T_3.

b) Is the conclusion of a) true for higher dimensions?

Proof:

a) Maybe there is some clever way to do this, but right now I’m missing it.

Let’s first prove a lemma:

Lemma: Let A,B\in\text{Mat}_2\left(F\right), then \left(AB-BA\right)^2=\lambda I_2 for some \lambda.

Proof: Let

A=\begin{pmatrix}a & b\\ c & d\end{pmatrix}

and

B=\begin{pmatrix}e & f\\ g & h\end{pmatrix}

A messy but quick calculation shows that

AB-BA=(bg-cf)\begin{pmatrix}1 & k_1\\ k_2 & -1\end{pmatrix}

where k_1,k_2 are two constants. We see then that

\left(AB-BA\right)^2=\left(bg-cf\right)^2(k_1k_2+1)I_2

from where the conclusion follows. \blacksquare

Remark: The only thing that makes this problem manageable, is that the last equality held regardless of what k_1,k_2 are. Put more directly

\begin{pmatrix}1 & m\\ n & -1\end{pmatrix}^2=(mn+1)I_2

for any m,n\in F.

Note then that by this lemma, for any A,B,C\in\text{Mat}_2\left(F\right) we have that

\left(AB-BA\right)^2C=\lambda C=C\lambda=C\left(AB-BA\right)^2

Thus, given any fixed ordered basis \mathcal{B} for \mathscr{V} we see that

\begin{aligned}\left[\left(T_1T_2-T_2T_1\right)^2\right]_{\mathcal{B}}\left[T_3\right]_{\mathcal{B}} &= \left(\left[T_1\right]_{\mathcal{B}}\left[T_2\right]_{\mathcal{B}}-\left[T_2\right]_{\mathcal{B}}\left[T_1\right]_{\mathcal{B}}\right)^2\left[T_3\right]_{\mathcal{B}}\\ &= \left[T_3\right]_{\mathcal{B}}\left(\left[T_1\right]_{\mathcal{B}}\left[T_2\right]_{\mathcal{B}}-\left[T_2\right]_{\mathcal{B}}\left[T_1\right]_{\mathcal{B}}\right)^2\\ &= \left[T_3\right]_{\mathcal{B}}\left[\left(T_1T_2-T_2T_1\right)\right]_{\mathcal{B}}\end{aligned}

From where it follows by previous theorem that T_3\left(T_1T_2-T_2T_1\right)^2=\left(T_1T_2-T_3T_1\right)^2 T_3.

b) The answer is no. Recall from a previous theorem that the center of \text{Mat}_n\left(F\right) is the set of all multiplies of I_n. In particular, we see that if T_1,T_2 were to satisfy the desired conditions then \left(T_1T_2-T_2T_1\right)^2 would  be central (in the center). But, this would clearly imply that their induced matrix representations, under some arbitrary but fixed ordered basis \mathcal{B}, would have the property that \left(\left[T_1\right]_{\mathcal{B}}\left[T_2\right]_{\mathcal{B}}-\left[T_2\right]_{\mathcal{B}}\left[T_1\right]_{\mathcal{B}}\right)^2 is central, and thus a multiple of I_n. But, taking any T_1 and T_2 and ordered basis \mathcal{B} such that

\left[T_1\right]_{\mathcal{B}}=\begin{pmatrix}1 & 1 & \cdots & 1\\ 1 & 1 & \cdots & 1\\ \vdots & \vdots & \ddots & \vdots\\ 1 & 1 & \cdots & 1\end{pmatrix}

and

\left[T_2\right]_{\mathcal{B}}=\begin{pmatrix}1 & 1 & \cdots & 1\\ 1 & 1 & \cdots & 1\\ \vdots & \vdots & \ddots & \vdots\\ 1 & 1 & \cdots & 0\end{pmatrix}

can easily be shown to dissatisfy this condition if n\geqslant 3. Thus, the answer is no.

8.

Problem: Let A\in\text{End}\left(\mathbb{C}^2\right) defined by A\left((\zeta_1,\zeta_2)\right)=(\zeta_1+\zeta_2,\zeta_2). Prove that if a linear transformation B commutes with A, then there exists a polynomial p such that B=p(A).

9.

Problem: For which of the following polynomials p and matrices A is it true that p(a)=[0]?

a) p(t)=t^3-3t^2+3t-1, A=\begin{pmatrix}1 & 1 & 1\\ 0 & 1 & 1\\ 0 & 0 & 1\end{pmatrix}

b) p(t)=t^2-3t, A=\begin{pmatrix}1 & 1 & 1\\ 1 & 1 & 1\\ 1 & 1 & 1\end{pmatrix}

c) p(t)=t^3+t^2+t+1, A=\begin{pmatrix}1 & 1 & 0\\ 1 & 1 & 1\\ 0 & 1 & 1\end{pmatrix}

d) p(t)=t^3-2t, A=\begin{pmatrix}0 & 1 & 0\\ 1 & 0 & 1\\ 0 & 1 & 0\end{pmatrix}

Proof:

I’m not sure exactly how do ‘show’ the calculations, so I’ll just state the result and leave the rest to the reader.

a) This does satisfy p(A)=[0]

b) This does satisfy p(A)=[0].

c) This does not, since p(A)=4\begin{pmatrix}2 & 2 & 1\\ 2 & 3 & 2\\ 1 & 2 & 2\end{pmatrix}

d) This does satisfy p(A)=[0]

Advertisements

December 19, 2010 - Posted by | Fun Problems, Halmos, Linear Algebra, Uncategorized | , , , ,

1 Comment »

  1. […] Point of post: This is a continuation of this post. […]

    Pingback by Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. III) « Abstract Nonsense | December 19, 2010 | Reply


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: