Abstract Nonsense

Crushing one theorem at a time

Halmos Sections 34 and 35:Products and Polynomials


Point of post: In this post we complete the problems at the end of sections 34 and 35 of Halmos’s book.

1.

Problem: Let S

\displaystyle S:\mathbb{C}[x]\to\mathbb{C}[x]:\sum_{j=0}^{n}a_j x^j\mapsto \sum_{j=0}^{n}\frac{a_jx^{j+1}}{j+1}

and

\displaystyle D:\mathbb{C}[x]\to\mathbb{C}[x]:\sum_{j=0}^{n}a_j x^j\mapsto \sum_{j=1}^{n}ja_j x^{j-1}

compute \left(S^nD^n\right)(p(x)) and \left(D^nS^n\right)(p(x)) for an arbitrary p\in\mathbb{C}[x].

Proof: Note that

\displaystyle \left(SD\right)\left(\sum_{j=0}^{m}a_jx^j\right)=S\left(\sum_{j=1}^{m}j a_j x^{j-1}\right)=\sum_{j=1}^{m}a_j x^j

and

\displaystyle \left(DS\right)\left(\sum_{j=0}^{m}a_j x^j\right)=D\left(\sum_{j=0}^{m}\frac{a_j x^{j+1}}{j+1}\right)=\sum_{j=0}^{m}a_j x^j

and so in particular in general

\displaystyle \left(S^n D^n\right)\left(\sum_{j=0}^{m}a_j x^j\right)=\sum_{j=n}^{m}a_j x^j

where the sum is the zero polynomial if n>m and

\displaystyle \left(D^n S^n\right)\left(\sum_{j=0}^{m}a_j x^j\right)=\sum_{j=0}^{m}a_j x^j

 

2.

Problem: If A and B are linear transformations such that AB-BA commutes with A then

A^kB-BA^k=kA^{k-1}\left(AB-BA\right)

for every k\in\mathbb{N}.

Proof:We first note that by induction

A^k\left(AB-BA\right)=\left(AB-BA\right)A^k

which, upon expansion gives

A^{k+1}B-A^kBA=ABA^k-BA^{k+1}

and for our particular purposes

A^{k+1}B-ABA^k=A^kBA-BA^{k+1}

 

Thus, with this in hand we may proceed by induction. Note that A^kB-BA^k=kA^{k-1}\left(AB-BA\right) holds trivially for k=1 and if we assume that it holds for k we see that

\begin{aligned}(k+1)A^k\left(AB-BA\right) &= kA^k\left(AB-BA\right)+A^k\left(AB-BA\right)\\ &= A kA^{k-1}\left(AB-BA\right)+A^k\left(AB-BA\right)\\ &= A\left(A^kB-BA^k\right)+A^k\left(AB-BA\right)\\ &= A^{k+1}B-ABA^k+A^{k+1}B-A^kBA\\ &=A^kBA-BA^{k+1}+A^{k+1}B-A^kBA\\ &= A^{k+1}B-BA^{k+1}\end{aligned}

from where the conclusion follows.

 

3.

Problem: Suppose that A:\mathbb{C}_n[x]\to\mathbb{C}_n[x] is given by \left(A(p)\right)(x)=p(x+1). Prove that

\displaystyle A=\sum_{k=0}^{n-1}\frac{D^k}{k!}

Proof: Recall that \mathbb{C}_n[x] has basis \{x^0,\cdots,x^{n-1}\} and since a linear transformation is determined entirely on a basis it suffices to check that this equality is true on this particular basis. But,

\displaystyle \sum_{k=0}^{n-1}\frac{D^k\left(x^{\ell}\right)}{k!}=\sum_{k=0}^{\ell}\frac{\ell(\ell-1)\cdots(\ell-k+1)x^{\ell-k}}{k!}=\sum_{k=0}^{\ell}{\ell \choose k}x^{\ell-k}=\left(x+1\right)^{\ell}

where we’ve evidently used the Binomial Theorem in the last step.

From previous comment the problem follows.

 

4.

Problem:

a) If A is an endomorphism on an n-dimensional F-space \mathscr{V}, then there exists a non-zero polynomial of degree less than or equal to n^2 which annihilates A.

Proof: This is a fairly crude estimate (since for all of us know that there is a n-degree polynomial which annihilates A, the characteristic polynomial). Regardless, we merely note that if there did not exist a non-zero polynomial p with \deg p=n^2 which annihilates A then for any \alpha_0,\cdots,\alpha_{n^2}\in F we’d have that

\alpha_0\mathbf{1}+\alpha_1 A+\cdots+\alpha_{n^2}A^{n^2}=\mathbf{0}\implies \alpha_0=\alpha_1=\cdots=\alpha_{n^2}=0

which implies that \text{End}\left(\mathscr{V}\right) has n^2+1 linearly independent vectors, which is clearly impossible.

 

5.

Problem: The product of linear transformations is defined only if they “match” in the following sense. Suppose that \mathscr{U}, \mathscr{V}, and \mathscr{W} are all F-spaces and suppose that A\in\text{Hom}\left(\mathscr{U},\mathscr{V}\right) and B\in\text{Hom}\left(\mathscr{V},\mathscr{W}\right). The product BA is defined to be the linear transformation from \mathscr{U} to \mathscr{W} given by B\circ A. Interpret and prove as many of the five axioms for an associative unital algebra as you can.

Proof: The first axiom can be thought of as follows. If A\in\text{Hom}\left(\mathscr{U},\mathscr{V}\right), B\in\text{Hom}\left(\mathscr{V},\mathscr{W}\right), and C\in\text{Hom}\left(\mathscr{W},\mathscr{X}\right), then it’s true that A(BC)=(AB)C since function composition  is associative. If A\in\text{Hom}\left(\mathscr{U},\mathscr{V}\right) and B,C\in\text{Hom}\left(\mathscr{V},\mathscr{W}\right) then we can say that A\left(B+C\right)=AB+AC since A(B(x)+C(x))=A(B(x))+A(C(x)). Similarly, the third axiom holds. The fourth one holds trivially. The fifth one doesn’t hold well at all. Namely, there is a left and right identity elements, namely \text{id}_\mathscr{V} and \text{id}_{\mathscr{U}} respectively but there is no ‘single’ identity element.

 

Reference:

1. Halmos, Paul R.  Finite-dimensional Vector Spaces,. New York: Springer-Verlag, 1974. Print

 

Advertisements

November 23, 2010 - Posted by | Fun Problems, Halmos, Linear Algebra | , , , , , , ,

3 Comments »

  1. Can you post part b) of ex. 4? Thank you!

    Comment by Kira | November 29, 2010 | Reply

    • Jeez, you’re quick.I’ll post it ASAP. Respond to this so I remember.

      Comment by drexel28 | November 29, 2010 | Reply

      • Please let me know part b) of ex. 4. Thank you!

        Comment by Kira | November 29, 2010


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: