Abstract Nonsense

Crushing one theorem at a time

Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. I)


Point of post: In this post I will complete the problems listed at the end of sections 37 and 38 of Halmos.

Remark: For those who are just interested in the solutions to Halmos and haven’t read my side-along postings you will probably need to see the series of posts for which this and this are the first posts for notation.

1.

Problem: Let A\in\text{End}\left(\mathbb{C}_n[x]\right) be defined by \left(A(p)\right)(x)=p(x+1), and let \mathcal{B} be the canonical ordered basis (1,x,\cdots,x^{n-1}). Find \left[A\right]_{\mathcal{B}}.

Proof: We note that for each j\in[n-1] we have that

\displaystyle A\left(x^j\right)=(x+1)^j=\sum_{i=0}^{j}{n\choose j} x^{i}=\sum_{i=0}^{n-1}{j\choose i}x^i

where we’ve appealed to the convention that \displaystyle {k\choose \ell}=0 if \ell>k. Thus

\displaystyle \left[A\right]_{\mathcal{B}}=\begin{pmatrix}{1 \choose 1} & {2\choose 1} & {3 \choose 1} & \cdots & {n\choose 1}\\ {1 \choose 2} & {2 \choose 2} & {3\choose 2} & \cdots & {n\choose 2}\\ \vdots & \vdots & \ddots & \vdots & \vdots\\ {1 \choose n-1} & {2 \choose n-1} & {3 \choose n-1} & \cdots & {n\choose n-1}\\ {1 \choose n} & {2\choose n} & {3 \choose n} & \cdots & {n\choose n}\end{pmatrix}

 

2.

Problem: Find the matrix of the operation of the operation of conjugation on \mathbb{C}, considered as a real vector space with respect to the canonical ordered basis \mathcal{B}=(1,i).

Proof: Let T be the conjugation operator. We see then that

T(1)=1+0i

and

T(i)=01+(-1)i

so that

\left[T\right]_{\mathcal{B}}=\begin{pmatrix}1 & 0\\ 0 & -1\end{pmatrix}

 

3.

Problem: Let \pi\in S_n then compute \left[A\right]_{\mathcal{B}} where \mathcal{B}=(e_1,\cdots,e_n) (where \{e_1,\cdots,e_n\} is the canonical basis for \mathbb{C}^n) and A((\zeta_1,\cdots,\zeta_n))=(\zeta_{\pi(1)},\cdots,\zeta_{\pi(n)}).

Proof: It’s fairly easy to see that

\left[A\right]_{\mathcal{B}}=\left(\begin{array}{c|c|c} & & \\ e_{\pi(1)} & \cdots & e_{\pi(n)}\\ & & \end{array}\right)

4.

Problem: Let P=\begin{pmatrix}1 & 1\\ 1 & 1\end{pmatrix} and consider T\in\text{End}\left(\text{Mat}_2(\mathbb{R})\right) given by T(X)=PX. Find \left[T\right]_{\mathcal{B}} where

\mathcal{B}=\left(\begin{pmatrix}1 & 0\\ 0 & 0\end{pmatrix},\begin{pmatrix}0 & 1\\ 0 & 0\end{pmatrix},\begin{pmatrix}0 & 0\\ 1 & 0\end{pmatrix},\begin{pmatrix}0 & 0\\ 0 & 1\end{pmatrix}\right)

Proof: We note that

T\left(\begin{pmatrix}1 &0\\ 0 &1\end{pmatrix}\right)=\begin{pmatrix}1 & 0\\1 & 0\end{pmatrix}

T\left(\begin{pmatrix}0 &1\\ 0 & 0\end{pmatrix}\right)=\begin{pmatrix}0 &1\\ 0 & 1\end{pmatrix}

T\left(\begin{pmatrix}0 & 0\\ 1 & 0\end{pmatrix}\right)=\begin{pmatrix}1 & 0\\ 1 & 0\end{pmatrix}

and

T\left(\begin{pmatrix}0 & 0\\ 0 & 1\end{pmatrix}\right)=\begin{pmatrix}0 & 1\\ 0 & 1\end{pmatrix}

So that

\left[T\right]_{\mathcal{B}}=\begin{pmatrix}1 & 0 & 1 & 0\\ 0 & 1 & 0 & 1\\ 1 & 0 & 1 & 0\\ 0 & 1 & 0 & 1\end{pmatrix}

 

5.

Problem: Let \mathscr{V} be an n-dimensional F-space and P\in\text{End}\left(\mathscr{V}\right). Let then

T:\text{End}\left(\mathscr{V}\right)\to\text{End}\left(\mathscr{V}\right):X\mapsto PX

Under what conditions is T invertible?

Proof: The claim is that T is an isomorphism if and only if P is an isomorphism. To see that this condition is sufficient we note that if P is an isomorphism and T(X)=T(Y) then PX=PY and thus since P is an isomorphism we may conclude that X=Y. Thus, T is a monomorphism, but by prior theorem we may then conclude that T is an isomorphism.

To see that P\in\text{GL}\left(\mathscr{V}\right) is a necessary condition we note that if T is an isomorphism then T is, in particular, an endomorphism. Thus, there exists some X_0\in\text{End}\left(\mathscr{V}\right) such that \mathbf{1}=T(X_0)=PX_0. But, the existence of a right inverse for P implies that P is an endomorphism, and thus since \mathscr{V} is finite dimensional we may conclude that P is an isomorphism.

6.

Problem: Prove that if I,J and K are the complex matrices

\begin{pmatrix}0 & 1\\ -1 & 0\end{pmatrix},\begin{pmatrix}i & 0\\ i & 0\end{pmatrix},\begin{pmatrix}i & 0\\ 0 & -i\end{pmatrix}

respectively. Show that I^2=J^2=K^2=-I_2, IJ=-JI=K, JK=-KJ=I, and KI=-IK=J.

Proof: This is purely computational, and not even cleverly so. That said, I would like to remark that this is a group under matrix multiplication which is isomorphic to the quaternions.

 

Advertisements

December 19, 2010 - Posted by | Fun Problems, Halmos, Linear Algebra, Uncategorized | , , , , ,

1 Comment »

  1. […] Point of post: This is a continuation of this post. […]

    Pingback by Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. II) « Abstract Nonsense | December 19, 2010 | Reply


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: