Abstract Nonsense

Crushing one theorem at a time

Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. V)


Point of post: This is a continuation of this post.

18.

Problem: For which values of \alpha are the following matrices invertible? Find the inverses whenever possible.

a) \begin{pmatrix}1 & \alpha & 0\\ \alpha & 1 & \alpha\\ 0 & \alpha & 1\end{pmatrix}

b) \begin{pmatrix}\alpha & 1 & 0\\ 1 & \alpha & 1\\ 0 & 1 & \alpha\end{pmatrix}

c) \begin{pmatrix}0 & 1 & \alpha\\ 1 & \alpha & 0\\ \alpha & 0 & 1\end{pmatrix}

d) \begin{pmatrix}1 & 1 & 1\\ 1 & 1 & \alpha\\ 1 & \alpha & 1\end{pmatrix}

Proof:

a) This is invertible precisely when \displaystyle \alpha\ne\pm\frac{\sqrt{2}}{2}. Indeed, if \displaystyle \alpha=\pm\frac{\sqrt{2}}{2} then \begin{pmatrix}1 & \alpha & 0\\ \alpha & 1 & \alpha\\ 0 & \alpha & 1\end{pmatrix}\begin{pmatrix}1\\ \mp\sqrt{2}\\ 1\end{pmatrix}=\begin{pmatrix}0\\ 0\\ 0\end{pmatrix}. And, if \alpha\ne\frac{\pm\sqrt{2}}{2} then \displaystyle \frac{1}{1-2\alpha^2}\begin{pmatrix}\alpha^2-1 & \alpha & -\alpha^2\\ \alpha & -1 & \alpha\\ -\alpha^2 & \alpha & \alpha^2-1\end{pmatrix} serves as an inverse.

b) This is invertible precisely when x\ne\pm\sqrt{2},0.  Indeed, if \alpha=\pm\sqrt{2} then \begin{pmatrix}\alpha & 1 & 0\\ 1 & \alpha & 1\\ 0 & 1 & \alpha\end{pmatrix}\begin{pmatrix}1\\ \mp\sqrt{2}\\ 1\end{pmatrix}=\begin{pmatrix}0\\ 0\\ 0\end{pmatrix}. And, if \alpha=0 then \begin{pmatrix}\alpha & 1 & 0\\ 1 & \alpha & 1\\ 0 & 1 & \alpha\end{pmatrix}\begin{pmatrix}-1\\ 0\\ 1\end{pmatrix}=\begin{pmatrix}0\\ 0\\ 0\end{pmatrix}. If \alpha\ne\pm\sqrt{2},0 then \displaystyle \frac{1}{\alpha^2-2}\begin{pmatrix}-\frac{1-\alpha^2}{\alpha} & -1 & \frac{1}{\alpha}\\ -1 & \alpha & -1\\ \frac{1}{\alpha} & -1 & -\frac{1-\alpha^2}{\alpha}\end{pmatrix} serves as an inverse.

c) Assuming we’re discussing real matrices this is true precisely when \alpha \ne -1. Indeed, if \alpha=-1 then \begin{pmatrix}0 & 1 & \alpha\\ 1 & \alpha & 0\\ \alpha & 0 & 1\end{pmatrix}\begin{pmatrix}1\\ 1\\ 1\end{pmatrix}=\begin{pmatrix}0\\ 0\\ 0\end{pmatrix}. If \alpha\ne -1 then \displaystyle \frac{1}{\alpha^3+1}\begin{pmatrix}-\alpha & 1 & \alpha^2\\ 1 & \alpha^2 & \alpha\\ \alpha^2 & -\alpha & 1\end{pmatrix} serves as an inverse.

d) This is invertible precisely when \alpha\ne 1. Indeed, if \alpha=1 then \begin{pmatrix}1 & 1 & 1\\ 1 & 1 & \alpha\\ 1 & \alpha & 1\end{pmatrix}\begin{pmatrix}-1\\ 1\\ 0\end{pmatrix}=\begin{pmatrix}0\\ 0\\ 0\end{pmatrix}. Otherwise \begin{pmatrix}\frac{\alpha+1}{\alpha-1} & \frac{1}{1-\alpha} & \frac{1}{1-\alpha}\\ \frac{1}{1-\alpha} & 0 & \frac{1}{\alpha-1}\\ \frac{1}{1-\alpha} & \frac{1}{\alpha-1} & 0\end{pmatrix} serves as an inverse.

19.

Problem:

a) It is easy to extend matrix theory to linear transformations between different vector spaces. Suppose that \mathscr{U} and \mathscr{V} are vector spaces over the same field F. Let \{x_1,\cdots,x_n\} and \{y_1,\cdots,y_m\} be bases for \mathscr{U} and \mathscr{V} respectively, and A\in\text{Hom}\left(\mathscr{U},\mathscr{V}\right). The matrix of A is, by definition, the rectangular m\times n array of scalars defined by

\displaystyle A(x_j)=\sum_{i=1}^{m}\alpha_{i,j}y_i

Define addition and multiplication of rectangular matrices so as to generalize as many possible of the results of section 38.

b) Suppose that A and B are multipliable matrices. Partition A into four rectangular blocks (top left, top right, bottom left, bottom right) and then partition B similarly so that the number of columns in the top left part of A is the same as the number of rows in the top left part of B. If, in an obvious shorthand, these partitioned matrices are indicated by

A=\begin{pmatrix}A_{1,1} & A_{1,2}\\ A_{2,1} & A_{2,2}\end{pmatrix}

and

B=\begin{pmatrix}B_{1,1} & B_{1,2}\\ B_{2,1} & B_{2,2}\end{pmatrix}

Then,

AB=\begin{pmatrix}A_{1,1}B_{1,1}+A_{1,2}B_{2,1} & A_{1,1}B_{1,2}+A_{1,2}B_{2,2}\\ A_{2,1}B_{1,1}+A_{2,2}B_{2,1} & A_{2,1}B_{1,2}+A_{2,2}B_{2,2}\end{pmatrix}

c) Use subspaces and complements to express the result of b) in terms of linear transformations (instead of matrices).

Proof:

a) We may define addition of rectangular matrices exactly the same. For multiplication, we may only (in any normal sense) define matrix multiplication of rectangular matrices A=[\alpha_{i,j}] and B=[\beta_{i,j}] when the size of them are m\times p and p\times n respectively. From there we may define the multiplication AB to be the m\times n matrix whose general term is, unsurprisingly

\displaystyle \sum_{r=1}^{p}\alpha_{i,p}\beta_{p,j}

One can check that with this definition all the axioms of an associative non-unital algebra are held.

b) This is just tedious computation, if someone has a dying desire for me to upload this, let me know.

c) Suppose that \mathscr{V}=\text{span}\{x_1,\cdots,x_k\}\oplus\text{span}\{x_{k+1},\cdots,x_n\}.  Then,  for some T\in\text{End}\left(\mathscr{V}\right)  we can compute [T]_{\mathcal{B}}, where \mathcal{B}=(x_1,\cdots,x_k,x_{k+1},\cdots,x_{n}) by thinking of \displaystyle T(x_j)=\sum_{i=1}^{k}\alpha_{i,j}x_i+\sum_{i=k+1}^{n}\alpha_{i,j}x_i and \displaystyle T(x_i)=\sum_{r=1}^{k}\beta_{r,i}x_r+\sum_{r=k+1}^{n}\beta_{r,i}x_r we see then that

\displaystyle \begin{aligned}T'\left(T(x_j)\right) &= T'\left(\sum_{i=1}^{k}\alpha_{i,j}x_i+\sum_{i=k+1}^{n}\alpha_{i,j}x_i\right)\\ &= \sum_{i=1}^{n}\alpha_{i,j}\left(\sum_{r=1}^{k}\beta_{r,i}x_r+\sum_{r=k+1}^{n}\beta_{r,i}x_r\right)+\sum_{i=k+1}^{n}\alpha_{i,j}\left(\sum_{r=1}^{k}\beta_{i,r}x_r+\sum_{i=k+1}^{n}\beta_{i,r}x_r\right)\\ &=\sum_{i=1}^{k}\sum_{r=1}^{k}\alpha_{i,j}\beta_{i,r}x_r+\sum_{i=1}^{k}\sum_{r=k+1}^{n}\alpha_{i,j}\beta_{r,i}x_r+\sum_{i=k+1}^{n}\sum_{r=1}^{k}\alpha_{i,j}\beta_{r,i}x_r+\sum_{i=k+1}^{n}\sum_{r=k+1}^{n}\alpha_{i,j}\beta_{r,i}x_r\end{aligned}

from where, with the proper definitions of A_{i,j},B_{i,j},\text{ }i,j\in[2] the result follows.

References:

1. Halmos, Paul R.  Finite-dimensional Vector Spaces,. New York: Springer-Verlag, 1974. Print

Advertisements

December 19, 2010 - Posted by | Fun Problems, Halmos, Linear Algebra | , , , ,

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: