# Abstract Nonsense

## Direct Sum of Linear Transformations and Direct Sum of Matrices (Pt. III)

Point of post: This is a literal continuation of this post. Treat the two posts as one contiguous object.

Direct Sum of Matrices

Now that we have defined the direct sum of linear transformations we now define the direct sum of matrices and show, what we all would hope is true, that the matrix representation with respect to the direct sum of linear transformations is the direct sum of the matrix representations of the direct summands; we’ll state this more rigorously in what follows. We first define the direct sum of matrices mechanically, reverting once again to thinking of them as being elements of the algebra of square matrices.

Let $F$ be a field and $M_1=[^1\alpha_{i_1,j_1}],\cdots,M_r[^r\alpha_{i_r,j_r}]$ (where the ‘left superscripts’ are meant to keep track of which matrix the particular $\alpha$ is an entry in) be matrices over $F$ of size $m_1\times n_1,\cdots,m_r\times n_r$ respectively. We then define the direct sum of $M_1,\cdots,M_n$ to be the $(m_1+\cdots+m_r)\times(n_1+\cdots+n_r)$ matrix

$\displaystyle \bigoplus_{k=1}^{r}M_k=\begin{pmatrix}M_1 & 0 & \cdots & 0\\ 0 & M_2 & \cdots & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & \cdots & 0 & M_r\end{pmatrix}$

We now show how the two concepts of direct sums of linear transformations and direct sums of matrices interact. Namely:

Theorem: Let $\mathscr{V}$, $\mathscr{W}_k,\mathscr{U}_k, \phi_k, T_k$ and $T$ be as in the definition of direct sum of linear transformations, except now assume that $\dim_F\mathscr{V}<\infty$. Also, let $\mathcal{B}_1=\{x_{1,1},\cdots,x_{1,n_1}\},\cdots,\mathcal{B}_m=\{x_{m,1},\cdots,x_{m,n_m}\}$ be bases for $\mathscr{U}_1,\cdots,\mathscr{U}_m$ and suppose that  for each $r\in[m]$ and $s\in[n_r]$ we have that

$\displaystyle T(x_{r,s})=\sum_{j=1}^{n_r} {^r \alpha_{s,j}x_{r,j}}$

[where the superscript $r$ is made to tell whether or not $\alpha_{1,5}$ (say, for example) is the coefficient of $x_{2,5}$ or $x_{3,5}$. In essence, it’s another index to keep straight which of the spaces  $\mathscr{U}_1,\cdots,\mathscr{U}_m$ we’re working in]. Then,

$\displaystyle \mathcal{B}=(\phi_1(x_{1,1}),\cdots\phi_1(,x_{1,n_1}),\cdots,\phi_m(x_{m,1}),\cdots,\phi_m(x_{m,n_m}))$

is an ordered basis for $\mathscr{V}$ and

$\displaystyle \left[T\right]_{\mathcal{B}}=\bigoplus_{k=1}^{m}\left[T_k\right]_{\mathcal{B}_k}\quad\quad\mathbf{(*)}$

Proof: The fact that $\mathcal{B}$ is an ordered basis for $\mathscr{V}$ is trivial since $\{\phi_k(x_{k,1}),\cdots,\phi_k(x_{k,n_k})\}$ is a basis for $\mathscr{W}_k$ for each $k\in[m]$ (since $\phi_k$ is an isomorphism).  Now to prove that $\mathbf{(*)}$ is true we begin by noticing that since for each $r\in[m]$ and $s\in[n_r]$ we have that $\phi(x_{r,s})\in\mathscr{W}_r$ and so by definition

$\displaystyle T\left(\phi_r\left(x_{r,s}\right)\right)=T_r(\phi^{-1}_r\left(\phi_r\left(x_{r,s}\right)\right)=\sum_{j=1}^{n_r} {^r \alpha_{s,j}x_{r,j}}\quad\mathbf{(1)}$

From where it follows that

$\displaystyle \left[T\right]_{\mathcal{B}}=\left( \begin{array}{c|c|c|c|c|c|c}& & & & & &\\ T\left(\phi_1\left(x_{1,1}\right)\right) & \cdots & T\left(\phi_1\left(x_{1,n_1}\right)\right) & \cdots & T\left(\phi_m\left(x_{m,1}\right)\right) & \cdots & T\left(\phi_m\left(x_{m,n_m}\right)\right)\\ & & & & & & \end{array}\right)$

But, it’s fairly easy to see from $\mathbf{(1)}$ that the above can be rewritten as

$\displaystyle \begin{pmatrix}{^1 \alpha_{1,1}} & \cdots & {^1\alpha_{1,n_1}}\\ \vdots & \ddots & \vdots\\ {^1\alpha_{n_1,1}} & \cdots & {^1\alpha_{n_1,n_1}}\end{pmatrix}\oplus\cdots\oplus\begin{pmatrix}{^m\alpha_{1,1}} & \cdots & {^m\alpha_{1,n_m}}\\ \vdots & \ddots & \vdots\\ {^m\alpha_{n_m,1}} & \cdots & {^m\alpha_{n_m,n_m}}\end{pmatrix}$

But, upon inspection this is equal to

$\displaystyle \bigoplus_{k=1}^{m}\left[T_k\right]_{\mathcal{B}_k}$

as desired. $\blacksquare$

References:

1.  Halmos, Paul R.  Finite-dimensional Vector Spaces,. New York: Springer-Verlag, 1974. Print