Abstract Nonsense

Crushing one theorem at a time

Algebra of Square Matrices (Pt. I)


Point of post: In this post we discuss the concept of the square matrix algebra of size n over a field F in the quest to approach the concept of the relation between matrices and linear transformations in the opposite way in which Halmos views them in sections 36 and 37 of his book. Namely, we start out by defining matrices as being algebraic objects in their own right, and then connect the endomorphism algebra of a vector space canonically with a particular matrix algebra in a canonical way.

Motivation

In this post we show how given a field F there is a natural way to produce an associative unital algebra of dimension n^2 over F. This algebra will be the algebra of square matrices (to be defined below) of size n\times n. Not only will this algebra serve as another interesting addition to our menagerie of vector spaces (algebras) but will give us a fascinating and useful means to look at the endomorphism algebra of a vector space \mathscr{V} is a whole new way.

Definition of Matrices

Let F be a field and n\in\mathbb{N}. For \alpha_{i,j}\in F,\quad i,j\in[n] we call the square array

\begin{pmatrix}\alpha_{1,1} & \cdots & \alpha_{1,n}\\ \vdots & \ddots & \vdots\\ \alpha_{n,1} & \cdots & \alpha_{n,n}\end{pmatrix}

a square matrix of size n over F. We denote the set of all such square matrices by \text{Mat}_n\left(F\right). For convenience sake we denote square matrices as either an upper case letter, such as M, or we use the notation [\alpha_{i,j}] (or both). We call the vector (\alpha_{i,1},\cdots,\alpha_{i,n})\in F^n the i^{\text{th}} row of M and the vector (\alpha_{1,j},\cdots,\alpha_{n,j})\in F^n the j^{\text{th}} column of M. We call \alpha_{i,j} the i,j^{\text{th}} entry of [\alpha_{i,j}] and we say two matrices are equal if they have the same entries. In essence, a matrix [\alpha_{i,j}] can really be thought of as an n^2-tuple of  elements of F.

Vector Space Structure

We now endow \text{Mat}_n\left(F\right) with a vector space structure over F, namely if M,N\in\text{Mat}_n\left(F\right) and M=[\alpha_{i,j}] and N=[\beta_{i,j}] we define the sum of M and N times the scalars \alpha and \beta respectively, denoted by\alpha M+\beta N, to be the matrix S=[\gamma_{i,j}] where \gamma_{i,j}=\alpha \alpha_{i,j}+\beta \beta_{i,j}. Put more visually

\alpha \begin{pmatrix}\alpha_{1,1} & \cdots & \alpha_{1,n}\\ \vdots & \ddots & \vdots\\ \alpha_{n,1} & \cdots & \alpha_{n,n}\end{pmatrix}+\beta\begin{pmatrix}\beta_{1,1} & \cdots & \beta_{1,n}\\ \vdots & \ddots & \vdots\\ \beta_{n,1} & \cdots & \beta_{n,n}\end{pmatrix}=\begin{pmatrix}\alpha \alpha_{1,1} +\beta \beta_{1,1} & \cdots & \alpha \alpha_{1,n}+\beta \beta_{1,n}\\ \vdots & \ddots & \vdots\\ \alpha \alpha_{n,1}+\beta \beta_{n,1} & \cdots & \alpha\alpha_{n,n}+\beta\beta_{n,n}\end{pmatrix}

We then define the zero element \bold{0} to be the zero matrix, denoted [0], to be the matrix

\begin{pmatrix}0 & \cdots & 0\\ \vdots & \ddots & \vdots\\ 0 & \cdots & 0\end{pmatrix}

and as a special case as the sum of matrices times scalars, for a matrix [\alpha_{i,j}] we define -[\alpha_{i,j}] to be [-\alpha_{i,j}], in other words scalars are distributed entry-wise. We leave it to the reader to verify that this structure does, in fact, define a vector space structure on \text{Mat}_n\left(F\right).

We now prove what was hinted at in the definition of matrices, namely that \text{Mat}_n\left(F\right) is just a guise for the common space F^{n^2}. Put more directly:

Theorem: Let F be a field, then:

\begin{aligned}&(1)\quad \text{Mat}_n\left(F\right)\textit{ is }n^2\textit{ dimensional}\\ &(2)\quad f:\text{Mat}_n\left(F\right)\to F^{n^2}:[\alpha_{i,j}]\mapsto (\alpha_{1,1},\alpha_{1,2},\cdots,\alpha_{n,n-1},\alpha_{n,n})\textit{ is an isomorphism}\end{aligned}

Proof:

(1): To do this we merely show that \left\{M_{k,\ell}:k,\ell\in[n]\right\} is a basis for \text{Mat}_n\left(F\right) where M_{k,\ell}=[\delta_{(k,\ell),(i,j)}] where

\delta_{(k,\ell),(i,j)}=\begin{cases}1 & \mbox{if}\quad (k,\ell)=(i,j)\\ 0 & \mbox{if}\quad (k,\ell)\ne(i,j)\end{cases}

But this follows directly from the observation that if \alpha_{k,\ell}\in F,\text{ }k,\ell\in[n] then

\displaystyle \sum_{k=1}^{n}\sum_{\ell=1}^{n}\alpha_{k,\ell}M_{\ell,k}=[\alpha_{i,j}]

(2): This follows directly from our earlier characterization of isomorphisms since evidently f is linear and letting \{e_1,\cdots,e_{n^2}\} be the usual basis for F^{n^2} we see then that f([\delta_{(k,\ell),(i,j)}])=e_{(k-1)n+\ell} which is evidently a bijection between the two bases.

\blacksquare

Algebra Structure of Square Matrices

We now discuss the much more interesting structure on \text{Mat}_n\left(F\right), namely the multiplicative structure; this structure is much less intuitive. So, if M=[\alpha_{i,j}],N=[\beta_{i,j}]\in\text{Mat}_n\left(F\right) We define the product of M and N, denoted by the concatenation MN=[\gamma_{i,j}] where

\displaystyle \gamma_{i,j}=\sum_{r=1}^{n}\alpha_{i,r}\beta_{r,j}

Once again, more visually

\displaystyle \begin{pmatrix}\alpha_{1,1} & \cdots & \alpha_{1,n}\\ & \vdots & \ddots & \vdots\\ \alpha_{n,1} & \cdots & \alpha_{n,n}\end{pmatrix}\begin{pmatrix}\beta_{1,1} & \cdots & \beta_{1,n}\\ \vdots & \ddots & \vdots\\ \beta_{n,1} & \cdots & \beta_{n,n}\end{pmatrix}=\begin{pmatrix}\sum_{r=1}^{n}\alpha_{1,r}\beta_{r,1} & \cdots & \sum_{r=1}^{n}\alpha_{1,r}\beta_{r,n}\\ & \vdots & \ddots & \vdots\\ \sum_{r=1}^{n}\alpha_{n,r}\beta_{r,1} & \cdots & \sum_{r=1}^{n}\alpha_{n,r}\beta_{r,n}\end{pmatrix}

It is perhaps less obvious with this above multiplication and the definition of the sum of two matrices given in the previous section that \text{Mat}_n\left(F\right) is turned into an associative unital algebra, so we’ll prove this

Theorem: Let F be a field and let \text{Mat}_n\left(F\right) have the vector space structure given in the first section and the multiplicative structure given in the second section. Then, \text{Mat}_n\left(F\right)  is an associative unital algebra with identity element I_n where I_n=[\delta_{i,j}] (where \delta_{i,j} is, as usual, the Kronecker Delta Symbol)

Proof: We first prove that the multiplication described above is, in fact, associative. To do this we let M=[\alpha_{i,j}],N=[\beta_{i,j}],P=\gamma_{i,j}\in\text{Mat}_n\left(F\right)  and note that the general term of MN  is

\displaystyle \omega_{i,j}=\sum_{r=1}^{n}\alpha_{i,r}\beta_{r,j}

and the general term of NP is

\displaystyle \nu_{i,j}=\sum_{s=1}^{n}\beta_{i,s}\gamma_{s,j}

Thus, the general term of M(NP) is

\displaystyle \begin{aligned}\sum_{r=1}^{n}\alpha_{i,r}\nu_{r,j} &=\sum_{r=1}^{n}\alpha_{i,r}\sum_{s=1}^{n}\beta_{r,s}\gamma_{s,j}\\ &=\sum_{s=1}^{n}\left(\sum_{r=1}^{n} \alpha_{i,r}\beta_{r,s}\right)\gamma_{s,j}\\ &= \sum_{s=1}^{n}\omega_{i,s}\gamma_{s,j}\end{aligned}

which is the general term for (MN)P. from where associativity follows. (Proof continued on next post)

References:

1. Golan, Jonathan S. The Linear Algebra a Beginning Graduate Student Ought to Know. Dordrecht: Springer, 2007. Print.

2. Halmos, Paul R.  Finite-dimensional Vector Spaces,. New York: Springer-Verlag, 1974. Print

Advertisements

December 13, 2010 - Posted by | Algebra, Halmos, Linear Algebra, Uncategorized | , , , ,

5 Comments »

  1. […] of post: This post is a continuation of this post. We will finish the proof started in the last post and then wrap up our discussion of square […]

    Pingback by Algebra of Square Matrices (Pt. II) « Abstract Nonsense | December 13, 2010 | Reply

  2. […] our last post we saw how to endow ( in disguise) with the structure of an associative unital algebra. We hinted […]

    Pingback by Algebra Isomorphism Between Algebra of Square Matrices and the Endomorphism Algebra (Pt. I) « Abstract Nonsense | December 16, 2010 | Reply

  3. […] read my side-along postings you will probably need to see the series of posts for which this and this are the first posts for […]

    Pingback by Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. I) « Abstract Nonsense | December 19, 2010 | Reply

  4. […] course we know that the space of of matrices over , ,  is an associative unital algebra with the usual […]

    Pingback by The Hilbert-Schmidt Inner Product on Complex Matrix Algebras « Abstract Nonsense | April 5, 2011 | Reply

  5. […] by where and where . The fact that with these  operations is really a ring is the same as the case when is a […]

    Pingback by Matrix Rings (Pt. I) « Abstract Nonsense | July 12, 2011 | Reply


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: