Abstract Nonsense

Crushing one theorem at a time

Algebra Isomorphism Between Algebra of Square Matrices and the Endomorphism Algebra (Pt. I)

Point of post: In this series of posts we’ll cover a lot of ground. We’ll discuss how one can canonically associate matrices in \text{Mat}_n\left(F\right) with endomorphisms on F^n, we’ll then discuss the ideas of ordered bases and associative unital algebra isomorphisms, and we’ll end the sequence with our main theorem which connects matrices and endomorphisms on general n-dimensional F-spaces in an interesting and instructive way.


In our last post we saw how to endow \text{Mat}_n\left(F\right) (F^{n^2} in disguise) with the structure of an associative unital algebra. We hinted that besides this being another example to add to our list of associative unital algebras that it played an important, and enlightening role in the study of the endomorphism algebra of an n-dimensional F-space. To see this we’ll first show how one can canonically interpret a matrix as a linear transformation on F^n(in all formality what we really mean is that the square array of numbers defined as a matrix gives rise to a linear transformation which is represented and computed using the same square array of numbers). We’ll then show that this process goes in reverse. Namely, given an endomorphism on some n-dimensional F-space \mathscr{V} there is a canonical way to produce a matrix. Moreover, this correspondence between linear transformations and matrices turns out to be something called an associative unital algebra isomorphism, which is just a highfalutin way of saying that the correspondence respects the operations of the domain and codomain algebras.

Square Matrices as Transformations on F^n

We begin by showing how given a field F and some matrix M\in\text{Mat}_n\left(F\right) there is a natural way to associate with M an element of \text{End}\left(F^n\right). We’ll see that the association is so natural, that we can’t help but indulge in a formal inaccuracy and use the same symbol for both the matrix and linear transformation. So, let us examine this idea in more detail.

So, as stated before we begin by considering some M=[\alpha_{i,j}]\in\text{Mat}_n\left(F\right). From M we’d like to produce some endomorphism on F^n. To start this process we’d like to make a convention which is purely for notational convenience. Namely, in the past we’ve looked at n-tuples in F^n as being written ‘horizontally’ in the sense that a general element of F^n looks like (\beta_1,\cdots,\beta_n). But, for our purposes (it will make things look ‘smoother’) we shall like to think of n-tuples in F^n as being written vertically. Thus, a general element of F^n with this convention will look like

\begin{pmatrix}\beta_1\\ \vdots\\ \beta_n\end{pmatrix}

From there we can define the endomorphism M:F^n\to F^n induced by M to be the one which acts on column vector as follows

\begin{pmatrix}\alpha_{1,1} & \cdots & \alpha_{1,n}\\ \vdots & \ddots & \vdots\\ \alpha_{n,1} & \cdots & \alpha_{n,n}\end{pmatrix}\begin{pmatrix}\beta_1\\ \vdots\\ \beta_n\end{pmatrix}=\begin{pmatrix}\sum_{j=1}^{n}\alpha_{1,j}\beta_j\\ \vdots\\ \sum_{j=1}^{n}\alpha_{n,j}\beta_j\end{pmatrix}

This mapping is indeed linear since

\begin{aligned}\begin{pmatrix}\alpha_{1,1} & \cdots & \alpha_{1,n}\\ \vdots & \ddots & \vdots\\ \alpha_{n,1} & \cdots & \alpha_{n,n}\end{pmatrix}\left(\begin{pmatrix}\beta_1\\ \vdots\\ \beta_n\end{pmatrix}+\begin{pmatrix}\gamma_1\\ \vdots\\ \gamma_n\end{pmatrix}\right) &= \begin{pmatrix}\sum_{j=1}^{n}\alpha_{1,j}(\beta_j+\gamma_j)\\ \vdots\\ \sum_{j=1}^n\alpha{n,j}(\beta_j+\gamma_j)\end{pmatrix}\\ &=\begin{pmatrix}\sum_{j=1}^{n}\alpha_{1,j}\beta_j\\ \vdots\\ \sum_{j=1}^{n}\alpha_{n,j}\beta_j\end{pmatrix}+\begin{pmatrix}\sum_{j=1}^{n}\alpha_{1,j}\gamma_j\\ \vdots\\ \sum_{j=1}^{n}\alpha_{n,j}\gamma_j\end{pmatrix}\\ &= \begin{pmatrix}\alpha_{1,1} & \cdots & \alpha_{1,n}\\ \vdots & \ddots & \vdots\\ \alpha_{n,1} & \cdots & \alpha_{n,n}\end{pmatrix}\begin{pmatrix}\beta_1\\ \vdots\\ \beta_n\end{pmatrix}+\begin{pmatrix}\alpha_{1,1} & \cdots & \alpha_{1,n}\\ \vdots & \ddots & \vdots\\ \alpha_{n,1} & \cdots & \alpha_{n,n}\end{pmatrix}\begin{pmatrix}\gamma_1\\ \vdots\\ \gamma_n\end{pmatrix}\end{aligned}


\begin{aligned}\alpha\left(\begin{pmatrix}\alpha_{1,1} & \cdots & \alpha_{1,n}\\ \vdots & \ddots & \vdots\\ \alpha_{n,1} & \cdots & \alpha_{n,n}\end{pmatrix}\begin{pmatrix}\beta_1\\ \vdots\\ \beta_n\end{pmatrix}\right) &= \alpha\begin{pmatrix}\sum_{j=1}^{n}\alpha_{1,j}\beta_j\\ \vdots\\ \sum_{j=1}^{n}\alpha_{n,j}\beta_j\end{pmatrix}\\ &= \begin{pmatrix}\alpha\sum_{j=1}^{n}\alpha_{1,j}\beta_j\\ \vdots\\ \alpha\sum_{j=1}^{n}\alpha_{n,j}\beta_j\end{pmatrix}\\ &= \begin{pmatrix}\sum_{j=1}^{n}\alpha_{1,j}(\alpha \beta_j)\\ \vdots\\ \sum_{j=1}^{n}\alpha_{n,j}(\alpha \beta_j)\end{pmatrix}\\ &= \begin{pmatrix}\alpha_{1,1} & \cdots & \alpha_{1,n}\\ \vdots & \ddots & \vdots\\ \alpha_{n,1} & \cdots & \alpha_{n,n}\end{pmatrix}\begin{pmatrix}\alpha \beta_1\\ \vdots\\ \alpha\beta_n\end{pmatrix}\end{aligned}

The interesting thing (not so surprising as we shall soon see) is the following theorem:

Theorem: Let F be a field and A,B\in\text{Mat}_n\left(F\right)  and \begin{pmatrix}\gamma_1\\ \vdots\\ \gamma_n\end{pmatrix}\in F^n. Then,

A\left(B\begin{pmatrix}\gamma_1\\ \vdots\\ \gamma_n\end{pmatrix}\right)=\left(AB\right)\begin{pmatrix}\gamma_1\\ \vdots\\ \gamma_n\end{pmatrix}

In words: applying B to \begin{pmatrix}\gamma_1\\ \vdots\\ \gamma_n\end{pmatrix} and then applying A gives the same result if you matrix multiply AB and then apply this to \begin{pmatrix}\gamma_1\\ \vdots\\ \gamma_n\end{pmatrix}.

Proof: This follows from direct computation:

\displaystyle \begin{aligned}A\left(B\begin{pmatrix}\gamma_1\\ \vdots\\ \gamma_n\end{pmatrix}\right) &= A\left(\begin{pmatrix}\sum_{r=1}^{n}\beta_{1,r}\gamma_r\\ \vdots\\ \sum_{r=1}^{n}\beta_{n,r}\gamma_r\end{pmatrix}\right)\\ &= \begin{pmatrix}\sum_{s=1}^{n}\alpha_{1,s}\sum_{r=1}^{n}\beta_{s,r}\gamma_r\\ \vdots\\ \sum_{s=1}^{n}\alpha_{n,s}\sum_{r=1}^{n}\beta_{s,r}\gamma_r\end{pmatrix}\\ &= \begin{pmatrix}\sum_{r=1}^{n}\left(\sum_{s=1}^{n}\alpha_{1,s}\beta_{s,r}\right)\gamma_r\\ \vdots\\ \sum_{r=1}^{n}\left(\sum_{s=1}^{n}\alpha_{n,s}\beta_{s,r}\right)\gamma_r\end{pmatrix}\\ &= \left(AB\right)\begin{pmatrix}\gamma_1\\ \vdots\\ \gamma_n\end{pmatrix}\end{aligned}


The question then arises as to whether all of the endomorphisms on F^n are of the above form. We will find the answer as a corollary of the next section

Associative Unital Algebra Isomorphisms

In this section we discuss the concept of associative unital algebra isomorphisms which, as the name suggests, are just the bijective maps between two algebras which preserve structure ‘both ways’ (for those familiar with category theory, this of course is just saying that associative unital algebras are the isomorphisms in the category of associative unital algebras).

So, let \mathscr{A} and \mathscr{A}' be associative unital algebras over F with multiplicative identities \mathbf{1} and \mathbf{1}'. Then, a mapping f:\mathscr{A}\to\mathscr{A}' is called an associative unital algebra isomorphism if f is bijective and for all x,y\in\mathscr{A} and \alpha,\beta\in F

\begin{aligned}&(1)\quad f(\alpha x+\beta y)=\alpha f(x)+\beta f(y)\\ &(2)\quad f(xy)=f(x)f(y)\\ &(3)\quad f(\mathbf{1})=\mathbf{1}'\end{aligned}

We first make the observation that since f is a linear transformation that all the theorems about them apply, and so in particular f(\bold{0})=\bold{0}.

Our next theorem just goes to show that if f is an associative unital algebra isomorphism then so is its inverse:

Theorem: Let \mathscr{A} and \mathscr{A}' be associative unital algebras with identities \mathbf{1} and \mathbf{1}'. Then, if f:\mathscr{A}\to\mathscr{A}' is an associative unital algebra isomorphism, so is f^{-1}.

Proof: Evidently f^{-1} is bijective and f^{-1}(\mathbf{1}')=\mathbf{1}. The fact that the inverse is linear follows from aprevious theorem. To see that the inverse preserves products we note that for any x',y'\in\mathscr{A}' we have by surjectivity of f that x'=f(x) and y'=f(y) for x,y\in\mathscr{A}. We see then that


and thus applying f^{-1} to both sides finishes the argument. \blacksquare



1. Golan, Jonathan S. The Linear Algebra a Beginning Graduate Student Ought to Know. Dordrecht: Springer, 2007. Print.

2. Halmos, Paul R.  Finite-dimensional Vector Spaces,. New York: Springer-Verlag, 1974. Print


December 14, 2010 - Posted by | Algebra, Halmos, Linear Algebra | , , , , , ,


  1. […] Point of post: This is a continuation of this post. […]

    Pingback by Algebra Isomorphism Between Algebra of Square Matrices and the Endomorphism Algebra (Pt. II) « Abstract Nonsense | December 14, 2010 | Reply

  2. […] read my side-along postings you will probably need to see the series of posts for which this and this are the first posts for […]

    Pingback by Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. I) « Abstract Nonsense | December 19, 2010 | Reply

  3. […] to verify that is a representation of it suffices to show that is a homomorphism. But, using the fact that the map is an associative unital algebra isomorphism we know then […]

    Pingback by Representation Theory: Matrix Entry Functions « Abstract Nonsense | February 22, 2011 | Reply

  4. […] Theorem: Let , and let be defined as above. Then is a -homomorphism (i.e. it’s an associative algebra homomorphism) […]

    Pingback by Representation Theory: A ‘Lemma’ « Abstract Nonsense | March 10, 2011 | Reply

  5. […] this post we’ll show much more. Indeed, we’ll show that this isomorphism is also an associative unital algebra isomorphism. Moreover, we’ll even show that if one gives each an inner product which is a multiple of […]

    Pingback by Representation Theory: Decomposing the Group Algebra Into the Direct Sum of Matrix Algebras « Abstract Nonsense | April 6, 2011 | Reply

  6. […] some point we denote the derivative of at by . We define the Jacbobian matrix of at to be the matrix representation of with respect to the usual ordered basis. We denote the Jacobian of at as […]

    Pingback by The Total Derivative « Abstract Nonsense | May 22, 2011 | Reply

  7. […] may think that matrix rings come up most often in the study of matrices, as in the connection between matrices over fields and linear transformations. In fact, in our studies ,at least in […]

    Pingback by Matrix Rings (Pt. I) « Abstract Nonsense | July 12, 2011 | Reply

  8. […] by linearity”, for finite dimensional spaces we get (in fact from the previous idea) the duality between and ,  we have the fact that structurally vector spaces have only one invariant (in the […]

    Pingback by Free Modules (Pt. I) « Abstract Nonsense | November 16, 2011 | Reply

  9. […] an example of an (iso)morphism of -algebras one can recall the basic fact from linear algebra that if is a field and is an -dimensional -space then […]

    Pingback by R-Algebras « Abstract Nonsense | January 10, 2012 | Reply

  10. […] the association is actuall an -algebra isomorphism (the proof is the same as that for the case of fields). Moreover we know that if we define to be the unique -isomorphism with then where acts on in […]

    Pingback by Tensor Algebra and Exterior Product (Pt. VI) « Abstract Nonsense | May 10, 2012 | Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: