## Algebra Isomorphism Between Algebra of Square Matrices and the Endomorphism Algebra (Pt. III)

**Point of post: **This is a continuation of this post.

*First Indications of Usefulness*

We have now seen how given an -dimensional -space and a fixed ordered basis there is an associative unital algebra isomorphism between and . So, the question no remains “who cares?” Why does knowing the matrix associated with a linear transformation under some fixed ordered basis help us? Well, for right now it’s not apparent why matrices are ‘such a big deal’. It turns out that, like many things in mathematics, the use of matrices over the less less explicit function notation has it’s pros can cons. The pros are that when using matrices things often make ‘more sense’ visually. As we shall see the concept of the trace and determinant of a linear transformation just ‘look more sensible’ when phrased to matrices. There you can literally point to what is happening. Similarly, the concepts of ‘rank’ and ‘nullity’ (the dimension of the image and kernel of a linear transformation respectively) are sometimes easy to ‘see’ when using matrices. Moreover, it’s often more convenient to do computations with matrices then it is with linear transformations. In particular, for those of you who have interest in applied mathematics, writing a program to calculate the value of matrix multiplication is much easier than writing an explicit function for the linear transformation. The cons for using matrices is that one is liable (if not careful) to lose sight of the deep vector space theoretic concepts that are underlying what they’re doing. To use a slight pejorative term (not meant this way) one can get caught ‘playing with matrices’ for the rest of their studies. This is something, in my opinion, to avoid. Thus, with this all said I’d like to prove one last theorem regarding matrices and linear transformations which was remarked on above. Namely:

**Theorem: ***Let be an -dimensional -space and a fixed ordered basis for . Then, if is the coordinate map*

*Then for any and any we have that *

* ***Proof: **Let be arbitrary. Then, there exists scalars such that

Also, there exists scalars such that

Thus,

But, with equal validity we see that

so that

From where the conclusion follows.

From this we have the following neat corollary:

**Corollary: ***Let be an -dimensional -space with ordered basis and . Then, and commute if and only if and commute.*

**Proof: **Let be arbitrary and assume first that and commute. Then,

and since was arbitrary it follows that .

Conversely, suppose that and note then that

From where the conclusion follows.

**References:**

1. Golan, Jonathan S. *The Linear Algebra a Beginning Graduate Student Ought to Know*. Dordrecht: Springer, 2007. Print.

2. Halmos, Paul R. *Finite-dimensional Vector Spaces,*. New York: Springer-Verlag, 1974. Print

[…] this, and a previous theorem we get the following […]

Pingback by Center of an Algebra « Abstract Nonsense | December 16, 2010 |

[…] where it follows by previous theorem that […]

Pingback by Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. II) « Abstract Nonsense | December 19, 2010 |

[…] of the column vector of the coefficients of that element with respect to the ordered basis). Recall then that and let be -invariant. Note then that for […]

Pingback by Representation Theory: Matrix Entry Functions « Abstract Nonsense | February 22, 2011 |