## Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. IV)

**Point of post: **This is a continuation of this post.

*Remark: *For some strange reason the fourth (this one) and the fifth (the previous one) got mixed up in the order of posting. The number is correct, this is the fourth post in this sequence and the one preceding it the fifth.

## Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. V)

**Point of post: **This is a continuation of this post.

## Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. III)

**Point of post:** This is a continuation of this post.

## Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. II)

**Point of post: **This is a continuation of this post.

## Halmos Sections 37 and 38: Matrices and Matrices of Linear Transformations(Pt. I)

**Point of post: **In this post I will complete the problems listed at the end of sections 37 and 38 of Halmos.

*Remark: *For those who are just interested in the solutions to Halmos and haven’t read my side-along postings you will probably need to see the series of posts for which this and this are the first posts for notation.

## Algebra Isomorphism Between Algebra of Square Matrices and the Endomorphism Algebra (Pt. III)

**Point of post: **This is a continuation of this post.

*First Indications of Usefulness*

We have now seen how given an -dimensional -space and a fixed ordered basis there is an associative unital algebra isomorphism between and . So, the question no remains “who cares?” Why does knowing the matrix associated with a linear transformation under some fixed ordered basis help us? Well, for right now it’s not apparent why matrices are ‘such a big deal’. It turns out that, like many things in mathematics, the use of matrices over the less less explicit function notation has it’s pros can cons. The pros are that when using matrices things often make ‘more sense’ visually. As we shall see the concept of the trace and determinant of a linear transformation just ‘look more sensible’ when phrased to matrices. There you can literally point to what is happening. Similarly, the concepts of ‘rank’ and ‘nullity’ (the dimension of the image and kernel of a linear transformation respectively) are sometimes easy to ‘see’ when using matrices. Moreover, it’s often more convenient to do computations with matrices then it is with linear transformations. In particular, for those of you who have interest in applied mathematics, writing a program to calculate the value of matrix multiplication is much easier than writing an explicit function for the linear transformation. The cons for using matrices is that one is liable (if not careful) to lose sight of the deep vector space theoretic concepts that are underlying what they’re doing. To use a slight pejorative term (not meant this way) one can get caught ‘playing with matrices’ for the rest of their studies. This is something, in my opinion, to avoid. Thus, with this all said I’d like to prove one last theorem regarding matrices and linear transformations which was remarked on above. Namely:

## Algebra Isomorphism Between Algebra of Square Matrices and the Endomorphism Algebra (Pt. I)

**Point of post: **In this series of posts we’ll cover a lot of ground. We’ll discuss how one can canonically associate matrices in with endomorphisms on , we’ll then discuss the ideas of ordered bases and associative unital algebra isomorphisms, and we’ll end the sequence with our main theorem which connects matrices and endomorphisms on general -dimensional -spaces in an interesting and instructive way.

*Motivation*

In our last post we saw how to endow ( in disguise) with the structure of an associative unital algebra. We hinted that besides this being another example to add to our list of associative unital algebras that it played an important, and enlightening role in the study of the endomorphism algebra of an -dimensional -space. To see this we’ll first show how one can canonically interpret a matrix as a linear transformation on (in all formality what we really mean is that the square array of numbers defined as a matrix gives rise to a linear transformation which is represented and computed using the same square array of numbers). We’ll then show that this process goes in reverse. Namely, given an endomorphism on some -dimensional -space there is a canonical way to produce a matrix. Moreover, this correspondence between linear transformations and matrices turns out to be something called an associative unital algebra isomorphism, which is just a highfalutin way of saying that the correspondence respects the operations of the domain and codomain algebras.

## Algebra of Square Matrices (Pt. I)

**Point of post: **In this post we discuss the concept of the *square matrix algebra of size over a field * in the quest to approach the concept of the relation between matrices and linear transformations in the opposite way in which Halmos views them in sections 36 and 37 of his book. Namely, we start out by defining matrices as being algebraic objects in their own right, and then connect the endomorphism algebra of a vector space canonically with a particular matrix algebra in a canonical way.

*Motivation*

In this post we show how given a field there is a natural way to produce an associative unital algebra of dimension over . This algebra will be the algebra of square matrices (to be defined below) of size . Not only will this algebra serve as another interesting addition to our menagerie of vector spaces (algebras) but will give us a fascinating and useful means to look at the endomorphism algebra of a vector space is a whole new way.