## The Endomorphism Algebra

**Point of post: **In this post we discuss the equivalent of sections 34 and 35 if Halmos’s book.

*Motivation*

In our last post we discussed the concept of linear transformations from a -space to itself and how we can form the vector space of all such linear transformations. It turns out though that this set of all linear transformation can be endowed with much more structure than that of a vector space. To motivate how special this property is consider the following question: given a vector space what does mean for vectors ? In general, it’s meaningless because a vector space need not have a “multiplication”. It turns out though that some vector spaces do have a canonical way to define multiplication in a way that “makes sense”, such vector spaces are invariably called *associative algebras*. In particular, it turns out that there is a canonical way to turn into an associative algebra. Namely, there is a natural way to define the *product *of two endomorphisms.

*Algebras*

Let be a -space. Then, is called an *associative algebra *if there exists a map

such that for all

1.

2.

3.

4.

for all . If also possess a non-zero element, denoted , such that

5.

for all then is called an *associative algebra with identity/unital algebra*. In practice one doesn’t explicitly write out but denotes by the concatenation .

We now proceed to give some examples:

**Ex(1): **Consider the space of all continuous functions from to itself. With the usual (pointwise) definitions of addition and multiplication this is an associative algebra (a Banach algebra if given the usual norm and the induced topology) with identity .

**Ex(2): ** with where is the usual cross product is a *non-associative* algebra.

**Ex(3): **The set of all matrices over some field with the usual definitions of matrix multiplication and addition, with identity .

*Remark: *Note that the last example gives an example of an associative unital algebra that isn’t commutative in the sense that, in general, for two matrices it’s not true that . Also, it shows in general that elements of an associative unital algebra need not have inverses, in the multiplicative sense.

While there are many theorems relating to general associative unital algebras we shall have need for only one, and it’s merely an observation. Namely, observe that for any we have that and so upon subtraction . Similarly, .

*Polynomials*

Given an associative algebra with identity over , there is a canonical way to define polynomials on . Namely, we may start by inductively defining for by the relation and . Note that by associativity this definition is well-defined. From there we can extend the notion of a real valued polynomial to a *polynomial on * by

for .

*Endomorphism Algebra*

The rest of this post will be devoted to showing how one can canonically impose a multiplication on which transforms it into an associative unital algebra. The surprising thing is the identity of this multiplication, namely, function composition. In essence we will show that we can consider

as a multiplication map, with identity . But, this is just grunt work, so let’s get to it:

**Theorem: *** with usual addition and function composition is an associative unital algebra with identity .*

**Proof: **We have already established that is a -space, and so it suffices to check axioms one through five as listed in the algebra section. We do this axiom by axiom.

1. This follows since function composition, in general, is associative

2. This follows since

for all , and thus .

3. This is done using the exact same method.

4. This follows since

5. This follows directly from the definition of the identity function.

Thus, since all five axioms are satisfied the conclusion follows.

* *Note that while is an associative unital algebra, it is far, far from being as *nice *as a field. Explicitly the multiplication (for spaces of dimension greater than one) admits zero divisors as can be seen by considering that . Moreover, the multiplication isn’t even commutative (for spaces of dimension greater than one) as can be seen by considering that but

**References:**

1. Golan, Jonathan S. *The Linear Algebra a Beginning Graduate Student Ought to Know*. Dordrecht: Springer, 2007. Print.

2. Halmos, Paul R. *Finite-dimensional Vector Spaces,*. New York: Springer-Verlag, 1974. Print

3. Simmons, George Finlay. *Introduction to Topology and Modern Analysis*. Malabar, FL: Krieger Pub., 2003. Print.

[…] linear transformation from to given by . Interpret and prove as many of the five axioms for an associative unital algebra as you […]

Pingback by Halmos Sections 34 and 35:Products and Polynomials « Abstract Nonsense | November 23, 2010 |

[…] that in our last post we discussed how to turn into an associative unital algebra by defining a ‘multiplication […]

Pingback by Invertible Linear Transformations (Pt. I) « Abstract Nonsense | November 30, 2010 |

[…] this post we show how given a field there is a natural way to produce an associative unital algebra of dimension over . This algebra will be the algebra of square matrices (to be defined below) of […]

Pingback by Matrix Algebra (Pt. I) « Abstract Nonsense | December 13, 2010 |

[…] these concepts in mind we claim that is a -dimensional associative unital algebra over . More […]

Pingback by Representation Theory: The Group Algebra « Abstract Nonsense | January 20, 2011 |

[…] any irrep which admits as its character. Since we know that . Thus, it clearly follows that the endormorphism algebra being dimensional must be equal to . In particular, since we know that for some . Note though […]

Pingback by Representation Theory: Character Tables « Abstract Nonsense | March 23, 2011 |

[…] Just as is the case for vector spaces, groups, modules etc. one can define the direct sum of algebras. […]

Pingback by Direct Sum of Algebra « Abstract Nonsense | April 6, 2011 |

[…] to interact. After much experimentation mathematicians (taking natural examples such as the endomorphism algebra with its two usual operations and with its two usual operations) decided upon the object which is […]

Pingback by Basic Definitions of Rings « Abstract Nonsense | June 15, 2011 |

[…] Theorem: Let be a Boolean ring, then the multiplication and turn into a unital associative -algebra. […]

Pingback by Boolean Rings (Pt. I) « Abstract Nonsense | July 14, 2011 |

[…] example comes from linear algebra itself. Namely, suppose we are given some vector space . We know that the set of endomorphisms on is an abelian group (in fact, it’s an associative algebra). […]

Pingback by Modules (Pt. I) « Abstract Nonsense | October 27, 2011 |

[…] makes into a unital ring such that . We have already run into algebras before, in the context of endomorphism algebras of vector spaces. More generally, any ring of matrices is given the structure of an -algebra. In […]

Pingback by R-Algebras « Abstract Nonsense | January 10, 2012 |

[…] suffices to show that a basis of is hit. To do this we let be a basis for and a basis for . We know then that a basis for is the set of maps defined by . That said, note […]

Pingback by Some Natural Identifications (Pt. III) « Abstract Nonsense | August 14, 2012 |