## Some Natural Identifications (Pt. IV)

**Point of Post: **This is a continuation of this post.

## Some Natural Identifications (Pt. II)

**Point of Post: **This is a continuation of this post.

## Dual Modules

**Point of Post: **In this post we discuss the notion of dual modules, and give a special treatment showing that for vector spaces, the dual space is isomorphic to the original space if and only in the case of finite dimension.

*Motivation*

Just as in the case for vector spaces, one is naturally inclined in module theory to look at the set of linear functionals from module to the ground ring, or using the notation of homomorphism groups . The idea of a dual module are vital in many ares of mathematics, including such non-algebraic subjects as analysis (where the dual space is vital for defining generalizations of integration, etc.). For us though, we are interested in the purely algebraic traits of dual modules and what they can tell us about the module itself. We shall be particularly interested in reminding ourselves of the following very interesting fact from linear algebra, which says that for a vector space the dual space of is isomorphic to precisely when is finite dimensional.

## Canonical Isomorphism Between a Finite Dimensional Inner Product Space and its Dual

**Point of Post: **In this post we prove that every finite dimensional inner product space is isomorphic to its dual space.

*Motivation*

We have seen in the past the proof that every finite dimensional vector space is isomorphic to its double dual. We know of course since dimension is preserved under taking duals for finite dimensional vector spaces (this is, in fact, a characterization of finite dimensionality) but there was no canonical (free of basis choice) way of defining the mapping. In this post we prove the scene is different if the vector space is supplied with an inner product (or more generally a non-degenerate bilinear form).

## Free Vector Spaces

**Point of post: **In this post we discuss the notions of free vector spaces. In particular we discuss their construction and where they naturally come up in the ‘real world’ of mathematics.

*Motivation*

In essence linear algebra (in the sense of finite dimensional vector spaces) is a very simple subject. Namely, it’s trivial that given any finite dimensional -space one has that . But, really even thinking about vector spaces as really just coordinate spaces of the form doesn’t catch the full feel of what we’re really studying since this adds a geometric aspect to the mix (in the case of ). Probably the most formal, sterile way of thinking about vector spaces is just elements of with placeholders. Something like (the set of all polynomials of degree less than or equal to ) is an excellent example of the idea–except we might get caught up in thinking of the polynomials as functions instead of just placeholders. This is where free vector spaces come into play. Namely, we’ll do precisely what we said before–make a vector space out of formal symbols. This comes up more often than one may think. Namely, often one has a ‘linear transformation’ which should act on an dimensional vector space. But, one doesn’t want to screw around with the notation of making it act on , etc–so one just defines a free vector space. Most often this comes up when what really happens is that one has a set of indices and a function on the indices and one define the free vector space over the indices and then bam, suddenly your function on the indices naturally becomes a linear transformation.

It is interesting to compare this intuitive idea to the formal definitions really going on in the background, namely the idea of free modules.

## Direct Sum of Algebra

**Point of post: **In this post we shall discuss a natural way to build new algebras out of a collection of algebras. This is the direct sum of algebras which, unsurprisingly, mimics the construction of the direct sum of groups or direct sum of vector spaces.

## The Hilbert-Schmidt Inner Product on Complex Matrix Algebras

**Point of post: **In this post we discuss a natural way to define an inner product on a complex matrix algebra of the form and describe some of the properties.

*Motivation*

Of course we know that the space of of matrices over , , is an associative unital algebra with the usual definitions of scalar multiplication, matrix addition, and matrix multiplication. It may then seem fruitful to ask what kinds of inner products one can put on it. In this post we shall define a natural one, namely called the Hilbert-Schmidt inner product and derive some interesting results about it.