## A Weird Condition on Tableaux

**Point of Post: **In this post we discuss an interesting property between two tableaux which will ultimately help us construct the irreps of associated to each -frame.

*Motivation*

So, enough being cryptic. I promised that we will create a bijection in such a way that –it’s about time I explained roughly how. So, in our last post we created this interesting function . Our main goal to the construction is to show that up to normalization is a minimal projection from where we shall get our corresponded irrep. In the journey to prove this we will need a strange, un-motivated concept which has to do with the relationship between the rows of one tableau and another tableau .Luckily, the motivation and usefulness will become apparent shortly. That said, we can at least give a glance of why anyone would even care about this condition. In particular, we shall use this condition to prove that the irreps associated to two different -frames are different.

## Free Vector Spaces

**Point of post: **In this post we discuss the notions of free vector spaces. In particular we discuss their construction and where they naturally come up in the ‘real world’ of mathematics.

*Motivation*

In essence linear algebra (in the sense of finite dimensional vector spaces) is a very simple subject. Namely, it’s trivial that given any finite dimensional -space one has that . But, really even thinking about vector spaces as really just coordinate spaces of the form doesn’t catch the full feel of what we’re really studying since this adds a geometric aspect to the mix (in the case of ). Probably the most formal, sterile way of thinking about vector spaces is just elements of with placeholders. Something like (the set of all polynomials of degree less than or equal to ) is an excellent example of the idea–except we might get caught up in thinking of the polynomials as functions instead of just placeholders. This is where free vector spaces come into play. Namely, we’ll do precisely what we said before–make a vector space out of formal symbols. This comes up more often than one may think. Namely, often one has a ‘linear transformation’ which should act on an dimensional vector space. But, one doesn’t want to screw around with the notation of making it act on , etc–so one just defines a free vector space. Most often this comes up when what really happens is that one has a set of indices and a function on the indices and one define the free vector space over the indices and then bam, suddenly your function on the indices naturally becomes a linear transformation.

It is interesting to compare this intuitive idea to the formal definitions really going on in the background, namely the idea of free modules.

## Projections Into the Group Algebra (Pt. I)

**Point of post: **In this post we discuss the notion of the group algebra, in preparation for our eventual discussion about the representation of symmetric groups.

*Motivation*

We’ve seen in past posts that the group algebra is isomorphic in all the important ways to the direct sum of matrix algebras. We’ll use this fact to study projections in the group algebra which are functions generalizing the notion of projections on an endomorphism algebra. Namely, projections are elements of the group algebra which are idempotent under convolution. These shall prove to be very important when we attempt, at a later date, to classify the representations of the symmetric group.

## Consequence of the Decomposition of the Group Algebra Into Matrix Algebras

**Point of post: **In this post we shall use our previous results concerning the decomposition of the group algebra to prove that if one writes an element of in a particular way then that form is conducive to computing polynomials.

*Motivation*

We saw in our last post that the group algebra is isomorphic to a direct sum of matrix algebras. We shall use this fact to derive an interesting fact about the group algebra. Namely, we know that for every choice of matrix entry functions one has that the group algebra is a direct sum of the subalgebras of the form where . Thus, every element has a decomposition of the form where . We shall show that with this decomposition it is much simpler to calculate for some polynomial . Namely, we’ll show the awesome result that is actually equal to . In fact, this isn’t surprising as we shall see that each shall act analogously to sitting inside .

## Decomposing the Group Algebra Into the Direct Sum of Matrix Algebras

**Point of post: **In this post we show how the group algebra of a finite group can decomposed into the direct sum of group algebras such that the isomorphism between them is unitary when the group algebra is given the usual inner product and the direct sum of matrix algebras the usual direct sum inner product where each summand is given the Hilbert-Schmidt inner product. Moreover, the isomorphism shall in fact be also an associative unital algebra isomorphism.

*Motivation*

We have seen that the matrix entry functions form an orthonormal basis for the group algebra. From this we got the important result that . It thus follows from basic linear algebra that as complex vector spaces in this post we’ll show much more. Indeed, we’ll show that this isomorphism is also an associative unital algebra isomorphism. Moreover, we’ll even show that if one gives each an inner product which is a multiple of the Hilbert-Schmidt inner product and the direct sum the usual inner product on direct sums that the isomorphism is also a unitary map! Thus, in essence the group algebra in all ways important becomes isomorphic to an object we know much, much about.

## Class Functions

**Point of post: **In this post we derive results about the set of class functions on a finite group , in particular finding its dimension as a subspace of the group algebra and characterizing it as the center of the group algebra.

**Motivation**

In our last series of posts we saw an interesting technique. We saw the interesting idea that if we want to prove the cardinality of a set is equal to it suffices to construct a vector space of dimension such that is a basis for . In particular, we saw that by considering the group algebra of dimension and then showing that is a basis for (where, as in the last post, the are the matrix entry functions). We now wish to get our milage out of this technique by applying it again in a different context (different set and different cardinality). We don’t want to ruin the surprise of what precisely this will be, but we shall construct the vector space with the ‘proper dimension’ in this post. In particular, we will consider and study the set of *class functions *on a finite group . Intuitively, these are functions which satisfy (of course we mean this loosely since we haven’t define the domain, range, etc.) the common ‘trace identity’ .

## Matrix Functions Form an (almost) Orthonormal Basis (Pt. II)

**Point of post: **This is a continuation of this post.

## Left Regular Representation

**Point of post: **In this post we shall discuss the notion of the left and right regular representations on the group algebra

*Motivation*

We saw in our last post that -representations of the group algebra on a pre-Hilbert space are, in a sense, one-to-one. In this post we give a specific example of a -representation of on itself which will then induce a representation of on . This representation will have a canonical form once we parse it out. This representation will be called (and one very similar to it) will be called the *left regular representation *of on .