# Abstract Nonsense

## A Weird Condition on Tableaux

Point of Post: In this post we discuss an interesting property between two tableaux which will ultimately help us construct the irreps of $S_n$ associated to each $n$-frame.

$\text{ }$

Motivation

$\text{ }$

So, enough being cryptic. I promised that we will create a bijection $\text{Frame}_n\to\widehat{S_n}$ in such a way that $\deg\rho^{(\mathcal{F})}=f_{\text{st}}\left(\mathcal{F}\right)$–it’s about time I explained roughly how. So, in our last post we created this interesting function $E:\text{Tab}\left(\mathcal{F}\right)\to\mathbb{C}\left[S_n\right]$. Our main goal to the construction is to show that up to normalization $E\left(\mathcal{T}\right)$ is a minimal projection from where we shall get our corresponded irrep. In the journey to prove this we will need a strange, un-motivated concept which has to do with the relationship between the rows of one tableau $\mathcal{T}$ and another tableau $\mathcal{T}'$.Luckily, the motivation and usefulness will become apparent shortly. That said, we can at least give a glance of why anyone would even care about this condition. In particular, we shall use this condition to prove that the irreps associated to two different $n$-frames are different.

$\text{ }$

May 22, 2011

## Free Vector Spaces

Point of post: In this post we discuss the notions of free vector spaces. In particular we discuss their construction and where they naturally come up in the ‘real world’ of mathematics.

$\text{ }$

Motivation

In essence linear algebra (in the sense of finite dimensional vector spaces) is a very simple subject. Namely, it’s trivial that given any finite dimensional $F$-space $\mathscr{V}$ one has that $\mathscr{V}\cong F^{\dim \mathscr{V}}$. But, really even thinking about vector spaces as really just coordinate spaces of the form $F^n$ doesn’t catch the full feel of what we’re really studying since this adds a geometric aspect to the mix (in the case of $\mathbb{R}$). Probably the most formal, sterile way of thinking about vector spaces is just elements of $F$ with placeholders. Something like $\mathcal{P}_n$ (the set of all polynomials of degree less than or equal to $n$) is an excellent example of the idea–except we might get caught up in thinking of the polynomials as functions instead of just placeholders. This is where free vector spaces come into play. Namely, we’ll do precisely what we said before–make a vector space out of formal symbols. This comes up more often than one may think. Namely, often one has a ‘linear transformation’ which should act on an $n$ dimensional vector space. But, one doesn’t want to screw around with the notation of making it act on $F^n$, etc–so one just defines a free vector space. Most often this comes up when what really happens is that one has a set of indices and a function on the indices and one define the free vector space over the indices and  then bam, suddenly your function on the indices naturally becomes a linear transformation.

$\text{ }$

It is interesting to compare this intuitive idea to the formal definitions really going on in the background, namely the idea of free modules.

$\text{ }$

April 19, 2011

## Projections Into the Group Algebra (Pt. I)

Point of post: In this post we discuss the notion of the group algebra, in preparation for our eventual discussion about the representation of symmetric groups.

Motivation

We’ve seen in past posts that the group algebra is isomorphic in all the important ways to the direct sum of matrix algebras. We’ll use this fact to study projections in the group algebra which are functions generalizing the notion of projections on an endomorphism algebra. Namely, projections are elements of the group algebra which are idempotent under convolution. These shall prove to be very important when we attempt, at a later date, to classify the representations of the symmetric group.

April 9, 2011

## Consequence of the Decomposition of the Group Algebra Into Matrix Algebras

Point of post: In this post we shall use our previous results concerning the decomposition of the group algebra to prove that if one writes an element of $\mathcal{A}(G)$ in a particular way then that form is conducive to computing polynomials.

Motivation

We saw in our last post that the group algebra is isomorphic to a direct sum of matrix algebras. We shall use this fact to derive an interesting fact about the group algebra. Namely, we know that for every choice of matrix entry functions one has that the group algebra is a direct sum of the subalgebras of the form $\Lambda^{(\alpha)}$ where $\Lambda^{(\alpha)}=\text{span}_{\mathbb{C}}\left\{D^{(\alpha)}_{i,j}:\alpha\in\widehat{G}\text{ and }i,j\in[d_\alpha]\right\}$. Thus, every element $f\in\mathcal{A}(G)$ has a decomposition of the form $\displaystyle \sum_{\alpha\in\widehat{G}}f^{(\alpha)}=f$ where $f^{(\alpha)}\in\Lambda^{(\alpha)}$. We shall show that with this decomposition it is much simpler to calculate $p(f)$ for some polynomial $p\in\mathbb{C}[x]$.  Namely, we’ll show the awesome result that $\displaystyle p(f)$ is actually equal to $\displaystyle \sum_{\alpha\in\widehat{G}}p\left(f^{(\alpha)}\right)$. In fact, this isn’t surprising as we shall see that each $\Lambda^{(\alpha)}$ shall act analogously to $\widetilde{\text{Mat}_{d_\alpha}\left(\mathbb{C}\right)}$ sitting inside $\displaystyle \bigoplus_{\alpha\in\widehat{G}}\text{Mat}_{d_\alpha}\left(\mathbb{C}\right)$.

April 9, 2011

## Decomposing the Group Algebra Into the Direct Sum of Matrix Algebras

Point of post: In this post we show how the group algebra of a finite group can decomposed into the direct sum of group algebras such that the isomorphism between them is unitary when the group algebra is given the usual inner product and the direct sum of matrix algebras the usual direct sum inner product where each summand is given the Hilbert-Schmidt inner product. Moreover, the isomorphism shall in fact be also an associative unital algebra isomorphism.

Motivation

We have seen that the matrix entry functions form an orthonormal basis for the group algebra.  From this we got the important result that $\displaystyle |G|=\sum_{\alpha\in\widehat{G}}d_\alpha^2$. It thus follows from basic linear algebra that as complex vector spaces $\displaystyle \mathcal{A}(G)\cong\bigoplus_{\alpha\in\widehat{G}}\text{Mat}_{d_\alpha}\left(\mathbb{C}\right)$ in this post we’ll show much more. Indeed, we’ll show that this isomorphism is also an associative unital algebra isomorphism. Moreover, we’ll even show that if one gives each $\text{Mat}_{d_\alpha}\left(\mathbb{C}\right)$ an inner product which is a multiple of the Hilbert-Schmidt inner product and the direct sum the usual inner product on direct sums that the isomorphism is also a unitary map! Thus, in essence the group algebra in all ways important becomes isomorphic to an object we know much, much about.

April 6, 2011

## Class Functions

Point of post: In this post we derive results about the set of class functions on a finite group $G$, in particular finding its dimension as a subspace of the group algebra and characterizing it as the center of the group algebra.

Motivation

In our last series of posts we saw an interesting technique. We saw the interesting idea that if we want to prove the cardinality of a set $X$ is equal to $\kappa$ it suffices to construct a vector space $\mathscr{V}$ of dimension $\kappa$ such that $X$ is a basis for $\mathscr{V}$. In particular, we saw that $\displaystyle \sum_{\alpha\in\widehat{G}}d_\alpha^2=|G|$ by considering the group algebra $\mathcal{A}(G)$ of dimension $|G|$ and then showing that $\left\{D^{(\alpha)}_{i,j}:\alpha\in\widehat{G}\text{ and }i,j\in[d_\alpha]\right\}$ is a basis for $\mathcal{A}(G)$ (where, as in the last post, the $D^{(\alpha)}_{i,j}$ are the matrix entry functions). We now wish to get our milage out of this technique by applying it again in a different context (different set and different cardinality). We don’t want to ruin the surprise of what precisely this will be, but we shall construct the vector space with the ‘proper dimension’ in this post. In particular, we will consider and study the set of class functions on a finite group $G$. Intuitively, these are functions which satisfy (of course we mean this loosely since we haven’t define the domain, range, etc.) the common ‘trace identity’ $\text{tr}\left(ABA^{-1}\right)=\text{tr}(B)$.

February 24, 2011

## Matrix Functions Form an (almost) Orthonormal Basis (Pt. II)

Point of post: This is a continuation of this post.

February 23, 2011

## Left Regular Representation (Pt. II)

Point of post: This post a is a continuation of this one.

January 23, 2011

## Left Regular Representation

Point of post: In this post we shall discuss the notion of the left and right regular representations on the group algebra $\mathcal{A}\left(G\right)$

Motivation

We saw in our last post that $\ast$-representations of the group algebra $\mathcal{A}\left(G\right)$ on a pre-Hilbert space $\mathscr{V}$ are, in a sense, one-to-one. In this post we give a specific example of a $\ast$-representation of $\mathcal{A}\left(G\right)$ on itself which will then induce a representation of $G$ on $\mathcal{A}\left(G\right)$. This representation will have a canonical form once we parse it out. This representation will be called (and one very similar to it) will be called the left regular representation of $G$ on $\mathcal{A}\left(G\right)$.

January 23, 2011

## *-representations of the Group Algebra

Point of post: This is a continuation of this post

January 21, 2011