# Abstract Nonsense

## Permutations (Pt.V Transpositions and the Sign of a Permutation)

Point of post: This post (which will undoubtedly spill over into continuations) will discuss the last topic in this sequence of posts on permutations. Namely, we will be discussing the concepts of permutations and the real goal (for our linear algebraic purposes) the sign of a permutation.

Motivation

Up until this point we’ve found a relatively simple class of permutations, cycles,  for which every permutation can be written as a product of them. We take this one step further and show that every cycle can be written as the product of cycles of length 2, or transpositions. Mathematically the sign of a permutation can be thought of  as the answer to the hypothetical question “if $f(x_1,\cdots,x_n)$ is a function such that the interchanging of any two of it’s arguments results in negative the original (e.g. $f(x_1,\cdots,x_n)=-f(x_n,\cdots,x_1)$). Then, if $\pi\in S_n$ is $f(x_1,\cdots,x_n)=f(x_{\pi(1)},\cdots,f_{\pi(n)})$ or $f(x_1,\cdots,x_n)=-f(x_{\pi(1)},\cdots,f(x_{\pi(n)})$?”. Another way to think about it is this. Suppose that we had a regular $n$-gon $P$ in the plane whose “front” is colored red and whose “back” is colored green. Suppose that the $n$ vertices are labeled $v_1,\cdots,v_n$. Then, we could imagine switching two of the vertices by “flipping” the $P$ over the line perpendicular to the line connecting those two vertices. But, we see that in the process we’ve now exposed the green side of $P$. Switching two more vertices results in another flip so that now the red side is exposed. If we permuted the vertices $v_1,\cdots,v_n$ to the configuration $v_{\pi(1)},\cdots,v_{\pi(n)}$ the sign of the permutation will tell us whether the red or green side of $P$ is showing.

Transpositions

A  cycle $\tau\in S_n$ is called a transposition if $\tau$ is of length two, in other words if $\tau=(a_1,a_2)$. Intuitively a transposition is the process of switching just two objects and leaving the rest fixed. The first interesting thing we notice about transpositions is that every (non-trivial)  cycle can be written as the product of them. Indeed:

Theorem: Let $(a_1,\cdots,a_k)\in S_n,\text{ }k\geqslant2$, then

$\displaystyle (a_1,\cdots,a_k)=(a_1,a_n)\cdots(a_1,a_3)(a_1,a_2)$

Proof: Evidently the it suffices to show that the two agree on $\{a_1,\cdots,a_n\}$ (since they both leave elements, if there are any, not in $\{a_1,\cdots,a_n\}$ unchanged. So, we merely note that if $k

\begin{aligned}\left((a_1,a_n)\cdots(a_1,a_3)(a_1,a_2)\right)(a_k) &= \left((a_1,a_n)\cdots(a_1,a_{k+1})(a_1,a_k)\right)(a_k)\\ &= \left((a_1,a_n)\cdots(a_1,a_{k+1})\right)(a_1)\\ &= \left((a_n,a_1)\cdots(a_1,a_{k+2})\right)(a_{k+1})\\ &= a_{k+1}\\ &= (a_1,\cdots,a_n)(a_k)\end{aligned}

and if $k=n$ then

$\left((a_1,a_n)\cdots(a_1,a_3)(a_1,a_2)\right)(a_n)=(a_1,a_n)(a_n)=a_1=(a_1,\cdots,a_n)(a_1)$

The conclusion follows. $\blacksquare$

Corollary: Every non-trivial permutation may be written as the product of transpositions.

Remark: This enables us to lend credence to the idea that every permutation can be made by making successive “switches”.

We next note that every transposition is it’s own inverse. Indeed, if $\tau=(a_1,a_2)$ then

$\tau^2(a_1)=\tau(\tau(a_1))=\tau(a_2)=a_1$

and

$\tau^2(a_2)=\tau(\tau(a_1)=\tau(a_1)=a_2$

Thus, we are left with the appealing result that if $\pi\in S_n$ is given by $\pi=\tau_1\cdots\tau_k$ then

$\pi^{-1}=\left(\tau_1\cdots\tau_k\right)^{-1}=\tau_k^{-1}\cdots\tau_1^{-1}=\tau_k\cdots\tau_1$

Remark: Intuitively this says that you can undo a permutation that you did in successive switches by undoing the switches in the reverse order in which you performed them.

So, just as with cycles we wonder what the corresponding permutation matrices look like, and in particular (as we shall see why) what their determinant is.

Theorem: Let $\tau=(a_1,a_2)\in S_n$ be a transposition, then if $P_\tau$ is the associated permutation matrix $P_\tau$ is the identity matrix with columns $a_1$ and $a_2$ switched.

Proof: We can describe the matrix $S$ resulting from switching the $a_1^{th}$ and $a_2^{th}$ columns of $I$ as $S=[s_{i,j}]$ where

$s_{i,j}=\begin{cases}1 & \mbox{if}\quad \left(i=j\text{ and }i,j\ne a_1,a_2\right)\text{ or }\left(i=a_1,j=a_2\text{ or }i=a_2,j=a_1\right)\\ 0 &\mbox{if}\text{otherwise}\end{cases}$

but noting that if $i\ne a_1,a_2$ we have that $\pi(i)=i$ and $\pi(a_1)=a_2,\pi(a_2)=a_1$. Thus, with this in mind the above may be written equally well as

$s_{i,j}=\begin{cases}1 & \mbox{if}\quad i=\pi(j)\\ 0 & \mbox{if}\quad i\ne\pi(j)\end{cases}$

from where it immediately follows that $S=P_\tau$. $\blacksquare$

Corollary: If $\tau$ is a transposition then $\det P_\tau=-1$.

Proof: This follows from elementary matrix analysis where it’s proven that interchanging the rows of a matrix changes the sign of the determinant. Thus, $\det P_\tau=-\det I=-1$. $\blacksquare$

We are now prepared to prove the theorem for which all of these posts has been building up to, namely that a permutation cannot be written both as an even and odd amount of polynomials. In other words, it says that if $\pi\in S_n$ then you can’t have $\pi=\tau_1\cdots\tau_{2m}$ and $\pi=\tau_1\cdots\tau_{2k+1}$.

Theorem: Let $\pi\in S_n$, then $\pi$ cannot be written as both an even and odd amount of transpositions.

Proof: Suppose that

$\pi=\tau_1\cdots\tau_{2m}$

and

$\pi=\tau'_1\cdots\tau'_{2k+1}$

Then,

$P_{\tau_1}\cdots P_{\tau_{2m}}=P_{\pi}=P_{\tau'_1}\cdots P_{\tau'_{2k+1}}$

and thus remembering that the determinant is a multiplicative function

$1=(-1)^{2m}=\det\left(P_{\tau_1}\cdots P_{\tau_{2m}}\right)=\det\left(P_{\tau'_1}\cdots P_{\tau'_{2k+1}}\right)=(-1)^{2k+1}=-1$

which is a contradiction. Thus, it follows that $P_{\pi}$ may not be written as both and even and odd amount of transpositions. $\blacksquare$

With this theorem we may unambiguously define the sign of the permutation $\pi$, denoted $\text{sgn }\pi$, to be $1$ if the permutation may be written as an even number of transpositions and $-1$ if it can be written as an odd number of transpositions. Unsurprisingly if $\text{sgn }\pi=1$ we call the permutation even and if $\text{sgn }\pi=-1$ we call it odd. In fact, the above shows the following corollary:

Corollary: $\text{sgn}(\pi)=\det P_{\pi}$

Believe it or not, the main purpose for considering permutation matrices was to make the above proof nice. The usual way to do this uses an ugly induction or sketchy counting technique. The above is clean-cut and intuitive. Also, the above corollary now allows us to give a very quick proof of the following:

Theorem: Let $\pi,\pi'\in S_n$, then $\text{sgn}(\pi\pi')=\text{sgn}(\pi) \text{sgn}(\pi')$

Proof: This follows from the corollary since $\text{sgn}(\pi\pi')=\det\left(P_{\pi\pi'}\right)=\det\left(P_{\pi}P_{\pi'}\right)=\det\left(P_{\pi}\right)\det\left(P_{\pi'}\right)$$=\text{sgn}(\pi) \text{sgn}(\pi')$. $\blacksquare$

References:

1.  Dummit, David Steven., and Richard M. Foote. Abstract Algebra. Hoboken, NJ: Wiley, 2004. Print.

2.  Fraleigh, John B., and Victor J. Katz. A First Course in Abstract Algebra. Boston: Addison-Wesley, 2003. Print.

November 7, 2010 -

1. […] definitions for the sign of a permutation and show that they’re equivalent to the one given here. Namely, the definitions given by the polynomial commonly denoted in Lang (viz. ref. 1 pg. 63-64) […]

Pingback by Permutations (Pt. VI Alternate Definitions of the Sign of a Permutation) « Abstract Nonsense | November 8, 2010 | Reply

2. […] So, from this we may finally verify that the definition of given in terms of is the same as the on given by us. […]

Pingback by Permutations (Pt. VI Alternate Definitions of the Sign of a Permutation cont.) « Abstract Nonsense | November 8, 2010 | Reply

3. […] we’ve made tacit use of the fact that (proof can be be found here) and that (since ). Thus, since was arbitrary the conclusion […]

Pingback by Symmetric and Antisymmetric forms « Abstract Nonsense | November 14, 2010 | Reply

4. […] the free vector space  (which of course we can identify with the group algebra by ) and is the sign function.  We lastly define a third function given by (order matters, so keep that in […]

Pingback by Row and Column Stabilizer « Abstract Nonsense | May 20, 2011 | Reply

5. […] the right tableau and the same column of the left tableau. Note that this is really saying that the transposition  is in . If we obviously write . To get the first reason why this condition would ever come up in […]

Pingback by A Weird Condition on Tableaux « Abstract Nonsense | May 22, 2011 | Reply