# Abstract Nonsense

## Permutations (Pt. VI Alternate Definitions of the Sign of a Permutation)

Point of post: In this post we point out two different, but commonly used, definitions for the sign of a permutation and show that they’re equivalent to the one given here. Namely, the definitions given by the polynomial commonly denoted $\Delta(x_1,\cdots,x_n)$ in Lang (viz. ref. 1 pg. 63-64) and the definition by  inversions as in Shilov (viz. ref. 2 pg. 5-6 ) .

Motivation

Often in abstract and linear algebra books the definition of the sign of a permutation is not the same as the one given by me in the last post. This can be patently disturbing, since one will undoubtedly wonder “Are these the same definition, just said different?” and maybe more to the point “Why did they use this definition? Is it ‘better’, and if so, in what way?” I hope to answer these questions (to within the limitations of my time and the size of these posts) for two of the most commonly found definitions of the sign of a permutation.

The Polynomial $\Delta(x_1,\cdots,x_n)$

To motivate this definition let us go back intuitively to the motivation for the definition of $\text{sgn }\pi$. Recall that we thought of some hypothetical function $f(x_1,\cdots,x_n)$ with the property that intuitively if “switch” any of the two arguments the sign of the function changes. And we wanted to answer the question: “if $\pi\in S_n$ clearly either $f(x_1,\cdots,x_n)=f(x_{\pi(1)},\cdots,x_{\pi(n)})$ or $f(x_1,\cdots,x_n)=-f(x_{\pi(1)},\cdots,x_{\pi(n)})$ but which is it?” Intuitively the first case will happen if we perform “switches” of two arguments an even number of times since each pair of switches cancels: $-(-f(x_1,\cdots,f(x_n)))=f(x_1,\cdots,x_n)$. And the second will occur when there are an odd number of “switches” since we can think of this as performing an even number of “switches” and then lastly performing one more switch. But, if we think about this the even number of “switches” will give us (by prior hand-waving) $f(x_1,\cdots,x_n)$ and thus the last “switch” performed after these will give us $-f(x_1,\cdots,x_n)$.

But, we’ve gained a considerable amount of knowledge since then and can state this much more clearly. What the above is really saying is that $f(x_1,\cdots,x_n)$ is a function such that $\tau\in S_n$ is a cycle then $f(x_1,\cdots,x_n)=-f(x_{\tau(1)},\cdots,x_{\tau(n)})$. Thus, we conclude above that intuitively $f(x_1,\cdots,x_n)=f(x_{\pi(1)},\cdots,f(x_{\pi(n)}))$ if and only if $\pi$ can be written as the product of an even number of transpositions and $f(x_{1},\cdots,f(x_n))=-f(x_{\pi(1)},\cdots,x_{\pi(n)}))$ if and only if $\pi$ can be written as the product of odd transpositions. Thus, putting this together we see that

$\displaystyle \frac{f(x_{\pi(1)},\cdots,f(x_{\pi(n)}))}{f(x_1,\cdots,x_n)}=\begin{cases}1 & \mbox{if}\quad \pi\text{ is even}\\-1 & \mbox{if}\quad \pi\text{ is odd}\end{cases}\quad\quad(1)$

which is precisely what we defined the sign of a permutation to be.

With this in mind, it’s more or less obvious why some author’s, such as Serge Lang, define the sign of a permutation to be

$\displaystyle \text{sgn}(\pi)=\frac{\Delta(x_{\pi(1)},\cdots,x_{\pi(n)})}{\Delta(x_1,\cdots,x_n)}$

where

$\displaystyle \Delta(x_1,\cdots,x_n)=\prod_{1\leqslant i

That said, we won’t leave this to hand-waving and prove that the two are equivalent. But first, we prove the following lemma

Lemma: Let $\Delta(x_1,\cdots,x_n)$. Then,

$\displaystyle \Delta(x_1,\cdots,x_n)=(-1)^{{n\choose 2}}\det\begin{bmatrix}1 & \cdots & x_1^{n-1}\\ \vdots & \ddots & \vdots\\ 1 & \cdots & x_n^{n-1}\end{bmatrix}$

Proof: Note that the number of $x_i-x_j,\text{ }1\leqslant i is precisely the number of ways of picking two distinct objects from a group of $n$ objects, i.e. $\displaystyle {n\choose 2}$. Thus, it suffices to show that

$\displaystyle \prod_{1\leqslant i

but this identity is well known. The conclusion follows. $\blacksquare$

Remark: The matrix for whose determinant I so readily stated as “well known”  is called the Vandermonde Matrix. I would love to prove that identity, but time just doesn’t permit. So for those who haven’t seen the proof before, the classic one may be found here and a combinatorial proof found here.

So, with this in hand the following theorem will now seem relevant

Theorem: Let $\pi\in S_n$ then for any $v_1,\cdots,v_n\in \mathbb{C}^n$

$\left[\begin{array}{ccc} & v_{\pi(1)} & \\ \hline & \vdots &\\ \hline & v_{\pi(n)} &\end{array}\right]=P_{\pi}^{T}\left[\begin{array}{ccc} & v_1 &\\ \hline & \vdots & \\ \hline & v_n & \end{array}\right]$

Proof: Let

$P_{\pi}^T=\left[\begin{array}{ccc} & p_1 &\\ \hline & \vdots &\\ \hline & p_n &\end{array}\right]$

Thus, by the general rules for multiplying matrices

$P_{\pi}^T\left[\begin{array}{ccc} & v_1 &\\ \hline & \vdots &\\ \hline & v_n &\end{array}\right]=\left[\begin{array}{ccc} & p_1\left[\begin{array}{c}v_1\\ \hline \vdots\\ \hline v_n\end{array}\right] &\\ \hline & \vdots & \\ \hline & p_n\left[\begin{array}{c}v_1\\ \hline \vdots\\ \hline v_n\end{array}\right] &\end{array}\right]$

Notice though that checking the definitions $p_j=[a_{j,k}]$ where

$a_{j,k}=\begin{cases}1 & \mbox{if}\quad k=\pi(j)\\ 0 & \mbox{if}\quad k\ne\pi(j)\end{cases}$

Thus, by definition

$p_j\left[\begin{array}{ccc} & v_1 &\\ \hline & \vdots &\\ \hline & v_n &\end{array}\right]=[c_{j,s}]$

where

$\displaystyle c_{j,s}=\sum_{k=1}^{n}a_{j,k}v_{k,s}=\sum_{k\ne \pi(j)}0+a_{j,\pi(j)}v_{\pi(j),s}=v_{\pi(j),s}$

from where it easily follows that

$p_j\left[\begin{array}{ccc} & v_1 &\\ \hline & \vdots &\\ \hline & v_n &\end{array}\right]=v_{\pi(j)}$

and so the conclusion follows. $\blacksquare$

References:

1.  Lang, Serge. Undergraduate Algebra. New York: Springer, 1998. Print.

2.  Shilov, G. E. Linear Algebra. Englewood Cliffs, NJ: Prentice-Hall, 1971. Print.