# Abstract Nonsense

## Permutations (Pt. II Permutation Matrices)

Point of post: This post is a continuation of this post in a continuing effort to define permutations fully and clearly, something someone reading Halmos’s book is bounded not to get (though admittedly this isn’t his goal). In this particular post we discuss the interesting concept of “permutation matrices” in an effort to intuitively simplify proofs covered in the next few posts.

Motivation

Often in mathematics the simple concept of “thinking of something differently” can have a profound impact on a study. The study of permutations is no different. Thus, it is clearly fruitful to search of as many different ways as possible. In this post we take permutations and think about them in terms of something most of us know all about, matrices.

Definition

Let $\text{Mat}_{n}\left(\mathbb{R}\right)$ be the set of all $n\times n$ matrices with real entries. We call $P\in\text{Mat}_n\left(\mathbb{R}\right)$ a permutation matrix if $P$ can be transformed into the identity matrix $I$ by interchanging the rows of $P$. Said differently $P$ is a permutation matrix if there exists row-exchanging matrices $R_1,\cdots,R_m$ such that

$R_1\cdots R_m=P$

It is clear though that this is equivalent to saying that $P=[a_{i,j}]$ such that if one fixed $i_0$ then there is a unique $j\in[n]$ such that $a_{i_0,j}=1$ and $a_{i_0,j'}=0$ for $j\ne j$ and conversely, for each fixed $j_0$ there exists a unique $i$ such that $a_{i,j_0}=1$ and $a_{'i,j_0}=0$ for $i'\ne i$.

Theorem: Let $\text{PermMat}_n$ be the set of all $n\times n$ permutation matrices. Then, $\text{PermMat}_n$ is a group under matrix multiplication.

Proof: Evidently $I\in\text{PermMat}_n$ and $IP=PI=P$ for all $P\in\text{PermMat}_n$. The associativity of the binary operation follows from the fact that matrix multiplication is, in general, associative. Thus, the only real question is whether or not the product and inverses of permutation matrices is a permutation matrix.  This follows almost immediately though after noticing that for a row-exchanging matrix $R$ we have that $R^2=I$ so that $R=R^{-1}$. Thus, if $P\in\text{PermMat}_n$ then

$P=R_1\cdots R_m$

for some row-exchanging matrices $R_1,\cdots,R_m$ and thus $P^{-1}=R_m^{-1}\cdots R_1^{-1}=R_m\cdots R_1$, and thus $P^{-1}\in\text{PermMat}_n$. Similarly, if $P,P'\in\text{PermMat}_n$ then $P=R_1\cdots R_m$ and $P'=R'_1\cdots R'_{m'}$ so that

$PP'=R_1\cdots R_mR'_{1}\cdots R'_{m'}$

so that by definition $PP'\in\text{PermMat}_n$. $\blacksquare$

We can also see the following theorem

Theorem: $\text{card }\text{PermMat}_n=n!$.

Proof: Note that $P\in\text{PermMat}_n$ is completely determined by which entry in each row $1$ appears. So, having chosen the position for $1$ in row $1$ we are given $n-1$ choices for the position of $1$ in row $2$ (because we can’t put it in the same position as was chosen in row $1$ otherwise that column would have two $1$s in it). Continuing this logic we see that the $k^{\text{th}}$ row has $n-k$ choices and so the total number of choices is $n\cdot(n-1)\cdots 2\cdot1=n!$. $\blacksquare$

Why Permutation Matrices Are Interesting

So, as the heading suggests there’s no reason (except the suggestive name) there’s seemingly no reason why we, in our study of permutations, should care one bit about these matrices. Well, think again. Their true power comes from the following theorem

Theorem: Let $S_n$ denote, as always, the symmetric group on $n$ letters. Then, $S_n\cong\text{PermMat}_n$, where $\cong$ stands for group isomorphic.

Proof: Define

$\theta:S_n\to \text{PermMat}_n:\pi\mapsto P_\pi$

where $P_{\pi}=[a_{i,j}]$ is given by

$\displaystyle a_{i,j}=\begin{cases} 1 & \mbox{if }\quad i=\pi(j)\\ 0 & \mbox{if}\quad i\ne \pi(j)\end{cases}$

(note that this really is a well-defined map since each column and row will have precisely one $1$ in it and zeros elsewhere). Since $\text{card }S_n=\text{card }\text{PermMat}_n=n!$ to prove that $\theta$ is a bijection it suffices, by this post, to prove it’s an injection. To do this suppose suppose that $P_{\pi}=P_{\pi'}$. Then, every entry of these two matrices must be equal. In particular since $1=a_{\pi(m),m}$ we must have that $1=a'_{\pi(m),m}$. But, notice that this is true if and only if $\pi(m)=\pi'(m)$. It follows that $\pi(m)=\pi'(m)$ for all $m\in[n]$ and thus $\pi=\pi'$.

Now, to see it’s a homomorphism we suppose that $\pi,\pi'\in S_n$ and show that

$P_{\pi\circ\pi'}=P_{\pi}P_{\pi'}$

To do this we note that by definition

$\displaystyle P_{\pi}P_{\pi'}=[c_{i,j}],\quad\text{where }c_{i,j}=\sum_{k=1}^{n}a_{i,k}a'_{j,k}$

and

$\displaystyle P_{\pi\circ\pi'}=[d_{i,j}],\quad\text{where }d_{i,j}=\begin{cases}1 & \mbox{if} \quad i=\pi\left(\pi'\left(j\right)\right)\\ 0 & \mbox{if}\quad i\ne\pi\left(\pi'\left(j\right)\right)\end{cases}$

But, by definition $P_{\pi}P_{\pi'}=P_{\pi\circ\pi'}$ if and only if $c_{i,j}=d_{i,j}$ for each $i,j\in[n]$. So, to prove this we note that

$\displaystyle a_{i,k}a'_{k,j}=\begin{cases} 1 & \mbox{if}\quad i=\pi(k)\text{ and }k=\pi'(j)\\ 0 & \mbox{if}\quad i\ne\pi(k)\text{ or }k\ne\pi'(j)\end{cases}$

Notice then that $i=\pi(k)\text{ and }k=\pi'(j)\implies i=\pi(k)=\pi(\pi'(j))$. Thus, if $i\ne \pi(\pi'(j))j$ then $a_{i,k}a'_{k,j}=0,\text{ }k=1,\cdots,n$. Thus, $c_{i,j}=0$ if $i\ne\pi(\pi'(j))$. Now, if $i=\pi(\pi'(j))$ we see that $a_{i,k}a'_{k,j}=0,\text{ }k\in[n]-\{\pi'(j)\}$ and when $k=\pi'(j)$ we have that $a_{i,k}a'_{k,j}=a_{\pi(\pi'(j)),\pi'(j)}a_{\pi'(j),j}=1\cdot 1=1$ thus $c_{i,j}=1$. Putting this together we see that

$c_{i,j}=\begin{cases}1 &\mbox{if}\quad i=\pi(\pi'(j))\\ 0 & \mbox{if} \quad i\ne\pi(\pi'(j))\end{cases}$

and thus $c_{i,j}=d_{i,j}$ as required. It follows that

$\theta\left(\pi\circ\pi'\right)=P_{\pi\circ\pi'}=P_{\pi}P_{\pi'}=\theta(\pi)\theta(\pi')$

from where it follows that $\theta$ is an isomorphism and so the theorem follows. $\blacksquare$

Remark: The counterintuitive nature of $i=\pi(j)$ opposed to $j=\pi(i)$ is an unfortunate, but unavoidable “mistake” of how we compose functions. One can check that if we had defined the above with $j=\pi(i)$ we’d get that $P_{\pi\circ\pi'}=P_{\pi'}P_{\pi}$.

The usefulness of these matrices in the study of permutations will show itself soon enough.

Remark: There is no need to state and prove Cayley’s Theorem in these posts since, although entirely related to $S_n$, it focuses more on group theory in general than on $S_n$ itself. That said, one can see as an immediate corollary of the above that every group $G$ of order $n$ has some embedding $\phi:G\to\text{PermMat}_n$ as some of you may see has some implications in the theory of representations of finite groups.

References: For information on permutation matrices, I’m not really sure where to turn. There’s mention of them in

1. Horn, Roger A., and Charles R. Johnson. Matrix Analysis. Cambridge [u.a.: Cambridge Univ., 2009. Print.

but not anywhere else really, except left  as an exercise in a few books.

November 6, 2010 -