## Permutations (Pt. II Permutation Matrices)

**Point of post: **This post is a continuation of this post in a continuing effort to define permutations fully and clearly, something someone reading Halmos’s book is bounded not to get (though admittedly this isn’t his goal). In this particular post we discuss the interesting concept of “permutation matrices” in an effort to intuitively simplify proofs covered in the next few posts.

*Motivation*

Often in mathematics the simple concept of “thinking of something differently” can have a profound impact on a study. The study of permutations is no different. Thus, it is clearly fruitful to search of as many different ways as possible. In this post we take permutations and think about them in terms of something most of us know all about, matrices.

*Definition*

Let be the set of all matrices with real entries. We call a *permutation matrix *if can be transformed into the identity matrix by interchanging the rows of . Said differently is a permutation matrix if there exists row-exchanging matrices such that

It is clear though that this is equivalent to saying that such that if one fixed then there is a unique such that and for and conversely, for each fixed there exists a unique such that and for .

**Theorem:** *Let be the set of all permutation matrices. Then, is a group under matrix multiplication.*

**Proof: **Evidently and for all . The associativity of the binary operation follows from the fact that matrix multiplication is, in general, associative. Thus, the only real question is whether or not the product and inverses of permutation matrices is a permutation matrix. This follows almost immediately though after noticing that for a row-exchanging matrix we have that so that . Thus, if then

for some row-exchanging matrices and thus , and thus . Similarly, if then and so that

so that by definition .

We can also see the following theorem

**Theorem: ***.*

**Proof: **Note that is completely determined by which entry in each row appears. So, having chosen the position for in row we are given choices for the position of in row (because we can’t put it in the same position as was chosen in row otherwise that column would have two s in it). Continuing this logic we see that the row has choices and so the total number of choices is .

*Why Permutation Matrices Are Interesting*

So, as the heading suggests there’s no reason (except the suggestive name) there’s seemingly no reason why we, in our study of permutations, should care one bit about these matrices. Well, think again. Their true power comes from the following theorem

**Theorem: ***Let denote, as always, the symmetric group on letters. Then, , where stands for group isomorphic.*

**Proof: **Define

where is given by

(note that this really is a well-defined map since each column and row will have precisely one in it and zeros elsewhere). Since to prove that is a bijection it suffices, by this post, to prove it’s an injection. To do this suppose suppose that . Then, every entry of these two matrices must be equal. In particular since we must have that . But, notice that this is true if and only if . It follows that for all and thus .

Now, to see it’s a homomorphism we suppose that and show that

To do this we note that by definition

and

But, by definition if and only if for each . So, to prove this we note that

Notice then that . Thus, if then . Thus, if . Now, if we see that and when we have that thus . Putting this together we see that

and thus as required. It follows that

from where it follows that is an isomorphism and so the theorem follows.

*Remark: *The counterintuitive nature of opposed to is an unfortunate, but unavoidable “mistake” of how we compose functions. One can check that if we had defined the above with we’d get that .

The usefulness of these matrices in the study of permutations will show itself soon enough.

*Remark: *There is no need to state and prove Cayley’s Theorem in these posts since, although entirely related to , it focuses more on group theory in general than on itself. That said, one can see as an immediate corollary of the above that every group of order has some embedding as some of you may see has some implications in the theory of representations of finite groups.

**References: **For information on permutation matrices, I’m not really sure where to turn. There’s mention of them in

1. Horn, Roger A., and Charles R. Johnson. *Matrix Analysis*. Cambridge [u.a.: Cambridge Univ., 2009. Print.

but not anywhere else really, except left as an exercise in a few books.

[…] of post: In this post we continue the discussion of permutations discussed here and here in our journey to understand permutations rigorously, clearly, and fully so that we may discuss the […]

Pingback by Permutations (Pt. III Orbits and Cycles) « Abstract Nonsense | November 6, 2010 |

[…] relatively short post discussing a nice corollary of the last theorem in the last post relating to permutation matrices. In essence, it shows that every permutation matrix may be decomposed into the product of commuting […]

Pingback by Permutations (Pt. IV Consequences for Permutation Matrices) « Abstract Nonsense | November 7, 2010 |

[…] extra structure to study the Hom groups. The answer, for finite groups, lies in the fact that as permutation matrices so that from the above we could benefit greatly from studying for all –of course, this is […]

Pingback by Yoneda’s Lemma (Pt. I) « Abstract Nonsense | January 14, 2012 |