## Alternating Forms

**Point of post: **In this post we discuss the concept of alternating forms as is discussed in section 30 of Halmos.

*Motivation*

So in our last post we restricted our study of multilinear algebra to that of spaces of the form in the attempt to better approximate multilinear forms which are near and dear to us, (e.g. the determinant). Thus, we continue on this path to study alternating forms, the abstractification of the observation that if two columns (or rows) in a matrix are the same, then the determinant is zero. This is really where things start to get interesting.

*Alternating Forms*

** **Let be a finite dimensional -space and we call

*alternating if*

for any . In other words is zero whenever any two of it’s arguments are equal. The first result that we can see from such -forms is that they’re alternating. Indeed

**Theorem: ***Let be a finite dimensional -space and let be alternating, then is skew-symmetric.*

**Proof: **We first note for any two we have by multilinearity that

But, since is alternating we may conclude that the left hand side and the first and third terms of the right hand side vanish to give us that

from where it follows by the arbitrariness of that for any transposition we have that

and thus if we may recall by the first theorem in the last post that

(more properly by induction) from where it clearly follows that and thus is skew-symmetric.

Now, intuitively it seems as though the converse of the above is true since, for example taking bilinear forms we see that if is skew-symmetric then

from where we would like to conclude that . Unfortunately though there are fields for which and , take for example. In particular, this is true precisely when the character of the field is equal to . Thus, with this in mind we are able to make sense of the restrictions in the following theorem:

**Theorem: ***Let be a finite dimensional -space with , then if is skew-symmetric then is alternating.*

**Proof: **We merely note that for any we have that

and thus adding to both sides we see that

and since is a unit we may conclude that

and since and were arbitrary it follows that is alternating.

We will see later (as one of the homework problems) an example where the above can fail for fields of characteristic two. Next is a theorem which is patently obvious, yet very important. It basically says that if is a linearly dependent set of vectors in and is alternating then . This is clear, as I said, because one of the vectors in is a linear combination of the others. Thus, when one uses multilinearity to expand it one has a sum of terms each term of which has two identical entries in the part (see below).

**Theorem: ***Let be a -dimensional -space and a linearly dependent set of vectors. Then, for any alternating it’s true that *

**Proof: **Since is linearly dependent then we may assume without loss of generality that there exists scalars such that

(we may assume this since one of the vectors must be a linear combination of the others, and thus with the possibility of reordering we may assume that vector is the “first” one). Thus

From where the conclusion follows.

The obvious question is, does the converse hold. Namely, if one has a -linear alternating form and does imply that is linearly dependent? The answer is yes in the case when .

**Theorem: ***Let be an -dimensional -space and be non-zero and alternating. Then, if are linearly independent then .*

**Proof: **Clearly since the dimension coincides with the number of vectors we know that forms a basis for . So, let be arbitrary. Then, we know that

so that

now, if at any point then and if not then for some , and by our first theorem this implies that . Thus, if we assume that then the two cases above imply that every term in is zero, and thus the sum is zero. But, since were arbitrary it follows that for all contradictory to our assumption that . It follows that .

**References:**

1. Halmos, Paul R. ” *Finite-dimensional Vector Spaces,*. New York: Springer-Verlag, 1974. Print

[…] that we have dealt enough with alternating -forms to get a “feel” for them, we can safely state some obvious theorems, which trim away […]

Pingback by Characterization of Alternating Multilinear Forms, and the Determinant « Abstract Nonsense | November 15, 2010 |

[…] Recall that an alternating map is a -linear map such that if any . The classic example of an alternating map is the -linear map given by the determinant (where we are obviously identifying an -tuple of elements with their associated -matrix). […]

Pingback by The Tensor Algebra and Exterior Algebra (Pt. III) « Abstract Nonsense | May 10, 2012 |

[…] back into the space. The embedding should look familiar either if you are familiar with the more classic approach to multilinear algebra (via the alternating map) or if you know Leibniz expansion of the determinant. Namely, we are […]

Pingback by The Tensor Algebra and Exterior Algebra (Pt. IV) « Abstract Nonsense | May 10, 2012 |