## Halmos Sections 18,19, and 20: Direct Sums, Dimension of Direct Sums, and Duals of Direct Sums

**Point of Post:**In this post we do the 18,19, and 20th sections in Halmos. In it we discuss necessary and sufficient conditions for a space to be the direct sum of subspaces, etc. The last problem gives a particularly interesting way of viewing what it means for a space to be a direct sum of subspaces.

1. **Problem:**Suppose that and are vectors in ; let and be the subspaces of spanned by and respectively. In which of the following cases is it true that ?

**a) **, , , and

**b) **, , , and

**c) **, , , and

**Proof:** We do this by considering the following lemma

**Sufficiency Lemma:** Let be subspaces of (all finite dimensional, as always). Then, if and are bases of and respectively then if and only if is a basis for

**Proof:** We have prove the necessity of this statement so it remains to show the sufficiency portion. Suppose that is a basis for . First, suppose that then

and so upon subtraction

and so by the linear independence of we may conclude that

and thus it follows that . Also, to prove that we notice that given we that there exists such that

from where the conclusion is obvious.

And, another lemma to make at least some of these easier.

**Lemma:** Let . Define

(i.e. take the columns of the matrix to be the vectors). Then, implies that is a basis.

**Proof:** Suppose that

then we may rephrase this as

And so if we know that is invertible and so

from where it follows that the set of vectors is linearly independent But, this is a set of linearly independent vectors in an dimensional space, and so it must be a basis.

*Remark:* This is an if and only if, that I will eventually prove in both directions, but this direction suffices for now.

**a) **We note that so isn’t a basis and the sufficiency lemma takes care of the rest.

**b) **We note that if

a simple calculation tells us that and so by the second lemma it’s linearly independent. The rest follows from the sufficiency lemma.

**c) **Define as in **b) **and note that and draw the same conclusion.

2.

**Problem: **If is the subspace of those vectors for which , and if is the subspace of all those vectors for which then .

**Proof:** Clearly if are the standard basis elements for then is a basis for . Similarly is a basis for . Thus, it suffices to prove that is a basis for . But consider the matrix

stare at it long enough and you’ll notice that

But, noticing that block matrix multiplication works the same as regular multiplication we can see that the above matrix “acts” like

in terms of multiplication. But, a quick check gives that the inverse of the above matrix is

one may hypothesize that

And, in fact, it is easily verifiable that

from where it follows that and so by our second lemma and the sufficiency lemma we’re done.

*Remark:* The intuitive “acts the same” quality that led us to our solutions really is more than just intuition. Really what we’ve noticed is that in general if is a ring with and additive and multiplicative identities respectively then the matrix

is invertible with inverse

But, (where is the zero matrix) are the multiplicative and additive identity of the ring of all matrices over .

3.

**Problem:** Construct three subspaces , and of a vector space so that but .

**Proof:** We need look no further than our last example. Let , , and . Noticing that and are both bases for we can see that

and as required.

4.

**Problem:**

**a) **If and are vector spaces, what is the relation between and .

**b) **In what sense is the formation of direct sums commutative?

**Proof:** If we look at the direct sum as being the internal sum, the obvious answer to both is that they are the same thing. If we view it as the external direct sum then the answer to **a) **and **b) **is that they are isomorphic by the canonical isomorphisms

and

which are undoubtedly isomorphisms.

5.

**Problem:**

**a) **Three subspaces , , and ofa vector space are called *indepdent* if each one is disjoint from the sum of the other two (here means “disjoint” means their intersection is trivial). Prove that a necessary and sufficient condition for

is that and are independent and that .

**b) **Suppose that and are elements of a vector space and that , and are the subspaces spanned by and respectively. Prove that the vectors and are linearly independent if and only if the subspaces , and are independent.

**c) **Prove that three finite-dimensional subspaces are independent if and only if the sum of their dimensionas is equal to the dimension of their sums.

**d) **Generalize the results **a)-c) **from three subspaces to any finite number.

**Proof:** Instead of doing all the results and then **e) **let’s just generalize now. Let be subspaces of and call them independent if

With this we do **a)-c) **in full generality:

**a) **

**Restated Problem: **Let be subspaces of . Then,

if and only if are -independent and

**Proof:** We first prove the following lemma:

**Lemma:** Let be subsets of . Then, are -independent if and only if every element of may be written uniquely as with

**Proof: **First suppose that is independent and let . Suppose then that where . Then, for any we can see that

so that from where it follows by assumption that and so .

Conversely, suppose that every can be written uniquely as with . Then, if that for some (namely itself) and and so upon subtraction we get that

but, since and since

it follows by the assumption of unique representation from where it follows that .

Then, with the assumptions on the subspaces described in the original problem we can see the above lemma says that are independent if and only if every element of may be written as the a unique sum with , but this is equivalent to saying that is the direct sum of the subspaces.

**b) **

**Restated problem: **A set of vectors is linearly independent if and only if the subpspaces are linearly depdenent.

**Proof:** Suppose that is an independent set and suppose that

Then, that says that

But, and and so this may be rewritten as

from where it follows by linear independence that and so

Conversely, suppose that are independent then,

but this says that and so from this (assuming which we may do otherwise we’re done) we may conclude that . But, we may do this for each so that from where linear independence follows.

**c)**

**Restated problem:** Let be subspaces of . Then, they are independent if and only if the sum of their dimensions is the dimension of their sums.

**Proof**: First assume that is non trivial for some . Then, we see that

but, we assumed that was non-trivial, and so it’s dimension is greater than one. So, combining this with the fact that in general

we may conclude that

Conversely, suppose that are independent, then:

but by assumption that the intersection in the last term is trivial, the last term is zero. Continuing in this way successively leads to the sum of the dimensions.

No comments yet.

## Leave a Reply