## Invariant Subspaces

**Point of Post: **In this post we’ll discuss the equivalent of section 39 in Halmos, but in more detail.

*Motivation*

Often in mathematics it’s fruitful to take a look at a class of commonly occurring, easily dealt with set of examples to enrich the overall tapestry of one’s knowledge. Linear transformations are no-different. We’d like to study the idea now of *invariant subspaces*. The idea being that if is a linear homomorphism with a non-trivial invariant subspace then is ‘nicer’ in some sense. We make this more rigorous below.

## Characterization of Linear Homomorphisms In Terms of Bases

**Point of post: **In this post we give a characterization of linear homomorphisms which has to do entirely with how the homomorphism acts on a basis.

*Motivation*

Often it is laborious to check that that a linear transformation is injective or surjective. It turns out that there is a nice characterization of injectivity and surjectivity for finite dimensional spaces which depends entirely on how the the transformation acts on a basis of the domain space.

## Invertible Linear Transformations (Pt. I)

**Point of post: **In this post we talk about the equivalent of Halmos’s section 36, but in a more general setting.

*Motivation*

Recall that in our last post we discussed how to turn into an associative unital algebra by defining a ‘multiplication map’

We saw though that this multiplication has some downsides (it had zero divisors and wasn’t, in general, commutative). There is a nice property of this algebra which isn’t enjoyed by all algebras. In particular, it is not ‘uncommon’ for elements of to have multiplicative inverses, in the usual sense. Now, at first glance this doesn’t seem that ‘great’ of a property; I mean, not *all* of the elements of have multiplicative inverses just *some *do. One may expect that, in general, this is a common occurrence among algebras, even more common among associative unital algebras. In fact, this isn’t the case. To see how badly the existence of multiplicative inverses can get screwed up consider for example the polynomial ring . It’s fairly plain to see that is, in fact, an associative unital algebra over with the usual polynomial addition and multiplication. That said, a little thought shows that has a multiplicative inverse *only if *.

Thus, this post will explore this ‘nice’ quality of the multiplication map on

## Halmos Sections 32 and 33: Linear Transformations and Transformations as Vectors (Pt. II)

**Point of post: **This is a continuation of this post in an effort to answer the questions at the end of sections 32 and 33 in Halmos’s book.

## Halmos Sections 32 and 33: Linear Transformations and Transformations as Vectors (Pt. I)

**Point of post: **In this post we complete the problems that appear at the end of Halmos, sections 32 and 33.

## Linear Transformations

**Point of post: **In this post I discuss the material discussed in sections 31 and 32 of Halmos.

*Motivation*

We finally begin studying arguably the most important aspect of linear algebra, *linear transformations.* These are the structure preserving maps for vector spaces. They correspond naturally to coordinate changes, linear shifts, and contraction/dilations. Moreover, theoretically they are interesting because, as it turns out, many maps which aren’t usually thought of as linear transformations are. For example, the differential operator on the space .