# Abstract Nonsense

## Linear Transformations

Point of post: In this post I discuss the material discussed in sections 31 and 32 of Halmos.

Motivation

We finally begin studying arguably the most important aspect of linear algebra, linear transformations. These are the structure preserving maps for vector spaces. They correspond naturally to coordinate changes, linear shifts, and contraction/dilations. Moreover, theoretically they are interesting because, as it turns out, many maps which aren’t usually thought of as linear transformations are. For example, the differential operator on the space $C^{\infty}[\mathbb{R},\mathbb{R}]$.

Linear Transformations

If $\mathscr{V}$ and $\mathscr{W}$ are $F$-space, we call a mapping $T:\mathscr{V}\to\mathscr{W}$linear transformation/linear homomorphism if

$T\left(\alpha x+\beta y\right)=\alpha T\left(x\right)+\beta T\left(y\right)$

for all $x,y\in \mathscr{V}$ and $\alpha,\beta\in F$. If $\mathscr{V}=\mathscr{W}$ we call $T$ a linear transformation on $\mathscr{V}$ in which case $T$ is usually called a linear operator or endomorphism.

As stated earlier examples of linear transformations are abound. We give three examples:

Ex.(1): Let $\mathscr{V}$ be a $F$-space, then if we consider $F$ as a vector space over itself then every linear functional $\varphi\in\text{Hom}\left(\mathscr{V},F\right)$ is a linear transformation from $\mathscr{V}$ to $F$.

Ex.(2): If $\mathscr{V}$ is a vector space the the maps $\bold{0}:\mathscr{V}\to\mathscr{V}:x\mapsto \bold{0}$ and $\text{id}:\mathscr{V}\to\mathscr{V}:x\mapsto x$ are both linear transformations.

Remark: In this context it is customary to denote $\text{id}:\mathscr{V}\to\mathscr{V}$ by $\bold{1}$, the reasons for this will be clear soon enough.

Ex.(3): If $C^{\infty}\left[\mathbb{R},\mathbb{R}\right]$ denotes the space of all smooth functions from $\mathbb{R}$ to $\mathbb{R}$ then the differential operator $D:C^{\infty}\left[\mathbb{R},\mathbb{R}\right]\to C^{\infty}\left[\mathbb{R},\mathbb{R}\right]:f\mapsto f'$ is a linear operator. More specifically, if $\mathbb{C}_n[z]$ is the space of all $n-1$ degree polynomials with complex coefficients  then

$\displaystyle D:\mathbb{C}_n[x]\to\mathbb{C}_n[x]:\sum_{j=0}^{n-1}c_j x^j\mapsto \sum_{j=1}^{n-1}jc_j x^{j-1}$

is a linear operator.
Remark: Note that the last map was defined algebraically, in a manner independent of the derivative. Thus, we can make sense of the “differential operator” on arbitrary polynomial rings.

Some aspects of linear transformation become apparent as soon as the definitions are laid out. For example, it’s evident that for any linear transformation $T$ that $T(\bold{0})=T\left(0\bold{0}\right)=0T\left(\bold{0}\right)=\bold{0}$. Slightly less obvious is the following theorem

Theorem: Let $T$ be a linear transformation from $\mathscr{V}$ to $\mathscr{W}$ which is bijective. Then, $T^{-1}$ is a linear transformation.

Proof: We merely note that if $x,y\in\mathscr{W}$ then $x=T(x'),y=T(y')$ for some $x',y'\in\mathscr{V}$ and thus

\begin{aligned}T^{-1}\left(\alpha x+\beta y\right) &=T^{-1}\left(\alpha T(x')+\beta T(y')\right)\\ &=T^{-1}\left(T\left(\alpha x'+\beta y'\right(\right)\\ &=\alpha x'+\beta y'\\ &=\alpha T^{-1}(x)+\beta T^{-1}(y)\end{aligned}

and since $x,y\in \mathscr{W}$ and $\alpha,\beta\in F$ were arbitrary the conclusion follows. $\blacksquare$

It’s also fairly clear that we may add linear transformations and multiply them by scalars in the obvious way. Namely if $T,T':\mathscr{V}\to\mathscr{W}$ are linear transformations and $\alpha,\beta\in F$ then we can define $\alpha T+\beta T'$ to be the map

$\alpha T+\beta T':\mathscr{V}\to\mathscr{W}:x\mapsto \alpha T(x)+\beta T'(x)\quad\mathbf{(1)}$

Moreover, with this definition of addition it’s clear that the zero function $\bold{0}$ acts as a additive inverse and the map $-T$ given by $(-T)(x)=-T(x)$ is an additive inverse. Thus, it’s not a quantum leap to realize that with the operations of addition and scalar multiplication given by $\mathbf{(1)}$ that the set of all linear transformations from $\mathscr{V}$ to $\mathscr{W}$, denoted $\text{Hom}\left(\mathscr{V},\mathscr{W}\right)$, is a vector space. If we only consider endomorphisms on $\mathscr{V}$ we shorten $\text{Hom}\left(\mathscr{V},\mathscr{V}\right)$ to $\text{End}\left(\mathscr{V}\right)$.

Dimension of $\text{Hom}\left(\mathscr{V},\mathscr{W}\right)$

So, the obvious question is, what is the dimension of $\text{Hom}\left(\mathscr{V},\mathscr{W}\right)$? The answer turns out to be $\dim_F\text{Hom}\left(\mathscr{V},\mathscr{W}\right)=\dim_F\left(\mathscr{V}\right)\cdot\dim_F\left(\mathscr{W}\right)$ although it may not be clear immediately why this is so. But first, we prove a small lemma:

Theorem: Let $\mathscr{V},\mathscr{W}$ be finite dimensional $F$-spaces. Then, if $\{v_1,\cdots,v_m\}$ is a basis for $\mathscr{V}$ then for any $\{w_1,\cdots,w_m\}\subseteq\mathscr{W}$ there is a unique linear transformation $T$ such that $T(v_k)=w_k$ for each $k\in[m]$.

Proof: Let $T:\mathscr{V}\to\mathscr{W}$ be the map given by

$\displaystyle T\left(\sum_{j=1}^{m}\alpha_j v_j\right)=\sum_{j=1}^{m}\alpha_j w_j$

Clearly then we have that $T$ is a linear map and $T\left(v_k\right)=w_k,\text{ }k=1,\cdots,m$. Moreover, if $T'$ is also a linear transformation such that $T'(v_k)=w_k$ we see then that

$\displaystyle T'\left(\sum_{j=1}^{m}\alpha_j v_j\right)=\sum_{j=1}^{m}\alpha_j T\left(v_j\right)=\sum_{j=1}^{m}\alpha_j w_j=T\left(\sum_{j=1}^{m}\alpha_j v_j\right)$

and since every element of $\mathscr{V}$ may be written in the form $\displaystyle \sum_{j=1}^{m}\alpha_j v_j$  it follows that $T'(v)=T(v)$ for all $v\in\mathscr{V}$, from where the conclusion follows. $\blacksquare$

Theorem: Let $\mathscr{V},\mathscr{W}$ be $F$-spaces of dimension $m$ and $n$ respectively; then the dimension of $\text{Hom}\left(\mathscr{V},\mathscr{W}\right)$ is $mn$.

Proof: Let $\{x_1,\cdots,x_m\}$ be a basis for $\mathscr{V}$ and $\{y_1,\cdots,y_n\}$ a basis for $\mathscr{W}$. Then, for each $i\in[m]$ and $j\in[n]$ the above lemma implies that there exists a unique $T_{i,j}\in\text{Hom}\left(\mathscr{V},\mathscr{W}\right)$ such that$T_{i,j}(v_k)=\delta_{i,k}w_j$. We claim that $\mathscr{B}\overset{\text{def.}}{=}\left\{T_{i,j}:i\in[m]\text{ and }j\in[n]\right\}$ forms a basis for $\text{Hom}\left(\mathscr{V},\mathscr{W}\right)$. Firstly, to see that $\mathscr{B}$ we suppose that $\alpha_{i,j}\in F,\text{ }i\in[m],j\in[n]$ are such that

$\displaystyle \sum_{i=1}^{m}\sum_{j=1}^{n}\alpha_{i,j}T_{i,j}=\bold{0}$

then, for a fixed $i_0\in[m]$ we see that the above implies that

$\displaystyle \sum_{j=1}^{m}\alpha_{i_0,j}w_j=\sum_{i=1}^{m}\sum_{j=1}^{n}\alpha_{i,j}T_{i,j}(v_{i_0})=0$

and since $\{w_1,\cdots,w_n\}$ is linearly independent it follows that $\alpha_{i_0,1},\cdots,\alpha_{i_0,n}=0$. But, since $i_0$ was arbitrary it follows that $\alpha_{i,j}=0,\text{ }i\in[m],j\in[n]$. Now, to prove that $\text{span }\mathscr{B}=\text{Hom}\left(\mathscr{V},\mathscr{W}\right)$ let $T\in\text{Hom}\left(\mathscr{V},\mathscr{W}\right)$.Then, for each $i_0\in[m]$ we have that

$\displaystyle T(v_{i_0})=\sum_{j=1}^{n}\beta_{i_0,j}w_j$

we claim that

$\displaystyle T=\sum_{i=1}^{m}\sum_{j=1}^{n}\beta_{i,j}T_{i,j}$

to do this it suffices to show that the right hand side agrees with the left hand side for $v_1,\cdots,v_m$. To do this though we merely note that for each $i_0\in[m]$

$\displaystyle \sum_{i=1}^{m}\sum_{j=1}^{n}\beta_{i,j}T_{i,j}(v_{i_0})=\sum_{j=1}^{n}\beta_{i_0,j}T_{i_0,j}(v_{i_0})=\sum_{j=1}^{n}\beta_{i_0,j}w_j=T(v_{i_0})$

from where it follows that $\mathscr{B}$ is a basis. Noting that $\text{card }\mathscr{B}=mn$ finishes the argument. $\blacksquare$

Corollary: If $\mathscr{V}$ is a finite dimensional $F$-space then $\dim_F\text{End}\left(\mathscr{V}\right)=\left(\dim_F\mathscr{V}\right)^2$

References:

1. Halmos, Paul R.  Finite-dimensional Vector Spaces,. New York: Springer-Verlag, 1974. Print

November 19, 2010 -

1. […] Proof: For the notation and proof see here […]

Pingback by Halmos Sections 32 and 33: Linear Transformations and Transformations as Vectors (Pt. I) « Abstract Nonsense | November 22, 2010 | Reply

2. […] a basis for where is as it was in previous discussions. It’s evidently linear independent, since it’s a subset of a linearly independent set. […]

Pingback by Halmos Sections 32 and 33: Linear Transformations and Transformations as Vectors (Pt. II) « Abstract Nonsense | November 22, 2010 | Reply

3. […] our last post we discussed the concept of linear transformations from a -space to itself  and how we can form […]

Pingback by The Endomorphism Algebra « Abstract Nonsense | November 22, 2010 | Reply

4. […] a set-theoretic inverse which is also a linear transformation. Recall though that we proved in an earlier post that if a linear transformation possesses a set-theoretic inverse, denoted , it is a linear […]

Pingback by Invertible Linear Transformations (Pt. I) « Abstract Nonsense | November 30, 2010 | Reply

5. […] Evidently is bijective and . The fact that the inverse is linear follows from a previous theorem. To see that the inverse preserves products we note that for any we have by surjectivity of that […]

Pingback by Algebra Isomorphism Between Algebra of Square Matrices and the Endomorphism Algebra (Pt. I) « Abstract Nonsense | December 14, 2010 | Reply

6. […] where is an open subset of and let . We say that is differentiable at if there exists a linear transformation  such that the […]

Pingback by The Total Derivative « Abstract Nonsense | May 22, 2011 | Reply

7. […] morphisms for -modules turn out to be precisely what we’d hope they’d be– “linear transformations“, or at least the analogue of them for modules (i.e. group homomorphisms that respect scalar […]

Pingback by Module Homomorphisms « Abstract Nonsense | October 28, 2011 | Reply

8. […] the finite dimensional vector spaces over respectively. In both cases the morphisms are just the linear transformations between the […]

Pingback by Examples of Categories (Revisited) « Abstract Nonsense | November 2, 2011 | Reply