# Abstract Nonsense

## Problem For Putnam Class

Point of post: In this post I’ll solve a fairly easy (though it took me a bit to see exactly now to do it) linear algebra problem. This is for a Putnam class I’m taking, where we’re encouraged to do as many problems as one can out of the six or so posted each week. This was one of the “easier” ones (they’re two categories, easier problems and past Putnam problems).

Problem: Let $\mathscr{V}$ be an $n$-dimensional vector space and $A:\mathscr{V}\to\mathscr{V}$ be a linear transformation. Prove that if $A$ has $n+1$ eigenvectors, each $n$  which are linearly independent, then $A=\lambda I$ for some $\lambda$.

Proof: Denote the $n+1$ eigenvectors as $x_1,\cdots,x_{n+1}$ and their corresponding eigenvalues as $\lambda_1,\cdots,\lambda_{n+1}$. Let  $\lambda_i$ and $\lambda_j$ with $i\ne j$ be arbitrary. By possible relabeling we may assume that $i=n$ and $j=n+1$. But, we know that $\{x_1,\cdots,x_{n-1},x_n\}$ is linearly independent and so (Horn, 46)

$A\sim\text{ diag}(\lambda_1,\cdots,\lambda_{n-1},\lambda_n)$

But, since $\{x_1,\cdots,x_{n-1},x_{n+1}\}$ is also linearly independent we may appeal to the same idea and see that

$A\sim\text{ diag}(\lambda_1,\cdots,\lambda_{n-1},\lambda_{n+1})$

from where it follows by the transitivity of the similarity relation that

$\text{diag}(\lambda_1,\cdots,\lambda_{n-1},\lambda_n)\sim\text{ diag}(\lambda_1,\cdots,\lambda_{n-1},\lambda_{n+1})$

Thus, since the trace of a matrix is invariant under similarity we may conclude that

$\displaystyle \sum_{k=1}^{n-1}\lambda_k+\lambda_n =\sum_{k=1}^{n-1}\lambda_k+\lambda_{n+1}$

and so by cancellation we get that $\lambda_n=\lambda_{n+1}$. Since (by relabeling) the eigenvalues were arbitrary we may conclude that $\lambda_1=\cdots=\lambda_n=\lambda_{n+1}$. Therefore, we can see that since

$A\sim\text{ diag}(\lambda_1,\cdots,\lambda_n)=\text{ diag}(\lambda,\cdots,\lambda)=\lambda I$

we may conclude that

$A=S\lambda IS^{-1}=\lambda ISS^{-1}=\lambda I$

from where the conclusion follows. $\blacksquare$

References:

1. Horn, Roger A., and Charles R. Johnson. Matrix Analysis. Cambridge [u.a.: Cambridge Univ., 2009. Print.