4
$\begingroup$

I'm doing a self-study of Axler's Linear Algebra Done Right, and am looking for some help understanding a step in the proof of Proposition 5.21, appearing on page 89 of the second edition. An abbreviated version of 5.21 and its proof is shown below, with the step I don't understand highlighted in bold:

Suppose $T \in \mathcal{L}$($V$). Let $\lambda_{1}, \ldots , \lambda_{m}$ denote the distinct eigenvalues of $T$. Then the following are equivalent:

  1. $V$ has a basis consisting of eigenvectors of T;
  2. dim V $=$ dim null ($T - \lambda_{1}I) + \cdots +$ dim null($T - \lambda_{m}I$)

Suppose that (2) holds. Choose a basis of null ($T - \lambda_{j}I$); put all these bases together to form a list ($\vec{v_{1}}, \ldots , \vec{v_{n}}$) of eigenvectors of $T$, where $n$ = dim $V$. To show this list is linearly independent, suppose

$a_{1}\vec{v_{1}} + \cdots +a_{n}\vec{v_{n}} = 0$ (1)

where $a_{1}, \ldots , a_{n} \in \mathbf{F}$. For each $j = 1,\ldots,m$, let $u_{j}$ denote the sum of all the terms of $a_{k} \vec{v_{k}}$ such that $\vec{v_{k}} \in$ null($T - \lambda_{j}I$). Thus each $u_{j}$ is an eigenvector of $T$ with eigenvalue $\lambda_{j}$, and

$u_{1} + \cdots + u_{m} = 0$

My question is this: what does it mean to "let $u_{j}$ denote the sum of all the terms of $a_{k} \vec{v_{k}}$ such that $\vec{v_{k}} \in$ null($T - \lambda_{j}I$)?" Aren't those terms already defined in equation (1), and isn't their sum already supposed to be zero? And how is it that "thus" each $u_{j}$ becomes an eigenvector of $T$ with a corresponding eigenvalue of $\lambda_{j}$?

Thanks in advance for any and all help!

$\endgroup$
1
  • $\begingroup$ Axler is defining something called $u_j$, so when he tells you to get it by summing over all those terms, I think he means for you to keep $j$ fixed while varying $k$. In other words, he wants you to get $u_j$ by adding up just the terms associated with the null space of $T - \lambda_j I$, with $j$ fixed. $\endgroup$ Commented Dec 19, 2014 at 19:21

1 Answer 1

2
$\begingroup$

We are assuming that there is a linear combination $a_1 \vec{v_1} + \dotsb + a_n \vec{v_n} = 0$, where these $n$ vectors $\vec{v_1}, \dotsc, \vec{v_n}$ are all the basis elements we chose for the null-spaces $\operatorname{null}(T - \lambda_i I)$.

So, we collect the terms that belong to each of these $m$ sets. For instance, suppose $n = 5$ and $T$ has three eigenvalues; the first, $\lambda_1$, has an eigenspace of dimension 2 (i.e. 2 independent eigenvectors); $\lambda_2$ also has $\operatorname{dim}\operatorname{null}(T - \lambda_2 I) = 2$; and $\lambda_3$ only has a one-dimensional eigenspace. Then the layout is like $$ \underbrace{a_1 \vec{v_1} + a_2 \vec{v_2}}_{u_1} + \underbrace{a_3 \vec{v_3} + a_4 \vec{v_4}}_{u_2} + \underbrace{a_5 \vec{v_5}}_{u_3} = 0 $$ Because $u_i$ is a linear combination of eigenvectors for $\lambda_i$, it is also an eigenvector for $\lambda_i$. And it's clear that $u_1 + u_2 + u_3 = 0$.

$\endgroup$
3
  • $\begingroup$ Terrific explication, @Kundor. Very much appreciated. One last request for clarification. You say that "because $u_{i}$ is a linear combination of eigenvectors for $\lambda_{i}$, it is also an eigenvector for $\lambda_{i}$." This wasn't obvious to me at all. Is this because null($T - \lambda_{i}$) is a subspace of $V$? $\endgroup$ Commented Dec 19, 2014 at 23:18
  • $\begingroup$ @ScentlessApprentice: Suppose $w$ and $v$ are eigenvectors for the eigenvalue $\lambda$, so $T w = \lambda w$ and $T v = \lambda v$. Then $T(w + v) = T w + T v = \lambda w + \lambda v = \lambda(w + v)$. $\endgroup$ Commented Dec 20, 2014 at 2:19
  • $\begingroup$ That's what I thought. Thanks! $\endgroup$ Commented Dec 20, 2014 at 4:41

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.