I finished a whole chapter about eigenvalues, eigenvectors and subspaces which are invariant (w.r.t. a linear operator i.e. a mapping from $V$ to $V$).
I went through lots of theorems and propositions and I understand their proofs well. But I think a question came up which is not covered there. Here it is.
Let's say $V$ is $n$-dimensional vector space over $\mathbb{R}$ ($n$ - natural number). Also, let's say $\phi : V \to V$ is a linear map. Let's say the characteristic polynomial of $\phi$ has real roots $\lambda_1, \lambda_2, \dots, \lambda_k$ with multiplicies $m_1, m_2, \dots, m_k$
Of course this implies
$\sum_1^{k}m_k \le n$
Then (for any $i$), is it true that in $V$ we can find $m_i$ linearly independent eigenvectors corresponding to $\lambda_i$, but we cannot find $m_i+1$ linearly independent eigenvectors corresponding to $\lambda_i$?
I solved a few problems (with actual matrices containing numbers), and that always seems to be the case.
E.g. if the characteristic polynomial is divisible by $(x-5)^2$, then I am finding exactly 2 linearly independent eigenvectors corresponding to the eigenvalue 5 (that's because the multiplicity is 2). So basically in that case there's a 2-dimensional $\phi$ invariant subspace of $V$ corresponding to the eigenvalue 5:
$U = \{u : \phi(u) = 5u\}$
$\dim\ U = 2$
But is this always the case? And can $U$ happen to be higher dimensional than $2$ ($2$ being just an example - the multiplicity of the eigenvalue $5$, when viewed as a characteristic root of $\phi$)?
I don't think any of the theorems in that chapter covered these two questions.