So we want to find the basis for the eigenspace of each eigenvalue $\lambda$ for some matrix $A$.
Through making this question, I have noticed that the basis for the eigenspace of a certain eigenvalue has some sort of connection to the eigenvector of said eigenvalue. Now I'm not sure if they actually equal each other, because I have some trouble when it comes to eigenvalues with a geometric multiplicity of two or more.
Take the following example:
$$\begin{pmatrix} 0 & -1 & 0 \\ 4 & 4 & 0 \\ 2 & 1 & 2 \end{pmatrix} $$
This matrix has a characteristic polynomial $- \lambda ^3 + 6 \lambda ^2 - 12 \lambda + 8$. The root of this is $\lambda = 2$, which has an algebraic multiplicity of 3.
When I try to find the basis for the eigenspace of the eigenvalue $\lambda = 2$, I kind of get confused. Because when I solve $(A - 2I)\mathbf{v}$ I simply get $(0,0,1)$ (actually $(0,0,n)$ where $n \in \mathbb{R}$) as the answer for the basis, even though this eigenvalue has two associated linearly independent eigenvectors, namely $(0,0,1)$ and $(1,-2,0)$. This leaves me with the following questions:
Is it true that the "basis of the eigenspace of the eigenvalue" is simply all of the eigenvectors of a certain eigenvalue (so in our example, the basis would be $(0,0,1), (1,-2,0)$)?
If so, why am I not able to get both eigenvectors with my method? And how would I be able to get them both?