1
$\begingroup$

I got this problem when studying system of differential equations. I'm trying to find a $n\times n$ matrix with all eigenvalues are the same such that this matrix also has $n$ linear independent eigenvectors. I think the only type of matrice with this property is scalar multiple of the identity matrix. But it seems like this case is rarely occur in system of differential equations.

Let $A$ be $n\times n$ matrix.

Prove/disprove that if $A$ has eigen value of multiplicity $n$ and $n$ eigenvectors that linearly independent, then $A$ is a scalar multiple of identity matrix. That is, $$A = kI.$$

$\endgroup$
1
  • 2
    $\begingroup$ What do you know about diagonalization? $\endgroup$ Commented Feb 19, 2018 at 10:45

3 Answers 3

2
$\begingroup$

Any $n$ independent vectors form a basis, and any other vector can be expressed as a linear combination of them. Then applying the transform $A$, you get that vector times the unique Eigenvalue and $A=\lambda I$.

$$Ax=A(x_0 e_0+x_1 e_1+x_2e_2)=x_0 \lambda e_0+x_1 \lambda e_1+x_2\lambda e_2=\lambda x.$$

$\endgroup$
1
$\begingroup$

Yes, it is true. The assertion that the matrix has $n$ linearly independent eigenvectors is equivalent to the assertion that it is diagonalizable. And, since it has a single eigenvalue $k$, that means that it is similar to $k\operatorname{Id}_n$. But the only matrix similar to $k\operatorname{Id}_n$ is $k\operatorname{Id}_n$.

$\endgroup$
1
$\begingroup$

Since the matrix $A$ has $n$ linearly independent eigenvectors, then the matrix $P$ with columns being each eigenvector of $A$ is a full rank matrix. For example if the eigenvectors are $\begin{bmatrix}1\\2\end{bmatrix}$ and $\begin{bmatrix}3\\4\end{bmatrix}$ then the matrix $P$ would be:$$\begin{bmatrix}1&3\\2&4\end{bmatrix}$$which is full rank and invertible. Also according to definition if $v$ is an eigenvector of $A$ corresponding to eigenvalue $\lambda$ therefore$$Av=\lambda v$$since the columns of $P$ are all eigenvectors we have$$AP=A\begin{bmatrix}v_1&v_2&...&v_n\end{bmatrix}=\begin{bmatrix}Av_1&Av_2&...&Av_n\end{bmatrix}=\begin{bmatrix}\lambda_1v_1&\lambda_2v_2&...&\lambda_nv_n\end{bmatrix}$$where $v_i$'s and $\lambda_i$'s are eigenvectors and eigenvalues respectively. Also $\lambda_1=\lambda_2=...=\lambda_n=\lambda$ therefore we have$$AP=\begin{bmatrix}\lambda_1v_1&\lambda_2v_2&...&\lambda_nv_n\end{bmatrix}=\begin{bmatrix}\lambda v_1&\lambda v_2&...&\lambda v_n\end{bmatrix}=\lambda\begin{bmatrix}v_1&v_2&...&v_n\end{bmatrix}=\lambda P$$since $P$ is full rank then is invertible and we have$$AP=\lambda P\to APP^{-1}=\lambda PP^{-1}\to A=\lambda I$$ and our proof is complete

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.