22
$\begingroup$

Here is a vector

$$\begin{pmatrix}i\\7i\\-2\end{pmatrix}$$

Here is a matrix

$$\begin{pmatrix}2& i&0\\-i&1&1\\0 &1&0\end{pmatrix}$$

Is there a simple way to determine whether the vector is an eigenvector of this matrix?

Here is some code for your convenience.

h = {{2, I, 0 }, {-I, 1, 1}, {0, 1, 0}}; y = {I, 7 I, -2}; 
$\endgroup$
8
  • 3
    $\begingroup$ Could also do h.y/y. Dividing by y divides element-wise, and so if it's an eigenvector, each element of the resulting vector should be the same (which is the eigenvalue). $\endgroup$ Commented Nov 27, 2017 at 19:24
  • 2
    $\begingroup$ @march: Division by zero (not in this case, but in general)...? $\endgroup$ Commented Nov 27, 2017 at 19:32
  • $\begingroup$ @HenrikSchumacher. Sure, it's not completely general, which is partly why I didn't write it as an answer. $\endgroup$ Commented Nov 27, 2017 at 19:33
  • $\begingroup$ @march It's still a good idea. $\endgroup$ Commented Nov 27, 2017 at 19:35
  • 8
    $\begingroup$ Solve the eigenvalue/vector equation: Solve[h.y == lambda*y, lambda]. It is an eigenvector iff solution set is nonempty. $\endgroup$ Commented Nov 27, 2017 at 20:33

8 Answers 8

37
$\begingroup$

You could use MatrixRank. Here is a function that does this:

eigenvectorQ[matrix_, vector_] := MatrixRank[{matrix . vector, vector}] == 1 

For your example:

eigenvectorQ[h, y] 

False

$\endgroup$
2
  • $\begingroup$ Slick... (and an upvote) $\endgroup$ Commented Nov 28, 2017 at 16:31
  • $\begingroup$ That is the right way $\endgroup$ Commented Nov 29, 2017 at 7:26
8
$\begingroup$

For problems with exact coordinates, one could code up the definition of eigenvector. The function eigV finds the eigenvalue for a given vector in the form L == value or returns False if there is none; the function eigQ returns True if there exists an eigenvalue for

ClearAll[eigQ, eigV]; eigV[m_, v_] := Reduce@Thread[(m - SparseArray[{i_, i_} :> L, Dimensions[m]]).v == 0]; eigV[m_][v_] := eigV[m, v]; (* operator form *) eigQ[m_, v_] := Resolve@Exists[L, eigV[m, v]]; eigQ[m_][v_] := eigQ[m, v]; (* operator form *) 

Examples:

eigQ[h] /@ {y, {-I (-2 + Sqrt[3]), 1 - Sqrt[3], 1}} (* {False, True} *) eigV[h] /@ {y, {-I (-2 + Sqrt[3]), 1 - Sqrt[3], 1}} (* {False, L == 1 - Sqrt[3]} *) 

Or simply

eigQ[h, y] (* False *) 

For approximate problems, one would have to account for rounding error.

$\endgroup$
3
  • $\begingroup$ Note: It's simpler to use Daniel Lichtblau's form for the equation: eigV[m_, v_] := Reduce[m.v == L*v] (brain spasm sent me on the roundabout way, I guess). $\endgroup$ Commented Nov 27, 2017 at 22:20
  • 1
    $\begingroup$ Why not just Resolve@Exists[α, #1.#2 == α #2] & @@ {h, y}? $\endgroup$ Commented Nov 28, 2017 at 3:21
  • $\begingroup$ @aardvark2012 Fair enough. I had something like that at one point. But I was enamored with eigV; for some reason, it appealed to me. Maybe the eigenvalue as a certificate of proof, I suppose. $\endgroup$ Commented Nov 28, 2017 at 3:46
6
$\begingroup$

Yes! We just check whether $h.y = (u + I v) y$ holds for some real $u, v \in \mathbb{R}$.

h = {{2, I, 0}, {-I, 1, 1}, {0, 1, 0}}; y = {I, 7 I, -2}; expr = Norm[h.y - (u + I v) y, 2]^2 // ComplexExpand; Minimize[expr, {u, v}] 

{623/6, {u -> 17/18, v -> 0}}

Answer: Nope, it's not.

$\endgroup$
5
$\begingroup$

You already have several good answers. An alternative is to use a Rayleigh quotient,

r = First[y.h.ConjugateTranspose[{y}]/Norm[y]]; 

The vector y is an eigenvector of h if and only if the matrix $$ h-r1_{3\times3} $$ is singular:

MatrixRank[h - IdentityMatrix[Length[y]] R]<Length[y] (*False*) 

or

Det[h - IdentityMatrix[Length[y]] R] == 0 (*False*) 

If you are using floating point numbers, you should change this condition into

MatrixRank[h - IdentityMatrix[Length[y]] R,Tolerance->epsilon]<Length[y] 

or

Abs[Det[h - IdentityMatrix[Length[y]] R]] < epsilon 

where epsilon is some small number.

As a matter of fact, the MatrixRank method is slightly faster than the Det one. It seems to me that it is also faster than the methods suggested by other users, but confirming this would require a more thorough analysis.

$\endgroup$
1
  • 2
    $\begingroup$ For full masochistic effect you should have him calculate the determinant by cofactor expansion. ;) $\endgroup$ Commented Nov 29, 2017 at 5:32
3
$\begingroup$

Either

Reduce[h . y == x * y, x] 

or

Reduce[(h - IdentityMatrix[Length[h]] x) . y == 0, x] 

depending on whether you would rather type $y$ once or twice.

$\endgroup$
3
$\begingroup$

Carl Woll's answer seems to be broken in the newest Mathematica. Here is a slight modification that makes it work

EigenvectorQ[matrix_, vector_] := MatrixRank[Join[matrix.vector, vector, 2]] == 1 

MatrixRank gives the number of linearly dependent columns. Join[l1, l2, 2] joins the second vector to the first on the right.

$\endgroup$
6
  • 1
    $\begingroup$ If you think about it, the rank of a matrix ought to be the same whether transposed or not. Can you give an example of a matrix-vector pair where Carl's version fails, but yours works? $\endgroup$ Commented Apr 17, 2020 at 3:58
  • $\begingroup$ @J.M. For whatever reason, I had trouble with it a week ago, but not today. Even with the same matrices. I'll delete this post in a bit. $\endgroup$ Commented Apr 22, 2020 at 16:42
  • $\begingroup$ @J.M., Actually, with some more experimenting (and more Diffeq homework), Carl Woll's answer doesn't work for the matrix {{4, 5},{-2, 6}} with the eigenvector{{1 - 3:ii:},{2}}. It doesn't seem to work with any vector with complex eigenvectors $\endgroup$ Commented Apr 23, 2020 at 18:27
  • $\begingroup$ With[{m = {{4, 5}, {-2, 6}}, v = {1 - 3 I, 2}}, MatrixRank[{m.v, v}] == 1] gives True. Also, why are you inputting eigenvectors in a format like {{1 - 3 I}, {2}}? Have you already seen this? $\endgroup$ Commented Apr 23, 2020 at 18:41
  • $\begingroup$ I'm using ctr+Enter and ctr+, to make a matrix within Mathematica, and passing the values that way. I haven't seen the link you sent, thank you for the info. $\endgroup$ Commented Apr 23, 2020 at 23:45
2
$\begingroup$
MemberQ[myeigens = Normalize/@Eigenvectors[h], Normalize[y]]|| MemberQ[myeigens, -Normalize[y]] 

(* False *)

$\endgroup$
8
  • 2
    $\begingroup$ And what if you have eigenvalues with multiplicities? $\endgroup$ Commented Nov 27, 2017 at 19:30
  • $\begingroup$ Won't you find the match in that case? $\endgroup$ Commented Nov 27, 2017 at 19:45
  • $\begingroup$ Not necessarily. The eigenspace to an eigenvalue with higher multiplicity may have dimension greater than 1 and Eigenvectors just picks a basis. So, it will likely happen that neither Normalize[y] nor -Normalize[y] coincides with one of the basis vectors. $\endgroup$ Commented Nov 27, 2017 at 19:50
  • 1
    $\begingroup$ Here, a counter example: MemberQ[Eigenvectors[DiagonalMatrix[{1, 1}]], Normalize[{1, -1}] || -Normalize[{1, -1}]] $\endgroup$ Commented Nov 27, 2017 at 19:53
  • $\begingroup$ @HenrikSchumacher: But the eigenvectors of DiagonalMatrix[{1,1}] are {1,0} and {0,1}, so of course your code will yield False. $\endgroup$ Commented Nov 27, 2017 at 22:22
2
$\begingroup$

How about this:

eigenVectorQ[mat_, vec_] := Abs[Dot[#1\[Conjugate], #2]] == Norm[#1] Norm[#2] &[mat.vec, vec] 

Then eigenVectorQ[h, y] returns False.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.