0
$\begingroup$

Consider the matrix $A \in \mathbb{R}^{n \times n}$ of all ones. Because there is only 1 linearly independent column, there are $n-1$ zero eigenvalues and 1 non-zero eigenvalue which is $n$.

So one eigenvector, $u_1$ can be determined by inspection of the definition of eigenvector: \begin{align*} Au_1 &= nu_1 \\ \therefore\qquad u_1 &= 1_n \end{align*}

Since $A$ is symmetric that means the eigenvectors corresponding to distinct eigenvalues are orthogonal. So that means all the eigenvectors ($n-1$ of them) corresponding to an eigenvalue of zero are orthogonal to $1_n$ (this implies this must all have zero mean).

My question is, how do we succinctly represent these $n-1$ orthogonal vectors? I also know that all of these $n-1$ eigenvectors are linearly independent. I just don't know how to properly represent them succinctly.

I think we can pick

\begin{align*} u_2 &= \begin{bmatrix}1 & -1 & 0 & \dots & 0\end{bmatrix}^T\\ u_3 &= \begin{bmatrix}1 & 0 & -1 & 0 & \dots & 0\end{bmatrix}^T\\ &\ \ \vdots\\ u_n &= \begin{bmatrix}1 & 0 & \dots & 0 & -1\end{bmatrix}^T \end{align*}

all of these are linearly independent. I just don't know how to represent them the formal way.

$\endgroup$
6
  • 1
    $\begingroup$ I mean... yes, that is a valid choice for the eigenbasis. If you insist on a cleaner way to write these, $u_i = e_1-e_i$ for each $i\in\{2,\dots,n\}$ should be fine where $e_i$ is the canonical "standard basis" vector with $0$'s everywhere except in the $i$'th position it has a $1$. $\endgroup$ Commented Aug 13, 2020 at 16:07
  • 1
    $\begingroup$ Note that your set $u_2,\dots, u_n$ fails to be orthogonal $\endgroup$ Commented Aug 13, 2020 at 16:30
  • $\begingroup$ This post is related, but doesn't quite answer your question. Note that if we can use complex numbers, the columns of the DFT matrix give us a nice orthonormal eigenbasis $\endgroup$ Commented Aug 13, 2020 at 16:34
  • $\begingroup$ @BenGrossmann. Is it possible to choose $u_2, \ldots, u_n$ so that they are orthogonal? $\endgroup$ Commented Aug 13, 2020 at 16:51
  • $\begingroup$ @dd22205 Yes, I explain one way to do so in my answer. An alternative (more typical) approach is to apply the Gram Schmidt to the basis that you came up with. $\endgroup$ Commented Aug 13, 2020 at 16:52

2 Answers 2

2
$\begingroup$

First, the case where $n$ is odd, with $n = 2k + 1$: Let $\theta = 2 \pi /n$. Taking the real and imaginary parts of the columns of the DFT matrix gives us the following nice orthogonal basis for $u_1^\perp$: \begin{align*} c_1 &= [1\ \ \cos \theta \ \ \cdots \ \ \cos ((n-1)\theta)]\\ c_2 &= [1\ \ \cos (2\theta) \ \ \cdots \ \ \cos (2(n-1)\theta)]\\ &\ \ \vdots \\ c_{k} &= [1\ \ \cos (k\theta) \ \ \cdots \ \ \cos (k(n-1)\theta)]\\ s_1 &= [1\ \ \sin \theta \ \ \cdots \ \ \sin ((n-1)\theta)]\\ s_2 &= [1\ \ \sin (2\theta) \ \ \cdots \ \ \sin (2(n-1)\theta)]\\ &\ \ \vdots \\ s_{k} &= [1\ \ \sin (k\theta) \ \ \cdots \ \ \sin (k(n-1)\theta)].\\ \end{align*} In the case that $n$ is even, we do essentially the same thing, but also include the vector $[-1,1,-1,\dots,1].$

$\endgroup$
3
  • $\begingroup$ Ah interesting. Is there some implicit requirement that the eigenvectors should be represented as orthogonal / orthonormal basis? What is the motivation in doing that? $\endgroup$ Commented Aug 13, 2020 at 16:53
  • $\begingroup$ I thought as an alternative to the very simple Hadamard basis, but alas possible for rather specific values of dimension $n$. $\endgroup$ Commented Aug 13, 2020 at 16:55
  • $\begingroup$ Well, doing this is always possible as a consequence of the spectral theorem. The end result of finding this orthonormal eigenbasis is that we end up with an "orthogonal diagonalization" $A = UDU^T$, where $U$ is the orthogonal matrix whose columns are the orthonormal eigenbasis. Just as diagonalizing $A$ allows us to easily find $p(A)$ for polynomials and analytic functions $p$, an orthogonal diagonalization allows us to do the same thing with functions of $A$ and $A^T$. $\endgroup$ Commented Aug 13, 2020 at 16:56
2
$\begingroup$

The other eigenvectors are found from $$Ax = 0$$

Solving this equation, we see that they span a plane through the origin:

$$x_1 + x_2 + \cdots + x_n = 0$$

Here is one example of a set of eigenvectors:

$$\left\{ \pmatrix{\phantom{-}1\\-1\\\phantom{-}0 \\\phantom{-}0\\ \phantom{-}\vdots\\ \phantom{-}0 \\ \phantom{-}0\\\phantom{-}0}, \pmatrix{\phantom{-}0 \\ \phantom{-}1\\-1\\\phantom{-}0 \\ \phantom{-}\vdots\\ \phantom{-}0 \\ \phantom{-}0\\\phantom{-}0}, \pmatrix{\phantom{-}0 \\\phantom{-}0 \\ \phantom{-}1\\-1\\ \phantom{-}\vdots\\ \phantom{-}0 \\\phantom{-}0\\ \phantom{-}0}, \cdots, \pmatrix{\phantom{-}0 \\\phantom{-}0 \\ \phantom{-}0\\\phantom{-}0\\ \phantom{-}\vdots\\ \phantom{-}1 \\- 1\\\phantom{-}0}, \pmatrix{\phantom{-}0 \\ \phantom{-}0 \\ \phantom{-}0\\ \phantom{-}0\\ \phantom{-}\vdots\\ \phantom{-}0\\ \phantom{-}1 \\ -1} \right\}$$

It is not difficult to find an orthonormal set. One way, we can apply the Gramm-Schmidt process.

$\endgroup$
5
  • $\begingroup$ Yes, but they do not yet constitute an orthonormal basis... $\endgroup$ Commented Aug 13, 2020 at 16:50
  • $\begingroup$ @JeanMarie Do they need to constitute an orthonormal basis? $\endgroup$ Commented Aug 13, 2020 at 16:52
  • $\begingroup$ It looks to be one of the demands of the text (not in the final sentences). $\endgroup$ Commented Aug 13, 2020 at 16:53
  • $\begingroup$ Okay, I missed that in the problem statement. Nice point! ... One way is to take these vectors and apply the Gramm Schmidt process. Actually, will think about it some more. Probably not difficult to for us to come up with something. $(1,-1,0,0,...,0)^T$, $(1,-1,1,-1,0,...,0)^T$, etc. $\endgroup$ Commented Aug 13, 2020 at 16:54
  • 1
    $\begingroup$ Even the questioner didn't realize we are asked for an orthogonormal set of vectors! $\endgroup$ Commented Aug 13, 2020 at 16:55

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.