3
$\begingroup$

I asked this question on mathoverflow, but it was deemed too simple, so I'm posting here instead --

Is there a nice way to characterize an orthonormal basis of eigenvectors of the following $d\times d$ matrix?

$$\mathbf{I}-\frac{1}{d} \mathbf{v}\mathbf{v}'$$

Where $\mathbf{v}$ is a $d\times 1$ vector of 1's. This is similar to the Householder matrix, except the $v's$ are not normalized. One eigenvector is $\mathbf{v}$ with corresponding eigenvalue 0, remaining eigenvalues should be 1. I'm looking for an expression in terms of unknown d.

Motivation: this is covariance matrix of uniform multinomial distribution, so expression for orthonormal basis produces a linear transformation that will make variables uncorrelated for large n

Example: below are 5 orthonormal eigenvectors vectors I get from Gram-Schmidt for d=5...what is the expression for general d? An even bigger example -- columns of this form orthonormal basis for d=20

$$-\frac{1}{\sqrt{2}},0,0,0,\frac{1}{\sqrt{2}}$$

$$-\frac{1}{\sqrt{6}},0,0,\sqrt{\frac{2}{3}},-\frac{1}{\sqrt{6}}$$

$$-\frac{1}{2 \sqrt{3}},0,\frac{\sqrt{3}}{2},-\frac{1}{2 \sqrt{3}},-\frac{1}{2 \sqrt{3}}$$

$$-\frac{1}{2 \sqrt{5}},\frac{2}{\sqrt{5}},-\frac{1}{2 \sqrt{5}},-\frac{1}{2 \sqrt{5}},-\frac{1}{2 \sqrt{5}}$$

$$\frac{1}{\sqrt{5}},\frac{1}{\sqrt{5}},\frac{1}{\sqrt{5}},\frac{1}{\sqrt{5}},\frac{1}{\sqrt{5}}$$

Update 09/08 I came across another interesting characterization, when d=2^k, for some k, then Walsh Functions form orthogonal basis for this matrix. In particular, let {$\mathbf{x_i}$} represent the list of vectors of binary expansion of integers 1 to d, ie {(0,0,0),(0,0,1),(0,1,0)...}. Then, rows (and columns) of $M$ define the orthonormal basis of matrix in question, where

$$M_{ij}=(-1)^{x_i \cdot x_j}$$

  • 0
    hm....formatting seems broken for arrays...is there standard way to do matrices? (I tried \begin{matrix} and \begin{array})2010-09-07
  • 0
    You need to terminate each row with four backslashes instead of the usual two.2010-09-07

4 Answers 4

3

If you write $P= I - \frac{1}{d}vv^t$ (I assume your $v'$ means the transpose of $v$), then you certainly have a symmetric matrix

$$ P^t = \left( I- \frac{1}{d}vv^t\right)^t = I -\frac{1}{d}v^{tt}v^t = I - \frac{1}{d}vv^t = P \ . $$

So it diagonalizes and has an orthonormal basis of eigenvectors. That is, there exists an orthogonal matrix $S$, ($S^{-1} = S^t$) and a diagonal matrix $D $ such that $S^tP S = D$.

Moreover, since $P^2 = P$,

\begin{align} P^2 &= \left( I- \frac{1}{d}vv^t\right) \left( I- \frac{1}{d}vv^t\right) \\ &= I - 2\frac{1}{d}vv^t + \frac{1}{d^2}vv^tvv^t \\ &= I - 2\frac{1}{d}vv^t + \frac{1}{d}vv^t \\ &=I- \frac{1}{d}vv^t = P \end{align}

and $P \neq 0,I$, its minimal polynomial is $x^2-x = x(x-1)$. So the only eigenvalues of $P$ are $0,1$.

As for eigenvectors, as you say, $v$ is one of them, of $0$ eigenvalue:

$$ Pv = \left( I- \frac{1}{d}vv^t\right)v = v - \frac{1}{d}vv^tv = v - v = 0 \ . $$

Hence, you can take $\frac{v}{\|v\|} = \frac{1}{\sqrt{d}} (1, \dots ,1)$ as the first vector of your orthonormal basis. The other vectors are an orthonormal basis of the orthogonal complement of $v$, $[v]^\bot$:

\begin{align} \left( I- \frac{1}{d}vv^t\right)w = w &\quad\Longleftrightarrow \quad w - \frac{1}{d}vv^tw = w \\ &\quad\Longleftrightarrow \quad vv^tw = 0 \\ &\quad\Longleftrightarrow \quad v^tw = 0 \end{align}

So, you only need to compute a basis for $[v]^\bot$ and orthonormalize it. In coordinates you must find the solutions of the linear equation

$$ x_1 + \dots + x_d = 0 \ . $$

For instance,

$$ (-1, 1, 0, \dots ,0), (-1, 0, 1, 0, \dots ,0), \dots , (-1, 0, \dots , 0,1) \ . $$

And now you apply the Gram-Schmidt process http://en.wikipedia.org/wiki/Gram_schmidt to these vectors.


EDIT. I think Unkz intuition is right. Before normalizing, the general pattern looks as follows. You have $d-1$ vectors of the form

$$ (-1, 0,\dots,0, i, -1,\dots, -1), \qquad \text{for} \quad i=1,\dots,d-1 \ , $$

where $i$ is in the $d-i+1$ coordinate (so there are $i-1$ coordinates with $-1$ after it) and one more last vector

$$ (d,\dots, d) \ . $$

Let's check that:

(a) Those first $d-1$ vectors belong to $\ker (P-I) = [(1,\dots, 1)]^\bot$ (so they are eigenvectors corresponding to the eigenvalue $1$):

$$ (1,\dots, 1)\cdot (-1, 0,\dots,0, i, -1,\dots, -1) = -1 + i + (i-1)(-1)= 0 \ . $$

(b) Those first $d-1$ are mutually orthogonal vectors. We may assume $i>j$, for instance:

$$ (-1, 0,\dots,0, i, -1,\dots, -1) \cdot (-1, 0,\dots,0, j, -1,\dots, -1) = 1 - i +(i-1)(-1)^2 = 0 \ . $$

Now, you quotient out their norms

$$ \sqrt{1 + i^2 + (i-1)} = \sqrt{i^2 + i} = \sqrt{i (i+1)} \qquad \text{for} \quad i= 1, \dots, d-1 $$

and

$$ \sqrt{d^2d} $$

and you are done.

  • 0
    Well...the last part seems to be the hardest, since d is not known2010-09-07
2

While I haven't taken the time to prove it, if you look at the numbers I think you'll see a very nice pattern if you multiply out all your irrational denominators.

(-1,0,0,0,1) norm=$\sqrt{1 \cdot 2}$

(-1,0,0,2,-1) norm=$\sqrt{2 \cdot 3}$

(-1,0,3,-1,-1) norm=$\sqrt{3 \cdot 4}$

(-1,4,-1,-1,-1) norm=$\sqrt{4 \cdot 5}$

(5,5,5,5,5) norm=$\sqrt{5^2 \cdot 5}$

This appears to work for a few other random choices of d, where you have -1 in the first column, (1..d) down the backwards diagonal, and -1 under the backwards diagonal, along with 1,1,...,1 on the bottom row.

1

One way is to find the householder matrix Q that maps v to a multiple of e_1 (first coordinate basis vector). Then (since Q is symmtric and orthogonal) w_2=Q*e_2 ... will be a basis of the orthogonal complement of v. Explicitly, I get w_k = e_k - u where u = (v+sqrt(d)*e_1)/(d+sqrt(d))

0

a slight retouching of unkz's idea is the following orthogonal basis for $v^\perp$:

$$u_1 = (1,-1,0,\dots,0),\ u_2 = (1,1,-2,0,\dots,0),\ \dots, \ u_{d-1} = (1,\dots,1,-d+1)$$

in this form it is pretty easy to see that the $\{u_i\}$ are in $v^\perp$ and are mutually orthogonal.

i don't quite understand the remark that normalizing the $\{u_i\}$ to get an ONB is the hardest part - since $d$ is unknown. what sort of expression "in terms of [the] unknown $d$" would be suitable?

ps: i would post this as a comment - if this @!*&*%#! site would let me.