8
$\begingroup$

Let's say I have a $2 \times 2$ matrix (actually the structure tensor of a discrete image - I): $$ \begin{bmatrix} \frac{\partial I}{\partial x}\frac{\partial I}{\partial x} & \frac{\partial I}{\partial x}\frac{\partial I}{\partial y} \\\ \frac{\partial I}{\partial y}\frac{\partial I}{\partial x} & \frac{\partial I}{\partial y}\frac{\partial I}{\partial y} \end{bmatrix}$$

It has 2 properties:

  1. Symmetric.
  2. Positive Semidefinite.

Given those properties, what would be the easiest method to numerically compute its eigenvectors (orthogonal) and eigenvalues?

  • 0
    You sometimes need extra backslashes, which has something to do with Markdown. I put in the third backslash to fix the matrix.2010-11-02
  • 0
    Jonas, Thank You. Noted my self.2010-11-02
  • 0
    @Jonas, Since I can't edit the message could you fix a mistake I made? It shouldn't be the second derivative. It should be a multiplication of the first derivatives: Ix * Ix, Ix * Iy, Iy * Ix, Iy * Iy. Thanks.2010-11-03
  • 0
    You should be able to edit all of your own questions and answers. Look to the bottom left of your answer, and click where it says "edit" (if you're logged in).2010-11-03
  • 0
    I assumed it is like comments. Fixed it. Thank you.2010-11-03

3 Answers 3

9

As a two-by-two matrix, applying Jacobi's method in fact gives the answer at once!

It is known that given a two-by-two symmetric matrix $\mathbf A$, one can construct an orthogonal matrix $\mathbf V=\bigl(\begin{smallmatrix}c&s\\\\-s&c\end{smallmatrix}\bigr)$ such that $\mathbf V^\top \mathbf A\mathbf V$ is diagonal, where the two numbers $c$ and $s$ satisfy $c^2+s^2=1$.

If the off-diagonal elements are not zero (why?), computing $c$ and $s$ can be done like so:

$$\begin{align*} \tau&=\frac{a_{22}-a_{11}}{2a_{12}}\\ t&=\frac{\mathrm{sgn}(\tau)}{|\tau|+\sqrt{1+\tau^2}}\\ c&=\frac1{\sqrt{1+t^2}}\\ s&=ct \end{align*}$$

from which the eigendecomposition easily follows.

(See Golub and Van Loan's excellent book for further details.)


Here is a Mathematica demonstration of Jacobi's method:

a = N[{{3, -1}, {-1, 2}}, 20];

tau = (a[[2, 2]] - a[[1, 1]])/a[[1, 2]]/2;

t = Sign[tau]/(Abs[tau] + Sqrt[1 + tau^2]);

{c, s} = {1, t}/Sqrt[1 + t^2]
{0.85065080835203993218, 0.5257311121191336060}

{l1, l2} = {a[[1, 1]] - t a[[1, 2]], a[[2, 2]] + t a[[1, 2]]}
{3.6180339887498948482, 1.3819660112501051518}

Eigenvalues[a] == {l1, l2}
True

Eigenvectors[a] == {{c, -s}, {s, c}}
True

a.{c, -s} == l1 {c, -s}
True

a.{s, c} == l2 {s, c}
True
  • 0
    I leave as an exercise the derivation of the following formulae for the diagonal elements of $\mathbf{V}^T\mathbf{A}\mathbf{V}$: $a_{11}-ta_{12}$ and $a_{22}+ta_{12}$2010-11-02
  • 0
    @ J.M., It works! Thank You.2010-11-05
  • 0
    @J.M., On the book you referred me to it says choosing between 2 solutions of `t`. You made a trick, could you explain it? Because my MATLAB code gets one of the Eigen Vectors in the wrong direction (Multiplied by 1). Thank You.2014-10-12
  • 0
    @Drazick, "gets one of the eigenvectors in the wrong direction (multiplied by 1)." - you are aware that if $\mathbf x$ is an eigenvector of $\mathbf A$, then any nonzero multiple of $\mathbf x$ is also an eigenvector?2015-05-01
6

Since you work with a $2\times2$ matrix, the corresponding characteristic polynomial is quadratic so the eigenvalues can be expressed in closed form in terms of the matrix elements. As soon as you get the eigenvalues all you have to do is to solve two systems of linear equations with two unknowns (i.e. the coordinates of the corresponding eigenvector in the standard basis). Again, everything is nice and explicit, no need to apply a fancy numerical method.

  • 0
    But beware that the formula for the roots of a quadratic equation needs to be implemented with a little thought in order to avoid cancellation! See http://en.wikipedia.org/wiki/Quadratic_equation#Floating_point_implementation.2010-11-02
  • 0
    I want to implement it on GPU. Hence simplicity is very important both because of performance and the lack of libraries. Hence even solving the linear system isn't "Straightforward" (I'm spoiled by Matlab).2010-11-02
  • 3
    @Drazick: solving a two-by-two homogeneous linear system *is* straightforward.2010-11-02
4

Despite other answers, I thought it might benefit the impatient to see the explicit answer below.

Let $$M = \left( \begin{array}{cc} a & b \\ b & c \end{array} \right),$$ be the input matrix.

Define the discriminant: $\Delta = \sqrt{a^2+4 b^2-2 a c+c^2}$

Then, the eigenvalues of $M$ are given by:

$\lambda_1 = 0.5(a+c-\Delta)$ and $\lambda_2 = 0.5(a+c+\Delta)$

Now, you can find a matrix $V$ such that $$M = V^{-1} \begin{pmatrix} \lambda_1 & 0\\ 0 & \lambda_2 \end{pmatrix}V.$$

Mathematica says that the matrix $V$ is given by $$ V = \begin{pmatrix} \frac{a-c-\Delta}{2b} & 1\\ \frac{a-c+\Delta}{2b} & 1 \end{pmatrix} $$

If you are looking for orthogonal $V$, then the above calculations need some changes.

  • 0
    "Explicit" is great :-). Could you give the explicit solution for Orthogonal V? I don't have access to Mathematica. Thank You!2010-11-03
  • 0
    @Drazick: The solution I gave returns orthogonal eigenvectors.2010-11-04
  • 0
    @Drazick: You can use Maxima instead of Mathematica for symbolic solving.2011-04-01