3
$\begingroup$

I know this is a trivial problem but am stuck with this thing.

$A \in \mathbb{R}^{m \times m}$ is a symmetric positive definite matrix and $C \in \mathbb{R}^{m \times p}$

How do I show that the matrix $$B = \begin{bmatrix} A & C \\ C^T & 0 \end{bmatrix}$$ is indefinite?

I can show that there exists a vector $v$ such that $v^TBv > 0$. (This is trivial. For instance, $v = [x , 0]^T$ where $x \in \mathbb{R}^{1 \times m}$ and $0 \in \mathbb{R}^{1 \times p}$)

Now how do I find a vector such that $v^TBv < 0$.

I do not need the answer. A clue/hint is welcome.

You can assume that $p \leq m$ and the matrix $C$ is full rank that is to say that all the constraints are linearly independent.

Thanks

3 Answers 3

0

Consider $$z = \begin{bmatrix}x\\ y\end{bmatrix}^T$$ with $x \in R^m$ and $y \in R^p$

Then $$f(z) = z^TBz = x^TAx + 2x^TCy$$ which containts a bilinear term, which is linear as a function of $y$. Hence given an $x$, an appropriate $y$ can be chosen such that the $f(z) < 0$

  • 0
    This does not work with, e.g., $C=0$. You still have to prove that there is such a $y$.2015-11-12
3

Hint for a less convoluted argument: Unless $C$ is zero (in which case the statement is false), there are vectors $x$ and $y$ such that $x^T C y$ is a nonzero $1\times 1$ matrix. Now consider $v=[x,ky]^T$ for large $k$ (positive or negative, depending on the sign of $x^T C y$).

1

Ok. So I figured it out. But I think my argument is really convoluted. I would appreciate if someone could come up with a simpler argument.

Without loss of generality, we can assume that the matrix $C$ is of full-rank $p$.

First let $B = \begin{bmatrix} A & C \\ C^T & 0 \end{bmatrix}$, where $A \in \mathbb{R}^{m \times m}$ and $C \in \mathbb{R}^{m \times p}$.

The first observation is that all the eigenvalues of this matrix have to be real. This is so since the matrix $B$ is symmetric. Also, the matrix $B$ has an eigenvalue decomposition.

Clearly, $\exists x$ such that $x^TBx > 0$. For instance, $x = [x_1 , 0]^T$ where $x_1 \in \mathbb{R}^{1 \times m}$ and $x_1 \neq 0$ and $0 \in \mathbb{R}^{1 \times p}$. Then $x^TBx =x_1^TAx_1 > 0$ since $A$ is positive definite.

Now, if we consider $x=[0,x_2]^T$ where $x_2 \in \mathbb{R}^{1 \times p}$ and $x_2 \neq 0$ and $0 \in \mathbb{R}^{1 \times m}$, then $x^TBx = 0$.

So $B$ is not positive definite. And since $B$ is symmetric, there is an eigenvalue decomposition for $B$. And hence $\exists \lambda$ such that $\lambda \leq 0$.

Now, we need to prove that $\exists \lambda$ such that $\lambda < 0$.

Now, we recognize that the matrix $B$ is full-rank.

This can be seen from the fact that the null space of the matrix $B$ is trivial.

Let $ z = \begin{bmatrix} x \\ y \end{bmatrix}$ such that $Bz = 0$.

Then we have $Ax + Cy = 0$ and $C^Tx = 0$. So we have $x = -A^{-1}Cy$ and hence we get $C^TA^{-1}Cy = 0$.

The matrix $C^TA^{-1}C$ is positive definite. This is because $C$ is full rank and is a skinny matrix. So whenever $x \neq 0$, $x^* = Cx \neq 0$ and hence $x^TC^TA^{-1}Cx = {x^*}^TA^{-1}x^* > 0$ (Since $A^{-1}$ is also positive definite).

Hence, $y=0$ is the only possible solution which implies $x=0$ and hence $z=0$ is the only vector in the null space.

So the matrix $B$ is of full-rank.

Hence, none of the eigenvalues are zero i.e. $\forall \lambda$, $\lambda \neq 0$.

Also, from previous argument, we have that $\exists \lambda$ such that $\lambda \leq 0$.

Hence, we can conclude that $\exists \lambda$ such that $\lambda < 0$. So if I choose the (or) a eigenvector $v$ corresponding to this $\lambda$, then I have $v^TBv = - \lambda ||v||_2^2 < 0$.

Hence, the matrix $B$ is Indefinite.

  • 0
    You should specify that $x_1\neq \mathbf{0}$. Also, you should change the name of your *second* $x_1$, lest it be confused with the first one! Immediately after you say "$B$ is not positive definite. Since $B$ is symmetric positive definite..." but you just said it was *not* positive definite.2010-11-28
  • 0
    @Arturo: sorry. This is what happens when I write a long answer. I wanted to mean that the matrix $B$ is just symmetric. I have changed the variables $x_1$ and $x_2 and have mentioned both have to be non-zero.2010-11-28
  • 0
    I was just providing some feedback, not criticising. I don't see anything wrong with this argument. The only reason you may see it as "convoluted" is that you end up having to do a bit of a lemma in the middle where you argue that $B$ is of full rank. But I think this is easy given the fact that $A$ and $C$ are both of full rank: if some column were a linear combination of the previous ones, then it would be a column of $C$ with $0$s underneath; since $C$ is full rank, the expression involves columns "of $A$"; but that would imply that some *rows* of $C$ are linearly dependent.2010-11-28
  • 0
    @Arturo: I am taking your comments in the right spirit. In fact, I am actually happy that a Prof is taking time to find and correct my mistakes. Sorry if my previous comment sounded rude (or) it gave the opinion that I felt you were criticizing. Your comments and feedback are always welcome.2010-11-28