3
$\begingroup$

Let $A,B \in \mathbb{R}^{n \times n}$ be two positive semi-definite matrices and let $a > 0$ be a constant.

Consider the following maximization problem $$ \max_{x \in \mathbb{R}^n, \gamma}\ x^T \left(\frac{\gamma}{a + \gamma} A - \gamma B \right)x \qquad \text{subject to } \gamma > 0, \|x\| = 1 $$ and suppose that we know that $\hat x$ and $\hat \gamma$ maximize the expression. In this case, we have that $\hat x$ is the largest eigenvector of the matrix $\frac{\hat \gamma}{a + \hat \gamma} A - \hat \gamma B$. Suppose that we perturb the matrix A by another matrix $E$, which is supposed to be small in some norm (say operator). Consider again the following optimization problem $$ \max_{x \in \mathbb{R}^n, \gamma}\ x^T \left(\frac{\gamma}{a + \gamma} (A+E) - \gamma B \right)x \qquad \text{subject to } \gamma > 0, \|x\| = 1 $$ and denote $\tilde x$ and $\tilde \gamma$ the maximizers. The question is how to bound the difference $|\hat \gamma - \tilde \gamma|$ as some function of $E$? Intuitively, if $E$ is small then $\hat \gamma$ and $\tilde \gamma$ should also be close. Any pointers to the relevant literature would be also appreciated.

  • 1
    See "sensitivity analysis" in any good optimization book. The answer depends on the Lagrange multipliers associated to $\hat{x}$ and $\hat{\gamma}$.2012-05-25

0 Answers 0