3
$\begingroup$

Let $A$ and $B$ be two symmetric matrices of the same size, and let $\alpha_k$ and $\beta_k$ be the $k$th largest eigenvalues of $A$ and $B$ respectively. What can we say about $|\beta_k - \alpha_k|$ in terms of $||B - A||_2$? To be clearer, I want the tightest possible upper bound on $|\beta_k - \alpha_k|$ using only the stated assumptions, and computed as a function of $||B - A||_2$.

  • 0
    Can you then at least say why you want this? Maybe there is a different approach.2010-12-06
  • 0
    Jonas T: I want this because I am doing a math research and the question is too noob for mathoverflow. I've been looking at theorems like Weyl and like Bauer-Fike and I'm wondering if there is a theorem that very specifically addresses my question. If not then I can piece it together from other theorems I can find on the google.2010-12-06
  • 0
    If you are subtly asking whether this is homework then the answer is no.2010-12-06
  • 2
    No, I'm not wondering if it is homework or not, your question just is not very "well-posed" in a mathematical sense.2010-12-06
  • 0
    I have edited the question in response to the comment about it not being very "well-posed" in a mathematical sense.2010-12-06
  • 0
    If you take into account that two different matrices related only by a similarity transformation can have the same eigenvalues... I'm not convinced of the usefulness of such a bound. Certainly, one can use a transformation matrix with large condition number, and that might worsen any upper bound expression anyone cares to construct.2010-12-06
  • 1
    Which matrix norm do you mean by $\|B-A\|_2$?2010-12-07
  • 0
    @Willie: is there a different interpretation of that apart from it being the Hölder 2-norm? (I'm genuinely curious and feeling especially dense today...)2010-12-07
  • 0
    @Willie Wong: In this case because the matrix is symmetric I mean the largest absolute value of eigenvalues of B-A. This is the induced Euclidean norm which is the spectral norm for square matrices.2010-12-07
  • 0
    @J.M. Well, it could also technically be the Frobenius 2 norm (the squared entry-wise norm), which is different from the spectral norm; they are all roughly equivalent, but since the OP asked about "tightest possible bounds", it is useful to know that norm he particularly has in mind.2010-12-07
  • 0
    @Willie: FWIW, $\|\cdot\|_F$ is reserved for the Frobenius norm in the stuff I've seen (i.e., I've never seen $\|\cdot\|_2$ be used for anything other than the spectral norm in the matrix context).2010-12-07
  • 0
    @J.M. $\|\cdot\|_p$ is often used in the context of operator theory/functional analysis for the Schatten $p$-norm, where the $p = 2$ case is the Frobenius norm and $p = \infty$ case is the spectral norm, hence the confusion. http://en.wikipedia.org/wiki/Schatten_norm2010-12-07
  • 0
    @Willie: Aha, so we really are in different fields of study. :D Sorry about that, I can only say that I was assuming the conventions they use in numerical linear algebra, which was why I was wondering how it could have been anything other than the spectral norm. Thanks for the enlightenment!2010-12-07
  • 0
    The |difference of eigenvalues| can be bounded by the two norm of A-B. See (13) in https://terrytao.wordpress.com/2010/01/12/254a-notes-3a-eigenvalues-and-sums-of-hermitian-matrices/#more-33412017-08-10

2 Answers 2

3

I am slightly unsure about the following approach, so you should double check if that is correct. I'll use $\|\cdot\|_2$ to denote the spectral norm as above, and $\|\cdot \|_F$ for the Frobenius norm.

It is well known that for $n\times n$ matrices, $\|\cdot \|_F \leq \sqrt{n} \|\cdot\|_2$ is sharp from just the definition of the norms. Now, I claim that

$$ |\beta_k - \alpha_k| \leq \|B - A\|_F $$

is also sharp. Fix the eigenvalues $\beta_k$ and $\alpha_k$. Consider orthogonal transformations of the symmetric matrix $B$ by $Q$. It suffices to compute the minimum

$$ \inf_{Q\in O(n)} \| QBQ^T - A\|_F $$

But using the definition of the Frobenius norm as an inner product, you see that

$$ \| QBQ^T - A\|_F^2 = \mathop{Tr}\left[ (QBQ^T - A)(QBQ^T - A) \right] = \|B\|_F^2 + \|A\|_F^2 - 2 A\cdot_F (QBQ^T)$$

It's an exercise to see that $A\cdot_F (QBQ^T)$ is maximized when all the $k$th eigenspace of $A$ lines up with that of $QBQ^T$ (some sort of Cauchy-Schwarz plus re-arrangement inequality). When the eigen-spaces line up, you have that

$$ \| QBQ^T - A\|_F^2 = \sum_{k = 1}^n |\beta_k - \alpha_k|^2 $$

and so the desired inequality holds, and is attained with the eigenvalues only differ in one position.


Unfortunately, the composition of these two inequalities is not automatically sharp, since the bound of $|\beta_k-\alpha_k|$ by $\|B-A\|_F$ is only sharp with $B-A$ has rank 1, while the first bound can be sharpened $\|C\|_F \leq \sqrt{\mathop{rank}(C)} \|C\|_2$. But it is clear that the most general sharp bound should be

$$ |\beta_k-\alpha_k| \leq s\|B-A\|_2$$

with $1 \leq s \leq \sqrt{n}$.

  • 0
    So far, it looks sound, but I'm too tired to try and "break" your bounds... maybe after a good night's rest. P.S. Usually $Q$ is used for denoting orthogonal matrices since $O$ is too easily confused with zero. A mere quibble, of course. :)2010-12-07
  • 0
    @J.M. fixed to your liking. :)2010-12-07
  • 0
    I upvoted it before commenting, just so you know. ;)2010-12-07
  • 0
    No, seriously, that was a valid point. I've by habit written $O$ for orthogonal matrices, but have quite often seen textbooks where $Q$ is used. I never really paid thought to why they used $Q$ instead of $O$, and now I know.2010-12-07
0

I am too untrusted to add a comment so i'm spamming this as an answer.

J.M: I think you are considering the converse of the question. If you have two very different-looking matrices it is possible that they have the same spectrum. But if you have two similar-looking matrices, then is it possible that their spectra can be very different?

  • 1
    The theorem I remember is "tiny perturbations in matrix entries correspond to tiny perturbations in eigenvalues" for *symmetric* matrices (for unsymmetric matrices, even a tiny perturbation can cause a matrix to be defective); but since I'm far away from my books, I can't give the rigorous formulation.2010-12-07
  • 0
    @minel: Instead of spamming the site with answers that should be comments, please earn the reputation needed for commenting by provinding useful answers.2010-12-07
  • 2
    @Rasmus: That's a little harsh.2010-12-07
  • 1
    @Noah Snyder: To my defense led me please notice that this is my third comment on an answer of this kind by minel. The others were at http://math.stackexchange.com/questions/13318/i-am-so-confused/13320#13320 and http://math.stackexchange.com/questions/13308/analytical-expression-to-find-the-shortest-distance-between-two-ellipses/13310#13310 . That's we the comment above lacks something like a kind first sentence. I don't think the workaround of posting answers instead of comments should be accepted just because the poster doesn't have enough rep to post comments.2010-12-07
  • 0
    @minel: Please excuse if my comment above appears harsh to you. I really meant to submit the sober information in it.2010-12-07
  • 0
    Fair enough, three comments in you'd hope someone would have gotten an upvote or two if they were adding value.2010-12-07
  • 0
    @Noah: I'm afraid, I cannot parse your latest comment.2010-12-08
  • 0
    I meant I now understand your impatience, if the first two comments had added value then they would have gotten upvoted and we wouldn't be in this situation with the third one.2010-12-08