Let $A$ and $B$ be two symmetric matrices of the same size, and let $\alpha_k$ and $\beta_k$ be the $k$th largest eigenvalues of $A$ and $B$ respectively. What can we say about $|\beta_k - \alpha_k|$ in terms of $||B - A||_2$? To be clearer, I want the tightest possible upper bound on $|\beta_k - \alpha_k|$ using only the stated assumptions, and computed as a function of $||B - A||_2$.
eigenvalue error
3
$\begingroup$
linear-algebra
-
0Can you then at least say why you want this? Maybe there is a different approach. – 2010-12-06
-
0Jonas T: I want this because I am doing a math research and the question is too noob for mathoverflow. I've been looking at theorems like Weyl and like Bauer-Fike and I'm wondering if there is a theorem that very specifically addresses my question. If not then I can piece it together from other theorems I can find on the google. – 2010-12-06
-
0If you are subtly asking whether this is homework then the answer is no. – 2010-12-06
-
2No, I'm not wondering if it is homework or not, your question just is not very "well-posed" in a mathematical sense. – 2010-12-06
-
0I have edited the question in response to the comment about it not being very "well-posed" in a mathematical sense. – 2010-12-06
-
0If you take into account that two different matrices related only by a similarity transformation can have the same eigenvalues... I'm not convinced of the usefulness of such a bound. Certainly, one can use a transformation matrix with large condition number, and that might worsen any upper bound expression anyone cares to construct. – 2010-12-06
-
1Which matrix norm do you mean by $\|B-A\|_2$? – 2010-12-07
-
0@Willie: is there a different interpretation of that apart from it being the Hölder 2-norm? (I'm genuinely curious and feeling especially dense today...) – 2010-12-07
-
0@Willie Wong: In this case because the matrix is symmetric I mean the largest absolute value of eigenvalues of B-A. This is the induced Euclidean norm which is the spectral norm for square matrices. – 2010-12-07
-
0@J.M. Well, it could also technically be the Frobenius 2 norm (the squared entry-wise norm), which is different from the spectral norm; they are all roughly equivalent, but since the OP asked about "tightest possible bounds", it is useful to know that norm he particularly has in mind. – 2010-12-07
-
0@Willie: FWIW, $\|\cdot\|_F$ is reserved for the Frobenius norm in the stuff I've seen (i.e., I've never seen $\|\cdot\|_2$ be used for anything other than the spectral norm in the matrix context). – 2010-12-07
-
0@J.M. $\|\cdot\|_p$ is often used in the context of operator theory/functional analysis for the Schatten $p$-norm, where the $p = 2$ case is the Frobenius norm and $p = \infty$ case is the spectral norm, hence the confusion. http://en.wikipedia.org/wiki/Schatten_norm – 2010-12-07
-
0@Willie: Aha, so we really are in different fields of study. :D Sorry about that, I can only say that I was assuming the conventions they use in numerical linear algebra, which was why I was wondering how it could have been anything other than the spectral norm. Thanks for the enlightenment! – 2010-12-07
-
0The |difference of eigenvalues| can be bounded by the two norm of A-B. See (13) in https://terrytao.wordpress.com/2010/01/12/254a-notes-3a-eigenvalues-and-sums-of-hermitian-matrices/#more-3341 – 2017-08-10