5
$\begingroup$

I want to find all the roots of a polynomial and decided to compute the eigenvalues of its companion matrix. How do I do that?

For example, if I have this polynomial: $4x^3 - 3x^2 + 9x - 1$, I compute the companion matrix:

$$\begin{bmatrix} 0&0&\frac{3}{4} \\ 1&0&-\frac{9}{4} \\ 0&1&\frac{1}{4} \end{bmatrix}$$

Now how can I find the eigenvalues of this matrix?
Thanks in advance,
rubik

  • 0
    But why doesn't it format my matrix in the right way??2010-11-03
  • 1
    you should probably type 4 backslashes instead of two when using the bmatrix environment (because each backslash needs to be escaped)2010-11-03
  • 0
    Ok, I see J.M. has already done that, thanks.2010-11-03
  • 1
    BTW, you have the wrong companion matrix. One first constructs the corresponding monic polynomial (coefficient of highest power is 1), and then puts in the negative of those coefficients either on the top or the right side of the matrix, and 1's on the subdiagonal.2010-11-03
  • 0
    to get the roots of your polynomial, you could also try using http://www.sagemath.org/2010-11-03
  • 0
    @ J.M. Ah ok, I need to construct the monic polynomial!2010-11-03
  • 1
    Your polynomial is cubic and should only have *three* roots; your example matrix on the other hand has *four* eigenvalues. See my earlier comment on how to construct the true matrix.2010-11-03
  • 0
    Hmm, I missed something in the earlier comment: all you need are the coefficients of the lower powers; do not include the leading coefficient of the monic polynomial!2010-11-03
  • 0
    The coefficients are reversed in the matrix - the 0 order coeff 1/4 (which corresponds to the product of the roots, and thus the product of the eigenvalues, and the determinant of the matrix) should be at the top; and the 2nd order coeff 3/4 (sum of roots, thus sum of eigenvalues and sum of diagonal) should be at the bottom.2014-05-17

3 Answers 3

2

You may want to take a look at Companion matrix to be sure that you have the right companion matrix.

In order to find the eigenvalues of a square matrix, you need to solve

$\text{det}(A-\lambda I)=0$

where A is the matrix you are interested in and I is the identity matrix.

You can find a couple of examples here.

  • 3
    If you take the determinant of the Frobenius companion matrix, you of course just go back to the original polynomial, so it's a bit circular here. How then do you find the roots of the polynomial? (Certainly the cubic formula applies in the OP's example, but remember that explicit solutions can be difficult to construct for polynomials of degree 5 or higher.)2010-11-03
  • 1
    Yes, I agree it is circular. However, @rubik wanted to calculate the eigenvalues of that matrix. In fact, I think the current matrix is wrong. Shoudn't be http://www.wolframalpha.com/input/?i=Eigenvalues[{{0,+0,1/4},+{1,+0,-9/4},+{0,+1,+3/4}}] ?2010-11-03
  • 0
    Yes, he did put in the coefficients backwards. That's why I personally prefer the version that has the coefficients at the top of the matrix; less confusing for me!2010-11-03
  • 0
    Or at the bottom http://www.jstor.org/pss/23123222010-11-03
  • 0
    If you do it at the bottom, you either need an eigensolver designed for lower Hessenberg matrices, or will have to do a similarity transformation to upper Hessenberg form. The thing is that software like LAPACK was written for upper Hessenberg matrices (of course, one could rewrite the LAPACK eigenroutine so it works with lower Hessenberg matrices, but...)2010-11-03
  • 0
    Point taken. By the way, you seem very aware of the computational feasibility for companion matrices. Do you work with them often?2010-11-04
  • 0
    @Robert: "Often" is a bit strong, but I am more or less acquainted with most of the numerics behind numerical linear algebra and polynomial rootfinding.2010-11-04
3

Typically, one uses the QR algorithm on a matrix with a preliminary reduction to upper Hessenberg form (all zeroes below the subdiagonal) for determining the eigenvalues. However, since the Frobenius companion matrix is already Hessenberg to begin with, it can easily be fed into an eigenroutine without preliminary transformation (though a preliminary balancing might be necessary if the coefficients of the polynomial vary widely in magnitude).

However, it is a rather wasteful algorithm: the Frobenius companion matrix is sparse, and subjecting it to the QR algorithm causes the zeroes above the subdiagonal to be filled in quickly. However, there are now new methods that exploit the sparsity of the Frobenius companion matrix; if you're sure you do not want to investigate other methods, you should look into this new development.

  • 1
    Constructing a Frobenius companion matrix in *Mathematica* is pretty easy: `(Prepend[PadRight[IdentityMatrix[Exponent[#, x] - 1], Exponent[#, x] - {1, 0}], -Reverse[Most[CoefficientList[#, x]]]/Coefficient[#, x, Exponent[#, x]]]) &[4x^3 - 3x^2 + 9x - 1]`2010-11-03
3

Hey There, so if I am assuming correctly for your case, you want to find eigenvalues for this matrix, which is essentially solving for your roots of the characteristic polynomial of the matrix after doing the determinant operation on it. So to go off from Robert idea, we want to use the equation,

det(A$-\lambda$ I) = $0$ $~~~$(following from this we can plug in the coefficient matrix given).

det(A$-\lambda$ I) = $\left[\begin{array}{ccc} 0-\lambda & 0 & \dfrac{3}{4} \\ 1 & 0-\lambda & -\dfrac{9}{4} \\ 0 & 1 & \dfrac{1}{4}-\lambda \end{array} \right] = 0$, where A is your coefficient matrix and I is the identity matrix.

I = $\left[\begin{array}{ccr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{array} \right]$

From here we can now find out the eigenvalues of the matrix A as follows:

$\underline {\text{Evaluation of the determinant expanding it by the minors of column 1:}}$

= $~-\lambda \left[\begin{array}{cc} -\lambda & -\dfrac{9}{4} \\ 1 & \dfrac{1}{4}-\lambda \end{array} \right] -1 \left[\begin{array}{cc} 0 & \dfrac{3}{4} \\ 1 & \dfrac{1}{4}-\lambda \end{array} \right] + 0 \left[\begin{array}{rr} 0 & \dfrac{3}{4} \\ -\lambda & -\dfrac{9}{4} \end{array} \right] $

$\Rightarrow ~ -\lambda \left[\begin{array}{c} \lambda^{2} -\dfrac{1}{4}\lambda + \dfrac{9}{4} \\ \end{array} \right] -1 \left[\begin{array}{cc} 0 -\dfrac{3}{4} \\ \end{array} \right] + ~0 \left[\begin{array}{rr} 0 + \dfrac{3}{4}\lambda \\ \end{array} \right] $

$\Rightarrow ~$ $-\lambda^3+\dfrac{1}{4}\lambda^2-\dfrac{9}{4}\lambda+\dfrac{3}{4}$, $~$ Hence our characteristic polynomial is now obtained.

$$P(\lambda)=-\lambda^3+\dfrac{1}{4}\lambda^2-\dfrac{9}{4}\lambda+\dfrac{3}{4}$$

If you need assistance on how to find the characteristic polynomial by evaluating the determinant, here is a reference: Computing Determinants \

After solving this polynomial for its roots (eigenvalues) we get the following:

{$\lambda = (0.329,0.000) ~~ \lambda = (-0.040,-1.508) ~~ \lambda = (-0.040,1.508)$}

I believe all the roots except for $\lambda = 0.329$ are complex conjugate roots. Can someone else please verify that those are all of the roots to this polynomial and that the ones I provided are correct, Thanks. I hope this helps out if this explanation is what you were looking for.

  • 0
    WoW! Very very good explanation, thank you very much! The roots are correct: `http://www.wolframalpha.com/input/?i=roots+-l^3+%2B+1%2F4%28l%29^2+-+9%2F4%28l%29+%2B+3%2F4`2011-03-20
  • 0
    @rubik: Your welcome, anytime.2011-04-03
  • 2
    I'm missing the point; one constructs a Frobenius companion matrix from a given polynomial, and you proceed to determine the characteristic polynomial of the companion matrix? A bit "chicken-or-egg", no?2011-05-02
  • 0
    @J.M.: I think the OP was only looking to obtain the eigenvalues, which were of interest. But could be wrong, OP was happy,so I am as well.2011-05-05