A matrix determinant (naively) can be computed in $O(n!)$ steps, or with a proper LU decomposition $O(n^3)$ steps. This assumes that all the matrix elements are constant. If, however the matrix elements are polynomials (say univariate of max order $p$) at each step of the LU decomposition an element is multiplied by another element producing (on average) ever larger polynomials. Each step therefore takes longer and longer - is the cost perform the decomposition still a polynomial?
EDIT: To clarify my reasoning a bit, if polynomial (using FFT as J.M. suggests in the comments) takes $O(m \log m)$, and we must perform $O(n^3)$ operations to get the LU decomposition, each step the polynomial could effectively double in degree at each multiplication*. The running time would look something like
$$ \approx O \left ( \sum_{k}^{n^3} (p \ln p)^{2k} \right ) .$$
* (it doesn't quite do that, and this is where I'm stuck)