2
$\begingroup$

Looking at a 2nd-order Taylor series approximation of the function $f$, I have this:

$$f(t_1) = f(t_0) + hf'(t_0) + {h^2\over 2}f''(t_0) + O(h^3)$$

Now say I approximate $f''(t0)$ with a 2nd-order central difference method:

$$f''(t) = {1\over 2}{f'(t+h) - f'(t-h)\over h} + O(h^2)$$

What's the resulting error of this method?

The naive approach would be to substitute the central difference equation into the Taylor series, giving something like this:

$$f(t_1) = f(t_0) + hf'(t_0) + {h\over 4}(f'(t_0+h)-f'(t_0-h)) + {1\over 2}O(h^4) + O(h^3)$$

Is that plausible? Would error actually decrease (go from 2nd-order to 4th-order)?

  • 0
    It looks like $h$ is being used differently in your Taylor polynomial and your approximation of the second derivative (and perhaps should be $t_1$ instead of $h$ in the Taylor polynomial).2010-09-28
  • 1
    Since you still have the $O(h^3)$ term, the order of the error is the same as in the first formula. (Remember that $O(h^4)+O(h^3)=O(h^3)$ as $h \to 0$.2010-09-29

1 Answers 1

2

If we start with the Taylor expansion (I'll change variables here, too many subscripts confuse me):

$$f(x+h)=f(x)+h f^{\prime}(x)+\frac{h^2}{2}f^{\prime\prime}(x)+\frac{h^3}{3!}f^{\prime\prime\prime}(x)+O(h^4)$$

and the derivative of this w.r.t. $h$

$$f^{\prime}(x+h)=f^{\prime}(x)+h f^{\prime\prime}(x)+\frac{h^2}{2}f^{\prime\prime\prime}(x)+O(h^3)$$

and the version of this with $h$ replaced by negative $h$:

$$f^{\prime}(x-h)=f^{\prime}(x)-h f^{\prime\prime}(x)+\frac{h^2}{2}f^{\prime\prime\prime}(x)+O(h^3)$$

subtracting the third expression from the second expression gives

$$f^{\prime}(x+h)-f^{\prime}(x-h)=2h f^{\prime\prime}(x)+O(h^3)$$

and we see that the even powers drop out of this error expansion.

If we solve for $f^{\prime\prime}(x)$ like so:

$$f^{\prime\prime}(x)=\frac{f^{\prime}(x+h)-f^{\prime}(x-h)}{2h}+O(h^2)$$

and substitute in the first expression,

$$f(x+h)=f(x)+h f^{\prime}(x)+\frac{h^2}{2}\left(\frac{f^{\prime}(x+h)-f^{\prime}(x-h)}{2h}+O(h^2)\right)+\frac{h^3}{3!}f^{\prime\prime\prime}(x)+O(h^4)$$

we can take the $O(h^2)$ within the parentheses out as an $O(h^4)$ term:

$$f(x+h)=f(x)+h f^{\prime}(x)+\frac{h}{2}\left(\frac{f^{\prime}(x+h)-f^{\prime}(x-h)}{2}\right)+\frac{h^3}{3!}f^{\prime\prime\prime}(x)+O(h^4)$$

the leading term after the replaced portion is $O(h^3)$, thus simplifying to

$$f(x+h)=f(x)+h f^{\prime}(x)+\frac{h}{2}\left(\frac{f^{\prime}(x+h)-f^{\prime}(x-h)}{2}\right)+O(h^3)$$

and we see that the formula has $O(h^3)$ error: cutting $h$ in half decreases the error by a factor of $2^3=8$.

  • 0
    I don't see what this answer has to do with the question. The question wasn't about what the central difference approximation for $f''(x)$ is in terms of $f$, it was what the order of approximation is in the expression $f(t_0+h) \approx f(t_0) + hf'(t_0) + {h\over 4}(f'(t_0+h)-f'(t_0-h))$.2010-09-29
  • 0
    ...and the answer is $O(h^3)$.2010-09-29
  • 0
    @Hans: there, fixed. I seem to have been thinking of something else while I wrote the first version of this answer. :) Thanks for the heads-up!2010-09-29
  • 0
    In your formula for $f''(x)$, you've forgotten to divide the remainder term by $h$; it should be $O(h^2)$ instead of $O(h^3)$. Also, in the second to last formula $f(x+h)=\ldots+O(h^3)+O(h^4)$, I don't see where the $O(h^3)$ comes from except that $(h^3/3!) f'''(x)=O(x^3)$; I guess you mean the term that arises from $(h^2/2) O(h^3) = O(h^5)$ (although according to my previous sentence that should in fact be $(h^2/2) O(h^2) = O(h^4)$).2010-09-29
  • 0
    All fixed, thanks for the corrections @Hans.2010-09-29
  • 0
    Thanks for the in-depth answer, J. M. The answer for my purposes seems to be "you can approximate the second derivative in a 2nd-order Taylor series with a 2nd-order central difference method with no loss of accuracy". Is this the correct takeaway?2010-09-30
  • 0
    @jjkparker: I wouldn't say "no loss of accuracy" (with that you're assuming that there would be no cancellation in the subtraction, which isn't the case if $h$ is tiny enough); much safer to say that the approximate version *can be* just as accurate as the exact one.2010-09-30