Wikipedia says the midpoint formula for numerical integration has error of order $h^3 f''(\xi)$. I am trying to replicate this result.
I'm guessing that I want to use Lagrange's formulation of the remainder for Taylor series. Let $x_0=\frac{a+b}{2}$ (i.e. the midpoint).
The midpoint method says $\int_a^b f(x)dx \approx (b-a)f(\frac{a+b}{2})$, so to get the error I find $(b-a) f(\frac{a+b}{2}) - \int_a^bf(x)dx$. If I expand this using Taylor's theorem I get:
$ \begin{aligned} error & =(b-a) f(x_0) - \int_a^bf(x_0)+\frac{f'(\xi)(x-x_0)}{2}dx \\ & =\frac{f'(\xi)}{2}\int_a^b(x-x_0)dx \\ & =\frac{f'(\xi)}{2}\int_a^b(x-\frac{a+b}{2})dx \\ & = 0 \end{aligned}$
So apparently I have just proven that it has zero error? Any hints as to what I did wrong? (I realize that since wikipedia gives it in terms of $f''$ I probably want to take the expansion one level further to match them, but I don't understand why this doesn't work.)