5
$\begingroup$

Almost everyone is familiar with the famous Taylor Series:

$ f(x) = \sum_{n=0}^{\infty} \frac{f^{(n)}(a)}{n!} (x-a)^n $

which, if it converges at more than one point, will converge in some interval about $a$. Has anyone considered the "Reverse" Taylor Series:

$ g(x) = \sum_{n=0}^\infty \frac{f^{(n)}(x-a)}{n!} a^n $

I call it reverse, because its what you get for symbolically taking $a \rightarrow (x-a)$. You might be saying that there is absolutely no reason to believe that this series should converge to $f(x)$, but there are two big examples where it does.

For $e^x$:

$g(x) = \sum_{n=0}^\infty \frac{e^{(x-a)}}{n!} a^n = \frac{e^x}{e^a} \sum_{n=0}^\infty \frac{a^n}{n!} = e^x $

For $x^k$:

$g(x) = \sum_{n=0}^k \frac{k(k-1)...(k-n+1)(x-a)^{k-n} a^n}{n!} = \sum_{n=0}^k {k \choose n} (x-a)^{k-n} a^n = ((x-a) + a)^k $

I have yet to find a counter-example for when the Reverse Taylor Series does not give back the original function for an analytic function, and I also have yet to think of a way to prove that the Reverse Taylor Series should converge for a given function. Does anyone have any ideas?

  • 0
    If it works for $x^k$, then if you express your real analytic function as a power series, just arguing term by term should give you the convergence.2010-12-23
  • 0
    http://en.wikipedia.org/wiki/Lagrange_reversion_theorem2010-12-23

2 Answers 2

7

Your two series are the same thing pointwise when they converge. $g(x)$ is the Taylor series for $f$ expanded around the point $x-a$. So it will converge as long as the Taylor series for $f$ at $x-a$ has a radius of convergence at least $a$.

However, in $f(x)$, moving $x$ around may make the Taylor series fail to converge due to your $x$ moving out of the radius of convergence. Whereas in $g(x)$, moving $x$ around will make the series fail to converge due to moving the point $x - a$ somewhere with a radius of convergence smaller than $a$. That is, once you fix $a$, in the Taylor series for $f$ around $a$, the convergence radius is fixed, independent of $x$. But in the series $g(x)$, the convergence of the series depends on how the radius of convergence at $x-a$ (which depends on $x$) compares with $a$. So that the function $g(x)$ converges at more than one point is insufficient to guarantee that $g(x)$ converges in a small interval about $a$.

4

The general idea:

Put $y=x-a$. Then, $$ g(x) = \sum\limits_{n = 0}^\infty {\frac{{f^{(n)} (y)}}{{n!}}(x - y)^n } = f(x). $$