This comes up in a paper by Beskos and Roberts on exact simulation of diffusions. I'm sure it's easy, but I can't work out what's going on.
Given a differentiable function $\alpha$ satisfying
$\alpha(u) \geq k_1$
and
$\alpha'(u) \leq k_2$,
for some constants $k_1$ and $k_2$, show that there exist constants $c_1$ and $c_2$ such that
$c_1 \leq \alpha^2(u) + \alpha'(u) \leq c_2$.
Clearly one condition implies alpha is bounded below and one implies that alpha has at most linear growth, but I don't see how this is enough for the conclusion.
Edit - I'm a fool. They wrote $k_1 \leq \alpha(u), \alpha'(u) \leq k_2$, but there was a line break after the comma. I was reading them as two seperate conditions on $\alpha$ and $\alpha'$. Sorry everyone, thanks for the help. If it wasn't for the counterexamples it would have taken a lot longer to realise that.
Thanks.