Consider the Chernoff bound: It assumes $X_1, \ldots, X_n$ are independent Bernoulli random variables, each having probability $p > 1/2$, and seeks the probability of simultaneous occurrence of more than $n/2$ of the events.
I once heard that, in the proof of the Chernoff bound, parameters $p$ and $n$ must be constant.
Is this true? for instance, can we let $p = 1/2 + n^{-3}$?
Edit (As per moron's request)
I'll elaborate on how I wan to use it. The complexity class BPP is defined as below:
A language L is in BPP if and only if there exists a probabilistic Turing machine M, such that
- M runs for polynomial time on all inputs;
- Completeness: For all $x \in L$, M outputs 0 with probability less than or equal to 1/3;
- Soundness: For all $x \not \in L$, M outputs 1 with probability less than or equal to 1/3.
Again, from Wikipedia:
In practice, an error probability of 1/3 might not be acceptable, ... but if the algorithm is run many times, the chance that the majority of the runs are wrong drops off exponentially as a consequence of the Chernoff bound.
This seems to be true as long as the error probability is a constant. However:
It does not even have to be constant: the same class of problems is defined by allowing error as high as $1/2 − n^{−c}$ on the one hand, or requiring error as small as $2^{−n^c}$ on the other hand, where $c$ is any positive constant, and $n$ is the length of input.
How is Chernoff bound applied in such cases?