1
$\begingroup$

Consider the Chernoff bound: It assumes $X_1, \ldots, X_n$ are independent Bernoulli random variables, each having probability $p > 1/2$, and seeks the probability of simultaneous occurrence of more than $n/2$ of the events.

I once heard that, in the proof of the Chernoff bound, parameters $p$ and $n$ must be constant.

Is this true? for instance, can we let $p = 1/2 + n^{-3}$?


Edit (As per moron's request)

I'll elaborate on how I wan to use it. The complexity class BPP is defined as below:

A language L is in BPP if and only if there exists a probabilistic Turing machine M, such that

  • M runs for polynomial time on all inputs;
  • Completeness: For all $x \in L$, M outputs 0 with probability less than or equal to 1/3;
  • Soundness: For all $x \not \in L$, M outputs 1 with probability less than or equal to 1/3.

Again, from Wikipedia:

In practice, an error probability of 1/3 might not be acceptable, ... but if the algorithm is run many times, the chance that the majority of the runs are wrong drops off exponentially as a consequence of the Chernoff bound.

This seems to be true as long as the error probability is a constant. However:

It does not even have to be constant: the same class of problems is defined by allowing error as high as $1/2 − n^{−c}$ on the one hand, or requiring error as small as $2^{−n^c}$ on the other hand, where $c$ is any positive constant, and $n$ is the length of input.

How is Chernoff bound applied in such cases?

  • 0
    I would say yes. Once $n$ is fixed... Perhaps you can elaborate on how you are planning to use it?2010-12-16
  • 0
    @Moron: Done. Please see the edited question.2010-12-16
  • 0
    I believe Qiaochu answered it. Don't you agree?2010-12-17
  • 0
    Every time someone refers to Moron (which is often, because he's one of the best users here) I have to do a double-take. I'm still not used to it...2010-12-17

2 Answers 2

2

the wikipedia link you give has an inequality that allows the success probabilities for the different bernoulli trials to be different - and the bound does not depend explicitly on the number of trials n. rather, n is replaced by $\mu = \sum_1^n p_i$, where $p_i$ is the success probability for the i$^{th}$ trial. look at the section "Theorem for multiplicative form of Chernoff bound (relative error)". in this form, it is clear, as qiaochu states, that the $\{p_i\}$ can depend on n - and can all be the same - or not.

3

The Chernoff bound states that a certain function $f(p, n)$ of $p$ and $n$ is greater than a certain other function $g(p, n)$ of $p$ and $n$, subject to the condition that $p > \frac{1}{2}$. I see no reason why you can't pick $p$ and $n$ to be whatever you want; in particular, I see no reason why you can't let $p$ depend on $n$.