Lower Bounds for The Cooperative Noisy Broadcast Problem
McGill University, Montreal, Canada.
We prove the first non-trivial (super linear) lower bound in the noisy broadcast model, defined by El Gamal in 1984. In this model there are n+1 processors P_0,P_1,...,P_n, each of which is initially given a private input bit x_i. The goal is for P_0 to learn the value of f(x_1,...,x_n), for some specified function f, using a series of noisy broadcasts. At each step a designated processor broadcasts one bit to all of the other processors, and the bit received by each processor is flipped with fixed probability (independently for each recipient).
In 1988, Gallager gave a noise-resistant protocol that allows P_0 to learn the entire input with constant probability in O(n log log n) broadcasts. We prove that Gallager's protocol is optimal, up to a constant factor. Our lower bound follows by reduction from a lower bound for generalized noisy decision trees, a new model which may be of independent interest. For this new model we show a lower bound of Omega(n log n) on the depth of a tree that learns the entire input.