11.00 am, Seminar Hall
Derandomizing Isolation in the Space-Bounded setting
University of Wisconsin-Madison.
Isolation is the process of singling out a solution to a problem that may have many solutions. It plays an important role in the design of parallel algorithms where it allows us to ensure that multiple parallel processes all work towards the same global solution. For example the best parallel algorithms for finding perfect matchings in graphs hinge on isolations for this reason. Isolations are also an ingredient in sequential algorithms for NP complete problems like Hamiltonicity. In the algebraic setting Isolations help avoid cancellations and allow us to reduce multivariate polynomial identity testing to univariate polynomial identity testing. All of these algorithms are randomized, and the only reason is the use of the Isolation Lemma - that for any set system over a finite universe, a random assignment of small integer weights to the elements of the universe has a high probability of yielding a unique set of minimum weight in the system.
This talk is about the possibility of deterministic isolation in the space-bounded setting. The question is: Can one always make the accepting computation paths of nondeterministic space-bounded machines unique without changing the underlying language and without blowing up the space by more than a constant factor? I will present some partial results towards the resolution of this question, obtained jointly with Dieter van Melkebeek.