Chennai Mathematical Institute

Seminars




2.00 pm, NKN Hall
Theoretical Connections between Convex Optimization and Active Learning

Aaditya Ramdas
Carnegie Mellon University.
07-01-14


Abstract

Stochastic convex optimization (SCO) under the first order oracle model deals with approximately optimizing a convex function over a convex set, given access to noisy function and gradient values at any point, using as few queries as possible. Active learning of threshold (ALT) classifiers, is a classification problem where every point has a binary label, and it deals with approximately locating a "threshold" on a subinterval of the real line (which is a point to the left of which labels are more likely to be negative, and to the right of which labels are more likely to be positive), given access to an oracle that returns noisy labels, using as few queries as possible.

In this talk, we will discuss four things

1) The key insight : Exploiting the sequential nature of both problems, we establish a concrete similarity between "Tsybakov Noise" from ALT theory and "Strong/Uniform Convexity" from SCO theory

2) Lower bounds: We will show how information-theoretic lower-bound techniques from ALT can be used to get very similar lower bounds in SCO (and show these are tight with matching upper bounds).

3) Upper bounds: We will go the other way, and show how upper-bound techniques from SCO can be used to get a very similar algorithm in ALT, that is adaptive to unknown Tsybakov noise parameters.

4) Combination: We will combine some of these techniques, to yield a coordinate descent algorithm that uses active learning for line search, that is adaptive to unknown strong/uniform convexity parameters.

This is joint work with Aarti Singh.