Chennai Mathematical Institute

Silver Jubilee Events


CMI Silver Jubilee Lecture

Jaikumar Radhakrishnan, Tata Institute of Fundamental Research, Mumbai

Mutual Information in One-Shot

Wednesday, November 12, 2014

Abstract:

Using two examples, we will describe information theoretic quantities such as Shannon Entropy, Relative Entropy and Mutual Information.

Example 1: We have a pair of random variables (X,Y) taking values in S x T. One party (the Sender) is given a value x in S chosen according to the distribution of X and needs to ensure that the other party (he Receiver) gets a value y in T so that (x,y) have the same distribution as (X,Y). How many bits on average must the Sender send the Receiver?

Example 2: There is one Sender and there two Receivers talking over a noisy channel: the sender feeds a letter into the channel and the two receivers receive something in response. How efficiently can the Sender communicate with the receivers?

In Example 1, we will obtain an operational one-shot meaning for Shannon's Mutual Information. In Example 2, we will obtain a one-shot version of Marton's bound.

We will assume no prior background in information theory.

(Based on joint work with David McAllester, Prahladh Harsha, Rahul Jain, Pranab Sen and Naqueeb Warsi.)





Google
Search WWW Search cmi.ac.in