3:30 pm, Seminar Hall CMI Silver Jubilee Lecture Mutual Information in OneShot Jaikumar Radhakrishnan TIFR, Mumbai. 121114 Abstract Using two examples, we will describe information theoretic quantities such as Shannon Entropy, Relative Entropy and Mutual Information. Example 1: We have a pair of random variables (X,Y) taking values in S x T. One party (the Sender) is given a value x in S chosen according to the distribution of X and needs to ensure that the other party (he Receiver) gets a value y in T so that (x,y) have the same distribution as (X,Y). How many bits on average must the Sender send the Receiver? Example 2: There is one Sender and there two Receivers talking over a noisy channel: the sender feeds a letter into the channel and the two receivers receive something in response. How efficiently can the Sender communicate with the receivers? In Example 1, we will obtain an operational oneshot meaning for Shannon's Mutual Information. In Example 2, we will obtain a oneshot version of Marton's bound. We will assume no prior background in information theory. (Based on joint work with David McAllester, Prahladh Harsha, Rahul Jain, Pranab Sen and Naqueeb Warsi.)
