Over the past two decades, information theory has reemerged within computational complexity theory as a mathematical tool for obtaining tight unconditional bounds in a number of models, including streaming algorithms, data structures, and communication complexity. Many of these applications can be systematized and extended via the study of information complexity - which treats information revealed or transmitted as the resource to be conserved.

In this talk we will discuss the two-party information complexity and its properties- and the interactive analogues of classical source coding theorems. We will then discuss applications to exact bounds in a number of models, as well as current challenges and extensions.


Mark Braverman is a Professor of Computer Science at Princeton University, having joined the department in 2011 from the University of Toronto, where he was an assistant professor in the mathematics and computer science departments. He earned his Ph.D. in 2008 from Toronto and did post-doctoral research at Microsoft Research New England, Cambridge, MA. Professor Braverman’s interests center on the connections between theoretical computer science and other disciplines, including information theory, mathematics, and economics. Most recently, he has been building new connections between information theory and complexity theory, studying the effects of noise in a variety of computational settings, and investigating how better algorithms can lead to better mechanism design, particularly in the context of healthcare. He was awarded an EMS Prize in 2016 from the European Mathematical Society, a Presburger Award in the same year, as well as the 2019 Alan T. Waterman Award from NSF.