Two key challenges in fitting BigData problems into a lossy compression framework are (i) the selection of an appropriate distortion measure, and (ii) characterizing the performance of distributed systems. Inspired by real systems, like Google, which return a list of likely data entries indexed by likelihood, we study the "logarithmic loss" distortion function in a multiterminal setting, thus addressing both challenges. In particular, we characterize the rate-distortion region for two (generally open) multiterminal source coding problems when distortion is measured under logarithmic loss. In addition to the main results, I'll discuss applications to machine learning, estimation, and combinatorics.
Thomas Courtade received the B.S. degree in Electrical Engineering from Michigan Technological University in 2007, and his M.S. and Ph.D. degrees in Electrical Engineering from UCLA in 2008 and 2012, respectively. In 2012, he was awarded a Postdoctoral Research Fellowship at the Center for Science of Information. He currently holds this position, and splits his time between Stanford and Princeton Universities.
While at UCLA he was the recipient of several fellowships, including the UCLA Dissertation Year Fellowship. In recognition of his teaching efforts, he was awarded an Excellence in Teaching Award from the Department of Electrical Engineering in 2011. Recently, his thesis, "Two Problems in Multiterminal Information Theory", was recognized with the Distinguished Ph.D. Dissertation award, and his paper "Multiterminal Source Coding under Logarithmic Loss" received a Best Student Paper Award at the 2012 International Symposium on Information Theory.