CSL Communications Group Calendar

CSL Communications Group Calendar

skip to events

calendar tabs

  •  All 
  • Grid
  • Month
  • Week
  • Day
  • (Selected tab) Detail

Event Detail Information

Event Detail Information

Nonasymptotic lossy compression

Speaker Victoria Kostina, Princeton University
Date Apr 15, 2013
Time 4:00 pm - 5:00 pm  
Location 141 Coordinated Science Lab
Sponsor CSL Communications Seminar
Contact Peggy Wells
Phone 217-244-2646
Views 1278
ABSTRACT: The fundamental problem of lossy compression is to represent an object with the best compression ratio (rate) possible while satisfying a constraint on the fidelity of reproduction. Traditional (asymptotic) information theory describes the optimum tradeoff between rate and fidelity that is achievable in the limit of infinite length of the source block to be compressed. Since limited delay is a key design requirement in many modern compression and transmission applications, we drop the assumption of unbounded blocklength and analyze the optimum tradeoff among rate, fidelity and blocklength achievable, regardless of the complexity of the code. We have found that the key random variable that governs the nonasymptotic fundamental limit of lossy compression is the so-called d-tilted information in x, which quantifies the number of bits required to represent source outcome x within distortion d. If we let the blocklength increase indefinitely, by the law of large numbers the d-tilted information in most source outcomes would be near its mean, which is equal to Shannon's rate-distortion function. At finite blocklength, however, the whole distribution of d-tilted information matters. Interestingly, in a seemingly unrelated problem of channel coding under input cost constraints, the maximum nonasymptotically achievable channel coding rate can be bounded in terms of the distribution of the random variable termed the b-tilted information in channel input x, which parallels the notion of the d-tilted information in lossy compression. Finally, we have analyzed the nonasymptotic fundamental limits of lossy joint source-channel coding, showing that separate design of source and channel codes, known to be optimal asymptotically, fails to achieve the non-asymptotic fundamental limit. Nonasymptotic information theory thus forces us to unlearn the lessons instilled by traditional asymptotic thinking. BIO: Victoria Kostina, a PhD student at Princeton University since 2008, and she works with Professor Sergio Verd on information-theoretic limits of lossy data compression in the finite blocklength regime. Previously, she received a Bachelor's degree from Moscow Institute of Physics and Technology and a Master's degree from the University of Ottawa.
link for robots only