Skip to yearly menu bar Skip to main content


Poster

Importance Matching Lemma for Lossy Compression with Side Information

Truong Buu Phan · Ashish Khisti · Christos Louizos

Multipurpose Room 2 - Number 104

Abstract:

We propose two extensions to existing importance sampling based methods for lossy compression.First, we introduce an importance sampling based compression scheme that is a variant of ordered random coding (Theis and Ahmed, 2022) and is amenable to direct evaluation of the achievable compression rate for a finite number of samples.Our second and major contribution is the \emph{importance matching lemma}, which is a finite proposal counterpart of the recently introduced {Poisson matching lemma} (Li and Anantharam, 2021).By integrating with deep learning, we provide a new coding scheme for distributed lossy compression with side information at the decoder.We demonstrate the effectiveness of the proposed scheme through experiments involving synthetic Gaussian sources, distributed image compression with MNIST and vertical federated learning with CIFAR-10.

Live content is unavailable. Log in and register to view live content