Loading [MathJax]/jax/output/CommonHTML/fonts/TeX/fontdata.js
Skip to yearly menu bar Skip to main content


Poster

Online Distribution Learning with Local Privacy Constraints

Jin Sima · Changlong Wu · Olgica Milenkovic · Wojciech Szpankowski

MR1 & MR2 - Number 76

Abstract: We study the problem of online conditional distribution estimation with \emph{unbounded} label sets under local differential privacy. The problem may be succinctly stated as follows. Let F be a distribution-valued function class with an unbounded label set. Our aim is to estimate an \emph{unknown} function fF in an online fashion. More precisely, at time t, given a sample xt, we generate an estimate of f(xt) using only a \emph{privatized} version of the true \emph{labels} sampled from f(xt). The objective is to minimize the cumulative KL-risk of a finite horizon T. We show that under (ϵ,0)-local differential privacy for the labels, the KL-risk equals ˜Θ(1ϵKT), up to poly-logarithmic factors, where K=|F|. This result significantly differs from the ˜Θ(TlogK) bound derived in Wu et al., (2023a) for \emph{bounded} label sets. As a side-result, our approach recovers a nearly tight upper bound for the hypothesis selection problem of Gopi et al., (2020), which has only been established for the \emph{batch} setting.

Chat is not available.