Skip to yearly menu bar Skip to main content


Poster

Federated Experiment Design under Distributed Differential Privacy

Wei-Ning Chen · Graham Cormode · Akash Bharadwaj · Peter Romov · Ayfer Ozgur

MR1 & MR2 - Number 60
[ ]
Sat 4 May 6 a.m. PDT — 8:30 a.m. PDT

Abstract:

Experiment design has a rich history dating back over a century and has found many critical applications across various fields since then. The use and collection of users' data in experiments often involve sensitive personal information, so additional measures to protect individual privacy are required during data collection, storage, and usage. In this work, we focus on the rigorous protection of users' privacy (under the notion of differential privacy (DP)) while minimizing the trust toward service providers. Specifically, we consider the estimation of the average treatment effect (ATE) under DP, while only allowing the analyst to collect population-level statistics via secure aggregation, a distributed protocol enabling a service provider to aggregate information without accessing individual data. Although a vital component in modern A/B testing workflows, private distributed experimentation has not previously been studied.To achieve DP, we design local privatization mechanisms that are compatible with secure aggregation and analyze the utility, in terms of the width of confidence intervals, both asymptotically and non-asymptotically. We show how these mechanisms can be scaled up to handle the very large number of participants commonly found in practice.In addition, when introducing DP noise, it is imperative to cleverly split privacy budgets to estimate both the mean and variance of the outcomes and carefully calibrate the confidence intervals according to the DP noise. Last, we present comprehensive experimental evaluations of our proposed schemes and show the privacy-utility trade-offs in experiment design.

Chat is not available.