Skip to yearly menu bar Skip to main content


Poster

Generalized PTR: User-Friendly Recipes for Data-Adaptive Algorithms with Differential Privacy

Rachel Redberg · Yuqing Zhu · Yu-Xiang Wang

Auditorium 1 Foyer 158

Abstract:

The "Propose-Test-Release" (PTR) framework is a classic recipe for designing differentially private (DP) algorithms that are data-adaptive, i.e. those that add less noise when the input dataset is "nice". We extend PTR to a more general setting by privately testing data-dependent privacy losses rather than local sensitivity, hence making it applicable beyond the standard noise-adding mechanisms, e.g. to queries with unbounded or undefined sensitivity. We demonstrate the versatility of generalized PTR using private linear regression as a case study. Additionally, we apply our algorithm to solve an open problem from “Private Aggregation of Teacher Ensembles (PATE)” --- privately releasing the entire model with a delicate data-dependent analysis.

Live content is unavailable. Log in and register to view live content