Skip to yearly menu bar Skip to main content


Poster

Deep Classifier Mimicry without Data Access

Steven Braun · Martin Mundt · Kristian Kersting

Multipurpose Room 2 - Number 117
award Student Paper Highlight
[ ]
 
Oral presentation: Oral: Deep Learning
Sat 4 May 1:30 a.m. PDT — 2:30 a.m. PDT

Abstract:

Access to pre-trained models has recently emerged as a standard across numerous machine learning domains. Unfortunately, access to the original data the models were trained on may not equally be granted. This makes it tremendously challenging to fine-tune, compress models, adapt continually, or to do any other type of data-driven update. We posit that original data access may however not be required. Specifically, we propose Contrastive Abductive Knowledge Extraction (CAKE), a model-agnostic knowledge distillation procedure that mimics deep classifiers without access to the original data. To this end, CAKE generates pairs of noisy synthetic samples and diffuses them contrastively toward a model’s decision boundary. We empirically corroborate CAKE's effectiveness using several benchmark datasets and various architectural choices, paving the way for broad application.

Live content is unavailable. Log in and register to view live content