Skip to yearly menu bar Skip to main content


Poster

RankDistil: Knowledge Distillation for Ranking

Sashank Reddi · Rama Kumar Pasumarthi · Aditya Menon · Ankit Singh Rawat · Felix Yu · Seungyeon Kim · Andreas Veit · Sanjiv Kumar

Keywords: [ Applications ] [ Neuroscience and Cognitive Science ] [ Neuroscience ] [ Neuroscience and Cognitive Science -> Human or Animal Learning; Probabilistic Methods ] [ Belief Propagation; Probabilistic Meth ] [ Information Retrieval ]


Abstract:

Knowledge distillation is an approach to improve the performance of a student model by using the knowledge of a complex teacher. Despite its success in several deep learning applications, the study of distillation is mostly confined to classification settings. In particular, the use of distillation in top-k ranking settings, where the goal is to rank k most relevant items correctly, remains largely unexplored. In this paper, we study such ranking problems through the lens of distillation. We present a distillation framework for top-k ranking and draw connections with the existing ranking methods. The core idea of this framework is to preserve the ranking at the top by matching the order of items of student and teacher, while penalizing large scores for items ranked low by the teacher. Building on this, we develop a novel distillation approach, RankDistil, specifically catered towards ranking problems with a large number of items to rank, and establish statistical basis for the method. Finally, we conduct experiments which demonstrate that RankDistil yields benefits over commonly used baselines for ranking problems.

Chat is not available.