Skip to yearly menu bar Skip to main content


Poster

Think Global, Adapt Local: Learning Locally Adaptive K-Nearest Neighbor Kernel Density Estimators

Kenny Olsen · Rasmus Hoeegh Lindrup · Morten Mørup

MR1 & MR2 - Number 168

Abstract:

Kernel density estimation (KDE) is a powerful technique for non-parametric density estimation, yet practical use of KDE-based methods remains limited by insufficient representational flexibility, especially for higher-dimensional data.Contrary to KDE, K-nearest neighbor (KNN) density estimation procedures locally adapt the density based on the K-nearest neighborhood, but unfortunately only provide asymptotically correct density estimates. We present the KNN-KDE method introducing observation-specific kernels for KDE that are locally adapted through priors defined by the covariance of the K-nearest neighborhood, forming a fully Bayesian model with exact density estimates. We further derive a scalable inference procedure that infers parameters through variational inference by optimizing the predictive likelihood exploiting sparsity, batched optimization, and parallel computation for massive inference speedups.We find that KNN-KDE provides valid density estimates superior to conventional KDE and KNN density estimation on both synthetic and real data sets. We further observe that the bayesian KNN-KDE even outperforms recent neural density estimation procedures on two of the five considered real data sets. The KNN-KDE unifies conventional kernel and KNN density estimation providing a scalable, generic and accurate framework for density estimation.

Chat is not available.