Skip to yearly menu bar Skip to main content


Improved Bound on Generalization Error of Compressed KNN Estimator

hang zhang · Ping Li

Auditorium 1 Foyer 102

Abstract: This paper studies the generalization capability of the compressed \emph{$k$-nearest neighbor} (KNN) estimator, where randomly-projected low-dimensional data are put into the KNN estimator rather than the high-dimensional raw data. Considering both regression and classification, we give improved bounds on its generalization errors, to put more specific, $\ell_2$ error for regression and mis-classification rate for classification. As a byproduct of our analysis, we prove that ordered distance is almost preserved with random projections, which we believe is for the first time. In addition, we provide numerical experiments on various public datasets to verify our theorems.

Live content is unavailable. Log in and register to view live content