Poster
Predictive Power of Nearest Neighbors Algorithm under Random Perturbation
Yue Xing · Qifan Song · Guang Cheng
Keywords: [ Learning Theory and Statistics ] [ Statistical Learning Theory ]
Abstract:
This work investigates the predictive performance of the classical Nearest Neighbors (-NN) algorithm when the testing data are corrupted by random perturbation. The impact of corruption level on the asymptotic regret is carefully characterized and we reveal a phase-transition phenomenon that, when the corruption level of the random perturbation is below a critical order (i.e., small- regime), the asymptotic regret remains the same; when it is beyond that order (i.e., large- regime), the asymptotic regret deteriorates polynomially. More importantly, the regret of -NN classifier heuristically matches the rate of minimax regret for randomly perturbed testing data, thus implies the strong robustness of -NN against random perturbation on testing data. In fact, we show that the classical -NN can achieve no worse predictive performance, compared to the NN classifiers trained via the popular noise-injection strategy. Our numerical experiment also illustrates that combining -NN component with modern learning algorithms will inherit the strong robustness of -NN. As a technical by-product, we prove that under different model assumptions, the pre-processed 1-NN proposed in \cite{xue2017achieving} will at most achieve a sub-optimal rate when the data dimension even if is chosen optimally in the pre-processing step.
Chat is not available.