Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Minimax Optimal Regression over Sobolev Spaces via Laplacian Regularization on Neighborhood Graphs

Alden Green · Sivaraman Balakrishnan · Ryan Tibshirani

Keywords: [ Models and Methods ] [ Learning on Graphs ]


Abstract: In this paper we study the statistical properties of Laplacian smoothing, a graph-based approach to nonparametric regression. Under standard regularity conditions, we establish upper bounds on the error of the Laplacian smoothing estimator \smash{ˆf}, and a goodness-of-fit test also based on \smash{ˆf}. These upper bounds match the minimax optimal estimation and testing rates of convergence over the first-order Sobolev class H1(X), for XRd and 1d<4; in the estimation problem, for d=4, they are optimal modulo a logn factor. Additionally, we prove that Laplacian smoothing is manifold-adaptive: if XRd is an m-dimensional manifold with m<d, then the error rate of Laplacian smoothing (in either estimation or testing) depends only on m, in the same way it would if X were a full-dimensional set in Rm.

Chat is not available.