## Uniform Consistency of Cross-Validation Estimators for High-Dimensional Ridge Regression

### Pratik Patil · Yuting Wei · Alessandro Rinaldo · Ryan Tibshirani

Keywords: [ Learning Theory and Statistics ] [ High-dimensional Statistics ]

[ Abstract ]
Wed 14 Apr 12:45 p.m. PDT — 2:45 p.m. PDT

Oral presentation: Deep Learning / High-dimensionality
Thu 15 Apr 2:15 p.m. PDT — 3:15 p.m. PDT

Abstract: We examine generalized and leave-one-out cross-validation for ridge regression in a proportional asymptotic framework where the dimension of the feature space grows proportionally with the number of observations. Given i.i.d.\ samples from a linear model with an arbitrary feature covariance and a signal vector that is bounded in $\ell_2$ norm, we show that generalized cross-validation for ridge regression converges almost surely to the expected out-of-sample prediction error, uniformly over a range of ridge regularization parameters that includes zero (and even negative values). We prove the analogous result for leave-one-out cross-validation. As a consequence, we show that ridge tuning via minimization of generalized or leave-one-out cross-validation asymptotically almost surely delivers the optimal level of regularization for predictive accuracy, whether it be positive, negative, or zero.

Chat is not available.