Skip to yearly menu bar Skip to main content


Oral

Data-splitting improves statistical performance in overparameterized regimes

Nicole Mücke · Enrico Reiss · Jonas Rungenhagen · Markus Klein

Abstract:

While large training datasets generally offer improvement in model performance, thetraining process becomes computationally expensive and time consuming. Distributedlearning is a common strategy to reduce the overall training time by exploiting multiplecomputing devices. Recently, it has been observed in the single machine setting thatoverparameterization is essential for benign overfitting in ridgeless regression in Hilbert spaces. We show that in this regime, data splitting has a regularizing effect, hence improving statistical performance and computational complexity at the same time. We further provide a unified framework that allows to analyze both the finite and infinite dimensional setting. We numerically demonstrate the effect of different model parameters.

Chat is not available.