Double-Descent

Implicit Regularization of Random Feature Models

We discuss how the effective ridge reveals the implicit regularization effect of finite sampling in random features. The derivative of the effective ridge tracks the variance of the optimal predictor, yielding an explanation for the variance explosion at the interpolation threshold for arbitrary datasets.

Implicit Regularization of Random Feature Models

Random Feature (RF) models are used as efficient parametric approximations of kernel methods. We investigate, by means of random matrix theory, the connection between Gaussian RF models and Kernel Ridge Regression (KRR). For a Gaussian RF model with …