A Comprehensive Analysis on the Learning Curve in Kernel Ridge Regression
Abstract
This paper conducts a comprehensive study of the learning curves of kernel
ridge regression (KRR) under minimal assumptions. Our contributions are
three-fold: 1) we analyze the role of key properties of the kernel, such as its
spectral eigen-decay, the characteristics of the eigenfunctions, and the
smoothness of the kernel; 2) we demonstrate the validity of the Gaussian
Equivalent Property (GEP), which states that the generalization performance of
KRR remains the same when the whitened features are replaced by standard
Gaussian vectors, thereby shedding light on the success of previous analyzes
under the Gaussian Design Assumption; 3) we derive novel bounds that improve
over existing bounds across a broad range of setting such as (in)dependent
feature vectors and various combinations of eigen-decay rates in the
over/underparameterized regimes.