A Theoretical Analysis of the Test Error of Finite-Rank Kernel Ridge Regression

Published in Neurips, 2023

Recommended citation: Tin Sum Cheng, Aurelien Lucchi, Ivan Dokmanic, Anastasis Kratsios, and David Belius. A Theoretical Analysis of the Test Error of Finite-Rank Kernel Ridge Regression. Advances in Neural Information Processing Systems, 36, 2023. http://tscheng516.github.io/personal_page/files/paper_neurips2023.pdf

Abstract

Existing statistical learning guarantees for general kernel regressors often yield loose bounds when used with finite-rank kernels. Yet, finite-rank kernels naturally appear in several machine learning problems, e.g. when fine-tuning a pre-trained deep neural network’s last layer to adapt it to a novel task when performing transfer learning. We address this gap for finite-rank kernel ridge regression (KRR) by deriving sharp non-asymptotic upper and lower bounds for the KRR test error of any finite-rank KRR. Our bounds are tighter than previously derived bounds on finite-rank KRR, and unlike comparable results, they also remain valid for any regularization parameters.

Download paper here

Recommended citation: Tin Sum Cheng, Aurelien Lucchi, Ivan Dokmanic, Anastasis Kratsios, and David Belius. A Theoretical Analysis of the Test Error of Finite-Rank Kernel Ridge Regression. Advances in Neural Information Processing Systems, 36, 2023.