Local convergence rates of the nonparametric least squares estimator with applications to transfer learning

Johannes Schmidt-Hieber, Petr Zamolodtchikov

Research output: Contribution to journalArticleAcademicpeer-review

2 Citations (Scopus)
59 Downloads (Pure)

Abstract

Convergence properties of empirical risk minimizers can be conveniently expressed in terms of the associated population risk. To derive bounds for the performance of the estimator under covariate shift, however, pointwise convergence rates are required. Under weak assumptions on the design distribution, it is shown that least squares estimators (LSE) over 1-Lipschitz functions are also minimax rate optimal with respect to a weighted uniform norm, where the weighting accounts in a natural way for the non-uniformity of the design distribution. This implies that although least squares is a global criterion, the LSE adapts locally to the size of the design density. We develop a new indirect proof technique that establishes the local convergence behavior based on a carefully chosen local perturbation of the LSE. The obtained local rates are then applied to analyze the LSE for transfer learning under covariate shift.
Original languageEnglish
Pages (from-to)1845-1877
Number of pages33
JournalBernoulli
Volume30
Issue number3
DOIs
Publication statusPublished - Aug 2024

Fingerprint

Dive into the research topics of 'Local convergence rates of the nonparametric least squares estimator with applications to transfer learning'. Together they form a unique fingerprint.

Cite this