On lower bounds for the bias-variance trade-off

Research output: Contribution to journalArticleAcademicpeer-review

2 Citations (Scopus)
108 Downloads (Pure)

Abstract

It is a common phenomenon that for high-dimensional and nonparametric
statistical models, rate-optimal estimators balance squared bias and variance.
Although this balancing is widely observed, little is known whether
methods exist that could avoid the trade-off between bias and variance. We
propose a general strategy to obtain lower bounds on the variance of any estimator
with bias smaller than a prespecified bound. This shows to which
extent the bias-variance trade-off is unavoidable and allows to quantify the
loss of performance for methods that do not obey it. The approach is based
on a number of abstract lower bounds for the variance involving the change
of expectation with respect to different probability measures as well as information
measures such as the Kullback–Leibler or χ2-divergence. Some of
these inequalities rely on a new concept of information matrices. In a second
part of the article, the abstract lower bounds are applied to several statistical
models including the Gaussian white noise model, a boundary estimation
problem, the Gaussian sequence model and the high-dimensional linear regression
model. For these specific statistical applications, different types of
bias-variance trade-offs occur that vary considerably in their strength. For
the trade-off between integrated squared bias and integrated variance in the
Gaussian white noise model, we propose to combine the general strategy for
lower bounds with a reduction technique. This allows us to reduce the original
problem to a lower bound on the bias-variance trade-off for estimators with
additional symmetry properties in a simpler statistical model. In the Gaussian
sequence model, different phase transitions of the bias-variance trade-off occur.
Although there is a non-trivial interplay between bias and variance, the
rate of the squared bias and the variance do not have to be balanced in order
to achieve the minimax estimation rate.
Original languageEnglish
Pages (from-to)1510 - 1533
Number of pages24
JournalAnnals of the Institute of Statistical Mathematics
Volume51
Issue number4
DOIs
Publication statusPublished - Aug 2023

Keywords

  • 2023 OA procedure
  • Cramér–Rao inequality
  • high-dimensional statistics
  • minimax estimation
  • nonparametric estimation
  • Bias-variance decomposition

Fingerprint

Dive into the research topics of 'On lower bounds for the bias-variance trade-off'. Together they form a unique fingerprint.

Cite this