On the inability of Gaussian process regression to optimally learn compositional functions

Matteo Giordano, Kolyan Ray, Johannes Schmidt-Hieber

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

35 Downloads (Pure)

Abstract

We rigorously prove that deep Gaussian process priors can outperform Gaussian process priors if the target function has a compositional structure. To this end, we study information-theoretic lower bounds for posterior contraction rates for Gaussian process regression in a continuous regression model. We show that if the true function is a generalized additive function, then the posterior based on any mean-zero Gaussian process can only recover the truth at a rate that is strictly slower than the minimax rate by a factor that is polynomially suboptimal in the sample size n.
Original languageEnglish
Title of host publicationNIPS'22
Subtitle of host publicationProceedings of the 36th International Conference on Neural Information Processing Systems
EditorsS. Koyejo
Place of PublicationRed Hook, NY
PublisherCurran Associates Inc.
Pages22341 -22353
ISBN (Print)978-1-7138-7108-8
DOIs
Publication statusPublished - 3 Apr 2024
Event36th International Conference on Neural Information Processing Systems, NIPS 2022 - New Orleans, United States
Duration: 2 Nov 20229 Dec 2022
Conference number: 36

Conference

Conference36th International Conference on Neural Information Processing Systems, NIPS 2022
Abbreviated titleNIPS 2022
Country/TerritoryUnited States
CityNew Orleans
Period2/11/229/12/22

Fingerprint

Dive into the research topics of 'On the inability of Gaussian process regression to optimally learn compositional functions'. Together they form a unique fingerprint.

Cite this