Abstract
We study posterior contraction rates for a class of deep Gaussian process priors in the
nonparametric regression setting under a general composition assumption on the regression function. It is shown that the contraction rates can achieve the minimax convergence rate (up to log n factors), while being adaptive to the underlying structure and smoothness of the target function. The proposed framework extends the Bayesian nonparametric theory for Gaussian process priors.
nonparametric regression setting under a general composition assumption on the regression function. It is shown that the contraction rates can achieve the minimax convergence rate (up to log n factors), while being adaptive to the underlying structure and smoothness of the target function. The proposed framework extends the Bayesian nonparametric theory for Gaussian process priors.
Original language | English |
---|---|
Pages (from-to) | 1-49 |
Number of pages | 49 |
Journal | Journal of machine learning research |
Volume | 24 |
Issue number | 66 |
Publication status | Published - Feb 2023 |
Keywords
- Bayesian nonparametric regression
- contraction rates
- deep Gaussian processes
- uncertainty quantification
- neural networks