Duality for Neural Networks through Reproducing Kernel Banach Spaces

Len Spek*, Tjeerd Jan Heeringa, Felix Leopold Schwenninger, Christoph Brune

*Corresponding author for this work

Research output: Working paperPreprintAcademic

244 Downloads (Pure)

Abstract

Reproducing Kernel Hilbert spaces (RKHS) have been a very successful tool in various areas of machine learning. Recently, Barron spaces have been used to prove bounds on the generalisation error for neural networks. Unfortunately, Barron spaces cannot be understood in terms of RKHS due to the strong nonlinear coupling of the weights. We show that this can be solved by using the more general Reproducing Kernel Banach spaces (RKBS). This class of integral RKBS can be understood as an infinite union of RKHS spaces. As the RKBS is not a Hilbert space, it is not its own dual space. However, we show that its dual space is again an RKBS where the roles of the data and parameters are interchanged, forming an adjoint pair of RKBSs including a reproducing property in the dual space. This allows us to construct the saddle point problem for neural networks, which can be used in the whole field of primal-dual optimisation.
Original languageEnglish
Pages1-19
Number of pages19
DOIs
Publication statusPublished - 9 Nov 2022

Keywords

  • math.FA
  • cs.LG

Fingerprint

Dive into the research topics of 'Duality for Neural Networks through Reproducing Kernel Banach Spaces'. Together they form a unique fingerprint.

Cite this