BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CERN//INDICO//EN
BEGIN:VEVENT
SUMMARY:PAC-Bayes training for neural networks: sparsity and uncertainty q
uantification
DTSTART;VALUE=DATE-TIME:20220803T150000Z
DTEND;VALUE=DATE-TIME:20220803T152000Z
DTSTAMP;VALUE=DATE-TIME:20231210T042933Z
UID:indico-contribution-1238@conference2.aau.at
DESCRIPTION:Increasing computational power and storage capacity have made
high-dimensional datasets accessible to many areas of research such as med
icine\, natural and social sciences. While classical statistical methods a
re not compatible with high-dimensional data\, especially due to the curse
of dimensionality\, machine learning methods have been successfully appli
ed to regression problems in practice. On the theoretical level\, a popula
r way to circumvent the curse of dimensionality is the concept of sparsity
. We study the Gibbs posterior distribution from PAC-Bayes theory for spar
se deep neural nets in a nonparametric regression setting. To access the p
osterior distribution\, an efficient MCMC algorithm based on backpropagati
on is constructed. The training yields a Bayesian neural network with a jo
int distribution on the network parameters. Using a mixture over uniform p
riors on sparse sets of network weights\, we prove an oracle inequality wh
ich shows that the method adapts to the unknown regularity and hierarchica
l structure of the regression function. Studying the Gibbs posterior distr
ibution from a frequentist Bayesian perspective\, we analyze the diameter
and show high coverage probability of the resulting credible sets. The met
hod is illustrated with an animation in a simulation example.\n\nThis talk
is based on joint work with Mathias Trabs.\n\nhttps://conference2.aau.at/
event/131/contributions/1238/
LOCATION:Universität Klagenfurt HS 4
URL:https://conference2.aau.at/event/131/contributions/1238/
END:VEVENT
END:VCALENDAR