22-24 September 2021
Alpen-Adria-Universität Klagenfurt
Europe/Vienna timezone

A Bregman Learning Framework for Sparse Neural Networks

22 Sep 2021, 15:10
HS 1 (Alpen-Adria-Universität Klagenfurt)

HS 1

Alpen-Adria-Universität Klagenfurt


Tim Roith (Friedrich-Alexander-Universität Erlangen-Nürnberg)


I will present a novel learning framework based on stochastic Bregman iterations. It allows to train sparse neural networks with an inverse scale space approach, starting from a very sparse network and gradually adding significant parameters. Apart from a baseline algorithm called LinBreg, I will also speak about an accelerated version using momentum, and AdaBreg, which is a Bregmanized generalization of the Adam algorithm. I will present a statistically profound sparse parameter initialization strategy, stochastic convergence analysis of the loss decay, and additional convergence proofs in the convex regime. The Bregman learning framework can also be applied to Neural Architecture Search and can, for instance, unveil autoencoder architectures for denoising or deblurring tasks.

Primary author

Tim Roith (Friedrich-Alexander-Universität Erlangen-Nürnberg)


Leon Bungert (University of Bonn) Prof. Martin Burger (Friedrich-Alexander-Universität Erlangen-Nürnberg) Dr Daniel Tenbrinck (Friedrich-Alexander-Universität Erlangen-Nürnberg)

Presentation Materials

There are no materials yet.