Publication:
Variational closed-Form deep neural net inference

Placeholder

Institution Authors

Research Projects

Organizational Unit

Journal Title

Journal ISSN

Volume Title

Type

article

Access

restrictedAccess

Publication Status

Published

Journal Issue

Abstract

We introduce a Bayesian construction for deep neural networks that is amenable to mean field variational inference that operates solely by closed-form update rules. Hence, it does not require any learning rate to be manually tuned. We show that by this virtue it becomes possible with our model to perform effective deep learning on three setups where conventional neural nets are known to perform suboptimally: i) online learning, ii) learning from small data, and iii) active learning. We compare our approach to earlier Bayesian neural network inference techniques spanning from expectation propagation to gradient-based variational Bayes, as well as deterministic neural nets with various activations functions. We observe our approach to improve on all these alternatives in two mainstream vision benchmarks and two medical data sets: diabetic retinopathy screening and exudate detection from eye fundus images.

Date

2018-09

Publisher

Elsevier

Description

Keywords

Citation

Collections


Page Views

0

File Download

0