Kandemir, Melih2019-04-022019-04-022018-090167-8655http://hdl.handle.net/10679/6250https://doi.org/10.1016/j.patrec.2018.07.001We introduce a Bayesian construction for deep neural networks that is amenable to mean field variational inference that operates solely by closed-form update rules. Hence, it does not require any learning rate to be manually tuned. We show that by this virtue it becomes possible with our model to perform effective deep learning on three setups where conventional neural nets are known to perform suboptimally: i) online learning, ii) learning from small data, and iii) active learning. We compare our approach to earlier Bayesian neural network inference techniques spanning from expectation propagation to gradient-based variational Bayes, as well as deterministic neural nets with various activations functions. We observe our approach to improve on all these alternatives in two mainstream vision benchmarks and two medical data sets: diabetic retinopathy screening and exudate detection from eye fundus images.engrestrictedAccessVariational closed-Form deep neural net inferencearticle11214515100044395080002110.1016/j.patrec.2018.07.001Bayesian Neural NetworksVariational BayesOnline learningActive learning2-s2.0-85049560836