Show simple item record

dc.contributor.authorHaußmann, M.
dc.contributor.authorHamprecht, F. A.
dc.contributor.authorKandemir, Melih
dc.date.accessioned2020-10-22T10:38:18Z
dc.date.available2020-10-22T10:38:18Z
dc.date.issued2019
dc.identifier.urihttp://hdl.handle.net/10679/7039
dc.identifier.urihttps://auai.org/uai2019/accepted.php
dc.description.abstractWe propose a new Bayesian Neural Net formulation that affords variational inference for which the evidence lower bound is analytically tractable subject to a tight approximation. We achieve this tractability by (i) decomposing ReLU nonlinearities into the product of an identity and a Heaviside step function, (ii) introducing a separate path that decomposes the neural net expectation from its variance. We demonstrate formally that introducing separate latent binary variables to the activations allows representing the neural network likelihood as a chain of linear operations. Performing variational inference on this construction enables a sampling-free computation of the evidence lower bound which is a more effective approximation than the widely applied Monte Carlo sampling and CLT related techniques. We evaluate the model on a range of regression and classification tasks against BNN inference alternatives, showing competitive or improved performance over the current state-of-the-art.en_US
dc.language.isoengen_US
dc.publisherAssociation For Uncertainty in Artificial Intelligence (AUAI)en_US
dc.relation.ispartof35th Conference on Uncertainty in Artificial Intelligence, UAI 2019
dc.rightsrestrictedAccess
dc.titleSampling-free variational inference of Bayesian neural networks by variance backpropagationen_US
dc.typeConference paperen_US
dc.publicationstatusPublisheden_US
dc.contributor.departmentÖzyeğin University
dc.contributor.authorID(ORCID 0000-0001-6293-3656 & YÖK ID 258737) Kandemir, Melih
dc.identifier.scopusSCOPUS:2-s2.0-85084012503
dc.relation.publicationcategoryConference Paper - International - Institutional Academic Staff


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record


Share this page