Publication:
Dropout regularization in hierarchical mixture of experts

Placeholder

Authors

Alpaydın, Ahmet İbrahim Ethem

Research Projects

Organizational Unit

Journal Title

Journal ISSN

Volume Title

Type

article

Access

restrictedAccess

Publication Status

Journal Issue

Abstract

Dropout is a very effective method in preventing overfitting and has become the go-to regularizer for multi-layer neural networks in recent years. Hierarchical mixture of experts is a hierarchically gated model that defines a soft decision tree where leaves correspond to experts and decision nodes correspond to gating models that softly choose between its children, and as such, the model defines a soft hierarchical partitioning of the input space. In this work, we propose a variant of dropout for hierarchical mixture of experts that is faithful to the tree hierarchy defined by the model, as opposed to having a flat, unitwise independent application of dropout as one has with multi-layer perceptrons. We show that on a synthetic regression data and on MNIST, CIFAR-10, and SSTB datasets, our proposed dropout mechanism prevents overfitting on trees with many levels improving generalization and providing smoother fits.

Date

2021-01-02

Publisher

Elsevier

Description

Keywords

Citation

Collections


Page Views

0

File Download

0