Publication: Dropout regularization in hierarchical mixture of experts
dc.contributor.author | Alpaydın, Ahmet İbrahim Ethem | |
dc.contributor.department | Computer Science | |
dc.contributor.ozuauthor | ALPAYDIN, Ahmet Ibrahim Ethem | |
dc.date.accessioned | 2022-09-07T12:24:00Z | |
dc.date.available | 2022-09-07T12:24:00Z | |
dc.date.issued | 2021-01-02 | |
dc.description.abstract | Dropout is a very effective method in preventing overfitting and has become the go-to regularizer for multi-layer neural networks in recent years. Hierarchical mixture of experts is a hierarchically gated model that defines a soft decision tree where leaves correspond to experts and decision nodes correspond to gating models that softly choose between its children, and as such, the model defines a soft hierarchical partitioning of the input space. In this work, we propose a variant of dropout for hierarchical mixture of experts that is faithful to the tree hierarchy defined by the model, as opposed to having a flat, unitwise independent application of dropout as one has with multi-layer perceptrons. We show that on a synthetic regression data and on MNIST, CIFAR-10, and SSTB datasets, our proposed dropout mechanism prevents overfitting on trees with many levels improving generalization and providing smoother fits. | en_US |
dc.identifier.doi | 10.1016/j.neucom.2020.08.052 | en_US |
dc.identifier.endpage | 156 | en_US |
dc.identifier.issn | 0925-2312 | en_US |
dc.identifier.scopus | 2-s2.0-85091258086 | |
dc.identifier.startpage | 148 | en_US |
dc.identifier.uri | http://hdl.handle.net/10679/7840 | |
dc.identifier.uri | https://doi.org/10.1016/j.neucom.2020.08.052 | |
dc.identifier.volume | 419 | en_US |
dc.identifier.wos | 000590175500013 | |
dc.language.iso | eng | en_US |
dc.peerreviewed | yes | en_US |
dc.publisher | Elsevier | en_US |
dc.relation.ispartof | Neurocomputing | |
dc.relation.publicationcategory | International Refereed Journal | |
dc.rights | restrictedAccess | |
dc.subject.keywords | Dropout | en_US |
dc.subject.keywords | Hierarchical models | en_US |
dc.subject.keywords | Mixture of experts | en_US |
dc.subject.keywords | Regularization | en_US |
dc.title | Dropout regularization in hierarchical mixture of experts | en_US |
dc.type | article | en_US |
dspace.entity.type | Publication | |
relation.isOrgUnitOfPublication | 85662e71-2a61-492a-b407-df4d38ab90d7 | |
relation.isOrgUnitOfPublication.latestForDiscovery | 85662e71-2a61-492a-b407-df4d38ab90d7 |
Files
License bundle
1 - 1 of 1
- Name:
- license.txt
- Size:
- 1.45 KB
- Format:
- Item-specific license agreed upon to submission
- Description: