Publication: Hierarchical mixtures of generators for adversarial learning
dc.contributor.author | Ahmetoğlu, A. | |
dc.contributor.author | Alpaydın, Ahmet İbrahim Ethem | |
dc.contributor.department | Computer Science | |
dc.contributor.ozuauthor | ALPAYDIN, Ahmet Ibrahim Ethem | |
dc.date.accessioned | 2022-11-16T09:54:50Z | |
dc.date.available | 2022-11-16T09:54:50Z | |
dc.date.issued | 2019 | |
dc.description.abstract | Generative adversarial networks (GANs) are deep neural networks that allow us to sample from an arbitrary probability distribution without explicitly estimating the distribution. There is a generator that takes a latent vector as input and transforms it into a valid sample from the distribution. There is also a discriminator that is trained to discriminate such fake samples from true samples of the distribution; at the same time, the generator is trained to generate fakes that the discriminator cannot tell apart from the true samples. Instead of learning a global generator, a recent approach involves training multiple generators each responsible from one part of the distribution. In this work, we review such approaches and propose the hierarchical mixture of generators, inspired from the hierarchical mixture of experts model, that learns a tree structure implementing a hierarchical clustering with soft splits in the decision nodes and local generators in the leaves. Since the generators are combined softly, the whole model is continuous and can be trained using gradient-based optimization, just like the original GAN model. Our experiments on five image data sets, namely, MNIST, FashionMNIST, UTZap50K, Oxford Flowers, and CelebA, show that our proposed model generates samples of high quality and diversity in terms of popular GAN evaluation metrics. The learned hierarchical structure also leads to knowledge extraction. | en_US |
dc.description.sponsorship | Bogazici University ; TÜBİTAK ULAKBİM | |
dc.identifier.doi | 10.1109/ICPR48806.2021.9413249 | |
dc.identifier.scopus | 2-s2.0-85110411379 | |
dc.identifier.uri | http://hdl.handle.net/10679/7964 | |
dc.identifier.uri | https://doi.org/10.1109/ICPR48806.2021.9413249 | |
dc.identifier.wos | 000678409200043 | |
dc.language.iso | eng | en_US |
dc.publicationstatus | Published | en_US |
dc.publisher | IEEE | en_US |
dc.relation.ispartof | 2020 25th International Conference on Pattern Recognition (ICPR) | |
dc.relation.publicationcategory | International | |
dc.rights | restrictedAccess | |
dc.title | Hierarchical mixtures of generators for adversarial learning | en_US |
dc.type | conferenceObject | en_US |
dspace.entity.type | Publication | |
relation.isOrgUnitOfPublication | 85662e71-2a61-492a-b407-df4d38ab90d7 | |
relation.isOrgUnitOfPublication.latestForDiscovery | 85662e71-2a61-492a-b407-df4d38ab90d7 |
Files
License bundle
1 - 1 of 1
- Name:
- license.txt
- Size:
- 1.45 KB
- Format:
- Item-specific license agreed upon to submission
- Description: