Publication:
Hierarchical mixtures of generators for adversarial learning

dc.contributor.authorAhmetoğlu, A.
dc.contributor.authorAlpaydın, Ahmet İbrahim Ethem
dc.contributor.departmentComputer Science
dc.contributor.ozuauthorALPAYDIN, Ahmet Ibrahim Ethem
dc.date.accessioned2022-11-16T09:54:50Z
dc.date.available2022-11-16T09:54:50Z
dc.date.issued2019
dc.description.abstractGenerative adversarial networks (GANs) are deep neural networks that allow us to sample from an arbitrary probability distribution without explicitly estimating the distribution. There is a generator that takes a latent vector as input and transforms it into a valid sample from the distribution. There is also a discriminator that is trained to discriminate such fake samples from true samples of the distribution; at the same time, the generator is trained to generate fakes that the discriminator cannot tell apart from the true samples. Instead of learning a global generator, a recent approach involves training multiple generators each responsible from one part of the distribution. In this work, we review such approaches and propose the hierarchical mixture of generators, inspired from the hierarchical mixture of experts model, that learns a tree structure implementing a hierarchical clustering with soft splits in the decision nodes and local generators in the leaves. Since the generators are combined softly, the whole model is continuous and can be trained using gradient-based optimization, just like the original GAN model. Our experiments on five image data sets, namely, MNIST, FashionMNIST, UTZap50K, Oxford Flowers, and CelebA, show that our proposed model generates samples of high quality and diversity in terms of popular GAN evaluation metrics. The learned hierarchical structure also leads to knowledge extraction.en_US
dc.description.sponsorshipBogazici University ; TÜBİTAK ULAKBİM
dc.identifier.doi10.1109/ICPR48806.2021.9413249
dc.identifier.scopus2-s2.0-85110411379
dc.identifier.urihttp://hdl.handle.net/10679/7964
dc.identifier.urihttps://doi.org/10.1109/ICPR48806.2021.9413249
dc.identifier.wos000678409200043
dc.language.isoengen_US
dc.publicationstatusPublisheden_US
dc.publisherIEEEen_US
dc.relation.ispartof2020 25th International Conference on Pattern Recognition (ICPR)
dc.relation.publicationcategoryInternational
dc.rightsrestrictedAccess
dc.titleHierarchical mixtures of generators for adversarial learningen_US
dc.typeconferenceObjecten_US
dspace.entity.typePublication
relation.isOrgUnitOfPublication85662e71-2a61-492a-b407-df4d38ab90d7
relation.isOrgUnitOfPublication.latestForDiscovery85662e71-2a61-492a-b407-df4d38ab90d7

Files

License bundle

Now showing 1 - 1 of 1
Placeholder
Name:
license.txt
Size:
1.45 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections