Publication:
Supervising topic models with Gaussian processes

dc.contributor.authorKandemir, Melih
dc.contributor.authorKekeç, T.
dc.contributor.authorYeniterzi, Reyyan
dc.contributor.departmentComputer Science
dc.contributor.ozuauthorKANDEMİR, Malih
dc.contributor.ozuauthorYENİTERZİ, Reyyan
dc.date.accessioned2018-09-04T08:24:05Z
dc.date.available2018-09-04T08:24:05Z
dc.date.issued2018-05
dc.description.abstractTopic modeling is a powerful approach for modeling data represented as high-dimensional histograms. While the high dimensionality of such input data is extremely beneficial in unsupervised applications including language modeling and text data exploration, it introduces difficulties in cases where class information is available to boost up prediction performance. Feeding such input directly to a classifier suffers from the curse of dimensionality. Performing dimensionality reduction and classification disjointly, on the other hand, cannot enjoy optimal performance due to information loss in the gap between these two steps unaware of each other. Existing supervised topic models introduced as a remedy to such scenarios have thus far incorporated only linear classifiers in order to keep inference tractable, causing a dramatical sacrifice from expressive power. In this paper, we propose the first Bayesian construction to perform topic modeling and non-linear classification jointly. We use the well-known Latent Dirichlet Allocation (LDA) for topic modeling and sparse Gaussian processes for non-linear classification. We combine these two components by a latent variable encoding the empirical topic distribution of each document in the corpus. We achieve a novel variational inference scheme by adapting ideas from the newly emerging deep Gaussian processes into the realm of topic modeling. We demonstrate that our model outperforms other existing approaches such as: (i) disjoint LDA and non-linear classification, (ii) joint LDA and linear classification, (iii) joint non-LDA linear subspace modeling and linear classification, and (iv) non-linear classification without topic modeling, in three benchmark data sets from two real-world applications: text categorization and image tagging.en_US
dc.description.sponsorshipNetherlands Organization for Scientific Research (NWO)
dc.identifier.doi10.1016/j.patcog.2017.12.019en_US
dc.identifier.endpage236en_US
dc.identifier.issn0031-3203en_US
dc.identifier.scopus2-s2.0-85044649648
dc.identifier.startpage226en_US
dc.identifier.urihttp://hdl.handle.net/10679/5936
dc.identifier.urihttps://doi.org/10.1016/j.patcog.2017.12.019
dc.identifier.volume77en_US
dc.identifier.wos000426222800019
dc.language.isoengen_US
dc.peerreviewedyesen_US
dc.publicationstatusPublisheden_US
dc.publisherElsevieren_US
dc.relation.ispartofPattern Recognition
dc.relation.publicationcategoryInternational Refereed Journal
dc.rightsrestrictedAccess
dc.subject.keywordsLatent Dirichlet allocationen_US
dc.subject.keywordsNonparametric Bayesian inferenceen_US
dc.subject.keywordsGaussian processesen_US
dc.subject.keywordsVariational inferenceen_US
dc.subject.keywordsSupervised topic modelsen_US
dc.titleSupervising topic models with Gaussian processesen_US
dc.typearticleen_US
dspace.entity.typePublication
relation.isOrgUnitOfPublication85662e71-2a61-492a-b407-df4d38ab90d7
relation.isOrgUnitOfPublication.latestForDiscovery85662e71-2a61-492a-b407-df4d38ab90d7

Files

License bundle

Now showing 1 - 1 of 1
Placeholder
Name:
license.txt
Size:
1.45 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections