This portal has been archived. Explore the next generation of this technology.

Scalable Inference for Logistic-Normal Topic Models

lib:432b8d3f06f6018b (v1.0.0)

Authors: Jianfei Chen,Jun Zhu,Zi Wang,Xun Zheng,Bo Zhang
Where published: NeurIPS 2013 12
Document:  PDF  DOI 
Abstract URL: http://papers.nips.cc/paper/4981-scalable-inference-for-logistic-normal-topic-models


Logistic-normal topic models can effectively discover correlation structures among latent topics. However, their inference remains a challenge because of the non-conjugacy between the logistic-normal prior and multinomial topic mixing proportions. Existing algorithms either make restricting mean-field assumptions or are not scalable to large-scale applications. This paper presents a partially collapsed Gibbs sampling algorithm that approaches the provably correct distribution by exploring the ideas of data augmentation. To improve time efficiency, we further present a parallel implementation that can deal with large-scale applications and learn the correlation structures of thousands of topics from millions of documents. Extensive empirical results demonstrate the promise.

Relevant initiatives  

Related knowledge about this paper Reproduced results (crowd-benchmarking and competitions) Artifact and reproducibility checklists Common formats for research projects and shared artifacts Reproducibility initiatives

Comments  

Please log in to add your comments!
If you notice any inapropriate content that should not be here, please report us as soon as possible and we will try to remove it within 48 hours!