Authors: Jianfei Chen,Jun Zhu,Zi Wang,Xun Zheng,Bo Zhang
Where published:
NeurIPS 2013 12
Document:
PDF
DOI
Abstract URL: http://papers.nips.cc/paper/4981-scalable-inference-for-logistic-normal-topic-models
Logistic-normal topic models can effectively discover correlation structures among latent topics. However, their inference remains a challenge because of the non-conjugacy between the logistic-normal prior and multinomial topic mixing proportions. Existing algorithms either make restricting mean-field assumptions or are not scalable to large-scale applications. This paper presents a partially collapsed Gibbs sampling algorithm that approaches the provably correct distribution by exploring the ideas of data augmentation. To improve time efficiency, we further present a parallel implementation that can deal with large-scale applications and learn the correlation structures of thousands of topics from millions of documents. Extensive empirical results demonstrate the promise.