LDA*: a robust and large-scale topic modeling system

Resource type
Journal Article
Authors/contributors
Title
LDA*: a robust and large-scale topic modeling system
Abstract
We present LDA*, a system that has been deployed in one of the largest Internet companies to fulfil their requirements of "topic modeling as an internal service" ---relying on thousands of machines, engineers in different sectors submit their data, some are as large as 1.8TB, to LDA* and get results back in hours. LDA* is motivated by the observation that none of the existing topic modeling systems is robust enough ---Each of these existing systems is designed for a specific point in the tradeoff space that can be sub-optimal, sometimes by up to 10×, across workloads. Our first contribution is a systematic study of all recently proposed samplers: AliasLDA, F+LDA, LightLDA, and WarpLDA. We discovered a novel system tradeoff among these samplers. Each sampler has different sampling complexity and performs differently, sometimes by 5×, on documents with different lengths. Based on this tradeoff, we further developed a hybrid sampler that uses different samplers for different types of documents. This hybrid approach works across a wide range of workloads and outperforms the fastest sampler by up to 2x. We then focused on distributed environments in which thousands of workers, each with different performance (due to virtualization and resource sharing), coordinate to train a topic model. Our second contribution is an asymmetric parameter server architecture that pushes some computation to the parameter server side. This architecture is motivated by the skew of the word frequency distribution and a novel tradeoff we discovered between communication and computation. With this architecture, we outperform the traditional, symmetric architecture by up to 2×. With these two contributions, together with a carefully engineered implementation, our system is able to outperform existing systems by up to 10× and has already been running to provide topic modeling services for more than six months.
Publication
Proceedings of the VLDB Endowment
Volume
10
Issue
11
Pages
1406-1417
Date
08/2017
Journal Abbr
Proc. VLDB Endow.
Language
en
ISSN
2150-8097
Short Title
LDA*
Accessed
23/02/2024, 17:09
Library Catalogue
DOI.org (Crossref)
Citation
Yut, L., Zhang, C., Shao, Y., & Cui, B. (2017). LDA*: a robust and large-scale topic modeling system. Proceedings of the VLDB Endowment, 10(11), 1406–1417. https://doi.org/10.14778/3137628.3137649