A Comparison of Topic Modeling and Classification Machine Learning Algorithms on Luganda Data
Abstract
Extracting functional themes and topics from a large text corpus manually is usually infeasible. There is a need to build text mining techniques such as topic modeling, which provide a mechanism to infer topics from a corpus of text automatically. This paper discusses topic modeling and topic classification models on Luganda text data. For topic modeling, we considered a Non-negative matrix factorization (NMF) which is an unsupervised machine learning algorithm that extracts hidden patterns from unlabeled text data to create latent topics, and for topic classification, we considered classic approaches, neural networks, and pretrained algorithms. The Bidirectional Encoder Representations from Transformers(BERT), a pretrained model that uses an attention mechanism that learns contextual relations between words (or sub-words) in a text, and a Support Vector Machine (SVM) algorithm, a classic model which analyzes particular properties of learning within text data, record the best results for topic classification. Our results indicate that topic modeling and topic classification algorithms produce relatively similar results when topic classification algorithms are trained on a balanced dataset.
Cite
Text
Tobius et al. "A Comparison of Topic Modeling and Classification Machine Learning Algorithms on Luganda Data." ICLR 2022 Workshops: AfricaNLP, 2022.Markdown
[Tobius et al. "A Comparison of Topic Modeling and Classification Machine Learning Algorithms on Luganda Data." ICLR 2022 Workshops: AfricaNLP, 2022.](https://mlanthology.org/iclrw/2022/tobius2022iclrw-comparison/)BibTeX
@inproceedings{tobius2022iclrw-comparison,
title = {{A Comparison of Topic Modeling and Classification Machine Learning Algorithms on Luganda Data}},
author = {Tobius, Bateesa Saul and Babirye, Claire and Nakatumba-Nabende, Joyce and Katumba, Andrew},
booktitle = {ICLR 2022 Workshops: AfricaNLP},
year = {2022},
url = {https://mlanthology.org/iclrw/2022/tobius2022iclrw-comparison/}
}