JOURNAL ARTICLE

Dirichlet-Smoothed Word Embeddings for Low-Resource Settings

Jakob JungmaierNora KassnerBenjamin Roth

Year: 2020 Journal:   arXiv (Cornell University) Pages: 3560-3565   Publisher: Cornell University

Abstract

Nowadays, classical count-based word embeddings using positive pointwise mutual information (PPMI) weighted co-occurrence matrices have been widely superseded by machine-learning-based methods like word2vec and GloVe. But these methods are usually applied using very large amounts of text data. In many cases, however, there is not much text data available, for example for specific domains or low-resource languages. This paper revisits PPMI by adding Dirichlet smoothing to correct its bias towards rare words. We evaluate on standard word similarity data sets and compare to word2vec and the recent state of the art for low-resource settings: Positive and Unlabeled (PU) Learning for word embeddings. The proposed method outperforms PU-Learning for low-resource settings and obtains competitive results for Maltese and Luxembourgish.

Keywords:
Word2vec Word (group theory) Computer science Pointwise mutual information Pointwise Smoothing Maltese Artificial intelligence Natural language processing Resource (disambiguation) Similarity (geometry) Dirichlet distribution Mathematics Mutual information Embedding

Metrics

5
Cited By
0.00
FWCI (Field Weighted Citation Impact)
7
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Text and Document Classification Technologies
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.