JOURNAL ARTICLE

Word Embedding for Cross-lingual Natural Language Analysis

Yukun Hu

Year: 2023 Journal:   Highlights in Science Engineering and Technology Vol: 68 Pages: 320-326

Abstract

Word embedding, a distributed representation of natural language based on deep neural networks, has made significant breakthroughs in many natural language processing tasks and has gradually become a hot subject in research and application. Word embedding methods can capture more complex and valuable semantic information than existing methods. However, existing methods of word embedding often rely on large-scale annotation resources, which are often difficult to obtain, especially for resource-poor languages. In response to this problem, researchers have explored different research routes, such as unsupervised learning from untagged data, semi-supervised learning that integrates tagged and untagged data, or crowdsourcing. At the same time, many scholars have proposed to improve the analysis accuracy of target tasks by integrating the annotation resources of different languages and enabling knowledge from foreign languages to be transferred or merged with models. This paper discusses the development and prospects of word embedding.

Keywords:
Computer science Word embedding Natural language processing Artificial intelligence Crowdsourcing Embedding Word (group theory) Annotation Natural language Representation (politics) Deep learning Resource (disambiguation) World Wide Web Linguistics

Metrics

2
Cited By
0.51
FWCI (Field Weighted Citation Impact)
13
Refs
0.67
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Sentiment Analysis and Opinion Mining
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.