Natural language processing includes a subfield called word sense disambiguation, which focuses mostly on words that might have several meanings. Polysemous terms are also referred to as confusing phrases in some circles. The performance of word sense disambiguation depends on how effectively the ambiguous word is recognized by the machine. The discussed word embedding model for the ambiguous words represents the words from the document space to vector space with no data loss. The most identified challenge of ambiguous word representation is the features. The selection and representation of ambiguous words with respect to the features is the tedious task of word embedding. The discussed word embedding model uses countable features of available context for disambiguation. The proposed model is implemented for ambiguous words with context information. The available context of ambiguous/polysemous words is used for disambiguation. The unavailability of the context is the challenge in this model. The Recurrent Neural Network with Large Small Term Memory is used for the classification. The output of the RNN-LSTM is the sense values which are further mapped with the freely available lexical resource WordNet for retrieving the correct sense(meaning).
K R KavithaS PranavA Anagh Anil
Pranjal Protim BorahGitimoni TalukdarArup Baruah
Tarjni VyasAmit GanatraX ChenZ LiuM SunR NavigliZ ZhongH NgT MillerC BiemannT ZeschI GurevychW GomaaA FahmyP BasileA CaputoG SemeraroT WeiY LuH ChangQ ZhouX BaoA GolkarS JafariS FakhrahmadS DashtiP KalitaA BarmanP BorahG TalukdarA BaruahM NamehS FakhrahmadM JahromiA MoroF CecconiR NavigliA MaasR DalyP PhamD HuangA NgC PottsM CrawfordT KhoshgoftaarJ PrusaA RichterH Al NajadaA KhanB BaharudinL LeeK KhanS TingW IpA TsangR MihalceaG TsatsaronisI VarlamisK NrvgA GhoshN MishraS GhoshT SchnabelI LabutovD MimnoT Joachims
Zheng-Yu NiuDonghong JiChew‐Lim TanLingpeng Yang
Sunita RawatKavita KalambeGaurav KawadeNilesh Korde