Most of NLP research fields (Translation, Classification, Dialogue Systems ...) have been revolutionized by the rise of deep learning methods, which rely on the new dense and low-dimensional feature representation. We present in this article the basic training techniques of Word Embeddings as well as the recent works on Abstractive Neural Summarizers. We also introduce our trained French Word Embeddings, further used as the embedding layer to implement our baseline French Neural Summarizer for the headline generation task, using the RNN (Recurrent Neural Network) Encoder-Decoder architecture.
Jeetendra KumarShashi ShekharRashmi Gupta
Jeetendra KumarShashi ShekharRashmi Gupta
Rajeev Kumar SinghSonia KhetarpaulRohan GorantlaSai Giridhar Allada
Ruhul AminNabila Sabrin SwornaMd Nazmul Khan LitonNahid Hossain