A. Xueyi HaoBohan LiErnan LiCunbin LiDalin Qin
Due to information overload, ATS(Automatic Text Summarization) is becoming increasingly important. However, nowadays DNN(Deep Neural Network) and pre-trained language models widely used in ATS cannot fully mimic the operating mechanism of neurons in the human brain. In this paper, we introduce SNN(Spiking Neural Network) into ATS system due to the high biological rationality, low power consumption, and high robustness of SNN. A new extractive summarization model called BERT(Bidirectional Encoder Representation from Transformers) + LSNN(Long short-term memory Spiking Neural Network) is proposed and a set of experiments on CNN/Daily Mail are implemented. Compared with the existing BERT + LSTM(Long Short-Term Memory) model, BERT + LSNN not only improves performance, but also verifies SNN's advantages of low power consumption and high robustness. This work is very promising to expand related research in text summarization.
Mohamed I. Abdel‐FattahFuji Ren
S. P. YongAhmad Izuddin Zainal AbidinYancan Chen
Senthil Pandi SS. ShanmugapriyaSundarababu MadduA K Reshmy
Merrin Sancia SahayaK. JayakumarJ. Anitha
Khang Nhứt LâmTuong ThanhNguyet-Hue Thi PhamJugal Kalita