Shovan BhowmikRifat SadikWahiduzzaman AkandaJuboraj Roy Pavel
Recent research has focused on opinion mining from public sentiments using natural language processing (NLP) and machine learning (ML) techniques. Transformer-based models, such as bidirectional encoder representations from transformers (BERT), excel in extracting semantic information but are resourceintensive. Google’s new research, mixing tokens with fourier transform, also known as FNet, replaced BERT’s attention mechanism with a non-parameterized fourier transform, aiming to reduce training time without compromising performance. This study fine-tuned the FNet model with a publicly available Kaggle hotel review dataset and investigated the performance of this dataset in both FNet and BERT architectures along with conventional machine learning models such as long short-term memory (LSTM) and support vector machine (SVM). Results revealed that FNet significantly reduces the training time by almost 20% and memory utilization by nearly 60% compared to BERT. The highest test accuracy observed in this experiment by FNet was 80.27% which is nearly 97.85% of BERT’s performance with identical parameters.
Doris Chenguang WuShiteng ZhongRichard T.R. QiuJi Wu
Abhay MagarBhakti PithavaSantosh Kumar Bharti
M. KalaivaniS. Tamil SelviPeer-ReviewedD GhoshB SeetharamuluB ReddyK NaiduSushith MishmalaP KaruppusamyHR RamathmikaSameh Al-NatourOzgur TuretkenAbdelaziz LawaniMichael ReedTyler MarkYuqing ZhengPraphula Kumar JainRajendra PamulaGautam SrivastavaZ SinglaS RandhawaS JainK ZvarevasheO OlugbaraG XuZ YuH YaoF LiY MengX WuShaozhong ZhangDingkai ZhangHaidong ZhongGuorong WangC HapsariW AstutiM PurbolaksonoMarouane BirjaliMohammed KasriAbderrahim Beni-HssaneM WongkarA AngdreseyM WongkarA Angdresey