In this paper, we propose an adaptive learning rate algorithm for gradient descent method used in Artificial Neural Network (ANN). The proposed adaptive learning rate neural network algorithm makes use of an initial assigned learning rate and after a set of backpropagation rounds the learning rate gets revised to a learning rate which is equal to minimum learning rate plus average of minimum learning rate and maximum learning rate (these parameters are updated in each recursive calls). During the next recursive call of the algorithm, the revised learning rate is used, if the average total error decreases, or the previous learning rate is continued. This is done until the desired learning rate is achieved which results in optimal weight parameters with high classification accuracy. We have compared the proposed algorithms with existing heuristics approaches and found that the convergence rate is to be faster and the proposed algorithm returns the optimized weights with higher classification accuracy. We have also tested the effectiveness of the algorithm on spam email classification using multilayer neural networks and found that it performs better than the existing approach and less prone to overfitting problems. The proposed algorithm results in 99.12% accuracy for the email classification.
Md. Tofael AhmedMariam AkterM. Saifur RahmanMaqsudur RahmanPintu Chandra PaulMiss. Nargis ParvinAlmas Hossain Antar
Vít ListíkJan ŠedivýVáclav Hlaváč
Weisen PanJian LiYixing GaoLiexiang YueYan YangLingli DengChao Deng
Zhan ChuanLU Xian-liangMengshu HouXu Zhou