Archive-based coronavirus herd immunity algorithm for optimizing weights in neural networks
Document Type
Article
Publication Title
Neural Computing and Applications
Abstract
The success of the supervised learning process for feedforward neural networks, especially multilayer perceptron neural network (MLP), depends on the suitable configuration of its controlling parameters (i.e., weights and biases). Normally, the gradient descent method is used to find the optimal values of weights and biases. The gradient descent method suffers from the local optimal trap and slow convergence. Therefore, stochastic approximation methods such as metaheuristics are invited. Coronavirus herd immunity optimizer (CHIO) is a recent metaheuristic human-based algorithm stemmed from the herd immunity mechanism as a way to treat the spread of the coronavirus pandemic. In this paper, an external archive strategy is proposed and applied to direct the population closer to more promising search regions. The external archive is implemented during the algorithm evolution, and it saves the best solutions to be used later. This enhanced version of CHIO is called ACHIO. The algorithm is utilized in the training process of MLP to find its optimal controlling parameters thus empowering their classification accuracy. The proposed approach is evaluated using 15 classification datasets with classes ranging between 2 to 10. The performance of ACHIO is compared against six well-known swarm intelligence algorithms and the original CHIO in terms of classification accuracy. Interestingly, ACHIO is able to produce accurate results that excel other comparative methods in ten out of the fifteen classification datasets and very competitive results for others.
DOI
10.1007/s00521-023-08577-y
Publication Date
4-19-2023
Keywords
Archive technique, CHIO, Coronavirus herd immunity optimizer, Feedforward neural networks, MLP, Optimization
Recommended Citation
Abu Doush, I., et al., "Archive-based coronavirus herd immunity algorithm for optimizing weights in neural networks", Neural Computing & Applications, Apr 2023, doi:10.1007/s00521-023-08577-y
Comments
IR Deposit conditions:
OA version (pathway b) Accepted version
12 months embargo
License: Publisher's Bespoke License Published source must be acknowledged with citation Must link to publisher version with DOI Post-prints are subject to Springer Nature re-use terms