TY - JOUR
T1 - Training multi-layer perceptron with enhanced brain storm optimization metaheuristics
AU - Bacanin, Nebojsa
AU - Alhazmi, Khaled
AU - Zivkovic, Miodrag
AU - Venkatachalam, K.
AU - Bezdan, Timea
AU - Nebhen, Jamel
N1 - Publisher Copyright:
© 2022 Tech Science Press. All rights reserved.
PY - 2022
Y1 - 2022
N2 - In the domain of artificial neural networks, the learning process represents one of the most challenging tasks. Since the classification accuracy highly depends on the weights and biases, it is crucial to find its optimal or suboptimal values for the problem at hand. However, to a very large search space, it is very difficult to find the proper values of connection weights and biases. Employing traditional optimization algorithms for this issue leads to slow convergence and it is prone to get stuck in the local optima. Most commonly, back-propagation is used for multi-layer-perceptron training and it can lead to vanishing gradient issue. As an alternative approach, stochastic optimization algorithms, such as nature-inspired metaheuristics are more reliable for complex optimization tax, such as finding the proper values of weights and biases for neural network training. In this work, we propose an enhanced brain storm optimization-based algorithm for training neural networks. In the simulations, ten binary classification benchmark datasets with different difficulty levels are used to evaluate the efficiency of the proposed enhanced brain storm optimization algorithm. The results show that the proposed approach is very promising in this domain and it achieved better results than other state-of-the-art approaches on the majority of datasets in terms of classification accuracy and convergence speed, due to the capability of balancing the intensification and diversification and avoiding the local minima. The proposed approach obtained the best accuracy on eight out of ten observed dataset, outperforming all other algorithms by 1-2% on average. When mean accuracy is observed, the proposed algorithm dominated on nine out of ten datasets.
AB - In the domain of artificial neural networks, the learning process represents one of the most challenging tasks. Since the classification accuracy highly depends on the weights and biases, it is crucial to find its optimal or suboptimal values for the problem at hand. However, to a very large search space, it is very difficult to find the proper values of connection weights and biases. Employing traditional optimization algorithms for this issue leads to slow convergence and it is prone to get stuck in the local optima. Most commonly, back-propagation is used for multi-layer-perceptron training and it can lead to vanishing gradient issue. As an alternative approach, stochastic optimization algorithms, such as nature-inspired metaheuristics are more reliable for complex optimization tax, such as finding the proper values of weights and biases for neural network training. In this work, we propose an enhanced brain storm optimization-based algorithm for training neural networks. In the simulations, ten binary classification benchmark datasets with different difficulty levels are used to evaluate the efficiency of the proposed enhanced brain storm optimization algorithm. The results show that the proposed approach is very promising in this domain and it achieved better results than other state-of-the-art approaches on the majority of datasets in terms of classification accuracy and convergence speed, due to the capability of balancing the intensification and diversification and avoiding the local minima. The proposed approach obtained the best accuracy on eight out of ten observed dataset, outperforming all other algorithms by 1-2% on average. When mean accuracy is observed, the proposed algorithm dominated on nine out of ten datasets.
KW - Algorithm hybridization
KW - Artificial neural network
KW - Brain storm optimization
KW - Metaheuristics
KW - Optimization
UR - https://www.scopus.com/pages/publications/85115988217
U2 - 10.32604/cmc.2022.020449
DO - 10.32604/cmc.2022.020449
M3 - Article
AN - SCOPUS:85115988217
SN - 1546-2218
VL - 70
SP - 4199
EP - 4215
JO - Computers, Materials and Continua
JF - Computers, Materials and Continua
IS - 2
ER -