TY - JOUR
T1 - Image classification using convolutional neural network tree ensembles
AU - Hafiz, A. M.
AU - Bhat, R. A.
AU - Hassaballah, M.
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2023/2
Y1 - 2023/2
N2 - Conventional machine learning techniques may have lesser performance when they deal with complex data. For addressing this issue, it is important to build data mining frameworks coupled with robust knowledge discovery mechanisms. One of such frameworks, which addresses these issues is ensemble learning. It fuses data, builds models and mines data into a single framework. In spite of the work done on ensemble learning, there remain issues like how to manage the complexity, how to optimize the model, and how to fine-tune the model. Natural data processing schemes use parallel processing and are robust and efficient, hence are successful. Taking a cue from natural data processing architectures, we propose a parallelized CNN tree ensemble approach. The proposed approach is compared against the baseline which is the deep network used in the ensemble. The ResNet50 architecture is utilized for initial experimentation. The datasets used for this task are the ImageNet and natural images datasets. The proposed approach outperforms the baseline on all experiments on the ImageNet dataset. Further, benchmarking of the proposed approach against different types of CNNs is done on various datasets including CIFAR-10, CIFAR-100, Fashion-MNIST, FEI face recognition, and MNIST digits. Since our approach is adaptable for CNNs, it outperforms the baseline CNNs as well as the state-of-the-art techniques on these datasets. The CNNs architectures used for benchmarking are ResNet-50, DenseNet, WRN-28-10 and NSGANetV1. The code for the paper is available in https://github.com/mueedhafiz1982/CNNTreeEnsemble.git.
AB - Conventional machine learning techniques may have lesser performance when they deal with complex data. For addressing this issue, it is important to build data mining frameworks coupled with robust knowledge discovery mechanisms. One of such frameworks, which addresses these issues is ensemble learning. It fuses data, builds models and mines data into a single framework. In spite of the work done on ensemble learning, there remain issues like how to manage the complexity, how to optimize the model, and how to fine-tune the model. Natural data processing schemes use parallel processing and are robust and efficient, hence are successful. Taking a cue from natural data processing architectures, we propose a parallelized CNN tree ensemble approach. The proposed approach is compared against the baseline which is the deep network used in the ensemble. The ResNet50 architecture is utilized for initial experimentation. The datasets used for this task are the ImageNet and natural images datasets. The proposed approach outperforms the baseline on all experiments on the ImageNet dataset. Further, benchmarking of the proposed approach against different types of CNNs is done on various datasets including CIFAR-10, CIFAR-100, Fashion-MNIST, FEI face recognition, and MNIST digits. Since our approach is adaptable for CNNs, it outperforms the baseline CNNs as well as the state-of-the-art techniques on these datasets. The CNNs architectures used for benchmarking are ResNet-50, DenseNet, WRN-28-10 and NSGANetV1. The code for the paper is available in https://github.com/mueedhafiz1982/CNNTreeEnsemble.git.
KW - CNN
KW - Deep learning
KW - Ensembles
KW - Image classification
KW - ImageNet
KW - Parallel processing
UR - http://www.scopus.com/inward/record.url?scp=85135776199&partnerID=8YFLogxK
U2 - 10.1007/s11042-022-13604-6
DO - 10.1007/s11042-022-13604-6
M3 - Article
AN - SCOPUS:85135776199
SN - 1380-7501
VL - 82
SP - 6867
EP - 6884
JO - Multimedia Tools and Applications
JF - Multimedia Tools and Applications
IS - 5
ER -