ABNet - Leveraging the AdaBoost to Boost Neural Network Performance
Keywords:
AdaBoost, Ensemble Algorithm, Neural Network, Backpropagation, Boosting AlgorithmAbstract
Within the healthcare segment, Artificial Intelligence applications have made diagnosis much better through early problem detection with much ease, accurate diagnosis, and tailored treatment plans. Apart from effective medical action, these advantages come with improved patient outcomes. The integration of many benefits in different models through the ensemble methods improved these advantages thus offering even higher accuracy and dependability.In this paper we propose an ensemble technique known as ABNet algorithm that combines the resilience of the AdaBoost algorithm with the agility of neural networks so as to increase classification accuracy. The algorithm implements the traditional architecture of neural networks, utilizing backpropagation for training through multiple layers of differentiable modules. Within this method the weights allocated to individual neural networks are influenced by their accuracy, with more accurate networks acquiring higher weights. This adjustability enables AdaBoost to highlight previously misclassified instances, effectively directing the neural networks to learn from their errors. Our startegy, substantiated on three datasets, show sturdiness and generalization, centerd on emphasizing the model's adaptability within various diagnostic disease datasets.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Journal of Soft Computing and Data Mining

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.









