ABNet - Leveraging the AdaBoost to Boost Neural Network Performance

Authors

  • Ifra Altaf PG Department of Computer Sciences, University of Kashmir
  • Manzoor Ahmed Chachoo University of Kashmir, Srinagar

Keywords:

AdaBoost, Ensemble Algorithm, Neural Network, Backpropagation, Boosting Algorithm

Abstract

Within the healthcare segment, Artificial Intelligence applications have made diagnosis much better through early problem detection with much ease, accurate diagnosis, and tailored treatment plans. Apart from effective medical action, these advantages come with improved patient outcomes. The integration of many benefits in different models through the ensemble methods improved these advantages thus offering even higher accuracy and dependability.In this paper we propose an ensemble technique known as ABNet algorithm that combines the resilience of the AdaBoost algorithm with the agility of neural networks so as to increase classification accuracy. The algorithm implements the traditional architecture of neural networks, utilizing backpropagation for training through multiple layers of differentiable modules. Within this method the weights allocated to individual neural networks are influenced by their accuracy, with more accurate networks acquiring higher weights. This adjustability enables AdaBoost to highlight previously misclassified instances, effectively directing the neural networks to learn from their errors. Our startegy, substantiated on three datasets, show sturdiness and generalization, centerd on emphasizing the model's adaptability within various diagnostic disease datasets.

Downloads

Download data is not yet available.

Downloads

Published

18-12-2024

Issue

Section

Articles

How to Cite

Ifra Altaf, & Manzoor Ahmed Chachoo. (2024). ABNet - Leveraging the AdaBoost to Boost Neural Network Performance. Journal of Soft Computing and Data Mining, 5(2), 264-273. https://penerbit.uthm.edu.my/ojs/index.php/jscdm/article/view/19085