Evaluation of Hyperparameter Optimization Techniques in Deep Learning considering Accuracy, Runtime and Computational Efficiency Metrics
Keywords:
Hyperparameter Optimization, Deep Learning, Approximation Algorithms, FNN, MNIST DatasetAbstract
Hyperparameter optimization is considered one of the most important steps in training deep learning models, since the performance metrics of the models, such as accuracy, generalizability, and computational efficiency, are closely related to this. The following five hyperparameter optimization techniques have been explored in this work: Grid Search, Random Search, Genetic Algorithm, Particle Swarm Optimization, and Simulated Annealing, on a feedforward neural network (FFNN) trained with the MNIST dataset. It considers two main configurations of 20 epochs and 50 epochs, focusing on three key metrics: accuracy, runtime, and computational efficiency. Results show that the approximation algorithms, like Genetic Algorithm and Simulated Annealing, can achieve a fantastic trade-off between accuracy and runtime, which allows them to perform much better in terms of computational practicality than the classical methods like Grid Search and Random Search. As a simple example, the highest Genetic Algorithm accuracy is 98.60% within 50 epochs, while Simulated Annealing performed better with the fastest run in 357.52 seconds. These results are bound to show how much flexibility and efficiency there is by the approximation algorithms when searching high-dimensional hyperparameter spaces under scarce resources. This work also presents a trade-off analysis between exhaustive classic techniques and adaptive approximation techniques. The Python implementation used-which is modular in architecture-provides a basic structure that can be extended to complex datasets and architectures. By bridging computational efficiency with practical efficacy, this work provides actionable guidance by both practitioners and researchers in the use of deep learning, offering a possible direction for choosing hyperparameter optimization methodologies most appropriate to constraints versus objectives.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Journal of Soft Computing and Data Mining

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.









