CNN Hyperparameter Optimization for MNIST Dataset with Metaheuristic Algorithms
DOI:
https://doi.org/10.30855/ais.2025.08.02.04Keywords:
Hyperparameter Optimization, Convolutional Neural Networks, Metaheuristic Algorithms, MNIST DatasetAbstract
In this study, we used nature-inspired metaheuristic algorithms for hyperparameter optimization, a key problem in deep learning. Specifically, the Gray Wolf Optimization (GWO) and Harris Hawk Optimization (HHO) algorithms were comparatively evaluated on the MNIST dataset, which is widely used for handwritten digit classification. The main objective of the study was to achieve high classification accuracy while simultaneously keeping the network structure as simple and computational cost low as possible. In this study, critical hyperparameters such as the number of layers, number of neurons, learning rate, dropout rate, and batch size were optimized. The findings show that both algorithms achieve high accuracy rates, but HHO, with a test accuracy of 98.1%, surpasses GWO's performance of 97.94%. Importantly, HHO achieved this success with fewer layers, a lower epoch count, and minimal regularization techniques. This demonstrates the advantage of HHO, especially under limited hardware resources and time constraints. In conclusion, our proposed study highlights that GWO and HHO algorithms provide effective solutions in hyperparameter optimization; moreover, HHO stands out with its low computational cost and high generalization ability.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Artificial Intelligence Studies

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Artificial Intelligence Studies (AIS) publishes open access articles under a Creative Commons Attribution 4.0 International License (CC BY). This license permits user to freely share (copy, distribute and transmit) and adapt the contribution including for commercial purposes, as long as the author is properly attributed.

For all licenses mentioned above, authors can retain copyright and all publication rights without restriction.







