CNN Hyperparameter Optimization for MNIST Dataset with Metaheuristic Algorithms

Authors

  • Gülistan Arslan Kütahya Dumlupınara Üniversitesi
  • Hasan Temurtaş Kutahya Dumlupınar University

DOI:

https://doi.org/10.30855/ais.2025.08.02.04

Keywords:

Hyperparameter Optimization, Convolutional Neural Networks, Metaheuristic Algorithms, MNIST Dataset

Abstract

In this study, we used nature-inspired metaheuristic algorithms for hyperparameter optimization, a key problem in deep learning. Specifically, the Gray Wolf Optimization (GWO) and Harris Hawk Optimization (HHO) algorithms were comparatively evaluated on the MNIST dataset, which is widely used for handwritten digit classification. The main objective of the study was to achieve high classification accuracy while simultaneously keeping the network structure as simple and computational cost low as possible. In this study, critical hyperparameters such as the number of layers, number of neurons, learning rate, dropout rate, and batch size were optimized. The findings show that both algorithms achieve high accuracy rates, but HHO, with a test accuracy of 98.1%, surpasses GWO's performance of 97.94%. Importantly, HHO achieved this success with fewer layers, a lower epoch count, and minimal regularization techniques. This demonstrates the advantage of HHO, especially under limited hardware resources and time constraints. In conclusion, our proposed study highlights that GWO and HHO algorithms provide effective solutions in hyperparameter optimization; moreover, HHO stands out with its low computational cost and high generalization ability.

Downloads

Published

31.12.2025

How to Cite

Arslan, G., & Temurtaş, H. (2025). CNN Hyperparameter Optimization for MNIST Dataset with Metaheuristic Algorithms. Artificial Intelligence Studies, 8(2), 138–147. https://doi.org/10.30855/ais.2025.08.02.04