Performance Evaluation of Swish-Based Activation Functions for Multi-Layer Networks

Keywords: ANN, activation function, exp-swish AF, Swish AF, multi-layer perceptron


Artificial Neural Networks (ANNs) are computational modelling implements that are universally endured in many disciplines and utilized to model complicated real-world problems such as regression, classification, function approximation, identification, control, pattern recognition, and forecasting. The importance of the ANN originates from the information processing properties of biological systems like nonlinearity, fault tolerance, parallelism, and learning. Activation function (AF) is a crucial characteristic of an ANN since it gains nonlinearity to the networks. One of the most well-known AF used in the literature is swish AF which is generally used in deep networks. In this study, in addition to the swish AF we use three different AFs based on the swish AF which are mish, e-swish, and exponential swish (exp-swish) AFs for multi-layer perceptron.  In order to compare the ANN models using different AFs for multi-layer networks, we use four different benchmark datasets from the University of California Irvine (UCI) Machine Learning Repository and get the result that exp-swish AF has the best performance.

How to Cite
KOÇAK, Y., & ŞİRAY, G. (2022). Performance Evaluation of Swish-Based Activation Functions for Multi-Layer Networks. Artificial Intelligence Studies, 5(1), 1-13.