Analisis Penurunan Gradien dengan Kombinasi Fungsi Aktivasi pada Algoritma JST untuk Pencarian Akurasi Terbaik

Abstract

There are many training function methods for gradient descent (gradient descent) and activation functions (transfer functions) that can be used in the ANN algorithm, especially the backpropagation algorithm. Therefore the aim of this paper is to analyze the best gradient descent that can be used as a reference for use in the ANN algorithm, especially the backpropagation algorithm in data prediction, classification and pattern management problems. The gradient descent methods to be analyzed include; Gradient descent backpropagation (traingd), Gradient descent with momentum backpropagation (traingdm), Gradient descent with adaptive learning rate backpropagation (traingda), and Gradient descent with momentum and adaptive learning rate backpropagation (traingdx). The training function will be combined with the activation function (transfer function) of bipolar sigmoid (tansig), linear transfer (purelin) and binary sigmoid (logsig). The sample data used for the analysis process is the time-series data for the Human Development Index in Indonesia, which is obtained from the Central Bureau of Statistics (BPS). Architectural models used for gradient descent analysis include: 6-10-15-1, 6-15-20-1, 6-20-25-1 and 6-25-30-1. Based on the analysis results, the best training function is traingda with an architectural model of 6-15-20-1 which produces an accuracy rate of 91% and MSE testing is 0.000731529 (smaller than other methods)