Optimizing echo state network through a novel fisher maximization based stochastic gradient descent


Ozturk M. M. , Cankaya İ. A. , Ipekci D.

NEUROCOMPUTING, vol.415, pp.215-224, 2020 (Journal Indexed in SCI) identifier identifier

  • Publication Type: Article / Article
  • Volume: 415
  • Publication Date: 2020
  • Doi Number: 10.1016/j.neucom.2020.07.034
  • Title of Journal : NEUROCOMPUTING
  • Page Numbers: pp.215-224
  • Keywords: Echo state network, Hyperparameter optimization, Stochastic gradient descent, OPTIMIZATION, INFORMATION, PROPERTY, SIGNALS

Abstract

Hyperparameter optimization is a challenging process that has the potential to improve machine learning algorithms. Since it creates a remarkable computational burden for machine learning tasks, there have been few works coping with tuning strategies of a specific algorithm. In this paper, an improved Stochastic Gradient Descent (SGD) based on Fisher Maximization is developed for tuning hyperparameters of an Echo State Network (ESN) which has a wide range of applications. The results of the method are then compared with those of traditional Gradient Descent and Grid Search. According to the obtained results; 1) The scale of the data sets greatly affects the reliability of hyperparameter optimization results; 2) Feature selection is critical in terms of mean error of training when hyperparameter optimization is applied on some methods such as ESN; 3) SGD falls in a good local minima if Fisher Maximization is performed to find a good starting point. (C) 2020 Elsevier B.V. All rights reserved.