Optimizing echo state network through a novel fisher maximization based stochastic gradient descent

Ozturk M. M., Cankaya İ. A., Ipekci D.

NEUROCOMPUTING, vol.415, pp.215-224, 2020 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 415
  • Publication Date: 2020
  • Doi Number: 10.1016/j.neucom.2020.07.034
  • Journal Name: NEUROCOMPUTING
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Applied Science & Technology Source, Biotechnology Research Abstracts, Compendex, Computer & Applied Sciences, EMBASE, INSPEC, zbMATH
  • Page Numbers: pp.215-224
  • Keywords: Echo state network, Hyperparameter optimization, Stochastic gradient descent, OPTIMIZATION, INFORMATION, PROPERTY, SIGNALS
  • Süleyman Demirel University Affiliated: Yes


Hyperparameter optimization is a challenging process that has the potential to improve machine learning algorithms. Since it creates a remarkable computational burden for machine learning tasks, there have been few works coping with tuning strategies of a specific algorithm. In this paper, an improved Stochastic Gradient Descent (SGD) based on Fisher Maximization is developed for tuning hyperparameters of an Echo State Network (ESN) which has a wide range of applications. The results of the method are then compared with those of traditional Gradient Descent and Grid Search. According to the obtained results; 1) The scale of the data sets greatly affects the reliability of hyperparameter optimization results; 2) Feature selection is critical in terms of mean error of training when hyperparameter optimization is applied on some methods such as ESN; 3) SGD falls in a good local minima if Fisher Maximization is performed to find a good starting point. (C) 2020 Elsevier B.V. All rights reserved.