An Ensemble Learning Approach to the Predictive Stability of Echo State Networks

Qiuyi Wu, Ernest Fokoue, Dhireesha Kudithipudi

Abstract


This paper demonstrates the inherent capacity of the ensemble learning approach to stabilize the notoriously difficult to tune echo state network (ESN) learning machines. ESN learning machines belong to a class of recurrent neural networks made popular because of their ability to perform predictively well on time series data. However, for any given task, building the corresponding optimal ESN is notoriously hard because of the instability of the tuning process. In this paper, we harness the high predictive variance by building ensembles made up of lightly tuned ESNs, and then combining those ESNs (base learners) rather than seeking the daunting task of selecting a single one. Throughout this paper, all our examples consistently demonstrate substantial reductions in average prediction error as a result of using the aggregate learning machine in place of the single one.

Keywords


Time Series; Stochastic Process; Learning Machine; Echo State Network; Ensemble Learning; Prediction Error

Full Text:

PDF

References


R.J. Hyndman, Time Series Data Library, http://data.is/TSDLdemo, July 2010.

J.A. Aslam, R.A. Popa and R.L. Rivest, On estimating the size and confidence of a statistical audit, Proceedings of the USENIX Workshop on Accurate Electronic Voting Technology, p. 8-8, August 6, 2007, Boston, MA.

B. Bacic, Echo state network ensemble for human motion data temporal phasing: A case study on tennis forehands, Neural Information Processing Lecture Notes in Computer Science, pp. 11 – 18, 2016.

L. Breiman, Bagging Predictors, Technical report 421, September 1994, Department of Statistics, University of California Berkeley, CA.

L. Breiman, Bagging predictors, Machine Learning 24 (2) (1996), 123 – 140.

L. Breiman, M. Last and J. Rice, Random forests, Statistical Challenges in Astronomy, Statistical Challenges in Astronomy 9 (2003), 243 – 254.

P.J. Brockwell and R.A. Davis, Model building and forecasting with arima processes, Springer Series in Statistics Time Series: Theory and Methods, pp. 273 – 329 (1991).

L. Büsing, B. Schrauwen and R. Legenstein, Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons, Neural Computation 22 (5) (2010), 1272 – 1311.

P. Buteneers, B. Schrauwen, D. Verstraeten and D. Stroobandt, Real-time epileptic seizure detection on intra-cranial rat data using reservoir computing, Advances in Neuro-Information Processing Lecture Notes in Computer Science, pp. 56 – 63 (2009).

M.J. Campbell and A.M. Walker, A survey of statistical work on the mackenzie river series of annual canadian lynx trappings for the years 1821-1934 and a new analysis, Journal of the Royal Statistical Society – Series A (General), 140 (4) (1977), p. 411.

M. Hlavac, Stargazer: Well-formatted regression and summary statistics tables, R Package Version 5.2.1, https://CRAN.R-project.org/package=stargazer (2018).

H. Jaeger, Echo state network, Scholarpedia 2 (9) (2007), 2330.

A. Jalalvand, G. VanWallendael and R. Van DeWalle, Real-time reservoir computing network-based systems for detection tasks on visual contents, 2015 7th International Conference on Computational Intelligence, Communication Systems and Networks, 2015.

D. Kudithipudi, Q. Saleh, C. Merkel, J. Thesing and B. Wysocki, Design and analysis of a neuromemristive reservoir computing architecture for biosignal processing, Frontiers in Neuroscience 9 (2016), 502.

M. Lukoševicius, A practical guide to applying echo state networks, Lecture Notes in Computer Science Neural Networks: Tricks of the Trade, pp. 659 – 686 (2012).

Q. Ma, L. Shen, W. Chen, J. Wang, J. Wei and Z. Yu, Functional echo state network for time series classification, Information Sciences 373 (2016), 1 – 20.

Q. Ma, L. Shen, W. Chen, J. Wang, J. Wei and Z. Yu, Functional echo state network for time series classification, Information Sciences 373 (2016), 1 – 20.

P.L. Mcdermott and C.K. Wikle, An ensemble quadratic echo state network for non-linear spatiotemporal forecasting, Stat. 6 (1) (2017), 315 – 330.

A.A. Prater, Comparison of echo state network output layer classification methods on noisy data, 2017 International Joint Conference on Neural Networks (IJCNN) (2017).

N. Schaetti, M. Salomon and R. Couturier, Echo state networks-based reservoir computing for mnist handwritten digits recognition, in: 2016 IEEE International Conference on Computational Science and Engineering (CSE) and IEEE International Conference on Embedded and Ubiquitous Computing (EUC) and 15th International Symposium on Distributed Computing and Applications for Business Engineering (DCABES), pp. 484 – 491 (2016).

N. Schaetti, M. Salomon and R. Couturier, Echo state networks-based reservoir computing for mnist handwritten digits recognition, 2016 IEEE Intl Conference on Computational Science and Engineering (CSE) and IEEE International Conference on Embedded and Ubiquitous Computing (EUC) and 15th International Symposium on Distributed Computing and Applications for Business Engineering (DCABES) (2016).

J. Schmee, D.F. Andrews and A.M. Herzberg, Data: A collection of problems from many fields for the student and research worker, Technometrics 29 (1) (1987), 120.

I. Sutskever and G. Hinton, Temporal-kernel recurrent neural networks. Neural Networks 23 (2) (2010), 239 – 243.

J.W. Taylor, Short-term electricity demand forecasting using double seasonal exponential smoothing, Journal of the Operational Research Society 54 (8) (2003), 799 – 805.

H. Wang and X. Yan, Optimizing the echo state network with a binary particle swarm optimization algorithm, Knowledge-Based Systems 86 (2015), 182 – 193.

Q. Wu, E. Fokoue and D. Kudithipudi, On the statistical challenges of echo state networks and some potential remedies, arXiv: 1802.07369 [stat.ML] (February 2018).




DOI: http://dx.doi.org/10.26713%2Fjims.v10i1-2.827

eISSN 0975-5748; pISSN 0974-875X