Time Series Forecasting using Recurrent Neural Networks modified by Bayesian Inference in the Learning Process
"Typically, time series forecasting is done by using models based directly on the past observations from the same sequence. In these cases, when the model is learning from data, there is not an extra quantity of noiseless data available and computational resources are unlimited. In practice, it is necessary to deal with finite noisy datasets, which lead to uncertainty about what so appropriate the model is. For this, the employment of models based on Bayesian inference are preferable. Then, probabilities are treated as a way to represent the subjective uncertainty from rational agent, performing an approximated inference by maximizing a lower bound on the marginal likelihood. A modified algorithm using long-short memory recurrent neural networks for time series forecasting was presented. This new approach was chosen in order to be as close as possible to the original series in the sense of minimizing the associated Kullback-Leibler Information Criterion. A simulation study was conducted to evaluate and illustrate results, comparing this approach with Bayesian neural-networks-based algorithms for artificial chaotic time-series. © 2019 IEEE."
Bayesian networks ; Forecasting ; Inference engines ; Learning systems ; Time series ; Bayesian ; Bayesian neural networks ; Computational resources ; Kullback Leibler divergence ; Kullback-Leibler information ; Marginal likelihood ; Subjective uncertainty ; Time series forecasting ; Recurrent neural networks ; Bayesian approximation ; Kullback-Leibler Divergence ; Recurrent neural network ; Time Series Forecasting ;
- Artículos