Uses a parametric statistical framework to understand the effect of input representation on performance for nonlinear prediction of time series. In particular, considerations of input representation lead directly to choices between feedforward and recurrent neural networks. It is shown that feedforward networks are nonlinear autoregressive models and that recurrent networks can model a larger class of processes, including nonlinear autoregressive moving average models. For some processes, feedback allows recurrent networks to achieve better predictions than can be made with a feedforward network with a finite number of inputs. The results are confirmed on a problem in power system regional load forecasting.< >
Anastasios ChristodoulidesParis MastorocostasGeorgios KandilogiannakisPanagiota TselentiAnastasios L. Kesidis
Jerome T. ConnorR. Douglas MartinLes Atlas
Romuald BonéM. AssaadMichel Crucianu