Please use this identifier to cite or link to this item: https://doi.org/10.1016/S0305-0483(01)00022-6
DC FieldValue
dc.titleInsights into neural-network forecasting of time series corresponding to ARMA(p,q) structures
dc.contributor.authorHwarng, H.B.
dc.date.accessioned2013-10-10T04:39:20Z
dc.date.available2013-10-10T04:39:20Z
dc.date.issued2001
dc.identifier.citationHwarng, H.B. (2001). Insights into neural-network forecasting of time series corresponding to ARMA(p,q) structures. Omega 29 (3) : 273-289. ScholarBank@NUS Repository. https://doi.org/10.1016/S0305-0483(01)00022-6
dc.identifier.issn03050483
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/44983
dc.description.abstractMotivated by the lack of evidence supporting the conjecture that the back-propagation neural network (BPNN) is a universal approximator thus it can perform at least comparably to linear models on linear data, this study is designed to answer two primary research questions, namely, "how does the BPNN perform with respect to various underlying ARMA(p,q) structures?" and "how does the level of noise in the training time series affect the BPNN's performance?" The goal is to understand better the modelling and forecasting ability of BPNNs on a special class of time series and suggest proper training strategies to improve performance. Using Box-Jenkins models' performance as a benchmark, it is concluded that BPNNs generally performed well and consistently for time series corresponding to ARMA(p,q) structures. BPNNs' ability to model and forecast is not affected by the number of parameters but by the magnitude of the coefficients of the underlying structure. Overall, BPNNs perform significantly better for most of the structures when a particular noise level is considered during network training. Therefore, a proper strategy is to train networks at a noise level consistent in magnitude with the time series' sample standard deviation. © 2001 Elsevier Science Ltd.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1016/S0305-0483(01)00022-6
dc.sourceScopus
dc.subjectBack-propagation
dc.subjectBox-Jenkins
dc.subjectExperimental design
dc.subjectForecasting
dc.subjectNeural networks
dc.subjectSimulation
dc.subjectTime series
dc.typeArticle
dc.contributor.departmentDECISION SCIENCES
dc.description.doi10.1016/S0305-0483(01)00022-6
dc.description.sourcetitleOmega
dc.description.volume29
dc.description.issue3
dc.description.page273-289
dc.description.codenOMEGA
dc.identifier.isiut000169206500005
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.