Stochastic neural networks with applications to nonlinear time series
成果类型:
Article
署名作者:
Lai, TL; Wong, SPS
署名单位:
Stanford University; Hong Kong University of Science & Technology
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1198/016214501753208636
发表日期:
2001
页码:
968-981
关键词:
monte-carlo methods
mixtures
摘要:
we consider a variant of the conventional neural network model, called the stochastic neural network, that can be used to approximate complex nonlinear stochastic systems. We show how the expectation-maximization algorithm can be used to develop efficient estimation schemes that have much lower computational complexity than those for conventional neural networks. This enables us to carry out model selection procedures, such as the Bayesian information criterion, to choose the number of hidden units and the input variables for each hidden unit. Stochastic neural networks are shown to have the universal approximation property of neural networks. Other important properties of the proposed model are given, and model-based multistep-ahead forecasts are provided. We fit stochastic neural network models to several real and simulated time series. Results show that the fitted models improve post-sample forecasts over conventional neural networks and other nonlinear and nonparametric models.