Nonlinear Time Series (nonlinear + time_series)

Distribution by Scientific Domains


Selected Abstracts


Measuring Conditional Persistence in Nonlinear Time Series,

OXFORD BULLETIN OF ECONOMICS & STATISTICS, Issue 3 2007
George Kapetanios
Abstract The persistence properties of economic time series have been a primary object of investigation in a variety of guises since the early days of econometrics. Recently, work on nonlinear modelling for time series has introduced the idea that persistence of a shock at a point in time may vary depending on the state of the process at that point in time. This article suggests investigating the persistence of processes conditioning on their history as a tool that may aid parametric nonlinear modelling. In particular, we suggest that examining the nonparametrically estimated derivatives of the conditional expectation of a variable with respect to its lag(s) may be a useful indicator of the variation in persistence with respect to its past history. We discuss in detail the implementation of the measure and present a Monte Carlo investigation. We further apply the persistence analysis to real exchange rates. [source]


Near-Term Travel Speed Prediction Utilizing Hilbert,Huang Transform

COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 8 2009
Khaled Hamad
In this study, we propose an innovative methodology for such prediction. Because of the inherently direct derivation of travel time from speed data, the study was limited to the use of speed only as a single predictor. The proposed method is a hybrid one that combines the use of the empirical mode decomposition (EMD) and a multilayer feedforward neural network with backpropagation. The EMD is the key part of the Hilbert,Huang transform, which is a newly developed method at NASA for the analysis of nonstationary, nonlinear time series. The rationale for using the EMD is that because of the highly nonlinear and nonstationary nature of link speed series, by decomposing the time series into its basic components, more accurate forecasts would be obtained. We demonstrated the effectiveness of the proposed method by applying it to real-life loop detector data obtained from I-66 in Fairfax, Virginia. The prediction performance of the proposed method was found to be superior to previous forecasting techniques. Rigorous testing of the distribution of prediction errors revealed that the model produced unbiased predictions of speeds. The superiority of the proposed model was also verified during peak periods, midday, and night. In general, the method was accurate, computationally efficient, easy to implement in a field environment, and applicable to forecasting other traffic parameters. [source]


Evolving modular networks with genetic algorithms: application to nonlinear time series

EXPERT SYSTEMS, Issue 4 2004
A.S. Cofiño
Abstract: A key problem of modular neural networks is finding the optimal aggregation of the different subtasks (or modules) of the problem at hand. Functional networks provide a partial solution to this problem, since the inter-module topology is obtained from domain knowledge (functional relationships and symmetries). However, the learning process may be too restrictive in some situations, since the resulting modules (functional units) are assumed to be linear combinations of selected families of functions. In this paper, we present a non-parametric learning approach for functional networks using feedforward neural networks for approximating the functional modules of the resulting architecture; we also introduce a genetic algorithm for finding the optimal intra-module topology (the appropriate balance of neurons for the different modules according to the complexity of their respective tasks). Some benchmark examples from nonlinear time-series prediction are used to illustrate the performance of the algorithm for finding optimal modular network architectures for specific problems. [source]


A forecasting procedure for nonlinear autoregressive time series models

JOURNAL OF FORECASTING, Issue 5 2005
Yuzhi CaiArticle first published online: 2 AUG 200
Abstract Forecasting for nonlinear time series is an important topic in time series analysis. Existing numerical algorithms for multi-step-ahead forecasting ignore accuracy checking, alternative Monte Carlo methods are also computationally very demanding and their accuracy is difficult to control too. In this paper a numerical forecasting procedure for nonlinear autoregressive time series models is proposed. The forecasting procedure can be used to obtain approximate m -step-ahead predictive probability density functions, predictive distribution functions, predictive mean and variance, etc. for a range of nonlinear autoregressive time series models. Examples in the paper show that the forecasting procedure works very well both in terms of the accuracy of the results and in the ability to deal with different nonlinear autoregressive time series models. Copyright © 2005 John Wiley & Sons, Ltd. [source]


A Dependence Metric for Possibly Nonlinear Processes

JOURNAL OF TIME SERIES ANALYSIS, Issue 5 2004
C. W. Granger
Abstract., A transformed metric entropy measure of dependence is studied which satisfies many desirable properties, including being a proper measure of distance. It is capable of good performance in identifying dependence even in possibly nonlinear time series, and is applicable for both continuous and discrete variables. A nonparametric kernel density implementation is considered here for many stylized models including linear and nonlinear MA, AR, GARCH, integrated series and chaotic dynamics. A related permutation test of independence is proposed and compared with several alternatives. [source]


Using Image and Curve Registration for Measuring the Goodness of Fit of Spatial and Temporal Predictions

BIOMETRICS, Issue 4 2004
Cavan Reilly
Summary Conventional measures of model fit for indexed data (e.g., time series or spatial data) summarize errors in y, for instance by integrating (or summing) the squared difference between predicted and measured values over a range of x. We propose an approach which recognizes that errors can occur in the x -direction as well. Instead of just measuring the difference between the predictions and observations at each site (or time), we first "deform" the predictions, stretching or compressing along the x -direction or directions, so as to improve the agreement between the observations and the deformed predictions. Error is then summarized by (a) the amount of deformation in x, and (b) the remaining difference in y between the data and the deformed predictions (i.e., the residual error in y after the deformation). A parameter, ,, controls the tradeoff between (a) and (b), so that as ,,, no deformation is allowed, whereas for ,= 0 the deformation minimizes the errors in y. In some applications, the deformation itself is of interest because it characterizes the (temporal or spatial) structure of the errors. The optimal deformation can be computed by solving a system of nonlinear partial differential equations, or, for a unidimensional index, by using a dynamic programming algorithm. We illustrate the procedure with examples from nonlinear time series and fluid dynamics. [source]