Home About us Contact | |||
Forecasting
Kinds of Forecasting Terms modified by Forecasting Selected AbstractsFORECASTING AND MANAGEMENT OF MIGRATORY PESTS IN AUSTRALIAINSECT SCIENCE, Issue 4 2002David Hunter Abstract, The Decision Support System (DSS) used by the Australian Plague Locust Commission for management of several important migratory insect pests in Australia is described. The DSS is based on a Geographic Information System that integrates data on weather and habitat condition with the migration, development and distribution of the pest to prepare forecasts and aid decisions for control. The GIS is module based with the number and nature of the modules easily modified depending on the detail of data required to manage the pest concerned. [source] APPLICATION OF GREY MODEL AND ARTIFICIAL NEURAL NETWORKS TO FLOOD FORECASTING,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 2 2006Moon Seong Rang ABSTRACT: The main focus of this study was to compare the Grey model and several artificial neural network (ANN) models for real time flood forecasting, including a comparison of the models for various lead times (ranging from one to six hours). For hydrological applications, the Grey model has the advantage that it can easily be used in forecasting without assuming that forecast storm events exhibit the same stochastic characteristics as the storm events themselves. The major advantage of an ANN in rainfall-runoff modeling is that there is no requirement for any prior assumptions regarding the processes involved. The Grey model and three ANN models were applied to a 2,509 km2 watershed in the Republic of Korea to compare the results for real time flood forecasting with from one to six hours of lead time. The fifth-order Grey model and the ANN models with the optimal network architectures, represented by ANN1004 (34 input nodes, 21 hidden nodes, and 1 output node), ANN1010 (40 input nodes, 25 hidden nodes, and 1 output node), and ANN1004T (14 input nodes, 21 hidden nodes, and 1 output node), were adopted to evaluate the effects of time lags and differences between area mean and point rainfall. The Grey model and the ANN models, which provided reliable forecasts with one to six hours of lead time, were calibrated and their datasets validated. The results showed that the Grey model and the ANN1010 model achieved the highest level of performance in forecasting runoff for one to six lead hours. The ANN model architectures (ANN1004 and ANN1010) that used point rainfall data performed better than the model that used mean rainfall data (ANN1004T) in the real time forecasting. The selected models thus appear to be a useful tool for flood forecasting in Korea. [source] FLOOD STAGE FORECASTING WITH SUPPORT VECTOR MACHINES,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 1 2002Shie-Yui Liong ABSTRACT: Machine learning techniques are finding more and more applications in the field of forecasting. A novel regression technique, called Support Vector Machine (SVM), based on the statistical learning theory is explored in this study. SVM is based on the principle of Structural Risk Minimization as opposed to the principle of Empirical Risk Minimization espoused by conventional regression techniques. The flood data at Dhaka, Bangladesh, are used in this study to demonstrate the forecasting capabilities of SVM. The result is compared with that of Artificial Neural Network (ANN) based model for one-lead day to seven-lead day forecasting. The improvements in maximum predicted water level errors by SVM over ANN for four-lead day to seven-lead day are 9.6 cm, 22.6 cm, 4.9 cm and 15.7 cm, respectively. The result shows that the prediction accuracy of SVM is at least as good as and in some cases (particularly at higher lead days) actually better than that of ANN, yet it offers advantages over many of the limitations of ANN, for example in arriving at ANN's optimal network architecture and choosing useful training set. Thus, SVM appears to be a very promising prediction tool. [source] CHAOTIC FORECASTING OF DISCHARGE TIME SERIES: A CASE STUDY,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 2 2001Francesco Lisi ABSTRACT: This paper considers the problem of forecasting the discharge time series of a river by means of a chaotic approach. To this aim, we first check for some evidence of chaotic behavior in the dynamic by considering a set of different procedures, namely, the phase portrait of the attractor, the correlation dimension, and the largest Lyapunov exponent. Their joint application seems to confirm the presence of a nonlinear deterministic dynamic of chaotic type. Second, we consider the so-called nearest neighbors predictor and we compare it with a classical linear model. By comparing these two predictors, it seems that nonlinear river flow modeling, and in particular chaotic modeling, is an effective method to improve predictions. [source] GENETIC PROGRAMMING AND ITS APPLICATION IN REAL-TIME RUNOFF FORECASTING,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 2 2001Soon Thiam Khu ABSTRACT: Genetic programming (GP), a relatively new evolutionary technique, is demonstrated in this study to evolve codes for the solution of problems. First, a simple example in the area of symbolic regression is considered. GP is then applied to real-time runoff forecasting for the Orgeval catchment in France. In this study, GP functions as an error updating scheme to complement a rainfall-runoff model, MIKE11/NAM. Hourly runoff forecasts of different updating intervals are performed for forecast horizons of up to nine hours. The results show that the proposed updating scheme is able to predict the runoff quite accurately for all updating intervals considered and particularly for updating intervals not exceeding the time of concentration of the catchment. The results are also compared with those of an earlier study, by the World Meteorological Organization, in which autoregression and Kalman filter were used as the updating methods. Comparisons show that GP is a better updating tool for real-time flow forecasting. Another important finding from this study is that nondimensionalizing the variables enhances the symbolic regression process significantly. [source] A Probabilistic Framework for Bayesian Adaptive Forecasting of Project ProgressCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2007Paolo Gardoni An adaptive Bayesian updating method is used to assess the unknown model parameters based on recorded data and pertinent prior information. Recorded data can include equality, upper bound, and lower bound data. The proposed approach properly accounts for all the prevailing uncertainties, including model errors arising from an inaccurate model form or missing variables, measurement errors, statistical uncertainty, and volitional uncertainty. As an illustration of the proposed approach, the project progress and final time-to-completion of an example project are forecasted. For this illustration construction of civilian nuclear power plants in the United States is considered. This application considers two cases (1) no information is available prior to observing the actual progress data of a specified plant and (2) the construction progress of eight other nuclear power plants is available. The example shows that an informative prior is important to make accurate predictions when only a few records are available. This is also the time when forecasts are most valuable to the project manager. Having or not having prior information does not have any practical effect on the forecast when progress on a significant portion of the project has been recorded. [source] Technology Forecasting: From Emotional to EmpiricalCREATIVITY AND INNOVATION MANAGEMENT, Issue 2 2001Michael S. Slocum Technology Forecasting has evolved from being a methodology based on emotional responses to one predicated on data collection. The Theory of Inventive Problem Solving (TRIZ) is a theory based on empirical data that relates technological evolution to the same stages of biological macro-evolution. This paper will explore the major emotional forecasting methods as well as discuss part of TRIZ Technology Forecasting called Maturity Mapping. The reader will briefly be introduced to eight evolutionary trends based on TRIZ. [source] Marketing Category Forecasting: An Alternative of BVAR-Artificial Neural Networks¶DECISION SCIENCES, Issue 4 2000James J. Jiang ABSTRACT Analyzing scanner data in brand management activities presents unique difficulties due to the vast quantity of the data. Time series methods that are able to handle the volume effectively often are inappropriate due to the violation of many statistical assumptions in the data characteristics. We examine scanner data sets for three brand categories and examine properties associated with many time series forecasting methods. Many violations are found with respect to linearity, normality, autocorrelation, and heteroscedasticity. With this in mind we compare the forecasting ability of neural networks that require no assumptions to two of the more robust time series techniques. Neural networks provide similar forecasts to Bayesian vector autoregression (BVAR), and both outperform generalized autoregressive conditional herteroscedasticty (GARCH) models. [source] Classroom Integration of Statistics and Management Science Via ForecastingDECISION SCIENCES JOURNAL OF INNOVATIVE EDUCATION, Issue 2 2006M. David Albritton First page of article [source] Forecasting the Direction of Policy Rate Changes: The Importance of ECB WordsECONOMIC NOTES, Issue 1-2 2009Carlo Rosa This paper evaluates the predictive power of different information sets for the European Central Bank (ECB) interest-rate-setting behaviour. We employ an ordered probit model, i.e. a limited dependent variable framework, to take into account the discreteness displayed by policy rate changes. The results show that the forecasting ability of standard Taylor-type variables, such as inflation and output gap, is fairly low both in-sample and out-of-sample, and is comparable to the performance of the random walk model. Instead by using broader information sets that include measures of core inflation, exchange rates, monetary aggregates and financial conditions, the accuracy of the forecasts about ECB future actions substantially improves. Moreover, ECB rhetoric considerably contributes to a better understanding of its policy reaction function. Finally, we find that that the ECB has been fairly successful in educating the public to anticipate the overall future direction of its monetary policy, but has been less successful in signalling the exact timing of rate changes. [source] Forecasting the Adoption of Genetically Modified Oilseed Rape Prognosen hinsichtlich der Einführung von gentechnisch verändertem Raps Prévisions sur l'adoption de colza transgéniqueEUROCHOICES, Issue 2 2009Gunnar Breustedt Summary Forecasting the Adoption of Genetically Modified Oilseed Rape We explore farmers' willingness to adopt genetically modified oilseed rape prior to its commercial release and estimate the ,demand' for the new technology. The analysis is based upon experiments with arable farmers in Germany who were asked to choose among conventional and GM rapeseed varieties with different characteristics. Our analysis has shown that ex ante GM adoption decisions are driven by profit expectations and personal as well as farm characteristics. Monetary and technological determinants such as the gross margin advantage of GM oilseed rape varieties, expected liability from cross pollination and restricted flexibility in returning to conventional oilseed rape growing affect the willingness to adopt GM rape in the expected directions. The results further indicate that neighbourhood effects and public attitudes matter a lot, such that individual farmers may not feel entirely free in their technology choice. Our demand simulations suggest that monopolistic seed prices would be set at between ,50 and ,100 per hectare, leaving farmers with a small share of the GM rent. This raises the question as to whether the farmers surveyed would actually benefit from the approval of GM rape varieties if the technology were to be provided by a single firm. Nous explorons le consentement des agriculteurs à utiliser du colza transgénique avant sa mise en marché et estimons la demande de cette nouvelle technologie. L'analyse se fonde sur des expériences menées auprès de cultivateurs allemands à qui l'on a demandé de choisir entre des variétés de colza conventionnelles et transgéniques aux caractéristiques différentes. Notre analyse a montré que les décisions a priori concernant l'adoption de variétés transgéniques sont fonction des profits attendus et des caractéristiques de l'agriculteur et de l'exploitation. Les facteurs financiers et technologiques comme les avantages des variétés de colza transgénique en termes de marge brute, ainsi que les risques possibles de fertilisation croisée et les contraintes relatives au retour vers des cultures de colza conventionnelles ont les effets attendus sur le consentement à adopter des variétés transgéniques. Les résultats montrent également que les effets de voisinage et l'attitude du public comptent beaucoup, de sorte que les agriculteurs individuels pourraient ne pas se sentir complètement libres de choisir leur technologie. Nos simulations sur la demande semblent indiquer que les prix des semences en situation de monopole seraient fixés entre 50 et 100 , par hectare, ce qui laisserait aux agriculteurs une faible part de la rente transgénique. Cela soulève la question de savoir si les agriculteurs de l'enquête tireraient vraiment avantage de l'approbation de variétés de colza transgéniques si la technologie n'était fournie que par une seule compagnie. Wir untersuchen die Bereitschaft von Landwirten, gentechnisch veränderten Raps einzuführen, bevor dieser auf den Markt gebracht wird, und schätzen die ,Nachfrage' nach dieser neuen Technologie ein. Unsere Analyse stützt sich auf Versuche mit Ackerbauern in Deutschland, bei denen sich die Landwirte zwischen herkömmlichen und gentechnisch veränderten Rapssorten mit verschiedenen Eigenschaften entscheiden mussten. Unsere Analyse zeigt, dass ex ante die Entscheidungen über die Einführung von gentechnisch verändertem Raps aufgrund von Gewinnerwartungen sowie von persönlichen und betrieblichen Charakteristika getroffen werden. Monetäre und technologische Bestimmungsgrößen wie z.B. der Vorsprung des gentechnisch veränderten Raps beim Deckungsbeitrag, die erwartete Haftbarkeit bei Fremdbestäubung sowie die nur eingeschränkte Flexibilität, zum herkömmlichen Rapsanbau zurückkehren zu können, beeinflussen erwartungsgemäß die Bereitschaft, gentechnisch veränderten Raps anzubauen. Desweiteren zeigen die Ergebnisse, dass Nachbarschaftseffekte und öffentliche Meinungen eine große Rolle spielen, so dass sich einige Landwirte womöglich bei der Wahl der Technologie in ihrer Entscheidungsfreiheit eingeschränkt fühlen. Unsere Nachfragesimulationen deuten darauf hin, dass sich monopolistische Saatgutpreise zwischen EUR 50 und EUR 100 pro Hektar bewegen würden, so dass den Landwirten ein kleiner Teil der ökonomischen Rente der GV-Technologie verbliebe. Dies wirft die Frage auf, ob die betrachteten Landwirte überhaupt von der Zulassung der gentechnisch veränderten Rapssorten profitieren würden, wenn die Technologie nur von einem einzigen Unternehmen angeboten würde. [source] Fiscal Forecasting: Lessons from the Literature and Challenges,FISCAL STUDIES, Issue 3 2008Teresa Leal H6; E62; C53 Abstract. While fiscal forecasting and monitoring has its roots in the accountability of governments for the use of public funds in democracies, the Stability and Growth Pact has significantly increased interest in budgetary forecasts in Europe, where they play a key role in EU multilateral budgetary surveillance. In view of the increased prominence and sensitivity of budgetary forecasts, which may lead to them being influenced by strategic and political factors, this paper discusses the main issues and challenges in the field of fiscal forecasting from a practitioner's perspective and places them in the context of the related literature. [source] Long-term Hydrological Forecasting in Cold Regions: Retrospect, Current Status and ProspectGEOGRAPHY COMPASS (ELECTRONIC), Issue 5 2009Alexander N. Gelfan The influence of long-term snow accumulation on the runoff conditions several months afterwards is a distinct hydrological characteristic of cold regions, which creates opportunities for long-term (seasonal and subseasonal) hydrological forecasting in these regions. We consider evolution of the long-term forecasting approaches from the deterministic data-based index methods to the hydrological model-based ensemble approaches. Of key interest in this review are the methods developed and used in operational practice in Russia and in the USA, with the emphasis being placed on the methods used in Russia, which may be less familiar to international hydrological society. Following a description of the historical context, we review recent developments that place emphasis on problems relating to the uncertainty of the weather conditions for the lead time of the forecast. We conclude with a personal view of the prospects for the future development of long-term hydrological forecasting techniques. [source] Severe Deep Moist Convective Storms: Forecasting and MitigationGEOGRAPHY COMPASS (ELECTRONIC), Issue 1 2008David L. Arnold Small-scale (2,20 km) circulations, termed ,severe deep moist convective storms', account for a disproportionate share of the world's insured weather-related losses. Spatial frequency maximums of severe convective events occur in South Africa, India, Mexico, the Caucasus, and Great Plains/Prairies region of North America, where the maximum tornado frequency occurs east of the Rocky Mountains. Interest in forecasting severe deep moist convective systems, especially those that produce tornadoes, dates to 1884 when tornado alerts were first provided in the central United States. Modern thunderstorm and tornado forecasting relies on technology and theory, but in the post-World War II era interest in forecasting has also been driven by public pressure. The forecasting process begins with a diagnostic analysis, in which the forecaster considers the potential of the atmospheric environment to produce severe convective storms (which requires knowledge of the evolving kinematic and thermodynamic fields, and the character of the land surface over which the storms will pass), and the likely character of the storms that may develop. Improvements in forecasting will likely depend on technological advancements, such as the development of phased-array radar systems and finer resolution numerical weather prediction models. Once initiated, the evolution of deep convective storms is monitored by satellite and radar. Mitigation of the hazards posed by severe deep moist convective storms is a three-step process, involving preparedness, response, and recovery. Preparedness implies that risks have been identified and organizations and individuals are familiar with a response plan. Response necessitates that potential events are identified before they occur and the developing threat is communicated to the public. Recovery is a function of the awareness of local, regional, and even national governments to the character and magnitude of potential events in specific locations, and whether or not long-term operational plans are in place at the time of disasters. [source] Predicting population consequences of ocean climate change for an ecosystem sentinel, the seabird Cassin's aukletGLOBAL CHANGE BIOLOGY, Issue 7 2010SHAYE G. WOLF Abstract Forecasting the ecological effects of climate change on marine species is critical for informing greenhouse gas mitigation targets and developing marine conservation strategies that remain effective and increase species' resilience under changing climate conditions. Highly productive coastal upwelling systems are predicted to experience substantial effects from climate change, making them priorities for ecological forecasting. We used a population modeling approach to examine the consequences of ocean climate change in the California Current upwelling ecosystem on the population growth rate of the planktivorous seabird Cassin's auklet (Ptychoramphus aleuticus), a demographically sensitive indicator of marine climate change. We use future climate projections for sea surface temperature and upwelling intensity from a regional climate model to forecast changes in the population growth rate of the auklet population at the important Farallon Island colony in central California. Our study projected that the auklet population growth rate will experience an absolute decline of 11,45% by the end of the century, placing this population on a trajectory toward extinction. In addition, future changes in upwelling intensity and timing of peak upwelling are likely to vary across auklet foraging regions in the California Current Ecosystem (CCE), producing a mosaic of climate conditions and ecological impacts across the auklet range. Overall, the Farallon Island Cassin's auklet population has been declining during recent decades, and ocean climate change in this century under a mid-level emissions scenario is projected to accelerate this decline, leading toward population extinction. Because our study species has proven to be a sensitive indicator of oceanographic conditions in the CCE and a powerful predictor of the abundance of other important predators (i.e. salmon), the significant impacts we predicted for the Cassin's auklet provide insights into the consequences that ocean climate change may have for other plankton predators in this system. [source] Forecasting the recurrence of ulcerative colitis: Can U.C. the future?INFLAMMATORY BOWEL DISEASES, Issue 3 2008Daniel Leffler MD No abstract is available for this article. [source] Forecasting and Finite Sample Performance of Short Rate Models: International Evidence,INTERNATIONAL REVIEW OF FINANCE, Issue 3-4 2005SIRIMON TREEPONGKARUNA ABSTRACT This paper evaluates the forecasting and finite sample performance of short-term interest rate models in a number of countries. Specifically, we run a series of in-sample and out-of-sample tests for both the conditional mean and volatility of one-factor short rate models, and compare the results to the random walk model. Overall, we find that the out-of-sample forecasting performance of one-factor short rate models is poor, stemming from the inability of the models to accommodate jumps and discontinuities in the time series data. In addition, we perform a series of Monte Carlo analyses similar to Chapman and Pearson to document the finite sample performance of the short rate models when ,3 is not restricted to be equal to one. Our results indicate the potential dangers of over-parameterization and highlight the limitations of short-term interest rate models. [source] Forecasting with Exponential Smoothing: The State Space Approach by Rob J. Hyndman, Anne B. Koehler, J. Keith Ord, Ralph D. SnyderINTERNATIONAL STATISTICAL REVIEW, Issue 2 2009David J. Hand No abstract is available for this article. [source] Surge, Escalate, Withdraw and Shinseki: Forecasting and Retro-casting American Force Strategies and Insurgency in Iraq,INTERNATIONAL STUDIES PERSPECTIVES, Issue 3 2007Andrew J. Enterline Central to the contemporary American foreign policy debate is the issue of reducing insurgency and promoting stability in Iraq and the role of American military forces in achieving these outcomes. Military force,related proposals range from complete withdrawal to a moderate "surge" in troops to a massive escalation of the force commitment. Here, we draw upon an analysis of domestic political stability in 60 imposed political systems occurring during the period 1816,1994 to forecast the effectiveness of said force-related proposals. The analysis underscores, in part, that (i) a policy of surging American troops is unlikely to succeed, (ii) a policy of belated massive escalation reduces insurgency, but much less so than an initial policy of massive invasion coupled with massive occupation, a strategy that preempts the development of a robust insurgency. [source] Forecasting the Adoption of GM Oilseed Rape: Evidence from a Discrete Choice Experiment in GermanyJOURNAL OF AGRICULTURAL ECONOMICS, Issue 2 2008Gunnar Breustedt C42; C81; Q12; Q16 Abstract This paper explores farmers' willingness to adopt genetically modified (GM) oilseed rape prior to its commercial release and estimates the ,demand' for the new technology. The analysis is based upon choice experiments with 202 German arable farmers. A multinomial probit estimation reveals that GM attributes such as gross margin, expected liability from cross pollination, or flexibility in returning to conventional oilseed rape significantly affect the likelihood of adoption. Neighbouring farmers' attitudes towards GM cropping and a number of farmer and farm characteristics were also found to be significant determinants of prospective adoption. Demand simulations suggest that adoption rates are very sensitive to the profit difference between GM and non-GM rape varieties. A monopolistic seed price would substantially reduce demand for the new technology. A monopolistic seed supplier would reap between 45% and 80% of the GM rent, and the deadweight loss of the monopoly would range between 15% and 30% of that rent. The remaining rent for farmers may be too small to outweigh possible producer price discounts resulting from the costs of segregating GM and non-GM oilseed rape along the supply chain. [source] Forecasting realized volatility: a Bayesian model-averaging approachJOURNAL OF APPLIED ECONOMETRICS, Issue 5 2009Chun Liu How to measure and model volatility is an important issue in finance. Recent research uses high-frequency intraday data to construct ex post measures of daily volatility. This paper uses a Bayesian model-averaging approach to forecast realized volatility. Candidate models include autoregressive and heterogeneous autoregressive specifications based on the logarithm of realized volatility, realized power variation, realized bipower variation, a jump and an asymmetric term. Applied to equity and exchange rate volatility over several forecast horizons, Bayesian model averaging provides very competitive density forecasts and modest improvements in point forecasts compared to benchmark models. We discuss the reasons for this, including the importance of using realized power variation as a predictor. Bayesian model averaging provides further improvements to density forecasts when we move away from linear models and average over specifications that allow for GARCH effects in the innovations to log-volatility. Copyright © 2009 John Wiley & Sons, Ltd. [source] Cointegration, Efficiency and Forecasting in the Currency MarketJOURNAL OF BUSINESS FINANCE & ACCOUNTING, Issue 1-2 2001Wilson H. S. Tong Existing literature on using the cointegration approach to examine the efficiency of the foreign exchange market gives mixed results. Arguments typically focus on econometric testing techniques, with fractional cointegration being the most current one. This paper tries to look at the issue from an economic perspective. It shows that the cointegrating relationship, whether cointegrated or fractionally cointegrated, is found mainly among the currencies of the European Monetary System which are set to fluctuate within a given range. Hence, there is no inconsistency with the notion of market efficiency. Yet, exploiting such a cointegrating relationship is helpful in currency forecasting. There is some evidence that restricting the forecasting model to consist of only cointegrated currencies improves forecasting efficiency. [source] Forecasting for the LCD monitor marketJOURNAL OF FORECASTING, Issue 4 2008Shin-Lian Lo Abstract The TFT-LCD (thin-film transistor,liquid crystal display) industry is one of the key global industries with products that have high clock speed. In this research, the LCD monitor market is considered for an empirical study on hierarchical forecasting (HF). The proposed HF methodology consists of five steps. First, the three hierarchical levels of the LCD monitor market are identified. Second, several exogenously driven factors that significantly affect the demand for LCD monitors are identified at each level of product hierarchy. Third, the three forecasting techniques,regression analysis, transfer function, and simultaneous equations model,are combined to forecast future demand at each hierarchical level. Fourth, various forecasting approaches and disaggregating proportion methods are adopted to obtain consistent demand forecasts at each hierarchical level. Finally, the forecast errors with different forecasting approaches are assessed in order to determine the best forecasting level and the best forecasting approach. The findings show that the best forecast results can be obtained by using the middle-out forecasting approach. These results could guide LCD manufacturers and brand owners on ways to forecast future market demands. Copyright 2008 John Wiley & Sons, Ltd. [source] Forecasting with panel data,JOURNAL OF FORECASTING, Issue 2 2008Badi H. Baltagi Abstract This paper gives a brief survey of forecasting with panel data. It begins with a simple error component regression model and surveys the best linear unbiased prediction under various assumptions of the disturbance term. This includes various ARMA models as well as spatial autoregressive models. The paper also surveys how these forecasts have been used in panel data applications, running horse races between heterogeneous and homogeneous panel data models using out-of-sample forecasts. Copyright © 2008 John Wiley & Sons, Ltd. [source] Forecasting the price of crude oil via convenience yield predictionsJOURNAL OF FORECASTING, Issue 7 2007Thomas A. KnetschArticle first published online: 14 NOV 200 Abstract The paper develops an oil price forecasting technique which is based on the present value model of rational commodity pricing. The approach suggests shifting the forecasting problem to the marginal convenience yield, which can be derived from the cost-of-carry relationship. In a recursive out-of-sample analysis, forecast accuracy at horizons within one year is checked by the root mean squared error as well as the mean error and the frequency of a correct direction-of-change prediction. For all criteria employed, the proposed forecasting tool outperforms the approach of using futures prices as direct predictors of future spot prices. Vis-à-vis the random-walk model, it does not significantly improve forecast accuracy but provides valuable statements on the direction of change. Copyright © 2007 John Wiley & Sons, Ltd. [source] Forecasting the recent behavior of US business fixed investment spending: an analysis of competing models,JOURNAL OF FORECASTING, Issue 1 2007David E. Rapach Abstract We evaluate forecasting models of US business fixed investment spending growth over the recent 1995:1,2004:2 out-of-sample period. The forecasting models are based on the conventional Accelerator, Neoclassical, Average Q, and Cash-Flow models of investment spending, as well as real stock prices and excess stock return predictors. The real stock price model typically generates the most accurate forecasts, and forecast-encompassing tests indicate that this model contains most of the information useful for forecasting investment spending growth relative to the other models at longer horizons. In a robustness check, we also evaluate the forecasting performance of the models over two alternative out-of-sample periods: 1975:1,1984:4 and 1985:1,1994:4. A number of different models produce the most accurate forecasts over these alternative out-of-sample periods, indicating that while the real stock price model appears particularly useful for forecasting the recent behavior of investment spending growth, it may not continue to perform well in future periods.,,Copyright © 2007 John Wiley & Sons, Ltd. [source] Forecasting the conditional covariance matrix of a portfolio under long-run temporal dependenceJOURNAL OF FORECASTING, Issue 6 2006Trino-Manuel Ñíguez Abstract Long-range persistence in volatility is widely modelled and forecast in terms of the so-called fractional integrated models. These models are mostly applied in the univariate framework, since the extension to the multivariate context of assets portfolios, while relevant, is not straightforward. We discuss and apply a procedure which is able to forecast the multivariate volatility of a portfolio including assets with long memory. The main advantage of this model is that it is feasible enough to be applied on large-scale portfolios, solving the problem of dealing with extremely complex likelihood functions which typically arises in this context. An application of this procedure to a portfolio of five daily exchange rate series shows that the out-of-sample forecasts for the multivariate volatility are improved under several loss functions when the long-range dependence property of the portfolio assets is explicitly accounted for.,,Copyright © 2006 John Wiley & Sons, Ltd. [source] A non-Gaussian generalization of the Airline model for robust seasonal adjustmentJOURNAL OF FORECASTING, Issue 5 2006JOHN A. D. ASTON Abstract In their seminal book Time Series Analysis: Forecasting and Control, Box and Jenkins (1976) introduce the Airline model, which is still routinely used for the modelling of economic seasonal time series. The Airline model is for a differenced time series (in levels and seasons) and constitutes a linear moving average of lagged Gaussian disturbances which depends on two coefficients and a fixed variance. In this paper a novel approach to seasonal adjustment is developed that is based on the Airline model and that accounts for outliers and breaks in time series. For this purpose we consider the canonical representation of the Airline model. It takes the model as a sum of trend, seasonal and irregular (unobserved) components which are uniquely identified as a result of the canonical decomposition. The resulting unobserved components time series model is extended by components that allow for outliers and breaks. When all components depend on Gaussian disturbances, the model can be cast in state space form and the Kalman filter can compute the exact log-likelihood function. Related filtering and smoothing algorithms can be used to compute minimum mean squared error estimates of the unobserved components. However, the outlier and break components typically rely on heavy-tailed densities such as the t or the mixture of normals. For this class of non-Gaussian models, Monte Carlo simulation techniques will be used for estimation, signal extraction and seasonal adjustment. This robust approach to seasonal adjustment allows outliers to be accounted for, while keeping the underlying structures that are currently used to aid reporting of economic time series data.,,Copyright © 2006 John Wiley & Sons, Ltd. [source] A forecasting procedure for nonlinear autoregressive time series modelsJOURNAL OF FORECASTING, Issue 5 2005Yuzhi CaiArticle first published online: 2 AUG 200 Abstract Forecasting for nonlinear time series is an important topic in time series analysis. Existing numerical algorithms for multi-step-ahead forecasting ignore accuracy checking, alternative Monte Carlo methods are also computationally very demanding and their accuracy is difficult to control too. In this paper a numerical forecasting procedure for nonlinear autoregressive time series models is proposed. The forecasting procedure can be used to obtain approximate m -step-ahead predictive probability density functions, predictive distribution functions, predictive mean and variance, etc. for a range of nonlinear autoregressive time series models. Examples in the paper show that the forecasting procedure works very well both in terms of the accuracy of the results and in the ability to deal with different nonlinear autoregressive time series models. Copyright © 2005 John Wiley & Sons, Ltd. [source] Forecasting the Treasury's balance at the FedJOURNAL OF FORECASTING, Issue 5 2004Daniel L. Thornton Abstract As part of the Fed's daily operating procedure, the Federal Reserve Bank of New York, the Board of Governors and the Treasury make a forecast of that day's Treasury balance at the Fed. These forecasts are an integral part of the Fed's daily operating procedure. Errors in these forecasts can generate variation in reserve supply and, consequently, the federal funds rate. This paper evaluates the accuracy of these forecasts. The evidence suggests that each agency's forecast contributes to the optimal, i.e., minimum variance, forecast and that the Trading Desk of the Federal Reserve Bank of New York incorporates information from all three of the agency forecasts in conducting daily open market operations. Moreover, these forecasts encompass the forecast of an economic model. Copyright © 2004 John Wiley & Sons, Ltd. [source] |