Series Data (series + data)

Distribution by Scientific Domains
Distribution within Business, Economics, Finance and Accounting

Kinds of Series Data

  • time series data


  • Selected Abstracts


    Homicide in Chicago from 1890 to 1930: prohibition and its impact on alcohol- and non-alcohol-related homicides

    ADDICTION, Issue 3 2009
    Mark Asbridge
    ABSTRACT Aim The aim of the current paper is to examine the impact of the enactment of constitutional prohibition in the United States in 1920 on total homicides, alcohol-related homicides and non-alcohol-related homicides in Chicago. Design Data are drawn from the Chicago Historical Homicide Project, a data set chronicling 11 018 homicides in Chicago between 1870 and 1930. Interrupted time,series and autoregression integrated moving average (ARIMA) models are employed to examine the impact of prohibition on three separate population-adjusted homicide series. All models control for potential confounding from World War I demobilization and from trend data drawn from Wesley Skogan's Time,Series Data from Chicago. Findings Total and non-alcohol-related homicide rates increased during prohibition by 21% and 11%, respectively, while alcohol-related homicides remained unchanged. For other covariates, alcohol-related homicides were related negatively to the size of the Chicago police force and positively to police expenditures and to the proportion of the Chicago population aged 21 years and younger. Non-alcohol-related homicides were related positively to police expenditures and negatively to the size of the Chicago police force. Conclusions While total and non-alcohol-related homicides in the United States continued to rise during prohibition, a finding consistent with other studies, the rate of alcohol-related homicides remained unchanged. The divergent impact of prohibition on alcohol- and non-alcohol-related homicides is discussed in relation to previous studies of homicide in this era. [source]


    Adaptive Fourier Series and the Analysis of Periodicities in Time Series Data

    JOURNAL OF TIME SERIES ANALYSIS, Issue 6 2000
    Robert V. Foutz
    A Fourier series decomposes a function x(t) into a sum of periodic components that have sinusoidal shapes. This paper describes an adaptive Fourier series where the periodic components of x(t) may have a variety of differing shapes. The periodic shapes are adaptive since they depend on the function x(t) and the period. The results, which extend both Fourier analysis and Walsh,Fourier analysis, are applied to investigate the shapes of periodic components in time series data sets. [source]


    Implicit Surface Modelling with a Globally Regularised Basis of Compact Support

    COMPUTER GRAPHICS FORUM, Issue 3 2006
    C. Walder
    We consider the problem of constructing a globally smooth analytic function that represents a surface implicitly by way of its zero set, given sample points with surface normal vectors. The contributions of the paper include a novel means of regularising multi-scale compactly supported basis functions that leads to the desirable interpolation properties previously only associated with fully supported bases. We also provide a regularisation framework for simpler and more direct treatment of surface normals, along with a corresponding generalisation of the representer theorem lying at the core of kernel-based machine learning methods. We demonstrate the techniques on 3D problems of up to 14 million data points, as well as 4D time series data and four-dimensional interpolation between three-dimensional shapes. Categories and Subject Descriptors (according to ACM CCS): I.3.5 [Computer Graphics]: Curve, surface, solid, and object representations [source]


    STRATEGIC BEHAVIORS TOWARD ENVIRONMENTAL REGULATION: A CASE OF TRUCKING INDUSTRY

    CONTEMPORARY ECONOMIC POLICY, Issue 1 2007
    TERENCE LAM
    We used trucking industry's response to the U.S. Environmental Protection Agency's acceleration of 2004 diesel emissions standards as a case study to examine the importance of accounting for regulatees' strategic behaviors in drafting of environmental regulations. Our analysis of the time series data of aggregate U.S. and Canada heavy-duty truck production data from 1992 through 2003 found that heavy-duty trucks production increased by 20%,23% in the 6 mo prior to the date of compliance. The increases might be due to truck operators pre-buying trucks with less expensive but noncompliant engines and behaving strategically in anticipation of other uncertainties. (JEL L51, Q25) [source]


    Measurement error and estimates of population extinction risk

    ECOLOGY LETTERS, Issue 1 2004
    John M. McNamara
    Abstract It is common to estimate the extinction probability for a vulnerable population using methods that are based on the mean and variance of the long-term population growth rate. The numerical values of these two parameters are estimated from time series of population censuses. However, the proportion of a population that is registered at each census is typically not constant but will vary among years because of stochastic factors such as weather conditions at the time of sampling. Here, we analyse how such sampling errors influence estimates of extinction risk and find sampling errors to produce two opposite effects. Measurement errors lead to an exaggerated overall variance, but also introduce negative autocorrelations in the time series (which means that estimates of annual growth rates tend to alternate in size). If time series data are treated properly these two effects exactly counter balance. We advocate routinely incorporating a measure of among year correlations in estimating population extinction risk. [source]


    A Three-step Method for Choosing the Number of Bootstrap Repetitions

    ECONOMETRICA, Issue 1 2000
    Donald W. K. Andrews
    This paper considers the problem of choosing the number of bootstrap repetitions B for bootstrap standard errors, confidence intervals, confidence regions, hypothesis tests, p -values, and bias correction. For each of these problems, the paper provides a three-step method for choosing B to achieve a desired level of accuracy. Accuracy is measured by the percentage deviation of the bootstrap standard error estimate, confidence interval length, test's critical value, test's p -value, or bias-corrected estimate based on B bootstrap simulations from the corresponding ideal bootstrap quantities for which B=,. The results apply quite generally to parametric, semiparametric, and nonparametric models with independent and dependent data. The results apply to the standard nonparametric iid bootstrap, moving block bootstraps for time series data, parametric and semiparametric bootstraps, and bootstraps for regression models based on bootstrapping residuals. Monte Carlo simulations show that the proposed methods work very well. [source]


    Can hepatitis C virus prevalence be used as a measure of injection-related human immunodeficiency virus risk in populations of injecting drug users?

    ADDICTION, Issue 2 2010
    An ecological analysis
    ABSTRACT Background Human immunodeficiency virus (HIV) outbreaks occur among injecting drug users (IDUs), but where HIV is low insight is required into the future risk of increased transmission. The relationship between hepatitis C virus (HCV) and HIV prevalence among IDUs is explored to determine whether HCV prevalence could indicate HIV risk. Methods Systematic review of IDU HIV/HCV prevalence data and regression analysis using weighted prevalence estimates and time,series data. Results HIV/HCV prevalence estimates were obtained for 343 regions. In regions other than South America/sub-Saharan Africa (SAm/SSA), mean IDU HIV prevalence is likely to be negligible if HCV prevalence is <30% (95% confidence interval 22,38%) but increases progressively with HCV prevalence thereafter [linearly (, = 0.39 and R2 = 0.67) or in proportion to cubed HCV prevalence (, = 0.40 and R2 = 0.67)]. In SAm/SSA, limited data suggest that mean HIV prevalence is proportional to HCV prevalence (, = 0.84, R2 = 0.99), but will be much greater than in non-SAm/SSA settings with no threshold HCV prevalence that corresponds to low HIV risk. At low HCV prevalences (<50%), time,series data suggest that any change in HIV prevalence over time is likely to be much smaller (<25%) than the change in HCV prevalence over the same time-period, but that this difference diminishes at higher HCV prevalences. Conclusions HCV prevalence could be an indicator of HIV risk among IDUs. In most settings, reducing HCV prevalence below a threshold (30%) would reduce substantially any HIV risk, and could provide a target for HIV prevention. [source]


    Market Shares, Financial Constraints and Pricing Behaviour in the Export Market

    ECONOMICA, Issue 276 2002
    Nils Gottfries
    A structural dynamic model of price and quantity adjustment is estimated on time series data for exports and export prices. Two sources of dynamics are considered: customer markets and preset prices. As predicted by the customer market model, the market share adjusts slowly after a change in the relative price and financial conditions affect prices. Prices are found to be sticky in the sense that they do not reflect the most recent information about costs and exchange rates. A parsimoniously parameterized structural model explains about 90% of the variation in market share and the relative price. [source]


    Nonparametric harmonic regression for estuarine water quality data

    ENVIRONMETRICS, Issue 6 2010
    Melanie A. Autin
    Abstract Periodicity is omnipresent in environmental time series data. For modeling estuarine water quality variables, harmonic regression analysis has long been the standard for dealing with periodicity. Generalized additive models (GAMs) allow more flexibility in the response function. They permit parametric, semiparametric, and nonparametric regression functions of the predictor variables. We compare harmonic regression, GAMs with cubic regression splines, and GAMs with cyclic regression splines in simulations and using water quality data collected from the National Estuarine Reasearch Reserve System (NERRS). While the classical harmonic regression model works well for clean, near-sinusoidal data, the GAMs are competitive and are very promising for more complex data. The generalized additive models are also more adaptive and require less-intervention. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    A missing values imputation method for time series data: an efficient method to investigate the health effects of sulphur dioxide levels

    ENVIRONMETRICS, Issue 2 2010
    Swarna Weerasinghe
    Abstract Environmental data contains lengthy records of sequential missing values. Practical problem arose in the analysis of adverse health effects of sulphur dioxide (SO2) levels and asthma hospital admissions for Sydney, Nova Scotia, Canada. Reliable missing values imputation techniques are required to obtain valid estimates of the associations with sparse health outcomes such as asthma hospital admissions. In this paper, a new method that incorporates prediction errors to impute missing values is described using mean daily average sulphur dioxide levels following a stationary time series with a random error. Existing imputation methods failed to incorporate the prediction errors. An optimal method is developed by extending a between forecast method to include prediction errors. Validity and efficacy are demonstrated comparing the performances with the values that do not include prediction errors. The performances of the optimal method are demonstrated by increased validity and accuracy of the , coefficient of the Poisson regression model for the association with asthma hospital admissions. Visual inspection of the imputed values of sulphur dioxide levels with prediction errors demonstrated that the variation is better captured. The method is computationally simple and can be incorporated into the existing statistical software. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    On case-crossover methods for environmental time series data

    ENVIRONMETRICS, Issue 2 2007
    Heather J. Whitaker
    Abstract Case-crossover methods are widely used for analysing data on the association between health events and environmental exposures. In recent years, several approaches to choosing referent periods have been suggested, with much discussion of two types of bias: bias due to temporal trends, and overlap bias. In the present paper, we revisit the case-crossover method, focusing on its origin in the case-control paradigm, in order to throw new light on these biases. We emphasise the distinction between methods based on case-control logic (such as the symmetric bi-directional (SBI) method), for which overlap bias is a consequence of non-exchangeability of the exposure series, and methods based on cohort logic (such as the time-stratified (TS) method), for which overlap bias does not arise. We show by example that the TS method may suffer severe bias from residual seasonality. This method can be extended to control for seasonality. However, time series regression is more flexible than case-crossover methods for the analysis of data on shared environmental exposures. We conclude that time series regression ought to be adopted as the method of choice in such applications. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Extreme value predictions based on nonstationary time series of wave data

    ENVIRONMETRICS, Issue 1 2006
    Christos N. Stefanakos
    Abstract A new method for calculating return periods of various level values from nonstationary time series data is presented. The key idea of the method is a new definition of the return period, based on the MEan Number of Upcrossings of the level x* (MENU method). In the present article, the case of Gaussian periodically correlated time series is studied in detail. The whole procedure is numerically implemented and applied to synthetic wave data in order to test the stability of the method. Results obtained by using several variants of traditional methods (Gumbel's approach and the POT method) are also presented for comparison purposes. The results of the MENU method showed an extraordinary stability, in contrast to the wide variability of the traditional methods. The predictions obtained by means of the MENU method are lower than the traditional predictions. This is in accordance with the results of other methods that also take into account the dependence structure of the examined time series. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Alcohol and mortality: methodological and analytical issues in aggregate analyses

    ADDICTION, Issue 1s1 2001
    Thor Norström
    This supplement includes a collection of papers that aim at estimating the relationship between per capita alcohol consumption and various forms of mortality, including mortality from liver cirrhosis, accidents, suicide, homicide, ischaemic heart disease, and total mortality. The papers apply a uniform methodological protocol, and they are all based on time series data covering the post-war period in the present EU countries and Norway. In this paper we discuss various methodological and analytical issues that are common to these papers. We argue that analysis of time series data is the most feasible approach for assessing the aggregate health consequences of changes in population drinking. We further discuss how aggregate data may also be useful for judging the plausibility of individual-level relationships, particularly those prone to be confounded by selection effects. The aggregation of linear and curvilinear risk curves is treated as well as various methods for dealing with the time-lag problem. With regard to estimation techniques we find country specific analyses preferable to pooled cross-sectional/time series models since the latter incorporate the dubious element of geographical co-variation, and conceal potentially interesting variations in alcohol effects. The approach taken in the papers at hand is instead to pool the country specific results into three groups of countries that represent different drinking cultures; traditional wine countries of southern Europe, beer countries of central Europe and the British Isles and spirits countries of northern Europe. The findings of the papers reinforce the central tenet of the public health perspective that overall consumption is an important determinant of alcohol-related harm rates. However, there is a variation across country groups in alcohol effects, particularly those on violent deaths, that indicates the potential importance of drinking patterns. There is no support for the notion that increases in per capita consumption have any cardioprotective effects at the population level. [source]


    A semi-parametric gap-filling model for eddy covariance CO2 flux time series data

    GLOBAL CHANGE BIOLOGY, Issue 9 2006
    VANESSA J. STAUCH
    Abstract This paper introduces a method for modelling the deterministic component of eddy covariance CO2 flux time series in order to supplement missing data in these important data sets. The method is based on combining multidimensional semi-parametric spline interpolation with an assumed but unstated dependence of net CO2 flux on light, temperature and time. We test the model using a range of synthetic canopy data sets generated using several canopy simulation models realized for different micrometeorological and vegetation conditions. The method appears promising for filling large systematic gaps providing the associated missing data do not overerode critical information content in the conditioning data used for the model optimization. [source]


    The effect of respiration variations on independent component analysis results of resting state functional connectivity

    HUMAN BRAIN MAPPING, Issue 7 2008
    Rasmus M. Birn
    Abstract The analysis of functional connectivity in fMRI can be severely affected by cardiac and respiratory fluctuations. While some of these artifactual signal changes can be reduced by physiological noise correction routines, signal fluctuations induced by slower breath-to-breath changes in the depth and rate of breathing are typically not removed. These slower respiration-induced signal changes occur at low frequencies and spatial locations similar to the fluctuations used to infer functional connectivity, and have been shown to significantly affect seed-ROI or seed-voxel based functional connectivity analysis, particularly in the default mode network. In this study, we investigate the effect of respiration variations on functional connectivity maps derived from independent component analysis (ICA) of resting-state data. Regions of the default mode network were identified by deactivations during a lexical decision task. Variations in respiration were measured independently and correlated with the MRI time series data. ICA appears to separate the default mode network and the respiration-related changes in most cases. In some cases, however, the component automatically identified as the default mode network was the same as the component identified as respiration-related. Furthermore, in most cases the time series associated with the default mode network component was still significantly correlated with changes in respiration volume per time, suggesting that current methods of ICA may not completely separate respiration from the default mode network. An independent measure of the respiration provides valuable information to help distinguish the default mode network from respiration-related signal changes, and to assess the degree of residual respiration related effects. Hum Brain Mapp 2008. © 2008 Wiley-Liss, Inc. [source]


    Amygdala,prefrontal dissociation of subliminal and supraliminal fear

    HUMAN BRAIN MAPPING, Issue 8 2006
    Leanne M. Williams
    Abstract Facial expressions of fear are universally recognized signals of potential threat. Humans may have evolved specialized neural systems for responding to fear in the absence of conscious stimulus detection. We used functional neuroimaging to establish whether the amygdala and the medial prefrontal regions to which it projects are engaged by subliminal fearful faces and whether responses to subliminal fear are distinguished from those to supraliminal fear. We also examined the time course of amygdala-medial prefrontal responses to supraliminal and subliminal fear. Stimuli were fearful and neutral baseline faces, presented under subliminal (16.7 ms and masked) or supraliminal (500 ms) conditions. Skin conductance responses (SCRs) were recorded simultaneously as an objective index of fear perception. SPM2 was used to undertake search region-of-interest (ROI) analyses for the amygdala and medial prefrontal (including anterior cingulate) cortex, and complementary whole-brain analyses. Time series data were extracted from ROIs to examine activity across early versus late phases of the experiment. SCRs and amygdala activity were enhanced in response to both subliminal and supraliminal fear perception. Time series analysis showed a trend toward greater right amygdala responses to subliminal fear, but left-sided responses to supraliminal fear. Cortically, subliminal fear was distinguished by right ventral anterior cingulate activity and supraliminal fear by dorsal anterior cingulate and medial prefrontal activity. Although subcortical amygdala activity was relatively persistent for subliminal fear, supraliminal fear showed more sustained cortical activity. The findings suggest that preverbal processing of fear may occur via a direct rostral,ventral amygdala pathway without the need for conscious surveillance, whereas elaboration of consciously attended signals of fear may rely on higher-order processing within a dorsal cortico,amygdala pathway. Hum Brain Mapp, 2005. © 2005 Wiley-Liss, Inc. [source]


    Spectral decomposition of periodic ground water fluctuation in a coastal aquifer

    HYDROLOGICAL PROCESSES, Issue 12 2008
    David Ching-Fang Shih
    Abstract This research accomplished by the descriptive statistics and spectral analysis of six kinds of time series data gives a complete assessment of periodic fluctuation in significant constituents for the Huakang Shan earthquake monitoring site. Spectral analysis and bandpass filtering techniques are demonstrated to accurately analyse the significant component. Variation in relative ground water heads with a period of 12·6 h is found to be highly related to seawater level fluctuation. Time lag is estimated about 3·78 h. Based on these phenomena, the coastal aquifer formed in an unconsolidated formation can be affected by the nearby seawater body for the semi-diurnal component. Fluctuation in piezometric heads is found to correspond at a rate of 1000 m h,1. Atmospheric pressure presents the significant components at periods of 10·8 h and 7·2 h in a quite different type, compared to relative ground water head and seawater level. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Time series forecasting by combining the radial basis function network and the self-organizing map

    HYDROLOGICAL PROCESSES, Issue 10 2005
    Gwo-Fong Lin
    Abstract Based on a combination of a radial basis function network (RBFN) and a self-organizing map (SOM), a time-series forecasting model is proposed. Traditionally, the positioning of the radial basis centres is a crucial problem for the RBFN. In the proposed model, an SOM is used to construct the two-dimensional feature map from which the number of clusters (i.e. the number of hidden units in the RBFN) can be figured out directly by eye, and then the radial basis centres can be determined easily. The proposed model is examined using simulated time series data. The results demonstrate that the proposed RBFN is more competent in modelling and forecasting time series than an autoregressive integrated moving average (ARIMA) model. Finally, the proposed model is applied to actual groundwater head data. It is found that the proposed model can forecast more precisely than the ARIMA model. For time series forecasting, the proposed model is recommended as an alternative to the existing method, because it has a simple structure and can produce reasonable forecasts. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Error analysis for the evaluation of model performance: rainfall,runoff event time series data

    HYDROLOGICAL PROCESSES, Issue 8 2005
    Edzer J. Pebesma
    Abstract This paper provides a procedure for evaluating model performance where model predictions and observations are given as time series data. The procedure focuses on the analysis of error time series by graphing them, summarizing them, and predicting their variability through available information (recalibration). We analysed two rainfall,runoff events from the R-5 data set, and evaluated 12 distinct model simulation scenarios for these events, of which 10 were conducted with the quasi-physically-based rainfall,runoff model (QPBRRM) and two with the integrated hydrology model (InHM). The QPBRRM simulation scenarios differ in their representation of saturated hydraulic conductivity. Two InHM simulation scenarios differ with respect to the inclusion of the roads at R-5. The two models, QPBRRM and InHM, differ strongly in the complexity and number of processes included. For all model simulations we found that errors could be predicted fairly well to very well, based on model output, or based on smooth functions of lagged rainfall data. The errors remaining after recalibration are much more alike in terms of variability than those without recalibration. In this paper, recalibration is not meant to fix models, but merely as a diagnostic tool that exhibits the magnitude and direction of model errors and indicates whether these model errors are related to model inputs such as rainfall. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Remote Monitoring Integrated State Variables for AR Model Prediction of Daily Total Building Air-Conditioning Power Consumption

    IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 5 2010
    Chuzo Ninagawa Member
    Abstract It is extremely difficult to predict daily accumulated power consumption of the entire building air-conditioning facilities because of a huge number of variables. We propose new integrated state variables, i.e. the daily operation amount and the daily operation-capacity-weighted average set temperature. Taking advantage of a remote monitoring technology, time series data of the integrated state variables were collected and an autoregressive (AR) model prediction for the daily total power consumption has been tried. © 2010 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source]


    Assessing the long-run economic impact of labour law systems: a theoretical reappraisal and analysis of new time series data

    INDUSTRIAL RELATIONS JOURNAL, Issue 6 2008
    Simon Deakin
    ABSTRACT Standard economic theory sees labour law as an exogenous interference with market relations and predicts mostly negative impacts on employment and productivity. We argue for a more nuanced theoretical position: labour law is, at least in part, endogenous, with both the production and the application of labour law norms influenced by national and sectoral contexts, and by complementarities between the institutions of the labour market and those of corporate governance and financial markets. Legal origin may also operate as a force shaping the content of the law and its economic impact. Time-series analysis using a new data set on legal change from the 1970s to the mid-2000s shows evidence of positive correlations between regulation and growth in employment and productivity, at least for France and Germany. No relationship, either positive or negative, is found for the UK and, although the United States shows a weak negative relationship between regulation and employment growth, this is offset by productivity gains. [source]


    Case studies in Bayesian segmentation applied to CD control

    INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 5 2003
    A.R. Taylor
    Identifying step changes in historical process and controller output variables can lead to improved process understanding and fault resolution in control system performance analysis. This paper describes an application of Bayesian methods in the search for statistically significant temporal segmentations in the data collected by a cross directional (CD) control system in an industrial web forming process. CD control systems give rise to vector observations which are often transformed through orthogonal bases for control and performance analysis. In this paper two models which exploit basis function representations of vector times series data are segmented. The first of these is a power spectrum model based on the asymptotic Chi-squared approximation which allows large data sets to be processed. The second approach, more capable of detecting small changes, but as a result is more computationally demanding, is a special case of the multivariate linear model. Given the statistical model of the data, inference regarding the number and location of the change-points is based on numerical Bayesian methods known as Markov chain Monte Carlo (MCMC). The methods are applied to real data and the resulting segmentation relates to real process events. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    An approach to the linguistic summarization of time series using a fuzzy quantifier driven aggregation

    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 5 2010
    Janusz Kacprzyk
    We extend our previous work on the linguistic summarization of time series data meant as the linguistic summarization of trends, i.e. consecutive parts of the time series, which may be viewed as exhibiting a uniform behavior under an assumed (degree of) granulation, and identified with straight line segments of a piecewise linear approximation of the time series. We characterize the trends by the dynamics of change, duration, and variability. A linguistic summary of a time series is then viewed to be related to a linguistic quantifier driven aggregation of trends. We primarily employ for this purpose the classic Zadeh's calculus of linguistically quantified propositions, which is presumably the most straightforward and intuitively appealing, using the classic minimum operation and mentioning other t -norms. We also outline the use of the Sugeno and Choquet integrals proposed in our previous papers. We show an application to the absolute performance type analysis of time series data on daily quotations of an investment fund over an 8-year period, by presenting first an analysis of characteristic features of quotations, under various (degrees of) granulations assumed, and then by listing some more interesting and useful summaries obtained. We propose a convenient presentation of linguistic summaries focused on some characteristic feature exemplified by what happens "almost always," "very often," "quite often," "almost never," etc. All these analyses are meant to provide means to support a human user to make decisions. © 2010 Wiley Periodicals, Inc. [source]


    Fuzzy information granules in time series data

    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 7 2004
    Michael R. Berthold
    Often, it is desirable to represent a set of time series through typical shapes in order to detect common patterns. The algorithm presented here compares pieces of a different time series in order to find such similar shapes. The use of a fuzzy clustering technique based on fuzzy c-means allows us to detect shapes that belong to a certain group of typical shapes with a degree of membership. Modifications to the original algorithm also allow this matching to be invariant with respect to a scaling of the time series. The algorithm is demonstrated on a widely known set of data taken from the electrocardiogram (ECG) rhythm analysis experiments performed at the Massachusetts Institute of Technology (MIT) laboratories and on data from protein mass spectrography. © 2004 Wiley Periodicals, Inc. [source]


    Forecasting and Finite Sample Performance of Short Rate Models: International Evidence,

    INTERNATIONAL REVIEW OF FINANCE, Issue 3-4 2005
    SIRIMON TREEPONGKARUNA
    ABSTRACT This paper evaluates the forecasting and finite sample performance of short-term interest rate models in a number of countries. Specifically, we run a series of in-sample and out-of-sample tests for both the conditional mean and volatility of one-factor short rate models, and compare the results to the random walk model. Overall, we find that the out-of-sample forecasting performance of one-factor short rate models is poor, stemming from the inability of the models to accommodate jumps and discontinuities in the time series data. In addition, we perform a series of Monte Carlo analyses similar to Chapman and Pearson to document the finite sample performance of the short rate models when ,3 is not restricted to be equal to one. Our results indicate the potential dangers of over-parameterization and highlight the limitations of short-term interest rate models. [source]


    A small monetary system for the euro area based on German data

    JOURNAL OF APPLIED ECONOMETRICS, Issue 6 2006
    Ralf Brüggemann
    Previous euro area money demand studies have used aggregated national time series data from the countries participating in the European Monetary Union (EMU). However, aggregation may be problematic because macroeconomic convergence processes have taken place in the countries of interest. Therefore, in this study, quarterly German data until 1998 are combined with data from the euro area from 1999 until 2002 and these series are used for fitting a small vector error correction model for the monetary sector of the EMU. A stable long-run money demand relation is found for the full sample period. Moreover, impulse responses do not change much when the sample period is extended by the EMU period provided the break in the extended data series is captured by a simple dummy variable. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Real-time forecasting of photosmog episodes: the Naples case study

    JOURNAL OF CHEMOMETRICS, Issue 7 2001
    A. Riccio
    Abstract In this paper we analysed the ozone time series data collected by the local monitoring network in the Naples urban area (southern Italy) during the spring/summer period of 1996. Our aim was to identify a reliable and effective model that could be used for the real-time forecasting of photosmog episodes. We studied the applicability of seasonal autoregressive integrated moving average models with some exogenous variables (ARIMAX) to our case study. The choice of exogenous variables,temperature, [NO2]/[NO] ratio and wind speed,was based on physical reasoning. The forecasting performance of all models was evaluated with data not used in model development, by means of an array of statistical indices: the comparison between observed and forecast means and standard deviations; intercept and slope of a least squares regression of forecast variable on observed variable; mean absolute and root mean square errors; and 95% confidence limits of forecast variable. The assessment of all models was also based on their tendency to forecast critical episodes. It was found that the model using information from the temperature data set to predict peak ozone levels gives satisfactory results, about 70% of critical episodes being correctly predicted by the 24,h ahead forecast function. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Information theoretical measures to analyze trajectories in rational molecular design

    JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 16 2007
    K. Hamacher
    Abstract We develop a new methodology to analyze molecular dynamics trajectories and other time series data from simulation runs. This methodology is based on an information measure of the difference between distributions of various data extract from such simulations. The method is fast as it only involves the numerical integration/summation of the distributions in one dimension while avoiding sampling issues at the same time. The method is most suitable for applications in which different scenarios are to be compared, e.g. to guide rational molecular design. We show the power of the proposed method in an application of rational drug design by reduced model computations on the BH3 motif in the apoptosis inducing BCL2 protein family. © 2007 Wiley Periodicals, Inc. J Comput Chem, 2007 [source]


    Testing for market power in the Australian grains and oilseeds industries

    AGRIBUSINESS : AN INTERNATIONAL JOURNAL, Issue 3 2007
    Christopher J. O'Donnell
    We formally assess competitive buying and selling behavior in the Australian grains and oilseeds industries using a more realistic empirical model and a less aggregated data set than previously available. We specify a duality model of profit maximization that allows for imperfect competition in both input and output markets and for variable-proportions technologies. Aggregate input-output data are used to define the structure of the relevant industries, and time series data are then used to implement the model for 13 grains and oilseeds products handled by seven groups of agents. The model is estimated in a Bayesian econometrics framework. We find evidence of flour and cereal food product manufacturers exerting market power when purchasing wheat, barley, oats and triticale; beer and malt manufacturers exerting market power when purchasing wheat and barley; and other food product manufacturers exerting market power when purchasing wheat, barley, oats and triticale. [EconLit citations: C11, L66, Q11]. © 2007 Wiley Periodicals, Inc. Agribusiness 23: 349,376, 2007. [source]


    Statistical simulation of flood variables: incorporating short-term sequencing

    JOURNAL OF FLOOD RISK MANAGEMENT, Issue 1 2008
    Y. Cai
    Abstract The pluvial and fluvial flooding in the United Kingdom over the summer of 2007 arose as a result of anomalous climatic conditions that persisted for over a month. Gaining an understanding of the sequencing of storm events and representing their characteristics within flood risk analysis is therefore of importance. This paper provides a general method for simulating univariate time series data, with a given marginal extreme value distribution and required autocorrelation structure, together with a demonstration of the method with synthetic data. The method is then extended to the multivariate case, where cross-variable correlations are also represented. The multivariate method is shown to work well for a two-variable simulation of wave heights and sea surges at Lerwick. This work was prompted by an engineering need for long time series data for use in continuous simulation studies where gradual deterioration is a contributory factor to flood risk and potential structural failure. [source]