Modelling Strategies (modelling + strategy)

Distribution by Scientific Domains


Selected Abstracts


Predicting the impacts of climate change on the distribution of species: are bioclimate envelope models useful?

GLOBAL ECOLOGY, Issue 5 2003
Richard G. Pearson
ABSTRACT Modelling strategies for predicting the potential impacts of climate change on the natural distribution of species have often focused on the characterization of a species' bioclimate envelope. A number of recent critiques have questioned the validity of this approach by pointing to the many factors other than climate that play an important part in determining species distributions and the dynamics of distribution changes. Such factors include biotic interactions, evolutionary change and dispersal ability. This paper reviews and evaluates criticisms of bioclimate envelope models and discusses the implications of these criticisms for the different modelling strategies employed. It is proposed that, although the complexity of the natural system presents fundamental limits to predictive modelling, the bioclimate envelope approach can provide a useful first approximation as to the potentially dramatic impact of climate change on biodiversity. However, it is stressed that the spatial scale at which these models are applied is of fundamental importance, and that model results should not be interpreted without due consideration of the limitations involved. A hierarchical modelling framework is proposed through which some of these limitations can be addressed within a broader, scale-dependent context. [source]


A fibre flexure,shear model for seismic analysis of RC-framed structures

EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 5 2009
P. Ceresa
Abstract While currently existing modelling approaches of reinforced concrete (RC) behaviour allow a reasonably accurate prediction of flexural response, the determination of its shear counterpart needs further developments. There are various modelling strategies in the literature able to predict the shear response and the shear,flexure coupling under monotonic loading conditions. However, very few are the reported models that have demonstrated successful results under cyclic loading, as in the seismic load case. These considerations lead to this research work focused on the development of a flexure,shear model for RC beam,column elements. A reliable constitutive model for cracked RC subjected to cyclic loading was implemented as bi-axial fibre constitutive model into a two-dimensional Timoshenko beam,column element. Aim of this research work is to arrive at the definition of a numerical model sufficiently accurate and, at the same time, computationally efficient, which will enable implementation within a finite element package for nonlinear dynamic analysis of existing non-seismically designed RC structures that are prone to shear-induced damage and collapse. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Indirect tax reform and the role of exemptions

FISCAL STUDIES, Issue 4 2001
John Creedy
Abstract This paper examines the question of whether indirect tax rates should be uniform, using four different modelling strategies. First, marginal tax reform is examined. This is concerned with the optimal direction of small changes in effective indirect tax rates and requires considerably less information than the calculation of optimal rates. Second, the welfare effects of a partial shift from the current indirect tax system in Australia towards a goods and services tax (GST) are considered, with particular emphasis on differences between household types and the role of exemptions. Third, in view of the stress on a distributional role for exemptions of certain goods from a GST, the potential limits to such redistribution are considered. The fourth approach examines the extent of horizontal inequity and reranking that can arise when there are non-uniform tax rates. These inequities arise essentially because of preference heterogeneity. [source]


Predicting the impacts of climate change on the distribution of species: are bioclimate envelope models useful?

GLOBAL ECOLOGY, Issue 5 2003
Richard G. Pearson
ABSTRACT Modelling strategies for predicting the potential impacts of climate change on the natural distribution of species have often focused on the characterization of a species' bioclimate envelope. A number of recent critiques have questioned the validity of this approach by pointing to the many factors other than climate that play an important part in determining species distributions and the dynamics of distribution changes. Such factors include biotic interactions, evolutionary change and dispersal ability. This paper reviews and evaluates criticisms of bioclimate envelope models and discusses the implications of these criticisms for the different modelling strategies employed. It is proposed that, although the complexity of the natural system presents fundamental limits to predictive modelling, the bioclimate envelope approach can provide a useful first approximation as to the potentially dramatic impact of climate change on biodiversity. However, it is stressed that the spatial scale at which these models are applied is of fundamental importance, and that model results should not be interpreted without due consideration of the limitations involved. A hierarchical modelling framework is proposed through which some of these limitations can be addressed within a broader, scale-dependent context. [source]


Modelling blowing snow redistribution to prairie wetlands

HYDROLOGICAL PROCESSES, Issue 18 2009
X. Fang
Abstract Blowing snow transports and sublimates a substantial portion of the seasonal snowfall in the prairies of western Canada. Snow redistribution is an important feature of prairie hydrology as deep snowdrifts provide a source of meltwater to replenish ponds and generate streamflow in this dry region. The spatial distribution of snow water equivalent in the spring is therefore of great interest. A test of the distributed and aggregated modelling strategies for blowing snow transport and sublimation was conducted at the St. Denis National Wildlife Area in the rolling, internally drained prairie pothole region east of Saskatoon, Saskatchewan, Canada. A LiDAR-based DEM and aerial photograph-based vegetation cover map were available for this region. A coupled complex windflow and blowing snow model was run with 262,144 6 m × 6 m grid cells to produce spatially distributed estimates of seasonal blowing snow transport and sublimation. The calculation was then aggregated to seven landscape units that represented the major influences of surface roughness, topography and fetch on blowing snow transport and sublimation. Both the distributed and aggregated simulations predicted similar end-of-winter snow water equivalent with substantial redistribution of blowing snow from exposed sparsely vegetated sites across topographic drainage divides to the densely vegetated pothole wetlands. Both simulations also agreed well with snow survey observations. While the distributed calculations provide a fascinating and detailed visual image of the interaction of complex landscapes and blowing snow redistribution and sublimation, it is clear that blowing snow transport and sublimation calculations can be successfully aggregated to the spatial scale of the major landscape units in this environment. This means that meso and macroscale hydrological models can represent blowing snow redistribution successfully in the prairies. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Do community-level models describe community variation effectively?

JOURNAL OF BIOGEOGRAPHY, Issue 10 2010
Andrés Baselga
Abstract Aim, The aim of community-level modelling is to improve the performance of species distributional models by taking patterns of co-occurrence among species into account. Here, we test this expectation by examining how well three community-level modelling strategies (,assemble first, predict later', ,predict first, assemble later', and ,assemble and predict together') spatially project the observed composition of species assemblages. Location, Europe. Methods, Variation in the composition of European tree assemblages and its spatial and environmental correlates were examined with cluster analysis and constrained analysis of principal coordinates. Results were used to benchmark spatial projections from three community-based strategies: (1) assemble first, predict later (cluster analysis first, then generalized linear models, GLMs); (2) predict first, assemble later (GLMs first, then cluster analysis); and (3) assemble and predict together (constrained quadratic ordination). Results, None of the community-level modelling strategies was able to accurately model the observed distribution of tree assemblages in Europe. Uncertainty was particularly high in southern Europe, where modelled assemblages were markedly different from observed ones. Assembling first and predicting later led to distribution models with the simultaneous occurrence of several types of assemblages in southern Europe that do not co-occur, and the remaining strategies yielded models with the presence of non-analogue assemblages that presently do not exist and that are much more strongly correlated with environmental gradients than with the real assemblages. Main conclusions, Community-level models were unable to characterize the distribution of European tree assemblages effectively. Models accounting for co-occurrence patterns along environmental gradients did not outperform methods that assume individual responses of species to climate. Unrealistic assemblages were generated because of the models' inability to capture fundamental processes causing patterns of covariation among species. The usefulness of these forms of community-based models thus remains uncertain and further research is required to demonstrate their utility. [source]


Evaluating effectiveness of preoperative testing procedure: some notes on modelling strategies in multi-centre surveys

JOURNAL OF EVALUATION IN CLINICAL PRACTICE, Issue 1 2008
Dario Gregori PhD
Abstract Rationale, In technology assessment in health-related fields the construction of a model for interpreting the economic implications of the introduction of a technology is only a part of the problem. The most important part is often the formulation of a model that can be used for selecting patients to submit to the new cost-saving procedure or medical strategy. The model is usually complicated by the fact that data are often non-homogeneous with respect to some uncontrolled variables and are correlated. The most typical example is the so-called hospital effect in multi-centre studies. Aims and objectives, We show the implications derived by different choices in modelling strategies when evaluating the usefulness of preoperative chest radiography, an exam performed before surgery, usually with the aim to detect unsuspected abnormalities that could influence the anaesthetic management and/or surgical plan. Method, We analyze the data from a multi-centre study including more than 7000 patients. We use about 6000 patients to fit regression models using both a population averaged and a subject-specific approach. We explore the limitations of these models when used for predictive purposes using a validation set of more than 1000 patients. Results, We show the importance of taking into account the heterogeneity among observations and the correlation structure of the data and propose an approach for integrating a population-averaged and subject specific approach into a single modeling strategy. We find that the hospital represents an important variable causing heterogeneity that influences the probability of a useful POCR. Conclusions, We find that starting with a marginal model, evaluating the shrinkage effect and eventually move to a more detailed model for the heterogeneity is preferable. This kind of flexible approach seems to be more informative at various phases of the model-building strategy. [source]


On the Asymmetric Volatility of Employment Outflows

LABOUR, Issue 4 2002
Gareth Leeves
Recent research into job flow dynamics highlights the asymmetry in aggregate employment adjustment. This has implications for patterns of worker flow adjustment. This paper draws upon modelling strategies developed in the applied finance literature to characterize the asymmetry of aggregate employment outflow volatility. It is found that higher employment outflow volatility is associated with negative shocks, when the outflow is lower than expected. This, it is suggested, could be associated with the dynamic processes linking the hiring and turnover of workers. [source]


Mean Reversion of Interest Rates in the Eurocurrency Market

OXFORD BULLETIN OF ECONOMICS & STATISTICS, Issue 4 2001
Jhy-Lin Wu
One stylised fact to emerge from the empirical analysis of interest rates is that the unit-root hypothesis in nominal interest rates cannot be rejected. However, using the panel date unit-root test IM, Pesaran and Shin (1997), we find support for the mean-reverting property of Eurocurrency rates. Thus, neither a vector-error-correction model nor a vector autoregressive model in differences is appropriate for modelling Eurocurrency rates. Instead, conventional modelling strategies with level data are appropriate. Furthermore, the finding of stationary interest rates supports uncovered interest parity, and hence the convergence hypothesis of interest rates. This in turn suggests a limited role for a monetary authority to affect domestic interest rates. [source]


Broad Beam Ion Sources for Electrostatic Space Propulsion and Surface Modification Processes: From Roots to Present Applications

CONTRIBUTIONS TO PLASMA PHYSICS, Issue 7 2007
H. Neumann
Abstract Ion thrusters or broad beam ion sources are widely used in electrostatic space propulsion and in high-end surface modification processes. A short historical review of the roots of electric space propulsion is given. In the following, we introduce the electrostatic ion thrusters and broad beam ion sources based on different plasma excitation principles and describe the similarities as well as the differences briefly. Furthermore, an overview on source plasma and ion beam characterisation methods is presented. Apart from that, a beam profile modelling strategy with the help of numerical trajectory codes as basis for a special grid system design is outlined. This modelling represents the basis for the adaptation of a grid system for required technological demands. Examples of model validation demonstrate their reliability. One of the main challenges in improvement of ion beam technologies is the customisation of the ion beam properties, e.g. the ion current density profile for specific demands. Methods of an ex-situ and in-situ beam profile control will be demonstrated. Examples for the use of ion beam technologies in space and on earth , the RIT-10 rescue mission of ESA's satellite Artemis, the RIT-22 for BepiColombo mission and the deposition of multilayer stacks for EUVL (Extreme Ultra Violet Lithography) mask blank application are provided in order to illustrate the potential of plasma-based ion beam sources. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


A Hertz contact model with non-linear damping for pounding simulation

EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 7 2006
Susendar Muthukumar
Abstract This paper investigates the cogency of various impact models in capturing the seismic pounding response of adjacent structures. The analytical models considered include the contact force-based linear spring, Kelvin and Hertz models, and the restitution-based stereomechanical approach. In addition, a contact model based on the Hertz law and using a non-linear hysteresis damper (Hertzdamp model) is also introduced for pounding simulation. Simple analytical approaches are presented to determine the impact stiffness parameters of the various contact models. Parameter studies are performed using two degree-of-freedom linear oscillators to determine the effects of impact modelling strategy, system period ratio, peak ground acceleration (PGA) and energy loss during impact on the system responses. A suite of 27 ground motion records from 13 different earthquakes is used in the analysis. The results indicate that the system displacements from the stereomechanical, Kelvin and Hertzdamp models are similar for a given coefficient of restitution, despite using different impact methodologies. Pounding increases the responses of the stiffer system, especially for highly out-of-phase systems. Energy loss during impact is more significant at higher levels of PGA. Based on the findings, the Hertz model provides adequate results at low PGA levels, and the Hertzdamp model is recommended at moderate and high PGA levels. Copyright © 2006 John Wiley & Sons, Ltd. [source]


A modelling strategy for the analysis of clinical trials with partly missing longitudinal data

INTERNATIONAL JOURNAL OF METHODS IN PSYCHIATRIC RESEARCH, Issue 3 2003
Ian R. White
Abstract Standard statistical analyses of randomized controlled trials with partially missing outcome data often exclude valuable information from individuals with incomplete follow-up. This may lead to biased estimates of the intervention effect and loss of precision. We consider a randomized trial with a repeatedly measured outcome, in which the value of the outcome on the final occasion is of primary interest. We propose a modelling strategy in which the model is successively extended to include baseline values of the outcome, then intermediate values of the outcome, and finally values of other outcome variables. Likelihood-based estimation of random effects models is used, allowing the incorporation of data from individuals with some missing outcomes. Each estimated intervention effect is free of non-response bias under a different missing-at-random assumption. These assumptions become more plausible as the more complex models are fitted, so we propose using the trend in estimated intervention effects to assess the nature of any non-response bias. The methods are applied to data from a trial comparing intensive case management with standard case management for severely psychotic patients. All models give similar estimates of the intervention effect and we conclude that non-response bias is likely to be small. Copyright © 2003 Whurr Publishers Ltd. [source]


Bananas and petrol: further evidence on the forecasting accuracy of the ABS ,headline' and ,underlying' rates of inflation

JOURNAL OF FORECASTING, Issue 6 2010
Liam J. A. Lenten
Abstract In the light of the still topical nature of ,bananas and petrol' being blamed for driving much of the inflationary pressures in Australia in recent times, the ,headline' and ,underlying' rates of inflation are scrutinised in terms of forecasting accuracy. A general structural time-series modelling strategy is applied to estimate models for alternative types of Consumer Price Index (CPI) measures. From this, out-of-sample forecasts are generated from the various models. The underlying forecasts are subsequently adjusted to facilitate comparison. The Ashley, Granger and Schmalensee (1980) test is then performed to determine whether there is a statistically significant difference between the root mean square errors of the models. The results lend weight to the recent findings of Song (2005) that forecasting models using underlying rates are not systematically inferior to those based on the headline rate. In fact, strong evidence is found that underlying measures produce superior forecasts. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Building neural network models for time series: a statistical approach

JOURNAL OF FORECASTING, Issue 1 2006
Marcelo C. Medeiros
Abstract This paper is concerned with modelling time series by single hidden layer feedforward neural network models. A coherent modelling strategy based on statistical inference is presented. Variable selection is carried out using simple existing techniques. The problem of selecting the number of hidden units is solved by sequentially applying Lagrange multiplier type tests, with the aim of avoiding the estimation of unidentified models. Misspecification tests are derived for evaluating an estimated neural network model. All the tests are entirely based on auxiliary regressions and are easily implemented. A small-sample simulation experiment is carried out to show how the proposed modelling strategy works and how the misspecification tests behave in small samples. Two applications to real time series, one univariate and the other multivariate, are considered as well. Sets of one-step-ahead forecasts are constructed and forecast accuracy is compared with that of other nonlinear models applied to the same series. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Combining evidence on air pollution and daily mortality from the 20 largest US cities: a hierarchical modelling strategy

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES A (STATISTICS IN SOCIETY), Issue 3 2000
Francesca Dominici
Reports over the last decade of association between levels of particles in outdoor air and daily mortality counts have raised concern that air pollution shortens life, even at concentrations within current regulatory limits. Criticisms of these reports have focused on the statistical techniques that are used to estimate the pollution,mortality relationship and the inconsistency in findings between cities. We have developed analytical methods that address these concerns and combine evidence from multiple locations to gain a unified analysis of the data. The paper presents log-linear regression analyses of daily time series data from the largest 20 US cities and introduces hierarchical regression models for combining estimates of the pollution,mortality relationship across cities. We illustrate this method by focusing on mortality effects of PM10 (particulate matter less than 10 ,m in aerodynamic diameter) and by performing univariate and bivariate analyses with PM10 and ozone (O3) level. In the first stage of the hierarchical model, we estimate the relative mortality rate associated with PM10 for each of the 20 cities by using semiparametric log-linear models. The second stage of the model describes between-city variation in the true relative rates as a function of selected city-specific covariates. We also fit two variations of a spatial model with the goal of exploring the spatial correlation of the pollutant-specific coefficients among cities. Finally, to explore the results of considering the two pollutants jointly, we fit and compare univariate and bivariate models. All posterior distributions from the second stage are estimated by using Markov chain Monte Carlo techniques. In univariate analyses using concurrent day pollution values to predict mortality, we find that an increase of 10 ,g m -3 in PM10 on average in the USA is associated with a 0.48% increase in mortality (95% interval: 0.05, 0.92). With adjustment for the O3 level the PM10 -coefficient is slightly higher. The results are largely insensitive to the specific choice of vague but proper prior distribution. The models and estimation methods are general and can be used for any number of locations and pollutant measurements and have potential applications to other environmental agents. [source]


High dimensional multivariate mixed models for binary questionnaire data

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 4 2006
Steffen Fieuws
Summary., Questionnaires that are used to measure the effect of an intervention often consist of different sets of items, each set possibly measuring another concept. Mixed models with set-specific random effects are a flexible tool to model the different sets of items jointly. However, computational problems typically arise as the number of sets increases. This is especially true when the random-effects distribution cannot be integrated out analytically, as with mixed models for binary data. A pairwise modelling strategy, in which all possible bivariate mixed models are fitted and where inference follows from pseudolikelihood theory, has been proposed as a solution. This approach has been applied to assess the effect of physical activity on psychocognitive functioning, the latter measured by a battery of questionnaires. [source]


Modelling Multivariate Outcomes in Hierarchical Data, with Application to Cluster Randomised Trials

BIOMETRICAL JOURNAL, Issue 3 2006
Rebecca M. Turner
Abstract In the cluster randomised study design, the data collected have a hierarchical structure and often include multivariate outcomes. We present a flexible modelling strategy that permits several normally distributed outcomes to be analysed simultaneously, in which intervention effects as well as individual-level and cluster-level between-outcome correlations are estimated. This is implemented in a Bayesian framework which has several advantages over a classical approach, for example in providing credible intervals for functions of model parameters and in allowing informative priors for the intracluster correlation coefficients. In order to declare such informative prior distributions, and fit models in which the between-outcome covariance matrices are constrained, priors on parameters within the covariance matrices are required. Careful specification is necessary however, in order to maintain non-negative definiteness and symmetry between the different outcomes. We propose a novel solution in the case of three multivariate outcomes, and present a modified existing approach and novel alternative for four or more outcomes. The methods are applied to an example of a cluster randomised trial in the prevention of coronary heart disease. The modelling strategy presented would also be useful in other situations involving hierarchical multivariate outcomes. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]