Deterministic Models (deterministic + models)

Distribution by Scientific Domains


Selected Abstracts


Bayesian uncertainty assessment in multicompartment deterministic simulation models for environmental risk assessment

ENVIRONMETRICS, Issue 4 2003
Samantha C. Bates
Abstract We use a special case of Bayesian melding to make inference from deterministic models while accounting for uncertainty in the inputs to the model. The method uses all available information, based on both data and expert knowledge, and extends current methods of ,uncertainty analysis' by updating models using available data. We extend the methodology for use with sequential multicompartment models. We present an application of these methods to deterministic models for concentration of polychlorinated biphenyl (PCB) in soil and vegetables. The results are posterior distributions of concentration in soil and vegetables which account for all available evidence and uncertainty. Model uncertainty is not considered. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Deterministic and Stochastic Modeling of Aquifer Stratigraphy, South Carolina

GROUND WATER, Issue 2 2000
Russell B. Miller
Deterministic and stochastic methods of three-dimensional hydrogeologic modeling are applied to characterization of contaminated Eocene aquifers at the Savannah River Site, South Carolina. The results address several important issues, including the use of multiple types of data in creating high-resolution aquifer models and the application of sequence-stratigraphic constraints. Specific procedures used include defining grid architecture stratigraphically, upscaling, modeling lithologic properties, and creating multiple equiprobable realizations of aquifer stratigraphy. An important question answered by the study is how to incorporate gamma-ray borehole-geophysical data in areas of anomalous log response, which occurs commonly in aquifers and confining units of the Atlantic Coastal Plain and other areas. To overcome this problem, gamma-ray models were conditioned to grain-size and lithofacies realizations. The investigation contributes to identifying potential pathways for downward migration of contaminants, which have been detected in confined aquifers at the modeling site. The approach followed in this investigation produces quantitative, stratigraphically constrained, geocellular models that incorporate multiple types of data from borehole-geophysical logs and continuous cores. The use of core-based stochastic realizations in conditioning deterministic models provides the advantage of incorporating lithologic information based on direct observations of cores rather than using only indirect measurements from geophysical logs. The high resolution of the models is demonstrated by the representation of thin, discontinuous clay beds that act as local barriers to flow. The models are effective in depicting the contrasts in geometry and heterogeneity between sheet-like nearshore-transgressive sands and laterally discontinuous sands of complex shoreline environments. [source]


Maximum likelihood fitting using ordinary least squares algorithms,

JOURNAL OF CHEMOMETRICS, Issue 8-10 2002
Rasmus Bro
Abstract In this paper a general algorithm is provided for maximum likelihood fitting of deterministic models subject to Gaussian-distributed residual variation (including any type of non-singular covariance). By deterministic models is meant models in which no distributional assumptions are valid (or applied) on the parameters. The algorithm may also more generally be used for weighted least squares (WLS) fitting in situations where either distributional assumptions are not available or other than statistical assumptions guide the choice of loss function. The algorithm to solve the associated problem is called MILES (Maximum likelihood via Iterative Least squares EStimation). It is shown that the sought parameters can be estimated using simple least squares (LS) algorithms in an iterative fashion. The algorithm is based on iterative majorization and extends earlier work for WLS fitting of models with heteroscedastic uncorrelated residual variation. The algorithm is shown to include several current algorithms as special cases. For example, maximum likelihood principal component analysis models with and without offsets can be easily fitted with MILES. The MILES algorithm is simple and can be implemented as an outer loop in any least squares algorithm, e.g. for analysis of variance, regression, response surface modeling, etc. Several examples are provided on the use of MILES. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Modelling the establishment and spread of autotetraploid plants in a spatially heterogeneous environment

JOURNAL OF EVOLUTIONARY BIOLOGY, Issue 3 2004
B.-H. Li
Abstract The establishment and spread of autotetraploids from an original diploid population in a heterogeneous environment were studied using a stochastic simulation model. Specifically, we investigated the effects of heterogeneous habitats and nonrandom pollen/seed dispersal on the critical value (,) of unreduced 2n gamete production necessary for the establishment of autotetraploids as predicted by deterministic models. Introduction of a heterogeneous environment with random pollen/seed dispersal had little effect on the , value. In contrast, incorporating nonrandom pollen/seed dispersal into a homogeneous environment considerably reduced the , value. Incorporating both heterogeneous habitats and nonrandom pollen/seed dispersal may lead either to an increase or to a decrease in the , value compared to that with random dispersal, indicating that the two factors interact in a complex way. [source]


Characterizing, Propagating, and Analyzing Uncertainty in Life-Cycle Assessment: A Survey of Quantitative Approaches

JOURNAL OF INDUSTRIAL ECOLOGY, Issue 1 2007
Shannon M. Lloyd
Summary Life-cycle assessment (LCA) practitioners build models to quantify resource consumption, environmental releases, and potential environmental and human health impacts of product systems. Most often, practitioners define a model structure, assign a single value to each parameter, and build deterministic models to approximate environmental outcomes. This approach fails to capture the variability and uncertainty inherent in LCA. To make good decisions, decision makers need to understand the uncertainty in and divergence between LCA outcomes for different product systems. Several approaches for conducting LCA under uncertainty have been proposed and implemented. For example, Monte Carlo simulation and fuzzy set theory have been applied in a limited number of LCA studies. These approaches are well understood and are generally accepted in quantitative decision analysis. But they do not guarantee reliable outcomes. A survey of approaches used to incorporate quantitative uncertainty analysis into LCA is presented. The suitability of each approach for providing reliable outcomes and enabling better decisions is discussed. Approaches that may lead to overconfident or unreliable results are discussed and guidance for improving uncertainty analysis in LCA is provided. [source]


A backoff strategy for model-based experiment design under parametric uncertainty

AICHE JOURNAL, Issue 8 2010
Federico Galvanin
Abstract Model-based experiment design techniques are an effective tool for the rapid development and assessment of dynamic deterministic models, yielding the most informative process data to be used for the estimation of the process model parameters. A particular advantage of the model-based approach is that it permits the definition of a set of constraints on the experiment design variables and on the predicted responses. However, uncertainty in the model parameters can lead the constrained design procedure to predict experiments that turn out to be, in practice, suboptimal, thus decreasing the effectiveness of the experiment design session. Additionally, in the presence of parametric mismatch, the feasibility constraints may well turn out to be violated when that optimally designed experiment is performed, leading in the best case to less informative data sets or, in the worst case, to an infeasible or unsafe experiment. In this article, a general methodology is proposed to formulate and solve the experiment design problem by explicitly taking into account the presence of parametric uncertainty, so as to ensure both feasibility and optimality of the planned experiment. A prediction of the system responses for the given parameter distribution is used to evaluate and update suitable backoffs from the nominal constraints, which are used in the design session to keep the system within a feasible region with specified probability. This approach is particularly useful when designing optimal experiments starting from limited preliminary knowledge of the parameter set, with great improvement in terms of design efficiency and flexibility of the overall iterative model development scheme. The effectiveness of the proposed methodology is demonstrated and discussed by simulation through two illustrative case studies concerning the parameter identification of physiological models related to diabetes and cancer care. © 2009 American Institute of Chemical Engineers AIChE J, 2010 [source]


Students' levels of explanations, models, and misconceptions in basic quantum chemistry: A phenomenographic study

JOURNAL OF RESEARCH IN SCIENCE TEACHING, Issue 5 2009
Christina Stefani
We investigated students' knowledge constructions of basic quantum chemistry concepts, namely atomic orbitals, the Schrödinger equation, molecular orbitals, hybridization, and chemical bonding. Ausubel's theory of meaningful learning provided the theoretical framework and phenomenography the method of analysis. The semi-structured interview with 19 second-year chemistry students supplied the data. We identified four levels of explanations in the students' answers. In addition, the scientific knowledge claims reflected three main levels of models. By combining levels of explanations with levels of models, we derived four categories. Two of the categories are shades of variation in the rote-learning part of a continuum, while the other two categories are in the meaningful-learning part. All students possessed alternative conceptions some of which occurred within certain categories, while others spanned more categories. The insistence on the deterministic models of the atom, the misinterpretation of models, and the poor understanding of the current quantum concepts are main problems in the learning of the basic quantum chemistry concepts. © 2009 Wiley Periodicals, Inc. J Res Sci Teach 46: 520,536, 2009 [source]


A Bayesian hierarchical approach to ensemble weather forecasting

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 3 2010
A. F. Di Narzo
Summary., In meteorology, the traditional approach to forecasting employs deterministic models mimicking atmospheric dynamics. Forecast uncertainty due to partial knowledge of the initial conditions is tackled by ensemble predictions systems. Probabilistic forecasting is a relatively new approach which may properly account for all sources of uncertainty. We propose a hierarchical Bayesian model which develops this idea and makes it possible to deal with ensemble predictions systems with non-identifiable members by using a suitable definition of the second level of the model. An application to Italian small-scale temperature data is shown. [source]


The genetic architecture of disease resistance in plants and the maintenance of recombination by parasites

MOLECULAR ECOLOGY, Issue 1 2001
Paula X. Kover
Abstract Parasites represent strong selection on host populations because they are ubiquitous and can drastically reduce host fitness. It has been hypothesized that parasite selection could explain the widespread occurrence of recombination because it is a coevolving force that favours new genetic combinations in the host. A review of deterministic models for the maintenance of recombination reveals that for recombination to be favoured, multiple genes that interact with each other must be under selection. To evaluate whether parasite selection can explain the maintenance of recombination, we review 85 studies that investigated the genetic architecture of plant disease resistance and discuss whether they conform to the requirements that emerge from theoretical models. General characteristics of disease resistance in plants and problems in evaluating resistance experimentally are also discussed. We found strong evidence that disease resistance in plants is determined by multiple loci. Furthermore, in most cases where loci were tested for interactions, epistasis between loci that affect resistance was found. However, we found weak support for the idea that specific allelic combinations determine resistance to different host genotypes and there was little data on whether epistasis between resistance genes is negative or positive. Thus, the current data indicate that it is possible that parasite selection can favour recombination, but more studies in natural populations that specifically address the nature of the interactions between resistance genes are necessary. The data summarized here suggest that disease resistance is a complex trait and that environmental effects and fitness trade-offs should be considered in future models of the coevolutionary dynamics of host and parasites. [source]


Extension of ideal free resource use to breeding populations and metapopulations

OIKOS, Issue 1 2000
C. Patrick Doncaster
The concept of an ideal and free use of limiting resources is commonly invoked in behavioural ecology as a null model for predicting the distribution of foraging consumers across heterogeneous habitat. In its original conception, however, its predictions were applied to the longer timescales of habitat selection by breeding birds. Here I present a general model of ideal free resource use, which encompasses classical deterministic models for the dynamics in continuous time of feeding aggregations, breeding populations and metapopulations. I illustrate its key predictions using the consumer functional response given by Holling's disc equation. The predictions are all consistent with classical population dynamics, but at least two of them are not usually recognised as pertaining across all scales. At the fine scale of feeding aggregations, the steady state of an equal intake for all ideal free consumers may be intrinsically unstable, if patches are efficiently exploited by individuals with a non-negligible handling time of resources. At coarser scales, classical models of population and metapopulation dynamics assume exploitation of a homogeneous environment, yet they can yield testable predictions for heterogeneous environments too under the assumption of ideal free resource use. [source]


Modelling the impact of vaccination on the epidemiology of varicella zoster virus in Australia

AUSTRALIAN AND NEW ZEALAND JOURNAL OF PUBLIC HEALTH, Issue 6 2005
Heather F. Gidding
Objective: To model the impact of universal varicella vaccination in Australia. Methods: The results of an Australia-wide serosurvey for varicella zoster virus (VZV) immunity were used to parameterise realistic, age-structured deterministic models (RAS) developed by Brisson and colleagues. We examined the impact of a vaccination program for one-year-olds alone, and with a catch-up campaign for 11 -year-olds, on the incidence of varicella and zoster, using Australia's population structure. Morbidity was then determined by calculating the number of hospital in-patient days. Results: Infant vaccination is predicted to reduce the incidence of varicella. However, zoster incidence is expected to increase initially, assuming exposure to varicella boosts immunity to zoster. Accumulated morbidity from both varicella and zoster is predicted to remain above that expected without vaccination for the first 70 years of an infant program (assuming 90% coverage with boosting for 20 years). However, after 70 years the net health savings from vaccination are predicted to increase substantially. Conclusions and Implications: Infant vaccination is expected to be a successful long-term commitment to reducing morbidity associated with VZV infection in Australia. [source]


The frequency of the sickle allele in Jamaica has not declined over the last 22 years

BRITISH JOURNAL OF HAEMATOLOGY, Issue 6 2005
N. A. Hanchard
Summary The ,malaria hypothesis' predicts that the frequency of the sickle allele, which is high in malaria-endemic African populations, should decline with each generation in populations of African descent living in areas where malaria is no longer endemic. In order to determine whether this has been the case in Jamaica, we compared haemoglobin electrophoresis results from two hospital-based screening programmes separated by more than 20 years (i.e. approximately one generation). The first comprised 100 000 neonates screened between 1973 and 1981, the second, 104 183 neonates screened between 1995 and 2003. The difference in frequency of the sickle allele was small (5·47% in the first cohort and 5·38% in the second screening cohort) and not significant (Z = 1·23, P = 0·22). The same was true of the sickle trait frequency (10·05% in the first cohort and 9·85% in the second, Z = 1·45, P = 0·15). These differences were smaller than predicted under simple deterministic models based on the malaria hypothesis, and suggest that these models may not capture important determinants of allele and trait frequency decline (or persistence) in contemporary populations. Refining the expectations for allele and trait frequency change for Jamaica and other similar populations is an area for future study. [source]