Stochastic Component (stochastic + component)

Distribution by Scientific Domains


Selected Abstracts


The individual tolerance concept is not the sole explanation for the probit dose-effect model,

ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 2 2000
Michael C. Newman
Abstract Predominant methods for analyzing dose- or concentration-effect data (i.e., probit analysis) are based on the concept of individual tolerance or individual effective dose (IED, the smallest characteristic dose needed to kill an individual). An alternative explanation (stochasticity hypothesis) is that individuals do not have unique tolerances: death results from stochastic processes occurring similarly in all individuals. These opposing hypotheses were tested with two types of experiments. First, time to stupefaction (TTS) was measured for zebra fish (Brachydanio rerio) exposed to benzocaine. The same 40 fish were exposed during five trials to test if the same order for TTS was maintained among trials. The IED hypothesis was supported with a minor stochastic component being present. Second, eastern mosquitofish (Gambusia holbrooki) were exposed to sublethal or lethal NaCl concentrations until a large portion of the lethally exposed fish died. After sufficient time for recovery, fish sublethally exposed and fish surviving lethal exposure were exposed simultaneously to lethal NaCl concentrations. No statistically significant effect was found of previous exposure on survival time but a large stochastic component to the survival dynamics was obvious. Repetition of this second type of test with pentachlorophenol also provided no support for the IED hypothesis. We conclude that neither hypothesis alone was the sole or dominant explanation for the lognormal (probit) model. Determination of the correct explanation (IED or stochastic) or the relative contributions of each is crucial to predicting consequences to populations after repeated or chronic exposures to any particular toxicant. [source]


Point process methodology for on-line spatio-temporal disease surveillance

ENVIRONMETRICS, Issue 5 2005
Peter Diggle
Abstract We formulate the problem of on-line spatio-temporal disease surveillance in terms of predicting spatially and temporally localised excursions over a pre-specified threshold value for the spatially and temporally varying intensity of a point process in which each point represents an individual case of the disease in question. Our point process model is a non-stationary log-Gaussian Cox process in which the spatio-temporal intensity, ,(x,t), has a multiplicative decomposition into two deterministic components, one describing purely spatial and the other purely temporal variation in the normal disease incidence pattern, and an unobserved stochastic component representing spatially and temporally localised departures from the normal pattern. We give methods for estimating the parameters of the model, and for making probabilistic predictions of the current intensity. We describe an application to on-line spatio-temporal surveillance of non-specific gastroenteric disease in the county of Hampshire, UK. The results are presented as maps of exceedance probabilities, P{R(x,t)c|data}, where R(x,t) is the current realisation of the unobserved stochastic component of ,(x,t) and c is a pre-specified threshold. These maps are updated automatically in response to each day's incident data using a web-based reporting system. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Stochastic modelling of global solar radiation measured in the state of Kuwait

ENVIRONMETRICS, Issue 7 2002
S. A. Al-Awadhi
Abstract Two stochastic models that capture the main features of daily exposure of global radiation in Kuwait are proposed. The development of these models is based on removing the annual periodicity and seasonal variation of solar radiation. Thus the daily radiation is decomposed as the sum of the trend component and a stochastic component. In many situations, there are dramatic changes in the radiation series through the year due to the condition of the weather, as is the case of the data from Kuwait. This would affect the accuracy of the model, and therefore the series is divided into two regimes: one corresponds to clear days where the value of the global radiation would be normal and the other to non-clear days where the value of global radiation would be very low. Then the trend component is expressed as a Fourier series taking into account such apparent breaks in the series. The stochastic component is first tested for linearity and Gaussianity and it is found that it does not satisfy these assumptions. Therefore, a linear time series model (ARMA modeling) may not be adequate and, to overcome this problem, a bilinear time series is used to model the stochastic component of daily global radiation in Kuwait. The method proposed considers first fitting an AR model to the data and then seeing whether a further reduction in the mean sum of squares can be achieved by introducing extra bilinear terms. The Akaike Information Criterion (AIC) is used to select the best model. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Sensitivity analysis for incomplete contingency tables: the Slovenian plebiscite case

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 1 2001
Geert Molenberghs
Classical inferential procedures induce conclusions from a set of data to a population of interest, accounting for the imprecision resulting from the stochastic component of the model. Less attention is devoted to the uncertainty arising from (unplanned) incompleteness in the data. Through the choice of an identifiable model for non-ignorable non-response, one narrows the possible data-generating mechanisms to the point where inference only suffers from imprecision. Some proposals have been made for assessing the sensitivity to these modelling assumptions; many are based on fitting several plausible but competing models. For example, we could assume that the missing data are missing at random in one model, and then fit an additional model where non-random missingness is assumed. On the basis of data from a Slovenian plebiscite, conducted in 1991, to prepare for independence, it is shown that such an ad hoc procedure may be misleading. We propose an approach which identifies and incorporates both sources of uncertainty in inference: imprecision due to finite sampling and ignorance due to incompleteness. A simple sensitivity analysis considers a finite set of plausible models. We take this idea one step further by considering more degrees of freedom than the data support. This produces sets of estimates (regions of ignorance) and sets of confidence regions (combined into regions of uncertainty). [source]


Temporal changes in replicated experimental stream fish assemblages: predictable or not?

FRESHWATER BIOLOGY, Issue 9 2006
WILLIAMJ.
Summary 1.,Natural aquatic communities or habitats cannot be fully replicated in the wild, so little is known about how initially identical communities might change over time, or the extent to which observed changes in community structure are caused by internal factors (such as interspecific interactions or traits of individual species) versus factors external to the local community (such as abiotic disturbances or invasions of new species). 2.,We quantified changes in seven initially identical fish assemblages, in habitats that were as similar as possible, in seminatural artificial streams in a 388-day trial (May 1998 to May 1999), and compared the change to that in fish assemblages in small pools of a natural stream during a year. The experimental design excluded floods, droughts, immigration or emigration. The experimental fish communities diverged significantly in composition and exhibited dissimilar trajectories in multivariate species space. Divergence among the assemblages increased from May through August, but not thereafter. 3.,Differences among the experimental assemblages were influenced by differences that developed during the year in algae cover and in potential predation (due to differential survival of sunfish among units). 4.,In the natural stream, fish assemblages in small pools changed more than those in the experimental units, suggesting that in natural assemblages external factors exacerbated temporal variation. 5.,Our finding that initially identical assemblages, isolated from most external factors, would diverge in the structure of fish assemblages over time suggests a lack of strong internal, deterministic controls in the assemblages, and that idiosyncratic or stochastic components (chance encounters among species; vagaries in changes in the local habitat) even within habitat patches can play an important role in assemblage structure in natural systems. [source]


Reworking the NAFTA: Departures from Traditional Frameworks

CANADIAN JOURNAL OF AGRICULTURAL ECONOMICS, Issue 4 2001
Garth J. Holloway
This paper reviews the treatment of intellectual property rights in the North American Free Trade Agreement (NAFTA) and considers the welfare-theoretic bases for innovation transfer between member and nonmember states. Specifically, we consider the effects of new technology development from within the union and question whether it is efficient (in a welfare sense) to transfer that new technology to nonmember states. When the new technology contains stochastic components, the important issue of information exchange arises and we consider this question in a simple oligopoly model with Bayesian updating. In this context, it is natural to ask the optimal price at which such information should be transferred. Some simple, natural conjugate examples are used to motivate the key parameters upon which the answer is dependent. L'article que void analyse comment I' Accord de libre-échange nord-américain (ALENA) traite la protection de la propriété intellectuelle et s,attarde sur les principes théoriques du bien-Aêtre résultant du transfert de I' innovation entre etats membres et non membres. Plus précisément, I'auteur examine les consequences de I'élaboration d'une nouvelle technologie au sein de I'union économique et s'interroge sur I'efficacité (sous I'angle du bien-être social) du transfert de cette technologie aux états non membres. L'importante question du partage de l'information surgit dès que la nouvelle technologie inclut des elements stochastiques. L'auteur étudie cette question en prenant pour modèle un simple oligopole actualisé par la méthode bayesienne. Dans un tel contexte, il est naturel de réclamer le prix optimal auquel il devrait y avoir partage de I' information. Quelques exemples simples, à conjugué naturel, servent à faire ressortir les principaux paramètres sur lesquels repose la réponse è la question examinée. [source]