Home About us Contact | |||
Stochastic Processes (stochastic + process)
Selected AbstractsReconstructing Macroeconomics: A Perspective from Statistical Physics and Combinational Stochastic Processes.ECONOMICA, Issue 301 2009By MASANAO AOKI, HIROSHI YOSHIKAWA No abstract is available for this article. [source] Handbook of Statistics, Volume 21, Stochastic Processes: Modeling and SimulationJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES A (STATISTICS IN SOCIETY), Issue 1 2004Freda Kemp First page of article [source] Parameter Estimation of Stochastic Processes with Long-range Dependence and IntermittencyJOURNAL OF TIME SERIES ANALYSIS, Issue 5 2001Jiti Gao This paper considers the case where a stochastic process may display both long-range dependence and second-order intermittency. The existence of such a process is established in Anh, Angulo and Ruiz-Medina (1999). We systematically study the estimation of parameters involved in the spectral density function of a process with long-range dependence and second-order intermittency. An estimation procedure for the parameters is given. Numerical results are presented to support the estimation procedure proposed in this paper. [source] Minimum , -divergence estimation for arch modelsJOURNAL OF TIME SERIES ANALYSIS, Issue 1 2006S. Ajay Chandra Abstract., This paper considers a minimum , -divergence estimation for a class of ARCH(p) models. For these models with unknown volatility parameters, the exact form of the innovation density is supposed to be unknown in detail but is thought to be close to members of some parametric family. To approximate such a density, we first construct an estimator for the unknown volatility parameters using the conditional least squares estimator given by Tjøstheim [Stochastic processes and their applications (1986) Vol. 21, pp. 251,273]. Then, a nonparametric kernel density estimator is constructed for the innovation density based on the estimated residuals. Using techniques of the minimum Hellinger distance estimation for stochastic models and residual empirical process from an ARCH(p) model given by Beran [Annals of Statistics (1977) Vol. 5, pp. 445,463] and Lee and Taniguchi [Statistica Sinica (2005) Vol. 15, pp. 215,234] respectively, it is shown that the proposed estimator is consistent and asymptotically normal. Moreover, a robustness measure for the score of the estimator is introduced. The asymptotic efficiency and robustness of the estimator are illustrated by simulations. The proposed estimator is also applied to daily stock returns of Dell Corporation. [source] Postglacial topographic evolution of glaciated valleys: a stochastic landscape evolution modelEARTH SURFACE PROCESSES AND LANDFORMS, Issue 11 2005Simon J. Dadson Abstract The retreat of valley glaciers has a dramatic effect on the stability of glaciated valleys and exerts a prolonged influence on the subsequent fluvial sediment transport regime. We have studied the evolution of an idealized glaciated valley during the period following retreat of ice using a numerical model. The model incorporates a stochastic process to represent deep-seated landsliding, non-linear diffusion to represent shallow landsliding and an approximation of the Bagnold relation to represent fluvial sediment transport. It was calibrated using field data from several recent surveys within British Columbia, Canada. We present ensemble model results and compare them with results from a deterministic linear-diffusion model to show that explicit representation of large landslides is necessary to reproduce the morphology and channel network structure of a typical postglacial valley. Our model predicts a rapid rate of fluvial sediment transport following deglaciation with a subsequent gradual decline, similar to that inferred for Holocene time. We also describe how changes in the model parameters affect the estimated magnitude and duration of the paraglacial sediment pulse. Copyright © 2005 John Wiley & Sons, Ltd. [source] Efficiency of base isolation systems in structural seismic protection and energetic assessmentEARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 10 2003Giuseppe Carlo Marano Abstract This paper concerns the seismic response of structures isolated at the base by means of High Damping Rubber Bearings (HDRB). The analysis is performed by using a stochastic approach, and a Gaussian zero mean filtered non-stationary stochastic process is used in order to model the seismic acceleration acting at the base of the structure. More precisely, the generalized Kanai,Tajimi model is adopted to describe the non-stationary amplitude and frequency characteristics of the seismic motion. The hysteretic differential Bouc,Wen model (BWM) is adopted in order to take into account the non-linear constitutive behaviour both of the base isolation device and of the structure. Moreover, the stochastic linearization method in the time domain is adopted to estimate the statistical moments of the non-linear system response in the state space. The non-linear differential equation of the response covariance matrix is then solved by using an iterative procedure which updates the coefficients of the equivalent linear system at each step and searches for the solution of the response covariance matrix equation. After the system response variance is estimated, a sensitivity analysis is carried out. The final aim of the research is to assess the real capacity of base isolation devices in order to protect the structures from seismic actions, by avoiding a non-linear response, with associated large plastic displacements and, therefore, by limiting related damage phenomena in structural and non-structural elements. In order to attain this objective the stochastic response of a non-linear n -dof shear-type base-isolated building is analysed; the constitutive law both of the structure and of the base devices is described, as previously reported, by adopting the BWM and by using appropriate parameters for this model, able to suitably characterize an ordinary building and the base isolators considered in the study. The protection level offered to the structure by the base isolators is then assessed by evaluating the reduction both of the displacement response and the hysteretic dissipated energy. Copyright © 2003 John Wiley & Sons, Ltd. [source] Income Variance Dynamics and HeterogeneityECONOMETRICA, Issue 1 2004Costas Meghir Recent theoretical work has shown the importance of measuring microeconomic uncertainty for models of both general and partial equilibrium under imperfect insurance. In this paper the assumption of i.i.d. income innovations used in previous empirical studies is removed and the focus of the analysis is placed on models for the conditional variance of income shocks, which is related to the measure of risk emphasized by the theory. We first discriminate amongst various models of earnings determination that separate income shocks into idiosyncratic transitory and permanent components. We allow for education- and time-specific differences in the stochastic process for earnings and for measurement error. The conditional variance of the income shocks is modelled as a parsimonious ARCH process with both observable and unobserved heterogeneity. The empirical analysis is conducted on data drawn from the 1967,1992 Panel Study of Income Dynamics. We find strong evidence of sizeable ARCH effects as well as evidence of unobserved heterogeneity in the variances. [source] Modeling and predicting complex space,time structures and patterns of coastal wind fieldsENVIRONMETRICS, Issue 5 2005Montserrat Fuentes Abstract A statistical technique is developed for wind field mapping that can be used to improve either the assimilation of surface wind observations into a model initial field or the accuracy of post-processing algorithms run on meteorological model output. The observed wind field at any particular location is treated as a function of the true (but unknown) wind and measurement error. The wind field from numerical weather prediction models is treated as a function of a linear and multiplicative bias and a term which represents random deviations with respect to the true wind process. A Bayesian approach is taken to provide information about the true underlying wind field, which is modeled as a stochastic process with a non-stationary and non-separable covariance. The method is applied to forecast wind fields from a widely used mesoscale numerical weather prediction (NWP) model (MM5). The statistical model tests are carried out for the wind speed over the Chesapeake Bay and the surrounding region for 21 July 2002. Coastal wind observations that have not been used in the MM5 initial conditions or forecasts are used in conjunction with the MM5 forecast wind field (valid at the same time that the observations were available) in a post-processing technique that combined these two sources of information to predict the true wind field. Based on the mean square error, this procedure provides a substantial correction to the MM5 wind field forecast over the Chesapeake Bay region. Copyright © 2005 John Wiley & Sons, Ltd. [source] A Framework for Valuing Derivative SecuritiesFINANCIAL MARKETS, INSTITUTIONS & INSTRUMENTS, Issue 5 2001Philip Gray This paper develops a general framework for valuing a wide range of derivative securities. Rather than focusing on the stochastic process of the underlying security and developing an instantaneously-riskless hedge portfolio, we focus on the terminal distribution of the underlying security. This enables the derivative security to be valued as the weighted sum of a number of component pieces. The component pieces are simply the different payoffs that the security generates in different states of the world, and they are weighted by the probability of the particular state of the world occurring. A full set of derivations is provided. To illustrate its use, the valuation framework is applied to plain-vanilla call and put options, as well as a range of derivatives including caps, floors, collars, supershares, and digital options. [source] On open-set lattices and some of their applications in semanticsINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 12 2003Mouw-Ching Tjiok In this article, we present the theory of Kripke semantics, along with the mathematical framework and applications of Kripke semantics. We take the Kripke-Sato approach to define the knowledge operator in relation to Hintikka's possible worlds model, which is an application of the semantics of intuitionistic logic and modal logic. The applications are interesting from the viewpoint of agent interactives and process interaction. We propose (i) an application of possible worlds semantics, which enables the evaluation of the truth value of a conditional sentence without explicitly defining the operator "," (implication), through clustering on the space of events (worlds) using the notion of neighborhood; and (ii) a semantical approach to treat discrete dynamic process using Kripke-Beth semantics. Starting from the topological approach, we define the measure-theoretical machinery, in particular, we adopt the methods developed in stochastic process,mainly the martingale,to our semantics; this involves some Boolean algebraic (BA) manipulations. The clustering on the space of events (worlds), using the notion of neighborhood, enables us to define an accessibility relation that is necessary for the evaluation of the conditional sentence. Our approach is by taking the neighborhood as an open set and looking at topological properties using metric space, in particular, the so-called ,-ball; then, we can perform the implication by computing Euclidean distance, whenever we introduce a certain enumerative scheme to transform the semantic objects into mathematical objects. Thus, this method provides an approach to quantify semantic notions. Combining with modal operators Ki operating on E set, it provides a more-computable way to recognize the "indistinguishability" in some applications, e.g., electronic catalogue. Because semantics used in this context is a local matter, we also propose the application of sheaf theory for passing local information to global information. By looking at Kripke interpretation as a function with values in an open-set lattice ,,U, which is formed by stepwise verification process, we obtain a topological space structure. Now, using the measure-theoretical approach by taking the Borel set and Borel function in defining measurable functions, this can be extended to treat the dynamical aspect of processes; from the stochastic process, considered as a family of random variables over a measure space (the probability space triple), we draw two strong parallels between Kripke semantics and stochastic process (mainly martingales): first, the strong affinity of Kripke-Beth path semantics and time path of the process; and second, the treatment of time as parametrization to the dynamic process using the technique of filtration, adapted process, and progressive process. The technique provides very effective manipulation of BA in the form of random variables and ,-subalgebra under the cover of measurable functions. This enables us to adopt the computational algorithms obtained for stochastic processes to path semantics. Besides, using the technique of measurable functions, we indeed obtain an intrinsic way to introduce the notion of time sequence. © 2003 Wiley Periodicals, Inc. [source] Linkage Disequilibrium Mapping of Disease Susceptibility Genes in Human PopulationsINTERNATIONAL STATISTICAL REVIEW, Issue 1 2000David Clayton Summary The paper reviews recent work on statistical methods for using linkage disequilibrium to locate disease susceptibility genes, given a set of marker genes at known positions in the genome. The paper starts by considering a simple deterministic model for linkage disequilibrium and discusses recent attempts to elaborate it to include the effects of stochastic influences, of "drift", by the use of either Writht-Fisher models or by approaches based on the coalescence of the genealogy of the sample of disease chromosomes. Most of this first part of the paper concerns a series of diallelic markers and, in this case, the models so far proposed are hierarchical probability models for multivariate binary data. Likelihoods are intractable and most approaches to linkage disequilibrium mapping amount to marginal models for pairwise associations between individual markers and the disease susceptibility locus. Approaches to evalutation of a full likelihood require Monte Carlo methods in order to integrate over the large number of unknowns. The fact that the initial state of the stochastic process which has led to present-day allele frequencies is unknown is noted and its implications for the hierarchical probability model is discussed. Difficulties and opportunities arising as a result of more polymorphic markers and extended marker haplotypes are indicated. Connections between the hierarchical modelling approach and methods based upon identity by descent and haplotype sharing by seemingly unrelated case are explored. Finally problems resulting from unknown modes of inheritance, incomplete penetrance, and "phenocopies" are briefly reviewed. Résumé Ce papier est une revue des travaux récents, protant sur les méthodes statistiques qui utilisent I'étude, des liaisons désé, quilib rées, pour identifer les génes, de susceptibilité des maladies,ápartir d'une série, de marqueurs de géncs á des positions définies du génome,. Le papier commence par considérer, un modéle, détéministe, simple pour liaisons déséquilibr,ées, puis nous discutons les améliorations, ré centes proposées, de ce modéle, dans but de tenir compte des effects des influences stochastiques soit en utilisant les modéles, de wright-fisher, soit par des approches basées, sur la coalescence de la géné alogic de I'échantillon, des chromosomes malades. La plupart de cette premiére, partie porte sur une série, de marqueurs dialléliques et, dans ce cas, les modéles, proposés, sont des modéles, hiérerchiques, probabilistes pour dinnées, binaires multivariées. Les viaisemblances n'ont pas de forme analytique et la plupart des approches pour la cartographie des liaisons déséquilibrées, sont équivalentes aux modéles, marginaux pour dinnées, appariées, entre des marqueurs individuels et le géne, de susceptibilité de la maladie.Pour évaluer, la vriausemblance compléte, des méthodes de Monte carlo sont nécessaires, afin d'intégrer, le large nombre d; inconnues. Le fait que l'état, initial du process stochastique qui a conduit éla fré, quence, allélique, actuel soit inconnu est á noter et ses implications pour le modéle, hiérarchique, probabiliste sont discutées.Les difficultés, et implications issues de marqueurs polumorphiques et de marquers haplotypes sont dévéloppées.Les liens entire l'approche de modélisation, hiérerchique, et les méthodes, d'analyse d'identite pardescendance et les haplotypes partagés, par des cas apparement non apparentés, sont explorés. Enfin les problémes, relatifs à des modes de transmission inconnus,à des pénétrances, incomplé, tes, et aux "phénocopies" sont briévenment evoqués. [source] Reactive Oxygen Species as Mediators of Cellular SenescenceIUBMB LIFE, Issue 4-5 2005Renata Colavitti Abstract Aging has often been viewed as a random process arising from the accumulation of both genetic and epigenetic changes. Increasingly, the notion that aging is a stochastic process is being supplanted by the concept that maximum lifespan of an organism is tightly regulated. This knowledge has led to a growing overlap between classical signal transduction paradigms and the biology of aging. We review certain specific examples where these seemingly disparate disciplines intersect. In particular, we review the concept that intracellular reactive oxygen species function as signalling molecules and that oxidants play a central role as mediators of cellular senescence. IUBMB Life, 57: 277-281, 2005 [source] Dynamic heterogeneity and life history variability in the kittiwakeJOURNAL OF ANIMAL ECOLOGY, Issue 2 2010Ulrich K. Steiner Summary 1. Understanding the evolution of life histories requires an assessment of the process that generates variation in life histories. Within-population heterogeneity of life histories can be dynamically generated by stochastic variation of reproduction and survival or be generated by individual differences that are fixed at birth. 2. We show for the kittiwake that dynamic heterogeneity is a sufficient explanation of observed variation of life histories. 3. The total heterogeneity in life histories has a small contribution from reproductive stage dynamics and a large contribution from survival differences. We quantify the diversity in life histories by metrics computed from the generating stochastic process. 4. We show how dynamic heterogeneity can be used as a null model and also how it can lead to positive associations between reproduction and survival across the life span. 5. We believe our approach to identifying the nature of among-individual heterogeneity yields important insights into the forces that generate within-population variation of life-history traits. It provides an alternative to claims that fixed individual differences are a major determinant of heterogeneity in life histories. [source] Ex post and ex ante prediction of unobserved multivariate time series: a structural-model based approachJOURNAL OF FORECASTING, Issue 1 2007Fabio H. Nieto Abstract A methodology for estimating high-frequency values of an unobserved multivariate time series from low-frequency values of and related information to it is presented in this paper. This is an optimal solution, in the multivariate setting, to the problem of ex post prediction, disaggregation, benchmarking or signal extraction of an unobservable stochastic process. Also, the problem of extrapolation or ex ante prediction is optimally solved and, in this context, statistical tests are developed for checking online the ocurrence of extreme values of the unobserved time series and consistency of future benchmarks with the present and past observed information. The procedure is based on structural or unobserved component models, whose assumptions and specification are validated with the data alone.,,Copyright © 2007 John Wiley & Sons, Ltd. [source] Time series modelling of two millennia of northern hemisphere temperatures: long memory or shifting trends?JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES A (STATISTICS IN SOCIETY), Issue 1 2007Terence C. Mills Summary., The time series properties of the temperature reconstruction of Moberg and co-workers are analysed. It is found that the record appears to exhibit long memory characteristics that can be modelled by an autoregressive fractionally integrated moving average process that is both stationary and mean reverting, so that forecasts will eventually return to a constant underlying level. Recent research has suggested that long memory and shifts in level and trend may be confused with each other, and fitting models with slowly changing trends is found to remove the evidence of long memory. Discriminating between the two models is difficult, however, and the strikingly different forecasts that are implied by the two models point towards some intriguing research questions concerning the stochastic process driving this temperature reconstruction. [source] A simple monotone process with application to radiocarbon-dated depth chronologiesJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 4 2008John Haslett Summary., We propose a new and simple continuous Markov monotone stochastic process and use it to make inference on a partially observed monotone stochastic process. The process is piecewise linear, based on additive independent gamma increments arriving in a Poisson fashion. An independent increments variation allows very simple conditional simulation of sample paths given known values of the process. We take advantage of a reparameterization involving the Tweedie distribution to provide efficient computation. The motivating problem is the establishment of a chronology for samples taken from lake sediment cores, i.e. the attribution of a set of dates to samples of the core given their depths, knowing that the age,depth relationship is monotone. The chronological information arises from radiocarbon (14C) dating at a subset of depths. We use the process to model the stochastically varying rate of sedimentation. [source] Bayesian incidence analysis of animal tumorigenicity dataJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 2 2001D. B. Dunson Statistical inference about tumorigenesis should focus on the tumour incidence rate. Unfortunately, in most animal carcinogenicity experiments, tumours are not observable in live animals and censoring of the tumour onset times is informative. In this paper, we propose a Bayesian method for analysing data from such studies. Our approach focuses on the incidence of tumours and accommodates occult tumours and censored onset times without restricting tumour lethality, relying on cause-of-death data, or requiring interim sacrifices. We represent the underlying state of nature by a multistate stochastic process and assume general probit models for the time-specific transition rates. These models allow the incorporation of covariates, historical control data and subjective prior information. The inherent flexibility of this approach facilitates the interpretation of results, particularly when the sample size is small or the data are sparse. We use a Gibbs sampler to estimate the relevant posterior distributions. The methods proposed are applied to data from a US National Toxicology Program carcinogenicity study. [source] Parameter Estimation of Stochastic Processes with Long-range Dependence and IntermittencyJOURNAL OF TIME SERIES ANALYSIS, Issue 5 2001Jiti Gao This paper considers the case where a stochastic process may display both long-range dependence and second-order intermittency. The existence of such a process is established in Anh, Angulo and Ruiz-Medina (1999). We systematically study the estimation of parameters involved in the spectral density function of a process with long-range dependence and second-order intermittency. An estimation procedure for the parameters is given. Numerical results are presented to support the estimation procedure proposed in this paper. [source] Joint venture evolution: extending the real options approachMANAGERIAL AND DECISION ECONOMICS, Issue 4 2008Jing Li Real options theory has emerged as a promising avenue to study joint venture (JV) evolution as a strategic response to managing uncertainty. We extend the real options approach by integrating it with game theory. Such a combined method enriches the valuation functions of each partnering firm and helps to identify the optimal decisions for exercising options in a JV. In our model, each firm's synergy from the joint operation and its knowledge acquisition capability (KAC) can significantly influence the competitive dynamics between partners, potentially affecting how each firm decides to acquire, divest, or dissolve. We employ a new solution technique in real options theory to capture the stochastic process of three factors, and use computer simulation to test the model under varying conditions. The results are stated in five testable propositions, providing a better understanding of the dynamics of a JV. We find that symmetries between partners in synergy or KAC contribute to stability or dissolution of the JV, whereas asymmetries in synergy or KAC make acquisition of the JV assets by one partner desirable. Copyright © 2008 John Wiley & Sons, Ltd. [source] Investment with an arithmetic process and lagsMANAGERIAL AND DECISION ECONOMICS, Issue 5 2000Avner Bar-Ilan This paper presents an explicit solution of a simple investment problem with entry lags and when the underlying stochastic process is arithmetic. It is shown that, without abandonment, the optimal investment plan is independent of the length of the lag. Copyright © 2000 John Wiley & Sons, Ltd. [source] The Fundamental Theorem of Asset Pricing under Proportional Transaction Costs in Finite Discrete TimeMATHEMATICAL FINANCE, Issue 1 2004Walter SchachermayerArticle first published online: 24 DEC 200 We prove a version of the Fundamental Theorem of Asset Pricing, which applies to Kabanov's modeling of foreign exchange markets under transaction costs. The financial market is described by a d×d matrix-valued stochastic process (,t)Tt=0 specifying the mutual bid and ask prices between d assets. We introduce the notion of "robust no arbitrage," which is a version of the no-arbitrage concept, robust with respect to small changes of the bid-ask spreads of (,t)Tt=0. The main theorem states that the bid-ask process (,t)Tt=0 satisfies the robust no-arbitrage condition iff it admits a strictly consistent pricing system. This result extends the theorems of Harrison-Pliska and Kabanov-Stricker pertaining to the case of finite ,, as well as the theorem of Dalang, Morton, and Willinger and Kabanov, Rásonyi, and Stricker, pertaining to the case of general ,. An example of a 5 × 5 -dimensional process (,t)2t=0 shows that, in this theorem, the robust no-arbitrage condition cannot be replaced by the so-called strict no-arbitrage condition, thus answering negatively a question raised by Kabanov, Rásonyi, and Stricker. [source] Wealth Dynamics and the Endogenous Design of Firm OrganizationTHE JAPANESE ECONOMIC REVIEW, Issue 3 2003Hiroshi Osano The purpose of this paper is to explore a dynamic interaction between wealth distribution and firm organization design using a model of growth in altruism in which a consideration of moral hazard on the part of agents with risk-averse preferences prevents complete insurance and generates inequality. I show (i) that there exists an ergodic invariant distribution of wealth to which the stochastic process of lineage wealth converges globally, and (ii) that the firm form with direct evaluation of the agent's effort is more likely to be chosen as wealth is distributed more equally. [source] A multicommodity model of futures prices: Using futures prices of one commodity to estimate the stochastic process of anotherTHE JOURNAL OF FUTURES MARKETS, Issue 6 2008Gonzalo Cortazar This article proposes a multicommodity model of futures prices of more than one commodity that allows the use of long-maturity futures prices available for one commodity to estimate futures prices for the other. The model considers that commodity prices have common and commodity-specific factors. A procedure for choosing the number of both types of unobservable-Gaussian factors is presented. Also, it is shown how commodities with and without seasonality may be jointly modeled and how to estimate the model using Kalman filter. Results for the West Texas Intermediate,Brent and for the West Texas Intermediate,unleaded gasoline models presented show strong improvements over the traditional individual-commodity models, with much lower out-of-sample errors and better volatility estimates, even when using fewer factors. © 2008 Wiley Periodicals, Inc. Jrl Fut Mark 28:537,560, 2008 [source] Plastid genomes in a regenerating tobacco shoot derive from a small number of copies selected through a stochastic processTHE PLANT JOURNAL, Issue 6 2008Kerry Ann Lutz Summary The plastid genome (ptDNA) of higher plants is highly polyploid, and the 1000,10 000 copies are compartmentalized with up to approximately 100 plastids per cell. The problem we address here is whether or not a newly arising genome can be established in a developing tobacco shoot, and be transmitted to the seed progeny. We tested this by generating two unequal ptDNA populations in a cultured tobacco cell. The parental tobacco plants in this study have an aurea (yellowish,golden) leaf color caused by the presence of a barau gene in the ptDNA. In addition, the ptDNA carries an aadA gene flanked with the phiC31 phage site-specific recombinase (Int) attP/attB target sites. The genetically distinct ptDNA copies were obtained by Int, which either excised only the aadA marker gene (i.e. did not affect the aurea phenotype) or triggered the deletion of both the aadA and barau transgenes, and thereby restored the green color. The ptDNA determining green plastids represented only a small fraction of the population and was not seen in a transient excision assay, and yet three out of the 53 regenerated shoots carried green plastids in all developmental layers. The remaining 49 Int-expressing plants had either exclusively aurea (24) or variegated (25) leaves with aurea and green sectors. The formation of homoplastomic green shoots with the minor green ptDNA in all developmental layers suggests that the ptDNA population in a regenerating shoot apical meristem derives from a small number of copies selected through a stochastic process. [source] Ln3M1,,,,TX7, quasi-isostructural compounds: stereochemistry and silver-ion motion in the Ln3Ag1,,,,GeS7 (Ln = La,Nd, Sm, Gd,Er and Y; , = 0.11,0.50) compoundsACTA CRYSTALLOGRAPHICA SECTION B, Issue 2 2009Marek Daszkiewicz The crystal structures of the Ln3Ag1,,,,GeS7 (Ln = La,Nd, Sm, Gd,Er, Y; , = 0.11,0.50, space group P63) compounds were determined by means of X-ray single-crystal diffraction and the similarities among the crystal structures of all Ln3M1,,,,TX7 (space group P63; Ln , lanthanide element, M, monovalent element; T, tetravalent element and X, S, Se) compounds deposited in the Inorganic Crystal Structure Database (ICSD) are discussed. Substitutions of each element in Ln3M1,,,,TX7 result in a different structural effect. On the basis of the data deposited in the ICSD the large family of the Ln3M1,,,,TX7 compounds was divided into three groups depending on the position of the monovalent element in the lattice. This position determines what kind of stereoisomer is present in the structure, either the ++ enantiomer or the +, diastereoisomer. Since the silver ions can occupy a different position and the energy barriers between positions are low the ions can move through the channel. It was shown that this movement is not a stochastic process but a correlated one. [source] Albatrosses, eagles and newts, Oh My!: exceptions to the prevailing paradigm concerning genetic diversity and population viability?ANIMAL CONSERVATION, Issue 5 2010D. H. Reed Abstract Numerous recent papers have demonstrated a central role for genetic factors in the extinction process or have documented the importance of gene flow in reversing population declines. This prompted one recent publication to declare that a revolution in conservation genetics has occurred. Contemporaneously with this revolution are a series of papers demonstrating long-term population persistence for several species despite having little or no detectable genetic variation. In a couple of notable cases, populations have been shown to have survived for centuries at small population size and with depleted levels of genetic variation. These contradictory results demand an explanation. In this review, I will show that these results do not necessarily fly in the face of theory as sometimes stated. The reconciliation of these two sets of observations relies on the incorporation of two major concepts. (1) Genetic factors do not act in a vacuum and it is their interaction with the environment, the strength and type of selection imposed, and the life history of the organism that determine the relative importance of genetic factors to extinction risk. (2) The relationship between molecular estimates of genetic variation and evolutionary potential, the relevance of genetic bottlenecks to adaptive genetic variation, and the nature of the stochastic process of extinction must be better integrated into expectations of population viability. Reports of populations persisting for hundreds of generations with very little detectable genetic variation provide us not only with valuable information but also with hope. However, recent studies suggest that we should not be sanguine about the importance of genetic diversity in the conservation of biodiversity. [source] Application in stochastic volatility models of nonlinear regression with stochastic designAPPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 2 2010Ping Chen Abstract In regression model with stochastic design, the observations have been primarily treated as a simple random sample from a bivariate distribution. It is of enormous practical significance to generalize the situation to stochastic processes. In this paper, estimation and hypothesis testing problems in stochastic volatility model are considered, when the volatility depends on a nonlinear function of the state variable of other stochastic process, but the correlation coefficient |,|,±1. The methods are applied to estimate the volatility of stock returns from Shanghai stock exchange. Copyright © 2009 John Wiley & Sons, Ltd. [source] On a discrimination problem for a class of stochastic processes with ordered first-passage timesAPPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 2 2001Antonio Di Crescenzo Abstract Consider the Bayes problem in which one has to discriminate if the random unknown initial state of a stochastic process is distributed according to either of two preassigned distributions, on the base of the observation of the first-passage time of the process through 0. For processes whose first-passage times to state 0 are increasing in the initial state according to the likelihood ratio order, such problem is solved by determining the Bayes decision function and the corresponding Bayes error. The special case of fixed initial values including a family of first-passage times with proportional reversed hazard functions is then studied. Finally, various applications to birth-and-death and to diffusion processes are discussed. Copyright © 2001 John Wiley & Sons, Ltd. [source] Power spectra from ,spotted' accretion discsASTRONOMISCHE NACHRICHTEN, Issue 10 2006T. Pechá Abstract We are carrying out a project to calculate power spectra of variability, assuming a model of a ,spotted' accretion disc near a black hole. We consider relativistic effects that change photon energy and produce light-bending and time-delays acting on the X-ray signal received by an observer. We assume that the life-time and the intrinsic emissivity of individual.aring events are described in terms of a simple stochastic process. This allows us to give approximate analytical formulae and compare them with numerical computations. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] A NOTE ON SAMPLING DESIGNS FOR RANDOM PROCESSES WITH NO QUADRATIC MEAN DERIVATIVEAUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 3 2006Bhramar Mukherjee Summary Several authors have previously discussed the problem of obtaining asymptotically optimal design sequences for estimating the path of a stochastic process using intricate analytical techniques. In this note, an alternative treatment is provided for obtaining asymptotically optimal sampling designs for estimating the path of a second order stochastic process with known covariance function. A simple estimator is proposed which is asymptotically equivalent to the full-fledged best linear unbiased estimator and the entire asymptotics are carried out through studying this estimator. The current approach lends an intuitive statistical perspective to the entire estimation problem. [source] |