Home About us Contact | |||
Limit Theorem (limit + theorem)
Kinds of Limit Theorem Selected AbstractsAverage case analysis of the Boyer-Moore algorithmRANDOM STRUCTURES AND ALGORITHMS, Issue 4 2006Tsung-Hsi Tsai Limit theorems (including a Berry-Esseen bound) are derived for the number of comparisons taken by the Boyer-Moore algorithm for finding the occurrences of a given pattern in a random text. Previously, only special variants of this algorithm have been analyzed. We also propose a means of computing the limiting constants for the mean and the variance. © 2005 Wiley Periodicals, Inc. Random Struct. Alg., 2006. [source] NUMBERS OF OBSERVATIONS NEAR ORDER STATISTICSAUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 4 2009Anthony G. Pakes Summary Limit theorems are obtained for the numbers of observations in a random sample that fall within a left-hand or right-hand neighbourhood of the,kth order statistic. The index,k,can be fixed, or tend to infinity as the sample size increases unboundedly. In essence, the proofs are applications of the classical Poisson and De Moivre,Laplace theorems. [source] Solute transport in sand and chalk: a probabilistic approachHYDROLOGICAL PROCESSES, Issue 5 2006E. Carlier Abstract A probabilistic approach is used to simulate particle tracking for two types of porous medium. The first is sand grains with a single intergranular porosity. Particle tracking is carried out by advection and dispersion. The second is chalk granulates with intergranular and matrix porosities. Sorption can occur with advection and dispersion during particle tracking. Particle tracking is modelled as the sum of elementary steps with independent random variables in the sand medium. An exponential distribution is obtained for each elementary step and shows that the whole process is Markovian. A Gamma distribution or probability density function is then deduced. The relationships between dispersivity and the elementary step are given using the central limit theorem. Particle tracking in the chalky medium is a non-Markovian process. The probability density function depends on a power of the distance. Experimental simulations by dye tracer tests on a column have been performed for different distances and discharges. The probabilistic approach computations are in good agreement with the experimental data. The probabilistic computation seems an interesting and complementary approach to simulate transfer phenomena in porous media with respect to the traditional numerical methods. Copyright © 2006 John Wiley & Sons, Ltd. [source] The Early History of the Cumulants and the Gram-Charlier SeriesINTERNATIONAL STATISTICAL REVIEW, Issue 2 2000Anders Hald Summary The early history of the Gram-Charlier series is discussed from three points of view: (1) a generalization of Laplace's central limit theorem, (2) a least squares approximation to a continuous function by means of Chebyshev-Hermite polynomials, (3) a generalization of Gauss's normal distribution to a system of skew distributions. Thiele defined the cumulants in terms of the moments, first by a recursion formula and later by an expansion of the logarithm of the moment generating function. He devised a differential operator which adjusts any cumulant to a desired value. His little known 1899 paper in Danish on the properties of the cumulants is translated into English in the Appendix. [source] The lognormal distribution is not an appropriate null hypothesis for the species,abundance distributionJOURNAL OF ANIMAL ECOLOGY, Issue 3 2005MARK WILLIAMSON Summary 1Of the many models for species,abundance distributions (SADs), the lognormal has been the most popular and has been put forward as an appropriate null model for testing against theoretical SADs. In this paper we explore a number of reasons why the lognormal is not an appropriate null model, or indeed an appropriate model of any sort, for a SAD. 2We use three empirical examples, based on published data sets, to illustrate features of SADs in general and of the lognormal in particular: the abundance of British breeding birds, the number of trees > 1 cm diameter at breast height (d.b.h.) on a 50 ha Panamanian plot, and the abundance of certain butterflies trapped at Jatun Sacha, Ecuador. The first two are complete enumerations and show left skew under logarithmic transformation, the third is an incomplete enumeration and shows right skew. 3Fitting SADs by ,2 test is less efficient and less informative than fitting probability plots. The left skewness of complete enumerations seems to arise from a lack of extremely abundant species rather than from a surplus of rare ones. One consequence is that the logit-normal, which stretches the right-hand end of the distribution, consistently gives a slightly better fit. 4The central limit theorem predicts lognormality of abundances within species but not between them, and so is not a basis for the lognormal SAD. Niche breakage and population dynamical models can predict a lognormal SAD but equally can predict many other SADs. 5The lognormal sits uncomfortably between distributions with infinite variance and the log-binomial. The latter removes the absurdity of the invisible highly abundant half of the individuals abundance curve predicted by the lognormal SAD. The veil line is a misunderstanding of the sampling properties of the SAD and fitting the Poisson lognormal is not satisfactory. A satisfactory SAD should have a thinner right-hand tail than the lognormal, as is observed empirically. 6The SAD for logarithmic abundance cannot be Gaussian. [source] Proportional hazards estimate of the conditional survival functionJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 4 2000Ronghui Xu We introduce a new estimator of the conditional survival function given some subset of the covariate values under a proportional hazards regression. The new estimate does not require estimating the base-line cumulative hazard function. An estimate of the variance is given and is easy to compute, involving only those quantities that are routinely calculated in a Cox model analysis. The asymptotic normality of the new estimate is shown by using a central limit theorem for Kaplan,Meier integrals. We indicate the straightforward extension of the estimation procedure under models with multiplicative relative risks, including non-proportional hazards, and to stratified and frailty models. The estimator is applied to a gastric cancer study where it is of interest to predict patients' survival based only on measurements obtained before surgery, the time at which the most important prognostic variable, stage, becomes known. [source] Identification of Persistent Cycles in Non-Gaussian Long-Memory Time SeriesJOURNAL OF TIME SERIES ANALYSIS, Issue 4 2008Mohamed Boutahar Abstract., Asymptotic distribution is derived for the least squares estimates (LSE) in the unstable AR(p) process driven by a non-Gaussian long-memory disturbance. The characteristic polynomial of the autoregressive process is assumed to have pairs of complex roots on the unit circle. In order to describe the limiting distribution of the LSE, two limit theorems involving long-memory processes are established in this article. The first theorem gives the limiting distribution of the weighted sum, is a non-Gaussian long-memory moving-average process and (cn,k,1 , k , n) is a given sequence of weights; the second theorem is a functional central limit theorem for the sine and cosine Fourier transforms [source] Nonlinear functionals of the periodogramJOURNAL OF TIME SERIES ANALYSIS, Issue 5 2002GILLES FAY A central limit theorem is stated for a wide class of triangular arrays of nonlinear functionals of the periodogram of a stationary linear sequence. Those functionals may be singular and not-bounded. The proof of this result is based on Bartlett decomposition and an existing counterpart result for the periodogram of an independent and identically distributed sequence, here taken to be the driving noise. The main contribution of this paper is to prove the asymptotic negligibility of the remainder term from Bartlett decomposition, feasible under short dependence assumption. As it is highlighted by applications (to estimation of nonlinear functionals of the spectral density, robust spectral estimation, local polynomial approximation and log-periodogram regression), this extends may results until then tied to Gaussian assumption. [source] Spectral Regression For Cointegrated Time Series With Long-Memory InnovationsJOURNAL OF TIME SERIES ANALYSIS, Issue 6 2000D. Marinucci Spectral regression is considered for cointegrated time series with long-memory innovations. The estimates we advocate are shown to be consistent when cointegrating relationships among stationary variables are investigated, while ordinary least squares are inconsistent due to correlation between the regressors and the cointegrating residuals; in the presence of unit roots, these estimates share the same asymptotic distribution as ordinary least squares. As a corollary of the main result, we provide a functional central limit theorem for quadratic forms in non-stationary fractionally integrated processes. [source] A limit theorem for randomly stopped independent increment processes on separable metrizable groupsMATHEMATISCHE NACHRICHTEN, Issue 15 2007Kern, Peter Becker Abstract In the spirit of the classical random central limit theorem a general limit theorem for random stopping in the scheme of infinitesimal triangular arrays on a separable metrizable group is presented. The approach incorporates and generalizes earlier results for normalized sequences of independent random variables on both separable Banach spaces and simply connected nilpotent Lie groups originated by Siegel and Hazod, respectively. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Variation mode and effect analysis: an application to fatigue life predictionQUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 2 2009Pär Johannesson Abstract We present an application of the probabilistic branch of variation mode and effect analysis (VMEA) implemented as a first-order, second-moment reliability method. First order means that the failure function is approximated to be linear around the nominal values with respect to the main influencing variables, while second moment means that only means and variances are taken into account in the statistical procedure. We study the fatigue life of a jet engine component and aim at a safety margin that takes all sources of prediction uncertainties into account. Scatter is defined as random variation due to natural causes, such as non-homogeneous material, geometry variation within tolerances, load variation in usage, and other uncontrolled variations. Other uncertainties are unknown systematic errors, such as model errors in the numerical calculation of fatigue life, statistical errors in estimates of parameters, and unknown usage profile. By treating also systematic errors as random variables, the whole safety margin problem is put into a common framework of second-order statistics. The final estimated prediction variance of the logarithmic life is obtained by summing the variance contributions of all sources of scatter and other uncertainties, and it represents the total uncertainty in the life prediction. Motivated by the central limit theorem, this logarithmic life random variable may be regarded as normally distributed, which gives possibilities to calculate relevant safety margins. Copyright © 2008 John Wiley & Sons, Ltd. [source] A system of grabbing particles related to Galton-Watson treesRANDOM STRUCTURES AND ALGORITHMS, Issue 4 2010Jean Bertoin Abstract We consider a system of particles with arms that are activated randomly to grab other particles as a toy model for polymerization. We assume that the following two rules are fulfilled: once a particle has been grabbed then it cannot be grabbed again, and an arm cannot grab a particle that belongs to its own cluster. We are interested in the shape of a typical polymer in the situation when the initial number of monomers is large and the numbers of arms of monomers are given by i.i.d. random variables. Our main result is a limit theorem for the empirical distribution of polymers, where limit is expressed in terms of a Galton-Watson tree. © 2010 Wiley Periodicals, Inc. Random Struct. Alg., 2010 [source] Some bounds on the coupon collector problemRANDOM STRUCTURES AND ALGORITHMS, Issue 2 2004Servet Martinez Abstract This work addresses on the coupon collector problem and its generalization introduced by Flajolet, Gardy, and Thimonier. In our main results, we show a ratio limit theorem for the random time of the generalized coupon collector problem, and, further, we give the leading term and the geometric rate for the distribution of this random time, when the number of throws is large. For the classical coupon collector problem, we give a bound on the conditional second moment for the number of visits to the coupons, relying strongly on a result of Holst on extremal distributions. © 2004 Wiley Periodicals, Inc. Random Struct. Alg. 2004 [source] Reversible coagulation,fragmentation processes and random combinatorial structures: Asymptotics for the number of groupsRANDOM STRUCTURES AND ALGORITHMS, Issue 2 2004Michael M. Erlihson Abstract The equilibrium distribution of a reversible coagulation-fragmentation process (CFP) and the joint distribution of components of a random combinatorial structure (RCS) are given by the same probability measure on the set of partitions. We establish a central limit theorem for the number of groups (= components) in the case a(k) = qkp,1, k , 1, q, p > 0, where a(k), k , 1, is the parameter function that induces the invariant measure. The result obtained is compared with the ones for logarithmic RCS's and for RCS's, corresponding to the case p < 0. © 2004 Wiley Periodicals, Inc. Random Struct. Alg. 2004 [source] Decay of correlations and the central limit theorem for meromorphic mapsCOMMUNICATIONS ON PURE & APPLIED MATHEMATICS, Issue 5 2006Tien-Cuong Dinh Let f be a dominant meromorphic self-map of large topological degree on a compact Kähler manifold. We give a new construction of the equilibrium measure , of f and prove that , is exponentially mixing. As a consequence, we get the central limit theorem in particular for Hölder-continuous observables, but also for noncontinuous observables. © 2005 Wiley Periodicals, Inc. [source] Central limit theorems for nonparametric estimators with real-time random variablesJOURNAL OF TIME SERIES ANALYSIS, Issue 5 2010Tae Yoon Kim Primary 62G07; 62F12; Secondary 62M05 C13; C14 In this article, asymptotic theories for nonparametric methods are studied when they are applied to real-time data. In particular, we derive central limit theorems for nonparametric density and regression estimators. For this we formally introduce a sequence of real-time random variables indexed by a parameter related to fine gridding of time domain (or fine discretization). Our results show that the impact of fine gridding is greater in the density estimation case in the sense that strong dependence due to fine gridding severely affects the major strength of nonparametric density estimator (or its data-adaptive property). In addition, we discuss some issues about nonparametric regression model with fine gridding of time domain. [source] Identification of Persistent Cycles in Non-Gaussian Long-Memory Time SeriesJOURNAL OF TIME SERIES ANALYSIS, Issue 4 2008Mohamed Boutahar Abstract., Asymptotic distribution is derived for the least squares estimates (LSE) in the unstable AR(p) process driven by a non-Gaussian long-memory disturbance. The characteristic polynomial of the autoregressive process is assumed to have pairs of complex roots on the unit circle. In order to describe the limiting distribution of the LSE, two limit theorems involving long-memory processes are established in this article. The first theorem gives the limiting distribution of the weighted sum, is a non-Gaussian long-memory moving-average process and (cn,k,1 , k , n) is a given sequence of weights; the second theorem is a functional central limit theorem for the sine and cosine Fourier transforms [source] André Dabrowski's work on limit theorems and weak dependenceTHE CANADIAN JOURNAL OF STATISTICS, Issue 3 2009Herold Dehling Abstract André Robert Dabrowski, Professor of Mathematics and Dean of the Faculty of Sciences at the University of Ottawa, died October 7, 2006, after a short battle with cancer. The author of the present paper, a long-term friend and collaborator of André Dabrowski, gives a survey of André's work on weak dependence and limit theorems in probability theory. The Canadian Journal of Statistics 37: 307,326; 2009 © 2009 Statistical Society of Canada André Robert Dabrowski, professeur de mathématiques et doyen de la Faculté des sciences de l'Université d'Ottawa, est décédé le 7 octobre 2006 après une courte bataille avec le cancer. L'auteur de cet article, un collaborateur et ami de longue date d'André Dabrowski, présente un survol des travaux d'André sur la dépendance faible et les théorèmes limites en théorie des probabilités. La revue canadienne de statistique 37: 307,326; 2009 © 2009 Société statistique du Canada [source] |