Central Limit Theorem (central + limit_theorem)

Distribution by Scientific Domains


Selected Abstracts


Central limit theorems for nonparametric estimators with real-time random variables

JOURNAL OF TIME SERIES ANALYSIS, Issue 5 2010
Tae Yoon Kim
Primary 62G07; 62F12; Secondary 62M05 C13; C14 In this article, asymptotic theories for nonparametric methods are studied when they are applied to real-time data. In particular, we derive central limit theorems for nonparametric density and regression estimators. For this we formally introduce a sequence of real-time random variables indexed by a parameter related to fine gridding of time domain (or fine discretization). Our results show that the impact of fine gridding is greater in the density estimation case in the sense that strong dependence due to fine gridding severely affects the major strength of nonparametric density estimator (or its data-adaptive property). In addition, we discuss some issues about nonparametric regression model with fine gridding of time domain. [source]


Solute transport in sand and chalk: a probabilistic approach

HYDROLOGICAL PROCESSES, Issue 5 2006
E. Carlier
Abstract A probabilistic approach is used to simulate particle tracking for two types of porous medium. The first is sand grains with a single intergranular porosity. Particle tracking is carried out by advection and dispersion. The second is chalk granulates with intergranular and matrix porosities. Sorption can occur with advection and dispersion during particle tracking. Particle tracking is modelled as the sum of elementary steps with independent random variables in the sand medium. An exponential distribution is obtained for each elementary step and shows that the whole process is Markovian. A Gamma distribution or probability density function is then deduced. The relationships between dispersivity and the elementary step are given using the central limit theorem. Particle tracking in the chalky medium is a non-Markovian process. The probability density function depends on a power of the distance. Experimental simulations by dye tracer tests on a column have been performed for different distances and discharges. The probabilistic approach computations are in good agreement with the experimental data. The probabilistic computation seems an interesting and complementary approach to simulate transfer phenomena in porous media with respect to the traditional numerical methods. Copyright © 2006 John Wiley & Sons, Ltd. [source]


The Early History of the Cumulants and the Gram-Charlier Series

INTERNATIONAL STATISTICAL REVIEW, Issue 2 2000
Anders Hald
Summary The early history of the Gram-Charlier series is discussed from three points of view: (1) a generalization of Laplace's central limit theorem, (2) a least squares approximation to a continuous function by means of Chebyshev-Hermite polynomials, (3) a generalization of Gauss's normal distribution to a system of skew distributions. Thiele defined the cumulants in terms of the moments, first by a recursion formula and later by an expansion of the logarithm of the moment generating function. He devised a differential operator which adjusts any cumulant to a desired value. His little known 1899 paper in Danish on the properties of the cumulants is translated into English in the Appendix. [source]


The lognormal distribution is not an appropriate null hypothesis for the species,abundance distribution

JOURNAL OF ANIMAL ECOLOGY, Issue 3 2005
MARK WILLIAMSON
Summary 1Of the many models for species,abundance distributions (SADs), the lognormal has been the most popular and has been put forward as an appropriate null model for testing against theoretical SADs. In this paper we explore a number of reasons why the lognormal is not an appropriate null model, or indeed an appropriate model of any sort, for a SAD. 2We use three empirical examples, based on published data sets, to illustrate features of SADs in general and of the lognormal in particular: the abundance of British breeding birds, the number of trees > 1 cm diameter at breast height (d.b.h.) on a 50 ha Panamanian plot, and the abundance of certain butterflies trapped at Jatun Sacha, Ecuador. The first two are complete enumerations and show left skew under logarithmic transformation, the third is an incomplete enumeration and shows right skew. 3Fitting SADs by ,2 test is less efficient and less informative than fitting probability plots. The left skewness of complete enumerations seems to arise from a lack of extremely abundant species rather than from a surplus of rare ones. One consequence is that the logit-normal, which stretches the right-hand end of the distribution, consistently gives a slightly better fit. 4The central limit theorem predicts lognormality of abundances within species but not between them, and so is not a basis for the lognormal SAD. Niche breakage and population dynamical models can predict a lognormal SAD but equally can predict many other SADs. 5The lognormal sits uncomfortably between distributions with infinite variance and the log-binomial. The latter removes the absurdity of the invisible highly abundant half of the individuals abundance curve predicted by the lognormal SAD. The veil line is a misunderstanding of the sampling properties of the SAD and fitting the Poisson lognormal is not satisfactory. A satisfactory SAD should have a thinner right-hand tail than the lognormal, as is observed empirically. 6The SAD for logarithmic abundance cannot be Gaussian. [source]


Proportional hazards estimate of the conditional survival function

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 4 2000
Ronghui Xu
We introduce a new estimator of the conditional survival function given some subset of the covariate values under a proportional hazards regression. The new estimate does not require estimating the base-line cumulative hazard function. An estimate of the variance is given and is easy to compute, involving only those quantities that are routinely calculated in a Cox model analysis. The asymptotic normality of the new estimate is shown by using a central limit theorem for Kaplan,Meier integrals. We indicate the straightforward extension of the estimation procedure under models with multiplicative relative risks, including non-proportional hazards, and to stratified and frailty models. The estimator is applied to a gastric cancer study where it is of interest to predict patients' survival based only on measurements obtained before surgery, the time at which the most important prognostic variable, stage, becomes known. [source]


Identification of Persistent Cycles in Non-Gaussian Long-Memory Time Series

JOURNAL OF TIME SERIES ANALYSIS, Issue 4 2008
Mohamed Boutahar
Abstract., Asymptotic distribution is derived for the least squares estimates (LSE) in the unstable AR(p) process driven by a non-Gaussian long-memory disturbance. The characteristic polynomial of the autoregressive process is assumed to have pairs of complex roots on the unit circle. In order to describe the limiting distribution of the LSE, two limit theorems involving long-memory processes are established in this article. The first theorem gives the limiting distribution of the weighted sum, is a non-Gaussian long-memory moving-average process and (cn,k,1 , k , n) is a given sequence of weights; the second theorem is a functional central limit theorem for the sine and cosine Fourier transforms [source]


Nonlinear functionals of the periodogram

JOURNAL OF TIME SERIES ANALYSIS, Issue 5 2002
GILLES FAY
A central limit theorem is stated for a wide class of triangular arrays of nonlinear functionals of the periodogram of a stationary linear sequence. Those functionals may be singular and not-bounded. The proof of this result is based on Bartlett decomposition and an existing counterpart result for the periodogram of an independent and identically distributed sequence, here taken to be the driving noise. The main contribution of this paper is to prove the asymptotic negligibility of the remainder term from Bartlett decomposition, feasible under short dependence assumption. As it is highlighted by applications (to estimation of nonlinear functionals of the spectral density, robust spectral estimation, local polynomial approximation and log-periodogram regression), this extends may results until then tied to Gaussian assumption. [source]


Spectral Regression For Cointegrated Time Series With Long-Memory Innovations

JOURNAL OF TIME SERIES ANALYSIS, Issue 6 2000
D. Marinucci
Spectral regression is considered for cointegrated time series with long-memory innovations. The estimates we advocate are shown to be consistent when cointegrating relationships among stationary variables are investigated, while ordinary least squares are inconsistent due to correlation between the regressors and the cointegrating residuals; in the presence of unit roots, these estimates share the same asymptotic distribution as ordinary least squares. As a corollary of the main result, we provide a functional central limit theorem for quadratic forms in non-stationary fractionally integrated processes. [source]


A limit theorem for randomly stopped independent increment processes on separable metrizable groups

MATHEMATISCHE NACHRICHTEN, Issue 15 2007
Kern, Peter Becker
Abstract In the spirit of the classical random central limit theorem a general limit theorem for random stopping in the scheme of infinitesimal triangular arrays on a separable metrizable group is presented. The approach incorporates and generalizes earlier results for normalized sequences of independent random variables on both separable Banach spaces and simply connected nilpotent Lie groups originated by Siegel and Hazod, respectively. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Variation mode and effect analysis: an application to fatigue life prediction

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 2 2009
Pär Johannesson
Abstract We present an application of the probabilistic branch of variation mode and effect analysis (VMEA) implemented as a first-order, second-moment reliability method. First order means that the failure function is approximated to be linear around the nominal values with respect to the main influencing variables, while second moment means that only means and variances are taken into account in the statistical procedure. We study the fatigue life of a jet engine component and aim at a safety margin that takes all sources of prediction uncertainties into account. Scatter is defined as random variation due to natural causes, such as non-homogeneous material, geometry variation within tolerances, load variation in usage, and other uncontrolled variations. Other uncertainties are unknown systematic errors, such as model errors in the numerical calculation of fatigue life, statistical errors in estimates of parameters, and unknown usage profile. By treating also systematic errors as random variables, the whole safety margin problem is put into a common framework of second-order statistics. The final estimated prediction variance of the logarithmic life is obtained by summing the variance contributions of all sources of scatter and other uncertainties, and it represents the total uncertainty in the life prediction. Motivated by the central limit theorem, this logarithmic life random variable may be regarded as normally distributed, which gives possibilities to calculate relevant safety margins. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Reversible coagulation,fragmentation processes and random combinatorial structures: Asymptotics for the number of groups

RANDOM STRUCTURES AND ALGORITHMS, Issue 2 2004
Michael M. Erlihson
Abstract The equilibrium distribution of a reversible coagulation-fragmentation process (CFP) and the joint distribution of components of a random combinatorial structure (RCS) are given by the same probability measure on the set of partitions. We establish a central limit theorem for the number of groups (= components) in the case a(k) = qkp,1, k , 1, q, p > 0, where a(k), k , 1, is the parameter function that induces the invariant measure. The result obtained is compared with the ones for logarithmic RCS's and for RCS's, corresponding to the case p < 0. © 2004 Wiley Periodicals, Inc. Random Struct. Alg. 2004 [source]


Decay of correlations and the central limit theorem for meromorphic maps

COMMUNICATIONS ON PURE & APPLIED MATHEMATICS, Issue 5 2006
Tien-Cuong Dinh
Let f be a dominant meromorphic self-map of large topological degree on a compact Kähler manifold. We give a new construction of the equilibrium measure , of f and prove that , is exponentially mixing. As a consequence, we get the central limit theorem in particular for Hölder-continuous observables, but also for noncontinuous observables. © 2005 Wiley Periodicals, Inc. [source]


Central limit theorems for nonparametric estimators with real-time random variables

JOURNAL OF TIME SERIES ANALYSIS, Issue 5 2010
Tae Yoon Kim
Primary 62G07; 62F12; Secondary 62M05 C13; C14 In this article, asymptotic theories for nonparametric methods are studied when they are applied to real-time data. In particular, we derive central limit theorems for nonparametric density and regression estimators. For this we formally introduce a sequence of real-time random variables indexed by a parameter related to fine gridding of time domain (or fine discretization). Our results show that the impact of fine gridding is greater in the density estimation case in the sense that strong dependence due to fine gridding severely affects the major strength of nonparametric density estimator (or its data-adaptive property). In addition, we discuss some issues about nonparametric regression model with fine gridding of time domain. [source]