Home About us Contact | |||
Probability Function (probability + function)
Selected AbstractsTemporal Extrapolation of PVA Results in Relation to the IUCN Red List Criterion ECONSERVATION BIOLOGY, Issue 1 2003Oskar Kindvall For a population viability analysis ( PVA) to be useful for assessing a species' threat category, the results must have been expressed as the full extinction probability function over time or at least for the three specified time frames. Often this is not the case, and extrapolations from different kinds of PVA results ( e.g., mean time to extinction ) are often necessary. By means of analytic models, we investigated the possibilities of extrapolation. Based on our results, we suggest that extrapolation is not advisable due to the huge errors that can occur. The extinction probability function is the best kind of summary statistic to use when applying criterion E, but even then the threat categorization may be ambiguous. If the extinction risk is low in the near future but increases rapidly later on, a species may be classified as vulnerable even though it is expected to become extinct within 100 years. To avoid this, we suggest that the guidelines to the IUCN Red List criteria include three reference lines that allow for interpretation of the PVA results in the context of the three threat categories within the entire period of 100 years. If the estimated extinction probability function overshoots one of these functions, the species should be classified accordingly. Resumen: Las categorías de amenaza de la lista roja de la IUCN ( críticamente en peligro, en peligro y vulnerable ) están definidos por un juego de criterios ( A-E ). El criterio E se define cuantitativamente por tres límites especificados de riesgo de extinción ( 50%, 20% y 10% ), cada uno de los cuales se asocia a un marco temporal determinado. Para poder usar el análisis de viabilidad poblacional ( PVA ) durante la evaluación de la categoría de amenaza de una especie, los resultados deben ser expresados en función de probabilidad de extinción total sobre un tiempo o al menos para los tres marcos temporales especificados. Frecuentemente este no es el caso y las extrapolaciones de diferente tipo de resultados de PVA ( tiempo promedio de extinción ) son usualmente necesarias. Por medio de modelos analíticos investigamos las posibilidades de extrapolación. En base a nuestros resultados sugerimos que la extrapolación no es recomendable debido a una enorme cantidad de errores que pueden ocurrir. La función de probabilidad de extinción es el mejor tipo de resumen estadístico a usar cuando se aplica el criterio E, pero, aún así, la categorización de amenaza puede ser ambigua. Si el riesgo de extinción es bajo en un futuro inmediato pero incrementa muy rápidamente después, una especie puede ser clasificada como vulnerable aunque se espere que se extinga dentro de los próximos 100 años. Para evitar esto, sugerimos que los lineamientos de los criterios de la Lista Roja de la UICN incluyan tres lineas de referencia que permitan la interpretación de los resultados del PVA en el contexto de las tres categorías dentro de un periodo completo de 100 años. Si la función de probabilidad de extinción estimada se extiende más allá de una de estas funciones, las especies deberán ser clasificadas según sea el caso. [source] Vectorial summation of probabilistic current harmonics in power systems: From a bivariate distribution model towards a univariate probability functionEUROPEAN TRANSACTIONS ON ELECTRICAL POWER, Issue 1 2000Y. J. Wang This paper extends the investigation into the bivariate normal distribution (BND) model which has been widely used to study the asymptotic behaviour of the sum of a sufficiently large number of randomly-varying harmonic phasors (of the same frequency). Although the BND model is effective and applicable to most problems involving harmonic summation, its main drawback resides in the computation time required to extract the probability density function of the harmonic magnitude from the two-dimensional BND model. This paper proposes a novel approach to the problem by assimilating the generalized Gamma distribution (GGD) model to the marginal distribution (the magnitude) of the BND using the method of moments. The proposed method can accurately estimate the parameters of the GGD model without time-consuming calculation. A power system containing ten harmonic sources is taken as an example where the comparison of the Monte-Carlo simulation, the BND model and the GGD model is given and discussed. The comparison shows that the GGD model approximates the BND model very well. [source] On Estimation in M/G/c/c QueuesINTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 6 2001Mei Ling Huang We derive the minimum variance unbiased estimator (MVUE) and the maximum likelihood estimator (MLE) of the stationary probability function (pf) of the number of customers in a collection of independent M/G/c/c subsystems. It is assumed that the offered load and number of servers in each subsystem are unknown. We assume that observations of the total number of customers in the system are utilized because it may be impractical or impossible to observe individual server occupancies. Both estimators depend on the R distribution (the distribution of the sum of independent right truncated Poisson random variables) and R numbers. [source] Micro- and macroscopic characteristics to stage gonadal maturation of female Baltic codJOURNAL OF FISH BIOLOGY, Issue 2 2003J. Tomkiewicz A set of histological characteristics to judge ovarian development was established and used to elaborate morphological criteria of 10 maturity stages of Baltic cod Gadus morhua sampled throughout the annual cycle to represent different macroscopic maturity stages. The applied characteristics confirmed most stages of the macroscopic scale, but the separation of late immature and resting mature females remained imprecise. Atretic vitellogenic oocytes or encapsulated residual eggs identified the resting condition morphologically, but not all ovaries with visible signs of previous spawning showed such features. One ovarian stage that was previously classified as ,ripening' was changed to ,spawning', owing to the prevalence of hydrated eggs and empty follicles. Ovaries with malfunctions were defined by a separate stage. Macroscopic criteria were revised by comparing the gross anatomy of ovaries with their histology. Female length and gonado-somatic index supported stage definitions, but substantial variation in Fulton's condition factor and the hepato-somatic index rendered these of little use for this purpose. The time of sampling influenced staging accuracy. A female spawner probability function based on the proportion of ripening and ripe specimens in early spring seems to be the most appropriate method to estimate spawner biomass and reproductive potential. [source] RANS-simulation of premixed turbulent combustion using the level set approachPROCEEDINGS IN APPLIED MATHEMATICS & MECHANICS, Issue 1 2005A. Kurenkov A model for premixed turbulent combustion is investigated using a RANS-approach. The evolution of the flame front is described with the help of the level set approach [1] which is used for tracking of propagating interfaces in free-surface flows, geodesics, grid generation and combustion. The fluid properties are conditioned on the flame front position using a burntunburnt probability function across the flame front. Computations are performed using the code FASTEST-3D which is a flow solver for a non-orthogonal, block-structured grid. (© 2005 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Counts with an endogenous binary regressor: A series expansion approachTHE ECONOMETRICS JOURNAL, Issue 1 2005Andrés Romeu Summary, We propose an estimator for count data regression models where a binary regressor is endogenously determined. This estimator departs from previous approaches by using a flexible form for the conditional probability function of the counts. Using a Monte Carlo experiment we show that our estimator improves the fit and provides a more reliable estimate of the impact of regressors on the count when compared to alternatives which do restrict the mean to be linear-exponential. In an application to the number of trips by households in the United States, we find that the estimate of the treatment effect obtained is considerably different from the one obtained under a linear-exponential mean specification. [source] Domestic dogs as an edge effect in the Brasília National Park, Brazil: interactions with native mammalsANIMAL CONSERVATION, Issue 5 2009A. C. R. Lacerda Abstract Edge effects are a well-known result of habitat fragmentation. However, little has been published on fragmentation, isolation and the intrusive influence from the surrounding matrix at the landscape level. The objectives of the present study are to evaluate the presence of dogs in the Brasília National Park (BNP) in relation to habitat type and the influence from the surrounding matrix. In addition, this study examines the response of the native mammal fauna to the presence of dogs. Track stations were built along dirt roads in the BNP and subsequently examined for the presence or absence of tracks. We used a stepwise logistic regression to model the occurrence of five mammal species relative to habitat variables, with an ,=0.05 to determine whether to enter and retain a variable in the model. A simulation of each species occurrence probability was conducted using a combination of selected habitat variables in a resource selection probability function. Results indicate a negative relationship between distance from the BNP edge and the probability of dog occurrences. From an ecological perspective, the presence of dogs inside the BNP indicates an edge effect. The occurrence of the maned wolf was positively associated with distance from a garbage dump site and negatively associated with the presence of dog tracks. The maned wolf and giant anteater seem to avoid areas near the garbage dump as well as areas with dog tracks. There is no support for the possible existence of a feral dog population inside the BNP, but the effects of free-ranging dogs on the wildlife population in such an isolated protected area must not be neglected. Domestic dog Canis familiaris populations and disease control programs should be established in the urban, sub-urban and rural areas surrounding the BNP, along with the complete removal of the garbage dump from the BNP surroundings. [source] Fisher Information Matrix of the Dirichlet-multinomial DistributionBIOMETRICAL JOURNAL, Issue 2 2005Sudhir R. Paul Abstract In this paper we derive explicit expressions for the elements of the exact Fisher information matrix of the Dirichlet-multinomial distribution. We show that exact calculation is based on the beta-binomial probability function rather than that of the Dirichlet-multinomial and this makes the exact calculation quite easy. The exact results are expected to be useful for the calculation of standard errors of the maximum likelihood estimates of the beta-binomial parameters and those of the Dirichlet-multinomial parameters for data that arise in practice in toxicology and other similar fields. Standard errors of the maximum likelihood estimates of the beta-binomial parameters and those of the Dirichlet-multinomial parameters, based on the exact and the asymptotic Fisher information matrix based on the Dirichlet distribution, are obtained for a set of data from Haseman and Soares (1976), a dataset from Mosimann (1962) and a more recent dataset from Chen, Kodell, Howe and Gaylor (1991). There is substantial difference between the standard errors of the estimates based on the exact Fisher information matrix and those based on the asymptotic Fisher information matrix. (© 2005 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Test of Marginal Compatibility and Smoothing Methods for Exchangeable Binary Data with Unequal Cluster SizesBIOMETRICS, Issue 1 2007Zhen Pang Summary Exchangeable binary data are often collected in developmental toxicity and other studies, and a whole host of parametric distributions for fitting this kind of data have been proposed in the literature. While these distributions can be matched to have the same marginal probability and intra-cluster correlation, they can be quite different in terms of shape and higher-order quantities of interest such as the litter-level risk of having at least one malformed fetus. A sensible alternative is to fit a saturated model (Bowman and George, 1995, Journal of the American Statistical Association90, 871,879) using the expectation-maximization (EM) algorithm proposed by Stefanescu and Turnbull (2003, Biometrics59, 18,24). The assumption of compatibility of marginal distributions is often made to link up the distributions for different cluster sizes so that estimation can be based on the combined data. Stefanescu and Turnbull proposed a modified trend test to test this assumption. Their test, however, fails to take into account the variability of an estimated null expectation and as a result leads to inaccurate p -values. This drawback is rectified in this article. When the data are sparse, the probability function estimated using a saturated model can be very jagged and some kind of smoothing is needed. We extend the penalized likelihood method (Simonoff, 1983, Annals of Statistics11, 208,218) to the present case of unequal cluster sizes and implement the method using an EM-type algorithm. In the presence of covariate, we propose a penalized kernel method that performs smoothing in both the covariate and response space. The proposed methods are illustrated using several data sets and the sampling and robustness properties of the resulting estimators are evaluated by simulations. [source] Identification of the effective distribution function for determination of the distributed activation energy models using the maximum likelihood method: Isothermal thermogravimetric dataINTERNATIONAL JOURNAL OF CHEMICAL KINETICS, Issue 1 2009Bojan Jankovi The new procedure for identification of the effective distribution function for determination of the distributed activation energy models, which is based on use the maximum likelihood method (MLM), was established. The five different continuous probability functions (exponential, logistic, normal, gamma, and Weibull probability functions (the extended set of distributions)) were used for searching the best reactivity model for two heterogeneous processes: (a) the isothermal reduction process of nickel oxide under hydrogen atmosphere and (b) the isothermal degradation process of bisphenol-A polycarbonate (Lexan) under nitrogen atmosphere. The MLM showed that for both processes, the most suitable reactivity model represents the Weibull distribution model. It was concluded that the values of Arrhenius parameters (ln A and Ea), evaluated from the Weibull distribution model, represent the effective kinetic values for both considered processes. This procedure enables identification the suitable distribution model for considered process only from the experimental data (based on the shapes of obtained integral kinetic curves), and this fact represents the advantage of established analysis. The established mathematical procedure, which is based on the MLM, can be applied as the preliminary analysis for evaluating the distribution of activation energies for complex heterogeneous processes. © 2008 Wiley Periodicals, Inc. Int J Chem Kinet 41: 27,44, 2009 [source] On convexity of MQAM's and MPAM's bit error probability functionsINTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 11 2009M. Naeem Abstract For MQAM and MPAM with practical values of M and Gray mapping, we provide a rigorous proof that the associated bit error probability (BEP) functions are convex of the signal-to-noise ratio per symbol. The proof employs Taylor series expansions of the BEP functions' second derivatives and term-by-term comparisons between positive and negative terms. Convexity results are useful for optimizing communication systems as in optimizing adaptive transmission policies. Copyright © 2009 John Wiley & Sons, Ltd. [source] Reliability of power stations: Stochastic versus derated power approachINTERNATIONAL JOURNAL OF ENERGY RESEARCH, Issue 2 2004Kris R. Voorspools Abstract Consideration of the eventual forced outage of individual power stations leads to a large number of possible states of the power generating system, all with their own probability. It is possible to design a stochastic method to properly take into account all of these possibilities and to weigh them accordingly. In broader energy models, instead of these stochastic techniques that require a considerable amount of calculation time, mostly approximative static simplified methods are applied. Up till now, these simplified techniques have not been validated. The scope of this paper is to check their validity. Therefore, two approaches are compared: a complete stochastic approach and a method based on the derated power (which is the nominal power multiplied with the average availability) of the individual plants. The conclusion of this comparison is that derated power may be used in energy modelling instead of the complicated stochastic approach. The error made is very small and the correlation between the unserved load probability functions obtained by both methods is excellent. Copyright © 2004 John Wiley & Sons, Ltd. [source] |