Sampling Algorithm (sampling + algorithm)

Distribution by Scientific Domains


Selected Abstracts


Deterministic Importance Sampling with Error Diffusion

COMPUTER GRAPHICS FORUM, Issue 4 2009
László Szirmay-Kalos
This paper proposes a deterministic importance sampling algorithm that is based on the recognition that delta-sigma modulation is equivalent to importance sampling. We propose a generalization for delta-sigma modulation in arbitrary dimensions, taking care of the curse of dimensionality as well. Unlike previous sampling techniques that transform low-discrepancy and highly stratified samples in the unit cube to the integration domain, our error diffusion sampler ensures the proper distribution and stratification directly in the integration domain. We also present applications, including environment mapping and global illumination rendering with virtual point sources. [source]


Estimating the number of ozone peaks in Mexico City using a non-homogeneous Poisson model

ENVIRONMETRICS, Issue 5 2008
Jorge A. Achcar
Abstract In this paper, we consider the problem of estimating the number of times an air quality standard is exceeded in a given period of time. A non-homogeneous Poisson model is proposed to analyse this issue. The rate at which the Poisson events occur is given by a rate function ,(t), t,,,0. This rate function also depends on some parameters that need to be estimated. Two forms of ,(t), t,,,0 are considered. One of them is of the Weibull form and the other is of the exponentiated-Weibull form. The parameters estimation is made using a Bayesian formulation based on the Gibbs sampling algorithm. The assignation of the prior distributions for the parameters is made in two stages. In the first stage, non-informative prior distributions are considered. Using the information provided by the first stage, more informative prior distributions are used in the second one. The theoretical development is applied to data provided by the monitoring network of Mexico City. The rate function that best fit the data varies according to the region of the city and/or threshold that is considered. In some cases the best fit is the Weibull form and in other cases the best option is the exponentiated-Weibull. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Online end-to-end quality of service monitoring for service level agreement management

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 4 2008
Xiaoyuan Ta
Abstract A major challenge in network and service level agreement (SLA) management is to provide Quality of Service (QoS) demanded by heterogeneous network applications. Online QoS monitoring plays an important role in the process by providing objective measurements that can be used for improving network design, troubleshooting and management. Online QoS monitoring becomes increasingly difficult and complex due to the rapid expansion of the Internet and the dramatic increase in the speed of network. Sampling techniques have been explored as a means to reduce the difficulty and complexity of measurement. In this paper, we investigate several major sampling techniques, i.e. systematic sampling, simple random sampling and stratified sampling. Performance analysis is conducted on these techniques. It is shown that stratified sampling with optimum allocation has the best performance. However, stratified sampling with optimum allocation requires additional statistics usually not available for real-time applications. An adaptive stratified sampling algorithm is proposed to solve the problem. Both theoretical analysis and simulation show that the proposed adaptive stratified sampling algorithm outperforms other sampling techniques and achieves a performance comparable to stratified sampling with optimum allocation. A QoS monitoring software using the aforementioned sampling techniques is designed and tested in various real networks. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Adaptive sampling applied to multivariate, multiple output rational interpolation models with application to microwave circuits

INTERNATIONAL JOURNAL OF RF AND MICROWAVE COMPUTER-AIDED ENGINEERING, Issue 4 2002
Robert Lehmensiek
Abstract A fast and efficient adaptive sampling algorithm for multivariate, multiple output rational interpolation models is presented, which is based on convergents of Thiele type branched continued fractions. The multiple output interpolation model consists of a set of rational interpolants, and each interpolant models one of the output parameters. A single global error function is defined that incorporates all the output parameters, and it is used for the selection of the same set of support points for all the interpolants. The technique is evaluated on several passive microwave structures and compared to previously published results. © 2002 Wiley Periodicals, Inc. Int J RF and Microwave CAE 12: 332,340, 2002. Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/mmce10032 [source]


Darwin, Galton and the Statistical Enlightenment

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES A (STATISTICS IN SOCIETY), Issue 3 2010
Stephen M. Stigler
Summary., On September 10th, 1885, Francis Galton ushered in a new era of Statistical Enlightenment with an address to the British Association for the Advancement of Science in Aberdeen. In the process of solving a puzzle that had lain dormant in Darwin's Origin of Species, Galton introduced multivariate analysis and paved the way towards modern Bayesian statistics. The background to this work is recounted, including the recognition of a failed attempt by Galton in 1877 as providing the first use of a rejection sampling algorithm for the simulation of a posterior distribution, and the first appearance of a proper Bayesian analysis for the normal distribution. [source]


Advanced Statistical Analysis as a Novel Tool to Pneumatic Conveying Monitoring and Control Strategy Development

PARTICLE & PARTICLE SYSTEMS CHARACTERIZATION, Issue 3-4 2006
Andrzej Romanowski
Abstract Behaviour of powder flow in pneumatic conveying has been investigated for many years, though it still remains a challenging task both practically and theoretically, especially when considering monitoring and control issues. Better understanding of the gas-solids flow structures can be beneficial for the design and operation of pneumatic transport installations. This paper covers a novel approach for providing the quantitative description in terms of parameter values useful for monitoring and control of this process with the use of Electrical Capacitance Tomography (ECT). The use of Bayesian statistics for analysis of ECT data allows the direct estimation of control parameters. This paper presents how this characteristic parameters estimation can be accomplished without the need for reconstruction and image post processing, which was a classical endeavour whenever tomography was applied. It is achieved using a ,high-level' statistical Bayesian modelling combined with a Markov chain Monte Carlo (MCMC) sampling algorithm. Advanced statistics is applied to data analysis for measurements coming from the part of phenomena present in the horizontal section of pneumatic conveyor during slug formation. [source]


Application of a Bayesian Approach to the Tomographic Analysis of Hopper Flow

PARTICLE & PARTICLE SYSTEMS CHARACTERIZATION, Issue 4 2005
Krzysztof Grudzien
Abstract This paper presents a new approach to the analysis of data on powder flow from electrical capacitance tomography (ECT) using probability modelling and Bayesian statistics. The methodology is illustrated for powder flow in a hopper. The purpose, and special features, of this approach is that ,high-level' statistical Bayesian modelling combined with a Markov chain Monte Carlo (MCMC) sampling algorithm allows direct estimation of control parameters of industrial processes in contrast to usually applied ,low-level', pixel-based methods of data analysis. This enables reliable recognition of key process features in a quantitative manner. The main difficulty when investigating hopper flow with ECT is due to the need to measure small differences in particle packing density. The MCMC protocol enables more robust identification of the responses of such complex systems. This paper demonstrates the feasibility of the approach for a simple case of particulate material flow during discharging of a hopper. It is concluded that these approaches can offer significant advantages for the analysis and control of some industrial powder and other multi-phase flow processes. [source]


Speeding up the FMMR perfect sampling algorithm: A case study revisited

RANDOM STRUCTURES AND ALGORITHMS, Issue 4 2003
Robert P. Dobrow
Abstract In a previous paper by the second author, two Markov chain Monte Carlo perfect sampling algorithms,one called coupling from the past (CFTP) and the other (FMMR) based on rejection sampling,are compared using as a case study the move-to-front (MTF) self-organizing list chain. Here we revisit that case study and, in particular, exploit the dependence of FMMR on the user-chosen initial state. We give a stochastic monotonicity result for the running time of FMMR applied to MTF and thus identify the initial state that gives the stochastically smallest running time; by contrast, the initial state used in the previous study gives the stochastically largest running time. By changing from worst choice to best choice of initial state we achieve remarkable speedup of FMMR for MTF; for example, we reduce the running time (as measured in Markov chain steps) from exponential in the length n of the list nearly down to n when the items in the list are requested according to a geometric distribution. For this same example, the running time for CFTP grows exponentially in n. © 2003 Wiley Periodicals, Inc. Random Struct. Alg., 2003 [source]


An Importance Sampling Method to Evaluate Value-at-Risk for Assets with Jump Risk,

ASIA-PACIFIC JOURNAL OF FINANCIAL STUDIES, Issue 5 2009
Ren-Her Wang
Abstract Risk management is an important issue when there is a catastrophic event that affects asset price in the market such as a sub-prime financial crisis or other financial crisis. By adding a jump term in the geometric Brownian motion, the jump diffusion model can be used to describe abnormal changes in asset prices when there is a serious event in the market. In this paper, we propose an importance sampling algorithm to compute the Value-at-Risk for linear and nonlinear assets under a multi-variate jump diffusion model. To be more precise, an efficient computational procedure is developed for estimating the portfolio loss probability for linear and nonlinear assets with jump risks. And the titling measure can be separated for the diffusion and the jump part under the assumption of independence. The simulation results show that the efficiency of importance sampling improves over the naive Monte Carlo simulation from 7 times to 285 times under various situations. We also show the robustness of the importance sampling algorithm by comparing it with the EVT-Copula method proposed by Oh and Moon (2006). [source]


Bayesian Analysis for Generalized Linear Models with Nonignorably Missing Covariates

BIOMETRICS, Issue 3 2005
Lan Huang
Summary We propose Bayesian methods for estimating parameters in generalized linear models (GLMs) with nonignorably missing covariate data. We show that when improper uniform priors are used for the regression coefficients, ,, of the multinomial selection model for the missing data mechanism, the resulting joint posterior will always be improper if (i) all missing covariates are discrete and an intercept is included in the selection model for the missing data mechanism, or (ii) at least one of the covariates is continuous and unbounded. This impropriety will result regardless of whether proper or improper priors are specified for the regression parameters, ,, of the GLM or the parameters, ,, of the covariate distribution. To overcome this problem, we propose a novel class of proper priors for the regression coefficients, ,, in the selection model for the missing data mechanism. These priors are robust and computationally attractive in the sense that inferences about , are not sensitive to the choice of the hyperparameters of the prior for , and they facilitate a Gibbs sampling scheme that leads to accelerated convergence. In addition, we extend the model assessment criterion of Chen, Dey, and Ibrahim (2004a, Biometrika91, 45,63), called the weighted L measure, to GLMs and missing data problems as well as extend the deviance information criterion (DIC) of Spiegelhalter et al. (2002, Journal of the Royal Statistical Society B64, 583,639) for assessing whether the missing data mechanism is ignorable or nonignorable. A novel Markov chain Monte Carlo sampling algorithm is also developed for carrying out posterior computation. Several simulations are given to investigate the performance of the proposed Bayesian criteria as well as the sensitivity of the prior specification. Real datasets from a melanoma cancer clinical trial and a liver cancer study are presented to further illustrate the proposed methods. [source]