Monte Carlo Sampling (monte + carlo_sampling)

Distribution by Scientific Domains

Terms modified by Monte Carlo Sampling

  • monte carlo sampling scheme

  • Selected Abstracts


    Model population analysis for variable selection

    JOURNAL OF CHEMOMETRICS, Issue 7-8 2010
    Hong-Dong Li
    Abstract To build a credible model for given chemical or biological or clinical data, it may be helpful to first get somewhat better insight into the data itself before modeling and then to present the statistically stable results derived from a large number of sub-models established only on one dataset with the aid of Monte Carlo Sampling (MCS). In the present work, a concept model population analysis (MPA) is developed. Briefly, MPA could be considered as a general framework for developing new methods by statistically analyzing some interesting parameters (regression coefficients, prediction errors, etc.) of a number of sub-models. New methods are expected to be developed by making full use of the interesting parameter in a novel manner. In this work, the elements of MPA are first considered and described. Then, the applications for variable selection and model assessment are emphasized with the help of MPA. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    An Adaptive Method for Indirect Illumination Using Light Vectors

    COMPUTER GRAPHICS FORUM, Issue 3 2001
    Xavier Serpaggi
    In computer graphics, several phenomema need to be taken into account when it comes to the field of photo-realism. One of the most relevant is obviously the notion of global, and more precisely indirect, illumination. In "classical" ray-tracing if you are not under the light, then you are in a shadow. A great amount of work has been carried out which proposes ray-tracing based solutions to take into account the fact that "there is a certain amount of light in shadows". All of these methods carry the same weaknesses: high computation time and a lot of parameters you need to manage to get something out of the method. This paper proposes a generic computation method of indirect illumination based on Monte Carlo sampling and on the sequential analysis theory, which is faster and more automatic than classical methods. [source]


    Inversion of terrestrial ecosystem model parameter values against eddy covariance measurements by Monte Carlo sampling

    GLOBAL CHANGE BIOLOGY, Issue 8 2005
    Wolfgang Knorr
    Abstract Effective measures to counter the rising levels of carbon dioxide in the Earth's atmosphere require that we better understand the functioning of the global carbon cycle. Uncertainties about, in particular, the terrestrial carbon cycle's response to climate change remain high. We use a well-known stochastic inversion technique originally developed in nuclear physics, the Metropolis algorithm, to determine the full probability density functions (PDFs) of parameters of a terrestrial ecosystem model. By thus assimilating half-hourly eddy covariance measurements of CO2 and water fluxes, we can substantially reduce the uncertainty of approximately five model parameters, depending on prior uncertainties. Further analysis of the posterior PDF shows that almost all parameters are nearly Gaussian distributed, and reveals some distinct groups of parameters that are constrained together. We show that after assimilating only 7 days of measurements, uncertainties for net carbon uptake over 2 years for the forest site can be substantially reduced, with the median estimate in excellent agreement with measurements. [source]


    Spatial variations in throughfall in a Moso bamboo forest: sampling design for the estimates of stand-scale throughfall

    HYDROLOGICAL PROCESSES, Issue 3 2010
    Yoshinori Shinohara
    Abstract We investigated the spatial and seasonal variations in throughfall (Tf) in relation to spatial and seasonal variations in canopy structure and gross rainfall (Rf) and assessed the impacts of the variations in Tf on stand-scale Tf estimates. We observed the canopy structure expressed as the leaf area index (LAI) once a month and Tf once a week in 25 grids placed in a Moso bamboo (Phyllostachys pubescens) forest for 1 year. The mean LAI and spatial variation in LAI did have some seasonal variations. The spatial variations in Tf reduced with increasing Rf, and the relationship between the spatial variation and the Rf held throughout the year. These results indicate that the seasonal change in LAI had little impact on spatial variations in Tf, and that Rf is a critical factor determining the spatial variations in Tf at the study site. We evaluated potential errors in stand-scale Tf estimates on the basis of measured Tf data using Monte Carlo sampling. The results showed that the error decreases greatly with increasing sample size when the sample size was less than ,8, whereas it was near stable when the sample size was 8 or more, regardless of Rf. A sample size of eight results in less than 10% error for Tf estimates based on Student's t -value analysis and would be satisfactory for interception loss estimates when considering errors included in Rf data. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Role of the one-body Jastrow factor in the transcorrelated self-consistent field equation

    INTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY, Issue 7 2006
    Naoto Umezawa
    Abstract The one-body Jastrow factor has been introduced into the transcorrelated variational Monte Carlo (TC-VMC) method. The principal role of the one-body Jastrow factor in the Jastrow,Slater-type wave function is to prevent an unfavorable effect of the two-body Jastrow factor that alters the charge density. In the TC-VMC method, since the one-body orbitals are optimized by the transcorrelated self-consistent field (TC-SCF) equations, which take into account the electron,electron correlation interactions originating from the two-body Jastrow factor, the unfavorable effect of altering charge density can be avoided without introducing the one-body Jastrow factor. However, it is found that it is still better to incorporate a one-body Jastrow factor into the TC-VMC method for the practical effect of reducing numerical errors caused by the Monte Carlo sampling and the re-weighting calculations in solving the TC-SCF equations. Moreover, since the one-body Jastrow function adopted in the present work is constructed from the two-body Jastrow factor without increasing any variational parameter, the computational cost is not significantly increased. The preferable effect of the use of the one-body Jastrow factor in the TC-VMC calculation is demonstrated for atoms. © 2006 Wiley Periodicals, Inc. Int J Quantum Chem, 2006 [source]


    Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations

    JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 2 2009
    Håvard Rue
    Summary., Structured additive regression models are perhaps the most commonly used class of models in statistical applications. It includes, among others, (generalized) linear models, (generalized) additive models, smoothing spline models, state space models, semiparametric regression, spatial and spatiotemporal models, log-Gaussian Cox processes and geostatistical and geoadditive models. We consider approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non-Gaussian response variables. The posterior marginals are not available in closed form owing to the non-Gaussian response variables. For such models, Markov chain Monte Carlo methods can be implemented, but they are not without problems, in terms of both convergence and computational time. In some practical applications, the extent of these problems is such that Markov chain Monte Carlo sampling is simply not an appropriate tool for routine analysis. We show that, by using an integrated nested Laplace approximation and its simplified version, we can directly compute very accurate approximations to the posterior marginals. The main benefit of these approximations is computational: where Markov chain Monte Carlo algorithms need hours or days to run, our approximations provide more precise estimates in seconds or minutes. Another advantage with our approach is its generality, which makes it possible to perform Bayesian analysis in an automatic, streamlined way, and to compute model comparison criteria and various predictive measures so that models can be compared and the model under study can be challenged. [source]


    Segmenting bacterial and viral DNA sequence alignments with a trans-dimensional phylogenetic factorial hidden Markov model

    JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 3 2009
    Wolfgang P. Lehrach
    Summary., The traditional approach to phylogenetic inference assumes that a single phylogenetic tree can represent the relationships and divergence between the taxa. However, taxa sequences exhibit varying levels of conservation, e.g. because of regulatory elements and active binding sites. Also, certain bacteria and viruses undergo interspecific recombination, where different strains exchange or transfer DNA subsequences, leading to a tree topology change. We propose a phylogenetic factorial hidden Markov model to detect recombination and rate variation simultaneously. This is applied to two DNA sequence alignments: one bacterial (Neisseria) and another of type 1 human immunodeficiency virus. Inference is carried out in the Bayesian framework, using reversible jump Markov chain Monte Carlo sampling. [source]


    Origin of Bends in Unperturbed Vinyl Polymers: An Illustration with Polystyrene

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 7 2007
    Yergou B. Tatek
    Abstract Previous experimental works have shown that dendronized vinyl polymers exhibit bends when adsorbed onto a surface. Two different mechanisms are believed to be responsible for the formation of these bends. These mechanisms are the temperature dependent random fluctuations of torsional bond states on one hand, and the intramolecular interactions due to the randomness in the stereochemical sequence of side chains on the other hand. Investigation of the amplitude and scope of the above mechanisms has been made by studying the conformational space of PS chains via RIS based Monte Carlo sampling. It was found that at low temperature bend formation is due to tacticity, whereas it was thermally driven at high temperature. The existence of a transition temperature between these two bend formation modes was demonstrated. It was also shown that for atactic chains, the maximum of bend formation occurs at Pm,,,0.7. [source]


    Modelling detrital cooling-age populations: insights from two Himalayan catchments

    BASIN RESEARCH, Issue 3 2003
    I. D. Brewer
    The distribution of detrital mineral cooling ages in river sediment provides a proxy record for the erosional history of mountain ranges. We have developed a numerical model that predicts detrital mineral age distributions for individual catchments in which particle paths move vertically toward the surface. Despite a restrictive set of assumptions, the model permits theoretical exploration of the effects of thermal structure, erosion rate, and topography on cooling ages. Hypsometry of the source-area catchment is shown to exert a fundamental control on the frequency distribution of bedrock and detrital ages. We illustrate this approach by generating synthetic 40Ar/39Ar muscovite age distributions for two catchments with contrasting erosion rates in central Nepal and then by comparing actual measured cooling-age distributions with the synthetic ones. Monte Carlo sampling is used to assess the mismatch between observed and synthetic age distributions and to explore the dependence of that mismatch on the complexity of the synthetic age signal and on the number of grains analysed. Observed detrital cooling ages are well matched by predicted ages for a more slowly eroding Himalayan catchment. A poorer match for a rapidly eroding catchment may result from some combination of large analytical uncertainties in the detrital ages and inhomogeneous erosion rates within the basin. Such mismatches emphasize the need for more accurate thermal and kinematic models and for sampling strategies that are adapted to catchment-specific geologic and geomorphic conditions. [source]