Proposal Distributions (proposal + distribution)

Distribution by Scientific Domains


Selected Abstracts


Gamma-SLAM: Visual SLAM in unstructured environments using variance grid maps

JOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 1 2009
Tim K. Marks
This paper describes an online stereo visual simultaneous localization and mapping (SLAM) algorithm developed for the Learning Applied to Ground Robotics (LAGR) program. The Gamma-SLAM algorithm uses a Rao,Blackwellized particle filter to obtain a joint posterior over poses and maps: the pose distribution is estimated using a particle filter, and each particle has its own map that is obtained through exact filtering conditioned on the particle's pose. Visual odometry is used to provide good proposal distributions for the particle filter, and maps are represented using a Cartesian grid. Unlike previous grid-based SLAM algorithms, however, the Gamma-SLAM map maintains a posterior distribution over the elevation variance in each cell. This variance grid map can capture rocks, vegetation, and other objects that are typically found in unstructured environments but are not well modeled by traditional occupancy or elevation grid maps. The algorithm runs in real time on conventional processors and has been evaluated for both qualitative and quantitative accuracy in three outdoor environments over trajectories totaling 1,600 m in length. © 2008 Wiley Periodicals, Inc. [source]


Particle Markov chain Monte Carlo methods

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 3 2010
Christophe Andrieu
Summary., Markov chain Monte Carlo and sequential Monte Carlo methods have emerged as the two main tools to sample from high dimensional probability distributions. Although asymptotic convergence of Markov chain Monte Carlo algorithms is ensured under weak assumptions, the performance of these algorithms is unreliable when the proposal distributions that are used to explore the space are poorly chosen and/or if highly correlated variables are updated independently. We show here how it is possible to build efficient high dimensional proposal distributions by using sequential Monte Carlo methods. This allows us not only to improve over standard Markov chain Monte Carlo schemes but also to make Bayesian inference feasible for a large class of statistical models where this was not previously so. We demonstrate these algorithms on a non-linear state space model and a Lévy-driven stochastic volatility model. [source]


Monte Carlo Inference for State,Space Models of Wild Animal Populations

BIOMETRICS, Issue 2 2009
Ken B. Newman
Summary We compare two Monte Carlo (MC) procedures, sequential importance sampling (SIS) and Markov chain Monte Carlo (MCMC), for making Bayesian inferences about the unknown states and parameters of state,space models for animal populations. The procedures were applied to both simulated and real pup count data for the British grey seal metapopulation, as well as to simulated data for a Chinook salmon population. The MCMC implementation was based on tailor-made proposal distributions combined with analytical integration of some of the states and parameters. SIS was implemented in a more generic fashion. For the same computing time MCMC tended to yield posterior distributions with less MC variation across different runs of the algorithm than the SIS implementation with the exception in the seal model of some states and one of the parameters that mixed quite slowly. The efficiency of the SIS sampler greatly increased by analytically integrating out unknown parameters in the observation model. We consider that a careful implementation of MCMC for cases where data are informative relative to the priors sets the gold standard, but that SIS samplers are a viable alternative that can be programmed more quickly. Our SIS implementation is particularly competitive in situations where the data are relatively uninformative; in other cases, SIS may require substantially more computer power than an efficient implementation of MCMC to achieve the same level of MC error. [source]