Home About us Contact | |||
Bayesian Approach (bayesian + approach)
Kinds of Bayesian Approach Selected AbstractsDETERMINATION OF MANATEE POPULATION TRENDS ALONG THE ATLANTIC COAST OF FLORIDA USING A BAYESIAN APPROACH WITH TEMPERATURE-ADJUSTED AERIAL SURVEY DATAMARINE MAMMAL SCIENCE, Issue 3 2004Bruce A. Craig Abstract In many animal population survey studies, the construction of a stochastic model provides an effective way to capture underlying biological characteristics that contribute to the overall variation in the data. In this paper we develop a stochastic model to assess the population trend and abundance of the Florida manatee, Trichechus manatus latirostris, along the Atlantic coast of the state, using aerial survey data collected at winter aggregation sites between 1982 and 2001. This model accounts for the method by which the manatees were counted, their movements between surveys, and the behavior of the total population over time. The data suggest an overall increase in the population from 1982 to 1989 of around 5%,7%, a reduction in growth or a leveling off (0%,4% annual growth) from 1990 to 1993, and then an increase again of around 3%,6% since 1994. In winter 2001,2002 (the most recent survey season for which analyses were done), we estimated the adult manatee population along the east coast of Florida to be 1,607 individuals (range = 1,353,1,972; 95% credible interval). Our estimate of manatee abundance corresponds well with maximum counts (approximately 1,600 manatees) produced during synoptic aerial surveys under optimal conditions. Our calculations of trends correspond well with mark and recapture analyses of trends in survival of adult manatees along the east coast through the early 1990s. Our population trend estimates since that time are more optimistic than those generated by mark-recapture models. [source] A SEMIPARAMETRIC BAYESIAN APPROACH TO MULTIVARIATE LONGITUDINAL DATAAUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 3 2010Pulak Ghosh Summary We extend the standard multivariate mixed model by incorporating a smooth time effect and relaxing distributional assumptions. We propose a semiparametric Bayesian approach to multivariate longitudinal data using a mixture of Polya trees prior distribution. Usually, the distribution of random effects in a longitudinal data model is assumed to be Gaussian. However, the normality assumption may be suspect, particularly if the estimated longitudinal trajectory parameters exhibit multi-modality and skewness. In this paper we propose a mixture of Polya trees prior density to address the limitations of the parametric random effects distribution. We illustrate the methodology by analysing data from a recent HIV-AIDS study. [source] A SEMIPARAMETRIC BAYESIAN APPROACH TO NETWORK MODELLING USING DIRICHLET PROCESS PRIOR DISTRIBUTIONSAUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 3 2010Pulak Ghosh Summary This paper considers the use of Dirichlet process prior distributions in the statistical analysis of network data. Dirichlet process prior distributions have the advantages of avoiding the parametric specifications for distributions, which are rarely known, and of facilitating a clustering effect, which is often applicable to network nodes. The approach is highlighted for two network models and is conveniently implemented using WinBUGS software. [source] ESTIMATION IN RICKER'S TWO-RELEASE METHOD: A BAYESIAN APPROACHAUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2006Shen-Ming Lee Summary The Ricker's two-release method is a simplified version of the Jolly-Seber method, from Seber's Estimation of Animal Abundance (1982), used to estimate survival rate and abundance in animal populations. This method assumes there is only a single recapture sample and no immigration, emigration or recruitment. In this paper, we propose a Bayesian analysis for this method to estimate the survival rate and the capture probability, employing Markov chain Monte Carlo methods and a latent variable analysis. The performance of the proposed method is illustrated with a simulation study as well as a real data set. The results show that the proposed method provides favourable inference for the survival rate when compared with the modified maximum likelihood method. [source] Analyzing the Relationship Between Smoking and Coronary Heart Disease at the Small Area Level: A Bayesian Approach to Spatial ModelingGEOGRAPHICAL ANALYSIS, Issue 2 2006Jane Law We model the relationship between coronary heart disease and smoking prevalence and deprivation at the small area level using the Poisson log-linear model with and without random effects. Extra-Poisson variability (overdispersion) is handled through the addition of spatially structured and unstructured random effects in a Bayesian framework. In addition, four different measures of smoking prevalence are assessed because the smoking data are obtained from a survey that resulted in quite large differences in the size of the sample across the census tracts. Two of the methods use Bayes adjustments of standardized smoking ratios (local and global adjustments), and one uses a nonparametric spatial averaging technique. A preferred model is identified based on the deviance information criterion. Both smoking and deprivation are found to be statistically significant risk factors, but the effect of the smoking variable is reduced once the confounding effects of deprivation are taken into account. Maps of the spatial variability in relative risk, and the importance of the underlying covariates and random effects terms, are produced. We also identify areas with excess relative risk. [source] A Bayesian Approach to Prediction Using the Gravity Model, with an Application to Patient Flow ModelingGEOGRAPHICAL ANALYSIS, Issue 3 2000Peter Congdon This paper investigates the potential for estimation and prediction by Bayesian methods of hospitalization flows classified by place of residence and hospital site. The focus is especially with respect to emergency (unplanned) admissions to hospitals. The need for strategic modeling and forecasting arises since the structure of U.K. emergency service provision is subject to changes involving site closures or changes in bed numbers. The gravity model, reflecting patient demand, hospital supply, and distance effects has been applied to patient flows, but generally in a situation of unchanged destination states. It may be modified, however, in accordance with major changes in hospital service structure, to include access effects (the interplay of supply and distance) and temporal variation in its parameters. Therefore, prediction may be applied to a "new" situation defined, for example, by closures of entire hospital sites. The modeling approach used may be adapted to other flow models where destinations may be added or eliminated (for example, trade-area models). A case study involves a sector of London subject to such a restructuring following the U.K. government's 1997,98 review of London's emergency services. [source] Reconstruction of the Water Table from Self-Potential Data: A Bayesian ApproachGROUND WATER, Issue 2 2009A. Jardani Ground water flow associated with pumping and injection tests generates self-potential signals that can be measured at the ground surface and used to estimate the pattern of ground water flow at depth. We propose an inversion of the self-potential signals that accounts for the heterogeneous nature of the aquifer and a relationship between the electrical resistivity and the streaming current coupling coefficient. We recast the inversion of the self-potential data into a Bayesian framework. Synthetic tests are performed showing the advantage in using self-potential signals in addition to in situ measurements of the potentiometric levels to reconstruct the shape of the water table. This methodology is applied to a new data set from a series of coordinated hydraulic tomography, self-potential, and electrical resistivity tomography experiments performed at the Boise Hydrogeophysical Research Site, Idaho. In particular, we examine one of the dipole hydraulic tests and its reciprocal to show the sensitivity of the self-potential signals to variations of the potentiometric levels under steady-state conditions. However, because of the high pumping rate, the response was also influenced by the Reynolds number, especially near the pumping well for a given test. Ground water flow in the inertial laminar flow regime is responsible for nonlinearity that is not yet accounted for in self-potential tomography. Numerical modeling addresses the sensitivity of the self-potential response to this problem. [source] A Bayesian Approach to Age Estimation in Modern Americans from the Clavicle,JOURNAL OF FORENSIC SCIENCES, Issue 3 2010Natalie Langley-Shirley Ph.D. Abstract:, Clavicles from 1289 individuals from cohorts spanning the 20th century were scored with two scoring systems. Transition analysis and Bayesian statistics were used to obtain robust age ranges that are less sensitive to the effects of age mimicry and developmental outliers than age ranges obtained using a percentile approach. Observer error tests showed that a simple three-phase scoring system proved the least subjective, while retaining accuracy levels. Additionally, significant sexual dimorphism was detected in the onset of fusion, with women commencing fusion at least a year earlier than men (women transition to fusion at approximately 15 years of age and men at 16 years). Significant secular trends were apparent in the onset of skeletal maturation, with modern Americans transitioning to fusion approximately 4 years earlier than early 20th century Americans and 3.5 years earlier than Korean War era Americans. These results underscore the importance of using modern standards to estimate age in modern individuals. [source] Application of a Bayesian Approach to the Tomographic Analysis of Hopper FlowPARTICLE & PARTICLE SYSTEMS CHARACTERIZATION, Issue 4 2005Krzysztof Grudzien Abstract This paper presents a new approach to the analysis of data on powder flow from electrical capacitance tomography (ECT) using probability modelling and Bayesian statistics. The methodology is illustrated for powder flow in a hopper. The purpose, and special features, of this approach is that ,high-level' statistical Bayesian modelling combined with a Markov chain Monte Carlo (MCMC) sampling algorithm allows direct estimation of control parameters of industrial processes in contrast to usually applied ,low-level', pixel-based methods of data analysis. This enables reliable recognition of key process features in a quantitative manner. The main difficulty when investigating hopper flow with ECT is due to the need to measure small differences in particle packing density. The MCMC protocol enables more robust identification of the responses of such complex systems. This paper demonstrates the feasibility of the approach for a simple case of particulate material flow during discharging of a hopper. It is concluded that these approaches can offer significant advantages for the analysis and control of some industrial powder and other multi-phase flow processes. [source] Modelling Operational Losses: A Bayesian ApproachQUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 5 2004Paolo Giudici Abstract The exposure of banks to operational risk has increased in recent years. The Basel Committee on Banking Supervision (known as Basel II) has established a capital charge to cover operational risks other than credit and market risk. According to the advanced methods defined in ,The New Basel Capital Accord' to quantify the capital charge, in this paper we present an advanced measurement approach based on a Bayesian network model that estimates an internal measure of risk of the bank. One of the main problems faced when measuring the operational risk is the scarcity of loss data. The methodology proposed solves this critical point because it allows a coherent integration, via Bayes' theorem, of different sources of information, such as internal and external data, and the opinions of ,experts' (process owners) about the frequency and the severity of each loss event. Furthermore, the model corrects the losses distribution by considering the eventual relations between different nodes of the network that represent the losses of each combination of business line/event type/bank/process and the effectiveness of the corresponding internal and external controls. The operational risk capital charge is quantified by multiplying the value at risk (VaR) per event, a percentile of the losses distribution determined, by an estimate of the number of losses that may occur in a given period. Furthermore, it becomes possible to monitor the effectiveness of the internal and external system controls in place at the bank. The methodology we present has been experimented as a pilot project in one of the most important Italian banking groups, Monte dei Paschi di Siena. Copyright © 2004 John Wiley & Sons, Ltd. [source] A Bayesian Approach to Real Options: The Case of Distinguishing between Temporary and Permanent ShocksTHE JOURNAL OF FINANCE, Issue 5 2010STEVEN R. GRENADIER ABSTRACT Traditional real options models demonstrate the importance of the "option to wait" due to uncertainty over future shocks to project cash flows. However, there is often another important source of uncertainty: uncertainty over the permanence of past shocks. Adding Bayesian uncertainty over the permanence of past shocks augments the traditional option to wait with an additional "option to learn." The implied investment behavior differs significantly from that in standard models. For example, investment may occur at a time of stable or decreasing cash flows, respond sluggishly to cash flow shocks, and depend on the timing of project cash flows. [source] Inference in Long-Horizon Event Studies: A Bayesian Approach with Application to Initial Public OfferingsTHE JOURNAL OF FINANCE, Issue 5 2000Alon Brav Statistical inference in long-horizon event studies has been hampered by the fact that abnormal returns are neither normally distributed nor independent. This study presents a new approach to inference that overcomes these difficulties and dominates other popular testing methods. I illustrate the use of the methodology by examining the long-horizon returns of initial public offerings (IPOs). I find that the Fama and French (1993) three-factor model is inconsistent with the observed long-horizon price performance of these IPOs, whereas a characteristic-based model cannot be rejected. [source] A Bayesian Approach to Surrogacy Assessment Using Principal Stratification in Clinical TrialsBIOMETRICS, Issue 2 2010Yun Li Summary A surrogate marker (S) is a variable that can be measured earlier and often more easily than the true endpoint (T) in a clinical trial. Most previous research has been devoted to developing surrogacy measures to quantify how well,S,can replace,T,or examining the use of,S,in predicting the effect of a treatment (Z). However, the research often requires one to fit models for the distribution of,T,given,S,and,Z. It is well known that such models do not have causal interpretations because the models condition on a postrandomization variable,S. In this article, we directly model the relationship among,T,,S, and,Z,using a potential outcomes framework introduced by Frangakis and Rubin (2002,,Biometrics,58, 21,29). We propose a Bayesian estimation method to evaluate the causal probabilities associated with the cross-classification of the potential outcomes of,S,and,T,when,S,and,T,are both binary. We use a log-linear model to directly model the association between the potential outcomes of,S,and,T,through the odds ratios. The quantities derived from this approach always have causal interpretations. However, this causal model is not identifiable from the data without additional assumptions. To reduce the nonidentifiability problem and increase the precision of statistical inferences, we assume monotonicity and incorporate prior belief that is plausible in the surrogate context by using prior distributions. We also explore the relationship among the surrogacy measures based on traditional models and this counterfactual model. The method is applied to the data from a glaucoma treatment study. [source] A Bayesian Approach to Modeling Associations Between Pulsatile HormonesBIOMETRICS, Issue 2 2009Nichole E. Carlson Summary Many hormones are secreted in pulses. The pulsatile relationship between hormones regulates many biological processes. To understand endocrine system regulation, time series of hormone concentrations are collected. The goal is to characterize pulsatile patterns and associations between hormones. Currently each hormone on each subject is fitted univariately. This leads to estimates of the number of pulses and estimates of the amount of hormone secreted; however, when the signal-to-noise ratio is small, pulse detection and parameter estimation remains difficult with existing approaches. In this article, we present a bivariate deconvolution model of pulsatile hormone data focusing on incorporating pulsatile associations. Through simulation, we exhibit that using the underlying pulsatile association between two hormones improves the estimation of the number of pulses and the other parameters defining each hormone. We develop the one-to-one, driver,response case and show how birth,death Markov chain Monte Carlo can be used for estimation. We exhibit these features through a simulation study and apply the method to luteinizing and follicle stimulating hormones. [source] Structural Equation Modeling: A Bayesian Approach by S.-Y.BIOMETRICS, Issue 4 2007No abstract is available for this article. [source] A spatial model of bird abundance as adjusted for detection probabilityECOGRAPHY, Issue 2 2009P. Marcos Gorresen Modeling the spatial distribution of animals can be complicated by spatial and temporal effects (i.e. spatial autocorrelation and trends in abundance over time) and other factors such as imperfect detection probabilities and observation-related nuisance variables. Recent advances in modeling have demonstrated various approaches that handle most of these factors but which require a degree of sampling effort (e.g. replication) not available to many field studies. We present a two-step approach that addresses these challenges to spatially model species abundance. Habitat, spatial and temporal variables were handled with a Bayesian approach which facilitated modeling hierarchically structured data. Predicted abundance was subsequently adjusted to account for imperfect detection and the area effectively sampled for each species. We provide examples of our modeling approach for two endemic Hawaiian nectarivorous honeycreepers: ,i,iwi Vestiaria coccinea and ,apapane Himatione sanguinea. [source] Data cloning: easy maximum likelihood estimation for complex ecological models using Bayesian Markov chain Monte Carlo methodsECOLOGY LETTERS, Issue 7 2007Subhash R. Lele Abstract We introduce a new statistical computing method, called data cloning, to calculate maximum likelihood estimates and their standard errors for complex ecological models. Although the method uses the Bayesian framework and exploits the computational simplicity of the Markov chain Monte Carlo (MCMC) algorithms, it provides valid frequentist inferences such as the maximum likelihood estimates and their standard errors. The inferences are completely invariant to the choice of the prior distributions and therefore avoid the inherent subjectivity of the Bayesian approach. The data cloning method is easily implemented using standard MCMC software. Data cloning is particularly useful for analysing ecological situations in which hierarchical statistical models, such as state-space models and mixed effects models, are appropriate. We illustrate the method by fitting two nonlinear population dynamics models to data in the presence of process and observation noise. [source] Modelling the effects of air pollution on health using Bayesian dynamic generalised linear modelsENVIRONMETRICS, Issue 8 2008Duncan Lee Abstract The relationship between short-term exposure to air pollution and mortality or morbidity has been the subject of much recent research, in which the standard method of analysis uses Poisson linear or additive models. In this paper, we use a Bayesian dynamic generalised linear model (DGLM) to estimate this relationship, which allows the standard linear or additive model to be extended in two ways: (i) the long-term trend and temporal correlation present in the health data can be modelled by an autoregressive process rather than a smooth function of calendar time; (ii) the effects of air pollution are allowed to evolve over time. The efficacy of these two extensions are investigated by applying a series of dynamic and non-dynamic models to air pollution and mortality data from Greater London. A Bayesian approach is taken throughout, and a Markov chain monte carlo simulation algorithm is presented for inference. An alternative likelihood based analysis is also presented, in order to allow a direct comparison with the only previous analysis of air pollution and health data using a DGLM. Copyright © 2008 John Wiley & Sons, Ltd. [source] Assessing the impacts of grazing levels on bird density in woodland habitat: a Bayesian approach using expert opinionENVIRONMETRICS, Issue 7 2005Petra M. Kuhnert Abstract Many studies on birds focus on the collection of data through an experimental design, suitable for investigation in a classical analysis of variance (ANOVA) framework. Although many findings are confirmed by one or more experts, expert information is rarely used in conjunction with the survey data to enhance the explanatory and predictive power of the model. We explore this neglected aspect of ecological modelling through a study on Australian woodland birds, focusing on the potential impact of different intensities of commercial cattle grazing on bird density in woodland habitat. We examine a number of Bayesian hierarchical random effects models, which cater for overdispersion and a high frequency of zeros in the data using WinBUGS and explore the variation between and within different grazing regimes and species. The impact and value of expert information is investigated through the inclusion of priors that reflect the experience of 20 experts in the field of bird responses to disturbance. Results indicate that expert information moderates the survey data, especially in situations where there are little or no data. When experts agreed, credible intervals for predictions were tightened considerably. When experts failed to agree, results were similar to those evaluated in the absence of expert information. Overall, we found that without expert opinion our knowledge was quite weak. The fact that the survey data is quite consistent, in general, with expert opinion shows that we do know something about birds and grazing and we could learn a lot faster if we used this approach more in ecology, where data are scarce. Copyright © 2005 John Wiley & Sons, Ltd. [source] Modeling and predicting complex space,time structures and patterns of coastal wind fieldsENVIRONMETRICS, Issue 5 2005Montserrat Fuentes Abstract A statistical technique is developed for wind field mapping that can be used to improve either the assimilation of surface wind observations into a model initial field or the accuracy of post-processing algorithms run on meteorological model output. The observed wind field at any particular location is treated as a function of the true (but unknown) wind and measurement error. The wind field from numerical weather prediction models is treated as a function of a linear and multiplicative bias and a term which represents random deviations with respect to the true wind process. A Bayesian approach is taken to provide information about the true underlying wind field, which is modeled as a stochastic process with a non-stationary and non-separable covariance. The method is applied to forecast wind fields from a widely used mesoscale numerical weather prediction (NWP) model (MM5). The statistical model tests are carried out for the wind speed over the Chesapeake Bay and the surrounding region for 21 July 2002. Coastal wind observations that have not been used in the MM5 initial conditions or forecasts are used in conjunction with the MM5 forecast wind field (valid at the same time that the observations were available) in a post-processing technique that combined these two sources of information to predict the true wind field. Based on the mean square error, this procedure provides a substantial correction to the MM5 wind field forecast over the Chesapeake Bay region. Copyright © 2005 John Wiley & Sons, Ltd. [source] Space varying coefficient models for small area dataENVIRONMETRICS, Issue 5 2003Renato M. Assunção Abstract Many spatial regression problems using area data require more flexible forms than the usual linear predictor for modelling the dependence of responses on covariates. One direction for doing this is to allow the coefficients to vary as smooth functions of the area's geographical location. After presenting examples from the scientific literature where these spatially varying coefficients are justified, we briefly review some of the available alternatives for this kind of modelling. We concentrate on a Bayesian approach for generalized linear models proposed by the author which uses a Markov random field to model the coefficients' spatial dependency. We show that, for normally distributed data, Gibbs sampling can be used to sample from the posterior and we prove a result showing the equivalence between our model and other usual spatial regression models. We illustrate our approach with a number of rather complex applied problems, showing that the method is computationally feasible and provides useful insights in substantive problems. Copyright © 2003 John Wiley & Sons, Ltd. [source] Contending with space,time interaction in the spatial prediction of pollution: Vancouver's hourly ambient PM10 fieldENVIRONMETRICS, Issue 5-6 2002Jim Zidek Abstract In this article we describe an approach for predicting average hourly concentrations of ambient PM10 in Vancouver. We know our solution also applies to hourly ozone fields and believe it may be quite generally applicable. We use a hierarchical Bayesian approach. At the primary level we model the logarithmic field as a trend model plus Gaussian stochastic residual. That trend model depends on hourly meteorological predictors and is common to all sites. The stochastic component consists of a 24-hour vector response that we model as a multivariate AR(3) temporal process with common spatial parameters. Removing the trend and AR structure leaves ,whitened' time series of vector series. With this approach (as opposed to using 24 separate univariate time series models), there is little loss of spatial correlation in these residuals compared with that in just the detrended residuals (prior to removing the AR component). Moreover our multivariate approach enables predictions for any given hour to ,borrow strength' through its correlation with adjoining hours. On this basis we develop a spatial predictive distribution for these residuals at unmonitored sites. By transforming the predicted residuals back to the original data scales we can impute Vancouver's hourly PM10 field. Copyright © 2002 John Wiley & Sons, Ltd. [source] Environmental power analysis , a new perspectiveENVIRONMETRICS, Issue 5 2001David R. Fox Abstract Power analysis and sample-size determination are related tools that have recently gained popularity in the environmental sciences. Their indiscriminate application, however, can lead to wildly misleading results. This is particularly true in environmental monitoring and assessment, where the quality and nature of data is such that the implicit assumptions underpinning power and sample-size calculations are difficult to justify. When the assumptions are reasonably met these statistical techniques provide researchers with an important capability for the allocation of scarce and expensive resources to detect putative impact or change. Conventional analyses are predicated on a general linear model and normal distribution theory with statistical tests of environmental impact couched in terms of changes in a population mean. While these are ,optimal' statistical tests (uniformly most powerful), they nevertheless pose considerable practical difficulties for the researcher. Compounding this difficulty is the subsequent analysis of the data and the impost of a decision framework that commences with an assumption of ,no effect'. This assumption is only discarded when the sample data indicate demonstrable evidence to the contrary. The alternative (,green') view is that any anthropogenic activity has an impact on the environment and therefore a more realistic initial position is to assume that the environment is already impacted. In this article we examine these issues and provide a re-formulation of conventional mean-based hypotheses in terms of population percentiles. Prior information or belief concerning the probability of exceeding a criterion is incorporated into the power analysis using a Bayesian approach. Finally, a new statistic is introduced which attempts to balance the overall power regardless of the decision framework adopted. Copyright © 2001 John Wiley & Sons, Ltd. [source] A BAYESIAN FRAMEWORK FOR THE ANALYSIS OF COSPECIATIONEVOLUTION, Issue 2 2000John P. Huelsenbeck Abstract., Information on the history of cospeciation and host switching for a group of host and parasite species is contained in the DNA sequences sampled from each. Here, we develop a Bayesian framework for the analysis of cospeciation. We suggest a simple model of host switching by a parasite on a host phylogeny in which host switching events are assumed to occur at a constant rate over the entire evolutionary history of associated hosts and parasites. The posterior probability density of the parameters of the model of host switching are evaluated numerically using Markov chain Monte Carlo. In particular, the method generates the probability density of the number of host switches and of the host switching rate. Moreover, the method provides information on the probability that an event of host switching is associated with a particular pair of branches. A Bayesian approach has several advantages over other methods for the analysis of cospeciation. In particular, it does not assume that the host or parasite phylogenies are known without error; many alternative phylogenies are sampled in proportion to their probability of being correct. [source] A Bayesian approach to the transmission/disequilibrium test for binary traitsGENETIC EPIDEMIOLOGY, Issue 1 2002Varghese George Abstract The transmission/disequilibrium test (TDT) for binary traits is a powerful method for detecting linkage between a marker locus and a trait locus in the presence of allelic association. The TDT uses information on the parent-to-offspring transmission status of the associated allele at the marker locus to assess linkage or association in the presence of the other, using one affected offspring from each set of parents. For testing for linkage in the presence of association, more than one offspring per family can be used. However, without incorporating the correlation structure among offspring, it is not possible to correctly assess the association in the presence of linkage. In this presentation, we propose a Bayesian TDT method as a complementary alternative to the classical approach. In the hypothesis testing setup, given two competing hypotheses, the Bayes factor can be used to weigh the evidence in favor of one of them, thus allowing us to decide between the two hypotheses using established criteria. We compare the proposed Bayesian TDT with a competing frequentist-testing method with respect to power and type I error validity. If we know the mode of inheritance of the disease, then the joint and marginal posterior distributions for the recombination fraction (,) and disequilibrium coefficient (,) can be obtained via standard MCMC methods, which lead naturally to Bayesian credible intervals for both parameters. Genet. Epidemiol. 22:41,51, 2002. © 2002 Wiley-Liss, Inc. [source] A Bayesian approach to estimating tectonic stress from seismological dataGEOPHYSICAL JOURNAL INTERNATIONAL, Issue 3 2007Richard Arnold SUMMARY Earthquakes are conspicuous manifestations of tectonic stress, but the non-linear relationships between the stresses acting on a fault plane, its frictional slip, and the ensuing seismic radiation are such that a single earthquake by itself provides little information about the ambient state of stress. Moreover, observational uncertainties and inherent ambiguities in the nodal planes of earthquake focal mechanisms preclude straightforward inferences about stress being drawn on the basis of individual focal mechanism observations. However, by assuming that each earthquake in a small volume of the crust represents a single, uniform state of stress, the combined constraints imposed on that stress by a suite of focal mechanism observations can be estimated. Here, we outline a probabilistic (Bayesian) technique for estimating tectonic stress directions from primary seismological observations. The Bayesian formulation combines a geologically motivated prior model of the state of stress with an observation model that implements the physical relationship between the stresses acting on a fault and the resultant seismological observation. We show our Bayesian formulation to be equivalent to a well-known analytical solution for a single, errorless focal mechanism observation. The new approach has the distinct advantage, however, of including (1) multiple earthquakes, (2) fault plane ambiguities, (3) observational errors and (4) any prior knowledge of the stress field. Our approach, while computationally demanding in some cases, is intended to yield reliable tectonic stress estimates that can be confidently compared with other tectonic parameters, such as seismic anisotropy and geodetic strain rate observations, and used to investigate spatial and temporal variations in stress associated with major faults and coseismic stress perturbations. [source] Enabling regional management in a changing climate through Bayesian meta-analysis of a large-scale disturbanceGLOBAL ECOLOGY, Issue 3 2010M Aaron MacNeil ABSTRACT Aim, Quantifying and predicting change in large ecosystems is an important research objective for applied ecologists as human disturbance effects become increasingly evident at regional and global scales. However, studies used to make inferences about large-scale change are frequently of uneven quality and few in number, having been undertaken to study local, rather than global, change. Our aim is to improve the quality of inferences that can be made in meta-analyses of large-scale disturbance by integrating studies of varying quality in a unified modelling framework that is informative for both local and regional management. Innovation, Here we improve conventionally structured meta-analysis methods by including imputation of unknown study variances and the use of Bayesian factor potentials. The approach is a coherent framework for integrating data of varying quality across multiple studies while facilitating belief statements about the uncertainty in parameter estimates and the probable outcome of future events. The approach is applied to a regional meta-analysis of the effects of loss of coral cover on species richness and the abundance of coral-dependent fishes in the western Indian Ocean (WIO) before and after a mass bleaching event in 1998. Main conclusions, Our Bayesian approach to meta-analysis provided greater precision of parameter estimates than conventional weighted linear regression meta-analytical techniques, allowing us to integrate all available data from 66 available study locations in the WIO across multiple scales. The approach thereby: (1) estimated uncertainty in site-level estimates of change, (2) provided a regional estimate for future change at any given site in the WIO, and (3) provided a probabilistic belief framework for future management of reef resources at both local and regional scales. [source] A New Method for Estimating Race/Ethnicity and Associated Disparities Where Administrative Records Lack Self-Reported Race/EthnicityHEALTH SERVICES RESEARCH, Issue 5p1 2008Marc N. Elliott Objective. To efficiently estimate race/ethnicity using administrative records to facilitate health care organizations' efforts to address disparities when self-reported race/ethnicity data are unavailable. Data Source. Surname, geocoded residential address, and self-reported race/ethnicity from 1,973,362 enrollees of a national health plan. Study Design. We compare the accuracy of a Bayesian approach to combining surname and geocoded information to estimate race/ethnicity to two other indirect methods: a non-Bayesian method that combines surname and geocoded information and geocoded information alone. We assess accuracy with respect to estimating (1) individual race/ethnicity and (2) overall racial/ethnic prevalence in a population. Principal Findings. The Bayesian approach was 74 percent more efficient than geocoding alone in estimating individual race/ethnicity and 56 percent more efficient in estimating the prevalence of racial/ethnic groups, outperforming the non-Bayesian hybrid on both measures. The non-Bayesian hybrid was more efficient than geocoding alone in estimating individual race/ethnicity but less efficient with respect to prevalence (p<.05 for all differences). Conclusions. The Bayesian Surname and Geocoding (BSG) method presented here efficiently integrates administrative data, substantially improving upon what is possible with a single source or from other hybrid methods; it offers a powerful tool that can help health care organizations address disparities until self-reported race/ethnicity data are available. [source] Geographical variations of inflammatory bowel disease in France: A study based on national health insurance dataINFLAMMATORY BOWEL DISEASES, Issue 3 2006Virginie Nerich Abstract Background and Aim: A north-south gradient in inflammatory bowel disease (IBD) incidence has been found in Europe and the United States. Its existence is inferred from comparisons of registries that cover only small portions of territories. Several studies suggest that IBD incidence in the north has reached a plateau, whereas in the south it has risen sharply. This evolution tends to reduce the north-south gradient, and it is uncertain whether it still exists. In France, patients with IBD are fully reimbursed for their health expenses by the national health insurance system, which is a potential source of data concerning the incidence of IBD at the national level. The aim of this study was to assess the geographical distribution of Crohn's disease (CD) and ulcerative colitis (UC) in France and to test the north-south gradient hypothesis. Methods: This study was conducted in metropolitan France and included patients to whom IBD reimbursement was newly attributed between January 1, 2000 and December 31, 2002. Data provided relate to age, sex, postcode area of residence, and IBD type. The mapping of geographical distribution of smoothed relative risks (RR) of CD and UC was carried out using a Bayesian approach, taking into account autocorrelation and population size in each département. Results: In the overall population, incidence rates were 8.2 for CD and 7.2 for UC per 100,000 inhabitants. A clear north-south gradient was shown for CD. Départements with the highest smoothed RR were located in the northern third of France. By contrast, the geographical distribution of smoothed RR of UC was homogeneous. Conclusions: This study shows a north-south gradient in France for CD but not for UC. [source] Generalized probabilistic approach of uncertainties in computational dynamics using random matrices and polynomial chaos decompositionsINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 8 2010Christian Soize Abstract A new generalized probabilistic approach of uncertainties is proposed for computational model in structural linear dynamics and can be extended without difficulty to computational linear vibroacoustics and to computational non-linear structural dynamics. This method allows the prior probability model of each type of uncertainties (model-parameter uncertainties and modeling errors) to be separately constructed and identified. The modeling errors are not taken into account with the usual output-prediction-error method, but with the nonparametric probabilistic approach of modeling errors recently introduced and based on the use of the random matrix theory. The theory, an identification procedure and a numerical validation are presented. Then a chaos decomposition with random coefficients is proposed to represent the prior probabilistic model of random responses. The random germ is related to the prior probability model of model-parameter uncertainties. The random coefficients are related to the prior probability model of modeling errors and then depends on the random matrices introduced by the nonparametric probabilistic approach of modeling errors. A validation is presented. Finally, a future perspective is introduced when experimental data are available. The prior probability model of the random coefficients can be improved in constructing a posterior probability model using the Bayesian approach. Copyright © 2009 John Wiley & Sons, Ltd. [source] |