Home About us Contact | |||
Uncertainty Information (uncertainty + information)
Selected AbstractsClustering composition vectors using uncertainty informationENVIRONMETRICS, Issue 8 2007William F. Christensen Abstract In the biological and environmental sciences, interest often lies in using multivariate observations to discover natural clusters of objects. In this manuscript, the incorporation of measurement uncertainty information into a cluster analysis is discussed. This study is motivated by a problem involving the clustering of composition vectors associated with each of several chemical species. The observed abundance of each component is available along with its estimated uncertainty (measurement error standard deviation). An approach is proposed for converting the abundance vectors into composition (relative abundance) vectors, obtaining the covariance matrix associated with each composition vector, and defining a Mahalanobis distance between composition vectors that are suitable for cluster analysis. The approach is illustrated using particle size distributions obtained near Houston, Texas in 2000. Computer simulation is used to compare the performance of Mahalanobis-distance-based and Euclidean-distance-based clustering approaches. The use of a modified Mahalanobis distance along with Ward's method is recommended for use. Copyright © 2007 John Wiley & Sons, Ltd. [source] Stochastic league tables: an application to diabetes interventions in the NetherlandsHEALTH ECONOMICS, Issue 5 2005Raymond C. W. Hutubessy Abstract The aim of this paper is to discuss the use of stochastic league tables approach in cost-effectiveness analysis of diabetes interventions. It addresses the common grounds and differences with other methods of presenting uncertainty to decision-makers. This comparison uses the cost-effectiveness results of medical guidelines for Dutch diabetes type 2 patients in primary and secondary care. Stochastic league tables define the optimum expansion pathway as compared to baseline, starting with the least costly and most cost-effective intervention mix. Multi-intervention cost-effectiveness acceptability curves are used as a way to represent uncertainty information on the cost-effectiveness of single interventions as compared to a single alternative. The stochastic league table for diabetes interventions shows that in case of low budgets treatment of secondary care patients is the most likely optimum choice. Current care options of diabetes complications are shown to be inefficient compared to guidelines treatment. With more resources available one may implement all guidelines and improve efficiency. The stochastic league table approach and multi-intervention cost-effectiveness acceptability curves in uncertainty analysis lead to similar results. In addition, the stochastic league table approach provides policy makers with information on affordability by budget level. It fulfils more adequately the information requirements to choose between interventions, using the efficiency criterion. Copyright © 2004 John Wiley & Sons, Ltd. [source] A comparative study of linear regression methods in noisy environmentsJOURNAL OF CHEMOMETRICS, Issue 12 2004Marco S. Reis Abstract With the development of measurement instrumentation methods and metrology, one is very often able to rigorously specify the uncertainty associated with each measured value (e.g. concentrations, spectra, process sensors). The use of this information, along with the corresponding raw measurements, should, in principle, lead to more sound ways of performing data analysis, since the quality of data can be explicitly taken into account. This should be true, in particular, when noise is heteroscedastic and of a large magnitude. In this paper we focus on alternative multivariate linear regression methods conceived to take into account data uncertainties. We critically investigate their prediction and parameter estimation capabilities and suggest some modifications of well-established approaches. All alternatives are tested under simulation scenarios that cover different noise and data structures. The results thus obtained provide guidelines on which methods to use and when. Interestingly enough, some of the methods that explicitly incorporate uncertainty information in their formulations tend to present not as good performances in the examples studied, whereas others that do not do so present an overall good performance. Copyright © 2005 John Wiley & Sons, Ltd. [source] Parameter Estimation in the Error-in-Variables Models Using the Gibbs SamplerTHE CANADIAN JOURNAL OF CHEMICAL ENGINEERING, Issue 1 2006Jessada J. Jitjareonchai Abstract Least squares and maximum likelihood techniques have long been used in parameter estimation problems. However, those techniques provide only point estimates with unknown or approximate uncertainty information. Bayesian inference coupled with the Gibbs Sampler is an approach to parameter estimation that exploits modern computing technology. The estimation results are complete with exact uncertainty information. The Error-in-Variables model (EVM) approach is investigated in this study. In it, both dependent and independent variables contain measurement errors, and the true values and uncertainties of all measurements are estimated. This EVM set-up leads to unusually large dimensionality in the estimation problem, which makes parameter estimation very difficult with classical techniques. In this paper, an innovative way of performing parameter estimation is introduced to chemical engineers. The paper shows that the method is simple and efficient; as well, complete and accurate uncertainty information about parameter estimates is readily available. Two real-world EVM examples are demonstrated: a large-scale linear model and an epidemiological model. The former is simple enough for most readers to understand the new concepts without difficulty. The latter has very interesting features in that a Poisson distribution is assumed, and a parameter with known distribution is retained while other unknown parameters are estimated. The Gibbs Sampler results are compared with those of the least squares. Les techniques de moindres carrés et de similitude maximale sont utilisées depuis longtemps dans les problèmes d'estimation des paramètres. Cependant, ces techniques ne fournissent que des estimations ponctuelles avec de l'information sur les incertitudes inconnue ou approximative. L'inférence de Bayes couplée à l'échantillonneur de Gibbs est une approche d'estimation paramétrique qui exploite la technologie moderne de calcul par ordinateur. Les résultats d'estimation sont complets avec l'information exacte sur les incertitudes. L'approche du modèle d'erreurs dans les variables (EVM) est étudiée dans cette étude. Dans cette méthode, les variables dépendantes et indépendantes contiennent des erreurs de mesure, et les véritables valeurs et incertitudes de toutes les mesures sont estimées. Ce système EVM mène à une dimensionnalité inhabituellement grande dans le problème d'estimation, ce qui rend l'estimation de paramètres très difficile avec les techniques classiques. Dans cet article, une façon innovante d'effectuer l'estimation de paramètres est présentée aux ingénieurs de génie chimique. On montre dans cet article que la méthode est simple et efficace; de même, de l'information complète et précise sur l'incertitude d'estimation de paramètres est accessible. Deux exemples d'EVM en situation réelle sont montrés, soient un modèle linéaire de grande échelle et un modèle épidémiologique. Le premier modèle est suffisamment simple pour la plupart des lecteurs pour comprendre les nouveaux concepts sans difficulté. Le deuxième possède des caractéristiques extrêmement intéressantes, en ce sens qu'on suppose une distribution de Poisson et qu'un paramètre ayant une distribution connue est retenu pendant que d'autres paramètres non connus sont estimés. Les résultats de l'échantillonneur de Gibbs sont comparés à ceux de la méthode des moindres carrés. [source] Simultaneous state estimation and attenuation correction for thunderstorms with radar data using an ensemble Kalman filter: tests with simulated dataTHE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 643 2009Ming Xue Abstract A new approach to dealing with attenuated radar reflectivity data in the data assimilation process is proposed and tested with simulated data using the ensemble square-root Kalman filter. This approach differs from the traditional method where attenuation is corrected in observation space first before observations are assimilated into numerical models. We build attenuation correction into the data assimilation system by calculating the expected attenuation within the forward observation operators using the estimated atmospheric state. Such a procedure does not require prior assumption about the types of hydrometeor species along the radar beams, and allows us to take advantage of knowledge about the hydrometeors obtained through data assimilation and state estimation. Being based on optimal estimation theory, error and uncertainty information on the observations and prior estimate can be effectively utilized, and additional observed parameters, such as those from polarimetric radar, can potentially be incorporated into the system. Tests with simulated reflectivity data of an X-band 3 cm wavelength radar for a supercell storm show that the attenuation correction procedure is very effective,the analyses obtained using attenuated data are almost as good as those obtained using unattenuated data. The procedure is also robust in the presence of moderate dropsize-distribution-related observation operator error and when systematic radar calibration error exists. The analysis errors are very large if no attenuation correction is applied. The effect of attenuation and its correction when radial velocity data are also assimilated is discussed as well. In general, attenuation correction is equally important when quality radial velocity data are also assimilated. Copyright © 2009 Royal Meteorological Society [source] The effects of wording on the understanding and use of uncertainty information in a threshold forecasting decisionAPPLIED COGNITIVE PSYCHOLOGY, Issue 1 2009Susan L. Joslyn Many believe that information about small chances of severe weather would be useful to the general public for precautionary action. What is the best way to explain this kind of information to a non-expert audience? The studies reported here investigated effects of framing (negative vs. positive), format (frequency vs. probability), likelihood (low vs. high) and compatibility (task-match) on interpretation of verbal expressions of forecast uncertainty and on subsequent forecasting decisions. The crucial factor was the match between the verbal expression and the overall task goal. Errors increased when there was a mismatch between the expression (e.g. winds less than 20,knots) and the task (e.g. post an advisory when winds will exceed 20,knots). However, framing and format had little impact. We conclude that consideration of user expectations arising from the overall task goal is crucial in explaining uncertainty information to a naïve audience. Global expectations overpower other potential effects. Copyright © 2008 John Wiley & Sons, Ltd. [source] |