Correction Procedure (correction + procedure)

Distribution by Scientific Domains


Selected Abstracts


Testing Features of Graphical DIF: Application of a Regression Correction to Three Nonparametric Statistical Tests

JOURNAL OF EDUCATIONAL MEASUREMENT, Issue 4 2006
Daniel M. Bolt
Inspection of differential item functioning (DIF) in translated test items can be informed by graphical comparisons of item response functions (IRFs) across translated forms. Due to the many forms of DIF that can emerge in such analyses, it is important to develop statistical tests that can confirm various characteristics of DIF when present. Traditional nonparametric tests of DIF (Mantel-Haenszel, SIBTEST) are not designed to test for the presence of nonuniform or local DIF, while common probability difference (P-DIF) tests (e.g., SIBTEST) do not optimize power in testing for uniform DIF, and thus may be less useful in the context of graphical DIF analyses. In this article, modifications of three alternative nonparametric statistical tests for DIF, Fisher's ,2test, Cochran's Z test, and Goodman's U test (Marascuilo & Slaughter, 1981), are investigated for these purposes. A simulation study demonstrates the effectiveness of a regression correction procedure in improving the statistical performance of the tests when using an internal test score as the matching criterion. Simulation power and real data analyses demonstrate the unique information provided by these alternative methods compared to SIBTEST and Mantel-Haenszel in confirming various forms of DIF in translated tests. [source]


Standardized T2* map of normal human heart in vivo to correct T2* segmental artefacts

NMR IN BIOMEDICINE, Issue 6 2007
Vincenzo Positano
Abstract A segmental, multislice, multi-echo T2* MRI approach could be useful in heart iron-overloaded patients to account for heterogeneous iron distribution, demonstrated by histological studies. However, segmental T2* assessment in heart can be affected by the presence of geometrical and susceptibility artefacts, which can act on different segments in different ways. The aim of this study was to assess T2* value distribution in the left ventricle and to develop a correction procedure to compensate for artefactual variations in segmental analysis. MRI was performed in four groups of 22 subjects each: healthy subjects (I), controls (II) (thalassemia intermedia patients without iron overload), thalassemia major patients with mild (III) and heavy (IV) iron overload. Three short-axis views (basal, median, and apical) of the left ventricle were obtained and analyzed using custom-written, previously validated software. The myocardium was automatically segmented into a 16-segment standardized heart model, and the mean T2* value for each segment was calculated. Punctual distribution of T2* over the myocardium was assessed, and T2* inhomogeneity maps for the three slices were obtained. In group I, no significant variation in the mean T2* among slices was found. T2* showed a characteristic circumferential variation in all three slices. The effect of susceptibility differences induced by cardiac veins was evident, together with low-scale variations induced by geometrical artefacts. Using the mean segmental deviations as correction factors, an artefact correction map was developed and used to normalize segmental data. The correction procedure was validated on group II. Group IV showed no significant presence of segmental artefacts, confirming the hypothesis that susceptibility artefacts are additive in nature and become negligible for high levels of iron overload. Group III showed a greater variability with respect to normal subjects. The correction map failed to compensate for these variations if both additive and percentage-based corrections were applied. This may reinforce the hypothesis that true inhomogeneity in iron deposition exists. Copyright © 2007 John Wiley & Sons, Ltd. [source]


On the influence of trigger level in distribution vibration surveys

PACKAGING TECHNOLOGY AND SCIENCE, Issue 4 2009
Vincent Rouillard
Abstract This paper follows on from recently published research that examined the effects of recording parameters on the outcomes of distribution vibration surveys. Whereas the previous research focused on the effects of the sampling period at which sub-records of the process are captured, this paper deals with another often-used recording parameter, namely, the vibration level trigger. The paper describes the development of a software tool that was designed specifically to study the influence of various sampling parameters on continuously recorded vibration data. This software tool was used to undertake a thorough statistical analysis on a vibration record set consisting of continuously sampled data measured from a wide variety of vehicle types and routes. The paper shows that the outcomes of vibration surveys are very sensitive to the trigger level and can produce highly distorted results by introducing a bias that nearly always overestimates the overall vibration levels. This is reflected in estimates of common descriptors of random vibration processes such as the average power spectral density (PSD), the peak-hold PSD and the overall root-mean-square values. The main outcome of this analysis is the formulation of a correction method based on relationships between the true mean and peak-hold PSDs and estimates from sampled data. The effect and significance of the proposed correction procedure is demonstrated, especially in the context of laboratory simulation of distribution vibrations. The paper concludes by making specific recommendations for configuring high-capacity field data recorders and applying correction strategies to ensure that vibration surveys yield statistically sound results. Copyright © 2009 John Wiley & Sons, Ltd. [source]


An automated method for ,clumped-isotope' measurements on small carbonate samples

RAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 14 2010
Thomas W. Schmid
Clumped-isotope geochemistry deals with the state of ordering of rare isotopes in molecules, in particular with their tendency to form bonds with other rare isotopes rather than with the most abundant ones. Among its possible applications, carbonate clumped-isotope thermometry is the one that has gained most attention because of the wide potential of applications in many disciplines of earth sciences. Clumped-isotope thermometry allows reconstructing the temperature of formation of carbonate minerals without knowing the isotopic composition of the water from which they were formed. This feature enables new approaches in paleothermometry. The currently published method is, however, limited by sample weight requirements of 10,15,mg and because measurements are performed manually. In this paper we present a new method using an automated sample preparation device coupled to an isotope ratio mass spectrometer. The method is based on the repeated analysis (n,=,6,8) of 200,µg aliquots of sample material and completely automated measurements. In addition, we propose to use precisely calibrated carbonates spanning a wide range in ,47 instead of heated gases to correct for isotope effects caused by the source of the mass spectrometer, following the principle of equal treatment of the samples and standards. We present data for international standards (NBS 19 and LSVEC) and different carbonates formed at temperatures exceeding 600°C to show that precisions in the range of 10 to 15,ppm (1 SE) can be reached for repeated analyses of a single sample. Finally, we discuss and validate the correction procedure based on high-temperature carbonates instead of heated gases. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Absolute measurement of lattice parameter in single crystals and epitaxic layers on a double-crystal X-ray diffractometer

ACTA CRYSTALLOGRAPHICA SECTION A, Issue 3 2005
M. Fatemi
Details of the recently developed `zone technique' for the absolute measurement of lattice parameter and strain in single-crystal solids and thin films are presented. The method is based on measuring X-ray rocking curves from a few equatorial planes within a suitable zone and correcting their peak positions at once with a single zero offset. In contrast to the comparative method, which usually requires use of two opposite azimuthal directions, those in the zone technique can often be completed in only one azimuthal setting. A typical strained layer in the cubic system can be fully and rapidly characterized with only three rocking curves. The technique is suitable for routine applications under typical laboratory conditions, and for high-precision measurements of nearly perfect crystals in a controlled environment, with a potential parts in 10,million accuracy. This degree of accuracy is a direct consequence of the zero offset correction procedure, which effectively cancels a large portion of the misalignment errors in the diffractometer. The use of the (n,,n) geometry substantially reduces the errors of eccentricity compared to the Bond technique, and its stronger reflections enable the measurement of small samples about 0.05,mm in length with relative ease. The technique is illustrated with examples, and its extension to the triple-axis (,,2,) instruments is discussed. [source]


Simultaneous state estimation and attenuation correction for thunderstorms with radar data using an ensemble Kalman filter: tests with simulated data

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 643 2009
Ming Xue
Abstract A new approach to dealing with attenuated radar reflectivity data in the data assimilation process is proposed and tested with simulated data using the ensemble square-root Kalman filter. This approach differs from the traditional method where attenuation is corrected in observation space first before observations are assimilated into numerical models. We build attenuation correction into the data assimilation system by calculating the expected attenuation within the forward observation operators using the estimated atmospheric state. Such a procedure does not require prior assumption about the types of hydrometeor species along the radar beams, and allows us to take advantage of knowledge about the hydrometeors obtained through data assimilation and state estimation. Being based on optimal estimation theory, error and uncertainty information on the observations and prior estimate can be effectively utilized, and additional observed parameters, such as those from polarimetric radar, can potentially be incorporated into the system. Tests with simulated reflectivity data of an X-band 3 cm wavelength radar for a supercell storm show that the attenuation correction procedure is very effective,the analyses obtained using attenuated data are almost as good as those obtained using unattenuated data. The procedure is also robust in the presence of moderate dropsize-distribution-related observation operator error and when systematic radar calibration error exists. The analysis errors are very large if no attenuation correction is applied. The effect of attenuation and its correction when radial velocity data are also assimilated is discussed as well. In general, attenuation correction is equally important when quality radial velocity data are also assimilated. Copyright © 2009 Royal Meteorological Society [source]


The performance of sample selection estimators to control for attrition bias

HEALTH ECONOMICS, Issue 5 2001
Astrid Grasdal
Abstract Sample attrition is a potential source of selection bias in experimental, as well as non-experimental programme evaluation. For labour market outcomes, such as employment status and earnings, missing data problems caused by attrition can be circumvented by the collection of follow-up data from administrative registers. For most non-labour market outcomes, however, investigators must rely on participants' willingness to co-operate in keeping detailed follow-up records and statistical correction procedures to identify and adjust for attrition bias. This paper combines survey and register data from a Norwegian randomized field trial to evaluate the performance of parametric and semi-parametric sample selection estimators commonly used to correct for attrition bias. The considered estimators work well in terms of producing point estimates of treatment effects close to the experimental benchmark estimates. Results are sensitive to exclusion restrictions. The analysis also demonstrates an inherent paradox in the ,common support' approach, which prescribes exclusion from the analysis of observations outside of common support for the selection probability. The more important treatment status is as a determinant of attrition, the larger is the proportion of treated with support for the selection probability outside the range, for which comparison with untreated counterparts is possible. Copyright © 2001 John Wiley & Sons, Ltd. [source]


The potential of variational retrieval of temperature and humidity profiles from Meteosat Second Generation observations

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 638 2009
F. Di Giuseppe
Abstract The quality of temperature and humidity retrievals from the infrared Spinning Enhanced Visible and Infrared Imager (SEVIRI) sensors on the geostationary Meteosat Second Generation (MSG) satellites is assessed by means of a one-dimensional variational algorithm. The study is performed with the aim of improving the spatial and temporal resolution of available observations to feed analysis systems designed for high-resolution regional-scale numerical weather prediction (NWP) models. The non-hydrostatic forecast model COSMO in the ARPA-SIMC operational configuration is used to provide background fields. Only clear-sky observations over sea are processed. An optimized one-dimensional variational set-up comprised of two water-vapour and three window channels is selected. It maximizes the reduction of errors in the model backgrounds while ensuring ease of operational implementation through accurate bias correction procedures and correct radiative transfer simulations. The 1Dvar retrieval quality is first quantified in relative terms, employing statistics to estimate the reduction in the background model errors. Additionally the absolute retrieval accuracy is assessed by comparing the analysis with independent radiosonde observations. The inclusion of satellite data brings a substantial reduction in the warm and dry biases present in the forecast model. Moreover it is shown that the use of the retrieved profiles generated by the 1Dvar in the COSMO nudging scheme can locally reduce forecast errors. Copyright © 2009 Royal Meteorological Society [source]