Data Quality (data + quality)

Distribution by Scientific Domains
Distribution within Medical Sciences


Selected Abstracts


Dependence of Effective Marine Park Zoning on Survey Design, Data Quality, and Community Acceptance: Response to Lynch

CONSERVATION BIOLOGY, Issue 2 2008
ALDO S. STEFFE
First page of article [source]


Data quality in thermal summation development models for forensically important blowflies

MEDICAL AND VETERINARY ENTOMOLOGY, Issue 3 2009
C. S. RICHARDS
Abstract. To highlight some issues regarding data quality that are significant in estimating post-mortem intervals (PMI) from maggots, the developmental constants of thermal summation models for development of Chrysomya megacephala Fabricius (Diptera: Calliphoridae) were calculated from incidental data gathered from 12 published studies, and from data generated specifically for the purpose in a single experiment. The focused experiment involved measuring the timing of five developmental landmarks at nine constant temperatures with a sampling resolution of 6,12 h, which is characteristic of other published studies. Combining data from different studies produced inconsistent results because of statistical noise introduced by (at least) disparities in temporal precision, descriptive statistics, geographical location and rearing diets. A robust experimental design to estimate a developmental model should involve at least six constant temperatures, starting at about 7°C above the relevant developmental zero (D0) and going almost to the upper critical temperature, and a temporal sampling interval with a relative precision of about 10%, which requires sampling about every 2 h until hatching, about every 3 h until first ecdysis and about every 6 h until second ecdysis. [source]


Validity of registry data: Agreement between cancer records in an end-stage kidney disease registry (voluntary reporting) and a cancer register (statutory reporting)

NEPHROLOGY, Issue 4 2010
ANGELA C WEBSTER
ABSTRACT: Aims: End-stage kidney disease registries inform outcomes and policy. Data quality is crucial but difficult to measure objectively. We assessed agreement between incident cancer reported to the Australian and New Zealand Dialysis and Transplant Registry (ANZDATA) and to the Central Cancer Registry (CCR) in New South Wales. Methods: ANZDATA records were linked to CCR using probabilistic matching. We calculated agreement between registries for patients with ,1 cancers, all cancers and site-specific cancer using the kappa statistic (,). We investigated cases where records disagreed and compared estimates of cancer risk based either on ANZDATA or on CCR using standardized incidence ratios (indirect standardization by age, sex and calendar year). Results: From 1980 to 2001, 9453 residents had dialysis or transplantation. ANZDATA recorded 867 cancers in 779 (8.2%) registrants; CCR 867 cancers in 788 (8.3%). ANZDATA recorded 170 patients with cancer that CCR did not, CCR recorded 179 patients that ANZDATA did not (, = 0.76). ANZDATA had sensitivity 77.3% (confidence interval (CI) 74.2,80.2), specificity 98.1% (CI 97.7,98.3) if CCR records were regarded as the reference standard. Agreement was similar for diagnoses while receiving dialysis (, = 0.78) or after transplantation (, = 0.79), but varied by cancer type. Agreement was poorest for melanoma (, = 0.61) and myeloma (, = 0.47) and highest for lymphoma (, = 0.80), leukaemia (, = 0.86) and breast cancer (, = 0.85). Artefact accounted for 20.8% of the non-concordance but error and misclassification did occur in both registries. Estimates of cancer risk based on ANZDATA or CCR records did not differ in any important way. Conclusion: Agreement of cancer records between both registries was high and differences largely explicable. It is likely that both ANZDATA and CCR have some inaccuracies, for reasons that are now more explicit, with themes similar to those likely to be experienced by other registries. [source]


Phototherapy in the management of atopic dermatitis: a systematic review

PHOTODERMATOLOGY, PHOTOIMMUNOLOGY & PHOTOMEDICINE, Issue 4 2007
N. Bhavani Meduri
Background/purpose: Atopic dermatitis (AD) is a common and extremely burdensome skin disorder with limited therapeutic options. Ultraviolet (UV) phototherapy is a well tolerated, efficacious treatment for AD, but its use is limited by a lack of guidelines in the optimal choice of modality and dosing. Given this deficit, we aim to develop suggestions for the treatment of AD with phototherapy by systematically reviewing the current medical literature. Methods: Data sources: All data sources were identified through searches of MEDLINE via the Ovid interface, the Cochrane Central Register of Controlled Trials, and a complementary manual literature search. Study selection: Studies selected for review met these inclusion criteria, as applied by multiple reviewers: controlled clinical trials of UV phototherapy in the management of AD in human subjects as reported in the English-language literature. Studies limited to hand dermatitis and studies in which subjects were allowed unmonitored use of topical corticosteroids or immunomodulators were excluded. Data extraction: Included studies were assessed by multiple independent observers who extracted and compiled the following data: number of patients, duration of treatment, cumulative doses of UV radiation, adverse effects, and study results. Data quality was assessed by comparing data sets and rechecking source materials if a discrepancy occurred. Results: Nine trials that met the inclusion criteria were identified. Three studies demonstrated that UVA1 is both faster and more efficacious than combined UVAB for treating acute AD. Two trials disclosed the advantages of medium dose (50 J/cm2) UVA1 for treating acute AD. Two trials revealed the superiority of combined UVAB in the management of chronic AD. Two additional studies demonstrated that narrow-band UVB is more effective than either broad-band UVA or UVA1 for managing chronic AD. Conclusion: On the basis of available evidence, the following suggestions can be made: phototherapy with medium-dose (50 J/cm2) UVA1, if available, should be used to control acute flares of AD while UVB modalities, specifically narrow-band UVB, should be used for the management of chronic AD. [source]


Extending a model of precarious employment: A qualitative study of immigrant workers in Spain

AMERICAN JOURNAL OF INDUSTRIAL MEDICINE, Issue 4 2010
Victoria Porthé PhD
Abstract Background Since the 1980s, changes in the labor market have modified power relations between capital and labor, leading to greater levels of precarious employment among workers. Globalization has led to a growth in migration, as people leave their countries in search of work. We aimed to describe the dimensions of precarious employment for immigrant workers in Spain. Methods Qualitative study using analytic induction. Criterion sampling was used to recruit 129 immigrant workers in Spain with documented and undocumented administrative status. Data quality was ensured by triangulation. Results Immigrant workers reported that precarious employment is characterized by high job instability, a lack of power for negotiating employment conditions, and defenselessness against high labor demands. They described insufficient wages, long working hours, limited social benefits, and difficulty in exercising their rights. Undocumented workers reported greater defenselessness and worse employment conditions. Conclusions This study allowed us to describe the dimensions of precarious employment in immigrant workers. Am. J. Ind. Med. 53:417,424, 2010. © 2010 Wiley-Liss, Inc. [source]


Cognitive burden of survey questions and response times: A psycholinguistic experiment

APPLIED COGNITIVE PSYCHOLOGY, Issue 7 2010
Timo Lenzner
An important objective in survey question design is to write clear questions that respondents find easy to understand and to answer. This contribution identifies the factors that influence question clarity. Theoretical and empirical evidence from psycholinguistics suggests that specific text features (e.g., low-frequency words (LFRWs), left-embedded syntax) cause comprehension difficulties and impose a high cognitive burden on respondents. To examine the effect of seven different text features on question clarity, an online experiment was conducted in which well-formulated questions were compared to suboptimal counterparts. The cognitive burden of the questions was assessed with response times. Data quality was compared in terms of drop-out rates and survey satisficing behaviour. The results show that at least six of the text features are relevant for the clarity of a question. We provide a detailed explanation of these text features and advise survey designers to avoid them when crafting questions. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Social Protection in Vietnam and Obstacles to Progressivity

ASIAN SOCIAL WORK AND POLICY REVIEW, Issue 1 2008
Martin Evans
The present paper analyzes the incidence and progressivity of Vietnamese state income transfers using survey data from the Vietnamese Household Living Standards Survey 2004. Data quality and sample selection issues are highlighted, especially in the coverage of rural-urban migrants. Simple income-based profiles of incidence are matched to several influences that confound and complicate the measurement of progressivity. The issue of the informal economy is highlighted through analysis of both the extent of private inter-household transfers and remittances and their relationship with state transfers, and in the informal charges that accompany uptake of state services and other petty corruption. Second, the issue of user-charges for health and education services is considered, as a considerable portion of state transfers are related to the take up of schooling and health care. Third, the issue of behavioral effects is also considered, concentrating on private inter-household transfers. The paper concludes by drawing together the evidence and the obstacles to measurement and progressivity to argue a range of data collection, methodological and policy recommendations. [source]


Data reliability and structure in the Swedish National Cataract Register

ACTA OPHTHALMOLOGICA, Issue 5 2001
Ingemar Håkansson
ABSTRACT. Purpose: A Swedish National Cataract Register was instituted in 1992, monitoring nearly all cataract operations in Sweden, and since its inception comprising about 95% of all operations. Data from a total of approximately 290 000 operations have been collected during 1992,1998. Data quality is an important factor, and we have therefore assessed the various types and frequencies of data entry errors in the material. Methods: The medical records for all operations in five selected participating clinics were retrieved for a set month. Each data transfer step from the record to the final data base was monitored for a total of 574 operations. A total of 10 variables were recorded for each operation. Results: Significant sources of error were absent in most variables. However, possibly important errors appeared in three: "date entering waiting list", "preoperative best corrected visual acuity in the operated eye", or "visual acuity in the fellow eye". There were also noteworthy variations between the five clinics, different for different parameters. Errors were predominantly prone to appear at the very first step of registration. In most cases this was due to deviations from the data collection instructions. Conclusions: The reliability is good for most values entered into the register, but it is important to ensure that data definitions are exact and adhered to. Repeated information to the involved persons on how to fill in the forms appears to be a requisite for maintaining good input quality. [source]


Data quality and clinical decision-making: do we trust machines blindly?

CLINICAL AND EXPERIMENTAL OPTOMETRY, Issue 3 2009
Konrad Pesudovs PhD
No abstract is available for this article. [source]


Using pulsed gradient spin echo NMR for chemical mixture analysis: How to obtain optimum results

CONCEPTS IN MAGNETIC RESONANCE, Issue 4 2002
Brian Antalek
Abstract Pulsed gradient spin echo NMR is a powerful technique for measuring diffusion coefficients. When coupled with appropriate data processing schemes, the technique becomes an exceptionally valuable tool for mixture analysis, the separation of which is based on the molecular size. Extremely fine differentiation may be possible in the diffusion dimension but only with high-quality data. For fully resolved resonances, components with diffusion coefficients that differ by less than 2% may be distinguished in mixtures. For highly overlapped resonances, the resolved spectra of pure components with diffusion coefficients that differ by less than 30% may be obtained. In order to achieve the best possible data quality one must be aware of the primary sources of artifacts and incorporate the necessary means to alleviate them. The origin of these artifacts are described, along with the methods necessary to observe them. Practical solutions are presented. Examples are shown that demonstrate the effects of the artifacts on the acquired data set. Many mixture analysis problems may be addressed with conventional high resolution pulsed field gradient probe technology delivering less than 0.5 T m,1 (50 G cm,1). © 2002 Wiley Periodicals, Inc. Concepts Magn Reson 14: 225,258, 2002. [source]


Gait characteristics of diabetic patients: a systematic review

DIABETES/METABOLISM: RESEARCH AND REVIEWS, Issue 3 2008
L. Allet
Abstract Patients with diabetes are at higher risk of experiencing fall-related injuries when walking than healthy controls. The underlying mechanism responsible for this is not yet clear. Thus we intend to summarize diabetic patients' gait characteristics and emphasize those which could be the possible underlying mechanisms for increased fall risk. This systematic review aims, in particular, to: (1) evaluate the quality of existing studies which investigate the gait characteristics of diabetic patients, (2) highlight areas of agreement and contradiction in study results, (3) discuss and emphasize parameters associated with fall risk, and (4) propose new orientations and further domains for research needed for their fall risk prevention. We conducted an electronic search of Pedro, PubMed, Ovid and Cochrane. Two authors independently assessed all abstracts. Quality of the selected articles was scored, and the study results summarized and discussed. We considered 236 abstracts of which 28 entered our full text review. Agreement on data quality between two reviewers was high (kappa: 0.90). Authors investigating gait parameters in a diabetic population evaluated in particular parameters either associated with fall risk (speed, step length or step-time variability) or with ulcers (pressure). There is agreement that diabetic patients walk slower, with greater step variability, and present higher plantar pressure than healthy controls. We concluded that diabetic patients present gait abnormalities, some of which can lead to heightened fall risk. To understand its' underlying mechanisms, and to promote efficient prevention, further studies should analyse gait under ,real-life' conditions. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Global biogeographical data bases on marine fishes: caveat emptor

DIVERSITY AND DISTRIBUTIONS, Issue 6 2008
D. Ross Robertson
ABSTRACT A review of georeferenced collection-site records for Caribbean shore-fishes served by major online distributors of aggregated biodiversity data found large-scale errors in over a third of the species and genera, in nearly two-thirds of the families. To avoid compromising the value of their services to the global science community online providers must actively address the question of data quality. [source]


Historical review of sample preparation for chromatographic bioanalysis: pros and cons

DRUG DEVELOPMENT RESEARCH, Issue 3 2007
Min S. Chang
Abstract Sample preparation is a major task in a regulated bioanalytical laboratory. The sample preparation procedure significantly impacts assay throughput, data quality, analysis cost, and employee satisfaction. Therefore, selecting and optimizing an appropriate sample preparation method is essential for successful method development. Because of our recent expertise, this article is focused on sample preparation for high-performance liquid chromatography with mass spectrometric detection. Liquid chromatography with mass spectrometric detection (LC-MS) is the most common detection technique for small molecules used in regulated bioanalytical laboratories. The sample preparation technologies discussed are pre-extraction and post-extraction sample processing, protein precipitation (PPT), liquid,liquid extraction (LLE), offline solid-phase extraction (SPE), and online solid-phase extraction. Since all these techniques were in use for more than two decades, numerous applications and variations exist for each technique. We will not attempt to categorize each variation. Rather, the development history, a brief theoretical background, and selected references are presented. The strengths and the limitations of each method are discussed, including the throughput improvement potential. If available, illustrations from presentations at various meetings by our laboratory are used to clarify our opinion. Drug Dev Res 68:107,133, 2007. ©2007 Wiley-Liss, Inc. [source]


Enhanced process monitoring for wastewater treatment systems

ENVIRONMETRICS, Issue 6 2008
Chang Kyoo Yoo
Abstract Wastewater treatment plants (WWTPs) remain notorious for poor data quality and sensor reliability problems due to the hostile environment, missing data problems and more. Many sensors in WWTP are prone to malfunctions in harsh environments. If a WWTP contains any redundancy between sensors, monitoring methods with sensor reconstruction such as the proposed one can yield a better monitoring efficiency than without a reconstruction scheme. An enhanced robust process monitoring method combined with a sensor reconstruction scheme to tackle the sensor failure problems is proposed for biological wastewater treatment systems. The proposed method is applied to a single reactor for high activity ammonia removal over nitrite (SHARON) process. It shows robust monitoring performance in the presence of sensor faults and produces few false alarms. Moreover, it enables us to keep the monitoring system running in the case of sensor failures. This guaranteed continuity of the monitoring scheme is a necessary development in view of real-time applications in full-scale WWTPs. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Organisation of proficiency testing for plant health diagnostic tests: the experience of FAPAS®

EPPO BULLETIN, Issue 1 2010
A. Reynolds
Proficiency testing (PT) is an established quality assurance measure and is based on the comparison of laboratories' results in an inter-laboratory trial. It highlights problems in laboratory analysis and is an educational tool to help improve data quality. This article describes how PT is organised by FAPAS®. FAPAS® is an international PT provider (external quality assessments) for food chemistry, food microbiology, genetically modified material and water/environmental samples. Since 2007, FAPAS® have organized plant health proficiency tests in conjunction with the Plant Pest and Disease Identification Programme at the Food and Environment Research Agency (Fera). Up until 2009, FAPAS® has organised seven plant health proficiency tests that covered the identification of lyophilised bacteria, viruses in leaves and fungi in agar plugs. In 2009, FAPAS® organized over 10 plant health proficiency tests under the banner of ,PhytoPAS', including Potato spindle tuber viroid, Phytophthora ramorum, Thrips palmi, Erwinia amylovora, Clavibacter michiganensis subsp. sepedonicus, etc. DNA extracts, cyst nematodes (Globodera pallida) and slides/immunofluorescence (IF) slides have been added to the programme. The organization of the plant health proficiency tests follows a similar pattern. Suitable test materials are prepared and tested for quality before distribution to requesting participants. Laboratories usually have 1,2 months to analyze their samples and return their results. A report is then compiled for issue to laboratories and these contain all results in an anonymous form, so that laboratories can compare their results with those of other participants. If a laboratory's performance is unsatisfactory then it is up to them to investigate the situation. Thus, the primary purpose of PT is the detection of inaccuracy in a laboratory's results, so that they can investigate the problems and initiate corrective procedures. [source]


New procedures to decompose geomagnetic field variations and application to volcanic activitiy

GEOPHYSICAL JOURNAL INTERNATIONAL, Issue 1 2008
Ikuko Fujii
SUMMARY We report the development of numerical procedures for extracting long-term geomagnetic field variations caused by volcanic activity from an observed geomagnetic field by using statistical methods. The newly developed procedures are to estimate the trend from the observed data, as well as variations of non-volcanic origin such as periodic components, components related to external geomagnetic variations and observational noise. We also aim at referring to data obtained at a remote standard geomagnetic observatory rather than using a temporarily installed reference site for reasons of data quality. Two different approaches,a Bayesian statistical method and a Kalman filter method,are applied to decompose the geomagnetic field data into four components for comparison. The number of filter coefficients and the degree of condition realizations are optimized on the basis of minimization of the information criteria. The two procedures were evaluated by using a synthetic data set. Generally, the results of both methods are equally sufficient. Subtle differences are seen at the first, several data points due to arbitrarily selected initial values in the case of the Kalman filter method and at the smaller residual for the Bayesian statistical method. The largest differences are in computation time and memory size. The Kalman filter method runs a thousand times faster on a testing workstation and requires less memory than the Bayesian method. The Kalman filter method was applied to the total intensity data at Kuchi-erabu-jima volcano. The result suggests that the procedure works reasonably well. [source]


The Use of Reference Materials: A Tutorial

GEOSTANDARDS & GEOANALYTICAL RESEARCH, Issue 1 2001
Jean S. Kane
reference materials; certified reference materials; method validation; traceability of measurement; geochemical analysis Any review of the analytical literature shows that, while reference materials are routinely used in laboratories world-wide, not all uses follow ISO Guide 33 (1989), which outlines best practices. Analytical data quality can suffer as a result. This paper reviews the various uses that the geoanalytical community has made of reference materials from a historical perspective, and suggests improvements in practice that would more closely follow ISO Guide 33 recommendations. Un examen de la littérature dans le domaine analytique montre que, si les matériaux de référence sont utilisés en routine dans les laboratoires du monde entier, ces derniers ne suivent pas toujours les recommandations du guide ISO 33 (1989a), qui souligne les bonnes pratiques de laboratoire. La qualité des données analytiques peut alors en souffrir. Cet article passe en revue les différentes utilisations des matériaux de référence par la communauté de géoanalyse, ceci d'un point de vue historique, et suggère des améliorations de pratiques pour suivre au mieux les recommandations du guide ISO 33. [source]


A Case Study of Soil-Gas Sampling in Silt and Clay-Rich (Low-Permeability) Soils

GROUND WATER MONITORING & REMEDIATION, Issue 1 2009
Todd A. McAlary
Soil-gas sampling and analysis is a common tool used in vapor intrusion assessments; however, sample collection becomes more difficult in fine-grained, low-permeability soils because of limitations on the flow rate that can be sustained during purging and sampling. This affects the time required to extract sufficient volume to satisfy purging and sampling requirements. The soil-gas probe tubing or pipe and sandpack around the probe screen should generally be purged prior to sampling. After purging, additional soil gas must be extracted for chemical analysis, which may include field screening, laboratory analysis, occasional duplicate samples, or analysis for more than one analytical method (e.g., volatile organic compounds and semivolatile organic compounds). At present, most regulatory guidance documents do not distinguish between soil-gas sampling methods that are appropriate for high- or low-permeability soils. This paper discusses permeability influences on soil-gas sample collection and reports data from a case study involving soil-gas sampling from silt and clay-rich soils with moderate to extremely low gas permeability to identify a sampling approach that yields reproducible samples with data quality appropriate for vapor intrusion investigations for a wide range of gas-permeability conditions. [source]


Application of Six Sigma Methods for Improving the Analytical Data Management Process in the Environmental Industry

GROUND WATER MONITORING & REMEDIATION, Issue 2 2006
Christopher M. French
Honeywell applied the rigorous and well-documented Six Sigma quality-improvement approach to the complex, highly heterogeneous, and mission-critical process of remedial site environmental data management to achieve a sea change in terms of data quality, environmental risk reduction, and overall process cost reduction. The primary focus was to apply both qualitative and quantitative Six Sigma methods to improve electronic management of analytical laboratory data generated for environmental remediation and long-term monitoring programs. The process includes electronic data delivery, data QA/QC checking, data verification, data validation, database administration, regulatory agency reporting and linkage to spatial information, and real-time geographical information systems. Results of the analysis identified that automated, centralized web-based software tools delivered through Software as a Service (SaaS) model are optimal to improve the process resulting in cost reductions, while simultaneously improving data quality and long-term data usability and perseverance. A pilot project was completed that quantified cycle time and cost improvements of 50% and 65%, respectively. [source]


Assessing the reliability of biodiversity databases: identifying evenly inventoried island parasitoid faunas (Hymenoptera: Ichneumonoidea) worldwide

INSECT CONSERVATION AND DIVERSITY, Issue 2 2010
ANA M. C. SANTOS
Abstract., 1.,Taxonomic and geographic biases are common in biodiversity inventories, especially in hyperdiverse taxa, such as the Ichneumonoidea. Despite these problems, biodiversity databases could be a valuable source of information if their reliability is carefully assessed. 2.,One major problem of using these data for large-scale analyses is the unevenness of data quality from different areas, which makes them difficult to compare. One way of surpassing such problem would be to identify sets of areas that are evenly inventoried. 3.,Here, we propose a scoring protocol for the identification of sets of evenly inventoried areas from taxonomic databases, based on three criteria: (i) completeness at high taxonomic levels, (ii) congruence with well-established ecological relationships (such as species,area relationship), and (iii) publication effort received. We apply this protocol to the selection of a set of evenly inventoried islands worldwide for two Ichneumonoidea families (Braconidae and Ichneumonidae) from the data gathered in Taxapad database. 4.,From the 118 islands included in Taxapad, 53 and 70 can be considered sufficiently inventoried for Braconidae and Ichneumonidae, respectively. The publication effort criterion was more restrictive than the other two criteria. The Indomalayan, Nearctic and Palearctic regions had more than half of their islands identified as evenly inventoried, for both families. 5.,We discuss the generality of the biases and incompleteness of most biodiversity data, and also how the basic principles of the protocol proposed here can be applied to taxonomic databases devoted to other taxa. Also, the islands identified here can serve as the basis for large-scale analyses of the poorly known biogeography of the Ichneumonoidea. [source]


Estimating temperature normals for USCRN stations

INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 14 2005
Bomin Sun
Abstract Temperature normals have been estimated for stations of the newly developed US Climate Reference Network (USCRN) by using USCRN temperatures and temperature anomalies interpolated from neighboring stations of the National Weather Service Cooperative Station Network (COOP). To seek the best normal estimation approach, several variations on estimation techniques were considered: the sensitivity of error of estimated normals to COOP data quality; the number of neighboring COOP station used; a spatial interpolation scheme; and the number of years of data used in normal estimation. The best estimation method we found is the one in which temperature anomalies are spatially interpolated from COOP stations within approximately 117 km of the target station using a weighting scheme involving the inverse of square difference in temperature (between the neighboring and target station). Using this approach, normals of USCRN stations were generated. Spatial and temporal characteristics of errors are presented, and the applicability of estimated normals in climate monitoring is discussed. Copyright © 2005 Royal Meteorological Society [source]


Application of possibility theory in the life-cycle inventory assessment of biofuels

INTERNATIONAL JOURNAL OF ENERGY RESEARCH, Issue 8 2002
Raymond R. Tan
Abstract Data uncertainty issues have constrained the widespread acceptance of life-cycle analysis (LCA) and related methods. This is particularly important in the LCA of fuels due to the wide range of available feedstocks and processing options. Despite recent attempts at standardization, there remain persistent doubts about the general validity of LCA results, often due to uncertainties about data quality. This paper demonstrates the application of possibility theory as a tool for handling life-cycle inventory data imprecision for the case of the net energy balance of coconut methyl ester (CME) as a biodiesel transport fuel. Results derived using a possibililistic computation are contrasted with those arrived at by probabilistic (Monte Carlo) simulation. The two approaches yield comparable results but possibilistic modelling offers significant advantages with respect to computational efficiency. The net energy balance of CME is estimated to be approximately 36 MJ kg,1, significantly higher than the 28 MJ kg,1 net energy typical of rapeseed oil methyl ester (RME) relevant to the U.K. Copyright © 2002 John Wiley & Sons, Ltd. [source]


A comparison of a microfocus X-ray source and a conventional sealed tube for crystal structure determination

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 5 2009
Thomas Schulz
Experiments are described in which a direct comparison was made between a conventional 2,kW water-cooled sealed-tube X-ray source and a 30,W air-cooled microfocus source with focusing multilayer optics, using the same goniometer, detector, radiation (Mo,K,), crystals and software. The beam characteristics of the two sources were analyzed and the quality of the resulting data sets compared. The Incoatec Microfocus Source (IµS) gave a narrow approximately Gaussian-shaped primary beam profile, whereas the Bruker AXS sealed-tube source, equipped with a graphite monochromator and a monocapillary collimator, had a broader beam with an approximate intensity plateau. Both sources were mounted on the same Bruker D8 goniometer with a SMART APEX II CCD detector and Bruker Kryoflex low-temperature device. Switching between sources simply required changing the software zero setting of the 2, circle and could be performed in a few minutes, so it was possible to use the same crystal for both sources without changing its temperature or orientation. A representative cross section of compounds (organic, organometallic and salt) with and without heavy atoms was investigated. For each compound, two data sets, one from a small and one from a large crystal, were collected using each source. In another experiment, the data quality was compared for crystals of the same compound that had been chosen so that they had dimensions similar to the width of the beam. The data were processed and the structures refined using standard Bruker and SHELX software. The experiments show that the IµS gives superior data for small crystals whereas the diffracted intensities were comparable for the large crystals. Appropriate scaling is particularly important for the IµS data. [source]


The influence of spatial errors in species occurrence data used in distribution models

JOURNAL OF APPLIED ECOLOGY, Issue 1 2008
Catherine H Graham
Summary 1Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error. [source]


Collation, assessment and analysis of literature in vitro data on hERG receptor blocking potency for subsequent modeling of drugs' cardiotoxic properties

JOURNAL OF APPLIED TOXICOLOGY, Issue 3 2009
Sebastian Polak
Abstract The assessment of the torsadogenic potency of a new chemical entity is a crucial issue during lead optimization and the drug development process. It is required by the regulatory agencies during the registration process. In recent years, there has been a considerable interest in developing in silico models, which allow prediction of drug,hERG channel interaction at the early stage of a drug development process. The main mechanism underlying an acquired QT syndrome and a potentially fatal arrhythmia called torsades de pointes is the inhibition of potassium channel encoded by hERG (the human ether-a-go-go-related gene). The concentration producing half-maximal block of the hERG potassium current (IC50) is a surrogate marker for proarrhythmic properties of compounds and is considered a test for cardiac safety of drugs or drug candidates. The IC50 values, obtained from data collected during electrophysiological studies, are highly dependent on experimental conditions (i.e. model, temperature, voltage protocol). For the in silico models' quality and performance, the data quality and consistency is a crucial issue. Therefore the main objective of our work was to collect and assess the hERG IC50 data available in accessible scientific literature to provide a high-quality data set for further studies. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Global patterns of plant diversity and floristic knowledge

JOURNAL OF BIOGEOGRAPHY, Issue 7 2005
Gerold Kier
Abstract Aims, We present the first global map of vascular plant species richness by ecoregion and compare these results with the published literature on global priorities for plant conservation. In so doing, we assess the state of floristic knowledge across ecoregions as described in floras, checklists, and other published documents and pinpoint geographical gaps in our understanding of the global vascular plant flora. Finally, we explore the relationships between plant species richness by ecoregion and our knowledge of the flora, and between plant richness and the human footprint , a spatially explicit measure of the loss and degradation of natural habitats and ecosystems as a result of human activities. Location, Global. Methods, Richness estimates for the 867 terrestrial ecoregions of the world were derived from published richness data of c. 1800 geographical units. We applied one of four methods to assess richness, depending on data quality. These included collation and interpretation of published data, use of species,area curves to extrapolate richness, use of taxon-based data, and estimates derived from other ecoregions within the same biome. Results, The highest estimate of plant species richness is in the Borneo lowlands ecoregion (10,000 species) followed by nine ecoregions located in Central and South America with , 8000 species; all are found within the Tropical and Subtropical Moist Broadleaf Forests biome. Among the 51 ecoregions with , 5000 species, only five are located in temperate regions. For 43% of the 867 ecoregions, data quality was considered good or moderate. Among biomes, adequate data are especially lacking for flooded grasslands and flooded savannas. We found a significant correlation between species richness and data quality for only a few biomes, and, in all of these cases, our results indicated that species-rich ecoregions are better studied than those poor in vascular plants. Similarly, only in a few biomes did we find significant correlations between species richness and the human footprint, all of which were positive. Main conclusions, The work presented here sets the stage for comparisons of degree of concordance of plant species richness with plant endemism and vertebrate species richness: important analyses for a comprehensive global biodiversity strategy. We suggest: (1) that current global plant conservation strategies be reviewed to check if they cover the most outstanding examples of regions from each of the world's major biomes, even if these examples are species-poor compared with other biomes; (2) that flooded grasslands and flooded savannas should become a global priority in collecting and compiling richness data for vascular plants; and (3) that future studies which rely upon species,area calculations do not use a uniform parameter value but instead use values derived separately for subregions. [source]


ASA scores in the preoperative patient: feedback to clinicians can improve data quality

JOURNAL OF EVALUATION IN CLINICAL PRACTICE, Issue 2 2007
Michael P. W. Grocott BSc MRCP FRCA
[source]


Technologies, Security, and Privacy in the Post-9/11 European Information Society

JOURNAL OF LAW AND SOCIETY, Issue 2 2004
Michael Levi
Since 11 September 2001, many ,hard' and ,soft' security strategies have been introduced to enable more intensive surveillance and control of the movement of `suspect populations'. Suicide bombings have since generated a step-change in asymmetric threat analysis and public perceptions of risk. This article reviews how post-9/11 ,security' issues intersect with existing and emerging technologies, particularly those relating to identity, location, home, and work that will form the backbone of the European Information Society. The article explores the complexities generated by the way that these technologies work, sites of nationalist resistance, and formal bureaucratic roles. Many of the planned surveillance methods and technologies are convergence technologies aiming to bring together new and existing data sources, but are unable to do so because of poor data quality and the difficulty of using the integrated data to reduce serious crime risks. The delay may enable legal compliance models to be developed in order to protect the principles of privacy that are set out in the ECHR and the EC Data Protection Directive. Though (moral) panics produce changes in law, the article emphasizes the constraining effects of law. [source]


1H spectroscopic imaging of human brain at 3 Tesla: Comparison of fast three-dimensional magnetic resonance spectroscopic imaging techniques

JOURNAL OF MAGNETIC RESONANCE IMAGING, Issue 3 2009
Matthew L. Zierhut PhD
Abstract Purpose To investigate the signal-to-noise-ratio (SNR) and data quality of time-reduced three-dimensional (3D) proton magnetic resonance spectroscopic imaging (1H MRSI) techniques in the human brain at 3 Tesla. Materials and Methods Techniques that were investigated included ellipsoidal k -space sampling, parallel imaging, and echo-planar spectroscopic imaging (EPSI). The SNR values for N-acetyl aspartate, choline, creatine, and lactate or lipid peaks were compared after correcting for effective spatial resolution and acquisition time in a phantom and in the brains of human volunteers. Other factors considered were linewidths, metabolite ratios, partial volume effects, and subcutaneous lipid contamination. Results In volunteers, the median normalized SNR for parallel imaging data decreased by 34,42%, but could be significantly improved using regularization. The normalized signal to noise loss in flyback EPSI data was 11,18%. The effective spatial resolutions of the traditional, ellipsoidal, sensitivity encoding (SENSE) sampling scheme, and EPSI data were 1.02, 2.43, 1.03, and 1.01 cm3, respectively. As expected, lipid contamination was variable between subjects but was highest for the SENSE data. Patient data obtained using the flyback EPSI method were of excellent quality. Conclusion Data from all 1H 3D-MRSI techniques were qualitatively acceptable, based upon SNR, linewidths, and metabolite ratios. The larger field of view obtained with the EPSI methods showed negligible lipid aliasing with acceptable SNR values in less than 9.5 min without compromising the point-spread function. J. Magn. Reson. Imaging 2009;30:473,480. © 2009 Wiley-Liss, Inc. [source]


A quality assurance audit: Phase III trial of maximal androgen deprivation in prostate cancer (TROG 96.01)

JOURNAL OF MEDICAL IMAGING AND RADIATION ONCOLOGY, Issue 1 2000
A Steigler
SUMMARY In 1997 the Trans-Tasman Radiation Oncology Group (TROG) performed a quality assurance (QA) audit of its phase III randomized clinical trial investigating the effectiveness of different durations of maximal androgen deprivation prior to and during definitive radiation therapy for locally advanced carcinoma of the prostate (TROG 96.01). The audit reviewed a total of 60 cases from 15 centres across Australia and New Zealand. In addition to verification of technical adherence to the protocol, the audit also incorporated a survey of centre planning techniques and a QA time/cost analysis. The present report builds on TROG's first technical audit conducted in 1996 for the phase III accelerated head and neck trial (TROG 91.01) and highlights the significant progress TROG has made in the interim period. The audit provides a strong validation of the results of the 96.01 trial, as well as valuable budgeting and treatment planning information for future trials. Overall improvements were detected in data quality and quantity, and in protocol compliance, with a reduction in the rate of unacceptable protocol violations from 10 to 4%. Audit design, staff education and increased data management resources were identified as the main contributing factors to these improvements. In addition, a budget estimate of $100 per patient has been proposed for conducting similar technical audits. The next major QA project to be undertaken by TROG during the period 1998,1999 is an intercentre dosimetry study. Trial funding and staff education have been targeted as the key major issues essential to the continued success and expansion of TROG's QA programme. [source]