Quality Data (quality + data)

Distribution by Scientific Domains

Kinds of Quality Data

  • high quality data
  • water quality data


  • Selected Abstracts


    Targeting Chemical and Biological Warfare Agents at the Molecular Level

    ELECTROANALYSIS, Issue 14 2003
    Omowunmi
    Abstract After the September,11 tragedies of 2001, scientists and law-enforcement agencies have shown increasing concern that terrorist organizations and their "rogue" foreign government-backers may resort to the use of chemical and/or biological agents against U.S. military or civilian targets. In addition to the right mix of policies, including security measures, intelligence gathering and training for medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism lies in how best to respond to an attack using reliable analytical sensors. The public and regulatory agencies expect sensing methodologies and devices for homeland security to be very reliable. Quality data can only be generated by using analytical sensors that are validated and proven to be under strict design criteria, development and manufacturing controls. Electrochemical devices are ideally suited for obtaining the desired analytical information in a faster, simpler, and cheaper manner compared to traditional (lab-based) assays and hence for meeting the requirements of decentralized biodefense applications. This articler presents a review of the major trends in monitoring technologies for chemical and biological warfare (CBW) agents. It focuses on research and development of sensors (particularly electrochemical ones), discusses how advances in molecular recognition might be used to design new multimission networked sensors (MULNETS) for homeland security. Decision flow-charts for choosing particular analytical techniques for CBW agents are presented. Finally, the paths to designing sensors to meet the needs of today's measurement criteria are analyzed. [source]


    Sediment quality in near coastal waters of the Gulf of Mexico: Influence of Hurricane Katrina,

    ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 7 2010
    John M. Macauley
    Abstract The results of the present study represent a synoptic analysis of sediment quality in coastal waters of Lake Pontchartrain and Mississippi Sound two months after the landfall of Hurricane Katrina. Posthurricane conditions were compared to prehurricane (2000,2004) conditions, for sediment quality data. There were no exceedances of effects range median (ERM) sediment quality guideline values for chemical contaminants in any of the sediment samples collected from the Lake Pontchartrain or the Mississippi Sound study areas following the hurricane. Lower threshold effects range low (ERL) values were exceeded for As, Cd, and Ni at several stations in both survey areas, similar to levels of contamination observed prior to the hurricane. The comparison of sediment quality indicators before and after the hurricane suggests considerable stability of these systems with respect to short-term ecological impacts. Although other studies have shown storm-related changes could be detected (e.g., effects on benthic communities associated with shifts in salinity), there were no indications of widespread sediment contamination. Environ. Toxicol. Chem. 2010;29:1403,1408. © 2010 SETAC [source]


    Nonparametric harmonic regression for estuarine water quality data

    ENVIRONMETRICS, Issue 6 2010
    Melanie A. Autin
    Abstract Periodicity is omnipresent in environmental time series data. For modeling estuarine water quality variables, harmonic regression analysis has long been the standard for dealing with periodicity. Generalized additive models (GAMs) allow more flexibility in the response function. They permit parametric, semiparametric, and nonparametric regression functions of the predictor variables. We compare harmonic regression, GAMs with cubic regression splines, and GAMs with cyclic regression splines in simulations and using water quality data collected from the National Estuarine Reasearch Reserve System (NERRS). While the classical harmonic regression model works well for clean, near-sinusoidal data, the GAMs are competitive and are very promising for more complex data. The generalized additive models are also more adaptive and require less-intervention. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    A simulation tool for designing nutrient monitoring programmes for eutrophication assessments,

    ENVIRONMETRICS, Issue 1 2010
    Janet Heffernan
    Abstract This paper describes a simulation tool to aid the design of nutrient monitoring programmes in coastal waters. The tool is developed by using time series of water quality data from a Smart Buoy, an in situ monitoring device. The tool models the seasonality and temporal dependence in the data and then filters out these features to leave a white noise series. New data sets are then simulated by sampling from the white noise series and re-introducing the modelled seasonality and temporal dependence. Simulating many independent realisations allows us to study the performance of different monitoring designs and assessment methods. We illustrate the approach using total oxidised nitrogen (TOxN) and chlorophyll data from Liverpool Bay, U.K. We consider assessments of whether the underlying mean concentrations of these water quality variables are sufficiently low; i.e. below specified assessment concentrations. We show that for TOxN, even when mean concentrations are at background, daily data from a Smart Buoy or multi-annual sampling from a research vessel would be needed to obtain adequate power. Copyright © 2009 Crown Copyright [source]


    Analysis of rain quality data from the South African interior

    ENVIRONMETRICS, Issue 4 2002
    Jacky Galpin
    Abstract Rain acidity may be ascribed to emissions from power station stacks, as well as emissions from other industry, biomass burning, maritime influences, agricultural influences, etc. Rain quality data are available for 30 sites in the South African interior, some from as early as 1985 for up to 14 rainfall seasons, while others only have relatively short records. The article examines trends over time in the raw and volume weighted concentrations of the parameters measured, separately for each of the sites for which sufficient data are available. The main thrust, however, is to examine the inter-relationship structure between the concentrations within each rain event (unweighted data), separately for each site, and to examine whether these inter-relationships have changed over time. The rain events at individual sites can be characterized by approximately eight combinations of rainfall parameters (or rain composition signatures), and these are common to all sites. Some sites will have more events from one signature than another, but there appear to be no signatures unique to a single site. Analysis via factor and cluster analysis, with a correspondence analysis of the results, also aid interpretation of the patterns. This spatio-temporal analysis, performed by pooling all rain event data, irrespective of site or time period, results in nine combinations of rainfall parameters being sufficient to characterize the rain events. The sites and rainfall seasons show patterns in these combinations of parameters, with some combinations appearing more frequently during certain rainfall seasons. In particular, the presence of the combination of low acetate and formate with high magnesium appears to be increasing in the later rainfall seasons, as does this combination together with calcium, sodium, chloride, potassium and fluoride. As expected, sites close together exhibit similar signatures. Copyright © 2002 John Wiley & Sons, Ltd. [source]


    UNEP-GEMS/Water Programme,water quality data, GEMStat and open web services,and Japanese cooperation

    HYDROLOGICAL PROCESSES, Issue 9 2007
    Sabrina Barker
    Abstract The purpose of this paper is threefold. First, it demonstrates how monitoring stations that collect water quality data can be situated globally via satellite data from Google Earth. Important technical issues such as interoperability and Open Web Services are discussed in this context. Second, it illustrates how researchers at local levels can benefit from this global technology. The discussion draws from the online water quality database, GEMStat, which contains water quality data and sediment load calculations from around the world. These types of data, collected locally, can be shown to bear global implications through Internet technology. GEMStat has been expanded to include Open Web Services to enable interoperability with other online databases. Third, it illustrate an international framework of cooperation through GEMS/Water Japan, introducing on-site monitoring activities as well as management of international river basin (Mekong/La Plata). Considerations for future application framework are presented in conclusion. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Heavy metal concentrations during storm events in a rehabilitated industrialized catchment

    HYDROLOGICAL PROCESSES, Issue 10 2003
    W. H. Blake
    Abstract Water quality data collected on a fortnightly or monthly basis are inadequate for assessment and modelling of many water quality problems as storm event samples are underrepresented or missed. This paper examines the stormflow dynamics of heavy metals (Pb, Cu, Cd and Zn) in the Nant-y-Fendrod stream, South Wales, which has been affected by 250 years of metal smelting, followed by 35 years of landscape rehabilitation measures. For storm events of contrasting (very dry and very wet) antecedent conditions in May 2000 and February 2001, respectively, temporal changes in streamwater heavy metal concentrations above and below an in-line flood detention lake are analysed. At the upstream site, peaks in total metal concentration were recorded on the rising limb for Pb (0·150 mg l,1) and Cu (0·038 mg l,1) but on the falling limb for Zn (1·660 mg l,1) and Cd (0·006 mg l,1) in the summer 2000 storm event, yielding clockwise and anticlockwise hysteretic loops respectively. In contrast, metal concentrations, although high throughout the winter storm event, were diluted somewhat during the storm peak itself. The Pb and Cu appear to be supplied by quickflow processes and transported in close association with fine sediment, whereas Zn and Cd are delivered to the channel and lake by slower subsurface seepage in dissolved form. In the winter 2001 event, antecedent soil moisture and shallow groundwater levels were anomalously high and seepage sources of dissolved metals dominated. Downstream of the lake, Pb and Cu levels and suspended sediment were high in the summer storm, but low in the winter storm, suggesting retention with deposition of fine sediment in the lake during the latter. In the winter storm, Zn and Cd levels were higher downstream than upstream of the lake, perhaps because of additional seepage inputs from the surrounding slopes, which failed to have an impact during summer. An understanding of the complex interplay of antecedent soil moisture and the dynamics of subsurface seepage pathways in relation to the three-dimensional distribution of sources is important in modelling heavy metal fluxes and levels in contaminated urban catchments. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Processes governing river water quality identified by principal component analysis

    HYDROLOGICAL PROCESSES, Issue 16 2002
    I. Haag
    Abstract The present study demonstrates the usefulness of principal component analysis in condensing and interpreting multivariate time-series of water quality data. In a case study the water quality system of the lock-regulated part of the River Neckar (Germany) was analysed, with special emphasis on the oxygen budget. Pooled data of ten water quality parameters and discharge, which had been determined at six stations along a 200 km reach of the river between the years 1993 and 1998, were subjected to principal component analysis. The analysis yielded four stable principal components, explaining 72% of the total variance of the 11 parameters. The four components could be interpreted confidently in terms of underlying processes: biological activity, dilution by high discharge, seasonal effects and the influence of wastewater. From analysing the data of single stations separately, these processes were found to be active throughout the complete reach. Considering the oxygen budget of the river, the variance of biological activity, representing the counteracting processes of primary production and microbial degradation, was found to be most important. This principal component explained 79% of the observed variance of oxygen saturation. In contrast, the analysis of a reduced data set from the 1970s showed that oxygen saturation was then dominated by discharge and temperature variations. The findings indicate that the oxygen budget used to be governed directly by the emission of degradable matter, whereas nowadays eutrophication is most important for extreme oxygen concentrations. Therefore, controlling eutrophication has to be the primary goal, in order to mitigate the rare episodes of pronounced oxygen over- and undersaturation in the future. Copyright © 2002 John Wiley & Sons, Ltd. [source]


    AUTOMATICALLY OPERATING RADARS FOR MONITORING INSECT PEST MIGRATIONS

    INSECT SCIENCE, Issue 4 2002
    Alistair Drake
    Abstract, Over the last three decades, special-purpose "entomological" radars have contributed much to the development of our understanding of insect migration, especially of the nocturnal migrations at altitudes of up to , 1 km that are regularly undertaken by many important pest species. One of the limitations of early radar studies, the difficulty of maintaining observations over long periods, has recently been overcome by the development of automated units that operate autonomously and transmit summaries of their observations to a base laboratory over the public telephone network. These relatively low-cost Insect Monitoring Radars (IMRs) employ a novel "ZLC" configuration that allows high quality data on the migrants' flight parameters and identity to be acquired. Two IMRs are currently operating in the semi-arid inland of eastern Australia, in a region where populations of migrant moths (Lepidoptera) and Australian plague locusts Chortoicetes terminifera (Orthoptera) commonly originate, and some examples of outputs from one of these units are presented. IMRs are able to provide the data needed to characterize a migration system, i.e. to estimate the probabilities of migration events occurring in particular directions at particular seasons and in response to particular environmental conditions and cues. They also appear capable of fulfilling a "sentinel" role for pest-management organisations, alerting forecasters to major migration events and thus to the likely new locations of potential target populations. Finally, they may be suitable for a more general ecological monitoring role, perhaps especially for quantifying year-to-year variations in biological productivity. [source]


    Forty years of numerical climate modelling

    INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 9 2001
    K. McGuffie
    Abstract Climate modelling is now a mature discipline approaching its fortieth birthday. The need for valid climate forecasts has been underlined by the recognition that human activities are now modifying the climate. The complex nature of the climate system has resulted in the development of a surprisingly large array of modelling tools. Some are relatively simple, such as the earth systems and energy balance models (EBMs), while others are highly sophisticated models which challenge the fastest speeds of the most powerful supercomputers. Indeed, this discipline of the latter half of the twentieth century is so critically dependent on the availability of a means of undertaking powerful calculations that its evolution has matched that of the digital computer. The multi-faceted nature of the climate system demands high quality, and global observations and innovative parameterizations through which processes which cannot be described or calculated explicitly are captured to the extent deemed necessary. Interestingly, results from extremely simple, as well as highly complex and many intermediate model types are drawn upon today for effective formulation and evaluation of climate policies. This paper discusses some of the important developments during the first 40 years of climate modelling from the first models of the global atmosphere to today's models, which typically consist of integrated multi-component representations of the full climate system. The pressures of policy-relevant questions more clearly underline the tension between the need for evaluation against quality data and the unending pressure to improve spatial and temporal resolutions of climate models than at any time since the inception of climate modelling. Copyright © 2001 Royal Meteorological Society [source]


    Data management and quality assurance for an International project: the Indo,US Cross-National Dementia Epidemiology Study

    INTERNATIONAL JOURNAL OF GERIATRIC PSYCHIATRY, Issue 6 2002
    Rajesh Pandav
    Abstract Background Data management and quality assurance play a vital but often neglected role in ensuring high quality research, particularly in collaborative and international studies. Objective A data management and quality assurance program was set up for a cross-national epidemiological study of Alzheimer's disease, with centers in India and the United States. Methods The study involved (a) the development of instruments for the assessment of elderly illiterate Hindi-speaking individuals; and (b) the use of those instruments to carry out an epidemiological study in a population-based cohort of over 5000 persons. Responsibility for data management and quality assurance was shared between the two sites. A cooperative system was instituted for forms and edit development, data entry, checking, transmission, and further checking to ensure that quality data were available for timely analysis. A quality control software program (CHECKS) was written expressly for this project to ensure the highest possible level of data integrity. Conclusions This report addresses issues particularly relevant to data management and quality assurance at developing country sites, and to collaborations between sites in developed and developing countries. Copyright © 2002 John Wiley & Sons, Ltd. [source]


    Methodological issues in online data collection

    JOURNAL OF ADVANCED NURSING, Issue 5 2007
    Mary Ann Cantrell
    Abstract Title.,Methodological issues in online data collection Aim., This paper is a report of a study to evaluate the use of an online data collection method to survey early survivors of childhood cancer about their physical and psychosocial characteristics and health-related quality of life. Background., A major advantage in conducting web-based nursing research is the ability to involve participants who are challenging to study because of their small numbers or inaccessibility because of geographic location. As paediatric oncology patients and early survivors of childhood cancer are often not easily accessible because of their small numbers at single institutions, web-based research methods have been proposed as a potentially effective approach to collect data in studies involving these clinical populations. Method., Guided by published literature on using the Internet for data collection, an online protocol was developed; this included construction of a website, development of a homepage and interactive HyperText Markup Language pages and the posting of the study link on various websites. Data collection occurred over a 6-month period between December 2005 and May 2006. Findings., Despite using strategies in conducting online research cited in published literature, the recruitment of subjects was very prolonged and the volume of missing data among many respondents excluded them from the study and created bias within the study's results. Conclusion., Web-based, online data collection methods create opportunities to conduct research globally, especially among difficult to access populations. However, web-based research requires careful consideration of how the study will be advertized and how data will be collected to ensure high quality data and validity of the findings. [source]


    Studying complex caring interfaces: key issues arising from a study of multi-agency rehabilitative care for people who have suffered a stroke

    JOURNAL OF CLINICAL NURSING, Issue 3 2002
    DAVINA ALLEN BA
    ,,Ensuring `seamless' health and social services provision has been a concern of policy makers for many years but our understanding of this complex system of work remains underdeveloped. ,,This article reports selected findings from a series of ethnographic case studies of health and social services provision to adults recovering from a first acute stroke. ,,Flexible working, the need for a lead professional and the transition from hospital to home are themes considered. ,,The need for high quality data in order to develop our existing understanding of complex caring interfaces is underlined. [source]


    How biased are estimates of extinction probability in revisitation studies?

    JOURNAL OF ECOLOGY, Issue 5 2006
    MARC KÉRY
    Summary 1Extinction is a fundamental topic for population ecology and especially for conservation and metapopulation biology. Most empirical studies on extinction resurvey historically occupied sites and estimate extinction probability as the proportion of sites where a species is no longer detected. Possible non-detection of surviving populations is usually not accounted for, which may result in extinction probabilities that are overestimated. 2As part of a large revisitation study in north-east Switzerland, 376 sites with historically known occurrences of a total of 11 plant species 80,100 years ago were visited by two independent observers. Based on typical population size, ramet size and plant architecture, we judged six species as easy to find and five species as hard to find. Using capture,recapture methods to separate non-detection from true extinction, we gauged the bias of extinction probability estimates that do not account for non-detection. 3When non-detection was not accounted for, a single visit resulted in an average estimate of population extinction probability of 0.49 (range 0.27,0.67). However, the mean detection probability of a surviving population during a single visit had an estimated average of only 0.81 (range 0.57,1). Consequently, accounting for non-detection resulted in extinction probability estimates ranging between 0.09 and 0.61 (mean 0.36). Based on a single survey, our revisitation study would have overestimated the extinction rate on average by 11 percentage points (range 5,22%) or by 59% (range 0,250%) relative to the estimated true value. 4A simple binomial argument enables the calculation of the minimum required number of visits to detect a surviving population with high probability (e.g. 95%). For the easy to find species, approximately two visits would be required to find most of the surviving populations, whereas up to four visits would be required for the hard to find species. 5In revisitation studies, only repeated revisits allow the separation of extinction from simple non-detection. Unless corrected for possible non-detection, extinction probability may be strongly overestimated, and hence some control for non-detection is desirable at least in a subset of species/sites in revisitation studies. These issues are also relevant to the estimation of extinction in metapopulation studies and to the collection of quality data for habitat and distribution models. [source]


    Firm reputation and applicant pool characteristics

    JOURNAL OF ORGANIZATIONAL BEHAVIOR, Issue 6 2003
    Daniel B. Turban
    Scholars have suggested that a firm's reputation can provide it with a competitive advantage by attracting more, and possibly higher-caliber, applicants. No research has actually investigated this relationship, however, in large part because researchers have not assessed applicant pool characteristics but instead have measured applicants' intentions. Therefore, we conducted two studies to investigate whether organizational reputation influenced the number and the quality of applicants actually seeking positions with firms. Company reputation was operationalized using two different published reputation measures, and applicant quality data were obtained from career services offices at business schools at two universities. Results from both studies supported the previously untested belief that firms with better reputations attract more applicants. Furthermore, some evidence suggested that firms with better reputations could select higher-quality applicants. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Case study: a maintenance practice used with real-time telecommunications software

    JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 2 2001
    Miroslav Popovi
    Abstract In this paper we present a case study of the software maintenance practice that has been successfully applied to real-time distributed systems, which are installed and fully operational in Moscow, St. Petersburg, and other cities across Russia. In this paper we concentrate on the software maintenance process, including customer request servicing, in-field error logging, role of information system, software deployment, and software quality policy, and especially the software quality prediction process. In this case study, the prediction process is shown to be integral and one of the most important parts of the software maintenance process. We include a software quality prediction procedure overview and an example of the actual practice. The quality of the new software update is predicted on the basis of the current update's quantity metrics data and quality data, and new update's quantity metrics data. For management, this forecast aids software maintenance efficiency, and cost reduction. For practitioners, the most useful result presented is the process for determining the value for the break point. We end this case study with five lessons learned. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Upland Controls on the Hydrological Functioning of Riparian Zones in Glacial Till Valleys of the Midwest,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 6 2007
    P. Vidon
    Abstract:, Identifying relationships between landscape hydrogeological setting, riparian hydrological functioning and riparian zone sensitivity to climate and water quality changes is critical in order to best use riparian zones as best management practices in the future. In this study, we investigate water table dynamics, water flow path and the relative importance of precipitation, deep ground water (DG) and seep water as sources of water to a riparian zone in a deeply incised glacial till valley of the Midwest. Data indicate that water table fluctuations are strongly influenced by soil texture and to a lesser extent by upland sediment stratigraphy producing seeps near the slope bottom. The occurrence of till in the upland and at 1.7-2 m in the riparian zone contributes to maintaining flow parallel to the ground surface at this site. Lateral ground-water fluxes at this site with a steep topography in the upland (16%) and loam soil near the slope bottom are small (<10 l/d/m stream length) and intermittent. A shift in flow path from a lateral direction to a down valley direction is observed in the summer despite the steep concave topography and the occurrence of seeps at the slope bottom. Principal component and discriminant analysis indicate that riparian water is most similar to seep water throughout the year and that DG originating from imbedded sand and gravel layers in the lower till unit is not a major source of water to riparian zones in this setting. Water quality data and the dependence of the riparian zone for recharge on seep water suggest that sites in this setting may be highly sensitive to changes in precipitation and water quality in the upland in the future. A conceptual framework describing the hydrological functioning of riparian zones on this setting is presented to generalize the finding of this study. [source]


    Oyster Crassostrea virginica Spat Settlement as it Relates to the Restoration of Fish River Reef in Mobile Bay, Alabama

    JOURNAL OF THE WORLD AQUACULTURE SOCIETY, Issue 4 2000
    Imad G. Saoud
    Spat collectors at the reefs were replaced every 2 wk and spat-set estimated as number of oysters per meter square per day. Water quality data at Fish River Reef was monitored using remote sensors. Spat-set data revealed significant variation between the four sites and between the 2 yr. Spat settlement was 5 to 10 times greater at the other three reefs than at Fish River Reef. Dates and intensity of oyster settlement at Fish River Reef were different from dates and intensity of oyster settlement at Shell Bank Reef, both on the eastern side of the bay. However, settlement was similar between Cedar Point Reef and White House Reef, both on the western side of the bay. Spat set appears to occur 3 wk after a rapid decline in water temperature, provided adequate oxygen concentrations are present at the time of settlement. Data collected suggest that intensity of settlement at Fish River Reef is considerably less than at other reefs in this study but could be adequate to reestablish the reef, if cultch and environmental conditions are suitable. The data also suggest that the source of larval oysters at Fish River Reef is different from the source of larval oysters at the other sites tested in the present study. [source]


    Modelling the energy dependencies of X-ray quasi-periodic oscillations in accreting compact objects

    MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, Issue 3 2005
    Piotr T.
    ABSTRACT We have constructed models of the quasi-periodic variability of X-ray emission from accreting compact objects. Assuming a general scenario of a propagation model of variability, with inverse Compton upscattering as the emission mechanism, we have considered a number of cases for the periodic modulation: modulation of the plasma heating rate, cooling rate by external soft photons and the amplitude of the reprocessed component. We have computed various observational characteristics which can be compared to good quality data. These include Fourier-frequency resolved spectra and the results of cross-correlation analysis between light curves at different energies. Each model of modulation predicts specific observational signatures, which help in identifying the physical processes driving quasi-periodic oscillations emission in accreting sources. [source]


    Cross-spectral analysis of the X-ray variability of Markarian 421

    MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, Issue 2 2002
    Y. H. Zhang
    ABSTRACT Using the cross-spectral method, we confirm the existence of the X-ray hard lags discovered with cross-correlation function technique during a large flare of Mrk 421 observed with BeppoSAX. For the 0.1,2 versus 2,10 keV light curves, both methods suggest sub-hour hard lags. In the time domain, the degree of hard lag, i.e. the amplitude of the 3.2,10 keV photons lagging the lower energy ones, tends to increase with the decreasing energy. In the Fourier frequency domain, by investigating the cross-spectra of the 0.1,2/2,10 keV and the 2,3.2/3.2,10 keV pairs of light curves, the flare also shows hard lags at the lowest frequencies. However, with the present data, it is impossible to constrain the dependence of the lags on frequencies even though the detailed simulations demonstrate that the hard lags at the lowest frequencies probed by the flare are not an artefact of sparse sampling, Poisson and red noise. As a possible interpretation, the implication of the hard lags is discussed in the context of the interplay between the (diffusive) acceleration and synchrotron cooling of relativistic electrons responsible for the observed X-ray emission. The energy-dependent hard lags are in agreement with the expectation of an energy-dependent acceleration time-scale. The inferred magnetic field (B, 0.11 G) is consistent with the value inferred from the spectral energy distributions of the source. Future investigations with higher quality data that show whether or not the time-lags are energy-/frequency-dependent will provide a new constraint on the current models of the TeV blazars. [source]


    Interventions for preventing obesity in childhood.

    OBESITY REVIEWS, Issue 3 2001
    A systematic review
    Abstract Background The prevalence of obesity and overweight is increasing worldwide. Obesity in children impacts on their health in both short- and long-term. Obesity prevention strategies are poorly understood. Objective To assess the effectiveness of interventions designed to prevent obesity in childhood. Search strategy Electronic databases were searched from January 1985 to October 1999. Selection criteria Data from randomized control trials and non-randomized trials with concurrent control group were included. A priori, studies with follow up of 1 year minimum were selected however, this was subsequently amended to include studies with a minimum follow up of three months. Data collection & analysis Two reviewers independently extracted data and assessed study quality. Main results Seven studies were included, three long-term (>1 years) and four short-term (>3 months and <1 years). The studies included were diverse in terms of study design and quality, target population, theoretical underpinning of intervention approach, and outcome measures. As such, it was not appropriate to combine study findings using statistical methods. Conclusions Two of the long-term studies (one focused on dietary education and physical activity vs. control, and the other only on dietary education vs. control), resulted in a reduction in the prevalence on obesity, but the third, which focused on dietary education and physical activity, found no effect. Of the four short-term studies, three focused simply on physical activity/reduction of sedentary behavious vs. control. Two of these studies resulted in a reduction in the prevalence of obesity in intervention groups compared with control groups, and another study found a non-significant reduction. The fourth study focused on dietary education and physical activity, and did not find an effect on obesity, but did report a reduction in fat intake. Overall, the findings of the review suggest that currently there is limited quality data on the effectiveness of obesity prevention programmes and as such no generalizable conclusions can be drawn. The need for well-designed studies that examine a range of interventions remains a priority. [source]


    Improving the precision of cotton performance trials conducted on highly variable soils of the southeastern USA coastal plain

    PLANT BREEDING, Issue 6 2007
    B. T. Campbell
    Abstract Reliable agronomic and fibre quality data generated in Upland cotton (Gossypium hirsutum L.) cultivar performance trials are highly valuable. The most common strategy used to generate reliable performance trial data uses experimental design to minimize experimental error resulting from spatial variability. However, an alternative strategy uses a posteriori statistical procedures to account for spatial variability. In this study, the efficiency of the randomized complete block (RCB) design and nearest neighbour adjustment (NNA) were compared in a series of cotton performance trials conducted in the southeastern USA to identify the efficiency of each in minimizing experimental error for yield, yield components and fibre quality. In comparison to the RCB, relative efficiency of the NNA procedure varied amongst traits and trials. Results show that experimental analyses, depending on the trait and selection intensity employed, can affect cultivar or experimental line selections. Based on this study, we recommend researchers conducting cotton performance trials on variable soils consider using NNA or other spatial methods to improve trial precision. [source]


    Outdoor exposure to airborne polycyclic organic matter and adverse reproductive outcomes: A pilot study

    AMERICAN JOURNAL OF INDUSTRIAL MEDICINE, Issue 3 2001
    Zdravko P. Vassilev MD
    Abstract Background To investigate the association between outdoor airborne polycyclic organic matter (POM) and adverse reproductive outcomes in New Jersey, we used a cross-sectional design combining air quality data from the USA EPA Cumulative Exposure Project and individual data on pregnancy outcomes from birth and fetal death certificates at the census tract level. Methods After excluding plural births and chromosomal anomalies, 221,406 live births and 1,591 fetal deaths registered in New Jersey during the years of 1990 and 1991 were included. The exposure estimates were derived from modeled average POM concentrations for each census tract in the state. Results After adjustment for potential confounders, the odds ratios (OR) for very low birth weight for the highest exposure compared to the lowest exposure group was 1.31 (95% CI 1.15,1.51); among term births, high POM exposure was associated with low birth weight OR,=,1.31 (95% CI 1.21,1.43), with fetal death OR,=,1.19 (95% CI 1.02,1.39) and with premature birth OR,=,1.25 (95% CI 1.19,1.31). The univariate stratified analyses suggested effect modification of all observed associations by maternal alcohol consumption. Conclusions This study found associations between outdoor exposure to modeled average airborne POM and several adverse pregnancy outcomes. The data and methods utilized in this pilot study may be useful for identifying hazardous air pollutants requiring in-depth investigation. Am. J. Ind. Med. 40:255,262, 2001. © 2001 Wiley-Liss, Inc. [source]


    Popitam: Towards new heuristic strategies to improve protein identification from tandem mass spectrometry data

    PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 6 2003
    Patricia Hernandez
    Abstract In recent years, proteomics research has gained importance due to increasingly powerful techniques in protein purification, mass spectrometry and identification, and due to the development of extensive protein and DNA databases from various organisms. Nevertheless, current identification methods from spectrometric data have difficulties in handling modifications or mutations in the source peptide. Moreover, they have low performance when run on large databases (such as genomic databases), or with low quality data, for example due to bad calibration or low fragmentation of the source peptide. We present a new algorithm dedicated to automated protein identification from tandem mass spectrometry (MS/MS) data by searching a peptide sequence database. Our identification approach shows promising properties for solving the specific difficulties enumerated above. It consists of matching theoretical peptide sequences issued from a database with a structured representation of the source MS/MS spectrum. The representation is similar to the spectrum graphs commonly used by de novo sequencing software. The identification process involves the parsing of the graph in order to emphazise relevant sections for each theoretical sequence, and leads to a list of peptides ranked by a correlation score. The parsing of the graph, which can be a highly combinatorial task, is performed by a bio-inspired algorithm called Ant Colony Optimization algorithm. [source]


    A Review of the Cluster Survey Sampling Method in Humanitarian Emergencies

    PUBLIC HEALTH NURSING, Issue 4 2008
    Shaun K. Morris
    ABSTRACT Obtaining quality data in a timely manner from humanitarian emergencies is inherently difficult. Conditions of war, famine, population displacement, and other humanitarian disasters, cause limitations in the ability to widely survey. These limitations hold the potential to introduce fatal biases into study results. The cluster sample method is the most frequently used technique to draw a representative sample in these types of scenarios. A recent study utilizing the cluster sample method to estimate the number of excess deaths due to the invasion of Iraq has generated much controversy and confusion about this sampling technique. Although subject to certain intrinsic limitations, cluster sampling allows researchers to utilize statistical methods to draw inferences regarding entire populations when data gathering would otherwise be impossible. [source]


    Isobaric metabolite interferences and the requirement for close examination of raw data in addition to stringent chromatographic separations in liquid chromatography/tandem mass spectrometric analysis of drugs in biological matrix

    RAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 13 2008
    Zhengyin Yan
    In addition to matrix effects, common interferences observed in liquid chromatography/tandem mass spectrometry (LC/MS/MS) analyses can be caused by the response of drug-related metabolites to the multiple reaction monitoring (MRM) channel of a given drug, as a result of in-source reactions or decomposition of either phase I or II metabolites. However, it has been largely ignored that, for some drugs, metabolism can lead to the formation of isobaric or isomeric metabolites that exhibit the same MRM transitions as parent drugs. The present study describes two examples demonstrating that interference caused by isobaric or isomeric metabolites is a practical issue in analyzing biological samples by LC/MS/MS. In the first case, two sequential metabolic reactions, demethylation followed by oxidation of a primary alcohol moiety to a carboxylic acid, produced an isobaric metabolite that exhibits a MRM transition identical to the parent drug. Because the drug compound was rapidly metabolized in rats and completely disappeared in plasma samples, the isobaric metabolite appeared as a single peak in the total ion current (TIC) trace and could easily be quantified as the drug since it was eluted at a retention time very close to that of the drug in a 12-min LC run. In the second example, metabolism via the ring-opening of a substituted isoxazole moiety led to the formation of an isomeric product that showed an almost identical collision-induced dissociation (CID) MS spectrum as the original drug. Because two components were co-eluted, the isomeric product could be mistakenly quantified and reported by data processing software as the parent drug if the TIC trace was not carefully inspected. Nowadays, all LC/MS data are processed by computer software in a highly automated fashion, and some analysts may spend much less time to visually examine raw TIC traces than they used to do. Two examples described in this article remind us that quality data require both adequate chromatographic separations and close examination of raw data in LC/MS/MS analyses of drugs in biological matrix. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Field analyses of RDX and TCE in groundwater during a GCW pilot study

    REMEDIATION, Issue 1 2002
    Andrew C. Elmore
    Contaminant concentrations in groundwater are typically analyzed using traditional laboratory analytical procedures approved by the Environmental Protection Agency (EPA) or state regulatory agencies. The use of off-site laboratories provides very high-quality water quality data at a relatively high cost in terms of time and money. Yet there are many instances when it is desirable to have water quality data measured in the field. The field methods for measuring water quality typically cost much less than the corresponding laboratory methods. However, the usability of the field data may be uncertain when the results are qualitatively compared to duplicate laboratory results. Groundwater samples collected during a groundwater circulation well pilot study were analyzed using field kits to measure concentrations of trichloroethylene (TCE) and the explosive compound known as RDX. A subset of the samples was split for duplicate laboratory analysis. Linear regression analysis and relative percent difference analysis were performed on the duplicate results to evaluate the comparability of the field and laboratory data. The data analyses were also used to evaluate the concept that the field kits were more accurate for specific concentration ranges, as well as the concept the field kit results would improve as field personnel gained experience with the field analysis procedures. © 2002 Wiley Periodicals, Inc. [source]


    Telemetric Intracavernosal and Intraspongiosal Pressure Monitoring

    THE JOURNAL OF SEXUAL MEDICINE, Issue 10 2008
    Rany Shamloul MD
    ABSTRACT Introduction., Despite the major breakthroughs basic research in erectile physiology experienced in the last, most of the methods used for quantitative assessment of erectile function in longitudinal studies suffer many drawbacks. Objective., This review will focus on radiotelemetric assessment of intracavernosal (ICP) and intraspongiosal (ISP) regarding the technique, data collection, interpretation, and overall benefits. Results., Telemetric recording of ICP and ISP allows for qualitative and quantitative assessment of erectile responses in experimental animals, a characteristic that is not possible using other techniques. This technique has many advantages that can collectively lead to production of high quality data regarding erection. The system suffers two drawbacks, its high cost and the need for surgical implantation of the transmitter. Conclusion., The use of telemetric monitoring of ICP and ISP carries many advantages that will, hopefully, establish this technique as the gold standard method for assessment of erectile responses in the near future. Shamloul R. Telemetric intracavernosal and intraspongiosal pressure monitoring. J Sex Med 2008;5:2246,2252. [source]


    Improving the Quality of Long-Term Care with Better Information

    THE MILBANK QUARTERLY, Issue 3 2005
    VINCENT MOR
    Publicly reporting information stimulates providers' efforts to improve the quality of health care. The availability of mandated, uniform clinical data in all nursing homes and home health agencies has facilitated the public reporting of comparative quality data. This article reviews the conceptual and technical challenges of applying information about the quality of long-term care providers and the evidence for the impact of information-based quality improvement. Quality "tools" have been used despite questions about the validity of the measures and their use in selecting providers or offering them bonus payments. Although the industry now realizes the importance of quality, research still is needed on how consumers use this information to select providers and monitor their performance and whether these efforts actually improve the outcomes of care. [source]


    Investigation of the temporal effects of spawning season and maternal and paternal differences on egg quality in Atlantic cod Gadus morhua L. broodstock

    AQUACULTURE RESEARCH, Issue 14 2009
    Dounia Hamoutene
    Abstract A better understanding of the parameters affecting egg quality and larval survival is of importance for continued development of cod broodstock and efficient husbandry practices. Decision tree analysis (DTA) was applied to analyse 3 years of egg quality data in an effort to extract the most important variables (i.e. predictors) in explaining differences in egg quality. The effect of three predictors (spawning time, maternal and paternal differences) has been studied on early cleavage pattern parameters, egg diameters, fertilization and hatching rates and has shown that females are the dominant variable and that time has a limited and inconsistent impact on the data. When using maternal, paternal differences and batch number (instead of spawning time) as predictors, the results confirm that no particular relationship is found between batch order (i.e. order in time) and egg quality. Moreover, batches with a higher egg quality show a consistency in the parameters assessed (i.e. batches with higher rates of normality in any parameter tend to be normal for other parameters). This is confirmed by the significant correlations found between cleavage parameters. Our results highlight that spawning time is of less importance than female parent contribution in ensuring high rates of fertilization and larval hatch, and maximizing general egg quality. [source]