Quality Standards (quality + standards)

Distribution by Scientific Domains

Kinds of Quality Standards

  • water quality standards


  • Selected Abstracts


    The use of technical knowledge in European water policy-making

    ENVIRONMENTAL POLICY AND GOVERNANCE, Issue 5 2010
    Perry J. M. van Overveld
    Abstract Environmental policy-making often involves a mix of technical knowledge, normative choice and uncertainty. Numerous actors, each with their own distinct objectives, are involved in these policy-making processes. One question these actors face, is how they can effectively communicate their technical knowledge and represent their interests in policy-making. The objective of this paper is to identify the factors that influence the use of technical knowledge and its impact on decision-making in the European Union. This is done for case of water policy-making for organic micropollutants, such as pesticides and pharmaceuticals. These pollutants enter the surface water in many ways and although concentrations are low, adverse effects cannot be ruled out. Via the EU Water Framework Directive, legislation has been developed to reduce the emissions of pollutants that pose a risk to ecology or public health. Using the advocacy coalition framework, the formal EU decision-making processes are analyzed for the identification of priority pollutants (Priority Substances) and the derivation of maximum allowable concentrations (Environmental Quality Standards). To enable a detailed analysis, the focus is on three specific micropollutants that pose health risks via drinking water supply. The findings show the extent to which actors can influence the decision-making process with technical knowledge. Early involvement in the drafting process that is led by the European Commission is important to influence decision-making outcomes. For this, organizational capacity in coalitions to mobilize and coordinate the required targeted contribution of technical knowledge is crucial. Copyright © 2010 John Wiley & Sons, Ltd and ERP Environment. [source]


    Evidence-Based Medicine and Clinical Trials in Pain Practice and Orthopedics

    PAIN PRACTICE, Issue 4 2005
    Ludger Gerdesmeyer MD
    Abstract: Medical practices should be based on scientific findings pursuant to the rules of evidence-based medicine. Quality standards for interventional pain therapy and orthopedic clinical studies have been lacking. As a result, the efficacy of many forms of therapy is insufficiently documented, making the level of evidence low. This article identifies common deficiencies in the conduct of clinical trials, as well as limitations in conducting randomized controlled studies. Recommendations for improvement are provided. The discussion provides the clinically active physician with interpretation aids for the evaluation of meta-analyses, supports personal evidence-based decisions, and reviews the most important principles for planning and conducting of experimental clinical studies. Current examples in the literature verify the implementation of these principles and present current findings in accordance with evidence-based medicine (EBM) criteria. In spite of an increasing emergence of EBM-based studies, we conclude that the number of well-designed, high quality, controlled studies conducted in accordance with the guidelines of Good Clinical Practice examining interventional pain therapy and orthopedic clinical studies remains unacceptably low. [source]


    A novel audit model for assessing quality in non-regulated research

    QUALITY ASSURANCE JOURNAL, Issue 2 2009
    S. G. Volsen
    Abstract The need for Quality standards in non-regulated research is a matter of considerable current debate. Whilst a number of such guidelines have been developed over recent years, their successful implementation remains a challenge to all. In order to assess whether research standards are indeed improving on the bench following the instigation of such a quality system, a question posed by both senior management and scientists alike, an independent compliance programme is required. However, given the lack of predicate rules, naivety to audit process and general sensitivity to external scrutiny within the scientific ranks, then work in this ,Grey Area' generates high exposure for the conventional GLP, GCP or GMP auditor. We have developed, tried, and tested a highly effective, novel audit model for assessing the quality of non-regulated research. This simple system can be applied successfully irrespective of scientific discipline or field. Whilst common principles will always apply during any quality system audit, the refinements and idiosyncrasies we describe here will, as we have found, help underpin success. Our intentional assumption is that this is a first time endeavour for the quality professional. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Factors influencing the challenges of modelling and treating fecal indicator bacteria in surface waters

    ECOHYDROLOGY, Issue 4 2009
    Cristiane Q. Surbeck
    Abstract In the United States, thousands of creeks, rivers, and coastal zones are listed as impaired in the Clean Water Act's 303(d) list. The number one general cause of impairments is denoted as ,pathogens', which can include known pathogenic organisms or, more commonly, fecal indicator bacteria (FIB), such as fecal coliform bacteria, Escherichia coli, and enterococci bacteria. Despite efforts by water quality managers to reduce FIB in surface waters via treatment, successful and significant reduction of FIB has been difficult to achieve to meet water quality standards. In addition, current efforts to numerically model FIB concentrations in surface waters do not consider many complexities associated with FIB as a pollutant. Reasons for the challenge of treating and modelling FIB are their varied sources and mechanisms of survival and decay in the environment. This technical note addresses this challenge by discussing the nature of FIB, their sources, and their fate and transport mechanisms. Sources of FIB to surface waters include wastewater, stormwater and dry-weather runoff, and animals. Mechanisms of pathogen indicator occurrence in surface waters are transport in stormwater, ecological proliferation, and interaction with sediments. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Global Standards, Local Realities: Private Agrifood Governance and the Restructuring of the Kenyan Horticulture Industry

    ECONOMIC GEOGRAPHY, Issue 2 2010
    Stefan Ouma
    abstract Over the past decade, private food safety and quality standards have become focal points in the supply chain management of large retailers, reshaping governance patterns in global agrifood chains. In this article, I analyze the relationship between private collective standards and the governance of agrifood markets, using the EUREPGAP/GLOBALGAP standard as a vantage point. I discuss the impact of this standard on the organization of supply chains of fresh vegetables in the Kenyan horticulture industry, focusing on the supply chain relationships and practices among exporters and smallholder farmers. In so doing, I seek to highlight the often-contested nature of the implementation of standards in social fields that are marked by different and distributed principles of evaluating quality, production processes, and legitimate actions in the marketplace. I also reconstruct the challenges and opportunities that exporters and farmers are facing with regard to the implementation of and compliance with standards. Finally, I elaborate on the scope for action that producers and policymakers have under these structures to retain sectoral competitiveness in a global economy of qualities. [source]


    Copper toxicity in relation to surface water-dissolved organic matter: Biological effects to Daphnia magna

    ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 12 2004
    Kees J.M. Kramer
    Abstract Water quality standards for copper are usually stated in total element concentrations. It is known, however, that a major part of the copper can be bound in complexes that are biologically not available. Natural organic matter, such as humic and fulvic acids, are strong complexing agents that may affect the bioavailable copper (Cu2+) concentration. The aim of this study was to quantify the relation between the concentration of dissolved natural organic matter and free Cu2+ in surface waters, and the biological effect, as measured in a standardized ecotoxicological test (48 h-median effective concentration [EC50] Daphnia magna, mobility). Six typical Dutch surface waters and an artificial water, ranging from 0.1 to 22 mg/L dissolved organic carbon (DOC), were collected and analyzed quarterly. Chemical speciation modeling was used as supporting evidence to assess bioavailability. The results show clear evidence of a linear relation between the concentration of dissolved organic carbon (in milligrams DOC/L) and the ecotoxicological effect (as effect concentration, EC50, expressed as micrograms Cu/L): 48-h EC50 (Daphnia, mobility) = 17.2 × DOC + 30.2 (r2 = 0.80, n = 22). Except for a brook with atypical water quality characteristics, no differences were observed among water type or season. When ultraviolet (UV)-absorption (380 nm) was used to characterize the dissolved organic carbon, a linear correlation was found as well. The importance of the free copper concentration was demonstrated by speciation calculations: In humic-rich waters the free Cu2+ concentration was estimated at ,10,11 M, whereas in medium to low dissolved organic carbon waters the [Cu2+] was ,10,10 M. Speciation calculations performed for copper concentrations at the effective concentration level (where the biological effect is considered the same) resulted in very similar free copper concentrations (,10,8 M Cu) in these surface waters with different characteristics. These observations consistently show that the presence of organic matter decreases the bioavailability, uptake, and ecotoxicity of copper in the aquatic environment. It demonstrates that the DOC content must be included in site-specific environmental risk assessment for trace metals (at least for copper). It is the quantification of the effects described that allows policy makers to review the criteria for copper in surface waters. [source]


    Evaluation of water quality using acceptance sampling by variables

    ENVIRONMETRICS, Issue 4 2003
    Eric P. Smith
    Abstract Under section 303(d) of the Clean Water Act, states must identify water segments where loads of pollutants are violating numeric water quality standards. Consequences of misidentification are quite important. A decision that water quality is impaired initiates the total maximum daily load or TMDL planning requirement. Falsely concluding that a water segment is impaired results in unnecessary TMDL planning and pollution control implementation costs. On the other hand, falsely concluding that a segment is not impaired may pose a risk to human health or to the services of the aquatic environment. Because of the consequences, a method is desired that minimizes or controls the error rates. The most commonly applied approach is to use the Environmental Protection Agency (EPA)'s raw score approach in which a stream segment is listed as impaired when greater than 10 per cent of the measurements of water quality conditions exceed a numeric criteria. An alternative to the EPA approach is the binomial test that the proportion exceeding the standard is 0.10 or less. This approach uses the number of samples exceeding the criteria as a test statistic along with the binomial distribution for evaluation and estimation of error rates. Both approaches treat measurements as binary; the values either exceed or do not exceed the standard. An alternative approach is to use the actual numerical values to evaluate standard. This method is referred to as variables acceptance sampling in quality control literature. The methods are compared on the basis of error rates. If certain assumptions are met then the variables acceptance method is superior in the sense that the variables acceptance method requires smaller sample sizes to achieve the same error rates as the raw score method or the binomial method. Issues associated with potential problems with environmental measurements and adjustments for their effects are discussed. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Trends, challenges and opportunities in power quality research

    EUROPEAN TRANSACTIONS ON ELECTRICAL POWER, Issue 1 2010
    Math H. J. Bollen
    Abstract This paper outlines a number of possible research directions in power quality. The introduction of new sources of generation will introduce the need for new research on voltage,magnitude variations, harmonic emission and harmonic resonance. Statistical performance indicators are expected to play an important role in addressing the hosting capacity of the power system for these new sources. The quickly growing amounts of power-quality data call for automatic analysis methods. Advanced signal-processing tools need to be developed and applied to address this challenge. Equipment with an active power-electronic interface generates waveform distortion at higher frequencies than existing equipment. The emission, spread, consequences and mitigation of this distortion require more research emphasis. The growing complexity of the power system calls for remote identification of system events and load transitions. Future DC networks, at different voltage levels, require the research on DC power quality next to AC power quality. Research on methods to describe and analyse time-varying harmonics has applications in a number of the above-mentioned issues. So does the use of hardware-in-the-loop (HIL) and real-time-digital simulation. Existing power quality standards should not form a barrier against future research; instead research should result in improved standards as well as completely new concepts. Examples are: voltage dips in three-phase systems, flicker due to non-incandescent lamps, and voltage variations on the timescale between 1,second and 10,minutes. All together, it is concluded in this paper that sufficient important and interesting research challenges and opportunities remain in the power quality area. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Manufacturing Reputations in Late Eighteenth-Century Birmingham

    HISTORICAL RESEARCH, Issue 181 2000
    Nigel Stirk
    This article examines the importance of local reputation and collaborative commercial politics for the business practices of individuals in industrializing Birmingham. It is suggested that shared ideas about quality standards, free trade and the national interest were instrumental in encouraging businessmen to work together to establish local representative institutions. Furthermore, these normative conceptions of how trade should be conducted reflected particular interpretations of the history of Birmingham and of individual enterprise. It is concluded that the particular geography of a provincial town was central to the application of principles and abstract ideas. [source]


    Flexibility-based competition: Skills and competencies in the new Europe

    HUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 1 2005
    Andrzej K. Kozminski
    In this paper the competitive strength and weaknesses of unifying and enlarging Europe in the global economy are examined. The focus is on people at work, their skills, and competencies. The idea of flexibility-based competition is developed implicating product and services portfolios, technologies, volumes, quality standards, distribution networks, and development cycles. Flexibility calls for speed maximizing management and special work force and labor markets characteristics. A new employment policy should change European labor markets making them more flexible and enabling "high-speed management." People able to adjust to flexible labor markets are described as "niche finders." Those who are equipped to excel in such markets and to win the competition game are presented in this paper as "top performers." Educational systems and particularly management education and development have to undergo deep restructuring to meet the challenge. An outline of new management education is provided. © 2005 Wiley Periodicals, Inc. Hum Factors Man 15: 35,47, 2005. [source]


    Effects of urbanization on stream water quality in the city of Atlanta, Georgia, USA,

    HYDROLOGICAL PROCESSES, Issue 20 2009
    Norman E. Peters
    Abstract A long-term stream water quality monitoring network was established in the city of Atlanta, Georgia during 2003 to assess baseline water quality conditions and the effects of urbanization on stream water quality. Routine hydrologically based manual stream sampling, including several concurrent manual point and equal width increment sampling, was conducted ,12 times annually at 21 stations, with drainage areas ranging from 3·7 to 232 km2. Eleven of the stations are real-time (RT) stations having continuous measures of stream stage/discharge, pH, dissolved oxygen, specific conductance, water temperature and turbidity, and automatic samplers for stormwater collection. Samples were analyzed for field parameters, and a broad suite of water quality and sediment-related constituents. Field parameters and concentrations of major ions, metals, nutrient species and coliform bacteria among stations were evaluated and with respect to watershed characteristics and plausible sources from 2003 through September 2007. Most constituent concentrations are much higher than nearby reference streams. Concentrations are statistically different among stations for several constituents, despite high variability both within and among stations. Routine manual sampling, automatic sampling during stormflows and RT water quality monitoring provided sufficient information about urban stream water quality variability to evaluate causes of water quality differences among streams. Fecal coliform bacteria concentrations of most samples exceeded Georgia's water quality standard for any water-usage class. High chloride concentrations occur at three stations and are hypothesized to be associated with discharges of chlorinated combined sewer overflows, drainage of swimming pool(s) and dissolution and transport during rainstorms of CaCl2, a deicing salt applied to roads during winter storms. One stream was affected by dissolution and transport of ammonium alum [NH4Al(SO4)2] from an alum-manufacturing plant; streamwater has low pH (<5), low alkalinity and high metals concentrations. Several trace metals exceed acute and chronic water quality standards and high concentrations are attributed to washoff from impervious surfaces. Published in 2009 by John Wiley & Sons, Ltd. [source]


    Governance in Global Value Chains

    IDS BULLETIN, Issue 3 2001
    John Humphrey
    Summaries The concept of ,governance' is central to the global value chain approach. This article explains what it means and why it matters for development research and policy. The concept is used to refer to the inter-firm relationships and institutional mechanisms through which non-market co-ordination of activities in the chain takes place. This co-ordination is achieved through the setting and enforcement of product and process parameters to be met by actors in the chain. In global value chains in which developing country producers typically operate, buyers play an important role in setting and enforcing these parameters. They set these parameters because of the (perceived) risk of producer failure. Product and process parameters are also set by government agencies and inter-national organisations concerned with quality standards or labour and environmental standards. To the extent that external parameter setting and enforcement develop and gain credibility, the need for governance by buyers within the chain will decline. [source]


    Assessment of Internet voice transport with TCP

    INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 4 2006
    Panagiotis Papadimitriou
    Abstract We investigate whether the current best-effort technology of Internet stands up to the high quality standards of real-time voice communication. More precisely, we assess VoIP quality within the context of transport protocol support and efficiency. Initially, we focus on TCP and UDP supportive role from the perspective of VoIP performance. Applying our metric for real-time performance, we reach the outcome that UDP is always not the protocol of choice for VoIP, since it occasionally exhibits inadequate performance. Beyond that, we evaluate a solution-framework based on TCP protocols, which favour real-time traffic. Based on our measurements, we assess the efficiency of the associated congestion control and congestion avoidance mechanisms in terms of VoIP performance. We also study the effect of packet size on protocol behaviour and VoIP quality. Furthermore, we investigate VoIP traffic friendliness, as well as potential tradeoffs between protocol performance and fairness. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Levels of quality management of blood transfusion services in Europe

    ISBT SCIENCE SERIES: THE INTERNATIONAL JOURNAL OF INTRACELLULAR TRANSPORT, Issue 1 2008
    C. Seidl
    The European blood legislation has defined several key quality elements to achieve Good Manufacturing Practice (GMP) in the field of blood transfusion. During the recent years, the blood legislation is in the process of implementation throughout its member states. Following the Directive 2002/98/EC, Directive 2005/62/EC has given further requirements for quality-management systems to be fulfilled by blood establishments. In addition, GMP/Good Laboratory Practice (GLP) and ISO standards are used inter alia by blood establishments. In order to support the implementation of the blood legislation, the European Public Health Work Plan (2005/2007) has cofunded two projects, led by the German Red Cross and supported by the European Blood Alliance, delivering a common European Standard Operating Procedure (SOP) methodology (EU-Q-Blood-SOP) and criteria and standards for the inspection of blood establishments (EUBIS). The EU-SOP manual will assist blood establishments in preparing for the inspection of their services related to the implementation of quality relevant elements required by the EU Directive 2002/98/EC and its technical annexes. The standards and criteria for inspection of blood establishments will cross-reference existing quality standards to the directive requirements and define requirements for the structure of quality-management systems based on the directive 2002/98/EC and its technical annexes. Based on these requirements, inspection standards and criteria are developed to assist in the independent assessment of quality systems established by individual blood establishments. These assessments are done in relation to the requirements defined by the European Union legislation on blood, in order to safeguard the quality of blood and to achieve continuous improvement of its quality throughout Europe. [source]


    Productive Restructuring and ,Standardization' in Mexican Horticulture: Consequences for Labour

    JOURNAL OF AGRARIAN CHANGE, Issue 2 2010
    HUBERT CARTON DE GRAMMONT
    In this paper we discuss how the establishment of strict quality and food safety norms for horticulture to satisfy the current consumer demands has forced enterprises to invest in modifying their productive processes. In the light of the unavoidable trend in favour of consumers, we analyze the precarious situation of farm workers, a situation that is not in tune with the concept of decent work promoted by the International Labour Organization or with the Social Accountability Standard promoted by the United Nations. We conclude that the enterprises have achieved major progress in productive restructuring to comply with quality standards, but at the expense of their workers' salaries and living and working conditions. This contradiction between the well-being of the consumer and the misery of the worker is a fundamental characteristic explaining the current success of globalized agro-food systems. [source]


    Review of aquaculture, its regulation and monitoring in Scotland

    JOURNAL OF APPLIED ICHTHYOLOGY, Issue 4-5 2000
    A. R. Henderson
    Summary The aquaculture industry in Scotland is primarily located on the western and northern coasts of Scotland where geographical and hydrographic conditions suit the species cultured. The regulation and monitoring of the industry has adapted and grown with the industry. Over 10 years, production has increased 10-fold and the efficiency of the industry has improved along with husbandry and management techniques although major disease problems have occurred. Planning and siting controls have recently been reviewed incorporating new EC legislation on environmental impact assessment. Environmental protection and end product quality are achieved through complex legislation demanding licences to discharge waste products and the application of strict quality standards and targets to both the product and its growing and receiving environment. Monitoring programmes are well established to ensure compliance with the legislation. The complexity of issues the industry now poses for regulation and monitoring have challenged traditional views and required new techniques to be developed, for example, mathematical modelling to set environmental targets for some medicines. A national approach has been needed which will benefit the industry and the regulators and allow focus to be brought to wider issues requiring research and development. [source]


    Covalently linked immunomagnetic separation/adenosine triphosphate technique (Cov-IMS/ATP) enables rapid, in-field detection and quantification of Escherichia coli and Enterococcus spp. in freshwater and marine environments

    JOURNAL OF APPLIED MICROBIOLOGY, Issue 1 2010
    C.M. Lee
    Abstract Aims:, Developing a rapid method for detection of faecal pollution is among the critical goals set forth by the Environmental Protection Agency in its revision of water quality criteria. The purpose of this study is to devise and test covalently linked antibody,bead complexes for faecal indicator bacteria (FIB), specifically Escherichia coli or Enterococcus spp., in measuring water quality in freshwater and marine systems. Methods and Results:, Covalently linked complexes were 58,89% more robust than antibody,bead complexes used in previous studies. Freshwater and marine water samples analysed using covalently linked immunomagnetic separation/adenosine triphosphate quantification technique (Cov-IMS/ATP) and culture-based methods yielded good correlations for E. coli (R = 0·87) and Enterococcus spp. (R = 0·94), with method detection limits below EPA recreational water quality health standards for single standard exceedances (E. coli, 38 cells per 100 ml; Enterococcus spp. , 25 cells per 100 ml). Cov-IMS/ATP correctly classified 87% of E. coli and 94% of Enterococcus spp. samples based on these water quality standards. Cov-IMS/ATP was also used as a field method to rapidly distinguish differential loading of E. coli between two stream channels to their confluence. Conclusions:, Cov-IMS/ATP is a robust, in-field detection method for determining water quality of both fresh and marine water systems as well as differential loading of FIB from two converging channels. Significance and Impact of the Study:, To our knowledge, this is the first work to present a viable rapid, in-field assay for measuring FIB concentrations in marine water environments. Cov-IMS/ATP is a potential alternative detection method, particularly in areas with limited laboratory support and resources, because of its increased economy and portability. [source]


    Antimalarial drug quality in Africa

    JOURNAL OF CLINICAL PHARMACY & THERAPEUTICS, Issue 5 2007
    A. A. Amin PhD
    Abstract Background and objective: There are several reports of sub-standard and counterfeit antimalarial drugs circulating in the markets of developing countries; we aimed to review the literature for the African continent. Methods: A search was conducted in PubMed in English using the medical subject headings (MeSH) terms: ,Antimalarials/analysis'[MeSH] OR ,Antimalarials/standards'[MeSH] AND ,Africa'[MeSH]' to include articles published up to and including 26 February 2007. Data were augmented with reports on the quality of antimalarial drugs in Africa obtained from colleagues in the World Health Organization. We summarized the data under the following themes: content and dissolution; relative bioavailability of antimalarial products; antimalarial stability and shelf life; general tests on pharmaceutical dosage forms; and the presence of degradation or unidentifiable impurities in formulations. Results and discussion: The search yielded 21 relevant peer-reviewed articles and three reports on the quality of antimalarial drugs in Africa. The literature was varied in the quality and breadth of data presented, with most bioavailability studies poorly designed and executed. The review highlights the common finding in drug quality studies that (i) most antimalarial products pass the basic tests for pharmaceutical dosage forms, such as the uniformity of weight for tablets, (ii) most antimalarial drugs pass the content test and (iii) in vitro product dissolution is the main problem area where most drugs fail to meet required pharmacopoeial specifications, especially with regard to sulfadoxine,pyrimethamine products. In addition, there are worryingly high quality failure rates for artemisinin monotherapies such as dihydroartemisinin (DHA); for instance all five DHA sampled products in one study in Nairobi, Kenya, were reported to have failed the requisite tests. Conclusions: There is an urgent need to strengthen pharmaceutical management systems such as post-marketing surveillance and the broader health systems in Africa to ensure populations in the continent have access to antimalarial drugs that are safe, of the highest quality standards and that retain their integrity throughout the distribution chain through adequate enforcement of existing legislation and enactment of new ones if necessary, and provision of the necessary resources for drug quality assurance. [source]


    Examination of the analytic quality of behavioral health randomized clinical trials

    JOURNAL OF CLINICAL PSYCHOLOGY, Issue 1 2007
    Bonnie Spring
    Adoption of evidence-based practice (EBP) policy has implications for clinicians and researchers alike. In fields that have already adopted EBP, evidence-based practice guidelines derive from systematic reviews of research evidence. Ultimately, such guidelines serve as tools used by practitioners. Systematic reviews of treatment efficacy and effectiveness reserve their strongest endorsements for treatments that are supported by high-quality randomized clinical trials (RCTs). It is unknown how well RCTs reported in behavioral science journals fare compared to quality standards set forth in fields that pioneered the evidence-based movement. We compared analytic quality features of all behavioral health RCTs (n = 73) published in three leading behavioral journals and two leading medical journals between January 2000 and July 2003. A behavioral health trial was operationalized as one employing a behavioral treatment modality to prevent or treat an acute or chronic physical disease or condition. Findings revealed areas of weakness in analytic aspects of the behavioral health RCTs reported in both sets of journals. Weaknesses were more pronounced in behavioral journals. The authors offer recommendations for improving the analytic quality of behavioral health RCTs to ensure that evidence about behavioral treatments is highly weighted in systematic reviews. © 2006 Wiley Periodicals, Inc. J Clin Psychol 63: 53,71, 2007. [source]


    Developing a model for quality evaluation in residential care for people with intellectual disability

    JOURNAL OF INTELLECTUAL DISABILITY RESEARCH, Issue 5 2000
    B. Maes
    Abstract The present article describes the development of a general model for the evaluation, enhancement and assurance of quality of care processes in residential facilities for children and adults with intellectual disability. The framework is based on current theories regarding quality of life and quality evaluation, on a consensus between several participants in Delphi discussion-rounds, and on a questionnaire for care providers and clients in all Flemish residential facilities. The model describes 13 quality standards and a list of indicators concerning organization and support interventions. Facilities may use this set of criteria and indicators in several ways within a continuous and dynamic system of internal quality assurance. Finally, the prospects of and conditions for the implementation of this model are discussed. [source]


    Driving less for better air: Impacts of a public information campaign

    JOURNAL OF POLICY ANALYSIS AND MANAGEMENT, Issue 1 2003
    Gary T. Henry
    In the wake of the 1990 amendments to the Clean Air Act, localities across the United States initiated public information campaigns both to raise awareness of threats to air quality and to change behavior related to air pollution by recommending specific behavioral changes in the campaign messages. These campaigns are designed to reduce the health hazards associated with poor air quality and to avoid federal sanctions resulting from the failure to meet air quality standards. As in many other communities across the country, a coalition of government agencies and businesses initiated a public information campaign in the Atlanta metropolitan region to reduce certain targeted behaviors, mainly driving. A two-stage model used to analyze data from a rolling sample survey shows that the centerpiece of the information campaign,air quality alerts,was effective in raising awareness and reducing driving in a segment of the population. When the overall information campaign was moderated by employers' participation in programs to improve air quality, drivers significantly reduced the number of miles they drove and the number of trips they took by car on days when air quality alerts were sounded. Public information campaigns can be successful in increasing awareness, but changing well-established behaviors, such as driving, is likely to require institutional mediation to provide social contexts that support the behavioral change, as well. © 2003 by the Association for Public Policy Analysis and Management. [source]


    Improving the quality of processing gastric cancer specimens: The pathologist's perspective,

    JOURNAL OF SURGICAL ONCOLOGY, Issue 3 2010
    Alyson L. Mahar BSc
    Abstract Background and Objectives Research into surgeon and pathologist knowledge of guidelines for lymph node (LN) assessment in gastric cancer demonstrated a knowledge deficit. To understand factors affecting optimal assessment we surveyed pathologists to identify external barriers. Methods Pathologists were identified using two Ontario physician databases and surveyed online or by mail, with a 40% response rate. Results The majority (56%) of pathologists stated assessing an additional five LNs would not be a burden. Most (80%) pathologists disagreed with pay for performance for achieving quality standards. Qualitative analysis determined the majority of pathologists believed achieving quality standards was inherent to their profession and should not require incentives. Poor surgical specimen was identified as a barrier and underscores the importance of aiming quality improvement initiatives at the multidisciplinary team. Conclusion In addition to education, tailoring an intervention to address all barriers, including laboratory constraints may be an effective means of improving gastric cancer care. J. Surg. Oncol. 2010; 101:195,199. © 2010 Wiley-Liss, Inc. [source]


    DECISION SUPPORT FOR ALLOCATION OF WATERSHED POLLUTION LOAD USING GREY FUZZY MULTIOBJECTIVE PROGRAMMING,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2006
    Ho-Wen Chen
    ABSTRACT: This paper uses the grey fuzzy multiobjective programming to aid in decision making for the allocation of waste load in a river system under versatile uncertainties and risks. It differs from previous studies by considering a multicriteria objective function with combined grey and fuzzy messages under a cost benefit analysis framework. Such analysis technically integrates the prior information of water quality models, water quality standards, wastewater treatment costs, and potential benefits gained via in-stream water quality improvement. While fuzzy sets are characterized based on semantic and cognitive vagueness in decision making, grey numbers can delineate measurement errors in data collection. By employing three distinct set theoretic fuzzy operators, the synergy of grey and fuzzy implications may smoothly characterize the prescribed management complexity. With the aid of genetic algorithm in the solution procedure, the modeling outputs contribute to the development of an effective waste load allocation and reduction scheme for tributaries in this subwatershed located in the lower Tseng-Wen River Basin, South Taiwan. Research findings indicate that the inclusion of three fuzzy set theoretic operators in decision analysis may delineate different tradeoffs in decision making due to varying changes, transformations, and movements of waste load in association with land use pattern within the watershed. [source]


    MODELING METALS TRANSPORT AND SEDIMENT/WATER INTERACTIONS IN A MINING IMPACTED MOUNTAIN STREAM,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 6 2004
    Brian S. Caruso
    ABSTRACT: The U.S. Environmental Protection Agency (USEPA) Water Quality Analysis Simulation Program (WASP5) was used to model the transport and sediment/water interactions of metals under low flow, steady state conditions in Tenmile Creek, a mountain stream supplying drinking water to the City of Helena, Montana, impacted by numerous abandoned hard rock mines. The model was calibrated for base flow using data collected by USEPA and validated using data from the U.S. Geological Survey (USGS) for higher flows. It was used to assess metals loadings and losses, exceedances of Montana State water quality standards, metals interactions in stream water and bed sediment, uncertainty in fate and transport processes and model parameters, and effectiveness of remedial alternatives that include leaving contaminated sediment in the stream. Results indicated that during base flow, adits and point sources contribute significant metals loadings to the stream, but that shallow ground water and bed sediment also contribute metals in some key locations. Losses from the water column occur in some areas, primarily due to adsorption and precipitation onto bed sediments. Some uncertainty exists in the metal partition coefficients associated with sediment, significance of precipitation reactions, and in the specific locations of unidentified sources and losses of metals. Standards exceedances are widespread throughout the stream, but the model showed that remediation of point sources and mine waste near water courses can help improve water quality. Model results also indicate, however, that alteration of the water supply scheme and increasing base flow will probably be required to meet all water quality standards. [source]


    REEXAMINING BEST MANAGEMENT PRACTICES FOR IMPROVING WATER QUALITY IN URBAN WATERSHEDS,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 5 2003
    Stephen R. Pennington
    ABSTRACT: Municipalities will be implementing structural best management practices at increasing rates in their effort to comply with Phase II of the National Pollutant Discharge Elimination System (NPDES). However, there is evidence that structural best management practices (BMPs) by themselves may be insufficient to attain desired water quality standards. This paper reports on an analysis of the median removal efficiencies of structural BMPs and compares them to removal efficiencies estimated as being necessary to attain water quality standards in the Rouge River in Detroit, Michigan. Eight water quality parameters are reviewed using data collected from 1994 to 1999 in the Rouge River. Currently, five of the eight parameters in the Rouge River including bacteria, biochemical oxygen demand, and total suspended solids (TSS) exceed the required water quality standards. The reported analysis of structural BMP efficiencies reveals that structural BMPs appear capable of reducing only some of the pollutants of concern to acceptable levels. [source]


    STOCHASTIC WATER QUALITY ANALYSIS USING RELIABILITY METHOD,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2001
    Kun-Yeun Han
    ABSTRACT: This study developed a QUAL2E-Reliability Analysis (QUAL2E-RA) model for the stochastic water quality analysis of the downstream reach of the main Han River in Korea. The proposed model is based on the QUAL2E model and incorporates the Advanced First-Order Second-Moment (AFOSM) and Mean-Value First-Order Second-Moment (MFOSM) methods. After the hydraulic characteristics from standard step method are identified, the optimal reaction coefficients are then estimated using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method. Considering variations in river discharges, pollutant loads from tributaries, and reaction coefficients, the violation probabilities of existing water quality standards at several locations in the river were computed from the AFOSM and MFOSM methods, and the results were compared with those from the Monte Carlo method. The statistics of the three uncertainty analysis methods show that the outputs from the AFOSM and MFOSM methods are similar to those from the Monte Carlo method. From a practical model selection perspective, the MFOSM method is more attractive in terms of its computational simplicity and execution time. [source]


    EPA'S BASINS MODEL: GOOD SCIENCE OR SERENDIPITOUS MODELING?,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2000
    Ray C. Whittemore
    ABSTRACT: Better Assessment Science Integrating Point and Non-point Sources (BASINS) is a geographic-based watershed assessment tool developed by EPA's Office of Water to help states more efficiently target and evaluate water-bodies that are not meeting water quality standards. BASINS (EPA, 1996a, 1998) brings together data on water quality and quantity, land uses, point source loadings, and other related spatial data with supporting nonpoint and water quality models at a quicker and more effective pace. EPA developed BASINS, to better integrate point and nonpoint source water quality assessments for the Nation's 2100+ watersheds. In its zeal to achieve this endpoint, EPA has initiated a simplistic approach that was expected to grow through scientific enhancements as TMDL developers become more familiar with modeling requirements. BASINS builds upon federal databases of water quality conditions and point source loadings for numerous parameters where quality assurance is suspect in some cases. Its design allows comprehensive assessments and modeling in typical Total Maximum Daily Load (TMDL) computations. While the TMDL utility is the primary reason BASINS was developed, other longer-range water quality assessments will become possible as the Agency expands the suite of assessment models and databases in future releases. The simplistic approach to modeling and user-friendly tools gives rise, however, to technical and philosophical concerns related to default data usage. Seamless generation of model input files and the failure of some utilities to work properly suggest to NCASI that serious problems may still exist and prompts the need for a more rigorous peer-review. Furthermore, sustainable training becomes paramount, as some older modelers will be unfamiliar with Geographic Information System (GIS) technology and associated computer skills. Overall, however, BASINS was judged to be an excellent beginning tool to meet the complex environmental modeling needs in the 21st Century. [source]


    Model choice in time series studies of air pollution and mortality

    JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES A (STATISTICS IN SOCIETY), Issue 2 2006
    Roger D. Peng
    Summary., Multicity time series studies of particulate matter and mortality and morbidity have provided evidence that daily variation in air pollution levels is associated with daily variation in mortality counts. These findings served as key epidemiological evidence for the recent review of the US national ambient air quality standards for particulate matter. As a result, methodological issues concerning time series analysis of the relationship between air pollution and health have attracted the attention of the scientific community and critics have raised concerns about the adequacy of current model formulations. Time series data on pollution and mortality are generally analysed by using log-linear, Poisson regression models for overdispersed counts with the daily number of deaths as outcome, the (possibly lagged) daily level of pollution as a linear predictor and smooth functions of weather variables and calendar time used to adjust for time-varying confounders. Investigators around the world have used different approaches to adjust for confounding, making it difficult to compare results across studies. To date, the statistical properties of these different approaches have not been comprehensively compared. To address these issues, we quantify and characterize model uncertainty and model choice in adjusting for seasonal and long-term trends in time series models of air pollution and mortality. First, we conduct a simulation study to compare and describe the properties of statistical methods that are commonly used for confounding adjustment. We generate data under several confounding scenarios and systematically compare the performance of the various methods with respect to the mean-squared error of the estimated air pollution coefficient. We find that the bias in the estimates generally decreases with more aggressive smoothing and that model selection methods which optimize prediction may not be suitable for obtaining an estimate with small bias. Second, we apply and compare the modelling approaches with the National Morbidity, Mortality, and Air Pollution Study database which comprises daily time series of several pollutants, weather variables and mortality counts covering the period 1987,2000 for the largest 100 cities in the USA. When applying these approaches to adjusting for seasonal and long-term trends we find that the Study's estimates for the national average effect of PM10 at lag 1 on mortality vary over approximately a twofold range, with 95% posterior intervals always excluding zero risk. [source]


    Light as a Controlling Tool

    LASER TECHNIK JOURNAL, Issue 1 2010
    White Light Interferometry in Quality Assurance of Photovoltaic Samples
    The photovoltaic industry is characterized by a permanent, substantial growing during the last years. Today improving efficiencies and reduction of manufacturing cost of solar cells is essential for the success in the competitive market. The reduction of manufacturing costs is associated with high volume manufacturing of the solar cells by perpetuation of high quality standards and requirements for small tolerances. Measurements of the topography of solar cells now start to play an important role in the quality assurance of the manufacturing process. It allows the three-dimensional mapping of a complete area with subsequent parameter extraction: so the efficiency of a solar cell depends on the wafer structure: Perfect smooth surfaces absorb less photons than surfaces with a certain, optimized roughness, whereas protecting layers should be as smooth and flat as possible. Similar to all Microsystems the structures can be investigated and compared to the target values: examples are layer thickness, widths and depths of structured lines, the volume-determination of hollows, defects, pores or abrasion/deposition rates. It also encompasses the 3D profile of printed circuit board tracks or special structures for sophisticated high efficiency photovoltaic elements. [source]


    Past and future sustainability of water policies in Europe

    NATURAL RESOURCES FORUM, Issue 3 2003
    Bernard Barraqué
    The article contributes to a discussion on two global issues on water: water resources management, and water supply and sanitation. Focusing on Europe, it traces the legal roots of current systems in history: as a resource, water is considered as a common property, rather than a market good; while as a public service it is usually a commodity. Public water supply and sanitation technologies and engineering have developed under three main paradigms: quantitative and civil engineering; qualitative and chemical/sanitary engineering (both on the supply side); and the most recent one, environmental engineering and integrated management (on the demand side). The cost of public drinking water is due to rise sharply in view of the two-fold financial challenge of replacing an ageing infrastructure and keeping up with ever-rising environmental and sanitary quality standards. Who will pay? Government subsidies, or water users? The author suggests that apparent successes with privatisation may have relied heavily on hidden government subsidies and/or the healthy state of previously installed water infrastructure: past government subsidies are still felt for as long as the lifetime of the infrastructure. The article stresses the importance of public participation and decentralized local management of water and sanitation services. Informing and involving users in water management decisions is seen as an integral part of the ,ethics' side of the crucial three E's (economics, environment, ethics). The article strongly argues for municipal provision of water services, and hopes that lessons learnt and solutions found in the European experience may serve water services management efforts in other regions of the world. [source]