Testing Laboratories (testing + laboratory)

Distribution by Scientific Domains


Selected Abstracts


Recent developments on high current measurement using current shunt

IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 5 2007
Tatsuo Kawamura Honorary Member
Abstract The Short-circuit Testing Liaison (STL) is the organization that consists of high power testing laboratories of the world. Member laboratories perform short-circuit tests under uniform interpretations of the IEC standards agreed through technical discussions and information exchanges among them. One of the recent projects that the STL has been working on is to establish uncertainty and traceability of high current measurement by international comparison tests with reference shunts. In concert with this project, the IEC Working Group is preparing the new standard for high current measurements. Copyright © 2007 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source]


Risk modelling in blood safety , review of methods, strengths and limitations

ISBT SCIENCE SERIES: THE INTERNATIONAL JOURNAL OF INTRACELLULAR TRANSPORT, Issue n1 2010
B. Custer
Risk modelling studies in blood safety play an important but occasionally misunderstood role. These studies are intended to quantify and contrast risks and benefits. This information is critical for policy development and intervention decision-making. The limitations of risk modelling should be considered alongside the results obtained. The goal of this manuscript and presentation is to review current risk modelling techniques used in blood safety and to discuss the pros and cons of using this information in the decision-making process. The types of questions that can be answered include the extent of a risk or threat; implications of action or inaction; identification of effective strategies for risk management; or whether to adopt specific interventions. These analyses can be focused on a risk alone but are often combined with economic information to gain an understanding of feasible risk interventions given budgetary or other monetary considerations. Thus, analyses that include risk modelling provide insights along multiple lines. As important, the analyses also provide information on what is not known or uncertain about a potential hazard and how much that uncertainty may influence the decision-making process. Specific examples of the range of risk analyses in which the author has participated will be reviewed and will include ongoing process improvement in testing laboratories such as error identification/eradication, estimation of the risk of malaria exposure based on the specific locations of travel, evaluation of blood supply and demand during an influenza pandemic, cost-utility analyses of screening interventions for infectious diseases in countries with different human development indices, and insurance against emerging pathogen risk. Each of these analyses has a different purpose and seeks to answer different questions, but all rely on similar methods. The tool kit for risk analysis is broad and varied but does have limitations. The chief limitation of risk modelling is that risk analyses are not scientific experiments or otherwise controlled studies. Consequently, the analyses are more apt to be influenced by assumptions. These assumptions may be necessary to structure a problem in a way that will allow the question of interest to be answered or may result from incomplete or missing information. Another potential limitation is that commissioners of such studies, those who undertake them, and the intended audience, such as regulatory agencies, may have distinct and differing interpretations of the results. Risk modelling is a set of techniques that can be used to inform and support decision-making at all levels in transfusion medicine. Advances in risk modelling techniques allow for continued expansion in the scope of possible questions that can be analysed. Expanded use also improves the acceptance of the utility of these studies in blood safety and transfusion medicine. [source]


Critical analysis of potential body temperature confounders on neurochemical endpoints caused by direct dosing and maternal separation in neonatal mice: a study of bioallethrin and deltamethrin interactions with temperature on brain muscarinic receptors

JOURNAL OF APPLIED TOXICOLOGY, Issue 1 2003
Jürgen Pauluhn
Abstract The present investigation was conducted to understand better possible confounding factors caused by direct dosing of neonatal mice during the pre-weaning developmental period. By direct dosing, pups might encounter thermal challenges when temporarily removed from their ,natural habitat'. Typically, this leads to a cold environment and food deprivation (impaired lactation) and modulation of the toxic potency of the substance administered. Growth retardation as a consequence of such behavioural changes in pups makes it increasingly difficult to differentiate specific from non-specific mechanisms. Neonatal NMRI mice were dosed daily by gavage (0.7 mg kg,1 body wt.) from postnatal day (PND) 10,16 with S -bioallethrin, deltamethrin or the vehicle. Then the pups, including their non-treated foster dams, were subjected temporarily for 6 h day to a hypo-, normo- or hyperthermic environment, which was followed by normal housing. The measured temperatures in the environmental chambers were ca. 21, 25 and 30°C, respectively. Thus, temperatures in the hypo- and normothermic groups are comparable to the temperatures commonly present in testing laboratories, whereas the hyperthermic condition is that temperature typically present in the ,natural habitat' of pups. A deviation from the normal behaviour of both pups and dams was observed in the hypo- and normothermic groups. In these groups the rectal temperatures of pups were markedly decreased, especially in the early phase of the study (PND 10,12). Neonates that received either test substance displayed changes in body weights and brain weights at terminal sacrifice (PND 17) when subjected temporarily to a non-physiological environment. An enormous influence of environmental temperature on the density of muscarinic receptors in the crude synaptosomal fraction of the cerebral cortex was ascertained. In summary, these results demonstrate that the direct dosing of thermolabile neonatal mice by gavage is subject to significant artefacts that render the interpretation of findings from such studies difficult. It appears that if direct dosing of neonatal pups is mandated, and inhalation is a relevant route of exposure, the combined inhalation exposure of dams with their litters is an alternative procedure that does not cause disruption of the ,natural habitat' of pups. However, owing to their higher ventilation, under such conditions the pups may receive dosages at least double those of the dams. Copyright © 2003 John Wiley & Sons, Ltd. [source]


PATHOGEN DETECTION IN FOOD MICROBIOLOGY LABORATORIES: AN ANALYSIS OF QUALITATIVE PROFICIENCY TEST DATA, 1999,2007

JOURNAL OF FOOD SAFETY, Issue 4 2009
DANIEL C. EDSON
ABSTRACT The objective of this study was to assess laboratories' ability to detect or rule out the presence of four common food pathogens: Escherichia coli O157:H7, Salmonella spp., Listeria monocytogenes and Campylobacter spp. To do this, qualitative proficiency test data provided by one proficiency test provider from 1999 to 2007 were examined. The annual and cumulative 9-year percentages of false-negative and false-positive responses were calculated. The cumulative 9-year false-negative rates were 7.8% for E. coli O157:H7, 5.9% for Salmonella spp., 7.2% for L. monocytogenes and 13.6% for Campylobacter spp. Atypical strains and low concentrations of bacteria were more likely to be missed, and the data showed no trend of improving performance over time. Percentages of false-positive results were below 5.0% for all four pathogens. PRACTICAL APPLICATIONS The results imply that food testing laboratories often fail to detect the presence of these four food pathogens in real food specimens. To improve pathogen detection, supervisors should ensure that testing personnel are adequately trained, that recommended procedures are followed correctly, that samples are properly prepared, that proper conditions (temperature, atmosphere and incubation time) are maintained for good bacterial growth and that recommended quality control procedures are followed. Supervisors should also always investigate reasons for unsatisfactory proficiency test results and take corrective action. Finally, more research is needed into testing practices and proficiency test performance in food testing laboratories. [source]


Simultaneous determination of mono-, di- and tributyltin in environmental samples using isotope dilution gas chromatography mass spectrometry

JOURNAL OF MASS SPECTROMETRY (INCORP BIOLOGICAL MASS SPECTROMETRY), Issue 5 2004
Giuseppe Centineo
Abstract The development of a rapid, precise and accurate speciation method for the simultaneous determination of mono-, di- and tributyltin in environmental samples is described. The method is based on using isotope dilution gas chromatography/mass spectrometry (GC/MS) with electron ionization, a widely used technique in routine testing laboratories. A mixed spike containing 119Sn-enriched monobutyltin (MBT), dibutyltin (DBT) and tributyltin (TBT) was used for the isotope dilution of the samples. Five molecular ions were monitored for each analyte, corresponding to the 116Sn, 117Sn, 118Sn, 119Sn and 120Sn isotopes. The detection at masses corresponding to 116Sn and 117Sn were used to correct for m + 1 and m + 2 contributions of 13C from the organic groups attached to the tin atom on the 118Sn, 119Sn and 120Sn masses with simple mathematical equations and the concentrations of the butyltin compounds were calculated based on the corrected 118Sn/119Sn and 120Sn/119Sn isotope ratios. The 119Sn-enriched multispecies spike was applied with satisfactory results to the simultaneous determination of MBT, DBT and TBT in three certified reference materials: two sediments, PACS-2 and BCR 646, and the mussel tissue CRM 477. The method was compared with a previously published GC/inductively coupled plasma MS isotope dilution procedure, developed in our laboratory, by injecting the same samples into both instruments. Comparable analytical results in terms of precision and accuracy are demonstrated for both atomic and molecular mass spectrometric detectors. Thus, reliable quantitative organotin speciation analysis can be achieved using the more widespread and inexpensive GC/MS instrument. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Testing high SPF sunscreens: a demonstration of the accuracy and reproducibility of the results of testing high SPF formulations by two methods and at different testing sites

PHOTODERMATOLOGY, PHOTOIMMUNOLOGY & PHOTOMEDICINE, Issue 4 2002
Patricia Poh Agin
Background/Purpose: The goals of this study were (i) to demonstrate that existing and widely used sun protection factor (SPF) test methodologies can produce accurate and reproducible results for high SPF formulations and (ii) to provide data on the number of test-subjects needed, the variability of the data, and the appropriate exposure increments needed for testing high SPF formulations. Methods: Three high SPF formulations were tested, according to the Food and Drug Administration's (FDA) 1993 tentative final monograph (TFM) ,very water resistant' test method and/or the 1978 proposed monograph ,waterproof' test method, within one laboratory. A fourth high SPF formulation was tested at four independent SPF testing laboratories, using the 1978 waterproof SPF test method. All laboratories utilized xenon arc solar simulators. Results: The data illustrate that the testing conducted within one laboratory, following either the 1978 proposed or the 1993 TFM SPF test method, was able to reproducibly determine the SPFs of the formulations tested, using either the statistical analysis method in the proposed monograph or the statistical method described in the TFM. When one formulation was tested at four different laboratories, the anticipated variation in the data owing to the equipment and other operational differences was minimized through the use of the statistical method described in the 1993 monograph. Conclusions: The data illustrate that either the 1978 proposed monograph SPF test method or the 1993 TFM SPF test method can provide accurate and reproducible results for high SPF formulations. Further, these results can be achieved with panels of 20,25 subjects with an acceptable level of variability. Utilization of the statistical controls from the 1993 sunscreen monograph can help to minimize lab-to-lab variability for well-formulated products. [source]


External quality assessment of rapid prenatal detection of numerical chromosomal aberrations using molecular genetic techniques: 3 years experience

PRENATAL DIAGNOSIS, Issue 5 2007
S. C. Ramsden
Abstract Objectives Prenatal diagnosis using rapid molecular genetic techniques is now a widely used method for detecting the most prevalent chromosomal aneuploidies. The object of this work was to develop a methodology for delivering external quality assessment (EQA) appropriate to the needs of routine diagnostic testing laboratories. Methods We have provided three rounds of EQA using 15 different samples over 3 years. The scheme has developed to assess both the genotyping accuracy of the results and the appropriateness of the clinical reports issued to the referring clinician. Results Participation in the EQA scheme has increased from 9 to 27 laboratories from across Europe over the three sample distributions. All laboratories have used quantitative fluorescence-PCR (QF-PCR) to analyse these samples except for a sole participant in 2006 who used multiplex ligation-dependent probe amplification (MLPA). In total 265 samples have been distributed, of which four (1.5%) were not reported due to technical failures and one (0.4%) was reported incorrectly and must be regarded as a genotyping error. Conclusions We have demonstrated a significant and increasing demand for EQA in the rapid detection of aneuploidies in UK and other European laboratories. Using the methodologies described, we have had a very low rate of technical failures and demonstrated a high level of genotyping accuracy. However, the quality of the clinical reports was variable and suggestions are made for improvement. Copyright © 2007 John Wiley & Sons, Ltd. [source]


The role of ASTM E27 methods in hazard assessment part II: Flammability and ignitability

PROCESS SAFETY PROGRESS, Issue 1 2005
Laurence G. Britton
Accurate flammability and ignitability data for chemicals form the cornerstone of procedures used to assess the hazards associated with commercial chemical production and use. Since 1967 the ASTM E27 Committee on the Hazard Potential of Chemicals has issued numerous, widely used consensus standards dealing with diverse testing and predictive procedures used to obtain relevant chemical hazard properties. The decision to issue a standard rests solely with the membership, which consists of representatives from industry, testing laboratories, consulting firms, government, academia, and instrument suppliers. Consequently, the procedures are automatically relevant, timely, and widely applicable. The purpose of this paper is to highlight some of the widely used standards, complemented with hypothetical but relevant examples describing the testing strategy, interpretation, and application of the results. A further goal of this paper is to encourage participation in the consensus standards development process. The paper is published in two parts. The first part (in the preceding issue of Process Safety Progress) dealt with the E27 standards pertaining to thermodynamics, thermal stability, and chemical compatibility. The second part, published here, focuses on the flammability, ignitability, and explosibility of fuel and air mixtures. © 2005 American Institute of Chemical Engineers Process Saf Prog, 2005 [source]


Comprehensive plasma-screening for known and unknown substances in doping controls

RAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 8 2010
Andreas Thomas
Occasionally, doping analysis has been recognized as a competitive challenge between cheating sportsmen and the analytical capabilities of testing laboratories. Both have made immense progress during the last decades, but obviously the athletes have the questionable benefit of frequently being able to switch to new, unknown and untested compounds to enhance their performance. Thus, as analytical counteraction and for effective drug testing, a complementary approach to classical targeted methods is required in order to implement a comprehensive screening procedure for known and unknown xenobiotics. The present study provides a new analytical strategy to circumvent the targeted character of classical doping controls without losing the required sensitivity and specificity. Using 50,µL of plasma only, the method potentially identifies illicit drugs in low ng/mL concentrations. Plasma provides the biological fluid with the circulating, unmodified xenobiotics; thus the identification of unknown compounds is facilitated. After a simple protein precipitation, liquid chromatographic separation and subsequent detection by means of high resolution/high accuracy orbitrap mass spectrometry, the procedure enables the determination of numerous compounds from different classes prohibited by the World Anti-Doping Agency (WADA). A new hyphenated mass spectrometry technology was employed without precursor ion selection for higher collision energy dissociation (HCD) fragmentation experiments. Thus the mass spectra contained all the desired information to identify unknown substances retrospectively. The method was validated for 32 selected model compounds for qualitative purposes considering the parameters specificity, selectivity, limit of detection (<0.1,10,ng/mL), precision (9,28%), robustness, linearity, ion suppression and recovery (80,112%). In addition to the identification of unknown compounds, the plasma samples were simultaneously screened for known prohibited targets. Copyright © 2010 John Wiley & Sons, Ltd. [source]