Home About us Contact | |||
Confidence Limits (confidence + limit)
Kinds of Confidence Limits Selected AbstractsPeriodontal microbiota and clinical periodontal status in a rural sample in southern ThailandEUROPEAN JOURNAL OF ORAL SCIENCES, Issue 5 2002P. N. Papapanou We sought to determine (i) the association of subgingival bacterial profiles to clinical periodontal status in a population with limited access to dental care in Thailand, and (ii) the external validity of our earlier findings from a similar study in rural China. We examined 356 subjects, 30,39 yr old and 50,59 yr old, with respect to clinical periodontal status and subgingival plaque at maximally 14 sites per subject. Checkerboard hybridizations were used to analyse a total of 4343 samples. The prevalence of the 27 species investigated ranged between 87.2% and 100%. Discriminant analysis based on microbial profiles classified correctly 67.5% of all deep (, 5 mm) and 64.2% of all shallow sites, and 67.4% of all subjects with and 69.3% of all subjects without , 3 deep pockets. High colonization by ,red complex' bacteria was four times as likely (95% Confidence Limits (CL) 2.5,6.6) in subjects with ,,10 sites with attachment loss of ,,5 mm, and 4.3 times as likely (95% CL 2.6,7.1) in subjects with , 30 such sites. The data confirmed (i) the ubiquitous prevalence of the bacteria investigated in subjects with no regular access to dental care; and (ii) the high odds for periodontal pathology conferred by increased levels of specific periodontal bacteria. [source] Quantitative repeated open application testing with a rinse-off product in methyldibromo glutaronitrile-sensitive patients: results of the IVDKCONTACT DERMATITIS, Issue 6 2010Annice Heratizadeh Background: While the use of methyldibromo glutaronitrile (MDBGN) in leave-on products is clearly associated with high sensitization or elicitation risk, such a clear-cut relation could be questioned with regard to rinse-off products. Objective: The objective of this study was to find a maximum non-eliciting concentration for rinse-off products in MDBGN patch test-positive patients. Patients and methods: We performed a use-related test [repeated open application test (ROAT)] in patients sensitized to MDBGN with a liquid soap containing three concentrations of MDBGN (50, 200, and 400 p.p.m. MDBGN, respectively). The soap at 50 p.p.m. was used twice daily for 4 weeks. If no reaction of the skin was observed, the product with the next higher concentration was used for another 4 weeks, etc. Results: In total, 32/37 evaluated cases [86.5%; lower exact one-sided 95% confidence limit (CL): 73.7%] did not react to any of the preparations. The remaining reacted as follows: 1/37 reacted to 50 p.p.m., 3/37 to 200 p.p.m., and 1/37 to 400 p.p.m. The cumulative non-response to 50 p.p.m. was 97.3% (lower CL: 87.8%). Conclusions: The majority of subjects sensitized to MDBGN-tolerated rinse-off products containing a maximum concentration of 400 p.p.m. A concentration in rinse-off products in the range of 50 p.p.m. could be regarded as safe for most individuals already sensitized. These concentrations will presumably prevent induction (sensitization) also. [source] A test system to evaluate the susceptibility of Oregon, USA, native stream invertebrates to triclopyr and carbarylENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 10 2001Jennifer L. Peterson Abstract The susceptibility of six indigenous macroinvertebrate species representative of U.S. Pacific Northwest streams (Ameletus sp., Brachycentrus americanus, Calineuria californica, Cinygma sp., Lepidostoma unicolor, Psychoglypha sp. early and late instar) to formulated triclopyr ester (herbicide) and carbaryl (insecticide) was determined using laboratory bioassays. Acute toxicity was expressed as the lethal concentration to 50% (LC50) and 1% (LC1) of the test population based on a 96-h exposure duration. Carbaryl was found to be 1,000 times more toxic than triclopyr for all the organisms tested. The LC1 values (7.5, 28.8, 9.0, 3.0, 9.5, 14.8, 33.8 ,g/L, respectively, for carbaryl and 1.8, 3.9, 4.0, 4.2, 29.0, 16.1 mg/L, respectively, for triclopyr) were used in the calculation of hazardous concentration to 5% of the stream macroinvertebrate community (HC5) based on the lower 95% confidence limit (HC5/95). The hazardous concentration (HC5/95) for triclopyr was 0.11 mg/L and for carbaryl ranged from 0.43 to 0.66 ,g/L, respectively. Triclopyr and carbaryl symptomology were analyzed for two organisms, C. californica and Cinygma sp. Carbaryl symptomology included knockdown and moribund states with severity and time of appearance being a function of dose. In triclopyr poisoning, death occurred suddenly with little or no symptomology. Time to 50% mortality (LT50) values were consistently higher for C. californica than for Cinygma sp. exposed to both chemicals at similar concentrations. [source] Language-Related Potentials in Temporal Lobe Epilepsy Before and After Surgical TreatmentEPILEPSIA, Issue 2000Toshihiko Ito Purpose: Temporal lobectomy has contributed to treatment for medically intractable epilepsies. However, influence of the surgical treatment on cognitive function is not still clear, especially from the electrophysiological viewpoint. N400, an event related potential (ERP) named for its negative polarity and peak latency of 400 ms, is reported to be an electrophysiological sign of neural activities associated with semantic priming in language perception. In the present study, ERPs are applied to evaluate the cognitive function of temporal lobe epilepsy before and after temporal lobectomy. Methods: Two patients with intractable temporal lobe epilepsy participated in this study. Fifteen normal subjects served as controls. The incongruous sentence task (Kutas and Hillyard 1980) was used to record N400 components in an auditory modality. Two types of sentences (40 Japanese sentences for each type) were prepared, in which the terminal words were either semantically congruent or incongruent. The scntences were randomly presented at approximately 65 dB SPL peak intensity. ERPs were recorded according to the international 10,20 system, with a balanced non-cephalic electrode reference and 2 1 channels. The band-pass filter was set from 0.5 to 30 Hz, and the ERPs were sampled at 500 Hz from 200 ms before the onset of terminal words to 824 ms post-stimulus. Waves were calculated by subtracting ERPs in the congruent condition from those in the incongruent condition. N400 was scored as the most negative point between 250 and 450 ms in the subtraction waves. Amplitudes were measured from the baseline of 100 ms before the terminal words. Motor responses were also measured with a right index finger, to estimate the accuracy of understanding sentences. Results: Case I was a 22-year-old male who had intractable epilepsy for 7 years. Magnetic resonance imaging (MRI) showed high-intensity signals in the right amygdalo-hippocampal region. The epileptic seizures were confirmed to originate from the region hy electroencephalography/closed-circuit television monitoring, and single-photon-emission computed tomography. ERPs were recorded I month before and after the right anterior temporal lobectomy. Before the surgery, the rate of correct responses showed no difference between the patient (96 %) and the controls (96 %). The amplitudes of N400 for the patient reduced in the right frontal and central areas (F4, C4), comparing to 99 % confidence limit for control subjects. After the surgery, the rate of correct responses was 97 %, and the amplitudes reduced in the right central, parietal, and posterior temporal areas (C4, P4, 0 2, T6). Case 2 (37-year-old female) had intractable epilepsy for 30 years. MRI showed brain atrophy in the right hippocampal region. The epileptic seizures were confirmed to originate from the region. N400 was recorded 3 months after the resection. The rate of correct responses was 95 %. The amplitudes of N400 were lower in the right frontal, parietal, and temporal areas (electrodes Fp2, F4, P4, T6, Pz), comparing to 99 % confidence limit of controls. Conclusions: Before the lpbectomy, the reduction of amplitudes of N400 indicated that the pathogenesis of intractable temporal lobe epilepsy would influence the process of semantic priming in language perception. After the resection, it was suggested that the right temporal lobectomy might affect the cognitive function in the brain from electrophysiological aspects. We could benefit from further study including analysis of the discrepancy between the amplitudes of N400 and behavioral responses. [source] Effects of Circadian Regulation and Rest,Activity State on Spontaneous Seizures in a Rat Model of Limbic EpilepsyEPILEPSIA, Issue 5 2000Mark Quigg Summary: Purpose: Circadian regulation via the suprachiasmatic nuclei and rest,activity state may influence expression of limbic seizures. Methods: Male rats (n = 14) were made epileptic by electrical stimulation of the hippocampus, causing limbic status epilepticus and subsequent seizures. We monitored seizures with intrahippocampal electrodes in 12,12-h light/dark (LD) cycles and in continuous dark (DD). We used radiotelemetry monitoring of activity to measure state and body temperature to determine circadian phase. Cosinor analysis and ,2 tests determined whether seizures occurred rhythmically when plotted by phase. State was defined as inactive or active in 10-min epochs based on whether activity count was below or above a cut-off value validated from video observation. Results: In LD, the peak seizure occurrence was 14:59 h after circadian temperature peak (95% confidence limit, 13:37,16:19). Phasic seizure occurrence persisted in DD for 14:05 (12:31,15:38), p < 0.0001, against uniform mean distribution. In LD, 14,787 epochs contained 1,268 seizures; seizures preferentially occurred during inactive epochs (965 observed, 878 expected in proportion to the overall distribution of inactive versus active epochs; p < 0.001). In DD, 20,664 epochs contained 1,609 seizures; seizures had no preferential occurrence by state (999 observed, 1,025 expected; p = 0.16). Conclusions: Limbic seizures occurred with an endogenous circadian rhythm. Seizures preferentially struck during inactivity during entrainment to the light,dark cycle. [source] Safety and Efficacy of Arterial Switch Operation in Previously Inoperable PatientsJOURNAL OF CARDIAC SURGERY, Issue 4 2010Liu Ying-long M.D. This study aimed to evaluate the safety and efficacy of ASO in these selected subset patients. Methods: The records of 86 patients older than six months with complete transposition of the great arteries and ventricular septal defect or Taussig-Bing anomaly and severe PAH who underwent ASO at our institution from May 2000 to October 2008 were reviewed retrospectively. Eighty survivors were followed-up. Results: There were six hospital deaths (7.0%, 95% confidence limit 1.6 to 12.4%). From January 2006 to October 2008, 46 consecutive ASOs were performed with no death. Operative mortality and mobility decreased significantly (p = 0.008 and p = 0.046, respectively). The median duration of follow-up was 42.1 ± 28.8 months (range, 2.0 to 99.5). Two late deaths occurred. Latest follow-up data showed that 2.8% of survivors were in New York Heart Association (NYHA) class II and 97.2% were in NYHA class I. Conclusions: Excellent early and mid-term results of ASO are obtained from patients older than six months with complete transposition of the great arteries and ventricular septal defect or Taussig-Bing anomaly and severe PAH in current era, and ASO is safe and effective in these selected subset patients. (J Card Surg 2010;25:400-405) [source] Safety and efficacy of sonographic-guided random real-time core needle biopsy of the liverJOURNAL OF CLINICAL ULTRASOUND, Issue 3 2009Siddharth A. Padia MD Abstract Purpose. To determine the safety and efficacy of real-time, sonographic-guided, random percutaneous needle biopsy of the liver in a tertiary medical center. Method. From an IRB-approved biopsy database, all patients who had random liver biopsy performed over a 24-month period were selected. In 350 patients, 539 random percutaneous needle biopsies of the liver were performed under real-time sonographic visualization. The following were recorded from the electronic medical record: patient demographics, indication for biopsy procedure; radiologist's name; needle type and gauge and number of passes; use and amount of IV sedation or anesthesia; adequacy of the specimen; and complications following the procedure. Result. Of 539 biopsies, 378 (70%) biopsy procedures were performed on liver transplant recipients. Of the biopsy procedures in nontransplant patients, 81/161 (50%) concurrently underwent biopsy of a focal liver mass. An 18-gauge automated core biopsy needle was used in 536/539 (99%). Median number of passes per biopsy procedure was 1 (mean, 1.7; range, 1,6). Sedation using midazolam and fentanyl was used in 483/539 (90%). There were only 8 inadequate specimens (1.5%, [2.3, upper 95% confidence limit, fully described in Statistical Analysis]). Complications were identified in 11/539 biopsy procedures (2.0%, [2.6, upper 95% confidence limit]): 5 with severe postprocedural pain, 3 with symptomatic hemorrhage, 2 with infection, and 1 with a rash. There were no sedation-related complications and no deaths related to the procedure. Conclusion. Real-time, sonographic-guided, random core-needle liver biopsy is a safe and highly effective procedure. © 2009 Wiley Periodicals, Inc. J Clin Ultrasound 2009 [source] A Computer-Based Method for Determination of the Cell-Free Layer Width in MicrocirculationMICROCIRCULATION, Issue 3 2006SANGHO KIM ABSTRACT Objectives: The cell-free layer between the erythrocyte column and the vessel wall is an important determinant of hydrodynamic resistance in microcirculatory vessels. The authors report a method for continuous measurement of the width of this layer. Methods: The light intensity of a linear array of pixels perpendicular to the vessel axis is continuously determined from a video image of a microcirculatory vessel. A threshold level based on Otsu's method is used to establish the interface between the cell-free layer and the erythrocyte column. To test the method, video images at 750,4500 frames/s were obtained from venules and arterioles in rat spinotrapezius muscle at normal and reduced arterial pressures before and after induction of erythrocyte aggregation with Dextran 500. The current measurements were compared to manual measurements of the same images. Results: Values obtained by the manual and the new methods were in agreement within the 95% confidence limit by the Bland-Altman analysis and within 90,95% range by the correlation coefficient (R2). The more frequent measurements reveal substantial, rapid variations in cell-free layer width and changes in mean values with alteration of arterial pressure and red cell aggregability. Conclusions: A new, computer-based technique has been developed that provides measurements of rapid, time-dependent variations in the width of the cell-free layer in the microcirculation. [source] Extended X-ray emission in the high-redshift quasar GB 1508+5714 at z= 4.3MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, Issue 1 2003W. Yuan ABSTRACT We report the discovery of extended X-ray emission around the powerful high-redshift quasar GB 1508+5714 at z= 4.3, revealed in a long Chandra ACIS observation. The emission feature is 3,4 arcsec away from the quasar core, which corresponds to a projected distance of about 25 kpc. The X-ray spectrum is best fitted with a power law of photon index 1.92 ± 0.35 (90 per cent confidence limit). The X-ray flux and luminosity reach 9.2 × 10,15 erg cm,2 s,1 (0.5,8 keV) and 1.6 × 1045 erg s,1 (2.7,42.4 keV rest frame, ,,= 0.73, ,m= 0.27, H0= 71 km s,1 Mpc,1), which is about 2 per cent of the total X-ray emission of the quasar. We interpret the X-ray emission as inverse Compton scattering of cosmic microwave background photons. The scattering relativistic electron population could either be a quasi-static diffuse cloud fed by the jet, or an outer extension of the jet with a high bulk Lorentz factor. We argue that the lack of an obvious detection of radio emission from the extended component could be a consequence of Compton losses on the electron population, or of a low magnetic field. Extended X-ray emission produced by inverse Compton scattering may be common around high-redshift radio galaxies and quasars, demonstrating that significant power is injected into their surroundings by powerful jets. [source] Constraints on the cosmic neutrino backgroundMONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, Issue 4 2003Elena Pierpaoli ABSTRACT The radiative component of the Universe has a characteristic impact on both large-scale structure (LSS) and the cosmic microwave background radiation (CMB). We use the recent WMAP data, together with previous Cosmic Background Imager (CBI) data and 2dF matter power spectrum, to constrain the effective number of neutrino species Neff in a general cosmology. We find that Neff= 4.31 with a 95 per cent confidence limit 1.6 ,Neff, 7.1. If we include the H0 prior from the HST project we find the best fit Neff= 4.08 and 1.90 ,Neff, 6.62 for a 95 per cent confidence limit. The curvature we derive is still consistent with flat, but assuming a flat Universe from the beginning implies a bias toward lower Neff, as well as artificially smaller error bars. Adding the supernova constraint does not improve the result. We analyse and discuss the degeneracies with other parameters, and point out that probes of the matter power spectrum on smaller scales and accurate independent ,8 measurements, together with better independent measurement of H0, would help in breaking the degeneracies. [source] Upper susceptibility threshold limits with confidence intervals: a method to identify normal and abnormal population values for laboratory toxicological parameters, based on acetylcholinesterase activities in sea licePEST MANAGEMENT SCIENCE (FORMERLY: PESTICIDE SCIENCE), Issue 3 2006Anders Fallang Abstract The interpretation and importance of comparing field values of susceptibility to pesticides with a laboratory reference strain that might bear little resemblance to the actual situation in the field are problematic and a continuing subject of debate. In this paper a procedure for defining a ,normal sensitive' population from a field study of 383 individuals to provide a basis for analysing and interpreting in vitro results is described and examined. Instead of using only the 95th percentile, the upper and lower confidence limits for the 95th percentile were also compared to select the best estimation of the limit for the normal material. A field population constrained by the upper confidence limit for the 95th percentile provides appropriate descriptions of the normal material in this study. This approach should prove useful in studies of pesticide resistance in field populations. Copyright © 2006 Society of Chemical Industry [source] Estimation of the number of working hours critical for the development of mental and physical fatigue symptoms in Japanese male workers,application of benchmark dose methodAMERICAN JOURNAL OF INDUSTRIAL MEDICINE, Issue 3 2007Yasushi Suwazono PhD Abstract Background To clarify the influence of working hours on subjective fatigue symptoms and obtain the critical dose (number of hours) to determine the number of permissible working hours, we calculated the benchmark dose (BMD) and the 95% lower confidence limit on BMD (BMDL) of working hours for subjective mental and fatigue symptoms using multivariate logistic regression. Methods Self-administered questionnaires were distributed to all 843 male daytime workers aged ,60 years in a single chemical factory, and 715 provided complete replies. The odds ratios of daily working hours were determined using positive findings of the Self-rating Depression Scale and 8 subscales of the Cumulative Fatigue Symptom Index as dependent variables, and other potential covariates as independent variables. Using significant parameters for the working hours and those for other covariates, the BMD and BMDL (BMD/BMDL) values were calculated for corresponding dependent variables. The benchmark response (BMR) was set at 5% or 10%. Results The BMDL with a BMR of 5% was shown to be 9.6,11.6 hr per day, which corresponds to 48,58 working hours per week and 36,78 overtime hours per month. Conclusions These results suggest that special attention should be paid to the workers whose working hours exceed these BMD/BMDL values. Am. J. Ind. Med. 50: 173,182, 2007. © 2007 Wiley-Liss, Inc. [source] Increased incidence of malignancies in Sweden after the Chernobyl accident,a promoting effect?AMERICAN JOURNAL OF INDUSTRIAL MEDICINE, Issue 3 2006Martin Tondel Abstract Background After the Chernobyl accident in 1986, as much as 5% of the released caesium-137 was deposited in Sweden due to a heavy rainfall 2 days after the event. A study of increased incidence of malignancies was initiated after the accident. Methods The cohort included 1,137,106 inhabitants who were 0,60 years old in 1986 and lived in 8 counties of Sweden with the highest fallout of caesium-137. With the dwelling coordinate, GIS-technique and a digital map on caesium-137, each individual was matched for the exposure. Adjustments were made for several potential confounding factors. During the follow-up 33,851 malignancies was recorded 1988,1999. Results Exposure categories were: 0,8 (reference), 9,23, 24,43, 44,66, 67,84, and ,85 nGy/hr. The corresponding adjusted Mantel-Haenszel incidence rate ratios for total malignancies during follow-up amounted to 1.000, 0.997, 1.072, 1.114, 1.068, 1.125, respectively. The excess relative risk per 100 nGy/hr with the same adjustments and time period was 0.042 95% confidence limit 0.001;0.084. An excess for thyroid cancer or leukemia could not be ruled out. Conclusion Increased incidence of total malignancies possibly related to the fallout from the Chernobyl accident is seen. Am. J. Ind. Med. © 2006 Wiley-Liss, Inc. [source] A New Method for Constructing Confidence Intervals for the Index CpmQUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 7 2004Michael Perakis Abstract In the statistical literature on the study of the capability of processes through the use of indices, Cpm appears to have been one of the most widely used capability indices and its estimation has attracted much interest. In this article, a new method for constructing approximate confidence intervals or lower confidence limits for this index is suggested. The method is based on an approximation of the non-central chi-square distribution, which was proposed by Pearson. Its coverage appears to be more satisfactory compared with that achieved by any of the two most widely used methods that were proposed by Boyles, in situations where one is interested in assessing a lower confidence limit for Cpm. This is supported by the results of an extensive simulation study. Copyright © 2004 John Wiley & Sons, Ltd. [source] New method to census primate groups: Estimating group density of Japanese macaques by point censusAMERICAN JOURNAL OF PRIMATOLOGY, Issue 2 2003Goro Hanya Abstract We devised a new method to estimate the density of primate groups in habitats that preclude the use of a line-transect census because the ground is too steep. We combined point census and group follows. From the number of groups counted at a fixed point for an hour, n, group density D was calculated: . ,, the detectability constant, was a constant when distance-dependent detectability g(y) was regressed on a half-normal model: g(y) = e -,2 and can be estimated by combining the information of group follow and point census. Using this method, we estimated the group density of Japanese macaques in Yakushima. A census area of 7 km2 was divided into 28 grid squares (500 m×500 m). One observer was positioned at a point in each grid square, and those points were censused simultaneously for 4,6 days from 0600,0700 to 1500,1600 hr. Four troops were followed for 144 hr during the point census. Distance-dependent detectability closely correlated with the half-normal model. The detectability constant varied with the time of day, but it was not influenced by troop identity or topography. Group density was calculated to be 1.48±0.61 and 0.701±0.432 groups/km2 in the disturbed and undisturbed areas, respectively (95% confidence limit). "True" group density estimated by home range data was within the confidence limit calculated by a point census in the home range of the troops for two troops, suggesting that this method was valid. This method is applicable to other species as long as at least one group can be followed, because it satisfies the fundamental assumptions of point census, and the detectability does not seem to be biased by troop or topography. Am. J. Primatol. 60:43,56, 2003. [source] Modelling the impact of an influenza A/H1N1 pandemic on critical care demand from early pathogenicity data: the case for sentinel reportingANAESTHESIA, Issue 9 2009A. Ercole Summary Projected critical care demand for pandemic influenza H1N1 in England was estimated in this study. The effect of varying hospital admission rates under statistical uncertainty was examined. Early in a pandemic, uncertainty in epidemiological parameters leads to a wide range of credible scenarios, with projected demand ranging from insignificant to overwhelming. However, even small changes to input assumptions make the major incident scenario increasingly likely. Before any cases are admitted to hospital, 95% confidence limit on admission rates led to a range in predicted peak critical care bed occupancy of between 0% and 37% of total critical care bed capacity, half of these cases requiring ventilatory support. For hospital admission rates above 0.25%, critical care bed availability would be exceeded. Further, only 10% of critical care beds in England are in specialist paediatric units, but best estimates suggest that 30% of patients requiring critical care will be children. Paediatric intensive care facilities are likely to be quickly exhausted and suggest that older children should be managed in adult critical care units to allow resource optimisation. Crucially this study highlights the need for sentinel reporting and real-time modelling to guide rational decision making. [source] Anti-JC virus antibodies: Implications for PML Risk StratificationANNALS OF NEUROLOGY, Issue 3 2010Leonid Gorelik PhD Objective A study was undertaken to establish an enzyme-linked immunosorbent assay (ELISA) to detect JC virus (JCV)-specific antibodies in multiple sclerosis (MS) patients, and to evaluate its potential utility for identifying patients at higher or lower risk (ie, risk stratification) of developing progressive multifocal leukoencephalopathy (PML). Methods A 2-step assay for detecting and confirming the presence of anti-JCV antibodies in human serum and plasma was developed and demonstrated to be both sensitive and specific. ELISA cutpoints were statistically established using sera from >800 MS patients from natalizumab clinical studies. Subsequently, this assay was used to determine the presence of anti-JCV antibodies in natalizumab-treated PML patients where serum samples were collected 16-180 months prior to the diagnosis of PML. Results In our evaluation of natalizumab-treated MS patients, 53.6% tested positive for anti-JCV antibodies, with a 95% confidence interval of 49.9 to 57.3%. The false-negative rate of the ELISA was calculated to be approximately 2.5%, with an upper 1-sided confidence limit of 4.4%. Notably, we observed anti-JCV antibodies in all 17 available pre-PML sera samples, which was significantly different from the 53.6% seropositivity observed in the overall MS study population (p < 0.0001). Interpretation This 2-step assay provides a means to classify MS patients as having detectable or not detectable levels of anti-JCV antibodies. The finding that all 17 of the pre-PML samples that were available tested seropositive, and none tested seronegative, warrants further research on the clinical utility of the anti-JCV antibody assay as a potential tool for stratifying MS patients for higher or lower risk of developing PML. Ann Neurol 2010 [source] Risk Assessment for Quantitative Responses Using a Mixture ModelBIOMETRICS, Issue 2 2000Mehdi Razzaghi Summary. A problem that frequently occurs in biological experiments with laboratory animals is that some subjects are less susceptible to the treatment than others. A mixture model has traditionally been proposed to describe the distribution of responses in treatment groups for such experiments. Using a mixture dose-response model, we derive an upper confidence limit on additional risk, defined as the excess risk over the background risk due to an added dose. Our focus will be on experiments with continuous responses for which risk is the probability of an adverse effect defined as an event that is extremely rare in controls. The asymptotic distribution of the likelihood ratio statistic is used to obtain the upper confidence limit on additional risk. The method can also be used to derive a benchmark dose corresponding to a specified level of increased risk. The EM algorithm is utilized to find the maximum likelihood estimates of model parameters and an extension of the algorithm is proposed to derive the estimates when the model is subject to a specified level of added risk. An example is used to demonstrate the results, and it is shown that by using the mixture model a more accurate measure of added risk is obtained. [source] Interobserver Agreement in Assessment of Clinical Variables in Children with Blunt Head TraumaACADEMIC EMERGENCY MEDICINE, Issue 9 2008Marc H. Gorelick MD Abstract Objectives:, To be useful in development of clinical decision rules, clinical variables must demonstrate acceptable agreement when assessed by different observers. The objective was to determine the interobserver agreement in the assessment of historical and physical examination findings of children undergoing emergency department (ED) evaluation for blunt head trauma. Methods:, This was a prospective cohort study of children younger than 18 years evaluated for blunt head trauma at one of 25 EDs in the Pediatric Emergency Care Applied Research Network (PECARN). Patients were excluded if injury occurred more than 24 hours prior to evaluation, if neuroimaging was obtained at another hospital prior to evaluation, or if the patient had a clinically trivial mechanism of injury. Two clinicians independently completed a standardized clinical assessment on a templated data form. Assessments were performed within 60 minutes of each other and prior to clinician review of any neuroimaging (if obtained). Agreement between the two observers beyond that expected by chance was calculated for each clinical variable, using the kappa (,) statistic for categorical variables and weighted kappa for ordinal variables. Variables with a lower 95% confidence limit (LCL) of , > 0.4 were considered to have acceptable agreement. Results:, Fifteen-hundred pairs of observations were obtained. Acceptable agreement was achieved in 27 of the 32 variables studied (84%). Mechanism of injury (low, medium, or high risk) had , = 0.83. For subjective symptoms, kappa ranged from 0.47 (dizziness) to 0.93 (frequency of vomiting); all had 95% LCL > 0.4. Of the physical examination findings, kappa ranged from 0.22 (agitated) to 0.89 (Glasgow Coma Scale [GCS] score). The 95% LCL for kappa was <0.4 for four individual signs of altered mental status and for quality (i.e., boggy or firm) of scalp hematoma if present. Conclusions:, Both subjective and objective clinical variables in children with blunt head trauma can be assessed by different observers with acceptable agreement, making these variables suitable candidates for clinical decision rules. [source] Measurement of memory of colorCOLOR RESEARCH & APPLICATION, Issue 4 2002H. H. Seliger Abstract Colors produced by monochromatic wavelengths of light viewed in isolation have been used as the only visual variables in short-term delayed matching (DM) and long-term recall (LTR) protocols to quantify three types of color memory in individuals with normal color vision. Measurements were normally distributed, so that color memories of individuals could be compared in terms of means and standard deviations. The variance of LTR of colors of familiar objects is shown to be separable into two portions, one due to "preferred colors" and the other due to individuals' precisions of matching. The wavelength dependence of DM exhibited minima of standard deviations at the same wavelengths as those reported for color discrimination measured by bipartite wavelength matching, and these wavelengths were shown to occur at the wavelengths of the intersections of cone spectral sensitivities. In an intermediate "green" region of relatively constant color discrimination, it was possible to combine DM measurements for different wavelengths for statistical analysis. The standard deviations of DM for individuals of a healthy population were normally distributed, providing a 95% upper confidence limit for identifying individuals with possible short-term memory impairment. Preliminary measurements of standard deviations of DM for delay times of , 1 s were consistent with a proposed rapidly decaying color imagery memory. © 2002 Wiley Periodicals, Inc. Col Res Appl, 27, 233,242, 2002; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/col.10067 [source] Using Population Count Data to Assess the Effects of Changing River Flow on an Endangered Riparian PlantCONSERVATION BIOLOGY, Issue 4 2006DIANE M. THOMSON análisis de viabilidad poblacional; gestión ribereña; método de difusión; presas; riesgo de extinción Abstract:,Methods for using simple population count data to project extinction risk have been the focus of much recent theoretical work, but few researchers have used these approaches to address management questions. We analyzed 15 years of census data on the federally endangered endemic riparian plant Pityopsis ruthii (Small) with the diffusion approximation (DA). Our goals were to evaluate relative extinction risk among populations in two different watersheds (in Tennessee, U.S.A.) and potential effects of variation in managed river flow on population dynamics. Populations in both watersheds had high projected risks of extinction within 50 years, but the causes of this risk differed. Populations of P. ruthii on the Hiwassee River had higher initial population sizes but significantly lower average growth rates than those on the Ocoee River. The only populations with low predicted short-term extinction risk were on the Ocoee. Growth rates for populations on both rivers were significantly reduced during periods of lower river flow. We found only marginal evidence of a quadratic relationship between population performance and flow. These patterns are consistent with the idea that low flows affect P. ruthii due to growth of competing vegetation, but the degree to which very high flows may reduce population growth is still unclear. Simulations indicated that populations were most sensitive to growth rates in low-flow years, but small changes in the frequency of these periods did not strongly increase risk for most populations. Consistent with results of other studies, DA estimates of extinction risk had wide confidence limits. Still, our results yielded several valuable insights, including the need for greater monitoring of populations on the Hiwassee and the importance of low-flow years to population growth. Our work illustrates the potential value of simple methods for analyzing count data despite the challenges posed by uncertainty in estimates of extinction risk. Resumen:,Los métodos que utilizan datos de conteos simples de la población para proyectar el riesgo de extinción han sido el foco reciente de mucho trabajo teórico, pero pocos investigadores han utilizado estos métodos para responder preguntas de gestión. Analizamos 15 años de datos de censos de la planta ribereña, endémica y federalmente en peligro Pityopsis ruthii (Small) mediante el método de difusión. Nuestras metas fueron evaluar el riesgo de extinción de poblaciones en dos cuencas hidrológicas distintas y con dos efectos potenciales de la variación del flujo de agua sobre la dinámica de la población. Las poblaciones en ambas cuencas tenían alto riesgo de extinción proyectado a 50 años, pero las causas de este riesgo difirieron. Las poblaciones de P. ruthii en el Río Hiwassee tuvieron poblaciones iniciales más grandes, pero tasas de crecimiento significativamente menores, que las poblaciones en el Río Ocoee. Las únicas poblaciones con bajo riesgo de extinción pronosticado estaban en el Ocoee. Las tasas de crecimiento de las poblaciones en ambos ríos se redujeron significativamente durante períodos de bajo flujo en el río. Sólo encontramos evidencia marginal de la relación cuadrática entre el funcionamiento de la población y el flujo. Estos patrones son consistentes con la idea de que los bajos flujos afectan a P. ruthii debido al crecimiento de vegetación competitiva, pero aun no es claro el grado en que flujos muy grandes pueden reducir el crecimiento poblacional. Las simulaciones indicaron que las poblaciones son más sensibles a las tasas de crecimiento en años con bajo flujo en los ríos, pero pequeños cambios en la frecuencia de esos períodos no aumentaron el riesgo en la mayoría de las poblaciones. Consistentemente con los resultados de otros estudios, las estimaciones del riesgo de extinción mediante el método de difusión tienen amplios límites de confianza. Aun así, nuestros resultados aportaron varios conocimientos valiosos, incluyendo la necesidad de mayor monitoreo de las poblaciones en el Hiwassee y la importancia para el crecimiento poblacional de los años con bajo flujo. Nuestro trabajo ilustra el valor potencial de métodos sencillos de análisis de datos de conteo a pesar de los retos impuestos por la incertidumbre en las estimaciones del riesgo de extinción. [source] Evaluation of statistical methods for left-censored environmental data with nonuniform detection limitsENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 9 2006Parikhit Sinha Abstract Monte Carlo simulations were used to evaluate statistical methods for estimating 95% upper confidence limits of mean constituent concentrations for left-censored data with nonuniform detection limits. Two primary scenarios were evaluated: data sets with 15 to 50% nondetected samples and data sets with 51 to 80% nondetected samples. Sample size and the percentage of nondetected samples were allowed to vary randomly to generate a variety of left-censored data sets. All statistical methods were evaluated for efficacy by comparing the 95% upper confidence limits for the left-censored data with the 95% upper confidence limits for the noncensored data and by determining percent coverage of the true mean (,). For data sets with 15 to 50% nondetected samples, the trimmed mean, Winsorization, Aitchison's, and log-probit regression methods were evaluated. The log-probit regression was the only method that yielded sufficient coverage (99,100%) of ,, as well as a high correlation coefficient (r2 = 0.99) and small average percent residuals (, 0.1%) between upper confidence limits for censored versus noncensored data sets. For data sets with 51 to 80% nondetected samples, a bounding method was effective (r2 = 0.96,0.99, average residual = ,5% to ,7%, 95-98% coverage of ,), except when applied to distributions with low coefficients of variation (standard deviation/, < 0.5). Thus, the following recommendations are supported by this research: data sets with 15 to 50% nondetected samples,log-probit regression method and use of Chebyshev theorem to estimate 95% upper confidence limits; data sets with 51 to 80% nondetected samples, bounding method and use of Chebyshev theorem to estimate 95% upper confidence limits. [source] Spatial point-process statistics: concepts and application to the analysis of lead contamination in urban soil,ENVIRONMETRICS, Issue 4 2005Christian Walter Abstract This article explores the use of spatial point-process analysis as an aid to describe topsoil lead distribution in urban environments. The data used were collected in Glebe, an inner suburb of Sydney. The approach focuses on the locations of punctual events defining a point pattern, which can be statistically described through local intensity estimates and between-point distance functions. F -, G - and K -surfaces of a marked spatial point pattern were described and used to estimate nearest distance functions over a sliding band of quantiles belonging to the marking variable. This provided a continuous view of the point pattern properties as a function of the marking variable. Several random fields were simulated by selecting points from random, clustered or regular point processes and diffusing them. Recognition of the underlying point process using variograms derived from dense sampling was difficult because, structurally, the variograms were very similar. Point-event distance functions were useful complimentary tools that, in most cases, enabled clear recognition of the clustered processes. Spatial sampling quantile point pattern analysis was defined and applied to the Glebe data set. The analysis showed that the highest lead concentrations were strongly clustered. The comparison of this data set with the simulation confidence limits of a Poisson process, a short-radius clustered point process and a geostatistical simulation showed a random process for the third quartile of lead concentrations but strong clustering for the data in the upper quartile. Thus the distribution of topsoil lead concentrations over Glebe may have resulted from several contamination processes, mainly from regular or random processes with large diffusion ranges and short-range clustered processes for the hot spots. Point patterns with the same characteristics as the Glebe experimental pattern could be generated by separate additive geostatistical simulation. Spatial sampling quantile point patterns statistics can, in an easy and accurate way, be used complementarily with geostatistical methods. Copyright © 2005 John Wiley & Sons, Ltd. [source] Confidence intervals for the calibration estimator with environmental applicationsENVIRONMETRICS, Issue 1 2002I. Müller Abstract The article investigates different estimation techniques in the simple linear controlled calibration model and provides different types of confidence limits for the calibration estimator. In particular, M-estimation and bootstrapping techniques are implemented to obtain estimates of regression parameters during the training stage. Moreover, bootstrap is used to construct several types of confidence intervals that are compared to the classical approach based on the assumption of normality. For some of these intervals, the second order asymptotic properties can be established by means of Edgeworth expansions. Two data sets,one on space debris and the other on bacteriological counts in water samples,are used to illustrate the method's environmental applications. Copyright © 2002 John Wiley & Sons, Ltd. [source] PERSPECTIVE: GENE DIVERGENCE, POPULATION DIVERGENCE, AND THE VARIANCE IN COALESCENCE TIME IN PHYLOGEOGRAPHIC STUDIESEVOLUTION, Issue 6 2000ScottV. Abstract Molecular methods as applied to the biogeography of single species (phylogeography) or multiple codistributed species (comparative phylogeography) have been productively and extensively used to elucidate common historical features in the diversification of the Earth's biota. However, only recently have methods for estimating population divergence times or their confidence limits while taking into account the critical effects of genetic polymorphism in ancestral species become available, and earlier methods for doing so are underutilized. We review models that address the crucial distinction between the gene divergence, the parameter that is typically recovered in molecular phylogeographic studies, and the population divergence, which is in most cases the parameter of interest and will almost always postdate the gene divergence. Assuming that population sizes of ancestral species are distributed similarly to those of extant species, we show that phylogeographic studies in vertebrates suggest that divergence of alleles in ancestral species can comprise from less than 10% to over 50% of the total divergence between sister species, suggesting that the problem of ancestral polymorphism in dating population divergence can be substantial. The variance in the number of substitutions (among loci for a given species or among species for a given gene) resulting from the stochastic nature of DNA change is generally smaller than the variance due to substitutions along allelic lines whose coalescence times vary due to genetic drift in the ancestral population. Whereas the former variance can be reduced by further DNA sequencing at a single locus, the latter cannot. Contrary to phylogeographic intuition, dating population divergence times when allelic lines have achieved reciprocal monophyly is in some ways more challenging than when allelic lines have not achieved monophyly, because in the former case critical data on ancestral population size provided by residual ancestral polymorphism is lost. In the former case differences in coalescence time between species pairs can in principle be explained entirely by differences in ancestral population size without resorting to explanations involving differences in divergence time. Furthermore, the confidence limits on population divergence times are severely underestimated when those for number of substitutions per site in the DNA sequences examined are used as a proxy. This uncertainty highlights the importance of multilocus data in estimating population divergence times; multilocus data can in principle distinguish differences in coalescence time (T) resulting from differences in population divergence time and differences in T due to differences in ancestral population sizes and will reduce the confidence limits on the estimates. We analyze the contribution of ancestral population size (,) to T and the effect of uncertainty in , on estimates of population divergence (,) for single loci under reciprocal monophyly using a simple Bayesian extension of Takahata and Satta's and Yang's recent coalescent methods. The confidence limits on , decrease when the range over which ancestral population size , is assumed to be distributed decreases and when increases; they generally exclude zero when /(4Ne) > 1. We also apply a maximum-likelihood method to several single and multilocus data sets. With multilocus data, the criterion for excluding = 0 is roughly that l/(4Ne)> 1, where l is the number of loci. Our analyses corroborate recent suggestions that increasing the number of loci is critical to decreasing the uncertainty in estimates of population divergence time. [source] Are stock assessment methods too complicated?FISH AND FISHERIES, Issue 3 2004A J R Cotter Abstract This critical review argues that several methods for the estimation and prediction of numbers-at-age, fishing mortality coefficients F, and recruitment for a stock of fish are too hard to explain to customers (the fishing industry, managers, etc.) and do not pay enough attention to weaknesses in the supporting data, assumptions and theory. The review is linked to North Sea demersal stocks. First, weaknesses in the various types of data used in North Sea assessments are summarized, i.e. total landings, discards, commercial and research vessel abundance indices, age-length keys and natural mortality (M). A list of features that an ideal assessment should have is put forward as a basis for comparing different methods. The importance of independence and weighting when combining different types of data in an assessment is stressed. Assessment methods considered are Virtual Population Analysis, ad hoc tuning, extended survivors analysis (XSA), year-class curves, catch-at-age modelling, and state-space models fitted by Kalman filter or Bayesian methods. Year-class curves (not to be confused with ,catch-curves') are the favoured method because of their applicability to data sets separately, their visual appeal, simple statistical basis, minimal assumptions, the availability of confidence limits, and the ease with which estimates can be combined from different data sets after separate analyses. They do not estimate absolute stock numbers or F but neither do other methods unless M is accurately known, as is seldom true. [source] Evaluation of large-scale stocking of early stages of brown trout, Salmo trutta, to angler catches in the French,Swiss part of the River DoubsFISHERIES MANAGEMENT & ECOLOGY, Issue 2 2003A. Champigneulle Abstract Around 500 000 brown trout, Salmo trutta L., alevins are stocked annually in the 24-km section of the River Doubs under study. All the alevins stocked in the period 1994,1996 were identifiable by fluoromarking their otoliths with tetracycline chlorhydrate. Anglers' catches, between June 1997 and September 1998, comprised trout aged 1+ to 7+ , but most (90% +) were 2+ to 3+ or 4+ , with the majority at 2+ and 3+. There was no significant difference in the size for a given age between marked and unmarked angled trout. The contribution of stocked fish in anglers' catches was around 22% for the 1995 cohort. The contribution of stocking (cohorts 1994 to 1995,1996) to the 1998 catches was around 23% (95% confidence limits: 19,27%). The estimated recapture rate was three to four trout per 1000 alevins stocked for the 1995 cohort. The major contribution (78%) of natural recruitment to anglers' catches suggests that the fishery management based on natural recruitment is still realistic in this part of River Doubs. [source] Calculating census efficiency for river birds: a case study with the White-throated Dipper Cinclus cinclus in the PyrénéesIBIS, Issue 1 2003Frank D'Amico Using the binomial law we modelled field data to estimate the probability (p,) of detecting pairs of breeding White-throated Dippers, and the population size (N,± confidence limits). The model was divided into two parts according to whether the actual size of the population under study was known or not; in the latter case the truncated binomial model was used. Dipper abundance data were collected from three 4-km-long river tracts in the Pyrénées (France) during the breeding seasons of different years. Goodness-of-fit tests indicated that the binomial model fitted the data well. For a given visit during the survey, the estimated probability of detecting any pair of Dippers if they were present was always high (0.63,0.94) and constant from year to year but not between sites. Estimations (N,) of the size of the population provided by the binomial model were very close to that derived from mapping techniques. This study provides the first ever quantification of the number of visits required to detect birds on linear territories: three visits were necessary to detect the whole breeding population. [source] Animal use replacement, reduction, and refinement: Development of an integrated testing strategy for bioconcentration of chemicals in fish,INTEGRATED ENVIRONMENTAL ASSESSMENT AND MANAGEMENT, Issue 1 2007Watze de Wolf Abstract When addressing the use of fish for the environmental safety of chemicals and effluents, there are many opportunities for applying the principles of the 3Rs: Reduce, Refine, and Replace. The current environmental regulatory testing strategy for bioconcentration and secondary poisoning has been reviewed, and alternative approaches that provide useful information are described. Several approaches can be used to reduce the number of fish used in the Organization for Economic Cooperation and Development (OECD) Test Guideline 305, including alternative in vivo test methods such as the dietary accumulation test and the static exposure approach. The best replacement approach would seem to use read-across, chemical grouping, and quantitative structure-activity relationships with an assessment of the key processes in bioconcentration: Adsorption, distribution, metabolism, and excretion. Biomimetic extraction has particular usefulness in addressing bioavailable chemicals and is in some circumstances capable of predicting uptake. Use of alternative organisms such as invertebrates should also be considered. A single cut-off value for molecular weight and size beyond which no absorption will take place cannot be identified. Recommendations for their use in bioaccumulative (B) categorization schemes are provided. Assessment of biotransformation with in vitro assays and in silico approaches holds significant promise. Further research is needed to identify their variability and confidence limits and the ways to use this as a basis to estimate bioconcentration factors. A tiered bioconcentration testing strategy has been developed taking account of the alternatives discussed. [source] A case-control study of the association of the polymorphisms and haplotypes of DNA ligase I with lung and upper-aerodigestive-tract cancersINTERNATIONAL JOURNAL OF CANCER, Issue 7 2008Yuan-Chin Amy Lee Abstract Tobacco smoking is a major risk factor for lung and upper-aerodigestive-tract (UADT) cancers. One possible mechanism for the associations may be through DNA damage pathways. DNA Ligase I (LIG1) is a DNA repair gene involved in both the nucleotide excision repair (NER) and the base excision repair (BER) pathways. We examined the association of 4 LIG1 polymorphisms with lung and UADT cancers, and their potential interactions with smoking in a population-based case-control study in Los Angeles County. We performed genotyping using the SNPlex method from Applied Biosystems. Logistic regression analyses of 551 lung cancer cases, 489 UADT cancer cases and 948 controls showed the expected associations of tobacco smoking with lung and UADT cancers and new associations between the LIG1 haplotypes and these cancers. For lung cancer, when compared to the most common haplotype (rs20581-rs20580-rs20579-rs439132 = T-C-C-A), the adjusted odds ratio (OR) is 1.2 (95% confidence limits (CL) = 0.95, 1.5) for the CACA haplotype, 1.4 (1.0, 1.9) for the CATA haplotype and 1.8 (1.1, 2.8) for the CCCG haplotype, after controlling for age, gender, race/ethnicity, education and tobacco smoking. We observed weaker associations between the LIG1 haplotypes and UADT cancers. Our findings suggest the LIG1 haplotypes may affect the risk of lung and UADT cancers. © 2007 Wiley-Liss, Inc. [source] |