Sample Data (sample + data)

Distribution by Scientific Domains


Selected Abstracts


Antenatal use of enoxaparin for prevention and treatment of thromboembolism in pregnancy

BJOG : AN INTERNATIONAL JOURNAL OF OBSTETRICS & GYNAECOLOGY, Issue 9 2000
Joanne Ellison Clinical Research Fellow
Objective To assess the safety and efficacy of enoxaparin use for thromboprophylaxis or treatment of venous thromboembolism during pregnancy. Design Retrospective review of casenotes of women who received enoxaparin during pregnancy. Setting Obstetric Medicine Unit at Glasgow Royal Maternity Hospital. Sample Data were obtained on 57 pregnancies in 50 women over six years. Methods Information was obtained from case records in relation to outcome measures, the presence of underlying thrombophilia and indication for anticoagulation. Main outcome measures Incidences of venous thromboembolism, haemorrhage, thrombocytopenia, peak plasma anti-factor Xa levels and symptomatic osteoporosis. Results There were no thromboembolic events in the thromboprophylaxis group. There were no incidences of heparin-induced thrombocytopenia. Twenty-two women had spinal or epidural anaesthesia and no complications were encountered. There was one instance of antepartum haemorrhage following attempted amniotomy in a woman with previously unknown vasa praevia. Two women sustained postpartum haemorrhage, both secondary to vaginal lacerations, resulting in blood loss > 1000 mL. Blood loss following caesarean section was not excessive. No instances of vertebral or hip fracture were encountered. The median peak plasma anti-factor Xa level on a dose of 40 mg once daily was 0.235 U/mL; peak plasma anti-factor Xa levels were not affected by gestational age. Conclusions The use of enoxaparin in pregnancy is associated with a low incidence of complications and a dose of 40 mg once daily throughout pregnancy provides satisfactory anti-factor Xa levels and appears effective in preventing venous thromboembolism. [source]


Dizziness Presentations in U.S. Emergency Departments, 1995,2004

ACADEMIC EMERGENCY MEDICINE, Issue 8 2008
Kevin A. Kerber MD
Abstract Objectives:, The objectives were to describe presentation characteristics and health care utilization information pertaining to dizziness presentations in U.S. emergency departments (EDs) from 1995 through 2004. Methods:, From the National Hospital Ambulatory Medical Care Survey (NHAMCS), patient visits to EDs for "vertigo-dizziness" were identified. Sample data were weighted to produce nationally representative estimates. Patient characteristics, diagnoses, and health care utilization information were obtained. Trends over time were assessed using weighted least squares regression analysis. Multivariable logistic regression analysis was used to control for the influence of age on the probability of a vertigo-dizziness visit during the study time period. Results:, Vertigo-dizziness presentations accounted for 2.5% (95% confidence interval [CI] = 2.4% to 2.6%) of all ED presentations during this 10-year period. From 1995 to 2004, the rate of visits for vertigo-dizziness increased by 37% and demonstrated a significant linear trend (p < 0.001). Even after adjusting for age (and other covariates), every increase in year was associated with increased odds of a vertigo-dizziness visit. At each visit, a median of 3.6 diagnostic or screening tests (95% CI = 3.2 to 4.1) were performed. Utilization of many tests increased over time (p < 0.01). The utilization of computerized tomography and magnetic resonance imaging (CT/MRI) increased 169% from 1995 to 2004, which was more than any other test. The rate of central nervous system diagnoses (e.g., cerebrovascular disease or brain tumor) did not increase over time. Conclusions:, In terms of number of visits and important utilization measures, the impact of dizziness presentations on EDs is substantial and increasing. CT/MRI utilization rates have increased more than any other test. [source]


Auditor Quality and the Accuracy of Management Earnings Forecasts,

CONTEMPORARY ACCOUNTING RESEARCH, Issue 4 2000
PETER M. CLARKSON
Abstract In this study, we appeal to insights and results from Davidson and Neu 1993 and McConomy 1998 to motivate empirical analyses designed to gain a better understanding of the relationship between auditor quality and forecast accuracy. We extend and refine Davidson and Neu's analysis of this relationship by introducing additional controls for business risk and by considering data from two distinct time periods: one in which the audit firm's responsibility respecting the earnings forecast was to provide review-level assurance, and one in which its responsibility was to provide audit-level assurance. Our sample data consist of Toronto Stock Exchange (TSE) initial public offerings (IPOs). The earnings forecast we consider is the one-year-ahead management earnings forecast included in the IPO offering prospectus. The results suggest that after the additional controls for business risk are introduced, the relationship between forecast accuracy and auditor quality for the review-level assurance period is no longer significant. The results also indicate that the shift in regimes alters the fundamental nature of the relationship. Using data from the audit-level assurance regime, we find a negative and significant relationship between forecast accuracy and auditor quality (i.e., we find Big 6 auditors to be associated with smaller absolute forecast errors than non-Big 6 auditors), and further, that the difference in the relationship between the two regimes is statistically significant. [source]


A Geographic Information Systems,based, weights-of-evidence approach for diagnosing aquatic ecosystem impairment

ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 8 2006
Katherine E. Kapo
Abstract A Geographic Information Systems,based, watershed-level assessment using Bayesian weights of evidence (WOE) and weighted logistic regression (WLR) provides a method to determine and compare potential environmental stressors in lotic ecosystems and to create predictive models of general or species-specific biological impairment across numerous spatial scales based on limited existing sample data. The WOE/WLR technique used in the present study is a data-driven, probabilistic approach conceptualized in epidemiological research and both developed for and currently used in minerals exploration. Extrapolation of this methodology to a case-study watershed assessment of the Great and Little Miami watersheds (OH, USA) using archival data yielded baseline results consistent with previous assessments. The method additionally produced a quantitative determination of physical and chemical watershed stressor associations with biological impairment and a predicted comparative probability (i.e., favorability) of biological impairment at a spatial resolution of 0.5 km2 over the watershed study region. Habitat stressors showed the greatest spatial association with biological impairment in low-order streams (on average, 56% of total spatial association), whereas water chemistry, particularly that of wastewater effluent, was associated most strongly with biological impairment in high-order reaches (on average, 79% of total spatial association, 28% of which was attributed to effluent). Significant potential stressors varied by land-use and stream order as well as by species. This WOE/WLR method provides a highly useful "tier 1" watershed risk assessment product through the integration of various existing data sources, and it produces a clear visual communication of areas favorable for biological impairment and a quantitative ranking of candidate stressors and associated uncertainty. [source]


Environmental power analysis , a new perspective

ENVIRONMETRICS, Issue 5 2001
David R. Fox
Abstract Power analysis and sample-size determination are related tools that have recently gained popularity in the environmental sciences. Their indiscriminate application, however, can lead to wildly misleading results. This is particularly true in environmental monitoring and assessment, where the quality and nature of data is such that the implicit assumptions underpinning power and sample-size calculations are difficult to justify. When the assumptions are reasonably met these statistical techniques provide researchers with an important capability for the allocation of scarce and expensive resources to detect putative impact or change. Conventional analyses are predicated on a general linear model and normal distribution theory with statistical tests of environmental impact couched in terms of changes in a population mean. While these are ,optimal' statistical tests (uniformly most powerful), they nevertheless pose considerable practical difficulties for the researcher. Compounding this difficulty is the subsequent analysis of the data and the impost of a decision framework that commences with an assumption of ,no effect'. This assumption is only discarded when the sample data indicate demonstrable evidence to the contrary. The alternative (,green') view is that any anthropogenic activity has an impact on the environment and therefore a more realistic initial position is to assume that the environment is already impacted. In this article we examine these issues and provide a re-formulation of conventional mean-based hypotheses in terms of population percentiles. Prior information or belief concerning the probability of exceeding a criterion is incorporated into the power analysis using a Bayesian approach. Finally, a new statistic is introduced which attempts to balance the overall power regardless of the decision framework adopted. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Methods for detecting interactions between genetic polymorphisms and prenatal environment exposure with a mother-child design

GENETIC EPIDEMIOLOGY, Issue 2 2010
Shuang Wang
Abstract Prenatal exposures such as polycyclic aromatic hydrocarbons and early postnatal environmental exposures are of particular concern because of the heightened susceptibility of the fetus and infant to diverse environmental pollutants. Marked inter-individual variation in response to the same level of exposure was observed in both mothers and their newborns, indicating that susceptibility might be due to genetic factors. With the mother-child pair design, existing methods developed for parent-child trio data or random sample data are either not applicable or not designed to optimally use the information. To take full advantage of this unique design, which provides partial information on genetic transmission and has both maternal and newborn outcome status collected, we developed a likelihood-based method that uses both the maternal and the newborn information together and jointly models gene-environment interactions on maternal and newborn outcomes. Through intensive simulation studies, the proposed method has demonstrated much improved power in detecting gene-environment interactions. The application on a real mother-child pair data from a study conducted in Krakow, Poland, suggested four significant gene-environment interactions after multiple comparisons adjustment. Genet. Epidemiol. 34: 125,132, 2010. © 2009 Wiley-Liss, Inc. [source]


Neurofuzzy Modeling of Context,Contingent Proximity Relations

GEOGRAPHICAL ANALYSIS, Issue 2 2007
Xiaobai Yao
The notion of proximity is one of the foundational elements in humans' understanding and reasoning of the geographical environments. The perception and cognition of distances plays a significant role in many daily human activities. Yet, few studies have thus far provided context,contingent translation mechanisms between linguistic proximity descriptors (e.g., "near,""far") and metric distance measures. One problem with previous fuzzy logic proximity modeling studies is that they presume the form of the fuzzy membership functions of proximity relations. Another problem is that previous studies have fundamental weaknesses in considering context factors in proximity models. We argue that statistical approaches are ill suited to proximity modeling because of the inherently fuzzy nature of the relations between linguistic and metric distance measures. In this study, we propose a neurofuzzy system approach to solve this problem. The approach allows for the dynamic construction of context,contingent proximity models based on sample data. An empirical case study with human subject survey data is carried out to test the validity of the approach and to compare it with the previous statistical approach. Interpretation and prediction accuracy of the empirical study are discussed. [source]


Color invariant object recognition using entropic graphs

INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, Issue 5 2006
Jan C. van Gemert
Abstract We present an object recognition approach using higher-order color invariant features with an entropy-based similarity measure. Entropic graphs offer an unparameterized alternative to common entropy estimation techniques, such as a histogram or assuming a probability distribution. An entropic graph estimates entropy from a spanning graph structure of sample data. We extract color invariant features from object images invariant to illumination changes in intensity, viewpoint, and shading. The Henze,Penrose similarity measure is used to estimate the similarity of two images. Our method is evaluated on the ALOI collection, a large collection of object images. This object image collection consists of 1000 objects recorded under various imaging circumstances. The proposed method is shown to be effective under a wide variety of imaging conditions. © 2007 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 16, 146,153, 2006 [source]


The Reliability of Difference Scores in Populations and Samples

JOURNAL OF EDUCATIONAL MEASUREMENT, Issue 1 2009
Donald W. Zimmerman
This study was an investigation of the relation between the reliability of difference scores, considered as a parameter characterizing a population of examinees, and the reliability estimates obtained from random samples from the population. The parameters in familiar equations for the reliability of difference scores were redefined in such a way that determinants of reliability in both populations and samples become more transparent. Computer simulation was used to find sample values and to plot frequency distributions of various correlations and variance ratios relevant to the reliability of differences. The shape of frequency distributions resulting from the simulations and the means and standard deviations of these distributions reveal the extent to which reliability estimates based on sample data can be expected to meaningfully represent population reliability. [source]


Amino-acid geochronology and the British Pleistocene: secure stratigraphical framework or a case of circular reasoning?

JOURNAL OF QUATERNARY SCIENCE, Issue 7 2002
Danny McCarroll
Abstract Aminostratigraphy is central to the recently revised correlation of Quaternary deposits in the British Isles, providing a link between terrestrial deposits and marine Oxygen Isotope Stages. The central tenet of British aminostratigraphy, however, that shells from the same interglacial yield very similar ratios, so that the characteristic ratios from different interglacials are distinct, remains uncertain. The data available suggest that amino-acid ratios from different interglacials do not fall into discrete groups, but overlap considerably. It is therefore not valid to assign individual shells to Oxygen Isotope Stages simply on the basis of their amino-acid ratios, which means that filtering data to remove high or low values, on the assumption that they represent reworked shells, is unacceptable. The range of ,characteristic ratios' assigned to British warm stages may have been underestimated and the degree of separation between them overestimated. Amino-acid ratios should be treated as sample data that are naturally variable. Copyright © 2002 John Wiley & Sons, Ltd. [source]


A THURSTONE-COOMBS MODEL OF CONCURRENT RATINGS WITH SENSORY AND LIKING DIMENSIONS

JOURNAL OF SENSORY STUDIES, Issue 1 2002
F. GREGORY ASHBY
ABSTRACT A popular product testing procedure is to obtain sensory intensity and liking ratings from the same consumers. Consumers are instructed to attend to the sensory attribute, such as sweetness, when generating their liking response. We propose a new model of this concurrent ratings task that conjoins a unidimensional Thurstonian model of the ratings on the sensory dimension with a probabilistic version of Coombs' (1964) unfolding model for the liking dimension. The model assumes that the sensory characteristic of the product has a normal distribution over consumers. An individual consumer selects a sensory rating by comparing the perceived value on the sensory dimension to a set of criteria that partitions the axis into intervals. Each value on the rating scale is associated with a unique interval. To rate liking, the consumer imagines an ideal product, then computes the discrepancy or distance between the product as perceived by the consumer and this imagined ideal. A set of criteria are constructed on this discrepancy dimension that partition the axis into intervals. Each interval is associated with a unique liking rating. The ideal product is assumed to have a univariate normal distribution over consumers on the sensory attribute evaluated. The model is shown to account for 94.2% of the variance in a set of sample data and to fit this data significantly better than a bivariate normal model of the data (concurrent ratings, Thurstonian scaling, Coombs' unfolding model, sensory and liking ratings). [source]


A measure of disclosure risk for microdata

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 4 2002
C. J. Skinner
Summary. Protection against disclosure is important for statistical agencies releasing microdata files from sample surveys. Simple measures of disclosure risk can provide useful evidence to support decisions about release. We propose a new measure of disclosure risk: the probability that a unique match between a microdata record and a population unit is correct. We argue that this measure has at least two advantages. First, we suggest that it may be a more realistic measure of risk than two measures that are currently used with census data. Second, we show that consistent inference (in a specified sense) may be made about this measure from sample data without strong modelling assumptions. This is a surprising finding, in its contrast with the properties of the two ,similar' established measures. As a result, this measure has potentially useful applications to sample surveys. In addition to obtaining a simple consistent predictor of the measure, we propose a simple variance estimator and show that it is consistent. We also consider the extension of inference to allow for certain complex sampling schemes. We present a numerical study based on 1991 census data for about 450 000 enumerated individuals in one area of Great Britain. We show that the theoretical results on the properties of the point predictor of the measure of risk and its variance estimator hold to a good approximation for these data. [source]


Linking Amazonian secondary succession forest growth to soil properties

LAND DEGRADATION AND DEVELOPMENT, Issue 4 2002
D. Lu
Abstract The Amazon Basin has suffered extensive deforestation in the past 30 years. Deforestation typically leads to changes in climate, biodiversity, hydrological cycle, and soil degradation. Vegetation succession plays an important role in soil restoration through accumulation of vegetation biomass and improved soil/plant interaction. However, relationships between succession and soil properties are not well known. For example, how does vegetation succession affect nutrient accumulation? Which soil factors are important in influencing vegetation growth? What is the best way to evaluate soil fertility in the Amazon basin? This paper focuses on the interrelationships between secondary succession and soil properties. Field soil sample data and vegetation inventory data were collected in two regions of Brazilian Amazonia (Altamira and Bragantina). Soil nutrients and texture were analyzed at successional forest sites. Multiple regression models were used to identify the important soil properties affecting vegetation growth, and a soil evaluation factor (SEF) was developed for evaluating soil fertility in Alfisols, Ultisols, and Oxisols, which differ in the ways they affect vegetation growth. For example, the upper 40,cm of soil is most important for vegetation growth in Alfisols, but in Ultisols and Oxisols deeper horizons significantly influence vegetation growth rates. Accumulation of vegetation biomass increased soil fertility and improved soil physical structure in Alfisols but did not completely compensate for the nutrient losses in Ultisols and Oxisols; however, it significantly reduced the rate of nutrient loss. Copyright © 2002 John Wiley & Sons, Ltd. [source]