Completeness

Distribution by Scientific Domains
Distribution within Medical Sciences


Selected Abstracts


The Accuracy and Completeness of Data Collected by Prospective and Retrospective Methods

ACADEMIC EMERGENCY MEDICINE, Issue 9 2005
J. Tobias Nagurney MD
Abstract Objectives: To describe and test a model that compares the accuracy of data gathered prospectively versus retrospectively among adult emergency department patients admitted with chest pain. Methods: The authors developed a model of information flow from subject to medical record to the clinical study case report form, based on a literature review. To test this model, a bidirectional (prospective and retrospective) study was conducted, enrolling all eligible adult patients who were admitted with a chief complaint of chest pain. The authors interviewed patients in the emergency department to determine their chest pain history and established a prospective database; this was considered the criterion standard. Then, patient medical records were reviewed to determine the accuracy and completeness of the information available through a retrospective medical record review. Results: The model described applies the concepts of reliability and validity to information passed on by the study subject, the clinician, and the medical record abstractor. This study was comprised of 104 subjects, of which 63% were men and the median age was 63 years. Subjects were uncertain of responses for 0,8% of questions and responded differently upon reinterview for subsets of questions 0,30% of the time. The sensitivity of the medical record for risk factors for coronary artery disease was 0.77 to 0.93. Among the 88 subjects (85%) who indicated that their chest pain was substernal or left chest, the medical record described this location in 44%. Timing of the chest pain was the most difficult item to accurately capture from the medical record. Conclusions: Information obtained retrospectively from the abstraction of medical records is measurably less accurate than information obtained prospectively from research subjects. For certain items, more than half of the information is not available. This loss of information is related to the data types included in the study and by the assumptions that a researcher performing a retrospective study makes about implied versus explicitly stated responses. A model of information flow that incorporates the concepts of reliability and validity can be used to measure some of the loss of information that occurs at each step along the way from subject to clinician to medical record abstractor. [source]


cag Pathogenicity Island of Helicobacter pylori in Korean Children

HELICOBACTER, Issue 4 2002
Jae Sung Ko
Abstract Background.cag pathogenicity island is reported to be a major virulence factor of Helicobacter pylori. The aim of this study was to investigate the status of cag pathogenicity island genes and gastric histology in Korean children with H. pylori gastritis. Methods.Helicobacter pylori DNA was extracted from antral biopsy specimens from 25 children with H. pylori gastritis. Specific polymerase chain reaction assays were used for four genes of cag pathogenicity island. The features of gastritis were scored in accordance with the updated Sydney System. Results.cagA was present in 23 (92%) of 25 children, and cagE in 24 (96%). Twenty-two (88%) children were cagT positive and 19 (76%) virD4 positive. All of the selected genes of the cag pathogenicity island were present in 17 (68%) children and completely deleted in one child. There were no differences in neutrophil activity and chronic inflammation between children infected with intact cag pathogenicity island strains and those with partially or totally deleted- cag pathogenicity island strains. Conclusion.cag pathogenicity island is not a uniform, conserved entity in Korea. Completeness of cag pathogenicity island may not be the major factor to determine the severity of H. pylori gastritis in children. [source]


Association between pacifier use and breast-feeding, sudden infant death syndrome, infection and dental malocclusion

INTERNATIONAL JOURNAL OF EVIDENCE BASED HEALTHCARE, Issue 6 2005
Ann Callaghan RN RM BNurs(Hons)
Executive summary Objective, To critically review all literature related to pacifier use for full-term healthy infants and young children. The specific review questions addressed are: What is the evidence of adverse and/or positive outcomes of pacifier use in infancy and childhood in relation to each of the following subtopics: ,breast-feeding; ,sudden infant death syndrome; ,infection; ,dental malocclusion. Inclusion criteria, Specific criteria were used to determine which studies would be included in the review: (i) the types of participants; (ii) the types of research design; and (iii) the types of outcome measures. To be included a study has to meet all criteria. Types of participants,The participants included in the review were healthy term infants and healthy children up to the age of 16 years. Studies that focused on preterm infants, and infants and young children with serious illness or congenital malformations were excluded. However, some total population studies did include these children. Types of research design, It became evident early in the review process that very few randomised controlled trials had been conducted. A decision was made to include observational epidemiological designs, specifically prospective cohort studies and, in the case of sudden infant death syndrome research, case,control studies. Purely descriptive and cross-sectional studies were excluded, as were qualitative studies and all other forms of evidence. A number of criteria have been proposed to establish causation in the scientific and medical literature. These key criteria were applied in the review process and are described as follows: (i) consistency and unbiasedness of findings; (ii) strength of association; (iii) temporal sequence; (iv) dose,response relationship; (v) specificity; (vi) coherence with biological background and previous knowledge; (vii) biological plausibility; and (viii) experimental evidence. Studies that did not meet the requirement of appropriate temporal sequencing of events and studies that did not present an estimate of the strength of association were not included in the final review. Types of outcome measures,Our specific interest was pacifier use related to: ,breast-feeding; ,sudden infant death syndrome; ,infection; ,dental malocclusion. Studies that examined pacifier use related to procedural pain relief were excluded. Studies that examined the relationship between pacifier use and gastro-oesophageal reflux were also excluded as this information has been recently presented as a systematic review. Search strategy, The review comprised published and unpublished research literature. The search was restricted to reports published in English, Spanish and German. The time period covered research published from January 1960 to October 2003. A protocol developed by New Zealand Health Technology Assessment was used to guide the search process. The search comprised bibliographic databases, citation searching, other evidence-based and guidelines sites, government documents, books and reports, professional websites, national associations, hand search, contacting national/international experts and general internet searching. Assessment of quality, All studies identified during the database search were assessed for relevance to the review based on the information provided in the title, abstract and descriptor/MeSH terms, and a full report was retrieved for all studies that met the inclusion criteria. Studies identified from reference list searches were assessed for relevance based on the study title. Keywords included: dummy, dummies, pacifier(s), soother(s), comforter(s), non-nutritive sucking, infant, child, infant care. Initially, studies were reviewed for inclusion by pairs of principal investigators. Authorship of articles was not concealed from the reviewers. Next, the methodological quality of included articles was assessed independently by groups of three or more principal investigators and clinicians using a checklist. All 20 studies that were accepted met minimum set criteria, but few passed without some methodological concern. Data extraction, To meet the requirements of the Joanna Briggs Institute, reasons for acceptance and non-acceptance at each phase were clearly documented. An assessment protocol and report form was developed for each of the three phases of review. The first form was created to record investigators' evaluations of studies included in the initial review. Those studies that failed to meet strict inclusion criteria were excluded at this point. A second form was designed to facilitate an in-depth critique of epidemiological study methodology. The checklist was pilot tested and adjustments were made before reviewers were trained in its use. When reviewers could not agree on an assessment, it was passed to additional reviewers and discussed until a consensus was reached. At this stage, studies other than cohort, case,control and randomised controlled trials were excluded. Issues of clarification were also addressed at this point. The final phase was that of integration. This phase, undertaken by the principal investigators, was assisted by the production of data extraction tables. Through a process of trial and error, a framework was formulated that adequately summarised the key elements of the studies. This information was tabulated under the following headings: authors/setting, design, exposure/outcome, confounders controlled, analysis and main findings. Results, With regard to the breast-feeding outcome, 10 studies met the inclusion criteria, comprising two randomised controlled trials and eight cohort studies. The research was conducted between 1995 and 2003 in a wide variety of settings involving research participants from diverse socioeconomic and cultural backgrounds. Information regarding exposure and outcome status, and potential confounding factors was obtained from: antenatal and postnatal records; interviews before discharge from obstetric/midwifery care; post-discharge interviews; and post-discharge postal and telephone surveys. Both the level of contact and the frequency of contact with the informant, the child's mother, differed widely. Pacifier use was defined and measured inconsistently, possibly because few studies were initiated expressly to investigate its relationship with breast-feeding. Completeness of follow-up was addressed, but missing data were not uniformly identified and explained. When comparisons were made between participants and non-participants there was some evidence of differential loss and a bias towards families in higher socioeconomic groups. Multivariate analysis was undertaken in the majority of studies, with some including a large number of sociodemographic, obstetric and infant covariates and others including just maternal age and education. As might be expected given the inconsistency of definition and measurement, the relationship between pacifier use and breast-feeding was expressed in many different ways and a meta-analysis was not appropriate. In summary, only one study did not report a negative association between pacifier use and breast-feeding duration or exclusivity. Results indicate an increase in risk for a reduced overall duration of breast-feeding from 20% to almost threefold. The data suggest that very infrequent use may not have any overall negative impact on breast-feeding outcomes. Six sudden infant death syndrome case,control studies met the criteria for inclusion. The research was conducted with information gathered between 1984 and 1999 in Norway, UK, New Zealand, the Netherlands and USA. Exposure information was obtained from a variety of sources including: hospital and antenatal records, death scene investigation, and interview and questionnaire. Information for cases was sought within 2 days after death, within 2,4 weeks after death and in one study between 3 and 11 years after death. Information for controls was sought from as early as 4 days of a nominated sudden infant death syndrome case, to between 1 and 7 weeks from the case date, and again in one study some 3,11 years later. In the majority of the studies case ascertainment was determined by post-mortem. Pacifier use was again defined and measured somewhat inconsistently. All studies controlled for confounding factors by matching and/or using multivariate analysis. Generally, antenatal and postnatal factors, as well as infant care practices, and maternal, family and socioeconomic issues were considered. All five studies reporting multivariate results found significantly fewer sudden infant death syndrome cases used a pacifier compared with controls. That is, pacifier use was associated with a reduced incidence of sudden infant death syndrome. These results indicate that the risk of sudden infant death syndrome for infants who did not use a pacifier in the last or reference sleep was at least twice, and possibly five times, that of infants who did use a pacifier. Three studies reported a moderately sized positive association between pacifier use and a variety of infections. Conversely, one study found no positive association between pacifier use at 15 months of age and a range of infections experienced between the ages of 6 and 18 months. Given the limited number of studies available and the variability of results, no meaningful conclusions could be drawn. Five cohort studies and one case,control study focused on the relationship between pacifier use and dental malocclusion. Not one of these studies reported a measure of association, such as an estimate of relative risk. It was therefore not possible to include these studies in the final review. Implications for practice, It is intended that this review be used as the basis of a ,best practice guideline', to make health professionals aware of the research evidence concerning these health and developmental consequences of pacifier use, because parents need clear information on which they can base child care decisions. With regard to the association between pacifier use and infection and dental malocclusion it was found that, due to the paucity of epidemiological studies, no meaningful conclusion can be drawn. There is clearly a need for more epidemiological research with regard to these two outcomes. The evidence for a relationship between pacifier use and sudden infant death syndrome is consistent, while the exact mechanism of the effect is not well understood. As to breast-feeding, research evidence shows that pacifier use in infancy is associated with a shorter duration and non-exclusivity. It is plausible that pacifier use causes babies to breast-feed less, but a causal relationship has not been irrefutably proven. Because breast-feeding confers an important advantage on all children and the incidence of sudden infant death syndrome is very low, it is recommended that health professionals generally advise parents against pacifier use, while taking into account individual circumstances. [source]


Assessment of registration quality of trials sponsored by China

JOURNAL OF EVIDENCE BASED MEDICINE, Issue 1 2009
Xuemei Liu
Abstract Objective To evaluate the quality of the registration information for trials sponsored by China registered in the WHO primary registries or other registries that meet the requirements of the International Committee of Medical Journal Editors (ICMJE). Methods We assessed the registration information for trials registered in the 9 WHO primary registries and one other registry that met the requirements of ICJME as of 15 October 2008. We analyzed the trial registration data set in each registry and assessed the registration quality against the WHO Trial Registration Data Set (TRDS). We also evaluated the quality of the information in the Source(s) of Monetary or Material Support section, using a specially prepared scale. Results The entries in four registries met the 20 items of the WHO TRDS. These were the Chinese Clinical Trial Registration Center (ChiCR), Australian New Zealand Clinical Trials Registry (NZCTR), Clinical Trials Registry , India (CTRI), and Sri Lanka Clinical Trials Registry (SLCTR). Registration quality varied among the different registries. For example, using the Scale of TRDS, the NZCTR scored a median of 19 points, ChiCTR (median = 18 points), ISRCTN.org (median = 17 points), and Clinical trials.org (median = 12 points). The data on monetary or material support for ChiCTR and ISRCTN.org were relatively complete and the score on our Scale for the Completeness of Funding Registration Quality ranged from ChiCTR (median = 7 points), ISRCTN.org (median = 6 points), NZCTR (median = 3 points) to clinicaltrials.gov (median = 2 points). Conclusion Further improvements are needed in both the quantity and quality of trial registration. This could be achieved by full completion of the 20 items of the WHO TRDS. Future research should assess ways to ensure the quality and scope of research registration and the role of mandatory registration of funded research. [source]


Positive predictive value of ICD-9 codes 410 and 411 in the identification of cases of acute coronary syndromes in the Saskatchewan Hospital automated database

PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 8 2008
Cristina Varas-Lorenzo MD
Abstract Background Case definitions are essential to epidemiological research. Objectives To evaluate ICD-9 codes 410 and 411 to identify cases of acute coronary syndromes (ACS), and the clinical information availability in the administrative and hospital discharge records of Saskatchewan, Canada. Methods In the context of a safety cohort study, we identified hospitalisations with primary discharge codes 410 (2260) and 411 (799). We selected all records with code 411, and a random sample (200) with code 410. Based on information obtained by trained abstractors from hospital records, events were classified by two cardiologists as definite or possible according to adapted AHA/ESC criteria. The validity of 410 and 411 codes was assessed by calculating the positive predictive value (PPV). Completeness of the recorded information on risk factors and use of aspirin was explored. Results The PPVs of the codes 410 and 411 for ACS were 0.96 (95%CI: 0. 92,0.98) and 0.86 (95%CI: 0.83,0.88), respectively. The PPV of 410 for acute myocardial infarction (AMI) was 0.95 (95%CI: 0.91,0.98). The PPV of 411 was 0.73 (95%CI: 0.70,0.77) for primary unstable angina (UA) and 0.09 (95%CI: 0.07,0.11) for AMI. Hospital charts review revealed key information for clinical variables, smoking, obesity and use of aspirin at admission. Conclusions ICD-9 410 code has high PPV for AMI cases, likewise 411 for UA cases. Case validation remains important in epidemiological studies with administrative health databases. Given the pathophysiology of ACS, both AMI and UA might be used as study end points. In addition to code 410, we recommend the use of 411 plus validation. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Completeness of state administrative databases for surveillance of congenital heart disease

BIRTH DEFECTS RESEARCH, Issue 9 2003
Christine E. Cronk
Abstract BACKGROUND Tracking birth prevalence of cardiac defects is essential to determining time and space clusters, and identifying potential associated factors. Resource limitations on state birth defects surveillance programs sometimes require that databases already available be used for ascertaining such defects. This study evaluated the data quality of state administrative databases for ascertaining congenital heart defects (CHD) and specific diagnoses of CHD. METHODS Children's Hospital of Wisconsin (CHW) medical records for infants born 1997,1999 and treated for CHD (n = 373) were abstracted and each case assigned CHD diagnoses based on definitive diagnostic reports (echocardiograms, catheterizations, surgical or autopsy reports). These data were linked to state birth and death records, and birth and postnatal (<1 year of age) hospital discharge summaries at the Wisconsin Bureau of Health Information (WBHI). Presence of any code/checkbox indicating CHD (generic CHD) and exact matches to abstracted diagnoses were evaluated. RESULTS Fifty-eight percent of cases with generic CHD were identified by state databases. Postnatal hospital discharge summaries identified 48%, birth hospital discharge summaries 27%, birth certificates 9% and death records 4% of these cases. Exact matches were found for 52% of 633 specific diagnoses. Postnatal hospital discharge summaries provided most matches. CONCLUSION State databases identified 60% of generic CHD and exactly matched about half of specific CHD diagnoses. The postnatal hospital discharge summaries performed best in both in identifying generic CHD and matching specific CHD diagnoses. Vital records had limited value in ascertaining CHD. Birth Defects Research (Part A) 67:597,603, 2003. © 2003 Wiley-Liss, Inc. [source]


Histopathology reporting in colorectal cancer: a proforma improves quality

COLORECTAL DISEASE, Issue 8 2009
P. N. Siriwardana
Abstract Aim, The histopathology report is vital to determine the need for adjuvant therapy and prognosis in colorectal cancer (CRC). Completeness of those in text format is inadequate. This study evaluated the improvement of quality of histopathology reports following the introduction of a template proforma, based on standards set by the Royal College of Pathologists (RCP), UK. Method, Sixty-eight consecutive histopathology reports based on 19 items for rectal cancer (RC) and 15 items for colon cancer (CC) using the proforma were prospectively analysed and compared with results of a previous audit of 82 consecutive histopathology reports in text format. The percentage of reports containing a statement for each data item for both series was compared using the Normal test for difference between two proportions. Completeness of each report was assessed and a percentage score (percentage completeness) was given. Mean percentage completeness was calculated for each format and compared using the two sample t -test. Results, Except for comments on the presence of ,histologically confirmed liver metastases' in CC and RC, ,distance from dentate line' and ,distance to circumferential margin' in RC, all other items were commented in more than 90% of reports, where 71% of the items based on the minimum data set were present in all reports. Compared to prose format, the mean percentage completeness (SD) improved from 74% (8) to 91% (4) (P < 0.0001) and from 81% (5) to 99% (1) (P < 0.0001) for RC and CC respectively in template proforma format. Conclusion, A template proforma and surgeon's contribution in relation to operative findings improves the quality of the histopathology report in CRC. [source]


LEARNING PRECONDITIONS FOR PLANNING FROM PLAN TRACES AND HTN STRUCTURE

COMPUTATIONAL INTELLIGENCE, Issue 4 2005
Okhtay Ilghami
A great challenge in developing planning systems for practical applications is the difficulty of acquiring the domain information needed to guide such systems. This paper describes a way to learn some of that knowledge. More specifically, the following points are discussed. (1) We introduce a theoretical basis for formally defining algorithms that learn preconditions for Hierarchical Task Network (HTN) methods. (2) We describe Candidate Elimination Method Learner (CaMeL), a supervised, eager, and incremental learning process for preconditions of HTN methods. We state and prove theorems about CaMeL's soundness, completeness, and convergence properties. (3) We present empirical results about CaMeL's convergence under various conditions. Among other things, CaMeL converges the fastest on the preconditions of the HTN methods that are needed the most often. Thus CaMeL's output can be useful even before it has fully converged. [source]


Exact and Robust (Self-)Intersections for Polygonal Meshes

COMPUTER GRAPHICS FORUM, Issue 2 2010
Marcel Campen
Abstract We present a new technique to implement operators that modify the topology of polygonal meshes at intersections and self-intersections. Depending on the modification strategy, this effectively results in operators for Boolean combinations or for the construction of outer hulls that are suited for mesh repair tasks and accurate mesh-based front tracking of deformable materials that split and merge. By combining an adaptive octree with nested binary space partitions (BSP), we can guarantee exactness (= correctness) and robustness (= completeness) of the algorithm while still achieving higher performance and less memory consumption than previous approaches. The efficiency and scalability in terms of runtime and memory is obtained by an operation localization scheme. We restrict the essential computations to those cells in the adaptive octree where intersections actually occur. Within those critical cells, we convert the input geometry into a plane-based BSP-representation which allows us to perform all computations exactly even with fixed precision arithmetics. We carefully analyze the precision requirements of the involved geometric data and predicates in order to guarantee correctness and show how minimal input mesh quantization can be used to safely rely on computations with standard floating point numbers. We properly evaluate our method with respect to precision, robustness, and efficiency. [source]


Evaluation of Uncertainties Associated with Geocoding Techniques

COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2004
Hassan A. Karimi
Geocoded data play a major role in numerous engineering applications such as transportation and environmental studies where geospatial information systems (GIS) are used for spatial modeling and analysis as they contain spatial information (e.g., latitude and longitude) about objects. The information that a GIS produces is impacted by the quality of the geocoded data (e.g., coordinates) stored in its database. To make appropriate and reasonable decisions using geocoded data, it is important to understand the sources of uncertainty in geocoding. There are two major sources of uncertainty in geocoding, one related to the database that is used as a reference data set to geocode objects and one related to the interpolation technique used. Factors such as completeness, correctness, consistency, currency, and accuracy of the data in the reference database contribute to the uncertainty of the former whereas the specific logic and assumptions used in an interpolation technique contribute to the latter. The primary purpose of this article is to understand uncertainties associated with interpolation techniques used for geocoding. In doing so, three geocoding algorithms were used and tested and the results were compared with the data collected by the Global Positioning System (GPS). The result of the overall comparison indicated no significant differences between the three algorithms. [source]


A Modeling Framework for Supply Chain Simulation: Opportunities for Improved Decision Making,

DECISION SCIENCES, Issue 1 2005
D. J. Van Der Zee
ABSTRACT Owing to its inherent modeling flexibility, simulation is often regarded as the proper means for supporting decision making on supply chain design. The ultimate success of supply chain simulation, however, is determined by a combination of the analyst's skills, the chain members' involvement, and the modeling capabilities of the simulation tool. This combination should provide the basis for a realistic simulation model, which is both transparent and complete. The need for transparency is especially strong for supply chains as they involve (semi)autonomous parties each having their own objectives. Mutual trust and model effectiveness are strongly influenced by the degree of completeness of each party's insight into the key decision variables. Ideally, visual interactive simulation models present an important communicative means for realizing the required overview and insight. Unfortunately, most models strongly focus on physical transactions, leaving key decision variables implicit for some or all of the parties involved. This especially applies to control structures, that is, the managers or systems responsible for control, their activities and their mutual attuning of these activities. Control elements are, for example, dispersed over the model, are not visualized, or form part of the time-indexed scheduling of events. In this article, we propose an alternative approach that explicitly addresses the modeling of control structures. First, we will conduct a literature survey with the aim of listing simulation model qualities essential for supporting successful decision making on supply chain design. Next, we use this insight to define an object-oriented modeling framework that facilitates supply chain simulation in a more realistic manner. This framework is meant to contribute to improved decision making in terms of recognizing and understanding opportunities for improved supply chain design. Finally, the use of the framework is illustrated by a case example concerning a supply chain for chilled salads. [source]


Review of bupropion for smoking cessation

DRUG AND ALCOHOL REVIEW, Issue 2 2003
ROBYN RICHMOND
Abstract The advent of bupropion hydrochloride sustained release (Zyban) has heralded a major change in the options available for smoking cessation pharmacotherapy. Bupropion is a selective re-uptake inhibitor of dopamine and noradrenalin which prevents or reduces cravings and other features of nicotine withdrawal. Bupropion is a useful oral and non-nicotine form of pharmacotherapy for smoking cessation. For this review a total of 221 papers were reviewed plus poster presentations. This review examines in detail original clinical trials on efficacy, categorised according to whether they were acute treatment trials in healthy smokers; studies in specific populations such as people with depression, chronic obstructive pulmonary disease (COPD) or cardiovascular disease; or relapse prevention studies. Overall, these studies in varying populations comprising over four thousand subjects, showed bupropion consistently produces a positive effect on smoking cessation outcomes. The evidence highlights the major public health role that bupropion has in smoking cessation. The methodological issues of published clinical trials reporting one year outcomes were examined in detail including: completeness of follow-up; loss to follow-up; intention to treat analysis; blindness of assessment; and validation of smoking status. The review discusses contraindications, adverse effects, dose and overdose, addictive potential, and the role of bupropion in reducing cessation-related weight gain. Bupropion combined with or compared to other pharmacotherapies (nicotine patch; nortriptyline) is considered. Impressive evidence exists for the use of bupropion in smoking cessation among difficult patients who are hard-core smokers such as those with cardiovascular disease, chronic obstructive pulmonary disease (COPD) and depression. Bupropion reduces withdrawal symptoms as well as weight gain and is effective for smoking cessation for people with and without a history of depression or alcoholism. Serious side effects of bupropion use are rare. The major safety issue with bupropion is risk of seizures (estimated at approximately 0.1%) and it should not be prescribed to patients with a current seizure disorder or any history of seizures. In clinical trials of bupropion for smoking cessation no seizures were reported. Allergic reactions occur at a rate of approximately 3% and minor adverse effects are common including dry mouth and insomnia. [source]


The Accuracy and Completeness of Data Collected by Prospective and Retrospective Methods

ACADEMIC EMERGENCY MEDICINE, Issue 9 2005
J. Tobias Nagurney MD
Abstract Objectives: To describe and test a model that compares the accuracy of data gathered prospectively versus retrospectively among adult emergency department patients admitted with chest pain. Methods: The authors developed a model of information flow from subject to medical record to the clinical study case report form, based on a literature review. To test this model, a bidirectional (prospective and retrospective) study was conducted, enrolling all eligible adult patients who were admitted with a chief complaint of chest pain. The authors interviewed patients in the emergency department to determine their chest pain history and established a prospective database; this was considered the criterion standard. Then, patient medical records were reviewed to determine the accuracy and completeness of the information available through a retrospective medical record review. Results: The model described applies the concepts of reliability and validity to information passed on by the study subject, the clinician, and the medical record abstractor. This study was comprised of 104 subjects, of which 63% were men and the median age was 63 years. Subjects were uncertain of responses for 0,8% of questions and responded differently upon reinterview for subsets of questions 0,30% of the time. The sensitivity of the medical record for risk factors for coronary artery disease was 0.77 to 0.93. Among the 88 subjects (85%) who indicated that their chest pain was substernal or left chest, the medical record described this location in 44%. Timing of the chest pain was the most difficult item to accurately capture from the medical record. Conclusions: Information obtained retrospectively from the abstraction of medical records is measurably less accurate than information obtained prospectively from research subjects. For certain items, more than half of the information is not available. This loss of information is related to the data types included in the study and by the assumptions that a researcher performing a retrospective study makes about implied versus explicitly stated responses. A model of information flow that incorporates the concepts of reliability and validity can be used to measure some of the loss of information that occurs at each step along the way from subject to clinician to medical record abstractor. [source]


Integrating highly diverse invertebrates into broad-scale analyses of cross-taxon congruence across the Palaearctic

ECOGRAPHY, Issue 6 2009
Andreas Schuldt
Our knowledge on broad-scale patterns of biodiversity, as a basis for biogeographical models and conservation planning, largely rests upon studies on the spatial distribution of vertebrates and plants, neglecting large parts of the world's biodiversity. To reassess the generality of these patterns and better understand spatial diversity distributions of invertebrates, we analyzed patterns of species richness and endemism of a hyperdiverse insect taxon, carabid beetles (ca 11 000 Palaearctic species known), and its cross-taxon congruence with well-studied vertebrates (amphibians, reptiles) and plants across 107,units of the Palaearctic. Based on species accumulation curves, we accounted for completeness of the carabid data by separately examining the western (well-sampled) and eastern (partly less well-sampled) Palaearctic and China (deficient data). For the western Palaearctic, we highlight overall centers of invertebrate, vertebrate and plant diversity. Species richness and endemism of carabids were highly correlated with patterns of especially plant and amphibian diversity across large parts of the Palaearctic. For the well-sampled western Palaearctic, hotspots of diversity integrating invertebrates were located in Italy, Spain and Greece. Only analysis of Chinese provinces yielded low congruence between carabids and plants/vertebrates. However, Chinese carabid diversity is only insufficiently known and China features the highest numbers of annual new descriptions of carabids in the Palaearctic. Even based on the incomplete data, China harbors at least 25% of all Palaearctic carabid species. Our study shows that richness and endemism patterns of highly diverse insects can exhibit high congruence with general large scale patterns of diversity inferred from plants/vertebrates and that hotspots derived from the latter can also include a high diversity of invertebrates. In this regard, China qualifies as an outstanding multi-taxon hotspot of diversity, requiring intense biodiversity research and conservation effort. Our findings extend the limited knowledge on broad-scale invertebrate distributions and allow for a better understanding of diversity patterns across a larger range of the world's biodiversity than usually considered. [source]


Assessing completeness of biodiversity databases at different spatial scales

ECOGRAPHY, Issue 1 2007
Jorge Soberón
First page of article [source]


Quantitative tools for perfecting species lists

ENVIRONMETRICS, Issue 2 2002
Michael W. Palmer
Abstract A substantial body of literature has accumulated on the topic of the estimation of species richness by extrapolation. However, most of these methods rely on an objective sampling of nature. This condition is difficult to meet and seldom achieved for large regions. Furthermore, scientists conducting biological surveys often already have preliminary but subjectively gathered species lists, and would like to assess the completeness of such lists, and/or to find a way to perfect them. We propose several strategies for utilizing external data (such as might be obtained using GIS) to aid in the completion of species lists. These include: (i) using existing species lists to develop predictive models; (ii) using the uniqueness of the environment as a guide to find underrepresented species; (iii) using spectral heterogeneity to locate environmentally heterogeneous regions; (iv) combining surveys with statistical model-building in an iterative manner. We demonstrate the potential of these approaches using simulation and case studies from Oklahoma. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Focal Ictal , Discharge on Scalp EEG Predicts Excellent Outcome of Frontal Lobe Epilepsy Surgery

EPILEPSIA, Issue 3 2002
Gregory A. Worrell
Summary: ,Purpose: To determine whether a focal ,-frequency discharge at seizure onset on scalp EEG predicts outcome of frontal lobe epilepsy (FLE) surgery. Methods: We identified 54 consecutive patients with intractable FLE who underwent epilepsy surgery between December 1987 and December 1996. A blind review of EEGs and magnetic resonance images (MRIs) was performed. Lesional epilepsy is defined as presence of an underlying structural abnormality on MRI. Results: Overall, 28 (52%) patients were seizure free, with a mean follow-up of 46.5 months. Presence of a focal ,-frequency discharge at seizure onset on scalp EEG predicted seizure-free outcome in lesional (p = 0.02) and nonlesional (p = 0.01) epilepsy patients. At least 90% of patients who had either lesional or nonlesional epilepsy were seizure free if scalp EEG revealed a focal , discharge at ictal onset. Moreover, logistic regression analysis showed that focal ictal , pattern and completeness of lesion resection were independently predictive of seizure-free outcome. Ictal onset with lateralized EEG activity of any kind and postresection electrocorticographic spikes did not predict surgical outcome (p > 0.05). Conclusions: Only about 25% of FLE surgical patients have a focal ,-frequency discharge at seizure onset on scalp EEG. However, its presence is highly predictive of excellent postsurgical seizure control in either lesional or nonlesional FLE surgical patients. [source]


Time asymmetric quantum theory , II.

FORTSCHRITTE DER PHYSIK/PROGRESS OF PHYSICS, Issue 6 2003
Relativistic resonances from S -matrix poles
Abstract Relativistic resonances and decaying states are described by representations of Poincaré transformations, similar to Wigner's definition of stable particles. To associate decaying state vectors to resonance poles of the S -matrix, the conventional Hilbert space assumption (or asymptotic completeness) is replaced by a new hypothesis that associates different dense Hardy subspaces to the in- and out-scattering states. Then one can separate the scattering amplitude into a background amplitude and one or several "relativistic Breit-Wigner" amplitudes, which represent the resonances per se. These Breit-Wigner amplitudes have a precisely defined lineshape and are associated to exponentially decaying Gamow vectors which furnish the irreducible representation spaces of causal Poincaré transformations into the forward light cone. [source]


Simplified algorithms for calculating double-couple rotation

GEOPHYSICAL JOURNAL INTERNATIONAL, Issue 1 2007
Yan Y. Kagan
SUMMARY We derive new, simplified formulae for evaluating the 3-D angle of earthquake double-couple (DC) rotation. The complexity of the derived equations depends on both accuracy requirements for angle evaluation and the completeness of desired solutions. The solutions are simpler than my previously proposed algorithm based on the quaternion representation designed in 1991. We discuss advantages and disadvantages of both approaches. These new expressions can be written in a few lines of computer code and used to compare both DC solutions obtained by different methods and variations of earthquake focal mechanisms in space and time. [source]


Elders assessment of an evolving model of oral health

GERODONTOLOGY, Issue 4 2007
Mario A Brondani
Objectives:, To evaluate qualitatively a model of oral health through focus groups among elders. Methods:, The participants (30 women and 12 men; mean age: 75 years) attended one of six focus groups to discuss the relevance of the model to their oral health-related beliefs and experiences, and transcripts of the narratives were analysed systematically for the components, associations and recommendations emerging from the discussions. Results:, The groups confirmed the relevance of the original components of the model with minor modifications, but felt that for completeness it required four additional components: diet; economic priorities; personal expectations; and health values and beliefs. They recommended that the negative connotations of limited activity, impairment and restricted participation were modified with the positive terms activity and participation, and they suggested that ellipses rather than concentric circles more aptly illustrate the dynamic and overlapping importance of the various components in the model. Conclusion:, The original model required additional components and graphic representation to accommodate all of the experiences and beliefs relating to the oral health of the elders who participated in this qualitative study. [source]


Subcutaneous Sumatriptan Pharmacokinetics: Delimiting the Monoamine Oxidase Inhibitor Effect

HEADACHE, Issue 2 2010
Anthony W. Fox
(Headache 2010;50:249-255) Background., The absolute bioavailability of subcutaneous (s.c.) sumatriptan is 96-100%. The decay curve for plasma concentration after 6 mg s.c. sumatriptan (ie, after Tmax = about 0.2 hours) includes a large distribution component. Metabolism by monoamine oxidase-A (MAO-A) leads to about 40% of the s.c. dose appearing in the urine as the inactive indole acetic acid. Product labeling states that co-administration of an inhibitor of MAO-A (a MAOI-A) causes a 2-fold increase in sumatriptan plasma concentrations, and a 40% increase in elimination half-life. Objective., The objective of this study is to determine whether MAOI-A therapy should deter the use of 6 mg s.c. sumatriptan on pharmacokinetic grounds. Methods., Summary pharmacokinetic data were taken from the literature and from GlaxoSmithKline (GSK) study C92-050. Half-times were converted into rate constants, which were then used in a parsimonious compartmental model (needing only 3 simultaneous differential equations). Acceptance criteria for the model included observed plasma sumatriptan concentrations at Tmax, 1, 2, and 10 hours post-dose. A set of 1000 concentration measurements at a resolution of 36 seconds was generated. The model was then perturbed with elimination constants observed during concomitant moclobemide administration, creating a second set of concentration measurements. The 2 sets were then plotted, examined for their differences, and integrated for a second time to obtain and compare areas under the curve (AUCs). Results., The greatest absolute difference between the 2 sets of measurements was 2.85 ng/mL at t = 2.95 hours. A 2-fold difference between the 2 sets occurred only after t = 5.96 hours, when the concentration in the presence of the MAOI-A was 3.72 ng/mL (or <4% of Cmax). At t = 10 hours, the concentrations in both sets were <1 ng/mL (ie, below the lower limit of assay quantitation), and AUC0-10h was 97.4 and 117 ng.hour/mL in the absence and presence of the MAOI-A. Conclusions., There are no pharmacokinetic grounds to deter co-administration of an MAOI-A and subcutaneous sumatriptan. The dominance of the distribution phase and completeness of absorption of a 6 mg dose of s.c. sumatriptan explains the trivial effect size of the MAOI-A on plasma sumatriptan concentrations. Importantly, these findings should not be extrapolated to other routes of administration for sumatriptan. [source]


Factors affecting participation in Sure Start programmes: a qualitative investigation of parents' views

HEALTH & SOCIAL CARE IN THE COMMUNITY, Issue 3 2007
Mark Avis BA(Hons) MSc RN RNT Cert Ed
Abstract The objectives of the present study were to examine the factors that parents identify as promoting or hindering participation in Sure Start programmes, and to identify methods for enhancing parents' engagement with Sure Start. A qualitative, in-depth interview study was conducted with parents registered with two local Sure Start programmes based in the East Midlands, UK, and located in inner city areas with a range of health and social problems associated with social exclusion and disadvantage. Sixty parents, guardians or carers of children living in both Sure Start areas were recruited during autumn of 2004 on the basis of whether they were identified as a ,frequent user' or ,non-frequent user' of Sure Start services. The data were analysed using a thematic approach supported by NVivo computer software, and explanatory themes were subsequently tested for completeness and adequacy. The results of the study indicated that parents who used Sure Start services were positive about the benefits that they obtained for themselves and their children, in particular in overcoming a sense of isolation. Parents who were non-frequent users identified a number of practical reasons that prevented them using Sure Start services, although parents also recognised a loss of confidence and trust in the local communities summarised in the phrase ,keeping myself to myself'. Parents' awareness of the targeted nature of Sure Start can also lead to stigma and reluctance to use services. It is concluded that continued investment of time and effort in maintaining communication networks between Sure Start staff and local parents is vital if parents and children are to make the best use of Sure Start services. [source]


Quality of histopathological reporting on melanoma and influence of use of a synoptic template

HISTOPATHOLOGY, Issue 6 2010
Lauren E Haydu
Haydu L E, Holt P E, Karim R Z, Madronio C M, Thompson J F, Armstrong B K & Scolyer R A (2010) Histopathology56, 768,774 Quality of histopathological reporting on melanoma and influence of use of a synoptic template Aims:, To evaluate the quality of histopathological reporting for melanoma in a whole population, to assess the influence on quality of the use of a synoptic template and thus to provide an evidence base to guide improvement in reporting melanoma pathology. Methods and results:, Histopathology reports of all primary invasive melanomas notified to the New South Wales Central Cancer Registry between October 2006 and October 2007 (n = 3784) were reviewed. A detailed audit of histopathology reports for consecutively diagnosed primary invasive melanoma over 6 months (n = 2082) was performed to assess the quality of each report based on compliance with the 2008 Clinical Practice Guidelines for the Management of Melanoma in Australia and New Zealand. Only half of the initial excision specimen reports included the essential components necessary to stage a melanoma patient according to the 2002 American Joint Committee on Cancer/International Union Against Cancer melanoma staging system. Report format was strongly correlated with completeness and validity of reporting: reports in a synoptic format, with or without a descriptive component, achieved the highest quality levels. Conclusions:, Even in a population with a high incidence of melanoma, concordance of pathology reports with current guidelines was comparatively low. Wider adoption of synoptic reporting is likely to increase report quality. [source]


Assessing the reliability of biodiversity databases: identifying evenly inventoried island parasitoid faunas (Hymenoptera: Ichneumonoidea) worldwide

INSECT CONSERVATION AND DIVERSITY, Issue 2 2010
ANA M. C. SANTOS
Abstract., 1.,Taxonomic and geographic biases are common in biodiversity inventories, especially in hyperdiverse taxa, such as the Ichneumonoidea. Despite these problems, biodiversity databases could be a valuable source of information if their reliability is carefully assessed. 2.,One major problem of using these data for large-scale analyses is the unevenness of data quality from different areas, which makes them difficult to compare. One way of surpassing such problem would be to identify sets of areas that are evenly inventoried. 3.,Here, we propose a scoring protocol for the identification of sets of evenly inventoried areas from taxonomic databases, based on three criteria: (i) completeness at high taxonomic levels, (ii) congruence with well-established ecological relationships (such as species,area relationship), and (iii) publication effort received. We apply this protocol to the selection of a set of evenly inventoried islands worldwide for two Ichneumonoidea families (Braconidae and Ichneumonidae) from the data gathered in Taxapad database. 4.,From the 118 islands included in Taxapad, 53 and 70 can be considered sufficiently inventoried for Braconidae and Ichneumonidae, respectively. The publication effort criterion was more restrictive than the other two criteria. The Indomalayan, Nearctic and Palearctic regions had more than half of their islands identified as evenly inventoried, for both families. 5.,We discuss the generality of the biases and incompleteness of most biodiversity data, and also how the basic principles of the protocol proposed here can be applied to taxonomic databases devoted to other taxa. Also, the islands identified here can serve as the basis for large-scale analyses of the poorly known biogeography of the Ichneumonoidea. [source]


Decision making using time-dependent knowledge: knowledge augmentation using qualitative reasoning

INTELLIGENT SYSTEMS IN ACCOUNTING, FINANCE & MANAGEMENT, Issue 1 2001
Song Jin Yu
In this paper we propose a method to enhance the performance of knowledge-based decision-support systems, knowledge of which is volatile and incomplete by nature in a dynamically changing situation, by providing meta-knowledge augmented by the Qualitative Reasoning (QR) approach. The proposed system intends to overcome the potential problem of completeness of the knowledge base. Using the deep meta-knowledge incorporated into the QR module, along with the knowledge we gain from applying inductive learning, we then identify the ongoing process and amplify the effects of each pending process to the attribute values. In doing so, we apply the QR models to enhance or reveal the patterns which are otherwise less obvious. The enhanced patterns can eventually be used to improve the classification of the data samples. The success factor hinges on the completeness of the QR process knowledge base. With enough processes taking place, the influences of each process will lead prediction in a direction that can reflect more of the current trend. The preliminary results are successful and shed light on the smooth introduction of Qualitative Reasoning to the business domain from the physical laboratory application. © 2001 John Wiley & Sons, Ltd. [source]


A unified approach to the implicit integration of standard, non-standard and viscous plasticity models

INTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 11 2002
René de Borst
Abstract It is shown how modern concepts to integrate the elasto-plastic rate equations of standard plasticity via an implicit algorithm can be generalized to plasticity without an explicitly defined yield surface and to overstress-type models of viscoplasticity, where the stress point can be located outside the loading surface. For completeness, a tangent operator is derived that is consistent with the update algorithm. Copyright © 2002 John Wiley & Sons, Ltd. [source]


A fast, one-equation integration algorithm for the Lemaitre ductile damage model

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 8 2002
E. A. de Souza NetoArticle first published online: 3 MAY 200
Abstract This paper introduces an elastic predictor/return mapping integration algorithm for a simplified version of the Lemaitre ductile damage model, whose return mapping stage requires the solution of only one scalar non-linear equation. The simplified damage model differs from its original counterpart only in that it excludes kinematic hardening. It can be used to predict ductile damage growth whenever load reversal is absent or negligible,a condition met in a vast number of practical engineering applications. The one-equation integration scheme proves highly efficient in the finite element solution of typical boundary value problems, requiring computation times comparable to those observed in classical von Mises implementations. This is in sharp contrast to the previously proposed implementations of the original model whose return mapping may require, in the most general case, the solution of a system of 14 coupled algebraic equations. For completeness, a closed formula for the corresponding consistent tangent operator is presented. The performance of the algorithm is illustrated by means of a numerical example. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Analysis of shear locking in Timoshenko beam elements using the function space approach

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 6 2001
Somenath Mukherjee
Abstract Elements based purely on completeness and continuity requirements perform erroneously in a certain class of problems. These are called the locking situations, and a variety of phenomena like shear locking, membrane locking, volumetric locking, etc., have been identified. Locking has been eliminated by many techniques, e.g. reduced integration, addition of bubble functions, use of assumed strain approaches, mixed and hybrid approaches, etc. In this paper, we review the field consistency paradigm using a function space model, and propose a method to identify field-inconsistent spaces for projections that show locking behaviour. The case of the Timoshenko beam serves as an illustrative example. Copyright © 2001 John Wiley & Sons, Ltd. [source]


A natural neighbour Galerkin method with quadtree structure

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 6 2005
J. J. Laguardia
Abstract We describe in this paper a highly structured numerical method that allows for an important speedup in the calculations. The method is implemented in a bi-dimensional binary tree (quadtree or octree) structure in a partition of unity framework. The partition of unity is constructed by using natural neighbour interpolation. Data can be easily obtained from voxel or pixel-based images, as well as STL files or other CAD descriptions. The method described here possesses linear completeness at least and essential boundary conditions are implemented through the characteristic function method, by employing a special class of functions called R -functions. After the theoretical description of the method, some examples of its performance are presented and analysed. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Simulation of high velocity concrete fragmentation using SPH/MLSPH

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 10 2003
T. Rabczuk
Abstract The simulation of concrete fragmentation under explosive loading by a meshfree Lagrangian method, the smooth particle hydrodynamics method (SPH) is described. Two improvements regarding the completeness of the SPH-method are examined, first a normalization developed by Johnson and Beissel (NSPH) and second a moving least square (MLS) approach as modified by Scheffer (MLSPH). The SPH-Code is implemented in FORTRAN 90 and parallelized with MPI. A macroscopic constitutive law with isotropic damage for fracture and fragmentation for concrete is implemented in the SPH-Code. It is shown that the SPH-method is able to simulate the fracture and fragmentation of concrete slabs under contact detonation. The numerical results from the different SPH-methods are compared with the data from tests. The good agreement between calculation and experiment suggests that the SPH-program can predict the correct maximum pressure as well as the damage of the concrete slabs. Finally the fragment distributions of the tests and the numerical calculations are compared. Copyright © 2003 John Wiley & Sons, Ltd. [source]