Incorrect

Distribution by Scientific Domains
Distribution within Medical Sciences

Terms modified by Incorrect

  • incorrect answer
  • incorrect assumption
  • incorrect conclusion
  • incorrect diagnosis
  • incorrect dose
  • incorrect inference
  • incorrect interpretation
  • incorrect response
  • incorrect result
  • incorrect use

  • Selected Abstracts


    Incorrect and incomplete coding and classification of diabetes: a systematic review

    DIABETIC MEDICINE, Issue 5 2010
    M. A. Stone
    Diabet. Med. 27, 491,497 (2010) Abstract Aims, To conduct a systematic review to identify types and implications of incorrect or incomplete coding or classification within diabetes or between diabetes and other conditions; also to determine the availability of evidence regarding frequency of occurrence. Methods, Medical Subject Headings (MeSH) and free-text terms were used to search relevant electronic databases for papers published to the end of August 2008. Two researchers independently reviewed titles and abstracts and, subsequently, the full text of potential papers. Reference lists of selected papers were also reviewed and authors consulted. Three reviewers independently extracted data. Results, Seventeen eligible studies were identified, including five concerned with distinguishing between Type 1 and Type 2 diabetes. Evidence was also identified regarding: the distinction between diabetes and no-diabetes, failure to specify type of diabetes, and diagnostic errors or difficulties involving maturity-onset diabetes of the young, latent autoimmune diabetes in adults, pancreatic diabetes, persistence of foetal haemoglobin and acquired immune deficiency syndrome (AIDS). The sample was too heterogeneous to derive accurate information about frequency, but our findings suggested that misclassification occurs most commonly in young people. Implications relating to treatment options and risk management were highlighted, in addition to psychological and financial implications and the potential impact on the validity of quality of care evaluations and research. Conclusions, This review draws attention to the occurrence and implications of incorrect or incomplete coding or classification of diabetes, particularly in young people. A pragmatic and clinically relevant approach to classification is needed to assist those involved in making decisions about types of diabetes. [source]


    Initialization Strategies in Simulation-Based SFE Eigenvalue Analysis

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2005
    Song Du
    Poor initializations often result in slow convergence, and in certain instances may lead to an incorrect or irrelevant answer. The problem of selecting an appropriate starting vector becomes even more complicated when the structure involved is characterized by properties that are random in nature. Here, a good initialization for one sample could be poor for another sample. Thus, the proper eigenvector initialization for uncertainty analysis involving Monte Carlo simulations is essential for efficient random eigenvalue analysis. Most simulation procedures to date have been sequential in nature, that is, a random vector to describe the structural system is simulated, a FE analysis is conducted, the response quantities are identified by post-processing, and the process is repeated until the standard error in the response of interest is within desired limits. A different approach is to generate all the sample (random) structures prior to performing any FE analysis, sequentially rank order them according to some appropriate measure of distance between the realizations, and perform the FE analyses in similar rank order, using the results from the previous analysis as the initialization for the current analysis. The sample structures may also be ordered into a tree-type data structure, where each node represents a random sample, the traverse of the tree starts from the root of the tree until every node in the tree is visited exactly once. This approach differs from the sequential ordering approach in that it uses the solution of the "closest" node to initialize the iterative solver. The computational efficiencies that result from such orderings (at a modest expense of additional data storage) are demonstrated through a stability analysis of a system with closely spaced buckling loads and the modal analysis of a simply supported beam. [source]


    An Alternate Multiple-Choice Scoring Procedure in a Macroeconomics Course

    DECISION SCIENCES JOURNAL OF INNOVATIVE EDUCATION, Issue 1 2004
    David A. Bradbard
    ABSTRACT In the standard scoring procedure for multiple-choice exams, students must choose exactly one response as correct. Often students may be unable to identify the correct response, but can determine that some of the options are incorrect. This partial knowledge is not captured in the standard scoring format. The Coombs elimination procedure is an alternate scoring procedure designed to capture partial knowledge. This paper presents the results of a semester-long experiment where both scoring procedures were compared on four exams in an undergraduate macroeconomics course. Statistical analysis suggests that the Coombs procedure is a viable alternative to the standard scoring procedure. Implications for classroom instruction and future research are also presented. [source]


    Soft tissue augmentation 2006: filler fantasy

    DERMATOLOGIC THERAPY, Issue 3 2006
    Arnold William Klein
    ABSTRACT:, As an increasing number of patients seek esthetic improvement through minimally invasive procedures, interest in soft tissue augmentation and filling agents is at an all-time high. One reason for this interest is the availability of botulinum toxin type A, which works superbly in the upper face. The rejuvenation of the upper face has created much interest in injectable filling agents and implant techniques that work equally well in the restoration of the lower face. One of the central tenets of soft tissue augmentation is the concept of the three-dimensional face. The youthful face has a soft, full appearance, as opposed to the flat, pulled, two-dimensional look often achieved by more traditional surgical approaches. Injectable filling agents can augment and even at times, replace pulling. Additionally, with the lip as the focal center of the lower face, subtle lip enhancement is here to stay, and is in fact, the number one indication for injectable fillers. Moreover, minimally invasive soft tissue augmentation offers cosmetic enhancement without the cost and recovery time associated with more invasive procedures. As more and more physicians take interest in minimally invasive surgery, courses in cosmetic surgery techniques are becoming increasingly popular at the medical meetings of many specialties. Today, physicians have a much larger armamentarium of techniques and materials with which to improve facial contours, ameliorate wrinkles, and provide esthetic rejuvenation to the face. For a substance or device to be amenable for soft tissue augmentation in the medical community, it must meet certain criteria. It must have both a high "use" potential, producing cosmetically pleasing results with a minimum undesirable reactions, and have a low abuse potential in that widespread or incorrect or indiscriminate use would not result in significant morbidity. It must be nonteratogenic, noncarcinogenic, and nonmigratory. In addition, the agent must provide predictable, persistent correction through reproducible implantation techniques. Finally, the substance, agent or device must be approved by the U.S. Food and Drug Administration, which assures purity, safety, and accessibility, as well as much-needed information regarding use. Having a thorough understanding of the filling agents available, their indications and contraindications, as well as having thorough knowledge of implant technique are vital in providing the patient with an esthetically pleasing result. [source]


    Differentiation and integration: guiding principles for analyzing cognitive change

    DEVELOPMENTAL SCIENCE, Issue 4 2008
    Robert S. Siegler
    Differentiation and integration played large roles within classic developmental theories but have been relegated to obscurity within contemporary theories. However, they may have a useful role to play in modern theories as well, if conceptualized as guiding principles for analyzing change rather than as real-time mechanisms. In the present study, we used this perspective to examine which rules children use, the order in which the rules emerge, and the effectiveness of instruction on water displacement problems. We found that children used systematic rules to solve such problems, and that the rules progress from undifferentiated to differentiated forms and toward increasingly accurate integration of the differentiated variables. Asking children to explain both why correct answers were correct and why incorrect answers were incorrect proved more effective than only requesting explanations of correct answers, which was more effective than just receiving feedback on the correctness of answers. Requests for explanations appeared to operate through helping children notice potential explanatory variables, formulate more advanced rules, and generalize the rules to novel problems. [source]


    Incorrect and incomplete coding and classification of diabetes: a systematic review

    DIABETIC MEDICINE, Issue 5 2010
    M. A. Stone
    Diabet. Med. 27, 491,497 (2010) Abstract Aims, To conduct a systematic review to identify types and implications of incorrect or incomplete coding or classification within diabetes or between diabetes and other conditions; also to determine the availability of evidence regarding frequency of occurrence. Methods, Medical Subject Headings (MeSH) and free-text terms were used to search relevant electronic databases for papers published to the end of August 2008. Two researchers independently reviewed titles and abstracts and, subsequently, the full text of potential papers. Reference lists of selected papers were also reviewed and authors consulted. Three reviewers independently extracted data. Results, Seventeen eligible studies were identified, including five concerned with distinguishing between Type 1 and Type 2 diabetes. Evidence was also identified regarding: the distinction between diabetes and no-diabetes, failure to specify type of diabetes, and diagnostic errors or difficulties involving maturity-onset diabetes of the young, latent autoimmune diabetes in adults, pancreatic diabetes, persistence of foetal haemoglobin and acquired immune deficiency syndrome (AIDS). The sample was too heterogeneous to derive accurate information about frequency, but our findings suggested that misclassification occurs most commonly in young people. Implications relating to treatment options and risk management were highlighted, in addition to psychological and financial implications and the potential impact on the validity of quality of care evaluations and research. Conclusions, This review draws attention to the occurrence and implications of incorrect or incomplete coding or classification of diabetes, particularly in young people. A pragmatic and clinically relevant approach to classification is needed to assist those involved in making decisions about types of diabetes. [source]


    What are the policy lessons of National Alcohol Prohibition in the United States, 1920,1933?

    ADDICTION, Issue 7 2010
    Wayne Hall
    ABSTRACT National alcohol prohibition in the United States between 1920 and 1933 is believed widely to have been a misguided and failed social experiment that made alcohol problems worse by encouraging drinkers to switch to spirits and created a large black market for alcohol supplied by organized crime. The standard view of alcohol prohibition provides policy lessons that are invoked routinely in policy debates about alcohol and other drugs. The alcohol industry invokes it routinely when resisting proposals to reduce the availability of alcohol, increase its price or regulate alcohol advertising and promotion. Advocates of cannabis law reform invoke it frequently in support of their cause. This paper aims: (i) to provide an account of alcohol prohibition that is more accurate than the standard account because it is informed by historical and econometric analyses; (ii) to describe the policy debates in the 1920s and 1930s about the effectiveness of national prohibition; and (iii) to reflect on any relevance that the US experience with alcohol prohibition has for contemporary policies towards alcohol. It is incorrect to claim that the US experience of National Prohibition indicates that prohibition as a means of regulating alcohol is always doomed to failure. Subsequent experience shows that partial prohibitions can produce substantial public health benefits at an acceptable social cost, in the absence of substantial enforcement. [source]


    The carbon,nutrient balance hypothesis: its rise and fall

    ECOLOGY LETTERS, Issue 1 2001
    J.G. Hamilton
    The idea that the concentration of secondary metabolites in plant tissues is controlled by the availability of carbon and nitrogen in the environment has been termed the carbon,nutrient balance hypothesis (CNB). This hypothesis has been invoked both for prediction and for post hoc explanation of the results of hundreds of studies. Although it successfully predicts outcomes in some cases, it fails to such an extent that it cannot any longer be considered useful as a predictive tool. As information from studies has accumulated, many attempts have been made to save CNB, but these have been largely unsuccessful and have managed only to limit its utility. The failure of CNB is rooted in assumptions that are now known to be incorrect and it is time to abandon CNB because continued use of the hypothesis is now hindering understanding of plant,consumer interactions. In its place we propose development of theory with a firm evolutionary basis that is mechanistically sophisticated in terms of plant and herbivore physiology and genetics. [source]


    ECONOMIC EFFECTS OF SMOKING BANS ON RESTAURANTS AND PUBS

    ECONOMIC AFFAIRS, Issue 4 2008
    Barrie Craven
    The United Kingdom has recently enacted smoking bans in public places such as restaurants and pubs. Public health advocates argue that bans are necessary because non-smokers need protection from second-hand smoke. Advocates also claim that bans do not exert harm on owners because of a vast empirical literature showing that restaurants and bars in the United States never suffer harm following bans. This paper examines whether these claims are true by developing a model within the Coasian framework whereby owners of businesses have incentives to deal with smoking disputes between smokers and non-smokers. Our model demonstrates that it is incorrect to argue that smoking bans are necessary because the private market has no method of attempting to solve smoking problems. It also predicts that bans exert different effects on different businesses: some will be unaffected while others will experience losses or gains. Our literature review reveals that predictions of differential effects are consistent with the empirical evidence. [source]


    THE REVIVAL OF CULTURAL EXPLANATION IN ECONOMICS

    ECONOMIC AFFAIRS, Issue 4 2003
    E. L. Jones
    Cultural explanations of economic change were largely dropped for a generation, as economists rejected their inconclusiveness and other social scientists labelled them as politically incorrect. Peter Bauer, however, expressed disquiet at the way deep influences like culture were being ignored in economic analysis. This paper discusses why high-profile attention has now turned back to culture. It does not find the expositions offered to be very persuasive but nevertheless agrees that Bauer's unease was understandable and describes other recent academic studies that are more promising. [source]


    Endogenous growth theory: a critique

    ECONOMIC AFFAIRS, Issue 3 2000
    Omar Al-Ubaydli
    Endogenous growth theory is based on a misperception of how science and technology are acquired and diffused. In particular, it is incorrect to assume that knowledge is freely available. Any knowledge which has economic value has to be accessed via the brains of experts who are members of the relevant ,invisible college' and are rivalrous. It therefore has the characteristics of a private good which can be left to conventional economic incentives to supply. [source]


    Growing Up in Guerrilla Camp: The long-Term Impact of Being a Child Soldier in El Salvador's Civil War

    ETHOS, Issue 4 2002
    Julia Dickson-Gõmez
    Many recent wars are characterized by high levels of civilian casualties, a large proportion of whom are women and children. Furthermore, an estimated 300,000 children are actively participating in 36 ongoing or recently ended conflicts around the world. However, there is a dearth of research on the long-term effects of war trauma experienced in childhood or children's active participation in armed conflicts. This article explores the long-term effects of children's active participation in the war in El Salvador by examining four young adults who fought with the guerrilla army as children and adolescents. Comparing these four cases with members of the community who joined and fought with the guerrilla as adults, it will be argued that traumatic experiences were even more devastating when they occurred in early childhood as they destroyed the ability to establish basic trust in competent and nurturing caretakers. Becoming a soldier created additional conflicts as these adolescent soldiers behaved in ways they felt were morally incorrect. Adolescent soldiers were also not given the opportunity to develop autonomy and learn adult peace-time roles. Both the psychological trauma suffered as children as well as continued economic scarcity and violence contribute to these campesinos' difficulties in creating meaningful lives as adults. [source]


    The relationship between baseline value and its change: problems in categorization and the proposal of a new method

    EUROPEAN JOURNAL OF ORAL SCIENCES, Issue 4 2005
    Yu-Kang Tu
    Oral health researchers have shown great interest in the relationship between the initial status of diseases and subsequent changes following treatment. Two main approaches have been adopted to provide evidence of a positive association between baseline values and their changes following treatment. One approach is to use correlation or regression to test the relationship between baseline measurements and subsequent change (correlation/regression approach). The second approach is to categorize the lesions into subgroups, according to threshold values, and subsequently compare the treatment effects across the two (or more) subgroups (categorization approach). However, the correlation/regression approach suffers a methodological weakness known as mathematical coupling. Consequently, the statistical procedure of testing the null hypothesis becomes inappropriate. Categorization seems to avoid the problem of mathematical coupling, although it still suffers regression to the mean. We show, first, how the appropriate null hypothesis may be established to analyze the relationship between baseline values and change in the correlation approach and, second, we use computer simulations to investigate the impact of regression to the mean on the significance testing of the differences in the average treatment effects (or average baseline values) in the categorization approach. Data available from previous literature are reanalyzed by testing the appropriate null hypotheses and the results are compared to those from testing the usual (incorrect) null hypothesis. The results indicate that both the correlation and categorization approaches can give rise to misleading conclusions and that more appropriate methods, such as Oldham's method and our new approach of deriving the correct null hypothesis, should be adopted. [source]


    Self-respect and the Respect of Others

    EUROPEAN JOURNAL OF PHILOSOPHY, Issue 1 2010
    Colin Bird
    It concentrates on a particular version of the claim defended by Avishai Margalit. The paper argues that Margalit's arguments fail to explain why the rival stoic view, that agents ultimately retain responsibility for their own self-respect, is incorrect. [source]


    WHY DOES A METHOD THAT FAILS CONTINUE TO BE USED?

    EVOLUTION, Issue 4 2009
    THE ANSWER
    It has been claimed that hundreds of researchers use nested clade phylogeographic analysis (NCPA) based on what the method promises rather than requiring objective validation of the method. The supposed failure of NCPA is based upon the argument that validating it by using positive controls ignored type I error, and that computer simulations have shown a high type I error. The first argument is factually incorrect: the previously published validation analysis fully accounted for both type I and type II errors. The simulations that indicate a 75% type I error rate have serious flaws and only evaluate outdated versions of NCPA. These outdated type I error rates fall precipitously when the 2003 version of single-locus NCPA is used or when the 2002 multilocus version of NCPA is used. It is shown that the tree-wise type I errors in single-locus NCPA can be corrected to the desired nominal level by a simple statistical procedure, and that multilocus NCPA reconstructs a simulated scenario used to discredit NCPA with 100% accuracy. Hence, NCPA is a not a failed method at all, but rather has been validated both by actual data and by simulated data in a manner that satisfies the published criteria given by its critics. The critics have come to different conclusions because they have focused on the pre-2002 versions of NCPA and have failed to take into account the extensive developments in NCPA since 2002. Hence, researchers can choose to use NCPA based upon objective critical validation that shows that NCPA delivers what it promises. [source]


    The hairs of Plantago reniformisBeck, section EremopsylliumPilg. (Plantaginaceae)

    FEDDES REPERTORIUM, Issue 3-4 2003
    E. Andrzejewska-Golec Dr. habil
    The paper is a continuation of the investigations on the hairs in representatives of family Plantaginaceae. It regards monotypic section EremopsylliumPilg. separated by Pilger from sectio LamprosanthaDecne. According to Rahn and Rønsted it is incorrect. In Plantago reniformis headed hairs typical of the family Plantaginaceae are present. They are plesiomorphic hairs with one celled stalk and head divided vertically into two cells. Headless hairs also are similar to hairs in subgenus Plantago sensu Rahn. Die Haare von Plantago reniformisBeck, Sektion EremopsylliumPilg. (Plantaginaceae) Dieser Beitrag ist eine Fortsetzung der Untersuchungen von Haaren an Vertretern der Plantaginaceae. Der vorliegende Artikel betrifft die monotypische Sektion EremopsylliumPilg. Pilger löste die Sektion Eremopsyllium aus der Sektion Lamprosantha heraus. Nach Rahn und Rønsted ist dies nicht berechtigt. Bei Plantago reniformis treten nur für die Familie Plantaginaceae typische Köpfchenhaare auf. Diese plesiomorphen Haare haben einzellige Stiele und die Köpfchen sind vertikal in zwei Zellen geteilt. Die Art besitzt auch ähnliche kopflose Haare wie die Taxa des Subgenus Plantago sensu Rahn. [source]


    MCMC-based linkage analysis for complex traits on general pedigrees: multipoint analysis with a two-locus model and a polygenic component

    GENETIC EPIDEMIOLOGY, Issue 2 2007
    Yun Ju Sung
    Abstract We describe a new program lm_twoqtl, part of the MORGAN package, for parametric linkage analysis with a quantitative trait locus (QTL) model having one or two QTLs and a polygenic component, which models additional familial correlation from other unlinked QTLs. The program has no restriction on number of markers or complexity of pedigrees, facilitating use of more complex models with general pedigrees. This is the first available program that can handle a model with both two QTLs and a polygenic component. Competing programs use only simpler models: one QTL, one QTL plus a polygenic component, or variance components (VC). Use of simple models when they are incorrect, as for complex traits that are influenced by multiple genes, can bias estimates of QTL location or reduce power to detect linkage. We compute the likelihood with Markov Chain Monte Carlo (MCMC) realization of segregation indicators at the hypothesized QTL locations conditional on marker data, summation over phased multilocus genotypes of founders, and peeling of the polygenic component. Simulated examples, with various sized pedigrees, show that two-QTL analysis correctly identifies the location of both QTLs, even when they are closely linked, whereas other analyses, including the VC approach, fail to identify the location of QTLs with modest contribution. Our examples illustrate the advantage of parametric linkage analysis with two QTLs, which provides higher power for linkage detection and better localization than use of simpler models. Genet. Epidemiol. © 2006 Wiley-Liss, Inc. [source]


    A brief study of applications of the generalized reciprocal method and of some limitations of the method

    GEOPHYSICAL PROSPECTING, Issue 5 2000
    Bengt Sjögren
    An analysis of the generalized reciprocal method (GRM), developed by Palmer for the interpretation of seismic refraction investigations, has been carried out. The aim of the present study is to evaluate the usefulness of the method for geotechnical investigations in connection with engineering projects. Practical application of the GRM is the main object of this study rather than the theoretical/mathematical aspects of the method. The studies are partly based on the models and field examples presented by Palmer. For comparison, some other refraction interpretation methods and techniques have been employed, namely the ABC method, the ABEM correction method, the mean-minus-T method and Hales' method. The comparisons showed that the results, i.e. the depths and velocities determined by Palmer, are partly incorrect due to some errors and misinterpretations when analysing the data from field examples. Due to the limitations of the GRM, some of which are mentioned here, stated by Palmer in his various publications, and other shortcomings of the method (e.g. the erasing of valuable information), the GRM must be regarded as being of limited use for detailed and accurate interpretations of refraction seismics for engineering purposes. [source]


    Creative ways to empower action to change the organization: Cases in point

    GLOBAL BUSINESS AND ORGANIZATIONAL EXCELLENCE, Issue 2 2003
    John P. Kotter
    In the Winter 2002 issue, the affiliation of Dan S. Cohen, co-author of the article, "Creative Ways To Empower Action To Change the Organization: Cases in Point" was incorrect. He is a Principal with Deloitte Consulting LLC. [source]


    In the Wake of Mantzikert: The First Crusade and the Alexian Reconquest of Western Anatolia

    HISTORY, Issue 314 2009
    JASON T. ROCHE
    The main aims of this article are threefold. It initially seeks to address two popular misconceptions frequently found in crusade histories and general histories of the Byzantine empire concerning the Turkish invasion and settlement of western Anatolia after the battle of Mantzikert in 1071. The article maintains that blurring the distinctions between the Seljuk Turks of Rans,m and the tribes of pastoral nomads or rather transhumants who came to be known as Türkmens or Turcomans is incorrect. The oft-repeated assumption that the Seljuk Turks of Baghdad oversaw the Turkish conquest of Anatolia is addressed when tracing the unstructured nature of the Turkish migration and the subsequent lack of unity amongst the invaders. After providing the context of the Turkish settlement in western Anatolia, the article throws new light on the relative ease with which the armies of the First Crusade traversed the Anatolian plateau and Byzantine forces compelled the speedy capitulation of Turkish towns and territories along the western coastal plains and river valleys of Anatolia in 1097 and 1098 respectively. [source]


    Diagnostic evaluation of cystic pancreatic lesions

    HPB, Issue 1 2008
    B. C. VISSER
    Abstract Background. Cystic pancreatic neoplasms (CPNs) present a unique challenge in preoperative diagnosis. We investigated the accuracy of diagnostic methods for CPN. Material and methods. This retrospective cases series includes 70 patients who underwent surgery at a university hospital for presumed CPNs between 1997 and 2003, and for whom a definitive diagnosis was established. Variables examined included symptoms, preoperative work-up (including endoscopic retrograde cholangiopancreatography (ERCP) in 22 cases and endoscopic ultrasound (EUS) in 12), and operative and pathological findings. Preoperative computed tomography (CT) and magnetic resonance imaging (MRI) scans (n=50 patients; CT=48; MRI=13) were independently reviewed by two blinded GI radiologists. Results. The final histopathologic diagnoses were mucinous cystic neoplasm (n=13), mucinous cystadenocarcinoma (10), serous cystadenoma (11), IPMN (14), simple cyst (3), cystic neuroendocrine tumor (5), pseudocyst (4), and other (10). Overall, 25 of 70 were malignant (37%), 21 premalignant (30%), and 24 benign (34%). The attending surgeon's preoperative diagnosis was correct in 31% of cases, incorrect in 29%, non-specific "cystic tumor" in 27%, and "pseuodcyst vs. neoplasm" in 11%. Eight had been previously managed as pseudocysts, and 3 pseudocysts were excised as presumed CPN. In review of the CT and MRI, a multivariate analysis of the morphologic features did not identify predictors of specific pathologic diagnoses. Both radiologists were accurate with their preferred (no. 1) diagnosis in <50% of cases. MRI demonstrated no additional utility beyond CT. Conclusions. The diagnosis of CPN remains challenging. Cross-sectional imaging methods do not reliably give an accurate preoperative diagnosis. Surgeons should continue to err on the side of resection. [source]


    Interference control in preschoolers: factors influencing performance on the day,night task

    INFANT AND CHILD DEVELOPMENT, Issue 5 2008
    Derek E. Montgomery
    Abstract Two experiments investigated preschoolers' interference control in variants of the day,night task. The day,night task involves instructing children across 16 trials to say the word ,day' when viewing a card depicting a nighttime sky and to say ,night' when shown a picture of the daytime sky. The purpose of the experiments was to investigate whether the depiction on each card distracts children because it is semantically associated with the instructed response or because the depicted item cues the alternative (incorrect) response within the response set. The results in the first study (N=23, M=52.65 months) and second study (N=54, M=50.81 months) indicate that a close semantic relation between the picture and the target response does not pose substantial interference for preschoolers. In contrast, the pictured item poses a significant challenge for preschoolers when it depicts the interfering alternative in the response set. Theoretical implications of these results for the development of interference control are discussed. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    A Decision-Making Framework for Sediment Contamination

    INTEGRATED ENVIRONMENTAL ASSESSMENT AND MANAGEMENT, Issue 3 2005
    Peter M. Chapman
    Abstract A decision-making framework for determining whether or not contaminated sediments are polluted is described. This framework is intended to be sufficiently prescriptive to standardize the decision-making process but without using "cook book" assessments. It emphasizes 4 guidance "rules": (1) sediment chemistry data are only to be used alone for remediation decisions when the costs of further investigation outweigh the costs of remediation and there is agreement among all stakeholders to act; (2) remediation decisions are based primarily on biology; (3) lines of evidence (LOE), such as laboratory toxicity tests and models that contradict the results of properly conducted field surveys, are assumed incorrect; and (4) if the impacts of a remedial alternative will cause more environmental harm than good, then it should not be implemented. Sediments with contaminant concentrations below sediment quality guidelines (SQGs) that predict toxicity to less than 5% of sediment-dwelling infauna and that contain no quantifiable concentrations of substances capable of biomagnifying are excluded from further consideration, as are sediments that do not meet these criteria but have contaminant concentrations equal to or below reference concentrations. Biomagnification potential is initially addressed by conservative (worst case) modeling based on benthos and sediments and, subsequently, by additional food chain data and more realistic assumptions. Toxicity (acute and chronic) and alterations to resident communities are addressed by, respectively, laboratory studies and field observations. The integrative decision point for sediments is a weight of evidence (WOE) matrix combining up to 4 main LOE: chemistry, toxicity, community alteration, and biomagnification potential. Of 16 possible WOE scenarios, 6 result in definite decisions, and 10 require additional assessment. Typically, this framework will be applied to surficial sediments. The possibility that deeper sediments may be uncovered as a result of natural or other processes must also be investigated and may require similar assessment. [source]


    Effects of Rare-Earth Dopants on the Ferroelectric and Pyroelectric Properties of Strontium Barium Niobate Ceramics

    INTERNATIONAL JOURNAL OF APPLIED CERAMIC TECHNOLOGY, Issue 6 2009
    Yingbang Yao
    Effects of various rare-earth (RE) dopants (Y3+, La3+, Ce3+, Pr3+, Nd3+, Sm3+, Eu3+, Gd3+, Tm3+, Dy3+, Er3+, and Yb3+) on the dielectric, ferroelectric, and pyroelectric properties of Sr0.5Ba0.5Nb2O6 (SBN50) ceramics were investigated. In the present studies, the doping concentrations of all the RE dopants were fixed at 1 mol%. Their potential usefulness in pyroelectric applications was discussed based on their measured pyroelectric detectivity figure of merit (FOM). On the basis of our studies, for RE dopants with atomic numbers smaller than Nd, their dielectric constants were greatly increased, while for RE dopants with atomic numbers larger than Sm, their dielectric constants as well as dielectric losses became smaller. Among various dopants, Eu-doped SBN showed the most improved ferroelectric properties. Its remnant polarization (Pr) was increased to 4.86 ,C/cm2 as compared with 3.23 ,C/cm2 obtained in undoped SBN50. On the other hand, Gd-doped SBN exhibited the largest pyroelectric coefficient of 168 ,C/m2 K, which was over three times of that of the undoped sample (49 ,C/m2 K). The work shows that Gd-doped SBN exhibits the greatest potential for pyro-applications because it bears the largest FOM of 0.45 × 10,5 Pa,0.5 [Correction: After online publication on 11/05/2008, an error was found in this article. The original value, 1.35×10,5 Pa,0.5, was incorrect. The data has been replaced with the correct value.]. [source]


    The recent Sahel drought is real

    INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 11 2004
    Aiguo Dai
    Abstract Using station rainfall data extracted from two comprehensive data sets, we show that large decreasing rainfall trends were widespread in the Sahel (10,20°N and 18°W,20°E) from the late 1950s to the late 1980s. Thereafter, Sahel rainfall has recovered somewhat through 2003, although the drought conditions have not ended in the region. These results confirm the findings of many previous studies. We also found that large multi-year oscillations appear to be more frequent and extreme after the late 1980s than previously. Analyses of Sahel regional rainfall time series derived from a fixed subset of stations and from all available stations show that the decreasing trend in Sahel rainfall is not an artifact of changing station networks. The rainfall model used by Chappell and Agnew (2004 International Journal of Climatology24: 547,554) is incorrect and their modelled rainfall time series is totally unrepresentative of Sahel average rainfall. Their conclusion about the Sahel rainfall trends being an artifact of changing station locations is emphatically wrong and their speculative statements about the implications of their results for other studies and other regions of the world are completely unfounded. Copyright © 2004 Royal Meteorological Society [source]


    Evaluating screening questionnaires using receiver Operating Characteristic (ROC) curves from two-phase (double) samples

    INTERNATIONAL JOURNAL OF METHODS IN PSYCHIATRIC RESEARCH, Issue 3 2000
    Giulia Bisoffi
    Abstract The characteristics of psychiatric screening tests (for example, sensitivity, specificity, and AUC , the area under an ROC curve) are frequently assessed using data arising from two-phase samples. Too often, however, the statistical methods that are used are incorrect. They do not appropriately account for the sampling design. Valid methods for the estimate of sensitivity, specificity and, in particular, the AUC, together with its standard error, are discussed in detail and a Stata macro for the implementation of these methods is provided. Simple weighting procedures are used to correct for verification biases arising from the two-phase design, together with bootstrap or jackknife sampling for the calculation of valid standard errors. Copyright © 2000 Whurr Publishers Ltd. [source]


    Anomalies in the Foundations of Ridge Regression: Some Clarifications

    INTERNATIONAL STATISTICAL REVIEW, Issue 2 2010
    Prasenjit Kapat
    Summary Several anomalies in the foundations of ridge regression from the perspective of constrained least-square (LS) problems were pointed out in Jensen & Ramirez. Some of these so-called anomalies, attributed to the non-monotonic behaviour of the norm of unconstrained ridge estimators and the consequent lack of sufficiency of Lagrange's principle, are shown to be incorrect. It is noted in this paper that, for a fixed,Y, norms of unconstrained ridge estimators corresponding to the given basis are indeed strictly monotone. Furthermore, the conditions for sufficiency of Lagrange's principle are valid for a suitable range of the constraint parameter. The discrepancy arose in the context of one data set due to confusion between estimates of the parameter vector,,,, corresponding to different parametrization (choice of bases) and/or constraint norms. In order to avoid such confusion, it is suggested that the parameter,,,corresponding to each basis be labelled appropriately. Résumé Plusieurs anomalies ont été récemment relevées par Jensen et Ramirez (2008) dans les fondements théoriques de la "ridge regression" considérée dans une perspective de moindres carrés constraints. Certaines de ces anomalies ont été attribuées au comportement non monotone de la norme des "ridge-estimateurs" non contraints, ainsi qu'au caractère non suffisant du principe de Lagrange. Nous indiquons dans cet article que, pour une valeur fixée de,Y, la norme des ridge-estimateurs correspondant à une base donnée sont strictement monotones. En outre, les conditions assurant le caractère suffisant du principe de Lagrange sont satisfaites pour un ensemble adéquat de valeurs du paramètre contraint. L'origine des anomalies relevées se trouve donc ailleurs. Cette apparente contradiction prend son origine, dans le contexte de l'étude d'un ensemble de données particulier, dans la confusion entre les estimateurs du vecteur de paramètres,,,correspondant à différentes paramétrisations (associées à différents choix d'une base) et/ou à différentes normes. Afin d'éviter ce type de confusion, il est suggéré d'indexer le paramètre de façon adéquate au moyen de la base choisie. [source]


    Thucydides and Modern Realism

    INTERNATIONAL STUDIES QUARTERLY, Issue 1 2006
    JONATHAN MONTEN
    This paper makes two main arguments about the relationship between Thucydides, modern realism, and the key conceptual ideas they introduce to situate and explain international politics. First, Thucydides refutes the central claim underlying modern realist scholarship, that the sources of state behavior can be located not in the character of the primary political units but in the decentralized system or structure created by their interaction. Second, however, analyses that discuss Thucydides exclusively with respect to this "third-image" realism do not take into account the most important emendation made to political realism in the last half of the twentieth century, Kenneth Waltz's Theory of International Politics. Waltz reformulates the theory of how anarchic political structures affect the behavior of their constituent units and suggests that the question posed by realism,and to be asked of Thucydides,is not whether states behave according to the Athenian thesis or consistently observe the power-political laws of nature, but whether they suffer "costs" in terms of political autonomy, security, and cultural integrity if they do not. Many scholars are therefore incorrect to assume that demonstrating the importance of non-structural factors in The Peloponnesian War severs the connection between Thucydides and structural realism. Thucydides may in fact be a realist, but not for reasons conventionally assumed. [source]


    Molecular systematics of the speciose Indo-Pacific soft coral genus, Sinularia (Anthozoa: Octocorallia)

    INVERTEBRATE BIOLOGY, Issue 4 2009
    Catherine S. McFadden
    Abstract. The speciose tropical soft coral genus Sinularia traditionally has been divided into five intrageneric taxonomic groups based on variation in a single morphological character: the shape of the club sclerites (calcite skeletal elements) embedded in the surface tissues of the colony. To test the phylogenetic utility of this system of classification, we used a 735-bp fragment of the octocoral-specific mitochondrial msh1 gene to construct a molecular phylogeny that included 80 of the ,150 recognized morphospecies of Sinularia. The msh1 phylogeny recovered five well-supported clades, but they were not congruent with the traditional intrageneric taxonomic groups. Mapping of characters onto the tree suggested that the five major clades plus several additional sub-clades of Sinularia can be distinguished based on a suite of four morphological characters; these include the presence of sclerites in the tentacle, collaret, and point regions of the polyps, in addition to the shape of the club sclerites in the surface tissues. The overall growth form of the colony also distinguishes some clades. Polyp sclerites have for the most part been overlooked taxonomically in Sinularia, and as a result information on these characters is lacking or is incorrect in many species descriptions. As has been the case in other recent studies of lower metazoan groups, construction of a molecular phylogeny has led us to recognize the phylogenetic and taxonomic importance of previously overlooked morphological characters. A revised taxonomic key that includes these characters is already improving our ability to discriminate species boundaries, and facilitating description of new Sinularia species. [source]


    Hunting for large carnivore conservation

    JOURNAL OF APPLIED ECOLOGY, Issue 6 2009
    Adrian Treves
    Summary 1. Carnivores are difficult to conserve because of direct and indirect competition with people. Public hunts are increasingly proposed to support carnivore conservation. This article reviews scientific evidence for the effectiveness of public hunts of large carnivores in attaining three common policy goals: stable carnivore populations, preventing conflict with carnivores (property damage and competition over game) and building public support for carnivore conservation. 2. Sustainable exploitation of stable wildlife populations has a solid, scientific foundation but the theory and its predictions must be adapted to complex patterns of carnivore behavioural ecology and population dynamics that demand years of landscape-level monitoring to understand fully. 3. A review of the evidence that hunting prevents property damage or reduces competition for game reveals large gaps in our understanding. Reducing the number of large carnivores to protect hunters' quarry species seems straightforward but we still know little about behavioural and ecological responses of the contested prey and sympatric meso-predators. For reducing property damage, the direct effect , numerical reduction in problematic individual carnivores , presents numerous obstacles, whereas the indirect effect , behavioural avoidance of humans by hunted carnivores , holds more promise. 4. Scientific measures of public support for carnivore-hunting policies are almost completely lacking, particularly measures of attitudes among hunters before and after controversial wildlife is designated as legal game species. Moreover, illegal killing of carnivores does not appear to diminish if they are designated as game. 5.Synthesis and applications. Sustainable hunting to maintain stable populations is well understood in theory but complex life histories of carnivores, and behavioural changes of hunters and the carnivores they stalk may result in unsustainable mortality for carnivores. The direct impact of hunting on carnivore damage to property is unclear and even doubtful given the inability or unwillingness of hunters to remove specific individuals selectively. However, hunters may indirectly deter carnivores from people and their property. The assumption that hunters will steward carnivores simply because they have in the past helped conserve other game species requires more study as preliminary results suggest it is incorrect. Policy-makers may achieve support for policy if they mesh utilitarian and preservationist values held by the general public. A number of opposed hypotheses should be disentangled before researchers confidently inform policy on sustainable hunting to prevent conflicts and build public support for carnivore conservation. [source]