Same Data (same + data)

Distribution by Scientific Domains

Terms modified by Same Data

  • same data set

  • Selected Abstracts


    Activist Macroeconomic Policy, Election Effects and the Formation of Expectations: Evidence from OECD Economies

    ECONOMICS & POLITICS, Issue 2 2000
    David Kiefer
    We examine the explanatory power of a political,business cycle theory in which governments practice short-run policy to lessen the impact of exogenous shocks. Governments have ideological objectives with respect to macroeconomic performance, but are constrained by an augmented Phillips curve. The most prominent version, the rational partisan model, incorporates forward-looking expectations. This model can be compared to a competing model based on backward-looking expectations. Alesina and Roubini's recent advocacy of the rational model uses OECD data. Our reconsideration of the same data, updated to 1995, suggests that the adaptive expectations version offers a better explanation than the rational one. [source]


    Major and minor depression in Parkinson's disease: a neuropsychological investigation

    EUROPEAN JOURNAL OF NEUROLOGY, Issue 9 2006
    A. Costa
    Previous studies have failed to distinguish the differential contribution of major and minor depression to cognitive impairment in patients with idiopathic Parkinson's disease (PD). This study was aimed at investigating the relationships among major depression (MD), minor depression (MiD) and neuropsychological deficits in PD. Eighty-three patients suffering from PD participated in the study. MD and MiD were diagnosed by means of a structured interview (SCID-I) based on the DSM-IV criteria, and severity of depression was evaluated by the Beck Depression Inventory. For the neuropsychological assessment, we used standardized scales that measure verbal and visual episodic memory, working memory, executive functions, abstract reasoning and visual-spatial and language abilities. MD patients performed worse than PD patients without depression on two long-term verbal episodic memory tasks, on an abstract reasoning task and on three measures of executive functioning. The MiD patients' performances on the same tests fell between those of the other two groups of PD patients but did not show significant differences. Our results indicate that MD in PD is associated with a qualitatively specific neuropsychological profile that may be related to an alteration of prefrontal and limbic cortical areas. Moreover, the same data suggest that in these patients MiD and MD may represent a gradual continuum associated with increasing cognitive deficits. [source]


    A two-step procedure for constructing confidence intervals of trait loci with application to a rheumatoid arthritis dataset

    GENETIC EPIDEMIOLOGY, Issue 1 2006
    Charalampos Papachristou
    Abstract Preliminary genome screens are usually succeeded by fine mapping analyses focusing on the regions that signal linkage. It is advantageous to reduce the size of the regions where follow-up studies are performed, since this will help better tackle, among other things, the multiplicity adjustment issue associated with them. We describe a two-step approach that uses a confidence set inference procedure as a tool for intermediate mapping (between preliminary genome screening and fine mapping) to further localize disease loci. Apart from the usual Hardy-Weiberg and linkage equilibrium assumptions, the only other assumption of the proposed approach is that each region of interest houses at most one of the disease-contributing loci. Through a simulation study with several two-locus disease models, we demonstrate that our method can isolate the position of trait loci with high accuracy. Application of this two-step procedure to the data from the Arthritis Research Campaign National Repository also led to highly encouraging results. The method not only successfully localized a well-characterized trait contributing locus on chromosome 6, but also placed its position to narrower regions when compared to their LOD support interval counterparts based on the same data. Genet. Epidemiol. 30:18,29, 2006. © 2005 Wiley-Liss, Inc. [source]


    Searching for phylogenetic pattern in biological invasions

    GLOBAL ECOLOGY, Issue 1 2008
    erban Proche
    Abstract It has been suggested that alien species with close indigenous relatives in the introduced range may have reduced chances of successful establishment and invasion (Darwin's naturalization hypothesis). Studies trying to test this have in fact been addressing four different hypotheses, and the same data can support some while rejecting others. In this paper, we argue that the phylogenetic pattern will change depending on the spatial and phylogenetic scales considered. Expectations and observations from invasion biology and the study of natural communities are that at the spatial scale relevant to competitive interactions, closely related species will be spatially separated, whereas at the regional scale, species in the same genera or families will tend to co-occur more often than by chance. We also argue that patterns in the relatedness of indigenous and naturalized plants are dependent on the continental/island setting, spatial occupancy levels, and on the group of organisms under scrutiny. Understanding how these factors create a phylogenetic pattern in invasions will help us predict which groups are more likely to invade where, and should contribute to general ecological theory. [source]


    Are Alaskan trees found in locally more favourable sites in marginal areas?

    GLOBAL ECOLOGY, Issue 2 2002
    Jack J. Lennon
    Abstract Aim Species generally become rarer and more patchily distributed as the margins of their ranges are approached. We predicted that in such marginal sites, tree species would tend to occur where some key environmental factors are at particularly favourable levels, compensating in part for the low overall suitability of marginal sites. Location The article considers the spatial distributions of trees in Southeast Alaska (the Alaskan ,panhandle'). Methods We quantified range marginality using spatial distributions of eight tree species across more than one thousand surveyed sites in Southeast Alaska. For each species we derived a site core/margin index using a three-dimensional trend surface generated from logistic regression on site coordinates. For each species, the relationships between the environmental factors slope, aspect and site marginality were then compared for occupied and unoccupied sets of sites. Results We found that site slope is important for more Alaskan tree species than aspect. Three out of eight had a significant core/margin by occupied/unoccupied interaction, tending to be present in significantly shallower-sloped (more favourable) sites in the marginal areas than the simple core/margin trend predicted. For site aspect, one species had a significant interaction, selecting potentially more favourable northerly aspects in marginal areas. A finer-scale analysis based on the same data came to the same overall conclusions. Conclusions There is evidence that several tree species in Alaska tend to occur in especially favourable sites in marginal areas. In these marginal areas, these species amplify habitat preferences shown in core areas. [source]


    Distinguishing between heterogeneity and inefficiency: stochastic frontier analysis of the World Health Organization's panel data on national health care systems

    HEALTH ECONOMICS, Issue 10 2004
    William Greene
    Abstract The most commonly used approaches to parametric (stochastic frontier) analysis of efficiency in panel data, notably the fixed and random effects models, fail to distinguish between cross individual heterogeneity and inefficiency. This blending of effects is particularly problematic in the World Health Organization's (WHO) panel data set on health care delivery, which is a 191 country, 5-year panel. The wide variation in cultural and economic characteristics of the worldwide sample produces a large amount of unmeasured heterogeneity in the data. This study examines several alternative approaches to stochastic frontier analysis with panel data, and applies some of them to the WHO data. A more general, flexible model and several measured indicators of cross country heterogeneity are added to the analysis done by previous researchers. Results suggest that there is considerable heterogeneity that has masqueraded as inefficiency in other studies using the same data. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Active microwave remote sensing for soil moisture measurement: a field evaluation using ERS-2

    HYDROLOGICAL PROCESSES, Issue 11 2004
    Jeffrey P. Walker
    Abstract Active microwave remote sensing observations of backscattering, such as C-band vertically polarized synthetic aperture radar (SAR) observations from the second European remote sensing (ERS-2) satellite, have the potential to measure moisture content in a near-surface layer of soil. However, SAR backscattering observations are highly dependent on topography, soil texture, surface roughness and soil moisture, meaning that soil moisture inversion from single frequency and polarization SAR observations is difficult. In this paper, the potential for measuring near-surface soil moisture with the ERS-2 satellite is explored by comparing model estimates of backscattering with ERS-2 SAR observations. This comparison was made for two ERS-2 overpasses coincident with near-surface soil moisture measurements in a 6 ha catchment using 15-cm time domain reflectometry probes on a 20 m grid. In addition, 1-cm soil moisture data were obtained from a calibrated soil moisture model. Using state-of-the-art theoretical, semi-empirical and empirical backscattering models, it was found that using measured soil moisture and roughness data there were root mean square (RMS) errors from 3·5 to 8·5 dB and r2 values from 0·00 to 0·25, depending on the backscattering model and degree of filtering. Using model soil moisture in place of measured soil moisture reduced RMS errors slightly (0·5 to 2 dB) but did not improve r2 values. Likewise, using the first day of ERS-2 backscattering and soil moisture data to solve for RMS surface roughness reduced RMS errors in backscattering for the second day to between 0·9 and 2·8 dB, but did not improve r2 values. Moreover, RMS differences were as large as 3·7 dB and r2 values as low as 0·53 between the various backscattering models, even when using the same data as input. These results suggest that more research is required to improve the agreement between backscattering models, and that ERS-2 SAR data may be useful for estimating fields-scale average soil moisture but not variations at the hillslope scale. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    A comparison of tropical temperature trends with model predictions

    INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 13 2008
    David H. Douglass
    Abstract We examine tropospheric temperature trends of 67 runs from 22 ,Climate of the 20th Century' model simulations and try to reconcile them with the best available updated observations (in the tropics during the satellite era). Model results and observed temperature trends are in disagreement in most of the tropical troposphere, being separated by more than twice the uncertainty of the model mean. In layers near 5 km, the modelled trend is 100 to 300% higher than observed, and, above 8 km, modelled and observed trends have opposite signs. These conclusions contrast strongly with those of recent publications based on essentially the same data. Copyright © 2007 Royal Meteorological Society [source]


    Social marketing in action,geodemographics, alcoholic liver disease and heavy episodic drinking in Great Britain

    INTERNATIONAL JOURNAL OF NONPROFIT & VOLUNTARY SECTOR MARKETING, Issue 3 2007
    Jane Powell
    This paper explores the use of geodemographic population classifications to identify and predict ,hotspots' of Great Britain (England, Scotland and Wales) prone to greater than expected alcoholic liver disease. MOSAIC geodemographic codes were overlaid onto Hospital Episode Statistics (HES) for Great Britain. The HES data included gender, MOSAIC Type, MOSAIC Code, postal and local authority district, month and year of birth, ethnic origin, Primary Care Trust and GP code. Analysis demonstrated that some geodemographic classifications of the population were over-represented for alcoholic liver disease episodes. These groups had low socio-economic and socio-cultural status, lived in areas of high deprivation and disadvantage. Manchester followed by Liverpool and Hull had the highest estimated patient group size in England and Hart, Surrey Heath and Wokingham the three lowest (indicating low expected levels of alcoholic liver disease compared with average). Analysis of the same data was also carried out at postcode level for Manchester indicating ,hotspots' for alcoholic level disease at street level. This analysis exemplifies the ways in which geodemographic data might be usefully applied to routine health service data to enhance service planning, delivery and improved targeting of information in harder to reach populations. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Measurement of body size and abundance in tests of macroecological and food web theory

    JOURNAL OF ANIMAL ECOLOGY, Issue 1 2007
    SIMON JENNINGS
    Summary 1Mean body mass (W) and mean numerical (N) or biomass (B) abundance are frequently used as variables to describe populations and species in macroecological and food web studies. 2We investigate how the use of mean W and mean N or B, rather than other measures of W and/or accounting for the properties of all individuals, can affect the outcome of tests of macroecological and food web theory. 3Theoretical and empirical analyses demonstrate that mean W, W at maximum biomass (Wmb), W when energy requirements are greatest (Wme) and the W when a species uses the greatest proportion of the energy available to all species in a W class (Wmpe) are not consistently related. 4For a population at equilibrium, relationships between mean W and Wme depend on the slope b of the relationship between trophic level and W. For marine fishes, data show that b varies widely among species and thus mean W is an unreliable indicator of the role of a species in the food web. 5Two different approaches, ,cross-species' and ,all individuals' have been used to estimate slopes of abundance,body mass relationships and to test the energetic equivalence hypothesis and related theory. The approaches, based on relationships between (1) log10 mean W and log10 mean N or B, and (2) log10 W and log10 N or B of all individuals binned into log10 W classes (size spectra), give different slopes and confidence intervals with the same data. 6Our results show that the ,all individuals' approach has the potential to provide more powerful tests of the energetic equivalence hypothesis and role of energy availability in determining slopes, but new theory and empirical analysis are needed to explain distributions of species relative abundance at W. 7Biases introduced when working with mean W in macroecological and food web studies are greatest when species have indeterminate growth, when relationships between W and trophic level are strong and when the range of species'W is narrow. [source]


    An algebraic algorithm for generation of three-dimensional grain maps based on diffraction with a wide beam of hard X-rays

    JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 1 2004
    T. Markussen
    A reconstruction method is presented for generation of three-dimensional maps of the grain boundaries within powders or polycrystals. The grains are assumed to have a mosaic spread below 1°. They are mapped by diffraction with a wide beam of hard X-rays, using a setup similar to that of parallel-beam absorption contrast tomography. First the diffraction spots are sorted with respect to grain of origin. Next, for each grain the reconstruction is performed by an algebraic algorithm known as three-dimensional ART. From simulations it is found that reconstructions with a spatial accuracy better than the pixel size of the detector can be obtained from as few as five diffraction spots. The results are superior to three-dimensional reconstructions based on the same data using a variant of the filtered back-projection algorithm. In comparison with layer-by-layer type reconstructions based on the two-dimensional ART algorithm, as introduced by Poulsen & Fu [J. Appl. Cryst. (2003), 36, 1062,1068], the quality of the maps is found to be similar, provided that five to ten spots are available for analysis, while data acquisition with the three-dimensional method is much faster. The three-dimensional ART methodology is validated on experimental data. With state-of-the-art detectors, the spatial accuracy is estimated to be 5,µm. [source]


    Clouds make nerds look good: field evidence of the impact of incidental factors on decision making

    JOURNAL OF BEHAVIORAL DECISION MAKING, Issue 2 2007
    Uri SimonsohnArticle first published online: 9 OCT 200
    Abstract Abundant experimental research has documented that incidental primes and emotions are capable of influencing people's judgments and choices. This paper examines whether the influence of such incidental factors is large enough to be observable in the field, by analyzing 682 actual university admission decisions. As predicted, applicants' academic attributes are weighted more heavily on cloudier days and non-academic attributes on sunnier days. The documented effects are of both statistical and practical significance: changes in cloud cover can increase a candidate's predicted probability of admission by an average of up to 11.9%. These results also shed light on the causes behind the long demonstrated unreliability of experts making repeated judgments from the same data. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Future eating and country keeping: what role has environmental history in the management of biodiversity?

    JOURNAL OF BIOGEOGRAPHY, Issue 5 2001
    D.M.J.S. Bowman
    In order to understand and moderate the effects of the accelerating rate of global environmental change land managers and ecologists must not only think beyond their local environment but also put their problems into a historical context. It is intuitively obvious that historians should be natural allies of ecologists and land managers as they struggle to maintain biodiversity and landscape health. Indeed, ,environmental history' is an emerging field where the previously disparate intellectual traditions of ecology and history intersect to create a new and fundamentally interdisciplinary field of inquiry. Environmental history is rapidly becoming an important field displacing many older environmentally focused academic disciplines as well as capturing the public imagination. By drawing on Australian experience I explore the role of ,environmental history' in managing biodiversity. First I consider some of the similarities and differences of the ecological and historical approaches to the history of the environment. Then I review two central questions in Australian environment history: landscape-scale changes in woody vegetation cover since European settlement and the extinction of the marsupials in both historical and pre-historical time. These case studies demonstrate that environmental historians can reach conflicting interpretations despite using essentially the same data. The popular success of some environmental histories hinges on the fact that they narrate a compelling story concerning human relationships and human value judgements about landscape change. Ecologists must learn to harness the power of environmental history narratives to bolster land management practices designed to conserve biological heritage. They can do this by using various currently popular environmental histories as a point of departure for future research, for instance by testing the veracity of competing interpretations of landscape-scale change in woody vegetation cover. They also need to learn how to write parables that communicate their research findings to land managers and the general public. However, no matter how sociologically or psychologically satisfying a particular environmental historical narrative might be, it must be willing to be superseded with new stories that incorporate the latest research discoveries and that reflects changing social values of nature. It is contrary to a rational and publicly acceptable approach to land management to read a particular story as revealing the absolute truth. [source]


    Multiple classifier integration for the prediction of protein structural classes

    JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 14 2009
    Lei Chen
    Abstract Supervised classifiers, such as artificial neural network, partition trees, and support vector machines, are often used for the prediction and analysis of biological data. However, choosing an appropriate classifier is not straightforward because each classifier has its own strengths and weaknesses, and each biological dataset has its own characteristics. By integrating many classifiers together, people can avoid the dilemma of choosing an individual classifier out of many to achieve an optimized classification results (Rahman et al., Multiple Classifier Combination for Character Recognition: Revisiting the Majority Voting System and Its Variation, Springer, Berlin, 2002, 167,178). The classification algorithms come from Weka (Witten and Frank, Data Mining: Practical Machine Learning Tools and Techniques, Morgan Kaufmann, San Francisco, 2005) (a collection of software tools for machine learning algorithms). By integrating many predictors (classifiers) together through simple voting, the correct prediction (classification) rates are 65.21% and 65.63% for a basic training dataset and an independent test set, respectively. These results are better than any single machine learning algorithm collected in Weka when exactly the same data are used. Furthermore, we introduce an integration strategy which takes care of both classifier weightings and classifier redundancy. A feature selection strategy, called minimum redundancy maximum relevance (mRMR), is transferred into algorithm selection to deal with classifier redundancy in this research, and the weightings are based on the performance of each classifier. The best classification results are obtained when 11 algorithms are selected by mRMR method, and integrated together through majority votes with weightings. As a result, the prediction correct rates are 68.56% and 69.29% for the basic training dataset and the independent test dataset, respectively. The web-server is available at http://chemdata.shu.edu.cn/protein_st/. © 2009 Wiley Periodicals, Inc. J Comput Chem, 2009 [source]


    Factors mediating the effect of gender on ninth-grade Turkish students' misconceptions concerning electric circuits

    JOURNAL OF RESEARCH IN SCIENCE TEACHING, Issue 6 2004
    Selen Sencar
    This study was designed to identify and analyze possible factors that mediate the effect of gender on ninth-grade Turkish students' misconceptions concerning electric circuits. A Simple Electric Circuit Concept Test (SECCT), including items with both practical and theoretical contexts, and an Interest-Experience Questionnaire about Electricity (IEQ) were administered to 1,678 ninth-grade students (764 male, 914 female) after the completion of a unit on electricity to assess students' misconceptions and interests-experiences about electricity. Results of the concept test indicated that general performances of the students were relatively low and that many students had misconceptions in interpreting electric circuits. When the data were analyzed using MANOVA and follow-up ANOVAs, a gender difference for males was observed on the dependent variable of total scores on the 10 practical items; however, there was no significant gender difference on the dependent variable of total scores on the six theoretical items. Moreover, when the same data were analyzed using MANCOVA and follow-up ANCOVAs, controlling students' age and interest-experience related to electricity, the observed gender difference was mediated on the total scores on the practical items. © 2004 Wiley Periodicals, Inc. J Res Sci Teach 41: 603,616, 2004 [source]


    Characterization of the Grain-Boundary Character and Energy Distributions of Yttria Using Automated Serial Sectioning and EBSD in the FIB

    JOURNAL OF THE AMERICAN CERAMIC SOCIETY, Issue 7 2009
    Shen J. Dillon
    A dual-beam focused ion beam scanning electron microscope was used to collect a series of parallel electron backscatter diffraction maps of polycrystalline yttria. Using characteristics of the triple junctions, the individual layers were aligned and the geometries of the grain-boundary planes between the layers were determined. This information was used to calculate the five-parameter grain-boundary character distribution (GBCD) and grain-boundary energy distribution (GBED). The GBCD derived from the three-dimensional data was qualitatively the same as that derived from a stereological analysis of the same data. The anisotropy in the GBCD of yttria is relatively weak compared with other ceramics and is inversely correlated to the GBED. [source]


    "Automatism" and the emergence of dynamic psychiatry

    JOURNAL OF THE HISTORY OF THE BEHAVIORAL SCIENCES, Issue 1 2003
    Adam Crabtree FacultyArticle first published online: 21 JAN 200
    This article is about the clash of two explanatory paradigms, each attempting to account for the same data of human experience. In the first half of the nineteenth century, physiologists investigated reflex actions and applied a recently coined word, "automatism," to describe actions which, although seeming to arise from higher centers, actually result from automatic reaction to sensory stimuli. Experiments with spinal reflexes led to the investigation of the reflex action of the brain or "cerebral automatisms." Reflex actions of this kind were used to explain everything from acting compulsively to composing symphonies. Physiological explanations of phenomena of this kind seemed insufficient to some and, in the 1880s, Frederic Myers and Pierre Janet developed psychological frameworks for understanding these phenomena, positing hidden centers of intelligence at work in the individual, outside ordinary awareness, which produce what came to be called "psychological automatisms." Their attempts to unify this psychological framework with the existing physiological one failed. Nevertheless, their work played a crucial role in paving the way for what Ellenberger called dynamic psychiatry, which accepts the reality of an unconscious dynamic of the psyche. © 2003 Wiley Periodicals, Inc. [source]


    New approach to 3D time-resolved angiography

    MAGNETIC RESONANCE IN MEDICINE, Issue 5 2002
    Bruno Madore
    Abstract TRICKS is an acquisition and reconstruction method capable of generating 3D time-resolved angiograms. Arguably, the main problem with TRICKS is the way it handles the outer regions of the k -space matrix, leading to artifacts at the edges of blood vessels. An alternative to the data- processing stage of TRICKS, designed to better represent edges and small vessels, is presented here. A weakness of the new approach is an increased sensitivity to motion compared to TRICKS. Since this method can use the same data as TRICKS, a hybrid reconstruction method could conceivably be developed where the advantages of both approaches are combined. Magn Reson Med 47:1022,1025, 2002. © 2002 Wiley-Liss, Inc. [source]


    Predictive toxicogenomics approaches reveal underlying molecular mechanisms of nongenotoxic carcinogenicity

    MOLECULAR CARCINOGENESIS, Issue 12 2006
    Alex Y. Nie
    Toxicogenomics technology defines toxicity gene expression signatures for early predictions and hypotheses generation for mechanistic studies, which are important approaches for evaluating toxicity of drug candidate compounds. A large gene expression database built using cDNA microarrays and liver samples treated with over one hundred paradigm compounds was mined to determine gene expression signatures for nongenotoxic carcinogens (NGTCs). Data were obtained from male rats treated for 24 h. Training/testing sets of 24 NGTCs and 28 noncarcinogens were used to select genes. A semiexhaustive, nonredundant gene selection algorithm yielded six genes (nuclear transport factor 2, NUTF2; progesterone receptor membrane component 1, Pgrmc1; liver uridine diphosphate glucuronyltransferase, phenobarbital-inducible form, UDPGTr2; metallothionein 1A, MT1A; suppressor of lin-12 homolog, Sel1h; and methionine adenosyltransferase 1, alpha, Mat1a), which identified NGTCs with 88.5% prediction accuracy estimated by cross-validation. This six genes signature set also predicted NGTCs with 84% accuracy when samples were hybridized to commercially available CodeLink oligo-based microarrays. To unveil molecular mechanisms of nongenotoxic carcinogenesis, 125 differentially expressed genes (P,<,0.01) were selected by Student's t -test. These genes appear biologically relevant, of 71 well-annotated genes from these 125 genes, 62 were overrepresented in five biochemical pathway networks (most linked to cancer), and all of these networks were linked by one gene, c - myc. Gene expression profiling at early time points accurately predicts NGTC potential of compounds, and the same data can be mined effectively for other toxicity signatures. Predictive genes confirm prior work and suggest pathways critical for early stages of carcinogenesis. © 2006 Wiley-Liss, Inc. [source]


    MPowering ecologists: community assembly tools for community assembly rules

    OIKOS, Issue 7 2010
    Joshua Ladau
    Null model tests of presence,absence data (,NMTPAs') provide important tools for inferring effects of competition, facilitation, habitat filtering, and other ecological processes from observational data. Many NMTPAs have been developed, but they often yield conflicting conclusions when applied to the same data. Type I and II error rates, size, power, robustness and bias provide important criteria for assessing which tests are valid, but these criteria need to be evaluated contingent on the sample size, null hypothesis of interest, and assumptions that are appropriate for the data set that is being analyzed. In this paper, we confirm that this is the case using the software MPower, evaluating the validity of NMTPAs contingent on the null hypothesis being tested, assumptions that can be made, and sample size. Evaluating the validity of NMTPAs contingent on these factors is important towards ensuring that reliable inferences are drawn from observational data about the processes controlling community assembly. [source]


    Comparative refinement of correct and incorrect structural models of tetrabutylammonium tetrabutylborate , pitfalls arising from poor-quality data

    ACTA CRYSTALLOGRAPHICA SECTION A, Issue 4 2010
    Vladimir Stilinovi
    This paper demonstrates how numerical parameters usually used to assess the quality of a crystal structure solution (R, wR and S) may be misleading when studying a model refined against poor-quality data. Weakly diffracting crystals of tetrabutylammonium tetrabutylborate, a low-density organic salt comprising isoelectronic cations and anions, were measured using Cu and Mo K, radiation. Along with the correct structural model, six erroneous structural models were constructed and refined against the same data. For both data sets it was found that models based on an incorrect unit-cell choice give lower values of R and wR than the correct one, thus apparently being in better agreement with measured data. Closer inspection of the measured data shows that this is in fact not the case. [source]


    Analysis of immunoglobulin glycosylation by LC-ESI-MS of glycopeptides and oligosaccharides

    PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 14 2008
    Johannes Stadlmann
    Abstract Two LC-ESI-MS methods for the analysis of antibody glycosylation are presented. In the first approach, tryptic glycopeptides are separated by RP chromatography and analyzed by ESI-MS. This "glycopeptide strategy" allows a protein- and subclass-specific quantitation of both neutral and sialylated glycan structures. Additional information about under- or deglycosylation and the protein backbone, e.g., termini, can be extracted from the same data. In the second LC-ESI-MS method, released oligosaccharides are separated on porous graphitic carbon (PGC). A complete structural assignment of neutral and sialylated oligosaccharides occurring on antibodies is thereby achieved in one chromatographic run. The two methods were applied to polyclonal human IgG, to commercial mAb expressed in CHO cells (Rituximab, Xolair, and Herceptin), in SP2/0 (Erbitux and Remicade) or NS0 cells (Zenapax) and the anti-HIV antibody 4E10 produced either in CHO cells or in a human cell line. Both methods require comparably little sample preparation and can be applied to SDS-PAGE bands. They both outperform non-MS methods in terms of reliability of peak assignment and MALDI-MS of underivatized glycans with regard to the recording of sialylated structures. Regarding fast and yet detailed structural assignment, LC-MS on graphitic carbon supersedes all other current methods. [source]


    MORAL CONTEXTUALISM AND MORAL RELATIVISM

    THE PHILOSOPHICAL QUARTERLY, Issue 232 2008
    Berit Brogaard
    Moral relativism provides a compelling explanation of linguistic data involving ordinary moral expressions like ,right' and ,wrong'. But it is a very radical view. Because relativism relativizes sentence truth to contexts of assessment it forces us to revise standard linguistic theory. If, however, no competing theory explains all of the evidence, perhaps it is time for a paradigm shift. However, I argue that a version of moral contextualism can account for the same data as relativism without relativizing sentence truth to contexts of assessment. This version of moral contextualism is thus preferable to relativism on methodological grounds. [source]


    Can fluctuating asymmetry be used to detect inbreeding and loss of genetic diversity in endangered populations?

    ANIMAL CONSERVATION, Issue 2 2000
    Dean M. Gilligan
    Fluctuating asymmetry (FA), a measure of developmental stability, has been proposed as a simple technique for identifying populations suffering from inbreeding and a loss of genetic diversity. However, there is controversy regarding the relationship between FA and both allozyme heterozygosity and pedigree inbreeding coefficients (F). FA of sternopleural bristle number in Drosophila melanogaster was measured in populations maintained at effective sizes of 25 (8 replicates), 50 (6), 100 (4), 250 (3) and 500 (2) for 50 generations (inbreeding coefficients of 0.05,0.71). FA was calculated from the same data set using three different indices (FA1, FA5 and FA6). There was no significant relationship of FA with pedigree inbreeding coefficients for any of the three indices. The relationship between FA and allozyme heterozygosity was non-significant for indices FA5 and FA6 (the more powerful indices) and only significant for FA1. A second comparison of highly inbred (F , 1) populations with their outbred base population showed significantly greater FA in the inbred populations only when analysed with FA6. Analysis of the same data using FA1 and FA5 showed non-significant relationships in the opposite direction. If a relationship between FA and genetic diversity does exist, it is weak and inconsistent. Consequently, our results do not support the use of FA as a monitoring tool to detect inbreeding or loss of genetic diversity. [source]


    Controlling False Discoveries in Multidimensional Directional Decisions, with Applications to Gene Expression Data on Ordered Categories

    BIOMETRICS, Issue 2 2010
    Wenge Guo
    Summary Microarray gene expression studies over ordered categories are routinely conducted to gain insights into biological functions of genes and the underlying biological processes. Some common experiments are time-course/dose-response experiments where a tissue or cell line is exposed to different doses and/or durations of time to a chemical. A goal of such studies is to identify gene expression patterns/profiles over the ordered categories. This problem can be formulated as a multiple testing problem where for each gene the null hypothesis of no difference between the successive mean gene expressions is tested and further directional decisions are made if it is rejected. Much of the existing multiple testing procedures are devised for controlling the usual false discovery rate (FDR) rather than the mixed directional FDR (mdFDR), the expected proportion of Type I and directional errors among all rejections. Benjamini and Yekutieli (2005,,Journal of the American Statistical Association,100, 71,93) proved that an augmentation of the usual Benjamini,Hochberg (BH) procedure can control the mdFDR while testing simple null hypotheses against two-sided alternatives in terms of one-dimensional parameters. In this article, we consider the problem of controlling the mdFDR involving multidimensional parameters. To deal with this problem, we develop a procedure extending that of Benjamini and Yekutieli based on the Bonferroni test for each gene. A proof is given for its mdFDR control when the underlying test statistics are independent across the genes. The results of a simulation study evaluating its performance under independence as well as under dependence of the underlying test statistics across the genes relative to other relevant procedures are reported. Finally, the proposed methodology is applied to a time-course microarray data obtained by Lobenhofer et al. (2002,,Molecular Endocrinology,16, 1215,1229). We identified several important cell-cycle genes, such as DNA replication/repair gene MCM4 and replication factor subunit C2, which were not identified by the previous analyses of the same data by Lobenhofer et al. (2002) and Peddada et al. (2003,,Bioinformatics,19, 834,841). Although some of our findings overlap with previous findings, we identify several other genes that complement the results of Lobenhofer et al. (2002). [source]


    Efron-Type Measures of Prediction Error for Survival Analysis

    BIOMETRICS, Issue 4 2007
    Thomas A. Gerds
    Summary Estimates of the prediction error play an important role in the development of statistical methods and models, and in their applications. We adapt the resampling tools of Efron and Tibshirani (1997, Journal of the American Statistical Association92, 548,560) to survival analysis with right-censored event times. We find that flexible rules, like artificial neural nets, classification and regression trees, or regression splines can be assessed, and compared to less flexible rules in the same data where they are developed. The methods are illustrated with data from a breast cancer trial. [source]


    Sensitivity analysis of different methods of coding taxonomic polymorphism: an example from higher-level bat phylogeny

    CLADISTICS, Issue 6 2002
    Nancy B. Simmons
    New information concerning strengths and weaknesses of different methods of coding taxonomic polymorphisms suggests that results of some previous studies may have been unintentionally biased by the methods employed. In this study, we demonstrate that a form of sensitivity analysis can be used to evaluate the effects of different methods of coding taxonomic polymorphisms on the outcome of phylogenetic analyses. Our earlier analysis of higher-level relationships of bats (Mammalia: Chiroptera) employed superspecific taxa as terminals and scored taxonomic polymorphisms using ambiguity coding. Application of other methods of dealing with polymorphisms (excluding variable characters, inferring ancestral states, majority coding) to the same data yields phylogenetic results that differ somewhat from those originally reported based on ambiguity coding. Monophyly of some clades was supported in all analyses (e.g., Microchiroptera, Rhinopomatoidea, and Nataloidea), while other groups found to be monophyletic in the original study (e.g., neotropical Nataloidea) appeared unresolved or nonmonophyletic when other methods were used to code taxonomic polymorphisms. Several groupings that were apparently refuted in the initial study (e.g., Noctilionoidea including Mystacinidae) were supported in some analyses, reducing some of the apparent incongruence between the trees in our earlier analysis (which were based principally on morphology) and other trees based on molecular data. Perceived support for various groupings (branch support, bootstrap values) were in some cases significantly affected by the methods employed. These results indicate that sensitivity analysis provides a useful tool for evaluating effects of different methods of dealing with taxonomic polymorphism in superspecific terminal taxa. Variation in results obtained with different methods suggests that it is always preferable to sample at the species level when higher-level taxa exhibit taxonomic polymorphism, thus avoiding methodological biases associated with different methods of dealing with taxonomic polymorphisms during data analysis. [source]