Home About us Contact | |||
Precise Definition (precise + definition)
Selected AbstractsPsychiatric diagnoses in the context of genetic studies of bipolar disorderBIPOLAR DISORDERS, Issue 6 2001Anne Duffy Precise definition of the phenotype is an issue of critical importance for the future success of genetic studies of bipolar disorders. So far, an uncertain phenotypic spectrum and genetic heterogeneity are realities that have hampered progress in genetic studies. While recognition of a broader spectrum of related illnesses is important for some applications, for genetic studies a narrow spectrum of illness closely tied to the genotype is paramount. This paper highlights current dilemmas and trends associated with phenotype specification and traces historical approaches. Finally, we explore a number of strategic directions in the diagnostic approach to bipolar disorders that may better serve genetic studies. [source] The metabolic syndrome and changing relationship between blood pressure and insulin with age, as observed in Aboriginal and Torres Strait Islander peoplesDIABETIC MEDICINE, Issue 11 2005A. E. Schutte Abstract Aims To determine the prevalence of the metabolic syndrome (MS) among Aboriginal and Torres Strait Islander peoples. A further objective was to investigate the relationships between fasting insulin and blood pressure (BP) within these groups with increasing age. Methods A cross-sectional population-based study included 369 Torres Strait Islanders (residing in Torres Strait and Far North Queensland), and 675 Aborigines from central Australia. Data necessary for classification of MS was collected, including fasting and 2-h glucose and insulin, urinary albumin and creatinine, anthropometric measurements, BP, serum lipids. Results The ATPIII criteria classified 43% of Torres Strait Islanders and 44% of Aborigines with MS, whereas 32 and 28%, respectively, had the MS according to WHO criteria. Agreement between the two criteria was only modest (kappa coefficient from 0.28 to 0.57). Factor analyses indicated no cluster including both insulin and BP in either population. Significant correlations (P < 0.05) [adjusted for gender, body mass index (BMI) and waist circumference] were observed between BP and fasting insulin: a positive correlation for Torres Strait Islanders aged 15,29 years, and an inverse correlation for Aborigines aged 40 years and older. Conclusion Torres Strait Islanders and Aborigines had very high prevalences of the MS. Specific population characteristics (high prevalences of central obesity, dyslipidaemia, renal disease) may make the WHO definition preferable to the ATPIII definition in these population groups. The poor agreement between criteria suggests a more precise definition of the metabolic syndrome that is applicable across populations is required. This study showed an inverse relationship with age for the correlation of BP and fasting insulin. [source] The notion of ,phonology' in dyslexia research: cognitivism,and beyondDYSLEXIA, Issue 3 2007Per Henning Uppstad Abstract Phonology has been a central concept in the scientific study of dyslexia over the past decades. Despite its central position, however, it is a concept with no precise definition or status. The present article investigates the notion of ,phonology' in the tradition of cognitive psychology. An attempt is made to characterize the basic assumptions of the phonological approach to dyslexia and to evaluate these assumptions on the basis of commonly accepted standards of empirical science. First, the core assumptions of phonological awareness are outlined and discussed. Second, the position of Paula Tallal is presented and discussed in order to shed light on an attempt to stretch the cognitive-psychological notion of ,phonology' towards auditory and perceptual aspects. Both the core assumptions and Tallal's position are rejected as unfortunate, albeit for different reasons. Third, the outcome of this discussion is a search for what is referred to as a ,vulnerable theory' within this field. The present article claims that phonological descriptions must be based on observable linguistic behaviour, so that hypotheses can be falsified by data. Consequently, definitions of ,dyslexia' must be based on symptoms; causal aspects should not be included. In fact, we claim that causal aspects, such as ,phonological deficit', both exclude other causal hypotheses and lead to circular reasoning. If we are to use terms such as ,phonology' and ,phoneme' in dyslexia research, we must have more precise operationalizations of them. Copyright © 2007 John Wiley & Sons, Ltd. [source] Staging anorexia nervosa: conceptualizing illness severityEARLY INTERVENTION IN PSYCHIATRY, Issue 1 2008Sarah Maguire Abstract In recent years, there has been increasing attention to the conceptualization of anorexia nervosa (AN) and its diagnostic criteria. While varying levels of severity within the illness category of AN have long been appreciated, neither a precise definition of severity nor an empirical examination of severity in AN has been undertaken. The aim of this article is to review the current state of knowledge on illness severity and to propose a theoretical model for the definition and conceptualization of severity in AN. AN is associated with significant medical morbidity which is related to the ,severity' of presentation on such markers as body mass index, eating and purging behaviours. The development of a functional staging system, based on symptom severity, is indicated for reasons similar to those cited by the cancer lobby. Improving case management and making appropriate treatment recommendations have been the primary purpose of staging in other fields, and might also apply to AN. Such a standardized staging system could potentially ease communication between treatment settings, and increase the specificity and comparability of research findings in the field of AN. [source] Clinical Presentations and Phenomenology of MyoclonusEPILEPSIA, Issue 2003Edward Faught Summary: The term "myoclonus" has been used to describe heterogeneous phenomena involving sudden movements, but there is no generally accepted, precise definition of myoclonus. Myoclonus can often be classified based on electroencephalographic (EEG) and/or electromyographic (EMG) data. Some myoclonic epilepsy syndromes, including juvenile myoclonic epilepsy, may frequently be misdiagnosed because of failure to obtain a complete patient history and/or failure to appreciate characteristic EEG changes. A good understanding of the features associated with myoclonic disorders (particularly the myoclonic epilepsies) and of features associated with other neurologic disorders that are often confused with myoclonic disorders is an invaluable aid in obtaining an accurate diagnosis and will ultimately help in determining the best course of treatment for patients. [source] Optimum adaptive OFDM systemsEUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 3 2003Lorenzo Piazzo When Orthogonal Frequency Division Multiplexing (OFDM) is used to transmit information over a frequency selective channel, it is convenient to vary the power and the number of bits allocated to each subcarrier in order to optimize the system performance. In this paper, the three classical problems of transmission power minimization, error rate minimization and throughput maximization are investigated in a unified manner. The relations existing among these three problems are clarified and a precise definition of optimum system is given. A general and rigorous way to extend the solution of any of the three problems in order to obtain the solution of the other two is presented. This result is used to devise an efficient algorithm for the error rate minimization. Copyright © 2003 AEI. [source] Use of longitudinal data in genetic studies in the genome-wide association studies era: summary of Group 14GENETIC EPIDEMIOLOGY, Issue S1 2009Berit Kerner Abstract Participants analyzed actual and simulated longitudinal data from the Framingham Heart Study for various metabolic and cardiovascular traits. The genetic information incorporated into these investigations ranged from selected single-nucleotide polymorphisms to genome-wide association arrays. Genotypes were incorporated using a broad range of methodological approaches including conditional logistic regression, linear mixed models, generalized estimating equations, linear growth curve estimation, growth modeling, growth mixture modeling, population attributable risk fraction based on survival functions under the proportional hazards models, and multivariate adaptive splines for the analysis of longitudinal data. The specific scientific questions addressed by these different approaches also varied, ranging from a more precise definition of the phenotype, bias reduction in control selection, estimation of effect sizes and genotype associated risk, to direct incorporation of genetic data into longitudinal modeling approaches and the exploration of population heterogeneity with regard to longitudinal trajectories. The group reached several overall conclusions: (1) The additional information provided by longitudinal data may be useful in genetic analyses. (2) The precision of the phenotype definition as well as control selection in nested designs may be improved, especially if traits demonstrate a trend over time or have strong age-of-onset effects. (3) Analyzing genetic data stratified for high-risk subgroups defined by a unique development over time could be useful for the detection of rare mutations in common multifactorial diseases. (4) Estimation of the population impact of genomic risk variants could be more precise. The challenges and computational complexity demanded by genome-wide single-nucleotide polymorphism data were also discussed. Genet. Epidemiol. 33 (Suppl. 1):S93,S98, 2009. © 2009 Wiley-Liss, Inc. [source] Issues, progress and new results in robust adaptive control,INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 10 2006Sajjad Fekri Abstract We overview recent progress in the field of robust adaptive control with special emphasis on methodologies that use multiple-model architectures. We argue that the selection of the number of models, estimators and compensators in such architectures must be based on a precise definition of the robust performance requirements. We illustrate some of the concepts and outstanding issues by presenting a new methodology that blends robust non-adaptive mixed µ-synthesis designs and stochastic hypothesis-testing concepts leading to the so-called robust multiple model adaptive control (RMMAC) architecture. A numerical example is used to illustrate the RMMAC design methodology, as well as its strengths and potential shortcomings. The later motivated us to develop a variant architecture, denoted as RMMAC/XI, that can be effectively used in highly uncertain exogenous plant disturbance environments. Copyright © 2006 John Wiley & Sons, Ltd. [source] Revisiting N -continuous density-functional theory: Chemical reactivity and "Atoms" in "Molecules"ISRAEL JOURNAL OF CHEMISTRY, Issue 3-4 2003Morrel H. Cohen We construct an internally-consistent density-functional theory valid for noninteger electron numbers N by precise definition of a density functional that is continuous in N. In this theory, charge transfer between the atoms of a heteronuclear diatomic molecule, which have been separated adiabatically to infinity, is avoided because the hardness for fractional occupation of a single HOMO spin-orbital is negative. This N -continuous density functional makes possible a variational theory of "atoms" in "molecules" that exactly decomposes the molecular electron density into a sum of contributions from its parts. The parts are treated as though isolated. That theory, in turn, gives a deep foundation to the chemical reactivity theory provided that the hardness of entities with vanishing spin density is positive, as argued to be the case here. This transition from negative to positive hardness closely parallels the transition from the Heitler-London to the Hund-Mulliken picture of molecular bonding. [source] Characterization of N -palmitoylated human growth hormone by in situ liquid,liquid extraction and MALDI tandem mass spectrometryJOURNAL OF MASS SPECTROMETRY (INCORP BIOLOGICAL MASS SPECTROMETRY), Issue 6 2007Emmanuelle Sachon Abstract Acylation is a common post-translational modification found in secreted proteins and membrane-associated proteins, including signal transducing and regulatory proteins. Acylation is also explored in the pharmaceutical and biotechnology industry to increase the stability and lifetime of protein-based products. The presence of acyl moieties in proteins and peptides affects the physico-chemical properties of these species, thereby modulating protein stability, function, localization and molecular interactions. Characterization of protein acylation is a challenging analytical task, which includes the precise definition of the acylation sites in proteins and determination of the identity and molecular heterogeneity of the acyl moiety at each individual site. In this study, we generated a chemically modified human growth hormone (hGH) by incorporation of a palmitoyl moiety on the N, group of a lysine residue. Monoacylation of the hGH protein was confirmed by determination of the intact molecular weight by mass spectrometry. Detailed analysis of protein acylation was achieved by analysis of peptides derived from hGH by protease treatment. However, peptide mass mapping by MALDI MS using trypsin and AspN proteases and standard sample preparation methods did not reveal any palmitoylated peptides. In contrast, in situ liquid,liquid extraction (LLE) performed directly on the MALDI MS metal target enabled detection of acylated peptide candidates by MALDI MS and demonstrated that hGH was N -palmitoylated at multiple lysine residues. MALDI MS and MS/MS analysis of the modified peptides mapped the N -palmitoylation sites to Lys158, Lys172 and Lys140 or Lys145. This study demonstrates the utility of LLE/MALDI MS/MS for mapping and characterization of acylation sites in proteins and peptides and the importance of optimizing sample preparation methods for mass spectrometry-based determination of substoichiometric, multi-site protein modifications. Copyright © 2007 John Wiley & Sons, Ltd. [source] Teaching and Learning Guide for: Locutionary, Illocutionary, PerlocutionaryLINGUISTICS & LANGUAGE COMPASS (ELECTRONIC), Issue 9 2010Mikhail Kissine This guide accompanies the following article: Mikhail Kissine, ,Locutionary, Illocutionary, Perlocutionary', Language and Linguistics Compass 2/6 (2008) pp. 1189,1202. DOI: 10.1111/j.1749-818x.2008.00093.x. The terms locutionary act, illocutionary act and perlocutionary act originate from Austin's classical How to do with words. The corresponding notions, however, prove difficult to define. Yet, lack of careful delineating of each level can lead to important theoretical confusions. This Teaching and Learning Guide explains why proper understanding of Austin's trichotomy is crucial for semantics and pragmatics. Author's Introduction Most contemporary discussions in semantics and pragmatics employ , implicitly or explicitly , some or all of the concepts of locutionary,illocutionary or perlocutionary acts. These notions originate from Austin's posthumous and notoriously intricate book, How to do things with words. The point of interest for the linguist, however, is not so much the exegesis of Austin's ideas, as the precise delimitation of these levels of meaning. First, it is important to characterise the locutionary level , which falls short of any illocutionary force , to avoid contaminating analyses of utterance meanings with matters relative to the illocutionary level, viz. to the speech act performed. Second, the precise definition of illocutionary acts is an extremely difficult matter. However, the first, imperative step must be a clear demarcation between perlocutionary acts , relative to causal effects of the utterances , and the utterance's illocutionary force. Third, to assess theories of illocutionary forces, one must take into account the requirements for psychological and empirical plausibility. For instance, classical Gricean theories of illocutionary force attribution link it with the cognitive capacity to perform complex multi-layered mental state attributions, which is incompatible with the data available on the pragmatic and cognitive functioning of young children. In sum, gaining better understanding of the tripartite distinction between the locutionary, illocutionary and perlocutionary levels is not a taxonomical exercise, but a prerequisite for anyone willing to tackle semantic and/or pragmatic issues with the right tools. Suggested Reading Austin, J.L. (1975) How to do things with words, Second edition, Oxford, Oxford University Press. Lecture VIII. Difficult reading, but essential to understand Austin's intuitions and the origin of the debate. Strawson, P.F. (1964) "Intention and convention in speech acts", Philosophical Review, 73, 439,60. Classical criticism of Austin's claim abut the conventionality of illocutionary acts and first formulation of a Gricean theory of speech acts. Strawson, P.F. (1973) "Austin and ,Locutionary meaning'", in I. Berlin et al. (eds.) Essays on J.L. Austin, Oxford, Clarendon Press, 46,68. This equally classical paper sheds light onto the difficult notions of rhetic and locutionary acts; it paves the way for using these concepts interchangeably. Recanati, F. (1987) Meaning and Force. The pragmatics of performative utterances, Cambridge, Cambridge University Press. Chapter 9. This is a lucid discussion and elaboration of Strawson's conception of the locuitonary act as a potential for the illocutionary level. Wilson, D. and Sperber, D. (1988) "Mood and the analysis of non-declarative sentences", in J. Dancy et al. (eds.) Human Agency, Language, Duty and Value. Philosophical essayes in honour of J.O. Urmson, Stanford, Stanford University Press, 77,101. This paper gives important reasons for not confusing the analysis of mood , of the locutionary level , with the analysis of speech acts. Kissine, M. (2009) "Illocutionary forces and what is said", Mind and Language, 24, 122,38. Provides a definition of locutionary acts as linguistic representations of mental states, and lays grounds for a theory of speech acts as reasons to believe or to act. Bach, K. (1994) "Conversational impliciture", Mind and Language, 9, 124,62. An important defence of the distinction between illocutionary and locutionary acts. However, the reader should be warned that Bach conceives of locutionary acts as context-independent propositional radicals, which is not a self-evident position. Alston (2000) Illocutionary Acts and Sentence Meaning, Ithaca, Cornell University Press, Chapter 2. Contains a clear and lucid criticism of theories that confuse illocutionary and perlocutionary levels. Dominicy, M. (2008) "Epideictic rhetoric and the representation of human decision and choice", in K. Korta and J. Garmendia (eds.) Meaning, Intentions, and Argumentation, Stanford, CSLI, 179,207. This paper contains a useful test for distinguishing verbs that describe illocutionary acts form those that describe perlocutionary acts. It is also the first proposal to formulate the illocutionary/perlocutionary divide in Davidsonian terms. Focus Questions 1,What kind of philosophy of action is called for by the distinction between locutions, perlocutions and illocutions? 2,Should the locutionary level be always fully propositional? 3,Can illocutionary acts be characterised in terms of prototypical perlocutional effects? 4,Should illocutionary acts be divided in conventional (institutional) and non-conventional (non-insitutional) ones? 5,Are there good reasons for singling out a locutionary level? 6,,Does the attribution of illocutionary forces presuppose a complex mindreading process? Connexion with to Related Material in Lectures or Discussions 1,The distinction between the locutionary and illocutionary levels is crucial for any discussion about the semantics/pragmatics interface. Many scholars hastily characterise semantics as related to sentence-meaning and pragmatics as concerning the speech act performed. However, one should not take for granted that any level where the meaning is context-dependant is necessarily that of the illocutionary act performed. 2,This distinction can also be relevant for the discussions about the meaning of moods. For instance, the imperative mood is often analysed in terms of the directive illocutionary force. However, there are cases where utterances of imperative sentences do not correspond to a directive speech act. 3,The distinction between perlocutionary and illocutionary acts remains central for any attempt to classify or to define illocutionary forces. 4,Different conceptions of illocutionary acts are important for discussions about the ontogeny and phylogeny of the pragmatic dimension(s) of linguistic competence. [source] Microscopic studies of the influence of main exposure time on parameters of flexographic printing plate produced by digital thermal methodMICROSCOPY RESEARCH AND TECHNIQUE, Issue 10 2009Liliya Harri Abstract The digital thermal technology of producing flexographic printing plates from photopolymer plates is one of the newest technologies. This technology allows to develop flexographic plates without the use of any solvent. The process of producing flexographic printing plates by the digital thermal method consists of several main stages: back exposure, laser exposure, main exposure, thermal development, post exposure, and light finishing. The studies carried out with the use of optical stereoscopic microscopy allowed to determine the effect of time of main exposure to ultraviolet radiation on the dot area, diameter, and edge factor of halftone dots reproduced on flexographic printing plate produced by the digital thermal method, as well as on the quality of reproducing the surface and on the profiles of free-standing printing microelements. The results of the microscopic studies performed have allowed to define the criteria of establishing optimum time of main exposure of photopolymer plates used in the digital thermal technology of producing flexographic printing plates. A precise definition of the criteria for determining the optimum time of main exposure will enable to reduce the time-consuming control tests and to eliminate errors in both the process of manufacturing flexographic printing plates and in the printing process carried out with the use of such plates. Microsc. Res. Tech., 2009. © 2009 Wiley-Liss, Inc. [source] Arabidopsis pathology breathes new life into the necrotrophs-vs.-biotrophs classification of fungal pathogensMOLECULAR PLANT PATHOLOGY, Issue 4 2004RICHARD P. OLIVER SUMMARY Fungal plant pathologists have for many decades attempted to classify pathogens into groups called necrotrophs, biotrophs and, more recently, hemibiotrophs. Although these terms are well known and frequently used, disagreements about which pathogens fall into which classes, as well as the precise definition of these terms, has conspired to limit their usefulness. Dogmas concerning the properties of the classes have been progressively eroded. However, the genetic analysis of disease resistance, particularly in the model plant Arabidopsis thaliana, has provided a biologically meaningful division based on whether defence against fungal pathogens is controlled via the salicylate or jasmonate/ethylene pathways. This mode-of-defence division distinguishes necrotrophs and biotrophs but it limits the biotroph class to pathogens that possess haustoria. The small number and limited range of pathogens that infect Arabidopsis means that several interesting questions are still unanswered. Do hemibiotrophs represents a distinct class or a subclass of the necrotrophs? Does the division apply to other plant families and particularly to cereals? and does this classification help us understand the intricacies of either fungal pathogenicity or plant defence? [source] Improved target volume characterization in stereotactic treatment planning of brain lesions by using high-resolution BOLD MR-venographyNMR IN BIOMEDICINE, Issue 7-8 2001Lothar R. Schad Abstract In this methodological paper I report the stereotactic correlation of different magnetic resonance imaging (MRI) techniques [MR angiography (MRA), MRI, blood bolus tagging (STAR), functional MRI, and high-resolution BOLD venography (HRBV)] in patients with cerebral arterio-venous malformations (AVM) and brain tumors. The patient's head was fixed in a stereotactic localization system which is usable in both MR-systems and linear accelerator installations. Using phantom measurements global geometric MR image distortions can be ,corrected' (reducing displacements to the size of a pixel) by calculations based on modeling the distortion as a fourth-order two-dimensional polynomial. Further object-induced local distortions can be corrected by additionally measured field maps. Using this method multimodality matching could be performed automatically as long as all images are acquired in the same examination and the patient is sufficiently immobilized to allow precise definition of the target volume. Information about the hemodynamics of the AVM was provided by a dynamic MRA with the STAR technique, leading to an improved definition of the size of the nidus, the origin of the feeding arteries, whereas HRBV imaging yielded detailed and improved information about the venous pattern and drainage. In addition, functional MRI was performed in patients with lesions close to the primary motor cortex area, leading to an improved definition of structures at risk for the high-dose application in radiosurgery. In patients with brain tumors the potential of HRBV to probe tumor angiogenesis and its use in intensity-modulated treatment planning is still hampered by the open question of how to translate a BOLD signal pattern measured in the tumor to a dose distribution, which should be addressed in future studies. Copyright © 2001 John Wiley & Sons, Ltd. [source] Critical evaluation of prognostic factors in childhood asthmaPEDIATRIC ALLERGY AND IMMUNOLOGY, Issue 2 2002H. P. Van Bever Current knowledge of the natural history of asthma is improving through the establishment of a more precise definition of asthma linked with information from a number of large-scale longitudinal studies. Risk factors for the development of childhood asthma are now more clearly understood. They include gender, atopic status, genetic and familial factors, respiratory infections, and outdoor and indoor pollution (1). In the present review two types of asthma and their prognosis will be discussed: 1Asthma in preschool children and its risk factors for evolution towards persistent childhood asthma. 2Asthma in older children and its risk factors for evolution towards adult asthma. [source] Assuring Real Freedom of Movement in EU Direct TaxationTHE MODERN LAW REVIEW, Issue 6 2000Ian Roxan The decisions of the European Court of Justice in applying the Treaty principles of freedom of movement to the direct taxation of individuals have been strongly criticised as taking an overly simplistic view of the interactions between national tax systems. The interactions often make non-discrimination an inappropriate criterion. This article proposes a framework, grounded in economic analysis, for understanding the implications of the interactions for freedom of movement. First, I establish a precise definition of obstacles to freedom of movement of individuals as costs of migration, as distinguished from incentives to migration (such as mere differences in national tax levels). Incentives can encourage economic distortions in migration, but they are not obstacles to migration (or free movement). Secondly, I develop the cross-migration test to distinguish costs of migration from incentives. I apply the test to show that two commonly used schemes of double tax relief, including exemption with progression, create unjustified obstacles to free movement. [source] Aid and Reform in Failing StatesASIAN-PACIFIC ECONOMIC LITERATURE, Issue 1 2008Lisa Chauvet This paper reviews the policy implications of research on reform in failing states (Chauvet and Collier 2006, 2007a, 2007b, 2008; Chauvet et al. 2006; Chauvet et al. 2007a, 2007b). After providing a precise definition of state failure and reform in such states, we present the internal constraints impeding reform in failing states. Élite preferences and insufficient social knowledge seem to be the major constraints on reform. We find that financial aid tends to allow the ruling élite to postpone reform. Technical assistance, however, has some effectiveness in relaxing the capacity constraint to implement reform, notably right at the beginning of reform. [source] Phylogenetics and Ecology: As Many Characters as Possible Should Be Included in the Cladistic Analysis,CLADISTICS, Issue 1 2001Philippe Grandcolas As many data as possible must be included in any scientific analysis, provided that they follow the logical principles on which this analysis is based. Phylogenetic analysis is based on the basic principle of evolution, i.e., descent with modification. Consequently, ecological characters or any other nontraditional characters must be included in phylogenetic analyses, provided that they can plausibly be postulated heritable. The claim of Zrzavý (1997, Oikos 80, 186,192) or Luckow and Bruneau (1997, Cladistics 13, 145,151) that any character of interest should be included in the analysis is thus inaccurate. Many characters, broadly defined or extrinsic (such as distribution areas), cannot be considered as actually heritable. It is argued that we should better care for the precise definition and properties of characters of interest than decide a priori to include them in any case in the analysis. The symmetrical claim of de Queiroz (1996, Am. Nat. 148, 700,708) that some characters of interest should better be excluded from analyses to reconstruct their history is similarly inaccurate. If they match the logical principles of phylogenetic analysis, there is no acceptable reason to exclude them. The different statistical testing strategies of Zrzavý (1997) and de Queiroz (1996) aimed at justifying inclusion versus exclusion of characters are ill-conceived, leading respectively to Type II and Type I errors. It is argued that phylogenetic analyses should not be constrained by testing strategies that are downstream of the logical principles of phylogenetics. Excluding characters and mapping them on an independent phylogeny produces a particular and suboptimal kind of secondary homology, the use of which can be justified only for preliminary studies dealing with broadly defined characters. [source] Multi-dimensional phenotyping: towards a new taxonomy for airway diseaseCLINICAL & EXPERIMENTAL ALLERGY, Issue 10 2005A. J. Wardlaw Summary All the real knowledge which we possess, depends on methods by which we distinguish the similar from the dissimilar. The greater the number of natural distinctions this method comprehends the clearer becomes our idea of things. The more numerous the objects which employ our attention the more difficult it becomes to form such a method and the more necessary. [1]. Classification is a fundamental part of medicine. Diseases are often categorized according to pre-20th century descriptions and concepts of disease based on symptoms, signs and functional abnormalities rather than on underlying pathogenesis. Where the aetiology of disease has been revealed (for example in the infectious diseases) a more precise classification has become possible, but in the chronic inflammatory diseases, and in the inflammatory airway diseases in particular, where pathogenesis has been stubbornly difficult to elucidate, we still use broad descriptive terms such as asthma and chronic obstructive pulmonary disease, which defy precise definition because they encompass a wide spectrum of presentations and physiological and cellular abnormalities. It is our contention that these broad-brush terms have outlived their usefulness and that we should be looking to create a new taxonomy of airway disease,a taxonomy that more closely reflects the spectrum of phenotypes that are encompassed within the term airway inflammatory diseases, and that gives full recognition to late 20th and 21st century insights into the disordered physiology and cell biology that characterizes these conditions in the expectation that these will map more closely to both aetiology and response to treatment. Development of this taxonomy will require a much more complete and sophisticated correlation of the many variables that make up a condition than has been usual to employ in an approach that encompasses multi-dimensional phenotyping and uses complex statistical tools such as cluster analysis. [source] Retrospective diagnosis of Kindler syndrome in a 37-year-old manCLINICAL & EXPERIMENTAL DERMATOLOGY, Issue 1 2006M. A. Thomson Summary Kindler syndrome is a rare autosomal recessive disorder characterized by acral blisters in infancy and early childhood, followed by photosensitivity, progressive poikiloderma and cutaneous atrophy. Other features include webbing of the toes and fingers, palmoplantar hyperkeratosis, gingival fragility, poor dentition, and mucosal involvement in the form of urethral, anal and oesophageal stenosis. The recent finding of KIND1 mutations in Kindler syndrome facilitates early diagnosis, prophylactic measures and more precise definition of the phenotype. In the family described here, molecular diagnosis of Kindler syndrome in an infant with acral blisters led to the belated diagnosis in a severely affected relative whose condition had remained unidentified for 37 years. [source] Guidelines for the treatment and management of new-onset diabetes after transplantation,CLINICAL TRANSPLANTATION, Issue 3 2005Alan Wilkinson Abstract:, Although graft and patient survival after solid organ transplantation have improved markedly in recent years, transplant recipients continue to experience an increased prevalence of cardiovascular disease (CVD) compared with the general population. A number of factors are known to impact on the increased risk of CVD in this population, including hypertension, dyslipidemia and diabetes mellitus. Of these factors, new-onset diabetes after transplantation has been identified as one of the most important, being associated with reduced graft function and patient survival, and increased risk of graft loss. In 2003, International Consensus Guidelines on New-onset Diabetes after Transplantation were published, which aimed to establish a precise definition and diagnosis of the condition and recommend management strategies to reduce its occurrence and impact. These updated 2004 guidelines, developed in consultation with the International Diabetes Federation (IDF), extend the recommendations of the previous guidelines and encompass new-onset diabetes after kidney, liver and heart transplantation. It is hoped that adoption of these management approaches pre- and post-transplant will reduce individuals' risk of developing new-onset diabetes after transplantation as well as ameliorating the long-term impact of this serious complication. [source] What, if anything, is sympatric speciation?JOURNAL OF EVOLUTIONARY BIOLOGY, Issue 6 2008B. M. FITZPATRICK Abstract Sympatric speciation has always fascinated evolutionary biologists, and for good reason; it pits diversifying selection directly against the tendency of sexual reproduction to homogenize populations. However, different investigators have used different definitions of sympatric speciation and different criteria for diagnosing cases of sympatric speciation. Here, we explore some of the definitions that have been used in empirical and theoretical studies. Definitions based on biogeography do not always produce the same conclusions as definitions based on population genetics. The most precise definitions make sympatric speciation an infinitesimal end point of a continuum. Because it is virtually impossible to demonstrate the occurrence of such a theoretical extreme, we argue that testing whether a case fits a particular definition is less informative than evaluating the biological processes affecting divergence. We do not deny the importance of geographical context for understanding divergence. Rather, we believe this context can be better understood by modelling and measuring quantities, such as gene flow and selection, rather than assigning cases to discrete categories like sympatric and allopatric speciation. [source] A LEXICON FOR FLAVOR DESCRIPTIVE ANALYSIS OF GREEN TEAJOURNAL OF SENSORY STUDIES, Issue 3 2007JEEHYUN LEE ABSTRACT A lexicon for describing green tea was developed using descriptive analysis methods. A highly trained, descriptive sensory panel identified, defined and referenced 31 flavor attributes for green tea. One-hundred and thirty-eight green tea samples from nine countries , China, India, Japan, Kenya, Korea, Sri Lanka, Taiwan, Tanzania and Vietnam , were selected to represent a wide range of green teas. Attributes could be categorized as "Green" (asparagus, beany, Brussels sprout, celery, parsley, spinach, green beans, green herb-like); "Brown" (ashy/sooty, brown spice, burnt/scorched, nutty, tobacco); "Fruity/Floral" (fruity, floral/perfumy, citrus, fermented); "Mouthfeel" (astringent, tooth-etching); "Basic Tastes" (overall sweet, bitter); and other attributes (almond, animalic, grain, musty/new leather, mint, seaweed, straw-like). Some attributes, such as green, brown, bitter, astringent and tooth-etching, were found in most samples, but many attributes were found in only a few samples. Green tea processors, food industry, researchers and consumers will benefit from this lexicon with precise definitions and references that reliably differentiate and characterize the sensory attributes of green teas. PRACTICAL APPLICATIONS Green tea (and white tea) processors, food industrialists, researchers and consumers will benefit from this lexicon with precise definitions and references that reliably differentiate and characterize the sensory attributes of green tea. [source] Understanding software maintenance and evolution by analyzing individual changes: a literature reviewJOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6 2009Hans Christian Benestad Abstract Understanding, managing and reducing costs and risks inherent in change are key challenges of software maintenance and evolution, addressed in empirical studies with many different research approaches. Change-based studies analyze data that describes the individual changes made to software systems. This approach can be effective in order to discover cost and risk factors that are hidden at more aggregated levels. However, it is not trivial to derive appropriate measures of individual changes for specific measurement goals. The purpose of this review is to improve change-based studies by (1) summarizing how attributes of changes have been measured to reach specific study goals and (2) describing current achievements and challenges, leading to a guide for future change-based studies. Thirty-four papers conformed to the inclusion criteria. Forty-three attributes of changes were identified, and classified according to a conceptual model developed for the purpose of this classification. The goal of each study was to either characterize the evolution process, to assess causal factors of cost and risk, or to predict costs and risks. Effective accumulation of knowledge across change-based studies requires precise definitions of attributes and measures of change. We recommend that new change-based studies base such definitions on the proposed conceptual model. Copyright © 2009 John Wiley & Sons, Ltd. [source] |