Home About us Contact | |||
Unified Approach (unified + approach)
Selected AbstractsIncentive Problems With Unidimensional Hidden Characteristics: A Unified ApproachECONOMETRICA, Issue 4 2010Martin F. Hellwig This paper develops a technique for studying incentive problems with unidimensional hidden characteristics in a way that is independent of whether the type set is finite, the type distribution has a continuous density, or the type distribution has both mass points and an atomless part. By this technique, the proposition that optimal incentive schemes induce no distortion "at the top" and downward distortions "below the top" is extended to arbitrary type distributions. However, mass points in the interior of the type set require pooling with adjacent higher types and, unless there are other complications, a discontinuous jump in the transition from adjacent lower types. [source] Electromechanics of Cardiac Tissue: A Unified Approach to the Fully Coupled Excitation-Contraction ProblemPROCEEDINGS IN APPLIED MATHEMATICS & MECHANICS, Issue 1 2009Serdar Göktepe This contribution is concerned with a new, unified finite element approach to the fully coupled problem of cardiac electromechanics. In contrast to the existing numerical approaches suggested in the literature; to the best of our knowledge, for the first time, we propose a fully implicit, purely finite-element-based approach to the coupled problem. The system of coupled algebraic equations obtained by simultaneous linearization of non-linear weighted residual terms is solved monolithically. The put forward modular algorithmic framework leads to an unconditionally stable and geometrically flexible structure that can readily be extended towards complex ionic models of cardiac electrophysiology. The performance of the proposed approach is illustrated by the coupled electromechanical analysis of a biventricular generic heart model. (© 2009 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Dose Finding for Continuous and Ordinal Outcomes with a Monotone Objective Function: A Unified ApproachBIOMETRICS, Issue 1 2009Anastasia Ivanova Summary In many phase I trials, the design goal is to find the dose associated with a certain target toxicity rate. In some trials, the goal can be to find the dose with a certain weighted sum of rates of various toxicity grades. For others, the goal is to find the dose with a certain mean value of a continuous response. In this article, we describe a dose-finding design that can be used in any of the dose-finding trials described above, trials where the target dose is defined as the dose at which a certain monotone function of the dose is a prespecified value. At each step of the proposed design, the normalized difference between the current dose and the target is computed. If that difference is close to zero, the dose is repeated. Otherwise, the dose is increased or decreased, depending on the sign of the difference. [source] Motivations for the Restoration of EcosystemsCONSERVATION BIOLOGY, Issue 2 2006ANDRE F. CLEWELL cambio climático; capital natural; restauración ecológica Abstract:,The reasons ecosystems should be restored are numerous, disparate, generally understated, and commonly underappreciated. We offer a typology in which these reasons,or motivations,are ordered among five rationales: technocratic, biotic, heuristic, idealistic, and pragmatic. The technocratic rationale encompasses restoration that is conducted by government agencies or other large organizations to satisfy specific institutional missions and mandates. The biotic rationale for restoration is to recover lost aspects of local biodiversity. The heuristic rationale attempts to elicit or demonstrate ecological principles and biotic expressions. The idealistic rationale consists of personal and cultural expressions of concern or atonement for environmental degradation, reengagement with nature, and/or spiritual fulfillment. The pragmatic rationale seeks to recover or repair ecosystems for their capacity to provide a broad array of natural services and products upon which human economies depend and to counteract extremes in climate caused by ecosystem loss. We propose that technocratic restoration, as currently conceived and practiced, is too narrow in scope and should be broadened to include the pragmatic rationale whose overarching importance is just beginning to be recognized. We suggest that technocratic restoration is too authoritarian, that idealistic restoration is overly restricted by lack of administrative strengths, and that a melding of the two approaches would benefit both. Three recent examples are given of restoration that blends the technocratic, idealistic, and pragmatic rationales and demonstrates the potential for a more unified approach. The biotic and heuristic rationales can be satisfied within the contexts of the other rationales. Resumen:,Las razones por la que los ecosistemas deben ser restaurados son numerosas, dispares, generalmente poco sustentadas, y comúnmente poco apreciadas. Ofrecemos una tipología en la que estas razones,o motivaciones,son ordenadas entre cinco razonamientos: tecnocrático, biótico, heurístico, idealista y pragmático. El razonamiento tecnocrático se refiere a la restauración que es llevada a cabo por agencias gubernamentales u otras grandes organizaciones para satisfacer misiones y mandatos institucionales específicos. El razonamiento biótico de la restauración es la recuperación de aspectos perdidos de la biodiversidad local. El razonamiento heurístico intenta extraer o demostrar principios ecológicos y expresiones bióticas. El razonamiento idealista consiste de expresiones personales y culturales de la preocupación o reparación de la degradación ambiental, reencuentro con la naturaleza y/o cumplimiento espiritual. El razonamiento pragmático busca recuperar o reparar ecosistemas por su capacidad de proporcionar una amplia gama de servicios y productos naturales de la que dependen las economías humanas y para contrarrestar extremos en el clima causados por la pérdida de ecosistemas. Proponemos que la restauración tecnocrática, como se concibe y practica actualmente, es muy corta en su alcance y debiera ampliarse para incluir al razonamiento pragmático, cuya importancia apenas comienza a ser reconocida. Sugerimos que la restauración tecnocrática es demasiado autoritaria, que la restauración idealista esta muy restringida por la falta de fortalezas administrativas, y que una mezcla de los dos enfoques podría beneficiar a ambas. Proporcionamos tres ejemplos recientes de restauración que combinan los razonamientos tecnocrático, idealista y pragmático y demuestran el potencial para un enfoque más unificado. Los razonamientos biótico y heurístico pueden ser satisfechos en el contexto de los otros razonamientos. [source] Equivalent force control method for generalized real-time substructure testing with implicit integrationEARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 9 2007Bin Wu Abstract This paper presents a new method, called the equivalent force control method, for solving the nonlinear equations of motion in a real-time substructure test using an implicit time integration algorithm. The method replaces the numerical iteration in implicit integration with a force-feedback control loop, while displacement control is retained to control the motion of an actuator. The method is formulated in such a way that it represents a unified approach that also encompasses the effective force test method. The accuracy and effectiveness of the method have been demonstrated with numerical simulations of real-time substructure tests with physical substructures represented by spring and damper elements, respectively. The method has also been validated with actual tests in which a Magnetorheological damper was used as the physical substructure. Copyright © 2007 John Wiley & Sons, Ltd. [source] Are there general mechanisms of animal home range behaviour?ECOLOGY LETTERS, Issue 6 2008A review, prospects for future research Abstract Home range behaviour is a common pattern of space use, having fundamental consequences for ecological processes. However, a general mechanistic explanation is still lacking. Research is split into three separate areas of inquiry , movement models based on random walks, individual-based models based on optimal foraging theory, and a statistical modelling approach , which have developed without much productive contact. Here we review recent advances in modelling home range behaviour, focusing particularly on the problem of identifying mechanisms that lead to the emergence of stable home ranges from unbounded movement paths. We discuss the issue of spatiotemporal scale, which is rarely considered in modelling studies, as well as highlighting the need to consider more closely the dynamical nature of home ranges. Recent methodological and theoretical advances may soon lead to a unified approach, however, conceptually unifying our understanding of linkages among home range behaviour and ecological or evolutionary processes. [source] A unified approach to the implicit integration of standard, non-standard and viscous plasticity modelsINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 11 2002René de Borst Abstract It is shown how modern concepts to integrate the elasto-plastic rate equations of standard plasticity via an implicit algorithm can be generalized to plasticity without an explicitly defined yield surface and to overstress-type models of viscoplasticity, where the stress point can be located outside the loading surface. For completeness, a tangent operator is derived that is consistent with the update algorithm. Copyright © 2002 John Wiley & Sons, Ltd. [source] A unified approach for the formulation of interaction problems by the boundary element methodINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 5 2006Yalcín Mengi Abstract A unified formulation is presented, based on boundary element method, in a form suitable for performing the interaction analyses by substructure method for solid,solid and soil,structure problems. The proposed formulation permits the evaluation of all the elements of impedance and input motion matrices simultaneously at a single step in terms of system matrices of the boundary element method without solving any special problem, such as, unit displacement or load problem, as required in conventional methods. It eliminates further the complicated procedure and the need for using scattering analysis in the evaluation of input motion functions. To explain the formulation, it is first given for an inclusion interacting with an infinite surrounding medium under the influence of a seismic input, where both the inclusion and surrounding medium are treated as viscoelastic. It is shown that the formulation for a rigid inclusion may be obtained from that for flexible inclusion as a special case through a transformation. Then, the formulation is extended to other types of interaction problems: a multi-inclusion problem and an interaction problem involving a foundation embedded in a viscoelastic half-space. It is found that the proposed formulation remains essentially the same for all kinds of interaction problems and it can be used not only in regular interaction analysis, but also in the analysis involving diffraction of waves in a medium containing holes. Copyright © 2005 John Wiley & Sons, Ltd. [source] On-line estimation and path planning for multiple vehicles in an uncertain environmentINTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 8 2004Jarurat Ousingsawat Abstract A unified approach to cooperative target tracking and path planning for multiple vehicles is presented. All vehicles, friendly and adversarial, are assumed to be aircraft. Unlike the typical target tracking problem that uses the linear state and nonlinear output dynamics, a set of aircraft nonlinear dynamics is used in this work. Target state information is estimated in order to integrate into a path planning framework. The objective is to fly from a start point to a goal in a highly dynamic, uncertain environment with multiple friendly and adversarial vehicles, without collision. The estimation architecture proposed is consistent with most path planning methods. Here, the path planning approach is based on evolutionary computation technique which is then combined with a nonlinear extended set membership filter in order to demonstrate a unified approach. A cooperative estimation approach among friendly vehicles is shown to improve speed and routing of the path. Copyright © 2004 John Wiley & Sons, Ltd. [source] Adaptive approach for nonlinear sensitivity analysis of reaction kineticsJOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 9 2005Illia Horenko Abstract We present a unified approach for linear and nonlinear sensitivity analysis for models of reaction kinetics that are stated in terms of systems of ordinary differential equations (ODEs). The approach is based on the reformulation of the ODE problem as a density transport problem described by a Fokker,Planck equation. The resulting multidimensional partial differential equation is herein solved by extending the TRAIL algorithm originally introduced by Horenko and Weiser in the context of molecular dynamics (J. Comp. Chem. 2003, 24, 1921) and discussed it in comparison with Monte Carlo techniques. The extended TRAIL approach is fully adaptive and easily allows to study the influence of nonlinear dynamical effects. We illustrate the scheme in application to an enzyme-substrate model problem for sensitivity analysis w.r.t. to initial concentrations and parameter values. © 2005 Wiley Periodicals, Inc. J Comput Chem 26: 941,948, 2005 [source] Fluctuating asymmetry and developmental instability in evolutionary biology: past, present and futureJOURNAL OF EVOLUTIONARY BIOLOGY, Issue 6 2006S. V. DONGEN Abstract The role of developmental instability (DI), as measured by fluctuating asymmetry (FA), in evolutionary biology has been the focus of a wealth of research for more than half a century. In spite of this long period and many published papers, our current state of knowledge reviewed here only allows us to conclude that patterns are heterogeneous and that very little is known about the underlying causes of this heterogeneity. In addition, the statistical properties of FA as a measure of DI are only poorly grasped because of a general lack of understanding of the underlying mechanisms that drive DI. If we want to avoid that this area of research becomes abandoned, more efforts should be made to understand the observed heterogeneity, and attempts should be made to develop a unifying statistical protocol. More specifically, and perhaps most importantly, it is argued here that more attention should be paid to the usefulness of FA as a measure of DI since many factors might blur this relationship. Furthermore, the genetic architecture, associations with fitness and the importance of compensatory growth should be investigated under a variety of stress situations. In addition, more focus should be directed to the underlying mechanisms of DI as well as how these processes map to the observable phenotype. These insights could yield more efficient statistical models and a unified approach to the analysis of patterns in FA and DI. The study of both DI and canalization is indispensable to obtain better insights in their possible common origin, especially because both have been suggested to play a role in both micro- and macro-evolutionary processes. [source] National Institute of Neurological Disorders and Stroke (NINDS): Advances in understanding and treating neuropathy, 24,25 October 2006; Bethesda, MarylandJOURNAL OF THE PERIPHERAL NERVOUS SYSTEM, Issue 1 2008Eva L. Feldman Abstract National Institute of Neurological Disorders and Stroke sponsored a meeting to explore the current status of basic and clinical research in peripheral neurobiology and clinical neuropathy. The goal of the workshop was to identify areas where additional research could lead to the development of new therapeutics in the next 5 years. Participants discussed the current understanding of disease mechanisms of axonal and demyelinating neuropathies, existing techniques in research, disease biomarkers, and assessment of neuropathy. Painful neuropathies were discussed at the basic scientific and clinical levels in relation to new insights into etiology and treatment. The meeting concluded with a discussion on therapeutic development in neuropathy and the need for a unified approach to multicenter trials. Short-term goals of the workshop were to form a working group for neuropathy, the Peripheral Neuropathy Study Group, and to translate new scientific findings into therapies and complete clinical trials. [source] A unified approach to regression analysis under double-sampling designsJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 3 2000Yi-Hau Chen We propose a unified approach to the estimation of regression parameters under double-sampling designs, in which a primary sample consisting of data on the rough or proxy measures for the response and/or explanatory variables as well as a validation subsample consisting of data on the exact measurements are available. We assume that the validation sample is a simple random subsample from the primary sample. Our proposal utilizes a specific parametric model to extract the partial information contained in the primary sample. The resulting estimator is consistent even if such a model is misspecified, and it achieves higher asymptotic efficiency than the estimator based only on the validation data. Specific cases are discussed to illustrate the application of the estimator proposed. [source] Linear instability of ideal flows on a sphereMATHEMATICAL METHODS IN THE APPLIED SCIENCES, Issue 3 2009Yuri N. Skiba Abstract A unified approach to the normal mode instability study of steady solutions to the vorticity equation governing the motion of an ideal incompressible fluid on a rotating sphere is considered. The four types of well-known solutions are considered, namely, the Legendre-polynomial (LP) flows, Rossby,Haurwitz (RH) waves, Wu,Verkley (WV) waves and modons. A conservation law for disturbances to each solution is derived and used to obtain a necessary condition for its exponential instability. By these conditions, Fjörtoft's (Tellus 1953; 5:225,230) average spectral number of the amplitude of an unstable mode must be equal to a special value. In the case of LP flows or RH waves, this value is related only with the basic flow degree. For the WV waves and modons, it depends both on the basic flow degree and on the spectral distribution of the mode energy in the inner and outer regions of the flow. Peculiarities of the instability conditions for different types of modons are discussed. The new instability conditions specify the spectral structure of growing disturbances localizing them in the phase space. For the LP flows, this condition complements the well-known Rayleigh,Kuo and Fjörtoft conditions related to the zonal flow profile. Some analytical and numerical examples are considered. The maximum growth rate of unstable modes is also estimated, and the orthogonality of any unstable, decaying and non-stationary mode to the basic flow is shown in the energy inner product. The analytical instability results obtained here can also be applied for testing the accuracy of computational programs and algorithms used for the numerical stability study. It should be stressed that Fjörtoft's spectral number appearing both in the instability conditions and in the maximum growth rate estimates is the parameter of paramount importance in the linear instability problem of ideal flows on a sphere. Copyright © 2008 John Wiley & Sons, Ltd. [source] Decomposing kernels of iterated operators,a unified approachMATHEMATICAL METHODS IN THE APPLIED SCIENCES, Issue 9 2007Guangbin Ren Abstract For any operator D acting in an Abelian group, we study the kernel of its iterates Dk and describe a general approach for decomposing it through the kernel of the operator D itself and some other given operators T1,,,Tk,1. Due to Almansi's famous theorem for polyharmonic functions the different types of decomposition are characterized in terms of strong, weak and restricted Almansi decomposition properties. Sufficient conditions are given for the existence of such decompositions. The case of the iterated Dirac operator (cf. Math. Meth. Appl. Sci. 2002; 25:1541,1552) follows as a special case. Several other special cases are discussed. Finally we prove corresponding decomposition theorems for the iterated weighted Laplacian (|x|,,)k, ,,(,,, 2), and the iterated Helmholtz type operator (,,,)k, ,,C. Copyright © 2006 John Wiley & Sons, Ltd. [source] Ontologies of nursing in an age of spiritual pluralism: closed or open worldview?NURSING PHILOSOPHY, Issue 1 2010Barbara Pesut PhD RN Abstract North American society has undergone a period of sacralization where ideas of spirituality have increasingly been infused into the public domain. This sacralization is particularly evident in the nursing discourse where it is common to find claims about the nature of persons as inherently spiritual, about what a spiritually healthy person looks like and about the environment as spiritually energetic and interconnected. Nursing theoretical thinking has also used claims about the nature of persons, health, and the environment to attempt to establish a unified ontology for the discipline. However, despite this common ground, there has been little discussion about the intersections between nursing philosophic thinking and the spirituality in nursing discourse, or about the challenges of adopting a common view of these claims within a spiritually pluralist society. The purpose of this paper is to discuss the call for ontological unity within nursing philosophic thinking in the context of the sacralization of a diverse society. I will begin with a discussion of secularization and sacralization, illustrating the diversity of beliefs and experiences that characterize the current trend towards sacralization. I will then discuss the challenges of a unified ontological perspective, or closed world view, for this diversity, using examples from both a naturalistic and a unitary perspective. I will conclude by arguing for a unified approach within nursing ethics rather than nursing ontology. [source] Unified approach for Euler,Lagrange equation arising in calculus of variationsOPTIMAL CONTROL APPLICATIONS AND METHODS, Issue 6 2004D. S. Naidu Abstract We address the development of a unified approach for the necessary conditions for optimization of a functional arising in calculus of variations. In particular, we develop a unified approach for the Euler,Lagrange equation, that is simultaneously applicable to both shift (q)-operator-based discrete-time systems and the derivative (d/dt)-operator-based continuous-time systems. It is shown that the Euler,Lagrange results that are now obtained separately for continuous- and discrete-time systems can be easily obtained from the unified approach. An illustrative example is given. Copyright © 2005 John Wiley & Sons, Ltd. [source] Sickle Cell Disease Summit: From clinical and research disparity to action,,AMERICAN JOURNAL OF HEMATOLOGY, Issue 1 2009Kathryn Hassell The American Society of Pediatric Hematology/Oncology Sickle Cell Summit brought together a broad range of constituencies to identify a unified approach to healthcare and research disparities for sickle cell disease. Recommendations included the following: (1) speak with a unified voice representing all constituencies; (2) optimize access to care from knowledgeable health care providers and create a medical home for all individuals with the disease; (3) utilize population-based surveillance to measure outcomes; (4) develop overall approaches to basic, translational, clinical, and health services research; (5) enhance the community role in advocacy, education, service, and fundraising. Taskforces were identified to effect implementation. Am. J. Hematol., 2009. © 2008 Wiley-Liss, Inc. [source] A unified approach to the analysis of Horton-Strahler parameters of binary tree structuresRANDOM STRUCTURES AND ALGORITHMS, Issue 3-4 2002Markus E. Nebel Abstract The Horton-Strahler number naturally arose from problems in various fields, e.g., geology, molecular biology and computer science. Consequently, detailed investigations of related parameters for different classes of binary tree structures are of interest. This paper shows one possibility of how to perform a mathematical analysis for parameters related to the Horton-Strahler number in a unified way such that only a single analysis is needed to obtain results for many different classes of trees. The method is explained by the examples of the expected Horton-Strahler number and the related rth moments, the average number of critical nodes, and the expected distance between critical nodes. © 2002 Wiley Periodicals, Inc. Random Struct. Alg., 21: 252,277, 2002 [source] A unified approach to estimation of nonlinear mixed effects and Berkson measurement error modelsTHE CANADIAN JOURNAL OF STATISTICS, Issue 2 2007Liqun Wang Abstract Mixed effects models and Berkson measurement error models are widely used. They share features which the author uses to develop a unified estimation framework. He deals with models in which the random effects (or measurement errors) have a general parametric distribution, whereas the random regression coefficients (or unobserved predictor variables) and error terms have nonparametric distributions. He proposes a second-order least squares estimator and a simulation-based estimator based on the first two moments of the conditional response variable given the observed covariates. He shows that both estimators are consistent and asymptotically normally distributed under fairly general conditions. The author also reports Monte Carlo simulation studies showing that the proposed estimators perform satisfactorily for relatively small sample sizes. Compared to the likelihood approach, the proposed methods are computationally feasible and do not rely on the normality assumption for random effects or other variables in the model. Une stratégie d'estimation commune pour les modèles non linéaires à effets mixtes et les modèles d'erreur de mesure de Berkson Les modèles à effets mixtes et les modèles d'erreur de mesure de Berkson sont très usités. Ils par-tagent certaines caractéristiques que l'auteur met à profit pour élaborer une stratégie d'estimation commune. II considère des modèles dans lesquels la loi des effets aléatoires (ou des erreurs de mesure) est paramé-trique tandis que celles des coefficients de régression aléatoires (ou de variables exogènes non observées) et des termes d'erreur ne le sont pas. II propose une estimation des moindres carrés au second ordre et une approche par simulation fondées sur les deux premiers moments conditionnels de la variable endogène, sachant les variables exogènes observées. Les deux estimateurs s'avèrent convergents et asymptotiquement gaussiens sous des conditions assez générales. L'auteur fait aussi état d'études de Monte-Carlo attestant du bon comportement des deux estimations dans des échantillons relativement petits. Les méthodes proposées ne posent aucune difficulté particulière au plan numérique et au contraire de l'approche par vraisemblance, ne supposent ni la normalité des effets aléatoires, ni celle des autres variables du modèle. [source] Phenological stages of willow (Salix)ANNALS OF APPLIED BIOLOGY, Issue 3 2010Margaret M. Saska The practice of uniform recording of biological plant growth stages or events has long been practiced in agricultural production. In this study the BBCH (Biologishe Bundesanstalt, Bundessortenamt and CHemical Industry) code has been applied to four precocious species of willows to define growth stages important to this group. The studied taxa represent varieties of potential importance in the Floral Industry. A new BBCH code is proposed where the annual cycle of willows is divided into clearly recognisable and easily distinguishable developmental phases which include eight principal stages, 30 secondary stages and six mesostages. Photographs illustrate the physical appearance of select stages. This proposed BBCH code shows a unified approach which may be applied to a large number of Salix species. [source] An Age-Stratified Poisson Model for Comparing Trends in Cancer Rates Across Overlapping RegionsBIOMETRICAL JOURNAL, Issue 4 2008Yi Li Abstract The annual percent change (APC) has been used as a measure to describe the trend in the age-adjusted cancer incidence or mortality rate over relatively short time intervals. The yearly data on these age-adjusted rates are available from the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The traditional methods to estimate the APC is to fit a linear regression of logarithm of age-adjusted rates on time using the least squares method or the weighted least squares method, and use the estimate of the slope parameter to define the APC as the percent change in the rates between two consecutive years. For comparing the APC for two regions, one uses a t-test which assumes that the two datasets on the logarithm of the age-adjusted rates are independent and normally distributed with a common variance. Two modifications of this test, when there is an overlap between the two regions or between the time intervals for the two datasets have been recently developed. The first modification relaxes the assumption of the independence of the two datasets but still assumes the common variance. The second modification relaxes the assumption of the common variance also, but assumes that the variances of the age-adjusted rates are obtained using Poisson distributions for the mortality or incidence counts. In this paper, a unified approach to the problem of estimating the APC is undertaken by modeling the counts to follow an age-stratified Poisson regression model, and by deriving a corrected Z -test for testing the equality of two APCs. A simulation study is carried out to assess the performance of the test and an application of the test to compare the trends, for a selected number of cancer sites, for two overlapping regions and with varied degree of overlapping time intervals is presented. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Impact of Population Substructure on Trend Tests for Genetic Case,Control Association StudiesBIOMETRICS, Issue 1 2010Gang Zheng Summary Hidden population substructure in case,control data has the potential to distort the performance of Cochran,Armitage trend tests (CATTs) for genetic associations. Three possible scenarios that may arise are investigated here: (i) heterogeneity of genotype frequencies across unidentified subpopulations (PSI), (ii) heterogeneity of genotype frequencies and disease risk across unidentified subpopulations (PSII), and (iii) cryptic correlations within unidentified subpopulations. A unified approach is presented for deriving the bias and variance distortion under the three scenarios for any CATT in a general family. Using these analytical formulas, we evaluate the excess type I errors of the CATTs numerically in the presence of population substructure. Our results provide insight into the properties of some proposed corrections for bias and variance distortion and show why they may not fully correct for the effects of population substructure. [source] The new World Health Organization classification of haematopoietic and lymphoid tumours: a dermatopathological perspectiveBRITISH JOURNAL OF DERMATOLOGY, Issue 4 2002D.N. Slater Summary The World Health Organization (WHO) has published a new consensus classification of tumours of haematopoietic and lymphoid tissue, based on recognizable disease entities defined by clinical and scientific criteria. The WHO does not support the use of stand-alone organ-related classifications, such as for skin. The Royal College of Pathologists (London) has adopted the WHO classification in its minimum dataset for the histopathological reporting of lymphoma and this will be used in the National Health Service Skin Cancer Dataset. The purpose of this review is to highlight the principal primary and secondary cutaneous haematopoietic and lymphoid tumours that are defined in the WHO classification. The review also discusses selected problematical areas in the WHO classification relevant to the skin and contains suggestions to encourage a unified approach in the use of the WHO coded summary. These represent an attempt to facilitate future progress and research in the field of cutaneous lymphoma. They are perceived as possible building-blocks for wider discussion and not as alterations to the classification. The WHO classification has been compared with a road map that indicates directions for future clinical and scientific research. [source] Decomposing changes in wage distributions: a unified approachCANADIAN JOURNAL OF ECONOMICS, Issue 4 2002Thomas Lemieux Over the last fifteen years, many researchers have attempted to explain the determinants and changes of wage inequality. I propose a simple procedure to decompose changes in the distribution of wages or in other distributions into three factors: changes in regression coefficients; the distribution of covariates, and residuals. The procedure requires only estimating standard OLS regressions augmented by a logit or probit model. It can be extended by modelling residuals as a function of unmeasured skills and skill prices. Two empirical examples showing how the procedure works in practice are considered. In the first example, sources of differences in the wage distribution in Alberta and British Columbia are considered; in the second, sources of change in overall wage inequality in the United States, 1973,99, are re,examined. Finally, the proposed procedure is compared with existing procedures. JEL classification: J3 La décomposition des changements dans les distributions de salaires : une approche unifée. Au cours des quinze dernières années, nombre d'études se sont penchées sur les déterminants et les changements de la distribution des salaires. Ce mémoire propose une procédure pour décomposer les changements de la distribution des salaires en trois éléments: les changements dans les coefficients de régression, la distribution des regresseurs et les changements résiduels. Cette procédure ne nécessite que l'estimation de regressions par moindre carrés ordinaires et d'un modèle probit ou logit. L'auteur montre aussi comment modéliser les résidus en fonction de compétences non mesurées. La procédure proposée est mise en application dans le contexte de deux exemples: la distribution des salaires en Alberta et en Colombie,Britannique et les changements dans la distribution des salaires de 1973 à 1999 aux Etats,Unis. Le mémoire examine aussi comment cette procédure se compare aux méthodes proposées par d'autres chercheurs. [source] Implementation of a multidisciplinary guideline-driven approach to the care of the extremely premature infant improved hospital outcomesACTA PAEDIATRICA, Issue 2 2010CA Nankervis Abstract Aim:, To test the hypothesis that implementing guidelines for the standardized care of the extremely premature infant (<27 weeks) in the first week of life would improve patient outcomes in an all referral NICU. Methods:, Data were collected on all infants <27 weeks gestational age and <7 days of age on admission cared for using these small baby guidelines (SBG), as well as on all age-matched infants admitted the year prior (comparison). Results:, Thirty-seven patients were cared for utilizing the SBG and 40 patients were in the comparison group. There were no differences between the groups in gestational age, birthweight or age on admission. There was no difference in survival to discharge (73% SBG, 70% comparison). The mean length of stay for survivors was 112 ± 38 days SBG and 145 ± 76 days (p < 0.05) comparison group. Survival without BPD was greater in the SBG group (24%) than in the comparison group (9%; p < 0.05), and survival without severe IVH was greater in the SBG group (65%) than in the comparison group (38%; p < 0.01). Conclusions:, These data demonstrate that applying a unified approach to the care of the extremely premature infant in the first week of life resulted in a decrease in the length of hospitalization and improved patient outcomes. [source] Enantioselective Total Synthesis of Brevetoxin A: Convergent Coupling Strategy and CompletionCHEMISTRY - A EUROPEAN JOURNAL, Issue 36 2009Michael Abstract A highly convergent, enantioselective total synthesis of brevetoxin A is reported. The development of a [X+2+X] Horner,Wadsworth,Emmons/cyclodehydration/reductive etherification convergent coupling strategy allowed a unified approach to the synthesis of two advanced tetracyclic fragments from four cyclic ether subunits. The Horner,Wittig coupling of the two tetracyclic fragments provided substrates that were explored for reductive etherification, the success of which delivered a late-stage tetraol intermediate. The tetraol was converted to the natural product through an expeditious selective oxidative process followed by methylenation. [source] Contributions to the theory and practice of the chromatographic separation of enantiomers,CHIRALITY, Issue S1 2005Volker Schurig Abstract The theory and practice of enantioselective capillary chromatography employing metal coordination compounds and modified cyclodextrins as chiral stationary phases are treated. A unified approach involving all contemporary chromatographic methods and a single enantioselective column is described. Reliable thermodynamic data of enantioselectivity are derived by the retention-increment method. The existence of an isoenantioselective temperature is demonstrated. Kinetic enantiomerization studies are presented. The preparative-scale separation of enantiomers by gas chromatography with enantioselective packed columns is achieved. Unusual phenomena and future aspects of enantioselective chromatography are discussed. Chirality 17:S205,S226, 2005. © 2005 Wiley-Liss, Inc. [source] Unified approach to KdV modulationsCOMMUNICATIONS ON PURE & APPLIED MATHEMATICS, Issue 10 2001Gennady A. El We develop a unified approach to integrating the Whitham modulation equations. Our approach is based on the formulation of the initial-value problem for the zero-dispersion KdV as the steepest descent for the scalar Riemann-Hilbert problem [6] and on the method of generating differentials for the KdV-Whitham hierarchy [9]. By assuming the hyperbolicity of the zero-dispersion limit for the KdV with general initial data, we bypass the inverse scattering transform and produce the symmetric system of algebraic equations describing motion of the modulation parameters plus the system of inequalities determining the number the oscillating phases at any fixed point on the (x, t)-plane. The resulting system effectively solves the zero-dispersion KdV with an arbitrary initial datum. © 2001 John Wiley & Sons, Inc. [source] |