Major Approaches (major + approach)

Distribution by Scientific Domains


Selected Abstracts


A novel approach and protocol for discovering extremely low-abundance proteins in serum

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 17 2006
Yoshinori Tanaka
Abstract The proteomic analysis of serum (plasma) has been a major approach to determining biomarkers essential for early disease diagnoses and drug discoveries. The determination of these biomarkers, however, is analytically challenging since the dynamic concentration range of serum proteins/peptides is extremely wide (more than 10,orders of magnitude). Thus, the reduction in sample complexity prior to proteomic analyses is essential, particularly in analyzing low-abundance protein biomarkers. Here, we demonstrate a novel approach to the proteomic analyses of human serum that uses an originally developed serum protein separation device and a sequentially linked 3-D-LC-MS/MS system. Our hollow-fiber-membrane-based serum pretreatment device can efficiently deplete high-molecular weight proteins and concentrate low-molecular weight proteins/peptides automatically within 1,h. Four independent analyses of healthy human sera pretreated using this unique device, followed by the 3-D-LC-MS/MS successfully produced 12,000,13,000 MS/MS spectra and hit around 1800,proteins (>95% reliability) and 2300,proteins (>80% reliability). We believe that the unique serum pretreatment device and proteomic analysis protocol reported here could be a powerful tool for searching physiological biomarkers by its high throughput (3.7,days per one sample analysis) and high performance of finding low abundant proteins from serum or plasma samples. [source]


Prevalence and epidemiologic characteristics of FASD from various research methods with an emphasis on recent in-school studies

DEVELOPMENTAL DISABILITIES RESEARCH REVIEW, Issue 3 2009
Philip A. May
Abstract Researching the epidemiology and estimating the prevalence of fetal alcohol syndrome (FAS) and other fetal alcohol spectrum disorders (FASD) for mainstream populations anywhere in the world has presented a challenge to researchers. Three major approaches have been used in the past: surveillance and record review systems, clinic-based studies, and active case ascertainment methods. The literature on each of these methods is reviewed citing the strengths, weaknesses, prevalence results, and other practical considerations for each method. Previous conclusions about the prevalence of FAS and total FASD in the United States (US) population are summarized. Active approaches which provide clinical outreach, recruitment, and diagnostic services in specific populations have been demonstrated to produce the highest prevalence estimates. We then describe and review studies utilizing in-school screening and diagnosis, a special type of active case ascertainment. Selected results from a number of in-school studies in South Africa, Italy, and the US are highlighted. The particular focus of the review is on the nature of the data produced from in-school methods and the specific prevalence rates of FAS and total FASD which have emanated from them. We conclude that FAS and other FASD are more prevalent in school populations, and therefore the general population, than previously estimated. We believe that the prevalence of FAS in typical, mixed-racial, and mixed-socioeconomic populations of the US is at least 2 to 7 per 1,000. Regarding all levels of FASD, we estimate that the current prevalence of FASD in populations of younger school children may be as high as 2,5% in the US and some Western European countries. © 2009 Wiley-Liss, Inc. Dev Disabil Res Rev 2009; 15:176,192. [source]


Can mechanism inform species' distribution models?

ECOLOGY LETTERS, Issue 8 2010
Lauren B. Buckley
Ecology Letters (2010) 13: 1041,1054 Abstract Two major approaches address the need to predict species distributions in response to environmental changes. Correlative models estimate parameters phenomenologically by relating current distributions to environmental conditions. By contrast, mechanistic models incorporate explicit relationships between environmental conditions and organismal performance, estimated independently of current distributions. Mechanistic approaches include models that translate environmental conditions into biologically relevant metrics (e.g. potential duration of activity), models that capture environmental sensitivities of survivorship and fecundity, and models that use energetics to link environmental conditions and demography. We compared how two correlative and three mechanistic models predicted the ranges of two species: a skipper butterfly (Atalopedes campestris) and a fence lizard (Sceloporus undulatus). Correlative and mechanistic models performed similarly in predicting current distributions, but mechanistic models predicted larger range shifts in response to climate change. Although mechanistic models theoretically should provide more accurate distribution predictions, there is much potential for improving their flexibility and performance. [source]


Quantitative trait linkage analysis by generalized estimating equations: Unification of variance components and Haseman-Elston regression

GENETIC EPIDEMIOLOGY, Issue 4 2004
Wei-Min Chen
Two of the major approaches for linkage analysis with quantitative traits in humans include variance components and Haseman-Elston regression. Previously, these were viewed as quite separate methods. We describe a general model, fit by use of generalized estimating equations (GEE), for which the variance components and Haseman-Elston methods (including many of the extensions to the original Haseman-Elston method) are special cases, corresponding to different choices for a working covariance matrix. We also show that the regression-based test of Sham et al. ([2002] Am. J. Hum. Genet. 71:238,253) is equivalent to a robust score statistic derived from our GEE approach. These results have several important implications. First, this work provides new insight regarding the connection between these methods. Second, asymptotic approximations for power and sample size allow clear comparisons regarding the relative efficiency of the different methods. Third, our general framework suggests important extensions to the Haseman-Elston approach which make more complete use of the data in extended pedigrees and allow a natural incorporation of environmental and other covariates. © 2004 Wiley-Liss, Inc. [source]


Intervention program to reduce waiting time of a dermatological visit: Managed overbooking and service centralization as effective management tools

INTERNATIONAL JOURNAL OF DERMATOLOGY, Issue 8 2007
Yuval Bibi MD
Background, Long waiting times are an impediment of dermatological patient care world-wide, resulting in significant disruption of clinical care and frustration among carers and patients. Objective, To reduce waiting times for dermatological appointments. Methods, A focus group including dermatologists and management personnel reviewed the scheduling process, mapped potential problems and proposed a comprehensive intervention program. The two major approaches taken in the intervention program were revision of the scheduling process by managed overbooking of patient appointments and centralization of the dermatological service into a centralized dermatological clinic. Results, Following the intervention program, the average waiting time for dermatological appointments decreased from 29.3 to 6.8 days. The number of scheduled appointments per 6 months rose from 17,007 to 20,433. Non-attendance proportion (no-show) decreased from 33% to 28%. Dermatologist work-hours were without significant change. Conclusions, Waiting lists for dermatological consultations were substantially shortened by managed overbooking of patient appointments and centralization of the service. [source]


Domestic Interests, Ideas and Integration: Lessons from the French Case

JCMS: JOURNAL OF COMMON MARKET STUDIES, Issue 1 2000
Craig Parsons
Both the major approaches to European integration, ,intergovernmentalism' and ,neofunctionalism', model integration as reflecting the demands of domestic interest groups. Where scholars qualify this basic model, they typically see integration diverging gradually and unintentionally from its expectations. This article tests the interest-group model against research into French policy-making across the history of integration, and argues that French policies never clearly reflected this interest-group baseline. Instead, French choices for today's European Union (as opposed to widely different historical alternatives) can only be explained with reference to French elites' ideas about Europe. Additionally, national leaders' ideas have set the main conditions for the success or failure of supranational entrepreneurship in Europe's ,grand bargains'. [source]


An interactive fuzzy satisficing method for multiobjective stochastic linear programming problems using chance constrained conditions

JOURNAL OF MULTI CRITERIA DECISION ANALYSIS, Issue 3 2002
Masatoshi Sakawa
Abstract Two major approaches to deal with randomness or ambiguity involved in mathematical programming problems have been developed. They are stochastic programming approaches and fuzzy programming approaches. In this paper, we focus on multiobjective linear programming problems with random variable coefficients in objective functions and/or constraints. Using chance constrained programming techniques, the stochastic programming problems are transformed into deterministic ones. As a fusion of stochastic approaches and fuzzy ones, after determining the fuzzy goals of the decision maker, interactive fuzzy satisficing methods to derive a satisficing solution for the decision maker by updating the reference membership levels is presented. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Interpersonal trust and voluntary associations: examining three approaches

THE BRITISH JOURNAL OF SOCIOLOGY, Issue 3 2002
Helmut Anheier
ABSTRACT The relationship between interpersonal trust and membership in voluntary associations is a persistent research finding in sociology. What is more, the notion of trust has become a central issue in current social science theorizing covering such diverse approaches as transaction costs economics or cognitive sociology. In different ways and for different purposes, these approaches address the role of voluntary organizations, although, as this paper argues, much of this thinking remains sketchy and underdeveloped. Against an empirical portrait of this relationship, the purpose of this paper is to assess such theorizing. We first set out to explicate major approaches to trust in economics, sociology and political science, using the non-profit or voluntary organization as a focal point. We then examine the various approaches in terms of their strengths and weaknesses, and, finally, identify key areas for theoretical development. In particular, we point to the social movement literature, the social psychology of trust, and recent thinking about civil society. [source]


Principles of pharmacoeconomics and their impact on strategic imperatives of pharmaceutical research and development

BRITISH JOURNAL OF PHARMACOLOGY, Issue 7 2010
József Bodrogi
The importance of evidence-based health policy is widely acknowledged among health care professionals, patients and politicians. Health care resources available for medical procedures, including pharmaceuticals, are limited all over the world. Economic evaluations help to alleviate the burden of scarce resources by improving the allocative efficiency of health care financing. Reimbursement of new medicines is subject to their cost-effectiveness and affordability in more and more countries. There are three major approaches to calculate the cost-effectiveness of new pharmaceuticals. Economic analyses alongside pivotal clinical trials are often inconclusive due to the suboptimal collection of economic data and protocol-driven costs. The major limitation of observational naturalistic economic evaluations is the selection bias and that they can be conducted only after registration and reimbursement. Economic modelling is routinely used to predict the cost-effectiveness of new pharmaceuticals for reimbursement purposes. Accuracy of cost-effectiveness estimates depends on the quality of input variables; validity of surrogate end points; and appropriateness of modelling assumptions, including model structure, time horizon and sophistication of the model to differentiate clinically and economically meaningful outcomes. These economic evaluation methods are not mutually exclusive; in practice, economic analyses often combine data collection alongside clinical trials or observational studies with modelling. The need for pharmacoeconomic evidence has fundamentally changed the strategic imperatives of research and development (R&D). Therefore, professionals in pharmaceutical R&D have to be familiar with the principles of pharmacoeconomics, including the selection of health policy-relevant comparators, analytical techniques, measurement of health gain by quality-adjusted life-years and strategic pricing of pharmaceuticals. [source]