Home About us Contact | |||
Common Approach (common + approach)
Selected AbstractsDivergent and Linear Solid-Phase Synthesis of PNA Containing Thiazole Orange as Artificial BaseEUROPEAN JOURNAL OF ORGANIC CHEMISTRY, Issue 15 2005Dilip V. Jarikote Abstract Fluorescent nucleobase surrogates provide nucleic acids with interesting properties. We have recently introduced thiazole orange as base surrogate into PNA and found that the so-called FIT (Forced Intercalation of Thiazole orange) PNA probes signal hybridization by enhancements of fluorescence. Common approaches of modifying nucleobases or introducing nucleobase surrogates draw upon the usage of monomer building blocks that have been synthesized in solution phase. The need to prefabricate a base-modified building block can hold up progress if several base modifications or base surrogates are to be evaluated. Herein, a method for divergent solid-phase synthesis is presented that serves the purpose to facilitate the screening for base surrogates that fluoresce upon hybridization. An Fmoc/Aloc-protected submonomer allowed the application of commonly used Fmoc-based solid-phase synthesis protocols while removal of the fully orthogonal Aloc group enabled the on-resin introduction of base surrogates after the linear chain assembly had been completed. The divergent solid-phase synthesis strategy is automatable, gives overall yields matching those of linear solid-phase synthesis and, most importantly, provides rapid access to any kind of base-modified PNA. (© Wiley-VCH Verlag GmbH & Co. KGaA, 69451 Weinheim, Germany, 2005) [source] Integrating multidimensional geophysical dataARCHAEOLOGICAL PROSPECTION, Issue 1 2006Kenneth L. Kvamme Abstract Surveys that utilize multiple geophysical methods offer greater insights about the subsurface because each one generally yields different information. Common approaches to integrating or ,fusing' multidimensional geophysical data are investigated utilizing computer graphics, geographical information system (GIS), mathematical and statistical solutions. These approaches are synthesized into graphical, discrete and continuous domains. It is shown that graphical approaches allow complex visualizations of the subsurface, but only images are generated and their dimensionality tends to be low. Discrete methods incorporate any number of geophysical dimensions, allow application of powerful Boolean operations, and produce unambiguous maps of anomaly presence or absence, but many of these methods rely on arbitrary thresholds that define only robust anomalies. Continuous data integrations offer capabilities beyond other methods because robust and subtle anomalies are simultaneously expressed, new quantitative information is generated, and interpretive data are derived in the form of regression weights, factor loadings, and the like, that reveal interrelationships and underlying dimensionality. All approaches are applied to a common data set obtained at Army City, Kansas, a World War I era commercial complex that serviced troops in nearby Camp Funston (now Fort Riley). Utilizing data from six geophysical surveys (magnetic gradiometry, electrical resistivity, ground-penetrating radar, magnetic susceptibility, soil conductivity, aerial thermography), various data integrations reveal the structure of this nearly forgotten town. Copyright © 2005 John Wiley & Sons, Ltd. [source] Modeling the model organism Dictyostelium discoideumDEVELOPMENT GROWTH & DIFFERENTIATION, Issue 6 2000Seido Nagano The cellular slime mold Dictyostelium discoideum is a fascinating organism, not only for biologists, but also for physicists. Since the Belousov,Zhabotinskii reaction pattern, a well-known non-linear phenomenon in chemistry, was observed during aggregation of Dictyostelium amoebae, Dictyostelium has been one of the major subjects of non-linear dynamics studies. Macroscopic theory, such as continuous cell density approximation, has been a common approach to studying pattern formation since the pioneering work of Turing. Recently, promising microscopic approaches, such as the cellular dynamics method, have emerged. They have shown that Dictyostelium is useful as a model system in biology. The synchronization mechanism of oscillatory production of cyclic adenosine 3,,5,-monophosphate in Dictyostelium is discussed in detail to show how it is a universal feature that can explain synchronization in other organisms. [source] Improving the Evaluation of Rural Development Policy Pour une meilleure évaluation de la politique de développement rural Die Evaluation der Politik zur Entwicklung des ländlichen Raums verbessernEUROCHOICES, Issue 1 2010David Blandford Summary Improving the Evaluation of Rural Development Policy A previous EuroChoices (Vol. 7, No. 1) compared and contrasted approaches to rural development policy in the EU and US. This Special Issue focuses on the evaluation of these policies, drawing on a workshop held in June 2009 at OECD Conference Center in Paris. Evaluation is an activity that runs parallel with policymaking and is capable of contributing to effectiveness and efficiency at all stages. Evaluators, wherever they work and whatever aspect of rural development is their focus, face some common technical problems. These include multiple (and often ill-defined) policy objectives, the choice of appropriate indicators (especially the need to distinguish between outputs and outcomes), how to establish baseline values, where to draw boundaries in terms of impact and time, and the identification of additionality and causality. Ensuring that lessons learned from evaluation are actually applied is problematic. Experiences covered in this Issue include the use of macro and case-study approaches, and various schemes (investment in human and social capital, and agri-environment and forestry). There is an inherent tension between using a common approach across countries and regions in the interests of comparability and the flexibility needed to capture all the relevant factors in the diverse situations in which rural development actions take place. Un précédent numéro de EuroChoices (Vol. 7, No. 1) comparait et mettait en regard les approches de l'Union européenne et des États-Unis en terme de politique de développement rural. Ce numéro spécial est consacréà l'évaluation de la politique et tire parti d'un atelier qui s'est tenu en juin 2009 au Centre de Conférences de l'OCDE à Paris. L'évaluation va de pair avec l'élaboration des politiques et peut contribuer à améliorer l'efficacité et l'efficience à tous les stades. Quels que soient leur affiliation et l'aspect du développement rural sur lequel ils se concentrent, les évaluateurs sont confrontés à certains problèmes techniques communs. Il s'agit des objectifs multiples (et souvent mal définis) de la politique, du choix d'indicateurs pertinents (en particulier la nécessité de faire la différence entre produit et résultat), de la manière d'établir des valeurs de référence, de la fixation de limites en terme d'incidence et de durée, et de l'identification des effets additifs et de la causalité. Il est difficile de s'assurer que les leçons tirées des évaluations sont effectivement retenues. Les expériences rapportées dans ce numéro comprennent des approches macroéconomiques ou fondées sur des études de cas, et couvrent différents programmes (investissements dans le capital social et humain, mesures agroenvironnementales, mesures forestières). Il existe une tension évidente entre l'utilisation d'une approche commune entre chaque pays et région, qui vise la comparabilité, et la flexibilité qui permet de prendre en compte l'ensemble des différents facteurs des situations variées dans lesquelles les mesures de développement rural sont appliquées. In einer vorherigen Ausgabe von EuroChoices (7:1) wurden Herangehensweisen an die Politik zur Entwicklung des ländlichen Raums in der EU und in den USA verglichen und diskutiert. Diese Sonderausgabe beschäftigt sich auf der Grundlage eines Workshops, der im Juni 2009 am OECD-Hauptsitz in Paris abgehalten wurde, mit Politikevaluation. Die Evaluation erfolgt parallel zur Politikgestaltung und kann in jeder Phase zur Steigerung von Wirksamkeit und Effizienz beitragen. Evaluatoren stehen einigen allgemeinen technischen Problemen gegenüber , ganz gleich, wo sie arbeiten und welchen Aspekten ländlicher Entwicklung sie sich widmen. Dazu zählen multiple (und oftmals unzureichend definierte) politische Ziele; die Auswahl von geeigneten Indikatoren (hier muss insbesondere zwischen Endprodukten und Ergebnissen unterschieden werden); die Frage, wie Ausgangswerte festzulegen und wo Grenzen im Hinblick auf Auswirkungen und den zeitlichen Rahmen zu setzen sind; sowie die Identifizierung von Additionalität und Kausalität. Es ist schwierig sicherzustellen, dass die Erkenntnisse aus der Evaluation auch umgesetzt werden. Die in dieser Ausgabe aufgegriffenen Erfahrungen berücksichtigen u.a. Makro- und Fallstudienansätze sowie verschiedene Maßnahmen (Investitionen in Human-/Sozialkapital sowie Agrarumwelt und Forstwirtschaft). Es besteht eine grundsätzliche Spannung zwischen einer im Interesse der Vergleichbarkeit einheitlichen länder- und regionenübergreifenden Herangehensweise und einer Flexibilität bei der Erfassung aller relevanten Faktoren in den verschiedenen Situationen, in denen ländliche Entwicklung stattfindet. [source] Limits of fine-mapping a quantitative traitGENETIC EPIDEMIOLOGY, Issue 2 2003Larry D. Atwood Abstract Once a significant linkage is found, an important goal is reducing the error in the estimated location of the linked locus. A common approach to reducing location error, called fine-mapping, is the genotyping of additional markers in the linked region to increase the genetic information. The utility of fine-mapping for quantitative trait linkage analysis is largely unknown. To explore this issue, we performed a fine-mapping simulation in which the region containing a significant linkage at a 10-centiMorgan (cM) resolution was fine-mapped at 2, 1, and 0.5 cM. We simulated six quantitative trait models in which the proportion of variation due to the quantitative trait locus (QTL) ranged from 0.20,0.90. We used four sampling designs that were all combinations of 100 and 200 families of sizes 5 and 7. Variance components linkage analysis (Genehunter) was performed until 1,000 replicates were found with a maximum lodscore greater than 3.0. For each of these 1,000 replications, we repeated the linkage analysis three times: once for each of the fine-map resolutions. For the most realistic model, reduction in the average location error ranged from 3,15% for 2-cM fine-mapping and from 3,18% for 1-cM fine-mapping, depending on the number of families and family size. Fine-mapping at 0.5 cM did not differ from the 1-cM results. Thus, if the QTL accounts for a small proportion of the variation, as is the case for realistic traits, fine-mapping has little value. Genet Epidemiol 24:99,106, 2003. © 2003 Wiley-Liss, Inc. [source] What determines the relationship between plant diversity and habitat productivity?GLOBAL ECOLOGY, Issue 6 2008Martin Zobel ABSTRACT The relationship between biodiversity and habitat productivity has been a fundamental topic in ecology. Although the relationship between these parameters may exhibit different shapes, the unimodal shape has been frequently encountered. The decrease in diversity at high productivity has usually been attributed to competitive exclusion. We suggest that evolutionary history and dispersal limitation may be even more important in shaping the diversity,productivity relationship. On a global scale, unimodal diversity,productivity relationships dominate in temperate regions, whereas positive relationships are more common in the tropics. This difference can be accounted for by contrasting evolutionary history. Temperate regions have smaller species pools for productive habitats since these habitats have been scarce historically for speciation, while the opposite is true for the tropics. In addition, dispersal within a region may limit diversity either due to the lack of dispersal syndromes at low productivity or the low number of diaspores at high productivity. Thereafter, biotic interactions (competition and facilitation) can shape the relationship. All these processes can act independently or concurrently. We recommend that the common approach to examining empirical diversity,environmental relationships should start with the role of large-scale processes such as evolutionary history and dispersal limitation, followed by influences associated with ecological interactions. [source] Intravenous Valproate Sodium in the Treatment of Daily HeadacheHEADACHE, Issue 6 2002Tamara H. Schwartz MD Background.,Treatment of chronic daily headache/transformed migraine is challenging, especially when it is complicated by overuse of analgesics, triptans, or both. One common approach involves the use of repetitive intravenous dihydroergotamine. We investigated the use of intravenous valproate sodium in the treatment of chronic daily headache/transformed migraine in patients who had contraindications to the use of or had failed treatment with dihydroergotamine. Methods.,We administered intravenous valproate sodium (Depacon) to patients with chronic daily headache/transformed migraine (loading dose 15 mg/kg, followed by 5 mg/kg every 8 hours). All analgesics and triptans were discontinued prior to treatment with divalproex sodium, and preventative medications for migraine were begun or continued. All patients received instruction in behavioral modification and the proper use of analgesics and triptans. Results.,Improvement in headache was reported by 80% of the patients treated, and valproate sodium was tolerated well by most. Conclusion.,Intravenous valproate sodium may be of assistance in the initial management of patients with chronic daily headache/transformed migraine and analgesic/triptan overuse, especially when dihydroergotamine is ineffective or contraindicated. [source] Empirical implications of response acquiescence in discrete-choice contingent valuationHEALTH ECONOMICS, Issue 10 2006Raymond Y. T. Yeung Abstract The use of discrete-choice contingent valuation (CV) to elicit individuals' preference, expressed as maximum willingness-to-pay (WTP), although primarily developed in environmental economics, has been popular in the economic evaluation of health and healthcare. However, a concern with this method is the potential for ,over-estimating' WTP values due to the presence of response acquiescence, or ,yea-saying' bias. Based on a CV survey conducted to estimate physicians' valuation of clinic computerization, the extent of such bias was estimated from a within-sample open-ended valuation question following the respondents' discrete choice response. Analysis of this data suggests that not only was response acquiescence an issue, but also that the parametric estimation of mean and median WTP, the most common approach to estimating WTP from discrete-choice data, would potentially magnify such bias (to various degrees depending on the distributional assumptions applied). The possible extent of CV design versus analysis in discrete-choice methods therefore warrants further exploration. Copyright © 2006 John Wiley & Sons, Ltd. [source] Easy to say, difficult to do: diversity management in retailHUMAN RESOURCE MANAGEMENT JOURNAL, Issue 3 2005Carley Foster This article examines how operational managers are interpreting the management of diversity in practice. It is explicitly concerned with the way in which managing diversity was understood and applied in one large, long-established British retailing company. The findings suggest that while the business benefits attributed to diversity management are appealing to employers, it is a concept that lacks clarity for line managers both in terms of what it is and how it should be implemented within the anti-discrimination legal framework. Line managers, familiar with the value of demonstrating a common approach in their decision-making as the key means of defence against claims of discriminatory treatment, regarded a diversity management agenda concerned with recognising and responding to individual differences as more likely to lead to feelings of unfairness and claims of unequal treatment. It will be argued that, in the implementation of organisational diversity initiatives, employers need to take greater account of the tensions facing line managers, their interpretation of diversity management and perceptions of fair treatment as well as the operational context. [source] Identification of autoregressive models in the presence of additive noiseINTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 5 2008Roberto Diversi Abstract A common approach in modeling signals in many engineering applications consists in adopting autoregressive (AR) models, consisting in filters with transfer functions having a unitary numerator, driven by white noise. Despite their wide application, these models do not take into account the possible presence of errors on the observations and cannot prove accurate when these errors are significant. AR plus noise models constitute an extension of AR models that consider also the presence of an observation noise. This paper describes a new algorithm for the identification of AR plus noise models that is characterized by a very good compromise between accuracy and efficiency. This algorithm, taking advantage of both low and high-order Yule,Walker equations, also guarantees the positive definiteness of the autocorrelation matrix of the estimated process and allows to estimate the equation error and observation noise variances. It is also shown how the proposed procedure can be used for estimating the order of the AR model. The new algorithm is compared with some traditional algorithms by means of Monte Carlo simulations. Copyright © 2007 John Wiley & Sons, Ltd. [source] Salinity patterns in irrigation systems, a threat to be demystified, a constraint to be managed: Field evidence from Algeria and Tunisia,IRRIGATION AND DRAINAGE, Issue S3 2009S. Bouarfa irrigation; gestion de la salinité; alcalinité résiduelle; perception des agriculteurs et stratégies; Maghreb Abstract Salinity problems induced by irrigation are often presented in the literature as a threat that can only be managed at the irrigation scheme scale by installing subsurface drainage. On the other hand, salinity is a constraint that has often been successfully managed locally by farmers adapting their practices. However, the continuing expansion of irrigation with related water scarcity problems plus the increasing use of groundwater of marginal quality has resulted in a new challenge that is difficult to handle at the farm level only. To assess the dynamics of soil salinity and water quality together with farmers' salinity management practices, we adapted a common approach to analyze two contrasted salinity patterns: a traditional salinity pattern in an oasis (Fatnassa, Tunisia), and a recent sodicity pattern in a large irrigation scheme (Lower Chelif, Algeria). This approach which combines surveys on farmers' perceptions and practices and salinity measurements and geochemical analysis paves the way for more integrated management of salinity problems related to water scarcity. Copyright © 2009 John Wiley & Sons, Ltd. Les problèmes de salinité en systèmes irrigués sont souvent présentés dans la littérature comme une menace dont la seule solution réside dans l'installation de systèmes de drainage. La salinité est cependant une contrainte qui peut également être gérée localement avec succès par les agriculteurs par une adaptation de leurs pratiques. Le développement continu de l'irrigation et les tensions sur l'eau qui en découlent contraignent à un usage accru d'eau de nappe de mauvaise qualité dont les conséquences sont difficilement maitrisables à la seule échelle de l'exploitation. Ce nouveau contexte nécessite le développement de nouvelles approches permettant d'appréhender à la fois les processus de salinisation et d'adaptation des agriculteurs. Nous avons adopté une démarche commune pour évaluer la dynamique d'évolution de la salinité et les pratiques des agriculteurs dans deux situations contrastées: un schéma de salinisation classique (oasis de Fatnassa, Tunisie) et un schéma récent d'évolution vers un processus de sodisation (plaine du Bas-Chelif, Algérie). L'utilisation de cette approche qui combine des enquêtes sur les perceptions et les pratiques des agriculteurs, des mesures de salinité et des analyzes géochimiques ouvre des perspectives pour une vision et une gestion plus intégrée des problèmes de salinité liés à la pénurie d'eau. Copyright © 2009 John Wiley & Sons, Ltd. [source] Testing alternate ecological approaches to seagrass rehabilitation: links to life-history traitsJOURNAL OF APPLIED ECOLOGY, Issue 5 2010Andrew D. Irving Summary 1.,Natural resources and ecosystem services provided by the world's major biomes are increasingly threatened by anthropogenic impacts. Rehabilitation is a common approach to recreating and maintaining habitats, but limitations to the success of traditional techniques necessitate new approaches. 2.,Almost one-third of the world's productive seagrass meadows have been lost in the past 130 years. Using a combined total of three seagrass species at seven sites over 8 years, we experimentally assessed the performance of multiple rehabilitation methods that utilize fundamentally different ecological approaches. 3.,First, traditional methods of transplantation were tested and produced varied survival (0,80%) that was site dependent. Secondly, seedling culture and outplanting produced poor survival (2,9%) but reasonable growth. Finally, a novel method that used sand-filled bags of hessian to overcome limitations of traditional techniques by facilitating recruitment and establishment of seedlings in situ produced recruit densities of 150,350 seedlings m,2, with long-term survival (up to 38 months) ranging from 0 to 72 individuals m,2. 4.,Results indicate that facilitating seagrass recruitment in situ using hessian bags can provide a new tool to alleviate current limitations to successful rehabilitation (e.g. mobile sediments, investment of time and resources), leading to more successful management and mitigation of contemporary losses. Hessian bags have distinct environmental and economic advantages over other methods tested in that they do not damage existing meadows, are biodegradable, quick to deploy, and cost less per hectare (US$16 737) than the estimated ecosystem value of seagrass meadows (US$27 039 year,1). 5.,Synthesis and applications. This research demonstrates how exploring alternate ecological approaches to habitat rehabilitation can expand our collective toolbox for successfully re-creating complex and productive ecosystems, and alleviate the destructive side-effects and low success rates of more traditional techniques. Moreover, new methods can offer economic and environmental solutions to the restrictions placed upon managers of natural resources. [source] A tale of two matrices: multivariate approaches in evolutionary biologyJOURNAL OF EVOLUTIONARY BIOLOGY, Issue 1 2007M. W. BLOWS Abstract Two symmetric matrices underlie our understanding of microevolutionary change. The first is the matrix of nonlinear selection gradients (,) which describes the individual fitness surface. The second is the genetic variance,covariance matrix (G) that influences the multivariate response to selection. A common approach to the empirical analysis of these matrices is the element-by-element testing of significance, and subsequent biological interpretation of pattern based on these univariate and bivariate parameters. Here, I show why this approach is likely to misrepresent the genetic basis of quantitative traits, and the selection acting on them in many cases. Diagonalization of square matrices is a fundamental aspect of many of the multivariate statistical techniques used by biologists. Applying this, and other related approaches, to the analysis of the structure of , and G matrices, gives greater insight into the form and strength of nonlinear selection, and the availability of genetic variance for multiple traits. [source] A simple covarion-based approach to analyse nucleotide substitution ratesJOURNAL OF EVOLUTIONARY BIOLOGY, Issue 4 2002J. Siltberg Using the ratio of nonsynonymous to synonymous nucleotide substitution rates (Ka/Ks) is a common approach for detecting positive selection. However, calculation of this ratio over a whole gene combines amino acid sites that may be under positive selection with those that are highly conserved. We introduce a new covarion-based method to sample only the sites potentially under selective pressure. Using ancestral sequence reconstruction over a phylogenetic tree coupled with calculation of Ka/Ks ratios, positive selection is better detected by this simple covarion-based approach than it is using a whole gene analysis or a windowing analysis. This is demonstrated on a synthetic dataset and is tested on primate leptin, which indicates a previously undetected round of positive selection in the branch leading to Gorilla gorilla. [source] Feedback control of dissipative PDE systems using adaptive model reductionAICHE JOURNAL, Issue 4 2009Amit Varshney Abstract The problem of feedback control of spatially distributed processes described by highly dissipative partial differential equations (PDEs) is considered. Typically, this problem is addressed through model reduction, where finite dimensional approximations to the original infinite dimensional PDE system are derived and used for controller design. The key step in this approach is the computation of basis functions that are subsequently utilized to obtain finite dimensional ordinary differential equation (ODE) models using the method of weighted residuals. A common approach to this task is the Karhunen-Loève expansion combined with the method of snapshots. To circumvent the issue of a priori availability of a sufficiently large ensemble of PDE solution data, the focus is on the recursive computation of eigenfunctions as additional data from the process becomes available. Initially, an ensemble of eigenfunctions is constructed based on a relatively small number of snapshots, and the covariance matrix is computed. The dominant eigenspace of this matrix is then utilized to compute the empirical eigenfunctions required for model reduction. This dominant eigenspace is recomputed with the addition of each snapshot with possible increase or decrease in its dimensionality; due to its small dimensionality the computational burden is relatively small. The proposed approach is applied to representative examples of dissipative PDEs, with both linear and nonlinear spatial differential operators, to demonstrate its effectiveness of the proposed methodology. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source] Development of a nanofiltration process to improve the stability of a novel anti-MRSA carbapenem drug candidateJOURNAL OF PHARMACEUTICAL SCIENCES, Issue 4 2002V. Antonucci Abstract The benzenesulfonate salt of an anti-methicillin-resistant Staphylococcus aureus carbapenem antibiotic studied is a crystalline, nonhygroscopic powder which is stable at room temperature, making it an ideal compound for long-term storage. However, the limited aqueous solubility of this salt prohibits parenteral administration. Conversely, the chloride salt of this carbapenem demonstrates opposing characteristics; it is quantitatively soluble in water, however is amorphous and subject to significant hydrolytic degradation in the solid state. Given two such extreme alternatives for pharmaceutical salt selection, a common approach taken is to develop the bioavailable salt and devise manufacturing and storage conditions that minimize degradation. This report describes a different approach to this manufacturing dilemma via the application of a simple and efficient nanofiltration process to convert the benzenesulfonate salt (storage entity) to the chloride salt (formulated drug product). Such an approach combines the positive attributes of these two salt forms into a single scalable process that reduces processing cycle times via elimination of redundant unit operations, increases the flexibility in manufacturing schedule, and improves overall product quality. © 2002 Wiley-Liss, Inc. and the American Pharmaceutical Association J Pharm Sci 91: 923,932, 2002 [source] PREDICTION OF STREAM TEMPERATURE IN FORESTED WATERSHEDS,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 1 2004V. Sridhar ABSTRACT: Removal of streamside vegetation changes the energy balance of a stream, and hence its temperature. A common approach to mitigating the effects of logging on stream temperature is to require establishment of buffer zones along stream corridors. A simple energy balance model is described for prediction of stream temperature in forested headwater watersheds that allows evaluation of the performance of such measures. The model is designed for application to "worst case" or maximum annual stream temperature, under low flow conditions with maximum annual solar radiation and air temperature. Low flows are estimated via a regional regression equation with independent variables readily accessible from GIS databases. Testing of the energy balance model was performed using field data for mostly forested basins on both the west and east slopes of the Cascade Mountains, and was then evaluated using the regional equations for low flow and observed maximum reach temperatures in three different east slope Cascades catchments. A series of sensitivity analyses showed that increasing the buffer width beyond 30 meters did not significantly decrease stream temperatures, and that other vegetation parameters such as leaf area index, average tree height, and to a lesser extent streamside vegetation buffer width, more strongly affected maximum stream temperatures. [source] Inference for two-stage adaptive treatment strategies using mixture distributionsJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 1 2010Abdus S. Wahed Summary., Treatment of complex diseases such as cancer, leukaemia, acquired immune deficiency syndrome and depression usually follows complex treatment regimes consisting of time varying multiple courses of the same or different treatments. The goal is to achieve the largest overall benefit defined by a common end point such as survival. Adaptive treatment strategy refers to a sequence of treatments that are applied at different stages of therapy based on the individual's history of covariates and intermediate responses to the earlier treatments. However, in many cases treatment assignment depends only on intermediate response and prior treatments. Clinical trials are often designed to compare two or more adaptive treatment strategies. A common approach that is used in these trials is sequential randomization. Patients are randomized on entry into available first-stage treatments and then on the basis of the response to the initial treatments are randomized to second-stage treatments, and so on. The analysis often ignores this feature of randomization and frequently conducts separate analysis for each stage. Recent literature suggested several semiparametric and Bayesian methods for inference related to adaptive treatment strategies from sequentially randomized trials. We develop a parametric approach using mixture distributions to model the survival times under different adaptive treatment strategies. We show that the estimators proposed are asymptotically unbiased and can be easily implemented by using existing routines in statistical software packages. [source] Normal and Lateral Deformation of Lyotropically Ordered Polymer BrushMACROMOLECULAR THEORY AND SIMULATIONS, Issue 9 2006Alexey A. Polotsky Abstract Summary: Planar polymer brush formed by semirigid chains of freely jointed rigid segments and immersed into a solvent is considered. Brush collapse induced by deterioration of the solvent quality and its deformation by external normal or lateral force is studied. It is demonstrated that these three different situations can be described in the framework of the common approach. It is shown that the collapse is accompanied by liquid-crystalline (LC) ordering within the brush. The LC transition can be jump-like (the first order) or continuous, depending on the segment's aspect ratio and grafting density. Transition point is investigated in detail, the corresponding phase diagrams are calculated. It is shown that the phase diagrams of a normally deformed brush have different structures, with a narrow ,leg' in the good solvent region for sparsely grafted brush, with two coexistence regions and a triple point, in addition, for shorter segment length or without these features if the chains are densely grafted. For the laterally deformed brush, phase diagrams have similar structures with a critical point in the good solvent regime. Polymer brush subjected to deformation by normal (top) and lateral (bottom) external force. [source] Three versions of an ethics of careNURSING PHILOSOPHY, Issue 4 2009Steven D. Edwards PhD M.Phil BA(hons) Abstract The ethics of care still appeals to many in spite of penetrating criticisms of it which have been presented over the past 15 years or so. This paper tries to offer an explanation for this, and then to critically engage with three versions of an ethics of care. The explanation consists firstly in the close affinities between nursing and care. The three versions identified below are by Gilligan (1982), a second by Tronto (1993), and a third by Gastmans (2006), see also Little (1998). Each version is described and then subjected to criticism. It is concluded that where the ethics of care is presented in a distinctive way, it is at its least plausible; where it is stated in more plausible forms, it is not sufficiently distinct from nor superior to at least one other common approach to nursing ethics, namely the much-maligned ,four principles' approach. What is added by this paper to what is already known: as the article tries to explain, in spite of its being subjected to sustained criticism the ethics of care retains its appeal to many scholars. The paper tries to explain why, partly by distinguishing three different versions of an ethics of care. It is also shown that all three versions are beset with problems the least serious of which is distinctiveness from other approaches to moral problems in health care. [source] Determinants of the optimal first-line therapy for follicular lymphoma: A decision analysis,AMERICAN JOURNAL OF HEMATOLOGY, Issue 4 2010Rebecca L. Olin Combination immunochemotherapy is the most common approach for initial therapy of patients with advanced-stage follicular lymphoma, but no consensus exists as to the optimal selection or sequence of available regimens. We undertook this decision analysis to systematically evaluate the parameters affecting the choice of early therapy in patients with this disease. We designed a Markov model incorporating the three most commonly utilized regimens (RCVP, RCHOP, and RFlu) in combinations of first- and second-line therapies, with the endpoint of number of quality-adjusted life years (QALYs) until disease progression. Data sources included Phase II and Phase III trials and literature estimates of long-term toxicities and health state utilities. Meta-analytic methods were used to derive the values and ranges of regimen-related parameters. Based on our model, the strategy associated with the greatest number of expected quality-adjusted life years was treatment with RCHOP in first-line therapy followed by treatment with RFlu in second-line therapy (9.00 QALYs). Strategies containing RCVP either in first- or second-line therapy resulted in the lowest number of QALYs (range 6.24,7.71). Sensitivity analysis used to determine the relative contribution of each model parameter identified PFS after first-line therapy and not short-term QOL as the most important factor in prolonging overall quality-adjusted life years. Our results suggest that regimens associated with a longer PFS provide a greater number of total QALYs, despite their short-term toxicities. For patients without contraindications to any of these regimens, use of a more active regimen may maximize overall quality of life. Am. J. Hematol. 2010. © 2010 Wiley-Liss, Inc. [source] Genome-wide association studies and the genetic dissection of complex traits,AMERICAN JOURNAL OF HEMATOLOGY, Issue 8 2009Paola Sebastiani The availability of affordable high throughput technology for parallel genotyping has opened the field of genetics to genome-wide association studies (GWAS), and in the last few years hundreds of articles reporting results of GWAS for a variety of heritable traits have been published. What do these results tell us? Although GWAS have discovered a few hundred reproducible associations, this number is underwhelming in relation to the huge amount of data produced, and challenges the conjecture that common variants may be the genetic causes of common diseases. We argue that the massive amount of genetic data that result from these studies remains largely unexplored and unexploited because of the challenge of mining and modeling enormous data sets, the difficulty of using nontraditional computational techniques and the focus of accepted statistical analyses on controlling the false positive rate rather than limiting the false negative rate. In this article, we will review the common approach to analysis of GWAS data and then discuss options to learn more from these data. We will use examples from our ongoing studies of sickle cell anemia and also GWAS in multigenic traits. Am. J. Hematol., 2009. © 2009 Wiley-Liss, Inc. [source] Power and sample size when multiple endpoints are consideredPHARMACEUTICAL STATISTICS: THE JOURNAL OF APPLIED STATISTICS IN THE PHARMACEUTICAL INDUSTRY, Issue 3 2007Stephen Senn Abstract A common approach to analysing clinical trials with multiple outcomes is to control the probability for the trial as a whole of making at least one incorrect positive finding under any configuration of true and false null hypotheses. Popular approaches are to use Bonferroni corrections or structured approaches such as, for example, closed-test procedures. As is well known, such strategies, which control the family-wise error rate, typically reduce the type I error for some or all the tests of the various null hypotheses to below the nominal level. In consequence, there is generally a loss of power for individual tests. What is less well appreciated, perhaps, is that depending on approach and circumstances, the test-wise loss of power does not necessarily lead to a family wise loss of power. In fact, it may be possible to increase the overall power of a trial by carrying out tests on multiple outcomes without increasing the probability of making at least one type I error when all null hypotheses are true. We examine two types of problems to illustrate this. Unstructured testing problems arise typically (but not exclusively) when many outcomes are being measured. We consider the case of more than two hypotheses when a Bonferroni approach is being applied while for illustration we assume compound symmetry to hold for the correlation of all variables. Using the device of a latent variable it is easy to show that power is not reduced as the number of variables tested increases, provided that the common correlation coefficient is not too high (say less than 0.75). Afterwards, we will consider structured testing problems. Here, multiplicity problems arising from the comparison of more than two treatments, as opposed to more than one measurement, are typical. We conduct a numerical study and conclude again that power is not reduced as the number of tested variables increases. Copyright © 2007 John Wiley & Sons, Ltd. [source] The Anglo-American Origins and International Diffusion of the "Third Way"POLITICS & POLICY, Issue 1 2003Donley T. Studlar Although much has been written about the meanings of the "Third Way," a term popularized by Prime Minister Tony Blair in Britain and U.S. President Bill Clinton to characterize their similar approaches to governing, little analysis has been done of the phenomenon of the rapid diffusion of this concept internationally. Although the Democratic Leadership Council used the term first in the United States in 1991, it was decided at a high-level meeting between Clinton and New Labour executive officials in 1997 to popularize the term to describe their common approach to governing. This paper describes both the intellectual and political sources of this concept and how it has spread, not only as a label for its originators, but also to other governments and parties in the world. The test of whether the Third Way becomes recognized as a coherent ideology will be whether, over time, those who advocate it become identified with distinctive, consistent policies. [source] Identification of plastic material parameters with error estimationPROCEEDINGS IN APPLIED MATHEMATICS & MECHANICS, Issue 1 2005Jaan Unger In recent years, inverse analysis has become a common approach to typical engineering problems such as model identification. In this contribution, the inverse problem is discussed in light of taking experimental uncertainties into account. This involves in particular the propagation of experimental errors and the analysis of the sensitivity of the model response to variations in the model parameters to be determined. The method is applied to an elasto-viscoplastic material model which is used in the context of electromagnetic high-speed forming. (© 2005 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Estimating false discovery rates for peptide and protein identification using randomized databasesPROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 12 2010Gregory Hather Abstract MS-based proteomics characterizes protein contents of biological samples. The most common approach is to first match observed MS/MS peptide spectra against theoretical spectra from a protein sequence database and then to score these matches. The false discovery rate (FDR) can be estimated as a function of the score by searching together the protein sequence database and its randomized version and comparing the score distributions of the randomized versus nonrandomized matches. This work introduces a straightforward isotonic regression-based method to estimate the cumulative FDRs and local FDRs (LFDRs) of peptide identification. Our isotonic method not only performed as well as other methods used for comparison, but also has the advantages of being: (i) monotonic in the score, (ii) computationally simple, and (iii) not dependent on assumptions about score distributions. We demonstrate the flexibility of our approach by using it to estimate FDRs and LFDRs for protein identification using summaries of the peptide spectra scores. We reconfirmed that several of these methods were superior to a two-peptide rule. Finally, by estimating both the FDRs and LFDRs, we showed for both peptide and protein identification, moderate FDR values (5%) corresponded to large LFDR values (53 and 60%). [source] Physical and computational analysis of the yeast Kluyveromyces lactis secreted proteomePROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 13 2008Catherine L. Swaim Abstract Secretion of proteins is the most common approach to protein expression in Kluyveromyces lactis. A proteomic analysis was performed on spent fermentation medium following bioreactor propagation of a wild-type industrial strain to identify proteins naturally secreted by K. lactis cells. Multidimensional separations were conducted and RP online ESI-MS/MS analysis identified 81 secreted proteins. In addition, an in silico analysis predicted 178 K. lactis proteins to be secreted via the general secretory pathway (GSP). These two datasets were compared and approximately 70% of the K. lactis proteins detected in the culture medium possessed a GSP sequence. The detected proteins included those involved with cell wall structure and synthesis, carbohydrate metabolism, and proteolysis, a result that may have significant bearing on heterologous protein expression. Additionally, both the experimental and in silico datasets were compared to similar, previously published datasets for Candida albicans. With the methodology presented here, we provide the deepest penetration into a yeast secretome yet reported. [source] Commercial exploitation of new technologies arising from university research: start-ups and markets for technologyR & D MANAGEMENT, Issue 4 2007Fred Pries The creation of start-up firms is an important method of commercializing new technologies arising from R&D at universities and other research institutions. Most research into start-ups presumes that these firms develop products or services. However, start-ups may operate through markets for technology by selling or licensing rights to use their technology to other firms , typically established firms , who develop and sell new products or services based on the technology. In this study of 57 public start-up firms created to commercialize the results of university research, we find evidence that (1) operating through markets for technology is a common approach to commercialization, (2) start-ups that operate in markets for technology can be effectively distinguished in practice from start-ups operating through product markets, and (3) there are substantive differences in the business activities of firms depending on whether they operate through product markets or markets for technology. [source] Cysteine-reactive covalent capture tags for enrichment of cysteine-containing peptidesRAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 21 2009Priscille Giron Considering the tremendous complexity and the wide dynamic range of protein samples from biological origin and their proteolytic peptide mixtures, proteomics largely requires simplification strategies. One common approach to reduce sample complexity is to target a particular amino acid in proteins or peptides, such as cysteine (Cys), with chemical tags in order to reduce the analysis to a subset of the whole proteome. The present work describes the synthesis and the use of two new cysteinyl tags, so-called cysteine-reactive covalent capture tags (C3T), for the isolation of Cys-containing peptides. These bifunctional molecules were specifically designed to react with cysteines through iodoacetyl and acryloyl moieties and permit efficient selection of the tagged peptides. To do so, a thioproline was chosen as the isolating group to form, after a deprotection/activation step, a thiazolidine with an aldehyde resin by the covalent capture (CC) method. The applicability of the enrichment strategy was demonstrated on small synthetic peptides as well as on peptides derived from digested proteins. Mass spectrometric (MS) analysis and tandem mass spectrometric (MS/MS) sequencing confirmed the efficient and straightforward selection of the cysteine-containing peptides. The combination of C3T and CC methods provides an effective alternative to reduce sample complexity and access low abundance proteins. Copyright © 2009 John Wiley & Sons, Ltd. [source] Restoring Stream Ecosystems: Lessons from a Midwestern StateRESTORATION ECOLOGY, Issue 3 2004Ashley H. Moerke Abstract Reach-scale stream restorations are becoming a common approach to repair degraded streams, but the effectiveness of these projects is rarely evaluated or reported. We surveyed governmental, private, and nonprofit organizations in the state of Indiana to determine the frequency and nature of reach-scale stream restorations in this midwestern U.S. state. For 10 attempted restorations in Indiana, questionnaires and on-site assessments were used to better evaluate current designs for restoring stream ecosystems. At each restoration site, habitat and water quality were evaluated in restored and unrestored reaches. Our surveys identified commonalities across all restorations, including the type of restoration, project goals, structures installed, and level of monitoring conducted. In general, most restorations were described as stream-relocation projects that combined riparian and in-stream enhancements. Fewer than half of the restorations conducted pre- or post-restoration monitoring, and most monitoring involved evaluations of riparian vegetation rather than aquatic variables. On-site assessments revealed that restored reaches had significantly lower stream widths and greater depths than did upstream unrestored reaches, but riparian canopy cover often was lower in restored than in unrestored reaches. This study provides basic information on midwestern restoration strategies, which is needed to identify strengths and weaknesses in current practices and to better inform future stream restorations. [source] |