Home About us Contact | |||
Robust Approach (robust + approach)
Selected AbstractsMixed biodiversity benefits of agri-environment schemes in five European countriesECOLOGY LETTERS, Issue 3 2006D. Kleijn Abstract Agri-environment schemes are an increasingly important tool for the maintenance and restoration of farmland biodiversity in Europe but their ecological effects are poorly known. Scheme design is partly based on non-ecological considerations and poses important restrictions on evaluation studies. We describe a robust approach to evaluate agri-environment schemes and use it to evaluate the biodiversity effects of agri-environment schemes in five European countries. We compared species density of vascular plants, birds, bees, grasshoppers and crickets, and spiders on 202 paired fields, one with an agri-environment scheme, the other conventionally managed. In all countries, agri-environment schemes had marginal to moderately positive effects on biodiversity. However, uncommon species benefited in only two of five countries and species listed in Red Data Books rarely benefited from agri-environment schemes. Scheme objectives may need to differentiate between biodiversity of common species that can be enhanced with relatively simple modifications in farming practices and diversity or abundance of endangered species which require more elaborate conservation measures. [source] Using propensity scores to estimate the effects of insecticides on stream invertebrates from observational data,ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 7 2009Lester L. Yuan Abstract Analyses of observational data can provide insights into relationships between environmental conditions and biological responses across a broader range of natural conditions than experimental studies, potentially complementing insights gained from experiments. However, observational data must be analyzed carefully to minimize the likelihood that confounding variables bias observed relationships. Propensity scores provide a robust approach for controlling for the effects of measured confounding variables when analyzing observational data. Here, we use propensity scores to estimate changes in mean invertebrate taxon richness in streams that have experienced insecticide concentrations that exceed aquatic life use benchmark concentrations. A simple comparison of richness in sites exposed to elevated insecticides with those that were not exposed suggests that exposed sites had on average 6.8 fewer taxa compared to unexposed sites. The presence of potential confounding variables makes it difficult to assert a causal relationship from this simple comparison. After controlling for confounding factors using propensity scores, the difference in richness between exposed and unexposed sites was reduced to 4.1 taxa, a difference that was still statistically significant. Because the propensity score analysis controlled for the effects of a wide variety of possible confounding variables, we infer that the change in richness observed in the propensity score analysis was likely caused by insecticide exposure. [source] Determination of Transverse Dispersion Coefficients from Reactive Plume LengthsGROUND WATER, Issue 2 2006Olaf A. Cirpka With most existing methods, transverse dispersion coefficients are difficult to determine. We present a new, simple, and robust approach based on steady-state transport of a reacting agent, introduced over a certain height into the porous medium of interest. The agent reacts with compounds in the ambient water. In our application, we use an alkaline solution injected into acidic ambient water. Threshold values of pH are visualized by adding standard pH indicators. Since aqueous-phase acid-base reactions can be considered practically instantaneous and the only process leading to mixing of the reactants is transverse dispersion, the length of the plume is controlled by the ratio of transverse dispersion to advection. We use existing closed-form expressions for multidimensional steady-state transport of conservative compounds in order to evaluate the concentration distributions of the reacting compounds. Based on these results, we derive an easy-to-use expression for the length of the reactive plume; it is proportional to the injection height squared, times the velocity, and inversely proportional to the transverse dispersion coefficient. Solving this expression for the transverse dispersion coefficient, we can estimate its value from the length of the alkaline plume. We apply the method to two experimental setups of different dimension. The computed transverse dispersion coefficients are rather small. We conclude that at slow but realistic ground water velocities, the contribution of effective molecular diffusion to transverse dispersion cannot be neglected. This results in plume lengths that increase with increasing velocity. [source] Computational methods for optical molecular imagingINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 12 2009Duan Chen Abstract A new computational technique, the matched interface and boundary (MIB) method, is presented to model the photon propagation in biological tissue for the optical molecular imaging. Optical properties have significant differences in different organs of small animals, resulting in discontinuous coefficients in the diffusion equation model. Complex organ shape of small animal induces singularities of the geometric model as well. The MIB method is designed as a dimension splitting approach to decompose a multidimensional interface problem into one-dimensional ones. The methodology simplifies the topological relation near an interface and is able to handle discontinuous coefficients and complex interfaces with geometric singularities. In the present MIB method, both the interface jump condition and the photon flux jump conditions are rigorously enforced at the interface location by using only the lowest-order jump conditions. This solution near the interface is smoothly extended across the interface so that central finite difference schemes can be employed without the loss of accuracy. A wide range of numerical experiments are carried out to validate the proposed MIB method. The second-order convergence is maintained in all benchmark problems. The fourth-order convergence is also demonstrated for some three-dimensional problems. The robustness of the proposed method over the variable strength of the linear term of the diffusion equation is also examined. The performance of the present approach is compared with that of the standard finite element method. The numerical study indicates that the proposed method is a potentially efficient and robust approach for the optical molecular imaging. Copyright © 2008 John Wiley & Sons, Ltd. [source] A robust approach to the UAV task assignment problemINTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 2 2008Mehdi Alighanbari Abstract This paper presents a new robust approach to the task assignment of unmanned aerial vehicles (UAVs) operating in uncertain dynamic environments for which the optimization data, such as target cost and target,UAV distances, are time varying and uncertain. The impact of this uncertainty in the data is mitigated by tightly integrating two approaches for improving the robustness of the assignment algorithm. One approach is to design task assignment plans that are robust to the uncertainty in the data, which reduces the sensitivity to errors in the situational awareness (SA), but can be overly conservative for long duration plans. A second approach is to replan as the SA is updated, which results in the best plan given the current information, but can lead to a churning type of instability if the updates are performed too rapidly. The strategy proposed in this paper combines robust planning with the techniques developed to eliminate churning. This combination results in the robust filter-embedded task assignment algorithm that uses both proactive techniques that hedge against the uncertainty, and reactive approaches that limit churning behavior by the vehicles. Numerous simulations are shown to demonstrate the performance benefits of this new algorithm. Copyright © 2007 John Wiley & Sons, Ltd. [source] Spatiotemporal Correlation Between Phase Singularities and Wavebreaks During Ventricular FibrillationJOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 10 2003YEN-BIN LIU M.D. Introduction: Phase maps and the detection of phase singularities (PSs) have become a well-developed method for characterizing the organization of ventricular fibrillation (VF). How precisely PS colocalizes with wavebreak (WB) during VF, however, is unknown. Methods and Results: We performed optical mapping of 27 episodes of VF in nine Langendorff-perfused rabbit hearts. A WB is a point where the activation wavefront and the repolarization waveback meet. A PS is a site where its phase is ambiguous and its neighboring pixels exhibit a continuous phase progression from ,, to +,. The correlation coefficient between the number of WBs and PSs was 0.78 ± 0.09 for each heart and 0.81 for all VF episodes (P < 0.001), indicating a significant temporal correlation. We then superimposed the WBs and PSs for every 100 frames of each episode. These maps showed a high degree of spatial colocalization. To quantify spatial colocalization, the spatial shifts between the cumulative maps of WBs and PSs in corresponding frames were calculated by automatic alignment to obtain maximum overlap between these two maps. The spatial shifts were 0.04 ± 0.31 mm on the x-axis and 0.06 ± 0.27 mm on the y-axis over a 20 × 20 mm2 mapped field, indicating highly significant spatial correlation. Conclusion: Phase mapping provides a convenient and robust approach to quantitatively describe wave propagation and organization during VF. The close spatiotemporal correlation between PSs and WBs establishes that PSs are a valid alternate representation of WB during VF and further validated the use of phase mapping in the study of VF dynamics. (J Cardiovasc Electrophysiol, Vol. 14, pp. 1103-1109, October 2003) [source] A non-Gaussian generalization of the Airline model for robust seasonal adjustmentJOURNAL OF FORECASTING, Issue 5 2006JOHN A. D. ASTON Abstract In their seminal book Time Series Analysis: Forecasting and Control, Box and Jenkins (1976) introduce the Airline model, which is still routinely used for the modelling of economic seasonal time series. The Airline model is for a differenced time series (in levels and seasons) and constitutes a linear moving average of lagged Gaussian disturbances which depends on two coefficients and a fixed variance. In this paper a novel approach to seasonal adjustment is developed that is based on the Airline model and that accounts for outliers and breaks in time series. For this purpose we consider the canonical representation of the Airline model. It takes the model as a sum of trend, seasonal and irregular (unobserved) components which are uniquely identified as a result of the canonical decomposition. The resulting unobserved components time series model is extended by components that allow for outliers and breaks. When all components depend on Gaussian disturbances, the model can be cast in state space form and the Kalman filter can compute the exact log-likelihood function. Related filtering and smoothing algorithms can be used to compute minimum mean squared error estimates of the unobserved components. However, the outlier and break components typically rely on heavy-tailed densities such as the t or the mixture of normals. For this class of non-Gaussian models, Monte Carlo simulation techniques will be used for estimation, signal extraction and seasonal adjustment. This robust approach to seasonal adjustment allows outliers to be accounted for, while keeping the underlying structures that are currently used to aid reporting of economic time series data.,,Copyright © 2006 John Wiley & Sons, Ltd. [source] Electric field controlled electrospray deposition for precise particle pattern and cell pattern formationAICHE JOURNAL, Issue 10 2010Jingwei Xie Abstract Photolithography, soft lithography, and ink jetting have been used for automated micropattern fabrication. However, most of the methods for microfabrication of surface pattern are limited to the investigation of material properties of substrates with high-cost and complex procedures. In the present study, we show a simple (single-step) yet versatile and robust approach to generate biodegradable polymeric particle patterns on a substrate using electrospray deposition through a mask. Various particle patterns including patterned dots, circles, squares, and bands can be easily formed and the features of particle patterns could also be tailored using different masks and electrostatic focusing effects. Furthermore, cell patterns can be achieved on the surface of particle patterns by blocking the areas without particle deposition on the substrate and culturing cells on the substrate. Polymeric particle patterns and cell patterns developed in this study could be used in the high throughput screening of sustained release formulations, cell-based sensing, and drug discovery. In addition to experimental results, an analysis of the associated electric field is used to investigate quantitatively the nature of focusing effect. Scaling analysis is also applied to obtain the dominate terms in electrospray deposition process. © 2010 American Institute of Chemical Engineers AIChE J, 2010 [source] Communicating and judging the quality of qualitative research: the need for a new languageJOURNAL OF HUMAN NUTRITION & DIETETICS, Issue 3 2003S. A. Fade Abstract Background Traditionally UK dietitians have tended to take a more quantitative approach to research. Qualitative research which gives an in-depth view of people's experiences and beliefs is also now being used to help answer some important dietetic research questions. Review A review of the limited number of qualitative research papers in the Journal of Human Nutrition and Dietetics 1990,2002 (nine papers in all), revealed a lack of specific discussion of the quality strategies commonly used in qualitative research. This could indicate a less than robust approach, but might also reflect a different perspective on quality, or simply the difficulties associated with disseminating qualitative research to a profession whose members lack familiarity with the language. The fact that qualitative research seems to be used rarely may also indicate a poor understanding of its role. Purpose of this paper This paper seeks to clarify the potential role of qualitative research and draws on previously published guidelines for demonstrating quality. It is hoped that this will offer dietitians a framework for carrying out qualitative research and a language for reporting it, as well acting as a stimulus for discussion. [source] Child Maltreatment in Diverse Households: Challenges to Law, Theory, and PracticeJOURNAL OF LAW AND SOCIETY, Issue 1 2008Julia Brophy In the United Kingdom relatively little attention has been paid to ,race' and racism and the role of cultural, religious, and linguistic diversity in care proceedings. This paper will look at the background, law, and guidance on diversity in this field and explore the impact of notions of diversity in evidence before the courts. It will look at their relevance in allegations of 'significant harm' to children and failures of parenting, and the coverage of diverse backgrounds in expert reports and parents' statements. It will argue that while there is no evidence that the threshold criteria for a care order require reassessing, there is room for considerable improvement in attention to issues of diversity in evidence and in the experiences of parents attending court. The paper will explore the implications of the studies for theorizing law and the duties of the state and look at notions such as cultural relativism and concepts imported from cultural anthropology for determining culturally acceptable parenting. It will highlight problems with these approaches and demonstrate why ,paradigms of intersectionality' is a more useful and robust approach. [source] Cerebral blood flow and oxygen metabolism measured with the Kety,Schmidt method using nitrous oxideACTA ANAESTHESIOLOGICA SCANDINAVICA, Issue 2 2009S. TAUDORF Background: The Kety,Schmidt method is the reference method for measuring global cerebral blood flow (CBF), cerebral metabolic rates (CMR) and flux, especially where scanners are unavailable or impractical. Our primary objective was to assess the repeatability of the Kety,Schmidt method in a variety of different approaches using inhaled nitrous oxide (N2O) as the tracer, combined with photoacoustic spectrometry. A secondary objective was to assess the impact of this tracer on the systemic vascular concentration of nitrite (NO2,). Methods: Twenty-nine healthy male volunteers underwent 61 CBF measurements by breathing a normoxic gas mixture containing 5% N2O until tension equilibrium. Paired blood samples were collected from an arterial and a jugular bulb catheter in the saturation or desaturation phase, by continuous or the discontinuous sampling. N2O concentration was measured with photoacoustic spectrometry after equilibration of blood samples with air. CBF was calculated by the Kety,Schmidt equation. CMR of oxygen (CMRO2) was determined by the Fick principle. NO2, in plasma and red blood cells (RBC) was measured by ozone-based chemiluminescence. Results: The most robust approach for CBF measurement was achieved by discontinuous sampling in the desaturation phase [CBF, 64 (95% confidence interval, 59,71 ml)] 100 g/min; CMRO2 1.8 (1.7,2.0) ,mol/g/min). The tracer did not influence plasma or RBC NO2, (P>0.05 vs. baseline). Conclusion: These findings confirm the reliability and robustness of the Kety,Schmidt method using inhaled N2O for the measurement of global CBF and CMR. At the low tracer concentration used, altered NO metabolism is unlikely to have affected cerebral haemodynamic function. [source] Modelling cross-hybridization on phylogenetic DNA microarrays increases the detection power of closely related speciesMOLECULAR ECOLOGY RESOURCES, Issue 1 2009JULIA C. ENGELMANN Abstract DNA microarrays are a popular technique for the detection of microorganisms. Several approaches using specific oligomers targeting one or a few marker genes for each species have been proposed. Data analysis is usually limited to call a species present when its oligomer exceeds a certain intensity threshold. While this strategy works reasonably well for distantly related species, it does not work well for very closely related species: Cross-hybridization of nontarget DNA prevents a simple identification based on signal intensity. The majority of species of the same genus has a sequence similarity of over 90%. For biodiversity studies down to the species level, it is therefore important to increase the detection power of closely related species. We propose a simple, cost-effective and robust approach for biodiversity studies using DNA microarray technology and demonstrate it on scenedesmacean green algae. The internal transcribed spacer 2 (ITS2) rDNA sequence was chosen as marker because it is suitable to distinguish all eukaryotic species even though parts of it are virtually identical in closely related species. We show that by modelling hybridization behaviour with a matrix algebra approach, we are able to identify closely related species that cannot be distinguished with a threshold on signal intensity. Thus this proof-of-concept study shows that by adding a simple and robust data analysis step to the evaluation of DNA microarrays, species detection can be significantly improved for closely related species with a high sequence similarity. [source] A robust approach for assessing misclassification rates under the two-component measurement error modelAPPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 4 2010Daniela Cocchi Abstract The majority of actions designed to improve processes and quality include the assessment of the capability of a measurement system. The statistical model relating the measured value to the true, but not observable, value of a product characteristic is usually Gaussian and additive. In this paper we propose to extend the said model to a more general formulation by introducing the structure of the two-component error model. An approximated method for evaluating the misclassification rates under the two-component error model is proposed and assessed. Copyright © 2009 John Wiley & Sons, Ltd. [source] Robust Isolation Of Sensor FailuresASIAN JOURNAL OF CONTROL, Issue 1 2003R. Xu ABSTRACT Sensor self-validity check is a critical step in system control and fault diagnostics. In this paper, a robust approach to isolate sensor failures is proposed. First, a residual model for a given system is built off-line and directly based on input-output measurement data. The residual model outputs are called "primary residuals" and are zero when there is no fault. Most conventional approaches to residual model generation are indirect, as they first require the determination of state-space or other models using standard system identification algorithms. Second, a new max-min design of structured residuals, which can maximize the sensitivity of structured residuals with respect to sensor failures, is proposed. Based on the structured residuals, one can then isolate the sensor failures. This design can also be done in an off-line manner. It is an optimization procedure that avoids local optimal solutions. Simulation and experimental results demonstrated the effectiveness of the proposed method. [source] Salt tolerant membrane adsorbers for robust impurity clearanceBIOTECHNOLOGY PROGRESS, Issue 6 2009William T. Riordan Abstract Clearance of impurities such as viruses, host cell protein (HCP), and DNA is a critical purification design consideration for manufacture of monoclonal antibody therapeutics. Anion exchange chromatography has frequently been utilized to accomplish this goal; however, anion exchange adsorbents based on the traditional quaternary amine (Q) ligand are sensitive to salt concentration, leading to reduced clearance levels of impurities at moderate salt concentrations (50,150 mM). In this report, membrane adsorbers incorporating four alternative salt tolerant anion exchange ligands were examined for impurity clearance: agmatine, tris-2-aminoethyl amine, polyhexamethylene biguanide (PHMB), and polyethyleneimine. Each of these ligands provided greater than 5 log reduction value (LRV) for viral clearance of phage ,X174 (pI , 6.7) at pH 7.5 and phage PR772 (pI , 4) at pH 4.2 in the presence of salt. Under these same conditions, the commercial Q membrane adsorber provided no clearance (zero LRV). Clearance of host-cell protein at pH 7.5 was the most challenging test case; only PHMB maintained 1.5 LRV in 150 mM salt. The salt tolerance of PHMB was attributed to its large positive net charge through the presence of multiple biguanide groups that participated in electrostatic and hydrogen bonding interactions with the impurity molecules. On the basis of the results of this study, membrane adsorbers that incorporate salt tolerant anion exchange ligands provide a robust approach to impurity clearance during the purification of monoclonal antibodies. © 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2009 [source] Dealing with uncertainty: adaptive approaches to sustainable river managementAQUATIC CONSERVATION: MARINE AND FRESHWATER ECOSYSTEMS, Issue 4 2002M.J. CLARKArticle first published online: 5 AUG 200 Abstract 1.Sustainable river management is the proclaimed aim of many agencies and institutions, but it remains challenging to bring this worthy ideal from the level of political rhetoric to that of practical river management. 2.Amongst the many drivers that already pressure the river manager, from internal institutional goals, through political aspirations to systemic change within the biophysical process system, one common element emerges, that of prevailing uncertainty. 3.Once it has been accepted that conventional science and engineering approaches to uncertainty (risk) minimization may be sub-optimal in a truly holistic (biophysical, socio-economic, political) system, the challenge emerges of developing a more appropriate framework without destroying over-burdened managers and management systems in the process. 4.It is argued that the necessary components are often already in place or under consideration. A linked model is proposed comprising practical measures of sustainability, robust approaches to uncertainty (if necessary, involving attitude change), responsive (adaptive) management frameworks, and an important underpinning of fuzzy decision support. Copyright © 2002 John Wiley & Sons, Ltd. [source] |