Home About us Contact | |||
Several Approaches (several + approach)
Selected AbstractsOn coordination and its significance to distributed and multi-agent systemsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 4 2006Sascha Ossowski Abstract Coordination is one of those words: it appears in most science and social fields, in politics, warfare, and it is even the subject of sports talks. While the usage of the word may convey different ideas to different people, the definition of coordination in all fields is quite similar,it relates to the control, planning, and execution of activities that are performed by distributed (perhaps independent) actors. Computer scientists involved in the field of distributed systems and agents focus on the distribution aspect of this concept. They see coordination as a separate field from all the others,a field that rather complements standard fields such as the ones mentioned above. This paper focuses on explaining the term coordination in relation to distributed and multi-agent systems. Several approaches to coordination are described and put in perspective. The paper finishes with a look at what we are calling emergent coordination and its potential for efficiently handling coordination in open environments. Copyright © 2005 John Wiley & Sons, Ltd. [source] Technology and innovation in the psychosocial treatment of methamphetamine use, risk and dependenceDRUG AND ALCOHOL REVIEW, Issue 3 2008FRANCES J. KAY-LAMBKIN Abstract Issues. The dramatic increase in methamphetamine use has led to the urgent need for high-quality, effective treatments and management strategies for methamphetamine use problems to be developed and disseminated. Although some evidence exists for the use of psychological, pharmacological and other approaches to treatment for problematic methamphetamine use, other evidence suggests that many methamphetamine users do not access these treatment options due to a range of individual and service-level barriers. Approach. A review of available research literature was undertaken to identify treatment strategies for methamphetamine users, which overcome the problems associated with treatment access for this important target group and involve technological and other innovative approaches. Key Findings. Several approaches to addressing problematic methamphetamine use have been suggested, including assertive engagement strategies, flexibility in the provision of treatment and retention strategies and use of a multi-focused intervention package, such as stepped care, perhaps including new technologies as alternatives or supplements to face-to-face-delivered treatments. No research currently exists to examine the possible benefit of these strategies for people with methamphetamine use problems. Implications. The use of stepped-care intervention packages has the potential to address many of the current challenges faced by both clinicians and clients in treating methamphetamine use problems. Conclusions. Although promising, these approaches require further attention and research effort, particularly among the specific group of methamphetamine users. [source] What can otolith examination tell us about the level of perturbations of Salmonid fish from the Kerguelen Islands?ECOLOGY OF FRESHWATER FISH, Issue 4 2008F. Morat Abstract,,, Otoliths preserve a continuous record of the life cycle from the natal through the adult stage. For that reason, the morphological and chemical characteristics of otoliths of two nonnative Salmonids, brown trout (Salmo trutta) and brook charr (Salvelinus fontinalis) from populations on the Kerguelen Islands were compared. Several approaches were used to study the relationships between otolith morphometry, crystal morph and chemical elemental composition. These salmonids sampled in Kerguelen are well differentiated in terms of species through their otolith shape. The results indicate that ecotypes and river populations can be reasonably well differentiated on the basis of otolith shape. The crystallisation study has revealed the presence of a particular form: the vaterite, present at a high rate: 45% of S. fontinalis and 18% from Salmo trutta fario. Moreover, vaterite and aragonite otoliths presented differences in chemical composition. [source] Comprehensive proteome analysis by chromatographic protein prefractionationELECTROPHORESIS, Issue 7-8 2004Pierre Lescuyer Abstract Protein copy number is distributed from 7 to 8 orders of magnitude in cells and probably up to 12 orders of magnitude in plasma. Classical silver-stained two-dimensional electrophoresis (2-DE) can only display up to four orders of magnitude. This is a major drawback since it is assumed that most of the regulatory proteins are low-abundance gene products. It is thus clear that the separation of low copy number proteins in amounts sufficient for postseparation analysis is an important issue in proteome studies to complete the comprehensive description of the proteome of any given cell type. The visualization of a polypeptide on a 2-DE gel will depend on the copy number, on the quantity loaded onto the gel and on the method of detection. As the amount of protein that can be loaded onto a gel is limited, one efficient solution is to fractionate the sample prior to 2-DE analysis. Several approaches exist including subcellular fractionation, affinity purification and chromatographic and electrophoretic protein prefractionation. The chromatographic step adds a new dimension in the protein separation using specific protein properties. It allows proteins to be adsorbed to a surface and eluted differentially under certain conditions. This review article presents studies combining chromatography-based methods to 2-DE analysis and draws general conclusions on this strategy. [source] Design of an estimator of the kinematics of AC contactorsEUROPEAN TRANSACTIONS ON ELECTRICAL POWER, Issue 7 2009Jordi-Roger Riba Ruiz Abstract This paper develops an estimator of the kinematics of the movable parts of any AC powered contactor. This estimator uses easily measurable electrical variables such as the voltage across the coil terminals and the current flowing through the main coil of the contactor. Hence, a low cost microcontroller would be able to implement a control algorithm in order to reduce the undesirable phenomenon of contact bounce, which causes severe erosion of the contacts and dramatically reduces the electrical life and reliability of the contacts. To develop such an estimator is essential to have at our disposal a robust model of the contactor. Therefore, a rigorous parametric model that allows us to predict the dynamic response of the AC contactor is proposed. It solves the mechanic and electromagnetic coupled differential equations that govern the dynamics of the contactor by applying a Runge,Kutta-based solver. Several approaches have been described in the technical literature. Most of them are based on high cost computational finite elements method or on simplified parametric models. The parametric model presented here takes into account the fringing flux and deals with shading rings interaction from a general point of view, thus avoiding simplified assumptions. Copyright © 2008 John Wiley & Sons, Ltd. [source] Basis functions for the consistent and accurate representation of surface mass loadingGEOPHYSICAL JOURNAL INTERNATIONAL, Issue 1 2007Peter J. Clarke SUMMARY Inversion of geodetic site displacement data to infer surface mass loads has previously been demonstrated using a spherical harmonic representation of the load. This method suffers from the continent-rich, ocean-poor distribution of geodetic data, coupled with the predominance of the continental load (water storage and atmospheric pressure) compared with the ocean bottom pressure (including the inverse barometer response). Finer-scale inversion becomes unstable due to the rapidly increasing number of parameters which are poorly constrained by the data geometry. Several approaches have previously been tried to mitigate this, including the adoption of constraints over the oceanic domain derived from ocean circulation models, the use of smoothness constraints for the oceanic load, and the incorporation of GRACE gravity field data. However, these methods do not provide appropriate treatment of mass conservation and of the ocean's equilibrium-tide response to the total gravitational field. Instead, we propose a modified set of basis functions as an alternative to standard spherical harmonics. Our basis functions allow variability of the load over continental regions, but impose global mass conservation and equilibrium tidal behaviour of the oceans. We test our basis functions first for the efficiency of fitting to realistic modelled surface loads, and then for accuracy of the estimates of the inferred load compared with the known model load, using synthetic geodetic displacements with real GPS network geometry. Compared to standard spherical harmonics, our basis functions yield a better fit to the model loads over the period 1997,2005, for an equivalent number of parameters, and provide a more accurate and stable fit using the synthetic geodetic displacements. In particular, recovery of the low-degree coefficients is greatly improved. Using a nine-parameter fit we are able to model 58 per cent of the variance in the synthetic degree-1 zonal coefficient time-series, 38,41 per cent of the degree-1 non-zonal coefficients, and 80 per cent of the degree-2 zonal coefficient. An equivalent spherical harmonic estimate truncated at degree 2 is able to model the degree-1 zonal coefficient similarly (56 per cent of variance), but only models 59 per cent of the degree-2 zonal coefficient variance and is unable to model the degree-1 non-zonal coefficients. [source] Approaches for derivation of environmental quality criteria for substances applied in risk assessment of discharges from offshore drilling operationsINTEGRATED ENVIRONMENTAL ASSESSMENT AND MANAGEMENT, Issue 2 2008Dag Altin Abstract In order to achieve the offshore petroleum industries "zero harm" goal to the environment, the environmental impact factor for drilling discharges was developed as a tool to identify and quantify the environmental risks associated with disposal of drilling discharges to the marine environment. As an initial step in this work the main categories of substances associated with drilling discharges and assumed to contribute to toxic or nontoxic stress were identified and evaluated for inclusion in the risk assessment. The selection were based on the known toxicological properties of the substances, or the total amount discharged together with their potential for accumulation in the water column or sediments to levels that could be expected to cause toxic or nontoxic stress to the biota. Based on these criteria 3 categories of chemicals were identified for risk assessment the water column and sediments: Natural organic substances, metals, and drilling fluid chemicals. Several approaches for deriving the environmentally safe threshold concentrations as predicted no effect concentrations were evaluated in the process. For the water column consensus were reached for using the species sensitivity distribution approach for metals and the assessment factor approach for natural organic substances and added drilling chemicals. For the sediments the equilibrium partitioning approach was selected for all three categories of chemicals. The theoretically derived sediment quality criteria were compared to field-derived threshold effect values based on statistical approaches applied on sediment monitoring data from the Norwegian Continental Shelf. The basis for derivation of predicted no effect concentration values for drilling discharges should be consistent with the principles of environmental risk assessment as described in the Technical Guidance Document on Risk Assessment issued by the European Union. [source] Animal use replacement, reduction, and refinement: Development of an integrated testing strategy for bioconcentration of chemicals in fish,INTEGRATED ENVIRONMENTAL ASSESSMENT AND MANAGEMENT, Issue 1 2007Watze de Wolf Abstract When addressing the use of fish for the environmental safety of chemicals and effluents, there are many opportunities for applying the principles of the 3Rs: Reduce, Refine, and Replace. The current environmental regulatory testing strategy for bioconcentration and secondary poisoning has been reviewed, and alternative approaches that provide useful information are described. Several approaches can be used to reduce the number of fish used in the Organization for Economic Cooperation and Development (OECD) Test Guideline 305, including alternative in vivo test methods such as the dietary accumulation test and the static exposure approach. The best replacement approach would seem to use read-across, chemical grouping, and quantitative structure-activity relationships with an assessment of the key processes in bioconcentration: Adsorption, distribution, metabolism, and excretion. Biomimetic extraction has particular usefulness in addressing bioavailable chemicals and is in some circumstances capable of predicting uptake. Use of alternative organisms such as invertebrates should also be considered. A single cut-off value for molecular weight and size beyond which no absorption will take place cannot be identified. Recommendations for their use in bioaccumulative (B) categorization schemes are provided. Assessment of biotransformation with in vitro assays and in silico approaches holds significant promise. Further research is needed to identify their variability and confidence limits and the ways to use this as a basis to estimate bioconcentration factors. A tiered bioconcentration testing strategy has been developed taking account of the alternatives discussed. [source] Alternatives to pilot plant experiments in cheese-ripening studiesINTERNATIONAL JOURNAL OF DAIRY TECHNOLOGY, Issue 4 2001Shakeel-ur-rehman Experimental studies on cheese have several objectives, from assessing the influence of the microflora and enzymes indigenous to milk to evaluating starters and adjuncts. Several studies have been undertaken to evaluate the influence of an individual ripening agent in the complex environment of cheese. Cheesemaking experiments, even on a pilot scale, are expensive and time-consuming, and when controlled bacteriological conditions are needed, pilot plant experiments are difficult to perform. Cheese curd slurries are simple models that can be prepared under sterile conditions in the laboratory and can be used as an intermediate between test tubes and cheese trials, but probably cannot replace the latter. Miniature model cheeses are similar to pilot plant cheeses and can be manufactured under sterile conditions. Several approaches to assess the role of cheese-ripening agents are reviewed in this paper. [source] Rural Poverty and Development Strategies in Latin AmericaJOURNAL OF AGRARIAN CHANGE, Issue 4 2006CRISTÓBAL KAY Several approaches to the study of poverty are discussed, to learn from their strengths as well as their weaknesses. For this purpose the concepts of marginality, social exclusion, new rurality and rural livelihoods, as well as the ethnic and gender dimensions of poverty, are examined. The debate on the peasantization (capitalization) or proletarianization (pauperization) of the peasantry sets the scene for the analysis of the different strategies adopted by peasants and rural labourers to secure their survival and perhaps achieve some prosperity. In examining the success or failure of interventions by governments, civil society and international organizations in the reduction of poverty, it is claimed that the State has a key role to perform. Furthermore, it is argued that poverty is caused and reproduced by the unequal distribution of resources and power at the household, local, national and international levels. Therefore, the starting point for the eradication of poverty has to be the implementation of a development strategy that addresses such inequalities while at the same time achieving competitiveness within the global system. [source] Protein folding in the post-genomic eraJOURNAL OF CELLULAR AND MOLECULAR MEDICINE, Issue 3 2002Jeannine M. Yon Abstract Protein folding is a topic of fundamental interest since it concerns the mechanisms by which the genetic message is translated into the three-dimensional and functional structure of proteins. In these post-genomic times, the knowledge of the fundamental principles are required in the exploitation of the information contained in the increasing number of sequenced genomes. Protein folding also has practical applications in the understanding of different pathologies and the development of novel therapeutics to prevent diseases associated with protein misfolding and aggregation. Significant advances have been made ranging from the Anfinsen postulate to the "new view" which describes the folding process in terms of an energy landscape. These new insights arise from both theoretical and experimental studies. The problem of folding in the cellular environment is briefly discussed. The modern view of misfolding and aggregation processes that are involved in several pathologies such as prion and Alzheimer diseases. Several approaches of structure prediction, which is a very active field of research, are described. [source] Characterizing, Propagating, and Analyzing Uncertainty in Life-Cycle Assessment: A Survey of Quantitative ApproachesJOURNAL OF INDUSTRIAL ECOLOGY, Issue 1 2007Shannon M. Lloyd Summary Life-cycle assessment (LCA) practitioners build models to quantify resource consumption, environmental releases, and potential environmental and human health impacts of product systems. Most often, practitioners define a model structure, assign a single value to each parameter, and build deterministic models to approximate environmental outcomes. This approach fails to capture the variability and uncertainty inherent in LCA. To make good decisions, decision makers need to understand the uncertainty in and divergence between LCA outcomes for different product systems. Several approaches for conducting LCA under uncertainty have been proposed and implemented. For example, Monte Carlo simulation and fuzzy set theory have been applied in a limited number of LCA studies. These approaches are well understood and are generally accepted in quantitative decision analysis. But they do not guarantee reliable outcomes. A survey of approaches used to incorporate quantitative uncertainty analysis into LCA is presented. The suitability of each approach for providing reliable outcomes and enabling better decisions is discussed. Approaches that may lead to overconfident or unreliable results are discussed and guidance for improving uncertainty analysis in LCA is provided. [source] Stenting of Bifurcation Lesions: A Rational ApproachJOURNAL OF INTERVENTIONAL CARDIOLOGY, Issue 6 2001FSCAI, THIERRY LEFÈVRE M.D. The occurrence of stenosis in or next to coronary bifurcations is relatively frequent and generally underestimated. In our experience, such lesions account for 15%,18% of all percutaneous coronary intervention > (PCI). The main reasons for this are (1) the coronary arteries are like the branches of a tree with many ramifications and (2) because of axial plaque redistribution, especially after stent implantation, PCI of lesions located next to a coronary bifurcation almost inevitably cause plaque shifting in the side branches. PCI treatment of coronary bifurcation lesions remains challenging. Balloon dilatation treatment used to be associated with less than satisfactory immediate results, a high complication rate, and an unacceptable restenosis rate. The kissing balloon technique resulted in improved, though suboptimal, outcomes. Several approaches were then suggested, like rotative or directional atherectomy, but these techniques did not translate into significantly enhanced results. With the advent of second generation stents, in 1996, the authors decided to set up an observational study on coronary bifurcation stenting combined with a bench test of the various stents available. Over the last 5 years, techniques, strategies, and stent design have improved. As a result, the authors have been able to define a rational approach to coronary bifurcation stenting. This bench study analyzed the behavior of stents and allowed stents to be discarded that are not compatible with the treatment of coronary bifurcations. Most importantly, this study revealed that stent deformation due to the opening of a strut is a constant phenomenon that must be corrected by kissing balloon inflation. Moreover, it was observed that the opening of a stent strut into a side branch could permit the stenting, at least partly, of the side branch ostium. This resulted in the provocative concept of "stenting both branches with a single stent." Therefore, a simple approach is currently implemented in the majority of cases: stenting of the main branch with provisional stenting of the side branch, The technique consists of inserting a guidewire in each coronary branch. A stent is then positioned in the main branch with a wire being "jailed" in the side branch. The wires are then exchanged, starting with the main branch wire that is passed through the stent struts into the side branch. After opening the stent struts in the side branch, kissing balloon inflation is performed. A second stent is deployed in the side branch in the presence of suboptimal results only. Over the last 2 years, this technique has been associated with a 98% angiographic success rate in both branches. Two stents are used in 30%,35% of cases and final kissing balloon inflation is performed in > 95% of cases. The in-hospital major adverse cardiac events (MACE) rate is around 5% and 7-month target vessel revascularization (TVR) is 13%. Several stents specifically designed for coronary bifurcation lesions are currently being investigated. The objective is to simplify the approach for all users. In the near future, the use of drug-eluting stents should reduce the risk of restenosis. [source] Physical and chemical considerations of damage induced in protein crystals by synchrotron radiation: a radiation chemical perspectiveJOURNAL OF SYNCHROTRON RADIATION, Issue 6 2002Peter O'Neill Radiation-induced degradation of protein or DNA samples by synchrotron radiation is an inherent problem in X-ray crystallography, especially at the `brighter' light sources. This short review gives a radiation chemical perspective on some of the physical and chemical processes that need to be considered in understanding potential pathways leading to the gradual degradation of the samples. Under the conditions used for X-ray crystallography at a temperature of <100,K in the presence of cryoprotectant agents, the majority of radiation damage of the protein samples arises from direct ionization of the amino acid residues and their associated water molecules. Some of the chemical processes that may occur at these protein centres, such as bond scission, are discussed. Several approaches are discussed that may reduce radiation damage, using agents known from radiation chemistry to minimize radical-induced degradation of the sample. [source] High-power RF photodiodes and their applicationsLASER & PHOTONICS REVIEWS, Issue 1-2 2009T. Nagatsuma Abstract There has been an increasing interest in photonic generation of RF signals in the millimeter-wave (30 GHz,300 GHz) and/or terahertz-wave (0.1 THz,10 THz) regions, and photodiodes play a key role in it. This paper reviews recent progress in the high-power RF photodiodes such as Uni-Traveling-Carrier-Photodiodes (UTC-PDs), which operate at these frequencies. Several approaches to increasing both the bandwidth and output power of photodiodes are discussed, and promising applications to broadband wireless communications and spectroscopic sensing are described. [source] Linear and Nonlinear Viscoelasticity of a Model Unentangled Polymer Melt: Molecular Dynamics and Rouse Modes AnalysisMACROMOLECULAR THEORY AND SIMULATIONS, Issue 3 2006Mihail Vladkov Abstract Summary: Using molecular dynamics simulations, we determine the linear and nonlinear viscoelastic properties of a model polymer melt in the unentangled regime. Several approaches are compared for the computation of linear moduli and viscosity, including Green-Kubo and nonequilibrium molecular dynamics (NEMD). An alternative approach, based on the use of the Rouse modes, is also discussed. This approach could be used to assess local viscoelastic properties in inhomogeneous systems. We also focus on the contributions of different interactions to the viscoelastic moduli and explain the microscopic mechanisms involved in the mechanical response of the melt to external solicitation. [source] One technique, two approaches, and results: Thoracic duct cannulation in small laboratory animalsMICROSURGERY, Issue 3 2003Mihai Ionac M.D., Ph.D. Experimental studies in immunology, pharmacology, or hematology require the sampling of the total thoracic duct lymph in awake and unrestrained rats or mice. Several approaches have been described for cannulation of the thoracic duct, but they are characterized by a modest reproducibility and a low lymph flow rate. An improved technique for obtaining thoracic duct lymph is described here, emphasizing the similarities and differences concerning both rats and mice (average weights of 305 and 15 g, respectively). Rats yielded a mean of 55.6 ml/day thoracic duct lymph, while lymph output in mice reached unexpected volumes of 29.3 ml/day. The use of an operating microscope and silicone cannula, and maintenance of mobility of the animals during lymph collection, offer a reliable method for a high and constant output of thoracic duct lymph. Relevant aspects of the murine thoracic duct anatomy are also identified. © 2003 Wiley-Liss, Inc. MICROSURGERY 23:239,245 2003 [source] Modelling cross-hybridization on phylogenetic DNA microarrays increases the detection power of closely related speciesMOLECULAR ECOLOGY RESOURCES, Issue 1 2009JULIA C. ENGELMANN Abstract DNA microarrays are a popular technique for the detection of microorganisms. Several approaches using specific oligomers targeting one or a few marker genes for each species have been proposed. Data analysis is usually limited to call a species present when its oligomer exceeds a certain intensity threshold. While this strategy works reasonably well for distantly related species, it does not work well for very closely related species: Cross-hybridization of nontarget DNA prevents a simple identification based on signal intensity. The majority of species of the same genus has a sequence similarity of over 90%. For biodiversity studies down to the species level, it is therefore important to increase the detection power of closely related species. We propose a simple, cost-effective and robust approach for biodiversity studies using DNA microarray technology and demonstrate it on scenedesmacean green algae. The internal transcribed spacer 2 (ITS2) rDNA sequence was chosen as marker because it is suitable to distinguish all eukaryotic species even though parts of it are virtually identical in closely related species. We show that by modelling hybridization behaviour with a matrix algebra approach, we are able to identify closely related species that cannot be distinguished with a threshold on signal intensity. Thus this proof-of-concept study shows that by adding a simple and robust data analysis step to the evaluation of DNA microarrays, species detection can be significantly improved for closely related species with a high sequence similarity. [source] Feasibility of Pulmonary Vein Ostia Radiofrequency Ablation in Patients with Atrial Fibrillation: A Multicenter Study (CACAF Pilot Study)PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 1p2 2003GIUSEPPE STABILE STABILE, G., et al.: Feasibility of Pulmonary Vein Ostia Radiofrequency Ablation in Patients with Atrial Fibrillation: A Multicenter Study (CACAF Pilot Study)Radiofrequency (RF) catheter ablation has been proposed as a treatment of atrial fibrillation (AF). Several approaches have been reported and success rates have been dependent on procedural volume and operator's experience. This is the first report of a multicenter study of RF ablation of AF. We treated 44 men and 25 women with paroxysmal(n = 40)or persistent(n = 29), drug refractory AF. Circular pulmonary vein (PV) ostial lesions were deployed transseptally, during sinus rhythm(n = 42)or AF(n = 26), under three-dimensional electroanatomic guidance. Cavo-tricuspid isthmus ablation was performed in 27 (40%) patients. The mean procedure time was215 ± 76minutes (93,530), mean fluoroscopic exposure32 ± 14minutes (12,79), and mean number of RF pulses per patient56 ± 29(18,166). The mean numbers of separate PV ostia mapped and isolated per patient were3.9 ± 0.5, and3.8 ± 0.7, respectively. Major complications were observed in 3 (4%) patients, including pericardial effusion, transient ischemic attack, and tamponade. At 1-month follow-up, 21 of 68 (31%) patients had had AF recurrences, of whom 8 required electrical cardioversion. After the first month, over a mean period of9 ± 3(5,14) months, 57 (84%) patients remained free of atrial arrhythmias. RF ablation of AF by circumferential PV ostial ablation is feasible with a high short-term success rate. While the procedure and fluoroscopic exposure duration were short, the incidence of major cardiac complications was not negligible. (PACE 2003; 26[Pt. II]:284,287) [source] THE VALUE OF SKU RATIONALIZATION IN PRACTICE (THE POOLING EFFECT UNDER SUBOPTIMAL INVENTORY POLICIES AND NONNORMAL DEMAND)PRODUCTION AND OPERATIONS MANAGEMENT, Issue 1 2003JOSÉ A. ALFARO Several approaches to the widely recognized challenge of managing product variety rely on the pooling effect. Pooling can be accomplished through the reduction of the number of products or stock-keeping units (SKUs), through postponement of differentiation, or in other ways. These approaches are well known and becoming widely applied in practice. However, theoretical analyses of the pooling effect assume an optimal inventory policy before pooling and after pooling, and, in most cases, that demand is normally distributed. In this article, we address the effect of nonoptimal inventory policies and the effect of nonnormally distributed demand on the value of pooling. First, we show that there is always a range of current inventory levels within which pooling is better and beyond which optimizing inventory policy is better. We also find that the value of pooling may be negative when the inventory policy in use is suboptimal. Second, we use extensive Monte Carlo simulation to examine the value of pooling for nonnormal demand distributions. We find that the value of pooling varies relatively little across the distributions we used, but that it varies considerably with the concentration of uncertainty. We also find that the ranges within which pooling is preferred over optimizing inventory policy generally are quite wide but vary considerably across distributions. Together, this indicates that the value of pooling under an optimal inventory policy is robust across distributions, but that its sensitivity to suboptimal policies is not. Third, we use a set of real (and highly erratic) demand data to analyze the benefits of pooling under optimal and suboptimal policies and nonnormal demand with a high number of SKUs. With our specific but highly nonnormal demand data, we find that pooling is beneficial and robust to suboptimal policies. Altogether, this study provides deeper theoretical, numerical, and empirical understanding of the value of pooling. [source] Hypoxia-inducible factor 1, inhibits the fibroblast-like markers type I and type III collagen during hypoxia-induced chondrocyte redifferentiation: Hypoxia not only induces type II collagen and aggrecan, but it also inhibits type I and type III collagen in the hypoxia-inducible factor 1,,dependent redifferentiation of chondrocytesARTHRITIS & RHEUMATISM, Issue 10 2009Elise Duval Objective Autologous chondrocyte implantation requires expansion of cells ex vivo, leading to dedifferentiation of chondrocytes (loss of aggrecan and type II collagen to the profit of type I and type III collagens). Several approaches have been described for redifferentiation of these cells. Among them, low oxygen tension has been exploited to restore the differentiated chondrocyte phenotype, but molecular mechanisms of this process remain unclear. However, under conditions of hypoxia, one of the major factors involved is hypoxia-inducible factor 1, (HIF-1,). The purpose of this study was to investigate the role of HIF-1, during human chondrocyte redifferentiation. Methods We used complementary approaches to achieving HIF-1, loss (inhibition by cadmium ions and dominant-negative expression) or gain (ectopic expression and cobalt ion treatment) of function. Expression of chondrocyte, as well as fibroblast-like, phenotype markers was determined using real-time reverse transcription,polymerase chain reaction and Western blot analyses. Binding activities of HIF-1, and SOX9, a pivotal transcription factor of chondrogenesis, were evaluated by electrophoretic mobility shift assays and by chromatin immunoprecipitation assay. Results We found that hypoxia and HIF-1, not only induced the expression of SOX9, COL2A1, and aggrecan, but they simultaneously inhibited the expression of COL1A1, COL1A2, and COL3A1. In addition, we identified the binding of HIF-1, to the aggrecan promoter, the first such reported demonstration of this binding. Conclusion This study is the first to show a bimodal role of HIF-1, in cartilage homeostasis, since HIF-1, was shown to favor specific markers and to impair dedifferentiation. This suggests that manipulation of HIF-1, could represent a promising approach to the treatment of osteoarthritis. [source] Carbon-accounting methods and reforestation incentivesAUSTRALIAN JOURNAL OF AGRICULTURAL & RESOURCE ECONOMICS, Issue 2 2003Oscar J. Cacho The emission of greenhouse gases, particularly carbon dioxide, and the consequent potential for climate change are the focus of increasing international concern. Temporary land-use change and forestry projects (LUCF) can be implemented to offset permanent emissions of carbon dioxide from the energy sector. Several approaches to accounting for carbon sequestration in LUCF projects have been proposed. In the present paper, the economic implications of adopting four of these approaches are evaluated in a normative context. The analysis is based on simulation of Australian farm,forestry systems. Results are interpreted from the standpoint of both investors and landholders. The role of baselines and transaction costs are discussed. [source] High-Dimensional Cox Models: The Choice of Penalty as Part of the Model Building ProcessBIOMETRICAL JOURNAL, Issue 1 2010Axel Benner Abstract The Cox proportional hazards regression model is the most popular approach to model covariate information for survival times. In this context, the development of high-dimensional models where the number of covariates is much larger than the number of observations ( ) is an ongoing challenge. A practicable approach is to use ridge penalized Cox regression in such situations. Beside focussing on finding the best prediction rule, one is often interested in determining a subset of covariates that are the most important ones for prognosis. This could be a gene set in the biostatistical analysis of microarray data. Covariate selection can then, for example, be done by L1 -penalized Cox regression using the lasso (Tibshirani (1997). Statistics in Medicine16, 385,395). Several approaches beyond the lasso, that incorporate covariate selection, have been developed in recent years. This includes modifications of the lasso as well as nonconvex variants such as smoothly clipped absolute deviation (SCAD) (Fan and Li (2001). Journal of the American Statistical Association96, 1348,1360; Fan and Li (2002). The Annals of Statistics30, 74,99). The purpose of this article is to implement them practically into the model building process when analyzing high-dimensional data with the Cox proportional hazards model. To evaluate penalized regression models beyond the lasso, we included SCAD variants and the adaptive lasso (Zou (2006). Journal of the American Statistical Association101, 1418,1429). We compare them with "standard" applications such as ridge regression, the lasso, and the elastic net. Predictive accuracy, features of variable selection, and estimation bias will be studied to assess the practical use of these methods. We observed that the performance of SCAD and adaptive lasso is highly dependent on nontrivial preselection procedures. A practical solution to this problem does not yet exist. Since there is high risk of missing relevant covariates when using SCAD or adaptive lasso applied after an inappropriate initial selection step, we recommend to stay with lasso or the elastic net in actual data applications. But with respect to the promising results for truly sparse models, we see some advantage of SCAD and adaptive lasso, if better preselection procedures would be available. This requires further methodological research. [source] Nanorobots: The Ultimate Wireless Self-Propelled Sensing and Actuating DevicesCHEMISTRY - AN ASIAN JOURNAL, Issue 9 2009Samuel Sánchez Dr. Abstract Natural motor proteins, "bionanorobots," have inspired researchers to develop artificial nanomachines (nanorobots) able to move autonomously by the conversion of chemical to mechanical energy. Such artificial nanorobots are self-propelled by the electrochemical decomposition of the fuel (up to now, hydrogen peroxide). Several approaches have been developed to provide nanorobots with some functionality, such as for controlling their movement, increasing their power output, or transporting different cargo. In this Focus Review we will discuss the recent advances in nanorobots based on metallic nanowires, which can sense, deliver, and actuate in complex environments, looking towards real applications in the not-too-distant future. Los motores naturales basados en proteínas "bionanorobots" han inspirado a investigadores a desarrollar nano-máquinas capaces de moverse de forma autónoma gracias a la conversión de energía química en mecánica. Los nanorobots artificiales se auto-propulsan por la descomposición electroquímica del combustible (hasta la fecha, peróxido de hidrogeno). Se han desarrollado varias propuestas para modificar estos nanorobots con la finalidad de controlar su movimiento, aumentar la potencia o transportar diferentes cargos. En esta revisión discutiremos los recientes avances en nanorobots artificiales basados en nano-hilos metálicos con perspectivas a aplicaciones en un futuro cercano. Estos nanorobots pueden sentir, liberar y actuar en medios complejos. [source] Oral and general health behaviours among Chinese urban adolescentsCOMMUNITY DENTISTRY AND ORAL EPIDEMIOLOGY, Issue 1 2008Poul Erik Petersen Abstract,,, Objectives:, The objectives of this study were to measure the association of general and oral health-related behaviours with living conditions and to explore the interrelationships between general and oral health-related behaviours in Chinese urban adolescents. Methods:, A cross-sectional survey of 2662 adolescents was conducted in eight Chinese provincial capitals. The response rate was 92%. The study population was selected through multistage cluster sampling and comprised three age groups: 11, 13 and 15 years. Data on oral and general health, lifestyles as well as living conditions were collected by means of self-administered structured questionnaires. Several additive indices were constructed from answers to the questions on specific behaviour, and participants were categorized according to scores on each component of health-related behaviour for statistical analyses by frequency distributions, regression analyses and factor analyses. Results:, Oral health-related behaviours among adolescents were associated with socioeconomic status of parents, school performance and peer relationships. The odds of a dental visit was 0.63 in adolescents of poorly educated parents and the corresponding figure for regular oral hygiene practices was 0.62. Odds of tobacco use was 3 for adolescents with poor performance in school while odds of consuming sugary foods/drinks was 1.3. Adolescents with high levels of preventive oral health practices also demonstrated general health-promoting behaviours. In factor analysis of general and oral health-related behaviours, three factors were isolated: (a) risk behaviours (loadings 0.48,0.66), (b) health-promoting behaviours (loadings 0.60,0.64) and (c) help-seeking behaviours (loadings 0.56,0.67). Conclusion:, The findings support a multidimensional model of health behaviour. Several approaches and multiple methods should be applied in oral health education in order to modify behaviours that affect oral health. [source] On web communities mining and recommendationCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2009Yanchun Zhang Abstract Because of the lack of a uniform schema for web documents and the sheer amount and dynamics of web data, both the effectiveness and the efficiency of information management and retrieval of web data are often unsatisfactory when using conventional data management and searching techniques. To address this issue, we have adopted web mining and web community analysis approaches. On the basis of the analysis of web document contents, hyperlinks analysis, user access logs and semantic analysis, we have developed various approaches or algorithms to construct and analyze web communities, and to make recommendations. This paper will introduce and discuss several approaches on web community mining and recommendation. Copyright © 2009 John Wiley & Sons, Ltd. [source] On case-crossover methods for environmental time series dataENVIRONMETRICS, Issue 2 2007Heather J. Whitaker Abstract Case-crossover methods are widely used for analysing data on the association between health events and environmental exposures. In recent years, several approaches to choosing referent periods have been suggested, with much discussion of two types of bias: bias due to temporal trends, and overlap bias. In the present paper, we revisit the case-crossover method, focusing on its origin in the case-control paradigm, in order to throw new light on these biases. We emphasise the distinction between methods based on case-control logic (such as the symmetric bi-directional (SBI) method), for which overlap bias is a consequence of non-exchangeability of the exposure series, and methods based on cohort logic (such as the time-stratified (TS) method), for which overlap bias does not arise. We show by example that the TS method may suffer severe bias from residual seasonality. This method can be extended to control for seasonality. However, time series regression is more flexible than case-crossover methods for the analysis of data on shared environmental exposures. We conclude that time series regression ought to be adopted as the method of choice in such applications. Copyright © 2006 John Wiley & Sons, Ltd. [source] Discussion on ,Personality psychology as a truly behavioural science' by R. Michael FurrEUROPEAN JOURNAL OF PERSONALITY, Issue 5 2009Article first published online: 14 JUL 200 Yes We Can! A Plea for Direct Behavioural Observation in Personality Research MITJA D. BACK and BORIS EGLOFF Department of Psychology, Johannes Gutenberg University Mainz, Germany mback@uni-leipzig.de Furr's target paper (this issue) is thought to enhance the standing of personality psychology as a truly behavioural science. We wholeheartedly agree with this goal. In our comment we argue for more specific and ambitious requirements for behavioural personality research. Specifically, we show why behaviour should be observed directly. Moreover, we illustratively describe potentially interesting approaches in behavioural personality research: lens model analyses, the observation of multiple behaviours in diverse experimentally created situations and the observation of behaviour in real life. Copyright © 2009 John Wiley & Sons, Ltd. The Categories of Behaviour Should be Clearly Defined PETER BORKENAU Department of Psychology, Martin-Luther University Halle-Wittenberg, Germany p.borkenau@psych.uni-halle.de The target paper is helpful by clarifying the terminology as well as the strengths and weaknesses of several approaches to collect behavioural data. Insufficiently considered, however, is the clarity of the categories being used for the coding of behaviour. Evidence is reported showing that interjudge agreement for retrospective and even concurrent codings of behaviour does not execeed interjudge agreement for personality traits if the categories being used for the coding of behaviour are not clearly defined. By contrast, if the behaviour to be registered is unambiguously defined, interjudge agreement may be almost perfect. Copyright © 2009 John Wiley & Sons, Ltd. Behaviour Functions in Personality Psychology PHILIP J. CORR Department of Psychology, Faculty of Social Sciences, University of East Anglia, Norwich, UK Philip.Corr@btopenworld.com Furr's target paper highlights the importance, yet under-representation, of behaviour in published articles in personality psychology. Whilst agreeing with most of his points, I remain unclear as to how behaviour (as specifically defined by Furr) relates to other forms of psychological data (e.g. cognitive task performance). In addition, it is not clear how the functions of behaviour are to be decided: different behaviours may serve the same function; and identical behaviours may serve different functions. To clarify these points, methodological and theoretical aspects of Furr's proposal would benefit from delineation. Copyright © 2009 John Wiley & Sons, Ltd. On the Difference Between Experience-Sampling Self-Reports and Other Self-Reports WILLIAM FLEESON Department of Psychology, Wake Forest University, Winston-Salem, NC, USA fleesonW@wfu.edu Furr's fair but evaluative consideration of the strengths and weaknesses of behavioural assessment methods is a great service to the field. As part of his consideration, Furr makes a subtle and sophisticated distinction between different self-report methods. It is easy to dismiss all self-reports as poor measures, because some are poor. In contrast, Furr points out that the immediacy of the self-reports of behaviour in experience-sampling make experience-sampling one of the three strongest methods for assessing behaviour. This comment supports his conclusion, by arguing that ESM greatly diminishes one the three major problems afflicting self-reports,lack of knowledge,and because direct observations also suffer from the other two major problems afflicting self-reports. Copyright © 2009 John Wiley & Sons, Ltd. What and Where is ,Behaviour' in Personality Psychology? LAURA A. KING and JASON TRENT Department of Psychology, University of Missouri, Columbia, USA kingla@missouri.edu Furr is to be lauded for presenting a coherent and persuasive case for the lack of behavioural data in personality psychology. While agreeing wholeheartedly that personality psychology could benefit from greater inclusion of behavioural variables, here we question two aspects of Furr's analysis, first his definition of behaviour and second, his evidence that behaviour is under-appreciated in personality psychology. Copyright © 2009 John Wiley & Sons, Ltd. Naturalistic Observation of Daily Behaviour in Personality Psychology MATTHIAS R. MEHL Department of Psychology, University of Arizona, Tucson, AZ, USA mehl@email.arizona.edu This comment highlights naturalistic observation as a specific method within Furr's (this issue) cluster direct behavioural observation and discusses the Electronically Activated Recorder (EAR) as a naturalistic observation sampling method that can be used in relatively large, nomothetic studies. Naturalistic observation with a method such as the EAR can inform researchers' understanding of personality in its relationship to daily behaviour in two important ways. It can help calibrate personality effects against act-frequencies of real-world behaviour and provide ecological, behavioural personality criteria that are independent of self-report. Copyright © 2009 John Wiley & Sons, Ltd. Measuring Behaviour D. S. MOSKOWITZ and JENNIFER J. RUSSELL Department of Psychology, McGill University, Montreal, Canada dsm@psych.mcgill.ca Furr (this issue) provides an illuminating comparison of the strengths and weaknesses of various methods for assessing behaviour. In the selection of a method for assessing behaviour, there should be a careful analysis of the definition of the behaviour and the purpose of assessment. This commentary clarifies and expands upon some points concerning the suitability of experience sampling measures, referred to as Intensive Repeated Measurements in Naturalistic Settings (IRM-NS). IRM-NS measures are particularly useful for constructing measures of differing levels of specificity or generality, for providing individual difference measures which can be associated with multiple layers of contextual variables, and for providing measures capable of reflecting variability and distributional features of behaviour. Copyright © 2009 John Wiley & Sons, Ltd. Behaviours, Non-Behaviours and Self-Reports SAMPO V. PAUNONEN Department of Psychology, University of Western Ontario, London, Canada paunonen@uwo.ca Furr's (this issue) thoughtful analysis of the contemporary body of research in personality psychology has led him to two conclusions: our science does not do enough to study real, observable behaviours; and, when it does, too often it relies on ,weak' methods based on retrospective self-reports of behaviour. In reply, I note that many researchers are interested in going beyond the study of individual behaviours to the behaviour trends embodied in personality traits; and the self-report of behaviour, using well-validated personality questionnaires, is often the best measurement option. Copyright © 2009 John Wiley & Sons, Ltd. An Ethological Perspective on How to Define and Study Behaviour LARS PENKE Department of Psychology, The University of Edinburgh, Edinburgh, UK lars.penke@ed.ac.uk While Furr (this issue) makes many important contributions to the study of behaviour, his definition of behaviour is somewhat questionable and also lacks a broader theoretical frame. I provide some historical and theoretical background on the study of behaviour in psychology and biology, from which I conclude that a general definition of behaviour might be out of reach. However, psychological research can gain from adding a functional perspective on behaviour in the tradition of Tinbergens's four questions, which takes long-term outcomes and fitness consequences of behaviours into account. Copyright © 2009 John Wiley & Sons, Ltd. What is a Behaviour? MARCO PERUGINI Faculty of Psychology, University of Milan,Bicocca, Milan, Italy marco.perugini@unimib.it The target paper proposes an interesting framework to classify behaviour as well as a convincing plea to use it more often in personality research. However, besides some potential issues in the definition of what is a behaviour, the application of the proposed definition to specific cases is at times inconsistent. I argue that this is because Furr attempts to provide a theory-free definition yet he implicitly uses theoretical considerations when applying the definition to specific cases. Copyright © 2009 John Wiley & Sons, Ltd. Is Personality Really the Study of Behaviour? MICHAEL D. ROBINSON Department of Psychology, North Dakota State University, Fargo, ND, USA Michael.D.Robinson@ndsu.edu Furr (this issue) contends that behavioural studies of personality are particularly important, have been under-appreciated, and should be privileged in the future. The present commentary instead suggests that personality psychology has more value as an integrative science rather than one that narrowly pursues a behavioural agenda. Cognition, emotion, motivation, the self-concept and the structure of personality are important topics regardless of their possible links to behaviour. Indeed, the ultimate goal of personality psychology is to understanding individual difference functioning broadly considered rather than behaviour narrowly considered. Copyright © 2009 John Wiley & Sons, Ltd. Linking Personality and Behaviour Based on Theory MANFRED SCHMITT Department of Psychology, University of Koblenz-Landau, Landau, Germany schmittm@uni-landau.de My comments on Furr's (this issue) target paper ,Personality as a Truly Behavioural Science' are meant to complement his behavioural taxonomy and sharpen some of the presumptions and conclusions of his analysis. First, I argue that the relevance of behaviour for our field depends on how we define personality. Second, I propose that every taxonomy of behaviour should be grounded in theory. The quality of behavioural data does not only depend on the validity of the measures we use. It also depends on how well behavioural data reflect theoretical assumptions on the causal factors and mechanisms that shape behaviour. Third, I suggest that the quality of personality theories, personality research and behavioural data will profit from ideas about the psychological processes and mechanisms that link personality and behaviour. Copyright © 2009 John Wiley & Sons, Ltd. The Apparent Objectivity of Behaviour is Illusory RYNE A. SHERMAN, CHRISTOPHER S. NAVE and DAVID C. FUNDER Department of Psychology, University of California, Riverside, CA, USA funder@ucr.edu It is often presumed that objective measures of behaviour (e.g. counts of the number of smiles) are more scientific than more subjective measures of behaviour (e.g. ratings of the degree to which a person behaved in a cheerful manner). We contend that the apparent objectivity of any behavioural measure is illusory. First, the reliability of more subjective measures of behaviour is often strikingly similar to the reliabilities of so-called objective measures. Further, a growing body of literature suggests that subjective measures of behaviour provide more valid measures of psychological constructs of interest. Copyright © 2009 John Wiley & Sons, Ltd. Personality and Behaviour: A Neglected Opportunity? LIAD UZIEL and ROY F. BAUMEISTER Department of Psychology, Florida State University, Tallahassee, FL, USA Baumeister@psy.fsu.edu Personality psychology has neglected the study of behaviour. Furr's efforts to provide a stricter definition of behaviour will not solve the problem, although they may be helpful in other ways. His articulation of various research strategies for studying behaviour will be more helpful for enabling personality psychology to contribute important insights and principles about behaviour. The neglect of behaviour may have roots in how personality psychologists define the mission of their field, but expanding that mission to encompass behaviour would be a positive step. Copyright © 2009 John Wiley & Sons, Ltd. [source] Glycolipid receptor depletion as an approach to specific antimicrobial therapyFEMS MICROBIOLOGY LETTERS, Issue 1 2006Majlis Svensson Abstract Mucosal pathogens recognize glycoconjugate receptors at the site of infection, and attachment is an essential first step in disease pathogenesis. Inhibition of attachment may prevent disease, and several approaches have been explored. This review discusses the prevention of bacterial attachment and disease by agents that modify the glycosylation of cell surface glycoconjugates. Glycosylation inhibitors were tested in the urinary tract infection model, where P-fimbriated Escherichia coli rely on glycosphingolipid receptors for attachment and tissue attack. N -butyldeoxynojirimycin blocked the expression of glucosylceramide-derived glycosphingolipids and attachment was reduced. Bacterial persistence in the kidneys was impaired and the inflammatory response was abrogated. N -butyldeoxynojirimycin was inactive against strains which failed to engage these receptors, including type 1 fimbriated or nonadhesive strains. In vivo attachment has been successfully prevented by soluble receptor analogues, but there is little clinical experience of such inhibitors. Large-scale synthesis of complex carbohydrates, which could be used as attachment inhibitors, remains a technical challenge. Antibodies to bacterial lectins involved in attachment may be efficient inhibitors, and fimbrial vaccines have been developed. Glycosylation inhibitors have been shown to be safe and efficient in patients with lipid storage disease and might therefore be tested in urinary tract infection. This approach differs from current therapies, including antibiotics, in that it targets the pathogens which recognize these receptors. [source] Testing association for markers on the X chromosome,GENETIC EPIDEMIOLOGY, Issue 8 2007Gang Zheng Abstract Test statistics for association between markers on autosomal chromosomes and a disease have been extensively studied. No research has been reported on performance of such test statistics for association on the X chromosome. With 100,000 or more single-nucleotide polymorphisms (SNPs) available for genome-wide association studies, thousands of them come from the X chromosome. The X chromosome contains rich information about population history and linkage disequilibrium. To identify X-linked marker susceptibility to a disease, it is important to study properties of various statistics that can be used to test for association on the X chromosome. In this article, we compare performance of several approaches for testing association on the X chromosome, and examine how departure from Hardy-Weinberg equilibrium would affect type I error and power of these association tests using X-linked SNPs. The results are applied to the X chromosome of Klein et al. [2005], a genome-wide association study with 100K SNPs for age-related macular degeneration. We found that a SNP (rs10521496) covered by DIAPH2, known to cause premature ovarian failure (POF) in females, is associated with age-related macular degeneration. Genet. Epidemiol. 2007. Published 2007 Wiley-Liss, Inc. [source] |