Components Used (component + used)

Distribution by Scientific Domains


Selected Abstracts


Tunable scheduling in a GridRPC framework

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2008
A. Amar
Abstract Among existing grid middleware approaches, one simple, powerful, and flexible approach consists of using servers available in different administrative domains through the classic client,server or remote procedure call paradigm. Network Enabled Servers (NES) implement this model, also called GridRPC. Clients submit computation requests to a scheduler, whose goal is to find a server available on the grid using some performance metric. The aim of this paper is to give an overview of a NES middleware developed in the GRAAL team called distributed interactive engineering toolbox (DIET) and to describe recent developments around plug-in schedulers, workflow management, and tools. DIET is a hierarchical set of components used for the development of applications based on computational servers on the grid. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Implementation of the ,Fresh Start' smoking cessation programme to 23 antenatal clinics: a randomized controlled trial investigating two methods of dissemination

DRUG AND ALCOHOL REVIEW, Issue 1 2001
MARGARET COOKE
Abstract The aim of the research was to investigate the effect of two methods of dissemination on the implementation of a smoking cessation programme and use of smoking cessation interventions in antenatal clinics. A repeated-measures randomized design was used. Hospital antenatal clinics (n = 23) were randomized to simple or intensive dissemination groups. All clinics in NSW with > 500 births were asked to participate. A survey of all clinical staff (n = 323) in 23 antenatal clinics was carried out prior to programme dissemination and 18 months after dissemination (n = 283). The response rate was 63% at baseline and 64% at follow-up. Smoking cessation intervention significantly increased after dissemination. (F (18,1) = 49.26, p < 0.001). The average number of smoking cessation interventions provided by clinics after programme dissemination increased from 4.5 to 7. 48 (mean difference 2.98, t(19) 7.08, n < 0.001, 95% CI (2.1,3.86). Type of dissemination did not influence the number of programme components used or the number of smoking cessation interventions offered. Also, the estimated proportion of clients offered intervention by clinicians did not vary due to type of dissemination. A simple mail-out of a smoking cessation programme to antenatal clinics for use during pregnancy can increase clinician intervention for smoking. When more intensive methods of dissemination are used, the quality of the interventions implemented by clinicians improves. More research on dissemination methods are required as both methods of dissemination did not produce systematic or sustained use of the programme. [source]


Improvement of information filtering by independent components selection

ELECTRICAL ENGINEERING IN JAPAN, Issue 2 2008
Takeru Yokoi
Abstract We propose an improvement of an information filtering process with independent components selection. The independent components are obtained by Independent Components Analysis and considered as topics. Selection of independent components is an efficient method of improving the accuracy of the information filtering for the purpose of extraction of similar topics by focusing on their meanings. To achieve this, we select the topics by Maximum Distance Algorithm with Jensen-Shannon divergence. In addition, document vectors are represented by the selected topics. We create a user profile from transformed data with a relevance feedback. Finally, we sort documents by the user profile and evaluate the accuracy by imputation precision. We carried out an evaluation experiment to confirm the validity of the proposed method considering meanings of components used in this experiment. © 2008 Wiley Periodicals, Inc. Electr Eng Jpn, 163(2): 49,56, 2008; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/eej.20519 [source]


How good are the Electrodes we use in PEFC?

FUEL CELLS, Issue 3 2004
M. Eikerling
Abstract Basically, companies and laboratories implement production methods for their electrodes on the basis of experience, technical capabilities and commercial preferences. But how does one know whether they have ended up with the best possible electrode for the components used? What should be the (i) optimal thickness of the catalyst layer? (ii) relative amounts of electronically conducting component (catalyst, with support , if used), electrolyte and pores? (iii) "particle size distributions" in these mesophases? We may be pleased with our MEAs, but could we make them better? The details of excellently working MEA structures are typically not a subject of open discussion, also hardly anyone in the fuel cell business would like to admit that their electrodes could have been made much better. Therefore, we only rarely find (far from systematic) experimental reports on this most important issue. The message of this paper is to illustrate how strongly the MEA morphology could affect the performance and to pave the way for the development of the theory. Full analysis should address the performance at different current densities, which is possible and is partially shown in this paper, but vital trends can be demonstrated on the linear polarization resistance, the signature of electrode performance. The latter is expressed through the minimum number of key parameters characterizing the processes taking place in the MEA. Model expressions of the percolation theory can then be used to approximate the dependence on these parameters. The effects revealed are dramatic. Of course, the corresponding curves will not be reproduced literally in experiments, since these illustrations use crude expressions inspired by the theory of percolation on a regular lattice, whereas the actual mesoscopic architecture of MEA is much more complicated. However, they give us a flavour of reserves that might be released by smart MEA design. [source]


Wave-induced progressive liquefaction in a poro-elastoplastic seabed: A two-layered model

INTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 5 2009
Z. Liu
Abstract In this study, the prediction model proposed by Sassa et al. (Geotechnique 2001; 51(10):847,857) for the wave-induced progressive liquefaction in marine sediment, based on two-layered inviscid fluid system, is re-examined. An alternative approach with a similar framework of Sassa et al. (Geotechnique 2001; 51(10):847,857) is developed to correct the inappropriate mechanism of wave components used. Then, a two-layered wave model which includes viscous effects is established and applied to describe the progressive nature of wave-induced liquefaction. A comprehensive comparison shows that Sassa's model overestimates the maximum liquefaction depth. It is found that the viscosity of liquefied soil cannot be ignored and the solution for an infinite seabed is not suitable for liquefaction analysis of shallow seabed. A parametric study demonstrates the significant influence of numerous wave and soil characteristics on the liquefaction depth. Copyright © 2008 John Wiley & Sons, Ltd. [source]


A spectral projection method for the analysis of autocorrelation functions and projection errors in discrete particle simulation

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 7 2008
André Kaufmann
Abstract Discrete particle simulation is a well-established tool for the simulation of particles and droplets suspended in turbulent flows of academic and industrial applications. The study of some properties such as the preferential concentration of inertial particles in regions of high shear and low vorticity requires the computation of autocorrelation functions. This can be a tedious task as the discrete point particles need to be projected in some manner to obtain the continuous autocorrelation functions. Projection of particle properties on to a computational grid, for instance, the grid of the carrier phase, is furthermore an issue when quantities such as particle concentrations are to be computed or source terms between the carrier phase and the particles are exchanged. The errors committed by commonly used projection methods are often unknown and are difficult to analyse. Grid and sampling size limit the possibilities in terms of precision per computational cost. Here, we present a spectral projection method that is not affected by sampling issues and addresses all of the above issues. The technique is only limited by computational resources and is easy to parallelize. The only visible drawback is the limitation to simple geometries and therefore limited to academic applications. The spectral projection method consists of a discrete Fourier-transform of the particle locations. The Fourier-transformed particle number density and momentum fields can then be used to compute the autocorrelation functions and the continuous physical space fields for the evaluation of the projection methods error. The number of Fourier components used to discretize the projector kernel can be chosen such that the corresponding characteristic length scale is as small as needed. This allows to study the phenomena of particle motion, for example, in a region of preferential concentration that may be smaller than the cell size of the carrier phase grid. The precision of the spectral projection method depends, therefore, only on the number of Fourier modes considered. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Implications of production sharing on exchange rate pass-through

INTERNATIONAL JOURNAL OF FINANCE & ECONOMICS, Issue 4 2009
Amit Ghosh
Abstract This paper presents a theoretical model to analyse exchange rate pass-through when there is cross-border production sharing. With production sharing, between two nations, we have pass-through at two different levels, one at the level of the imported parts and components used in making the final good and the other at the level of the final good. We find the higher the pricing-to-market at the intermediate good level, the lower the pass-through for the final good. The model is further extended to analyse three-country production sharing, substitution between two alternate sources of imported inputs. Finally, we draw some policy implications. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Retrofitting implant overdenture attachments: A clinical report

JOURNAL OF PROSTHODONTICS, Issue 2 2003
Harel Simon DMD
The remake of implant-overdenture prostheses on preexisting implants can present the clinician with a challenge, especially when the prosthetic components used initially cannot be replaced. The difficulty of remaking the prosthesis may be further increased by implant attachments designed to be cemented to the implant itself,a feature that complicates future replacement. This clinical report describes the restoration of worn implant attachments using retrofit components. [source]


Laser Polishing in Medical Engineering

LASER TECHNIK JOURNAL, Issue 2 2010
Laser Polishing of Components for Left Ventricular Assist Devices
Cardiac surgery has made significant progress during the last 50 years. nowadays, almost every congenital or contracted dysfunction of the heart can be treated clinically or at least the etiopathology can be alleviated. During these years, implantable Left Ventricular Assist Devices (LVADs) have proven to be an effective and reliable medical product. In particular, the survival rate of patients with cardiac insufficiency has risen due to these devices. This type of heart-assist device is implanted either to bridge the time until cardiac transplantation or recovery has occurred, or for permanent implantation in the patient's body. Berlin Heart GmbH produces the clinically tested axial pump system INCOR® (Figure 1, above). The INCOR heart-as-sisting pump is a powerful implantable LVAD which has been used in more than 500 clinical applications. The main function of the axial pump is to unload the patient's heart by transporting blood from the left ventricle to the aorta. In order to assure high reliability of the pump's operation, the components used for blood transport have to be highly bio- and hemocompatible. [source]


Facilitating substance phase-out through material information systems and improving environmental impacts in the recycling stage of a product

NATURAL RESOURCES FORUM, Issue 3 2010
Daniel PaskaArticle first published online: 4 AUG 2010
Abstract The amount of electrical and electronic products is increasing rapidly, and this inevitably leads to the generation of large quantities of waste from these goods. Some of the generated e-waste ends up in regions with sub-standard recycling systems and may be processed under poor conditions. During uncontrolled incineration, halogenated dioxins and furans can be generated from brominated and chlorinated compounds in the products. In order to reduce the health and environmental risks involved in the recycling stage of the life cycle of electronics, an effective design-for-environment process must be established during the product development phase. Knowledge of the chemical substances in the product is crucial to being able to make informed decisions. Through full knowledge of the material content of procured components, phase-outs of unwanted substances, such as halogenated substances, can be performed in an effective manner. Therefore, information is the key to success in phasing-out substances; facilitating compliance of legal provisions for manufacturers of electrical and electronic devices; and improving the environmental footprint of products as they reach the end of the life cycle. After an introduction to the challenges of electronics waste management, this paper describes supply chain information systems and how they are used to facilitate substance phase-outs in the electronics industry. Sony Ericsson has been working with phase-outs of unwanted substances since it was founded in 2001. Through the introduction of a material declaration system that keeps track of all substances in the components used in the company's products, Sony Ericsson has been able to replace unwanted substances to improve environmental impacts at the recycling stage of a product. [source]


Rapid characterization and quality control of complex cell culture media solutions using raman spectroscopy and chemometrics

BIOTECHNOLOGY & BIOENGINEERING, Issue 2 2010
Boyan Li
Abstract The use of Raman spectroscopy coupled with chemometrics for the rapid identification, characterization, and quality assessment of complex cell culture media components used for industrial mammalian cell culture was investigated. Raman spectroscopy offers significant advantages for the analysis of complex, aqueous-based materials used in biotechnology because there is no need for sample preparation and water is a weak Raman scatterer. We demonstrate the efficacy of the method for the routine analysis of dilute aqueous solution of five different chemically defined (CD) commercial media components used in a Chinese Hamster Ovary (CHO) cell manufacturing process for recombinant proteins. The chemometric processing of the Raman spectral data is the key factor in developing robust methods. Here, we discuss the optimum methods for eliminating baseline drift, background fluctuations, and other instrumentation artifacts to generate reproducible spectral data. Principal component analysis (PCA) and soft independent modeling of class analogy (SIMCA) were then employed in the development of a robust routine for both identification and quality evaluation of the five different media components. These methods have the potential to be extremely useful in an industrial context for "in-house" sample handling, tracking, and quality control. Biotechnol. Bioeng. 2010;107: 290,301. © 2010 Wiley Periodicals, Inc. [source]


In vitro neuromuscular activity of snake venoms

CLINICAL AND EXPERIMENTAL PHARMACOLOGY AND PHYSIOLOGY, Issue 9 2002
Wayne C Hodgson
Summary 1.,Snake venoms consist of a multitude of pharmacologically active components used for the capture of prey. Neurotoxins are particularly important in this regard, producing paralysis of skeletal muscles. These neurotoxins can be classified according to their site of action (i.e. pre- or post-synaptic). 2.,Presynaptic neurotoxins, which display varying phospholipase A2 activities, have been identified in the venoms of the four major families of venomous snakes (i.e. Crotalidae, Elapidae, Hydrophiidae and Viperidae). The blockade of transmission produced by these toxins is usually characterized by a triphasic effect on acetylcholine release. Considerable work has been directed at identifying the binding site(s) on the presynaptic nerve terminal for these toxins, although their mechanism of action remains unclear. 3.,Post-synaptic neurotoxins are antagonists of the nicotinic receptor on the skeletal muscle. Depending on their sequence, post-synaptic toxins are subdivided into short- and long-chain toxins. These toxins display different binding kinetics and different affinity for subtypes of nicotinic receptors. Post-synaptic neurotoxins have only been identified in venoms from the families Elapidae and Hydrophiidae. 4.,Due to the high cost of developing new antivenoms and the reluctance of many companies to engage in this area of research, new methodologies are required to test the efficacy of existing antivenoms to ensure their optimal use. While chicken eggs have proven useful for the examination of haemorrhagic venoms, this procedure is not suited to venoms that primarily display neurotoxic activity. The chick biventer cervicis muscle has proven useful for this procedure, enabling the rapid screening of antivenoms against a range of venoms. 5.,Historically, the lethality of snake venoms has been based on murine LD50 studies. Due to ethical reasons, these studies are being superseded by in vitro studies. Instead, the time taken to produce 90% inhibition of nerve-mediated twitches (i.e. t90) in skeletal muscle preparations can be determined. However, these two procedures result in different rank orders because they are measuring two different parameters. While murine LD50 determinations are based on ,quantity', t90 values are based on how ,quick' a venom acts. Therefore, knowledge of both parameters is still desirable. 6.,In vitro neuromuscular preparations have proven to be invaluable tools in the examination of snake venoms and isolated neurotoxins. They will continue to play a role in further elucidating the mechanism of action of these highly potent toxins. Further study of these toxins may provide more highly specific research tools or lead compounds for pharmaceutical agents. [source]