Specific Application (specific + application)

Distribution by Scientific Domains


Selected Abstracts


Mechanism of fiber,matrix separation in ribbed compression molded parts,

POLYMER COMPOSITES, Issue 4 2007
Alejandro Londoño-Hurtado
This paper presents a model that predicts fiber,matrix separation in ribbed sections of compression molded parts. The model combines a mechanical analysis of compression molding with some experimentally measured variables. It is shown that with a higher closing speed, the viscosity of the material will increase and fiber,matrix separation can be reduced. Specific applications for this method are compression molding of sheet molding compounds and glass mat-reinforced thermoplastics. POLYM. COMPOS., 28:451,457, 2007. © 2007 Society of Plastics Engineers [source]


Bank and Nonbank Financial Intermediation

THE JOURNAL OF FINANCE, Issue 6 2004
PHILIP BOND
ABSTRACT Conglomerates, trade credit arrangements, and banks are all instances of financial intermediation. However, these institutions differ significantly in the extent to which the projects financed absorb aggregate intermediary risk, in whether or not intermediation is carried out by a financial specialist, in the type of projects they fund and in the type of claims they issue to investors. The paper develops a simple unified model that both accounts for the continued coexistence of these different forms of intermediation, and explains why they differ. Specific applications to conglomerate firms, trade credit, and banking are discussed. [source]


Computer-based management environment for an assembly language programming laboratory

COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 1 2007
Santiago Rodríguez
Abstract This article describes the environment used in the Computer Architecture Department of the Technical University of Madrid (UPM) for managing small laboratory work projects and a specific application for an Assembly Language Programming Laboratory. The approach is based on a chain of tools that a small team of teachers can use to efficiently manage a course with a large number of students (400 per year). Students use this tool chain to complete their assignments using an MC88110 CPU simulator also developed by the Department. Students use a Delivery Agent tool to send files containing their implementations. These files are stored in one of the Department servers. Every student laboratory assignment is tested by an Automatic Project Evaluator that executes a set of previously designed and configured tests. These tools are used by teachers to manage mass courses thereby avoiding restrictions on students working on the same assignment. This procedure may encourage students to copy others' laboratory work and we have therefore developed a complementary tool to help teachers find "replicated" laboratory assignment implementations. This tool is a plagiarism detection assistant that completes the tool-chain functionality. Jointly, these tools have demonstrated over the last decade that important benefits can be gained from the exploitation of a global laboratory work management system. Some of the benefits may be transferable to an area of growing importance that we have not directly explored, i.e. distance learning environments for technical subjects. © 2007 Wiley Periodicals, Inc. Comput Appl Eng Educ 15: 41,54, 2007; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20094 [source]


Is the Impact of Public Investment Neutral Across the Regional Income Distribution?

ECONOMIC GEOGRAPHY, Issue 3 2005
Evidence from Mexico
Abstract: This article investigates the contribution of public investment to the reduction of regional inequalities, with a specific application to Mexico. We examine the impact of public investment according to the position of each region in the conditional distribution of regional income by using quantile regression as an empirical technique. The results confirm the hypothesis that regional inequalities can indeed be attributed to the regional distribution of public investment; the observed pattern shows that public investment mainly helped to reduce regional inequalities among the richest regions. [source]


Imaging of the shoulder

EQUINE VETERINARY EDUCATION, Issue 4 2010
W. R. Redding
Summary Diagnosis of lameness associated with the shoulder region requires a careful clinical examination, the use of specifically placed intra-articular analgesia and a combination of some common imaging techniques to accurately define the source of pain. Most equine practices performing lameness examinations in the horse have the radiographic and ultrasonographic equipment necessary to accurately image the shoulder. This article presents a description of the unique anatomy of the shoulder and the specific application of radiographic and ultrasonographic techniques to provide a complete set of diagnostic images of the shoulder region. A brief discussion of nuclear scintigraphy of this region is also included. [source]


Trying to See Red Through Stickleback Photoreceptors: Functional Substitution of Receptor Sensitivities

ETHOLOGY, Issue 3 2006
Mickey P. Rowe
A key to understanding animal behavior is knowledge of the sensory information animals extract from their environment. For visually motivated tasks, the information animals obtain through their eyes is often assumed to be essentially the same as that perceived by humans. However, known differences in structure and processing among the visual systems of different animals clearly indicate that the world seen by each is different. A well-characterized difference between human and other animal visual systems is the number of types and spectral sensitivities of their photoreceptors. We are developing a technique, functional substitution, that exploits knowledge of these differences to portray for human subjects, colors as they would appear through the photoreceptors of another animal. In a specific application, we ask human subjects to rank hues of male threespine stickleback (Gasterosteus aculeatus) throats viewed through stickleback photopigments. We compare these ranks to ranks of the same throat hues viewed through normal human photoreceptors. We find essentially no difference between the two sets of rankings. This suggests that any differences in human and stickleback rankings of such hues would result from differences in post-receptoral neural processing. Using a previously developed model of stickleback neural processing, we established another ranking of the hues which was again essentially the same as the rankings produced by the human subjects. A growing literature indicates that stickleback do rank such hues in the evaluation of males as potential mates or threats. Although our results do not demonstrate that humans and stickleback use the same mechanisms to assess color, our experiments significantly failed to show that stickleback and human rankings of throat hues should be different. Nevertheless, a comparison of all these rankings to ranks derived from subjective color scoring by human observers suggests that color scoring may utilize other cues and should thus be used cautiously. [source]


Daily streamflow modelling and assessment based on the curve-number technique

HYDROLOGICAL PROCESSES, Issue 16 2002
Jin-Yong Choi
Abstract A cell-based long-term hydrological model (CELTHYM) that can be integrated with a geographical information system (GIS) was developed to predict continuous stream flow from small agricultural watersheds. The CELTHYM uses a cell-by-cell soil moisture balance approach. For surface runoff estimation, the curve number technique considering soil moisture on a daily basis was used, and release rate was used to estimate baseflow. Evapotranspiration was computed using the FAO modified Penman equation that considered land-use-based crop coefficients, soil moisture and the influence of topography on radiation. A rice paddy field water budget model was also adapted for the specific application of the model to East Asia. Model sensitivity analysis was conducted to obtain operational information about the model calibration parameters. The CELTHYM was calibrated and verified with measured runoff data from the WS#1 and WS#3 watersheds of the Seoul National University, Department of Agricultural Engineering, in Hwaseong County, Kyounggi Province, South Korea. The WS#1 watershed is comprised of about 35·4% rice paddy fields and 42·3% forest, whereas the WS#3 watershed is about 85·0% forest and 11·5% rice paddy fields. The CELTHYM was calibrated for the parameter release rate, K, and soil moisture storage coefficient, STC, and results were compared with the measured runoff data for 1986. The validation results for WS#1 considering all daily stream flow were poor with R2, E2 and RMSE having values of 0·40, ,6·63 and 9·69 (mm), respectively, but validation results for days without rainfall were statistically significant (R2 = 0·66). Results for WS#3 showed good agreement with observed data for all days, and R2, E2 and RMSE were 0·92, 0·91 and 2·23 (mm), respectively, suggesting potential for CELTHYM application to other watersheds. The direct runoff and water balance components for watershed WS#1 with significant areas of paddy fields did not perform well, suggesting that additional study of these components is needed. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Thinking inside the box: Novel linear scaling algorithm for Coulomb potential evaluation

INTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY, Issue 4 2006
David C. Thompson
Abstract Beginning with the Poisson equation, and expanding the electronic potential in terms of sine functions, the natural orbitals for describing the particle-in-a-box problem, we find that simple analytic forms can be found for the evaluation of the Coulomb energy for both the interacting and non-interacting system of N -electrons in a box. This method is reminiscent of fast-Fourier transform and scales linearly. To improve the usefulness of this result, we generalize the idea by considering a molecular system, embedded in a box, within which we determine the electrostatic potential, in the same manner as that described for our model systems. Within this general formalism, we consider both periodic and aperiodic recipes with specific application to systems described using Gaussian orbitals; although in principle the method is seen to be completely general. © 2005 Wiley Periodicals, Inc. Int J Quantum Chem, 2006 [source]


The uptake of applied ecology

JOURNAL OF APPLIED ECOLOGY, Issue 1 2002
S. J. Ormerod
Summary 1We asked 229 authors who have published recently in the Journal of Applied Ecology (1999,2001) whether their papers made management or policy recommendations and whether they had evidence of consequent uptake. 2A total of 108 respondents working in the UK (34%), Europe (30%), the Americas (12%), Australasia (11%), Asia (7%) and Africa (6%) reported on 110 papers. They represented agro-ecosystems (35%), temperate forests or woodlands (16%), savanna, grass or arid lands (11%), rivers or wetlands (10%), estuaries or marine systems (7%) and tropical forests (5%). The major organisms were invertebrates (27%), birds (24%), mammals (21%) and higher plants (21%). Topics apparently under-represented in recent coverage include ecosystem science, urban areas, soils, mountain systems, fish, amphibians and lower organisms such as algae. 3Almost all papers (99%) carried recommendations and for 57% there was evidence of uptake in the broad categories of ,environmental management or models', ,information, training and education' and ,monitoring and assessment'. Most uptake involved large geographical scales through habitat or species management plans (32% of cases), effects on reserve design or designation (6%), and effects on agri-environmental policy (5%). The development of further research (11%), the communication of methods to other ecologists (9%), the dissemination of recommendations to practitioners or agencies (7%), and uptake in training or education (5%) were important uses of information. 4Prestige from publication in the Journal of Applied Ecology aided several authors in convincing end-users of research value. User involvement in research as participants or funders was widespread (> 42% of papers), a fact which almost certainly promotes uptake along with the parallel dissemination of management messages. We view applied issues as an important interface between end-users and ecologists of value to ,both' communities but suggest that improved communication will further benefit the sponsorship and application of ecological science. 5The major reason offered for lack of uptake was that it was still too soon after publication (21% of respondents). Costs, difficulty of implementation, the scale of the problem, and ,challenges to existing thinking' each figured in more than one response. 6For some respondents, papers were led by curiosity rather than the need for direct application. Several authors published in the Journal to share ideas internationally, or said that recommendations were general, conceptual or long-term rather than specific. The editors of the Journal of Applied Ecology recognize the seminal importance of contributions that affect policy incrementally and conceptually as much as those with specific application. 7These data provide evidence that ecological science is aiding environmental management and policy across a wide range of regions, ecosystems and types of organisms; rather than merely detecting problems, applied ecology is offering solutions both directly and more diffusely through conceptual advance. We invite the user community to offer their own perspectives about the value of research-led publications such as this Journal, about how links between researchers and users might be strengthened, and about how the uptake of applied ecology might be further advanced. [source]


Meta-analysis in model implementation: choice sets and the valuation of air quality improvements

JOURNAL OF APPLIED ECONOMETRICS, Issue 6 2007
H. Spencer Banzhaf
This research illustrates how the methods developed for meta-analysis can serve to document and summarize voluminous information derived from repeated sensitivity analyses. Our application is to the sensitivity of welfare estimates derived from discrete choice models to assumptions about the choice set. These assumptions affect welfare estimates through both the estimated parameters of the model and, conditional on the parameters, the substitution among alternatives. In our specific application, the evaluation is in terms of estimated benefits of air quality improvements in Los Angeles based on discrete choices of neighborhood and housing. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Health Care Need: Three Interpretations

JOURNAL OF APPLIED PHILOSOPHY, Issue 2 2006
ANDREAS HASMAN
abstract The argument that scarce health care resources should be distributed so that patients in ,need' are given priority for treatment is rarely contested. In this paper, we argue that if need is to play a significant role in distributive decisions it is crucial that what is meant by need can be precisely articulated. Following a discussion of the general features of health care need, we propose three principal interpretations of need, each of which focuses on separate intuitions. Although this account may not be a completely exhaustive reflection of what people mean when they refer to need, the three interpretations provide a starting-point for further debate of what the concept means in its specific application. We discuss combined interpretations, the meaning of grading needs, and compare needs-based priority setting to social welfare maximisation. [source]


Challenging the Bioethical Application of the Autonomy Principle within Multicultural Societies

JOURNAL OF APPLIED PHILOSOPHY, Issue 1 2004
Andrew Fagan
abstract,This article critically re-examines the application of the principle of patient autonomy within bioethics. In complex societies such as those found in North America and Europe health care professionals are increasingly confronted by patients from diverse ethnic, cultural, and religious backgrounds. This affects the relationship between clinicians and patients to the extent that patients' deliberations upon the proposed courses of treatment can, in various ways and to varying extents, be influenced by their ethnic, cultural, and religious commitments. The principle of patient autonomy is the main normative constraint imposed upon medical treatment. Bioethicists typically appeal to the principle of patient autonomy as a means for generally attempting to resolve conflict between patients and clinicians. In recent years a number of bioethicists have responded to the condition of multiculturalism by arguing that the autonomy principle provides the basis for a common moral discourse capable of regulating the relationship between clinicians and patients in those situations where patients' beliefs and commitments do or may contradict the ethos of biomedicine. This article challenges that claim. I argue that the precise manner in which the autonomy principle is philosophically formulated within such accounts prohibits bioethicists' deployment of autonomy as a core ideal for a common moral discourse within multicultural societies. The formulation of autonomy underlying such accounts cannot be extended to simply assimilate individuals' most fundamental religious and cultural commitments and affiliations per se. I challenge the assumption that respecting prospective patients' fundamental religious and cultural commitments is necessarily always compatible with respecting their autonomy. I argue that the character of some peoples' relationship with their cultural or religious community acts to significantly constrain the possibilities for acting autonomously. The implication is clear. The autonomy principle may be presently invalidly applied in certain circumstances because the conditions for the exercise of autonomy have not been fully or even adequately satisfied. This is a controversial claim. The precise terms of my argument, while addressing the specific application of the autonomy principle within bioethics, will resonate beyond this sphere and raises questions for attempts to establish a common moral discourse upon the ideal of personal autonomy within multicultural societies generally. [source]


Liposomes in dermatology today

JOURNAL OF THE EUROPEAN ACADEMY OF DERMATOLOGY & VENEREOLOGY, Issue 5 2009
J De Leeuw
Abstract Liposomes are vesicles consisting of spherical phospholipid bi-layers with specific properties making them useful for topical application of drugs. Liposome research has expanded considerably over the last 30 years and nowadays, it is possible to construct a wide range of liposomes varying in size, phospholipids composition and surface characteristics to suit the specific application for which they are intended. In dermatology, the topical application of liposomes has proven to be of therapeutic value. Liposomes can be used as carriers for hydrophilic as well as lipophilic therapeutic agents because of their amphipathic character. They may improve stabilization of instable drugs by encapsulating them and serve as penetration enhancers facilitating the transport of compounds that otherwise cannot penetrate the skin. Liposomes help in reducing skin irritation by sustaining the release of drugs and by hydration of the epidermis. They also have the potential to target drugs into the pilosebaceous structures and hence they have an additional advantage for treatment of hair follicle-associated disorders. Clinical data indicate that 5-ALA encapsulated in liposomes improves the quality of Fluorescence Diagnosis by ALA-induced Porphyrins (FD) and optimizes the results of Photodynamic Therapy (PDT). Conflicts of interest None declared [source]


Bayesian calibration of computer models

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 3 2001
Marc C. Kennedy
We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the fitted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-fitting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise. [source]


Understanding Materials Processing Lasers

LASER TECHNIK JOURNAL, Issue 5 2009
A Comprehensive Overview Covering the Capabilities, Applicability of the Major Systems
A decade ago, flowing gas CO2 lasers dominated the market for materials processing applications, such as cutting, welding and marking of metals and organics. However, over the past few years, new laser technologies have emerged that give the materials processor more options in choosing a laser source. Specifically these developments include the introduction of higher power sealed CO2 lasers, the advent of fiber lasers, and improvements in the brightness of direct diode laser systems. The result is that choosing the right laser for a specific application is now a more complex task than in the past. This article reviews the technology and capabilities of all these laser types and provides guidance on identifying the optimal source for a specific need. [source]


A New Technique for Preparing a Filled Type of Polymeric Gradient Material

MACROMOLECULAR MATERIALS & ENGINEERING, Issue 11 2006
Yong-Bin Zhu
Abstract Summary: So-called functionally gradient materials have received increased attention as a new type of composite whose microelements, including composition and structure, change spatially to optimize the gradient properties for a specific application. In this study, a new technique for continuously preparing a filled type of PGM was investigated through co-extrusion/gradient distribution/2-dimensional mixing with conventional polymeric material processing facilities. The processing line from co-extrusion, gradient distribution to 2-dimensional mixing was fulfilled by two extruders, a gradient distribution unit and 2-dimensional mixing units. The gradient distribution unit and 2-dimensional mixing units were designed separately in our group. As an example, a PE/GB PGM was prepared by using this new technique. The gradient variation of composition along the sample thickness direction was studied by TG and SEM. The TG results indicated that a gradient variation of the content of GB was formed along the thickness of the sample. The direct gradient distribution of GB came from SEM observation, which showed an increased stacking density of GB along the sample thickness. Experimental results indicated that the processing method with co-extrusion/gradient distribution/2-dimensional mixing can serve as a new way to produce a filled type of PGM and is worthy of further investigation. The prepared polyethylene/glass bead PGM; the graph illustrates the glass bead concentration gradient across the sample thickness. [source]


Does the lipid membrane composition of arsonoliposomes affect their anticancer activity?

MOLECULAR NUTRITION & FOOD RESEARCH (FORMERLY NAHRUNG/FOOD), Issue 5 2009
A cell culture study
Abstract Sonicated arsonoliposomes were prepared using arsonolipid with palmitic acid acyl chain (C16), mixed with phosphatidylcholine (PC)-based or 1,2-distearoyl- sn -glycero-3-phosphocholine (DSPC)-based, and cholesterol (Chol) with C16/DSPC/Chol 8:12:10 molar ratio. PEG-lipid (1,2-distearoyl- sn -glycero-3-phosphoethanolamine conjugated to polyethylenoglycol 2000) containing vesicles (PEGylated-arsonoliposomes; PC-based and DSPC-based) were also prepared. The cytotoxicity of these arsonoliposomes towards different cancer cells (human promyelocytic leukaemia NB4, Prostatic cancer PC3, human breast adenocarcinoma MDA-MB-468, human T-lymphocyte (MT-4) and also towards human umbilical vein endothelial cells (HUVECs) was evaluated by calculating the arsonoliposome-induced growth inhibition of the cells by the MTT assay. IC-50 values were interpolated from cell number/arsonoliposome concentration curves. The results reveal that all types of arsonoliposomes evaluated significantly inhibit the growth of most of the cancer cells studied (PC3, NB4, MT4) with the exception of the MDA-MB-468 breast cancer cells which were minimally affected by arsonoliposomes; in some cases even less than HUVEC. Nevertheless, for the same cell type the differences between the different types of arsonoliposomes were significant but not proportional to their stability, indicating that the formation of arsonoliposomes with very stable membranes is not a problem for their anticancer activity. Thereby it is concluded that arsonoliposome composition should be adjusted in accordance to their in vivo kinetics and the desired, for each specific application, biodistribution of As and/or encapsulated drug. [source]


Coherent synchrotron emission from cosmic ray air showers

MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, Issue 4 2006
Qinghuan Luo
ABSTRACT Coherent synchrotron emission by particles moving along semi-infinite tracks is discussed, with a specific application to radio emission from air showers induced by high-energy cosmic rays. It is shown that in general, radiation from a particle moving along a semi-infinite orbit consists of usual synchrotron emission and modified impulsive bremsstrahlung. The latter component is due to the instantaneous onset of the curved trajectory of the emitting particle at its creation. Inclusion of the bremsstrahlung leads to broadening of the radiation pattern and a slower decay of the spectrum at the cut-off frequency than the conventional synchrotron emission. Possible implications of these features for air shower radio emission are discussed. [source]


Analysis of parameterized quadratic eigenvalue problems in computational acoustics with homotopic deviation theory

NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, Issue 6 2006
F. Chaitin-Chatelin
Abstract This paper analyzes a family of parameterized quadratic eigenvalue problems from acoustics in the framework of homotopic deviation theory. Our specific application is the acoustic wave equation (in 1D and 2D) where the boundary conditions are partly pressure release (homogeneous Dirichlet) and partly impedance, with a complex impedance parameter ,. The admittance t = 1/, is the classical homotopy parameter. In particular, we study the spectrum when t , ,. We show that in the limit part of the eigenvalues remain bounded and converge to the so-called kernel points. We also show that there exist the so-called critical points that correspond to frequencies for which no finite value of the admittance can cause a resonance. Finally, the physical interpretation that the impedance condition is transformed into a pressure release condition when |t| , , enables us to give the kernel points in closed form as eigenvalues of the discrete Dirichlet problem. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Stereotypes, Asian Americans, and Wages: An Empirical Strategy Applied to Computer Use at Work

AMERICAN JOURNAL OF ECONOMICS AND SOCIOLOGY, Issue 2 2009
Sanae Tashiro
This article examines the effect on wages of the Asian-American stereotype as mathematically and technically adept, and the role this stereotype may play in explaining racial wage differences. We propose an empirical strategy to examine the influence of stereotypes on labor market outcomes, with a specific application to the wage premium associated with computer use at work. Using Current Population Survey data, ordinary least squares estimates do not provide compelling evidence that a positive stereotype affects wages for Asian Americans. [source]


Polymer electrolyte membranes for high-temperature fuel cells based on aromatic polyethers bearing pyridine units

POLYMER INTERNATIONAL, Issue 11 2009
Joannis K Kallitsis
Abstract This review is focused on the design and synthesis of new high-temperature polymer electrolytes based on aromatic polyethers bearing polar pyridine moieties in the main chain. Such materials are designed to be used in polymer electrolyte fuel cells operating at temperatures higher than 100 °C. New monomers and polymers have been synthesized and characterized within this field in respect of their suitability for this specific application. Copolymers with optimized structures in order to combine excellent film-forming properties with high mechanical, thermal and oxidative stability and controlled acid uptake have been synthesized which, after doping with phosphoric acid, result in ionically conducting membranes. Such materials have been studied in respect of their conductivity under various conditions and used for the construction of membrane-electrode assemblies (MEAs) which are used for fuel cells operating at temperatures up to 180 °C. New and improved, in terms of oxidative stability and mechanical properties in the doped state, polymeric membranes have been synthesized and used effectively for MEA construction and single-cell testing. Copyright © 2009 Society of Chemical Industry [source]


A computer experiment application to the design and optimization of a capacitive accelerometer

APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 2 2009
M. J. Alvarez
Abstract An accelerometer is a transducer that allows measuring the acceleration acting on a structure. Physically, an accelerometer consists of a central mass suspended by thin and flexible arms and its performance is highly dependent on the dimensions of both the mass and the arms. The two most important parameters when evaluating the performance of these devices are the sensitivity and the operating frequency range (or bandwidth), the latter one being limited to of the resonance frequency. Therefore, it is very convenient to gain knowledge on how changes in the dimensions of the mass and arms affect the value of the natural frequency of the accelerometer, as it will provide guidelines to design accelerometers that fulfil frequency requirements of a specific application. A quadratic polynomial function of the natural logarithm of the frequency versus geometrical factors has been obtained using response surface methodology approach. A faced-centered cube design was used in the experimentation. The data were obtained conducting computer simulations using finite element design techniques. A better understanding of how these variables affect the value of frequency has been reached, which will be very useful for the device design purposes. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Semiparametric Regression Modeling with Mixtures of Berkson and Classical Error, with Application to Fallout from the Nevada Test Site

BIOMETRICS, Issue 1 2002
Bani Mallick
Summary. We construct Bayesian methods for semiparametric modeling of a monotonic regression function when the predictors are measured with classical error, Berkson error, or a mixture of the two. Such methods require a distribution for the unobserved (latent) predictor, a distribution we also model semi-parametrically. Such combinations of semiparametric methods for the dose-response as well as the latent variable distribution have not been considered in the measurement error literature for any form of measurement error. In addition, our methods represent a new approach to those problems where the measurement error combines Berkson and classical components. While the methods are general, we develop them around a specific application, namely, the study of thyroid disease in relation to radiation fallout from the Nevada test site. We use this data to illustrate our methods, which suggest a point estimate (posterior mean) of relative risk at high doses nearly double that of previous analyses but that also suggest much greater uncertainty in the relative risk. [source]


The Role of Functional Parameters for Topographical Characterization of Bone-Anchored Implants

CLINICAL IMPLANT DENTISTRY AND RELATED RESEARCH, Issue 2 2006
Anna Arvidsson MSc
ABSTRACT Background, The surface topographical characterization of bone-anchored implants has been recommended to be based on amplitude, spatial, and hybrid parameters. There are also functional parameters that have the potential to describe characteristics important for a specific application. Purpose, The aim of the present study was to evaluate if parameters that have been described as functional in engineering applications are also relevant in the topographical characterization of bone-anchored implants. Materials and Methods, The surface topography of threaded titanium implants with different surface roughness (Sa, Sds, and Sdr) was analyzed with an optical interferometer, and five candidating functional parameters (Sbi, Sci, Svi, Sm, and Sc) were calculated. Examples of the same parameters for five commercially available dental implants were also calculated. Results, The highest core fluid retention index (Sci) was displayed by the turned implants, followed by fixtures blasted with 250- and 25-,m particles, respectively. Fixtures blasted with 75-,m Al2O3 particles displayed the lowest Sci value. This is the inverse order of the bone biological ranking based on earlier in vivo studies with the experimental surfaces included in the present study. Conclusion, A low core fluid retention index (Sci) seems favorable for bone-anchored implants. Therefore, it is suggested to include Sci to the set of topographical parameters for bone-anchored implants to possibly predict the biological outcome. [source]


Parallel processing of remotely sensed hyperspectral imagery: full-pixel versus mixed-pixel classification

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 13 2008
Antonio J. Plaza
Abstract The rapid development of space and computer technologies allows for the possibility to store huge amounts of remotely sensed image data, collected using airborne and satellite instruments. In particular, NASA is continuously gathering high-dimensional image data with Earth observing hyperspectral sensors such as the Jet Propulsion Laboratory's airborne visible,infrared imaging spectrometer (AVIRIS), which measures reflected radiation in hundreds of narrow spectral bands at different wavelength channels for the same area on the surface of the Earth. The development of fast techniques for transforming massive amounts of hyperspectral data into scientific understanding is critical for space-based Earth science and planetary exploration. Despite the growing interest in hyperspectral imaging research, only a few efforts have been devoted to the design of parallel implementations in the literature, and detailed comparisons of standardized parallel hyperspectral algorithms are currently unavailable. This paper compares several existing and new parallel processing techniques for pure and mixed-pixel classification in hyperspectral imagery. The distinction of pure versus mixed-pixel analysis is linked to the considered application domain, and results from the very rich spectral information available from hyperspectral instruments. In some cases, such information allows image analysts to overcome the constraints imposed by limited spatial resolution. In most cases, however, the spectral bands collected by hyperspectral instruments have high statistical correlation, and efficient parallel techniques are required to reduce the dimensionality of the data while retaining the spectral information that allows for the separation of the classes. In order to address this issue, this paper also develops a new parallel feature extraction algorithm that integrates the spatial and spectral information. The proposed technique is evaluated (from the viewpoint of both classification accuracy and parallel performance) and compared with other parallel techniques for dimensionality reduction and classification in the context of three representative application case studies: urban characterization, land-cover classification in agriculture, and mapping of geological features, using AVIRIS data sets with detailed ground-truth. Parallel performance is assessed using Thunderhead, a massively parallel Beowulf cluster at NASA's Goddard Space Flight Center. The detailed cross-validation of parallel algorithms conducted in this work may specifically help image analysts in selection of parallel algorithms for specific applications. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Capillary and microchip electrophoresis in microdialysis: Recent applications

ELECTROPHORESIS, Issue 1 2010
Elizabeth Guihen
Abstract The theme of this review is to highlight the importance of microscale electrophoretic-based separation systems in microdialysis (,D). The ability of CE and MCE to yield very rapid and highly efficient separations using just nanolitre volumes of microdialysate samples will also be discussed. Recent advances in this area will be highlighted, by illustration of some exciting new applications while the need for further innovation will be covered. The first section briefly introduces the concept of ,D sampling coupled with electrophoresis-based separation and the inherent advantages of this approach. The following section highlights some specific applications of CE separations in the detection of important biomarkers such as low-molecular-weight neurotransmitters, amino acids, and other molecules that are frequently encountered in ,D. Various detection modes in CE are outlined and some of the advantages and drawbacks thereof are discussed. The last section introduces the concepts of micro-total analysis systems and the coupling of MCE and ,D. Some of the latest innovations will be illustrated. The concluding section reflects on the future of this important chemical alliance between ,D and CE/MCE. [source]


Comparative sediment quality guideline performance for predicting sediment toxicity in Southern California, USA

ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 12 2005
Doris E. Vidal
Abstract Several types of sediment quality guidelines (SQGs) are used by multiple agencies in southern California (USA) to interpret sediment chemistry data, yet little information is available to identify the best approaches to use. The objective of this study was to evaluate the predictive ability of five SQGs to predict the presence and absence of sediment toxicity in coastal southern California: the effects range-median quotient (ERMq), consensus moderate effect concentration (consensus MEC), mean sediment quality guideline quotient (SQGQ1), apparent effects threshold (AET), and equilibrium partitioning (EqP) for organics. Large differences in predictive ability among the SQGs were obtained when each approach was applied to the same southern California data set. Sediment quality guidelines that performed well in identifying nontoxic samples were not necessarily the best predictors of toxicity. In general, the mean ERMq, SQGQ1q, and consensus MECq approaches had a better overall predictive ability than the AET and EqP for organics approaches. In addition to evaluating the predictive ability of SQGs addressing chemical mixtures, the effect of an individual SQG value (DDT) was also evaluated for the mean ERMq with and without DDT. The mean ERMq without DDT had a better ability to predict toxic samples than the mean ERMq with DDT. Similarities in discriminatory ability between different approaches, variations in accuracy among SQG values for some chemicals, and the presence of complex mixtures of contaminants in most samples underscore the need to apply SQGs in combination, such as the mean quotient. Management objectives and SQG predictive ability using regional data should be determined beforehand so that the most appropriate SQG approach and critical values can be identified for specific applications. [source]


Nanomaterials for Neural Interfaces

ADVANCED MATERIALS, Issue 40 2009
Nicholas A. Kotov
Abstract This review focuses on the application of nanomaterials for neural interfacing. The junction between nanotechnology and neural tissues can be particularly worthy of scientific attention for several reasons: (i) Neural cells are electroactive, and the electronic properties of nanostructures can be tailored to match the charge transport requirements of electrical cellular interfacing. (ii) The unique mechanical and chemical properties of nanomaterials are critical for integration with neural tissue as long-term implants. (iii) Solutions to many critical problems in neural biology/medicine are limited by the availability of specialized materials. (iv) Neuronal stimulation is needed for a variety of common and severe health problems. This confluence of need, accumulated expertise, and potential impact on the well-being of people suggests the potential of nanomaterials to revolutionize the field of neural interfacing. In this review, we begin with foundational topics, such as the current status of neural electrode (NE) technology, the key challenges facing the practical utilization of NEs, and the potential advantages of nanostructures as components of chronic implants. After that the detailed account of toxicology and biocompatibility of nanomaterials in respect to neural tissues is given. Next, we cover a variety of specific applications of nanoengineered devices, including drug delivery, imaging, topographic patterning, electrode design, nanoscale transistors for high-resolution neural interfacing, and photoactivated interfaces. We also critically evaluate the specific properties of particular nanomaterials,including nanoparticles, nanowires, and carbon nanotubes,that can be taken advantage of in neuroprosthetic devices. The most promising future areas of research and practical device engineering are discussed as a conclusion to the review. [source]


Preparation of Boron-Carbide/Carbon Nanofibers from a Poly(norbornenyldecaborane) Single-Source Precursor via Electrostatic Spinning,

ADVANCED MATERIALS, Issue 7 2005
T. Welna
Pyrolysis of poly(norbornenyldecaborane) that has been electrostatically spun provides a route to non-woven mats of boron-carbide/carbon ceramic nanofibers with narrow distributions and controllable dimensions (see Figure). This approach allows the fabrication of composite ceramic fibers with varying composition, which could be tailored to suit specific applications. [source]


Food applications of trans fatty acid substitutes

INTERNATIONAL JOURNAL OF FOOD SCIENCE & TECHNOLOGY, Issue 5 2007
Paul Wassell
Summary The review outlines the increasing need to reduce trans fatty acids, and addresses the functionality issues of various trans free solutions through discussion of hydrogenation, interesterification, and fractionation, and their influence on fat crystallisation and solid fat content. Caution is urged not to focus solely on physio-chemical aspects, but to approach trans free designing for specific food applications from a multidisciplinary angle. Examples of specific applications; margarines, shortenings and frying oils are given. The review also offers a glimpse into what the future trans free trends may hold. [source]