Existing Technology (existing + technology)

Distribution by Scientific Domains


Selected Abstracts


Surface plasmon resonance for high-throughput ligand screening of membrane-bound proteins

BIOTECHNOLOGY JOURNAL, Issue 11 2009
Jennifer A. Maynard Dr.
Abstract Technologies based on surface plasmon resonance (SPR) have allowed rapid, label-free characterization of protein-protein and protein-small molecule interactions. SPR has become the gold standard in industrial and academic settings, in which the interaction between a pair of soluble binding partners is characterized in detail or a library of molecules is screened for binding against a single soluble protein. In spite of these successes, SPR is only beginning to be adapted to the needs of membrane-bound proteins which are difficult to study in situ but represent promising targets for drug and biomarker development. Existing technologies, such as BIAcoreTM, have been adapted for membrane protein analysis by building supported lipid layers or capturing lipid vesicles on existing chips. Newer technologies, still in development, will allow membrane proteins to be presented in native or near-native formats. These include SPR nanopore arrays, in which lipid bilayers containing membrane proteins stably span small pores that are addressable from both sides of the bilayer. Here, we discuss current SPR instrumentation and the potential for SPR nanopore arrays to enable quantitative, high-throughput screening of G protein coupled receptor ligands and applications in basic cellular biology. [source]


Virtual laboratory: A distributed collaborative environment

COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 1 2004
Tiranee Achalakul
Abstract This article proposes the design framework of a distributed, real-time collaborative architecture. The architecture concept allows information to be fused, disseminated, and interpreted collaboratively among researchers who live across continents in real-time. The architecture is designed based on the distributed object technology, DCOM. In our framework, every module can be viewed as an object. Each of these objects communicates and passes data with one another via a set of interfaces and connection points. We constructed the virtual laboratory based on the proposed architecture. The laboratory allows multiple analysts to collaboratively work through a standard web-browser using a set of tools, namely, chat, whiteboard, audio/video exchange, file transfer and application sharing. Several existing technologies are integrated to provide collaborative functions, such as NetMeeting. Finally, the virtual laboratory quality evaluation is described with an example application of remote collaboration in satellite image fusion and analysis. © 2004 Wiley Periodicals, Inc. Comput Appl Eng Educ 12: 44,53, 2004; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20008 [source]


Automotive Material Sustainability Through Reversible Adhesives,

ADVANCED ENGINEERING MATERIALS, Issue 7 2010
Allan R. Hutchinson
This communication defines the key existing technologies for reversible adhesion and bonded joint disassembly, and introduces the reader to early experimental findings on the use of thermally labile functional additives in an adhesive matrix. These additives have been found to induce localized, out of plane stresses in a joint's bondline, allowing for an adhesive disbond. It has been found that the additive and adhesive matrix combination is key to the relationship between joint disassembly and joint strength. [source]


Phylogenetic analysis of developmental and postnatal mouse cell lineages

EVOLUTION AND DEVELOPMENT, Issue 1 2010
Stephen J. Salipante
SUMMARY Fate maps depict how cells relate together through past lineage relationships, and are useful tools for studying developmental and somatic processes. However, with existing technologies, it has not been possible to generate detailed fate maps of complex organisms such as the mouse. We and others have therefore proposed a novel approach, "phylogenetic fate mapping," where patterns of somatic mutation carried by the individual cells of an animal are used to retrospectively deduce lineage relationships through phylogenetic inference. Here, we have cataloged genomic polymorphisms at 324 mutation-prone polyguanine tracts for nearly 300 cells isolated from a single mouse, and have explored the cells' lineage relationships both phylogenetically and through a network-based approach. We present a model of mouse embryogenesis, where an early period of substantial cell mixing is followed by more coherent growth of clones later. We find that cells from certain tissues have greater numbers of close relatives in other specific tissues than expected from chance, suggesting that those populations arise from a similar pool of ancestral lineages. Finally, we have investigated the dynamics of cell turnover (the frequency of cell loss and replacement) in postnatal tissues. This work offers a longitudinal study of developmental lineages, from conception to adulthood, and provides insight into basic questions of mouse embryology as well as the somatic processes that occur after birth. [source]


A Bayesian model averaging approach for cost-effectiveness analyses

HEALTH ECONOMICS, Issue 7 2009
Caterina Conigliani
Abstract We consider the problem of assessing new and existing technologies for their cost-effectiveness in the case where data on both costs and effects are available from a clinical trial, and we address it by means of the cost-effectiveness acceptability curve. The main difficulty in these analyses is that cost data usually exhibit highly skew and heavy-tailed distributions so that it can be extremely difficult to produce realistic probabilistic models for the underlying population distribution, and in particular to model accurately the tail of the distribution, which is highly influential in estimating the population mean. Here, in order to integrate the uncertainty about the model into the analysis of cost data and into cost-effectiveness analyses, we consider an approach based on Bayesian model averaging: instead of choosing a single parametric model, we specify a set of plausible models for costs and estimate the mean cost with a weighted mean of its posterior expectations under each model, with weights given by the posterior model probabilities. The results are compared with those obtained with a semi-parametric approach that does not require any assumption about the distribution of costs. Copyright © 2008 John Wiley & Sons, Ltd. [source]


How to make single small holes with large aspect ratios

PHYSICA STATUS SOLIDI - RAPID RESEARCH LETTERS, Issue 2-3 2009
Helmut Föll
Abstract In many areas of research a need for single small holes with diameters from a few nm to several ,m and large aspect ratios exists that is hard to meet with existing technologies. In a proof of principle it is shown that suitable single holes or specific arrays of some single holes can be made by first etching a very large number of small and deep holes or pores into semiconductors like Si or InP by established electrochemical means, followed by masking the desired holes and filling all others with, e.g., a metal in a galvanic process. The potential and limitations of this technique is briefly discussed. (© 2009 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Techno-Economic Analysis of Hydrazine Hydrate Technologies

CHEMICAL ENGINEERING & TECHNOLOGY (CET), Issue 9 2010
P. Nikhitha
Abstract The increasing demand of current world production for hydrazine hydrate emphasizes the need to focus on the techno-economic analysis of the existing technologies. Three processes, namely the Raschig process, urea process, and peroxide-ketazine process, are chosen for technical analysis followed by cost estimation and economic assessment. The technical part involves the development of flow sheets, process design, carrying out of calculations as well as estimation of raw materials, labor, utilities, and process equipment by sizing and other sub-components. The economic part comprises the estimation of working capital, fixed capital investment, total capital investment, and total production costs. Economic parameters like net profits, rate of return, payback period, and break-even point are also estimated to perform economic analysis. The results obtained from technical analysis and economical feasibility studies show that the peroxide-ketazine-based hydrazine hydrate technology has clear advantages in terms of raw material consumption and economic competitiveness. [source]


A North American multilaboratory study of CD4 counts using flow cytometric panleukogating (PLG): A NIAID-DAIDS Immunology Quality Assessment Program Study,,§¶

CYTOMETRY, Issue S1 2008
Thomas N. Denny
Abstract Background The global HIV/AIDS pandemic and guidelines for initiating anti-retroviral therapy (ART) and opportunistic infection prophylaxis demand affordable, reliable, and accurate CD4 testing. A simple innovative approach applicable to existing technology that has been successfully applied in resource-challenged settings, PanLeukogated CD4 (PLG), could offer solutions for cost saving and improved precision. Methods Day-old whole blood from 99 HIV+ donors was simultaneously studied in five North-American laboratories to compare the performance of their predicate methods with the dual-platform PLG method. The predicate technology included varying 4-color CD45/CD3/CD4/CD8 protocols on different flow cytometers. Each laboratory also assayed eight replicate specimens of day-old blood from 10 to 14 local donors. Bias and precision of predicate and PLG methods was studied between- and within-participating laboratories. Results Significantly (P < 0.0001) improved between-laboratory precision/coefficient of variation (CV%) was noted using the PLG method (overall median 9.3% vs. predicate median CV 13.1%). Within-laboratory precision was also significantly (P < 0.0001) better overall using PLG (median 4.6% vs. predicate median CV 6.2%) and in 3 of the 5 laboratories. PLG counts tended to be 11% smaller than predicate methods (P < 0.0001) for shipped (median of predicate,PLG = 31) and local specimens (median of predicate,PLG = 23), both overall and in 4 of 5 laboratories (median decreases of 4, 16, 20, and 21% in shipped specimens); the other laboratory had a median increase of 5%. Conclusion Laboratories using predicate CD4 methods similar to those in this study could improve their between-laboratory and their within-laboratory precision, and reduce costs, by switching to the PLG method after adequate training, if a change (usually, a decrease) in CD4 counts is acceptable to their health systems. © 2008 Clinical Cytometry Society [source]


Glucose sensors: a review of current and emerging technology

DIABETIC MEDICINE, Issue 3 2009
N. S. Oliver
Abstract Glucose monitoring technology has been used in the management of diabetes for three decades. Traditional devices use enzymatic methods to measure glucose concentration and provide point sample information. More recently continuous glucose monitoring devices have become available providing more detailed data on glucose excursions. In future applications the continuous glucose sensor may become a critical component of the closed loop insulin delivery system and, as such, must be selective, rapid, predictable and acceptable for continuous patient use. Many potential sensing modalities are being pursued including optical and transdermal techniques. This review aims to summarize existing technology, the methods for assessing glucose sensing devices and provide an overview of emergent sensing modalities. [source]


Exhaled Nitric Oxide Levels during Acute Asthma Exacerbation

ACADEMIC EMERGENCY MEDICINE, Issue 7 2005
Michelle Gill MD
Abstract Objectives: Fractional exhaled nitric oxide (FENO) has been shown in laboratory settings and trials of patients with stable asthma to correlate with the degree of airway inflammation. The authors hypothesized that the technique of measuring FENO would be reproducible in the setting of acute asthma in the emergency department (ED) and that the FENO results during ED visits would potentially predict disposition, predict relapse following discharge, and correlate with the National Institutes of Health (NIH) asthma severity scale and peak expiratory flow measurements. Methods: The authors prospectively measured FENO in a convenience sample of ED patients with acute exacerbations of asthma, both at the earliest possible opportunity and then one hour later. Each assessment point included triplicate measurements to assess reproducibility. The authors also performed spirometry and classified asthma severity using the NIH asthma severity scale. Discharged patients were contacted in 72 hours to determine whether their asthma had relapsed. Results: The authors discontinued the trial (n= 53) after a planned interim analysis demonstrated reproducibility (coefficient of variation, 15%) substantially worse than our a priori threshold for precision (4%). There was no association between FENO response and corresponding changes in spirometry or clinical scores. Areas under the receiver operating characteristic curves for the prediction of hospitalization and relapse were poor (0.579 and 0.713, respectively). Conclusions: FENO measurements in ED patients with acute asthma exacerbations were poorly reproducible and did not correlate with standard measures of asthma severity. These results suggest that using existing technology, FENO is not a useful marker for assessing severity, response to treatment, or disposition of acute asthmatic patients in the ED. [source]


Nanostructured Bulk Silicon as an Effective Thermoelectric Material

ADVANCED FUNCTIONAL MATERIALS, Issue 15 2009
Sabah K. Bux
Abstract Thermoelectric power sources have consistently demonstrated their extraordinary reliability and longevity for deep space missions and small unattended terrestrial systems. However, more efficient bulk materials and practical devices are required to improve existing technology and expand into large-scale waste heat recovery applications. Research has long focused on complex compounds that best combine the electrical properties of degenerate semiconductors with the low thermal conductivity of glassy materials. Recently it has been found that nanostructuring is an effective method to decouple electrical and thermal transport parameters. Dramatic reductions in the lattice thermal conductivity are achieved by nanostructuring bulk silicon with limited degradation in its electron mobility, leading to an unprecedented increase by a factor of 3.5 in its performance over that of the parent single-crystal material. This makes nanostructured bulk (nano-bulk) Si an effective high temperature thermoelectric material that performs at about 70% the level of state-of-the-art Si0.8Ge0.2 but without the need for expensive and rare Ge. [source]


Quantification of the graphical details of collagen fibrils in transmission electron micrographs

JOURNAL OF MICROSCOPY, Issue 1 2001
Y. Xia
A novel 2D image analysis technique is demonstrated. Using the digitized images of articular cartilage from transmission electron microscopy (TEM), this technique performs a localized ,vector' analysis at each region that is large enough to include several or tens of collagen fibrils but small enough to provide a fine resolution for the whole tissue. For each small and localized region, the morphology of the collagen fibrils can be characterized by three quantities essential to the nature of the tissue: the concentration of the fibrils, the overall orientation of the fibrils, and the anisotropy of the fibrils. This technique is capable of providing new insight to the existing technology by assigning quantitative attributes to the qualitative graphics. The assigned quantities are sensitive to the fine structure of the collagen matrix and meaningful in the architectural nature of the collagen matrix. These quantities could provide a critical linkage between the ultrastructure of the tissue and the macroscopic behaviours of the material. In addition, coarse-graining the microscopic resolution of EM without compromising the essential features of the tissue's structure provides a direct view of the tissue's morphology and permits direct correlations and comparisons among interdisciplinary techniques. [source]


Approaches for assessing hazards and risks to workers and the public from contaminated land

REMEDIATION, Issue 1 2007
Michael Gochfeld
Many public agencies and private entities are faced with assessing the risks to humans from contamination on their lands. The United States Department of Energy (US DOE) and Department of Defense are responsible for large holdings of contaminated land and face a long-term and costly challenge to assure sustainable protectiveness. With increasing interest in the conversion of brownfields to productive uses, many former industrial properties must also be assessed to determine compatible future land uses. In the United States, many cleanup plans or actions are based on the Comprehensive Environmental Responsibility, Compensation, and Liability Act, which provides important but incomplete coverage of these issues, although many applications have tried to involve stakeholders at multiple steps. Where there is the potential for exposure to workers, the public, and the environment from either cleanup or leaving residual contamination in place, there is a need for a more comprehensive approach to evaluate and balance the present and future risk(s) from existing contamination, from remediation actions, as well as from postremediation residual contamination. This article focuses on the US DOE, the agency with the largest hazardous waste remediation task in the world. Presented is a framework extending from preliminary assessment, risk assessment and balancing, epidemiology, monitoring, communication, and stakeholder involvement useful for assessing risk to workers and site neighbors. Provided are examples of those who eat fish, meat, or fruit from contaminated habitats. The US DOE's contaminated sites are unique in a number of ways: (1) huge physical footprint size, (2) types of waste (mixed radiation/chemical), and (3) quantities of waste. Proposed future land uses provide goals for remediation, but since some contamination is of a type or magnitude that cannot be cleaned up with existing technology, this in turn constrains future land use options, requiring an iterative approach. The risk approaches must fit a range of future land uses and end-states from leave-in-place to complete cleanup. This will include not only traditional risk methodologies, but also the assessment and surveillance necessary for stewards for long-term monitoring of risk from historic and future exposure to maintain sustainable protectiveness. Because of the distinctiveness of DOE sites, application of the methodologies developed here to other waste site situations requires site-specific evaluation © 2007 Wiley Periodicals, Inc. [source]


Dynamic or Static Capabilities?

THE JOURNAL OF PRODUCT INNOVATION MANAGEMENT, Issue 5 2009
Process Management Practices, Response to Technological Change
Whether and how organizations adapt to changes in their environments has been a prominent theme in organization and strategy research. Within this research, there is controversy about whether organizational routines hamper or facilitate adaptation. Organizational routines give rise to inertia but are also the vehicles for change in recent work on dynamic capabilities. This rising interest in routines in research coincides with an increase in management practices focused on organizational routines and processes. This study explores how the increasing use of process management practices affected organizational response to a major technological change through new product developments. The empirical setting is the photography industry over a decade, during the shift from silver-halide chemistry to digital technology. The advent and rise of practices associated with the new ISO 9000 certification program in the 1990s coincided with increasing technological substitution in photography, allowing for assessing how increasing attention to routines through ISO 9000 practices over time affected ongoing responsiveness to the technological change. The study further compares the effects for the incumbent firms in the existing technology with nonincumbent firms entering from elsewhere. Relying on longitudinal panel data models as well as hazard models, findings show that greater process management practices dampened response to new generations of digital technology, but this effect differed for incumbents and nonincumbents. Increasing use of process management practices over time had a greater negative effect on incumbents' response to the rapid technological change. The study contributes to research in technological change by highlighting specific management practices that may create disconnects between firms' capabilities and changing environments and disadvantage incumbents in the face of radical technological change. This research also contributes to literature on organizational routines and capabilities. Studying the effects of increasing ISO 9000 practices undertaken in firms provides an opportunity to gauge the effects of systematic routinization of organizational activities and their effects on adaptation. This research also contributes to management practice. The promise of process management is to help firms adapt to changing environments, and, as such, managers facing technological change may adopt process management practices as a response to uncertainty and change. But managers must more fully understand the potential benefits and risks of process management to ensure these practices are used in the appropriate contexts. [source]


Enhanced energy efficiency and reliability of telecommunication equipment with the introduction of novel air cooled thermal architectures

BELL LABS TECHNICAL JOURNAL, Issue 2 2010
Domhnaill Hernon
In the past, thermal management was an afterthought in the design process of a product owing to the fact that heat dissipation loads and densities were minute and did not adversely affect component reliability. In fact, it may be stated that, historically, the sole purpose of thermal management was to ensure component operation below a critical temperature thereby providing reliable equipment operation for a given time period. However, this mindset has evolved in recent years given current economic and energy concerns. Climate change concern owing to vast green house gas emissions, increasing fuel and electricity costs, and a general trend towards energy-efficiency awareness has promoted thermal management to the forefront of "green" innovation within the information and communications technology (ICT) sector. If one considers the fact that up to 50 percent of the energy budget of a data center is spent on cooling equipment and that two percent of the United States' annual electricity is consumed by telecommunications equipment, it becomes obvious that thermal management has a key role to play in the development of eco-sustainable solutions. This paper will provide an overview of the importance of thermal management for reliable component operation and highlight the research areas where improved energy efficiency can be achieved. Novel air-cooled thermal solutions demonstrating significant energy savings and improved reliability over existing technology will be presented including three dimensional (3D) monolithic heat sinks and vortex generators. © 2010 Alcatel-Lucent. [source]


Screening and prevention of diabetic blindness

ACTA OPHTHALMOLOGICA, Issue 4 2000
Einar Stefánsson
ABSTRACT. : Diabetic eye disease remains a major cause of blindness in the world. Laser treatment for proliferative diabetic retinopathy and diabetic macular edema became available more than two decades ago. The outcome of treatment depends on the timing of laser treatment. The laser treatment is optimally delivered when high-risk characteristics have developed in proliferative retinopathy or diabetic macular edema and before this has significantly affected vision. Laser treatment is usually successful if applied during this optimal period whereas the treatment benefit falls sharply if the treatment is applied too late. In order to optimize the timing of laser treatment in diabetic eye disease screening programs have been established. The oldest screening program is 20 years old and several programs have been established during the last decade. In this paper the organisation and methods of screening programs are described including direct and photographic screening. The incidence and prevalence of blindness is much lower in populations where screening for diabetic eye disease has been established compared to diabetic populations without screening. Technical advantages may allow increased efficiency and telescreening. From a public health standpoint screening for diabetic eye disease is one of the most cost effective health procedures available. Diabetic eye disease can be prevented using existing technology and the cost involved is many times less than the cost of diabetic blindness. [source]