Home About us Contact | |||
Technologies Available (technology + available)
Selected AbstractsAlternatives to the Conference Status Quo: Summary Recommendations from the 2008 CORD Academic Assembly Conference Alternatives WorkgroupACADEMIC EMERGENCY MEDICINE, Issue 2009Annie T. Sadosty MD Abstract Objective:, A panel of Council of Emergency Medicine Residency Directors (CORD) members was asked to examine and make recommendations regarding the existing Accreditation Council of Graduate Medical Education (ACGME) EM Program Requirements pertaining to educational conferences, identified best practices, and recommended revisions as appropriate. Methods:, Using quasi-Delphi technique, 30 emergency medicine (EM) residency program directors and faculty examined existing requirements. Findings were presented to the CORD members attending the 2008 CORD Academic Assembly, and disseminated to the broader membership through the CORD e-mail list server. Results:, The following four ACGME EM Program Requirements were examined, and recommendations made: 1The 5 hours/week conference requirement: For fully accredited programs in good standing, outcomes should be driving how programs allocate and mandate educational time. Maintain the 5 hours/week conference requirement for new programs, programs with provisional accreditation, programs in difficult political environs, and those with short accreditation cycles. If the program requirements must retain a minimum hours/week reference, future requirements should take into account varying program lengths (3 versus 4 years). 2The 70% attendance requirement: Develop a new requirement that allows programs more flexibility to customize according to local resources, individual residency needs, and individual resident needs. 3The requirement for synchronous versus asynchronous learning: Synchronous and asynchronous learning activities have advantages and disadvantages. The ideal curriculum capitalizes on the strengths of each through a deliberate mixture of each. 4Educationally justified innovations: Transition from process-based program requirements to outcomes-based requirements. Conclusions:, The conference requirements that were logical and helpful years ago may not be logical or helpful now. Technologies available to educators have changed, the amount of material to cover has grown, and online on-demand education has grown even more. We believe that flexibility is needed to customize EM education to suit individual resident and individual program needs, to capitalize on regional and national resources when local resources are limited, to innovate, and to analyze and evaluate interventions with an eye toward outcomes. [source] Technical Cost Modeling for the Mechanical Milling at Cryogenic Temperature (Cryomilling),ADVANCED ENGINEERING MATERIALS, Issue 8 2004J. Ye Cryomilling is one of the few technologies available to fabricate a large quantity of nanostructured materials. No matter how exciting and promising a technology is, its ultimate realization is invariably dependent on economic success. Technical cost modeling was employed in this paper to analyze the processing cost of cryomilling. The results demonstrated that cryomilling has the potential to be commercially economical to fabricate nanostructured materials. [source] Towards correlative imaging of plant cortical microtubule arrays: combining ultrastructure with real-time microtubule dynamicsJOURNAL OF MICROSCOPY, Issue 3 2009D.A. BARTON Summary There are a variety of microscope technologies available to image plant cortical microtubule arrays. These can be applied specifically to investigate direct questions relating to array function, ultrastructure or dynamics. Immunocytochemistry combined with confocal laser scanning microscopy provides low resolution "snapshots" of cortical microtubule arrays at the time of fixation whereas live cell imaging of fluorescent fusion proteins highlights the dynamic characteristics of the arrays. High-resolution scanning electron microscopy provides surface detail about the individual microtubules that form cortical microtubule arrays and can also resolve cellulose microfibrils that form the innermost layer of the cell wall. Transmission electron microscopy of the arrays in cross section can be used to examine links between microtubules and the plasma membrane and, combined with electron tomography, has the potential to provide a complete picture of how individual microtubules are spatially organized within the cortical cytoplasm. Combining these high-resolution imaging techniques with the expression of fluorescent cytoskeletal fusion proteins in live cells using correlative microscopy procedures will usher in an radical change in our understanding of the molecular dynamics that underpin the organization and function of the cytoskeleton. [source] Unravelling response-specificity in Ca2+ signalling pathways in plant cellsNEW PHYTOLOGIST, Issue 1 2001Jason J. Rudd Summary Considerable advances have been made, both in the technologies available to study changes in intracellular cytosolic free Ca2+ ([Ca2+]i), and in our understanding of Ca2+ signalling cascades in plant cells, but how specificity can be generated from such a ubiquitous component as Ca2+ is questionable. Recently the concept of ,Ca2+ signatures' has been formulated; tight control of the temporal and spatial characteristics of alterations in [Ca2+]i signals is thought to be responsible, at least in part, for the specificity of the response. However, the way in which Ca2+ signatures are decoded, which depends on the nature and location of the targets of the Ca2+ signals, has received little attention. In a few key systems, progress is being made on how diverse Ca2+ signatures might be transduced within cells in response to specific signals. Valuable pieces of the signal-specificity puzzle are being put together and this is illustrated here using some key examples; these emphasize the global importance of Ca2+ -mediated signal-transduction cascades in the responses of plants to a wide diversity of extracellular signals. However, the way in which signal specificity is encoded and transduced is still far from clear. [source] Research and Development Trends in BiodieselASIA-PACIFIC JOURNAL OF CHEMICAL ENGINEERING, Issue 5-6 2004V. Rudolph Biodiesel, a derivative from plant oils or animal fats, has gained widespread acceptance in recent years as a sustainable alternative fuel to petroleum diesel due to its environmental benefits and renewability. Although there are several different ways in which biodiesel can be used or formulated as a fuel such as direct blending, microemulsions and thermal cracking, the most widespread remains the alkyl esters of fatty acids obtained through transesterification of the oils or fats. In transesterification, triglycerides which are the main chemical in oils or fats are converted into esters through reaction with simple alcohols. The physical and chemical properties of the esters thus obtained are very similar to those of the petroleum diesel. This paper reviews the current technologies available for the transesterifications of vegetable oils and animal fats and identifies that the biggest factor deterring a greater market uptake of biodiesel is its cost. It concludes that, in addition to government policy framework, e.g. to reduce the pump price of biodiesel through fuel tax exemption, further technological development presents significant scope for improvement. At present, there are no suitable and developed transesterification technologies that can handle cheap, low-quality feedstocks including waste animal fats and spent cooking oils. These feedstocks contain high percentages of water and free fatty acids which are extremely detrimental to the yield and reaction rates of the transesterification processes. This paper also suggests some future research and development directions and requirements for more competitive biodiesel production. [source] High-throughput DNA sequencing , concepts and limitationsBIOESSAYS, Issue 6 2010Martin Kircher Abstract Recent advances in DNA sequencing have revolutionized the field of genomics, making it possible for even single research groups to generate large amounts of sequence data very rapidly and at a substantially lower cost. These high-throughput sequencing technologies make deep transcriptome sequencing and transcript quantification, whole genome sequencing and resequencing available to many more researchers and projects. However, while the cost and time have been greatly reduced, the error profiles and limitations of the new platforms differ significantly from those of previous sequencing technologies. The selection of an appropriate sequencing platform for particular types of experiments is an important consideration, and requires a detailed understanding of the technologies available; including sources of error, error rate, as well as the speed and cost of sequencing. We review the relevant concepts and compare the issues raised by the current high-throughput DNA sequencing technologies. We analyze how future developments may overcome these limitations and what challenges remain. [source] Metabolites in safety testing: metabolite identification strategies in discovery and developmentBIOPHARMACEUTICS AND DRUG DISPOSITION, Issue 4 2009Angus N. R. Nedderman Abstract The publication of the FDA MIST guidelines in 2008, together with the acknowledged importance of metabolism data for the progression of novel compounds through drug discovery and drug development, has resulted in a renewed focus on the metabolite identification strategies utilised throughout the pharmaceutical industry. With the plethora of existing and emerging technologies available to the metabolite identification scientist, it is argued that increased diligence should be applied to metabolism studies in the early stages of both drug discovery and drug development, in order to more routinely impact chemical design and to comply with the concepts of the MIST guidance without re-positioning the definitive radiolabelled studies from there typical place in late development. Furthermore, these strategic elements should be augmented by a broad and thorough understanding of the impact of the derived metabolism data, most notably considerations of absolute abundance, structure and pharmacological activity, such that they can be put into proper context as part of a holistic safety strategy. The combination of these approaches should ensure a metabolite identification strategy that successfully applies the principles of the MIST guidance throughout the discovery/development continuum and thereby provides appropriate confidence in support of human safety. Copyright © 2009 John Wiley & Sons, Ltd. [source] A New Numerical Approach for a Detailed Multicomponent Gas Separation Membrane Model and AspenPlus SimulationCHEMICAL ENGINEERING & TECHNOLOGY (CET), Issue 7 2005M. H. Murad Chowdhury Abstract A new numerical solution approach for a widely accepted model developed earlier by Pan [1] for multicomponent gas separation by high-flux asymmetric membranes is presented. The advantage of the new technique is that it can easily be incorporated into commercial process simulators such as AspenPlusTM [2] as a user-model for an overall membrane process study and for the design and simulation of hybrid processes (i.e., membrane plus chemical absorption or membrane plus physical absorption). The proposed technique does not require initial estimates of the pressure, flow and concentration profiles inside the fiber as does in Pan's original approach, thus allowing faster execution of the model equations. The numerical solution was formulated as an initial value problem (IVP). Either Adams-Moulton's or Gear's backward differentiation formulas (BDF) method was used for solving the non-linear differential equations, and a modified Powell hybrid algorithm with a finite-difference approximation of the Jacobian was used to solve the non-linear algebraic equations. The model predictions were validated with experimental data reported in the literature for different types of membrane gas separation systems with or without purge streams. The robustness of the new numerical technique was also tested by simulating the stiff type of problems such as air dehydration. This demonstrates the potential of the new solution technique to handle different membrane systems conveniently. As an illustration, a multi-stage membrane plant with recycle and purge streams has been designed and simulated for CO2 capture from a 500,MW power plant flue gas as a first step to build hybrid processes and also to make an economic comparison among different existing separation technologies available for CO2 separation from flue gas. [source] Guest Lecture 9.00,9.45 Wednesday 17 September 2003CYTOPATHOLOGY, Issue 2003Peter A. Hall MD PhD FRCPath The past decades have seen an explosion in our knowledge of the molecular events underpinning the pathogenesis of many disease processes. Furthermore, there have been enormous technical advances with the ability to identify, clone and sequence genes and to characterize their protein products now being common place in research settings. However, despite many claims as to the utility of molecular and biochemical methods in pathology only very few laboratories employ such methods in a clinical setting. Indeed the impact of molecular medicine has been more talked about than real. Why is this? The goal of this presentation is to address this question and present some perspectives on the future of Molecular Pathology. I shall overview, for the BSCC, the current state of the technology available for gene analysis and to explore the developments needed before the mirage of molecular pathology becomes a clinical reality. [source] The European Service Mapping Schedule (ESMS): development of an instrumentfor the description and classificationof mental health servicesACTA PSYCHIATRICA SCANDINAVICA, Issue 2000S. Johnson Objective: This paper describes the development of an instrument for description and classification of mental health services and for measurement of service use. Purposes to be served by the instrument include: (i) identification of gaps in the spectrum of services in a catchment area; (ii) obtaining background information which may be important to understanding why apparently similar interventions lead to different outcomes in different areas; (iii) investigating how introduction of a particular type of service influences use of other local services; and (iv) understanding the relationship between sociodemographic factors and service use. Method: The instrument was developed through meetings of an international expert panel and pilot stages in several European countries. Results: Use of the European Mapping Service Mapping Schedule (ESMS) appears feasible in several countries and allowed description and classification of the full range of services identified within each of the study catchment areas. Conclusion: The ESMS promises to fill a gap in the technology available for mental health services research. Further practical experiences of its use for a variety of purposes in a variety of settings are now needed to indicate how far the ESMS does successfully generate data which are useful to researchers and planners. [source] Surgery for Ruptured Sinus of Valsalva Aneurysm into Right Ventricular Outflow Tract: Role of Intraoperative 2D and Real Time 3D Transesophageal EchocardiographyECHOCARDIOGRAPHY, Issue 7 2010Shrinivas Gadhinglajkar M.D. A major limitation of the 2D echocardiography during surgery for a complex cardiac lesion is its inability to provide an accurate spatial orientation of the structure. The real time 3D transesophageal echocardiography (RT-3D-TEE) technology available in Philips IE 33 ultrasound machine is relatively new to an operation suite. We evaluated its intraoperative utility in a patient, who was operated for repair of a ruptured sinus of Valsalva aneurysm (RSOVA) and closure of a supracristal ventricular septal defect. The VSD and RSOVA were visualized through different virtual windows in a more promising way on intraoperative RT-3D-TEE than on the 2D echocardiography. The acquired images could be virtually cropped and displayed in anatomical views to the operating surgeon for a clear orientation to the anatomy of the lesion. RT-3D-TEE is a potential intraoperative monitoring tool in surgeries for complex cardiac lesions. (Echocardiography 2010;27:E65-E69) [source] The Impact of Mergers and Acquisitions on the Efficiency of the US Banking Industry: Further EvidenceJOURNAL OF BUSINESS FINANCE & ACCOUNTING, Issue 1-2 2008Adel A. Al-Sharkas Abstract:, Using the Stochastic Frontier Approach (SFA), this study investigates the cost and profit efficiency effects of bank mergers on the US banking industry. We also use the non-parametric technique of Data Envelopment Analysis (DEA) to evaluate the production structure of merged and non-merged banks. The empirical results indicate that mergers have improved the cost and profit efficiencies of banks. Further, evidence shows that merged banks have lower costs than non-merged banks because they are using the most efficient technology available (technical efficiency) as well as a cost minimizing input mix (allocative efficiency). The results suggest that there is an economic rational for future mergers in the banking industry. Finally, mergers may allow the banking industry to take advantage of the opportunities created by improved technology. [source] An affordable modular mobile robotic platform with fuzzy logic control and evolutionary artificial neural networksJOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 8 2004Maurice Tedder Autonomous robotics projects encompass the rich nature of integrated systems that includes mechanical, electrical, and computational software components. The availability of smaller and cheaper hardware components has helped make possible a new dimension in operational autonomy. This paper describes a mobile robotic platform consisting of several integrated modules including a laptop computer that serves as the main control module, microcontroller-based motion control module, a vision processing module, a sensor interface module, and a navigation module. The laptop computer module contains the main software development environment with a user interface to access and control all other modules. Programming language independence is achieved by using standard input/output computer interfaces including RS-232 serial port, USB, networking, audio input and output, and parallel port devices. However, with the same hardware technology available to all, the distinguishing factor in most cases for intelligent systems becomes the software design. The software for autonomous robots must intelligently control the hardware so that it functions in unstructured, dynamic, and uncertain environments while maintaining an autonomous adaptability. This paper describes how we introduced fuzzy logic control to one robot platform in order to solve the 2003 Intelligent Ground Vehicle Competition (IGVC) Autonomous Challenge problem. This paper also describes the introduction of hybrid software design that utilizes Fuzzy Evolutionary Artificial Neural Network techniques. In this design, rather than using a control program that is directly coded, the robot's artificial neural net is first trained with a training data set using evolutionary optimization techniques to adjust weight values between neurons. The trained neural network with a weight average defuzzification method was able to make correct decisions to unseen vision patterns for the IGVC Autonomous Challenge. A comparison of the Lawrence Technological University robot designs and the design of the other competing schools shows that our platforms were the most affordable robot systems to use as tools for computer science and engineering education. © 2004 Wiley Periodicals, Inc. [source] ARCHAEOLOGICAL PETROLOGY AND THE ARCHAEOMETRY OF LITHIC MATERIALS,ARCHAEOMETRY, Issue 2 2008M. S. SHACKLEY For 50 years, archaeologists and physical scientists have been dating, determining the composition of and measuring stone tools, and reporting them in Archaeometry and many other journals. In Archaeometry specifically, the number of papers devoted to the analysis of lithic material has increased at least 30 times since 1958 and volume 1. This is a reflection not only of an increase in the number of scholars devoting their time to the archaeometry of stone, but also of increases in the quality and quantity of instrumental technology available to researchers in the field. [source] |