Simulator

Distribution by Scientific Domains
Distribution within Engineering

Kinds of Simulator

  • circuit simulator
  • commercial simulator
  • earth simulator
  • human patient simulator
  • network simulator
  • patient simulator
  • process simulator
  • rainfall simulator
  • reality simulator
  • solar simulator
  • surgery simulator
  • surgical simulator
  • system simulator
  • the earth simulator
  • transfer simulator
  • virtual reality simulator


  • Selected Abstracts


    A performance comparison between the Earth Simulator and other terascale systems on a characteristic ASCI workload,

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2005
    Darren J. Kerbyson
    Abstract This work gives a detailed analysis of the relative performance of the recently installed Earth Simulator and the next top four systems in the Top500 list using predictive performance models. The Earth Simulator uses vector processing nodes interconnected using a single-stage, cross-bar network, whereas the next top four systems are built using commodity based superscalar microprocessors and interconnection networks. The performance that can be achieved results from an interplay of system characteristics, application requirements and scalability behavior. Detailed performance models are used here to predict the performance of two codes representative of the ASCI workload, namely SAGE and Sweep3D. The performance models encapsulate fully the behavior of these codes and have been previously validated on many large-scale systems. One result of this analysis is to size systems, built from the same nodes and networks as those in the top five, that will have the same performance as the Earth Simulator. In particular, the largest ASCI machine, ASCI Q, is expected to achieve a similar performance to the Earth Simulator on the representative workload. Published in 2005 by John Wiley & Sons, Ltd. [source]


    Evaluating high-performance computers,

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2005
    Jeffrey S. Vetter
    Abstract Comparisons of high-performance computers based on their peak floating point performance are common but seldom useful when comparing performance on real workloads. Factors that influence sustained performance extend beyond a system's floating-point units, and real applications exercise machines in complex and diverse ways. Even when it is possible to compare systems based on their performance, other considerations affect which machine is best for a given organization. These include the cost, the facilities requirements (power, floorspace, etc.), the programming model, the existing code base, and so on. This paper describes some of the important measures for evaluating high-performance computers. We present data for many of these metrics based on our experience at Lawrence Livermore National Laboratory (LLNL), and we compare them with published information on the Earth Simulator. We argue that evaluating systems involves far more than comparing benchmarks and acquisition costs. We show that evaluating systems often involves complex choices among a variety of factors that influence the value of a supercomputer to an organization, and that the high-end computing community should view cost/performance comparisons of different architectures with skepticism. Published in 2005 by John Wiley & Sons, Ltd. [source]


    Plasma Edge Physics with B2-Eirene

    CONTRIBUTIONS TO PLASMA PHYSICS, Issue 1-2 2006
    R. Schneider
    Abstract The B2-Eirene code package was developed to give better insight into the physics in the scrape-off layer (SOL), which is defined as the region of open field-lines intersecting walls. The SOL is characterised by the competition of parallel and perpendicular transport defining by this a 2D system. The description of the plasma-wall interaction due to the existence of walls and atomic processes are necessary ingredients for an understanding of the scrape-off layer. This paper concentrates on understanding the basic physics by combining the results of the code with experiments and analytical models or estimates. This work will mainly focus on divertor tokamaks, but most of the arguments and principles can be easily adapted also to other concepts like island divertors in stellarators or limiter devices. The paper presents the basic equations for the plasma transport and the basic models for the neutral transport. This defines the basic ingredients for the SOLPS (Scrape-Off Layer Plasma Simulator) code package. A first level of understanding is approached for pure hydrogenic plasmas based both on simple models and simulations with B2-Eirene neglecting drifts and currents. The influence of neutral transport on the different operation regimes is here the main topic. This will finish with time-dependent phenomena for the pure plasma, so-called Edge Localised Modes (ELMs). Then, the influence of impurities on the SOL plasma is discussed. For the understanding of impurity physics in the SOL one needs a rather complex combination of different aspects. The impurity production process has to be understood, then the effects of impurities in terms of radiation losses have to be included and finally impurity transport is necessary. This will be introduced with rising complexity starting with simple estimates, analysing then the detailed parallel force balance and the flow pattern of impurities. Using this, impurity compression and radiation instabilities will be studied. This part ends, combining all the elements introduced before, with specific, detailed results from different machines. Then, the effect of drifts and currents is introduced and their consequences presented. Finally, some work on deriving scaling laws for the anomalous turbulent transport based on automatic edge transport code fitting procedures will be described. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Porcine Sebaceous Cyst Model: An Inexpensive, Reproducible Skin Surgery Simulator

    DERMATOLOGIC SURGERY, Issue 8 2005
    Jonathan Bowling MBChB
    background. Surgical simulators are an established part of surgical training and are regularly used as part of the objective structured assessment of technical skills. Specific artificial skin models representing cutaneous pathology are available, although they are expensive when compared with pigskin. The limitations of artificial skin models include their difficulty in representing lifelike cutaneous pathology. objective. Our aim was to devise an inexpensive, reproducible surgical simulator that provides the most lifelike representation of the sebaceous cyst. materials and methods. Pigskin, either pig's feet/trotters or pork belly, was incised, and a paintball was inserted subcutaneously and fixed with cyanoacrylic glue. results. This model has regularly been used in cutaneous surgical courses that we have organized. Either adding more cyanoacrylic glue or allowing more time for the paint ball to absorb fluid from surrounding tissue can also adjust the degree of difficulty. conclusions. The degree of correlation with lifelike cutaneous pathology is such that we recommend that all courses involved in basic skin surgery should consider using the porcine sebaceous cyst model when teaching excision of sebaceous cysts. [source]


    Analysis of the effects of ultrafine particulate matter while accounting for human exposure

    ENVIRONMETRICS, Issue 2 2009
    B. J. REICH
    Abstract Particulate matter (PM) has been associated with mortality in several epidemiological studies. The US EPA currently regulates PM10 and PM2.5 (mass concentration of particles with diameter less than 10 and 2.5 µm, respectively), but it is not clear which size of particles are most responsible for adverse heath outcomes. A current hypothesis is that ultrafine particles with diameter less than 0.1 µm are particularly harmful because their small size allows them to deeply penetrate the lungs. This paper investigates the association between exposure to particles of varying diameter and daily mortality. We propose a new dynamic factor analysis model to relate the ambient concentrations of several sizes of particles with diameters ranging from 0.01 to 0.40 µm with mortality. We introduce a Bayesian model that converts ambient concentrations into simulated personal exposure using the EPA's Stochastic Human Exposure and Dose Simulator, and relates simulated exposure with mortality. Using new data from Fresno, CA, we find that the 4-day lag of particles with diameter between 0.02 and 0.08 µm is associated with mortality. This is consistent with the small particles hypothesis. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    A Simulator-based Medical Education Service

    ACADEMIC EMERGENCY MEDICINE, Issue 8 2002
    James A. Gordon MD
    No abstract is available for this article. [source]


    Abolishing the Tax-Free Threshold in Australia: Simulating Alternative Reforms,

    FISCAL STUDIES, Issue 2 2009
    John Creedy
    H24; H31 Abstract This paper examines the role of the tax-free income tax threshold in a complex tax and transfer system consisting of a range of taxes and benefits, each with its own taper rates and thresholds. Considering a tax and benefit system with benefit taper rates whereby some benefits are received by income groups other than those at the bottom of the distribution, it is suggested that a tax-free threshold is not a necessary requirement to achieve redistribution. Four alternative policy changes, each involving the elimination of the tax-free threshold in Australia and designed to achieve approximate revenue neutrality, were examined using the Melbourne Institute Tax and Transfer Simulator. A range of implications were examined, including labour supply responses to tax changes and the effects of policy changes on inequality and social welfare. The results demonstrate that it is possible to eliminate the tax-free threshold under approximate overall revenue and distribution neutrality, but that it is impossible to improve labour supply incentives at the same time. In order to achieve improved incentives, either revenue or distribution neutrality has to be sacrificed. [source]


    Transport and environmental temperature variability of eggs and larvae of the Japanese anchovy (Engraulis japonicus) and Japanese sardine (Sardinops melanostictus) in the western North Pacific estimated via numerical particle-tracking experiments

    FISHERIES OCEANOGRAPHY, Issue 2 2009
    SACHIHIKO ITOH
    Abstract Numerical particle-tracking experiments were performed to investigate the transport and variability in environmental temperature experienced by eggs and larvae of Pacific stocks of the Japanese anchovy (Engraulis japonicus) and Japanese sardine (Sardinops melanostictus) using high-resolution outputs of the Ocean General Circulation Model for the Earth Simulator (OFES) and the observed distributions of eggs collected from 1978 to 2004. The modeled anchovy individuals tend to be trapped in coastal waters or transported to the Kuroshio,Oyashio transition region. In contrast, a large proportion of the sardines are transported to the Kuroshio Extension. The egg density-weighted mean environmental temperature until day 30 of the experiment was 20,24°C for the anchovy and 17,20°C for the sardine, which can be explained by spawning areas and seasons, and interannual oceanic variability. Regression analyses revealed that the contribution of environmental temperature to the logarithm of recruitment per spawning (expected to have a negative relationship with the mean mortality coefficient) was significant for both the anchovy and sardine, especially until day 30, which can be regarded as the initial stages of their life cycles. The relationship was quadratic for the anchovy, with an optimal temperature of 21,22°C, and linear for the sardine, with a negative coefficient. Differences in habitat areas and temperature responses between the sardine and anchovy are suggested to be important factors in controlling the dramatic out-of-phase fluctuations of these species. [source]


    Net primary productivity mapped for Canada at 1-km resolution

    GLOBAL ECOLOGY, Issue 2 2002
    J Liu
    Abstract Aim To map net primary productivity (NPP) over the Canadian landmass at 1-km resolution. Location Canada. Methods A simulation model, the Boreal Ecosystem Productivity Simulator (BEPS), has been developed. The model uses a sunlit and shaded leaf separation strategy and a daily integration scheme in order to implement an instantaneous leaf-level photosynthesis model over large areas. Two key driving variables, leaf area index (every 10 days) and land cover type (annual), are derived from satellite measurements of the Advanced Very High Resolution Radiometer (AVHRR). Other spatially explicit input data are also prepared, including daily meteorological data (radiation, precipitation, temperature, and humidity), available soil water holding capacity (AWC) and forest biomass. The model outputs are compared with ground plot data to ensure that no significant systematic biases are created. Results The simulation results show that Canada's annual net primary production was 1.22 Gt C year,1 in 1994, 78% attributed to forests, mainly the boreal forest, without considering the contribution of the understorey. The NPP averaged over the entire landmass was ~140 g C m,2 year,1 in 1994. Geographically, NPP varied greatly among ecozones and provinces/territories. The seasonality of NPP is characterized by strong summer photosynthesis capacities and a short growing season in northern ecosystems. Conclusions This study is the first attempt to simulate Canada-wide NPP with a process-based model at 1-km resolution and using a daily step. The statistics of NPP are therefore expected to be more accurate than previous analyses at coarser spatial or temporal resolutions. The use of remote sensing data makes such simulations possible. BEPS is capable of integrating the effects of climate, vegetation, and soil on plant growth at a regional scale. BEPS and its parameterization scheme and products can be a basis for future studies of the carbon cycle in mid-high latitude ecosystems. [source]


    On the effects of triangulated terrain resolution on distributed hydrologic model response

    HYDROLOGICAL PROCESSES, Issue 11 2005
    Enrique R. Vivoni
    Abstract Distributed hydrologic models based on triangulated irregular networks (TIN) provide a means for computational efficiency in small to large-scale watershed modelling through an adaptive, multiple resolution representation of complex basin topography. Despite previous research with TIN-based hydrology models, the effect of triangulated terrain resolution on basin hydrologic response has received surprisingly little attention. Evaluating the impact of adaptive gridding on hydrologic response is important for determining the level of detail required in a terrain model. In this study, we address the spatial sensitivity of the TIN-based Real-time Integrated Basin Simulator (tRIBS) in order to assess the variability in the basin-averaged and distributed hydrologic response (water balance, runoff mechanisms, surface saturation, groundwater dynamics) with respect to changes in topographic resolution. Prior to hydrologic simulations, we describe the generation of TIN models that effectively capture topographic and hydrographic variability from grid digital elevation models. In addition, we discuss the sampling methods and performance metrics utilized in the spatial aggregation of triangulated terrain models. For a 64 km2 catchment in northeastern Oklahoma, we conduct a multiple resolution validation experiment by utilizing the tRIBS model over a wide range of spatial aggregation levels. Hydrologic performance is assessed as a function of the terrain resolution, with the variability in basin response attributed to variations in the coupled surface,subsurface dynamics. In particular, resolving the near-stream, variable source area is found to be a key determinant of model behaviour as it controls the dynamic saturation pattern and its effect on rainfall partitioning. A relationship between the hydrologic sensitivity to resolution and the spatial aggregation of terrain attributes is presented as an effective means for selecting the model resolution. Finally, the study highlights the important effects of terrain resolution on distributed hydrologic model response and provides insight into the multiple resolution calibration and validation of TIN-based hydrology models. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Game theory application to Fed Cattle procurement in an experimental market

    AGRIBUSINESS : AN INTERNATIONAL JOURNAL, Issue 1 2009
    Jared G. Carlberg
    Consolidation in meatpacking has elicited many market power concerns and studies. A noncooperative, infinitely repeated game theory model was developed and an empirical model estimated to measure beef packing firm behavior in cattle procurement. Experimental market data from three semester-long classes using the Fed Cattle Market Simulator (FCMS) were used. Collusive behavior was found for all three data periods though the extent of collusion varied across semester-long data periods. Results may have been influenced by market conditions imposed on the experimental market in two of the three semesters. One was a marketing agreement between the largest packer and two feedlots and the other involved limiting the amount and type of public market information available to participants. Findings underscore the need for applying game theory to real-world transaction-level, fed cattle market data. [EconLit Citations: C730, L100]. © 2009 Wiley Periodicals, Inc. [source]


    Robust load,frequency regulation: A real-time laboratory experiment

    OPTIMAL CONTROL APPLICATIONS AND METHODS, Issue 6 2007
    Hassan Bevrani
    Abstract This paper addresses a new method for robust decentralized design of proportional-integral-based load,frequency control (LFC) with communication delays. In the proposed methodology, the LFC problem is reduced to a static output feedback control synthesis for a multiple delays power system, and then the control parameters are easily carried out using robust H, control technique. To demonstrate the efficiency of the proposed control strategy, an experimental study has been performed on the Analog Power System Simulator at the Research Laboratory of the Kyushu Electric Power Co. in Japan. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Minimal erythema dose after multiple UV exposures depends on pre-exposure skin pigmentation

    PHOTODERMATOLOGY, PHOTOIMMUNOLOGY & PHOTOMEDICINE, Issue 4 2004
    M. Henriksen
    Background/purpose: Phototherapy consists of multiple ultraviolet (UV) exposures. Most previous studies have focused on erythema following a single UV exposure in fair-skinned persons. Although it is well known that phototherapy lowers the daily UV-threshold dose for erythema in clinical practice, this is insufficiently documented under controlled experimental conditions. The purpose of this study was to quantify the change in the daily threshold for a dose specific erythema grade after 1,4 consecutive daily UV exposures. Methods: Forty-nine healthy volunteers (skin type II,V) with varying pigmentation quantified by skin reflectance. Two UV sources were used: a narrowband UVB (Philips TL01) and a Solar Simulator (Solar Light Co.). Just perceptible erythema after 24 h was chosen as the minimal erythema dose (+); besides + and ++ were assessed. Results: We found a positive and significant exponential relationship between skin pigmentation and UV dose to elicit a specific erythema grade on the back after 1,4 UV exposures. After repetitive UV exposures the UV dose had to be lowered more in dark-skinned persons compared with fair-skinned persons to elicit a certain erythema grade. This applied to both UV sources and all erythema grades. Conclusion: In the dark-skinned persons the daily UV dose after the 4 days UV exposure should be lowered by 40,50% to avoid burns compared with the single UV exposure. For the most fair-skinned persons essentially no reduction in the daily UV dose was needed. Our results indicate that the pre-exposure pigmentation level can guide the UV dosage in phototherapy. [source]


    Monte Carlo study of 2D electron gas transport including Pauli exclusion principle in highly doped silicon

    PHYSICA STATUS SOLIDI (C) - CURRENT TOPICS IN SOLID STATE PHYSICS, Issue 1 2008
    F. Carosella
    Abstract A Multi Sub-band Monte Carlo Simulator improved to efficiently include the Pauli Exclusion Principle is presented. It is used to study the transport in highly doped and ultra-thin silicon film. Both steady state and transient regime of transport for silicon films under uniform driving field are investigated. Such approach aims to be carried out in a full device simulator to improve the modeling of the access region of nano-Double Gate MOSFETs. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Shear stress nucleation in microcellular foaming process

    POLYMER ENGINEERING & SCIENCE, Issue 6 2002
    Lee Chen
    The effect of shear stress on the foaming process has been studied using the Foaming Process Simulator developed previously. The polymer samples were saturated with gas in the test chamber. A rotor was used to apply shear stress to the polymer samples. Foams were obtained by releasing the pressure quickly. Polystyrene, filled and unfilled, was used as the material. The cell density was analyzed with a scanning electron microscope. It was found that the cell density was significantly increased by introducing shear stress. The higher the shear stress, the more significant the effect. A cell stretch model has been developed to explain the cell nucleation enhancement with shear stress. The nucleation sites are stretched under the shear stress. The stretched nuclei are much easier to expand for cell formation owing to their larger surface areas and non-spherical shapes. The model prediction shows the same tendency of the effect of shear stress observed in the experiment. The key issue with shear stress nucleation is the transformation of mechanical shear energy into surface energy. [source]


    Flattening the Effective Marginal Tax Rate Structure in Australia: Policy Simulations Using the Melbourne Institute Tax and Transfer Simulator

    THE AUSTRALIAN ECONOMIC REVIEW, Issue 2 2003
    John Creedy
    This article uses the Melbourne Institute Tax and Transfer Simulator to examine the effects of a reduction in the means-tested benefit taper, or withdrawal, rates in Australia to 30 per cent. That is, all taper rates of 50 per cent and 70 per cent in the March 1998 benefit system are reduced to 30 per cent, while leaving all basic benefit levels unchanged. This change is therefore expected to ,flatten' the tax structure by reducing the high marginal tax rates applying to those with relatively low incomes and increasing the marginal tax rates of medium incomes. Simulations in which all individuals are assumed to remain at their pre-reform labour supply levels are compared with behavioural simulations in which the majority of individuals are free to adjust the number of hours worked. The results reflect only the supply side of the labour market. The database used is the 1997-98 Survey of Income and Housing Costs, so that weekly incomes are based on the financial year 1997-98. The comparison shows that, for sole parents, accounting for behavioural effects of the reform results in a lower estimated expenditure for government, whereas for couples, accounting for behavioural effects results in a higher estimated expenditure. [source]


    Simulating the Behavioural Effects of Welfare Reforms Among Sole Parents in Australia

    THE ECONOMIC RECORD, Issue 242 2002
    Alan Duncan
    This paper derives and estimates an econometric model of labour supply among sole parents in Australia, using modelling techniques which treat the labour supply decision as a utility maximising choice between a given number of discrete states. The model is then used to look at the likely effects of actual and hypothetical welfare policy reforms. Model estimates are based upon net incomes generated by the Melbourne Institute Tax and Transfer Simulator (MITTS), developed at the Melbourne Institute in collaboration with the Department of Family and Community Services (FaCS). [source]


    Temporal Bone Simulator as a Training and Assessment Tool for Temporal Bone Dissection

    THE LARYNGOSCOPE, Issue S3 2010
    Wes A. Allison MD
    No abstract is available for this article. [source]


    Using a Virtual Reality Temporal Bone Simulator to Assess Otolaryngology Trainees,

    THE LARYNGOSCOPE, Issue 2 2007
    Molly Zirkle MD
    Abstract Objective: The objective of this study is to determine the feasibility of computerized evaluation of resident performance using hand motion analysis on a virtual reality temporal bone (VR TB) simulator. We hypothesized that both computerized analysis and expert ratings would discriminate the performance of novices from experienced trainees. We also hypothesized that performance on the virtual reality temporal bone simulator (VR TB) would differentiate based on previous drilling experience. Study Design: The authors conducted a randomized, blind assessment study. Methods: Nineteen volunteers from the Otolaryngology,Head and Neck Surgery training program at the University of Toronto drilled both a cadaveric TB and a simulated VR TB. Expert reviewers were asked to assess operative readiness of the trainee based on a blind video review of their performance. Computerized hand motion analysis of each participant's performance was conducted. Results: Expert raters were able to discriminate novices from experienced trainees (P < .05) on cadaveric temporal bones, and there was a trend toward discrimination on VR TB performance. Hand motion analysis showed that experienced trainees had better movement economy than novices (P < .05) on the VR TB. Conclusion: Performance, as measured by hand motion analysis on the VR TB simulator, reflects trainees' previous drilling experience. This study suggests that otolaryngology trainees could accomplish initial temporal bone training on a VR TB simulator, which can provide feedback to the trainee, and may reduce the need for constant faculty supervision and evaluation. [source]


    Impact of the Endoscopic Sinus Surgical Simulator on Operating Room Performance

    THE LARYNGOSCOPE, Issue 7 2002
    Charles V. Edmond Jr. MD
    Abstract Objectives/Hypothesis The aim of this study is to evaluate an endoscopic sinus surgical simulator (ESS) as a training device and to introduce a methodology to assess its impact on actual operating room performance. Study Design Prospective evaluation of the endoscopic sinus surgical simulator as a trainer. Methods Ten junior and senior ear, nose and throat residents served as subjects, some of whom had prior training with the simulator. The evaluation team collected several measures, which were analyzed for a statistical correlation, including simulator scores, operating room performance rating, ratings of videotaped operating room procedures, and surgical competency rating. Results These findings suggest the ESS simulator positively affects initial operating room performance across all measures as judged by senior surgeons rating anonymous videotapes of those procedures. The two simulation-trained residents were rated consistently better than the other two residents across all measures. These differences approached statistical significance for two items: anterior ethmoidectomy (P = .06;P <.05) and surgical confidence (P = .09;P <.05). In addition, the 3 subjects with the highest overall scores on the competency evaluation also had 3 of the 4 highest cumulative simulation times. Conclusions The endoscopic sinus surgical simulator is a valid training device and appears to positively impact operating room performance among junior otolaryngology residents. [source]


    An improved PDF cloud scheme for climate simulations

    THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 651 2010
    Akira Kuwano-Yoshida
    Abstract An efficient grid-scale cloud scheme for climate simulation is implemented in the atmospheric general circulation model for the Earth Simulator (AFES). The new cloud scheme uses statistical partial condensation using joint-Gaussian probability distribution functions (PDFs) of the liquid water potential temperature and total water content, with standard deviations estimated by the moist Mellor,Yamada level-2 turbulence scheme. It also adopts improved closure parameters based on large-eddy simulations and a revised mixing length that varies with the stability and turbulent kinetic energy. These changes not only enable better representation of low-level boundary layer clouds, but also improve the atmospheric boundary layer structure. Sensitivity experiments for vertical resolution suggest that O(100,200 m) intervals are adequate to represent well-mixed boundary layers with the new scheme. The new scheme performs well at relatively low horizontal resolution (about 150 km), although inversion layers near the coast become more intense at a higher horizontal resolution (about 50 km). Copyright © 2010 Royal Meteorological Society [source]


    Pre-oxygenation and apnoea in pregnancy: changes during labour and with obstetric morbidity in a computational simulation

    ANAESTHESIA, Issue 4 2009
    S. H. McClelland
    Summary Using the Nottingham Physiology Simulator, we investigated the effects on pre-oxygenation and apnoea during rapid sequence induction of labour, obesity, sepsis, pre-eclampsia, maternal haemorrhage and multiple pregnancy in term pregnancy. Pre-oxygenation with 100% oxygen was followed by simulated rapid sequence induction when end-tidal nitrogen tension was less than 1 kPa, and apnoea. Labour, morbid obesity and sepsis accelerated pre-oxygenation and de-oxygenation during apnoea. Fastest pre-oxygenation was in labour, with 95% of the maximum change in expired oxygen tension occurring in 47 s, compared to 97 s in a standard pregnant subject. The labouring subject with a body mass index of 50 kg.m,2 demonstrated the fastest desaturation, the time taken to fall to an arterial saturation < 90% being 98 s, compared to 292 s in a standard pregnant subject. Pre-eclampsia prolonged pre-oxygenation and tolerance to apnoea. Maternal haemorrhage and multiple pregnancy had minor effects. Our results inform the risk-benefit comparison of the anaesthetic options for Caesarean section. [source]


    Anwendung von massiv paralleler Berechnung mit Grafikkarten (GPGPU) für CFD-Methoden im Brandschutz

    BAUPHYSIK, Issue 4 2009
    Hendrik C. Belaschk Dipl.-Ing.
    Berechnungsverfahren; Brandschutz; calculation methods; fire protection engineering Abstract Der Einsatz von Brandsimulationsprogrammen, die auf den Methoden der Computational Fluid Dynamics (CFD) beruhen, wird in der Praxis immer breiter. Infolge der Zunahme von verfügbarer Rechenleistung in der Computertechnik können heute die Auswirkungen möglicher Brandszenarien nachgebildet und daraus nützliche Informationen für den Anwendungsfall gewonnen werden (z. B. Nachweis der Zuverlässigkeit von Brandschutzkonzepten). Trotz der erzielten Fortschritte reicht die Leistung von heute verfügbaren Computern bei weitem nicht aus, um einen Gebäudebrand mit allen beteiligten physikalischen und chemischen Prozessen mit der höchstmöglichen Genauigkeit zu simulieren. Die in den Computerprogrammen zur Berechnung der Brand- und Rauchausbreitung implementierten Modelle stellen daher immer einen Kompromiss zwischen der praktischen Recheneffizienz und dem Detailgrad der Modellierung dar. Im folgenden Aufsatz wird gezeigt, worin die Ursachen für den hohen Rechenbedarf der CFD-Methoden liegen und welche Problemstellungen und möglichen Fehlerquellen sich aus den getroffenen Modellvereinfachungen für den Ingenieur ergeben. Darüber hinaus wird ein neuer Technologieansatz vorgestellt, der die Rechenleistung eines Personalcomputers unter Verwendung spezieller Software und handelsüblicher 3D-Grafikkarten massiv erhöht. Hierzu wird am Beispiel des Fire Dynamics Simulator (FDS) demonstriert, dass sich die erforderliche Berechnungszeit für eine Brandsimulation auf einem Personalcomputer um den Faktor 20 und mehr verringern lässt. Application of general-purpose computing on graphics processing units (GPGPU) in CFD techniques for fire safety simulations. The use of fire simulation programs based on computational fluid dynamics (CFD) techniques is becoming more and more widespread in practice. The increase in available computing power enables the effects of possible fire scenarios to be modelled in order to derive useful information for practical applications (e.g. analysis of the reliability of fire protection concepts). However, despite the progress in computing power the performance of currently available computers is inadequate for simulating a building fire including all relevant physical and chemical processes with maximum accuracy. The models for calculating the spread of fire and smoke implemented in the computer programs therefore always represent a compromise between practical computing efficiency and level of modelling detail. This paper illustrates the reasons for the high computing power demand of CFD techniques and describes potential problems and sources of error resulting from simplifications applied in the models. In addition, the paper presents a new technology approach that significantly increases the computing power of a PC using special software and standard 3D graphics cards. The Fire Dynamics Simulator (FDS) is used as an example to demonstrate how the required calculation time for a fire simulation on a PC can be reduced by a factor of 20 and more. [source]


    Fire play: ICCARUS,Intelligent command and control, acquisition and review using simulation,

    BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY, Issue 2 2008
    James Powell
    Is it possible to educate a fire officer to deal intelligently with the command and control of a major fire event he will never have experienced? The authors of this paper believe there is, and present here just one solution to this training challenge. It involves the development of an intelligent simulation based upon computer managed interactive media. The expertise and content underpinning this educational development was provided by the West Midlands Fire Service. Their brief for this training programme was unambiguous and to the point: 1Do not present the trainee with a model answer, because there are no generic fires. Each incident is novel, complex, and often ,wicked' in that it changes obstructively as it progresses. Thus firefighting demands that Commanders impose their individual intelligence on each problem to solve it. 2A suitable Educational Simulator should stand alone; operate in real time; emulate as nearly as possible the ,feel' of the fireground; present realistic fire progress; incorporate the vast majority of those resources normally present at a real incident; bombard the trainee with information from those sources; provide as few system-prompts as possible. 3There should also be an interrogable visual debrief which can be used after the exercise to give the trainees a firm understanding of the effects of their actions. This allows them to draw their own conclusions of their command effectiveness. Additionally, such a record of command and control will be an ideal initiator of tutorial discussion. 4The simulation should be realisable on a hardware/software platform of £10 000. 5The overriding importance is that the simulation should ,emulate as nearly as possible the feelings and stresses of the command role'. [source]


    5 The Contraption: A Low-Cost Participatory Hemodynamic Simulator

    ACADEMIC EMERGENCY MEDICINE, Issue 2008
    James Ritchie
    A hemodynamic simulator assembled from readily-available, inexpensive components can be used to demonstrate complex, clinically pertinent physiologic concepts in a hands-on experiential setting. Our simulator is composed of clear plastic tubing, squeeze bulbs, Heimlich valves, simple plastic connectors, balloons, IV tubing, plastic storage containers, a low-pressure gauge, and a child's water wheel. After a short introduction, student participants reproduce cardiac and systemic vascular function in a coordinated simulation. Normal functional physiology is demonstrated, followed by scripted changes in physiologic conditions. At least four students are simultaneously involved in managing the simulation, including squeezing the bulbs in simulating heart chamber contraction, modifying afterload, preload, and heart rate, and assessing output parameters such as blood pressure, cerebral blood flow, and cardiac output. Using this model, we are able to demonstrate and teach the following concepts: preload, afterload, hypertensive consequences, effects of dysrhythmias, valve disorders, preload criticality with disorders such as tamponade and right ventricular MI, gradual nature of change in physiology, normal compensation despite serious malfunction, relationship of blood pressure with cardiac output, shock state despite normal BP, neurogenic shock, septic shock, hypovolemic shock, cardiogenic shock, cardiac work, maximum blood pressure, vasopressor physiology, diastolic dysfunction coupled with decreased preload or atrial dysfunction, and CHF treatment options. Trainee feedback has been overwhelmingly positive. Trainees at all levels of training, including EMTs and senior EM residents, have grasped complex hemodynamic physiology concepts intuitively after participating with this trainer. [source]


    Development of Divertor Plasma Simulators with High Heat Flux Plasmas and its Application to Nuclear Fusion Study: A Review

    IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 4 2009
    Noriyasu Ohno Member
    Abstract This overview describes recent studies of nuclear fusion by using divertor plasma simulators generating high heat flux plasma. The experimental results regarding plasma-material interaction related to tungsten and carbon dusts, and detached recombining plasma are reviewed. © 2009 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source]


    Effectiveness of simulation on health profession students' knowledge, skills, confidence and satisfaction

    INTERNATIONAL JOURNAL OF EVIDENCE BASED HEALTHCARE, Issue 3 2008
    Susan Laschinger
    Abstract Background, Despite the recent wave of interest being shown in high-fidelity simulators, they do not represent a new concept in healthcare education. Simulators have been a part of clinical education since the 1950s. The growth of patient simulation as a core educational tool has been driven by a number of factors. Declining inpatient populations, concerns for patient safety and advances in learning theory are forcing healthcare educators to look for alternatives to the traditional clinical encounter for skill acquisition for students. Objective, The aim of this review was to identify the best available evidence on the effectiveness of using simulated learning experiences in pre-licensure health profession education. Inclusion criteria,Types of studies: This review considered any experimental or quasi-experimental studies that addressed the effectiveness of using simulated learning experiences in pre-licensure health profession practice. In the absence of randomised controlled trials, other research designs were considered for inclusion, such as, but not limited to: non-randomised controlled trials and before-and-after studies. Types of participants: This review included participants who were pre-licensure practitioners in nursing, medicine, and rehabilitation therapy. Types of intervention(s)/phenomena of interest: Studies that evaluated the use of human physical anatomical models with or without computer support, including whole-body or part-body simulators were included. Types of outcome measures, Student outcomes included knowledge acquisition, skill performance, learner satisfaction, critical thinking, self-confidence and role identity. Search strategy, Using a defined search and retrieval method, the following databases were accessed for the period 1995,2006: Medline, CINAHL, Embase, PsycINFO, HealthSTAR, Cochrane Database of Systematic Reviews and ERIC. Methodological quality, Each paper was assessed by two independent reviewers for methodological quality prior to inclusion in the review using the standardised critical appraisal instruments for evidence of effectiveness, developed by the Joanna Briggs Institute. Disagreements were dealt with by consultations with a third reviewer. Data collection, Information was extracted from each paper independently by two reviewers using the standardised data extraction tool from the Joanna Briggs Institute. Disagreements were dealt with by consultation with a third reviewer. Data synthesis, Due to the type of designs and quality of available studies, it was not possible to pool quantitative research study results in statistical meta-analysis. As statistical pooling was not possible, the findings are presented in descriptive narrative form. Results, Twenty-three studies were selected for inclusion in this review including partial task trainers and high-fidelity human patient simulators. The results indicate that there is high learner satisfaction with using simulators to learn clinical skills. The studies demonstrated that human patient simulators which are used for teaching higher level skills, such as airway management, and physiological concepts are useful. While there are short-term gains in knowledge and skill performance, it is evident that performance of skills over time after initial training decline. Conclusion, At best, simulation can be used as an adjunct for clinical practice, not a replacement for everyday practice. Students enjoyed the sessions and using the models purportedly makes learning easier. However, it remains unclear whether the skills learned through a simulation experience transfer into real-world settings. More research is needed to evaluate whether the skills acquired with this teaching methodology transfer to the practice setting such as the impact of simulation training on team function. [source]


    Study of parallel numerical methods for semiconductor device simulation

    INTERNATIONAL JOURNAL OF NUMERICAL MODELLING: ELECTRONIC NETWORKS, DEVICES AND FIELDS, Issue 1 2006
    Natalia Seoane
    Abstract Simulators of semiconductor devices have to solve systems of equations generated by the discretization of partial differential equations, which are the most time-consuming part of the simulation process. Therefore, the use of an effective method to solve these linear systems is essential. In this work we have evaluated the efficiency of different parallel direct and iterative solvers used for the solution of the drift,diffusion equations in semiconductor device simulation. Several preconditioning techniques have been applied in order to minimize the execution times. We have found that FGMRES and BCGSTAB solvers preconditioned with Additive Schwarz are the most suitable for these types of problems. The results were obtained in an HP Superdome cluster with 128 Itanium2 1.5 GHz. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Simulation in a Disaster Drill: Comparison of High-fidelity Simulators versus Trained Actors

    ACADEMIC EMERGENCY MEDICINE, Issue 11 2008
    Brian Gillett MD
    Abstract Objectives:, High-fidelity patient simulation provides lifelike medical scenarios with real-time stressors. Mass casualty drills must construct a realistic incident in which providers care for multiple injured patients while simultaneously coping with numerous stressors designed to tax an institution's resources. This study compared the value of high-fidelity simulated patients with live actor-patients. Methods:, A prospective cohort study was conducted during two mass casualty drills in December 2006 and March 2007. The providers' completion of critical actions was tested in live actor-patients and simulators. A posttest survey compared the participants' perception of "reality" between the simulators and live actor victims. Results:, The victims (n = 130) of the mass casualty drill all had burn-, blast-, or inhalation-related injuries. The participants consisted of physicians, residents, medical students, clerks, and paramedics. The authors compared the team's execution of the 136 critical actions (17 critical actions × 8 scenarios) between the simulators and the live actor-patients. Only one critical action was missed in the simulator group and one in the live actor group, resulting in a miss rate of 0.74% (95% confidence interval [CI] = 0.01% to 4.5%). All questionnaires were returned and analyzed. The vast majority of participants disagreed or strongly disagreed that the simulators were a distraction from the disaster drill. More than 96% agreed or strongly agreed that they would recommend the simulator as a training tool. The mean survey scores for all participants demonstrated agreement that the simulators closely mimicked real-life scenarios, accurately represented disease states, and heightened the realism of patient assessment and treatment options during the drill with the exception of nurse participants, who agreed slightly less strongly. Conclusions:, This study demonstrated that simulators compared to live actor-patients have equivalent results in prompting critical actions in mass casualty drills and increase the perceived reality of such exercises. [source]


    3D virtual simulator for breast plastic surgery

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3-4 2008
    Youngjun Kim
    Abstract We have proposed novel 3D virtual simulation software for breast plastic surgery. Our software comprises two processes: a 3D torso modeling and a virtual simulation of the surgery result. First, image-based modeling is performed in order to obtain a female subject's 3D torso data. Our image-based modeling method utilizes a template model, and this is deformed according to the patient's photographs. For the deformation, we applied procrustes analysis and radial basis functions (RBF). In order to enhance reality, the subject's photographs are mapped onto a mesh. Second, from the modeled subject data, we simulate the subject's virtual appearance after the plastic surgery by morphing the shape of the breasts. We solve the simulation problem by an example-based approach. The subject's virtual shape is obtained from the relations between the pair sets of feature points from previous patients' photographs obtained before and after the surgery. Copyright © 2008 John Wiley & Sons, Ltd. [source]