Home About us Contact | |||
Simulations
Kinds of Simulations Terms modified by Simulations Selected AbstractsDEVELOPMENT AND VALIDATION OF A NEW FACILITY FOR LOW EARTH ORBIT THERMAL CYCLING SIMULATIONEXPERIMENTAL TECHNIQUES, Issue 5 2009P. Micciché First page of article [source] SIMULATION OF THIN-FILM DEODORIZERS IN PALM OIL REFININGJOURNAL OF FOOD PROCESS ENGINEERING, Issue 2010ROBERTA CERIANI ABSTRACT As the need for healthier fats and oils (natural vitamin and trans fat contents) and interest in biofuels are growing, many changes in the world's vegetable oil market are driving the oil industry to developing new technologies and recycling traditional ones. Computational simulation is widely used in the chemical and petrochemical industries as a tool for optimization and design of (new) processes, but that is not the case for the edible oil industry. Thin-film deodorizers are novel equipment developed for steam deacidification of vegetable oils, and no work on the simulation of this type of equipment could be found in the open literature. This paper tries to fill this gap by presenting results from the study of the effect of processing variables, such as temperature, pressure and percentage of stripping steam, in the final quality of product (deacidified palm oil) in terms of final oil acidity, the tocopherol content and neutral oil loss. The simulation results have been evaluated by using the response surface methodology. The model generated by the statistical analysis for tocopherol retention has been validated by matching its results with industrial data published in the open literature. PRACTICAL APPLICATIONS This work is a continuation of our previous works (Ceriani and Meirelles 2004a, 2006; Ceriani et al. 2008), dealing with the simulation of continuous deodorization and/or steam deacidification for a variety of vegetable oils using stage-wised columns, and analyzing both the countercurrent and the cross-flow patterns. In this work, we have studied thin-film deodorizers, which are novel equipment developed for steam deacidification of vegetable oils. Here, we highlight issues related to final oil product quality and the corresponding process variables. [source] MONTE CARLO SIMULATION OF FAR INFRARED RADIATION HEAT TRANSFER: THEORETICAL APPROACHJOURNAL OF FOOD PROCESS ENGINEERING, Issue 4 2006F. TANAKA ABSTRACT We developed radiation heat transfer models with the combination of the Monte Carlo (MC) method and computational fluid dynamic approach and two-dimensional heat transfer models based on the fundamental quantum physics of radiation and fluid dynamics. We investigated far infrared radiation (FIR) heating in laminar and buoyancy airflow. A simple prediction model in laminar airflow was tested with an analytical solution and commercial software (CFX 4). The adequate number of photon tracks for MC simulation was established. As for the complex designs model, the predicted results agreed well with the experimental data with root mean square error of 3.8 K. Because food safety public concerns are increasing, we applied this model to the prediction of the thermal inactivation level by coupling with the microbial kinetics model. Under buoyancy airflow condition, uniformity of FIR heating was improved by selecting adequate wall temperature and emissivity. [source] SOLID FOODS FREEZE-DRYING SIMULATION AND EXPERIMENTAL DATAJOURNAL OF FOOD PROCESS ENGINEERING, Issue 2 2005S. KHALLOUFI ABSTRACT This article presents a mathematical model describing the unsteady heat and mass transfer during the freeze drying of biological materials. The model was built from the mass and energy balances in the dried and frozen regions of the material undergoing freeze drying. A set of coupled nonlinear partial differential equations permitted the description of the temperature and pressure profiles, together with the position of the sublimation interface. These equations were transformed to a finite element scheme and numerically solved using the Newton-Raphson approach to represent the nonlinear problem and the interface position. Most parameters involved in the model (i.e., thermal conductivity, specific heat, density, heat and mass transfer coefficients etc.) were obtained from experimental data cited in the literature. The dehydration kinetics and the temperature profiles of potato and apple slabs were experimentally determined during freeze drying. The simulation results agreed closely with the water content experimental data. The prediction of temperature profiles within the solid was, however, less accurate. [source] UNDERGRADUATE APPELLATE SIMULATION IN AMERICAN COLLEGESJOURNAL OF LEGAL STUDIES EDUCATION, Issue 1 2001Charles R. Knerr [source] GEOLOGICAL MODEL EVALUATION THROUGH WELL TEST SIMULATION: A CASE STUDY FROM THE WYTCH FARM OILFIELD, SOUTHERN ENGLANDJOURNAL OF PETROLEUM GEOLOGY, Issue 1 2007S.Y. Zheng This paper presents an approach to the evaluation of reservoir models using transient pressure data. Braided fluvial sandstones exposed in cliffs in SW England were studied as the surface equivalent of the Triassic Sherwood Sandstone, a reservoir unit at the nearby Wytch Farm oilfield. Three reservoir models were built; each used a different modelling approach ranging in complexity from stochastic pixel-based modelling using commercially available software, to a spreadsheet random number generator. In order to test these models, numerical well test simulations were conducted using sector models extracted from the geological models constructed. The simulation results were then evaluated against the actual well test data in order to find the model which best represented the field geology. Two wells at Wytch Farm field were studied. The results suggested that for one of the sampled wells, the model built using the spreadsheet random number generator gave the best match to the well test data. In the well, the permeability from the test interpretation matched the geometric average permeability. This average is the "correct" upscaled permeability for a random system, and this was consistent with the random nature of the geological model. For the second well investigated, a more complex "channel object" model appeared to fit the dynamic data better. All the models were built with stationary properties. However, the well test data suggested that some parts of the field have different statistical properties and hence show non-stationarity. These differences would have to be built into the model representing the local geology. This study presents a workflow that is not yet considered standard in the oil industry, and the use of dynamic data to evaluate geological models requires further development. The study highlights the fact that the comparison or matching of results from reservoir models and well-test analyses is not always straightforward in that different models may match different wells. The study emphasises the need for integrated analyses of geological and engineering data. The methods and procedures presented are intended to form a feedback loop which can be used to evaluate the representivity of a geological model. [source] DIEL RHYTHM OF ALGAL PHOSPHATE UPTAKE RATES IN P-LIMITED CYCLOSTATS AND SIMULATION OF ITS EFFECT ON GROWTH AND COMPETITION1JOURNAL OF PHYCOLOGY, Issue 4 2002Chi-Yong Ahn Oscillations in the phosphate (Pi) uptake rates for three species of green algae were examined in a P-limited cyclostat. For Ankistrodesmus convolutus Corda and Chlorella vulgaris Beyerinck, the Pi uptake rates increased during the daytime and decreased at night. In contrast, Chlamydomonas sp. exhibited the opposite uptake pattern. Cell densities also oscillated under a light:dark cycle, dividing at a species-specific timing rather than continuously. In general, the cell densities exhibited an inverse relationship with the Pi uptake rates. A competition experiment between A. convolutus and C. vulgaris in a P-limited cyclostat resulted in the dominance of C. vulgaris, regardless of the relative initial cell concentrations. Chlorella vulgaris also dominated in a mixed culture with Chlamydomonas sp., irrespective of the initial seeding ratio and dilution rate. However, Chlamydomonas sp. and A. convolutus coexisted in the competition experiment with gradual decrease of Chlamydomonas sp. when equally inoculated. Mathematical expressions of the oscillations in the Pi uptake rate and species-specific cell division gate were used to develop a simulation model based on the Droop equation. The simulation results for each of the species conformed reasonably well to the experimental data. The results of the competition experiments also matched the competition simulation predictions quite well, although the experimental competition was generally more delayed than the simulations. In conclusion, the model simulation that incorporated the effect of diel rhythms in nutrient uptake clearly demonstrated that species diversity could be enhanced by different oscillation patterns in resource uptake, even under the condition of limitation by the same resource. [source] NUMERICAL MODELING AND SIMULATION ON THE SWALLOWING OF JELLYJOURNAL OF TEXTURE STUDIES, Issue 4 2009H. MIZUNUMA ABSTRACT Studies of the swallowing process are especially important for the development of care foods for dysphagia. However, the effectiveness of experiments on human subjects is somewhat limited due to instrument resolution, stress to the subjects and the risk of aspiration. These problems may be resolved if numerical simulation of swallowing can be used as an alternative investigative tool. On this basis, a numerical model is proposed to simulate the swallowing of a simple jelly bolus. The structure of the pharynx was modeled using a finite element method, and the swallowing movements were defined by pharynx posterior wall shift, laryngeal elevation and epiglottis retroflexion. The rheological characteristics of the jelly were investigated using an oscillatory rheometer and a compression test. A Maxwell three-element model was applied to the rheological model of the jelly. The model constants were obtained from compression tests because the mode of deformation and the stress level of the compression tests were similar to those of the swallowed jelly. The frictional relationship between the organs and the jelly was estimated experimentally from some frictional measurements between the jelly and a wet sloping surface. The results of the simulations for the soft and hard jellies showed different patterns of swallowing that depended on their hardness, and the soft jelly produced faster swallowing because of its flexibility. PRACTICAL APPLICATIONS The object of this study is to develop a numerical simulation model of swallowing. Numerical modeling is suitable for the quantitative analysis of the swallowing process and may also be expected to enable a systematic study of care foods that are safe and offer some degree of comfort to patients suffering from swallowing disorders. The computer simulation can be used for evaluation without dangerous risks to the patient. [source] HYDROLOGIC SIMULATION OF THE LITTLE WASHITA RIVER EXPERIMENTAL WATERSHED USING SWAT,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 2 2003Michael W. Van Liew ABSTRACT: Precipitation and streamflow data from three nested subwatersheds within the Little Washita River Experimental Watershed (LWREW) in southwestern Oklahoma were used to evaluate the capabilities of the Soil and Water Assessment Tool (SWAT) to predict streamflow under varying climatic conditions. Eight years of precipitation and streamflow data were used to calibrate parameters in the model, and 15 years of data were used for model validation. SWAT was calibrated on the smallest and largest sub-watersheds for a wetter than average period of record. The model was then validated on a third subwatershed for a range in climatic conditions that included dry, average, and wet periods. Calibration of the model involved a multistep approach. A preliminary calibration was conducted to estimate model parameters so that measured versus simulated yearly and monthly runoff were in agreement for the respective calibration periods. Model parameters were then fine tuned based on a visual inspection of daily hydrographs and flow frequency curves. Calibration on a daily basis resulted in higher baseflows and lower peak runoff rates than were obtained in the preliminary calibration. Test results show that once the model was calibrated for wet climatic conditions, it did a good job in predicting streamflow responses over wet, average, and dry climatic conditions selected for model validation. Monthly coefficients of efficiencies were 0.65, 0.86, and 0.45 for the dry, average, and wet validation periods, respectively. Results of this investigation indicate that once calibrated, SWAT is capable of providing adequate simulations for hydrologic investigations related to the impact of climate variations on water resources of the LWREW. [source] STORMFLOW SIMULATION USING A GEOGRAPHICAL INFORMATION SYSTEM WITH A DISTRIBUTED APPROACH,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 4 2001Zhongbo Yu ABSTRACT: With the increasing availability of digital and remotely sensed data such as land use, soil texture, and digital elevation models (DEMs), geographic information systems (GIS) have become an indispensable tool in preprocessing data sets for watershed hydrologic modeling and post processing simulation results. However, model inputs and outputs must be transferred between the model and the GIS. These transfers can be greatly simplified by incorporating the model itself into the GIS environment. To this end, a simple hydrologic model, which incorporates the curve number method of rainfall-runoff partitioning, the ground-water base-flow routine, and the Muskingum flow routing procedure, was implemented on the GIS. The model interfaces directly with stream network, flow direction, and watershed boundary data generated using standard GIS terrain analysis tools; and while the model is running, various data layers may be viewed at each time step using the full display capabilities. The terrain analysis tools were first used to delineate the drainage basins and stream networks for the Susquehanna River. Then the model was used to simulate the hydrologic response of the Upper West Branch of the Susquehanna to two different storms. The simulated streamflow hydrographs compare well with the observed hydrographs at the basin outlet. [source] LARGE-SCALE SIMULATION OF THE HUMAN ARTERIAL TREECLINICAL AND EXPERIMENTAL PHARMACOLOGY AND PHYSIOLOGY, Issue 2 2009L Grinberg SUMMARY 1Full-scale simulations of the virtual physiological human (VPH) will require significant advances in modelling, multiscale mathematics, scientific computing and further advances in medical imaging. Herein, we review some of the main issues that need to be resolved in order to make three-dimensional (3D) simulations of blood flow in the human arterial tree feasible in the near future. 2A straightforward approach is computationally prohibitive even on the emerging petaflop supercomputers, so a three-level hierarchical approach based on vessel size is required, consisting of: (i) a macrovascular network (MaN); (ii) a mesovascular network (MeN); and (iii) a microvascular network (MiN). We present recent simulations of MaN obtained by solving the 3D Navier,Stokes equations on arterial networks with tens of arteries and bifurcations and accounting for the neglected dynamics through proper boundary conditions. 3A multiscale simulation coupling MaN,MeN,MiN and running on hundreds of thousands of processors on petaflop computers will require no more than a few CPU hours per cardiac cycle within the next 5 years. The rapidly growing capacity of supercomputing centres opens up the possibility of simulation studies of cardiovascular diseases, drug delivery, perfusion in the brain and other pathologies. [source] III. SIMULATIONS 1a, 1b, AND 1c: THE ROLE OF MOVING PARTS IN FORMING REPRESENTATIONS OF OBJECTSMONOGRAPHS OF THE SOCIETY FOR RESEARCH IN CHILD DEVELOPMENT, Issue 1 2008Article first published online: 16 MAR 200 First page of article [source] Instream Flow Science For Sustainable River Management,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 5 2009Geoffrey E Petts Abstract:, Concerns for water resources have inspired research developments to determine the ecological effects of water withdrawals from rivers and flow regulation below dams, and to advance tools for determining the flows required to sustain healthy riverine ecosystems. This paper reviews the advances of this environmental flows science over the past 30 years since the introduction of the Instream Flow Incremental Methodology. Its central component, Physical HABitat SIMulation, has had a global impact, internationalizing the e-flows agenda and promoting new science. A global imperative to set e-flows, including an emerging trend to set standards at the regional scale, has led to developments of hydrological and hydraulic approaches but expert judgment remains a critical element of the complex decision-making process around water allocations. It is widely accepted that river ecosystems are dependent upon the natural variability of flow (the flow regime) that is typical of each hydro-climatic region and upon the range of habitats found within each channel type within each region. But as the sophistication of physical (hydrological and hydraulic) models has advanced emerging biological evidence to support those assumptions has been limited. Empirical studies have been important to validate instream flow recommendations but they have not generated transferable relationships because of the complex nature of biological responses to hydrological change that must be evaluated over decadal time-scales. New models are needed to incorporate our evolving knowledge of climate cycles and morphological sequences of channel development but most importantly we need long-term research involving both physical scientists and biologists to develop new models of population dynamics that will advance the biological basis for 21st Century e-flow science. [source] Simulation of compression refrigeration systemsCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 3 2006Jaime Sieres Abstract This study presents the main features of a software for simulating vapor compression refrigeration systems that are self designed by the user. A library of 10 different components is available: compressor, expansion device, condenser, evaporator, heat exchanger, flash tank, direct intercooler flash tank, indirect intercooler flash tank, mixer, and splitter. With these components and a library of different refrigerants many different refrigeration systems may be solved. By a user-friendly interface, the user can draw the system scheme by adding different components, connecting them and entering different input data. Results are presented in the form of tables and the cycle diagram of the system is drawn on the logP,h and T,s thermodynamic charts. © 2006 Wiley Periodicals, Inc. Comput Appl Eng Educ 14: 188,197, 2006; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20075 [source] Hybrid Simulation of Miscible Mixing with Viscous FingeringCOMPUTER GRAPHICS FORUM, Issue 2 2010Seung-Ho Shiny Abstract By modeling mass transfer phenomena, we simulate solids and liquids dissolving or changing to other substances. We also deal with the very small-scale phenomena that occur when a fluid spreads out at the interface of another fluid. We model the pressure at the interfaces between fluids with Darcy's Law and represent the viscous fingering phenomenon in which a fluid interface spreads out with a fractal-like shape. We use hybrid grid-based simulation and smoothed particle hydrodynamics (SPH) to simulate intermolecular diffusion and attraction using particles at a computable scale. We have produced animations showing fluids mixing and objects dissolving. [source] A Hybrid Approach to Multiple Fluid Simulation using Volume FractionsCOMPUTER GRAPHICS FORUM, Issue 2 2010Nahyup Kang Abstract This paper presents a hybrid approach to multiple fluid simulation that can handle miscible and immiscible fluids, simultaneously. We combine distance functions and volume fractions to capture not only the discontinuous interface between immiscible fluids but also the smooth transition between miscible fluids. Our approach consists of four steps: velocity field computation, volume fraction advection, miscible fluid diffusion, and visualization. By providing a combining scheme between volume fractions and level set functions, we are able to take advantages of both representation schemes of fluids. From the system point of view, our work is the first approach to Eulerian grid-based multiple fluid simulation including both miscible and immiscible fluids. From the technical point of view, our approach addresses the issues arising from variable density and viscosity together with material diffusion. We show that the effectiveness of our approach to handle multiple miscible and immiscible fluids through experiments. [source] A Time Model for Time-Varying VisualizationCOMPUTER GRAPHICS FORUM, Issue 6 2009M. Wolter I.3.6 [Computer Graphics]: Methodology and Techniques; I.6.6 [Simulation and Modelling]: Simulation Output Analysis Abstract The analysis of unsteady phenomena is an important topic for scientific visualization. Several time-dependent visualization techniques exist, as well as solutions for dealing with the enormous size of time-varying data in interactive visualization. Many current visualization toolkits support displaying time-varying data sets. However, for the interactive exploration of time-varying data in scientific visualization, no common time model that describes the temporal properties which occur in the visualization process has been established. In this work, we propose a general time model which classifies the time frames of simulation phenomena and the connections between different time scales in the analysis process. This model is designed for intuitive interaction with time in visualization applications for the domain expert as well as for the developer of visualization tools. We demonstrate the benefits of our model by applying it to two use cases with different temporal properties. [source] Simulation of two-phase flow with sub-scale droplet and bubble effectsCOMPUTER GRAPHICS FORUM, Issue 2 2009Viorel Mihalef Abstract We present a new Eulerian-Lagrangian method for physics-based simulation of fluid flow, which includes automatic generation of sub-scale spray and bubbles. The Marker Level Set method is used to provide a simple geometric criterion for free marker generation. A filtering method, inspired from Weber number thresholding, further controls the free marker generation (in a physics-based manner). Two separate models are used, one for sub-scale droplets, the other for sub-scale bubbles. Droplets are evolved in a Newtonian manner, using a density-extension drag force field, while bubbles are evolved using a model based on Stokes' Law. We show that our model for sub-scale droplet and bubble dynamics is simple to couple with a full (macro-scale) Navier-Stokes two-phase flow model and is quite powerful in its applications. Our animations include coarse grained multiphase features interacting with fine scale multiphase features. [source] Pedestrian Reactive Navigation for Crowd Simulation: a Predictive ApproachCOMPUTER GRAPHICS FORUM, Issue 3 2007Sébastien Paris This paper addresses the problem of virtual pedestrian autonomous navigation for crowd simulation. It describes a method for solving interactions between pedestrians and avoiding inter-collisions. Our approach is agent-based and predictive: each agent perceives surrounding agents and extrapolates their trajectory in order to react to potential collisions. We aim at obtaining realistic results, thus the proposed model is calibrated from experimental motion capture data. Our method is shown to be valid and solves major drawbacks compared to previous approaches such as oscillations due to a lack of anticipation. We first describe the mathematical representation used in our model, we then detail its implementation, and finally, its calibration and validation from real data. [source] Recent Developments and Applications of Haptic DevicesCOMPUTER GRAPHICS FORUM, Issue 2 2003S. D. Laycock Abstract Over recent years a variety of haptic feedback devices have been developed and are being used in a number of important applications. They range from joysticks used in the entertainment industry to specialised devices used in medical applications. This paper will describe the recent developments of these devices and show how they have been applied. It also examines how haptic feedback has been combined with visual display devices, such as virtual reality walls and workbenches, in order to improve the immersive experience. ACM CSS: H.5.2 Information Interfaces and Presentation,Haptic I/O; I.3.8 Computer Graphics,Applications; I.6 Simulation and Modelling,Applications [source] STRANDS: Interactive Simulation of Thin Solids using Cosserat ModelsCOMPUTER GRAPHICS FORUM, Issue 3 2002Dinesh K. Pai Strandsare thin elastic solids that are visually well approximated as smooth curves, and yet possess essential physical behaviors characteristic of solid objects such as twisting. Common examples in computer graphics include: sutures, catheters, and tendons in surgical simulation; hairs, ropes, and vegetation in animation. Physical models based on spring meshes or 3D finite elements for such thin solids are either inaccurate or inefficient for interactive simulation. In this paper we show that models based on the Cosserat theory of elastic rods are very well suited for interactive simulation of these objects. The physical model reduces to a system of spatial ordinary differential equations that can be solved efficiently for typical boundary conditions. The model handles the important geometric non-linearity due to large changes in shape. We introduce Cosserat-type physical models, describe efficient numerical methods for interactive simulation of these models, and implementation results. [source] Fast and Controllable Simulation of the Shattering of Brittle ObjectsCOMPUTER GRAPHICS FORUM, Issue 2 2001Jeffrey Smith First page of article [source] Fuzzy Monte Carlo Simulation and Risk Assessment in ConstructionCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 4 2010N. Sadeghi However, subjective and linguistically expressed information results in added non-probabilistic uncertainty in construction management. Fuzzy logic has been used successfully for representing such uncertainties in construction projects. In practice, an approach that can handle both random and fuzzy uncertainties in a risk assessment model is necessary. This article discusses the deficiencies of the available methods and proposes a Fuzzy Monte Carlo Simulation (FMCS) framework for risk analysis of construction projects. In this framework, we construct a fuzzy cumulative distribution function as a novel way to represent uncertainty. To verify the feasibility of the FMCS framework and demonstrate its main features, the authors have developed a special purpose simulation template for cost range estimating. This template is employed to estimate the cost of a highway overpass project. [source] Integrating Messy Genetic Algorithms and Simulation to Optimize Resource UtilizationCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 6 2009Tao-ming Cheng Various resource distribution modeling scenarios were tested in simulation to determine their system performances. MGA operations were then applied in the selection of the best resource utilization schemes based on those performances. A case study showed that this new modeling mechanism, along with the implemented computer program, could not only ease the process of developing optimal resource utilization, but could also improve the system performance of the simulation model. [source] Simulation of Accuracy Performance for Wireless Sensor-Based Construction Asset TrackingCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2009Miros, aw J. Skibniewski In particular, identifying the location of distributed mobile entities throughout wireless communications becomes the primary task to realize the remote tracking and monitoring of the construction assets. Even though several alternative solutions have been introduced by utilizing recent technologies, such as radio frequency identification (RFID) and the global positioning system (GPS), they could not provide a solid direction to accurate and scalable tracking frameworks in large-scale construction domains due to limited capability and inflexible networking architectures. This article introduces a new tracking architecture using wireless sensor modules and shows an accuracy performance using a numerical simulation approach based on the time-of-flight method. By combining radio frequency (RF) and ultrasound (US) signals, the simulation results showed an enhanced accuracy performance over the utilization of an RF signal only. The proposed approach can provide potential guidelines for further exploration of hardware/software design and for experimental analysis to implement the framework of tracking construction assets. [source] Nondestructive Evaluation of Elastic Properties of Concrete Using Simulation of Surface WavesCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 8 2008Jae Hong Kim In this study, to evaluate information of a surface waveform beyond the simple wave velocity, artificial intelligence engines are employed to estimate simulation parameters, that is, the properties of elastic materials. The developed artificial neural networks are trained with a numerical database having secured its stability. In the process, the appropriate shape of the force,time function for an impact load is assumed so as to avoid Gibbs phenomenon, and the proposed principal wavelet-component analysis accomplishes a feature extraction with a wavelet transformed signal. The results of estimation are validated with experiments focused on concrete materials. [source] Bi-level Programming Formulation and Heuristic Solution Approach for Dynamic Traffic Signal OptimizationCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2006Dazhi Sun Conventional methods of signal timing optimization assume given traffic flow pattern, whereas traffic assignment is performed with the assumption of fixed signal timing. This study develops a bi-level programming formulation and heuristic solution approach (HSA) for dynamic traffic signal optimization in networks with time-dependent demand and stochastic route choice. In the bi-level programming model, the upper level problem represents the decision-making behavior (signal control) of the system manager, while the user travel behavior is represented at the lower level. The HSA consists of a Genetic Algorithm (GA) and a Cell Transmission Simulation (CTS) based Incremental Logit Assignment (ILA) procedure. GA is used to seek the upper level signal control variables. ILA is developed to find user optimal flow pattern at the lower level, and CTS is implemented to propagate traffic and collect real-time traffic information. The performance of the HSA is investigated in numerical applications in a sample network. These applications compare the efficiency and quality of the global optima achieved by Elitist GA and Micro GA. Furthermore, the impact of different frequencies of updating information and different population sizes of GA on system performance is analyzed. [source] Initialization Strategies in Simulation-Based SFE Eigenvalue AnalysisCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2005Song Du Poor initializations often result in slow convergence, and in certain instances may lead to an incorrect or irrelevant answer. The problem of selecting an appropriate starting vector becomes even more complicated when the structure involved is characterized by properties that are random in nature. Here, a good initialization for one sample could be poor for another sample. Thus, the proper eigenvector initialization for uncertainty analysis involving Monte Carlo simulations is essential for efficient random eigenvalue analysis. Most simulation procedures to date have been sequential in nature, that is, a random vector to describe the structural system is simulated, a FE analysis is conducted, the response quantities are identified by post-processing, and the process is repeated until the standard error in the response of interest is within desired limits. A different approach is to generate all the sample (random) structures prior to performing any FE analysis, sequentially rank order them according to some appropriate measure of distance between the realizations, and perform the FE analyses in similar rank order, using the results from the previous analysis as the initialization for the current analysis. The sample structures may also be ordered into a tree-type data structure, where each node represents a random sample, the traverse of the tree starts from the root of the tree until every node in the tree is visited exactly once. This approach differs from the sequential ordering approach in that it uses the solution of the "closest" node to initialize the iterative solver. The computational efficiencies that result from such orderings (at a modest expense of additional data storage) are demonstrated through a stability analysis of a system with closely spaced buckling loads and the modal analysis of a simply supported beam. [source] From a Product Model to Visualization: Simulation of Indoor Flows with Lattice-Boltzmann MethodsCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 6 2004Siegfried Kühner All models are derived from a product data model based on Industry Foundation Classes. Concepts of the Lattice-Boltzmann method are described, being used as the numerical kernel of our simulation system. We take advantage of spacetrees as a central data structure for all geometry related objects. Finally, we describe some advanced postprocessing and visualization techniques allowing to efficiently analyze huge amounts of simulation data. [source] Efficient and fair scheduling for two-level information broadcasting systemsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 18 2008Byoung-Hoon Lee Abstract In a ubiquitous environment, there are many applications where a server disseminates information of common interest to pervasive clients and devices. For an example, an advertisement server sends information from a broadcast server to display devices. We propose an efficient information scheduling scheme for information broadcast systems to reduce average waiting time for information access while maintaining fairness between information items. Our scheme allocates information items adaptively according to relative popularity for each local server. Simulation results show that our scheme can reduce the waiting time up to 30% compared with the round robin scheme while maintaining cost-effective fairness. Copyright © 2008 John Wiley & Sons, Ltd. [source] |