Simulation Approach (simulation + approach)

Distribution by Scientific Domains
Distribution within Life Sciences

Kinds of Simulation Approach

  • carlo simulation approach
  • monte carlo simulation approach


  • Selected Abstracts


    A Parallelised High Performance Monte Carlo Simulation Approach for Complex Polymerisation Kinetics

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 6 2007
    Hugh Chaffey-Millar
    Abstract A novel, parallelised approach to Monte Carlo simulations for the computation of full molecular weight distributions (MWDs) arising from complex polymerisation reactions is presented. The parallel Monte Carlo method constitutes perhaps the most comprehensive route to the simulation of full MWDs of multiple chain length polymer entities and can also provide detailed microstructural information. New fundamental insights have been developed with regard to the Monte Carlo process in at least three key areas: (i) an insufficient system size is demonstrated to create inaccuracies via poor representation of the most improbable events and least numerous species; (ii) advanced algorithmic principles and compiler technology known to computer science have been used to provide speed improvements and (iii) the parallelisability of the algorithm has been explored and excellent scalability demonstrated. At present, the parallel Monte Carlo method presented herein compares very favourably in speed with the latest developments in the h-p Galerkin method-based PREDICI software package while providing significantly more detailed microstructural information. It seems viable to fuse parallel Monte Carlo methods with those based on the h-p Galerkin methods to achieve an optimum of information depths for the modelling of complex macromolecular kinetics and the resulting microstructural information. [source]


    A Holistic Simulation Approach from a Measured Load to Element Stress Using Combined Multi-body Simulation and Finite Element Modelling

    PROCEEDINGS IN APPLIED MATHEMATICS & MECHANICS, Issue 1 2009
    Matthias Harter
    The design of vehicle bodies requires the knowledge of the vehicle's structural response to external loads and disturbances. In rigid multi-body simulation the dynamic behaviour of complex systems is calculated with rigid bodies and neglect of body elasticity. On the other hand, in finite element models large degree of freedom numbers are used to represent the elastic properties of a single body. Both simulation methods can be combined, if the finite element model size is reduced to a degree of freedom number feasible to multi-body simulation. The application to practical purposes requires the use and interconnection of several different software tools. In this contribution a holistic method is presented, which starts with the measurement or synthesis of loads and excitations, continues with the integration of a reduced finite element model into a multi-body system, the dynamic response calculation of this combined model, and concludes with the result expansion to the full finite element model for calculating strain and stress values at any point of the finite element mesh. The applied software tools are Simpack, Nastran, and Matlab. An example is given with a railway vehicle simulated on measured track geometry. (© 2009 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Exchange Rates and Cash Flows in Differentiated Product Industries: A Simulation Approach

    THE JOURNAL OF FINANCE, Issue 5 2007
    RICHARD FRIBERG
    ABSTRACT How do exchange rate changes impact firms' cash flows? We extend a simulation method developed in industrial organization to answer this question. We use prices, quantities, and product characteristics for differentiated products, coupled with a discrete choice framework and an assumption of price competition, to estimate marginal costs for all producers. Using a Monte Carlo approach we generate counterfactual prices and profits for different levels of exchange rates. We illustrate the method using the market for bottled water. Our results stress that even in a relatively simple market such as this one, different brands face very different exchange rate risks. [source]


    Advanced Experimental and Simulation Approaches to Meet Reliability Challenges of New Electronics Systems

    ADVANCED ENGINEERING MATERIALS, Issue 4 2009
    Dietmar Vogel
    Abstract This paper focuses on some advanced aspects of physics of failure approaches. Tracing of failure modes under realistic loading is a key issue to separate relevant failure sites to be studied in more detail. In the past design of experiment (DoE) tools have been developed to handle this problem. They allow to optimize design and/or material selection with respect to different failure mechanisms and sites. The application of these methods is demonstrated by optimizations performed for fracture problems. Interface fracture has been chosen as one of the most important failure mechanisms. Finally, local stress and strain measurement tools developed over the past years are presented at the end of the paper. They are tools to validate simulation results and therefore the underlying mechanical modeling. Namely, local stress measurement tools under development are needed to make realistic assumptions of loading conditions and to provide residual stress data for FEA. [source]


    Simulation of Accuracy Performance for Wireless Sensor-Based Construction Asset Tracking

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2009
    Miros, aw J. Skibniewski
    In particular, identifying the location of distributed mobile entities throughout wireless communications becomes the primary task to realize the remote tracking and monitoring of the construction assets. Even though several alternative solutions have been introduced by utilizing recent technologies, such as radio frequency identification (RFID) and the global positioning system (GPS), they could not provide a solid direction to accurate and scalable tracking frameworks in large-scale construction domains due to limited capability and inflexible networking architectures. This article introduces a new tracking architecture using wireless sensor modules and shows an accuracy performance using a numerical simulation approach based on the time-of-flight method. By combining radio frequency (RF) and ultrasound (US) signals, the simulation results showed an enhanced accuracy performance over the utilization of an RF signal only. The proposed approach can provide potential guidelines for further exploration of hardware/software design and for experimental analysis to implement the framework of tracking construction assets. [source]


    Confronting Uncertainty and Missing Values in Environmental Value Transfer as Applied to Species Conservation

    CONSERVATION BIOLOGY, Issue 5 2010
    SONIA AKTER
    conservación de especies; error de transferencia; incertidumbre; transferencia de valor ambiental; valores de no uso Abstract:,The nonuse (or passive) value of nature is important but time-consuming and costly to quantify with direct surveys. In the absence of estimates of these values, there will likely be less investment in conservation actions that generate substantial nonuse benefits, such as conservation of native species. To help overcome decisions about the allocation of conservation dollars that reflect the lack of estimates of nonuse values, these values can be estimated indirectly by environmental value transfer (EVT). EVT uses existing data or information from a study site such that the estimated monetary value of an environmental good is transferred to another location or policy site. A major challenge in the use of EVT is the uncertainty about the sign and size of the error (i.e., the percentage by which transferred value exceeds the actual value) that results from transferring direct estimates of nonuse values from a study to a policy site, the site where the value is transferred. An EVT is most useful if the decision-making framework does not require highly accurate information and when the conservation decision is constrained by time and financial resources. To account for uncertainty in the decision-making process, a decision heuristic that guides the decision process and illustrates the possible decision branches, can be followed. To account for the uncertainty associated with the transfer of values from one site to another, we developed a risk and simulation approach that uses Monte Carlo simulations to evaluate the net benefits of conservation investments and takes into account different possible distributions of transfer error. This method does not reduce transfer error, but it provides a way to account for the effect of transfer error in conservation decision making. Our risk and simulation approach and decision-based framework on when to use EVT offer better-informed decision making in conservation. Resumen:,El valor de no uso (o pasivo) de la naturaleza es importante pero su cuantificación con muestreos pasivos consume tiempo y es costosa. En ausencia de estimaciones de estos valores, es probable que haya menos inversión en acciones de conservación que generen beneficios de no uso sustanciales, tal como la conservación de especies nativas. Para ayudar a superar decisiones respecto a la asignación de dólares para conservación que reflejan la carencia de estimaciones de los valores de no uso, estos valores pueden ser estimados indirectamente por la transferencia de valor ambiental (TVA). La transferencia de valor ambiental utiliza datos existentes o información de un sitio de estudio de tal manera que el valor monetario estimado de un bien ambiental es transferido a otro sitio. Un reto mayor en el uso de TVA es la incertidumbre sobre la señal y el tamaño del error (i.e., el porcentaje en que el valor transferido excede al valor actual) que resulta de la transferencia de estimaciones directas de los valores de no uso de un sitio de estudio a uno político, el sitio adonde el valor es transferido. Una TVA es más útil si el marco de toma de decisiones no requiere información muy precisa y cuando la decisión de conservación está restringida por tiempo y recursos financieros. Para tomar en cuenta la incertidumbre en el proceso de toma de decisiones, se puede seguir una decisión heurística que guie el proceso de decisión e ilustre sobre las posibles ramificaciones de la decisión. Para tomar en cuenta la incertidumbre asociada con la transferencia de valores de un sitio a otro, desarrollamos un método de riesgo y simulación que utiliza simulaciones Monte Carlo para evaluar los beneficios netos de las inversiones de conservación y que considera posibles distribuciones diferentes de la transferencia de error. Este método no reduce el error de transferencia, pero proporciona una manera para considerar el efecto del error de transferencia en la toma de decisiones de conservación. Nuestro método de riesgo y simulación y el marco de referencia basado en decisones sobre cuando utilizar TVA permiten la toma de decisiones en conservación más informadas. [source]


    A simulation approach to determine statistical significance of species turnover peaks in a species-rich tropical cloud forest

    DIVERSITY AND DISTRIBUTIONS, Issue 6 2007
    K. Bach
    ABSTRACT Use of ,-diversity indices in the study of spatial distribution of species diversity is hampered by the difficulty of applying significance tests. To overcome this problem we used a simulation approach in a study of species turnover of ferns, aroids, bromeliads, and melastomes along an elevational gradient from 1700 m to 3400 m in a species-rich tropical cloud forest of Bolivia. Three parameters of species turnover (number of upper/lower elevational species limits per elevational step, Wilson,Shmida similarity index between adjacent steps) were analysed. Significant species turnover limits were detected at 2000 (± 50) m and 3050 m, which roughly coincided with the elevational limits of the main vegetation types recognized in the study area. The taxon specificity of elevational distributions implies that no single plant group can be used as a reliable surrogate for overall plant diversity and that the response to future climate change will be taxon-specific, potentially leading to the formation of plant communities lacking modern analogues. Mean elevational range size of plant species was 490 m (± 369). Elevational range sizes of terrestrial species were shorter than those of epiphytes. We conclude that our simulation approach provides an alternative approach for assessing the statistical significance of levels of species turnover along ecological gradient without the limitations imposed by traditional statistical approaches. [source]


    Effects of spatially structured vegetation patterns on hillslope erosion in a semiarid Mediterranean environment: a simulation study

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 2 2005
    Matthias Boer
    Abstract A general trend of decreasing soil loss rates with increasing vegetation cover fraction is widely accepted. Field observations and experimental work, however, show that the form of the cover-erosion function can vary considerably, in particular for low cover conditions that prevail on arid and semiarid hillslopes. In this paper the structured spatial distribution of the vegetation cover and associated soil attributes is proposed as one of the possible causes of variation in cover,erosion relationships, in particular in dryland environments where patchy vegetation covers are common. A simulation approach was used to test the hypothesis that hillslope discharge and soil loss could be affected by variation in the spatial correlation structure of coupled vegetation cover and soil patterns alone. The Limburg Soil Erosion Model (LISEM) was parameterized and verified for a small catchment with discontinuous vegetation cover at Rambla Honda, SE Spain. Using the same parameter sets LISEM was subsequently used to simulate water and sediment fluxes on 1 ha hypothetical hillslopes with simulated spatial distributions of vegetation and soil parameters. Storms of constant rainfall intensity in the range of 30,70 mm h,1 and 10,30 min duration were applied. To quantify the effect of the spatial correlation structure of the vegetation and soil patterns, predicted discharge and soil loss rates from hillslopes with spatially structured distributions of vegetation and soil parameters were compared with those from hillslopes with spatially uniform distributions. The results showed that the spatial organization of bare and vegetated surfaces alone can have a substantial impact on predicted storm discharge and erosion. In general, water and sediment yields from hillslopes with spatially structured distributions of vegetation and soil parameters were greater than from identical hillslopes with spatially uniform distributions. Within a storm the effect of spatially structured vegetation and soil patterns was observed to be highly dynamic, and to depend on rainfall intensity and slope gradient. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Patents and R&D as Real Options

    ECONOMIC NOTES, Issue 1 2004
    Eduardo S. Schwartz
    This article develops and implements a simulation approach to value patents and patent-protected R&D projects based on the Real Options approach. It takes into account uncertainty in the cost-to-completion of the project, uncertainty in the cash flows to be generated from the project, and the possibility of catastrophic events that could put an end to the effort before it is completed. It also allows for the possibility of abandoning the project when costs turn out to be larger than expected or when estimated cash flows turn out to be smaller than anticipated. This abandonment option represents a very substantial part of the project's value when the project is marginal or/and when uncertainty is large. The model presented can be used to evaluate the effects of regulation on the cost of innovation and the amount on innovative output. The main focus of the article is the pharmaceutical industry. The framework, however, applies just as well to other research-intensive industries such as software or hardware development. (J.E.L.:G31, O22, O32). [source]


    Optimization of Monte Carlo Procedures for Value at Risk Estimates

    ECONOMIC NOTES, Issue 1 2002
    Sabrina Antonelli
    This paper proposes a methodology which improves the computational efficiency of the Monte Carlo simulation approach of value at risk (VaR) estimates. Principal components analysis is used to reduce the number of relevant sources of risk driving the portfolio dynamics. Moreover, large deviations techniques are used to provide an estimate of the minimum number of price scenarios to be simulated to attain a given accuracy. Numerical examples are provided and show the good performance of the methodolgy proposed. (J.E.L.: C15, G1). [source]


    Non parametric VaR Techniques.

    ECONOMIC NOTES, Issue 2 2001
    Myths, Realities
    VaR (value-at-risk) estimates are currently based on two main techniques: the variance-covariance approach or simulation. Statistical and computational problems affect the reliability of these techniques. We illustrate a new technique , filtered historical simulation (FHS) , designed to remedy some of the shortcomings of the simulation approach. We compare the estimates it produces with traditional bootstrapping estimates. (J.E.L.: G19). [source]


    SEX-RATIO CONFLICT BETWEEN QUEENS AND WORKERS IN EUSOCIAL HYMENOPTERA: MECHANISMS, COSTS, AND THE EVOLUTION OF SPLIT COLONY SEX RATIOS

    EVOLUTION, Issue 12 2005
    Ken R. Helms
    Abstract Because workers in the eusocial Hymenoptera are more closely related to sisters than to brothers, theory predicts that natural selection should act on them to bias (change) sex allocation to favor reproductive females over males. However, selection should also act on queens to prevent worker bias. We use a simulation approach to analyze the coevolution of this conflict in colonies with single, once-mated queens. We assume that queens bias the primary (egg) sex ratio and workers bias the secondary (adult) sex ratio, both at some cost to colony productivity. Workers can bias either by eliminating males or by directly increasing female caste determination. Although variation among colonies in kin structure is absent, simulations often result in bimodal (split) colony sex ratios. This occurs because of the evolution of two alternative queen or two alternative worker biasing strategies, one that biases strongly and another that does not bias at all. Alternative strategies evolve because the mechanisms of biasing result in accelerating benefits per unit cost with increasing bias, resulting in greater fitness for strategies that bias more and bias less than the population equilibrium. Strategies biasing more gain from increased biasing efficiency whereas strategies biasing less gain from decreased biasing cost. Our study predicts that whether queens or workers evolve alternative strategies depends upon the mechanisms that workers use to bias the sex ratio, the relative cost of queen and worker biasing, and the rates at which queen and worker strategies evolve. Our study also predicts that population and colony level sex allocation, as well as colony productivity, will differ diagnostically according to whether queens or workers evolve alternative biasing strategies and according to what mechanism workers use to bias sex allocation. [source]


    A Computational Framework for Patient-Specific Analysis of Articular Cartilage Incorporating Structural Information from DT-MRI

    GAMM - MITTEILUNGEN, Issue 2 2009
    David M. Pierce
    Abstract Accurate techniques for simulating sof t biological tissue deformation are an increasingly valuable tool in many areas of biomechanical analysis and medical image computing. To model the morphology and the material response of human articular cartilage a phenomenological and patient-specific simulation approach incorporating the collagen fibre fabric is proposed. We then demonstrate a unique combination of ultra-high field Diffusion Tensor Magnetic Resonance Imaging (17.6T DT-MRI) and a novel numerical approach incorporating the empirical data to predict the collagen fibre fabric deformation for an indentation experiment (© 2009 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Patterns and Determinants of Historical Woodland Clearing in Central-Western New South Wales, Australia

    GEOGRAPHICAL RESEARCH, Issue 4 2007
    MICHAEL BEDWARD
    Abstract We consider the history of woodland clearing in central western New South Wales, Australia, which has led to the present highly cleared and fragmented landscape. A combined approach is used examining available historical land-use data and using regression analysis to relate the pattern of cleared and wooded areas in the recent landscape to environmental variables, taking into account the contagious nature of clearing. We also ask whether it would be possible to apply a simple simulation modelling approach to reconstruct a credible historical sequence of clearing in the study area. The historical data indicate that annual clearing rates have varied substantially in the study area and selective tree removal (ringbarking and thinning) has been common. These findings make it unlikely that a simple simulation approach would replicate the spatial and temporal sequence of woodland loss. Our regression results show that clearing patterns can be related to environmental variables, particularly annual rainfall and estimated pre-European vegetation type, but that patterns are dominated by contagion. [source]


    Improving expected tail loss estimates with neural networks

    INTELLIGENT SYSTEMS IN ACCOUNTING, FINANCE & MANAGEMENT, Issue 2 2005
    J. R. Aragonés
    Expected tail loss (ETL) and other ,coherent' risk measures are rapidly gaining acceptance amongst risk managers due to the limitations of value-at-risk (VaR) as a risk measure. In this article we explore the use of multilayer perceptron supervised neural networks to improve our estimates of ETL numbers using information from both tails of the distribution. We compare the results with the historical simulation approach to the estimation of VaR and ETL. The evaluation results indicate that the ETL estimates using neural networks are superior to historical simulation ETL estimates in all periods except for one, and in that case the historical ETL is slightly superior. Overall, therefore, when the whole period is considered, our results indicate that the network estimates of ETL are superior to the historical ones. Finally, one of the most interesting results of the study is the fact that the neural networks seem to indicate that VaR and ETL (as a function of VaR itself) are dependent not only on the negative returns observed, but also on large positive returns, which indicates that too much emphasis on losses could lead us to overlook important risk information arising from large positive returns. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Adaptive preconditioning of linear stochastic algebraic systems of equations

    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 11 2007
    Y. T. Feng
    Abstract This paper proposes an adaptively preconditioned iterative method for the solution of large-scale linear stochastic algebraic systems of equations with one random variable that arise from the stochastic finite element modelling of linear elastic problems. Firstly, a Rank-one posteriori preconditioner is introduced for a general linear system of equations. This concept is then developed into an effective adaptive preconditioning scheme for the iterative solution of the stochastic equations in the context of a modified Monte Carlo simulation approach. To limit the maximum number of base vectors used in the scheme, a simple selection criterion is proposed to update the base vectors. Finally, numerical experiments are conducted to assess the performance of the proposed adaptive preconditioning strategy, which indicates that the scheme with very few base vectors can improve the convergence of the standard Incomplete Cholesky preconditioning up to 50%. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    An OPNET-based simulation approach for deploying VoIP

    INTERNATIONAL JOURNAL OF NETWORK MANAGEMENT, Issue 3 2006
    K. Salah
    These days a massive deployment of VoIP is taking place over IP networks. VoIP deployment is a challenging task for network researchers and engineers. This paper presents a detailed simulation approach for deploying VoIP successfully. The simulation uses the OPNET network simulator. Recently OPNET has gained a considerable popularity in both academia and industry, but there is no formal or known approach or methodology as to how OPNET can be used to assess the support and readiness of an existing network in deploying VoIP. Our approach and work presented in this paper predict, prior to the purchase and deployment of VoIP equipment, the number of VoIP calls that can be sustained by an existing network while satisfying QoS requirements of all network services and leaving adequate capacity for future growth. As a case study, we apply the simulation approach on a typical network of a small enterprise. The paper presents a detailed description of simulation models for network topology and elements using OPNET. The paper describes modeling and representation of background and VoIP traffic, as well as various simulation configurations. Moreover, the paper discusses many design and engineering issues pertaining to the deployment of VoIP. These issues include characteristics of VoIP traffic and QoS requirements, VoIP flow and call distribution, defining future growth capacity, and measurement and impact of background traffic.,Copyright © 2006 John Wiley & Sons, Ltd. [source]


    A simulation-based reliability assessment approach for congested transit network

    JOURNAL OF ADVANCED TRANSPORTATION, Issue 1 2004
    Yafeng Yin
    This paper is an attempt to develop a generic simulation-based approach to assess transit service reliability, taking into account interaction between network performance and passengers' route choice behaviour. Three types of reliability, say, system wide travel time reliability, schedule reliability and direct boarding waiting-time reliability are defined from perspectives of the community or transit administration, the operator and passengers. A Monte Carlo simulation approach with a stochastic user equilibrium transit assignment model embedded is proposed to quantify these three reliability measures of transit service. A simple transit network with a bus rapid transit (BRT) corridor is analysed as a case study where the impacts of BRT components on transit service reliability are evaluated preliminarily. [source]


    Genetic signatures in an invasive parasite of Anguilla anguilla correlate with differential stock management

    JOURNAL OF FISH BIOLOGY, Issue 1 2010
    S. Wielgoss
    In this article, it is shown that available genetic tools for the omnipresent parasite Anguillicoloides crassus in European eels Anguilla anguilla are sensitive to different immigration rates into local A. anguilla stocks for two separated river systems. Relying on four highly polymorphic microsatellite markers, it was inferred that under natural recruitment, nematode samples meet Hardy,Weinberg expectations for a single panmictic population, while genetic signals show signs for a strong Wahlund effect most likely due to very recent population mixing under frequent restocking of young A. anguilla. This was indicated by a low but significant FST value among within-host populations (infrapopulations) along with high inbreeding indices FIS consistent over all loci. The latter signal is shown to stem from high levels of admixture and the presence of first-generation migrants, and alternative explanations such as marker- and sex-specific biases in the nematode populations could be dismissed. Moreover, the slightly increased degree of relatedness within infrapopulations in the stocked river system cannot explain the excessive inbreeding values found and are most likely a direct consequence of recent influx of already infected fish harbouring parasites with different genetic signatures. Applying a simulation approach using known variables from the nematode's invasion history, only the artificial introduction of a Wahlund effect leads to a close match between simulated and real data, which is a strong argument for using the parasite as a biological tag for detecting and characterizing fish translocation. [source]


    Multiphase flow and mixing in dilute bubble swarms

    AICHE JOURNAL, Issue 9 2010
    Stefan Radl
    Abstract High-fidelity three-dimensional (3-D) simulations of multiphase flow and mixing in dilute bubble swarms were performed using the Euler-Lagrange simulation approach. Included was species transport, as well as complex chemical reactions in the simulations. It was found that the algebraic SGS model satisfactory predicts experimental data for the mean flow field. A detailed description of multiphase flow was used and developed to simulate the time evolution of scalar and reactive mixing in a bubble column. An analysis involving the scale of segregation ,, a metric that characterizes the mean driving force for mixing, is applied for the first time to multiphase flow. The study shows that , is inversely proportional to the bubble diameter at constant gas-feed rate, but only a weak function of the gas-feed rate. Also, we observed significant differences of mixing metrics in reactive and nonreactive systems. © 2010 American Institute of Chemical Engineers AIChE J, 2010 [source]


    Quantification of Chemical Striae in Inorganic Melts and Glasses through Picture Processing

    JOURNAL OF THE AMERICAN CERAMIC SOCIETY, Issue 9 2010
    Martin Jensen
    Chemical striations occur in various types of inorganic melts like lava and glass melts, and affect the physical properties of materials. This paper reports a quantitative study of the chemical striations in iron-rich aluminosilicate melts and glasses. In this study, an integrated method has been established, which consists of sample preparation, image acquiring, Fourier Transformation, and characteristic value determination. The principle of the established method is illustrated by picture processing-based simulation. The extent of the chemical striations and the diffusion length of the striae can be measured using this method. It is found that the extent of the chemical striations is rather sensitive to the melting technique. Furthermore, the impact of chemical diffusion and stirring on the extent of striations is revealed using the picture processing-based simulation approach. The diffusion process eliminates small striae and reduces the intensity of the larger ones. At a constant temperature, the diffusion determines the transformation rate of an inhomogeneous melt into a homogeneous one. During stirring, the size distribution of the large striae becomes broader, but the overall intensity of the striae becomes smaller. [source]


    RAINGAGE NETWORK DESIGN USING NEXRAD PRECIPITATION ESTIMATES,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 5 2002
    A. Allen Bradley
    ABSTRACT: A general framework is proposed for using precipitation estimates from NEXRAD weather radars in raingage network design. NEXRAD precipitation products are used to represent space time rainfall fields, which can be sampled by hypothetical raingage networks. A stochastic model is used to simulate gage observations based on the areal average precipitation for radar grid cells. The stochastic model accounts for subgrid variability of precipitation within the cell and gage measurement errors. The approach is ideally suited to raingage network design in regions with strong climatic variations in rainfall where conventional methods are sometimes lacking. A case study example involving the estimation of areal average precipitation for catchments in the Catskill Mountains illustrates the approach. The case study shows how the simulation approach can be used to quantify the effects of gage density, basin size, spatial variation of precipitation, and gage measurement error, on network estimates of areal average precipitation. Although the quality of NEXRAD precipitation products imposes limitations on their use in network design, weather radars can provide valuable information for empirical assessment of rain-gage network estimation errors. Still, the biggest challenge in quantifying estimation errors is understanding subgrid spatial variability. The results from the case study show that the spatial correlation of precipitation at subgrid scales (4 km and less) is difficult to quantify, especially for short sampling durations. Network estimation errors for hourly precipitation are extremely sensitive to the uncertainty in subgrid spatial variability, although for storm total accumulation, they are much less sensitive. [source]


    Effect of Pressure on the Miscibility of Polyethylene/Poly(ethylene- alt -propylene) Blends

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 7 2006
    Phillip Choi
    Abstract Summary: Effect of density, and hence pressure, on the miscibility of a 50:50 mol/mol PE/PEP blend was studied using a coarse-grained MC simulation approach on a high-coordination lattice, with the conformations of the coarse-grained chains constrained by the RIS model. Interchain pair correlation functions are used to assess the miscibility of the mixtures. Miscibility increases with increasing temperature over the range ,50,150,°C. It is rather insensitive to pressure at high temperatures, but at ,50,°C, the blend miscibility increases with decreasing pressure. The findings are consistent with the fact that the blend is an UCST blend and that the simulation temperatures used, except ,50,°C, were considerably higher than the UCST of the blend. The pressure dependence of the blend miscibility observed near ,50,°C is also in agreement with the experimental observation that the blend exhibits a negative volume change of mixing. The present work demonstrates that the coarse-grained MC approach, when it is used with periodic boundary cells of different sizes filled with the same number of chains, is capable of capturing the pressure dependence of UCST blends. In addition, such a simulation also provides us with insights about the molecular origin of the observed pressure dependence of miscibility. In the present case, the segregation of PE and PEP chains at low temperatures and high pressure simply originates from the fact that fully extended segments of PE chains tend to cluster so that their intermolecular interactions can be maximized. As the temperature increases, there is a decrease in the probability of a trans state at a CC bond in PE, and therefore the attraction between the PE chains is reduced at higher temperatures, promoting miscibility and the UCST behavior. Density (pressure) dependence of the 2nd shell pair correlation function values for a 50/50 PE/PEP blend at ,50,°C. [source]


    When can ecological speciation be detected with neutral loci?

    MOLECULAR ECOLOGY, Issue 11 2010
    XAVIER THIBERT-PLANTE
    Abstract It is not yet clear under what conditions empirical studies can reliably detect progress toward ecological speciation through the analysis of allelic variation at neutral loci. We use a simulation approach to investigate the range of parameter space under which such detection is, and is not, likely. We specifically test for the conditions under which divergent natural selection can cause a ,generalized barrier to gene flow' that is present across the genome. Our individual-based numerical simulations focus on how population divergence at neutral loci varies in relation to recombination rate with a selected locus, divergent selection on that locus, migration rate and population size. We specifically test whether genetic differences at neutral markers are greater between populations in different environments than between populations in similar environments. We find that this expected signature of ecological speciation can be detected under part of the parameter space, most consistently when divergent selection is strong and migration is intermediate. By contrast, the expected signature of ecological speciation is not reliably detected when divergent selection is weak or migration is low or high. These findings provide insights into the strengths and weaknesses of using neutral markers to infer ecological speciation in natural systems. [source]


    Involvement of motor pathways in corticobasal syndrome detected by diffusion tensor tractography,

    MOVEMENT DISORDERS, Issue 2 2009
    Kai Boelmans MD
    Abstract Corticobasal syndrome (CBS) is a progressive parkinsonian disease characterized by cortical and subcortical neuronal loss. Although motor disabilities are a core feature of CBS, the involvement of motor pathways in this condition has not been completely clarified. We used magnetic resonance diffusion tensor imaging (DTI) to study corticospinal and transcallosal motor projections in CBS, and applied fiber tractography to analyze the axonal integrity of white matter projections. Ten patients with CBS were compared with 10 age-matched healthy controls. Fiber tracts were computed using a Monte-Carlo simulation approach. Tract-specific mean values of the apparent diffusion coefficient (ADC) and fractional anisotropy (FA) were determined. CBS patients showed a reduction of corticospinal tract (CST) fibers on the first affected side with significantly increased ADC and reduced FA values. In the corpus callosum (CC), particularly in the posterior trunk, patients also had significantly reduced fiber projections, with a higher ADC and lower FA than controls. This pattern indicates changes of the white matter integrity in both CST and CC. Thus, magnetic resonance DTI can be used to assess motor pathway involvement in CBS patients. © 2008 Movement Disorder Society [source]


    Movement patterns and study area boundaries: influences on survival estimation in capture,mark,recapture studies

    OIKOS, Issue 8 2008
    Gregg E. Horton
    The inability to account for the availability of individuals in the study area during capture,mark,recapture (CMR) studies and the resultant confounding of parameter estimates can make correct interpretation of CMR model parameter estimates difficult. Although important advances based on the Cormack,Jolly,Seber (CJS) model have resulted in estimators of true survival that work by unconfounding either death or recapture probability from availability for capture in the study area, these methods rely on the researcher's ability to select a method that is correctly matched to emigration patterns in the population. If incorrect assumptions regarding site fidelity (non-movement) are made, it may be difficult or impossible as well as costly to change the study design once the incorrect assumption is discovered. Subtleties in characteristics of movement (e.g. life history-dependent emigration, nomads vs territory holders) can lead to mixtures in the probability of being available for capture among members of the same population. The result of these mixtures may be only a partial unconfounding of emigration from other CMR model parameters. Biologically-based differences in individual movement can combine with constraints on study design to further complicate the problem. Because of the intricacies of movement and its interaction with other parameters in CMR models, quantification of and solutions to these problems are needed. Based on our work with stream-dwelling populations of Atlantic salmon Salmo salar, we used a simulation approach to evaluate existing CMR models under various mixtures of movement probabilities. The Barker joint data model provided unbiased estimates of true survival under all conditions tested. The CJS and robust design models provided similarly unbiased estimates of true survival but only when emigration information could be incorporated directly into individual encounter histories. For the robust design model, Markovian emigration (future availability for capture depends on an individual's current location) was a difficult emigration pattern to detect unless survival and especially recapture probability were high. Additionally, when local movement was high relative to study area boundaries and movement became more diffuse (e.g. a random walk), local movement and permanent emigration were difficult to distinguish and had consequences for correctly interpreting the survival parameter being estimated (apparent survival vs true survival). [source]


    Deterministic simulation of transport in MOSFETs by coupling Wigner, Poisson and Schrödinger equations

    PHYSICA STATUS SOLIDI (A) APPLICATIONS AND MATERIALS SCIENCE, Issue 11 2008
    J. Kefi-Ferhane
    Abstract In this paper, we present a new simulation approach for quantum transport in short and thin MOSFET channels which is based on a deterministic resolution of 1D Wigner Transport Equation and its couplings with 2D Poisson and 1D Schrödinger equations. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    The design and use of an agent-based model to simulate the 1918 influenza epidemic at Norway House, Manitoba

    AMERICAN JOURNAL OF HUMAN BIOLOGY, Issue 3 2009
    Connie Carpenter
    Agent-based modeling provides a new approach to the study of virgin soil epidemics like the 1918 flu. In this bottom-up simulation approach, a landscape can be created and populated with a heterogeneous group of agents who move and interact in ways that more closely resemble human behavior than is usually seen in other modeling techniques. In this project, an agent-based model was constructed to simulate the spread of the 1918 influenza pandemic through the Norway House community in Manitoba, Canada. Archival, ethnographic, epidemiological, and biological information were used to aid in designing the structure of the model and to estimate values for model parameters. During the epidemic, Norway House was a Hudson's Bay Company post and a Swampy Cree-Métis settlement with an economy based on hunting, fishing, and the fur trade. The community followed a traditional, seasonal travel pattern of summer aggregation and winter dispersal. The model was used to examine how seasonal community structures and associated population movement patterns may have influenced disease transmission and epidemic spread. Simulations of the model clearly demonstrate that human behavior can significantly influence epidemic outcomes. Am. J. Hum. Biol. 2009. © 2008 Wiley-Liss, Inc. [source]


    Modelling the light induced metastable effects in amorphous silicon

    PHYSICA STATUS SOLIDI (C) - CURRENT TOPICS IN SOLID STATE PHYSICS, Issue 2 2008
    G. Munyeme
    Abstract We present results of computer simulations of the light induced degradation of amorphous silicon solar cells. It is now well established that when amorphous silicon is illuminated the density of dangling bond states increases. Dangling bond states produce amphoteric electronic mid-gap states which act as efficient charge trapping and recombination centres. The increase in dangling bond states causes a decrease in the performance of amorphous silicon solar cells. To show this effect, a modelling approach has been developed which uses the density of localised states with exponentially increasing band-tails and dangling bond defect states distribution chosen according to the defect pool model. The calculation of the evolution of dangling bond state density during illumination has been achieved through a dynamic scaling relation derived from a defect creation model. The approach considers the amphoteric nature of the dangling bond state and thus accounts for the contributions of the different charge states of the dangling bond during the degradation process. This paper attempts to describe the simulation approach which calculates the defect density as a function of energy, position in the solar cell and illumination time. In excellent agreement with other workers, our simulation results show that the increase in the density of neutral dangling bond states during illumination is higher than of the charged states. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Tryptophanyl fluorescence lifetime distribution of hyperthermophilic ,-glycosidase from molecular dynamics simulation: A comparison with the experimental data

    PROTEIN SCIENCE, Issue 9 2000
    Ettore Bismuto
    Abstract A molecular dynamics simulation approach has been utilized to understand the unusual fluorescence emission decay observed for ,-glycosidase from the hyperthermophilic bacterium Solfolobus sulfataricus (S,gly), a tetrameric enzyme containing 17 tryptophanyl residues for each subunit. The tryptophanyl emission decay of (S,gly) results from a bimodal distribution of fluorescence lifetimes with a short-lived component centered at 2.5 ns and a long-lived one at 7.4 ns Bismuto E, Nucci R, Rossi M, Irace G, 1999, Proteins 27:71,79). From the examination of the trajectories of the side chains capable of causing intramolecular quenching for each tryptophan microenvironment and using a modified Stern,Volmer model for the emission quenching processes, we calculated the fluorescence lifetime for each tryptophanyl residue of S,gly at two different temperatures, i.e., 300 and 365 K. The highest temperature was chosen because in this condition S,lgy evidences a maximum in its catalytic activity and is stable for a very long time. The calculated lifetime distributions overlap those experimentally determined. Moreover, the majority of trytptophanyl residues having longer lifetimes correspond to those originally identified by inspection of the crystallographic structure. The tryptophanyl lifetimes appear to be a complex function of several variables, such as microenvironment viscosity, solvent accessibility, the chemical structure of quencher side chains, and side-chain dynamics. The lifetime calculation by MD simulation can be used to validate a predicted structure by comparing the theoretical data with the experimental fluorescence decay results. [source]