Carlo Simulations (carlo + simulation)

Distribution by Scientific Domains
Distribution within Physics and Astronomy

Kinds of Carlo Simulations

  • monte carlo simulation

  • Terms modified by Carlo Simulations

  • carlo simulation approach
  • carlo simulation method
  • carlo simulation studies

  • Selected Abstracts


    MONTE CARLO SIMULATION OF FAR INFRARED RADIATION HEAT TRANSFER: THEORETICAL APPROACH

    JOURNAL OF FOOD PROCESS ENGINEERING, Issue 4 2006
    F. TANAKA
    ABSTRACT We developed radiation heat transfer models with the combination of the Monte Carlo (MC) method and computational fluid dynamic approach and two-dimensional heat transfer models based on the fundamental quantum physics of radiation and fluid dynamics. We investigated far infrared radiation (FIR) heating in laminar and buoyancy airflow. A simple prediction model in laminar airflow was tested with an analytical solution and commercial software (CFX 4). The adequate number of photon tracks for MC simulation was established. As for the complex designs model, the predicted results agreed well with the experimental data with root mean square error of 3.8 K. Because food safety public concerns are increasing, we applied this model to the prediction of the thermal inactivation level by coupling with the microbial kinetics model. Under buoyancy airflow condition, uniformity of FIR heating was improved by selecting adequate wall temperature and emissivity. [source]


    Fuzzy Monte Carlo Simulation and Risk Assessment in Construction

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 4 2010
    N. Sadeghi
    However, subjective and linguistically expressed information results in added non-probabilistic uncertainty in construction management. Fuzzy logic has been used successfully for representing such uncertainties in construction projects. In practice, an approach that can handle both random and fuzzy uncertainties in a risk assessment model is necessary. This article discusses the deficiencies of the available methods and proposes a Fuzzy Monte Carlo Simulation (FMCS) framework for risk analysis of construction projects. In this framework, we construct a fuzzy cumulative distribution function as a novel way to represent uncertainty. To verify the feasibility of the FMCS framework and demonstrate its main features, the authors have developed a special purpose simulation template for cost range estimating. This template is employed to estimate the cost of a highway overpass project. [source]


    Prediction of Chain Length Distribution of Polystyrene Made in Batch Reactors with Bifunctional Free-Radical Initiators Using Dynamic Monte Carlo Simulation

    MACROMOLECULAR REACTION ENGINEERING, Issue 3 2007
    Ibrahim M. Maafa
    Abstract The objective of this paper is to present a dynamic Monte Carlo model that is able to simulate the polymerization of styrene with bifunctional free-radical initiators in a batch reactor. The model can predict the dynamic evolution of the chain length distribution of polystyrene in the reactor. The model includes all relevant polymerization mechanistic steps, including chemical and thermal radical generation, and diffusion-controlled termination. The model was applied to styrene polymerization and the Monte Carlo estimates for chain length averages were compared to those obtained with the method of moments. Excellent agreement was obtained between the two methods. Although styrene polymerization was used as a case study, the proposed methodology can be easily extended to any other polymer type made by free-radical polymerization. [source]


    Chain Length Distributions of Polyolefins Made with Coordination Catalysts at Very Short Polymerization Times , Analytical Solution and Monte Carlo Simulation

    MACROMOLECULAR REACTION ENGINEERING, Issue 1 2007
    João B. P. Soares
    Abstract We developed an analytical solution to describe how the CLD of polymers made with coordination polymerization catalysts vary as a function of time for very short polymerization times before the CLD becomes completely developed. We compared the analytical solution with a dynamic Monte Carlo model for validation, obtaining excellent agreement. Our analytical solution can be used to determine when the steady-state hypothesis, commonly used in polymerization models, becomes valid as a function of polymer chain length. We also extended our model to describe polymerization with multiple-site-type catalysts. Depending on the polymerization kinetic parameters of the different site types on the catalyst, the fully developed CLD is reached through very different intermediate CLDs. This modeling approach, although rather simplified, can be used to interpret results from short polymerization time experiments such as the ones done in stopped-flow reactors. [source]


    Dynamic Monte Carlo Simulation of Graft Copolymers Made with ATRP and Metallocene Catalysts

    MACROMOLECULAR SYMPOSIA, Issue 1 2006
    Mamdouh Al-Harthi
    Abstract The synthesis of polyolefin graft copolymers made with coordination polymerization was studied by dynamic Monte Carlo simulation. Narrow molecular weight distribution macromonomers, containing terminal vinyl groups made with atom-transfer radical polymerization (ATRP), were incorporated randomly into the polyolefin backbone. In addition to average molecular weights and polydispersity index, the model predicts the complete molecular weight distribution (MWD) and branching density of the graft copolymer. The effect of the concentration of macromonomers on the grafting efficiency was also studied. [source]


    Monte Carlo Simulation of Polymer Reactions at Interfaces

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 4 2007
    Andreas John
    Abstract Adhesion of immiscible polymers during two-component injection moulding may be improved by transreactions of properly functionalised components. We performed MC simulations based on the three-dimensional coarse-grained bond fluctuation model (BFM) including a thermal interaction potential in with energy to characterise the behaviour of several selected types of chemical reactions, which are governed by activation energies of EA,=,0, 1, 3 and 5 kBT. The consumption of reactive monomers for all the reactions in the time interval below the Rouse time ,R exhibits a typical crossover from a kinetic-controlled to a diffusion-controlled behaviour and can be described by a bimolecular kinetic ansatz. [source]


    Monte Carlo Simulation of ABA Triblock Copolymer Melts Confined in a Cylindrical Nanotube

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 2 2007
    Xingqing Xiao
    Abstract Monte Carlo simulations were used to identify the microphase morphologies of ABA triblock copolymer melts confined in a cylindrical nanotube. The influences of the volume fraction of mid-block B (fB), the radius of nanotube (R) and the asymmetry of ABA triblock copolymer chain were discussed in detail. When fB varies, a series of double-continuous, three-layer concentric cylinder barrel, porous net, double helixes and the new multiplex structures were observed under different conditions. In addition, the stacked disk, catenoid-cylinder and multi-layer concentric cylinder barrel structures occur in turns at changing R. The relation between circular lamellae period L and layer number Nlayer of concentric cylinder barrel with the increase of R was investigated to further explain the put-off phenomenon of microphase transition of the multi-layer concentric cylinder barrel structures. As for the increase of the asymmetry of ABA triblock copolymer chain, it was concluded that the short AI segments tend to site at the interface between rich A and B circular lamellae. [source]


    Monte Carlo Simulation of Degradation of Porous Poly(lactide) Scaffolds, 1

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 9 2006
    Effect of Porosity on pH
    Abstract Summary: Monte Carlo method was used to simulate the degradation of porous PLA scaffolds. The simulated volume was assumed to be divided homogeneously between the pore and solid PLA with the ratio equal to the bulk porosity of the scaffold. The volume was divided into surface and bulk elements where the surface elements were in direct contact with the aqueous degradation medium, while the bulk elements were surrounded by the pore and solid PLA. The effect of degradation time on PLA ester groups and carboxylic acid end-groups for surface and bulk elements, pH, PLA degradation rate and mass loss, and PLA molecular weight distribution was simulated. For surface elements, pH remained constant at 7.4 over the entire time of degradation, while for bulk elements its value decreased significantly to as low as 5.8. The highest drop in pH within the scaffold was observed for the highest porosity of 90%. There was a lag time of at least 7 weeks in the mass loss for surface as well as bulk elements for porosities ranging from 70 to 90%. The mass loss for bulk elements was considerably faster than the surface elements. This difference in the rate of mass loss between the surface and bulk elements could affect the 3D morphology and dimensional stability of the scaffold in vivo as degradation proceeds. The simulation predicts that, due to differences in the rate of bulk and surface degradation, hollow structures could form inside the scaffold after 19, 17, and 15 weeks for initial porosities of 70, 80, and 90%, respectively. A schematic diagram illustrating the degradation of an element on the outer surface of the scaffold (surface element) versus an element within the volume of the scaffold (bulk element). [source]


    Interface Structure between Immiscible Reactive Polymers under Transreaction: a Monte Carlo Simulation

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 5 2005
    Xuehao He
    Abstract Summary: The interface structure between two immiscible melts, a polycondensate polymer A (e.g., polycarbonate, polyester or polyamide) and a polymer B, was studied by means of Monte Carlo simulations using the bond fluctuation model. Polymer B contained a reactive end group (e.g., OH, NH2 or COOH). Copolymers were generated in-situ at the interfaces by transreactions (alcoholysis, aminolysis or acidolysis), composing of various length of block A, depending on the position of transreaction in the polycondensate chain A. The content of copolymer at the interface increased with the time, particular fast at the early stage. Fragments of polymers A were released with an end group, reactive to polymers A. This resulted in the proceeding of internal transreactions. An asymmetric interface structure was formed. The simulation also showed that copolymers generated by interfacial transreactions increased the compatibility of the two polymers and enhanced the adhesion strength at the interfaces. [source]


    Adsorption Behavior of Asymmetrical Triblock Copolymers at the Solid-Liquid Interface by Monte Carlo Simulation

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 8 2004
    Changjun Peng
    Abstract Summary: Monte Carlo simulation on a simple lattice model has been used to study the adsorption of asymmetrical triblock copolymers from a non-selective solvent at the solid-liquid interface. The size distributions of train, loop and tail configurations for those copolymers are obtained as well as other details of the adsorption layer microstructure. Also the influence of adsorption energy and the role of molecular symmetry are investigated. A segment-density profile, the adsorption amount, the surface coverage, and the adsorption layer thickness have been determined. Finally, it is shown that the adsorption behavior of an asymmetrical copolymer can be predicted from the symmetrical copolymer. Size distributions of the tail configuration for A8,kB20Ak. [source]


    Monte Carlo Simulations of the Electron Currents Collected by Electrostatic Probes

    CONTRIBUTIONS TO PLASMA PHYSICS, Issue 7-8 2004
    D. Trunec
    Abstract We have carried out Monte Carlo simulation of the motion of electrons in the space charge sheath surrounding a cylindrical Langmuir (electrostatic) probe. The electron currents to the probe have been calculated from these simulations for different conditions (pressure of neutral gas, presence of magnetic field). The results of the simulations have been compared with recent Langmuir probe measurements made in cylindrical DC magnetron plasmas. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Kinetic Monte Carlo Simulations of Precipitation,

    ADVANCED ENGINEERING MATERIALS, Issue 12 2006
    E. Clouet
    Abstract We present some recent applications of the atomistic diffusion model and of the kinetic Monte Carlo (KMC) algorithm to systems of industrial interest, i.e. Al-Zr-Sc and Fe-Nb-C alloys, or to model systems. These applications include study of homogeneous and heterogeneous precipitation as well as of phase transformation under irradiation. The KMC simulations are also used to test the main assumptions and limitations of more simple models and classical theories used in the industry, e.g. the classical nucleation theory. [source]


    Adsorption of Semiflexible Chains on Nanostriped Surfaces: Monte Carlo Simulations

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 1 2007
    Abdullah AlSunaidi
    Abstract Monte Carlo simulations were carried out to investigate the adsorption of semiflexible chains from a semidilute solution to substrates with periodic stripes of width w. The chains are made of fused N,=,10 monomers of diameter , interacting with each other through excluded volume interactions and with the stripes via a square-well potential of depth , and width ,. The surface coverage was found to increase upon increasing the chain stiffness and decreases on increasing the width of the stripes. At small w, more flexible chains are adsorbed than stiff chains. Analysis of the radius of gyration for the chains showed that when w,<,8,, the component along the stripe direction is significantly larger than the others. Orientational order parameter reveals that, for small w, chains have preference to align along the stripe direction. [source]


    Monte Carlo Simulations of the Morphologies and Conformations of Triblock Copolymer Thin Films

    MACROMOLECULAR THEORY AND SIMULATIONS, Issue 2 2006
    Yongmin Huang
    Abstract Summary: The morphologies and conformations of triblock copolymer (ABA and ABC) thin films confined between two identical walls were investigated by Monte Carlo simulation using bond length fluctuation and cavity diffusion algorithm on cubic lattice. Effects of the wall-block interactions, copolymer chain composition and film thickness on morphologies, as well as on the fraction of chain "bridge" conformation fbridge are presented in detail. In ABA thin film, column, parallel, perforated and perpendicular lamellas were discriminated, furthermore, the transition of morphology and the variation of fbridge of ABA film along with the increase of thickness were revealed. In ABC thin film, lamella especially perpendicular lamella morphologies are predominant in varying the wall-block interactions and the thickness. The results are consistent with some theoretical predictions such as DDFT and simulations reported in literature. Isodensity profile of A5B5A5 thin film. [source]


    Monte Carlo Simulations of the 3D Ashkin,Teller Model: continuous phase transition lines

    PHYSICA STATUS SOLIDI (B) BASIC SOLID STATE PHYSICS, Issue 2 2003
    G. Musia
    Abstract Large-scale Monte Carlo simulations, based on the invariance of the Binder cumulant Q, for continuous phase transitions in the three-dimensional Ashkin,Teller spin-lattice model on a cubic lattice, have been performed. Using the universality hypotesis and the finite-size-scaling analysis, the Ising character of phase transitions from the antiferro- to paramagnetic phase, where the cumulant Q behavior is different than in the Ising model, is demonstrated. Some preliminary results demonstrating the existance of the tricritical points are also presented. [source]


    A Methodology for Assessing Transportation Network Terrorism Risk with Attacker and Defender Interactions

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 6 2010
    Pamela M. Murray-Tuite
    Decision makers need a methodology that can capture the complex attacker,defender interactions and help them understand the overall effects on the transportation system, as well as the consequences of asset failure. This article presents such a methodology, which uses probabilities of target,attack method combinations that are degree of belief based and updated using Bayes' Theorem after evidence of the attack is obtained. Monte Carlo simulation generates the probability of link capacity effects by sampling from distributions of capacity reductions due to preevent security measures, substitutions, target failure, and postevent security measures. The average capacity reduction for a particular target,attack method combination serves as an input to the traffic assignment,simulation package DYNASMART-P to determine travel time effects. The methodology is applied to a sample network based on the northern Virginia area. Results based on notional data indicated that preevent security measures reduced attack probabilities, but in some cases increased the mobility consequences. Thus, decision makers must carefully evaluate the effects of their decisions. [source]


    Prediction of Onset of Corrosion in Concrete Bridge Decks Using Neural Networks and Case-Based Reasoning

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 2 2005
    G. Morcous
    It is based on the integration of artificial neural network (ANN), case-based reasoning (CBR), mechanistic model, and Monte Carlo simulation (MCS). A probabilistic mechanistic model is used to generate the distribution of the time to corrosion initiation based on statistical models of the governing parameters obtained from field data. The proposed ANN and CBR models act as universal functional mapping tools to approximate the relationship between the input and output of the mechanistic model. These tools are integrated with the MCS technique to generate the distribution of the corrosion initiation time using the distributions of the governing parameters. The proposed methodology is applied to predict the time to corrosion initiation of the top reinforcing steel in the concrete deck of the Dickson Bridge in Montreal. This study demonstrates the feasibility, adequate reliability, and computational efficiency of the proposed integrated ANN-MCS and CBR-MCS approaches for preliminary project-level and also network-level analyses. [source]


    Incorporating Uncertainty into Demographic Modeling: Application to Shark Populations and Their Conservation

    CONSERVATION BIOLOGY, Issue 4 2002
    Enric Cortés
    I used age-structured life tables and Leslie matrices based on a prebreeding survey and a yearly time step applied only to females to model the demography of 41 populations from 38 species of sharks representing four orders and nine families. I used Monte Carlo simulation to reflect uncertainty in the estimates of demographic traits and to calculate population statistics and elasticities for these populations; I used correlation analysis to identify the demographic traits that explained most of the variation in population growth rates ( , ). The populations I examined fell along a continuum of life-history characteristics that can be linked to elasticity patterns. Sharks characterized by early age at maturity, short lifespan, and large litter size had high , values and short generation times, whereas sharks that mature late and have long lifespans and small litters have low , values and long generation times. Sharks at the "fast" end of the spectrum tended to have comparable adult and juvenile survival elasticities, whereas sharks at the "slow" end of the continuum had high juvenile survival elasticity and low age,zero survival ( or fertility ) elasticity. Ratios of adult survival to fertility elasticities and juvenile survival to fertility elasticities suggest that many of the populations studied do not possess the biological attributes necessary to restore , to its original level after moderate levels of exploitation. Elasticity analysis suggests that changes in juvenile survival would have the greatest effect on ,, and correlation analysis indicates that variation in juvenile survival, age at maturity, and reproduction account for most of the variation in ,. In general, combined results from elasticity and correlation analyses suggest that research, conservation, and management efforts should focus on these demographic traits. Resumen: Exploré los efectos de la incertidumbre en los caracteres demográficos en análisis demográficos de tiburones, un método no empleado con anterioridad para este taxón. Utilicé tablas de vida estructuradas por edades y matrices de Leslie basadas en evaluaciones pre-gestación y pasos de tiempo de un año aplicados solo a las hembras para modelar la demografía de 41 poblaciones de 38 especies de tiburones que representan cuatro órdenes y nueve familias. Utilicé la simulación de Monte Carlo para reflejar la incertidumbre en las estimaciones de caracteres demográficos y calcular las estadísticas y elasticidades poblacionales para estas poblaciones y el análisis de correlación para identificar los caracteres demográficos que explican la mayoría de la variación en las tasas de crecimiento poblacional ( , ). Las poblaciones examinadas caen dentro de un continuo de características de historias de vida que pueden estar vinculadas con los patrones de elasticidad. Los tiburones que maduran a temprana edad y tienen corta duración de vida y grupos grandes de crías tuvieron valores altos de , y tiempos generacionales cortos, mientras que los tiburones que maduran tarde y tienen una duración de vida larga y grupos pequeños de crías tienen valores bajos de , y tiempos generacionales largos. Los tiburones que se encuentran en el punto final "rápido" del espectro tendieron a tener elasticidades de supervivencia de adultos y juveniles comparables, mientras que los tiburones en el punto final "lento" del continuo tuvieron una alta elasticidad de supervivencia de juveniles y una baja elasticidad en supervivencia a la edad cero (o fertilidad ). Las proporciones de elasticidades de supervivencia de adultos y fertilidad y de elasticidades de supervivencia de juveniles y fertilidad sugieren que muchas de las poblaciones estudiadas no poseen los atributos biológicos necesarios para restaurar , a su nivel original después de niveles moderados de explotación. El análisis de elasticidad sugiere que en la supervivencia de juveniles se podría tener el efecto mayor de , y el análisis de correlación indica que la variación en la supervivencia de juveniles, la edad de maduración y reproducción explican la mayor parte de la variación en ,. En general, los resultados combinados de los análisis de elasticidad y correlación sugieren que los esfuerzos de investigación, conservación y manejo deberían enfocarse a estas características demográficas. [source]


    Monte Carlo Simulations of the Electron Currents Collected by Electrostatic Probes

    CONTRIBUTIONS TO PLASMA PHYSICS, Issue 7-8 2004
    D. Trunec
    Abstract We have carried out Monte Carlo simulation of the motion of electrons in the space charge sheath surrounding a cylindrical Langmuir (electrostatic) probe. The electron currents to the probe have been calculated from these simulations for different conditions (pressure of neutral gas, presence of magnetic field). The results of the simulations have been compared with recent Langmuir probe measurements made in cylindrical DC magnetron plasmas. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Efficient sampling and data reduction techniques for probabilistic seismic lifeline risk assessment

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 10 2010
    Nirmal Jayaram
    Abstract Probabilistic seismic risk assessment for spatially distributed lifelines is less straightforward than for individual structures. While procedures such as the ,PEER framework' have been developed for risk assessment of individual structures, these are not easily applicable to distributed lifeline systems, due to difficulties in describing ground-motion intensity (e.g. spectral acceleration) over a region (in contrast to ground-motion intensity at a single site, which is easily quantified using Probabilistic Seismic Hazard Analysis), and since the link between the ground-motion intensities and lifeline performance is usually not available in closed form. As a result, Monte Carlo simulation (MCS) and its variants are well suited for characterizing ground motions and computing resulting losses to lifelines. This paper proposes a simulation-based framework for developing a small but stochastically representative catalog of earthquake ground-motion intensity maps that can be used for lifeline risk assessment. In this framework, Importance Sampling is used to preferentially sample ,important' ground-motion intensity maps, and K -Means Clustering is used to identify and combine redundant maps in order to obtain a small catalog. The effects of sampling and clustering are accounted for through a weighting on each remaining map, so that the resulting catalog is still a probabilistically correct representation. The feasibility of the proposed simulation framework is illustrated by using it to assess the seismic risk of a simplified model of the San Francisco Bay Area transportation network. A catalog of just 150 intensity maps is generated to represent hazard at 1038 sites from 10 regional fault segments causing earthquakes with magnitudes between five and eight. The risk estimates obtained using these maps are consistent with those obtained using conventional MCS utilizing many orders of magnitudes more ground-motion intensity maps. Therefore, the proposed technique can be used to drastically reduce the computational expense of a simulation-based risk assessment, without compromising the accuracy of the risk estimates. This will facilitate computationally intensive risk analysis of systems such as transportation networks. Finally, the study shows that the uncertainties in the ground-motion intensities and the spatial correlations between ground-motion intensities at various sites must be modeled in order to obtain unbiased estimates of lifeline risk. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    Incremental dynamic analysis for estimating seismic performance sensitivity and uncertainty ,

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 2 2010
    Dimitrios Vamvatsikos
    Abstract Incremental dynamic analysis (IDA) is presented as a powerful tool to evaluate the variability in the seismic demand and capacity of non-deterministic structural models, building upon existing methodologies of Monte Carlo simulation and approximate moment-estimation. A nine-story steel moment-resisting frame is used as a testbed, employing parameterized moment-rotation relationships with non-deterministic quadrilinear backbones for the beam plastic-hinges. The uncertain properties of the backbones include the yield moment, the post-yield hardening ratio, the end-of-hardening rotation, the slope of the descending branch, the residual moment capacity and the ultimate rotation reached. IDA is employed to accurately assess the seismic performance of the model for any combination of the parameters by performing multiple nonlinear timehistory analyses for a suite of ground motion records. Sensitivity analyses on both the IDA and the static pushover level reveal the yield moment and the two rotational-ductility parameters to be the most influential for the frame behavior. To propagate the parametric uncertainty to the actual seismic performance we employ (a) Monte Carlo simulation with latin hypercube sampling, (b) point-estimate and (c) first-order second-moment techniques, thus offering competing methods that represent different compromises between speed and accuracy. The final results provide firm ground for challenging current assumptions in seismic guidelines on using a median-parameter model to estimate the median seismic performance and employing the well-known square-root-sum-of-squares rule to combine aleatory randomness and epistemic uncertainty. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Statistical performance analysis of seismic-excited structures with active interaction control

    EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 7 2003
    Yunfeng Zhang
    Abstract This paper presents a statistical performance analysis of a semi-active structural control system for suppressing the vibration response of building structures during strong seismic events. The proposed semi-active mass damper device consists of a high-frequency mass damper with large stiffness, and an actively controlled interaction element that connects the mass damper to the structure. Through actively modulating the operating states of the interaction elements according to pre-specified control logic, vibrational energy in the structure is dissipated in the mass damper device and the vibration of the structure is thus suppressed. The control logic, categorized under active interaction control, is defined directly in physical space by minimizing the inter-storey drift of the structure to the maximum extent. This semi-active structural control approach has been shown to be effective in reducing the vibration response of building structures due to specific earthquake ground motions. To further evaluate the control performance, a Monte Carlo simulation of the seismic response of a three-storey steel-framed building model equipped with the proposed semi-active mass damper device is performed based on a large ensemble of artificially generated earthquake ground motions. A procedure for generating code-compatible artificial earthquake accelerograms is also briefly described. The results obtained clearly demonstrate the effectiveness of the proposed semi-active mass damper device in controlling vibrations of building structures during large earthquakes. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Variable reporting and quantitative reviews: a comparison of three meta-analytical techniques

    ECOLOGY LETTERS, Issue 5 2003
    Marc J. Lajeunesse
    Abstract Variable reporting of results can influence quantitative reviews by limiting the number of studies for analysis, and thereby influencing both the type of analysis and the scope of the review. We performed a Monte Carlo simulation to determine statistical errors for three meta-analytical approaches and related how such errors were affected by numbers of constituent studies. Hedges'd and effect sizes based on item response theory (IRT) had similarly improved error rates with increasing numbers of studies when there was no true effect, but IRT was conservative when there was a true effect. Log response ratio had low precision for detecting null effects as a result of overestimation of effect sizes, but high ability to detect true effects, largely irrespective of number of studies. Traditional meta-analysis based on Hedges'd are preferred; however, quantitative reviews should use various methods in concert to improve representation and inferences from summaries of published data. [source]


    Early survival of marble trout Salmo marmoratus: evidence for density dependence?

    ECOLOGY OF FRESHWATER FISH, Issue 2 2007
    S. Vincenzi
    Abstract,,, The role of endogenous and exogenous factors in regulating population dynamics of freshwater salmonids is still a matter of debate. The aim of the present work was to assess the relative importance of density-dependent and -independent factors in determining the survival of marble trout (Salmo marmoratus) yearlings in two populations living in Slovenian streams (Zakojska and Gorska). The investigation was performed by combining a classical life table analysis with Monte Carlo simulation. Size-dependent fecundity was estimated by stripping wild adults in the fish farm. A significant positive relationship was found between length of marble trout females and the number of eggs produced. Egg density was the major determinant of survival from eggs to age 1+ (,0) in both streams. Residuals of the relationship between ,0 and egg density were positively correlated with rainfall only in Zakojska, probably because, within a certain range, more intense rainfalls increases stream flow and, consequently, suitable habitat for trout. Our study shows how density-dependent and environmental factors can interact to determine the survival of marble trout during the first year of life. [source]


    Measuring and Optimizing Portfolio Credit Risk: A Copula-based Approach,

    ECONOMIC NOTES, Issue 3 2004
    Annalisa Di Clemente
    In this work, we present a methodology for measuring and optimizing the credit risk of a loan portfolio taking into account the non-normality of the credit loss distribution. In particular, we aim at modelling accurately joint default events for credit assets. In order to achieve this goal, we build the loss distribution of the loan portfolio by Monte Carlo simulation. The times until default of each obligor in portfolio are simulated following a copula-based approach. In particular, we study four different types of dependence structure for the credit assets in portfolio: the Gaussian copula, the Student's t-copula, the grouped t-copula and the Clayton n-copula (or Cook,Johnson copula). Our aim is to assess the impact of each type of copula on the value of different portfolio risk measures, such as expected loss, maximum loss, credit value at risk and expected shortfall. In addition, we want to verify whether and how the optimal portfolio composition may change utilizing various types of copula for describing the default dependence structure. In order to optimize portfolio credit risk, we minimize the conditional value at risk, a risk measure both relevant and tractable, by solving a simple linear programming problem subject to the traditional constraints of balance, portfolio expected return and trading. The outcomes, in terms of optimal portfolio compositions, obtained assuming different default dependence structures are compared with each other. The solution of the risk minimization problem may suggest us how to restructure the inefficient loan portfolios in order to obtain their best risk/return profile. In the absence of a developed secondary market for loans, we may follow the investment strategies indicated by the solution vector by utilizing credit default swaps. [source]


    A STATISTICAL TEST OF UNBIASED EVOLUTION OF BODY SIZE IN BIRDS

    EVOLUTION, Issue 12 2002
    Folmer Bokma
    Abstract., Of the approximately 9500 bird species, the vast majority is small-bodied. That is a general feature of evolutionary lineages, also observed for instance in mammals and plants. The avian interspecific body size distribution is right-skewed even on a logarithmic scale. That has previously been interpreted as evidence that body size evolution has been biased. However, a procedure to test for unbiased evolution from the shape of body size distributions was lacking. In the present paper unbiased body size evolution is defined precisely, and a statistical test is developed based on Monte Carlo simulation of unbiased evolution. Application of the test to birds suggests that it is highly unlikely that avian body size evolution has been unbiased as defined. Several possible explanations for this result are discussed. A plausible explanation is that the general model of unbiased evolution assumes that population size and generation time do not affect the evolutionary variability of body size; that is, that micro- and macroevolution are decoupled, which theory suggests is not likely to be the case. [source]


    The perturbation method and the extended finite element method.

    FATIGUE & FRACTURE OF ENGINEERING MATERIALS AND STRUCTURES, Issue 8 2006
    An application to fracture mechanics problems
    ABSTRACT The extended finite element method has been successful in the numerical simulation of fracture mechanics problems. With this methodology, different to the conventional finite element method, discretization of the domain with a mesh adapted to the geometry of the discontinuity is not required. On the other hand, in traditional fracture mechanics all variables have been considered to be deterministic (uniquely defined by a given numerical value). However, the uncertainty associated with these variables (external loads, geometry and material properties, among others) it is well known. This paper presents a novel application of the perturbation method along with the extended finite element method to treat these uncertainties. The methodology has been implemented in a commercial software and results are compared with those obtained by means of a Monte Carlo simulation. [source]


    Electrically Conductive Thin Films Prepared from Layer-by-Layer Assembly of Graphite Platelets

    ADVANCED FUNCTIONAL MATERIALS, Issue 7 2009
    Mubarak Alazemi
    Abstract Layer-by-layer (LBL) assembly of carbon nanoparticles for low electrical contact resistance thin film applications is demonstrated. The nanoparticles consist of irregularly shaped graphite platelets, with acrylamide/,, -methacryl-oxyethyl-trimethyl-ammonium copolymer as the cationic binder. Nanoparticle zeta (,,) potential and thereby electrostatic interactions are varied by altering the pH of graphite suspension as well as that of the binder suspension. Film thickness as a function of zeta potential, immersion time, and the number of layers deposited is obtained using Monte Carlo simulation of the energy dispersive spectroscopy measurements. Multilayer film surface morphology is visualized via field-emission scanning electron microscopy and atomic-force microscopy. Thin film electrical properties are characterized using electrical contact resistance measurements. Graphite nanoparticles are found to self-assemble onto gold substrates through two distinct yet overlapping mechanisms. The first mechanism is characterized by logarithmic carbon uptake with respect to the number of deposition cycles and slow clustering of nanoparticles on the gold surface. The second mechanism results from more rapid LBL nanoparticle assembly and is characterized by linear weight uptake with respect to the number of deposition cycles and a constant bilayer thickness of 15 to 21,nm. Thin-film electrical contact resistance is found to be proportional to the thickness after equilibration of the bilayer structure. Measured values range from 1.6,m,,cm,2 at 173,nm to 3.5,m,,cm,2 at 276,nm. Coating volume resistivity is reduced when electrostatic interactions are enhanced during LBL assembly. [source]


    Stochastic Study of Solute Transport in a Nonstationary Medium

    GROUND WATER, Issue 2 2006
    Bill X. Hu
    A Lagrangian stochastic approach is applied to develop a method of moment for solute transport in a physically and chemically nonstationary medium. Stochastic governing equations for mean solute flux and solute covariance are analytically obtained in the first-order accuracy of log conductivity and/or chemical sorption variances and solved numerically using the finite-difference method. The developed method, the numerical method of moments (NMM), is used to predict radionuclide solute transport processes in the saturated zone below the Yucca Mountain project area. The mean, variance, and upper bound of the radionuclide mass flux through a control plane 5 km downstream of the footprint of the repository are calculated. According to their chemical sorption capacities, the various radionuclear chemicals are grouped as nonreactive, weakly sorbing, and strongly sorbing chemicals. The NMM method is used to study their transport processes and influence factors. To verify the method of moments, a Monte Carlo simulation is conducted for nonreactive chemical transport. Results indicate the results from the two methods are consistent, but the NMM method is computationally more efficient than the Monte Carlo method. This study adds to the ongoing debate in the literature on the effect of heterogeneity on solute transport prediction, especially on prediction uncertainty, by showing that the standard derivation of solute flux is larger than the mean solute flux even when the hydraulic conductivity within each geological layer is mild. This study provides a method that may become an efficient calculation tool for many environmental projects. [source]


    The impact of using different costing methods on the results of an economic evaluation of cardiac care: microcosting vs gross-costing approaches

    HEALTH ECONOMICS, Issue 4 2009
    Fiona M. Clement (Nee Shrive)
    Abstract Background: Published guidelines on the conduct of economic evaluations provide little guidance regarding the use and potential bias of the different costing methods. Objectives: Using microcosting and two gross-costing methods, we (1) compared the cost estimates within and across subjects, and (2) determined the impact on the results of an economic evaluation. Methods: Microcosting estimates were obtained from the local health region and gross-costing estimates were obtained from two government bodies (one provincial and one national). Total inpatient costs were described for each method. Using an economic evaluation of sirolimus-eluting stents, we compared the incremental cost,utility ratios that resulted from applying each method. Results: Microcosting, Case-Mix-Grouper (CMG) gross-costing, and Refined-Diagnosis-Related grouper (rDRG) gross-costing resulted in 4-year mean cost estimates of $16,684, $16,232, and $10,474, respectively. Using Monte Carlo simulation, the cost per QALY gained was $41,764 (95% CI: $41,182,$42,346), $42,538 (95% CI: $42,167,$42,907), and $36,566 (95% CI: $36,172,$36,960) for microcosting, rDRG-derived and CMG-derived estimates, respectively (P<0.001). Conclusions: Within subject, the three costing methods produced markedly different cost estimates. The difference in cost,utility values produced by each method is modest but of a magnitude that could influence a decision to fund a new intervention. Copyright © 2008 John Wiley & Sons, Ltd. [source]