Simulation Experiments (simulation + experiment)

Distribution by Scientific Domains

Kinds of Simulation Experiments

  • extensive simulation experiment


  • Selected Abstracts


    Simulation of resource synchronization in a dynamic real-time distributed computing environment

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2004
    Chen Zhang
    Abstract Today, more and more distributed computer applications are being modeled and constructed using real-time principles and concepts. In 1989, the Object Management Group (OMG) formed a Real-Time Special Interest Group (RT SIG) with the goal of extending the Common Object Request Broker Architecture (CORBA) standard to include real-time specifications. This group's most recent efforts have focused on the requirements of dynamic distributed real-time systems. One open problem in this area is resource access synchronization for tasks employing dynamic priority scheduling. This paper presents two resource synchronization protocols that the authors have developed which meet the requirements of dynamic distributed real-time systems as specified by Dynamic Scheduling Real-Time CORBA (DSRT CORBA). The proposed protocols can be applied to both Earliest Deadline First (EDF) and Least Laxity First (LLF) dynamic scheduling algorithms, allow distributed nested critical sections, and avoid unnecessary runtime overhead. In order to evaluate the performance of the proposed protocols, we analyzed each protocol's schedulability. Since the schedulability of the system is affected by numerous system configuration parameters, we have designed simulation experiments to isolate and illustrate the impact of each individual system parameter. Simulation experiments show the proposed protocols have better performance than one would realize by applying a schema that utilizes dynamic priority ceiling update. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Aggregation in Matching Markets

    INTERNATIONAL ECONOMIC REVIEW, Issue 1 2000
    John K. Dagsvik
    This article develops aggregate relations for a matching market of heterogeneous suppliers and demanders. Under particular assumptions about the distribution of preferences and the matching game, asymptotic aggregate relations for the number of realized matches of different types in the presence of flexible contracts (such as a price) are derived. Simulation experiments demonstrate that the model also provides excellent predictions in small populations. The potential for applications within demographic, labor market, and welfare analyses is discussed. [source]


    Adaptive least mean squares block Volterra filters

    INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, Issue 4 2001
    Tarek I. Haweel
    Abstract Adaptive filtering has found many applications in situations where the underlying signals are changing or unknown. While linear filters are simple from implementation and conceptual points of view, many signals are non-linear in nature. Non-linear filters based on truncated Volterra expansions can effectively model a large number of systems. Unfortunately, the resulting input auto-moment matrix is ill conditioned, which results in a slow convergence rate. This paper proposes a class of block adaptive Volterra filters in which the input sequences are Hadamard transformed to improve the condition number of the input auto-moment matrix and consequently improve the convergence rate. This is achieved by the decorrelation effect produced by the orthogonality of the transform. Since Hadamard transformation employs only ±1's, the additional required computational and implementation burdens are few. The effect of additive white Gaussian noise is introduced. Simulation experiments are given to illustrate the improved performance of the proposed method over the conventional Volterra LMS method. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    A zone co-operation approach for efficient caching in mobile ad hoc networks

    INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 9 2006
    Narottam Chand
    Abstract Mobile Ad hoc NETwork (MANET) presents a constrained communication environment due to fundamental limitations of client resources, insufficient wireless bandwidth and users' frequent mobility. Caching of frequently accessed data in such environment is a potential technique that can improve the data access performance and availability. Co-operative caching, which allows the sharing and co-ordination of cached data among clients, can further explore the potential of the caching techniques. In this paper, we propose a novel scheme, called zone co-operative (ZC) for caching in MANETs. In ZC scheme, one-hop neighbours of a mobile client form a co-operative cache zone. For a data miss in the local cache, each client first searches the data in its zone before forwarding the request to the next client that lies along routing path towards server. As a part of cache management, cache admission control and value -based replacement policy are developed to improve the data accessibility and reduce the local cache miss ratio. An analytical study of ZC based on data popularity, node density and transmission range is also performed. Simulation experiments show that the ZC caching mechanism achieves significant improvements in cache hit ratio and average query latency in comparison with other caching strategies. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    A modularized framework for solving an economic,environmental power generation mix problem

    INTERNATIONAL JOURNAL OF ENERGY RESEARCH, Issue 9 2004
    Haoxiang Xia
    Abstract This paper presents a modularized simulation modelling framework for evaluating the impacts on economic cost and CO2 emissions resulting from the introduction of a solid oxide fuel cell (SOFC) system into the existing mix of centralized power generation technologies in Japan. The framework is comprised of three parts: a dual-objective linear programming model that solves the generation best-mix problem for the existing power generation technologies; a nonlinear SOFC system model in which the economic cost and CO2 emissions by the SOFC system can be calculated; and the Queuing Multi-Objective Optimizer (QMOO), a multi-objective evolutionary algorithm (MOEA) developed at the EPFL in Switzerland as the overall optimizer of the combined power supply system. Thus, the framework integrates an evolutionary algorithm that is more suitable for handling nonlinearities with a calculus-based method that is more efficient in solving linear programming problems. Simulation experiments show that the framework is successful in solving the stated problem. Moreover, the three components of the modularized framework can be interconnected through a platform-independent model integration environment. As a result, the framework is flexible and scalable, and can potentially be modified and/or integrated with other models to study more complex problems. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    A new Bayesian formulation for Holt's exponential smoothing

    JOURNAL OF FORECASTING, Issue 3 2009
    Robert R. Andrawis
    Abstract In this paper we propose a Bayesian forecasting approach for Holt's additive exponential smoothing method. Starting from the state space formulation, a formula for the forecast is derived and reduced to a two-dimensional integration that can be computed numerically in a straightforward way. In contrast to much of the work for exponential smoothing, this method produces the forecast density and, in addition, it considers the initial level and initial trend as part of the parameters to be evaluated. Another contribution of this paper is that we have derived a way to reduce the computation of the maximum likelihood parameter estimation procedure to that of evaluating a two-dimensional grid, rather than applying a five-variable optimization procedure. Simulation experiments confirm that both proposed methods give favorable performance compared to other approaches. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    A Bayesian threshold nonlinearity test for financial time series

    JOURNAL OF FORECASTING, Issue 1 2005
    Mike K. P. So
    Abstract We propose in this paper a threshold nonlinearity test for financial time series. Our approach adopts reversible-jump Markov chain Monte Carlo methods to calculate the posterior probabilities of two competitive models, namely GARCH and threshold GARCH models. Posterior evidence favouring the threshold GARCH model indicates threshold nonlinearity or volatility asymmetry. Simulation experiments demonstrate that our method works very well in distinguishing GARCH and threshold GARCH models. Sensitivity analysis shows that our method is robust to misspecification in error distribution. In the application to 10 market indexes, clear evidence of threshold nonlinearity is discovered and thus supporting volatility asymmetry. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    A Bayesian nonlinearity test for threshold moving average models

    JOURNAL OF TIME SERIES ANALYSIS, Issue 5 2010
    Qiang Xia
    We propose a Bayesian test for nonlinearity of threshold moving average (TMA) models. First, we obtain the marginal posterior densities of all parameters, including the threshold and delay, of the TMA model using Gibbs sampler with the Metropolis,Hastings algorithm. And then, we adopt reversible-jump Markov chain Monte Carlo methods to calculate the posterior probabilities for MA and TMA models. Posterior evidence in favour of the TMA model indicates threshold nonlinearity. Simulation experiments and a real example show that our method works very well in distinguishing MA and TMA models. [source]


    First-order rounded integer-valued autoregressive (RINAR(1)) process

    JOURNAL OF TIME SERIES ANALYSIS, Issue 4 2009
    M. Kachour
    Abstract., We introduce a new class of autoregressive models for integer-valued time series using the rounding operator. Compared with classical INAR models based on the thinning operator, the new models have several advantages: simple innovation structure, autoregressive coefficients with arbitrary signs, possible negative values for time series and possible negative values for the autocorrelation function. Focused on the first-order RINAR(1) model, we give conditions for its ergodicity and stationarity. For parameter estimation, a least squares estimator is introduced and we prove its consistency under suitable identifiability condition. Simulation experiments as well as analysis of real data sets are carried out to attest the model performance. [source]


    A MODEL FOR BATCH ADVANCED AVAILABLE-TO-PROMISE

    PRODUCTION AND OPERATIONS MANAGEMENT, Issue 4 2002
    CHIEN-YU CHEN
    The available-to-promise (atp) function is becoming increasingly important in supply chain management since it directly links production resources with customer orders. In this paper, a mixed integer programming (mip) ATP model is presented. This model can provide an order-promising and -fulfillment solution for a batch of orders that arrive within a predefined batching interval. A variety of constraints, such as raw material availability, production capacity, material compatibility, and customer preferences, are considered. Simulation experiments using the model investigate the sensitivity of supply chain performance to changes in certain parameters, such as batching interval size and customer order flexibility. [source]


    USING LEAST SQUARE SVM FOR NONLINEAR SYSTEMS MODELING AND CONTROL

    ASIAN JOURNAL OF CONTROL, Issue 2 2007
    Haoran Zhang
    ABSTRACT Support vector machine is a learning technique based on the structural risk minimization principle, and it is also a class of regression method with good generalization ability. The paper firstly introduces the mathematical model of regression least squares support vector machine (LSSVM), and designs incremental learning algorithms by the calculation formula of block matrix, then uses LSSVM to model nonlinear system, based on which to control nonlinear systems by model predictive method. Simulation experiments indicate that the proposed method provides satisfactory performance, and it achieves superior modeling performance to the conventional method based on neural networks, moreover it achieves well control performance. [source]


    Labour Market Policies and Long-term Unemployment in a Flow Model of the Australian Labour Market

    AUSTRALIAN ECONOMIC PAPERS, Issue 2 2003
    Ric D. Herbert
    This paper develops a general equilibrium job matching model, which is used to assess the impact of active labour market policies, reductions in unemployment benefits and reductions in worker bargaining power on long-term unemployment and other key macro variables. The model is calibrated using Australian data. Simulation experiments are conducted through impulse response analysis. The simulations suggest that active labour market programs (ALMPs) targeted at the long-term unemployed have a small net impact and produce adverse spillover effects on short-term unemployment. Reducing the level of unemployment benefits relative to wages and worker bargaining power have more substantial effects on total and long-term unemployment and none of the spillover effects of ALMPs. [source]


    On Estimating Medical Cost and Incremental Cost-Effectiveness Ratios with Censored Data

    BIOMETRICS, Issue 4 2001
    Hongwei Zhao
    Summary. Medical cost estimation is very important to health care organizations and health policy makers. We consider cost-effectiveness analysis for competing treatments in a staggered-entry, survival-analysis-based clinical trial. We propose a method for estimating mean medical cost over patients in such settings. The proposed estimator is shown to be consistent and asymptotically normal, and its asymptotic variance can be obtained. In addition, we propose a method for estimating the incremental cost-effectiveness ratio and for obtaining a confidence interval for it. Simulation experiments are conducted to evaluate our proposed methods. Finally, we apply our methods to a clinical trial comparing the cost effectiveness of implanted cardiac defibrillators with conventional therapy for individuals at high risk for ventricular arrhythmias. [source]


    The practice of non-parametric estimation by solving inverse problems: the example of transformation models

    THE ECONOMETRICS JOURNAL, Issue 3 2010
    Frédérique Fève
    Summary, This paper considers a semi-parametric version of the transformation model , (Y) =,,X+U under exogeneity or instrumental variables assumptions (E(U,X) = 0 or . This model is used as an example to illustrate the practice of the estimation by solving linear functional equations. This paper is specially focused on the data-driven selection of the regularization parameter and of the bandwidths. Simulations experiments illustrate the relevance of this approach. [source]


    Temperature drop analysis of the thruster in a space cryogenic environment

    HEAT TRANSFER - ASIAN RESEARCH (FORMERLY HEAT TRANSFER-JAPANESE RESEARCH), Issue 2 2007
    Ze-Juan Xiao
    Abstract Based on the conservation of energy, a coupling heat-transfer physical model and a set of mathematical equations are put forward to calculate the main components of the thruster, including the capillary injection tube, the aggregate organ, the injection plate, and the bracket when they are exposed to a space cryogenic environment. The typical temperature drop course of a 10N monopropellant thruster has been calculated by this computational model. The calculation results agree well with test data in a vacuum cryogenic simulation experiment performed on the ground. The temperature of the injection tube provides the thermal boundary conditions for the propellant temperature drop calculation while flowing through it. This provided the criterion to judge whether the propellant freezes or not. The upper stage has no air conditioning, so the injection tube is a weak link for the cryogenic reliability work of the thruster. This is considered one of the most important areas of the whole for reliability research. © 2007 Wiley Periodicals, Inc. Heat Trans Asian Res, 36(2): 85,95, 2007; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/htj.20144 [source]


    Evaluation of operators' performance for automation design in the fully digital control room of nuclear power plants

    HUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 1 2010
    Chiuhsiang Joe Lin
    Abstract Recent technical developments in computer hardware and software have meant that human,machine systems can be automated in many respects. If automation fails, however, human operators can have difficulty in recognizing the existence of a problem, identifying what has failed, and taking corrective action to remedy these out-of-the-loop (OOTL) performance problems. Several studies have suggested that taxonomies of levels of automation (LOAs) and types of automation (TOAs) can be used to solve OOTL problems. This study examined the impact of LOAs in process control automation within the context of nuclear power plants (NPPs). A simulation experiment in an NPP is performed to validate this framework using an automatic mode and a semiautomatic mode. Mental demand is found to be significantly reduced under the automatic mode; however, participants felt frustrated with this LOA. Situation awareness is found to be similar in the two modes. The results of an end-of-experiment subjective rating reveal that participants were evenly divided between the two modes with respect to generating and selecting functions. It is therefore suggested that human operators be involved in generating and selecting functions under an automatic mode. © 2009 Wiley Periodicals, Inc. [source]


    Building neural network models for time series: a statistical approach

    JOURNAL OF FORECASTING, Issue 1 2006
    Marcelo C. Medeiros
    Abstract This paper is concerned with modelling time series by single hidden layer feedforward neural network models. A coherent modelling strategy based on statistical inference is presented. Variable selection is carried out using simple existing techniques. The problem of selecting the number of hidden units is solved by sequentially applying Lagrange multiplier type tests, with the aim of avoiding the estimation of unidentified models. Misspecification tests are derived for evaluating an estimated neural network model. All the tests are entirely based on auxiliary regressions and are easily implemented. A small-sample simulation experiment is carried out to show how the proposed modelling strategy works and how the misspecification tests behave in small samples. Two applications to real time series, one univariate and the other multivariate, are considered as well. Sets of one-step-ahead forecasts are constructed and forecast accuracy is compared with that of other nonlinear models applied to the same series. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Consistent estimation of binary-choice panel data models with heterogeneous linear trends

    THE ECONOMETRICS JOURNAL, Issue 2 2006
    Alban Thomas
    Summary, This paper presents an extension of fixed effects binary choice models for panel data, to the case of heterogeneous linear trends. Two estimators are proposed: a Logit estimator based on double conditioning and a semiparametric, smoothed maximum score estimator based on double differences. We investigate small-sample properties of these estimators with a Monte Carlo simulation experiment, and compare their statistical properties with standard fixed effects procedures. An empirical application to land renting decisions of Russian households between 1996 and 2002 is proposed. [source]


    Supplementary oxygen and temperature management during live transportation of greenlip abalone, Haliotis laevigata (Donovan, 1808)

    AQUACULTURE RESEARCH, Issue 7 2009
    Erin J Bubner
    Abstract Live greenlip abalone, Haliotis laevigata, are highly valued in Australian export markets with demand increasingly being met with cultured stock. Live transportation of abalone requires the maintenance of favourable conditions within transport containers for periods exceeding 35 h. We examined the combined effects of temperature regulation (ice provision) and of supplemental oxygen (60% and 100% concentrations) on mortality rates of abalone over 7 days following a 35-h simulated live-transport experiment. We also examined the physiological condition of greenlip abalone (oxygen consumption rate, haemolymph pH and weight) during the simulation experiment. The provision of ice and supplementary oxygen reduced abalone mortalities. Omission of ice and supplementary oxygen during the transport simulation resulted in mortality rates ranging from 70% to 100%. The addition of ice to containers with ambient oxygen concentrations decreased average mortality rates by 50%. While supplementary oxygen further reduced these rates, the provision of both ice and 100% oxygen was by far the most effective combination, reducing mortalities to between 2% and 6%. Supplementary oxygen increased oxygen consumption rates of abalone above those transported at ambient oxygen concentrations. Live-transport decreased haemolymph pH in all treatments but was most pronounced in treatments without ice or supplementary oxygen. On average, abalone lost 7,13% of their weight during the simulation but this loss was independent of transport treatment. [source]


    HEAVY-TAILED-DISTRIBUTED THRESHOLD STOCHASTIC VOLATILITY MODELS IN FINANCIAL TIME SERIES

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 1 2008
    Cathy W. S. Chen
    Summary To capture mean and variance asymmetries and time-varying volatility in financial time series, we generalize the threshold stochastic volatility (THSV) model and incorporate a heavy-tailed error distribution. Unlike existing stochastic volatility models, this model simultaneously accounts for uncertainty in the unobserved threshold value and in the time-delay parameter. Self-exciting and exogenous threshold variables are considered to investigate the impact of a number of market news variables on volatility changes. Adopting a Bayesian approach, we use Markov chain Monte Carlo methods to estimate all unknown parameters and latent variables. A simulation experiment demonstrates good estimation performance for reasonable sample sizes. In a study of two international financial market indices, we consider two variants of the generalized THSV model, with US market news as the threshold variable. Finally, we compare models using Bayesian forecasting in a value-at-risk (VaR) study. The results show that our proposed model can generate more accurate VaR forecasts than can standard models. [source]


    PATTERN RECOGNITION VIA ROBUST SMOOTHING WITH APPLICATION TO LASER DATA

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2007
    Carlo Grillenzoni
    Summary Nowadays airborne laser scanning is used in many territorial studies, providing point data which may contain strong discontinuities. Motivated by the need to interpolate such data and preserve their edges, this paper considers robust nonparametric smoothers. These estimators, when implemented with bounded loss functions, have suitable jump-preserving properties. Iterative algorithms are developed here, and are equivalent to nonlinear M-smoothers, but have the advantage of resembling the linear Kernel regression. The selection of their coefficients is carried out by combining cross-validation and robust-tuning techniques. Two real case studies and a simulation experiment confirm the validity of the method; in particular, the performance in building recognition is excellent. [source]


    MAXIMUM LIKELIHOOD ESTIMATION FOR A POISSON RATE PARAMETER WITH MISCLASSIFIED COUNTS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2005
    James D. Stamey
    Summary This paper proposes a Poisson-based model that uses both error-free data and error-prone data subject to misclassification in the form of false-negative and false-positive counts. It derives maximum likelihood estimators (MLEs) for the Poisson rate parameter and the two misclassification parameters , the false-negative parameter and the false-positive parameter. It also derives expressions for the information matrix and the asymptotic variances of the MLE for the rate parameter, the MLE for the false-positive parameter, and the MLE for the false-negative parameter. Using these expressions the paper analyses the value of the fallible data. It studies characteristics of the new double-sampling rate estimator via a simulation experiment and applies the new MLE estimators and confidence intervals to a real dataset. [source]


    Testing the Ratio of Two Poisson Rates

    BIOMETRICAL JOURNAL, Issue 2 2008
    Kangxia Gu
    Abstract In this paper we compare the properties of four different general approaches for testing the ratio of two Poisson rates. Asymptotically normal tests, tests based on approximate p -values, exact conditional tests, and a likelihood ratio test are considered. The properties and power performance of these tests are studied by a Monte Carlo simulation experiment. Sample size calculation formulae are given for each of the test procedures and their validities are studied. Some recommendations favoring the likelihood ratio and certain asymptotic tests are based on these simulation results. Finally, all of the test procedures are illustrated with two real life medical examples. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Measurement Error in a Random Walk Model with Applications to Population Dynamics

    BIOMETRICS, Issue 4 2006
    John Staudenmayer
    Summary Population abundances are rarely, if ever, known. Instead, they are estimated with some amount of uncertainty. The resulting measurement error has its consequences on subsequent analyses that model population dynamics and estimate probabilities about abundances at future points in time. This article addresses some outstanding questions on the consequences of measurement error in one such dynamic model, the random walk with drift model, and proposes some new ways to correct for measurement error. We present a broad and realistic class of measurement error models that allows both heteroskedasticity and possible correlation in the measurement errors, and we provide analytical results about the biases of estimators that ignore the measurement error. Our new estimators include both method of moments estimators and "pseudo"-estimators that proceed from both observed estimates of population abundance and estimates of parameters in the measurement error model. We derive the asymptotic properties of our methods and existing methods, and we compare their finite-sample performance with a simulation experiment. We also examine the practical implications of the methods by using them to analyze two existing population dynamics data sets. [source]


    Effect of Cross Linking Agent on Alkali/Surfactant/Polymer

    CHINESE JOURNAL OF CHEMISTRY, Issue 1 2008
    Ke ZHANG
    Abstract Alkali/surfactant/polymer (ASP) multisystem flooding technique, which has an expansive application prospect, is one of the enhancing oil recovery (EOR) methods. By adding the organic chromium to the ASP, the molecular structure of polymer was made to change, and the capability of controlling mobility coefficient of ASP was improved. The results showed that multisystem could still keep ultra-low interfacial tension between the multisystem and crude oil after addition of Cr3+. The resistance factor and residual resistance factor, the indicator which describes the capability of controlling mobility, upgraded strikingly. However its storage modulus and loss modulus, the indicator which describes viscoelasticity, increased. The results of physical simulation experiment indicated that this type of improved ASP could increase the recovery ratio by 4.3% compared to common ASP multisystem. [source]


    Investigating Driver Injury Severity in Traffic Accidents Using Fuzzy ARTMAP

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 6 2002
    Hassan T. Abdelwahab
    This paper applies fuzzy adaptive resonance theory MAP (fuzzy ARTMAP) neural networks to analyze and predict injury severity for drivers involved in traffic accidents. The paper presents a modified version of fuzzy ARTMAP in which the training patterns are ordered using the K,means algorithm before being presented to the neural network. The paper presents three applications of fuzzy ARTMAP for analyzing driver injury severity for drivers involved in accidents on highways, signalized intersections, and toll plazas. The analysis is based on central Florida's traffic accident database. Results showed that the ordered fuzzy ARTMAP proved to reduce the network size and improved the performance. To facilitate the application of fuzzy ARTMAP, a series of simulation experiments to extract knowledge from the models were suggested. Results of the fuzzy ARTMAP neural network showed that female drivers experience higher severity levels than male drivers. Vehicle speed at the time of an accident increases the likelihood of high injury severity. Wearing a seat belt decreases the chance of having severe injuries. Drivers in passenger cars are more likely to experience a higher injury severity level than those in vans or pickup trucks. Point of impact, area type, driving under the influence, and driver age were also among the factors that influence the severity level. [source]


    Simulation of resource synchronization in a dynamic real-time distributed computing environment

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2004
    Chen Zhang
    Abstract Today, more and more distributed computer applications are being modeled and constructed using real-time principles and concepts. In 1989, the Object Management Group (OMG) formed a Real-Time Special Interest Group (RT SIG) with the goal of extending the Common Object Request Broker Architecture (CORBA) standard to include real-time specifications. This group's most recent efforts have focused on the requirements of dynamic distributed real-time systems. One open problem in this area is resource access synchronization for tasks employing dynamic priority scheduling. This paper presents two resource synchronization protocols that the authors have developed which meet the requirements of dynamic distributed real-time systems as specified by Dynamic Scheduling Real-Time CORBA (DSRT CORBA). The proposed protocols can be applied to both Earliest Deadline First (EDF) and Least Laxity First (LLF) dynamic scheduling algorithms, allow distributed nested critical sections, and avoid unnecessary runtime overhead. In order to evaluate the performance of the proposed protocols, we analyzed each protocol's schedulability. Since the schedulability of the system is affected by numerous system configuration parameters, we have designed simulation experiments to isolate and illustrate the impact of each individual system parameter. Simulation experiments show the proposed protocols have better performance than one would realize by applying a schema that utilizes dynamic priority ceiling update. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Research Methods of Inquiry

    ACADEMIC EMERGENCY MEDICINE, Issue 11 2006
    Joel Rodgers MA
    Incidents of significant consequence that create surge may require special research methods to provide reliable, generalizable results. This report was constructed through a process of literature review, expert panel discussion at the journal's consensus conference, and iterative development. Traditional clinical research methods that are well accepted in medicine are exceptionally difficult to use for surge incidents because the incidents are very difficult to reliably predict, the consequences vary widely, human behaviors are heterogeneous in response to incidents, and temporal conditions prioritize limited resources to response, rather than data collection. Current literature on surge research methods has found some degree of reliability and generalizability in case-control, postincident survey methods, and ethnographical designs. Novel methods that show promise for studying surge include carefully validated simulation experiments and survey methods that produce validated results from representative populations. Methodologists and research scientists should consider quasi-experimental designs and case-control studies in areas with recurrent high-consequence incidents (e.g., earthquakes and hurricanes). Specialists that need to be well represented in areas of research include emergency physicians and critical care physicians, simulation engineers, cost economists, sociobehavioral methodologists, and others. [source]


    The Macro-Economic Effects of Directed Credit Policies: A Real-Financial CGE Evaluation for India

    DEVELOPMENT AND CHANGE, Issue 3 2001
    C. W. M. Naastepad
    The effectiveness of directed credit programmes as an instrument for economic development is the subject of considerable debate. However, the focus of this debate is almost exclusively on the intra-sectoral effects of directed credit and its adverse effects on financial sector performance, neglecting possible spillover effects on demand, production and investment in the rest of the economy. This article tries to fill this gap by examining the macro-economic effects of directed credit in India with the help of a novel real-financial computable general equilibrium (CGE) model. Focusing on credit rather than money, the model goes beyond earlier modelling approaches by (1) incorporating directed credit policy and credit rationing; (2) recognizing the dual role of credit for working capital and investment; and (3) allowing for switches between credit-constrained, capacity-constrained and demand-constrained regimes. The results from short- and medium-term simulation experiments with the model indicate that, when credit market failures result in rationing as in India's agricultural and small-scale industrial sectors, the macro-economic effects of directed credit are likely to be significant and positive. [source]


    Soil detachment and transport on field- and laboratory-scale interrill areas: erosion processes and the size-selectivity of eroded sediment

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 8 2006
    O. Malam Issa
    Abstract Field- and laboratory-scale rainfall simulation experiments were carried out in an investigation of the temporal variability of erosion processes on interrill areas, and the effects of such variation upon sediment size characteristics. Poorly aggregated sandy soils from the semi-arid environment of Senegal, West Africa, were used on both a 40 m2 field plot and a 0·25 m2 laboratory plot; rainfall intensity for all experiments was 70 mm h,1 with a duration of 1 to 2 hours. Time-series measurements were made of the quantity and the size distribution of eroded material: these permitted an estimate of the changing temporal balance between the main erosion processes (splash and wash). Results from both spatial scales showed a similar temporal pattern of runoff generation and sediment concentration. For both spatial scales, the dominant erosional process was detachment by raindrops; this resulted in a dynamic evolution of the soil surface under raindrop impact, with the rapid formation of a sieving crust followed by an erosion crust. However, a clear difference was observed between the two scales regarding the size of particles detached by both splash and wash. While all measured values were lower than the mean weight diameter (MWD) value of the original soil (mean 0·32 mm), demonstrating the size-selective nature of wash and splash processes, the MWD values of washed and splashed particles at the field scale ranged from 0·08 to 0·16 mm and from 0·12 to 0·30 mm respectively, whereas the MWD values of washed and splashed particles at the laboratory scale ranged from 0·13 to 0·29 mm and from 0·21 to 0·32 mm respectively. Thus only at the field scale were the soil particles detached by splash notably coarser than those transported by wash. This suggests a transport-limited erosion process at the field scale. Differences were also observed between the dynamics of the soil loss by wash at the two scales, since results showed wider scatter in the field compared to the laboratory experiments. This scatter is probably related to the change in soil surface characteristics due to the size-selectivity of the erosion processes at this spatial scale. Copyright © 2006 John Wiley & Sons, Ltd. [source]