Home About us Contact | |||
Modeling Framework (modeling + framework)
Selected AbstractsA Modeling Framework for Supply Chain Simulation: Opportunities for Improved Decision Making,DECISION SCIENCES, Issue 1 2005D. J. Van Der Zee ABSTRACT Owing to its inherent modeling flexibility, simulation is often regarded as the proper means for supporting decision making on supply chain design. The ultimate success of supply chain simulation, however, is determined by a combination of the analyst's skills, the chain members' involvement, and the modeling capabilities of the simulation tool. This combination should provide the basis for a realistic simulation model, which is both transparent and complete. The need for transparency is especially strong for supply chains as they involve (semi)autonomous parties each having their own objectives. Mutual trust and model effectiveness are strongly influenced by the degree of completeness of each party's insight into the key decision variables. Ideally, visual interactive simulation models present an important communicative means for realizing the required overview and insight. Unfortunately, most models strongly focus on physical transactions, leaving key decision variables implicit for some or all of the parties involved. This especially applies to control structures, that is, the managers or systems responsible for control, their activities and their mutual attuning of these activities. Control elements are, for example, dispersed over the model, are not visualized, or form part of the time-indexed scheduling of events. In this article, we propose an alternative approach that explicitly addresses the modeling of control structures. First, we will conduct a literature survey with the aim of listing simulation model qualities essential for supporting successful decision making on supply chain design. Next, we use this insight to define an object-oriented modeling framework that facilitates supply chain simulation in a more realistic manner. This framework is meant to contribute to improved decision making in terms of recognizing and understanding opportunities for improved supply chain design. Finally, the use of the framework is illustrated by a case example concerning a supply chain for chilled salads. [source] Direct Manipulation and Interactive Sculpting of PDE SurfacesCOMPUTER GRAPHICS FORUM, Issue 3 2000Haixia Du This paper presents an integrated approach and a unified algorithm that combine the benefits of PDE surfaces and powerful physics-based modeling techniques within one single modeling framework, in order to realize the full potential of PDE surfaces. We have developed a novel system that allows direct manipulation and interactive sculpting of PDE surfaces at arbitrary location, hence supporting various interactive techniques beyond the conventional boundary control. Our prototype software affords users to interactively modify point, normal, curvature, and arbitrary region of PDE surfaces in a predictable way. We employ several simple, yet effective numerical techniques including the finite-difference discretization of the PDE surface, the multigrid-like subdivision on the PDE surface, the mass-spring approximation of the elastic PDE surface, etc. to achieve real-time performance. In addition, our dynamic PDE surfaces can also be approximated using standard bivariate B-spline finite elements, which can subsequently be sculpted and deformed directly in real-time subject to intrinsic PDE constraints. Our experiments demonstrate many attractive advantages of our dynamic PDE formulation such as intuitive control, real-time feedback, and usability to the general public. [source] A Modeling Framework for Supply Chain Simulation: Opportunities for Improved Decision Making,DECISION SCIENCES, Issue 1 2005D. J. Van Der Zee ABSTRACT Owing to its inherent modeling flexibility, simulation is often regarded as the proper means for supporting decision making on supply chain design. The ultimate success of supply chain simulation, however, is determined by a combination of the analyst's skills, the chain members' involvement, and the modeling capabilities of the simulation tool. This combination should provide the basis for a realistic simulation model, which is both transparent and complete. The need for transparency is especially strong for supply chains as they involve (semi)autonomous parties each having their own objectives. Mutual trust and model effectiveness are strongly influenced by the degree of completeness of each party's insight into the key decision variables. Ideally, visual interactive simulation models present an important communicative means for realizing the required overview and insight. Unfortunately, most models strongly focus on physical transactions, leaving key decision variables implicit for some or all of the parties involved. This especially applies to control structures, that is, the managers or systems responsible for control, their activities and their mutual attuning of these activities. Control elements are, for example, dispersed over the model, are not visualized, or form part of the time-indexed scheduling of events. In this article, we propose an alternative approach that explicitly addresses the modeling of control structures. First, we will conduct a literature survey with the aim of listing simulation model qualities essential for supporting successful decision making on supply chain design. Next, we use this insight to define an object-oriented modeling framework that facilitates supply chain simulation in a more realistic manner. This framework is meant to contribute to improved decision making in terms of recognizing and understanding opportunities for improved supply chain design. Finally, the use of the framework is illustrated by a case example concerning a supply chain for chilled salads. [source] Biotic ligand model of the acute toxicity of metals.ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 10 2001Abstract The biotic ligand model (BLM) was developed to explain and predict the effects of water chemistry on the acute toxicity of metals to aquatic organisms. The biotic ligand is defined as a specific receptor within an organism where metal complexation leads to acute toxicity. The BLM is designed to predict metal interactions at the biotic ligand within the context of aqueous metal speciation and competitive binding of protective cations such as calcium. Toxicity is defined as accumulation of metal at the biotic ligand at or above a critical threshold concentration. This modeling framework provides mechanistic explanations for the observed effects of aqueous ligands, such as natural organic matter, and water hardness on metal toxicity. In this paper, the development of a copper version of the BLM is described. The calibrated model is then used to calculate LC50 (the lethal concentration for 50% of test organisms) and is evaluated by comparison with published toxicity data sets for freshwater fish (fathead minnow, Pimephales promelas) and Daphnia. [source] GENETIC STUDY: The dopamine D4 Receptor (DRD4) gene exon III polymorphism, problematic alcohol use and novelty seeking: direct and mediated genetic effectsADDICTION BIOLOGY, Issue 2 2009Lara A. Ray ABSTRACT The present study sought to integrate convergent lines of research on the associations among the dopamine D4 receptor (DRD4) gene, novelty seeking and drinking behaviors with the overall goal of elucidating genetic influences on problematic drinking in young adulthood. Specifically, this study tested a model in which novelty seeking mediated the relationship between DRD4 variable number of tandem repeats (VNTR) genotype and problematic alcohol use. Participants (n = 90, 40 females) were heavy-drinking college students. Analyses using a structural equation modeling framework suggested that the significant direct path between DRD4 VNTR genotype and problematic alcohol use was reduced to a trend level in the context of a model that included novelty seeking as a mediator, thereby suggesting that the effects of DRD4 VNTR genotype on problematic alcohol use among heavy-drinking young adults were partially mediated by novelty seeking. Cross-group comparisons indicated that the relationships among the model variables were not significantly different in models for men versus women. These results extend recent findings of the association between this polymorphism of the DRD4 receptor gene, problematic alcohol use and novelty seeking. These findings may also help elucidate the specific pathways of risk associated with genetic influences on alcohol use and abuse phenotypes. [source] INVESTIGATING EVOLUTIONARY TRADE-OFFS IN WILD POPULATIONS OF ATLANTIC SALMON (SALMO SALAR): INCORPORATING DETECTION PROBABILITIES AND INDIVIDUAL HETEROGENEITYEVOLUTION, Issue 9 2010Mathieu Buoro Evolutionary trade-offs among demographic parameters are important determinants of life-history evolution. Investigating such trade-offs under natural conditions has been limited by inappropriate analytical methods that fail to address the bias in demographic estimates that can result when issues of detection (uncertain detection of individual) are ignored. We propose a new statistical approach to quantify evolutionary trade-offs in wild populations. Our method is based on a state-space modeling framework that focuses on both the demographic process of interest as well as the observation process. As a case study, we used individual mark,recapture data for stream-dwelling Atlantic salmon juveniles in the Scorff River (Southern Brittany, France). In freshwater, juveniles face two life-history choices: migration to the ocean and sexual maturation (for males). Trade-offs may appear with these life-history choices and survival, because all are energy dependent. We found a cost of reproduction on survival for fish staying in freshwater and a survival advantage associated with the "decision" to migrate. Our modeling framework opens up promising prospects for the study of evolutionary trade-offs when some life-history traits are not, or only partially, observable. [source] Incorporating covariates in mapping heterogeneous traits: a hierarchical model using empirical Bayes estimationGENETIC EPIDEMIOLOGY, Issue 7 2007Swati Biswas Abstract Complex genetic traits are inherently heterogeneous, i.e., they may be caused by different genes, or non-genetic factors, in different individuals. So, for mapping genes responsible for these diseases using linkage analysis, heterogeneity must be accounted for in the model. Heterogeneity across different families can be modeled using a mixture distribution by letting each family have its own heterogeneity parameter denoting the probability that its disease-causing gene is linked to the marker map under consideration. A substantial gain in power is expected if covariates that can discriminate between the families of linked and unlinked types are incorporated in this modeling framework. To this end, we propose a hierarchical Bayesian model, in which the families are grouped according to various (categorized) levels of covariate(s). The heterogeneity parameters of families within each group are assigned a common prior, whose parameters are further assigned hyper-priors. The hyper-parameters are obtained by utilizing the empirical Bayes estimates. We also address related issues such as evaluating whether the covariate(s) under consideration are informative and grouping of families. We compare the proposed approach with one that does not utilize covariates and show that our approach leads to considerable gains in power to detect linkage and in precision of interval estimates through various simulation scenarios. An application to the asthma datasets of Genetic Analysis Workshop 12 also illustrates this gain in a real data analysis. Additionally, we compare the performances of microsatellite markers and single nucleotide polymorphisms for our approach and find that the latter clearly outperforms the former. Genet. Epidemiol. 2007. © 2007 Wiley-Liss, Inc. [source] Large-scale plant light-use efficiency inferred from the seasonal cycle of atmospheric CO2GLOBAL CHANGE BIOLOGY, Issue 8 2004Christopher J. Still Abstract We combined atmospheric CO2 measurements, satellite observations, and an atmospheric transport model in an inverse modeling framework to infer a key property of vegetation physiology, the light-use efficiency (LUE) of net primary production, for large geographic regions. We find the highest LUE in boreal regions and in the northern hemisphere tropics. Within boreal zones, Eurasian LUE is higher than North American LUE and has a distinctly different seasonal profile. This longitudinal asymmetry is consistent with ecological differences expected from the much greater cover of deciduous vegetation in boreal Eurasia caused by the vast Siberian forests of the deciduous conifer, Larch. Inferred LUE of the northern hemisphere tropics is also high and displays a seasonal profile consistent with variations of both cloud cover and C4 vegetation activity. [source] Ground Water Sustainability: Methodology and Application to the North China PlainGROUND WATER, Issue 6 2008Jie Liu This article analyzes part of a ground water flow system in the North China Plain (NCP) subject to severe overexploitation and rapid depletion. A transient ground water flow model was constructed and calibrated to quantify the changes in the flow system since the predevelopment 1950s. The flow model was then used in conjunction with an optimization code to determine optimal pumping schemes that improve ground water management practices. Finally, two management scenarios, namely, urbanization and the South-to-North Water Transfer Project, were evaluated for their potential impacts on the ground water resources in the study area. Although this study focuses on the NCP, it illustrates a general modeling framework for analyzing the sustainability, or the lack thereof, of ground water flow systems driven by similar hydrogeologic and economic conditions. The numerical simulation is capable of quantifying the various components of the overall flow budget and evaluating the impacts of different management scenarios. The optimization modeling allows the determination of the maximum "sustainable pumping" that satisfies a series of prescribed constraints. It can also be used to minimize the economic costs associated with ground water development and management. Furthermore, since the NCP is one of the most water scarce and economically active regions in the world, the conclusions and insights from this study are of general interest and international significance. [source] Modeling and compensation for angular transmission error in harmonic drive gearingsIEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 2 2009Masafumi Yamamoto Student Member Abstract This paper presents a novel modeling and compensation approach for the angular transmission error in harmonic drive gearings. In the modeling, physical phenomena of the transmission error due to nonlinear elastic deformations in micro-displacement region are especially dealt with, as well as the synchronous component which has been discussed in a variety of conventional studies. On the basis of the analyses of the phenomena, the nonlinear elastic component is mathematically modeled by applying a modeling framework for the rolling friction with hysteresis attributes. The proposed transmission error model has been adopted to the positioning system as a model-based feedforward compensation manner. Experimental results using a prototype show the effectiveness of the proposed modeling and compensation. Copyright © 2009 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source] An MDE modeling framework for measurable goal-oriented requirementsINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 8 2010Fernando Molina It is a proved fact that the appropriate management of requirements is one of the most influential factors in the success of software development projects. With the advent of the model-driven engineering (MDE) development paradigm, the need for formal gathering mechanisms, which provide the necessary degree of nonambiguity and detail, has led to the proposal of a myriad of requirements metamodels. However, a great disparity exists, both in the concepts/relationships and in the underlying semantics involved in each metamodel. Moreover, most existing proposals lack backward (e.g., alignment with business goals) or forward (e.g., connection with validation methods) traceability. In view of this situation, this article proposes a "measurable requirements metamodel" that offers support to the elicitation of measurable requirements. This support is based on the explicit connection of goals, requirements, and measures, thus fostering a goal-driven measurable requirements engineering (RE) perspective. Additionally, since it is well known that metamodels only reflect the abstract syntax of the modeling language, the proposed metamodel also includes a notation (concrete syntax) which, for reasons of understandability, is based on the goal-oriented requirements language (GRL) notation. This notation is supported by a unified modeling language (UML) profile that facilitates its adoption by RE analysts in the context of any UML-based software engineering process. To support this proposal, an Eclipse tool has been developed. This tool permits the integration of measurable requirements as a driving force in the context of a given MDE development process. © 2010 Wiley Periodicals, Inc. [source] Two-Sided Platforms: Product Variety and Pricing StructuresJOURNAL OF ECONOMICS & MANAGEMENT STRATEGY, Issue 4 2009Andrei Hagiu This paper provides a new modeling framework to analyze two-sided platforms connecting producers and consumers. In contrast to the existing literature, indirect network effects are determined endogenously, through consumers' taste for variety and producer competition. Three new aspects of platform pricing structures are derived.,First, the optimal platform pricing structure shifts towards extracting more rents from producers relative to consumers when consumers have stronger demand for variety, since producers become less substitutable. With platform competition, consumer preferences for variety, producer market power, and producer economies of scale in multihoming also make platforms' price-cutting strategies on the consumer side less effective. This second effect on equilibrium pricing structures goes in the opposite direction relative to the first one.,Third, variable fees charged to producers can serve to trade off producer innovation incentives against the need to reduce a platform holdup problem. [source] Radio transmitters do not affect the body condition of Savannah Sparrows during the fall premigratory periodJOURNAL OF FIELD ORNITHOLOGY, Issue 4 2009Lauren F. Rae ABSTRACT Radio telemetry can be a valuable tool for studying the behavior, physiology, and demography of birds. We tested the assumption that radio transmitters have no adverse effects on body condition in an island population of Savannah Sparrows (Passerculus sandwichensis). To assess possible changes in condition, 20 radiotagged and 25 nontagged Savannah Sparrows were captured and recaptured throughout the postfledging period. We used four measures of condition: mass, an index of fat free dry mass (measured via heavy water dilution), pectoral muscle depth (measured via ultrasound imaging), and an index of fat mass (measured via heavy water dilution). Using both a generalized linear modeling framework and paired design, we found no significant differences in the body condition of radiotagged and nontagged adults and juveniles. Thus, our results provide evidence that radiotransmitters have no effect on the condition of Savannah Sparrows during the premigratory period. RESUMEN La radiotelemetría puede ser una herramienta valiosa para estudiar la conducta, fisiología y demografía de aves. Utilizando una población de gorriones (Passerculus sandiwichensis), pusimos a prueba el postulado de que los radiotransmisores no tienen efecto adverso sobre la condición corporal de las aves. Para determinar el posible efecto en la condición de los pájaros, capturamos y recapturamos 20 individuos con radiotransmisores y 25 individuos sin estos, durante el periodo de post-volantones. En el trabajo utilizamos cuatro medidas de condición: masa, un índice de masa-seca sin grasa (medida a través de una dilución de agua pesada), profundidad de los músculos pectorales (determinado con una máquina de ultrasonido) y un índice de masa de lípidos (medida vía una dilución de agua pesada). Utilizando tanto un andamiaje con modelo linear generalizado y un diseño pareado, no encontramos diferencias significativas en la condición corporal entre los adultos y juveniles con o sin transmisores. Nuestros resultados proveen evidencia de que los radiotransmisores no tienen efecto adverso en la condición de los gorriones estudiados durante el periodo migratorio. [source] Design and Analysis of Bioenergy NetworksJOURNAL OF INDUSTRIAL ECOLOGY, Issue 2 2009A Complex Adaptive Systems Approach Summary This article presents a new methodology for designing industrial networks and analyzing them dynamically from the standpoint of sustainable development. The approach uses a combination of optimization and simulation tools. Assuming "top-down" overarching control of the network, we use global dynamic optimization to determine which evolutionary pathways are preferred in terms of economic, social, and environmental performance. Considering the autonomy of network entities and their actions, we apply agent-based simulation to analyze how the network actually evolves. These two perspectives are integrated into a powerful multiscale modeling framework for evaluating the consequences of new policy instruments or different business strategies aimed at stimulating sustainable development as well as identifying optimal leverage points for improved performance of the network in question. The approach is demonstrated for a regional network of interdependent organizations deploying a set of bioenergy technologies within a developing-economy context. [source] A time-dependent multiphysics, multiphase modeling framework for carbon nanotube synthesis using chemical vapor depositionAICHE JOURNAL, Issue 12 2009Mahmoud Reza Hosseini Abstract A time-dependent multiphysics, multiphase model is proposed and fully developed here to describe carbon nanotubes (CNTs) fabrication using chemical vapor deposition (CVD). The fully integrated model accounts for chemical reaction as well as fluid, heat, and mass transport phenomena. The feed components for the CVD process are methane (CH4), as the primary carbon source, and hydrogen (H2). Numerous simulations are performed for a wide range of fabrication temperatures (973.15,1273.15 K) as well as different CH4 (500,1000 sccm) and H2 (250,750 sccm) flow rates. The effect of temperature, total flow rate, and feed mixture ratio on CNTs growth rate as well as the effect of amorphous carbon formation on the final product are calculated and compared with experimental results. The outcomes from this study provide a fundamental understanding and basis for the design of an efficient CNT fabrication process that is capable of producing a high yield of CNTs, with a minimum amount of amorphous carbon. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source] Quantitative framework for reliable safety analysisAICHE JOURNAL, Issue 1 2002Haitao Huang The effectiveness of any methodology used to identify hazards in chemical processes affects both safety and economics. To achieve maximum safety at minimum cost, a conservative, but realistic, analysis must be carried out. An approach to hazard identification is proposed based on a detailed process model which includes nonlinear dynamics and uncertainty. A new modeling framework, the region-transition model (RTM), is developed, which enables the simulation of regions of the operating space through an extension of the hybrid state transition system formalism. The RTM is illustrated on a nonlinear batch reactor with parameter uncertainty. A safety-verification algorithm identifies regions of the input space (initial conditions and external inputs) which guarantee safe operation. The algorithm is successfully applied to three examples: a tank with overflow and underflow, a batch reactor with an exothermic reaction, and a CSTR with feed preheating. [source] Integrating Person-Centered and Variable-Centered Approaches in the Study of Developmental Courses and Transitions in Alcohol Use: Introduction to the Special SectionALCOHOLISM, Issue 6 2000Marsha E. Bates This special section consists of research from the symposium "Integrating Person-Centered and Variable-Centered Approaches to the Study of Developmental Courses and Transitions in Alcohol Use," presented at the 1999 Annual Meeting of the Research Society on Alcoholism. The section focuses on ways to integrate variable-centered and person-centered approaches to better understand longitudinal trajectories of alcohol use and associated problems. Our aim is to increase awareness and discussion of alternative conceptual and quantitative approaches that involve both a person-centered and a variable-centered component, and to make these methods more accessible to alcohol and other drug researchers. The first paper provides a general latent variable modeling framework within which to conceptualize developmental questions that involve the combination of continuous latent variables and categorical variables that represent classifications of individuals into meaningful subgroups. This is followed by three empirical papers that use integrative methods to examine early adult outcomes of adolescent binge drinking; potential mediators of familial alcoholism effects on alcohol and tobacco use disorder comorbidity; and the ability of psychopathology, substance use, and parental history of alcohol problems to predict individual differences in the likelihood of transitions in drinking behavior during adolescence. The section concludes with a discussion of the statistical basis for integrating person-centered and variable-centered methods, a comparison of study findings, and directions for future research. [source] Integrated Modular Modeling of Water and Nutrients From Point and Nonpoint Sources in the Patuxent River Watershed,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2008Zhi-Jun Liu Abstract:, We present a simple modular landscape simulation model that is based on a watershed modeling framework in which different sets of processes occurring in a watershed can be simulated separately with different models. The model consists of three loosely coupled submodels: a rainfall-runoff model (TOPMODEL) for runoff generation in a subwatershed, a nutrient model for estimation of nutrients from nonpoint sources in a subwatershed, and a stream network model for integration of point and nonpoint sources in the routing process. The model performance was evaluated using monitoring data in the watershed of the Patuxent River, a tributary to the Chesapeake Bay in Maryland, from July 1997 through August 1999. Despite its simplicity, the landscape model predictions of streamflow, and sediment and nutrient loads were as good as or better than those of the Hydrological Simulation Program-Fortran model, one of the most widely used comprehensive watershed models. The landscape model was applied to predict discharges of water, sediment, silicate, organic carbon, nitrate, ammonium, organic nitrogen, total nitrogen, organic phosphorus, phosphate, and total phosphorus from the Patuxent watershed to its estuary. The predicted annual water discharge to the estuary was very close to the measured annual total in terms of percent errors for both years of the study period (,2%). The model predictions for loads of nutrients were also good (20-30%) or very good (<20%) with exceptions of sediment (40%), phosphate (36%), and organic carbon (53%) for Year 1. [source] OUTPUT VERSUS INPUT CONTROLS UNDER UNCERTAINTY: THE CASE OF A FISHERYNATURAL RESOURCE MODELING, Issue 2 2009SATOSHI YAMAZAKI Abstract The paper compares the management outcomes with a total allowable catch (TAC) and a total allowable effort (TAE) in a fishery under uncertainty. Using a dynamic programming model with multiple uncertainties and estimated growth, harvest, and effort functions from one of the world's largest fisheries, the relative economic and biological benefits of a TAC and TAE are compared and contrasted in a stochastic environment. This approach provides a decision and modeling framework to compare instruments and achieve desired management goals. A key finding is that neither instrument is always preferred in a world of uncertainty and that regulator's risk aversion and weighting in terms of expected net profits and biomass, and the trade-offs in terms of expected values and variance determine instrument choice. [source] An extensible modeling framework for dynamic reassignment and rerouting in cooperative airborne operationsNAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 7 2010Chase C. Murray Abstract Unmanned aerial vehicles (UAVs), increasingly vital to the success of military operations, operate in a complex and dynamic environment, sometimes in concert with manned aircraft. We present an extensible modeling framework for the solution to the dynamic resource management (DRM) problem, where airborne resources must be reassigned to time-sensitive tasks in response to changes in battlespace conditions. The DRM problem is characterized by diverse tasks with time windows, heterogeneous resources with fuel- and payload-capacity limitations, and multiple competing objectives. We propose an integer linear programing formulation for this problem, where mathematical feasibility is guaranteed. Although motivated by airborne military operations, the proposed general modeling framework is applicable to a wide array of settings, such as disaster relief operations. Additionally, land- or water-based operations may be modeled within this framework, as well as any combination of manned and unmanned vehicles. © 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010 [source] Bicriteria product design optimization: An efficient solution procedure using AND/OR treesNAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 6 2002S. Raghavan Competitive imperatives are causing manufacturing firms to consider multiple criteria when designing products. However, current methods to deal with multiple criteria in product design are ad hoc in nature. In this paper we present a systematic procedure to efficiently solve bicriteria product design optimization problems. We first present a modeling framework, the AND/OR tree, which permits a simplified representation of product design optimization problems. We then show how product design optimization problems on AND/OR trees can be framed as network design problems on a special graph,a directed series-parallel graph. We develop an enumerative solution algorithm for the bicriteria problem that requires as a subroutine the solution of the parametric shortest path problem. Although this parametric problem is hard on general graphs, we show that it is polynomially solvable on the series-parallel graph. As a result we develop an efficient solution algorithm for the product design optimization problem that does not require the use of complex and expensive linear/integer programming solvers. As a byproduct of the solution algorithm, sensitivity analysis for product design optimization is also efficiently performed under this framework. © 2002 Wiley Periodicals, Inc. Naval Research Logistics 49: 574,592, 2002; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/nav.10031 [source] Mathematical modeling for the ionic inclusion process inside conducting polymer-based thin-filmsPOLYMER ENGINEERING & SCIENCE, Issue 11 2008Saptarshi Majumdar Ionic inclusion inside thin conducting polymer (CP) film is a major and common feature for actuator as well as membrane-based drug release. In this study, an electro-active polymeric thin-film system has been conceptualized. PNP-electro-neutrality (Poisson,Nernst, Planck) based modeling framework with customized boundary conditions is used to depict the electrochemical phenomena. In dynamic model, kinetics of probable redox reactions is included along with electro-migration and diffusion terms in the overall PNP framework. At steady state, interfacial voltage seems to hold the critically important role, while ionic migration and reaction kinetics play very crucial roles in determining the dynamics of such systems. The validated model predicts that lowering in the standard potential of the polymeric electrode accelerates the process of ionic ingress. Higher ionic flux is obtained using slower voltage scan. Variation of diffusivity shows the large spectrum of relatively unexplored dynamics for such electro-active thin-film-based system. The significance is in designing actuator- or membrane-based controlled molecular release systems. POLYM. ENG. SCI., 2008. © 2008 Society of Plastics Engineers [source] Analysis of computer experiments with multiple noise sourcesQUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 2 2010Christian Dehlendorff Abstract In this paper we present a modeling framework for analyzing computer models with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled effectively with linear mixed effects models and generalized additive models. Copyright © 2009 John Wiley & Sons, Ltd. [source] DYNAMIC ORDER SUBMISSION AND HERDING BEHAVIOR IN ELECTRONIC TRADINGTHE JOURNAL OF FINANCIAL RESEARCH, Issue 1 2010Wing Lon Ng Abstract I analyze the dynamic trading behavior of market participants by developing a bivariate modeling framework for describing the arrival process of buy and sell orders in a limit order book. The model contains an extended autoregressive conditional duration model with a flexible generalized Beta distribution to explain the duration process, combined with a dynamic logit model to capture the traders' order submission strategy. I find that the state of the order book as well as the speed of the order arrival have a significant influence on the order placement, inducing temporal asymmetric market movements. [source] Falling and explosive, dormant, and rising markets via multiple-regime financial time series modelsAPPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 1 2010Cathy W. S. Chen Abstract A multiple-regime threshold nonlinear financial time series model, with a fat-tailed error distribution, is discussed and Bayesian estimation and inference are considered. Furthermore, approximate Bayesian posterior model comparison among competing models with different numbers of regimes is considered which is effectively a test for the number of required regimes. An adaptive Markov chain Monte Carlo (MCMC) sampling scheme is designed, while importance sampling is employed to estimate Bayesian residuals for model diagnostic testing. Our modeling framework provides a parsimonious representation of well-known stylized features of financial time series and facilitates statistical inference in the presence of high or explosive persistence and dynamic conditional volatility. We focus on the three-regime case where the main feature of the model is to capturing of mean and volatility asymmetries in financial markets, while allowing an explosive volatility regime. A simulation study highlights the properties of our MCMC estimators and the accuracy and favourable performance as a model selection tool, compared with a deviance criterion, of the posterior model probability approximation method. An empirical study of eight international oil and gas markets provides strong support for the three-regime model over its competitors, in most markets, in terms of model posterior probability and in showing three distinct regime behaviours: falling/explosive, dormant and rising markets. Copyright © 2009 John Wiley & Sons, Ltd. [source] Performance and cost analysis of future, commercially mature gasification-based electric power generation from switchgrassBIOFUELS, BIOPRODUCTS AND BIOREFINING, Issue 2 2009Haiming Jin Abstract Detailed process designs and mass/energy balances are developed using a consistent modeling framework and input parameter assumptions for biomass-based power generation at large scale (4536 dry metric tonnes per day switchgrass input), assuming future commercially mature component equipment performance levels. The simulated systems include two gasification-based gas turbine combined cycles (B-IGCC) designed around different gasifier technologies, one gasification-based solid oxide fuel cell cycle (B-IGSOFC), and a steam-Rankine cycle. The simulated design-point efficiency of the B-IGSOFC is the highest among all systems (51.8%, LHV basis), with modestly lower efficiencies for the B-IGCC design using a pressurized, oxygen-blown gasifier (49.5% LHV) and for the B-IGCC design using a low-pressure indirectly heated gasifier (48.6%, LHV). The steam-Rankine system has a simulated efficiency of 33.0% (LHV). Detailed capital costs are estimated assuming commercially mature (,Nth plant') technologies for the two B-IGCC and the steam-Rankine systems. B-IGCC systems are more capital-intensive than the steam-Rankine system, but discounted cash flow rate of return calculations highlight the total cost advantage of the B-IGCC systems when biomass prices are higher. Uncertainties regarding prospective mature-technology costs for solid oxide fuel cells and hot gas sulfur clean-up technologies assumed for the B-IGSOFC performance analysis make it difficult to evaluate the prospective electricity generating costs for B-IGSOFC relative to B-IGCC. The rough analysis here suggests that B-IGSOFC will not show improved economics relative to B-IGCC at the large scale considered here. © 2009 Society of Chemical Industry and John Wiley & Sons, Ltd [source] Longitudinal Studies of Binary Response Data Following Case,Control and Stratified Case,Control Sampling: Design and AnalysisBIOMETRICS, Issue 2 2010Jonathan S. Schildcrout Summary We discuss design and analysis of longitudinal studies after case,control sampling, wherein interest is in the relationship between a longitudinal binary response that is related to the sampling (case,control) variable, and a set of covariates. We propose a semiparametric modeling framework based on a marginal longitudinal binary response model and an ancillary model for subjects' case,control status. In this approach, the analyst must posit the population prevalence of being a case, which is then used to compute an offset term in the ancillary model. Parameter estimates from this model are used to compute offsets for the longitudinal response model. Examining the impact of population prevalence and ancillary model misspecification, we show that time-invariant covariate parameter estimates, other than the intercept, are reasonably robust, but intercept and time-varying covariate parameter estimates can be sensitive to such misspecification. We study design and analysis issues impacting study efficiency, namely: choice of sampling variable and the strength of its relationship to the response, sample stratification, choice of working covariance weighting, and degree of flexibility of the ancillary model. The research is motivated by a longitudinal study following case,control sampling of the time course of attention deficit hyperactivity disorder (ADHD) symptoms. [source] Bayesian Calibration of a Stochastic Kinetic Computer Model Using Multiple Data SourcesBIOMETRICS, Issue 1 2010D. A. Henderson Summary In this article, we describe a Bayesian approach to the calibration of a stochastic computer model of chemical kinetics. As with many applications in the biological sciences, the data available to calibrate the model come from different sources. Furthermore, these data appear to provide somewhat conflicting information about the model parameters. We describe a modeling framework that allows us to synthesize this conflicting information and arrive at a consensus inference. In particular, we show how random effects can be incorporated into the model to account for between-individual heterogeneity that may be the source of the apparent conflict. [source] A General Framework for the Analysis of Animal Resource Selection from Telemetry DataBIOMETRICS, Issue 3 2008Devin S. Johnson Summary We propose a general framework for the analysis of animal telemetry data through the use of weighted distributions. It is shown that several interpretations of resource selection functions arise when constructed from the ratio of a use and availability distribution. Through the proposed general framework, several popular resource selection models are shown to be special cases of the general model by making assumptions about animal movement and behavior. The weighted distribution framework is shown to be easily extended to readily account for telemetry data that are highly autocorrelated; as is typical with use of new technology such as global positioning systems animal relocations. An analysis of simulated data using several models constructed within the proposed framework is also presented to illustrate the possible gains from the flexible modeling framework. The proposed model is applied to a brown bear data set from southeast Alaska. [source] Spatial Multistate Transitional Models for Longitudinal Event DataBIOMETRICS, Issue 1 2008F. S. Nathoo Summary Follow-up medical studies often collect longitudinal data on patients. Multistate transitional models are useful for analysis in such studies where at any point in time, individuals may be said to occupy one of a discrete set of states and interest centers on the transition process between states. For example, states may refer to the number of recurrences of an event, or the stage of a disease. We develop a hierarchical modeling framework for the analysis of such longitudinal data when the processes corresponding to different subjects may be correlated spatially over a region. Continuous-time Markov chains incorporating spatially correlated random effects are introduced. Here, joint modeling of both spatial dependence as well as dependence between different transition rates is required and a multivariate spatial approach is employed. A proportional intensities frailty model is developed where baseline intensity functions are modeled using parametric Weibull forms, piecewise-exponential formulations, and flexible representations based on cubic B-splines. The methodology is developed within the context of a study examining invasive cardiac procedures in Quebec. We consider patients admitted for acute coronary syndrome throughout the 139 local health units of the province and examine readmission and mortality rates over a 4-year period. [source] |