Process Models (process + models)

Distribution by Scientific Domains


Selected Abstracts


A MULTINOMIAL APPROXIMATION FOR AMERICAN OPTION PRICES IN LÉVY PROCESS MODELS

MATHEMATICAL FINANCE, Issue 4 2006
Ross A. Maller
This paper gives a tree-based method for pricing American options in models where the stock price follows a general exponential Lévy process. A multinomial model for approximating the stock price process, which can be viewed as generalizing the binomial model of Cox, Ross, and Rubinstein (1979) for geometric Brownian motion, is developed. Under mild conditions, it is proved that the stock price process and the prices of American-type options on the stock, calculated from the multinomial model, converge to the corresponding prices under the continuous time Lévy process model. Explicit illustrations are given for the variance gamma model and the normal inverse Gaussian process when the option is an American put, but the procedure is applicable to a much wider class of derivatives including some path-dependent options. Our approach overcomes some practical difficulties that have previously been encountered when the Lévy process has infinite activity. [source]


Status of Microbial Modeling in Food Process Models

COMPREHENSIVE REVIEWS IN FOOD SCIENCE AND FOOD SAFETY, Issue 1 2008
Bradley P. Marks
ABSTRACT:, Food process models are typically aimed at improving process design or operation by optimizing some physical or chemical outcome, such as maximizing processing yield, minimizing energy usage, or maximizing nutrient retention. However, in seeking to achieve these objectives, one of the critical constraints is usually microbiological. For example, growth of pathogens or spoilage organisms must be held below a certain level, or pathogen reduction for a kill step must achieve a certain target. Therefore, mathematical models for microbial populations subjected to food processing operations are essential elements of the broader field of food process modeling. However, the complexity of the underlying biological phenomena presents special challenges in formulating, validating, and applying microbial models to real-world applications. In that context, the narrow purpose of this article is to (1) outline the general terminology and constructs of microbial models, (2) evaluate the state of knowledge/state of the art in application of these models, and (3) offer observations about current limitations and future opportunities in the area of predictive microbiology for food process modeling. [source]


The use of GIS-based digital morphometric techniques in the study of cockpit karst

EARTH SURFACE PROCESSES AND LANDFORMS, Issue 2 2007
P. Lyew-Ayee
Abstract Cockpit karst landscapes are among the most distinctive landscapes in the world, and have been the focus of long-standing scientific interest. Early researchers used largely descriptive techniques to categorize the terrain, and subsequent work has not attempted to critically re-evaluate descriptions of landscapes using more sophisticated methods. The distinctive surface topography of cockpit karst areas can be characterized in order to compare them with other karst as well as non-karst areas, and to determine geological and/or climatic conditions that are responsible for the observed terrain. Process models of the rate of karst denudation or evolution can only be accurate if the contemporary morphology of the landscape is quantitatively and unambiguously defined. A detailed analysis of cockpit karst terrain is carried out using the latest GIS-based digital morphometric techniques in order to assess the nature of such terrain and provide further information for subsequent modelling, as well as other non-geomorphological applications, such as environmental management and conservation issues. The paper presents the methodology used for the digital analysis of terrain and landforms in the distinctive Cockpit Country area of Jamaica and its environs. The results indicate that cockpit karst may be categorized based on its vertical, horizontal and shape characteristics, as well as by looking at the semivariogram, slope characteristics, and landscape relief scale, which combine measures of vertical and horizontal scales. Copyright © 2006 John Wiley & Sons, Ltd. [source]


The behaviour of soil process models of ammonia volatilization at contrasting spatial scales

EUROPEAN JOURNAL OF SOIL SCIENCE, Issue 6 2008
R. Corstanje
Summary Process models are commonly used in soil science to obtain predictions at a spatial scale that is different from the scale at which the model was developed, or the scale at which information on model inputs is available. When this happens, the model and its inputs require aggregation or disaggregation to the application scale, and this is a complex problem. Furthermore, the validity of the aggregated model predictions depends on whether the model describes the key processes that determine the process outcome at the target scale. Different models may therefore be required at different spatial scales. In this paper we develop a diagnostic framework which allows us to judge whether a model is appropriate for use at one or more spatial scales both with respect to the prediction of variations at those scale and in the requirement for disaggregation of the inputs. We show that spatially nested analysis of the covariance of predictions with measured process outcomes is an efficient way to do this. This is applied to models of the processes that lead to ammonia volatilization from soil after the application of urea. We identify the component correlations at different scales of a nested scheme as the diagnostic with which to evaluate model behaviour. These correlations show how well the model emulates components of spatial variation of the target process at the scales of the sampling scheme. Aggregate correlations were identified as the most pertinent to evaluate models for prediction at particular scales since they measure how well aggregated predictions at some scale correlate with aggregated values of the measured outcome. There are two circumstances under which models are used to make predictions. In the first case only the model is used to predict, and the most useful diagnostic is the concordance aggregate correlation. In the second case model predictions are assimilated with observations which should correct bias in the prediction, and errors in the variance; the aggregate correlations would be the most suitable diagnostic. [source]


Defining the moment of erosion: the principle of thermal consonance timing

EARTH SURFACE PROCESSES AND LANDFORMS, Issue 13 2005
D. M. LawlerArticle first published online: 9 DEC 200
Abstract Geomorphological process research demands quantitative information on erosion and deposition event timing and magnitude, in relation to fluctuations in the suspected driving forces. This paper establishes a new measurement principle , thermal consonance timing (TCT) , which delivers clearer, more continuous and quantitative information on erosion and deposition event magnitude, timing and frequency, to assist understanding of the controlling mechanisms. TCT is based on monitoring the switch from characteristically strong temperature gradients in sediment, to weaker gradients in air or water, which reveals the moment of erosion. The paper (1) derives the TCT principle from soil micrometeorological theory; (2) illustrates initial concept operationalization for field and laboratory use; (3) presents experimental data for simple soil erosion simulations; and (4) discusses initial application of TCT and perifluvial micrometeorology principles in the delivery of timing solutions for two bank erosion events on the River Wharfe, UK, in relation to the hydrograph. River bank thermal regimes respond, as soil temperature and energy balance theory predicts, with strong horizontal thermal gradients (often >1 K cm,1 over 6·8 cm). TCT fixed the timing of two erosion events, the first during inundation, the second 19 h after the discharge peak and 13 h after re-emergence from the flow. This provides rare confirmation of delayed bank retreat, quantifies the time-lag involved, and suggests mass failure processes rather than fluid entrainment. Erosion events can be virtually instantaneous, implying ,catastrophic retreat' rather than ,progressive entrainment'. Considerable potential exists to employ TCT approaches for: validating process models in several geomorphological contexts; assisting process identification and improving discrimination of competing hypotheses of process dominance through high-resolution, simultaneous analysis of erosion and deposition events and driving forces; defining shifting erodibility and erosion thresholds; refining dynamic linkages in event-based sediment budget investigations; and deriving closer approximations to ,true' erosion and deposition rates, especially in self-concealing scour-and-fill systems. Copyright © 2005 John Wiley & Sons, Ltd. [source]


The behaviour of soil process models of ammonia volatilization at contrasting spatial scales

EUROPEAN JOURNAL OF SOIL SCIENCE, Issue 6 2008
R. Corstanje
Summary Process models are commonly used in soil science to obtain predictions at a spatial scale that is different from the scale at which the model was developed, or the scale at which information on model inputs is available. When this happens, the model and its inputs require aggregation or disaggregation to the application scale, and this is a complex problem. Furthermore, the validity of the aggregated model predictions depends on whether the model describes the key processes that determine the process outcome at the target scale. Different models may therefore be required at different spatial scales. In this paper we develop a diagnostic framework which allows us to judge whether a model is appropriate for use at one or more spatial scales both with respect to the prediction of variations at those scale and in the requirement for disaggregation of the inputs. We show that spatially nested analysis of the covariance of predictions with measured process outcomes is an efficient way to do this. This is applied to models of the processes that lead to ammonia volatilization from soil after the application of urea. We identify the component correlations at different scales of a nested scheme as the diagnostic with which to evaluate model behaviour. These correlations show how well the model emulates components of spatial variation of the target process at the scales of the sampling scheme. Aggregate correlations were identified as the most pertinent to evaluate models for prediction at particular scales since they measure how well aggregated predictions at some scale correlate with aggregated values of the measured outcome. There are two circumstances under which models are used to make predictions. In the first case only the model is used to predict, and the most useful diagnostic is the concordance aggregate correlation. In the second case model predictions are assimilated with observations which should correct bias in the prediction, and errors in the variance; the aggregate correlations would be the most suitable diagnostic. [source]


Vegetation structure characteristics and relationships of Kalahari woodlands and savannas

GLOBAL CHANGE BIOLOGY, Issue 3 2004
J.L. Privette
Abstract The Kalahari Transect is one of several International Geosphere,Biosphere Programme (IGBP) transects designed to address global change questions at the regional scale, in particular by exploiting natural parameter gradients (Koch et al., 1995). In March 2000, we collected near-synoptic vegetation structural data at five sites spanning the Kalahari's large precipitation gradient (about 300,1000 mm yr,1) from southern Botswana (,24°S) to Zambia (,15°S). All sites were within the expansive Kalahari sandsheet. Common parameters, including plant area index (PAI), leaf area index (LAI) and canopy cover (CC), were measured or derived using several indirect instruments and at multiple spatial scales. Results show that CC and PAI increase with increasing mean annual precipitation. Canopy clumping, defined by the deviation of the gap size distribution from that of randomly distributed foliage, was fairly constant along the gradient. We provide empirical relationships relating these parameters to each other and to precipitation. These results, combined with those in companion Kalahari Transect studies, provide a unique and coherent test bed for ecological modeling. The data may be used to parameterize process models, as well as test internally predicted parameters and their variability in response to well-characterized climatological differences. [source]


Preferential flows and travel time distributions: defining adequate hypothesis tests for hydrological process models

HYDROLOGICAL PROCESSES, Issue 12 2010
Keith J. Beven
Abstract This introduction to the second annual review issue of Hydrological Processes tries to put the collection of papers on preferential flows and travel time distributions into a more general context of testing models as hypotheses about how catchment systems function. It is suggested that, because of the possibilities of non-stationary and epistemic errors in both data and models, such tests could be carried out within a rejectionist limits-of-acceptability framework. The principles and difficulties of hypothesis testing within these particular research areas are discussed. An important point to take from this discussion is that the use of a formal testing framework, and the consequent rejection of models as hypotheses after allowing for uncertainties in the data, is the starting point for developing better theories and data sets. Copyright © 2010 John Wiley & Sons, Ltd. [source]


A temporal perspective of the computer game development process

INFORMATION SYSTEMS JOURNAL, Issue 5 2009
Patrick Stacey
Abstract., This paper offers an insight into the games software development process from a time perspective by drawing on an in-depth study in a games development organization. The wider market for computer games now exceeds the annual global revenues of cinema. We have, however, only a limited scholarly understanding of how games studios produce games. Games projects require particular attention because their context is unique. Drawing on a case study, the paper offers a theoretical conceptualization of the development process of creative software, such as games software. We found that the process, as constituted by the interactions of developers, oscillates between two modes of practice: routinized and improvised, which sediment and flux the working rhythms in the context. This paper argues that while we may predeterminately lay down the broad stages of creative software development, the activities that constitute each stage, and the transition criteria from one to the next, may be left to the actors in the moment, to the temporality of the situation as it emerges. If all development activities are predefined, as advocated in various process models, this may leave little room for opportunity and the creative fruits that flow from opportunity, such as enhanced features, aesthetics and learning. [source]


On a class of switched, robustly stable, adaptive systems

INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 3 2001
Felipe M. Pait
Abstract A class of switched algorithms for adaptive control of siso linear systems is described. The systems considered are assumed to belong to one among a finite number of classes of admissible process models, and each class is robustly stabilizable by some linear time-invariant controller. The control used is chosen in real time by a tuner or supervisor, according to observations of suitably defined ,identification errors.' The method preserves the robustness properties of the linear control design in an adaptive context, thus extending earlier ideas in multiple-model adaptive control by presenting a more flexible and less conservative framework for considering such systems. One motivating application is fault-tolerant control. Copyright © 2001 John Wiley & Sons, Ltd. [source]


The Mediating Role of Feedback Acceptance in the Relationship between Feedback and Attitudinal and Performance Outcomes

INTERNATIONAL JOURNAL OF SELECTION AND ASSESSMENT, Issue 4 2009
Frederik Anseel
The purpose of this study was to increase our understanding of applicant perceptions of feedback by drawing upon feedback process models. In Study 1, participants (N=125) completed a personality questionnaire as a first stage of a selection simulation. Results showed that the effect of feedback on attitudes toward the organization was mediated by feedback acceptance. In Study 2, participants (N=252) completed two parallel versions of an in-basket exercise and received informative feedback between the two versions. Results showed that the effect of feedback on subsequent test performance was partially mediated by feedback acceptance. Together, these results highlight the important role of feedback acceptance in selection and suggest new strategies to enhance applicant perceptions in selection. [source]


Process similarity and developing new process models through migration

AICHE JOURNAL, Issue 9 2009
Junde Lu
Abstract An industrial process may operate over a range of conditions to produce different grades of product. With a data-based model, as conditions change, a different process model must be developed. Adapting existing process models can allow using fewer experiments for the development of a new process model, resulting in a saving of time, cost, and effort. Process similarity is defined and classified based on process representation. A model migration strategy is proposed for one type of process similarity, family similarity, which involves developing a new process model by taking advantage of an existing base model, and process attribute information. A model predicting melt-flow-length in injection molding is developed and tested as an example and shown to give satisfactory results. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source]


Incremental identification of fluid multi-phase reaction systems

AICHE JOURNAL, Issue 4 2009
Claas Michalik
Abstract Despite their importance, rigorous process models are rarely available for reaction and especially multi-phase reaction systems. The high complexity of these systems, which is due to the superposed effects of mass transfer and intrinsic reaction, is the major barrier for the development of process models. A methodology that allows thesystematic decomposition of mass transfer and chemical reaction and thus enables the efficient identification of multi-phase reaction systems is proposed in this work. The method is based on the so-called Incremental Identification Method, recently presented by Brendel et al., Chem Eng Sci. 2006;61:5404-5420. The method allows to easily test the identifiability of a system based on the available measurement data. If identifiability is given, the intrinsic reaction kinetics can be identified in a sound and numerically robust manner. These benefits are illustrated using a simulated 2-phase enzyme reaction system. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source]


The routinization of innovation research: a constructively critical review of the state-of-the-science

JOURNAL OF ORGANIZATIONAL BEHAVIOR, Issue 2 2004
Neil Anderson
In this review we argue that facilitators of innovation at the individual, group, and organizational levels have been reliably identified, and that validated process models of innovation have been developed. However, a content analysis of selected research published between 1997 and 2002 suggests a routinization of innovation research, with a heavy focus on replication,extension, cross-sectional designs, and a single level of analysis. We discuss five innovative pathways for future work: Study innovation as an independent variable, across cultures, within a multi-level framework, and use meta-analysis and triangulation. To illustrate we propose a ,distress-related innovation' model of the relations between negatively connotated variables and innovation at the individual, group, and organizational levels of analysis. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Factorized approach to nonlinear MPC using a radial basis function model

AICHE JOURNAL, Issue 2 2001
Sharad Bhartiya
A new computationally efficient approach for nonlinear model predictive control (NMPC) presented here uses the factorability of radial basis function (RBF) process models in a traditional model predictive control (MPC) framework. The key to the approach is to formulate the RBF process model that can make nonlinear predictions across a p-step horizon without using future unknown process measurements. The RBF model avoids error propagation from use of model predictions us input in a recursive or iterative manner. The resulting NMPC formulation using the RBF model provides analytic expressions for the gradient and Hessian of the controller's objective function in terms of RBF network parameters. Solution of the NMPC optimization problem is simplifed significantly by factorization of the RBF model output into terms containing only known and unknown parts of the process. [source]


A General Misspecification Test for Spatial Regression Models: Dependence, Heterogeneity, and Nonlinearity

JOURNAL OF REGIONAL SCIENCE, Issue 2 2001
Thomas De Graaff
There is an increasing awareness of the potentials of nonlinear modeling in regional science. This can be explained partly by the recognition of the limitations of conventional equilibrium models in complex situations, and also by the easy availability and accessibility of sophisticated computational techniques. Among the class of nonlinear models, dynamic variants based on, for example, chaos theory stand out as an interesting approach. However, the operational significance of such approaches is still rather limited and a rigorous statistical-econometric treatment of nonlinear dynamic modeling experiments is lacking. Against this background this paper is concerned with a methodological and empirical analysis of a general misspecification test for spatial regression models that is expected to have power against nonlinearity, spatial dependence, and heteroskedasticity. The paper seeks to break new research ground by linking the classical diagnostic tools developed in spatial econometrics to a misspecification test derived directly from chaos theory,the BDS test, developed by Brock, Dechert, and Scheinkman (1987). A spatial variant of the BDS test is introduced and applied in the context of two examples of spatial process models, one of which is concerned with the spatial distribution of regional investments in The Netherlands, the other with spatial crime patterns in Columbus, Ohio. [source]


Pseudomartingale estimating equations for modulated renewal process models

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 1 2009
Fengchang Lin
Summary., We adapt martingale estimating equations based on gap time information to a general intensity model for a single realization of a modulated renewal process. The consistency and asymptotic normality of the estimators is proved under ergodicity conditions. Previous work has considered either parametric likelihood analysis or semiparametric multiplicative models using partial likelihood. The framework is generally applicable to semiparametric and parametric models, including additive and multiplicative specifications, and periodic models. It facilitates a semiparametric extension of a popular parametric earthquake model. Simulations and empirical analyses of Taiwanese earthquake sequences illustrate the methodology's practical utility. [source]


Residual analysis for spatial point processes (with discussion)

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 5 2005
A. Baddeley
Summary., We define residuals for point process models fitted to spatial point pattern data, and we propose diagnostic plots based on them. The residuals apply to any point process model that has a conditional intensity; the model may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Some existing ad hoc methods for model checking (quadrat counts, scan statistic, kernel smoothed intensity and Berman's diagnostic) are recovered as special cases. Diagnostic tools are developed systematically, by using an analogy between our spatial residuals and the usual residuals for (non-spatial) generalized linear models. The conditional intensity , plays the role of the mean response. This makes it possible to adapt existing knowledge about model validation for generalized linear models to the spatial point process context, giving recommendations for diagnostic plots. A plot of smoothed residuals against spatial location, or against a spatial covariate, is effective in diagnosing spatial trend or co-variate effects. Q,Q -plots of the residuals are effective in diagnosing interpoint interaction. [source]


Improving the Quality of Information Flows in the Backend of a Product Development Process: a Case Study

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 2 2005
Jaring Boersma
Abstract Considerable research has gone into designing effective product development processes. This, coupled with the increasing need for products that are able to deliver reliable, complex functionality with a high degree of innovation, presents a major challenge to modern day industries in the business of developing products. In order to incorporate relevant field experience in the design and manufacturing of new products, increasingly detailed information needs to be retrieved from the market in a very short amount of time. In one particular consumer electronics industry, business process models describing the information flow in the backend of the product development process indicated massive data loss and also serious data quality degradation. This paper attempts to show how such losses can be mitigated and also proposes a business model that can adequately capture information of a higher quality and in a more structured manner. The end result will be a product development process that provides better feedback on current product performance and is more responsive to future market needs. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Identification and fine tuning of closed-loop processes under discrete EWMA and PI adjustments

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 6 2001
Rong Pan
Abstract Conventional process identification techniques of a open-loop process use the cross-correlation function between historical values of the process input and of the process output. If the process is operated under a linear feedback controller, however, the cross-correlation function has no information on the process transfer function because of the linear dependency of the process input on the output. In this paper, several circumstances where a closed-loop system can be identified by the autocorrelation function of the output are discussed. It is assumed that a proportional integral controller with known parameters is acting on the process while the output data were collected. The disturbance is assumed to be a member of a simple yet useful family of stochastic models, which is able to represent drift. It is shown that, with these general assumptions, it is possible to identify some dynamic process models commonly encountered in manufacturing. After identification, our approach suggests to tune the controller to a near-optimal setting according to a well-known performance criterion. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Annotation: Recent Research Examining the Role of Peer Relationships in the Development of Psychopathology

THE JOURNAL OF CHILD PSYCHOLOGY AND PSYCHIATRY AND ALLIED DISCIPLINES, Issue 5 2001
Kirby Deater-Deckard
This Annotation highlights recent research on the role of peer group and friendship factors in the development of psychopathology in childhood and adolescence. Several processes are considered, including peer rejection (e.g., exclusion and victimization), social withdrawal and avoidance of peer interaction, and the socialization of deviant behavior and internalizing problems. The mediating influences of several proximal components are examined, including cognitive-perceptual factors and emotion regulation. In addition, the moderating ifluences of close friendship, age, gender, ethnicity, and group norms are considered. Several promising avenues for future directions in research are highlighted, including the examination of heterogeneity in developmental processes, further investigation of gender-based norms, and the application of multi-level modeling techniques and gene-environment process models. [source]


Using Reference Models within the Enterprise Resource Planning Lifecycle

AUSTRALIAN ACCOUNTING REVIEW, Issue 22 2000
MICHAEL ROSEMANN
ERP-specific reference models describe on a conceptual level the structure and functionality of enterprise resource planning solutions. However, these models focus on depicting executable processes and do not take into account tasks related to business engineering, system selection, implementation or change. This paper discusses how reference process models can be used within the entire ERP lifecycle. All phases of the ERP lifecycle have individual requirements for the management of the relevant knowledge. It will be shown how extended reference models can serve as a knowledge repository for enterprise resource planning. This paper includes several pragmatic recommendations for managers involved in ERP projects. [source]


Bayesian Nonparametric Modeling for Comparison of Single-Neuron Firing Intensities

BIOMETRICS, Issue 1 2010
Athanasios Kottas
Summary We propose a fully inferential model-based approach to the problem of comparing the firing patterns of a neuron recorded under two distinct experimental conditions. The methodology is based on nonhomogeneous Poisson process models for the firing times of each condition with flexible nonparametric mixture prior models for the corresponding intensity functions. We demonstrate posterior inferences from a global analysis, which may be used to compare the two conditions over the entire experimental time window, as well as from a pointwise analysis at selected time points to detect local deviations of firing patterns from one condition to another. We apply our method on two neurons recorded from the primary motor cortex area of a monkey's brain while performing a sequence of reaching tasks. [source]


Hierarchical Spatial Modeling of Additive and Dominance Genetic Variance for Large Spatial Trial Datasets

BIOMETRICS, Issue 2 2009
Andrew O. Finley
Summary This article expands upon recent interest in Bayesian hierarchical models in quantitative genetics by developing spatial process models for inference on additive and dominance genetic variance within the context of large spatially referenced trial datasets. Direct application of such models to large spatial datasets are, however, computationally infeasible because of cubic-order matrix algorithms involved in estimation. The situation is even worse in Markov chain Monte Carlo (MCMC) contexts where such computations are performed for several iterations. Here, we discuss approaches that help obviate these hurdles without sacrificing the richness in modeling. For genetic effects, we demonstrate how an initial spectral decomposition of the relationship matrices negate the expensive matrix inversions required in previously proposed MCMC methods. For spatial effects, we outline two approaches for circumventing the prohibitively expensive matrix decompositions: the first leverages analytical results from Ornstein,Uhlenbeck processes that yield computationally efficient tridiagonal structures, whereas the second derives a modified predictive process model from the original model by projecting its realizations to a lower-dimensional subspace, thereby reducing the computational burden. We illustrate the proposed methods using a synthetic dataset with additive, dominance, genetic effects and anisotropic spatial residuals, and a large dataset from a Scots pine (Pinus sylvestris L.) progeny study conducted in northern Sweden. Our approaches enable us to provide a comprehensive analysis of this large trial, which amply demonstrates that, in addition to violating basic assumptions of the linear model, ignoring spatial effects can result in downwardly biased measures of heritability. [source]


Wildlife Population Assessment: Past Developments and Future Directions

BIOMETRICS, Issue 1 2000
S. T. Buckland
Summary. We review the major developments in wildlife population assessment in the past century. Three major areas are considered: mark-recapture, distance sampling, and harvest models. We speculate on how these fields will develop in the next century. Topics for which we expect to see methodological advances include integration of modeling with Geographic Information Systems, automated survey design algorithms, advances in model-based inference from sample survey data, a common inferential framework for wildlife population assessment methods, improved methods for estimating population trends, the embedding of biological process models into inference, substantially improved models for conservation management, advanced spatiotemporal models of ecosystems, and greater emphasis on incorporating model selection uncertainty into inference. We discuss the kind of developments that might be anticipated in these topics. [source]


Application of agent-based system for bioprocess description and process improvement

BIOTECHNOLOGY PROGRESS, Issue 3 2010
Ying Gao
Abstract Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. © 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2010 [source]


Public Relations Planning and Action as "Practical-Critical" Communication

COMMUNICATION THEORY, Issue 4 2003
Wayne D. Woodward
A practical-critical approach to communication contends that critical analysis should have practical consequences, specifically to extend participation and to introduce innovative forms of communication. Planning and action process models in public relations illustrate the approach. The practical-critical position develops out of a reconstructive revision of existing, instrumental models. The emphases are (a) variabilities and contingencies in communication, (b) temporal sequencing of cooperative activity, (c) conditions of uncertainty that are part of pursuing a shared focus through joint activity, and (d) the interdependent relations among material, symbolic, and relational dimensions of process planning and action. The practical-critical framework provides for continuous, dialectical analysis of a central focus of activity, while deriving benefits from the sequencing of cooperative effort. [source]


Status of Microbial Modeling in Food Process Models

COMPREHENSIVE REVIEWS IN FOOD SCIENCE AND FOOD SAFETY, Issue 1 2008
Bradley P. Marks
ABSTRACT:, Food process models are typically aimed at improving process design or operation by optimizing some physical or chemical outcome, such as maximizing processing yield, minimizing energy usage, or maximizing nutrient retention. However, in seeking to achieve these objectives, one of the critical constraints is usually microbiological. For example, growth of pathogens or spoilage organisms must be held below a certain level, or pathogen reduction for a kill step must achieve a certain target. Therefore, mathematical models for microbial populations subjected to food processing operations are essential elements of the broader field of food process modeling. However, the complexity of the underlying biological phenomena presents special challenges in formulating, validating, and applying microbial models to real-world applications. In that context, the narrow purpose of this article is to (1) outline the general terminology and constructs of microbial models, (2) evaluate the state of knowledge/state of the art in application of these models, and (3) offer observations about current limitations and future opportunities in the area of predictive microbiology for food process modeling. [source]