Conventional Models (conventional + models)

Distribution by Scientific Domains


Selected Abstracts


Contact allergy, irritancy and ,danger'

CONTACT DERMATITIS, Issue 3 2000
J. P. McFadden
Conventional models of the immune response are based on distinguishing self and non-self. However, we consider that the more recently proposed ,danger' model may be an illuminating alternative for studying allergic contact dermatitis. In this model, an antigenic signal on its own would tend to produce tolerance. In contrast, in the presence of a ,danger' signal, which, in the case of allergic contact dermatitis, we suggest is usually cutaneous irritancy, the immune system would become activated, leading first to the induction of sensitization and then subsequently to the elicitation of a contact hypersensitivity response. In most cases, both the antigenic signal and irritant signal will come from the hapten, although, e.g., in an occupational setting, traumiterative dermatitis would be the source of the ,danger' signal. Typically, the irritant signal tends to be more concentration-dependent and thus is the overriding factor in the determination of the effective sensitizing and eliciting concentrations of the hapten. A further prediction of this hypothesis is that successful experiments demonstrating low-dose tolerance with contact allergens may be explained by the loss of the irritant effect at lower dilutions, whilst an antigenic stimulus remains present. [source]


Labor Taxation in Search Equilibrium with Home Production

GERMAN ECONOMIC REVIEW, Issue 4 2002
Bertil Holmlund
Conventional models of equilibrium unemployment typically imply that proportional taxes on labor earnings are neutral with respect to unemployment as long as the tax does not affect the replacement rate provided by unemployment insurance, i.e. unemployment benefits relative to after,tax earnings. When home production is an option, the conventional results may no longer hold. This paper uses a search equilibrium model with home production to examine the employment and welfare implications of labor taxes. The employment effect of a rise in a proportional tax is found to be negative for sufficiently low replacement rates, whereas it is ambiguous for moderate and high replacement rates. Numerical calibrations of the model indicate that employment generally falls when labor taxes are raised. [source]


Sorption of benzidine and 3,3,-dichlorobenzidine to lake sediments.

ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 5 2005
1. conceptualization, development of a multiparameter model
Abstract Aromatic amines, such as benzidine and 3,3,-dichlorobenzidine (DCB), are part of the dyes and pigments manufacturing process. The prolonged use of these carcinogenic chemicals in the past generation has introduced a significant amount of contamination to the environment. Their persistency in several mediums has sparked a number of studies in an attempt to develop predictive tools of their fate and transport in the environment. In this study, benzidine and DCB batch isotherms were developed and evaluated. The sediment samples were variable in composition, ranging from sandy to silty-clay sediment samples. The batch isotherms were then analyzed using high-performance liquid chromatography. Subsequently, a multiparameter model (MPM) that accounted for partitioning, covalent bonding, and cation exchange was developed and tested in an effort to understand the various mechanisms. Results proved the proposed model to be effective in predicting sorption of aromatic amines to lake sediments. The findings suggest that the MPM can provide a better understanding of the sorption process of aromatic amines than more conventional models. [source]


Brain networks: Graph theoretical analysis and development models

INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, Issue 2 2010
Myoung Won Cho
Abstract A trendy method to understand the brain is to make a map representing the structural network of the brain, also known as the connectome, on the scale of a brain region. Indeed analysis based on graph theory provides quantitative insights into general topological principles of brain network organization. In particular, it is disclosed that typical brain networks share the topological properties, such as small-world and scale-free, with many other complex networks encountered in nature. Such topological properties are regarded as characteristics of the optimal neural connectivity to implement efficient computation and communication; brains with disease or abnormality show distinguishable deviations in the graph theoretical analysis. Considering that conventional models in graph theory are, however, not adequate for direct application to the neural system, we also discuss a model for explaining how the neural connectivity is organized. © 2010 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 20, 108,116, 2010 [source]


Removal of arsenic from simulated groundwater by GAC-Fe: A modeling approach

AICHE JOURNAL, Issue 7 2009
P. Mondal
Abstract A study on kinetics and equilibrium is presented on the adsorption of arsenic species from simulated groundwater containing arsenic (As(III):As(V)::1:1), Fe and Mn in concentrations of 0.188 mg/L, 2.8 mg/L and 0.6 mg/L, respectively, by iron impregnated granular activated charcoal (GAC-Fe). Also presented is the interaction effect of As, Fe and Mn on the removal of arsenic species from water, which simulates contaminated groundwater. Among conventional models, pseudo second-order kinetic model and Freundlich isotherm were adequate to explain the kinetics and equilibrium of adsorption process, respectively. However, in comparison to conventional isotherm empirical polynomial isotherm provided a more accurate prediction on equilibrium specific uptakes of arsenic species. Effects of initial concentrations of As, Fe and Mn on the removal of total arsenic (As(T)), As(V) & As(III) have been correlated within the error limit of ,0.2 to +5.64%. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source]


Effect of chemical kinetics on feasible splits for reactive distillation

AICHE JOURNAL, Issue 3 2001
Nitin Chadda
Feasible direct and indirect sharp splits for multicomponent single-feed continuous reactive distillation are predicted with a model, in which each column section is represented by a series of cocurrent isobaric flashes. In the limits of no reaction and equilibrium chemical reaction, the model reduces to conventional models for distillation lines, and each column section can be represented by the same equations. At intermediate reaction rates, however, the models for the column sections differ, and new results for fixed points and feasible products are obtained. A bifurcation study shows the limits of feasibility, including the influence of flow rate, catalyst level and holdup. Unlike distillation without reaction, limited ranges of feasibility in all of these variables are found. The method has been applied to five examples, one of which is described in detail. Feasibility predictions are validated by column simulations. [source]


Formation of hard very high energy gamma-ray spectra of blazars due to internal photon,photon absorption

MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, Issue 3 2008
Felix A. Aharonian
ABSTRACT The energy spectra of TeV gamma-rays from blazars, after being corrected for intergalatic absorption in the extragalactic background light (EBL), appear unusually hard, a fact that poses challenges to the conventional models of particle acceleration in TeV blazars and/or to the EBL models. In this paper, we show that the internal absorption of gamma-rays caused by interactions with dense narrow-band radiation fields in the vicinity of compact gamma-ray production regions can lead to the formation of gamma-ray spectra of an almost arbitrary hardness. This allows significant relaxation of the current tight constraints on particle acceleration and radiation models, although at the expense of enhanced requirements to the available non-thermal energy budget. The latter, however, is not a critical issue, as long as it can be largely compensated by the Doppler boosting, assuming large (>10) Doppler factors of the relativistically moving gamma-ray production regions. The suggested scenario of formation of hard gamma-ray spectra predicts detectable synchrotron radiation of secondary electron,positron pairs which might require a revision of the current ,standard paradigm' of spectral energy distributions of gamma-ray blazars. If the primary gamma-rays are of hadronic origin related to pp or p, interactions, the ,internal gamma-ray absorption' model predicts neutrino fluxes close to the detection threshold of the next generation high-energy neutrino detectors. [source]


JOINTLY-DETERMINED ECOLOGICAL THRESHOLDS AND ECONOMIC TRADE-OFFS IN WILDLIFE DISEASE MANAGEMENT

NATURAL RESOURCE MODELING, Issue 4 2007
ELI P. FENICHEL
ABSTRACT. We investigate wildlife disease management, in a bioeconomic framework, when the wildlife host is valuable and disease transmission is density-dependent. Disease prevalence is reduced in density-dependent models whenever the population is harvested below a host-density threshold a threshold population density below which disease prevalence declines and above which a disease becomes epidemic. In conventional models, the threshold is an exogenous function of disease parameters. We consider this case and find a steady state with positive disease prevalence to be optimal. Next, we consider a case in which disease dynamics are affected by both population controls and changes in human-environmental interactions. The host-density threshold is endogenous in this case. That is, the manager does not simply manage the population relative to the threshold, but rather manages both the population and the threshold. The optimal threshold depends on the economic and ecological trade-offs arising from the jointly-determined system. Accounting for this endogene-ity can lead to reduced disease prevalence rates and higher population levels. Additionally, we show that ecological parameters that may be unimportant in conventional models that do not account for the endogeneity of the host-density threshold are potentially important when host density threshold is recognized as endogenous. [source]


SOFTWARE ENGINEERING CONSIDERATIONS FOR INDIVIDUAL-BASED MODELS

NATURAL RESOURCE MODELING, Issue 1 2002
GLEN E. ROPELLA
ABSTRACT. Software design is much more important for individual-based models (IBMs) than it is for conventional models, for three reasons. First, the results of an IBM are the emergent properties of a system of interacting agents that exist only in the software; unlike analytical model results, an IBMs outcomes can be reproduced only by exactly reproducing its software implementation. Second, outcomes of an IBM are expected to be complex and novel, making software errors difficult to identify. Third, an IBM needs ,systems software' that manages populations of multiple kinds of agents, often has nonlinear and multi-threaded process control and simulates a wide range of physical and biological processes. General software guidelines for complex models are especially important for IBMs. (1) Have code critically reviewed by several people. (2) Follow prudent release management prac-tices, keeping careful control over the software as changes are implemented. (3) Develop multiple representations of the model and its software; diagrams and written descriptions of code aid design and understanding. (4) Use appropriate and widespread software tools which provide numerous major benefits; coding ,from scratch' is rarely appropriate. (5) Test the software continually, following a planned, multi-level, exper-imental strategy. (6) Provide tools for thorough, pervasive validation and verification. (7) Pay attention to how pseudorandom numbers are generated and used. Additional guidelines for IBMs include: (a) design the model's organization before starting to write code,(b) provide the ability to observe all parts of the model from the beginning,(c) make an extensive effort to understand how the model executes how often different pieces of code are called by which objects, and (d) design the software to resemble the system being mod-eled, which helps maintain an understanding of the software. Strategies for meeting these guidelines include planning adequate resources for software development, using software professionals to implement models and using tools like Swarm that are designed specifically for IBMs. [source]


Production planning with resources subject to congestion

NAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 2 2009
Jakob Asmundsson
Abstract A fundamental difficulty in developing effective production planning models has been accurately reflecting the nonlinear dependency between workload and lead times. We develop a mathematical programming model for production planning in multiproduct, single stage systems that captures the nonlinear dependency between workload and lead times. We then use outer linearization of this nonlinear model to obtain a linear programming formulation and extend it to multistage systems. Extensive computational experiments validate the approach and compare its results to conventional models that assume workload-independent planning lead times. © 2009 Wiley Periodicals, Inc. Naval Research Logistics, 2009 [source]


Numerical and experimental investigation of shrinkage behavior of precision injection molded articles.

POLYMER ENGINEERING & SCIENCE, Issue 8 2008

In the accompanying paper, Part I, presented are the physical modeling and numerical formulation of new lateral motion modelings. In Part II, new models developed in Part I are validated by the successful comparison of calculated residual stress profile with the literature data. The predicted results of the birefringence, residual stress distribution, and shrinkage from new lateral motion modeling are in better agreement with corresponding experimental data than those from the conventional ones. The new model prediction falls between those of two extreme cases corresponding to conventional models. As a result of extensive parametric study of processing conditions, the developed analysis system is found to be capable of successfully predicting the tendency of shrinkage behavior varying with most of processing conditions. In this regard, the new model enables better analysis based design and optimization of precision injection-molded products. POLYM. ENG. SCI., 2008. © 2008 Society of Plastics Engineers [source]


Collapse as Cultural Revolution: Power and Identity in the Tiwanaku to Pacajes Transition

ARCHEOLOGICAL PAPERS OF THE AMERICAN ANTHROPOLOGICAL ASSOCIATION, Issue 1 2004
John Wayne Janusek
Inherent foundations of power are often made explicit in state collapse and ethnogenesis, among the most problematic processes tackled by archaeologists. Recent research on collapse globally indicates that conventional models prioritizing external change (e.g., environmental shift, immigration) fail to address the historical intricacies of and human agency involved in state fragmentation. Some recent models treat collapse as a sudden drop in political complexity, and most fail to elaborate how state collapse influenced postcollapse sociopolitical and cultural patterns. Synthesizing substantial recent research on Tiwanaku (A.D. 500,1150) and post-Tiwanaku Pacajes (A.D. 1150,1450) polities in the south-central Andes, I suggest that state collapse involved a fateful conjunction of sociopolitical and environmental transformations. Drought conditions descended upon a centralized yet highly fragile sociopolitical landscape that had become increasingly volatile during Tiwanaku's apogee. Collapse involved rapid transformation as well as slow, cumulative shifts and enduring continuities. It was a cultural revolution that began during Tiwanaku hegemony and drew heavily on existing practices and ideals. Grounded in practice theory, this case study finds human agency squarely in the center of macroprocesses such as collapse and situates Andean foundations of power in the matrix of local ideals, practices, and identities from which hegemonic regimes such as Tiwanaku were forged. [source]


After BitTorrent: Darknets to Native Data

ARCHITECTURAL DESIGN, Issue 5 2006
Anthony Burke
Abstract What are the implications of the inherent reflexivity of the Internet for the design professions? Anthony Burke argues that radically innovative and distributed forms of information exchange such as BitTorrent suggest a general shift away from the traditional conception of the architect as master builder to one more in line with the collaborative remixing and patching tactics of the hacker. BitTorrent is a communications protocol that allows massive information exchange across infinite users with minimum resources. Through its sheer force of collectively pooled imagination, it provides a potent example of the sorts of platforms of information exchange that foster the new forms of communal organisation that Michael Hardt and Antonio Negri term the ,Multitude', and which productively challenge conventional models of cultural invention and production. In this context, Burke raises questions about the implications of this broader shift for the design professions' business organisation, as well as their more general methodologies. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Sustainability practices of SMEs: the case of NZ

BUSINESS STRATEGY AND THE ENVIRONMENT, Issue 4 2006
S. R. Lawrence
Abstract While individually small and medium sized enterprises (SMEs) may have small social, environmental and financial impacts, cumulatively their impact is significant. One of the fundamental questions is how a single economic entity, especially a small-scale enterprise, can be engaged in the uptake of sustainability practices. This question is particularly pertinent to New Zealand, where 98% of enterprises are SMEs. In this paper questions are raised about the conventional models of ,business ethics' and accountability and their relevance to SMEs. The paper reports on actual practices and discusses the possibility of small enterprises having accountability for their social and environmental impacts. Ways of linking individual firm activities to sustainability, such as a communitarian model of accountability, are discussed and illustrated. Copyright © 2006 John Wiley & Sons, Ltd and ERP Environment. [source]


The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth

COGNITIVE SCIENCE - A MULTIDISCIPLINARY JOURNAL, Issue 1 2005
Mark Steyvers
Abstract We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of connections follow power laws that indicate a scale-free pattern of connectivity, with most nodes having relatively few connections joined together through a small number of hubs with many connections. These regularities have also been found in certain other complex natural networks, such as the World Wide Web, but they are not consistent with many conventional models of semantic organization, based on inheritance hierarchies, arbitrarily structured networks, or high-dimensional vector spaces. We propose that these structures reflect the mechanisms by which semantic networks grow. We describe a simple model for semantic growth, in which each new word or concept is connected to an existing network by differentiating the connectivity pattern of an existing node. This model generates appropriate small-world statistics and power-law connectivity distributions, and it also suggests one possible mechanistic basis for the effects of learning history variables (age of acquisition, usage frequency) on behavioral performance in semantic processing tasks. [source]