Modeling Methods (modeling + methods)

Distribution by Scientific Domains


Selected Abstracts


ModEco: an integrated software package for ecological niche modeling

ECOGRAPHY, Issue 4 2010
Qinghua Guo
ModEco is a software package for ecological niche modeling. It integrates a range of niche modeling methods within a geographical information system. ModEco provides a user friendly platform that enables users to explore, analyze, and model species distribution data with relative ease. ModEco has several unique features: 1) it deals with different types of ecological observation data, such as presence and absence data, presence-only data, and abundance data; 2) it provides a range of models when dealing with presence-only data, such as presence-only models, pseudo-absence models, background vs presence data models, and ensemble models; and 3) it includes relatively comprehensive tools for data visualization, feature selection, and accuracy assessment. [source]


Differences in spatial predictions among species distribution modeling methods vary with species traits and environmental predictors

ECOGRAPHY, Issue 6 2009
Alexandra D. Syphard
Prediction maps produced by species distribution models (SDMs) influence decision-making in resource management or designation of land in conservation planning. Many studies have compared the prediction accuracy of different SDM modeling methods, but few have quantified the similarity among prediction maps. There has also been little systematic exploration of how the relative importance of different predictor variables varies among model types and affects map similarity. Our objective was to expand the evaluation of SDM performance for 45 plant species in southern California to better understand how map predictions vary among model types, and to explain what factors may affect spatial correspondence, including the selection and relative importance of different environmental variables. Four types of models were tested. Correlation among maps was highest between generalized linear models (GLMs) and generalized additive models (GAMs) and lowest between classification trees and GAMs or GLMs. Correlation between Random Forests (RFs) and GAMs was the same as between RFs and classification trees. Spatial correspondence among maps was influenced the most by model prediction accuracy (AUC) and species prevalence; map correspondence was highest when accuracy was high and prevalence was intermediate (average prevalence for all species was 0.124). Species functional type and the selection of climate variables also influenced map correspondence. For most (but not all) species, climate variables were more important than terrain or soil in predicting their distributions. Environmental variable selection varied according to modeling method, but the largest differences were between RFs and GLMs or GAMs. Although prediction accuracy was equal for GLMs, GAMs, and RFs, the differences in spatial predictions suggest that it may be important to evaluate the results of more than one model to estimate the range of spatial uncertainty before making planning decisions based on map outputs. This may be particularly important if models have low accuracy or if species prevalence is not intermediate. [source]


The effect of sample size and species characteristics on performance of different species distribution modeling methods

ECOGRAPHY, Issue 5 2006
Pilar A. Hernandez
Species distribution models should provide conservation practioners with estimates of the spatial distributions of species requiring attention. These species are often rare and have limited known occurrences, posing challenges for creating accurate species distribution models. We tested four modeling methods (Bioclim, Domain, GARP, and Maxent) across 18 species with different levels of ecological specialization using six different sample size treatments and three different evaluation measures. Our assessment revealed that Maxent was the most capable of the four modeling methods in producing useful results with sample sizes as small as 5, 10 and 25 occurrences. The other methods compensated reasonably well (Domain and GARP) to poorly (Bioclim) when presented with datasets of small sample sizes. We show that multiple evaluation measures are necessary to determine accuracy of models produced with presence-only data. Further, we found that accuracy of models is greater for species with small geographic ranges and limited environmental tolerance, ecological characteristics of many rare species. Our results indicate that reasonable models can be made for some rare species, a result that should encourage conservationists to add distribution modeling to their toolbox. [source]


Inverse modeling methods for indoor airborne pollutant tracking: literature review and fundamentals

INDOOR AIR, Issue 6 2007
X. Liu
First page of article [source]


Studies of molecular docking between fibroblast growth factor and heparin using generalized simulated annealing

INTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY, Issue 13 2008
Samuel Silva da Rocha Pita
Abstract Since the middle 70s, the main molecular docking problem consists in limitations to treat adequately the degrees of freedom of protein (or a receptor) due to the energy landscape roughness and the high computational cost. Until recently, only few algorithms considering flexible simultaneously both ligand and receptor at low computational cost were developed. As a recent proposed Statistical Mechanics, generalized simulated annealing (GSA) has been employed at diverse works concerning global optimization problems. In this work, we used this method exploring the molecular docking problem taking into account the FGF-2 and heparin complex. Since the requirements of an efficient docking algorithm are accuracy and velocity, we tested the influence of GSA parameters qA (new configuration acceptance index), qV (energy surface visiting index), and qT (temperature decreasing control) on the performance of GSADOCK program. Our simulations showed that as temperature parameter qT increases, qA parameter follows this behavior in the interval ranging from 1.1 to 2.3. We found that the GSA parameters have the best performance for the qA values ranging from 1.1 to 1.3, qV values from 1.3 to 1.5, and qT values from 1.1 to 1.7. Most of good qV values were equal or next the good qT values. Finally, the implemented algorithm is trustworthy and can be employed as a tool of molecular modeling methods. The final version of the program will be free of charge and will be accessible at our home-page or could be requested to the authors for e-mail. © 2008 Wiley Periodicals, Inc. Int J Quantum Chem, 2008 [source]


Pervaporation separation of sodium alginate/chitosan polyelectrolyte complex composite membranes for the separation of water/alcohol mixtures: Characterization of the permeation behavior with molecular modeling techniques

JOURNAL OF APPLIED POLYMER SCIENCE, Issue 4 2007
Sang-Gyun Kim
Abstract Polyelectrolyte complex (PEC) membranes were prepared by the complexation of protonated chitosan with sodium alginate doped on a porous, polysulfone-supporting membrane. The pervaporation characteristics of the membranes were investigated with various alcohol/water mixtures. The physicochemical properties of the permeant molecules and polyion complex membranes were determined with molecular modeling methods, and the data from these methods were used to explain the permeation of water and alcohol molecules through the PEC membranes. The experimental results showed that the prepared PEC membranes had an excellent pervaporation performance in most aqueous alcohol solutions and that the selectivity and permeability of the membranes depended on the molecular size, polarity, and hydrophilicity of the permeant alcohols. However, the aqueous methanol solutions showed a permeation behavior different from that of the other alcohol solutions. Methanol permeated the prepared PEC membranes more easily than water even though water molecules have stronger polarity and are smaller than methanol molecules. The experimental results are discussed from the point of view of the physical properties of the permeant molecules and the membranes in the permeation state. © 2006 Wiley Periodicals, Inc. J Appl Polym Sci 103: 2634,2641, 2007 [source]


Complex molecular assemblies at hand via interactive simulations

JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 15 2009
Olivier Delalande
Abstract Studying complex molecular assemblies interactively is becoming an increasingly appealing approach to molecular modeling. Here we focus on interactive molecular dynamics (IMD) as a textbook example for interactive simulation methods. Such simulations can be useful in exploring and generating hypotheses about the structural and mechanical aspects of biomolecular interactions. For the first time, we carry out low-resolution coarse-grain IMD simulations. Such simplified modeling methods currently appear to be more suitable for interactive experiments and represent a well-balanced compromise between an important gain in computational speed versus a moderate loss in modeling accuracy compared to higher resolution all-atom simulations. This is particularly useful for initial exploration and hypothesis development for rare molecular interaction events. We evaluate which applications are currently feasible using molecular assemblies from 1900 to over 300,000 particles. Three biochemical systems are discussed: the guanylate kinase (GK) enzyme, the outer membrane protease T and the soluble N -ethylmaleimide-sensitive factor attachment protein receptors complex involved in membrane fusion. We induce large conformational changes, carry out interactive docking experiments, probe lipid,protein interactions and are able to sense the mechanical properties of a molecular model. Furthermore, such interactive simulations facilitate exploration of modeling parameters for method improvement. For the purpose of these simulations, we have developed a freely available software library called MDDriver. It uses the IMD protocol from NAMD and facilitates the implementation and application of interactive simulations. With MDDriver it becomes very easy to render any particle-based molecular simulation engine interactive. Here we use its implementation in the Gromacs software as an example. © 2009 Wiley Periodicals, Inc. J Comput Chem, 2009 [source]


Integrated estimation of measurement error with empirical process modeling,A hierarchical Bayes approach

AICHE JOURNAL, Issue 11 2009
Hongshu Chen
Abstract Advanced empirical process modeling methods such as those used for process monitoring and data reconciliation rely on information about the nature of noise in the measured variables. Because this likelihood information is often unavailable for many practical problems, approaches based on repeated measurements or process constraints have been developed for their estimation. Such approaches are limited by data availability and often lack theoretical rigor. In this article, a novel Bayesian approach is proposed to tackle this problem. Uncertainty about the error variances is incorporated in the Bayesian framework by setting noninformative priors for the noise variances. This general strategy is used to modify the Sampling-based Bayesian Latent Variable Regression (Chen et al., J Chemom., 2007) approach, to make it more robust to inaccurate information about the likelihood functions. Different noninformative priors for the noise variables are discussed and unified in this work. The benefits of this new approach are illustrated via several case studies. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source]


Using R&D portfolio management to deal with dynamic risk

R & D MANAGEMENT, Issue 5 2008
Serghei Floricel
We develop a theoretical framework for understanding why firms adopt specific approaches for the management of innovation project portfolios. Our theory focuses on a key contingency factor for innovation, namely the dynamics of competitive environments. We use four dimensions to characterize the patterns of environmental dynamics: velocity, turbulence, growth and instability. The paper then proposes the concept of dynamic risk as a determinant of portfolio management processes. Dynamic risk results from second-order learning by a firm confronted with a specific dynamic pattern in its environment. This learning concerns the likely nature of threats and the required updating of cognitive frameworks in such environments. Attempts to deal with dynamic risk enable various actors inside the firm to understand what kind of dynamic capabilities are needed in their innovation portfolio management processes. As a result of this diffuse learning, firms tend to favor certain common characteristics in their concrete portfolio management activities. To advance the theorizing of these characteristics, the paper also proposes four dimensions of portfolio management: structure, commitment, emergence and integration. Based on arguments inspired by the dynamic capability and related literatures, we advance a series of hypotheses, that relate environmental dynamics dimensions and portfolio management dimensions. These hypotheses are tested based on a survey of 795 firms in a variety of sectors and on four continents, using original scales and structural equation modeling methods. The results show, among other findings, that high-velocity environments favor structured as well as integrated portfolio management approaches, while high-growth environments favor approaches that are structured but commit significant resources to each project as well. Turbulent environments favor approaches that are emergent, but also, contrary to our expectations, have high resource commitment levels. Finally, firms in unstable environments have a marginal preference for emergent approaches. Results could help advance the dynamic contingency theoretical perspective on dynamic capabilities, as well as improve the practice of innovation portfolio management. [source]


Process vs resource-oriented Petri net modeling of automated manufacturing systems,

ASIAN JOURNAL OF CONTROL, Issue 3 2010
NaiQi Wu
Abstract Since the 1980s, Petri nets (PN) have been widely used to model automated manufacturing systems (AMS) for analysis, performance evaluation, simulation, and control. They are mostly based on process-oriented modeling methods and thus termed as process-oriented PN (POPN) in this paper. The recent study of deadlock avoidance problems in AMS led to another type of PN called resource-oriented PN (ROPN). This paper, for the first time, compares these two modeling methods and resultant models in terms of modeling power, model complexity for analysis and control, and some critical properties. POPN models the part production processes straightforwardly, while ROPN is more compact and effective for deadlock resolution. The relations between these two models are investigated. Several examples are used to illustrate them. Copyright © 2010 John Wiley and Sons Asia Pte Ltd and Chinese Automatic Control Society [source]


A series of molecular dynamics and homology modeling computer labs for an undergraduate molecular modeling course

BIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION, Issue 4 2010
Donald E. Elmore
Abstract As computational modeling plays an increasingly central role in biochemical research, it is important to provide students with exposure to common modeling methods in their undergraduate curriculum. This article describes a series of computer labs designed to introduce undergraduate students to energy minimization, molecular dynamics simulations, and homology modeling. These labs were created as part of a one-semester course on the molecular modeling of biochemical systems. Students who completed these activities felt that they were an effective component of the course, reporting improved comfort with the conceptual background and practical implementation of the computational methods. Although created as a component of a larger course, these activities could be readily adapted for a variety of other educational contexts. As well, all of these labs utilize software that is freely available in an academic environment and can be run on fairly common computer hardware, making them accessible to teaching environments without extensive computational resources. [source]


Varying rates of diversification in the genus Melitaea (Lepidoptera: Nymphalidae) during the past 20 million years

BIOLOGICAL JOURNAL OF THE LINNEAN SOCIETY, Issue 2 2009
JULIEN LENEVEU
The influence of Quarternary glacial cycles on the extant diversity of Holarctic species has been intensively studied. It has been hypothesized that palaeoclimatic changes are responsible for divergence events in lineages. A constant improvement in DNA sequencing and modeling methods, as well as palaeoclimatic reconstruction, permit a deeper exploration of general causes of speciation in geological time. In the present study, we sampled, as exhaustively as possible, the butterflies belonging to the genus Melitaea (Lepidoptera: Nymphalidae), which are widely spread in the Palaearctic region. We conducted analyses to assess the phylogeny of the genus and estimated the timing of divergence and the most likely distribution of ancestral populations. The results obtained indicate that the systematics of the genus is in need of revision and that the diversity of the genus has been profoundly shaped by palaeoenvironmental changes during its evolutionary history. The present study also emphasizes that, when employed with caveats, major palaeoenvironmental events could represent very powerful tools for the calibration of the dating of divergences using molecular data. © 2009 The Linnean Society of London, Biological Journal of the Linnean Society, 2009, 97, 346,361. [source]


Statistical methods for longitudinal research on bipolar disorders

BIPOLAR DISORDERS, Issue 3 2003
John Hennen
Objectives: Outcomes research in bipolar disorders, because of complex clinical variation over-time, offers demanding research design and statistical challenges. Longitudinal studies involving relatively large samples, with outcome measures obtained repeatedly over-time, are required. In this report, statistical methods appropriate for such research are reviewed. Methods: Analytic methods appropriate for repeated measures data include: (i) endpoint analysis; (ii) endpoint analysis with last observation carried forward; (iii) summary statistic methods yielding one summary measure per subject; (iv) random effects and generalized estimating equation (GEE) regression modeling methods; and (v) time-to-event survival analyses. Results: Use and limitations of these several methods are illustrated within a randomly selected (33%) subset of data obtained in two recently completed randomized, double blind studies on acute mania. Outcome measures obtained repeatedly over 3 or 4 weeks of blinded treatment in active drug and placebo sub-groups included change-from-baseline Young Mania Rating Scale (YMRS) scores (continuous measure) and achievement of a clinical response criterion (50% YMRS reduction). Four of the methods reviewed are especially suitable for use with these repeated measures data: (i) the summary statistic method; (ii) random/mixed effects modeling; (iii) GEE regression modeling; and (iv) survival analysis. Conclusions: Outcome studies in bipolar illness ideally should be longitudinal in orientation, obtain outcomes data frequently over extended times, and employ large study samples. Missing data problems can be expected, and data analytic methods must accommodate missingness. [source]


Status of Observational Models Used in Design and Control of Products and Processes

COMPREHENSIVE REVIEWS IN FOOD SCIENCE AND FOOD SAFETY, Issue 1 2008
Shyam S. Sablani
This article is part of a collection entitled "Models for Safety, Quality, and Competitiveness of the Food Processing Sector," published in Comprehensive Reviews in Food Science and Food Safety. It has been peer-reviewed and was written as a follow-up of a pre-IFT workshop, partially funded by the USDA NRI grant 2005-35503-16208. ABSTRACT:, Modeling techniques can play a vital role in developing and characterizing food products and processes. Physical, chemical, and biological changes that take place during food and bioproduct processing are very complex and experimental investigation may not always be possible due to time, cost, effort, and skills needed. In some cases even experiments are not feasible to conduct. Often it is difficult to visualize the complex behavior of a data set. In addition, modeling is a must for process design, optimization, and control. With the rapid development of computer technology over the past few years, more and more food scientists have begun to use computer-aided modeling techniques. Observation-based modeling methods can be very useful where time and resources do not allow complete physics-based understanding of the process. This review discusses the state of selected observation-based modeling techniques in the context of industrial food processing. [source]