Home About us Contact | |||
Modeling Techniques (modeling + techniques)
Kinds of Modeling Techniques Selected AbstractsMolecular Modeling Techniques in Material Sciences.CHEMPHYSCHEM, Issue 9 2006Amitesh Maiti., By Jörg-Rüdiger Hill, Lalitha Subramanian No abstract is available for this article. [source] Image-based modeling of 3D objects with curved surfacesCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 2 2008Man Hee Lee Abstract This paper addresses an image-based method for modeling 3D objects with curved surfaces based on the non-uniform rational B-splines (NURBS) representation. The user fits the feature curves on a few calibrated images with 2D NURBS curves using the interactive user interface. Then, 3D NURBS curves are constructed by stereo reconstruction of the corresponding feature curves. Using these as building blocks, NURBS surfaces are reconstructed by the known surface building methods including bilinear surfaces, ruled surfaces, generalized cylinders, and surfaces of revolution. In addition to them, we also employ various advanced techniques, including skinned surfaces, swept surfaces, and boundary patches. Based on these surface modeling techniques, it is possible to build various types of 3D shape models with textured curved surfaces without much effort. Copyright © 2007 John Wiley & Sons, Ltd. [source] Projective Texture Mapping with Full PanoramaCOMPUTER GRAPHICS FORUM, Issue 3 2002Dongho Kim Projective texture mapping is used to project a texture map onto scene geometry. It has been used in many applications, since it eliminates the assignment of fixed texture coordinates and provides a good method of representing synthetic images or photographs in image-based rendering. But conventional projective texture mapping has limitations in the field of view and the degree of navigation because only simple rectangular texture maps can be used. In this work, we propose the concept of panoramic projective texture mapping (PPTM). It projects cubic or cylindrical panorama onto the scene geometry. With this scheme, any polygonal geometry can receive the projection of a panoramic texture map, without using fixed texture coordinates or modeling many projective texture mapping. For fast real-time rendering, a hardware-based rendering method is also presented. Applications of PPTM include panorama viewer similar to QuicktimeVR and navigation in the panoramic scene, which can be created by image-based modeling techniques. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Viewing Algorithms; I.3.7 [Computer Graphics]: Color, Shading, Shadowing, and Texture [source] Direct Manipulation and Interactive Sculpting of PDE SurfacesCOMPUTER GRAPHICS FORUM, Issue 3 2000Haixia Du This paper presents an integrated approach and a unified algorithm that combine the benefits of PDE surfaces and powerful physics-based modeling techniques within one single modeling framework, in order to realize the full potential of PDE surfaces. We have developed a novel system that allows direct manipulation and interactive sculpting of PDE surfaces at arbitrary location, hence supporting various interactive techniques beyond the conventional boundary control. Our prototype software affords users to interactively modify point, normal, curvature, and arbitrary region of PDE surfaces in a predictable way. We employ several simple, yet effective numerical techniques including the finite-difference discretization of the PDE surface, the multigrid-like subdivision on the PDE surface, the mass-spring approximation of the elastic PDE surface, etc. to achieve real-time performance. In addition, our dynamic PDE surfaces can also be approximated using standard bivariate B-spline finite elements, which can subsequently be sculpted and deformed directly in real-time subject to intrinsic PDE constraints. Our experiments demonstrate many attractive advantages of our dynamic PDE formulation such as intuitive control, real-time feedback, and usability to the general public. [source] Grouping Pavement Condition Variables for Performance Modeling Using Self-Organizing MapsCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 2 2001Nii O. Attoh-Okine Different modeling techniques have been employed for the evaluation of pavement performance, determination of structural capacity, and performance predictions. The evaluation of performance involves the functional analysis of pavements based on the history of the riding quality. The riding comfort and pavement performance can be conveniently defined in terms of roughness and pavement distresses. Thus different models have been developed relating roughness with distresses to predict pavement performance. These models are too complex and require parsimonious equations involving fewer variables. Artificial neural networks have been used successfully in the development of performance-prediction models. This article demonstrates the use of an artificial intelligence neural networks self-organizing maps for the grouping of pavement condition variables in developing pavement performance models to evaluate pavement conditions on the basis of pavement distresses. [source] GAUGE: Grid Automation and Generative Environment,CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2006Francisco Hernández Abstract The Grid has proven to be a successful paradigm for distributed computing. However, constructing applications that exploit all the benefits that the Grid offers is still not optimal for both inexperienced and experienced users. Recent approaches to solving this problem employ a high-level abstract layer to ease the construction of applications for different Grid environments. These approaches help facilitate construction of Grid applications, but they are still tied to specific programming languages or platforms. A new approach is presented in this paper that uses concepts of domain-specific modeling (DSM) to build a high-level abstract layer. With this DSM-based abstract layer, the users are able to create Grid applications without knowledge of specific programming languages or being bound to specific Grid platforms. An additional benefit of DSM provides the capability to generate software artifacts for various Grid environments. This paper presents the Grid Automation and Generative Environment (GAUGE). The goal of GAUGE is to automate the generation of Grid applications to allow inexperienced users to exploit the Grid fully. At the same time, GAUGE provides an open framework in which experienced users can build upon and extend to tailor their applications to particular Grid environments or specific platforms. GAUGE employs domain-specific modeling techniques to accomplish this challenging task. Copyright © 2005 John Wiley & Sons, Ltd. [source] MODELING MEDIATION IN THE ETIOLOGY OF VIOLENT BEHAVIOR IN ADOLESCENCE: A TEST OF THE SOCIAL DEVELOPMENT MODEL,CRIMINOLOGY, Issue 1 2001BU HUANG The social development model seeks to explain human behavior through specification of predictive and mediating developmental relationships. It incorporates the effects of empirical predictors ("risk factors" and "protective factors") for antisocial behavior and seeks to synthesize the most strongly supported propositions of control theory, social learning theory, and differential association theory. This article examines the fit of the social development model using constructs measured at ages 10, 13, 14, and 16 to predict violent behavior at age 18. The sample of 808 is from the longitudinal panel of the Seattle Social Development Project, which in 1985 surveyed fifth-grade students from schools serving high crime neighborhoods in Seattle, Washington. Structural equation modeling techniques were used to examine the fit of the model to the data. The model fit the data (CFI ,.90, RMSEA ,.05). We conclude that the social development model adequately predicts violence at age 18 and mediates much of the effect of prior violence. Implications for theory and for prevention are discussed. [source] Examining Drivers of Course Performance: An Exploratory Examination of an Introductory CIS Applications CourseDECISION SCIENCES JOURNAL OF INNOVATIVE EDUCATION, Issue 1 2006Rhonda A. Syler ABSTRACT The accelerating diffusion of broadband Internet access provides many opportunities for the development of pedagogically robust Web-based instruction (WBI). While the supporting technology infrastructure of broadband disseminates, the attention of academic researchers focuses upon issues such as the drivers of student usage of WBI. Specifically, the research presented herein examined the impact of WBI on a student's aggregate course performance. We hypothesized that learning independence (LI) is a determinate factor in a student's use of WBI. In this study, we employed structural equation modeling techniques to examine the data and assess the direct and indirect effects of LI on WBI usage. The subjects, students in an introductory Computer Information Systems applications course, used a Web-based tutorial program for skills instruction. The findings of this study suggest that WBI usage has a significant impact on a student's course performance. Despite its plausibility, the effect of LI on WBI usage was not significant. However, we did conclude that two of the second order factors of the LI construct have a direct effect on a student's performance in the course. [source] Methods to account for spatial autocorrelation in the analysis of species distributional data: a reviewECOGRAPHY, Issue 5 2007Carsten F. Dormann Species distributional or trait data based on range map (extent-of-occurrence) or atlas survey data often display spatial autocorrelation, i.e. locations close to each other exhibit more similar values than those further apart. If this pattern remains present in the residuals of a statistical model based on such data, one of the key assumptions of standard statistical analyses, that residuals are independent and identically distributed (i.i.d), is violated. The violation of the assumption of i.i.d. residuals may bias parameter estimates and can increase type I error rates (falsely rejecting the null hypothesis of no effect). While this is increasingly recognised by researchers analysing species distribution data, there is, to our knowledge, no comprehensive overview of the many available spatial statistical methods to take spatial autocorrelation into account in tests of statistical significance. Here, we describe six different statistical approaches to infer correlates of species' distributions, for both presence/absence (binary response) and species abundance data (poisson or normally distributed response), while accounting for spatial autocorrelation in model residuals: autocovariate regression; spatial eigenvector mapping; generalised least squares; (conditional and simultaneous) autoregressive models and generalised estimating equations. A comprehensive comparison of the relative merits of these methods is beyond the scope of this paper. To demonstrate each method's implementation, however, we undertook preliminary tests based on simulated data. These preliminary tests verified that most of the spatial modeling techniques we examined showed good type I error control and precise parameter estimates, at least when confronted with simplistic simulated data containing spatial autocorrelation in the errors. However, we found that for presence/absence data the results and conclusions were very variable between the different methods. This is likely due to the low information content of binary maps. Also, in contrast with previous studies, we found that autocovariate methods consistently underestimated the effects of environmental controls of species distributions. Given their widespread use, in particular for the modelling of species presence/absence data (e.g. climate envelope models), we argue that this warrants further study and caution in their use. To aid other ecologists in making use of the methods described, code to implement them in freely available software is provided in an electronic appendix. [source] EEG source localization in focal epilepsy: Where are we now?EPILEPSIA, Issue 2 2008Chris Plummer Summary Electroencephalographic source localization (ESL) by noninvasive means is an area of renewed interest in clinical epileptology. This has been driven by innovations in the computer-assisted modeling of dipolar and distributed sources for the investigation of focal epilepsy; a process fueled by the ever-increasing computational power available to researchers for the analysis of scalp EEG recordings. However, demonstration of the validity and clinical utility of these mathematically derived source modeling techniques has struggled to keep pace. This review evaluates the current clinical "fitness' of ESL as applied to the focal epilepsies by examining some of the key studies performed in the field, with emphasis given to clinical work published in the last five years. In doing so, we discuss why ESL techniques have not made an impact on routine epilepsy practice, underlining some of the current problems and controversies in the field. We conclude by examining where ESL currently sits alongside magnetoencephalography and combined EEG-functional magnetic resonance imaging in the investigation of focal epilepsy. [source] How news content influences anti-immigration attitudes: Germany, 1993,2005EUROPEAN JOURNAL OF POLITICAL RESEARCH, Issue 4 2009HAJO G. BOOMGAARDEN Immigration is an increasingly important political issue in Western democracies and a crucial question relates to the antecedents of public attitudes towards immigrants. It is generally acknowledged that information relayed through the mass media plays a role in the formation of anti-immigration attitudes. This study considers whether news coverage of immigrants and immigration issues relates to macro-level dynamics of anti-immigration attitudes. It further explores whether this relationship depends on variation in relevant real world contexts. The models simultaneously control for the effects of established contextual explanatory variables. Drawing on German monthly time-series data and on ARIMA time-series modeling techniques, it is shown that both the frequency and the tone of coverage of immigrant actors in the news significantly influence dynamics in anti-immigration attitudes. The strength of the effect of the news, however, depends on contextual variation in immigration levels and the number of asylum seekers. Implications of these findings are discussed in the light of the increasing success of extreme right parties and growing opposition to further European integration. [source] Cortical sources of the early components of the visual evoked potentialHUMAN BRAIN MAPPING, Issue 2 2002Francesco Di Russo Abstract This study aimed to characterize the neural generators of the early components of the visual evoked potential (VEP) to isoluminant checkerboard stimuli. Multichannel scalp recordings, retinotopic mapping and dipole modeling techniques were used to estimate the locations of the cortical sources giving rise to the early C1, P1, and N1 components. Dipole locations were matched to anatomical brain regions visualized in structural magnetic resonance imaging (MRI) and to functional MRI (fMRI) activations elicited by the same stimuli. These converging methods confirmed previous reports that the C1 component (onset latency 55 msec; peak latency 90,92 msec) was generated in the primary visual area (striate cortex; area 17). The early phase of the P1 component (onset latency 72,80 msec; peak latency 98,110 msec) was localized to sources in dorsal extrastriate cortex of the middle occipital gyrus, while the late phase of the P1 component (onset latency 110,120 msec; peak latency 136,146 msec) was localized to ventral extrastriate cortex of the fusiform gyrus. Among the N1 subcomponents, the posterior N150 could be accounted for by the same dipolar source as the early P1, while the anterior N155 was localized to a deep source in the parietal lobe. These findings clarify the anatomical origin of these VEP components, which have been studied extensively in relation to visual-perceptual processes. Hum. Brain Mapping 15:95,111, 2001. © 2001 Wiley-Liss, Inc. [source] Atomic-Level Studies of Molecular Self-Assembly on Metallic SurfacesADVANCED MATERIALS, Issue 10-11 2009Giulia Tomba Abstract Shrinking devices to the nanoscale, while still maintaining accurate control on their structure and functionality is one of the major technological challenges of our era. The use of purposely directed self-assembly processes provides a smart alternative to the troublesome manipulation and positioning of nanometer-sized objects piece by piece. Here, we report on a series of recent works where the in-depth study of appropriately chosen model systems addresses the two key-points in self-assembly: building blocks selection and control of bonding. We focus in particular on hydrogen bonding because of the stability, precision and yet flexibility of nanostructures based on this interaction. Complementing experimental information with advanced atomistic modeling techniques based on quantum formalisms is a key feature of most investigations. We thus highlight the role of theoretical modeling while we follow the progression in the use of more and more complex molecular building blocks, or "tectons". In particular, we will see that the use of three-dimensional, flexible tectons promises to be a powerful way to achieve highly sophisticated functional nanostructures. However, the increasing complexity of the assembly units used makes it generally more difficult to control the supramolecular organization and predict the assembling mechanisms. This creates a case for developing novel analysis methods and ever more advanced modeling techniques. [source] Flexible constraints for regularization in learning from dataINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 6 2004Eyke Hüllermeier By its very nature, inductive inference performed by machine learning methods mainly is data driven. Still, the incorporation of background knowledge,if available,can help to make inductive inference more efficient and to improve the quality of induced models. Fuzzy set,based modeling techniques provide a convenient tool for making expert knowledge accessible to computational methods. In this article, we exploit such techniques within the context of the regularization (penalization) framework of inductive learning. The basic idea is to express knowledge about an underlying data-generating process in terms of flexible constraints and to penalize those models violating these constraints. An optimal model is one that achieves an optimal trade-off between fitting the data and satisfying the constraints. © 2004 Wiley Periodicals, Inc. [source] Recent advances of neural network-based EM-CADINTERNATIONAL JOURNAL OF RF AND MICROWAVE COMPUTER-AIDED ENGINEERING, Issue 5 2010Humayun Kabir Abstract In this article, we provide an overview of recent advances in computer-aided design techniques using neural networks for electromagnetic (EM) modeling and design applications. Summary of various recent neural network modeling techniques including passive component modeling, design and optimization using the models are discussed. Training data for the models are generated from EM simulations. The trained neural networks become fast and accurate models of EM structures. The models are then incorporated into various optimization methods and commercially available circuit simulators for fast design and optimization. We also provide an overview of recently developed neural network inverse modeling technique. Training a neural network inverse model directly may become difficult due to the nonuniqueness of the input,output relationship in the inverse model. Training data containing multivalued solutions are divided into groups according to derivative information. Multiple inverse submodels are built based on divided data groups and are then combined to form a complete model. Comparison between the conventional EM-based design approach and the inverse design approach has also been discussed. These computer-aided design techniques using neural models provide circuit level simulation speed with EM level accuracy avoiding the high computational cost of EM simulation. © 2010 Wiley Periodicals, Inc. Int J RF and Microwave CAE, 2010. [source] Time-domain measurement and modeling techniques for wideband communication components and systemsINTERNATIONAL JOURNAL OF RF AND MICROWAVE COMPUTER-AIDED ENGINEERING, Issue 1 2003Christopher P. Silva Abstract The measurement and characterization of RF and microwave components and systems has been dominated by a tone-based frequency-domain paradigm because of its simplicity and accuracy. This review article describes new and accurate time-domain techniques that are superior for application to wideband and nonlinear contexts. © 2003 Wiley Periodicals, Inc. Int J RF and Microwave CAE 13: 5,31, 2003. [source] A Socio-Political and -Cultural Model of the War in Afghanistan1INTERNATIONAL STUDIES REVIEW, Issue 1 2010Armando Geller We present a simulation model of current conflict-torn Afghanistan in which a system-dynamics model is coupled with an agent-based model. Agent-based modeling techniques are applied to model individual cognition and behavior as well as group formation processes. System-dynamics modeling is used for representing macro conflict processes, such as duration of violence and combat success ratio. The cognitive and behavioral processes are couched in a socio-cultural context and feed into the system dynamics processes. This affords us exploring the relationship between local socio-culturally-driven cognition and behavior and (dynamic) macro properties of armed conflict. We demonstrate the importance of analyzing conflict-torn Afghanistan from an interplay of adapting "traditional" socio-cultural mechanisms, political culture and power structures, and politico-economic macro-processes. We find that variations in the conflict's superstructure can be explained through variations in socio-culturally dependent structures. The model indicates limitations with regard to classical prediction, but is promising with regard to explanatory-driven pattern forecasting. [source] Pervaporation separation of sodium alginate/chitosan polyelectrolyte complex composite membranes for the separation of water/alcohol mixtures: Characterization of the permeation behavior with molecular modeling techniquesJOURNAL OF APPLIED POLYMER SCIENCE, Issue 4 2007Sang-Gyun Kim Abstract Polyelectrolyte complex (PEC) membranes were prepared by the complexation of protonated chitosan with sodium alginate doped on a porous, polysulfone-supporting membrane. The pervaporation characteristics of the membranes were investigated with various alcohol/water mixtures. The physicochemical properties of the permeant molecules and polyion complex membranes were determined with molecular modeling methods, and the data from these methods were used to explain the permeation of water and alcohol molecules through the PEC membranes. The experimental results showed that the prepared PEC membranes had an excellent pervaporation performance in most aqueous alcohol solutions and that the selectivity and permeability of the membranes depended on the molecular size, polarity, and hydrophilicity of the permeant alcohols. However, the aqueous methanol solutions showed a permeation behavior different from that of the other alcohol solutions. Methanol permeated the prepared PEC membranes more easily than water even though water molecules have stronger polarity and are smaller than methanol molecules. The experimental results are discussed from the point of view of the physical properties of the permeant molecules and the membranes in the permeation state. © 2006 Wiley Periodicals, Inc. J Appl Polym Sci 103: 2634,2641, 2007 [source] Menstrual age,dependent systematic error in sonographic fetal weight estimation: A mathematical modelJOURNAL OF CLINICAL ULTRASOUND, Issue 3 2002Max Mongelli MD Abstract Purpose We used computer modeling techniques to evaluate the accuracy of different types of sonographic formulas for estimating fetal weight across the full range of clinically important menstrual ages. Methods Input data for the computer modeling techniques were derived from published British standards for normal distributions of sonographic biometric growth parameters and their correlation coefficients; these standards had been derived from fetal populations whose ages were determined using sonography. The accuracy of each of 10 formulas for estimating fetal weight was calculated by comparing the weight estimates obtained with these formulas in simulated populations with the weight estimates expected from birth weight data, from 24 weeks' menstrual age to term. Preterm weights were estimated by interpolation from term birth weights using sonographic growth curves. With an ideal formula, the median weight estimates at term should not differ from the population birth weight median. Results The simulated output sonographic values closely matched those of the original population. The accuracy of the fetal weight estimation differed by menstrual age and between various formulas. Most methods tended to overestimate fetal weight at term. Shepard's formula progressively overestimated weights from about 2% at 32 weeks to more than 15% at term. The accuracy of Combs's and Shinozuka's volumetric formulas varied least by menstrual age. Hadlock's formula underestimated preterm fetal weight by up to 7% and overestimated fetal weight at term by up to 5%. Conclusions The accuracy of sonographic fetal weight estimation based on volumetric formulas is more consistent across menstrual ages than are other methods. © 2002 Wiley Periodicals, Inc. J Clin Ultrasound 30:139,144, 2002; DOI 10.1002/jcu.10051 [source] Computer aided design for sustainable industrial processes: Specific tools and applicationsAICHE JOURNAL, Issue 4 2009Maurizio Fermeglia Abstract Chemical Process Sustainability can be estimated using different sustainability indicators. The quantitative estimation of those indicators is necessary (i) for evaluating the environmental impact of a chemical process and (ii) for choosing the best design among different available alternatives. To accomplish these goals, the computerized calculation of sustainability indicators requires the use of at least three computer tools: (i) process simulation, (ii) molecular modeling and a (iii) sustainability indicators software code. In this work, a complete software platform, Process Sustainability Prediction Framework, integrated with process simulation programs, which support the CAPE-OPEN interfaces, is presented and discussed. The article contains also description and application of molecular modeling techniques to estimate different toxicological data, which are used in the calculation of sustainability indicators. A representative example of one chemical process and thermo-physical properties used in the toxicological data calculation, are reported to demonstrate the applicability of the software to real cases. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source] The Impact of Underage Drinking Laws on Alcohol-Related Fatal Crashes of Young DriversALCOHOLISM, Issue 7 2009James C. Fell Background:, This study used a pre- to post-design to evaluate the influence on drinking-and-driving fatal crashes of 6 laws directed at youth aged 20 and younger and 4 laws targeting all drivers. Methods:, Data on the laws were drawn from the Alcohol Policy Information System data set (1998 to 2005), the Digests of State Alcohol Highway Safety Related Legislation (1983 to 2006), and the Westlaw database. The Fatality Analysis Reporting System data set (1982 to 2004) was used to assess the ratio of drinking to nondrinking drivers involved in fatal crashes [fatal crash incidence ratio (CIR)]. The data were analyzed using structural equation modeling techniques. Results:, Significant decreases in the underage fatal CIR were associated with presence of 4 of the laws targeting youth (possession, purchase, use and lose, and zero tolerance) and 3 of the laws targeting all drivers (0.08 blood alcohol concentration illegal per se law, secondary or upgrade to a primary seat belt law, and an administrative license revocation law). Beer consumption was associated with a significant increase in the underage fatal CIR. The direct effects of laws targeting drivers of all ages on adult drinking drivers aged 26 and older were similar but of a smaller magnitude compared to the findings for those aged 20 and younger. It is estimated that the 2 core underage drinking laws (purchase and possession) and the zero tolerance law are currently saving an estimated 732 lives per year controlling for other exposure factors. If all states adopted use and lose laws, an additional 165 lives could be saved annually. Conclusions:, These results provide substantial support for the effectiveness of under age 21 drinking laws with 4 of the 6 laws examined having significant associations with reductions in underage drinking-and-driving fatal crashes. These findings point to the importance of key underage drinking and traffic safety laws in efforts to reduce underage drinking-driver crashes. [source] Developing the changes in attitude about the relevance of science (CARS) questionnaire and assessing two high school science classesJOURNAL OF RESEARCH IN SCIENCE TEACHING, Issue 8 2003Marcelle A. Siegel This study has two purposes: (a) methodological,to design and test a new instrument able to reflect changes in attitudes toward science over time, and (b) investigative,to find out the effect of two similar curricular treatments on the attitudes of two classes. Items about the relevance of science to students' lives were developed, pilot-tested, and analyzed using Rasch modeling. We then divided reliable items into three equivalent questionnaire forms. The final three forms of the questionnaire were used to assess high school students' attitudes. Over 18 weeks, one class used a core curriculum (Science and Sustainability) to learn science in the context of making decisions about societal issues. A second class used the same core curriculum, but with parts replaced by computer-based activities (Convince Me) designed to enhance the coherence of students' arguments. Using traditional and Rasch modeling techniques, we assessed the degrees to which such instructional activities promoted students' beliefs that science is relevant to them. Both classes tended to agree more, over time, that science is relevant to their lives, and the increases were statistically equivalent between classes. This study suggests that, by using innovative, issue-based activities, it is possible to enhance students' attitudes about the relevance of science. © 2003 Wiley Periodicals, Inc. J Res Sci Teach 40: 757,775, 2003 [source] Securitization of Mortality Risks in Life AnnuitiesJOURNAL OF RISK AND INSURANCE, Issue 2 2005Yijia Lin The purpose of this article is to study mortality-based securities, such as mortality bonds and swaps, and to price the proposed mortality securities. We focus on individual annuity data, although some of the modeling techniques could be applied to other lines of annuity or life insurance. [source] Recent Advances in Mathematical Modeling of Flow and Heat Transfer Phenomena in Glass FurnacesJOURNAL OF THE AMERICAN CERAMIC SOCIETY, Issue 5 2002Manoj K. Choudhary This paper reviews significant advances in the mathematical modeling of flow and heat transfer phenomena in glass furnaces during the period 1996,2000. It describes developments in both the fundamental/scientific and practical aspects of modeling. The topics reviewed include developments in (a) model formulation and modeling techniques, (b) postprocessing modeling of glass quality and environmental emissions, (c) measurement of thermodynamic and transport properties of melt relevant to modeling, and (d) incorporation of model-based knowledge into process control schemes. These developments are critically examined and assessed from an industrial perspective, and topics needing further research and development efforts are identified. [source] STREAMFLOW DEPLETION: MODELING OF REDUCED BASEFLOW ANI INDUCED STREAM INFILTRATION FROM SEASONALLY PUMPED WELLS,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 1 2001Xunhong Chen ABSTRACT: Numerical modeling techniques are used to analyze streamflow depletion for stream-aquifer systems with baseflow. The analyses calculated two flow components generated by a pumping well located at a given distance from a river that is hydraulically connected to an unconfined aquifer. The two components are induced stream infiltration and reduced baseflow; both contribute to total streamflow depletion. Simulation results suggest that the induced infiltration, the volume of water discharged from the stream to the aquifer, has a shorter term impact on streamflow, while the reduced baseflow curves show a longer term effect. The peak impacts of the two hydrologic processes on streamflow occur separately. The separate analysis helps in understanding the hydrologic interactions between stream and aquifer. Practically, it provides useful information about contaminant transport from stream to aquifer when water quality is a concern, and for areas where water quantity is an issue, the separate analysis offers additional information to the development of water resource management plan. [source] Theoretical Modeling in Hemodynamics of MicrocirculationMICROCIRCULATION, Issue 8 2008JACK LEE ABSTRACT Over the past decades, theoretical modeling has become an indispensable component of research into the hemodynamics of microcirculation. Numerous studies rely on modeling to provide quantitative insights into the interacting biophysical mechanisms that govern microcirculatory flow. The mechanical deformation of hematocytes has been addressed by continuum and molecular-informed computational models based on a growing body of experimental information. Theoretical analyses of single-vessel flow and blood rheology have led to a range of modeling approaches. Until recently, computational constraints limited direct simulations of multi-particle flows involving deformation and/or aggregation, but recent studies have begun to address this challenge. Network-level analyses have provided insights into the biophysical principles underlying the design of the microcirculation. This approach has been used to complement available experimental data and to derive empirical models of microvascular blood rheology. Continued increases in computational performance applied to current modeling techniques will enable larger scale simulations. In order to exploit this opportunity, integration of diverse theoretical approaches within a multi-scale framework is needed. [source] LEADERSHIP AND PROCEDURAL JUSTICE CLIMATE AS ANTECEDENTS OF UNIT-LEVEL ORGANIZATIONAL CITIZENSHIP BEHAVIORPERSONNEL PSYCHOLOGY, Issue 1 2004MARK G. EHRHART Despite an abundance of research conducted on organizational citizenship behavior (OCB) at the individual level of analysis, relatively little is known about unit-level OCB. To investigate the antecedents of unit-level OCB, data were collected from employees of 249 grocery store departments. Structural equation modeling techniques were used to test a model in which procedural justice climate was hypothesized to partially mediate the relationship between leadership behavior (servant-leadership) and unit-level OCB. Models were tested using both employee ratings and manager ratings of unit-level OCB. The results gave general support for the hypotheses, although there were some differences depending on the source of the OCB ratings (supervisor or subordinate), whether the type of department was controlled for, and whether a common method variance factor was included. Overall, the evidence generally supported the association of both servant-leadership and procedural justice climate with unit-level OCB. Building on the current study, a multilevel framework for the study of OCB is presented in conjunction with a discussion of future research directions in four specific areas. [source] The design and use of an agent-based model to simulate the 1918 influenza epidemic at Norway House, ManitobaAMERICAN JOURNAL OF HUMAN BIOLOGY, Issue 3 2009Connie Carpenter Agent-based modeling provides a new approach to the study of virgin soil epidemics like the 1918 flu. In this bottom-up simulation approach, a landscape can be created and populated with a heterogeneous group of agents who move and interact in ways that more closely resemble human behavior than is usually seen in other modeling techniques. In this project, an agent-based model was constructed to simulate the spread of the 1918 influenza pandemic through the Norway House community in Manitoba, Canada. Archival, ethnographic, epidemiological, and biological information were used to aid in designing the structure of the model and to estimate values for model parameters. During the epidemic, Norway House was a Hudson's Bay Company post and a Swampy Cree-Métis settlement with an economy based on hunting, fishing, and the fur trade. The community followed a traditional, seasonal travel pattern of summer aggregation and winter dispersal. The model was used to examine how seasonal community structures and associated population movement patterns may have influenced disease transmission and epidemic spread. Simulations of the model clearly demonstrate that human behavior can significantly influence epidemic outcomes. Am. J. Hum. Biol. 2009. © 2008 Wiley-Liss, Inc. [source] Understanding the kinetics and network formation of dimethacrylate dental resinsPOLYMERS FOR ADVANCED TECHNOLOGIES, Issue 6 2001Lale G. Lovell Abstract Dimethacrylate monomers are commonly used as the organic phase of dental restorative materials but many questions remain about the underlying kinetics and network formation in these highly crosslinked photopolymer systems. Several novel experimental and modeling techniques that have been developed for other multifunctional (meth)acrylates were utilized to gain further insight into these resin systems. Specifically, this work investigates the copolymerization behavior ofbis-GMA (2,2-bis[p-(2-hydroxy-3-methacryloxyprop-1-oxy)-phenyl]propane) and TEGDMA (triethylene glycol dimethacrylate), two monomers typically used for dental resin formulations. Near-infrared spectroscopy, electron paramagnetic resonance spectroscopy, as well as dynamic mechanical and dielectric analysis were used to characterize the kinetics, radical populations, and structural properties of this copolymer system. In addition, a kinetic model is described that provides valuable information about the network evolution during the formation of this crosslinked polymer. The results of these numerous studies illustrate that all of the aforementioned techniques can be readily applied to dental resin systems and consequently can be used to obtain a wealth of information about these systems. The application of these techniques provides insight into the complex polymerization kinetics and corresponding network formation, and as a result, a more complete understanding of the anomolous behaviors exhibited by these systems, such as diffusion controlled kinetics and conversion dependent network formation, is attained. Copyright © 2001 John Wiley & Sons, Ltd. [source] Party Identification and Core Political ValuesAMERICAN JOURNAL OF POLITICAL SCIENCE, Issue 4 2005Paul Goren Party identification and core political values are central elements in the political belief systems of ordinary citizens. Are these predispositions related to one another? Does party identification influence core political values or are partisan identities grounded in such values? This article draws upon theoretical works on partisan information processing and value-based reasoning to derive competing hypotheses about whether partisanship shapes political values or political values shape partisanship. The hypotheses are tested by using structural equation modeling techniques to estimate dynamic models of attitude stability and constraint with data from the 1992,94,96 National Election Study panel survey. The analyses reveal that partisan identities are more stable than the principles of equal opportunity, limited government, traditional family values, and moral tolerance; party identification constrains equal opportunity, limited government, and moral tolerance; and these political values do not constrain party identification. [source] |