Home About us Contact | |||
Traditional Models (traditional + models)
Selected AbstractsPERSPECTIVE: SEXUAL CONFLICT AND SEXUAL SELECTION: CHASING AWAY PARADIGM SHIFTSEVOLUTION, Issue 6 2003TOMMASO PIZZARI Abstract., Traditional models of sexual selection propose that partner choice increases both average male and average female fitness in a population. Recent theoretical and empirical work, however, has stressed that sexual conflict may be a potent broker of sexual selection. When the fitness interests of males and females diverge, a reproductive strategy that increases the fitness of one sex may decrease the fitness of the other sex. The chase-away hypothesis proposes that sexual conflict promotes sexually antagonistic, rather than mutualistic, coevolution, whereby manipulative reproductive strategies in one sex are counteracted by the evolution of resistance to such strategies in the other sex. In this paper, we consider the criteria necessary to demonstrate the chase-away hypothesis. Specifically, we review sexual conflict with particular emphasis on the chase-away hypothesis; discuss the problems associated with testing the predictions of the chase-away hypothesis and the extent to which these predictions and the predictions of traditional models of sexual selection are mutually exclusive; discuss misconceptions and mismeasures of sexual conflict; and suggest an alternative approach to demonstrate sexual conflict, measure the intensity of sexually antagonistic selection in a population, and elucidate the coevolutionary trajectories of the sexes. [source] Mortgage Lending, Sample Selection and DefaultREAL ESTATE ECONOMICS, Issue 4 2000Stephen L Ross Traditional models of mortgage default suffer from sample-selection bias because they do not control for the loan approval process. This paper estimates a sample-selection-corrected default model using the 1990 Boston Federal Reserve loan application sample and the 1992 Federal Housing Authority (FHA) foreclosure sample. A single-equation FHA default model appears to suffer from substantial selection bias, but the bias primarily arises from the omission of credit history and other variables that are only in the application sample. Therefore, default models that contain detailed information on applicants may not suffer from substantial selection bias. Finally, a test for prejudice-based discrimination is developed and conducted, but the findings are inconclusive. [source] Speciation and inversions: Chimps and humansBIOESSAYS, Issue 9 2003Jody Hey A new set of models has resurrected a role for chromosomal inversions in the formation of new species.1,3 Traditional models, which are generally considered to be unlikely in most cases, had imagined that inversions might aid speciation by directly causing low hybrid fitness. In contrast, the newer models focus on the effect that inversions have on local recombination rates. A test of these models found a strikingly high rate of amino-acid substitution within regions where humans and chimpanzees differ by inversions, suggesting perhaps that our ancestral species underwent a divergence process in which gene flow and inversions played a key role.4 However, it remains uncertain whether this interesting finding is actually consistent with the proposed model. BioEssays 25:825,828, 2003. © 2003 Wiley Periodicals, Inc. [source] Isolation and high genetic diversity in dwarf mountain toads (Capensibufo) from South AfricaBIOLOGICAL JOURNAL OF THE LINNEAN SOCIETY, Issue 4 2010KRYSTAL A. TOLLEY Traditional models of amphibian dispersal and gene flow point to low dispersal and high philopatry. In recent years, this traditional view has been challenged and it appears that no general model holds across taxa. Conservation of amphibians cannot be addressed on an over-arching scale, but must come on a case-by-case basis, especially for range-restricted species where information on gene flow and migration must be incorporated into conservation efforts. The only two members of the genus Capensibufo Grandison, 1980 (Anura: Bufonidae) are range restricted small bufonids, with distributions limited to montane areas in South Africa. Using a Bayesian analysis of two mitochondrial markers (16S and ND2), we examined the genetic patterns in Capensibufo rosei and Capensibufo tradouwi in order to understand both taxonomic and geographic boundaries. These species were not monophyletic, and demonstrate no clear taxonomic boundaries. Instead, the genus is extremely diverse genetically, with distinct lineages confined to isolated mountains that represent geographic boundaries. In addition, bioclimatic modelling using MAXENT and scenarios of climatic conditions at both the present and last glacial maximum suggest multiple bioclimatic and physical barriers to gene flow at present and in the past. We conclude that members of the genus have very low vagility, that current taxonomic boundaries are inadequate, and that strong geographic structuring has undoubtedly contributed to genetic diversity at the species level, rather than the population level. © 2010 The Linnean Society of London, Biological Journal of the Linnean Society, 2010, 100, 822,834. [source] Inclusion or control? commissioning and contracting services for people with learning disabilitiesBRITISH JOURNAL OF LEARNING DISABILITIES, Issue 4 2006Liam Concannon Accessible summary ,,The rise of new public management has seen the role of the social worker becoming increasingly administrative and less about face to face contact with service users. ,,When commissioning managers seek to help people with learning disabilities plan their services, who actually makes the decisions? ,,Direct payments are proposed as the answer for people with learning disabilities to take the lead, but is this a real shift in power from managers to service users? This paper examines what commissioning and contracting means for people with learning disabilities. It asks if the voices of service users are heard when it comes to planning their services and, more significantly, are their choices respected and acted upon by commissioners? The government believes the introduction of direct payments will change the way social care is administered, by placing both the decision-making and funding, firmly in the hands of people with learning disabilities. However, the question remains as to how far this can be successful, considering the complicated administration and financial processes involved. The paper explores new ground in terms of research by investigates the effects that new public management, in the form of commissioning and contracting, has on the lives of people with learning disabilities. It looks at the relationship between the service user, care manager and commissioner, and asks whether management structures help individuals or actually create further barriers to participation and inclusion. Summary This paper seeks to critically assess the impact made by the introduction of commissioning and contracting as a new culture of social care in learning disability services. It offers an evaluation of the growth in importance of the user as consumer. Does the commissioning and the contract process give users with learning disabilities a greater influence over their services and ultimately their lives? It is suggested that far from empowering people with learning disabilities to have a say in the services they want, the emerging culture of business contracts and new public management transfers power firmly back into the hands of professionals making the decisions. Social work practice is changing in response to major shifts in social trends and at the behest of market values. Traditional models are being rejected and the challenge for social work is to adapt itself to operate within a competency based paradigm. The paper argues that at the centre of this new culture is a government use of a system of performance management that successfully drives down cost. Thus there remain contradictions between the adoption of a mixed economy of care; services planning; consumerism; resource constraints; and the communication difficulties experienced by many people with learning disabilities. [source] Why environmental scientists are becoming BayesiansECOLOGY LETTERS, Issue 1 2005James S. Clark Abstract Advances in computational statistics provide a general framework for the high-dimensional models typically needed for ecological inference and prediction. Hierarchical Bayes (HB) represents a modelling structure with capacity to exploit diverse sources of information, to accommodate influences that are unknown (or unknowable), and to draw inference on large numbers of latent variables and parameters that describe complex relationships. Here I summarize the structure of HB and provide examples for common spatiotemporal problems. The flexible framework means that parameters, variables and latent variables can represent broader classes of model elements than are treated in traditional models. Inference and prediction depend on two types of stochasticity, including (1) uncertainty, which describes our knowledge of fixed quantities, it applies to all ,unobservables' (latent variables and parameters), and it declines asymptotically with sample size, and (2) variability, which applies to fluctuations that are not explained by deterministic processes and does not decline asymptotically with sample size. Examples demonstrate how different sources of stochasticity impact inference and prediction and how allowance for stochastic influences can guide research. [source] Biopolitical Utopianism in Educational TheoryEDUCATIONAL PHILOSOPHY AND THEORY, Issue 7 2007Tyson Lewis Abstract In this paper I shift the center of utopian debates away from questions of ideology towards the question of power. As a new point of departure, I analyze Foucault's notion of biopower as well as Hardt and Negri's theory of biopolitics. Arguing for a new hermeneutic of biopolitics in education, I then apply this lens to evaluate the educational philosophy of John Dewey. In conclusion, the paper suggests that while Hardt and Negri are missing an educational theory, John Dewey is missing a concept of democracy adequate to the biopolitical struggles of the multitude. Thus, I call for a synthesis of Dewey and Hardt and Negri in order to generate a biopedagogical practice beyond both traditional models of education as well as current standardization. [source] GENETICS AND RECENT HUMAN EVOLUTIONEVOLUTION, Issue 7 2007Alan R. Templeton Starting with "mitochondrial Eve" in 1987, genetics has played an increasingly important role in studies of the last two million years of human evolution. It initially appeared that genetic data resolved the basic models of recent human evolution in favor of the "out-of-Africa replacement" hypothesis in which anatomically modern humans evolved in Africa about 150,000 years ago, started to spread throughout the world about 100,000 years ago, and subsequently drove to complete genetic extinction (replacement) all other human populations in Eurasia. Unfortunately, many of the genetic studies on recent human evolution have suffered from scientific flaws, including misrepresenting the models of recent human evolution, focusing upon hypothesis compatibility rather than hypothesis testing, committing the ecological fallacy, and failing to consider a broader array of alternative hypotheses. Once these flaws are corrected, there is actually little genetic support for the out-of-Africa replacement hypothesis. Indeed, when genetic data are used in a hypothesis-testing framework, the out-of-Africa replacement hypothesis is strongly rejected. The model of recent human evolution that emerges from a statistical hypothesis-testing framework does not correspond to any of the traditional models of human evolution, but it is compatible with fossil and archaeological data. These studies also reveal that any one gene or DNA region captures only a small part of human evolutionary history, so multilocus studies are essential. As more and more loci became available, genetics will undoubtedly offer additional insights and resolutions of human evolution. [source] PERSPECTIVE: SEXUAL CONFLICT AND SEXUAL SELECTION: CHASING AWAY PARADIGM SHIFTSEVOLUTION, Issue 6 2003TOMMASO PIZZARI Abstract., Traditional models of sexual selection propose that partner choice increases both average male and average female fitness in a population. Recent theoretical and empirical work, however, has stressed that sexual conflict may be a potent broker of sexual selection. When the fitness interests of males and females diverge, a reproductive strategy that increases the fitness of one sex may decrease the fitness of the other sex. The chase-away hypothesis proposes that sexual conflict promotes sexually antagonistic, rather than mutualistic, coevolution, whereby manipulative reproductive strategies in one sex are counteracted by the evolution of resistance to such strategies in the other sex. In this paper, we consider the criteria necessary to demonstrate the chase-away hypothesis. Specifically, we review sexual conflict with particular emphasis on the chase-away hypothesis; discuss the problems associated with testing the predictions of the chase-away hypothesis and the extent to which these predictions and the predictions of traditional models of sexual selection are mutually exclusive; discuss misconceptions and mismeasures of sexual conflict; and suggest an alternative approach to demonstrate sexual conflict, measure the intensity of sexually antagonistic selection in a population, and elucidate the coevolutionary trajectories of the sexes. [source] Assessing the impact of mixing assumptions on the estimation of streamwater mean residence timeHYDROLOGICAL PROCESSES, Issue 12 2010Fabrizio Fenicia Abstract Catchment streamwater mean residence time (Tmr) is an important descriptor of hydrological systems, reflecting their storage and flow pathway properties. Tmr is typically inferred from the composition of stable water isotopes (oxygen-18 and deuterium) in observed rainfall and discharge. Currently, lumped parameter models based on convolution and sinewave functions are usually used for tracer simulation. These traditional models are based on simplistic assumptions that are often known to be unrealistic, in particular, steady flow conditions, linearity, complete mixing and others. However, the effect of these assumptions on Tmr estimation is seldom evaluated. In this article, we build a conceptual model that overcomes several assumptions made in traditional mixing models. Using data from the experimental Maimai catchment (New Zealand), we compare a complete-mixing (CM) model, where rainfall water is assumed to mix completely and instantaneously with the total catchment storage, with a partial-mixing (PM) model, where the tracer input is divided between an ,active' and a ,dead' storage compartment. We show that the inferred distribution of Tmr is strongly dependent on the treatment of mixing processes and flow pathways. The CM model returns estimates of Tmr that are well identifiable and are in general agreement with previous studies of the Maimai catchment. On the other hand, the PM model,motivated by a priori catchment insights,provides Tmr estimates that appear exceedingly large and highly uncertain. This suggests that water isotope composition measurements in rainfall and discharge alone may be insufficient for inferring Tmr. Given our model hypothesis, we also analysed the effect of different controls on Tmr. It was found that Tmr is controlled primarily by the storage properties of the catchment, rather than by the speed of streamflow response. This provides guidance on the type of information necessary to improve Tmr estimation. Copyright © 2010 John Wiley & Sons, Ltd. [source] Predicting river water temperatures using the equilibrium temperature concept with application on Miramichi River catchments (New Brunswick, Canada)HYDROLOGICAL PROCESSES, Issue 11 2005Daniel Caissie Abstract Water temperature influences most of the physical, chemical and biological properties of rivers. It plays an important role in the distribution of fish and the growth rates of many aquatic organisms. Therefore, a better understanding of the thermal regime of rivers is essential for the management of important fisheries resources. This study deals with the modelling of river water temperature using a new and simplified model based on the equilibrium temperature concept. The equilibrium temperature concept is an approach where the net heat flux at the water surface can be expressed by a simple equation with fewer meteorological parameters than required with traditional models. This new water temperature model was applied on two watercourses of different size and thermal characteristics, but within a similar meteorological region, i.e., the Little Southwest Miramichi River and Catamaran Brook (New Brunswick, Canada). A study of the long-term thermal characteristics of these two rivers revealed that the greatest differences in water temperatures occurred during mid-summer peak temperatures. Data from 1992 to 1994 were used for the model calibration, while data from 1995 to 1999 were used for the model validation. Results showed a slightly better agreement between observed and predicted water temperatures for Catamaran Brook during the calibration period, with a root-mean-square error (RMSE) of 1·10 °C (Nash coefficient, NTD = 0·95) compared to 1·45 °C for the Little Southwest Miramichi River (NTD = 0·94). During the validation period, RMSEs were calculated at 1·31 °C for Catamaran Brook and 1·55 °C for the Little Southwest Miramichi River. Poorer model performances were generally observed early in the season (e.g., spring) for both rivers due to the influence of snowmelt conditions, while late summer to autumn modelling performances showed better results. Copyright © 2005 John Wiley & Sons, Ltd. [source] Towards a more general species,area relationship: diversity on all islands, great and smallJOURNAL OF BIOGEOGRAPHY, Issue 4 2001Lomolino Aim To demonstrate a new and more general model of the species,area relationship that builds on traditional models, but includes the provision that richness may vary independently of island area on relatively small islands (the small island effect). Location We analysed species,area patterns for a broad diversity of insular biotas from aquatic and terrestrial archipelagoes. Methods We used breakpoint or piecewise regression methods by adding an additional term (the breakpoint transformation) to traditional species,area models. The resultant, more general, species,area model has three readily interpretable, biologically relevant parameters: (1) the upper limit of the small island effect (SIE), (2) an estimate of richness for relatively small islands and (3) the slope of the species,area relationship (in semi-log or log,log space) for relatively large islands. Results The SIE, albeit of varying magnitude depending on the biotas in question, appeared to be a relatively common feature of the data sets we studied. The upper limit of the SIE tended to be highest for species groups with relatively high resource requirements and low dispersal abilities, and for biotas of more isolated archipelagoes. Main conclusions The breakpoint species,area model can be used to test for the significance, and to explore patterns of variation in small island effects, and to estimate slopes of the species,area (semi-log or log,log) relationship after adjusting for SIE. Moreover, the breakpoint species,area model can be expanded to investigate three fundamentally different realms of the species,area relationship: (1) small islands where species richness varies independent of area, but with idiosyncratic differences among islands and with catastrophic events such as hurricanes, (2) islands beyond the upper limit of SIE where richness varies in a more deterministic and predictable manner with island area and associated, ecological factors and (3) islands large enough to provide the internal geographical isolation (large rivers, mountains and other barriers within islands) necessary for in situ speciation. [source] The second educational revolution: rethinking education in the age of technologyJOURNAL OF COMPUTER ASSISTED LEARNING, Issue 1 2010A. Collins Abstract This paper drew upon a recent book (Rethinking Education in the Age of Technology) to summarize a number of prospects and challenges arising from the appropriation of digital technology into learning and educational practice. Tensions between traditional models of schooling and the affordances of digital media were noted, while the promise of these technologies for shaping a new system of education was reviewed. It was argued that new technology brings radical opportunities but also significant challenges. The urgency of seeking a coherent model for the future of education in a technological age was stressed. [source] "I'm feeling lucky": The role of emotions in seeking information on the WebJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 6 2006James Kalbach Recent research highlights the potential relevance of emotions in interface design. People can no longer be modeled as purely goal-driven, task-solving agents: They also have affective motivations for their choices and behavior implying an extended mandate for search design. Absent from current Web design practice, however, is a pattern for emotive criticism and design reflecting these new directions. Further, discussion of emotions and Web design is not limited to visual design or aesthetic appeal: Emotions users have as they interact with information also have design implications. The author outlines a framework for understanding users' emotional states as they seek information on the Web. It is inspired largely by Carol Kuhlthau's (1991, 1993, 1999) work in library services, particularly her information searching process (ISP), which is adapted to Web design practice. A staged approach resembling traditional models of information seeking behavior is presented here as the basis for creating appropriate search and navigation systems. This user-centered framework is flexible and solution-oriented, enjoys longevity, and considers affective factors. Its aim is a more comprehensive, conceptual analysis of the user's entire information search experience. [source] Binary models for marginal independenceJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 2 2008Mathias Drton Summary., Log-linear models are a classical tool for the analysis of contingency tables. In particular, the subclass of graphical log-linear models provides a general framework for modelling conditional independences. However, with the exception of special structures, marginal independence hypotheses cannot be accommodated by these traditional models. Focusing on binary variables, we present a model class that provides a framework for modelling marginal independences in contingency tables. The approach that is taken is graphical and draws on analogies with multivariate Gaussian models for marginal independence. For the graphical model representation we use bidirected graphs, which are in the tradition of path diagrams. We show how the models can be parameterized in a simple fashion, and how maximum likelihood estimation can be performed by using a version of the iterated conditional fitting algorithm. Finally we consider combining these models with symmetry restrictions. [source] A bicriterion approach for routing problems in multimedia networksNETWORKS: AN INTERNATIONAL JOURNAL, Issue 4 2003João C. N. Clímaco Abstract Routing problems in communication networks supporting multiple services, namely, multimedia applications, involve the selection of paths satisfying multiple constraints (of a technical nature) and seeking simultaneously to "optimize" the associated metrics. Although traditional models in this area are single-objective, in many situations, it is important to consider different, eventually conflicting, objectives. In this paper, we consider a bicriterion model dedicated to calculating nondominated paths for specific traffic flows (associated with video services) in multiservice high-speed networks. The mathematical formulation of the problem and the bicriterion algorithmic approach developed for its resolution are presented together with computational tests regarding an application to video-traffic routing in a high-speed network. The algorithmic approach is an adaptation of recent work by Ernesto Martins and his collaborators, namely, the MPS algorithm. © 2003 Wiley Periodicals, Inc. [source] Borrower Credit and the Valuation of Mortgage-Backed SecuritiesREAL ESTATE ECONOMICS, Issue 4 2005Francis A. Longstaff We study the valuation of mortgage-backed securities when borrowers may have to refinance at premium rates because of their credit. The optimal refinancing strategy often results in prepayment being delayed significantly relative to traditional models. Furthermore, mortgage values can exceed par by much more than the cost of refinancing. Applying the model to an extensive sample of mortgage-backed security prices, we find that the implied credit spreads that match these prices closely parallel borrowers' actual spreads at the origination of the mortgage. These results suggest that models that incorporate borrower credit into the analysis may provide a promising alternative to the reduced-form prepayment models widely used in practice. [source] Marie Nathusius' Elisabeth and Fontane's Effi Briest: Mental Illness and Marital Discord in the "Century of Nerves"THE GERMAN QUARTERLY, Issue 1 2010Nicole Thesz This comparative analysis of Marie Nathusius' Elisabeth (1856/57) and Theodor Fontane's Effi Briest (1895) reveals striking similarities. Both novels depict child brides whose disappointment in marriage leads to nervous ailments. The Kur that both heroines undergo represents one of several contrasts between domesticity and the outside world. Illness, in Nathusius's portrayal, is an opportunity to negotiate difficulties in marital relationships. While Elisabeth upholds traditional models of femininity, it also shows the husband's nervous reactions to discord. Like Effi Briest, there are implications of social pressures, but ultimately healing Elisabeth involves her free will to choose religious faith, and thus health or "das Heil". Fontane, in contrast, places the etiology of illness firmly within the vicissitudes of patriarchal society, which crushes the individual beneath its hypocritical norms. The Kur thus offers Effi no respite, and instead transports her toward isolation and untimely death. [source] The role of evaporite mobility in modifying subsidence patterns during normal fault growth and linkage, Halten Terrace, Mid-NorwayBASIN RESEARCH, Issue 2 2005Nick J. Richardson Well-calibrated seismic interpretation in the Halten Terrace of Mid-Norway demonstrates the important role that structural feedback between normal fault growth and evaporite mobility has for depocentre development during syn-rift deposition of the Jurassic,Early Cretaceous Viking and Fangst Groups. While the main rift phase reactivated pre-existing structural trends, and initiated new extensional structures, a Triassic evaporite interval decouples the supra-salt cover strata from the underlying basement, causing the development of two separate fault populations, one in the cover and the other confined to the pre-salt basement. Detailed displacement,length analyses of both cover and basement fault arrays, combined with mapping of the component parts of the syn-rift interval, have been used to reveal the spatial and temporal evolution of normal fault segments and sediment depocentres within the Halten Terrace area. Significantly, the results highlight important differences with traditional models of normal fault-controlled subsidence, including those from parts of the North Sea where salt is absent. It can now be shown that evaporite mobility is intimately linked to the along-strike displacement variations of these cover and basement faults. The evaporites passively move beneath the cover in response to the extension, such that the evaporite thickness becomes greatest adjacent to regions of high fault displacement. The consequent evaporite swells can become large enough to have pronounced palaeobathymetric relief in hangingwall locations, associated with fault displacement maxima, the exact opposite situation to that predicted by traditional models of normal fault growth. Evaporite movement from previous extension also affects the displacement,length relationships of subsequently nucleated or reactivated faults. Evaporite withdrawal, on the other hand, tends to be a later-stage feature associated with the high stress regions around the propagating tips of normal faults or their coeval hangingwall release faults. The results indicate the important effect of, and structural feedback caused by, syn-rift evaporite mobility in heavily modifying subsidence patterns produced by normal fault array evolution. Despite their departure from published models, the results provide a new, generic framework within which to interpret extensional fault and depocentre development and evolution in areas in which mobile evaporites exist. [source] A Bayesian Approach to Surrogacy Assessment Using Principal Stratification in Clinical TrialsBIOMETRICS, Issue 2 2010Yun Li Summary A surrogate marker (S) is a variable that can be measured earlier and often more easily than the true endpoint (T) in a clinical trial. Most previous research has been devoted to developing surrogacy measures to quantify how well,S,can replace,T,or examining the use of,S,in predicting the effect of a treatment (Z). However, the research often requires one to fit models for the distribution of,T,given,S,and,Z. It is well known that such models do not have causal interpretations because the models condition on a postrandomization variable,S. In this article, we directly model the relationship among,T,,S, and,Z,using a potential outcomes framework introduced by Frangakis and Rubin (2002,,Biometrics,58, 21,29). We propose a Bayesian estimation method to evaluate the causal probabilities associated with the cross-classification of the potential outcomes of,S,and,T,when,S,and,T,are both binary. We use a log-linear model to directly model the association between the potential outcomes of,S,and,T,through the odds ratios. The quantities derived from this approach always have causal interpretations. However, this causal model is not identifiable from the data without additional assumptions. To reduce the nonidentifiability problem and increase the precision of statistical inferences, we assume monotonicity and incorporate prior belief that is plausible in the surrogate context by using prior distributions. We also explore the relationship among the surrogacy measures based on traditional models and this counterfactual model. The method is applied to the data from a glaucoma treatment study. [source] Export market correlation and strategic trade policyCANADIAN JOURNAL OF ECONOMICS, Issue 1 2000Mahmudul Anam In the traditional models of strategic trade policy pioneered by Brander and Spencer, exports of the domestic firm, engaged in a Cournot-Nash competition with the foreign firm in a neutral market, must be subsidized to maximize national welfare. We demonstrate that when the firms play the Cournot-Nash game in two stochastic and positively correlated markets, it may be optimal to tax exports to the more volatile market while subsidizing it in the other. The policy combination reduces the amplitude of aggregate profit and raises the utility of the risk-averse firm in a manner similar to the theory of portfolio choice. JEL Classification: F12, D18 Marchés d'exportation co-reliés et politique commerciale stratégique. Dans les modèles traditionnels de politique commerciale stratégique proposés par Brander et Spencer, les exportations de la firme nationale, qui est engagée dans une concurrence à la Cournot-Nash avec une firme étrangère dans un marché neutre, doit être subventionnée si l'on veut maximiser le niveau national de bien-être. On montre que, quand les entreprises jouent un jeu à la Cournot-Nash dans deux marchés d'exportation stochastiques et positivement co-reliés, il peut être optimal de taxer les exportations vers le marché le plus volatile et de subventionner les exportations vers l'autre marché. Cette combinaison de politiques réduit l'amplitude de variation des profits agrégés et augmente l'utilité de l'entreprise qui a une aversion au risque d'une manière qui ressemble à ce qui se passe dans la théorie des choix de portefeuilles. [source] AN EFFICIENT MODEL FOR ENHANCING TEXT CATEGORIZATION USING SENTENCE SEMANTICSCOMPUTATIONAL INTELLIGENCE, Issue 3 2010Shady Shehata Most of text categorization techniques are based on word and/or phrase analysis of the text. Statistical analysis of a term frequency captures the importance of the term within a document only. However, two terms can have the same frequency in there documents, but one term contributes more to the meaning of its sentences than the other term. Thus, the underlying model should identify terms that capture the semantics of text. In this case, the model can capture terms that present the concepts of the sentence, which leads to discovering the topic of the document. A new concept-based model that analyzes terms on the sentence, document, and corpus levels rather than the traditional analysis of document only is introduced. The concept-based model can effectively discriminate between nonimportant terms with respect to sentence semantics and terms which hold the concepts that represent the sentence meaning. A set of experiments using the proposed concept-based model on different datasets in text categorization is conducted in comparison with the traditional models. The results demonstrate the substantial enhancement of the categorization quality using the sentence-based, document-based and corpus-based concept analysis. [source] Applying business management models in health careINTERNATIONAL JOURNAL OF HEALTH PLANNING AND MANAGEMENT, Issue 4 2002Michael G. Trisolini Abstract Most health care management training programmes and textbooks focus on only one or two models or conceptual frameworks, but the increasing complexity of health care organizations and their environments worldwide means that a broader perspective is needed. This paper reviews five management models developed for business organizations and analyses issues related to their application in health care. Three older, more ,traditional' models are first presented. These include the functional areas model, the tasks model and the roles model. Each is shown to provide a valuable perspective, but to have limitations if used in isolation. Two newer, more ,innovative' models are next discussed. These include total quality management (TQM) and reengineering. They have shown potential for enabling dramatic improvements in quality and cost, but have also been found to be more difficult to implement. A series of ,lessons learned' are presented to illustrate key success factors for applying them in health care organizations. In sum, each of the five models is shown to provide a useful perspective for health care management. Health care managers should gain experience and training with a broader set of business management models. Copyright © 2002 John Wiley & Sons, Ltd. [source] |