Home About us Contact | |||
Implicit Assumption (implicit + assumption)
Selected AbstractsApplication of the New Keystone-Species Concept to Prairie Dogs: How Well Does It Work?CONSERVATION BIOLOGY, Issue 6 2000Natasha B. Kotliar This prompted Power et al. (1996) to refine the definition: keystone species have large effects on community structure or ecosystem function (i.e., high overall importance), and this effect should be large relative to abundance (i.e., high community importance). Using prairie dogs (Cynomys spp.) as an example, I review operational and conceptual difficulties encountered in applying this definition. As applied to prairie dogs, the implicit assumption that overall importance is a linear function of abundance is invalid. In addition, community importance is sensitive to abundance levels, the definition of community, and sampling scale. These problems arise largely from the equation for community importance, as used in conjunction with removal experiments at single abundance levels. I suggest that we shift from the current emphasis on the dualism between keystone and nonkeystone species and instead examine how overall and community importance vary (1) with abundance, (2) across spatial and temporal scales, and (3) under diverse ecological conditions. In addition, I propose that a third criterion be incorporated into the definition: keystone species perform roles not performed by other species or processes. Examination of how these factors vary among populations of keystone species should help identify the factors contributing to, or limiting, keystone-level functions, thereby increasing the usefulness of the keystone-species concept in ecology and conservation. Although the quantitative framework of Power et al. falls short of being fully operational, my conceptual guidelines may improve the usefulness of the keystone-species concept. Careful attention to the factors that limit keystone function will help avoid misplaced emphasis on keystone species at the expense of other species. Resumen: Se ha sugerido que el concepto de especie pilar no sea usado más en ecología y conservación, principalmente debido a que el concepto ha sido pobremente definido. Esto instigó a Power et al. (1996) a refinar la definición: las especies pilar tienen grandes efectos en la estructura de una comunidad o la función de un ecosistema (alta importancia en lo general), y este efecto debe ser grande en relación con la abundancia (alta importancia en la comunidad). Usando los perros de pradera (Cynomys spp) como ejemplo, revisé las dificultades operativas y conceptuales encontradas durante la aplicación de esta definición. Al aplicarse a perros de pradera, la suposición implícita de que la importancia en lo general es una función lineal de la abundancia es inválida. Además, la importancia en la comunidad es sensible a los niveles de abundancia, a la definición de comunidad y a la escala de muestreo. Estos problemas surgen, en gran medida, de la ecuación para la importancia en la comunidad, al ser usada conjuntamente con experimentos de remoción a un solo nivel de abundancia. Sugiero que el énfasis actual en la dualidad sobre especies pilares/no pilares cambie para examinar cómo varía la importancia en lo general y en la comunidad; (1) con la abundancia, (2) a lo largo de escalas espaciales y temporales, y (3) bajo diversas condiciones ecológicas. Además, propongo que sea incorporado un tercer criterio en la definición: las especies pilar llevan a cabo funciones no llevadas a cabo por otras especies o procesos. El análisis de cómo varían estos factores entre poblaciones de especies pilar ayudará a identificar los factores que contribuyen, o limitan las funciones a nivel pilar, incrementando con ello la utilidad del concepto de especie pilar en ecología y conservación. Aunque el marco de trabajo cuantitativo de Power et al. no llega a ser completamente operacional, mis guías conceptuales pueden mejorar la utilidad de este concepto. Una atención especial a los factores que limitan el funcionamiento pilar ayudaría a evitar un énfasis mal ubicado en especies pilar a costa de otras especies. [source] THE ENFORCEMENT OF COOPERATION BY POLICINGEVOLUTION, Issue 7 2010Claire El Mouden Policing is regarded as an important mechanism for maintaining cooperation in human and animal social groups. A simple model providing a theoretical overview of the coevolution of policing and cooperation has been analyzed by Frank (1995, 1996b, 2003, 2009), and this suggests that policing will evolve to fully suppress cheating within social groups when relatedness is low. Here, we relax some of the assumptions made by Frank, and investigate the consequences for policing and cooperation. First, we address the implicit assumption that the individual cost of investment into policing is reduced when selfishness dominates. We find that relaxing this assumption leads to policing being favored only at intermediate relatedness. Second, we address the assumption that policing fully recovers the loss of fitness incurred by the group owing to selfishness. We find that relaxing this assumption prohibits the evolution of full policing. Finally, we consider the impact of demography on the coevolution of policing and cooperation, in particular the role for kin competition to disfavor the evolution of policing, using both a heuristic "open" model and a "closed" island model. We find that large groups and increased kin competition disfavor policing, and that policing is maintained more readily than it invades. Policing may be harder to evolve than previously thought. [source] THE EVOLUTION OF SPERM-ALLOCATION STRATEGIES AND THE DEGREE OF SPERM COMPETITIONEVOLUTION, Issue 3 2005Paul D. Williams Abstract The prevailing viewpoint in the study of sperm competition is that male sperm-allocation strategies evolve in response to the degree of sperm competition an ejaculate can expect to experience within a given mating. If males cannot assess the degree of sperm competition their ejaculate will face and/or they are unable to facultatively adjust sperm investment in response to perceived levels of competition, high sperm allocation (per mating) is predicted to evolve in the context of high sperm competition. An implicit assumption of the framework used to derive this result is that the degree of sperm competition is unaffected by changes in sperm-allocation strategies. We present theory based on an alternative perspective, in which the degree of sperm competition and the sperm-allocation strategy are coupled traits that coevolve together. Our rationale is that the pattern of sperm allocation in the population will, in part, determine the level of sperm competition by affecting the number of ejaculates per female in the population. In this setting, evolution in sperm-allocation strategies is driven by changes in underlying environmental parameters that influence both the degree of sperm competition and sperm allocation. This change in perspective leads to predictions that are qualitatively different from those of previous theory. [source] Role of the temperature distribution on the PN junction behaviour in the electro-thermal simulationINTERNATIONAL JOURNAL OF NUMERICAL MODELLING: ELECTRONIC NETWORKS, DEVICES AND FIELDS, Issue 6 2004Hatem Garrab Abstract Electro-thermal simulations of a PIN-diode based on the finite-element method, show a non-uniform temperature distribution inside the device during switching transients. Hence, the implicit assumption of a uniform temperature distribution when coupling an analytical electrical model and a thermal model yields inaccurate electro-thermal behaviour of the PIN-diode so far. The idea of including non-uniform temperature distribution into power semiconductor device models is not new, as accurate electro-thermal simulations are required for designing compact power electronic systems (as IC or MCM). Instead of using a one-dimensional finite difference or element method, the bond graphs and the hydrodynamic method are utilized to build an electro-thermal model of the PIN-diode. The results obtained by this original technique are compared with those obtained by a commercial finite-element simulator. The results are similar but the computation effort of the proposed technique is a fraction of that required by finite-element simulators. Moreover, the proposed technique may be applied easily to other power semiconductor devices. Copyright © 2004 John Wiley & Sons, Ltd. [source] The Time Series Properties of Financial Ratios: Lev RevisitedJOURNAL OF BUSINESS FINANCE & ACCOUNTING, Issue 5-6 2003Christos Ioannidis This paper re-evaluates the time series properties of financial ratios. It presents new empirical analysis which explicitly allows for the possibility that financial ratios can be characterized as non-linear mean-reverting processes. Financial ratios are widely employed as explanatory variables in accounting and finance research with applications ranging from the determinants of auditors' compensation to explaining firms' investment decisions. An implicit assumption in this empirical work is that the ratios are stationary so that the postulated models can be estimated by classical regression methods. However, recent empirical work on the time series properties of corporate financial ratios has reported that the level of the majority of ratios is described by non-stationary, I(1), integrated processes and that the ratio differences are parsimoniously described by random walks. We hypothesize that financial ratios may follow a random walk near their target level, but that the more distant a ratio is from target, the more likely the firm is to take remedial action to bring it back towards target. This behavior will result in a significant size distortion of the conventional stationarity tests and lead to frequent non-rejection of the null hypothesis of non-stationarity, a finding which undermines the use of these ratios as reliable conditioning variables for the explanation of firms' decisions. [source] A theoretical explanation for the retention mechanism of ion exclusion chromatographyJOURNAL OF SEPARATION SCIENCE, JSS, Issue 17 2003Bronis, aw K. G Abstract Ion Exclusion Chromatography is classically used for the separation of weak acid anions. Dilute strong acids (e.g. sulphuric or perchloric acid) or just water are used as eluents. To increase the exclusion effect, strong cation exchangers, characterized by high concentration of functional groups, are applied. The inner column volume of commercially available columns is increased by increasing their size in comparison to traditional ones (usually 300×7.8 mm ID). The description of the retention mechanism of this technique implicitly assumes that both mobile and stationary phases are typical aqueous solutions, and their dielectric constants are thus equal. This equality implies the equality of solute dissociation constants in both phases. Another implicit assumption is that the dead- and inner volumes of the column are constant, and independent of the mobile phase composition. The present paper shows that stationary and mobile phases are generally characterized by different physicochemical parameters. Thus, they cannot be considered as regular aqueous solutions. Additionally, we show that weak cation exchanger resins, which are characterized by a relatively small concentration of the functional groups, and weak acid based buffers can also be used in IEC. This would expand the possible applications of this method and enable, for example, the separation of strong acids (anions). The influence of ionic strength on the retention and dead- and inner column volumes is also discussed. Finally we also briefly describe the retention mechanism of Electrostatic Ion Chromatography. [source] ENDOGENOUS GROWTH, PRICE STABILITY AND MARKET DISEQUILIBRIAMETROECONOMICA, Issue 1 2010Orlando Gomes ABSTRACT Resorting to an endogenous growth framework, the paper studies the implications of taking market clearing as a long-term possibility rather than an every period implicit assumption, as in conventional growth analysis. The underlying main assumption respects to an adjustment mechanism in which: (1) transitional dynamics are characterized by the persistence of an accumulated market imbalance, and (2) monetary authorities are able to guarantee price stability. The implications of this modeling structure are the following: (1) a market-clearing equilibrium may co-exist with other equilibrium points, (2) several types of stability outcomes are obtainable, and (3) monetary policy becomes relevant for growth. [source] The Identity Division of Labor in Native AlaskaAMERICAN ANTHROPOLOGIST, Issue 1 2009LISA FRINK ABSTRACT, There is often an implicit assumption that womens' technologies and associated tasks in subsistence-based groups are expedient and simple. For instance, in Native Alaska, the butchering of fish has been illustrated as arduous but uncomplicated work. On the contrary, closer examinations, as well as discussions with the people who are still learning and practicing subsistence tasks, indicate that this perspective is inaccurate. Instead, these taken-for-granted technologies and techniques require a lifetime of training and practice, and not all people achieve master status. Drawing from data from contemporary herring processing and the related tools of the trade, I explore the division of labor in the context of expertise and apprenticeship. [Keywords:,apprenticeship, expertise, gender, age, Alaska] [source] Population in the UN Environment Programme's Global Environment Outlook 2000POPULATION AND DEVELOPMENT REVIEW, Issue 3 2000Article first published online: 27 JAN 200 Most specialized agencies in the United Nations system have taken to compiling a periodic status report on their field. The UN Environment Programme (UNEP) issued the first in a proposed biennial series in 1998, titled Global Environment Outlook-1 or GEO-1. The second in the series, Global Environment Outlook 2000, was published in 1999. GEO-2000 is described by the UNEP's Executive Director, Klaus Töpfer, in the foreword as "a comprehensive integrated assessment of the global environment at the turn of the millennium, [and] a forward-looking document, providing a vision into the 21st century." Its status, however, is rendered uncertain by the printed caution that "The contents of this volume do not necessarily reflect the views or policies of UNEP or contributory organizations." GEO-2000 paints a generally bleak picture of environmental trends. It evidences a wide array of particulars ("In the Southern Ocean, the Patagonian toothfish is being over-fished and there is a large accidental mortality of seabirds caught up in fishing equipment"), but perhaps of more import are its statements about the root causes of environmental problems and what must be done. The excerpts below reflect some of these general views as they pertain to population. They are taken from the section entitled "Areas of danger and opportunity" in Chapter 1 of the report, and from the section "Tackling root causes" in Chapter 5. High resource consumption, fueled by affluent, Western lifestyles, is seen as a basic cause of environmental degradation. Cutting back this consumption will be required, freeing up resources for development elsewhere. Materialist values associated with urban living are part of the problem, given the concentration of future population growth in cities. And "genuine globalization" will entail free movement of people as well as capital and goods, thus optimizing "the population to environmental carrying capacity." Some of these positions are at least questionable: the supposed "innate environmental sensitivity of people raised on the land or close to nature," or the aim of "globalization of population movements." The latter does not appear in the recommendations, perhaps because of an implicit assumption that the effect of open borders on environmental trends is unlikely to be favorable. (For an earlier statement of the same sentiment,from 1927,see the comments by Albert Thomas, first director of the ILO, reproduced in the Archives section of PDR 9, no. 4.) [source] INDIVIDUALIZATION AND PUBLIC SECTOR LEADERSHIPPUBLIC ADMINISTRATION, Issue 1 2008JOHN LAWLER This is a conceptual paper whose aim is to relate the development of ,individualization' (Beck and Beck-Gernsheim 2002) to organizational leadership. It does this by examining individualization alongside the implicit assumption on which orthodox approaches to leadership are founded, namely that leadership is an individualized phenomenon. Despite the expanding literature on these topics, particularly that on leadership, these concepts have not been examined in relation to one another. This paper seeks to do this in two ways. Firstly, it highlights the increased attention given to leadership in the UK public sector, locating leadership as a continuation of public sector managerialism. Secondly, it discusses the development of the trend of individualization more broadly. The paper's main discussion focuses on leadership as an individual activity and of the consequences of that approach. In particular, it argues that individualized leadership presents a restrictive perspective which does not allow for exploration of a broader range of leadership approaches, particularly that of distributed leadership, which have especial relevance for public sector organizations. [source] Photoreactions and lateral patterning in Langmuir and Langmuir,Blodgett filmsTHE CHEMICAL RECORD, Issue 2 2007Mutsuyoshi Matsumoto Abstract Reversible morphological changes occur with photoisomerization of azobenzene in Langmuir,Blodgett (LB) films complexed with polycations, which contradicts an implicit assumption of the concept of free volume that two-dimensional film structures are preserved during the photoisomerization. J-aggregates of chromophores are formed by two processes. The first process is "light-induced J-aggregation" in which photoisomerized molecules form J-aggregates. The other process is "triggered J-aggregation," in which photoisomerization of one of the components triggers J-aggregation of another chemical species in the mixed films. Both processes of J-aggregation are in many cases accompanied by large morphological changes of the films. However, LB films fabricated using processes under isobaric conditions do not change their morphology during light-induced J-aggregation and are patterned with J-aggregates using ultraviolet illumination through a photomask. Phase separation in mixed LB films gives rise to two-dimensional patterns, which are used to fabricate templates by using an amphiphilic silane-coupling agent as one of the components in the mixed LB films. Nanopatterns are also fabricated. © 2007 The Japan Chemical Journal Forum and Wiley Periodicals, Inc. Chem Rec 7: 69,77; 2007: Published online in Wiley InterScience (www.interscience.wiley.com) DOI 10.1002/tcr.20099 [source] Measuring forecast skill: is it real skill or is it the varying climatology?THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 621C 2006Thomas M. Hamill Abstract It is common practice to summarize the skill of weather forecasts from an accumulation of samples spanning many locations and dates. In calculating many of these scores, there is an implicit assumption that the climatological frequency of event occurrence is approximately invariant over all samples. If the event frequency actually varies among the samples, the metrics may report a skill that is different from that expected. Many common deterministic verification metrics, such as threat scores, are prone to mis-reporting skill, and probabilistic forecast metrics such as the Brier skill score and relative operating characteristic skill score can also be affected. Three examples are provided that demonstrate unexpected skill, two from synthetic data and one with actual forecast data. In the first example, positive skill was reported in a situation where metrics were calculated from a composite of forecasts that were comprised of random draws from the climatology of two distinct locations. As the difference in climatological event frequency between the two locations was increased, the reported skill also increased. A second example demonstrates that when the climatological event frequency varies among samples, the metrics may excessively weight samples with the greatest observational uncertainty. A final example demonstrates unexpectedly large skill in the equitable threat score of deterministic precipitation forecasts. Guidelines are suggested for how to adjust skill computations to minimize these effects. Copyright © 2006 Royal Meteorological Society [source] Phylogeny of ,-proteobacteria: resolution of one branch of the universal tree?BIOESSAYS, Issue 5 2004James R. Brown The reconstruction of bacterial evolutionary relationships has proven to be a daunting task because variable mutation rates and horizontal gene transfer (HGT) among species can cause grave incongruities between phylogenetic trees based on single genes. Recently, a highly robust phylogenetic tree was constructed for 13 ,-proteobacteria using the combined alignments of 205 conserved orthologous proteins.1 Only two proteins had incongruent tree topologies, which were attributed to HGT between Pseudomonas species and Vibrio cholerae or enterics. While the evolutionary relationships among these species appears to be resolved, further analysis suggests that HGT events with other bacterial partners likely occurred; this alters the implicit assumption of ,-proteobacteria monophyly. Thus, any thorough reconstruction of bacterial evolution must not only choose a suitable set of molecular markers but also strive to reduce potential bias in the selection of species. BioEssays 26:463,468, 2004. © 2004 Wiley Periodicals, Inc. [source] Environmental power analysis , a new perspectiveENVIRONMETRICS, Issue 5 2001David R. Fox Abstract Power analysis and sample-size determination are related tools that have recently gained popularity in the environmental sciences. Their indiscriminate application, however, can lead to wildly misleading results. This is particularly true in environmental monitoring and assessment, where the quality and nature of data is such that the implicit assumptions underpinning power and sample-size calculations are difficult to justify. When the assumptions are reasonably met these statistical techniques provide researchers with an important capability for the allocation of scarce and expensive resources to detect putative impact or change. Conventional analyses are predicated on a general linear model and normal distribution theory with statistical tests of environmental impact couched in terms of changes in a population mean. While these are ,optimal' statistical tests (uniformly most powerful), they nevertheless pose considerable practical difficulties for the researcher. Compounding this difficulty is the subsequent analysis of the data and the impost of a decision framework that commences with an assumption of ,no effect'. This assumption is only discarded when the sample data indicate demonstrable evidence to the contrary. The alternative (,green') view is that any anthropogenic activity has an impact on the environment and therefore a more realistic initial position is to assume that the environment is already impacted. In this article we examine these issues and provide a re-formulation of conventional mean-based hypotheses in terms of population percentiles. Prior information or belief concerning the probability of exceeding a criterion is incorporated into the power analysis using a Bayesian approach. Finally, a new statistic is introduced which attempts to balance the overall power regardless of the decision framework adopted. Copyright © 2001 John Wiley & Sons, Ltd. [source] Implementing a CMC tutor group for an existing distance education courseJOURNAL OF COMPUTER ASSISTED LEARNING, Issue 3 2000M Weller Abstract, ,Artificial Intelligence for Technology' (T396) is a distance learning course provided by the Open University of the UK using face-to-face tutorials. In 1997 a pilot study was undertaken of a computer-mediated communication (CMC) tutor group which consisted of volunteers from around the UK. The student feedback raised a number of issues including: the need for a distinct function for the tutor group conference, the role of and demands on the tutor, and the benefits perceived by students. It is suggested that some issues arise from a conflict of cultures each with their own implicit assumptions. The traditional face-to-face tutorial model is sometimes at variance with the demands of the new CMC based tuition. [source] Critical Theorizing: Enhancing Theoretical Rigor in Family ResearchJOURNAL OF FAMILY THEORY & REVIEW, Issue 3 2009Stan J. Knapp Theory performs vital descriptive, sensitizing, integrative, explanatory, and value functions in the generation of knowledge about families. Yet theory and research can also simultaneously misconceive, desensitize, misdirect, misinterpret, and devalue. Overcoming the degenerative potentialities of theory and research requires attention to critical theorizing, a joint process of (a) critically examining the explicit and implicit assumptions of theory and research and (b) using dialogical theoretical practices. I draw upon the work of John Stuart Mill to argue that critical and dialogical theorizing is a vital and necessary practice in the production of understandings of family phenomena that are more fully scientific and empirical. A brief examination of behavioral research on marital interaction illustrates the importance of critical theorizing. [source] What's politics got to do with it?: Why donors find it so hard to come to terms with politics, and why this mattersJOURNAL OF INTERNATIONAL DEVELOPMENT, Issue 6 2009Sue Unsworth Abstract Donors are paying more attention to politics, and some are applying political analysis to specific aspects of development practice. But this is having little influence on mainstream debates about aid, and donors are not questioning their implicit assumptions about how development happens. There are powerful intellectual and institutional barriers to recognising that politics is central to the whole development process. This matters because, without a change in their mental models, donors will not invest in understanding local political dynamics, or give priority to strategically important but difficult issues. If they did so they would discover some very practical opportunities for progress. Copyright © 2009 John Wiley & Sons, Ltd. [source] The Language of Multiple Identities among Dominican AmericansJOURNAL OF LINGUISTIC ANTHROPOLOGY, Issue 2 2000Benjamin Bailey As a group whose members are Hispanic, American, and largely of African descent, Dominican Americans must negotiate distinctive issues of identity in the United States. Language is central to these negotiations, both as a symbol of identity and as a medium through which to construct and display local social meanings. Dominican Americans use linguistic forms from multiple varieties of two codes, Spanish and English, to situationally activate various facets of their multiple identities. This multivariety linguistic and interactional construction of identities undermines implicit assumptions of uniformity and essentialism in U.S. linguistic and ethnic/racial categories, particularly in the construction of the category "African American." [source] The Rights of Children, the Rights of Nations: Developmental Theory and the Politics of Children's RightsJOURNAL OF SOCIAL ISSUES, Issue 4 2008Colette Daiute The Convention on the Rights of the Child (CRC), U.N. General Assembly (1989) is a major breakthrough in defining children as fully human and working to ensure them the attendant benefits worldwide. While children's rights as equal human beings may seem obvious in the 21st century, the politics of establishing and ensuring such rights are contentious. The CRC is a brilliant negotiation of conceptions of the child and international relations, yet certain tensions in the children's rights process lead to a lack of clarity in a global situation that continues to leave millions of children at risk. Analyzing the CRC and related practices from a developmental perspective can help identify obstacles to the advancement of children's rights, especially those related to opportunities for rights-based thinking and the exercise of self-determination and societal-determination rights. In this article, I offer a qualitative analysis of children's rights in the context of what I refer to as the CRC activity-meaning system. I present a theoretical framework for considering this system of policy and practice as enacted in the CRC treaty and related monitoring, reporting, qualifying, and implementing documents. A discourse analysis of conceptions of the child and those responsible for ensuring their rights in seven representative documents (including the CRC Treaty, a report by the U.N. Committee on the Rights of the Child, minutes of a U.N. Security Council meeting, reports by a State-Party, and a report by a civil society group in that country) reveals tensions inherent in the CRC activity-meaning system.1 Emerging from this analysis is a tension between children's rights and nation's rights. Created in part via explicit and implicit assumptions about child development in the CRC as these posit responsibilities across actors in the broader CRC system, this tension challenges the implementation of children's rights and the development of children's rights-based understandings. I use this analysis to explain why future research and practice should address the development of children's rights-based understanding not only in terms of maturation or socialization but also as integral to salient conflicts in their every day lives. [source] How Race, Sex, and Age Frame the Use of Authority by Local Government OfficialsLAW & SOCIAL INQUIRY, Issue 3 2010Shannon Portillo Thanks to the civil rights movement, women and racial and ethnic minorities increasingly hold positions of public authority,but they experience and exercise this authority differently from white men. Based on 162 narratives collected from 49 US local government officials (city administrators and police), I find that women, minorities, and younger officials in positions of authority face a paradox of rules. Because they have lower social status with the public and within their organizations, they must rely on formal and explicit rules as a key basis for their authority, but such reliance causes their very authority to be questioned. Social status based on implicit assumptions about social identities, including race or ethnicity, sex, and age, originates outside of organizations and has effects society wide. This study shows that social status continues to permeate US local government organizations in both subtle and explicit ways, even in bureaucratic settings that are formally committed to merit and professional norms. [source] Statistical hypothesis testing in intraspecific phylogeography: nested clade phylogeographical analysis vs. approximate Bayesian computationMOLECULAR ECOLOGY, Issue 2 2009ALAN R. TEMPLETON Abstract Nested clade phylogeographical analysis (NCPA) and approximate Bayesian computation (ABC) have been used to test phylogeographical hypotheses. Multilocus NCPA tests null hypotheses, whereas ABC discriminates among a finite set of alternatives. The interpretive criteria of NCPA are explicit and allow complex models to be built from simple components. The interpretive criteria of ABC are ad hoc and require the specification of a complete phylogeographical model. The conclusions from ABC are often influenced by implicit assumptions arising from the many parameters needed to specify a complex model. These complex models confound many assumptions so that biological interpretations are difficult. Sampling error is accounted for in NCPA, but ABC ignores important sources of sampling error that creates pseudo-statistical power. NCPA generates the full sampling distribution of its statistics, but ABC only yields local probabilities, which in turn make it impossible to distinguish between a good fitting model, a non-informative model, and an over-determined model. Both NCPA and ABC use approximations, but convergences of the approximations used in NCPA are well defined whereas those in ABC are not. NCPA can analyse a large number of locations, but ABC cannot. Finally, the dimensionality of tested hypothesis is known in NCPA, but not for ABC. As a consequence, the ,probabilities' generated by ABC are not true probabilities and are statistically non-interpretable. Accordingly, ABC should not be used for hypothesis testing, but simulation approaches are valuable when used in conjunction with NCPA or other methods that do not rely on highly parameterized models. [source] WEAK AND STRONG SUSTAINABILITY, ENVIRONMENTAL CONSERVATION AND ECONOMIC GROWTHNATURAL RESOURCE MODELING, Issue 3 2006WERNER HEDIGER ABSTRACT. To investigate the role of explicit and implicit assumptions in different models of weak and strong sustain-ability, the Solow/Hartwick model of intergenerational equity with nonrenewable resources is gradually extended to include renewable resources, endogenous technical progress, and stock pollution. This reveals the fundamental role of endogenous technical progress for sustainable development, the inconsistency of implicit sustainability assumptions in various models, as well as the existence of a Hartwick rule for Daly's steady-state economy. Moreover, it shows that the concepts of Solow sustainability and strong sustainability coincide as a special case of weak sustainability. The latter integrates economic and environmental concerns and aims at maintaining the welfare potential of an economy over time. It does not rule out economic growth by assumption. Rather, the analysis shows that environmental conservation and economic growth can be compatible with each other, without jeopardizing social welfare. Finally, the analysis shows that the discussion of sustain-ability models cannot be restricted to the explicit differences that are usually pointed out by their authors and commentators. Rather, implicit assumptions must be made explicit. [source] A guide to knowledge translation theoryTHE JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS, Issue 1 2006Carole A. Estabrooks RN Abstract Despite calls over several decades for theory development, there remains no overarching knowledge-translation theory. However, a range of models and theoretical perspectives focused on narrower and related areas have been available for some time. We provide an overview of selected perspectives that we believe are particularly useful for developing testable and useful knowledge-translation interventions. In addition, we discuss adjuvant theories necessary to complement these perspectives. We draw from organizational innovation, health, and social sciences literature to illustrate the similarities and differences of various theoretical perspectives related to the knowledge-translation field. A variety of theoretical perspectives useful to knowledge translation exist. They are often spread across disciplinary boundaries, making them difficult to locate and use. Poor definitional clarity, discipline-specific terminology, and implicit assumptions often hinder the use of complementary perspectives. Health care environments are complex, and assessing the setting prior to selecting a theory should be the first step in knowledge-translation initiatives. Finding a fit between setting (context) and theory is important for knowledge-translation initiatives to succeed. Because one theory will not fit all contexts, it is helpful to understand and use several different theories. Although there are often barriers associated with combining theories from different disciplines, such obstacles can be overcome, and to do so will increase the likelihood that knowledge-translation initiatives will succeed. [source] Is All Communication Created Equal?: An Investigation into the Effects of Communication Mode on Perceived Information QualityTHE JOURNAL OF PRODUCT INNOVATION MANAGEMENT, Issue 2 2000Elliot Maltz Enhancing communication between functions is crucial to successful product development and management. Previous work in the product innovation management literature has made two implicit assumptions. First, that increasing the frequency of information dissemination from one function to the other always improves the perceived quality of the information received. The second assumption is that all types of interfunctional communication carry equal weight in the decision-making process of the target of that communication. The current study develops a typology of communication modes, which suggests a rationale for why these assumptions may not be true. The empirical findings of the study, based on a survey of 504 nonmarketing managers indicate that the relationship between total communication frequency and perceived information quality (PIQ) is nonlinear. Specifically, the study finds that marketing managers can either communicate too little or too much with nonmarketing managers. If they interact too infrequently, they run the risk of not understanding the way to most effectively communicate market information. If they communicate too much, they may overload the manager with too much information and erode the overall quality of the information sent. In addition, some modes of communication are more effective in improving perceptions of the quality of market information. For instance, regular e-mail sent by marketing managers seems to have no effect on perceived information quality. On the other hand, e-mail sent with supporting documentation can have a strong positive effect on perceived information quality. Impromptu phone calls by marketing have less positive effects than scheduled phone calls. Interestingly, too much of the wrong types of communication actually seem to reduce perceptions of perceived information quality, and consequently the likelihood that market information will be used. The study also suggests that certain kinds of communication are better for manufacturing managers and others more effective in sharing information with R&D managers. For instance, disseminating information through written reports seems to reduce perceived information quality. This is particularly true for R&D managers. On the other hand too many meetings can reduce perceptions of PIQ, particularly on the part of manufacturing managers. Implications for theory and practice are discussed. [source] |