Home About us Contact | |||
Different Versions (different + version)
Selected AbstractsConsumer versus citizen preferences in contingent valuation: evidence on the role of question framing,AUSTRALIAN JOURNAL OF AGRICULTURAL & RESOURCE ECONOMICS, Issue 4 2005Ville Ovaskainen Rather than individual consumer preferences, responses to referendum-style contingent valuation surveys on environmental goods may express citizen assessments that take into account benefits to others. We reconsider the consumer versus citizen hypothesis with a focus on the role of framing information. Survey data on conservation areas in Ilomantsi, Finland, are used. Different versions of the valuation question were used to encourage the respondents to take the consumer or the citizen role. The citizen version expectedly resulted in substantially fewer zero-WTP responses and protests and higher mean and median WTP, suggesting that the framing information has a major effect on the preferences expressed. The findings support the idea of multiple preferences. For a more confident interpretation of contingent valuation responses, future studies should recognise their intended use in survey design and gain information about respondents' motives to determine the presence and type of altruistic motives. [source] RUN/EDIT information processing mode and phasic cardiac accelerationPSYCHOPHYSIOLOGY, Issue 6 2008Tytus Sosnowski Abstract Our previous research showed that tasks demanding running of ready-to-use programs (RUN tasks) caused a greater tonic heart rate increase than did tasks that require problem solving (EDIT tasks). We found also a similar though not so consistent effect in the analysis of phasic cardiac acceleration. The aim of the present study was to replicate the last finding using new experimental tasks. Fifty-four male secondary school pupils were divided randomly into three experimental groups. Each group performed a different version of a nonsignaled reaction time (RT) task: simple RT, sensory choice RT, and semantic choice RT. Participants had to respond within an established time limit, but this limit was continuously modified in such a way that each participant was given positive feedback in approximately 50% of trials. According to expectations, the simple RT task evoked greater phasic cardiac acceleration than did the choice RT tasks. [source] Public Roles for the Medical Profession in the United States: Beyond Theories of Decline and FallTHE MILBANK QUARTERLY, Issue 3 2001Rosemary A. Stevens The future role of national medical organizations as a moral voice in health policymaking in the United States deserves attention from both scholarly and strategic perspectives. Arguments for strengthening the public roles of organized professionalism include its long (if neglected) history of public service. Scholarship of the past 40 years has emphasized the decline of a profession imbued with self-interest, together with associated hteories of organizational conflict. Through new concepts and language, a different version of organized medicine from that of the past might be invented for the future,one that draws on multiple medical organizations, encourages more effective cooperation with other health care groups, and builds on traditional professional agendas through adaptation and extension. [source] Unreliability of the dot probe taskEUROPEAN JOURNAL OF PERSONALITY, Issue 7 2005Stefan C. Schmukle Abstract The dot probe task is a widely used measure of attention allocation to threatening stimuli. The present two studies examine the reliability of different versions of this task using words as well as pictures as stimulus material. Estimates of both internal consistency and retest reliability over one week lead to the conclusion that the dot probe task is a completely unreliable measure of attentional allocation in non-clinical samples. This unreliability may explain the inconsistent findings for the dot probe task as reported in the literature. Copyright © 2005 John Wiley & Sons, Ltd. [source] A framework of knowledge versioning managementEXPERT SYSTEMS, Issue 3 2004M. T. Maliappis Abstract: Knowledge is an inherently dynamic entity continuously changing and evolving. In many cases, the coexistence of different versions of the same core knowledge is a necessity. So is the availability of the proper environment and tools to deal with knowledge versioning. In this paper, a framework of knowledge versioning management is proposed and implemented dealing with hybrid knowledge representation models using frames and rules. This framework facilitates knowledge version handling and maintenance, improving, in parallel, knowledge sharing and reuse. Knowledge components are stored in a set of tables and handled as data under the auspices of a database management system. The proper structure of tables and their relationships allows the creation of independent knowledge modules. Several knowledge modules can be assembled to construct higher level modules, which finally form versions of knowledge. Corresponding knowledge base versions consist of several knowledge modules easy to handle and process in various application areas. The proposed framework has been implemented and thoroughly examined in an application area of great importance, such as pest management. [source] It's not just what you do, it's the way that you do it: the effect of different payment card formats and survey administration on willingness to pay for health gainHEALTH ECONOMICS, Issue 3 2006Richard D. Smith Abstract A general population sample of 314 Australian respondents were randomly allocated to complete a contingent valuation survey administered by face-to-face or telephone (,phone-mail-phone') interview. Although the telephone interview was quicker to complete, no significant difference was found in values obtained through either method. Within each sub-sample, respondents were also randomly allocated to the three different versions of the payment card (PC) questionnaire format: values listed from high-to-low, values listed from low-to-high and values randomly shuffled. The high-to-low version resulted in significantly higher values than the other versions. Further analyses indicate that the randomly shuffled PC version may produce the most ,valid' values. Copyright © 2005 John Wiley & Sons, Ltd. [source] Evaluation of performance enhancing proxies in internet over satelliteINTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 6 2003Navid Ehsan Abstract Performance enhancing proxies (PEPs) are widely used to improve the performance of TCP over high delay-bandwidth product links and links with high error probability. In this paper we analyse the performance of using TCP connection splitting in combination with web caching via traces obtained from a commercial satellite system. We examine the resulting performance gain under different scenarios, including the effect of caching, congestion, random loss and file sizes. We show, via analysing our measurements, that the performance gain from using splitting is highly sensitive to random losses and the number of simultaneous connections, and that such sensitivity is alleviated by caching. On the other hand, the use of a splitting proxy enhances the value of web caching in that cache hits result in much more significant performance improvement over cache misses when TCP splitting is used. We also compare the performance of using different versions of HTTP in such a system. Copyright © 2003 John Wiley & Sons, Ltd. [source] Using a synthesised technique for grounded theory in nursing researchJOURNAL OF CLINICAL NURSING, Issue 16 2009Hsiao-Yu Chen Aims., To introduce a synthesised technique for using grounded theory in nursing research. Background., Nursing increasingly uses grounded theory for a broadened perspective on nursing practice and research. Nurse researchers have choices in how to choose and use grounded theory as a research method. These choices come from a deep understanding of the different versions of grounded theory, including Glaser's classic grounded theory and Strauss and Corbin's later approach. Design., Grounded theory related literature review was conducted. Methods., This is a methodological review paper. Results., Nursing researchers intent on using a grounded theory methodology should pay attention to the theoretical discussions including theoretical sampling, theoretical sensitivity, constant comparative methods and asking questions, keeping memoranda diagramming, identification of a core category and a resultant explanatory theory. A synthesised approach is developed for use, based on Strauss and Corbin's style of sampling and memoranda writing, but selecting theoretical coding families, that differ from the paradigm model of Strauss and Corbin, from the wide range suggested by Glaser. This led to the development of a multi-step synthesised approach to grounded theory data analysis based on the works of Glaser, Charmaz and Strauss and Corbin. Conclusions., The use of this synthesised approach provides a true reflection of Glaser's idea of ,emergence of theory from the data' and Strauss and Corbin's style of sampling and memoranda writing is employed. This multi-step synthesised method of data analysis maintains the philosophical perspective of grounded theory. Relevance to clinical practice., This method indicates how grounded theory has developed, where it might go next in nursing research and how it may continue to evolve. [source] Cape Fear, Two Versions and Two Visions Separated by Thirty YearsJOURNAL OF LAW AND SOCIETY, Issue 1 2001Gerald J. Thain This essay examines the changes between 1962 and 1991 that occurred in the context within which the two very different versions of Cape Fear appeared. These two versions of the story of a threatened lawyer are emblematic of an altered perspective on law. The essay highlights the tension between art's role as a reflector of society and its values and its role shaping social views. The inference, from the different portrayals of Sam Bowden, that there has been a systematic decline in the lawyer's status and public esteem is not, however, borne out in the cinematic field. The situation has become one of moral ambiguity with the lawyer playing a more ambivalent role in society. [source] The professionalism of practising law: A comparison across work contextsJOURNAL OF ORGANIZATIONAL BEHAVIOR, Issue 8 2008Jean E. Wallace Traditionally, the literature assumed that solo practitice best exemplifies the ideal professional work arrangement and that when professionals become salaried employees their professionalism is seriously threatened. The primary goal of this paper is to examine lawyers' sense of professionalism across two work contexts: solo practitioner offices and law firm settings. We also examine status distinctions within law firms, between associates and partners, and compare both to independent practitioners. Solo practitioners and law firm partners are similar on most key dimensions of professionalism, whereas the greatest contrasts occur between partners and associates within law firms. Partners and solo practitioners share similar experiences of autonomy and service as owner-managers, whereas partners and associates share greater collegiality among professionals, perhaps fostered through law firm cultures. All three groups report comparable amounts of variety in their work and are equally committed to the practice of law. The key factors that account for gaps in professionalism reflect the nature of law practices, primarily through time spent with corporate clients and pressure to generate profits. We conclude that different versions of lawyers' professionalism are influenced by the everyday aspects of their work and one version is not necessarily more professional than the other. Copyright © 2008 John Wiley & Sons, Ltd. [source] Assessing global diffusion with Web memetics: The spread and evolution of a popular jokeJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 12 2009Limor Shifman Memes are small units of culture, analogous to genes, which flow from person to person by copying or imitation. More than any previous medium, the Internet has the technical capabilities for global meme diffusion. Yet, to spread globally, memes need to negotiate their way through cultural and linguistic borders. This article introduces a new broad method, Web memetics, comprising extensive Web searches and combined quantitative and qualitative analyses, to identify and assess: (a) the different versions of a meme, (b) its evolution online, and (c) its Web presence and translation into common Internet languages. This method is demonstrated through one extensively circulated joke about men, women, and computers. The results show that the joke has mutated into several different versions and is widely translated, and that translations incorporate small, local adaptations while retaining the English versions' fundamental components. In conclusion, Web memetics has demonstrated its ability to identify and track the evolution and spread of memes online, with interesting results, albeit for only one case study. [source] On the Nitroxide Quasi-Equilibrium in the Alkoxyamine-Mediated Radical Polymerization of StyreneMACROMOLECULAR THEORY AND SIMULATIONS, Issue 2 2006Enrique Saldívar-Guerra Abstract Summary: The range of validity of two popular versions of the nitroxide quasi-equilibrium (NQE) approximation used in the theory of kinetics of alkoxyamine mediated styrene polymerization, are systematically tested by simulation comparing the approximate and exact solutions of the equations describing the system. The validity of the different versions of the NQE approximation is analyzed in terms of the relative magnitude of (dN/dt)/(dP/dt). The approximation with a rigorous NQE, kc[P][N],=,kd[PN], where P, N and PN are living, nitroxide radicals and dormant species respectively, with kinetic constants kc and kd, is found valid only for small values of the equilibrium constant K (10,11,10,12 mol,·,L,1) and its validity is found to depend strongly of the value of K. On the other hand, the relaxed NQE approximation of Fischer and Fukuda, kc[P][N],=,kd[PN]0 was found to be remarkably good up to values of K around 10,8 mol,·,L,1. This upper bound is numerically found to be 2,3 orders of magnitude smaller than the theoretical one given by Fischer. The relaxed NQE is a better one due to the fact that it never completely neglects dN/dt. It is found that the difference between these approximations lies essentially in the number of significant figures taken for the approximation; still this subtle difference results in dramatic changes in the predicted course of the reaction. Some results confirm previous findings, but a deeper understanding of the physico-chemical phenomena and their mathematical representation and another viewpoint of the theory is offered. Additionally, experiments and simulations indicate that polymerization rate data alone are not reliable to estimate the value of K, as recently suggested. Validity of the rigorous nitroxide quasi-equilibrium assumption as a function of the nitroxide equilibrium constant. [source] McDowell on Kant: Redrawing the Bounds of SenseMETAPHILOSOPHY, Issue 4 2000Christopher Norris John McDowell's Mind and World is a notable attempt to redirect the interest of analytic philosophers toward certain themes in Kantian and more recent continental thought. Only thus, he believes, can we move beyond the various failed attempts , by Quine, Davidson, Rorty, and others , to achieve a naturalised epistemology that casts off the various residual "dogmas" of old-style logical empiricism. In particular, McDowell suggests that we return to Kant's ideas of "spontaneity" and "receptivity" as the two jointly operative powers of mind which enable thought to transcend the otherwise unbridgeable gulf between sensuous intuitions and concepts of understanding. However, this project miscarries for several reasons. Chief among them is the highly problematical nature of Kant's claims, taken over by McDowell without reference to their later treatment at the hands of subjective and objective idealists. Hence he tends to fall back into different versions of the same mind/world dualism. I then question McDowell's idea that Kant can be "naturalised" by reinterpreting those claims from a more hermeneutic or communitarian standpoint with its sources in Hegel, Wittgenstein, and Gadamer. For the result is to deprive Kant's philosophy of its distinctively critical dimension not only with regard to epistemological issues but also in relation to matters of ethical and sociopolitical judgement. [source] The minimum evolution problem: Overview and classificationNETWORKS: AN INTERNATIONAL JOURNAL, Issue 2 2009Daniele Catanzaro Abstract Molecular phylogenetics studies the hierarchical evolutionary relationships among organisms by means of molecular data. These relationships are typically described by means of weighted trees, or phylogenies, whose leaves represent the observed organisms, internal vertices the intermediate ancestors, and edges the evolutionary relationships between pairs of organisms. Molecular phylogenetics provides several criteria for selecting one phylogeny from among plausible alternatives. Usually, such criteria can be expressed in terms of objective functions, and the phylogenies that optimize them are referred to as optimal. One of the most important criteria is the minimum evolution (ME) criterion, which states that the optimal phylogeny for a given set of organisms is the one whose sum of edge weights is minimal. Finding the phylogeny that satisfies the ME criterion involves solving an optimization problem, called the minimum evolution problem (MEP), which is notoriously -Hard. This article offers an overview of the MEP and discusses the different versions of it that occur in the literature. © 2008 Wiley Periodicals, Inc. NETWORKS, 2009 [source] Three versions of an ethics of careNURSING PHILOSOPHY, Issue 4 2009Steven D. Edwards PhD M.Phil BA(hons) Abstract The ethics of care still appeals to many in spite of penetrating criticisms of it which have been presented over the past 15 years or so. This paper tries to offer an explanation for this, and then to critically engage with three versions of an ethics of care. The explanation consists firstly in the close affinities between nursing and care. The three versions identified below are by Gilligan (1982), a second by Tronto (1993), and a third by Gastmans (2006), see also Little (1998). Each version is described and then subjected to criticism. It is concluded that where the ethics of care is presented in a distinctive way, it is at its least plausible; where it is stated in more plausible forms, it is not sufficiently distinct from nor superior to at least one other common approach to nursing ethics, namely the much-maligned ,four principles' approach. What is added by this paper to what is already known: as the article tries to explain, in spite of its being subjected to sustained criticism the ethics of care retains its appeal to many scholars. The paper tries to explain why, partly by distinguishing three different versions of an ethics of care. It is also shown that all three versions are beset with problems the least serious of which is distinctiveness from other approaches to moral problems in health care. [source] Contextualism and the Factivity ProblemPHILOSOPHY AND PHENOMENOLOGICAL RESEARCH, Issue 3 2008PETER BAUMANN Epistemological contextualism - the claim that the truth-value of knowledge-attributions can vary with the context of the attributor - has recently faced a whole series of objections. The most serious one, however, has not been discussed much so far: the factivity objection. In this paper, I explain what the objection is and present three different versions of the objection. I then show that there is a good way out for the contextualist. However, in order to solve the problem the contextualist has to accept a relationalist version of contextualism. [source] The Limits of Rational Choice: New Institutionalism in the Test Bed of Central Banking Politics in AustraliaPOLITICAL STUDIES, Issue 3 2002Stephen Bell This paper tests the explanatory capacities of different versions of new institutionalism by examining the Australian case of a general transition in central banking practice and monetary politics: namely, the increased emphasis on low inflation and central bank independence. Standard versions of rational choice institutionalism largely dominate the literature on the politics of central banking, but this approach (here termed RC1) fails to account for Australian empirics. RC1 has a tendency to establish actor preferences exogenously to the analysis; actors'motives are also assumed a priori; actor's preferences are depicted in relatively static, ahistorical terms. And there is the tendency, even a methodological requirement, to assume relatively simple motives and preference sets among actors, in part because of the game theoretic nature of RC1 reasoning. It is possible to build a more accurate rational choice model by re-specifying and essentially updating the context, incentives and choice sets that have driven rational choice in this case. Enter RC2. However, this move subtly introduces methodological shifts and new theoretical challenges. By contrast, historical institutionalism uses an inductive methodology. Compared with deduction, it is arguably better able to deal with complexity and nuance. It also utilises a dynamic, historical approach, and specifies (dynamically) endogenous preference formation by interpretive actors. Historical institutionalism is also able to more easily incorporate a wider set of key explanatory variables and incorporate wider social aggregates. Hence, it is argued that historical institutionalism is the preferred explanatory theory and methodology in this case. [source] Asymptotics in Knuth's parking problem for caravans,RANDOM STRUCTURES AND ALGORITHMS, Issue 1 2006Jean Bertoin Abstract We consider a generalized version of Knuth's parking problem, in which caravans consisting of a random number of cars arrive at random on the unit circle. Then each car turns clockwise until it finds a free space to park. Extending a recent work by Chassaing and Louchard Random Struct Algor 21(1) (2002), 76,119, we relate the asymptotics for the sizes of blocks formed by occupied spots with the dynamics of the additive coalescent. According to the behavior of the caravans' size tail distribution, several qualitatively different versions of the eternal additive coalescent are involved. © 2005 Wiley Periodicals, Inc. Random Struct. Alg., 2006 [source] Untimeliness as Moral Indictment: Tamil Agricultural Labouring Women's Use of Lament as Life NarrativeTHE AUSTRALIAN JOURNAL OF ANTHROPOLOGY, Issue 2 2007Kalpana Ram How do Dalit women forge certain forms of critical perspectives in relation to their existence? This paper explores the very particular poetics that shape the women's responses to an invitation by the ethnographer to tell her their life stories. Their narratives made use of several dominant discourses in south India that ritually construct a woman's life as a teleology of an unfolding essence, an embodied force that comes into flower and fruition, and must be socially shaped and tended in order to bring about an auspicious confluence for both woman and the social order. The women also made use of the structure and tropes of several styles of performance that have tragedy at their emotional heart, and which gain their force against the normative construction of life cycle as temporality. By using these forms, women were able to bring into discourse several aspects of their experience of marriage that would otherwise gain no social recognition. In particular, they highlighted the prematurity of their marriage, having wed while still children themselves. The wider argument of this paper engages with two very different versions of agency,one predicated on the use of reason and consent by the individual, the other derived from an examination of the Dalit women's narratives. [source] A low reynolds number k-, modelling of turbulent pipe flow: Flow pattern and energy balanceTHE CANADIAN JOURNAL OF CHEMICAL ENGINEERING, Issue 2 2001Shirish S. Thakre Abstract The present paper addresses a comparative analysis of twelve different versions of low Reynolds number k -, turbulence models. The predictive capability of the models have been tested on the basis of the flow patterns and energy balance. Numerical simulations were performed at the Reynolds numbers of 7400, 22 000 and 500 000. The predicted mean axial velocity and turbulent kinetic energy were compared with the experimental data of Durst et al. (1995) and Schildknecht et al.(1979) for the Reynolds number of 7400 and 22000 respectively. The overall energy balance was established at three Reynolds numbers of 7400, 22 000 and 500000. A comparison of all the models has been predicted. On décrit dans cet article une analyse comparative de douze versions différentes de modèles de turbulence à faibles nombres de Reynolds k -,. La capacité de prédiction de ces modèles a été testée d'après les profils d'écoulement et le bilan énergétique. Des simulations numériques ont été réalisées à des nombres de Reynolds de 7400, 22 000 et 500 000. La vitesse axiale et l'énergie cinétique turbulente moyennes prédites ont été comparées aux données expérimentales de Durst et al. (1995) et Schildknecht et al. (1979) pour les nombres de Reynolds de 7400 et 22 000, respectivement. Le bilan énergétique global a été établi pour les trois nombres de Reynolds. Une comparaison de tous les modèles a été effectuée. [source] SOLVING THE PROBLEM OF EASY KNOWLEDGETHE PHILOSOPHICAL QUARTERLY, Issue 233 2008Tim Black Stewart Cohen argues that several epistemological theories fall victim to the problem of easy knowledge: they allow us to know far too easily that certain sceptical hypotheses are false and that how things seem is a reliable indicator of how they are. This problem is a result of the theories' interaction with an epistemic closure principle. Cohen suggests that the theories should be modified. I argue that attempts to solve the problem should focus on closure instead; a new and plausible epistemic closure principle can solve the problem of easy knowledge. My solution offers a uniform and more successful response to different versions of the problem of easy knowledge. [source] Reconciliation and Political Legitimacy: The Old Australia and the New South AfricaAUSTRALIAN JOURNAL OF POLITICS AND HISTORY, Issue 2 2003Paul Muldoon In both Australia and South Africa a state-sponsored discourse of reconciliation has been deployed as a tool of national integration and state building. This usage has tended to encourage a politics of selective memory that runs contrary to the spirit of reconciliation as recognition of different views of the nation. This article seeks to recover (and promote) a more positive concept of reconciliation by treating it as a discursive, democratic space in which different versions of the national story can be acknowledged and negotiated. The cases of Australia and South Africa are used in a mutually illuminating way to explore what "telling the truth" about the past might mean and how such "truth-telling" might help restore legitimacy to liberal states confronted with a "broken moral order". [source] Divisibility and the Moral Status of EmbryosBIOETHICS, Issue 5-6 2001Christian Munthe The phenomenon of twinning in early fetal development has become a popular source for doubt regarding the ascription of moral status to early embryos. In this paper, the possible moral basis for such a line of reasoning is critically analysed with sceptical results. Three different versions of the argument from twinning are considered, all of which are found to rest on confusions between the actual division of embryos involved in twinning and the property of early embryos to be divisible, to be based on highly questionable ethical assumptions, or to imply inconsistent claims regarding the moral importance of potentiality and/or the moral status of embryos. This is taken to expose a number of related inconsistencies in the moral basis of pro-life positions. In particular, ascribing moral significance to the property of being (in)divisible is found to be incompatible with the claim that human individuals possess unique values which could underpin an absolute moral ban on murder. [source] Global Model for Optimizing Crossflow Microfiltration and Ultrafiltration Processes: A New Predictive and Design ToolBIOTECHNOLOGY PROGRESS, Issue 4 2005Gautam Lal Baruah A global model and algorithm that predicts the performance of crossflow MF and UF process individually or in combination in the laminar flow regime is presented and successfully tested. The model accounts for solute polydispersity, ionic environment, electrostatics, membrane properties and operating conditions. Computer programs were written in Fortran 77 for different versions of the model algorithm that can optimize MF/UF processes rapidly in terms of yield, purity, selectivity, or processing time. The model is validated successfully with three test cases: separation of bovine serum albumin (BSA) from hemoglobin (Hb), capture of immunoglobulin (IgG) from transgenic goat milk by MF, and separation of BSA from IgG by UF. These comparisons demonstrate the capability of the global model to conduct realistic in silico simulations of MF and UF processes. This model and algorithm should prove to be an invaluable technique to rapidly design new or optimize existing MF and UF processes separately or in combination in both pressure-dependent and pressure-independent regimes. [source] |