Home About us Contact | |||
Basic Assumptions (basic + assumption)
Selected AbstractsApplying the developmental perspective in the psychiatric assessment and diagnosis of persons with intellectual disability: part I , assessmentJOURNAL OF INTELLECTUAL DISABILITY RESEARCH, Issue 1 2005A. Dosen Abstract Background In generic psychiatry there has been increasing interest among scientists for the developmental perspective. However, professionals active in the mental health care of people with intellectual disability (ID) have not shown the same degree of interest. The author of this article, who has had a liberal amount of rewarding experiences with the developmental approach in the field of ID, considers the developmental perspective to be innovative and very useful in psychiatric assessment, diagnosis and treatment of this population. The aim of the article is to stimulate a wider application of the developmental perspective as well as to challenge a professional discussion on this issue. Methods Basic assumptions of the developmental perspective are discussed and assessment tools and methods are described. Results In a case vignette, the advantages of developmentally based assessment are emphasized. Emotional development and personality development are viewed as the developmental components that play an important role in adaptive and maladaptive behaviour as well as in the onset and presentation of psychopathology. It is clear that interpretative insight into the totality of the psychosocial aspects of these individuals cannot only be obtained by measuring the level of cognitive development. A wider frame of mind is needed for unambiguous psychiatric diagnostics. Therefore, a replacement of the three dimensional paradigm (bio,psycho,social) by a four dimensional one (bio,psycho,socio,developmental) for the assessment and diagnosis of persons with ID is proposed. [source] Isochrone ages for field dwarfs: method and application to the age,metallicity relationMONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, Issue 2 2004Frédéric Pont ABSTRACT A new method is presented to compute age estimates from theoretical isochrones using temperature, luminosity and metallicity data for individual stars. Based on Bayesian probability theory, this method avoids the systematic biases affecting simpler strategies and provides reliable estimates of the age probability distribution function for late-type dwarfs. Basic assumptions concerning the a priori parameter distribution suitable for the solar neighbourhood are combined with the likelihood assigned to the observed data to yield the complete posterior age probability. This method is especially relevant for G dwarfs in the 3,15 Gyr range of ages, crucial to the study of the chemical and dynamical history of the Galaxy. In many cases, it yields markedly different results from the traditional approach of reading the derived age from the isochrone nearest to the data point. We show that the strongest process affecting the traditional approach is that of strongly favouring computed ages near the end-of-main-sequence lifetime. The Bayesian method compensates for this potential bias and generally assigns much higher probabilities to lower main-sequence ages, compared with short-lived evolved stages. This has a strong influence on any application to galactic studies, especially given the present uncertainties on the absolute temperature scale of the stellar evolution models. In particular, the known mismatch between the model predictions and the observations for moderately metal-poor dwarfs (,1 < [Fe/H] < ,0.3) has a dramatic effect on the traditional age determination. We apply our method to the classic sample of Edvardsson et al., who derived the age,metallicity relation (AMR) of 189 field dwarfs with precisely determined abundances. We show how much of the observed scatter in the AMR is caused by the interplay between the systematic biases affecting the traditional age determination, the colour mismatch with the evolution models and the presence of undetected binaries. Using new parallax, temperature and metallicity data, our age determination for the same sample indicates that the intrinsic dispersion in the AMR is at most 0.15 dex and probably lower. In particular, we show that old, metal-rich objects ([Fe/H], 0.0 dex, age > 5 Gyr) and young, metal-poor objects ([Fe/H] < ,0.5 dex, age < 6 Gyr) in many observed AMR plots are artefacts caused by too simple a treatment of the age determination. The incompatibility of those AMR plots with a well-mixed interstellar medium may therefore only be apparent. Incidentally, our results tend to restore confidence in the method of age determination from the chromospheric activity for field dwarfs. [source] WTCP: an efficient mechanism for improving wireless access to TCP servicesINTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 1 2003Karunaharan Ratnam Abstract The transmission control protocol (TCP) has been mainly designed assuming a relatively reliable wireline network. It is known to perform poorly in the presence of wireless links because of its basic assumption that any loss of a data segment is due to congestion and consequently it invokes congestion control measures. However, on wireless access links, a large number of segment losses will occur more often because of wireless link errors or host mobility. For this reason, many proposals have recently appeared to improve TCP performance in such environment. They usually rely on the wireless access points (base stations) to locally retransmit the data in order to hide wireless losses from TCP. In this paper, we present Wireless-TCP (WTCP), a new mechanism for improving wireless access to TCP services. We use extensive simulations to evaluate TCP performance in the presence of congestion and wireless losses when the base station employs WTCP, and the well-known Snoop proposal (A comparison of mechanisms for improving TCP performance in wireless networks. In ACM SIGCOMM Symposium on Communication, Architectures and Protocols, August 1996). Our results show that WTCP significantly improves the throughput of TCP connections due to its unique feature of hiding the time spent by the base station to locally recover from wireless link errors so that TCPs round trip time estimation at the source is not affected. This proved to be critical since otherwise the ability of the source to effectively detect congestion in the fixed wireline network is hindered. Copyright © 2003 John Wiley & Sons, Ltd. [source] Online process mean estimation using L1 norm exponential smoothingNAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 5 2009Wei Jiang Abstract A basic assumption in process mean estimation is that all process data are clean. However, many sensor system measurements are often corrupted with outliers. Outliers are observations that do not follow the statistical distribution of the bulk of the data and consequently may lead to erroneous results with respect to statistical analysis and process control. Robust estimators of the current process mean are crucial to outlier detection, data cleaning, process monitoring, and other process features. This article proposes an outlier-resistant mean estimator based on the L1 norm exponential smoothing (L1 -ES) method. The L1 -ES statistic is essentially model-free and demonstrably superior to existing estimators. It has the following advantages: (1) it captures process dynamics (e.g., autocorrelation), (2) it is resistant to outliers, and (3) it is easy to implement. © 2009 Wiley Periodicals, Inc. Naval Research Logistics 2009 [source] Diagnosing explainable heterogeneity of variance in random-effects modelsTHE CANADIAN JOURNAL OF STATISTICS, Issue 1 2000Fan Zhang Abstract Data-analytic tools for models other than the normal linear regression model are relatively rare. Here we develop plots and diagnostic statistics for nonconstant variance for the random-effects model (REM). REMs for longitudinal data include both within- and between-subject variances. A basic assumption is that the two variance terms are constant across subjects. However, we often find that these variances are functions of covariates, and the data set has what we call explainable heterogeneity, which needs to be allowed for in the model. We characterize several types of heterogeneity of variance in REMs and develop three diagnostic tests using the score statistic: one for each of the two variance terms, and the third for a form of multivariate nonconstant variance. For each test we present an adjusted residual plot which can identify cases that are unusually influential on the outcome of the test. Il existe relativement peu de méthodes diagnostiques adaptérs à des modèles autres que celui de la régression linéaire dans le cadre gaussien. Les auteurs présentent ici des outils numériques et réelles permettant de déceler l'hétérogénéité de la variance dans des modèles à effets aléatoires. Lorsqu'ils sont utiliséd pour l'analyse de données longitudinales, ces modèles component des composantes de variance inter et intra-sujets qui sont habituellement supposées homogènes sur l'ensemble des sujets. Dans la pratique, il arrive toutefois que ces variances dépendent de covariables, ce qui induit une hétérogénéité explicable dont le modèle doit tenir compte. Les auteurs décrivent plusieurs scénarios de ce type et mon-trent comment détecter la présence d'hétérogénéité au moyen de statistiques scores, une pour chacune des deux composantes de la variance et la troisième permettant de tester une forme particulière d'hétérogénéite multivariée. Chaque test s'accompagne d'un graphique de résidus ajustés facilitant l'identification des observations influentes. [source] Assimilation of radar-derived rain rates into the convective-scale model COSMO-DE at DWDTHE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 634 2008K. Stephan Abstract To improve very-short-range forecasts particularly in convective situations, a version of the COSMO-Model (formerly known as LM) which simulates deep convection explicitly (horizontal grid length: 2.8 km) has been developed and is now run operationally at DWD. This model uses a prognostic type of precipitation scheme accounting for the horizontal drift of falling hydrometeors. To initialise convective-scale events, the latent heat nudging (LHN) approach has been adopted for the assimilation of surface precipitation rates derived from radar reflectivity data. It is found that a conventional LHN scheme designed for larger-scale models with diagnostic treatment of precipitation does not perform well and leads to strong overestimation of precipitation when applied to the convective-scale model with a prognostic treatment of precipitation. As illustrated here, surface precipitation and vertically integrated latent heating are far less correlated horizontally and temporally in such a model than with diagnostic precipitation, and this implies a violation of the basic assumption of LHN. Several revisions to the LHN scheme have therefore been developed in view of the characteristic model behaviour so as to re-enhance the validity of the basic assumption and to reduce greatly the overestimation of precipitation during assimilation. With the revised scheme, the model is able to simulate the precipitation patterns in good agreement with radar observations during the assimilation and the first hours of the forecast. The scheme also has a positive impact on screen-level parameters and on the longer-term climatology of the model. Extending the temporal impact of the radar observations further into the free forecast will be the focus of future research. Copyright © 2008 Royal Meteorological Society [source] Temperature perception and nociceptionDEVELOPMENTAL NEUROBIOLOGY, Issue 1 2004Barry G. Green Abstract The specificity theory of somesthesis holds that perceptions of warmth, cold, and pain are served by separate senses. Although no longer accepted in all its details, the theory's basic assumptions of anatomical and functional specificity have remained guiding principles in research on temperature perception and its relationship to pain. This article reviews the response characteristics of thermoreceptors, temperature-sensitive nociceptors, and their associated pathways in the context of old and new perceptual phenomena, most of which cannot be satisfactorily explained by the specificity theory. The evidence indicates that throughout most of the perceptual range, temperature sensitivity depends upon coactivation of, and interactions among, thermal and nociceptive pathways that are composed of both specific "labeled lines" and nonspecific, multimodal fibers. Adding to this complexity is evidence that tactile stimulation can influence the way in which thermal stimulation is perceived. It is argued that thermoreception is best defined as a functional subsystem of somesthesis that serves the very different and sometimes conflicting demands of thermoregulation, protection from thermal injury, and haptic perception. © 2004 Wiley Periodicals, Inc. J Neurobiol 61: 13,29, 2004 [source] The notion of ,phonology' in dyslexia research: cognitivism,and beyondDYSLEXIA, Issue 3 2007Per Henning Uppstad Abstract Phonology has been a central concept in the scientific study of dyslexia over the past decades. Despite its central position, however, it is a concept with no precise definition or status. The present article investigates the notion of ,phonology' in the tradition of cognitive psychology. An attempt is made to characterize the basic assumptions of the phonological approach to dyslexia and to evaluate these assumptions on the basis of commonly accepted standards of empirical science. First, the core assumptions of phonological awareness are outlined and discussed. Second, the position of Paula Tallal is presented and discussed in order to shed light on an attempt to stretch the cognitive-psychological notion of ,phonology' towards auditory and perceptual aspects. Both the core assumptions and Tallal's position are rejected as unfortunate, albeit for different reasons. Third, the outcome of this discussion is a search for what is referred to as a ,vulnerable theory' within this field. The present article claims that phonological descriptions must be based on observable linguistic behaviour, so that hypotheses can be falsified by data. Consequently, definitions of ,dyslexia' must be based on symptoms; causal aspects should not be included. In fact, we claim that causal aspects, such as ,phonological deficit', both exclude other causal hypotheses and lead to circular reasoning. If we are to use terms such as ,phonology' and ,phoneme' in dyslexia research, we must have more precise operationalizations of them. Copyright © 2007 John Wiley & Sons, Ltd. [source] Hydrograph and unit hydrograph derivation in arid regionsHYDROLOGICAL PROCESSES, Issue 8 2007Zekai Abstract Arid and semi-arid regions expose special hydrological features that are distinctive from humid areas. Unfortunately, humid-region hydrological empirical formulations are used directly in the arid and semi-arid regions without care about the basic assumptions. During any storm rainfall in arid regions, rainfall, infiltration and runoff components of the hydrological cycle have impacts on water resources. The basis of the methodology presented in this paper is the ratio of runoff increment to rainfall increment during an infinitesimally small time duration. This is the definition of runoff coefficient for the same infinitesimal time duration. The ratio is obtained through rational, physical and mathematical combination of hydrological thinking and then integrated with the classical infiltration equation for the hydrograph determination. The parameters of the methodology are explained and their empirical estimations are presented. The methodology works for rainfall and runoff from ungauged watersheds where infiltration measurement can be performed. The comparison of the new approach with different classical approaches, such as the rational formula and Soil Conservation Service method, are presented in detail. Its application is performed for two wadis within the Kingdom of Saudi Arabia. Copyright © 2006 John Wiley & Sons, Ltd. [source] Issues and Challenges of Emigration Dynamics in Developing CountriesINTERNATIONAL MIGRATION, Issue 4 2001A.A. Afolayan This article is a theory-based attempt to present the issues and challenges of emigration dynamics in developing countries. The topic is discussed within several basic assumptions: first, that emigration dynamics in developing countries have certain features that are different from those in developed countries; second, that countries in the regions covered by the study (sub-Saharan Africa, Central America and the Caribbean, and South Asia) are representative of developing countries. The article has been considerably facilitated by two recently concluded and reported projects: the IOM/UNFPA project, "Emigration dynamics in developing countries: sub-Saharan Africa, Central America and the Caribbean, and South Asia"(Appleyard, 1998, 1999), and the UAP/CEIFO project on "International migration in and from Africa: dimensions, challenges and prospects"(Adepoju and Hammar, 1996). Any serious academic study of emigration dynamics in developing countries must acknowledge these landmark scholarly studies if they hope to advance understanding of the essential features of emigration dynamics in developing countries. A prime objective of the present article is to focus attention on aspects of the emigration process that will enable policy makers to utilize emigration for development, especially through national and international cooperation at regional and global levels. The article is predicated upon the need for a theory or model of emigration dynamics in developing countries that meets both internal and external dimensions. The adequacy of such a theory can be measured at three different levels: observation, description and explanation (Chomsky, 1965). [source] The ethics of research using electronic mail discussion groupsJOURNAL OF ADVANCED NURSING, Issue 5 2005Debbie Kralik PhD RN Aim., The aim of this paper is to identify and discuss the ethical considerations that have confronted and challenged the research team when researchers facilitate conversations using private electronic mail discussion lists. Background., The use of electronic mail group conversations, as a collaborative data generation method, remains underdeveloped in nursing. Ethical challenges associated with this approach to data generation have only begun to be considered. As receipt of ethics approval for a study titled; ,Describing transition with people who live with chronic illness' we have been challenged by many ethical dilemmas, hence we believe it is timely to share the issues that have confronted the research team. These discussions are essential so we can understand the possibilities for research interaction, communication, and collaboration made possible by advanced information technologies. Discussion., Our experiences in this study have increased our awareness for ongoing ethical discussions about privacy, confidentiality, consent, accountability and openness underpinning research with human participants when generating data using an electronic mail discussion group. We describe how we work at upholding these ethical principles focusing on informed consent, participant confidentiality and privacy, the participants as threats to themselves and one another, public,private confusion, employees with access, hackers and threats from the researchers. Conclusion., A variety of complex issues arise during cyberspace research that can make the application of traditional ethical standards troublesome. Communication in cyberspace alters the temporal, spatial and sensory components of human interaction, thereby challenging traditional ethical definitions and calling to question some basic assumptions about identity and ones right to keep aspects of it confidential. Nurse researchers are bound by human research ethics protocols; however, the nature of research by electronic mail generates moral issues as well as ethical concerns. Vigilance by researchers is required to ensure that data are viewed within the scope of the enabling ethics approval. [source] Self-definition of women experiencing a nontraditional graduate fellowship program,JOURNAL OF RESEARCH IN SCIENCE TEACHING, Issue 8 2006Gayle A. Buck Women continue to be underrepresented in the fields of science, technology, engineering, and mathematics (STEM). One factor contributing to this underrepresentation is the graduate school experience. Graduate programs in STEM fields are constructed around assumptions that ignore the reality of women's lives; however, emerging opportunities may lead to experiences that are more compatible for women. One such opportunity is the Graduate Teaching Fellows in K,12 Education (GK,12) Program, which was introduced by the National Science Foundation in 1999. Although this nontraditional graduate program was not designed explicitly for women, it provided an unprecedented context in which to research how changing some of the basic assumptions upon which a graduate school operates may impact women in science. This exploratory case study examines the self-definition of 8 women graduate students who participated in a GK,12 program at a major research university. The findings from this case study contribute to higher education's understanding of the terrain women graduate students in the STEM areas must navigate as they participate in programs that are thought to be more conducive to their modes of self-definition while they continue to seek to be successful in the historically Eurocentric, masculine STEM fields. © 2006 Wiley Periodicals, Inc. J Res Sci Teach 43: 852,873, 2006 [source] The Cultural Paradigm of the Smaller FirmJOURNAL OF SMALL BUSINESS MANAGEMENT, Issue 4 2004Helen Haugh This paper presents the findings from an ethnographic study of organizational culture and shared values in four smaller firms, the outcome of which was the identification of the cultural values shared between owner,managers (OMs) and employees in each firm. The research employed Schein's conceptualization of culture as a three-layer phenomenon, consisting of surface artifacts, shared values and beliefs, and basic assumptions. The analytical technique of grounded theory was employed to process the large volume of data gathered during the extended research period. The data reveal a complex array of values in each firm, with only one firm exhibiting a homogenous culture where values are shared by all those working in the organization. In the remaining three firms, five values appear to be shared by all employees; however, this is overlaid by a pattern of subcultures differentiated by distinctive shared values. Interfirm analysis among the four firms found that the values of survival, independence, control, pragmatism, and financial prudence were shared by two or more firms. The research collectively defines these shared values as the cultural paradigm of the smaller firm. [source] Utopianism in psychology: The case of Wilhelm ReichJOURNAL OF THE HISTORY OF THE BEHAVIORAL SCIENCES, Issue 2 2002Petteri Pietikainen Ph.D. research fellowArticle first published online: 8 APR 200 This article examines utopian elements in Wilhelm Reich's writings in his American phase (1939,1957) in order to illustrate utopian sources of dynamic psychology. Although there are scholars who have used the term "psychological utopia" and applied it to individual thinkers (Reich, Marcuse, Fromm) and to specific psychological disciplines (psychoanalysis, behaviorism, cognitive psychology), the term itself has remained elusive and vague. Furthermore, there have been few attempts to systematically examine utopian elements in twentieth-century psychology in general and the basic assumptions of psychological utopianism in particular. While pointing out that Reich's orgonomic theories have no scientific merit, this article argues for the relevancy of his ideas for understanding the nature of utopianism in dynamic psychology. © 2002 Wiley Periodicals, Inc. [source] Tests for cycling in a signalling pathwayJOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 4 2004T. G. Müller Summary., Cellular signalling pathways, mediating receptor activity to nuclear gene activation, are generally regarded as feed forward cascades. We analyse measured data of a partially observed signalling pathway and address the question of possible feed-back cycling of involved biochemical components between the nucleus and cytoplasm. First we address the question of cycling in general, starting from basic assumptions about the system. We reformulate the problem as a statistical test leading to likelihood ratio tests under non-standard conditions. We find that the modelling approach without cycling is rejected. Afterwards, to differentiate two different transport mechanisms within the nucleus, we derive the appropriate dynamical models which lead to two systems of ordinary differential equations. To compare both models we apply a statistical testing procedure that is based on bootstrap distributions. We find that one of both transport mechanisms leads to a dynamical model which is rejected whereas the other model is satisfactory. [source] Law versus the State: The Judicialization of Politics in EgyptLAW & SOCIAL INQUIRY, Issue 4 2003Tamir Moustafa This study seeks to explain the paradoxical expansion of constitutional power in Egypt over the past two decades, despite that country's authoritarian political system. I find that the Egyptian regime established an independent constitutional court, capable of providing institutional guarantees on the security of property rights, in order to attract desperately needed private investment after the failure of its socialist-oriented development strategy. The court continued to expand its authority, fundamentally transforming the mode of interaction between state and society by supporting regime efforts to liberalize the economy while simultaneously providing new avenues for opposition activists and human rights groups to challenge the state. The Egyptian case challenges some of our basic assumptions about the conditions under which we are likely to see a judicialization of politics, and it invites scholars to explore the dynamics of judicial politics in other authoritarian political systems. [source] Research paradigms in medical education researchMEDICAL EDUCATION, Issue 4 2010Suzanne Bunniss Medical Education 2010: 44: 358,366 Context, The growing popularity of less familiar methodologies in medical education research, and the use of related data collection methods, has made it timely to revisit some basic assumptions regarding knowledge and evidence. Methods, This paper outlines four major research paradigms and examines the methodological questions that underpin the development of knowledge through medical education research. Discussion, This paper explores the rationale behind different research designs, and shows how the underlying research philosophy of a study can directly influence what is captured and reported. It also explores the interpretivist perspective in some depth to show how less familiar paradigm perspectives can provide useful insights to the complex questions generated by modern healthcare practice. Conclusions, This paper concludes that the quality of research is defined by the integrity and transparency of the research philosophy and methods, rather than the superiority of any one paradigm. By demonstrating that different methodological approaches deliberately include and exclude different types of data, this paper highlights how competing knowledge philosophies have practical implications for the findings of a study. [source] Qualitative methodologies I: asking research questions with reflexive insightMUSCULOSKELETAL CARE, Issue 3 2007CPsychol, Elizabeth D. Hale BA Abstract The purpose of this paper, the first of a series of two discussion pieces, is to introduce some of the issues in the debate surrounding qualitative research to the readers of Musculoskeletal Care. Recent issues of the Journal have seen an informative focus on quantitative methods and statistical analysis, and here we provide an equivalent introduction to semi-structured interviewing and qualitative analysis in this series. In the qualitative tradition, we have tried to keep our discussion reflexive, transparent and contextualized within the history of the approach and the theoretical considerations that underlie it, including the origins, nature, methods and limits of the approach. We provide information that we hope is useful for readers with all levels of familiarity with qualitative research, building from an introduction to some basic assumptions and ethical issues. We also introduce one specific qualitative approach, interpretative phenomenological analysis, which researchers might wish to apply. In the accompanying paper in a subsequent issue of Musculoskeletal Care, we will describe the potential application of this approach. Copyright © 2007 John Wiley & Sons, Ltd. [source] Spinoza's Proof of NecessitarianismPHILOSOPHY AND PHENOMENOLOGICAL RESEARCH, Issue 2 2003OLLI KOISTINEN This paper consists of four sections. The first section considers what the proof of necessitarianism in Spinoza's system requires. Also in the first section, Jonathan Bennett's (1984) reading of lpl6 as involving a commitment to necessitarianism is presented and accepted. The second section evaluates Bennett's suggestion how Spinoza might have been led to conclude necessitarianism from his basic assumptions. The third section of the paper is devoted to Don Garrett's (1991) interpretation of Spinoza's proof. I argue that Bennett's and Garrett's interpretations of Spinoza's necessitarianism have shortcomings which justify an attempt to offer an alternative proof. In the proof given in the fourth section, it is argued that Spinoza derived necessitarianism from the conjunction of the following principles: (i) necessary existence of the substances; (ii) substance-property ontology; (iii) superessentialism; and (iv) the ,no shared attribute'thesis. [source] Social maladjustment and students with behavioral and emotional disorders: Revisiting basic assumptions and assessment issuesPSYCHOLOGY IN THE SCHOOLS, Issue 8 2004Daniel Olympia While much of the current focus in special education remains on reauthorization of the Individuals with Disabilities Act of 1997, disparities in the identification of children with serious emotional disorders continue to plague special educators and school psychologists. Several years after the issue of social maladjustment and its relationship to serious emotional disturbance was discussed and debated, little appears to have changed. Children, adolescents, and families are subjected to widely varying philosophies, assessment procedures, and services based on questionable criteria used to determine whether a student "qualifies" for services under the Serious Emotional Disturbance (SED) designation. In this paper, we address how this issue has significantly affected access to services for students with serious emotional disturbances. Faulty assumptions regarding the relationship of social maladjustment to emotional disturbance in children/adolescents are identified and the implications of these assumptions for children are described. The lack of research supporting specific tools developed to assess social maladjustment in the context of a serious emotional disorder and the impact of this current practice on children is addressed from a practical and ethical standpoint. The role of the school psychologist as gatekeeper is contrasted to that of the more positive role as facilitator. © 2004 Wiley Periodicals, Inc. Psychol Schs 41: 835,847, 2004. [source] An attempt to create an ethic of transfer after conflict resolution in fractured communities: a modified formal grounded theory shaped by meta-ethnographyPSYCHOTHERAPY AND POLITICS INTERNATIONAL, Issue 2 2007Maurice Apprey Abstract A modified formal grounded theory on the ethic of transfer after conflict resolution has been established. There are two parts to this account. First a phenomenologically driven set of basic assumptions is deployed to shape the praxis. Then a meta-ethnographic synthesis is used to combine different approaches to conflict resolution in order to create another discrete interventional practice in ways that make us uneasy about each of the prior practices. The result is an interventional approach that allows practitioners of conflict resolution in fractured communities to begin their interventions with an understanding of the cultural habitus in the first instance, followed up with processes of transformation through psychopolitical dialogues and ending with grassroots projects that return the conflict-resolution project into the hands of the stakeholders whose cultural habitus was determined at the onset. Finally, psychopolitical dialogue with the select group of stakeholders ends with the choice of a number of grassroots projects that in turn generalize the results from small groups into the larger population. Such an ethic of transfer then starts with gatekeepers to sanction the psychopolitical dialogues and returns to the same gatekeepers who guide the selection of grassroots projects. The result is a recursive loop that treats the ethic of transfer of a conflict resolution project as part of an organic whole rather than an addendum. Copyright © 2007 John Wiley & Sons, Ltd. [source] Positron Emission Tomography Applied to Fluidization EngineeringTHE CANADIAN JOURNAL OF CHEMICAL ENGINEERING, Issue 1 2005Chutima Dechsiri Abstract The movement of particles in a laboratory fluidized bed has been studied using Positron Emission Tomography (PET). With this non-invasive technique both pulses of various shapes and single tracer particles were followed in 3-D. The equipment and materials used made it possible to label actual bed particles as tracer. This paper briefly describes the data analysis, some of the results and compares them with a stochastic model. The results confirm the basic assumptions of the model qualitatively but not quantitatively. Further analysis of the results indicates that the system exhibits gulf-streaming, a feature which is not yet accounted for in the model, but is common in fluidized beds in practice. On a étudié le mouvement des particules dans un lit fluidisé de laboratoire par la tomographie à émission de positrons (PET). Cette technique non intrusive a permis de suivre en 3-D des impulsions de formes diverses aussi bien que des particules traçantes uniques en 3-D. L'équipement et les matériaux utilisés ont permis d'employer des particules de lit réelles comme traceurs. On décrit brièvement dans cet article l'analyse des données et certains des résultats, puis une comparaison est effectuée avec un modèle stochastique. Les résultats confirment les hypothèses de base du modèle de façon qualitative mais non quantitative. Une analyse plus approfondie des résultats indique que le système montre un phénomène de « gulf-streaming », une caractéristique pas encore prise en compte dans le modèle mais qui est courante dans les lits fluidisés réels. [source] Unravelling the riddle of exhibitionism: A lesson in the power tactics of perverse interpersonal relationshipsTHE INTERNATIONAL JOURNAL OF PSYCHOANALYSIS, Issue 1 2008Richard Howard Tuch Through an examination of the varied paradoxes embedded within the phenomenon of genital exhibitionism, the author establishes exhibitionism as a paradigm for interpersonal relations whereby one individual entices another to lose himself, to a benign or dangerous degree, in a presented portrayal/enactment. Efforts to entice that cause an extreme loss of the subject's sense of self , making it exceedingly hard to break free of , are designed to render the subject powerless and take psychic possession of him. The perpetrator accomplishes this feat by interacting with his victim in ways capable of producing a sudden and profound regression with sufficient loss of autonomous ego functioning that the subject finds it hard, if not impossible, to act on his own behalf. The essential feature of the perpetrator's efforts is his violation of the unspoken but understood rules of interpersonal engagement that, when violated, cause extreme disorientation and a loss of trust in one's most basic assumptions about how humans treat one another. When this happens, we would describe the interaction as having been perverted to serve the exclusive needs of the predator , to gain as complete control over the other as possible. [source] ,Weaving thoughts' A method for presenting and commenting psychoanalytic case material in a peer groupTHE INTERNATIONAL JOURNAL OF PSYCHOANALYSIS, Issue 5 2005JOHAN NORMAN The authors argue that there are good reasons for seriously considering the dynamics of the peer group when discussing psychoanalytical case material. The setting and procedure have to protect and facilitate for the presenter and the group members to work together. The aim of this paper is to discuss the problems connected with presenting and discussing clinical psychoanalytical material in a peer group and to describe one such specifi c method, which the authors call the ,weaving thoughts' method. The design is primarily inspired by Bion's formulation ,thoughts in search of a thinker'. The group participants refl ect on the presented clinical material in a way that the authors metaphorically describe as creating a weave of thoughts that emerges from the material. The aim of the method is to facilitate a work-group climate that allows thoughts to wander about, and to avert group members from debating and compromising the integrity of its members by letting basic assumptions come into power. The method is described from theoretical and practical points of view, with two illustrations of seminars according to this design and fi nally a discussion of the advantages and drawbacks of the method. [source] The capacity to be an analyst: A contribution from attachment research to the study of candidate selectionTHE INTERNATIONAL JOURNAL OF PSYCHOANALYSIS, Issue 6 2003Janice Halpern In this paper the author discusses how the study of candidate selection, once a topic of vibrant research, has unfortunately languished. Certain qualities were thought to characterize the successful candidate. However, they were never successfully operationalized nor empirically tested. Possibly because of this lack of empirical data, selectors today have difficulty articulating their criteria and are relying on intuition. In order to provide a more rational basis for contemporary selection, the author looks to the attachment literature. This makes sense because attachment theory shares some basic assumptions of contemporary psychoanalysis. The Adult Attachment Interview (AAI) is a research tool that predicts the ability of a parent to convey attachment security. It is scored by attending to how a person speaks about his early attachment experiences. The AAI appears to tap into similar qualities to those selection researchers have sought in their candidates. Further, the scoring method of the AAI appears to be similar to the last attempt by selection researchers to operationalize them. Given these similarities, the author recommends an empirical study using the AAI to operationalize these qualities in analytic candidates. The study would test their importance for success in the training program, thus offering selectors some empirical grounding for their choices. [source] The role and impact of mental simulation in designAPPLIED COGNITIVE PSYCHOLOGY, Issue 3 2009Bo T. Christensen Although theories of mental simulations have used different formulations of the premises of ,thought experiments', they can be fitted under a minimalist hypothesis stating that mental simulations are run under situations of uncertainty to turn that uncertainty into approximate answers. Three basic assumptions of mental simulations were tested by using naturalistic data from engineering design. Results from the design protocols showed (1) initial representations in mental simulation had higher than base-rate uncertainty, (2) uncertainty in mental simulations were lowered after simulation runs, (3) resulting representations had more approximations than base-rate or initial representations. Further, the reference to external representational systems (sketches and prototypes) was examined. It was found that prototypes had fewer technical/functional simulations compared to sketches or unsupported cognition. Although prototypes were associated with more approximation than unsupported cognition, the different external representation categories did not differ in information uncertainty. The results support the minimalist hypothesis of mental simulations. Copyright © 2008 John Wiley & Sons, Ltd. [source] Revisiting the concept of lineage in prokaryotes: a phylogenetic perspectiveBIOESSAYS, Issue 5 2009Yan Boucher Abstract Mutation and lateral transfer are two categories of processes generating genetic diversity in prokaryotic genomes. Their relative importance varies between lineages, yet both are complementary rather than independent, separable evolutionary forces. The replication process inevitably merges together their effects on the genome. We develop the concept of "open lineages" to characterize evolutionary lineages that over time accumulate more changes in their genomes by lateral transfer than by mutation. They contrast with "closed lineages," in which most of the changes are caused by mutation. Open and closed lineages are interspersed along the branches of any tree of prokaryotes. This patchy distribution conflicts with the basic assumptions of traditional phylogenetic approaches. As a result, a tree representation including both open and closed lineages is a misrepresentation. The evolution of all prokaryotic lineages cannot be studied under a single model unless new phylogenetic approaches that are more pluralistic about lineage evolution are designed. [source] Hierarchical Spatial Modeling of Additive and Dominance Genetic Variance for Large Spatial Trial DatasetsBIOMETRICS, Issue 2 2009Andrew O. Finley Summary This article expands upon recent interest in Bayesian hierarchical models in quantitative genetics by developing spatial process models for inference on additive and dominance genetic variance within the context of large spatially referenced trial datasets. Direct application of such models to large spatial datasets are, however, computationally infeasible because of cubic-order matrix algorithms involved in estimation. The situation is even worse in Markov chain Monte Carlo (MCMC) contexts where such computations are performed for several iterations. Here, we discuss approaches that help obviate these hurdles without sacrificing the richness in modeling. For genetic effects, we demonstrate how an initial spectral decomposition of the relationship matrices negate the expensive matrix inversions required in previously proposed MCMC methods. For spatial effects, we outline two approaches for circumventing the prohibitively expensive matrix decompositions: the first leverages analytical results from Ornstein,Uhlenbeck processes that yield computationally efficient tridiagonal structures, whereas the second derives a modified predictive process model from the original model by projecting its realizations to a lower-dimensional subspace, thereby reducing the computational burden. We illustrate the proposed methods using a synthetic dataset with additive, dominance, genetic effects and anisotropic spatial residuals, and a large dataset from a Scots pine (Pinus sylvestris L.) progeny study conducted in northern Sweden. Our approaches enable us to provide a comprehensive analysis of this large trial, which amply demonstrates that, in addition to violating basic assumptions of the linear model, ignoring spatial effects can result in downwardly biased measures of heritability. [source] The Impact of Board Independence and CEO Duality on Firm Performance: A Quantile Regression Analysis for Indonesia, Malaysia, South Korea and ThailandBRITISH JOURNAL OF MANAGEMENT, Issue 3 2010Dendi Ramdani We study the effect of board independence and CEO duality on firm performance for a sample of stock-listed enterprises from Indonesia, Malaysia, South Korea and Thailand, applying quantile regression. Quantile regression is more powerful than classical linear regression since quantile regression can produce estimates for all conditional quantiles of the distribution of a response variable, whereas classical linear regression only estimates the conditional mean effects of a response variable. Moreover, quantile regression is better able to handle violations of the basic assumptions in classical linear regression. Our empirical evidence shows that the effect of board independence and CEO duality on firm performance is different across the conditional quantiles of the distribution of firm performance, something classical linear regression would leave unidentified. This finding suggests that estimating the quantile effect of a response variable can well be more insightful than estimating only the mean effect of the response variable. Additionally, we find a negative moderating effect of board size on the positive relationship between CEO duality and firm performance. [source] |