New Framework (new + framework)

Distribution by Scientific Domains


Selected Abstracts


The Invisible (Inaudible) Woman: Nursing in the English Academy

GENDER, WORK & ORGANISATION, Issue 2 2005
Liz Meerabeau
Nursing is numerically a far larger academic discipline than medicine, and is situated in many more higher education institutions in England (over 50), whereas there are 21 medical schools. Like the rest of ,non medical education and training' it is purchased through a quasi-market. Despite the size of this market, however, nursing education has until recently been largely invisible in policy documents and the ambitions of nursing academics to develop their subject are seen as inappropriate. This article explores this invisibility and inaudibility, with particular reference to the 1997 Richards Report, Clinical Academic Careers and the 2001 Nuffield Trust report, A New Framework for NHS/University Relations. It draws on the work of Davies on the ,professional predicament' of nursing, to argue that, although the move of nursing education into higher education had the aim of improving its status, nursing has difficulty finding its voice within academia. As a result, issues which are salient for nursing (as for many of the health professions), such as a poor (or relatively poor) showing in the Research Assessment Exercise and the complexities of balancing research, teaching and maintaining clinical competence, are raised as high-profile issues only in medicine. [source]


2.,A New Framework for Macroeconomics

AMERICAN JOURNAL OF ECONOMICS AND SOCIOLOGY, Issue 4 2009
Achieving Full Employment by Increasing Capital Turnover
Most forms of macroeconomics today, whether Keynesian or monetarist, presuppose that problems of economic instability can be treated as errors in financial management. Neither fiscal nor monetary policy recognizes the existence of systemic faults in the real economy that result in overinvestment in durable capital that turns over slowly, in contrast to forms of capital that interact more frequently with land and labor. Only by removing serious distortions in microeconomic relations can macroeconomic problems be resolved. The current global economic crisis exemplifies the limitations of policies that ignore distortions in the rate of turnover of investment capital. [source]


Nationalism in Ukraine: Towards A New Framework

POLITICS, Issue 2 2000
Taras Kuzio
Nationalism is the most abused term in contemporary Ukrainian studies. The majority of scholars have failed to place its use within either a theoretical or comparative framework due to the dominance of area studies and the Russo-centricity of Sovietology and post-Sovietology. Instead of defining it within political science parameters, ,nationalism' has been used in a subjective and negative manner by equating it solely in an ethno-cultural sense with Ukrainophones. As a result, scholars tend to place Ukrainophones on the right of the political spectrum. This article argues that this is fundamentally at odds with theory and comparative politics on two counts. First, ,nationalism' is a thin ideology and can function through all manner of ideologies ranging from communism to fascism. Second, all liberal democracies are composed of ethno-cultural and civic features and are therefore permeated by state (civic) nationalism. The article proposes an alternative three-fold framework for understanding ,nationalism' in Ukraine. [source]


ChemInform Abstract: The Oxoantimonates(III) Rb2Sb8O13 and Cs8Sb22O37: New Framework and Layer Structures with "Lone-Pair" Cations.

CHEMINFORM, Issue 34 2002
Franziska Emmerling
Abstract ChemInform is a weekly Abstracting Service, delivering concise information at a glance that was extracted from about 100 leading journals. To access a ChemInform Abstract of an article which was published elsewhere, please select a "Full Text" option. The original article is trackable via the "References" option. [source]


Maintaining Case-Based Reasoners: Dimensions and Directions

COMPUTATIONAL INTELLIGENCE, Issue 2 2001
David C. Wilson
Experience with the growing number of large-scale and long-term case-based reasoning (CBR) applications has led to increasing recognition of the importance of maintaining existing CBR systems. Recent research has focused on case-base maintenance (CBM), addressing such issues as maintaining consistency, preserving competence, and controlling case-base growth. A set of dimensions for case-base maintenance, proposed by Leake and Wilson, provides a framework for understanding and expanding CBM research. However, it also has been recognized that other knowledge containers can be equally important maintenance targets. Multiple researchers have addressed pieces of this more general maintenance problem, considering such issues as how to refine similarity criteria and adaptation knowledge. As with case-base maintenance, a framework of dimensions for characterizing more general maintenance activity, within and across knowledge containers, is desirable to unify and understand the state of the art, as well as to suggest new avenues of exploration by identifying points along the dimensions that have not yet been studied. This article presents such a framework by (1) refining and updating the earlier framework of dimensions for case-base maintenance, (2) applying the refined dimensions to the entire range of knowledge containers, and (3) extending the theory to include coordinated cross-container maintenance. The result is a framework for understanding the general problem of case-based reasoner maintenance (CBRM). Taking the new framework as a starting point, the article explores key issues for future CBRM research. [source]


An Algorithmic Framework for Specifying the Semantics of Discourse Relations

COMPUTATIONAL INTELLIGENCE, Issue 4 2000
Alistair Knott
In this paper, a new framework is proposed for defining the semantics of discourse relations and for defining the semantics of the utterances that relations link together. The proposal is to define relations in terms of the operation of an algorithm simulating the mental state of an agent interacting with the world. The algorithm interleaves perception, theorem proving, and action: The denotation of a complex utterance containing a relation between two simpler utterances is taken to be the description of the operation of the algorithm during the time interval identified by their Reichenbachian reference times. This proposal is presented in detail for two discourse relations. Its potential application in the treatment of mood, tense, aspect, and dialogue structure is also discussed in very general terms. [source]


Fast GPU-based Adaptive Tessellation with CUDA

COMPUTER GRAPHICS FORUM, Issue 2 2009
Michael Schwarz
Abstract Compact surface descriptions like higher-order surfaces are popular representations for both modeling and animation. However, for fast graphics-hardware-assisted rendering, they usually need to be converted to triangle meshes. In this paper, we introduce a new framework for performing on-the-fly crack-free adaptive tessellation of surface primitives completely on the GPU. Utilizing CUDA and its flexible memory write capabilities, we parallelize the tessellation task at the level of single surface primitives. We are hence able to derive tessellation factors, perform surface evaluation as well as generate the tessellation topology in real-time even for large collections of primitives. We demonstrate the power of our framework by exemplarily applying it to both bicubic rational Bézier patches and PN triangles. [source]


A framework for quad/triangle subdivision surface fitting: Application to mechanical objects

COMPUTER GRAPHICS FORUM, Issue 1 2007
Guillaume Lavoué
Abstract In this paper we present a new framework for subdivision surface approximation of three-dimensional models represented by polygonal meshes. Our approach, particularly suited for mechanical or Computer Aided Design (CAD) parts, produces a mixed quadrangle-triangle control mesh, optimized in terms of face and vertex numbers while remaining independent of the connectivity of the input mesh. Our algorithm begins with a decomposition of the object into surface patches. The main idea is to approximate the region boundaries first and then the interior data. Thus, for each patch, a first step approximates the boundaries with subdivision curves (associated with control polygons) and creates an initial subdivision surface by linking the boundary control points with respect to the lines of curvature of the target surface. Then, a second step optimizes the initial subdivision surface by iteratively moving control points and enriching regions according to the error distribution. The final control mesh defining the whole model is then created assembling every local subdivision control meshes. This control polyhedron is much more compact than the original mesh and visually represents the same shape after several subdivision steps, hence it is particularly suitable for compression and visualization tasks. Experiments conducted on several mechanical models have proven the coherency and the efficiency of our algorithm, compared with existing methods. [source]


Reducing threats to species: threat reversibility and links to industry

CONSERVATION LETTERS, Issue 4 2010
Laura R. Prugh
Abstract Threats to species' persistence are typically mitigated via lengthy and costly recovery planning processes that are implemented only after species are at risk of extinction. To reduce overall threats and minimize risks to species not yet imperiled, a proactive and broad-scale framework is needed. Using data on threats to imperiled species in Canada to illustrate our approach, we link threats to industries causing the harm, thus providing regulators with quantitative data that can be used directly in cost-benefit and risk analyses to broadly reduce threat levels. We then show how ranking the ease of threat abatement and reversal assists prioritization by identifying threats that are easiest to mitigate as well as threats that are possible to abate but nearly impossible to reverse. This new framework increases the usefulness of widely available threat data for preventative conservation and species recovery. [source]


Applying the Collective Causal Mapping Methodology to Operations Management Curriculum Development,

DECISION SCIENCES JOURNAL OF INNOVATIVE EDUCATION, Issue 2 2007
Julie M. Hays
ABSTRACT Although the field of operations management has come a long way since its beginnings in scientific management, the field still appears somewhat amorphous and unstructured to many. Introductory operations management textbooks usually include a number of largely disjointed topics, which leave many students (and their instructors) without a coherent framework for understanding the field. As a result, the importance and sequencing of topics varies widely between courses and instructors, even within the same university. This article applies the newly developed Collective Causal Mapping Methodology to create a causal map for the entire field of operations management. The causal map is built on expert opinions collected from over 250 academics and practitioners representing many areas of expertise, schools, organizations, and countries. This collective causal map is then used to create a new framework for understanding and teaching operations management. This framework can aid instructors in determining which topics should be taught in an operations management course, how these topics might be grouped and sequenced, and the important interrelationships among the topics that should be stressed to students. [source]


A Parametric Approach to Flexible Nonlinear Inference

ECONOMETRICA, Issue 3 2001
James D. Hamilton
This paper proposes a new framework for determining whether a given relationship is nonlinear, what the nonlinearity looks like, and whether it is adequately described by a particular parametric model. The paper studies a regression or forecasting model of the form yt=,(xt)+,t where the functional form of ,(,) is unknown. We propose viewing ,(,) itself as the outcome of a random process. The paper introduces a new stationary random field m(,) that generalizes finite-differenced Brownian motion to a vector field and whose realizations could represent a broad class of possible forms for ,(,). We view the parameters that characterize the relation between a given realization of m(,) and the particular value of ,(,) for a given sample as population parameters to be estimated by maximum likelihood or Bayesian methods. We show that the resulting inference about the functional relation also yields consistent estimates for a broad class of deterministic functions ,(,). The paper further develops a new test of the null hypothesis of linearity based on the Lagrange multiplier principle and small-sample confidence intervals based on numerical Bayesian methods. An empirical application suggests that properly accounting for the nonlinearity of the inflation-unemployment trade-off may explain the previously reported uneven empirical success of the Phillips Curve. [source]


An evolutionary concept for altered steroid hormone metabolism in patients with rheumatoid arthritis

EXPERIMENTAL DERMATOLOGY, Issue 2 2005
Rainer H. Straub
The pathogenesis of chronic disabling inflammatory diseases (CDIDs) is partly understood. The presently used concepts focus mainly on abnormalities of the immune system but this view is incomplete. The presented concept is a new framework for the pathogenesis of CDIDs. It integrates evolutionary theories with the classical immunological standpoint, which is further linked with a neuroendocrine immune view of erroneous homeostatic adaptation of the other supersystems (nervous system, endocrine system, reproductive system): 1. In CDIDs, the loss of tolerance against self and harmless foreign antigens leads to continuous immune aggression which is dependent on a multifactorial genetically polymorph background (the initiation). 2. However, advantageous or disadvantageous adaptation to CDIDs were not evolutionary conserved because CDIDs severely impaired reproduction or appeared after the reproductive phase and, thus, imply a strong negative selection pressure. 3. Reactions of all supersystems are evolutionary conserved for transient inflammatory reactions such as the elimination of infectious agents, wound healing, foreign body reaction and many others. 4. The sum of the false reactions of all supersystems , conserved for transient inflammation , provide the pathogenetic background for the chronification of CDIDs because a continuous aggressive situation is created (the chronification). The human disease of rheumatoid arthritis is used as a prototypic CDID to illustrate the integrated view point. The synovial tissue innervation is in the focus of this concept. [source]


Understanding intention of movement from electroencephalograms

EXPERT SYSTEMS, Issue 5 2007
Heba Lakany
Abstract: In this paper, we propose a new framework for understanding intention of movement that can be used in developing non-invasive brain,computer interfaces. The proposed method is based on extracting salient features from brain signals recorded whilst the subject is actually (or imagining) performing a wrist movement in different directions. Our method focuses on analysing the brain signals at the time preceding wrist movement, i.e. while the subject is preparing (or intending) to perform the movement. Feature selection and classification of the direction is done using a wrapper method based on support vector machines (SVMs). The classification results show that we are able to discriminate the directions using features extracted from brain signals prior to movement. We then extract rules from the SVM classifiers to compare the features extracted for real and imaginary movements in an attempt to understand the mechanisms of intention of movement. Our new approach could be potentially useful in building brain,computer interfaces where a paralysed person could communicate with a wheelchair and steer it to the desired direction using a rule-based knowledge system based on understanding of the subject's intention to move through his/her brain signals. [source]


On establishing the accuracy of noise tomography travel-time measurements in a realistic medium

GEOPHYSICAL JOURNAL INTERNATIONAL, Issue 3 2009
Victor C. Tsai
SUMMARY It has previously been shown that the Green's function between two receivers can be retrieved by cross-correlating time series of noise recorded at the two receivers. This property has been derived assuming that the energy in normal modes is uncorrelated and perfectly equipartitioned, or that the distribution of noise sources is uniform in space and the waves measured satisfy a high frequency approximation. Although a number of authors have successfully extracted travel-time information from seismic surface-wave noise, the reason for this success of noise tomography remains unclear since the assumptions inherent in previous derivations do not hold for dispersive surface waves on the Earth. Here, we present a simple ray-theory derivation that facilitates an understanding of how cross correlations of seismic noise can be used to make direct travel-time measurements, even if the conditions assumed by previous derivations do not hold. Our new framework allows us to verify that cross-correlation measurements of isotropic surface-wave noise give results in accord with ray-theory expectations, but that if noise sources have an anisotropic distribution or if the velocity structure is non-uniform then significant differences can sometimes exist. We quantify the degree to which the sensitivity kernel is different from the geometric ray and find, for example, that the kernel width is period-dependent and that the kernel generally has non-zero sensitivity away from the geometric ray, even within our ray theoretical framework. These differences lead to usually small (but sometimes large) biases in models of seismic-wave speed and we show how our theoretical framework can be used to calculate the appropriate corrections. Even when these corrections are small, calculating the errors within a theoretical framework would alleviate fears traditional seismologists may have regarding the robustness of seismic noise tomography. [source]


Policy Drivers in UK Higher Education in Historical Perspective: ,Inside Out', ,Outside In' and the Contribution of Research

HIGHER EDUCATION QUARTERLY, Issue 2 2006
Michael Shattock
Where have been the main policy drivers for the development of British higher education over the last 50 years? This article argues that while higher education policy was once driven from the inside outwards, from the late 1970s it has been driven exclusively from the outside inwards. Policy decisions under either regime were rarely driven by research findings especially from within the higher education community. The current imbalance between ,inside-out' and ,outside-in' policy formation is paradoxically most apparent when the higher education system has a more widely diversified funding base than at any time since the 1930s. The key policy challenge is now not what new policies are needed but what new framework should be developed for policy making. [source]


Ontology-based resource management

HUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 6 2009
Jussi I. Kantola
Managers face many difficulties in managing organizational resources, with problems arising from multi-objectiveness, lack of holism, subjective understanding, different perceptions, vagueness, and complexity. This work presents an ontology-based management framework that aims to reduce the number of difficulties related to organizational resource management. In this framework, organizational resources are considered ontologies. The technological part of the framework includes an online repository and an interpreter for ontologies; the repository provides global access to ontologies, and the interpreter enables reasoning based on meanings. Introspection and extrospection create unique instances of the ontologies. When these unique instances are used, it is possible to manage organizational resources in a new way. This management method uses situational perceptions and aspirations for the future of organizational resources. Currently the new framework is in use in the private sector, the municipal sector, and in several universities in the world. © 2009 Wiley Periodicals, Inc. [source]


Integrated Optimization by Multi-Objective Particle Swarm Optimization

IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 1 2010
Masaru Kawarabayashi Student Member
Abstract In this letter, integrated optimization system, a new framework of practical optimization, is expanded to multi-objective optimization problem. This system is used in order to reduce the number of accesses to a simulator. On the basis of simulation results using some typical benchmark problems, it is shown that the proposed integrated optimization system enables to obtain relatively good Pareto solutions with drastic reduction in the number of function calls for evaluating the performance index values of systems. Copyright © 2010 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source]


Client-Led Information System Creation (CLIC): navigating the gap

INFORMATION SYSTEMS JOURNAL, Issue 3 2005
Donna Champion
Abstract., This paper offers a new framework to facilitate an interpretive approach to client-led information system development, referred to as CLIC (Client-Led Information System Creation). The challenge of moving seamlessly through a process of information systems (IS) design is still the subject of much research in the IS field. Attempts to address the difficulties of ,bridging the gap' between a client's business needs and an information system definition have hitherto not provided a coherent and practical approach. Rather than attempting to bridge the gap, this paper describes an approach to managing this gap by facilitating the clients' navigating through the information system design process (or inquiry process) in a coherent manner. The framework has been developed through practice, and the paper provides an example of navigating through the design phase taken from an Action Research field study in a major UK bank. [source]


Non-inferior Nash strategies for routing control in parallel-link communication networks

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 4 2005
Yong Liu
Abstract We consider a routing control problem of two-node parallel-link communication network shared by competitive teams of users. Each team has various types of entities (traffics or jobs) to be routed on the network. The users in each team cooperate for the benefit of their team so as to achieve optimal routing over network links. The teams, on the other hand, compete among themselves for the network resources and each has an objective function that relates to the overall performance of the network. For each team, there is a centralized decision-maker, called the team leader or manager, who coordinates the routing strategies among all entities in his team. A game theoretic approach to deal with both cooperation within each team and competition among the teams, called the Non-inferior Nash strategy, is introduced. Considering the roles of a group manager in this context, the concept of a Non-inferior Nash strategy with a team leader is introduced. This multi-team solution provides a new framework for analysing hierarchically controlled systems so as to address complicated coordination problems among the various users. This strategy is applied to derive the optimal routing policies for all users in the network. It is shown that Non-inferior Nash strategies with a team leader is effective in improving the overall network performance. Various types of other strategies such as team optimization and Nash strategies are also discussed for the purpose of comparison. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Aggregation of linguistic labels when semantics is based on antonyms

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 4 2001
Vicenç Torra
In this work, we introduce aggregation operators for linguistic labels (this is, ordinal scales) when different experts (or information sources) use different domains to express their knowledge. The aggregated value is computed (i) building first a unified framework, (ii) transforming all the initial values into this new framework, (iii) aggregating the transformed values, and (iv) finally applying a reversal transformation. Transformations and all the constructions are based on assuming an existing semantics for all the domains. In this work, we consider the semantics based on the existence of an antonym (or a set of them) for each element in the domain. This is equivalent to a semantics based on negation functions. © 2001 John Wiley & Sons, Inc. [source]


Constructive algorithm for dynamic observer error linearization via integrators: single output case

INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 1 2007
Kyung T. Yu
Abstract Dynamic observer error linearization which has been introduced recently is a new framework for observer design. Although this approach unifies several existing results on the problem and extends the class of systems that can be transformed into an observable linear system with an injection term of known signals, constructive algorithms to check the applicability are not available yet. In this paper, a constructive algorithm is proposed to solve the problem under some restrictions on the system structure and on the auxiliary dynamics introduced in the problem. The algorithm is constructive in the sense that the components of the transformation can be obtained step-by-step. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Learning conditions at work: a framework to understand and assess informal learning in the workplace

INTERNATIONAL JOURNAL OF TRAINING AND DEVELOPMENT, Issue 1 2004
Sveinung Skule
The purpose of this article is to develop a framework to understand and assess the quality of learning environments in the workplace. It is argued that indicators used to measure and assess informal learning at work, at both the national and the enterprise level, are underdeveloped. Consequently, current frameworks to measure and benchmark learning are heavily biased towards education and formal training. A new framework is developed, based on a quantitative survey representative of the private sector in Norway. The framework consists of seven learning conditions, which have significant effects on informal learning at work. Implications for further research, policy and practice are discussed. [source]


Critical appraisal of rigour in interpretive phenomenological nursing research

JOURNAL OF ADVANCED NURSING, Issue 2 2006
Lorna De Witt BScN RN
Aim., This paper reports a critical review of published nursing research for expressions of rigour in interpretive phenomenology, and a new framework of rigour specific to this methodology is proposed. Background., The rigour of interpretive phenomenology is an important nursing research methods issue that has direct implications for the legitimacy of nursing science. The use of a generic set of qualitative criteria of rigour for interpretive phenomenological studies is problematic because it is philosophically inconsistent with the methodology and creates obstacles to full expression of rigour in such studies. Methods., A critical review was conducted of the published theoretical interpretive phenomenological nursing literature from 1994 to 2004 and the expressions of rigour in this literature identified. We used three sources to inform the derivation of a proposed framework of expressions of rigour for interpretive phenomenology: the phenomenological scholar van Manen, the theoretical interpretive phenomenological nursing literature, and Madison's criteria of rigour for hermeneutic phenomenology. Findings., The nursing literature reveals a broad range of criteria for judging the rigour of interpretive phenomenological research. The proposed framework for evaluating rigour in this kind of research contains the following five expressions: balanced integration, openness, concreteness, resonance, and actualization. Balanced integration refers to the intertwining of philosophical concepts in the study methods and findings and a balance between the voices of study participants and the philosophical explanation. Openness is related to a systematic, explicit process of accounting for the multiple decisions made throughout the study process. Concreteness relates to usefulness for practice of study findings. Resonance encompasses the experiential or felt effect of reading study findings upon the reader. Finally, actualization refers to the future realization of the resonance of study findings. Conclusion., Adoption of this or similar frameworks of expressions of rigour could help to preserve the integrity and legitimacy of interpretive phenomenological nursing research. [source]


Can forecasting performance be improved by considering the steady state?

JOURNAL OF FORECASTING, Issue 1 2008
An application to Swedish inflation, interest rate
Abstract This paper investigates whether the forecasting performance of Bayesian autoregressive and vector autoregressive models can be improved by incorporating prior beliefs on the steady state of the time series in the system. Traditional methodology is compared to the new framework,in which a mean-adjusted form of the models is employed,by estimating the models on Swedish inflation and interest rate data from 1980 to 2004. Results show that the out-of-sample forecasting ability of the models is practically unchanged for inflation but significantly improved for the interest rate when informative prior distributions on the steady state are provided. The findings in this paper imply that this new methodology could be useful since it allows us to sharpen our forecasts in the presence of potential pitfalls such as near unit root processes and structural breaks, in particular when relying on small samples.,,Copyright © 2008 John Wiley & Sons, Ltd. [source]


A new framework for data reconciliation and measurement bias identification in generalized linear dynamic systems

AICHE JOURNAL, Issue 7 2010
Hua Xu
Abstract This article describes a new framework for data reconciliation in generalized linear dynamic systems, in which the well-known Kalman filter (KF) is inadequate for filtering. In contrast to the classical formulation, the proposed framework is in a more concise form but still remains the same filtering accuracy. This comes from the properties of linear dynamic systems and the features of the linear equality constrained least squares solution. Meanwhile, the statistical properties of the framework offer new potentials for dynamic measurement bias detection and identification techniques. On the basis of this new framework, a filtering formula is rederived directly and the generalized likelihood ratio method is modified for generalized linear dynamic systems. Simulation studies of a material network present the effects of both the techniques and emphatically demonstrate the characteristics of the identification approach. Moreover, the new framework provides some insights about the connections between linear dynamic data reconciliation, linear steady state data reconciliation, and KF. © 2009 American Institute of Chemical Engineers AIChE J, 2010 [source]


Explaining organizational change in international development: the role of complexity in anti-corruption work

JOURNAL OF INTERNATIONAL DEVELOPMENT, Issue 8 2004
Bryane Michael
What explains the rapid expansion of programmes undertaken by donor agencies which may be labelled as ,anti-corruption programmes' in the 1990s? There are four schools of anti-corruption project practice: universalistic, state-centric, society-centric, and critical schools of practice. Yet, none can explain the expansion of anti-corruption projects. A ,complexity perspective' offers a new framework for looking at such growth. Such a complexity perspective addresses how project managers, by strategically interacting, can create emergent and evolutionary expansionary self-organisation. Throughout the ,first wave' of anti-corruption activity in the 1990s, such self-organization was largely due to World Bank sponsored national anti-corruption programmes. More broadly, the experience of the first wave of anti-corruption practice sheds light on development theory and practice,helping to explain new development practice with its stress on multi-layeredness, participation, and indigenous knowledge. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Conceptualizing Knowledge Creation: A Critique of Nonaka's Theory*

JOURNAL OF MANAGEMENT STUDIES, Issue 7 2006
Stephen Gourlay
abstract Nonaka's proposition that knowledge is created through the interaction of tacit and explicit knowledge involving four modes of knowledge conversion is flawed. Three of the modes appear plausible but none are supported by evidence that cannot be explained more simply. The conceptual framework omits inherently tacit knowledge, and uses a radically subjective definition of knowledge: knowledge is in effect created by managers. A new framework is proposed suggesting that different kinds of knowledge are created by different kinds of behaviour. Following Dewey, non-reflectional behaviour is distinguished from reflective behaviour, the former being associated with tacit knowledge, and the latter with explicit knowledge. Some of the implications for academic and managerial practice are considered. [source]


Dynamic operability analysis of nonlinear process networks based on dissipativity

AICHE JOURNAL, Issue 4 2009
Osvaldo J. Rojas
Abstract Most modern chemical plants are complex networks of multiple interconnected nonlinear process units, often with multiple recycle and by-pass streams and energy integration. Interactions between process units often lead to plant-wide operability problems (i.e., difficulties in process control). Plant-wide operability analysis is often difficult due to the complexity and nonlinearity of the processes. This article provides a new framework of dynamic operability analysis for plant-wide processes, based on the dissipativity of each process unit and the topology of the process network. Based on the concept of dissipative systems, this approach can deal with nonlinear processes directly. Developed from a network perspective, the proposed framework is also inherently scalable and thus can be applied to large process networks. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source]


Trait Psychology and Culture: Exploring Intercultural Comparisons

JOURNAL OF PERSONALITY, Issue 6 2001
Robert R. McCrae
Personality traits, studied for decades by Western personality psychologists, have recently been reconceptualized as endogenous basic tendencies that, within a cultural context, give rise to habits, attitudes, skills, beliefs, and other characteristic adaptations. This conceptualization provides a new framework for studying personality and culture at three levels. Transcultural research focuses on identifying human universals, such as trait structure and development; intracultural studies examine the unique expression of traits in specific cultures; and intercultural research characterizes cultures and their subgroups in terms of mean levels of personality traits and seeks associations between cultural variables and aggregate personality traits. As an example of the problems and possibilities of intercultural analyses, data on mean levels of Revised NEO Personality Inventory scales from college age and adult samples (N = 23,031) of men and women from 26 cultures are examined. Results showed that age and gender differences resembled those found in American samples; different subsamples from each culture showed similar levels of personality traits; intercultural factor analysis yielded a close approximation to the Five-Factor Model; and factor scores were meaningfully related to other culture-level variables. However, mean trait levels were not apparent to expert raters, casting doubt on the accuracy of national stereotypes. Trait psychology can serve as a useful complement to cultural perspectives on human nature and personality. [source]


After Bristol: the healthcare of young children and the law

LEGAL STUDIES, Issue 2 2003
Jo Bridgeman
This paper considers the written statements provided to the Bristol Inquiry by parents whose children underwent cardiac surgery at the Bristol Royal Infirmary between 1984 and 1995, seeking to learn from their experiences, opinions, feelings and expectations. The law regulating the relationship between healthcare professional, parent and child is considered in light of these accounts. The limitations of the existing law are such that a new legal framework is required which fosters the relationship between healthcare professional, parent and child, supporting them in the shared endeavour of caring for the child. Of central importance within this new framework would be recognition of each child as a distinct individual and of the expertise which parents can contribute to the care of their child. [source]