Home About us Contact | |||
Underlying Concepts (underlying + concept)
Selected Abstracts,Best Practice' Options for the Legal Recognition of Customary TenureDEVELOPMENT AND CHANGE, Issue 3 2005Daniel Fitzpatrick Is there a ,best practice' model for the legal recognition of customary tenure? If not, is it possible to identify the circumstances in which a particular model would be most appropriate? This article considers these questions in the light of economic theories of property rights, particularly as illustrated by the World Bank's 2003 land policy report. While these theories have their flaws, the underlying concept of tenure security allows a typological framework for developing legal responses to customary tenure. In particular, this article suggests that the nature and degree of State legal intervention in a customary land system should be determined by reference to the nature and causes of any tenure insecurity. This hypothesis is discussed by reference to a wide variety of legal examples from Africa, Papua New Guinea and the South Pacific. The objective is not to suggest that law determines resource governance outcomes in pluralist normative environments, but to improve the quality of legal interventions in order to assist customary groups to negotiate better forms of tenure security and access to resources. [source] Trust in Nurses Scale: construct validity and internal reliability evaluationJOURNAL OF ADVANCED NURSING, Issue 3 2010Laurel E. Radwin radwin l.e. & cabral h.j. (2010) Trust in Nurses Scale: construct validity and internal reliability evaluation. Journal of Advanced Nursing66(3), 683,689. Abstract Aim., This paper is a report of the continued psychometric evaluation of the Trust in Nurses Scale. Background., Qualitative analyses indicate that trust in nurses is critically important to adult patients. Instruments that distinctively measure this concept are lacking. A middle-range theory of patient-centred nursing care provided the theoretical basis for the Trust in Nurses Scale. Content validity was assessed by an expert panel and patient interviews. Construct validity and reliability were found acceptable using multi-trait/multi-item analysis techniques. These findings were previously reported. Methods., Construct validity and reliability of the Trust in Nurses Scale was assessed in 2007 using data collected during 2004,2005 from 187 hospitalized patients in a haematology-oncology setting. Trust in nurses (the latent factor) was operationalized by five items (manifest variables) using confirmatory factor analyses. Fit statistics included comparative fit index, Tucker-Lewis Index, root mean square error of approximation and the standardized root mean square residual. Internal consistency reliability was assessed using coefficient alpha. Findings., Both a five-item and a four-item version demonstrate acceptable psychometric properties. The five-item version met three fit statistics criteria. Fifty-nine per cent of the variance was explained. A four-item version met all fit statistics criteria. Sixty-six per cent of the variance was explained. Acceptable internal consistency reliability was found for both versions. Conclusion., Previous psychometric testing of the Trust in Nurses Scale provided evidence of the instrument's reliability, content validity and construct validity. The presented analyses further support construct validity. Thus, cumulative findings indicate that the instrument measures with a few items the underlying concept of trust. [source] Using effectiveness studies for prescribing research, part 1JOURNAL OF CLINICAL PHARMACY & THERAPEUTICS, Issue 5 2002N. Freemantle PhD Summary The process of evaluating pharmaceuticals has become highly conceptualized in contrast to the lack of formal rules for assessing effects of interventions on practice. We argue that clinical audit is a key factor prior to instigating an intervention and that randomized controlled evaluations are preferable. We discuss the need for small-scale experiments prior to full trials to validate the underlying concept of an intervention with the recognition that different approaches may be necessary. This includes open rather than blind assessments and greater emphasis on qualitative issues during development of interventions followed by quantitative appraisal of their impact. [source] The evolution of trade-offs: where are we?JOURNAL OF EVOLUTIONARY BIOLOGY, Issue 2 2007D. A. ROFF Abstract Trade-offs are a core component of many evolutionary models, particularly those dealing with the evolution of life histories. In the present paper, we identify four topics of key importance for studies of the evolutionary biology of trade-offs. First, we consider the underlying concept of ,constraint'. We conclude that this term is typically used too vaguely and suggest that ,constraint' in the sense of a bias should be clearly distinguished from ,constraint' in the sense of proscribed combinations of traits or evolutionary trajectories. Secondly, we address the utility of the acquisition,allocation model (the ,Y-model'). We find that, whereas this model and its derivatives have provided new insights, a misunderstanding of the pivotal equation has led to incorrect predictions and faulty tests. Thirdly, we ask how trade-offs are expected to evolve under directional selection. A quantitative genetic model predicts that, under weak or short-term selection, the intercept will change but the slope will remain constant. Two empirical tests support this prediction but these are based on comparisons of geographic populations: more direct tests will come from artificial selection experiments. Finally, we discuss what maintains variation in trade-offs noting that at present little attention has been given to this question. We distinguish between phenotypic and genetic variation and suggest that the latter is most in need of explanation. We suggest that four factors deserving investigation are mutation-selection balance, antagonistic pleiotropy, correlational selection and spatio-temporal variation, but as in the other areas of research on trade-offs, empirical generalizations are impeded by lack of data. Although this lack is discouraging, we suggest that it provides a rich ground for further study and the integration of many disciplines, including the emerging field of genomics. [source] Complementing Mass Customization Toolkits with User Communities: How Peer Input Improves Customer Self-Design,THE JOURNAL OF PRODUCT INNOVATION MANAGEMENT, Issue 6 2008Nikolaus Franke In this paper, the authors propose that the canonical customer,toolkit dyad in mass customization (MC) should be complemented with user communities. Many companies in various industries have begun to offer their customers the opportunity to design their own products online. The companies provide Web-based MC toolkits that allow customers who prefer individualized products to tailor items such as sneakers, personal computers (PCs), cars, kitchens, cereals, or skis to their specific preferences. Most existing MC toolkits are based on the underlying concept of an isolated, dyadic interaction process between the individual customer and the MC toolkit. Information from external sources is not provided. As a result, most academic research on MC toolkits has focused on this dyadic perspective. The main premise of this paper is that novice MC toolkit users in particular might largely benefit from information given by other customers. Pioneering research shows that customers in the computer gaming and digital music instruments industries are willing to support each other for the sake of efficient toolkit use (e.g., how certain toolkit functions work). Expanding on their work, the present paper provides evidence that peer assistance appears also extremely useful in the two other major phases of the customer's individual self-design process, namely, the development of an initial idea and the evaluation of a preliminary design solution. Two controlled experiments were conducted in which 191 subjects used an MC toolkit to design their own individual skis. The authors found that during the phase of developing an initial idea, having access to other users' designs as potential starting points stimulates the integration of existing solution chunks into the problem-solving process, which indicates more systematic problem-solving behavior. Peer customer input also turned out to have positive effects on the evaluation of preliminary design solutions. Providing other customers' opinions on interim design solutions stimulated favorable problem-solving behavior, namely, the integration of external feedback. The use of these two problem-solving heuristics in turn leads to an improved process outcome,that is, self-designed products that meet the preferences of the customers more effectively (measured in terms of perceived preference fit, purchase intention, and willingness to pay). These findings have important theoretical and managerial implications. [source] Using the Malcolm Baldrige National Quality Award in Teaching: One Criteria, Several Perspectives,DECISION SCIENCES JOURNAL OF INNOVATIVE EDUCATION, Issue 2 2004James A. Belohlav ABSTRACT The Malcolm Baldrige National Quality Award (MBNQA) has influenced the thinking and operations within organizations from all sectors of the American economy. This paper presents the experiences of three faculty members who have used the Criteria for Performance Excellence and the underlying concepts of the MBNQA to enhance the learning experiences of their students. The authors discuss how Dale's Cone of Experience is employed, by means of concrete exercises and experiences, to better leverage the student's ability to understand the abstract concepts. The formal, end-of-term student evaluations indicate that the described approach has led to a higher level of student engagement in the learning process, as evidenced by more abundant and higher-quality feedback to the instructors. [source] A view from the bridge: agreement between the SF-6D utility algorithm and the Health Utilities IndexHEALTH ECONOMICS, Issue 11 2003Bernie J. O'Brien Abstract Background: The SF-6D is a new health state classification and utility scoring system based on 6 dimensions (,6D') of the Short Form 36, and permits a "bridging" transformation between SF-36 responses and utilities. The Health Utilities Index, mark 3 (HUI3) is a valid and reliable multi-attribute health utility scale that is widely used. We assessed within-subject agreement between SF-6D utilities and those from HUI3. Methods: Patients at increased risk of sudden cardiac death and participating in a randomized trial of implantable defibrillator therapy completed both instruments at baseline. Score distributions were inspected by scatterplot and histogram and mean score differences compared by paired t -test. Pearson correlation was computed between instrument scores and also between dimension scores within instruments. Between-instrument agreement was by intra-class correlation coefficient (ICC). Results: SF-6D and HUI3 forms were available from 246 patients. Mean scores for HUI3 and SF-6D were 0.61 (95% CI 0.60,0.63) and 0.58 (95% CI 0.54,0.62) respectively; a difference of 0.03 (p<0.03). Score intervals for HUI3 and SF-6D were (-0.21 to 1.0) and (0.30,0.95). Correlation between the instrument scores was 0.58 (95% CI 0.48,0.68) and agreement by ICC was 0.42 (95% CI 0.31,0.52). Correlations between dimensions of SF-6D were higher than for HUI3. Conclusions: Our study casts doubt on the whether utilities and QALYs estimated via SF-6D are comparable with those from HUI3. Utility differences may be due to differences in underlying concepts of health being measured, or different measurement approaches, or both. No gold standard exists for utility measurement and the SF-6D is a valuable addition that permits SF-36 data to be transformed into utilities to estimate QALYs. The challenge is developing a better understanding as to why these classification-based utility instruments differ so markedly in their distributions and point estimates of derived utilities. Copyright © 2003 John Wiley & Sons, Ltd. [source] Design spaces, measures and metrics for evaluating quality of time operators and consequences leading to improved algorithms by design,illustration to structural dynamicsINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 14 2005X. Zhou Abstract For the first time, for time discretized operators, we describe and articulate the importance and notion of design spaces and algorithmic measures that not only can provide new avenues for improved algorithms by design, but also can distinguish in general, the quality of computational algorithms for time-dependent problems; the particular emphasis is on structural dynamics applications for the purpose of illustration and demonstration of the basic concepts (the underlying concepts can be extended to other disciplines as well). For further developments in time discretized operators and/or for evaluating existing methods, from the established measures for computational algorithms, the conclusion that the most effective (in the sense of convergence, namely, the stability and accuracy, and complexity, namely, the algorithmic formulation and algorithmic structure) computational algorithm should appear in a certain algorithmic structure of the design space amongst comparable algorithms is drawn. With this conclusion, and also with the notion of providing new avenues leading to improved algorithms by design, as an illustration, a novel computational algorithm which departs from the traditional paradigm (in the sense of LMS methods with which we are mostly familiar with and widely used in commercial software) is particularly designed into the perspective design space representation of comparable algorithms, and is termed here as the forward displacement non-linearly explicit L-stable (FDEL) algorithm which is unconditionally consistent and does not require non-linear iterations within each time step. From the established measures for comparable algorithms, simply for illustration purposes, the resulting design of the FDEL formulation is then compared with the commonly advocated explicit central difference method and the implicit Newmark average acceleration method (alternately, the same conclusion holds true against controllable numerically dissipative algorithms) which pertain to the class of linear multi-step (LMS) methods for assessing both linear and non-linear dynamic cases. The conclusions that the proposed new design of the FDEL algorithm which is a direct consequence of the present notion of design spaces and measures, is the most effective algorithm to-date to our knowledge in comparison to the class of second-order accurate algorithms pertaining to LMS methods for routine and general non-linear dynamic situations is finally drawn through rigorous numerical experiments. Copyright © 2005 John Wiley & Sons, Ltd. [source] Characterization of tissue structure at varying length scales using temporal diffusion spectroscopy,NMR IN BIOMEDICINE, Issue 7 2010John C. Gore Abstract The concepts, theoretical behavior and experimental applications of temporal diffusion spectroscopy are reviewed and illustrated. Temporal diffusion spectra are obtained using oscillating-gradient waveforms in diffusion-weighted measurements, and represent the manner in which various spectral components of molecular velocity correlations vary in different geometrical structures that restrict or hinder free movements. Measurements made at different gradient frequencies reveal information on the scale of restrictions or hindrances to free diffusion, and the shape of a spectrum reveals the relative contributions of spatial restrictions at different distance scales. Such spectra differ from other so-called diffusion spectra which depict spatial frequencies and are defined at a fixed diffusion time. Experimentally, oscillating gradients at moderate frequency are more feasible for exploring restrictions at very short distances which, in tissues, correspond to structures smaller than cells. We describe the underlying concepts of temporal diffusion spectra and provide analytical expressions for the behavior of the diffusion coefficient as a function of gradient frequency in simple geometries with different dimensions. Diffusion in more complex model media that mimic tissues has been simulated using numerical methods. Experimental measurements of diffusion spectra have been obtained in suspensions of particles and cells, as well as in vivo in intact animals. An observation of particular interest is the increased contrast and heterogeneity observed in tumors using oscillating gradients at moderate frequency compared with conventional pulse gradient methods, and the potential for detecting changes in tumors early in their response to treatment. Computer simulations suggest that diffusion spectral measurements may be sensitive to intracellular structures, such as nuclear size, and that changes in tissue diffusion properties may be measured before there are changes in cell density. Copyright © 2010 John Wiley & Sons, Ltd. [source] Social aspect of sustainable packagingPACKAGING TECHNOLOGY AND SCIENCE, Issue 6 2010Norbisimi Nordin Abstract Sustainability is one of the ,buzz' words that is highly discussed in the area of packaging nowadays. For many product manufacturing business, incorporation of sustainability principles into their business practice can only be visualized by others in the end product through packaging. Besides the criteria, underlying concepts and principles, most discussions towards achieving goals for sustainable packaging are focused on details of models and practices adopted by the industry, and the effectiveness and practicality of these practices in balancing the economic profits and environmental benefits. While the economic and environmental bases of packaging sustainability have been examined and discussed in great detail, the same is not true of social consideration. Although the success of sustainable packaging development actually relies on both technological development and social considerations, many of the social aspects of sustainable packaging are often overlooked. Although many companies have been putting the efforts and initiatives to elevate sustainability from an abstract goal into an immediate priority, relatively little is known about the consumers' insights of packaging sustainability. Recognizing the consumers as the final arbiter of the success of sustainable packaging, this paper will explore consumers' perceptions on the sustainable packaging concept, their perceptions of the impact to the environment and discuss factors that drive consumers' preferences and purchase decision. Discussion and information gathered in this paper is aimed to stimulate understanding on the importance of the social dimension of packaging sustainability and its role in supporting the efforts to improve sustainability practice. Copyright © 2010 John Wiley & Sons, Ltd. [source] Peptide and protein quantification: A map of the minefieldPROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 4 2010Marc Vaudel Abstract The increasing popularity of gel-free proteomics technologies has created a strong demand for compatible quantitative analysis methods. As a result, a plethora of different techniques has been proposed to perform gel-free quantitative analysis of proteomics samples. Each of these methods comes with certain strengths and shortcomings, and they often are dedicated to a specific purpose. This review will present a brief overview of the main methods, organized by their underlying concepts, and will discuss the issues they raise with a focus on data processing. Finally, we will list the available software that can help with the data processing from quantitative experiments. We hope that this review will thus enable researchers to find the most appropriate method available for their research objectives, and can also serve as a basis for creating a reliable data processing strategy. [source] |