Home About us Contact | |||
Theoretical Background (theoretical + background)
Selected AbstractsI. Introduction, Theoretical Background, and Prior ResearchMONOGRAPHS OF THE SOCIETY FOR RESEARCH IN CHILD DEVELOPMENT, Issue 4 2002Melanie Killen First page of article [source] Organizational entrepreneurship: Theoretical background, some empirical tests, and directions for future researchHUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 1 2005Mariusz Bratnicki The widely held contemporary perspective on entrepreneurship is dangerously premature. Currently existing theories are insufficient to understand the dynamic interplay among entrepreneurship, the state, and external environment forces, as well as the organization's capacity to facilitate entrepreneurship and resulting effectiveness. In this exploratory paper I identify organizational architecture variables that help to shape a firm's entrepreneurship. The topic of organizational enablers is introduced. The primary purpose of the investigation falls under a category of exploration of dialectics' potential for entrepreneurship theory and development concept that refers to strategic contradictions, organizational enablers, and entrepreneurial behaviors. In particular, it focuses on understanding the organizational context of entrepreneurship and understanding the entrepreneurial reconciliation made by managers who seek to increase their company's growth. I investigate empirically how reconciliation of primary and secondary contradictions is related to entrepreneurial behaviors. The in-depth examination of organizational enablers and entrepreneurial behavior is only one example of how a dialectical approach can reshape our understanding of the complex, multilevel entrepreneurship process, which may have less to do with the behavior of individual members than with impersonal and seemingly insignificant forces. Finally, implications for future research are discussed. © 2005 Wiley Periodicals, Inc. Hum Factors Man 15: 15,33, 2005. [source] The multi-chain Markov switching modelJOURNAL OF FORECASTING, Issue 7 2005Edoardo Otranto Abstract In many real phenomena the behaviour of a certain variable, subject to different regimes, depends on the state of other variables or the same variable observed in other subjects, so the knowledge of the state of the latter could be important to forecast the state of the former. In this paper a particular multivariate Markov switching model is developed to represent this case. The transition probabilities of this model are characterized by the dependence on the regime of the other variables. The estimation of the transition probabilities provides useful information for the researcher to forecast the regime of the variables analysed. Theoretical background and an application are shown. Copyright © 2005 John Wiley & Sons, Ltd. [source] A graphical generalized implementation of SENSE reconstruction using MatlabCONCEPTS IN MAGNETIC RESONANCE, Issue 3 2010Hammad Omer Abstract Parallel acquisition of Magnetic Resonance Imaging (MRI) has the potential to significantly reduce the scan time. SENSE is one of the many techniques for the reconstruction of parallel MRI images. A generalized algorithm for SENSE reconstruction and theoretical background is presented. This algorithm can be used for SENSE reconstruction for any acceleration factor between 2 and 8, for any Phase Encode direction (Horizontal or Vertical), with or without Regularization. The user can select a particular type of Regularization. A GUI based implementation of the algorithm is also given. Signal-to-noise ratio, artefact power, and g -factor map are used to quantify the quality of reconstruction. The effects of different acceleration factors on these parameters are also discussed. The GUI based implementation of SENSE reconstruction provides an easy selection of various parameters needed for reconstruction of parallel MRI images and helps in an efficient reconstruction and analysis of the quality of reconstruction. © 2010 Wiley Periodicals, Inc. Concepts Magn Reson Part A 36A: 178,186, 2010. [source] NMR studies of chiral organic compounds in non-isotropic phasesCONCEPTS IN MAGNETIC RESONANCE, Issue 3 2008Marek J. Potrzebowski Abstract In this article, new applications and perspectives of one- and two-dimensional NMR spectroscopy for study of chiral organic compounds in the non-isotropic phases (solid state and liquid crystals) are presented. The review is organized into five sections. In the first part, theoretical background and short introduction to solid state NMR are shown. The second part presents how NMR isotropic chemical shift can be used for distinguishing of racemates and enantiomers. In the third section, the power of the ODESSA pulse sequence for investigation of racemates, enantiomers and establishing of enantiomeric excess are discussed. The fourth part shows the application of analysis of principal elements of chemical shift tensors obtained by means of 2D NMR techniques for searching of absolute configuration and conformational changes in the solid state. The final part presents recent achievements of chiral liquid crystals NMR methodology for study of chiral compounds. © 2008 Wiley Periodicals, Inc. Concepts Magn Reson Part A 32A:201,218, 2008. [source] Historical review of sample preparation for chromatographic bioanalysis: pros and consDRUG DEVELOPMENT RESEARCH, Issue 3 2007Min S. Chang Abstract Sample preparation is a major task in a regulated bioanalytical laboratory. The sample preparation procedure significantly impacts assay throughput, data quality, analysis cost, and employee satisfaction. Therefore, selecting and optimizing an appropriate sample preparation method is essential for successful method development. Because of our recent expertise, this article is focused on sample preparation for high-performance liquid chromatography with mass spectrometric detection. Liquid chromatography with mass spectrometric detection (LC-MS) is the most common detection technique for small molecules used in regulated bioanalytical laboratories. The sample preparation technologies discussed are pre-extraction and post-extraction sample processing, protein precipitation (PPT), liquid,liquid extraction (LLE), offline solid-phase extraction (SPE), and online solid-phase extraction. Since all these techniques were in use for more than two decades, numerous applications and variations exist for each technique. We will not attempt to categorize each variation. Rather, the development history, a brief theoretical background, and selected references are presented. The strengths and the limitations of each method are discussed, including the throughput improvement potential. If available, illustrations from presentations at various meetings by our laboratory are used to clarify our opinion. Drug Dev Res 68:107,133, 2007. ©2007 Wiley-Liss, Inc. [source] Active tendon control of cable-stayed bridges: a large-scale demonstrationEARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 7 2001Frédéric Bossens This paper presents a strategy for active damping of cable structures, using active tendons. The first part of the paper summarizes the theoretical background: the control law is briefly presented together with the main results of an approximate linear theory which allows the prediction of closed-loop poles with a root locus technique. The second part of the paper reports on experimental results obtained with two test structures: the first one is a small size mock-up representative of a cable-stayed bridge during the construction phase. The control of the parametric vibration of passive cables due to deck vibration is demonstrated. The second one is a 30 m long mock-up built on the reaction wall of the ELSA test facility at the JRC Ispra (Italy); this test structure is used to demonstrate the practical implementation of the control strategy with hydraulic actuators. Copyright © 2001 John Wiley & Sons, Ltd. [source] Guidelines on routine cerebrospinal fluid analysis.EUROPEAN JOURNAL OF NEUROLOGY, Issue 9 2006Report from an EFNS task force A great variety of neurological diseases require investigation of cerebrospinal fluid (CSF) to prove the diagnosis or to rule out relevant differential diagnoses. The objectives were to evaluate the theoretical background and provide guidelines for clinical use in routine CSF analysis including total protein, albumin, immunoglobulins, glucose, lactate, cell count, cytological staining, and investigation of infectious CSF. The methods included a Systematic Medline search for the above-mentioned variables and review of appropriate publications by one or more of the task force members. Grading of evidence and recommendations was based on consensus by all task force members. It is recommended that CSF should be analysed immediately after collection. If storage is needed 12 ml of CSF should be partitioned into three to four sterile tubes. Albumin CSF/serum ratio (Qalb) should be preferred to total protein measurement and normal upper limits should be related to patients' age. Elevated Qalb is a non-specific finding but occurs mainly in bacterial, cryptococcal, and tuberculous meningitis, leptomingeal metastases as well as acute and chronic demyelinating polyneuropathies. Pathological decrease of the CSF/serum glucose ratio or increased lactate concentration indicates bacterial or fungal meningitis or leptomeningeal metastases. Intrathecal immunoglobulin G synthesis is best demonstrated by isoelectric focusing followed by specific staining. Cellular morphology (cytological staining) should be evaluated whenever pleocytosis is found or leptomeningeal metastases or pathological bleeding is suspected. Computed tomography-negative intrathecal bleeding should be investigated by bilirubin detection. [source] Discussion on ,Personality psychology as a truly behavioural science' by R. Michael FurrEUROPEAN JOURNAL OF PERSONALITY, Issue 5 2009Article first published online: 14 JUL 200 Yes We Can! A Plea for Direct Behavioural Observation in Personality Research MITJA D. BACK and BORIS EGLOFF Department of Psychology, Johannes Gutenberg University Mainz, Germany mback@uni-leipzig.de Furr's target paper (this issue) is thought to enhance the standing of personality psychology as a truly behavioural science. We wholeheartedly agree with this goal. In our comment we argue for more specific and ambitious requirements for behavioural personality research. Specifically, we show why behaviour should be observed directly. Moreover, we illustratively describe potentially interesting approaches in behavioural personality research: lens model analyses, the observation of multiple behaviours in diverse experimentally created situations and the observation of behaviour in real life. Copyright © 2009 John Wiley & Sons, Ltd. The Categories of Behaviour Should be Clearly Defined PETER BORKENAU Department of Psychology, Martin-Luther University Halle-Wittenberg, Germany p.borkenau@psych.uni-halle.de The target paper is helpful by clarifying the terminology as well as the strengths and weaknesses of several approaches to collect behavioural data. Insufficiently considered, however, is the clarity of the categories being used for the coding of behaviour. Evidence is reported showing that interjudge agreement for retrospective and even concurrent codings of behaviour does not execeed interjudge agreement for personality traits if the categories being used for the coding of behaviour are not clearly defined. By contrast, if the behaviour to be registered is unambiguously defined, interjudge agreement may be almost perfect. Copyright © 2009 John Wiley & Sons, Ltd. Behaviour Functions in Personality Psychology PHILIP J. CORR Department of Psychology, Faculty of Social Sciences, University of East Anglia, Norwich, UK Philip.Corr@btopenworld.com Furr's target paper highlights the importance, yet under-representation, of behaviour in published articles in personality psychology. Whilst agreeing with most of his points, I remain unclear as to how behaviour (as specifically defined by Furr) relates to other forms of psychological data (e.g. cognitive task performance). In addition, it is not clear how the functions of behaviour are to be decided: different behaviours may serve the same function; and identical behaviours may serve different functions. To clarify these points, methodological and theoretical aspects of Furr's proposal would benefit from delineation. Copyright © 2009 John Wiley & Sons, Ltd. On the Difference Between Experience-Sampling Self-Reports and Other Self-Reports WILLIAM FLEESON Department of Psychology, Wake Forest University, Winston-Salem, NC, USA fleesonW@wfu.edu Furr's fair but evaluative consideration of the strengths and weaknesses of behavioural assessment methods is a great service to the field. As part of his consideration, Furr makes a subtle and sophisticated distinction between different self-report methods. It is easy to dismiss all self-reports as poor measures, because some are poor. In contrast, Furr points out that the immediacy of the self-reports of behaviour in experience-sampling make experience-sampling one of the three strongest methods for assessing behaviour. This comment supports his conclusion, by arguing that ESM greatly diminishes one the three major problems afflicting self-reports,lack of knowledge,and because direct observations also suffer from the other two major problems afflicting self-reports. Copyright © 2009 John Wiley & Sons, Ltd. What and Where is ,Behaviour' in Personality Psychology? LAURA A. KING and JASON TRENT Department of Psychology, University of Missouri, Columbia, USA kingla@missouri.edu Furr is to be lauded for presenting a coherent and persuasive case for the lack of behavioural data in personality psychology. While agreeing wholeheartedly that personality psychology could benefit from greater inclusion of behavioural variables, here we question two aspects of Furr's analysis, first his definition of behaviour and second, his evidence that behaviour is under-appreciated in personality psychology. Copyright © 2009 John Wiley & Sons, Ltd. Naturalistic Observation of Daily Behaviour in Personality Psychology MATTHIAS R. MEHL Department of Psychology, University of Arizona, Tucson, AZ, USA mehl@email.arizona.edu This comment highlights naturalistic observation as a specific method within Furr's (this issue) cluster direct behavioural observation and discusses the Electronically Activated Recorder (EAR) as a naturalistic observation sampling method that can be used in relatively large, nomothetic studies. Naturalistic observation with a method such as the EAR can inform researchers' understanding of personality in its relationship to daily behaviour in two important ways. It can help calibrate personality effects against act-frequencies of real-world behaviour and provide ecological, behavioural personality criteria that are independent of self-report. Copyright © 2009 John Wiley & Sons, Ltd. Measuring Behaviour D. S. MOSKOWITZ and JENNIFER J. RUSSELL Department of Psychology, McGill University, Montreal, Canada dsm@psych.mcgill.ca Furr (this issue) provides an illuminating comparison of the strengths and weaknesses of various methods for assessing behaviour. In the selection of a method for assessing behaviour, there should be a careful analysis of the definition of the behaviour and the purpose of assessment. This commentary clarifies and expands upon some points concerning the suitability of experience sampling measures, referred to as Intensive Repeated Measurements in Naturalistic Settings (IRM-NS). IRM-NS measures are particularly useful for constructing measures of differing levels of specificity or generality, for providing individual difference measures which can be associated with multiple layers of contextual variables, and for providing measures capable of reflecting variability and distributional features of behaviour. Copyright © 2009 John Wiley & Sons, Ltd. Behaviours, Non-Behaviours and Self-Reports SAMPO V. PAUNONEN Department of Psychology, University of Western Ontario, London, Canada paunonen@uwo.ca Furr's (this issue) thoughtful analysis of the contemporary body of research in personality psychology has led him to two conclusions: our science does not do enough to study real, observable behaviours; and, when it does, too often it relies on ,weak' methods based on retrospective self-reports of behaviour. In reply, I note that many researchers are interested in going beyond the study of individual behaviours to the behaviour trends embodied in personality traits; and the self-report of behaviour, using well-validated personality questionnaires, is often the best measurement option. Copyright © 2009 John Wiley & Sons, Ltd. An Ethological Perspective on How to Define and Study Behaviour LARS PENKE Department of Psychology, The University of Edinburgh, Edinburgh, UK lars.penke@ed.ac.uk While Furr (this issue) makes many important contributions to the study of behaviour, his definition of behaviour is somewhat questionable and also lacks a broader theoretical frame. I provide some historical and theoretical background on the study of behaviour in psychology and biology, from which I conclude that a general definition of behaviour might be out of reach. However, psychological research can gain from adding a functional perspective on behaviour in the tradition of Tinbergens's four questions, which takes long-term outcomes and fitness consequences of behaviours into account. Copyright © 2009 John Wiley & Sons, Ltd. What is a Behaviour? MARCO PERUGINI Faculty of Psychology, University of Milan,Bicocca, Milan, Italy marco.perugini@unimib.it The target paper proposes an interesting framework to classify behaviour as well as a convincing plea to use it more often in personality research. However, besides some potential issues in the definition of what is a behaviour, the application of the proposed definition to specific cases is at times inconsistent. I argue that this is because Furr attempts to provide a theory-free definition yet he implicitly uses theoretical considerations when applying the definition to specific cases. Copyright © 2009 John Wiley & Sons, Ltd. Is Personality Really the Study of Behaviour? MICHAEL D. ROBINSON Department of Psychology, North Dakota State University, Fargo, ND, USA Michael.D.Robinson@ndsu.edu Furr (this issue) contends that behavioural studies of personality are particularly important, have been under-appreciated, and should be privileged in the future. The present commentary instead suggests that personality psychology has more value as an integrative science rather than one that narrowly pursues a behavioural agenda. Cognition, emotion, motivation, the self-concept and the structure of personality are important topics regardless of their possible links to behaviour. Indeed, the ultimate goal of personality psychology is to understanding individual difference functioning broadly considered rather than behaviour narrowly considered. Copyright © 2009 John Wiley & Sons, Ltd. Linking Personality and Behaviour Based on Theory MANFRED SCHMITT Department of Psychology, University of Koblenz-Landau, Landau, Germany schmittm@uni-landau.de My comments on Furr's (this issue) target paper ,Personality as a Truly Behavioural Science' are meant to complement his behavioural taxonomy and sharpen some of the presumptions and conclusions of his analysis. First, I argue that the relevance of behaviour for our field depends on how we define personality. Second, I propose that every taxonomy of behaviour should be grounded in theory. The quality of behavioural data does not only depend on the validity of the measures we use. It also depends on how well behavioural data reflect theoretical assumptions on the causal factors and mechanisms that shape behaviour. Third, I suggest that the quality of personality theories, personality research and behavioural data will profit from ideas about the psychological processes and mechanisms that link personality and behaviour. Copyright © 2009 John Wiley & Sons, Ltd. The Apparent Objectivity of Behaviour is Illusory RYNE A. SHERMAN, CHRISTOPHER S. NAVE and DAVID C. FUNDER Department of Psychology, University of California, Riverside, CA, USA funder@ucr.edu It is often presumed that objective measures of behaviour (e.g. counts of the number of smiles) are more scientific than more subjective measures of behaviour (e.g. ratings of the degree to which a person behaved in a cheerful manner). We contend that the apparent objectivity of any behavioural measure is illusory. First, the reliability of more subjective measures of behaviour is often strikingly similar to the reliabilities of so-called objective measures. Further, a growing body of literature suggests that subjective measures of behaviour provide more valid measures of psychological constructs of interest. Copyright © 2009 John Wiley & Sons, Ltd. Personality and Behaviour: A Neglected Opportunity? LIAD UZIEL and ROY F. BAUMEISTER Department of Psychology, Florida State University, Tallahassee, FL, USA Baumeister@psy.fsu.edu Personality psychology has neglected the study of behaviour. Furr's efforts to provide a stricter definition of behaviour will not solve the problem, although they may be helpful in other ways. His articulation of various research strategies for studying behaviour will be more helpful for enabling personality psychology to contribute important insights and principles about behaviour. The neglect of behaviour may have roots in how personality psychologists define the mission of their field, but expanding that mission to encompass behaviour would be a positive step. Copyright © 2009 John Wiley & Sons, Ltd. [source] A hindrance to communication: the use of difficult and incomprehensible languageINTERNATIONAL JOURNAL OF APPLIED LINGUISTICS, Issue 2 2002Karol Janicki This paper gives a brief theoretical background to and reports on three empirical studies carried out within the theoretical framework of folk linguistics, using questionnaire data. The paper is concerned with the layperson's reactions to the use of difficult and incomprehensible language. In study 1, subjects from Norway, Poland, Germany, and the USA were asked to indicate which professional groups exhibit the use of difficult language. They were also asked to suggest reasons for that use. In study 2, subjects from Norway, Poland, Germany, and the UK were asked to answer questions concerning the use of incomprehensible language in the academic community. Study 3 was similar to study 2, but in this case only highly comparable subjects from the USA and Poland were recruited. The three studies show that the use of difficult and incomprehensible language is perceived by the layperson as a serious sociolinguistic problem. They point to lawyers, politicians, computer specialists, academics and medical doctors as the heaviest users of such language. They also show that such language exerts much negative impact on the educational process and that the educational domain is a huge potential field for future research in this respect. [source] The evolution of, and revolution in, land surface schemes designed for climate modelsINTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 5 2003A. J. Pitman Abstract The land surface is a key component of climate models. It controls the partitioning of available energy at the surface between sensible and latent heat, and it controls the partitioning of available water between evaporation and runoff. The land surface is also the location of the terrestrial carbon sink. Evidence is increasing that the influence of the land surface is significant on climate and that changes in the land surface can influence regional- to global-scale climate on time scales from days to millennia. Further, there is now a suggestion that the terrestrial carbon sink may decrease as global temperatures increase as a consequence of rising CO2 levels. This paper provides the theoretical background that explains why the land surface should play a central role in climate. It also provides evidence, sourced from climate model experiments, that the land surface is of central importance. This paper then reviews the development of land surface models designed for climate models from the early, very simple models through to recent efforts, which include a coupling of biophysical processes to represent carbon exchange. It is pointed out that significant problems remain to be addressed, including the difficulties in parameterizing hydrological processes, root processes, sub-grid-scale heterogeneity and biogeochemical cycles. It is argued that continued development of land surface models requires more multidisciplinary efforts by scientists with a wide range of skills. However, it is also argued that the framework is now in place within the international community to build and maintain the latest generation of land surface models. Further, there should be considerable optimism that consolidating the recent rapid advances in land surface modelling will enhance our capability to simulate the impacts of land-cover change and the impacts of increasing CO2 on the global and regional environment. Copyright © 2003 Royal Meteorological Society [source] Anterior cingulate dysfunction in geriatric depressionINTERNATIONAL JOURNAL OF GERIATRIC PSYCHIATRY, Issue 4 2008George S. Alexopoulos Abstract Background Although several brain abnormalities have been identified in geriatric depression, their relationship to the pathophysiological mechanisms leading to the development and perpetuation of this syndrome remain unclear. Methods This paper reviews findings on the anterior cingulate cortex (ACC) function and on the relationship of ACC abnormalities to the clinical presentation and the course of geriatric depression in order to elucidate the pathophysiological role of ACC in this disorder. Results The ACC is responsible for conflict detection and emotional evaluation of error and is connected to brain structures that regulate mood, emotional valence of thought and autonomic and visceral responses, which are functions disturbed in depression. Geriatric depression often is accompanied by abnormalities in some executive functions and has a clinical presentation consistent with ACC abnormalities. Indices of ACC dysfunction are associated with adverse outcomes of geriatric depression. Conclusions Converging findings suggest that at least some ACC functions are abnormal in depression and these abnormalities are pathophysiologically meaningful. Indices of ACC dysfunction may be used to identify subgroups of depressed elderly patients with distinct illness course and treatment needs and serve as the theoretical background for novel treatment development. Copyright © 2007 John Wiley & Sons, Ltd. [source] Space, Boundaries, and the Problem of Order: A View from Systems TheoryINTERNATIONAL POLITICAL SOCIOLOGY, Issue 3 2007Jan Helmig The idea our global polity is chiefly divided by territorially organized nation-states captures contemporary constellations of power and authority only insufficiently. Through a decoupling of power and the state, political spaces no longer match geographical spaces. Instead of simply acknowledging a challenge to the state, there is the need to rethink the changing meaning of space for political processes. The paper identifies three aspects, a reconceptualization of the spatial assumptions that IR needs to address: the production of space, the constitutive role of boundaries, and the problem of order. With this contribution, we argue that one avenue in understanding the production of space and the following questions of order is by converging systems theory and critical geopolitics. While the latter has already developed a conceptual apparatus to analyze the production of space, the former comes with an encompassing theoretical background, which takes "world society" as the starting point of analysis. In this respect, nation states are understood as a form of internal differentiation of a wider system, namely world society. [source] Future Directions for the Teaching and Learning of Statistics at the Tertiary LevelINTERNATIONAL STATISTICAL REVIEW, Issue 1 2001Des F. Nicholl Summary Significant advances in, and the resultant impact of, Information Technology (IT) during the last fifteen years has resulted in a much more data based society, a trend that can be expected to continue into the foreseeable future. This phenomenon has had a real impact on the Statistics discipline and will continue to result in changes in both content and course delivery. Major research directions have also evolved during the last ten years directly as a result of advances in IT. The impact of these advances has started to flow into course content, at least for advanced courses. One question which arises relates to what impact will this have on the future training of statisticians, both with respect to course content and mode of delivery. At the tertiary level the last 40 years has seen significant advances in theoretical aspects of the Statistics discipline. Universities have been outstanding at producing scholars with a strong theoretical background but questions have been asked as to whether this has, to some degree, been at the expense of appropriate training of the users of statistics (the ,tradespersons'). Future directions in the teaching and learning of Statistics must take into account the impact of IT together with the competing need to produce scholars as well as competent users of statistics to meet the future needs of the market place. For Statistics to survive as a recognizable discipline the need to be able to train statisticians with an ability to communicate is also seen as an areà of crucial importance. Satisfying the needs of society as well as meeting the needs of the profession are considered as the basic determinants which will derive the future teaching and training of statisticians at the tertiary level and will form the basis of this presentation. [source] The use of simulation and post-simulation interview to examine the knowledge involved in community nursing assessment practiceJOURNAL OF ADVANCED NURSING, Issue 5 2000Alison Bryans PhD MSc BA DipHV RGN RNT The use of simulation and post-simulation interview to examine the knowledge involved in community nursing assessment practice This paper describes the development of an innovative research approach which used the complementary methods of simulation and post-simulation interview to examine the knowledge-base involved in community nursing assessment practice in the United Kingdom. The study commenced in 1994 and the main phase of data-gathering took place over a 3-week period in 1995. Having outlined the study's aim, context and theoretical background, this paper focuses on the two main methods of data-gathering used. Detailed description of the simulation method and the post-simulation interview and the rationales for their use are followed by critical discussion which identifies their particular strengths and weaknesses. Threats to validity are also considered. It is argued that the combined use of a simulated assessment and a post-simulation structured interview has great potential as a means of exploring the knowledge involved in community nursing assessment practice. [source] Hard-modelled trilinear decomposition (HTD) for an enhanced kinetic multicomponent analysisJOURNAL OF CHEMOMETRICS, Issue 5 2002Yorck-Michael Neuhold Abstract We present a novel approach for kinetic, spectral and chromatographic resolution of trilinear data sets acquired from slow chemical reaction processes via repeated chromatographic analysis with diode array detection. The method is based on fitting rate constants of distinct chemical model reactions (hard-modelled, integrated rate laws) by a Newton,Gauss,Levenberg/Marquardt (NGL/M) optimization in combination with principal component analysis (PCA) and/or evolving factor analysis (EFA), both known as powerful methods from bilinear data analysis. We call our method hard-modelled trilinear decomposition (HTD). Compared with classical bilinear hard-modelled kinetic data analysis, the additional chromatographic resolution leads to two major advantages: (1) the differentiation of indistinguishable rate laws, as they can occur in consecutive first-order reactions; and (2) the circumvention of many problems due to rank deficiencies in the kinetic concentration profiles. In this paper we present the theoretical background of the algorithm and discuss selected chemical rate laws. Copyright © 2002 John Wiley & Sons, Ltd. [source] The common patterns of natureJOURNAL OF EVOLUTIONARY BIOLOGY, Issue 8 2009S. A. FRANK Abstract We typically observe large-scale outcomes that arise from the interactions of many hidden, small-scale processes. Examples include age of disease onset, rates of amino acid substitutions and composition of ecological communities. The macroscopic patterns in each problem often vary around a characteristic shape that can be generated by neutral processes. A neutral generative model assumes that each microscopic process follows unbiased or random stochastic fluctuations: random connections of network nodes; amino acid substitutions with no effect on fitness; species that arise or disappear from communities randomly. These neutral generative models often match common patterns of nature. In this paper, I present the theoretical background by which we can understand why these neutral generative models are so successful. I show where the classic patterns come from, such as the Poisson pattern, the normal or Gaussian pattern and many others. Each classic pattern was often discovered by a simple neutral generative model. The neutral patterns share a special characteristic: they describe the patterns of nature that follow from simple constraints on information. For example, any aggregation of processes that preserves information only about the mean and variance attracts to the Gaussian pattern; any aggregation that preserves information only about the mean attracts to the exponential pattern; any aggregation that preserves information only about the geometric mean attracts to the power law pattern. I present a simple and consistent informational framework of the common patterns of nature based on the method of maximum entropy. This framework shows that each neutral generative model is a special case that helps to discover a particular set of informational constraints; those informational constraints define a much wider domain of non-neutral generative processes that attract to the same neutral pattern. [source] Concept maps: Experiments on dynamic thinkingJOURNAL OF RESEARCH IN SCIENCE TEACHING, Issue 3 2007Natalia Derbentseva Three experiments were conducted to examine the effects of map structure, concept quantification, and focus question on dynamic thinking during a Concept Map (CMap) construction task. The first experiment compared cyclic and hierarchical structures. The second experiment examined the impact of the quantification of the header concept in the map. The third experiment explored the effect of the focus question on the map. For all three experiments, the content of the CMaps was assessed for the number of dynamic propositions and the number of quantified concepts. The results show that the cyclic structure, the quantification of the header concept, and the focus question "How" significantly increased dynamic thinking. The studies, the theoretical background, and the implications of the findings are discussed. © 2006 Wiley Periodicals, Inc. J Res Sci Teach 44: 448,465, 2007 [source] Rapid analysis of food products by means of high speed gas chromatographyJOURNAL OF SEPARATION SCIENCE, JSS, Issue 4 2007Paola Donato Abstract Since the invention of GC, there has been an ever increasing interest within the chromatographic community for faster GC methods. This is obviously related to the fact that the number of samples subjected to GC analysis has risen greatly. Nowadays, in routine analytical applications, sample throughput is often the most important aspect considered when choosing an analytical method. Gas chromatographic instrumentation, especially in the last decade, has been subjected to continuous and considerable improvement. High-speed injection systems, electronic gas pressure control, rapid oven heating/cooling and fast detection are currently available in a variety of commercial gas chromatographs. The main consequence of this favourable aspect is that high-speed GC is being increasingly employed for routine analysis in different fields. Furthermore, the employment of dedicated software makes the passage from a conventional to a fast GC method a rather simple step. The present review provides an overview of the employment of fast GC techniques for the analysis of food constituents and contaminants. A brief historical and theoretical background is also provided for the approaches described. [source] Promoting Lifestyle Change in the Prevention and Management of Type 2 DiabetesJOURNAL OF THE AMERICAN ACADEMY OF NURSE PRACTITIONERS, Issue 8 2003APRN, Robin Whittemore PhD Purpose To present the theoretical background for lifestyle change interventions in the prevention and management of type 2 diabetes and to provide pragmatic strategies for advanced practice nurses (APNs) to incorporate such interventions into their practices. Data Sources Selected scientific literature and the Internet. Conclusions There is an epidemic of obesity and type 2 diabetes among adults in the United States. Preventing or managing these health conditions requires significant lifestyle changes by individuals. Implications for Practice APNs are in a key role to deliver lifestyle change interventions, particularly in the primary care setting. Strategies to assist APNs with lifestyle change counseling include (a) assessment, (b) mutual decision making, (c) referral to education programs, (d) individualized treatment goals, (e) strategies to assist with problem solving, (f) continuing support and encouragement, (g) relapse prevention, and (h) ongoing follow-up. [source] The impact of the budget deficit on inflation in the Islamic Republic of IranOPEC ENERGY REVIEW, Issue 1 2005Abbas Alavirad In this paper, we measure and analyse the impact of budget deficit on inflation in the Islamic Republic of Iran. After briefly reviewing the theoretical background, we use univariate cointegration tests, such as the autoregressive distributed lag model (ARDL) and Phillips-Hansen methods, to study the relationship between the two in the long term. Additionally, we use the error correction model to study the behaviour of the model in the short run. Our analysis is based on time series annual data from 1963 to 1999 and our results show that budget deficits, as well as liquidity, do have a significant impact on inflation rates in the Islamic Republic of Iran. [source] Reproducibility of the diagnosis of small adenocarcinoma of the lung and usefulness of an educational program for the diagnostic criteriaPATHOLOGY INTERNATIONAL, Issue 1 2005Masayuki Noguchi Using 32 small adenocarcinomas of the lung including bronchioloalveolar carcinoma (BAC), the reproducibility of diagnosis by the modified diagnostic criteria for small adenocarcinoma (Cancer 75; 2844, 1995) and the effectiveness of an educational program for 27 volunteer general pathologists were examined. The average coincidence rate of the diagnosis before and after the program was 42.4% and 56.6%, respectively. The coincidence rate after the program was significantly higher than that before the program (P < 0.05). In contrast, the average coincidence rate of six lung cancer specialists was 71.4%, and this was significantly higher than that for general pathologists after the program (P < 0.05). When the cases were divided into two groups (in situ adenocarcinoma (BAC and BAC with alveolar collapse) and early invasive adenocarcinoma), the average coincidence rate for the general pathologists after the program increased to 85.3%, which was significantly higher than that before the program (80.3%; P < 0.05). The rate for the specialists was 89%, which was higher than that for the general pathologists after the program but not significantly so. This trial was thought to provide a theoretical background for the histological diagnosis of peripheral type adenocarcinoma of the lung and to justify the existing diagnostic criteria. [source] Probiotics and prebiotics in atopic dermatitis: review of the theoretical background and clinical evidencePEDIATRIC ALLERGY AND IMMUNOLOGY, Issue 2p2 2010Leontien B. Van Der Aa van der Aa LB, Heymans HSA, van Aalderen WMC, Sprikkelman AB. Probiotics and prebiotics in atopic dermatitis: review of the theoretical background and clinical evidence. Pediatr Allergy Immunol 2010: 21: e355,e367. © 2009 John Wiley & Sons A/S The prevalence of atopic dermatitis (AD) has risen over the past decades, especially in western societies. According to the revised hygiene hypothesis this increase is caused by a changed intestinal colonization pattern during infancy, which has an impact on the immune system. Manipulating the intestinal microflora with pro-, pre- or synbiotics is an innovative way to prevent or treat AD. This review provides an overview of the theoretical basis for using probiotics and prebiotics in AD and presents the current evidence from randomized controlled trials (RCTs) regarding prevention and treatment of AD and food allergy in children with pro-, pre- and synbiotics. Seven RCTs on prevention and 12 RCTs on treatment were found by searching the Pubmed, Embase and Cochrane databases. Results of these trials are conflicting. In conclusion, at this moment there is not enough evidence to support the use of pro-, pre- or synbiotics for prevention or treatment of AD in children in clinical practice. [source] Modeling of Active Noise and Vibration Control with Finite Elements and Boundary ElementsPROCEEDINGS IN APPLIED MATHEMATICS & MECHANICS, Issue 1 2009Stefan Ringwelski A recently developed coupled finite element-boundary element modeling scheme for the design of active noise and vibration control of multi-coupled structural-acoustic systems is presented. The approach allows the computation of structural vibrations and resulting sound fields. By means of an example, the paper describes the theoretical background of the coupled approach. In order to show the performance of the developed approach, test simulations are carried out in the frequency domain. (© 2009 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Representation and Simulation of Smart Structures in Multibody DynamicsPROCEEDINGS IN APPLIED MATHEMATICS & MECHANICS, Issue 1 2003Andreas Heckmann This paper presents a methodology for the simulation of smart structures with piezoceramic patches by means of multibody dynamics. Therefore, the theoretical background is outlined adapting a modal multifield approach. Then, an application example is used to illustrate the implemented process chain. This procedure provides the framework for the development of an environment in order to design, optimise and verify all vibration control elements. [source] Application of electron transfer dissociation (ETD) for the analysis of posttranslational modificationsPROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 21 2008Julia Wiesner Abstract Despite major advantages in the field of proteomics, the analysis of PTMs still poses a major challenge; thus far, preventing insights into the role and regulation of protein networks. Additionally, top-down sequencing of proteins is another powerful approach to reveal comprehensive information for biological function. A commonly used fragmentation technique in MS-based peptide sequencing is CID. As CID often fails in PTM-analysis and performs best on doubly-charged, short and middle-sized peptides, confident peptide identification may be hampered. A newly developed fragmentation technique, namely electron transfer dissociation (ETD), supports both, PTM- and top-down analysis, and generally results in more confident identification of long, highly charged or modified peptides. The following review presents the theoretical background of ETD and its technical implementation in mass analyzers. Furthermore, current improvements of ETD and approaches for the PTM-analysis and top-down sequencing are introduced. Alternating both fragmentation techniques, ETD and CID, increases the amount of information derived from peptide fragmentation, thereby enhancing both, peptide sequence coverage and the confidence of peptide and protein identification. [source] Imaging geophysical data,taking the viewer into accountARCHAEOLOGICAL PROSPECTION, Issue 1 2004T. J. Dennis Abstract A common way of presenting geophysical data from two-dimensional sources is as a grey-scale image. Some theoretical background to discrete image representation is described, and the deleterious effects of inappropriate (too sparse) sampling and display of such images discussed in an archaeological context. In high-quality images, such as magazine illustrations or digital television, the sampling densities can be sufficiently high to avoid the appearance of artefacts. Geophysical images in contrast are often sampled at very low densities; if the effective area of each sample is significantly less than the sample spacing, then the classic effect called ,aliasing' in communication engineering, caused by the violation of Nyquist's criterion, will be seen. Knowledge of the sensor's footprint can be used to select an appropriate sample density, and so minimize this source of distortion. To maximize the visibility of what may be low-contrast structures immersed in a high level of background noise, it is helpful also to consider the bandpass nature of the spatial frequency response of the human visual system. The non-linear phenomenon of visual masking is shown to influence the choice on presentation methods. Copyright © 2004 John Wiley & Sons, Ltd. [source] Einheit von Wissenschaft und Kunst im Brückenbau: Hellmut Homberg (1909,1990) , Leben und Wirken (Teil I)BAUTECHNIK, Issue 10 2009Karl-Eugen Kurrer Dr.-Ing. Abstract Am 5. September 2009 wäre Hellmut Homberg einhundert Jahre alt geworden. Dies ist Anlass, an sein vor allem den Brückenbau bereicherndes Wirken zu erinnern und ein, wenn auch nicht vollständiges, Werkverzeichnis zu erstellen. Das gilt sowohl für die unter seiner maßgebenden Mitwirkung entworfenen Brücken als auch für seine völlig neuartigen, die statischen Berechnungen in der Praxis erleichternden, streng theoretisch hergeleiteten und eine zutreffende Bemessung der Kreuzwerke und orthotropen Fahrbahnplatten ermöglichenden Tafel- und Tabellenwerke. Teil I schildert Homberg s beruflichen Weg und versucht, ein Porträt zu zeichnen; Teil II ist seinen theoretischen Untersuchungen gewidmet, und Teil III geht auf besondere Brücken ein, und. Harmony between science and art in bridge-building: Hellmut Homberg (1909,90) , life and work (part I). Hellmut Homberg would have been 100 years old on 5 September 2009. This is an opportunity to look back on his work which so enriched the world of bridge-building in particular, and also a chance to compile a catalogue of his work, albeit incomplete. This applies to the bridges in which he played an influential role in their design and also to his books of mathematical and design tables that enabled the accurate design of beam grids and orthotropic bridge decks. The tables with their rigorous theoretical background were at the time quite new and eased structural calculations in everyday practice. Part I describes Homberg's professional career and attempts to draw a portrait of the man; part II is devoted to his theoretical studies, and part III deals with particular bridges. [source] The collaborative practice model for bipolar disorder: design and implementation in a multi-site randomized controlled trialBIPOLAR DISORDERS, Issue 5 2001Mark S BauerArticle first published online: 7 JUL 200 Bipolar disorder remains a high morbidity and costly illness in general clinical practice, despite the availability of efficacious medications. This ,efficacy,effectiveness gap'[1,2] may be addressed by better organizing systems of care. One type of intervention is the ,collaborative practice model' which can be defined as an organization of care that a) emphasizes development in the patient of illness management skills, and b) supports provider capability and availability in order to c) engage patients in timely, joint decision-making regarding their illness. This article describes such a collaborative practice model for bipolar disorder, designed to be widely adoptable and sustainable in general clinical practice. The first part of the article describes the theoretical background from which the collaborative practice approach developed, emphasizing its origins in the lithium clinics of the 1970s, in nursing theory and practice, and more recently in the management of chronic medical diseases. The second part describes the structure of one such intervention, the Bipolar Disorders Program (BDP) developed in the Veterans Affairs health care system. The third part summarizes results from single-site studies of the intervention. The fourth part describes several key issues in its implementation in an ongoing multi-site randomized controlled trial, VA Cooperative Study Program (CSP) #430. Data to date indicate that such collaborative practice interventions may improve important process and intermediate outcome variables for bipolar disorder. The BDP provides an example of a multi-faceted collaborative practice model that can be manualized and implemented across multiple sites in a randomized controlled trial. [source] Membrane Technology for Water TreatmentCHEMICAL ENGINEERING & TECHNOLOGY (CET), Issue 8 2010Th. Peters Abstract Membrane processes have become very important tools in water management and water related environmental engineering, because their efficiency has been proven from a technical and economical, as well as an ecological, point of view. This situation is partially based on results obtained during the operation of reverse osmosis systems that were developed in the early days of this technology for the desalination of seawater. Details regarding the theoretical background of these pressure driven membrane processes, examples of their application in water treatment, limiting factors, operational data and results for the purification efficiency are considered as the basis for the discussion of decision-supporting criteria for the selection of these technologies for possible applications, and as basis for the evaluation of future developments. [source] |