Rigor

Distribution by Scientific Domains
Distribution within Medical Sciences

Kinds of Rigor

  • methodological rigor
  • scientific rigor


  • Selected Abstracts


    Critical Theorizing: Enhancing Theoretical Rigor in Family Research

    JOURNAL OF FAMILY THEORY & REVIEW, Issue 3 2009
    Stan J. Knapp
    Theory performs vital descriptive, sensitizing, integrative, explanatory, and value functions in the generation of knowledge about families. Yet theory and research can also simultaneously misconceive, desensitize, misdirect, misinterpret, and devalue. Overcoming the degenerative potentialities of theory and research requires attention to critical theorizing, a joint process of (a) critically examining the explicit and implicit assumptions of theory and research and (b) using dialogical theoretical practices. I draw upon the work of John Stuart Mill to argue that critical and dialogical theorizing is a vital and necessary practice in the production of understandings of family phenomena that are more fully scientific and empirical. A brief examination of behavioral research on marital interaction illustrates the importance of critical theorizing. [source]


    Methodological Rigor in the Study of Transfer: Identifying L1 Influence in them Interlanguage Lexicon

    LANGUAGE LEARNING, Issue 2 2000
    Scott Jarvis
    Numerous conflicting claims exist concerning the nature of L1 influence. This article argues that much of the confusion could be eliminated if a unified framework were established for this area of inquiry. Such a framework would minimally require transfer studies to consider at least 3 potential effects of L1 influence: (a) intra-L1-group similarities, (b) inter-L1-group differences, and (c) L1-IL performance similarities. This study examines all three types ofevidence in the English lexical reference of Finnish-speaking and Swedish-speaking Finns at multiple levels of age and L2 exposure in three different but related elicitation tasks. The results suggest a subtle yet demonstrable presence for L1 influence in this area of interlanguage performance. [source]


    Methodological aspects of rigor in qualitative nursing research on families involved in intensive care units: A literature review

    NURSING & HEALTH SCIENCES, Issue 1 2007
    Sevald Høye rn, mnsc
    Abstract, Rigor has important ramifications for the entire qualitative research process. The aim of this study was to evaluate aspects of methodological congruence by focusing on four dimensions of rigor in qualitative nursing research related to the presence of patients' family members in the intensive care unit. Eight research papers covering the years 1990,2004 were analyzed by means of one of Burns and Grove's standards, methodological congruence, for critique and consistency. The results show that there are varying degrees of focus on procedural rigor, such as limitations and bias. Ethical rigor is described clearly in some papers, while others lack descriptions of confidentiality and the voluntary nature of participation. However, all papers contain descriptions of qualitative data analysis. In conclusion, there were strengths in procedural rigor and auditability, but also some limitations in the identification of theoretical development and the scientific tradition on which the article is based. [source]


    Maturation of Corporate Governance Research, 1993,2007: An Assessment

    CORPORATE GOVERNANCE, Issue 3 2009
    Boris Durisin
    ABSTRACT Manuscript Type: Review Research Question/Issue: This study seeks to investigate whether governance research in fact is a discipline or whether it is rather the subject of multi-disciplinary research. We map the intellectual structure of corporate governance research and its evolution from 1993,2007. Research Findings/Results: Based on the analysis of more than 1,000 publications and 48,000 citations in Corporate Goverance: An International Review (CGIR) and other academic journals, our study identifies the most influential works, the dominant subfields, and their evolution. Our study assesses the maturation of corporate governance research as a discipline; it finds increasing sophistication, depth and rigor, and consistency in its intellectual structure. Theoretical Implications: There is a large body of accumulated corporate governance research in the US, yet there is an empirical gap on cross-national studies in the literature. Furthermore, hardly any of the top cited works undertake their study in a cross-national setting. Thus, corporate governance research and CGIR in its quest to contribute to a global theory of corporate governance might benefit if articles have a cross-national methodological approach and empirical grounding in their research design and if articles explicitly aim at stating the theoretical underpinnings they draw on. Practical Implications: Globalists find in CGIR an outlet addressing economics and finance (e.g., whether and how compensation or dismissal of CEOs is related to board characteristics), management (e.g., whether and how best practice codes adoption is related to board characteristics and performance), and accounting (e.g., whether and how earnings manipulations is related to board characteristics) issues globally. [source]


    Similar Deficiencies in Procedural Dermatology and Dermatopathology Fellow Evaluation despite Different Periods of ACGME Accreditation: Results of a National Survey

    DERMATOLOGIC SURGERY, Issue 7 2008
    SCOTT R. FREEMAN MD
    BACKGROUND Fellow evaluation is required by the Accreditation Council for Graduate Medical Education (ACGME). Procedural dermatology fellowship accreditation by the ACGME began in 2003 while dermatopathology accreditation began in 1976. OBJECTIVE The objective was to compare fellow evaluation rigor between ACGME-accredited procedural dermatology and dermatopathology fellowships. METHODS Questionnaires were mailed to fellowship directors of the ACGME-accredited (2006,2007) procedural dermatology and dermatopathology fellowship programs. Information was collected regarding evaluation form development, delivery, and collection. RESULTS The response rates were 74% (25/34) and 53% (24/45) for procedural and dermatopathology fellowship programs, respectively. Sixteen percent (4/25) of procedural dermatology and 25% (6/24) of dermatopathology programs do not evaluate fellows. Fifty percent or less of program (4/8 procedural dermatology and 3/7 dermatopathology) evaluation forms address all six core competencies required by the ACGME. CONCLUSION Procedural fellowships are evaluating fellows as rigorously as the more established dermatopathology fellowships. Both show room for improvement because one in five programs reported not evaluating fellows and roughly half of the evaluation forms provided do not address the six ACGME core competencies. [source]


    Negative symptoms of schizophrenia: a problem that will not go away

    ACTA PSYCHIATRICA SCANDINAVICA, Issue 1 2007
    S. M. Stahl
    Objective:, Negative symptoms of schizophrenia are a common, enduring, and debilitating component of the psychopathology of schizophrenia. Although efforts thus far to elucidate a distinct schizophrenia subtype based upon negative symptoms have yielded mixed results, there are nevertheless neurobiological correlates of the negative symptom typology. Method:, A review of nosology, typology, and assessment tools for determining core negative symptoms in schizophrenia. Results:, Negative symptoms can be difficult to evaluate objectively. Current rating scales ,capture' key domains of negative symptoms, in spite of considerable overlap between these domains. However, each objective assessment trades off methodological rigor and detail against brevity of assessment and ease of use. Conclusion:, The description of new methods for measuring these devastating symptoms, coupled with the ongoing development of novel antipsychotics and agents that augment antipsychotics have fuelled renewed interest in the evaluation of negative symptoms and optimism that better treatments for negative symptoms can be found. [source]


    The Dependence of Growth-Model Results on Proficiency Cut Scores

    EDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 4 2009
    Andrew D. Ho
    States participating in the Growth Model Pilot Program reference individual student growth against "proficiency" cut scores that conform with the original No Child Left Behind Act (NCLB). Although achievement results from conventional NCLB models are also cut-score dependent, the functional relationships between cut-score location and growth results are more complex and are not currently well described. We apply cut-score scenarios to longitudinal data to demonstrate the dependence of state- and school-level growth results on cut-score choice. This dependence is examined along three dimensions: 1) rigor, as states set cut scores largely at their discretion, 2) across-grade articulation, as the rigor of proficiency standards may vary across grades, and 3) the time horizon chosen for growth to proficiency. Results show that the selection of plausible alternative cut scores within a growth model can change the percentage of students "on track to proficiency" by more than 20 percentage points and reverse accountability decisions for more than 40% of schools. We contribute a framework for predicting these dependencies, and we argue that the cut-score dependence of large-scale growth statistics must be made transparent, particularly for comparisons of growth results across states. [source]


    A Guide to Understanding and Developing Performance-Level Descriptors

    EDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 4 2008
    Marianne Perie
    There has been much discussion recently about why the percentage of students scoring Proficient or above varies as much as it does on state assessments across the country. However, most of these discussions center on the leniency or rigor of the cut score. Yet, the cut score is developed in a standard-setting process that depends heavily on the definition for each level of performance. Good performance-level descriptors (PLDs) can be the foundation of an assessment program, driving everything from item development to cut scores to reporting. PLDs should be written using a multistep process. First, policymakers determine the number and names of the levels. Next, they develop policy definitions specifying the level of rigor intended by each level, regardless of the grade or subject to which it is applied. Finally, content experts and education leaders should supplement these policy definitions with specific statements related to the content standards for each assessment. This article describes a process for developing PLDs, contrasts that with current state practice, and discusses the implication for interpreting the word "proficient," which is the keystone of No Child Left Behind. [source]


    Protective Effect of HOE642, a Selective Blocker of Na+ -H+ Exchange, Against the Development of Rigor Contracture in Rat Ventricular Myocytes

    EXPERIMENTAL PHYSIOLOGY, Issue 1 2000
    Marisol Ruiz-Meana
    The objective of this study was to investigate the effect of Na+ -H+ exchange (NHE) and HCO3, -Na+ symport inhibition on the development of rigor contracture. Freshly isolated adult rat cardiomyocytes were subjected to 60 min metabolic inhibition (MI) and 5 min re-energization (Rx). The effects of perfusion of HCO3, or HCO3, -free buffer with or without the NHE inhibitor HOE642 (7 ,M) were investigated during MI and Rx. In HCO3, -free conditions, HOE642 reduced the percentage of cells developing rigor during MI from 79 ± 1% to 40 ± 4% (P < 0.001) without modifying the time at which rigor appeared. This resulted in a 30% reduction of hypercontracture during Rx (P < 0.01). The presence of HCO3, abolished the protective effect of HOE642 against rigor. Cells that had developed rigor underwent hypercontracture during Rx independently of treatment allocation. Ratiofluorescence measurement demonstrated that the rise in cytosolic Ca2+ (fura-2) occurred only after the onset of rigor, and was not influenced by HOE642. NHE inhibition did not modify Na+ rise (SBFI) during MI, but exaggerated the initial fall of intracellular pH (BCEFC). In conclusion, HOE642 has a protective effect against rigor during energy deprivation, but only when HCO3, -dependent transporters are inhibited. This effect is independent of changes in cytosolic Na+ or Ca2+ concentrations. [source]


    "The probable industrial origin of archaeological daub at an Iron Age site in northeast Thailand" (Parr and Boyd, 2002): A comment on the inappropriate application of geophysical and geochemical techniques to an archaeological question

    GEOARCHAEOLOGY: AN INTERNATIONAL JOURNAL, Issue 8 2003
    Maria Cotter
    Parr and Boyd (2002) used colorimetric analysis in combination with geophysical and geochemical techniques to estimate firing temperatures for archaeological daub from an Iron Age site in Thailand. They suggest that the daub was fired at high temperatures and, therefore, is indicative of kiln utilization and increased industrialization during that period in Thailand. They argue that the adoption of a multimethod analytical approach in which the combination of data derived from ICP-MS, X-ray diffraction, and magnetic susceptibility analyses of daub samples, coupled with microscopic and macroscopic examination of samples, enhances the accuracy of their interpretations. While they should be commended for attempting to substantiate their claims using many geophysical and geochemical techniques, their arguments are flawed by the misapplication of the techniques described and/or over-interpretation of the data generated by such techniques. Therefore, Parr and Boyd's (2002:285) point about methodology ("that the combined interpretation of independent measures provides a better estimate of the original firing temperatures of the archaeological material than has hitherto been possible") is made redundant by the lack of scientific rigor applied to the independent measures used for this study. © 2003 Wiley Periodicals, Inc. [source]


    Classification Analysis of World Economic Regions

    GEOGRAPHICAL ANALYSIS, Issue 4 2001
    Raymond J. Dezzani
    Economic classifications of countries are of continuing utility for comparative and analytic purposes. However, traditional methods of arriving at classifications are often ad hoc, subjective, and imprecise, not permitting the assignments to be used for closer analysis. Discriminant analysis is used in this paper to isolate a time-specific set of economic factors delimiting economic state categories that correspond to core-periphery states. The core-periphery framework is shown to be a special case of a hierarchical market scheme. The purposes of this work are (1) to create a theoretically grounded, empirically derived classification over several time periods to permit dynamic comparisons to be made and provide an explanation of change in the global economy, and (2) to provide feedback information from the classification to supply the necessary rigor and quantitative insight to the world-systems theoretical framework. Results of the analysis suggest that different economic variables provide varying levels of explanation at different times. In particular, variables representing factor endowment provide a greater measure of explanation early in the sequence (for example, 1960) while trade and investment measures are of greater importance in the latter part of the study sequence (for example, 1990). OPEC countries significantly bifurcate the world-economy classification in 1970 and exhibit separate class characteristics. Even within the short time period, a number of countries are shown to transit among the classes. The model is also able to capture the dependence structure implicit in the world-systems framework. [source]


    Cluster headache: the challenge of clinical trials.

    HEADACHE, Issue 3 2003
    K Moore
    Curr Pain Headache. Rep 2002 Feb;6(1):52-56 The design and execution of clinical trials poses special problems for cluster headache. Although there is less inter-individual and intra-individual variability of attacks than seen with migraine, the brevity of attacks, spontaneous remissions unrelated to treatment, and the relative rarity of cluster headaches challenge investigators. The International Headache Society has developed guidelines that represent a compromise between scientific rigor and practicality. Only injectable sumatriptan for acute attacks and verapamil for prophylaxis have demonstrated a robust therapeutic effect in controlled clinical trials. Comment: Kenneth Moore raises important methodological considerations. It is possible to undertake crossover trials comparing different active treatments? He is correct in his assertion that few agents show robust efficacy. A major issue relates to the proportion of patients with episodic versus chronic cluster headache where efficacy of active treatments can vary. For example, oral zolmitriptan was effective against placebo only in those patients with episodic disease (Bahra A, Gawel MJ, Hardebo JE, Millson DS, Breen SA, Goadsby PJ. Oral zolmitriptan is effective in the acute treatment of cluster headache. Neurology. 2000;54:1832-1839). And a set of small studies on melatonin and cluster demonstrate the problems Dr. Moore describes. In one study (Leone M, D'Amico D, Moschiano F, Fraschini F, Busonne G. Metalonin versus placebo in the prophylaxis of cluster headache: a double-blind pilot study with parallel groups. Cephalalgia. 1996;16:494-496), the melatonin worked only in episodic, not chronic cluster patients. In the second study (Prinsheim T, Magnoux E, Dobson CF, Hamel E, Aube M. Melatonin as adjuctive therapy in the prophylaxis of cluster headache: a pilot study. Headache. 2002;42:787-792), melatonin did not work better than placebo in either episodic or chronic cluster patients. Furthermore, the paper abstracted above by Torelli and Manzoni suggests that episodic cluster may progress to chronic cluster as a result of extrinsic factors such as smoking. Finally, there are ethical issues in placebo-controlled cluster studies, given the severity of the pain and the availability of effective acute and chronic treatments. As noted above, Dr. Peter Goadsby points out the need to persevere with these studies to find nonvasoactive treatments for patients with cluster headache. DSM and SJT [source]


    A modest proposal: a testable differentiation between third- and fourth-order information complexity

    INTERNATIONAL JOURNAL OF APPLIED PSYCHOANALYTIC STUDIES, Issue 4 2006
    Kathryn Cason
    Abstract In Human Capability, Jaques and Cason (1994) described the importance of the Third and Fourth Orders of Information Complexity used by adults working to create and manage our commercial endeavors, govern our countries, and provide services such as healthcare and education to our populations. Today our knowledge of these two Orders is still in descriptive terms, therefore less subject to testing than meets the necessary scientific rigor. In order to pursue a better understanding of how to more effectively educate and employ this capability in the adult population it is necessary to have clarity about the boundaries of these apparently discontinuous innate human "processes." The authors here set out important aspects of their continued inquiry. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Theorizing Migration Policy: Is There a Third Way?1

    INTERNATIONAL MIGRATION REVIEW, Issue 1 2007
    Christina Boswell
    This article critically reviews theories of migration policy according to two criteria: methodological rigor and explanatory plausibility. It finds that political economy accounts are theoretically robust, but at the price of oversimplification. Neo-institutional theories offer more sophisticated accounts, but fall short on a number of methodological and explanatory counts. As an alternative, this article suggests a theory focusing on the functional imperatives of the state in the area of migration, which shape its responses to societal interests and institutional structures. [source]


    A Pragmatic Guide to Qualitative Historical Analysis in the Study of International Relations

    INTERNATIONAL STUDIES PERSPECTIVES, Issue 4 2002
    Cameron G. Thies
    Researchers using qualitative methods, including case studies and comparative case studies, are becoming more self,conscious in enhancing the rigor of their research designs so as to maximize their explanatory leverage with a small number of cases. One aspect of qualitative research that has not received as much attention is the use of primary and secondary source material as data or evidence. This essay explores the potential problems encountered by political scientists as they conduct archival research or rely on secondary source material produced by historians. The essay also suggests guidelines for researchers to minimize the main problems associated with qualitative historical research, namely, investigator bias and unwarranted selectivity in the use of historical source materials. These guidelines should enable advanced undergraduates and graduate students to enhance the quality of their historically minded political science scholarship. [source]


    Application of molecular clocks in ornithology revisited

    JOURNAL OF AVIAN BIOLOGY, Issue 6 2006
    A. Townsend Peterson
    Molecular clocks have seen many applications in ornithology, but many applications are uncritical. In this commentary, I point out logical inconsistencies in many uses of clocks in avian molecular systematics. I call for greater rigor in application of molecular clocks , clocks should only be used when clocklike behavior has been tested and confirmed, and when appropriate calibrations are available. Authors and reviewers should insist on such rigor to assure that systematics is indeed scientific, and not just storytelling. [source]


    Taking stock of naturalistic decision making

    JOURNAL OF BEHAVIORAL DECISION MAKING, Issue 5 2001
    Raanan Lipshitz
    Abstract We review the progress of naturalistic decision making (NDM) in the decade since the first conference on the subject in 1989. After setting out a brief history of NDM we identify its essential characteristics and consider five of its main contributions: recognition-primed decisions, coping with uncertainty, team decision making, decision errors, and methodology. NDM helped identify important areas of inquiry previously neglected (e.g. the use of expertise in sizing up situations and generating options), it introduced new models, conceptualizations, and methods, and recruited applied investigators into the field. Above all, NDM contributed a new perspective on how decisions (broadly defined as committing oneself to a certain course of action) are made. NDM still faces significant challenges, including improvement of the quantity and rigor of its empirical research, and confirming the validity of its prescriptive models. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Endotoxin-like reaction following once-daily gentamicin

    ACTA ANAESTHESIOLOGICA SCANDINAVICA, Issue 6 2009
    E. E. ALY
    An endotoxin-like reaction is a host response to an agent that induces the release of endogenous pyrogens, including cytokines. The typical reaction that is associated with gentamicin is fever and chills, rigor, shivering, tachycardia with hypertension or hypotension, respiratory symptoms and muscle cramps. We report a case of a 92-year-old patient who developed an endotoxin-like reaction in the post-operative recovery unit following 200 mg of gentamicin. The reported side effect is not included in the drug sheet or in the British National Formulary. No similar incidents were reported in the UK. We discuss the clinical picture of this rare event, along with a review of the literature and recommendations. [source]


    Injection-Salting of pre rigor Fillets of Atlantic Salmon (Salmo salar)

    JOURNAL OF FOOD SCIENCE, Issue 1 2007
    Sveinung Birkeland
    ABSTRACT:, The effects of temperature (,1, 4, and 10 °C), brine concentration (12% and 25% NaCl), injection volumes, and needle densities were investigated on fillet weight gain (%), salt content (%), fillet contraction (%), and muscle gaping in pre rigor brine-injected fillets of Atlantic salmon (Salmo salar). Increased brine concentration (12% to 25%) significantly increased the initial (< 5 min after injection) and final contraction (24 h after injection) of pre rigor fillets. Increased brine concentration significantly reduced weight gain and increased salt content but had no significant effect on muscle gaping. The temperatures tested did not significantly affect weight gain, fillet contraction, or gaping score. Significant regressions (P < 0.01) between the injection volume and weight gain (range: 2.5% to 15.5%) and salt content (range: 1.7% to 6.5%) were observed for injections of pre rigor fillets. Double injections significantly increased the weight gain and salt content compared to single injections. Initial fillet contraction measured 30 min after brine injection increased significantly (P < 0.01) with increasing brine injection volume but no significant difference in the fillet contraction was observed 12 h after brine injection (range: 7.9% to 8.9%). Brine-injected post rigor control fillets obtained higher weight gain, higher salt content, more muscle gaping, and significantly lower fillet contraction compared to the pre rigor injected fillets. Injection-salting is an applicable technology as a means to obtain satisfactory salt contents and homogenously distribute the salt into the muscle of pre rigor fillets of Atlantic salmon before further processing steps such as drying and smoking. [source]


    Keeping Quality of Sea-Frozen Thawed Cod Fillets on Ice

    JOURNAL OF FOOD SCIENCE, Issue 9 2001
    E. Martinsdóttir And
    ABSTRACT: The objective was to evaluate the suitability of sea-frozen, thawed cod fillets for the "chilled" seafood market. Fillets were kept frozen for 17 mo. After thawing, fillets were kept iced and at 4°C. Microbiological research on fillets showed higher initial numbers in post-rigor than pre-rigor fillets. Pre-rigor fillets were judged fresher after 2 mo of storage compared to post-rigor. With longer freezer storage, lower initial freshness scores were obtained, and formation of trimethylamine in thawed fillets was slowed. Thawed fillets frozen prior to rigor merited higher scores for freshness than fillets frozen post-rigor. This difference decreased with prolonged freezer storage. The results strongly indicate that fillets should be frozen pre-rigor. [source]


    Kumho, Daubert, and the Nature of Scientific Inquiry: Implications for Forensic Anthropology,

    JOURNAL OF FORENSIC SCIENCES, Issue 4 2008
    Christopher R. Grivas M.S.
    Abstract:, In the last 15 years, the US Supreme Court has implemented major changes concerning the admittance of expert testimony. In 1993, Daubert v. Merrell Dow Pharmaceuticals superseded the Frye ruling in federal courts and established judges, not the scientific community, as the gatekeepers regarding the credibility of scientific evidence. In 1999, a lesser-known but equally important decision, Kumho Tire v. Carmichael, ruled that technical expert testimony needed to employ the same rigor as outlined in Daubert, but experts can develop theories based on observations and apply such theories to the case before the court. Anthropology has never been defined as a hard science. Yet, many recent publications have modified existing techniques to meet the Daubert criteria, while none have discussed the significance of Kumho to anthropological testimony. This paper examines the impact of Daubert and Kumho on forensic anthropology and illustrates areas of anthropological testimony best admitted under Kumho's guidance. [source]


    Understanding the Plan: A Studio Experience

    JOURNAL OF INTERIOR DESIGN, Issue 3 2006
    Allan Hing M.A.
    ABSTRACT The plan is a powerful tool in the design process that requires both intellectual and creative rigor. The focus of this article is the methodology of how the plan is presented, developed, and understood in sophomore interior design studio. The studio's goal is to give students a foundation of understanding by broadening their knowledge of spatial design through the study of the plan and plan language. The plan is what architects and interior and landscape designers use to move people through space, to organize space, and to place objects in space. The article outlines how this material is presented through readings, lectures, and design projects. The lessons require students to be creative and analytical in developing a plan, to gain visual literacy in understanding a plan and its spaces, and to use plan language in their explanations in studio. Students should learn to think and speak in terms of entry, path, and goal. Plan language includes such terms as axis, centering and re-centering, symmetry, focal points, gesturing, reinforcement, in-line, articulation, and hierarchy. Students are required to take a letterform and develop an orchestrated spatial walk through the form using plan language, and then they must complete a series of diagrams. Past and present plan types are analyzed and important architects who have contributed to the plan and plan language are discussed (Mackintosh, Wright, Le Corbusier, and Scarpa). The plan is the element which most interior designers use to develop space. Therefore, students and educators should have a greater understanding and vocabulary for such an important tool. [source]


    Integrated estimation of measurement error with empirical process modeling,A hierarchical Bayes approach

    AICHE JOURNAL, Issue 11 2009
    Hongshu Chen
    Abstract Advanced empirical process modeling methods such as those used for process monitoring and data reconciliation rely on information about the nature of noise in the measured variables. Because this likelihood information is often unavailable for many practical problems, approaches based on repeated measurements or process constraints have been developed for their estimation. Such approaches are limited by data availability and often lack theoretical rigor. In this article, a novel Bayesian approach is proposed to tackle this problem. Uncertainty about the error variances is incorporated in the Bayesian framework by setting noninformative priors for the noise variances. This general strategy is used to modify the Sampling-based Bayesian Latent Variable Regression (Chen et al., J Chemom., 2007) approach, to make it more robust to inaccurate information about the likelihood functions. Different noninformative priors for the noise variables are discussed and unified in this work. The benefits of this new approach are illustrated via several case studies. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source]


    Strategies to Facilitate Lifestyle Change Associated with Diabetes Mellitus

    JOURNAL OF NURSING SCHOLARSHIP, Issue 3 2000
    Robin Whittemore
    Purpose: To critically examine the literature about strategies and factors that influence lifestyle change in people with diabetes mellitus. Method: This integrative literature review included an extensive search of published literature about strategies to facilitate lifestyle change related to diabetes mellitus. Articles reviewed were empirical studies focused on lifestyle change and were published between 1985 and 1999. Meta-analyses and relevant reviews of the literature were also included. Over 90 articles were initially identified, 72 met the specified criteria and could be categorized according to a theoretical framework or a broad topic area. Findings: Studies were clustered into the categories of educational, behavioral, cultural, and health beliefs that influence or are barriers to lifestyle change. Studies indicate that positive outcomes are associated with diabetes education programs that focus on self-management, emphasize behavioral strategies, and provide culturally relevant information. Conclusions: Theoretically grounded research in diabetes care is imperative for the future. Expansion of research methods, continued methodological rigor of studies, and instrument development would contribute to knowledge development in diabetes care. Research priorities are proposed. [source]


    A Taxonomy of Passive Behaviors in People with Alzheimer's Disease

    JOURNAL OF NURSING SCHOLARSHIP, Issue 3 2000
    Kathleen Byrne Colling
    Purpose: To construct a taxonomy of passive behaviors for understanding people with Alzheimer's disease. Passive behaviors are those associated with decreased motor movements, decreasing interactions with the environment, and feelings of apathy and listlessness. Little is known about behaviors associated with passivity, and these behaviors have not been categorized. Organizing Construct: Taxonomy construction. Passive behaviors in people with Alzheimer's disease were conceptualized as disturbing behaviors, patterns of personality change, and negative symptoms. Methods: The taxonomy was developed using critical reviews of 15 empirical studies published 1985 through 1998. Procedures included listing behaviors; clustering behaviors into inductively derived groupings; conducting an expert panel-review, making revisions, and conduting a second review; establishing global and category-by-category reliability using Cohen's Kappa. Findings: The nonhierarchic, natural taxonomy indicated five categories of behaviors associated with passivity in Alzheimer's disease: diminutions of cognition, psychomotor activity, emotions, interactions with people, and interactions with the environment. Analysis indicated substantial agreement beyond chance and showed statistically significant agreement among the six nurse-expert raters. Areas of synchrony between the taxonomy and the Need-Driven Dementia Compromised Behavior Model were identified. Conclusions: This taxonomy of passive behaviors in patients with Alzheimer's disease showed empirical rigor and compatibility with a middle-range theory and can be viewed as a sensitizing analytic scheme to guide future practice, research, and theory development. [source]


    Density functional theory for chemical engineering: From capillarity to soft materials

    AICHE JOURNAL, Issue 3 2006
    Jianzhong Wu
    Abstract Understanding the microscopic structure and macroscopic properties of condensed matter from a molecular perspective is important for both traditional and modern chemical engineering. A cornerstone of such understanding is provided by statistical mechanics, which bridges the gap between molecular events and the structural and physiochemical properties of macro- and mesoscopic systems. With ever-increasing computer power, molecular simulations and ab initio quantum mechanics are promising to provide a nearly exact route to accomplishing the full potential of statistical mechanics. However, in light of their versatility for solving problems involving multiple length and timescales that are yet unreachable by direct simulations, phenomenological and semiempirical methods remain relevant for chemical engineering applications in the foreseeable future. Classical density functional theory offers a compromise: on the one hand, it is able to retain the theoretical rigor of statistical mechanics and, on the other hand, similar to a phenomenological method, it demands only modest computational cost for modeling the properties of uniform and inhomogeneous systems. Recent advances are summarized of classical density functional theory with emphasis on applications to quantitative modeling of the phase and interfacial behavior of condensed fluids and soft materials, including colloids, polymer solutions, nanocomposites, liquid crystals, and biological systems. Attention is also given to some potential applications of density functional theory to material fabrications and biomolecular engineering. © 2005 American Institute of Chemical Engineers AIChE J, 2006 [source]


    A simulation-optimization framework for research and development pipeline management

    AICHE JOURNAL, Issue 10 2001
    Dharmashankar Subramanian
    The Research and Development Pipeline management problem has far-reaching economic implications for new-product-development-driven industries, such as pharmaceutical, biotechnology and agrochemical industries. Effective decision-making is required with respect to portfolio selection and project task scheduling in the face of significant uncertainty and an ever-constrained resource pool. The here-and-now stochastic optimization problem inherent to the management of an R&D Pipeline is described in its most general form, as well as a computing architecture, Sim-Opt, that combines mathematical programming and discrete event system simulation to assess the uncertainty and control the risk present in the pipeline. The R&D Pipeline management problem is viewed in Sim-Opt as the control problem of a performance-oriented, resource-constrained, stochastic, discrete-event, dynamic system. The concept of time lines is used to study multiple unique realizations of the controlled evolution of the discrete-event pipeline system. Four approaches using various degrees of rigor were investigated for the optimization module in Sim-Opt, and their relative performance is explored through an industrially motivated case study. Methods are presented to efficiently integrate information across the time lines from this framework. This integration of information demonstrated in a case study was used to infer a creative operational policy for the corresponding here-and-now stochastic optimization problem. [source]


    Future Prospects for Biomarkers of Alcohol Consumption and Alcohol-Induced Disorders

    ALCOHOLISM, Issue 6 2010
    Willard M. Freeman
    The lack of reliable measures of alcohol intake is a major obstacle to the diagnosis, treatment, and research of alcohol abuse and alcoholism. Successful development of a biomarker that allows for accurate assessment of alcohol intake and drinking patterns would not only be a major advance in clinical care but also a valuable research tool. A number of advances have been made in testing the validity of proposed biomarkers as well as in identifying potential new biomarkers through systems biology approaches. This commentary will examine the definition of a biomarker of heavy drinking, the types of potential biomarkers, the steps in biomarker development, the current state of biomarker development, and critical obstacles for the field. The challenges in developing biomarkers for alcohol treatment and research are similar to those found in other fields. However, the alcohol research field must reach a competitive level of rigor and organization. We recommend that NIAAA consider taking a leadership role in organizing investigators in the field and providing a common set of clinical specimens for biomarker validation studies. [source]


    Patient Recall in Advanced Education in Prosthodontics Programs in the United States

    JOURNAL OF PROSTHODONTICS, Issue 4 2010
    Fatemeh S. Afshari DMD
    Abstract Purpose: This study surveyed program directors of Advanced Education Programs in Prosthodontics (AEPP) in the United States to determine the extent, type, incidence, and perceived effectiveness of implemented recall systems. Material and Methods: Surveys were sent to AEPP directors across the United States to assess their program's recall protocol. This survey first identified whether an active recall program existed. For programs with recall systems, rigor in promoting ongoing oral health was surveyed by focusing on recall frequency, patient tracking protocol, involved personnel, interaction with other university departments, provided clinical procedures, and therapy completion protocol. Whether the directors perceived that their recall system was successful was also investigated. Results: Thirty-three of 46 programs responded, giving a response rate of 72%. Of these 33 programs, only 21 (64%) had an active recall system, although 30 (91%) believed recall to be important. Twelve (57%) directors with recall programs considered their system to be effective. Conclusions: Prosthodontic program directors felt their program's recall effectiveness could be improved. Due to the numerous potential benefits of an active recall system, AEPPs should consider implementing or enhancing their recall programs. Further studies are indicated to determine specific criteria that describe an effective recall system for prosthodontic programs within the context of patient health promotion, program curriculum, and financial ramifications. [source]


    So mechanical or routine: The not original in Feist

    JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 4 2010
    Julian Warner
    The United States Supreme Court case of 1991, Feist Publications, Inc. v. Rural Tel. Service Co., continues to be highly significant for property in data and databases, but remains poorly understood. The approach taken in this article contrasts with previous studies. It focuses upon the "not original" rather than the original. The delineation of the absence of a modicum of creativity in selection, coordination, and arrangement of data as a component of the not original forms a pivotal point in the Supreme Court decision. The author also aims at elucidation rather than critique, using close textual exegesis of the Supreme Court decision. The results of the exegesis are translated into a more formal logical form to enhance clarity and rigor. The insufficiently creative is initially characterized as "so mechanical or routine." Mechanical and routine are understood in their ordinary discourse senses, as a conjunction or as connected by AND, and as the central clause. Subsequent clauses amplify the senses of mechanical and routine without disturbing their conjunction. The delineation of the absence of a modicum of creativity can be correlated with classic conceptions of computability. The insufficiently creative can then be understood as a routine selection, coordination, or arrangement produced by an automatic mechanical procedure or algorithm. An understanding of a modicum of creativity and of copyright law is also indicated. The value of the exegesis and interpretation is identified as its final simplicity, clarity, comprehensiveness, and potential practical utility. [source]