Statistical Concepts (statistical + concept)

Distribution by Scientific Domains


Selected Abstracts


Conceptual problems in laypersons' understanding of individualized cancer risk: a qualitative study

HEALTH EXPECTATIONS, Issue 1 2009
Paul K. J. Han MD MA MPH
Abstract Objective, To explore laypersons' understanding of individualized cancer risk estimates, and to identify conceptual problems that may limit this understanding. Background, Risk prediction models are increasingly used to provide people with information about their individual risk of cancer and other diseases. However, laypersons may have difficulty understanding individualized risk information, because of conceptual as well as computational problems. Design, A qualitative study was conducted using focus groups. Semi-structured interviews explored participants' understandings of the concept of risk, and their interpretations of a hypothetical individualized colorectal cancer risk estimate. Setting and participants, Eight focus groups were conducted with 48 adults aged 50,74 years residing in two major US metropolitan areas. Participants had high school or greater education, some familiarity with information technology, and no personal or family history of cancer. Results, Several important conceptual problems were identified. Most participants thought of risk not as a neutral statistical concept, but as signifying danger and emotional threat, and viewed cancer risk in terms of concrete risk factors rather than mathematical probabilities. Participants had difficulty acknowledging uncertainty implicit to the concept of risk, and judging the numerical significance of individualized risk estimates. The most challenging conceptual problems related to conflict between subjective and objective understandings of risk, and difficulties translating aggregate-level objective risk estimates to the individual level. Conclusions, Several conceptual problems limit laypersons' understanding of individualized cancer risk information. These problems have implications for future research on health numeracy, and for the application of risk prediction models in clinical and public health settings. [source]


Foundational Value of Statistics Education for Management Curriculum

INTERNATIONAL STATISTICAL REVIEW, Issue 3 2007
Hirokuni Tamura
Traitement humain de l'information; Education/Enseignement; Prise de décision; Prévision; Modèles statistiques Summary The purpose of this paper is to propose a unique and distinct value of statistics education for management. The 1986 inaugural conference on Making Statistics More Effective in Schools of Business (MSMESB) proposed valuable guidelines for reforming statistics education in schools of business. However, a survey conducted by McAlevey & Everett (2001) identified that their impact has been minimal, and argued that structural problems many business schools have are the potential cause. We argue these structural problems exist because the value of the body of statistical tools for management is ambiguous and has not been made explicit. The unique and distinct value of statistics for management can be identified as the body of tools necessary to meet the inherent needs of a manager charged with making predictive judgments facing data. The need arises because human information-processing capacity is quite limited, as the findings of researchers in cognitive psychology testify. These findings also affirm that the basic statistical concepts needed for processing data cannot be learned from management experiences. The model of a manager faced with data, while considering the evidence of inherent limitations of human information-processing capacity, establishes the foundational value of statistics training in the management curriculum. Statistics education in business schools will be made more effective when management educators recognize such value of the discipline, lend their support and reward the ownership commitment for continuous improvement and innovations of the business statistics curriculum. Résumé Le but de cet article est de proposer une valeur unique et particulière de l'enseignement des statistiques dans le domaine de la gestion. La conférence inaugurale de 1986 traitant des moyens d'améliorer l'efficacité de cet enseignement dans les écoles de gestion a proposé des lignes directrices valables pour la réforme de l'enseignement des statistiques dans les écoles de gestion. Néanmoins, un sondage effectué par McAlevey & Everett (2001), a identifié leur impact comme étant minimal et en attribue la cause probable aux problèmes structurels des écoles de gestion. Nous considérons que ces problèmes existent parce que la valeur du corpus statistique de gestion est ambigüe et n'a pas été mise en lumière. La valeur unique et distincte des statistiques de gestion peut être identifiée comme un corpus d'outils nécessaires pour répondre aux besoins inhérents d'un gestionnaire chargé de faire des prévisions au moyen d'informations brutes. Ce besoin vient du fait que la capacité humaine de traitement de l'information est limitée ainsi qu'en témoignent les recherches en psychologie cognitive. Ces résultats affirment également que les concepts statistiques basiques nécessaires pour le traitement de l'information ne peuvent être acquis par l'expérience de la gestion. Le modèle du gestionnaire confronté de l'information, une fois l'évidence des limites des capacités humaines en matière de traitement de l'information est prise en compte, établi la valeur fondatrice de l'entrainement aux statistiques dans un curriculum de gestion. L'enseignement des statistiques dans les écoles de commerce sera plus efficace quand les responsables de l'éducation reconnaitront cette valeur de la discipline, y apporteront leur soutien et récompenseront les actions visant à l'amélioration et l'innovation constante au sein du curriculum statistique de gestion. [source]


Integrating the Principles of Evidence-Based Practice Into Clinical Practice

JOURNAL OF THE AMERICAN ACADEMY OF NURSE PRACTITIONERS, Issue 3 2004
Kathleen A. Klardie RN
Column Editor Comment This series of articles illustrates many considerations relevant to the application of clinical practice guidelines (CPGs). This particular column describes the actions of a nurse practitioner (NP) striving to understand the foundations of recommendations that are based largely on expert opinion. Although application of CPGs does not generally require this degree of investigation, it is essential that providers understand the processes used to interpret the basis of recommendations, including the application of the basic statistical concepts, when making decisions about how recommendations apply to individual patient scenarios. Utilizing evidence-based practice when providing patient care requires a range of skills that allows the NP to locate appropriate research evidence, to develop an understanding of the statistics used in interpreting and reporting research, and to evaluate the effects of interventions on patient outcomes. The application of the key concepts of evidenced-based practice within the primary care setting is explored through a hypothetical patient scenario, which was created as the focal point for three articles that illustrate principles of evidence-based practice. The goal of this series of articles is to provide a basic understanding of evidence-based practice and its application in clinical practice. This article explores the use of interventions selected from CPGs and investigates the potential effects of recommended interventions on patient outcomes. Commonly encountered statistical concepts are reviewed, and examples of their application in interpreting and reporting research are demonstrated. The principles of relative risk, relative risk reduction, absolute risk reduction, and numbers needed to treat are described. This review provides the NP with some basic skills to determine both the quality and usefulness of research. [source]


Statistical methods in spatial genetics

MOLECULAR ECOLOGY, Issue 23 2009
GILLES GUILLOT
Abstract The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only the potential of various approaches but also methodological pitfalls. [source]


A Framework for Unifying Formal and Empirical Analysis

AMERICAN JOURNAL OF POLITICAL SCIENCE, Issue 3 2010
Jim Granato
An important disconnect exists between the current use of formal modeling and applied statistical analysis. In general, a lack of linkage between the two can produce statistically significant parameters of ambiguous origin that, in turn, fail to assist in falsifying theories and hypotheses. To address this scientific challenge, a framework for unification is proposed. Methodological unification leverages the mutually reinforcing properties of formal and applied statistical analysis to produce greater transparency in relating theory to test. This framework for methodological unification, or what has been referred to as the empirical implications of theoretical models (EITM), includes (1) connecting behavioral (formal) and applied statistical concepts, (2) developing behavioral (formal) and applied statistical analogues of these concepts, and (3) linking and evaluating the behavioral (formal) and applied statistical analogues. The elements of this EITM framework are illustrated with examples from voting behavior, macroeconomic policy and outcomes, and political turnout. [source]