Intuitive Appeal (intuitive + appeal)

Distribution by Scientific Domains


Selected Abstracts


THE INTERACTION OF ANTISOCIAL PROPENSITY AND LIFE-COURSE VARYING PREDICTORS OF DELINQUENT BEHAVIOR: DIFFERENCES BY METHOD OF ESTIMATION AND IMPLICATIONS FOR THEORY,

CRIMINOLOGY, Issue 2 2007
GRAHAM C. OUSEY
Recent criminological research has explored the extent to which stable propensity and life-course perspectives may be integrated to provide a more comprehensive explanation of variation in individual criminal offending. One line of these integrative efforts focuses on the ways that stable individual characteristics may interact with, or modify, the effects of life-course varying social factors. Given their consistency with the long-standing view that person,environment interactions contribute to variation in human social behavior, these theoretical integration attempts have great intuitive appeal. However, a review of past criminological research suggests that conceptual and empirical complexities have, so far, somewhat dampened the development of a coherent theoretical understanding of the nature of interaction effects between stable individual antisocial propensity and time-varying social variables. In this study, we outline and empirically assess several of the sometimes conflicting hypotheses regarding the ways that antisocial propensity moderates the influence of time-varying social factors on delinquent offending. Unlike some prior studies, however, we explicitly measure the interactive effects of stable antisocial propensity and time-varying measures of selected social variables on changes in delinquent offending. In addition, drawing on recent research that suggests that the relative ubiquity of interaction effects in past studies may be partly from the poorly suited application of linear statistical models to delinquency data, we alternatively test our interaction hypotheses using least-squares and tobit estimation frameworks. Our findings suggest that method of estimation matters, with interaction effects appearing readily in the former but not in the latter. The implications of these findings for future conceptual and empirical work on stable propensity/time-varying social variable interaction effects are discussed. [source]


Seasonal mortality and the effect of body size: a review and an empirical test using individual data on brown trout

FUNCTIONAL ECOLOGY, Issue 4 2008
Stephanie M. Carlson
Summary 1,For organisms inhabiting strongly seasonal environments, over-winter mortality is thought to be severe and size-dependent, with larger individuals presumed to survive at a higher rate than smaller conspecifics. Despite the intuitive appeal and prevalence of these ideas in the literature, few studies have formally tested these hypotheses. 2We here tested the support for these two hypotheses in stream-dwelling salmonids. In particular, we combined an empirical study in which we tracked the fate of individually-marked brown trout across multiple seasons and multiple years with a literature review in which we compiled the results of all previous pertinent research in stream-dwelling salmonids. 3We report that over-winter mortality does not consistently exceed mortality during other seasons. This result emerged from both our own research as well as our review of previous research focusing on whether winter survival is lower than survival during other seasons. 4We also report that bigger is not always better in terms of survival. Indeed, bigger is often worse. Again, this result emerged from both our own empirical work as well as the compilation of previous research focusing on the relationship between size and survival. 5We suggest that these results are not entirely unexpected because self-sustaining populations are presumably adapted to the predictable seasonal variation in environmental conditions that they experience. [source]


Use of pharmacoeconomics in prescribing research.

JOURNAL OF CLINICAL PHARMACY & THERAPEUTICS, Issue 4 2003
Part 4: is cost-utility analysis a useful tool?
Summary This paper is the fourth in the ,Research Note' series describing various aspects of pharmacoeconomics in prescribing research. This article describes cost-utility analysis, and how it can be incorporated into pharmacoeconomics. While this approach has intuitive appeal, the uncertainties about the methods for estimating utilities mean that the value of cost-utility analysis in prescribing research is still to be established. Importantly, cost-utility analysis may not capture aspects of quality of life that enable meaningful comparisons between prescribing choices at an individual or societal level. [source]


Non-parametric tests for distributional treatment effect for randomly censored responses

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 1 2009
Myoung-jae Lee
Summary., For a binary treatment ,=0, 1 and the corresponding ,potential response'Y0 for the control group (,=0) and Y1 for the treatment group (,=1), one definition of no treatment effect is that Y0 and Y1 follow the same distribution given a covariate vector X. Koul and Schick have provided a non-parametric test for no distributional effect when the realized response (1,,)Y0+,Y1 is fully observed and the distribution of X is the same across the two groups. This test is thus not applicable to censored responses, nor to non-experimental (i.e. observational) studies that entail different distributions of X across the two groups. We propose ,X -matched' non-parametric tests generalizing the test of Koul and Schick following an idea of Gehan. Our tests are applicable to non-experimental data with randomly censored responses. In addition to these motivations, the tests have several advantages. First, they have the intuitive appeal of comparing all available pairs across the treatment and control groups, instead of selecting a number of matched controls (or treated) in the usual pair or multiple matching. Second, whereas most matching estimators or tests have a non-overlapping support (of X) problem across the two groups, our tests have a built-in protection against the problem. Third, Gehan's idea allows the tests to make good use of censored observations. A simulation study is conducted, and an empirical illustration for a job training effect on the duration of unemployment is provided. [source]


Utopia and the doubters: truth, transition and the law

LEGAL STUDIES, Issue 3 2008
Colm Campbell
Truth commissions have an intuitive appeal in squaring the circle of peace and accountability post-conflict, but some claims for their benefits risk utopianism. Law provides both opportunities and pitfalls for post-conflict justice initiatives, including the operation of truth commissions. Rather than adopting a heavily legalised approach, derived from Public Inquiries, an ,holistic legal model', employing social science fact-finding methodologies to explore pattern of violations, and drawing appropriately on legal standards, may provide the best option for a possible Northern Ireland truth commission. [source]


The statistics of refractive error maps: managing wavefront aberration analysis without Zernike polynomials

OPHTHALMIC AND PHYSIOLOGICAL OPTICS, Issue 3 2009
D. Robert Iskander
Abstract The refractive error of a human eye varies across the pupil and therefore may be treated as a random variable. The probability distribution of this random variable provides a means for assessing the main refractive properties of the eye without the necessity of traditional functional representation of wavefront aberrations. To demonstrate this approach, the statistical properties of refractive error maps are investigated. Closed-form expressions are derived for the probability density function (PDF) and its statistical moments for the general case of rotationally-symmetric aberrations. A closed-form expression for a PDF for a general non-rotationally symmetric wavefront aberration is difficult to derive. However, for specific cases, such as astigmatism, a closed-form expression of the PDF can be obtained. Further, interpretation of the distribution of the refractive error map as well as its moments is provided for a range of wavefront aberrations measured in real eyes. These are evaluated using a kernel density and sample moments estimators. It is concluded that the refractive error domain allows non-functional analysis of wavefront aberrations based on simple statistics in the form of its sample moments. Clinicians may find this approach to wavefront analysis easier to interpret due to the clinical familiarity and intuitive appeal of refractive error maps. [source]


Putting the Image Back in Imagination

PHILOSOPHY AND PHENOMENOLOGICAL RESEARCH, Issue 1 2001
AMY KIND
Despite their intuitive appeal and a long philosophical history, imagery-based accounts of the imagination have fallen into disfavor in contemporary discussions. the philosophical pressure to reject such accounts seems to derive from two distinct sources. First, the fact that mental images have proved difficult to accommodate within a scientific conception of mind has led to numerous attempts to explain away their existence, and this in turn has led to attempts to explain the phenomenon of imagining without reference to such ontologically dubious entities as mental images. Second, even those philosophers who accept mental images in their ontology have worried about what seem to be fairly obvious examples of imaginings that occur without imagery. In this paper, I aim to relieve both these points of philosophical pressure and, in the process, develop a new imagery-based account of the imagination: the imagery model. [source]


Power of Genetic Association Studies with Fixed and Random Genotype Frequencies

ANNALS OF HUMAN GENETICS, Issue 5 2010
Julia Kozlitina
Summary When estimating the power of genetic association studies, the allele and genotype frequencies are often assumed to be known, and the numbers of individuals with each genotype are set equal to their expectations under Hardy-Weinberg equilibrium. In fact, both allele and genotype frequencies are unknown and thus random. It has previously been suggested that ignoring uncertainty in these parameters could lead to inflated power expectations. To overcome the problem, one can average power estimates over the distributions of unknown frequencies. We investigate the power-averaging method and find that, despite the intuitive appeal, it may not improve accuracy in practice, while significantly increasing computational time. For a fixed allele frequency, we show that the amount of overestimation diminishes rapidly with sample size and is completely negligible for N > 200. For an unknown frequency, the result of averaging depends on the genetic model, and may not always provide a more conservative estimate of power. We explore the effect of uncertainty in the factors that determine statistical power of association studies and propose a more economical approach to the power analysis. [source]


Reparameterizing the Pattern Mixture Model for Sensitivity Analyses Under Informative Dropout

BIOMETRICS, Issue 4 2000
Michael J. Daniels
Summary. Pattern mixture models are frequently used to analyze longitudinal data where missingness is induced by dropout. For measured responses, it is typical to model the complete data as a mixture of multivariate normal distributions, where mixing is done over the dropout distribution. Fully parameterized pattern mixture models are not identified by incomplete data; Little (1993, Journal of the American Statistical Association88, 125,134) has characterized several identifying restrictions that can be used for model fitting. We propose a reparameterization of the pattern mixture model that allows investigation of sensitivity to assumptions about nonidentified parameters in both the mean and variance, allows consideration of a wide range of nonignorable missing-data mechanisms, and has intuitive appeal for eliciting plausible missing-data mechanisms. The parameterization makes clear an advantage of pattern mixture models over parametric selection models, namely that the missing-data mechanism can be varied without affecting the marginal distribution of the observed data. To illustrate the utility of the new parameterization, we analyze data from a recent clinical trial of growth hormone for maintaining muscle strength in the elderly. Dropout occurs at a high rate and is potentially informative. We undertake a detailed sensitivity analysis to understand the impact of missing-data assumptions on the inference about the effects of growth hormone on muscle strength. [source]