Statistical Methodology (statistical + methodology)

Distribution by Scientific Domains


Selected Abstracts


Teaching Statistical Consulting Before Statistical Methodology

AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2003
Ross H. Taplin
Abstract This paper outlines and discusses the advantages of an ,Introduction to Statistical Consulting' course (ISC) that exposes students to statistical consulting early in their studies. The course is intended for students before, or while, they study their units in statistical techniques, and assumes only a first-year introductory statistics unit. The course exposes undergraduate students to the application of statistics and helps develop statistical thinking. An important goal is to introduce students to work as a statistician early in their studies because this motivates some students to study statistics further and provides a framework to motivate the learning of further statistical techniques. The ISC has proved popular with students, and this paper discusses the reasons for this popularity and the benefits of an ISC to statistical education and the statistics profession. [source]


Therapeutic Equivalence , Clinical Issues and Statistical Methodology in Noninferiority Trials

BIOMETRICAL JOURNAL, Issue 1 2005
Axel Munk
This special issue on therapeutic equivalence contains a selection of 8 papers presented at the conference ,Therapeutic Equivalence , Clinical Issues and Statistical Methodology in Noninferiority Trials' held in Düsseldorf, December 12,13, 2003. The aim of this conference was to gather experts from academics, industry and regulatory agencies in the field of therapeutic equivalence, in particular of noninferiority trials. Originally initiated as a small workshop, it soon turned out that there is obviously strong need to discuss these challenging issues at a broader auditorium. Indeed, the feedback to this conference was overwhelming, finally more than 300 researchers participated. Hence the idea emerged to collect the results and discussions in a single journal issue. It took more than a year to finish it, and various activities in this rapidly developing area have been going on and were incorporated. We are very grateful to the Editors E. Brunner and M. Schumacher of the Biometrical Journal for their encouragement and support to publish this special issue on the occasion of this conference. Further, the technical assistance and expertise of G. Skipka and K. Thangavelu is gratefully acknowledged. We are also indebted to Peter Bauer and Stephen Senn for their discussions of the subsequent articles by Bristol, Freitag, Hauschke, Slacik-Erben, Hensen and Kaufmann, Hung, Wang and O'Neill, Lange and Freitag, Tsong and Zhang, Wellek, and last but not least we would like to thank Joachim Röhmel for his contribution to this special issue. Joachim Röhmel contributed significantly during the last three decades to various branches of biostatistical research, and in particular to the design and analysis of equivalence trials. The aim of this special issue is therefore twofold, it is also devoted to the occasion of Joachim Röhmel's retirement from BfArM in 2004. In the following we would briefly like to express our deep appreciation of his scientific work. (© 2005 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Performance and effectiveness trade-off for checkpointing in fault-tolerant distributed systems

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 1 2007
Panagiotis Katsaros
Abstract Checkpointing has a crucial impact on systems' performance and fault-tolerance effectiveness: excessive checkpointing results in performance degradation, while deficient checkpointing incurs expensive recovery. In distributed systems with independent checkpoint activities there is no easy way to determine checkpoint frequencies optimizing response-time and fault-tolerance costs at the same time. The purpose of this paper is to investigate the potentialities of a statistical decision-making procedure. We adopt a simulation-based approach for obtaining performance metrics that are afterwards used for determining a trade-off between checkpoint interval reductions and efficiency in performance. Statistical methodology including experimental design, regression analysis and optimization provides us with the framework for comparing configurations, which use possibly different fault-tolerance mechanisms (replication-based or message-logging-based). Systematic research also allows us to take into account additional design factors, such as load balancing. The method is described in terms of a standardized object replication model (OMG FT-CORBA), but it could also be applied in other (e.g. process-based) computational models. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Preliminary testing for normality: some statistical aspects of a common concept

CLINICAL & EXPERIMENTAL DERMATOLOGY, Issue 6 2006
V. Schoder
Summary Background., Statistical methodology has become an increasingly important topic in dermatological research. Adequacy of the statistical procedure depends among others on distributional assumptions. In dermatological articles, the choice between parametric and nonparametric methods is often based on preliminary goodness-of-fit tests. Aim., For the special case of the assumption of normally distributed data, the Kolmogorov,Smirnov test is the most popular choice. We investigated the performance of this test on four types of non-normal data, representing the majority of real data in dermatological research. Methods., Simulations were run to assess the performance of the Kolmogorov,Smirnov test, depending on sample size and severity of violations of normality. Results., The Kolmogorov,Smirnov test performs badly on data with single outliers, 10% outliers and skewed data at sample sizes <,100, whereas normality is rejected to an acceptable degree for Likert-type data. Conclusion., Preliminary testing for normality is not recommended for small-to-moderate sample sizes. [source]


Effects of 17 ,-estradiol exposure on Xenopus laevis gonadal histopathology

ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 5 2010
Jeffrey C. Wolf
Abstract The natural estrogen 17 ,-estradiol (E2) is a potential environmental contaminant commonly employed as a positive control substance in bioassays involving estrogenic effects. The aquatic anuran Xenopus laevis is a frequent subject of reproductive endocrine disruptor research; however, histopathological investigations have tended to be less than comprehensive. Consequently, a study was designed to characterize gross and microscopic changes in the gonads of X. laevis as a result of E2 exposure. Additional goals of this study, which consisted of three separate experiments, included the standardization of diagnostic terminology and criteria, the validation of statistical methodology, and the establishment of a half maximal effective concentration (EC50) for E2 as defined by an approximately 50% conversion of presumptive genotypic males to phenotypic females. In the first experiment, frogs were exposed to nominal concentrations of 0, 0.2, 1.5, or 6.0,µg/L E2. From these experimental results and those of a subsequent range finding trial, the EC50 for E2 was determined to be approximately 0.2,µg/L. This E2 concentration was utilized in the other two experiments, which were performed at different facilities to confirm the reproducibility of results. Experiments were conducted according to Good Laboratory Practice guidelines, and the histopathologic evaluations were peer reviewed by an independent pathologist. Among the three trials, the histopathological findings that were strongly associated with E2-exposure (p,<,0.001 to 0.0001) included an increase in the proportion of phenotypic females, mixed sex, dilated testis tubules, dividing gonocytes in the testis, and dilated ovarian cavities in phenotypic ovaries. A comparison of the gross and microscopic evaluations suggested that some morphologic changes in the gonads may potentially be missed if studies rely entirely on macroscopic assessment. Environ. Toxicol. Chem. 2010;29:1091,1105. © 2010 SETAC [source]


Estimation of trends in high urban ozone levels using the quantiles of (GEV)

ENVIRONMETRICS, Issue 5 2010
Hortensia J. Reyes
Abstract In this paper we propose a statistical methodology to analyze the trends of very high values of tropospheric ozone. The methodology is based on the estimation of percentiles of the distribution of extreme values. The asymptotic distribution of the estimated percentiles is obtained with a normal result. This allows us to use a linear regression to investigate linear and non-linear trends. To illustrate the proposed methodology we use the information on ozone levels from some monitoring stations in Mexico City during the period from 1986 to 2005. The analysis of the information indicates a decrease in the very high ozone levels in the later years. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Emissions of greenhouse gases attributable to the activities of the land transport: modelling and analysis using I-CIR stochastic diffusion,the case of Spain

ENVIRONMETRICS, Issue 2 2008
R. Gutiérrez
Abstract In this study, carried out on the basis of the conclusions and methodological recommendations of the Fourth Assessment Report (2007) of the International Panel on Climate Change (IPCC), we consider the emissions of greenhouse gases (GHG), and particularly those of CO2, attributable to the activities of land transport, for all sectors of the economy, as these constitute a significant proportion of total GHG emissions. In particular, the case of Spain is an example of a worrying situation in this respect, both in itself and in the context of the European Union. To analyse the evolution, in this case, of such emissions, to enable medium-term forecasts to be made and to obtain a model that will enable us to analyse the effects of possible corrector mechanisms, we have statistically fitted a inverse Cox-Ingersoll-Ross (I-CIR) type nonlinear stochastic diffusion process, on the basis of the real data measured for the period 1990,2004, during which the Kyoto protocol has been applicable. We have studied the evolution of the trend of these emissions using estimated trend functions, for which purpose probabilistic complements such as trend functions and stationary distribution are incorporated, and a statistical methodology (estimation and asymptotic inference) for this diffusion, these tools being necessary for the application of the analytical methodology proposed. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Predicting river water temperatures using stochastic models: case study of the Moisie River (Québec, Canada)

HYDROLOGICAL PROCESSES, Issue 1 2007
Behrouz Ahmadi-Nedushan
Abstract Successful applications of stochastic models for simulating and predicting daily stream temperature have been reported in the literature. These stochastic models have been generally tested on small rivers and have used only air temperature as an exogenous variable. This study investigates the stochastic modelling of daily mean stream water temperatures on the Moisie River, a relatively large unregulated river located in Québec, Canada. The objective of the study is to compare different stochastic approaches previously used on small streams to relate mean daily water temperatures to air temperatures and streamflow indices. Various stochastic approaches are used to model the water temperature residuals, representing short-term variations, which were obtained by subtracting the seasonal components from water temperature time-series. The first three models, a multiple regression, a second-order autoregressive model, and a Box and Jenkins model, used only lagged air temperature residuals as exogenous variables. The root-mean-square error (RMSE) for these models varied between 0·53 and 1·70 °C and the second-order autoregressive model provided the best results. A statistical methodology using best subsets regression is proposed to model the combined effect of discharge and air temperature on stream temperatures. Various streamflow indices were considered as additional independent variables, and models with different number of variables were tested. The results indicated that the best model included relative change in flow as the most important streamflow index. The RMSE for this model was of the order of 0·51 °C, which shows a small improvement over the first three models that did not include streamflow indices. The ridge regression was applied to this model to alleviate the potential statistical inadequacies associated with multicollinearity. The amplitude and sign of the ridge regression coefficients seem to be more in agreement with prior expectations (e.g. positive correlation between water temperature residuals of different lags) and make more physical sense. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Karl Pearson and the Establishment of Mathematical Statistics

INTERNATIONAL STATISTICAL REVIEW, Issue 1 2009
M. Eileen Magnello
Summary At the end of the nineteenth century, the content and practice of statistics underwent a series of transitions that led to its emergence as a highly specialised mathematical discipline. These intellectual and later institutional changes were, in part, brought about by a mathematical-statistical translation of Charles Darwin's redefinition of the biological species as something that could be viewed in terms of populations. Karl Pearson and W.F.R. Weldon's mathematical reconceptualisation of Darwinian biological variation and "statistical" population of species in the 1890s provided the framework within which a major paradigmatic shift occurred in statistical techniques and theory. Weldon's work on the shore crab in Naples and Plymouth from 1892 to 1895 not only brought them into the forefront of ideas of speciation and provided the impetus to Pearson's earliest statistical innovations, but it also led to Pearson shifting his professional interests from having had an established career as a mathematical physicist to developing one as a biometrician. The innovative statistical work Pearson undertook with Weldon in 1892 and later with Francis Galton in 1894 enabled him to lay the foundations of modern mathematical statistics. While Pearson's diverse publications, his establishment of four laboratories and the creation of new academic departments underscore the plurality of his work, the main focus of his life-long career was in the establishment and promulgation of his statistical methodology. [source]


Interpreting trends in cancer patient survival

JOURNAL OF INTERNAL MEDICINE, Issue 2 2006
P. W. DICKMAN
Abstract Data on cancer patient survival are an invaluable tool in the evaluation of therapeutic progress against cancer as well as other lethal diseases. As with all quantitative information routinely used in evidence-based clinical management , including diagnostic tests, prognostic markers and comparisons of therapeutic interventions , data on patient survival require evaluation based on an understanding of the underlying statistical methodology, methods of data collection and classification, and, most notably, clinical and biologic insight. This article contains an introduction to the methods used for estimating cancer patient survival, including cause-specific survival, relative survival and period analysis. The methods, and their interpretation, are illustrated through presentation of trends in incidence, mortality and patient survival for a range of different cancers. Our aim was to lay out the strengths and limitations of survival analysis as a tool in the evaluation of progress in the diagnosis and treatment of cancer. [source]


DISCUSSION II ON FITTING EQUATIONS TO SENSORY DATA

JOURNAL OF SENSORY STUDIES, Issue 1 2000
STEVEN M. SEIFERHELD
ABSTRACT In his article " On Fitting Equations to Sensory Data." Moskowitz suggests many strategies for model fitting which depart from current statistical methodology. Four areas discussed by Moskowitz are addressed here: (1) Forcing terms into a model; (2) The use of hold-out samples; (3) The use of aggregate data (averaging across people, suppressing the person-to-person variation); and (4) The use of random data as a predictor variable in a regression equation. All four of these points will be examined within this article. [source]


Enhancing statistical education by using role-plays of consultations

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES A (STATISTICS IN SOCIETY), Issue 2 2007
Ross Taplin
Summary., Role-plays in which students act as clients and statistical consultants to each other in pairs have proved to be an effective class exercise. As well as helping to teach statistical methodology, they are effective at encouraging statistical thinking, problem solving, the use of context in applied statistical problems and improving attitudes towards statistics and the statistics profession. Furthermore, they are fun. This paper explores the advantages of using role-plays and provides some empirical evidence supporting their success. The paper argues that there is a place for teaching statistical consulting skills well before the traditional post-graduate qualification in statistics, including to school students with no knowledge of techniques in statistical inference. [source]


Response surface methodology for optimizing the fermentation medium of Clostridium butyricum

LETTERS IN APPLIED MICROBIOLOGY, Issue 4 2004
G.Q. He
Abstract Aims:, Strains of Clostridium butyricum have been increasingly used as probiotics for both animals and humans. The aim of this study was to develop a growth medium for cultivating C. butyricum ZJUCB using a statistical methodology. Methods and Results:, Response surface methodology (RSM) was used to evaluate the effects of variables, namely the concentrations of the glucose, pectin, soyabean cake extract, casein, corn steep flour, ammonium sulphate, sodium bicarbonate and the medium initial pH. A fractional factorial design was applied to study the main factors that affected the growth of a probiotic strain of C. butyricum currently preserved in our lab and the central composite experimental design was adopted to derive a statistical model for optimizing the composition of the fermentation medium. The experimental results showed that the optimum fermentation medium for the growth of C. butyricum was composed of 2% glucose (w/v), 0·5% pectin (w/v), 0·2% casein (w/v), 3·98% soyabean cake extract, 0·1% (NH4)2SO4 (w/v), 0·124% NaHCO3 (w/v), 0·37% corn steep flour (w/v), 0·02% MnSO4 H2O (w/v), 0·02% MgSO4 7H2O (w/v) and 0·002% CaCl2 (w/v) at pH 7·5. Conclusions:, After incubating 24 h in the optimum fermentation medium, the populations of the viable organisms were estimated to be 109 CFU ml,1. In the present study, we report the optimization of a growth medium that produced increased yields using statistical approach. Significance and Impact of the Study:, The use of bacteria as a probiotic is showing increasing potential. The development of a growth medium that has a high yield is an obvious need, and the approach to optimizing a growth medium is innovative. [source]


Estimation of the seed dispersal kernel from exact identification of source plants

MOLECULAR ECOLOGY, Issue 23 2007
JUAN J. ROBLEDO-ARNUNCIO
Abstract The exact identification of individual seed sources through genetic analysis of seed tissue of maternal origin has recently brought the full analytical potential of parentage analysis to the study of seed dispersal. No specific statistical methodology has been described so far, however, for estimation of the dispersal kernel function from categorical maternity assignment. In this study, we introduce a maximum-likelihood procedure to estimate the seed dispersal kernel from exact identification of seed sources. Using numerical simulations, we show that the proposed method, unlike other approaches, is independent of seed fecundity variation, yielding accurate estimates of the shape and range of the seed dispersal kernel under varied sampling and dispersal conditions. We also demonstrate how an obvious estimator of the dispersal kernel, the maximum-likelihood fit of the observed distribution of dispersal distances to seed traps, can be strongly biased due to the spatial arrangement of seed traps relative to source plants. Finally, we illustrate the use of the proposed method with a previously published empirical example for the animal-dispersed tree species Prunus mahaleb. [source]


Assessing the impact of ICH E9

PHARMACEUTICAL STATISTICS: THE JOURNAL OF APPLIED STATISTICS IN THE PHARMACEUTICAL INDUSTRY, Issue 2 2008
David Brown
Abstract The ICH harmonized tripartite guideline ,Statistical Principles for Clinical Trials', more commonly referred to as ICH E9, was adopted by the regulatory bodies of the European Union, Japan and the USA in 1998. This document united related guidance documents on statistical methodology from each of the three ICH regions, and meant that for the first time clear consistent guidance on statistical principles was available to those conducting and reviewing clinical trials. At the 10th anniversary of the guideline's adoption, this paper discusses the influence of ICH E9 by presenting a perspective on how approaches to some aspects of clinical trial design, conduct and analysis have changed in that time in the context of regulatory submissions in the European Union. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Methodology, Statistics, and Voting Error: An Exploration of 2000 Presidential Election Data in Two States

POLICY STUDIES JOURNAL, Issue 1 2005
Geralyn M. Miller
In the wake of the voting controversy of Election 2000, along with passage of a congressional measure designed to fix what many believe is an ailing voting system, research into the impact of voting equipment on residual voting error has become a crucial question as the states prepare to replace existing voting equipment through the use of matching federal funds, to adjust existing equipment, or to face yet more lawsuits. Most existent studies into the link between voting equipment and residual voting error have concentrated on voting equipment across the states rather than within the individual states, generating results that are subject to a possible aggregation bias. Using a variety of statistical techniques, data on Election 2000 U.S. presidential and U.S. senatorial races are analyzed in an attempt to determine the impact of voting equipment on the voting error levels intrastate in those races. This study presents analysis of two sets of state data, Wyoming and Pennsylvania, and is used to argue that the infamous punch-card voting equipment may not be a significant contributor to an increase in voter error when analyzing intrastate, contrary to existing research that indicates it is significant when analyzed across multiple states. This research underscores the importance of researchers' ideological perspectives in application of statistical methodology to the American policy arena. [source]


Fetal size charts for the Italian population.

PRENATAL DIAGNOSIS, Issue 6 2005
Normative curves of head, abdomen, long bones
Abstract Objective To describe size charts developed from fetuses of Italian couples. Method Prospective cross-sectional investigation conducted in three referral centers for prenatal diagnosis. The population of the study included fetuses between the 16th and the 40th week of gestation recruited prospectively and examined only once for the purpose of this study. Exclusion criteria comprised all maternal and/or fetal conditions possibly affecting fetal biometry. The following biometric variables were measured: biparietal diameter, head circumference, abdominal circumference, femur, tibia, humerus, ulna and radio length. The statistical procedure recommended for analyzing this type of data set was employed to derive normal ranges and percentiles. Birthweight was also recorded. Our centiles were then compared with results from other studies. Results The best-fitted regression model to describe the relationships between head circumference and abdominal circumference and gestational age was a cubic one, whereas a simple quadratic model fitted BPD, and length of long bones. Models fitting the SD were straight lines or quadratic curves. Neither the use of fractional polynomials (the greatest power of the polynomials being 3) nor the logarithmic transformation improved the fitting of the curves. Conclusion We have established size charts for fetuses from Italian couples using the recommended statistical approach. Since the mean birthweight in this study is not statistically different from the official birthweight reported for the Italian population, these reference intervals, developed according to the currently approved statistical methodology, can be employed during second- and third-trimester obstetric ultrasound of fetuses from Italian couples. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Modelling and forecasting vehicle stocks using the trends of stochastic Gompertz diffusion models: The case of Spain

APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 3 2009
R. Gutiérrez
Abstract In the present study, we treat the stochastic homogeneous Gompertz diffusion process (SHGDP) by the approach of the Kolmogorov equation. Firstly, using a transformation in diffusion processes, we show that the probability transition density function of this process has a lognormal time-dependent distribution, from which the trend and conditional trend functions and the stationary distribution are obtained. Second, the maximum likelihood approach is adapted to the problem of parameters estimation in the drift and the diffusion coefficient using discrete sampling of the process, then the approximated asymptotic confidence intervals of the parameter are obtained. Later, we obtain the corresponding inference of the stochastic homogeneous lognormal diffusion process as limit from the inference of SHGDP when the deceleration factor tends to zero. A statistical methodology, based on the above results, is proposed for trend analysis. Such a methodology is applied to modelling and forecasting vehicle stocks. Finally, an application is given to illustrate the methodology presented using real data, concretely the total vehicle stocks in Spain. Copyright © 2008 John Wiley & Sons, Ltd. [source]


COMPARING GLASS COMPOSITIONAL ANALYSES*

ARCHAEOMETRY, Issue 3 2006
M. J. BAXTER
In a recently published study of Romano-British colourless glass compositions, using inductively coupled plasma spectroscopy, 28 glasses from Colchester sampled in a previous study were resampled. This was done deliberately, with a view to examining the repeatability of results from sampling on different occasions. We report on our results here, developing in the process some simple statistical methodology that could be applied in similar situations. The potential for combining analyses undertaken at different times is discussed and illustrated. [source]


Joint Spatial Modeling of Recurrent Infection and Growth with Processes under Intermittent Observation

BIOMETRICS, Issue 2 2010
F. S. Nathoo
Summary In this article, we present a new statistical methodology for longitudinal studies in forestry, where trees are subject to recurrent infection, and the hazard of infection depends on tree growth over time. Understanding the nature of this dependence has important implications for reforestation and breeding programs. Challenges arise for statistical analysis in this setting with sampling schemes leading to panel data, exhibiting dynamic spatial variability, and incomplete covariate histories for hazard regression. In addition, data are collected at a large number of locations, which poses computational difficulties for spatiotemporal modeling. A joint model for infection and growth is developed wherein a mixed nonhomogeneous Poisson process, governing recurring infection, is linked with a spatially dynamic nonlinear model representing the underlying height growth trajectories. These trajectories are based on the von Bertalanffy growth model and a spatially varying parameterization is employed. Spatial variability in growth parameters is modeled through a multivariate spatial process derived through kernel convolution. Inference is conducted in a Bayesian framework with implementation based on hybrid Monte Carlo. Our methodology is applied for analysis in an 11-year study of recurrent weevil infestation of white spruce in British Columbia. [source]