Statistical Techniques (statistical + techniques)

Distribution by Scientific Domains
Distribution within Life Sciences

Kinds of Statistical Techniques

  • multivariate statistical techniques
  • standard statistical techniques


  • Selected Abstracts


    A Bridge to Advanced Statistical Techniques

    CONSERVATION BIOLOGY, Issue 3 2004
    Michelle A. Marvier
    No abstract is available for this article. [source]


    The distribution of file transmission duration in the web

    INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 5 2004
    R. Nossenson
    Abstract It is well known that the distribution of files transmission duration in the Web is heavy-tailed (A practical guide to Heavy Tails: Statistical Techniques and Application. Birkhauser: Boston, 1998; 3,26). This paper attempts to understand the reasons for this phenomenon by isolating the three major factors influencing the transmission duration: file size, network conditions and server load. We present evidence that the transmission-duration distribution (TDD) of the same file from the same server to the same client in the Web is Pareto and therefore heavy tailed. Furthermore, text files transmission delay for a specific client/server pair is not significantly affected by the file sizes: all files transmitted from the same server to the same client have very similar transmission duration distributions, regardless of their size. We use simulations to estimate the impact of network conditions and server load on the TDD. When the server and the client are on the same local network, the TDD of each file is usually Pareto as well (for server files and client requests that are distributed in a realistic way). By examining a wide-area network situation, we conclude that the network conditions do not have a major influence on the heavy-tailed behaviour of TDD. In contrast, the server load is shown to have a significant impact on the high variability of this distribution. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning by IZENMAN, A. J.

    BIOMETRICS, Issue 3 2009
    Debashis Ghosh
    No abstract is available for this article. [source]


    Controls on modern alluvial fan processes in the central Alps, northern Italy

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 3 2004
    Giovanni B. Crosta
    Abstract Alluvial fan development in Alpine areas is often affected by catastrophic sedimentary processes associated with extreme ,oods events, causing serious risks for people living on the fans. Hazard assessment in these areas depends on proper identi,cation of the dominant sedimentary processes on the fans. Data from a set of 209 alluvial fans from the central Alps of Italy are presented in this paper and analysed with the help of various statistical techniques (linear regression, principal components analysis, cluster analysis, discriminant analysis and logistic regression). First, we used modern sedimentary facies and historical records (,ood events since 15th century), to distinguish between the two dominant sedimentary processes on alluvial fans: debris ,ows and stream,ows. Then, in order to analyse the main controls on past and present fan processes, 36 morphological, geological and land-use variables were analysed. As with observations for arid-environment fans, catchment morphology is the most in,uential factor in the study area, whereas geology and land use are minor controls. The role of climatic change and landsliding within the catchments also seems to be very important and is discussed. Statistical techniques also help in differentiating groups of alluvial fans by sets of controlling factors, including stage and type of evolution. Finally, by using discriminant analysis and logistic regression, we classi,ed alluvial fans according to the dominant sedimentary process, with a success rate ranging between 75 and 92 per cent. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Outcomes of permanent family placement for children of minority ethnic origin

    CHILD & FAMILY SOCIAL WORK, Issue 1 2001
    P.G. Moffatt
    The paper explores the outcomes of permanent family placement for children of minority ethnic origin, using a sample of 254 placements drawn, in the main, from a cohort of 1165 British children placed between 1980 and 1985. Statistical techniques are used to explore the relationship between ,success' (defined, for the purposes of this paper, as the placement not known to have broken down) and a range of variables, including the characteristics of the child, the birth parents and the adoptive parents, and the type of placement. In most respects, the findings are consistent with those of similar studies. Age at placement is found to have an important effect, with success least likely for children placed in the middle age range. Lack of problem behaviours in the children at the time of placement is also found to be associated with success. Variables which are found to have no effect on the probability of success include ethnic origin of the child, whether it is an adoptive or permanent foster placement, and whether contact with birth parents continued after placement. There was an interesting gender effect in that boys did better in ,transracial' than in ,matched' placements, but the opposite was the case for girls. [source]


    A social agent pedestrian model

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3-4 2008
    Andrew Park
    Abstract This paper presents a social agent pedestrian model based on experiments with human subjects. Research studies of criminology and environmental psychology show that certain features of the urban environment generate fear in people, causing them to take alternate routes. The Crime Prevention Through Environmental Design (CPTED) strategy has been implemented to reduce fear of crime and crime itself. Our initial prototype of a pedestrian model was developed based on these findings of criminology research. In the course of validating our model, we constructed a virtual environment (VE) that resembles a well-known fear-generating area where several decision points were set up. 60 human subjects were invited to navigate the VE and their choices of routes and comments during the post interviews were analyzed using statistical techniques and content analysis. Through our experimental results, we gained new insights into pedestrians' behavior and suggest a new enhanced and articulated agent model of a pedestrian. Our research not only provides a realistic pedestrian model, but also a new methodology for criminology research. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Modeling human affective postures: an information theoretic characterization of posture features

    COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3-4 2004
    P. Ravindra De Silva
    One of the challenging issues in affective computing is to give a machine the ability to recognize the mood of a person. Efforts in that direction have mainly focused on facial and oral cues. Gestures have been recently considered as well, but with less success. Our aim is to fill this gap by identifying and measuring the saliency of posture features that play a role in affective expression. As a case study, we collected affective gestures from human subjects using a motion capture system. We first described these gestures with spatial features, as suggested in studies on dance. Through standard statistical techniques, we verified that there was a statistically significant correlation between the emotion intended by the acting subjects, and the emotion perceived by the observers. We used Discriminant Analysis to build affective posture predictive models and to measure the saliency of the proposed set of posture features in discriminating between 4 basic emotional states: angry, fear, happy, and sad. An information theoretic characterization of the models shows that the set of features discriminates well between emotions, and also that the models built over-perform the human observers. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Growth curve analyses are best suited to examine the relation between developmental pathways and selective breeding: Comment on Hofer, Shair, Masmela, & Brunelli, "Developmental effects of selective breeding for an infantile trait: The rat pup ultrasonic isolation call"

    DEVELOPMENTAL PSYCHOBIOLOGY, Issue 4 2001
    George F. Michel
    Abstract Hofer, Brunelli, Shair, & Masmela (2001) provide valuable information about the effects of selective breeding on rat-pup behaviors and physiology. Although the design and statistical analytic techniques employed are typical of those used to evaluate behavioral development in animals, I offer several suggestions about how to evaluate the influence of selective breeding on developmental pathways using modern statistical techniques. As Hofer et al. demonstrate, the development of rat behavior and physiology can be an excellent model for examining the relation between selection and development. © 2001 John Wiley & Sons, Inc. Dev Psychobiol 39: 247,250, 2001 [source]


    Biogeographical patterns of endemic terrestrial Afrotropical birds

    DIVERSITY AND DISTRIBUTIONS, Issue 3 2002
    H. M. De Klerk
    Abstract. Biogeographical zones are described for terrestrial bird species endemic to the Afrotropics using up-to-date distributional data and multivariate statistical techniques. This provides an objective basis for a hierarchy of subregions, provinces and districts, based on a set of rules. Results are compared to previous studies at continental and regional scales. Biogeographical zones for passerines and non-passerines are compared and found to be similar. Peaks of species richness and narrow endemism are described for the six major subdivisions (subregions) identified by the cluster analysis. Coincidence of peaks of species richness and narrow endemism is found to be low, such that areas selected to represent high species richness tallies will often fail to represent narrow endemics. Strong regionalization of Afrotropical birds indicates the need to use a biogeographical framework in conservation priority setting exercises to ensure that unique, but species-poor, avifaunas are not neglected. [source]


    Controls on modern alluvial fan processes in the central Alps, northern Italy

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 3 2004
    Giovanni B. Crosta
    Abstract Alluvial fan development in Alpine areas is often affected by catastrophic sedimentary processes associated with extreme ,oods events, causing serious risks for people living on the fans. Hazard assessment in these areas depends on proper identi,cation of the dominant sedimentary processes on the fans. Data from a set of 209 alluvial fans from the central Alps of Italy are presented in this paper and analysed with the help of various statistical techniques (linear regression, principal components analysis, cluster analysis, discriminant analysis and logistic regression). First, we used modern sedimentary facies and historical records (,ood events since 15th century), to distinguish between the two dominant sedimentary processes on alluvial fans: debris ,ows and stream,ows. Then, in order to analyse the main controls on past and present fan processes, 36 morphological, geological and land-use variables were analysed. As with observations for arid-environment fans, catchment morphology is the most in,uential factor in the study area, whereas geology and land use are minor controls. The role of climatic change and landsliding within the catchments also seems to be very important and is discussed. Statistical techniques also help in differentiating groups of alluvial fans by sets of controlling factors, including stage and type of evolution. Finally, by using discriminant analysis and logistic regression, we classi,ed alluvial fans according to the dominant sedimentary process, with a success rate ranging between 75 and 92 per cent. Copyright © 2004 John Wiley & Sons, Ltd. [source]


    Assessment of shallow landslide susceptibility by means of multivariate statistical techniques

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 12 2001
    Cristina Baeza
    Abstract Several multivariate statistical analyses have been performed to identify the most influential geological and geomorphological parameters on shallow landsliding and to quantify their relative contribution. A data set was first prepared including more than 30 attributes of 230 failed and unfailed slopes. The performance of principal component analysis, t-test and one-way test, allowed a preliminary selection of the most significant variables, which were used as input variables for the discriminant analysis. The function obtained has classified successfully 88·5 per cent of the overall slope population and 95·6 per cent of the failed slopes. Slope gradient, watershed area and land-use appeared as the most powerful discriminant factors. A landslide susceptibility map, based on the scores of the discriminant function, has been prepared for Ensija range in the Eastern Pyrenees. An index of relative landslide density shows that the results of the map are consistent. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Randomized controlled trial comparing the effectiveness and safety of intranasal and intramuscular naloxone for the treatment of suspected heroin overdose

    ADDICTION, Issue 12 2009
    Debra Kerr
    ABSTRACT Aims Traditionally, the opiate antagonist naloxone has been administered parenterally; however, intranasal (i.n.) administration has the potential to reduce the risk of needlestick injury. This is important when working with populations known to have a high prevalence of blood-borne viruses. Preliminary research suggests that i.n. administration might be effective, but suboptimal naloxone solutions were used. This study compared the effectiveness of concentrated (2 mg/ml) i.n. naloxone to intramuscular (i.m.) naloxone for suspected opiate overdose. Methods This randomized controlled trial included patients treated for suspected opiate overdose in the pre-hospital setting. Patients received 2 mg of either i.n. or i.m. naloxone. The primary outcome was the proportion of patients who responded within 10 minutes of naloxone treatment. Secondary outcomes included time to adequate response and requirement for supplementary naloxone. Data were analysed using multivariate statistical techniques. Results A total of 172 patients were enrolled into the study. Median age was 29 years and 74% were male. Rates of response within 10 minutes were similar: i.n. naloxone (60/83, 72.3%) compared with i.m. naloxone (69/89, 77.5%) [difference: ,5.2%, 95% confidence interval (CI) ,18.2 to 7.7]. No difference was observed in mean response time (i.n.: 8.0, i.m.: 7.9 minutes; difference 0.1, 95% CI ,1.3 to 1.5). Supplementary naloxone was administered to fewer patients who received i.m. naloxone (i.n.: 18.1%; i.m.: 4.5%) (difference: 13.6%, 95% CI 4.2,22.9). Conclusions Concentrated intranasal naloxone reversed heroin overdose successfully in 82% of patients. Time to adequate response was the same for both routes, suggesting that the i.n. route of administration is of similar effectiveness to the i.m. route as a first-line treatment for heroin overdose. [source]


    Adjusting for mortality effects in chronic toxicity testing: Mixture model approach

    ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 1 2000
    Shin Cheng David Wang
    Abstract Chronic toxicity tests, such as the Ceriodaphnia dubia 7-d test are typically analyzed using standard statistical methods such as analysis of variance or regression. Recent research has emphasized the use of Poisson regression or more generalized regression for the analysis of the fecundity data from these studies. A possible problem in using standard statistical techniques is that mortality may occur from toxicant effects as well as reduced fecundity. A mixture model that accounts for fecundity and mortality is proposed for the analysis of data arising from these studies. Inferences about key parameters in the model are discussed. A joint estimate of the inhibition concentration is proposed based on the model. Confidence interval estimation via the bootstrap method is discussed. An example is given for a study involving copper and mercury. [source]


    Environmental power analysis , a new perspective

    ENVIRONMETRICS, Issue 5 2001
    David R. Fox
    Abstract Power analysis and sample-size determination are related tools that have recently gained popularity in the environmental sciences. Their indiscriminate application, however, can lead to wildly misleading results. This is particularly true in environmental monitoring and assessment, where the quality and nature of data is such that the implicit assumptions underpinning power and sample-size calculations are difficult to justify. When the assumptions are reasonably met these statistical techniques provide researchers with an important capability for the allocation of scarce and expensive resources to detect putative impact or change. Conventional analyses are predicated on a general linear model and normal distribution theory with statistical tests of environmental impact couched in terms of changes in a population mean. While these are ,optimal' statistical tests (uniformly most powerful), they nevertheless pose considerable practical difficulties for the researcher. Compounding this difficulty is the subsequent analysis of the data and the impost of a decision framework that commences with an assumption of ,no effect'. This assumption is only discarded when the sample data indicate demonstrable evidence to the contrary. The alternative (,green') view is that any anthropogenic activity has an impact on the environment and therefore a more realistic initial position is to assume that the environment is already impacted. In this article we examine these issues and provide a re-formulation of conventional mean-based hypotheses in terms of population percentiles. Prior information or belief concerning the probability of exceeding a criterion is incorporated into the power analysis using a Bayesian approach. Finally, a new statistic is introduced which attempts to balance the overall power regardless of the decision framework adopted. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Aggregate time-series regression in the field of alcohol

    ADDICTION, Issue 7 2001
    Jürgen Rehm
    Time-series regression of successive aggregate data has been used widely to explore relationships between substance use and harm, or to measure effects of policy interventions. This paper suggests minimal standards to conduct such analyses with respect to validity and reliability of underlying data, statistical techniques and requirements and interpretation of results. [source]


    Research Trends in Textiles and Clothing: An Analysis of Three Journals, 1980,1999

    FAMILY & CONSUMER SCIENCES RESEARCH JOURNAL, Issue 2 2001
    Sharron J. Lennon
    The purpose of this research was to assess trends in research, research strategies, data analysis techniques, funding sources, affiliations, and the use of theoretical frameworks in textiles and clothing research. Empirical research focused on textiles and clothing and published in three home economics,related journals,Journal of Family and Consumer Sciences, Family and Consumer Science Research Journal, and Clothing and Textiles Research Journal,from 1980 to 1999 was content analyzed (N = 586). Although survey methodology and experimentation were the first and second most-used research strategies in all but one 5-year period from 1980 to 1999, fieldwork has increased. Data analysis techniques were primarily quantitative, with increases in the use of some advanced statistical techniques. However, the qualitative treatment of data also increased. Suggestions for graduate education and faculty development are offered. [source]


    Moduli stabilisation and applications in IIB string theory

    FORTSCHRITTE DER PHYSIK/PROGRESS OF PHYSICS, Issue 3 2007
    J.P. Conlon
    String compactifications represent the most promising approach towards unifying general relativity with particle physics. However, naive compactifications give rise to massless particles (moduli) which would mediate unobserved long-range forces, and it is therefore necessary to generate a potential for the moduli. In the introductory chapters I review this problem and recall how in IIB compactifications the dilaton and complex structure moduli can be stabilised by 3-form fluxes. There exist very many possible discrete flux choices which motivates the use of statistical techniques to analyse this discretuum of choices. Such approaches generate formulae predicting the distribution of vacua and I describe numerical tests of these formulae on the Calabi-Yau ,4[1,1,2,2,6]. Stabilising the Kähler moduli requires nonperturbative superpotential effects. I review the KKLT construction and explain why this must in general be supplemented with perturbative Kähler corrections. I show how the incorporation of such corrections generically leads to non-supersymmetric minima at exponentially large volumes, giving a detailed account of the,, expansion and its relation to Kähler corrections. I illustrate this with explicit computations for the Calabi-Yau ,4[1,1,1,6,9]. The next part of the article examines phenomenological applications of this construction. I first describe how the magnitude of the soft supersymmetry parameters may be computed. In the large-volume models the gravitino mass and soft terms are volume-suppressed. As we naturally have ,, ,1, this gives a dynamical solution of the hierarchy problem. I also demonstrate the existence of a fine structure in the soft terms, with gaugino masses naturally lighter than the gravitino mass by a factor ln (MP/m3/2). A second section gives a detailed analysis of the relationship of moduli stabilisation to the QCD axions relevant to the strong CP problem, proving a no-go theorem on the compatibility of a QCD axion with supersymmetric moduli stabilisation. I describe how QCD axions can coexist with nonsupersymmetric perturbative stabilisation and how the large-volume models naturally contain axions with decay constants that are phenomenologically allowed and satisfy the appealing relationship fa2 ,MPMsusy. A further section describe how a simple and predictive inflationary model can be built in the context of the above large-volume construction, using the no-scale Kähler potential to avoid the , problem. I finally conclude, summarising the phenomenological scenario and outlining the prospects for future work. [source]


    A comparative analysis of the diving behaviour of birds and mammals

    FUNCTIONAL ECOLOGY, Issue 5 2006
    L. G. HALSEY
    Summary 1We use a large interspecific data set on diving variables for birds and mammals, and statistical techniques to control for the effects of phylogenetic non-independence, to assess evolutionary associations among different elements of diving behaviour across a broad and diverse range of diving species. Our aim is to assess whether the diving ability of homeothermic vertebrates is influenced by factors other than the physiology of the species. 2Body mass is related to dive duration even when dive depth is controlled for and thus for a given dive depth, larger species dive for longer. This implies that larger species have a greater capacity for diving than is expressed in their dive depth. Larger animals that dive shallowly, probably for ecological reasons such as water depth, make use of the physiological advantage that their size confers by diving for longer. 3Dive duration correlates with dive depth more strongly than with body mass. This confirms that some animals are poor divers for their body mass, either because of a lower physiological capacity or because their behaviour limits their diving. 4Surface duration relates not only to dive duration but also to dive depth, as well as to both independently. This indicates a relationship between dive depth and surface duration controlling for dive duration, which suggests that deeper dives are energetically more expensive than shallow dives of the same duration. 5Taxonomic class does not improve any of the dive variable models in the present study. There is thus an unsuspected consistency in the broad responses of different groups to the effects on diving of the environment, which are therefore general features of diving evolution. [source]


    Predicting potential impacts of climate change on the geographical distribution of enchytraeids: a meta-analysis approach

    GLOBAL CHANGE BIOLOGY, Issue 11 2007
    MARÍA JESÚS I. BRIONES
    Abstract The expectation that atmospheric warming will be most pronounced at higher latitudes means that Arctic and montane systems, with predominantly organic soils, will be particularly influenced by climate change. One group of soil fauna, the enchytraeids, is commonly the major soil faunal component in specific biomes, frequently exceeding above-ground fauna in biomass terms. These organisms have a crucial role in carbon turnover in organic rich soils and seem particularly sensitive to temperature changes. In order to predict the impacts of climate change on this important group of soil organisms we reviewed data from 44 published papers using a combination of conventional statistical techniques and meta-analysis. We focused on the effects of abiotic factors on total numbers of enchytraeids (a total of 611 observations) and, more specifically, concentrated on total numbers, vertical distribution and age groupings of the well-studied species Cognettia sphagnetorum (228 observations). The results highlight the importance of climatic factors, together with vegetation and soil type in determining global enchytraeid distribution; in particular, cold and wet environments with mild summers are consistently linked to greater densities of enchytraeids. Based on the upper temperature distribution limits reported in the literature, and identified from our meta-analyses, we also examined the probable future geographical limits of enchytraeid distribution in response to predicted global temperature changes using the HadCM3 model climate output for the period between 2010 and 2100. Based on the existing data we identify that a maximum mean annual temperature threshold of 16 °C could be a critical limit for present distribution of field populations, above which their presence would decline markedly, with certain key species, such as C. sphagnetorum, being totally lost from specific regions. We discuss the potential implications for carbon turnover in these organic soils where these organisms currently dominate and, consequently, their future role as C sink/source in response to climate change. [source]


    Waiting for Broadband: Local Competition and the Spatial Distribution of Advanced Telecommunication Services in the United States

    GROWTH AND CHANGE, Issue 2 2004
    TONY H. GRUBESIC
    ABSTRACT With the passage of the Telecommunications Act of 1996, Congress directed the Federal Communications Commission and all fifty U.S. states to encourage the deployment of advanced telecommunication capability in a reasonable and timely manner. Today, with the rollout of advanced data services such as digital subscriber lines (xDSL), cable modems, and fixed wireless technologies, broadband has become an important component of telecommunication service and competition. Unfortunately, the deployment of last-mile infrastructure enabling high-speed access has proceeded more slowly than anticipated and competition in many areas is relatively sparse. More importantly, there are significant differences in the availability of broadband services between urban and rural areas. This paper explores aspects of broadband access as a function of market demand and provider competition. Data collected from the Federal Communications Commission is analyzed using a geographic information system and spatial statistical techniques. Results suggest significant spatial variation in broadband Internet access as a function of provider competition in the United States. [source]


    A classification of drainage and macropore flow in an agricultural catchment

    HYDROLOGICAL PROCESSES, Issue 1 2002
    Dr C. M. Heppell
    Abstract This paper uses a variety of multivariate statistical techniques in order to improve current understanding of the antecedent and rainfall controls on drainage characteristics for an agricultural underdrained clay site. Using the dataset obtained from a two-year hillslope study at Wytham (Oxfordshire, UK) a number of patterns in the nature and style of drainage events were explored. First, using principal components analysis, a distinction was drawn between drainflow controlled by antecedent conditions and drainflow controlled by rainfall characteristics. Dimensional analysis then distinguished between two further types of drainflow event: antecedent limited events (ALE) and non-antecedent limited events (NALE). These were drainflow events requiring a minimum antecedent hydraulic head to occur (ALE) and events that occurred in response to rainfall irrespective of the antecedent conditions, because the rainfall was either of high enough intensity or duration to prompt a response in drainflow (NALE). 2. The dataset also made possible a preliminary investigation into the controls on and types of macropore flow at the site. Principal components analysis identified that rainfall characteristics were more important than antecedent conditions in generating high proportions of macropore flow in drainflow. Of the rainfall characteristics studied, rainfall amount and intensity were the dominant controls on the amount of macropore flow, with duration as a secondary control. Two styles of macropore flow were identified: intensity-driven and duration-driven. Intensity-driven events are characterized by rainfall of high intensity and short duration. During such events the amount of macropore flow is proportional to the rainfall intensity and the interaction between macropore and matrix flow is kinetically limited. The second style of macropore flow is characterized by long-duration events. For these events the amount of macropore flow approaches a maximum value whatever the rainfall duration. This suggests that these events are characterized by an equilibrium interaction between macropores and matrix flow. Copyright © 2002 John Wiley & Sons, Ltd. [source]


    Hierarchical Bayesian modelling of wind and sea surface temperature from the Portuguese coast

    INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 9 2010
    Ricardo T. Lemos
    Abstract In this work, we revisit a recent analysis that pointed to an overall relaxation of the Portuguese coastal upwelling system, between 1941 and 2000, and apply more elaborate statistical techniques to assess that evidence. Our goal is to fit a model for environmental variables that accommodate seasonal cycles, long-term trends, short-term fluctuations with some degree of autocorrelation, and cross-correlations between measuring sites and variables. Reference cell coding is used to investigate similarities in behaviour among sites. Parameter estimation is performed in a single modelling step, thereby producing more reliable credibility intervals than previous studies. This is of special importance in the assessment of trend significance. We employ a Bayesian approach with a purposely developed Markov chain Monte Carlo method to explore the posterior distribution of the parameters. Our results substantiate most previous findings and provide new insight on the relationship between wind and sea surface temperature off the Portuguese coast. Copyright © 2009 Royal Meteorological Society [source]


    Clustering with artificial neural networks and traditional techniques

    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 4 2003
    G. Tambouratzis
    In this article, two clustering techniques based on neural networks are introduced. The two neural network models are the Harmony theory network (HTN) and the self-organizing logic neural network (SOLNN), both of which are characterized by parallel processing, a distributed architecture, and a large number of nodes. After describing their clustering characteristics and potential, a comparison to classical statistical techniques is performed. This comparison allows the creation of a correspondence between each neural network clustering technique and particular metrics as used by the corresponding statistical methods, which reflect the affinity of the clustered patterns. In particular, the HTN is found to perform the clustering task with an accuracy similar to the best statistical methods, while it is further capable of proposing an optimal number of groups into which the patterns may be clustered. On the other hand, the SOLNN combines a high clustering accuracy with the ability to cluster higher-dimensional patterns without a considerable increase in the processing time. © 2003 Wiley Periodicals, Inc. [source]


    Karl Pearson and the Establishment of Mathematical Statistics

    INTERNATIONAL STATISTICAL REVIEW, Issue 1 2009
    M. Eileen Magnello
    Summary At the end of the nineteenth century, the content and practice of statistics underwent a series of transitions that led to its emergence as a highly specialised mathematical discipline. These intellectual and later institutional changes were, in part, brought about by a mathematical-statistical translation of Charles Darwin's redefinition of the biological species as something that could be viewed in terms of populations. Karl Pearson and W.F.R. Weldon's mathematical reconceptualisation of Darwinian biological variation and "statistical" population of species in the 1890s provided the framework within which a major paradigmatic shift occurred in statistical techniques and theory. Weldon's work on the shore crab in Naples and Plymouth from 1892 to 1895 not only brought them into the forefront of ideas of speciation and provided the impetus to Pearson's earliest statistical innovations, but it also led to Pearson shifting his professional interests from having had an established career as a mathematical physicist to developing one as a biometrician. The innovative statistical work Pearson undertook with Weldon in 1892 and later with Francis Galton in 1894 enabled him to lay the foundations of modern mathematical statistics. While Pearson's diverse publications, his establishment of four laboratories and the creation of new academic departments underscore the plurality of his work, the main focus of his life-long career was in the establishment and promulgation of his statistical methodology. [source]


    War and Peace in Space and Time: The Role of Democratization

    INTERNATIONAL STUDIES QUARTERLY, Issue 1 2000
    Kristian S. Gleditsch
    Democratization reduces the risk of war, but uneven transitions toward democracy can increase the probability of war. Using country-level data on democratization and international war from the period 1875,1996, we develop a general additive statistical model reassessing this claim in light of temporal and spatial dependence. We also develop a new geopolitical database of contiguities and demonstrate new statistical techniques for probing the extent of spatial clustering and its impact on the relationship between democratization and war. Our findings reaffirm that democratization generally does reduce the risk of war, but that large swings back and forth between democracy and autocracy can increase war proneness. We show that the historical context of peace diminishes the risk of war, while a regional context plagued by conflict greatly magnifies it. [source]


    Food Scares and Trust: A European Study

    JOURNAL OF AGRICULTURAL ECONOMICS, Issue 1 2008
    Mario Mazzocchi
    Abstract The complex interactions between the determinants of food purchase under risk are explored using the SPARTA model, based on the theory of planned behaviour, and estimated through a combination of multivariate statistical techniques. The application investigates chicken consumption choices in two scenarios: (a) a ,standard' purchasing situation; and (b) following a hypothetical Salmonella scare. The data are from a nationally representative survey of 2,725 respondents from five European countries: France, Germany, Italy, the Netherlands and the United Kingdom. Results show that the effects and interactions of behavioural determinants vary significantly within Europe. Only in the case of a food scare do risk perceptions and trust come into play. The policy priority should be on building and maintaining trust in food and health authorities and research institutions, while food chain actors could mitigate the consequences of a food scare through public trust. No relationship is found between socio-demographic variables and consumer trust in food safety information. [source]


    Effect of Off-Pump Coronary Artery Bypass Grafting on Risk-Adjusted and Cumulative Sum Failure Outcomes After Coronary Artery Surgery

    JOURNAL OF CARDIAC SURGERY, Issue 6 2002
    Richard J. Novick M.D.
    We therefore applied CUSUM, as well as standard statistical techniques, to analyze a surgeon's experience with off-pump coronary artery bypass grafting (OPCAB) and on-pump procedures to determine whether the two techniques have similar or different outcomes. Methods: In 320 patients undergoing nonemergent, first time coronary artery bypass grafting, preoperative patient characteristics, rates of mortality and major complications, and ICU and hospital lengths of stay were compared between the on-pump and OPCAB cohorts using Fisher's exact tests and Wilcoxon two sample tests. Predicted mortality and length of stay were determined using previously validated models of the Cardiac Care Network of Ontario. Observed versus expected ratios of both variables were calculated for the two types of procedures. Furthermore, CUSUM curves were constructed for the on-pump and OPCAB cohorts. A multivariable analysis of the predictors of hospital length of stay was also performed to determine whether the type of coronary artery bypass procedure had an independent impact on this variable. Results: The predicted mortality risk and predicted hospital length of stay were almost identical in the 208 on-pump patients ( 2.2 ± 3.9% ; 8.2 ± 2.5 days) and the 112 OPCAB patients ( 2.0 ± 2.2% ; 7.8 ± 2.1 days). The incidence of hospital mortality and postoperative stroke were 2.9% and 2.4% in on-pump patients versus zero in OPCAB patients (p= 0.09 and 0.17, respectively). Mechanical ventilation for greater than 48 hours was significantly less common in OPCAB (1.8%) than in on-pump patients (7.7%, p= 0.04). The rate of 10 major complications was 14.9% in on-pump versus 8.0% in OPCAB patients (p= 0.08). OPCAB patients experienced a hospital length of stay that was a median of 1.0 day shorter than on-pump patients (p= 0.01). The observed versus expected ratio for length of stay was 0.78 in OPCAB patients versus 0.95 in on-pump patients. On CUSUM analysis, the failure curve in OPCAB patients was negative and was flatter than that of on-pump patients throughout the duration of the study. Furthermore, OPCAB was an independent predictor of a reduced hospital length of stay on multivariable analysis. Conclusions: OPCAB was associated with better outcomes than on-pump coronary artery bypass despite a similar predicted risk. This robust finding was documented on sensitive CUSUM analysis, using standard statistical techniques and on a multivariable analysis of the independent predictors of hospital length of stay.(J Card Surg 2002;17:520-528) [source]


    Predictive and interpolative biplots applied to canonical variate analysis in the discrimination of vegetable oils by their fatty acid composition

    JOURNAL OF CHEMOMETRICS, Issue 9 2004
    M. Rui Alves
    Abstract The fatty acid profiles of 120 commercial unblended peanut, corn, soybean and sunflower vegetable oils and 17 commercial brands of blended edible oils were determined by HRGC/FID/capillary column, including several cis and trans isomers of mono-, di- and tri-unsaturated fatty acids as well as fatty acids with an odd number of carbon atoms. Although many statistical techniques may show some usefulness in the description and analysis of the data obtained, predictive biplots applied to canonical variate analysis prove to be a very useful way of carrying out interpretations and an important aid in building up models, while interpolative biplots display great advantages in the utilization of models for classification purposes on a day-to-day basis. Furthermore, these biplots require only a modest understanding of statistical tools, since all judgements are made regarding original fatty acids and original measuring units. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Statistical analysis of catalyst degradation in a semi-continuous chemical production process

    JOURNAL OF CHEMOMETRICS, Issue 8 2001
    Eleftherios Kaskavelis
    Abstract The effect of decaying catalyst efficacy in a commercial-scale, semi-continuous petrochemical process was investigated. The objective was to gain a better understanding of process behaviour and its effect on production rate. The process includes a three-stage reaction performed in fixed bed reactors. Each of the three reaction stages consists of a number of catalyst beds that are changed periodically to regenerate the catalyst. Product separation and reactant recycling are then performed in a series of distillation columns. In the absence of specific measurements of the catalyst properties, process operational data are used to assess catalyst decay. A number of statistical techniques were used to model production rate as a function of process operation, including information on short- and long-term catalyst decay. It was found that ridge regression, partial least squares and stepwise selection multiple linear regression yielded similar predictive models. No additional benefit was found from the application of non-linear partial least squares or Curds and Whey. Finally, through time series profiles of total daily production volume, corresponding to individual in-service cycles of the different reaction stages, short-term catalyst degradation was assessed. It was shown that by successively modelling the process as a sequence of batches corresponding to cycles of each reaction stage, considerable economic benefit could be realized by reducing the maximum cycle length in the third reaction stage. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Using effectiveness studies for prescribing research, part 2

    JOURNAL OF CLINICAL PHARMACY & THERAPEUTICS, Issue 6 2002
    N. Freemantle PhD
    Summary Trials that consider the effects of interventions on prescribing behaviour amongst clinicians often have complex design implications resulting in data that has inherent hierarchical structure. It follows that both experimental design and analysis plans must account for this structure and thus results should be considered in terms of clinician behaviour rather than individual patient response. We describe this change in perspective and the necessity for using statistical techniques that allow incorporation of potential confounding effects. We also discuss the appropriateness of some specific outcomes in relation to these trials. [source]