Statistical Information (statistical + information)

Distribution by Scientific Domains


Selected Abstracts


The effect of discordance among violence and general recidivism risk estimates on predictive accuracy

CRIMINAL BEHAVIOUR AND MENTAL HEALTH, Issue 3 2006
Jeremy F. Mills
Introduction,Previous research has shown that the prediction of short-term inpatient violence is negatively affected when clinicians' inter-rater agreement is low and when confidence in the estimate of risk is low. This study examined the effect of discordance between risk assessment instruments used to predict long-term general and violence risk in offenders. Methods,The Psychopathy Checklist , Revised (PCL,R), Level of Service Inventory , Revised (LSI,R), Violence Risk Appraisal Guide (VRAG), and the General Statistical Information on Recidivism (GSIR) were the four risk-prediction instruments used to predict post-release general and violent recidivism within a sample of 209 offenders. Results,The findings lend empirical support to the assumption that predictive accuracy is threatened where there is discordance between risk estimates. Discordance between instruments had the impact of reducing predictive accuracy for all instruments except the GSIR. Further, the influence of discordance was shown to be greater on certain instruments over others. Discordance had a moderating effect on both the PCL,R and LSI,R but not on the VRAG and GSIR. Conclusions,There is a distinct advantage when attempting to predict recidivism to employing measures such as the LSI-R, which includes dynamic variables and intervention-related criminogenic domains, over a measure purely of fixed characteristics, such as the GSIR; however, if there is discordance between the risk estimates, caution should be exercised and more reliance on the more static historically based instrument may be indicated. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Semi-empirical model for site effects on acceleration time histories at soft-soil sites.

EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 11 2004
Part 1: formulation, development
Abstract A criterion is developed for the simulation of realistic artificial ground motion histories at soft-soil sites, corresponding to a detailed ground motion record at a reference firm-ground site. A complex transfer function is defined as the Fourier transform of the ground acceleration time history at the soft-soil site divided by the Fourier transform of the acceleration record at the firm-ground site. Working with both the real and the imaginary components of the transfer function, and not only with its modulus, serves to keep the statistical information about the wave phases (and, therefore, about the time variation of amplitudes and frequencies) in the algorithm used to generate the artificial records. Samples of these transfer functions, associated with a given pair of soft-soil and firm-ground sites, are empirically determined from the corresponding pairs of simultaneous records. Each function included in a sample is represented as the superposition of the transfer functions of the responses of a number of oscillators. This formulation is intended to account for the contributions of trains of waves following different patterns in the vicinity of both sites. The properties of the oscillators play the role of parameters of the transfer functions. They vary from one seismic event to another. Part of the variation is systematic, and can be explained in terms of the influence of ground motion intensity on the effective values of stiffness and damping of the artificial oscillators. Another part has random nature; it reflects the random characteristics of the wave propagation patterns associated with the different events. The semi-empirical model proposed recognizes both types of variation. The influence of intensity is estimated by means of a conventional one-dimensional shear wave propagation model. This model is used to derive an intensity-dependent modification of the values of the empirically determined model parameters in those cases when the firm-ground earthquake intensity used to determine these parameters differs from that corresponding to the seismic event for which the simulated records are to be obtained. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Settlement in Tax Evasion Prosecution

ECONOMICA, Issue 283 2004
Inés Macho-Stadler
It is often argued that, even if optimal ex post, settlement dilutes deterrence ex ante. We analyse the interest for the tax authority of committing, ex ante, to a settlement strategy. We show that to commit to the use of settlements is ex ante optimal when the tax authority receives signals that provide statistical information about the taxpayers' true tax liability. The more informative the signal, the larger the additional expected revenue raised by the tax authority when using settlement as a policy tool. [source]


Measuring the amount of statistical information in the EPT index

ENVIRONMETRICS, Issue 1 2005
Patty L. Kitchin
Abstract Biological monitoring is the process of measuring the effect of environmental stress on the environment. Aquatic macroinvertebrates are widely used in the monitoring of freshwater lotic systems. The macroinvertebrate fauna of a reference stream is commonly compared to the fauna of an impacted stream that is affected by an environmental stressor. The smaller the similarity between these two streams, the greater the effect of pollution or stress on the impacted stream. Many richness measures, or statistics, exist for measuring similarity. These statistics can be computed using different levels of taxonomic resolution (species, genus and family). Many aquatic biologists believe that species-level identifications, which require exorbitant time and expertise, are needed for correct data interpretations. The actual amount of information provided by these statistics at different taxonomic levels has never been measured. This article evaluates the amount of statistical information provided by the EPT index as compared to a sufficient statistic at the various levels of taxonomic resolution. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Using spatial models and kriging techniques to optimize long-term ground-water monitoring networks: a case study

ENVIRONMETRICS, Issue 5-6 2002
Kirk Cameron
Abstract In a pilot project, a spatial and temporal algorithm (geostatistical temporal,spatial or GTS) was developed for optimizing long-term monitoring (LTM) networks. Data from two monitored ground-water plumes were used to test the algorithm. The primary objective was to determine the degree to which sampling, laboratory analysis, and/or well construction resources could be pared without losing key statistical information concerning the plumes. Optimization of an LTM network requires an accurate assessment of both ground-water quality over time and trends or other changes in individual monitoring wells. Changes in interpolated concentration maps over time indicate whether ground-water quality has improved or declined. GTS separately identifies temporal and spatial redundancies. Temporal redundancy may be reduced by lengthening the time between sample collection. Spatial redundancy may be reduced by removing wells from the network which do not significantly impact assessment of ground-water quality. Part of the temporal algorithm in GTS involves computation of a composite temporal variogram to determine the least redundant overall sampling interval. Under this measure of autocorrelation between sampling events, the lag time at which the variogram reaches a sill is the sampling interval at which same-well measurements lack correlation and are therefore non-redundant. The spatial algorithm assumes that well locations are redundant if nearby wells offer nearly the same statistical information about the underlying plume. A well was considered redundant if its removal did not significantly change: (i) an interpolated map of the plume; (ii) the local kriging variances in that section of the plume; and (iii) the average global kriging variance. To identify well redundancy, local kriging weights were accumulated into global weights and used to gauge each well's relative contribution to the interpolated plume map. By temporarily removing that subset of wells with the lowest global kriging weights and re-mapping the plume, it was possible to determine how many wells could be removed without losing critical information. Test results from the Massachusetts Military Reserve (MMR) indicated that substantial savings in sampling, analysis and operational costs could be realized by utilizing GTS. Annual budgetary savings that would accrue were estimated at between 35 per cent and 5 per cent for both LTM networks under study.Copyright © 2002 John Wiley & Sons, Ltd. [source]


Outcome of psychological treatments of pathological gambling: a review and meta-analysis

ADDICTION, Issue 10 2005
Stĺle Pallesen
ABSTRACT Aims To investigate the short- and long-term effect of psychological treatments of pathological gambling and factors relating to treatment outcome. Design and setting This study provides a quantitative meta-analytical review of psychotherapeutic treatments of pathological gambling. Studies were identified by computer search in the PsycINFO and Medline databases covering the period from 1966 to 2004, as well as from relevant reference lists. Inclusion criteria The target problem was pathological gambling, the treatment was psychological, the study was published in English and outcomes directly pertaining to gambling were employed. Single case studies, studies where elimination of gambling not was the priority and studies with insufficient statistical information were excluded from the present meta-analysis. Participants A total of 37 outcome studies, published or reported between 1968 and 2004, were identified. Of these 15 were excluded, thus 22 studies were included, involving 1434 subjects. The grand mean age was 40.1 years. The overall proportion of men was 71.5%. Measurements The included studies were coded for outcome measures of pathological gambling. For each condition, means and standard deviations for gambling-related outcome measures, all based upon self-reports or therapist ratings, were compiled at three points in time: baseline, post-treatment and the last follow-up reported. Findings Effect sizes represent the difference between the mean score in a treatment condition and a control condition or the difference between mean scores at separated points in time for one group, expressed in terms of standard deviation units. At post-treatment the analysis indicated that psychological treatments were more effective than no treatment, yielding an overall effect size of 2.01 (P < 0.01). At follow-up (averaging 17.0 months) the corresponding effect size was 1.59 (P < 0.01). A multiple regression analysis showed that the magnitude of effect sizes at post-treatment were lower in studies including patients with a formal diagnosis of pathological gambling only, compared to studies not employing such inclusion criteria. Effect sizes were also higher in randomized controlled trials compared to not randomized controlled trials, higher in within subjects designs compared to between subjects designs and also positively related to number of therapy sessions. No mediator variables were significantly related to the magnitude of the effect sizes at follow-up. Conclusion Psychological interventions for pathological gamble seem to be yield very favourable short- and long-term outcomes. [source]


Geolocation of Atlantic cod (Gadus morhua) movements in the Gulf of Maine using tidal information

FISHERIES OCEANOGRAPHY, Issue 4 2007
J. P. GRÖGER
Abstract Information derived from archival tags (digital storage tags, DSTs) were used to backtrack the migration of 11 tagged Atlantic cod (Gadus morhua) during 2001 in Massachusetts Bay, the Gulf of Maine, and Georges Bank. The DST tags continuously recorded time, temperature and depth. To geolocate fish positions during its time at large, we first extracted the tidal signal from the pressure recordings on the DST tags, and then compared the resulting data to data predicted with a Massachusetts Bay tidal model that provided us with geographical coordinates at a given date and time. Using least-squares criteria within an estimated geographical region of confidence that was constrained by biological and statistical information (e.g. swimming speed, known release and recapture location, and bottom depth) we were able to geolocate the fish. The resultant geolocated migration tracks indicate a large degree of movement of Atlantic cod in the region and an elevated importance of the Great South Channel (GSC) migration corridor between Massachusetts Bay and the western Georges Bank and Nantucket Shoals region. This observation contrasts strongly with inferences of limited movements by Atlantic cod based on conventional tag recapture methods (mean of 1200 km traveled versus 44 km traveled as measured by conventional tagging and geolocation, respectively). This study demonstrates that geolocation methodologies applied to archival tag studies hold great promise of becoming an important new tool for fisheries managers to quantify the movements of fishes. It also points out the need for greater collaboration between fisheries scientists and oceanographers, and particularly for the development of improved tidal models to cover stock regions more accurately and with higher precision. [source]


The design of an optimal filter for monthly GRACE gravity models

GEOPHYSICAL JOURNAL INTERNATIONAL, Issue 2 2008
R. Klees
SUMMARY Most applications of the publicly released Gravity Recovery and Climate Experiment monthly gravity field models require the application of a spatial filter to help suppressing noise and other systematic errors present in the data. The most common approach makes use of a simple Gaussian averaging process, which is often combined with a ,destriping' technique in which coefficient correlations within a given degree are removed. As brute force methods, neither of these techniques takes into consideration the statistical information from the gravity solution itself and, while they perform well overall, they can often end up removing more signal than necessary. Other optimal filters have been proposed in the literature; however, none have attempted to make full use of all information available from the monthly solutions. By examining the underlying principles of filter design, a filter has been developed that incorporates the noise and full signal variance,covariance matrix to tailor the filter to the error characteristics of a particular monthly solution. The filter is both anisotropic and non-symmetric, meaning it can accommodate noise of an arbitrary shape, such as the characteristic stripes. The filter minimizes the mean-square error and, in this sense, can be considered as the most optimal filter possible. Through both simulated and real data scenarios, this improved filter will be shown to preserve the highest amount of gravity signal when compared to other standard techniques, while simultaneously minimizing leakage effects and producing smooth solutions in areas of low signal. [source]


Improving Parsing of ,BA' Sentences for Machine Translation

IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 1 2008
Dapeng Yin Non-member
Abstract The research on Chinese-Japanese machine translation has been lasting for many years, and now this research field is increasingly thoroughly refined. In practical machine translation system, the processing of a simple and short Chinese sentence has somewhat good results. However, the translation of complex long Chinese sentence still has difficulties. For example, these systems are still unable to solve the translation problem of complex ,BA' sentences. In this article a new method of parsing of ,BA' sentence for machine translation based on valency theory is proposed. A ,BA' sentence is one that has a prepositional word ,BA'. The structural character of a ,BA' sentence is that the original verb is behind the object word. The object word after the ,BA' preposition is used as an adverbial modifier of an active word. First, a large number of grammar items from Chinese grammar books are collected, and some elementary judgment rules are set by classifying and including the collected grammar items. Then, these judgment rules are put into use in actual Chinese language and are modified by checking their results instantly. Rules are checked and modified by using the statistical information from an actual corpus. Then, a five-segment model used for ,BA' sentence translation is brought forward after the above mentioned analysis. Finally, we applied this proposed model into our developed machine translation system and evaluated the experimental results. It achieved a 91.3% rate of accuracy and the satisfying result verified effectiveness of our five-segment model for ,BA' sentence translation. Copyright © 2007 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source]


Comparison between non-probabilistic interval analysis method and probabilistic approach in static response problem of structures with uncertain-but-bounded parameters

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 4 2004
Zhiping Qiu
Abstract The uncertainty present in many practical engineering analysis and design problems can be modelled using probabilistic or non-probabilistic interval analysis methods. One motivation for using non-probabilistic interval analysis models rather than probabilistic models for uncertain variables is the general dearth of information in characterizing the uncertainties. Non-probabilistic interval analysis methods are less information-intensive than probabilistic models, since no density information is required. Instead of conventional optimization studies, where the minimum possible response is sought, here an uncertainty model is developed as an anti-optimization problem of finding the least favourable response and the most favourable response under the constraints within the set-theoretical description. Non-probabilistic interval analysis methods have been used for dealing with uncertain phenomena in a wide range of engineering applications. This paper is concerned with the problem of comparison between the non-probabilistic interval analysis method and the probabilistic approach in the static response problem of structures with uncertain-but-bounded parameters from mathematical proofs and numerical calculations. The results show that under the condition of the interval vector of the uncertain parameters determined from the probabilistic and statistical information, the width of the static displacement obtained by the non-probabilistic interval analysis method is larger than that by the probabilistic approach for structures with uncertain-but-bounded structural parameters. This is just the result that we expect, since according to the definition of probabilistic theory and interval mathematics, the region determined by the non-probabilistic interval analysis method should contain one predicted by the probabilistic approach. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Optimal and self-tuning fusion Kalman filters for discrete-time stochastic singular systems

INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 10 2008
Shu-Li Sun
Abstract Based on the optimal fusion estimation algorithm weighted by scalars in the linear minimum variance sense, a distributed optimal fusion Kalman filter weighted by scalars is presented for discrete-time stochastic singular systems with multiple sensors and correlated noises. A cross-covariance matrix of filtering errors between any two sensors is derived. When the noise statistical information is unknown, a distributed identification approach is presented based on correlation functions and the weighted average method. Further, a distributed self-tuning fusion filter is given, which includes two stage fusions where the first-stage fusion is used to identify the noise covariance and the second-stage fusion is used to obtain the fusion state filter. A simulation verifies the effectiveness of the proposed algorithm. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Nursing and midwifery management of hypoglycaemia in healthy term neonates

INTERNATIONAL JOURNAL OF EVIDENCE BASED HEALTHCARE, Issue 7 2005
Vivien Hewitt BSc(Hons) GradDipLib
Executive summary Objectives The primary objective of this review was to determine the best available evidence for maintenance of euglycaemia, in healthy term neonates, and the management of asymptomatic hypoglycaemia in otherwise healthy term neonates. Inclusion criteria Types of studies The review included any relevant published or unpublished studies undertaken between 1995 and 2004. Studies that focus on the diagnostic accuracy of point-of-care devices for blood glucose screening and/or monitoring in the neonate were initially included as a subgroup of this review. However, the technical nature and complexity of the statistical information published in diagnostic studies retrieved during the literature search stage, as well as the considerable volume of published research in this area, suggested that it would be more feasible to analyse diagnostic studies in a separate systematic review. Types of participants The review focused on studies that included healthy term (37- to 42-week gestation) appropriate size for gestational age neonates in the first 72 h after birth. Exclusions ,,preterm or small for gestational age newborns; ,,term neonates with a diagnosed medical or surgical condition, congenital or otherwise; ,,babies of diabetic mothers; ,,neonates with symptomatic hypoglycaemia; ,,large for gestational age neonates (as significant proportion are of diabetic mothers). Types of intervention All interventions that fell within the scope of practice of a midwife/nurse were included: ,,type (breast or breast milk substitutes), amount and/or timing of feeds, for example, initiation of feeding, and frequency; ,,regulation of body temperature; ,,monitoring (including screening) of neonates, including blood or plasma glucose levels and signs and symptoms of hypoglycaemia. Interventions that required initiation by a medical practitioner were excluded from the review. Types of outcome measures Outcomes that were of interest included: ,,occurrence of hypoglycaemia; ,,re-establishment and maintenance of blood or plasma glucose levels at or above set threshold (as defined by the particular study); ,,successful breast-feeding; ,,developmental outcomes. Types of research designs The review initially focused on randomised controlled trials reported from 1995 to 2004. Insufficient randomised controlled trials were identified and the review was expanded to include additional cohort and cross-sectional studies for possible inclusion in a narrative summary. Search strategy The major electronic databases, including MEDLINE/PubMed, CINAHL, EMBASE, LILACS, Cochrane Library, etc., were searched using accepted search techniques to identify relevant published and unpublished studies undertaken between 1995 and 2004. Efforts were made to locate any relevant unpublished materials, such as conference papers, research reports and dissertations. Printed journals were hand-searched and reference lists checked for potentially useful research. The year 1995 was selected as the starting point in order to identify any research that had not been included in the World Health Organisation review, which covered literature published up to 1996. The search was not limited to English language studies. Assessment of quality Three primary reviewers conducted the review assisted by a review panel. The review panel was comprised of nine nurses with expertise in neonatal care drawn from senior staff in several metropolitan neonatal units and education programs. Authorship of journal articles was not concealed from the reviewers. Methodological quality of each study that met the inclusion criteria was assessed by two reviewers, using a quality assessment checklist developed for the review. Disagreements between reviewers were resolved through discussion or with the assistance of a third reviewer. Data extraction and analysis Two reviewers used a data extraction form to independently extract data relating to the study design, setting and participants; study focus and intervention(s); and measurements and outcomes. As only one relevant randomised controlled trial was found, a meta-analysis could not be conducted nor tables constructed to illustrate comparisons between studies. Instead, the findings were summarised by a narrative identifying any relevant findings that emerged from the data. Results Seven studies met the inclusion criteria for the objective of this systematic review. The review provided information on the effectiveness of three categories of intervention , type of feeds, timing of feeds and thermoregulation on two of the outcome measures identified in the review protocol , prevention of hypoglycaemia, and re-establishment and maintenance of blood or plasma glucose levels above the set threshold (as determined by the particular study). There was no evidence available on which to base conclusions for effectiveness of monitoring or developmental outcomes, and insufficient evidence for breast-feeding success. Given that only a narrative review was possible, the findings of this review should be interpreted with caution. The findings suggest that the incidence of hypoglycaemia in healthy, breast-fed term infants of appropriate size for gestational age is uncommon and routine screening of these infants is not indicated. The method and timing of early feeding has little or no influence on the neonatal blood glucose measurement at 1 h in normal term babies. In healthy, breast-fed term infants the initiation and timing of feeds in the first 6 h of life has no significant influence on plasma glucose levels. The colostrum of primiparous mothers provides sufficient nutrition for the infant in the first 24 h after birth, and supplemental feeds or extra water is unnecessary. Skin-to-skin contact appears to provide an optimal environment for fetal to neonatal adaptation after birth and can help to maintain body temperature and adequate blood glucose levels in healthy term newborn infants, as well as providing an ideal opportunity to establish early bonding behaviours. Implications for practice The seven studies analysed in this review confirm the World Health Organisation's first three recommendations for prevention and management of asymptomatic hypoglycaemia, namely: 1Early and exclusive breast-feeding is safe to meet the nutritional needs of healthy term newborns worldwide. 2Healthy term newborns that are breast-fed on demand need not have their blood glucose routinely checked and need no supplementary foods or fluids. 3Healthy term newborns do not develop ,symptomatic' hypoglycaemia as a result of simple underfeeding. If an infant develops signs suggesting hypoglycaemia, look for an underlying condition. Detection and treatment of the cause are as important as correction of the blood glucose level. If there are any concerns that the newborn infant might be hypoglycaemic it should be given another feed. Given the importance of thermoregulation, skin-to-skin contact should be promoted and ,kangaroo care' encouraged in the first 24 h after birth. While it is important to main the infant's body temperature care should be taken to ensure that the child does not become overheated. [source]


Agricultural Economics and Rural Development: Marriage or Divorce?

JOURNAL OF AGRICULTURAL ECONOMICS, Issue 3 2001
Presidential Address
Rural development is a growing field of interest, both in policy and conceptual terms. However, it is sometimes loosely defined, and statistical information is therefore often confusing. This paper attempts to clarify some of the concepts involved, and then to explore the relationship between the study of rural development and that of agricultural economics. It is argued that much would be gained by a clearer separation between the economics of land, covering its environmental aspects as well as food production, and that of the rural population and economy. Links between the two certainly exist, but are better recognised individually, by analysts and policymakers alike. [source]


Rhetoric of Atrocities: The Place of Horrific Human Rights Abuses in Presidential Persuasion Efforts

PRESIDENTIAL STUDIES QUARTERLY, Issue 2 2007
ERAN N. BEN-PORATH
An analysis of presidential rhetoric in the post-Cold War era finds that in building the case for imminent war, presidents turn to narrative descriptions of specific atrocities, namely rape, torture, and victimization of children. By the same token, presidents wishing to avoid American involvement in war use abstract terms and statistical information concerning human rights crises, but refrain from detailing personalized stories of abuse. This study expands on the theory of savagery as a necessary component in enemy construction and on the literature concerning the changing rhetorical landscape of the post-Cold War era. The analysis finds the rhetoric of atrocities employed and avoided, in similar fashion, by three presidents and across several different settings. The implications are discussed in the article. [source]


Method for estimating decomposition characteristics of energetic chemicals

PROCESS SAFETY PROGRESS, Issue 4 2003
Sima Chervin
Experimental data on the decomposition characteristics of approximately400 chemicals, representing various classes of energetic materials, were summarized by chemical class and statistically analyzed. Average decomposition characteristics, such as energy of decomposition and decomposition onset temperature, were determined for chemical classes containing the following energetic groups: nitro, nitroso, N-oxide, oxime, hydroxylamine, tetrazole, azide, triazene, triazole, diazo, azo, hydrazine, and perchlorate. Additional statistical information is presented for each chemical class, such as number of chemicals analyzed, ranges, and standard deviations for the decomposition parameters analyzed. For chemical classes containing an energetic group attached to an aromatic ring, the presence and position of another substituting group in the ring can significantly influence the decomposition onset temperature. The study summarizes the list of activating and deactivating functional groups, and the positions in the ring where the strongest activation or deactivation occurs. The authors also recommend a method for estimating decomposition parameters of new chemicals. [source]


A second threshold for the hard-core model on a Bethe lattice

RANDOM STRUCTURES AND ALGORITHMS, Issue 3 2004
Graham R. Brightwell
We determine the approximate value of a critical activity for the hard-core model on the Bethe lattice, which determines whether the unique simple invariant Gibbs measure is extremal. This "recovery threshold" turns out to be different both from the threshold for unique Gibbs measure and (in contrast to the Ising model) from the threshold for recovery of root information using only statistical information about distant sites. © 2004 Wiley Periodicals, Inc. Random Struct. Alg., 2004 [source]


Representation of 3D heterogeneous cloud fields using copulas: Theory for water clouds

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 636 2008
Peter M. Norris
Abstract It is shown that a general representation of GCM column cloud fraction within probability density function (PDF)-based statistical cloud parametrizations can be obtained using statistical functions called copulas that encapsulate the dependence structure of rank statistics in a multivariate system. Using this theory, a new Gaussian copula formulation of GCM cloud overlap is obtained. The copula approach provides complete flexibility in the choice of the marginal PDF of each layer's moisture and temperature, and, compared with earlier approaches, including the ,generalized overlap' approach, allows a far more general specification of the correlation between any pair of layers. It also allows easy addition of new layer variables, such as temperature, into the modelled grid-column statistics. As a preliminary test of this formulation, its ability to statistically describe a cloud-resolving model simulation of a complex multi-layer case-study, including both large-scale and convective clouds, is examined. The Gaussian copula cloud fraction is found to be significantly less biased than other common cloud overlap methods for this case-study. Estimates of several nonlinear quantities are also improved with the Gaussian copula model: the variance of condensed water path and the fluxes of solar and thermal radiation at atmospheric column boundaries. This first paper, though limited to the simpler case of water clouds, addresses subgrid-scale variability in both moisture and temperature. This work is envisaged as a first step towards developing a generalized statistical framework for GCM cloud parametrization and for assimilating statistical information from high-resolution satellite observations into GCMs and global analyses. Copyright © 2008 Royal Meteorological Society [source]


Who's afraid of the big bad wolf?

AUSTRALIAN OCCUPATIONAL THERAPY JOURNAL, Issue 4 2007
Making sense of results from randomised controlled trials
Background:, Reading and interpreting results from research can be challenging. Many therapists read the abstract and discussion but confess to bypassing the results section of journal articles. Methods:, This paper discusses necessary steps to reading the results section of a published randomised controlled trial. Recent clinical trials are used as examples. Results:, An ability to read and interpret the results section of a paper requires the therapist to consider whether the tests conducted are appropriate, the results reported are accurate and the conclusions drawn by the authors are appropriate. Estimates of treatment effect sizes can be calculated by the therapist and used to determine if the effect of treatment is likely to be large enough to be "clinically worthwhile". Conclusions:, The ability to extract meaningful statistical information from clinical trials is a fundamental skill that will enhance therapists' knowledge and understanding of evidence-based practice. [source]