Various Assumptions (various + assumption)

Distribution by Scientific Domains


Selected Abstracts


Missing data assumptions and methods in a smoking cessation study

ADDICTION, Issue 3 2010
Sunni A. Barnes
ABSTRACT Aim A sizable percentage of subjects do not respond to follow-up attempts in smoking cessation studies. The usual procedure in the smoking cessation literature is to assume that non-respondents have resumed smoking. This study used data from a study with a high follow-up rate to assess the degree of bias that may be caused by different methods of imputing missing data. Design and methods Based on a large data set with very little missing follow-up information at 12 months, a simulation study was undertaken to compare and contrast missing data imputation methods (assuming smoking, propensity score matching and optimal matching) under various assumptions as to how the missing data arose (randomly generated missing values, increased non-response from smokers and a hybrid of the two). Findings Missing data imputation methods all resulted in some degree of bias which increased with the amount of missing data. Conclusion None of the missing data imputation methods currently available can compensate for bias when there are substantial amounts of missing data. [source]


Can cocaine use be evaluated through analysis of wastewater?

ADDICTION, Issue 5 2009
A nation-wide approach conducted in Belgium
ABSTRACT Aims Cocaine is the second most-used illicit drug world-wide and its consumption is increasing significantly, especially in western Europe. Until now, the annual prevalence has been estimated indirectly by means of interviews. A recently introduced and direct nation-wide approach based on measurements of the major urinary excreted metabolite of cocaine, benzoylecgonine, in wastewater is proposed. Design Wastewater samples from 41 wastewater treatment plants (WWTPs) in Belgium, covering approximately 3 700 000 residents, were collected. Each WWTP was sampled on Wednesdays and Sundays during two sampling campaigns in 2007,08. Samples were analysed for cocaine (COC) and its metabolites, benzoylecgonine (BE) and ecgonine methylester (EME) by a validated procedure based on liquid chromatography coupled with tandem mass spectrometry. Concentrations of BE were used to calculate cocaine consumption (g/day per 1000 inhabitants) for each WWTP region and for both sampling campaigns (g/year per 1000 inhabitants). Findings Weekend days showed significantly higher cocaine consumption compared with weekdays. The highest cocaine consumption was observed for WWTPs receiving wastewater from large cities, such as Antwerp, Brussels and Charleroi. Results were extrapolated for the total Belgian population and an estimation of a yearly prevalence of cocaine use was made based on various assumptions. An amount of 1.88 tonnes (t) per year [standard error (SE) 0.05 t] cocaine is consumed in Belgium, corresponding to a yearly prevalence of 0.80% (SE 0.02%) for the Belgian population aged 15,64 years. This result is in agreement with an earlier reported estimate of the Belgian prevalence of cocaine use conducted through socio-epidemiological studies (0.9% for people aged 15,64 years). Conclusions Wastewater analysis is a promising tool to evaluate cocaine consumption at both local and national scale. This rapid and direct estimation of the prevalence of cocaine use in Belgium corresponds with socio-epidemiological data. However, the strategy needs to be refined further to allow a more exact calculation of cocaine consumption from concentrations of BE in wastewater. [source]


IBDfinder and SNPsetter: Tools for pedigree-independent identification of autozygous regions in individuals with recessive inherited disease,

HUMAN MUTATION, Issue 6 2009
Ian M. Carr
Abstract Autozygosity mapping of recessive genes can be performed on a small number of affected individuals from consanguineous pedigrees. With the advent of microarray SNP analysis, acquiring genotype data has become extremely simple and quick, in comparison to gene mapping with microsatellite markers. However, the subsequent data analysis required to identify autozygous regions can still be a significant obstacle. For rapid gene identification, it may be desirable to integrate information from heterogeneous groups of affected individuals, both familial and isolated, under various assumptions of ancestry and locus heterogeneity, that are not amenable to formal linkage analysis. Unfortunately, there are few computer programs aimed specifically at facilitating this type of data sifting. Here, we demonstrate two new programs that facilitate the identification of autozygous regions within a heterogeneous SNP dataset derived from familial and sporadic affected individuals. Hum Mutat 30:1,8, 2009. © 2009 Wiley-Liss, Inc. [source]


Three-dimensional viscous flow over rotating periodic structures

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 5 2003
Kyu-Tae Kim
Abstract The three-dimensional Stokes flow in a periodic domain is examined in this study. The problem corresponds closely to the flow inside internal mixers, where the flow is driven by the movement of a rotating screw; the outer barrel remaining at rest. A hybrid spectral/finite-difference approach is proposed for the general expansion of the flow field and the solution of the expansion coefficients. The method is used to determine the flow field between the screw and barrel. The regions of elongation and shear are closely examined. These are the two mechanisms responsible for mixing. Besides its practical importance, the study also allows the assessment of the validity of the various assumptions usually adopted in mixing and lubrication problems. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Aftershocks: Postwar Leadership Survival, Rivalry, and Regime Dynamics

INTERNATIONAL STUDIES QUARTERLY, Issue 4 2004
Michael Colaresi
Under what conditions are leaders replaced after a war? Past research has reported that the outcome of the war and regime type affect postwar leadership tenure. Yet, this does not exhaust the conditions that could potentially influence political survival. In this article, I reexamine the links between regime type and leadership replacement after a war. I show that past research has failed to account for the dynamics of political leadership, and in the process has misrepresented the evidence supporting previous theories. I then show, using event history techniques, that both internal and external factors can alter leadership trajectories after a war. Specifically, war outcomes significantly affect the job security of a leader outside of international rivalry, but have less of an effect within rivalry. Additionally, relaxing various assumptions concerning the relationship between leadership survival and regime type leads to a richer understanding of the process of postwar leadership turnover. Finally, several propositions concerning the interaction between regime type and the costs of war are not supported in this analysis. [source]


Forecasting with panel data,

JOURNAL OF FORECASTING, Issue 2 2008
Badi H. Baltagi
Abstract This paper gives a brief survey of forecasting with panel data. It begins with a simple error component regression model and surveys the best linear unbiased prediction under various assumptions of the disturbance term. This includes various ARMA models as well as spatial autoregressive models. The paper also surveys how these forecasts have been used in panel data applications, running horse races between heterogeneous and homogeneous panel data models using out-of-sample forecasts. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Affirmative action, duality of error, and the consequences of mispredicting the academic performance of african american college applicants

JOURNAL OF POLICY ANALYSIS AND MANAGEMENT, Issue 1 2002
Jeryl L. Mumpower
The implications of different potential affirmative action policies depend on three factors: selection rate from the applicant pool, base rate of qualified applicants, and accuracy of performance predictions. A series of analyses was conducted under various assumptions concerning affirmative action plans, causes of racial differences in average college admissions test scores, and racial differences in accuracy of performance predictions. Evidence suggesting a lower level of predictive accuracy for African Americans implies that, under a program of affirmative action, both proportionately more false positives (matriculated students who do not succeed) and proportionately more false negatives (rejected applicants who could have succeeded) will be found among African American applicants. Unless equivalent levels of predictive accuracy are achieved for both groups, no admission policy can be fair simultaneously to majority group applicants and African American applicants. © 2002 by the Association for Public Policy Analysis and Management. [source]


Demographic Issues in Longevity Risk Analysis

JOURNAL OF RISK AND INSURANCE, Issue 4 2006
Eric Stallard
Fundamental to the modeling of longevity risk is the specification of the assumptions used in demographic forecasting models that are designed to project past experience into future years, with or without modifications based on expert opinion about influential factors not represented in the historical data. Stochastic forecasts are required to explicitly quantify the uncertainty of forecasted cohort survival functions, including uncertainty due to process variance, parameter errors, and model misspecification errors. Current applications typically ignore the latter two sources although the potential impact of model misspecification errors is substantial. Such errors arise from a lack of understanding of the nature and causes of historical changes in longevity and the implications of these factors for the future. This article reviews the literature on the nature and causes of historical changes in longevity and recent efforts at deterministic and stochastic forecasting based on these data. The review reveals that plausible alternative sets of forecasting assumptions have been derived from the same sets of historical data, implying that further methodological development will be needed to integrate the various assumptions into a single coherent forecasting model. Illustrative calculations based on existing forecasts indicate that the ranges of uncertainty for older cohorts' survival functions will be at a manageable level. Uncertainty ranges for younger cohorts will be larger and the need for greater precision will likely motivate further model development. [source]


Stochastic Regression Model with Dependent Disturbances

JOURNAL OF TIME SERIES ANALYSIS, Issue 2 2001
Kokyo Choy
In this paper, we consider the estimation of the coefficient of a stochastic regression model whose explanatory variables and disturbances are permitted to exhibit short-memory or long-memory dependence. Three estimators of the coefficient are proposed. A variety of their asymptotics are illuminated under various assumptions on the explanatory variables and the disturbances. Numerical studies of the theoretical results are given. They show some unexpected aspects of the asymptotics of the three estimators. [source]


Narrative vigilance: the analysis of stories in health care

NURSING PHILOSOPHY, Issue 2 2005
John Paley ma
Abstract The idea of narrative has been widely discussed in the recent health care literature, including nursing, and has been portrayed as a resource for both clinical work and research studies. However, the use of the term ,narrative' is inconsistent, and various assumptions are made about the nature (and functions) of narrative: narrative as a naive account of events; narrative as the source of ,subjective truth'; narrative as intrinsically fictional; and narrative as a mode of explanation. All these assumptions have left their mark on the nursing literature, and all of them (in our view) are misconceived. Here, we argue that a failure to distinguish between ,narrative' and ,story' is partly responsible for these misconceptions, and we offer an analysis that shows why the distinction between them is essential. In doing so, we borrow the concept of ,narrativity' from literary criticism. Narrativity is something that a text has degrees of, and our proposal is that the elements of narrativity can be ,sorted' roughly into a continuum, at the ,high narrativity' end of which we find ,story'. On our account, ,story' is an interweaving of plot and character, whose organization is designed to elicit a certain emotional response from the reader, while ,narrative' refers to the sequence of events and the (claimed) causal connections between them. We suggest that it is important not to confuse the emotional persuasiveness of the ,story' with the objective accuracy of the ,narrative', and to this end we recommend what might be called ,narrative vigilance'. There is nothing intrinsically authentic, or sacrosanct, or emancipatory, or paradigmatic about narrative itself, even though the recent health care literature has had a marked tendency to romanticize it. [source]


Mid-domain models as predictors of species diversity patterns: bathymetric diversity gradients in the deep sea

OIKOS, Issue 3 2005
Craig R. McClain
Geometric constraints represent a class of null models that describe how species diversity may vary between hard boundaries that limit geographic distributions. Recent studies have suggested that a number of large scale biogeographic patterns of diversity (e.g. latitude, altitude, depth) may reflect boundary constraints. However, few studies have rigorously tested the degree to which mid-domain null predictions match empirical patterns or how sensitive the null models are to various assumptions. We explore how variation in the assumptions of these models alter null depth ranges and consequently bathymetric variation in diversity, and test the extent to which bathymetric patterns of species diversity in deep sea gastropods, bivalves, and polychaetes match null predictions based on geometric constraints. Range,size distributions and geographic patterns of diversity produced by these null models are sensitive to the relative position of the hard boundaries, the specific algorithms used to generate range sizes, and whether species are continuously or patchily distributed between range end points. How well empirical patterns support null expectations is highly dependent on these assumptions. Bathymetric patterns of species diversity for gastropods, bivalves and polychaetes differ substantially from null expectations suggesting that geometric constraints do not account for diversity,depth patterns in the deep sea benthos. [source]


Industrial energy policy: a case study of demand in Kuwait

OPEC ENERGY REVIEW, Issue 2 2006
M. Nagy Eltony
The purpose behind building the industrial energy demand model was to enable assessment of the impact of potential policy options and to forecast future energy demand under various assumptions, including the impact of the possible removal of energy subsidies in accordance with the World Trade Organization (WTO) agreement. The results of the model, based on three scenarios, underline several important issues: With nominal energy prices staying the same (the status quo) and with inflation and economic growth continuing to expand (i.e. baseline scenario), it is expected that industrial demand will grow. In this sector, energy consumption is projected to grow at an annual growth rate of about 3.5 per cent throughout the forecast period. In the moderate scenario, however, this drops to 1.9 per cent and when all energy subsidies are removed as in the case of the extreme scenario, the energy consumption is projected to grow by only 1.5 per cent annually throughout the same period. Moreover, with regards to inter-fuel substitution, the model forecast indicates that electricity and natural gas consumption will decline, while the consumption of oil products will increase in all scenarios. The results of the model also indicate that the changing price structure of energy resources should be done in a comprehensive manner. In other words, electricity prices should be adjusted upwards instantly with the adjustment of oil products' prices and natural gas otherwise, a massive inter-fuel substitution will occur within the various consuming industries. [source]


Methodology and model for performance and cost comparison of innovative treatment technologies at wood preserving sites

REMEDIATION, Issue 1 2001
Mark L. Evans
Wood preserving facilities have used a variety of compounds, including pentachlorophenol (PCP), creosote, and certain metals, to extend the useful life of wood products. Past operations and waste management practices resulted in soil and water contamination at a portion of the more than 700 wood preserving sites in the United States (EPA, 1997). Many of these sites are currently being addressed under federal, state, or voluntary cleanup programs. The U.S. Environmental Protection Agency (EPA) National Risk Management Research Laboratory (NRMRL) has responded to the need for information aimed at facilitating remediation of wood preserving sites by conducting treatability studies, issuing guidance, and preparing reports. This article presents a practical methodology and computer model for screening the performances and comparing the costs of seven innovative technologies that could be used for the treatment of contaminated soils at user-specified wood preserving sites. The model incorporates a technology screening function and a cost-estimating function developed from literature searches and vendor information solicited for this study. This article also provides background information on the derivation of various assumptions and default values used in the model, common contaminants at wood preserving sites, and recent trends in the cleanup of such sites. © 2001 John Wiley & Sons, Inc. [source]


The Cost of Monopoly in Australian Manufacturing

THE AUSTRALIAN ECONOMIC REVIEW, Issue 4 2001
Robert Dixon
This article looks at the deadweight loss arising from monopoly elements in Australian manufacturing under various assumptions and its relationship with the level of concentration. [source]


Peculiar relics from Primordial Black Holes in the inflationary paradigm

ANNALEN DER PHYSIK, Issue 3 2004
A. Barrau
Abstract Depending on various assumptions on the energy scale of inflation and assuming a primordial power spectrum of a step-like structure, we explore new possibilities for Primordial Black Holes (PBH) and Planck relics to contribute substantially to Cold Dark Matter in the Universe. A recently proposed possibility to produce Planck relics in four-dimensional string gravity is considered in this framework. Possible experimental detection of PBHs through gravitational waves is also explored. We stress that inflation with a low energy scale, and also possibly when Planck relics are produced, leads unavoidably to relics originating from PBHs that are not effectively classical during their formation, rendering the usual formalism inadequate for them. [source]


On the Chandra Detection of Diffuse X-Ray Emission from Sgr A*

ASTRONOMISCHE NACHRICHTEN, Issue S1 2003
M. E. Pessah
Abstract Kinematic studies of the stellar motions near Sgr A* have revealed the presence of several million solar masses of dark matter enclosed within 0.015 parsecs of the Galactic Center. However, it is not yet clear what fraction of this material is contained within a single point-like object, as opposed to an extended distribution of orbiting matter (e.g., in the form of neutron stars). Recent Chandra observations suggest that the X-ray emission from this source is partially diffuse. This result provides an important clue that can be used to set some constraints on the mass distribution surrounding the black hole. Here, we develop a simple model in which the diffuse emission is produced by a halo of neutron stars accreting from the gas falling toward the center. We discuss the various accretion mechanisms that are likely to contribute significantly to the X-ray flux, and show that a highly magnetized fraction of old neutron stars may account for the diffuse high-energy source. If this picture is correct, the upper bound to the mass of the central black hole is ,2.2 × 106M,. The core radius of the dark cluster must then be ,0.06 pc. We also discuss the sensitivity of our results to the various assumptions made in our calculations. [source]


EXPONENTIAL SMOOTHING AND NON-NEGATIVE DATA

AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 4 2009
Muhammad Akram
Summary The most common forecasting methods in business are based on exponential smoothing, and the most common time series in business are inherently non-negative. Therefore it is of interest to consider the properties of the potential stochastic models underlying exponential smoothing when applied to non-negative data. We explore exponential smoothing state space models for non-negative data under various assumptions about the innovations, or error, process. We first demonstrate that prediction distributions from some commonly used state space models may have an infinite variance beyond a certain forecasting horizon. For multiplicative error models that do not have this flaw, we show that sample paths will converge almost surely to zero even when the error distribution is non-Gaussian. We propose a new model with similar properties to exponential smoothing, but which does not have these problems, and we develop some distributional properties for our new model. We then explore the implications of our results for inference, and compare the short-term forecasting performance of the various models using data on the weekly sales of over 300 items of costume jewelry. The main findings of the research are that the Gaussian approximation is adequate for estimation and one-step-ahead forecasting. However, as the forecasting horizon increases, the approximate prediction intervals become increasingly problematic. When the model is to be used for simulation purposes, a suitably specified scheme must be employed. [source]