Random Coefficients (random + coefficient)

Distribution by Scientific Domains


Selected Abstracts


When Should Epidemiologic Regressions Use Random Coefficients?

BIOMETRICS, Issue 3 2000
Sander Greenland
Summary. Regression models with random coefficients arise naturally in both frequentist and Bayesian approaches to estimation problems. They are becoming widely available in standard computer packages under the headings of generalized linear mixed models, hierarchical models, and multilevel models. I here argue that such models offer a more scientifically defensible framework for epidemiologic analysis than the fixed-effects models now prevalent in epidemiology. The argument invokes an antiparsimony principle attributed to L. J. Savage, which is that models should be rich enough to reflect the complexity of the relations under study. It also invokes the countervailing principle that you cannot estimate anything if you try to estimate everything (often used to justify parsimony). Regression with random coefficients offers a rational compromise between these principles as well as an alternative to analyses based on standard variable-selection algorithms and their attendant distortion of uncertainty assessments. These points are illustrated with an analysis of data on diet, nutrition, and breast cancer. [source]


OBESITY AND NUTRIENT CONSUMPTION: A RATIONAL ADDICTION?

CONTEMPORARY ECONOMIC POLICY, Issue 3 2007
TIMOTHY J. RICHARDS
Recent research shows that the dramatic rise in obesity in the United States is due more to the overconsumption of unhealthy foods than underactivity. This study tests for an addiction to food nutrients as a potential explanation for the apparent excessive consumption. A random coefficients (mixed) logit model is used to test a multivariate rational addiction model. The results reveal a particularly strong addiction to carbohydrates. The implication of this finding is that price-based policies, sin taxes, or produce subsidies that change the expected future costs and benefits of consuming carbohydrate-intensive foods may be effective in controlling excessive nutrient intake. (JEL D120, I120, C230) [source]


A general model for predicting brown tree snake capture rates

ENVIRONMETRICS, Issue 3 2003
Richard M. Engeman
Abstract The inadvertent introduction of the brown tree snake (Boiga irregularis) to Guam has resulted in the extirpation of most of the island's native terrestrial vertebrates, has presented a health hazard to small children, and also has produced economic problems. Trapping around ports and other cargo staging areas is central to a program designed to deter dispersal of the species. Sequential trapping of smaller plots is also being used to clear larger areas of snakes in preparation for endangered species reintroductions. Traps and trapping personnel are limited resources, which places a premium on the ability to plan the deployment of trapping efforts. In a series of previous trapping studies, data on brown tree snake removal from forested plots was found to be well modeled by exponential decay functions. For the present article, we considered a variety of model forms and estimation procedures, and used capture data from individual plots as random subjects to produce a general random coefficients model for making predictions of brown tree snake capture rates. The best model was an exponential decay with positive asymptote produced using nonlinear mixed model estimation where variability among plots was introduced through the scale and asymptote parameters. Practical predictive abilities were used in model evaluation so that a manager could project capture rates in a plot after a period of time, or project the amount of time required for trapping to reduce capture rates to a desired level. The model should provide managers with a tool for optimizing the allocation of limited trapping resources. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Tissue Oxygenation Does Not Predict Central Venous Oxygenation in Emergency Department Patients With Severe Sepsis and Septic Shock

ACADEMIC EMERGENCY MEDICINE, Issue 4 2010
Anthony M. Napoli MD
Abstract Objectives:, This study sought to determine whether tissue oxygenation (StO2) could be used as a surrogate for central venous oxygenation (ScVO2) in early goal-directed therapy (EGDT). Methods:, The study enrolled a prospective convenience sample of patients aged ,18 years with sepsis and systolic blood pressure <90 mm Hg after 2 L of normal saline or lactate >4 mmol, who received a continuous central venous oximetry catheter. StO2 and ScVO2 were measured at 15-minute intervals. Data were analyzed using a random coefficients model, correlations, and Bland-Altman plots. Results:, There were 284 measurements in 40 patients. While a statistically significant relationship existed between StO2 and ScVO2 (F(1,37) = 10.23, p = 0.002), StO2 appears to systematically overestimate at lower ScVO2 and underestimate at higher ScVO2. This was reflected in the fixed effect slope of 0.49 (95% confidence interval [CI] = 0.266 to 0.720) and intercept of 34 (95% CI = 14.681 to 50.830), which were significantly different from 1 and 0, respectively. The initial point correlation (r = 0.5) was fair, but there was poor overall agreement (bias = 4.3, limits of agreement = ,20.8 to 29.4). Conclusions:, Correlation between StO2 and ScVO2 was fair. The two measures trend in the same direction, but clinical use of StO2 in lieu of ScVO2 is unsubstantiated due to large and systematic biases. However, these biases may reflect real physiologic states. Further research may investigate if these measures could be used in concert as prognostic indicators. ACADEMIC EMERGENCY MEDICINE 2010; 17:349,352 © 2010 by the Society for Academic Emergency Medicine [source]


Generalized probabilistic approach of uncertainties in computational dynamics using random matrices and polynomial chaos decompositions

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 8 2010
Christian Soize
Abstract A new generalized probabilistic approach of uncertainties is proposed for computational model in structural linear dynamics and can be extended without difficulty to computational linear vibroacoustics and to computational non-linear structural dynamics. This method allows the prior probability model of each type of uncertainties (model-parameter uncertainties and modeling errors) to be separately constructed and identified. The modeling errors are not taken into account with the usual output-prediction-error method, but with the nonparametric probabilistic approach of modeling errors recently introduced and based on the use of the random matrix theory. The theory, an identification procedure and a numerical validation are presented. Then a chaos decomposition with random coefficients is proposed to represent the prior probabilistic model of random responses. The random germ is related to the prior probability model of model-parameter uncertainties. The random coefficients are related to the prior probability model of modeling errors and then depends on the random matrices introduced by the nonparametric probabilistic approach of modeling errors. A validation is presented. Finally, a future perspective is introduced when experimental data are available. The prior probability model of the random coefficients can be improved in constructing a posterior probability model using the Bayesian approach. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Analysis and implementation issues for the numerical approximation of parabolic equations with random coefficients

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 6-7 2009
F. Nobile
Abstract We consider the problem of numerically approximating statistical moments of the solution of a time-dependent linear parabolic partial differential equation (PDE), whose coefficients and/or forcing terms are spatially correlated random fields. The stochastic coefficients of the PDE are approximated by truncated Karhunen,Loève expansions driven by a finite number of uncorrelated random variables. After approximating the stochastic coefficients, the original stochastic PDE turns into a new deterministic parametric PDE of the same type, the dimension of the parameter set being equal to the number of random variables introduced. After proving that the solution of the parametric PDE problem is analytic with respect to the parameters, we consider global polynomial approximations based on tensor product, total degree or sparse polynomial spaces and constructed by either a Stochastic Galerkin or a Stochastic Collocation approach. We derive convergence rates for the different cases and present numerical results that show how these approaches are a valid alternative to the more traditional Monte Carlo Method for this class of problems. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Generalized polynomial chaos and random oscillators

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 3 2004
D. Lucor
Abstract We present a new approach to obtain solutions for general random oscillators using a broad class of polynomial chaos expansions, which are more efficient than the classical Wiener,Hermite expansions. The approach is general but here we present results for linear oscillators only with random forcing or random coefficients. In this context, we are able to obtain relatively sharp error estimates in the representation of the stochastic input as well as the solution. We have also performed computational comparisons with Monte Carlo simulations which show that the new approach can be orders of magnitude faster, especially for compact distributions. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Labour force participation rates at the regional and national levels of the European Union: An integrated analysis*

PAPERS IN REGIONAL SCIENCE, Issue 4 2007
J. Paul Elhorst
Space-time data; multilevel analysis; spatial autocorrelation; labour force participation; labour market policy Abstract., This study investigates the causes of variation in regional labour force participation rates in a cross-country perspective. A microeconomic framework of the labour force participation decision is aggregated across individuals to obtain an explanatory model of regional participation rates in which both regional-level and national-level variables serve as explanatory variables. An appropriate econometric model of random coefficients for the regional variables and fixed coefficients for the national variables is developed, further taking into account that observations may be correlated over time and in space and that some of the explanatory variables are not strictly exogenous. This model is estimated for men and for women, using annual 1983,1997 Eurostat data from 157 regions across 13 EU countries. The hypotheses that regional participation rates in the EU are determined by a common structure and that labour force participation can be encouraged by a common policy must be strongly rejected. [source]


Maximum likelihood estimates for the Hildreth,Houck random coefficients model

THE ECONOMETRICS JOURNAL, Issue 1 2002
Asad Zaman
We explore maximum likelihood (ML) estimation of the Hildreth,Houck random coefficients model. We show that the global ML estimator can be inconsistent. We develop an alternative LML (local ML) estimator and prove that it is consistent and asymptotically efficient for points in the interior of the parameters. Properties of the LML and comparisons with common method of moments (MM) estimates are done via Monte Carlo. Boundary parameters lead to nonstandard asymptotic distributions for the LML which are described. The LML is used to develop a modification of the LR test for random coefficients. Simulations suggest that the LR test is more powerful for distant alternatives than the Breusch,Pagan (BP) Lagrange multiplier test. A simple modification of the BP test also appears to be more powerful than the BP. [source]


When Should Epidemiologic Regressions Use Random Coefficients?

BIOMETRICS, Issue 3 2000
Sander Greenland
Summary. Regression models with random coefficients arise naturally in both frequentist and Bayesian approaches to estimation problems. They are becoming widely available in standard computer packages under the headings of generalized linear mixed models, hierarchical models, and multilevel models. I here argue that such models offer a more scientifically defensible framework for epidemiologic analysis than the fixed-effects models now prevalent in epidemiology. The argument invokes an antiparsimony principle attributed to L. J. Savage, which is that models should be rich enough to reflect the complexity of the relations under study. It also invokes the countervailing principle that you cannot estimate anything if you try to estimate everything (often used to justify parsimony). Regression with random coefficients offers a rational compromise between these principles as well as an alternative to analyses based on standard variable-selection algorithms and their attendant distortion of uncertainty assessments. These points are illustrated with an analysis of data on diet, nutrition, and breast cancer. [source]