Restrictive Assumptions (restrictive + assumption)

Distribution by Scientific Domains


Selected Abstracts


Insurance, Bond Covenants, and Under- or Over-investment With Risky Asset Reconstitution

JOURNAL OF RISK AND INSURANCE, Issue 1 2007
Arthur HauArticle first published online: 8 MAR 200
Traditional theory predicts that the shareholders of a limited liability company financed partly by bonds may underinvest by not replacing damaged company assets. It also precludes the possibility of overinvestment. By relaxing the restrictive assumption maintained under traditional theory, namely, that the effects of reconstituting damaged assets are nonstochastic, this article shows that both over and underinvestment are possible. It is shown that these moral hazard problems can be mitigated by incorporating appropriate insurance requirements into bond covenants. Moreover, it is shown that the insurance requirements for alleviating underinvestment and overinvestment are quite different. Particularly, for underinvestment, the required insurance only needs to make the bonds riskless in the best asset reconstitution states of the loss states in which the company value falls short of the promised bond repayment; however, for overinvestment, the required insurance should make the bonds totally riskless. The difference in insurance requirements is especially important when insurance is actuarially unfavorable such that more-than-required insurance is always undesirable. [source]


A Coincident Index, Common Factors, and Monthly Real GDP,

OXFORD BULLETIN OF ECONOMICS & STATISTICS, Issue 1 2010
Roberto S. Mariano
Abstract The Stock,Watson coincident index and its subsequent extensions assume a static linear one-factor model for the component indicators. This restrictive assumption is unnecessary if one defines a coincident index as an estimate of monthly real gross domestic products (GDP). This paper estimates Gaussian vector autoregression (VAR) and factor models for latent monthly real GDP and other coincident indicators using the observable mixed-frequency series. For maximum likelihood estimation of a VAR model, the expectation-maximization (EM) algorithm helps in finding a good starting value for a quasi-Newton method. The smoothed estimate of latent monthly real GDP is a natural extension of the Stock,Watson coincident index. [source]


A GENERAL MULTIVARIATE EXTENSION OF FISHER'S GEOMETRICAL MODEL AND THE DISTRIBUTION OF MUTATION FITNESS EFFECTS ACROSS SPECIES

EVOLUTION, Issue 5 2006
Guillaume Martin
Abstract The evolution of complex organisms is a puzzle for evolutionary theory because beneficial mutations should be less frequent in complex organisms, an effect termed "cost of complexity." However, little is known about how the distribution of mutation fitness effects (f(s)) varies across genomes. The main theoretical framework to address this issue is Fisher's geometric model and related phenotypic landscape models. However, it suffers from several restrictive assumptions. In this paper, we intend to show how several of these limitations may be overcome. We then propose a model of f(s) that extends Fisher's model to account for arbitrary mutational and selective interactions among n traits. We show that these interactions result in f(s) that would be predicted by a much smaller number of independent traits. We test our predictions by comparing empirical f(s) across species of various gene numbers as a surrogate to complexity. This survey reveals, as predicted, that mutations tend to be more deleterious, less variable, and less skewed in higher organisms. However, only limited difference in the shape of f(s) is observed from Escherichia coli to nematodes or fruit flies, a pattern consistent with a model of random phenotypic interactions across many traits. Overall, these results suggest that there may be a cost to phenotypic complexity although much weaker than previously suggested by earlier theoretical works. More generally, the model seems to qualitatively capture and possibly explain the variation of f(s) from lower to higher organisms, which opens a large array of potential applications in evolutionary genetics. [source]


ADAPTIVE CONSTRAINTS AND THE PHYLOGENETIC COMPARATIVE METHOD: A COMPUTER SIMULATION TEST

EVOLUTION, Issue 1 2002
Emilia P. Martins
Abstract Recently, the utility of modern phylogenetic comparative methods (PCMs) has been questioned because of the seemingly restrictive assumptions required by these methods. Although most comparative analyses involve traits thought to be undergoing natural or sexual selection, most PCMs require an assumption that the traits be evolving by less directed random processes, such as Brownian motion (BM). In this study, we use computer simulation to generate data under more realistic evolutionary scenarios and consider the statistical abilities of a variety of PCMs to estimate correlation coefficients from these data. We found that correlations estimated without taking phylogeny into account were often quite poor and never substantially better than those produced by the other tested methods. In contrast, most PCMs performed quite well even when their assumptions were violated. Felsenstein's independent contrasts (FIC) method gave the best performance in many cases, even when weak constraints had been acting throughout phenotypic evolution. When strong constraints acted in opposition to variance-generating (i.e., BM) forces, however, FIC correlation coefficients were biased in the direction of those BM forces. In most cases, all other PCMs tested (phylogenetic generalized least squares, phylogenetic mixed model, spatial autoregression, and phylogenetic eigenvector regression) yielded good statistical performance, regardless of the details of the evolutionary model used to generate the data. Actual parameter estimates given by different PCMs for each dataset, however, were occasionally very different from one another, suggesting that the choice among them should depend on the types of traits and evolutionary processes being considered. [source]


Annual streamflow modelling with asymmetric distribution function

HYDROLOGICAL PROCESSES, Issue 17 2008
Nermin Sarlak
Abstract Classical autoregressive models (AR) have been used for forecasting streamflow data in spite of restrictive assumptions, such as the normality assumption for innovations. The main reason for making this assumption is the difficulties faced in finding model parameters for non-normal distribution functions. However, the modified maximum likelihood (MML) procedure used for estimating autoregressive model parameters assumes a non-normally distributed residual series. The aim in this study is to compare the performance of the AR(1) model with asymmetric innovations with that of the classical autoregressive model for hydrological annual data. The models considered are applied to annual streamflow data obtained from two streamflow gauging stations in K,z,l,rmak Basin, Turkey. Copyright © 2008 John Wiley & Sons, Ltd. [source]


An ,, algorithm for the windsurfer approach to adaptive robust control

INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 8 2004
Arvin Dehghani
Abstract The windsurfing approach to iterative control requires a series of controller designs with the gradual expanding of the closed-loop bandwidth, and in the end in order to stop the algorithm, some validation tests are carried out. In this paper, an ,, design algorithm is introduced in order to remove the empirical aspect from the stopping criteria and to make the procedure more systematic, hence facilitating the design. Moreover, some restrictive assumptions on the plant model are lifted and some issues with the controller design step are tackled by the introduction of a new design method. This enables us to address a wider class of practical problems. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Generating Dichotomous Item Scores with the Four-Parameter Beta Compound Binomial Model

JOURNAL OF EDUCATIONAL MEASUREMENT, Issue 3 2007
Patrick O. Monahan
A Monte Carlo simulation technique for generating dichotomous item scores is presented that implements (a) a psychometric model with different explicit assumptions than traditional parametric item response theory (IRT) models, and (b) item characteristic curves without restrictive assumptions concerning mathematical form. The four-parameter beta compound-binomial (4PBCB) strong true score model (with two-term approximation to the compound binomial) is used to estimate and generate the true score distribution. The nonparametric item-true score step functions are estimated by classical item difficulties conditional on proportion-correct total score. The technique performed very well in replicating inter-item correlations, item statistics (point-biserial correlation coefficients and item proportion-correct difficulties), first four moments of total score distribution, and coefficient alpha of three real data sets consisting of educational achievement test scores. The technique replicated real data (including subsamples of differing proficiency) as well as the three-parameter logistic (3PL) IRT model (and much better than the 1PL model) and is therefore a promising alternative simulation technique. This 4PBCB technique may be particularly useful as a more neutral simulation procedure for comparing methods that use different IRT models. [source]


The Scaling of Mixed-Item-Format Tests With the One-Parameter and Two-Parameter Partial Credit Models

JOURNAL OF EDUCATIONAL MEASUREMENT, Issue 3 2000
Robert C. Sykes
Item response theory scalings were conducted for six tests with mixed item formats. These tests differed in their proportions of constructed response (c.r.) and multiple choice (m.c.) items and in overall difficulty. The scalings included those based on scores for the c.r. items that had maintained the number of levels as the item rubrics, either produced from single ratings or multiple ratings that were averaged and rounded to the nearest integer, as well as scalings for a single form of c.r. items obtained by summing multiple ratings. A one-parameter (IPPC) or two-parameter (2PPC) partial credit model was used for the c.r. items and the one-parameter logistic (IPL) or three-parameter logistic (3PL) model for the m.c. items, ltem fit was substantially worse with the combination IPL/IPPC model than the 3PL/2PPC model due to the former's restrictive assumptions that there would be no guessing on the m.c. items and equal item discrimination across items and item types. The presence of varying item discriminations resulted in the IPL/IPPC model producing estimates of item information that could be spuriously inflated for c.r. items that had three or more score levels. Information for some items with summed ratings were usually overestimated by 300% or more for the IPL/IPPC model. These inflated information values resulted in under-estbnated standard errors of ability estimates. The constraints posed by the restricted model suggests limitations on the testing contexts in which the IPL/IPPC model can be accurately applied. [source]


Parasites and deleterious mutations: interactions influencing the evolutionary maintenance of sex

JOURNAL OF EVOLUTIONARY BIOLOGY, Issue 5 2010
A. W. PARK
Abstract The restrictive assumptions associated with purely genetic and purely ecological mechanisms suggest that neither of the two forces, in isolation, can offer a general explanation for the evolutionary maintenance of sex. Consequently, attention has turned to pluralistic models (i.e. models that apply both ecological and genetic mechanisms). Existing research has shown that combining mutation accumulation and parasitism allows restrictive assumptions about genetic and parasite parameter values to be relaxed while still predicting the maintenance of sex. However, several empirical studies have shown that deleterious mutations and parasitism can reduce fitness to a greater extent than would be expected if the two acted independently. We show how interactions between these genetic and ecological forces can completely reverse predictions about the evolution of reproductive modes. Moreover, we demonstrate that synergistic interactions between infection and deleterious mutations can render sex evolutionarily stable even when there is antagonistic epistasis among deleterious mutations, thereby widening the conditions for the evolutionary maintenance of sex. [source]


An energetic material model for time-dependent ferroelectric behaviour: existence and uniqueness

MATHEMATICAL METHODS IN THE APPLIED SCIENCES, Issue 12 2006
Alexander Mielke
Abstract We discuss rate-independent engineering models for the multi-dimensional behaviour of ferroelectric materials. These models capture the non-linear and hysteretic behaviour of such materials. We show that these models can be formulated in an energetic framework which is based on the elastic and the electric displacements as reversible variables and on interior, irreversible variables like the remanent polarization. We provide quite general conditions on the constitutive laws which guarantee the existence of a solution. Under more restrictive assumptions we are also able to establish uniqueness results. Copyright © 2006 John Wiley & Sons, Ltd. [source]


How to Analyze Political Attention with Minimal Assumptions and Costs

AMERICAN JOURNAL OF POLITICAL SCIENCE, Issue 1 2010
Kevin M. Quinn
Previous methods of analyzing the substance of political attention have had to make several restrictive assumptions or been prohibitively costly when applied to large-scale political texts. Here, we describe a topic model for legislative speech, a statistical learning model that uses word choices to infer topical categories covered in a set of speeches and to identify the topic of specific speeches. Our method estimates, rather than assumes, the substance of topics, the keywords that identify topics, and the hierarchical nesting of topics. We use the topic model to examine the agenda in the U.S. Senate from 1997 to 2004. Using a new database of over 118,000 speeches (70,000,000 words) from the Congressional Record, our model reveals speech topic categories that are both distinctive and meaningfully interrelated and a richer view of democratic agenda dynamics than had previously been possible. [source]


On the Specification and Estimation of the Production Function for Cognitive Achievement*

THE ECONOMIC JOURNAL, Issue 485 2003
Petra E. Todd
This paper considers methods for modelling the production function for cognitive achievement in a way that captures theoretical notions that child development is a cumulative process depending on the history of family and school inputs and on innate ability. It develops a general modelling framework that accommodates many of the estimating equations used in the literatures. It considers different ways of addressing data limitations, and it makes precise the identifying assumptions needed to justify alternative approaches. Commonly used specifications are shown to place restrictive assumptions on the production technology. Ways of testing modelling assumptions and of relaxing them are discussed. [source]


Programmed motion in the presence of homogeneity

ASTRONOMISCHE NACHRICHTEN, Issue 8 2009
G. Bozis
Abstract In the framework of the inverse problem of dynamics, we face the following question with reference to the motion of one material point: Given a region Torb of the xy plane, described by the inequality g (x, y) , c0, are there potentials V = V (x, y) which can produce monoparametric families of orbits f (x, y) = c (also to be found) lying exclusively in the region Torb? As the relevant PDEs are nonlinear, an answer to this question (generally affirmative, but not with assurance) can be given by the procedure of the determination of certain constants specifying the pertinent functions. In this paper we ease the mathematics involved by making certain simplifying assumptions referring to the homogeneity of both the function g (x, y) (describing the boundary of Torb) and of the slope function ,(x, y) = fy/fx (representing the required family f (x, y) = c). We develop the method to treat the so formulated problem and we show that, even under these restrictive assumptions, an affirmative answer is guaranteed provided that two algebraic equations have in common at least one solution (© 2009 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Estimation for the Optimal Combination of Markers without Modeling the Censoring Distribution

BIOMETRICS, Issue 1 2009
Chin-Tsang Chiang
Summary In the time-dependent receiver operating characteristic curve analysis with several baseline markers, research interest focuses on seeking appropriate composite markers to enhance the accuracy in predicting the vital status of individuals over time. Based on censored survival data, we proposed a more flexible estimation procedure for the optimal combination of markers under the validity of a time-varying coefficient generalized linear model for the event time without restrictive assumptions on the censoring pattern. The consistency of the proposed estimators is also established in this article. In contrast, the inverse probability weighting (IPW) approach might introduce a bias when the selection probabilities are misspecified in the estimating equations. The performance of both estimation procedures are examined and compared through a class of simulations. It is found from the simulation study that the proposed estimators are far superior to the IPW ones. Applying these methods to an angiography cohort, our estimation procedure is shown to be useful in predicting the time to all-cause and coronary artery disease related death. [source]