Home About us Contact | |||
Weaker Assumptions (weaker + assumption)
Selected AbstractsEconomic aspects of human cloning and reprogeneticsECONOMIC POLICY, Issue 36 2003Gilles Saint-Paul SUMMARY While most discussions of human cloning start and end with ethics, this paper analyses the economics of human cloning. I analyse the incentives for cloning and its implications for the long-run distribution of skills and income. I discuss models of human cloning for different motives, focusing on those that tend to produce new human beings with improved ability. I distinguish three cases: cloning as a means of assisted reproduction for infertile couples, cloning by fertile couples aimed at producing high ability offspring and, finally, financially motivated cloning. The third case supposes that the creator of a clone can appropriate some fraction of the clone's future income. Even if this fraction is small, the possibility of producing exceptionally talented clones with correspondingly high incomes might make it profitable, and thus turn cloning into a form of financial investment. An important consequence of these models is that to the extent that ability is genetically determined and cloners prefer to make high-ability clones, cloning will act as a form of what might be called ,unnatural selection'. Following standard Darwinian logic, such selection will tend to increase the proportion of high ability people in society. Indeed, under some assumptions the distribution of ability eventually converges to a mass point at the highest possible ability level. Under weaker assumptions, it is shown that ability-reducing genes are eventually eliminated. These results do not depend on cloning displacing sexual reproduction or even being widespread; they hold even if a small, or even negligible number of top ability workers are cloned at a small (but not negligible) number of copies. The paper discusses the plausibility of the models and their results in light on the evidence on marriage markets, child selection, human assisted reproduction and animal husbandry. Finally, it is shown how the analysis can be used to help formulate policies toward cloning, whether they aim at preventing it or managing its external effects. , Gilles Saint-Paul [source] Structure of the optimal income tax in the quasi-linear modelINTERNATIONAL JOURNAL OF ECONOMIC THEORY, Issue 1 2007Nigar Hashimzade H21; H24 Existing numerical characterizations of the optimal income tax have been based on a limited number of model specifications. As a result, they do not reveal which properties are general. We determine the optimal tax in the quasi-linear model under weaker assumptions than have previously been used; in particular, we remove the assumption of a lower bound on the utility of zero consumption and the need to permit negative labor incomes. A Monte Carlo analysis is then conducted in which economies are selected at random and the optimal tax function constructed. The results show that in a significant proportion of economies the marginal tax rate rises at low skills and falls at high. The average tax rate is equally likely to rise or fall with skill at low skill levels, rises in the majority of cases in the centre of the skill range, and falls at high skills. These results are consistent across all the specifications we test. We then extend the analysis to show that these results also hold for Cobb-Douglas utility. [source] Improved estimation of portfolio value-at-risk under copula models with mixed marginalsTHE JOURNAL OF FUTURES MARKETS, Issue 10 2006Douglas J. Miller Portfolio value-at-risk (PVAR) is widely used in practice, but recent criticisms have focused on risks arising from biased PVAR estimates due to model specification errors and other problems. The PVAR estimation method proposed in this article combines generalized Pareto distribution tails with the empirical density function to model the marginal distributions for each asset in the portfolio, and a copula model is used to form a joint distribution from the fitted marginals. The copula,mixed distribution (CMX) approach converges in probability to the true marginal return distribution but is based on weaker assumptions that may be appropriate for the returns data found in practice. CMX is used to estimate the joint distribution of log returns for the Taiwan Stock Exchange (TSE) index and the associated futures contracts on SGX and TAIFEX. The PVAR estimates for various hedge portfolios are computed from the fitted CMX model, and backtesting diagnostics indicate that CMX outperforms the alternative PVAR estimators. © 2006 Wiley Periodicals, Inc. Jrl Fut Mark 26:997,1018, 2006 [source] Testing Marginal Homogeneity Against Stochastic Order in Multivariate Ordinal DataBIOMETRICS, Issue 2 2009B. Klingenberg Summary Many assessment instruments used in the evaluation of toxicity, safety, pain, or disease progression consider multiple ordinal endpoints to fully capture the presence and severity of treatment effects. Contingency tables underlying these correlated responses are often sparse and imbalanced, rendering asymptotic results unreliable or model fitting prohibitively complex without overly simplistic assumptions on the marginal and joint distribution. Instead of a modeling approach, we look at stochastic order and marginal inhomogeneity as an expression or manifestation of a treatment effect under much weaker assumptions. Often, endpoints are grouped together into physiological domains or by the body function they describe. We derive tests based on these subgroups, which might supplement or replace the individual endpoint analysis because they are more powerful. The permutation or bootstrap distribution is used throughout to obtain global, subgroup, and individual significance levels as they naturally incorporate the correlation among endpoints. We provide a theorem that establishes a connection between marginal homogeneity and the stronger exchangeability assumption under the permutation approach. Multiplicity adjustments for the individual endpoints are obtained via stepdown procedures, while subgroup significance levels are adjusted via the full closed testing procedure. The proposed methodology is illustrated using a collection of 25 correlated ordinal endpoints, grouped into six domains, to evaluate toxicity of a chemical compound. [source] |