Home About us Contact | |||
Reasonable Assumptions (reasonable + assumption)
Selected AbstractsHow Much Will Feeding More and Wealthier People Encroach on Forests?POPULATION AND DEVELOPMENT REVIEW, Issue 2 2001Paul E. Waggoner Forests have recently expanded in many countries. The success of the world, including both rich and poor, in following this trend depends on future changes in population, income per capita, appetite, and crop yields. Extended to the year 2050, the strengths of these forces, estimated from experience, project cropland shrinking by nearly 200 million hectares, more than three times the land area of France. Changes in some of the forces, with crop yield the most manageable, could double the shrinkage. Reasonable assumptions about the forces can also make the distribution of spared land between rich and poor countries roughly equal. Although the encroachment factor translating cropland change into forest land change varies greatly, one-third or more of the cropland spared could become forest. [source] Calculation of IBD probabilities with dense SNP or sequence dataGENETIC EPIDEMIOLOGY, Issue 6 2008Jonathan M. Keith Abstract The probabilities that two individuals share 0, 1, or 2 alleles identical by descent (IBD) at a given genotyped marker locus are quantities of fundamental importance for disease gene and quantitative trait mapping and in family-based tests of association. Until recently, genotyped markers were sufficiently sparse that founder haplotypes could be modelled as having been drawn from a population in linkage equilibrium for the purpose of estimating IBD probabilities. However, with the advent of high-throughput single nucleotide polymorphism genotyping assays, this is no longer a reasonable assumption. Indeed, the imminent arrival of individual sequencing will enable high-density single nucleotide polymorphism genotyping on a scale for which current algorithms are not equipped. In this paper, we present a simple new model in which founder haplotypes are modelled as a Markov chain. Another important innovation is that genotyping errors are explicitly incorporated into the model. We compare results obtained using the new model to those obtained using the popular genetic linkage analysis package Merlin, with and without using the cluster model of linkage disequilibrium that is incorporated into that program. We find that the new model results in accuracy approaching that of Merlin with haplotype blocks, but achieves this with orders of magnitude faster run times. Moreover, the new algorithm scales linearly with number of markers, irrespective of density, whereas Merlin scales supralinearly. We also confirm a previous finding that ignoring linkage disequilibrium in founder haplotypes can cause errors in the calculation of IBD probabilities. Genet. Epidemiol. 2008. © 2008 Wiley-Liss, Inc. [source] Further uses of the heat of oxidation in chemical hazard assessmentPROCESS SAFETY PROGRESS, Issue 1 2003Laurence G. Britton Flammability: The "net heat of oxidation" technique described in an earlier publication is extended to predicting the lower flammable limits, lower limit flame temperatures, and limiting oxygen concentrations of chlorinated organic fuels having H:Cl ratios greater than unity. A new Rule is derived for predicting the effect of initial temperature on the lower flammable limits and limiting oxygen concentrations of organic fuels. It is suggested that this Rule be used in preference to the modified "Burgess-Wheeler" Rule. The effect of initial pressure is discussed. Instability: Net heats of oxidation (kcal/mol oxygen) for a series of disparate fuel groups are compared with ",HD," the maximum heat of decomposition (cal/g) calculated using CHETAH methodology. Given the reasonable assumption that CHETAH's "maximum heat of decomposition" cannot exceed the net heat of combustion ",HC," examination is made as to whether the ratio of these parameters (each expressed in units of kcal/mol), coined the "Reaction Heat Ratio" (RH), provides a useful new indicator for instability assessment. Of these parameters, the net heat of oxidation (,HC/S) is the best indicator to help assign NFPA Instability Ratings. However, ,HC/S cannot generally be used to assign ratings for organo-peroxides. Also, its performance as an indicator for hazardous polymerization depends on the ,HC/S difference between the reacting monomer and the polymer product, so it should become increasingly unreliable as the monomer ,HC/S approaches -100 kcal/mol oxygen. The ranking method tacitly assumes organic polymers to have a constant heat of oxidation of about -100 kcal/mol oxygen. Errors in this assumption must invalidate the ranking approach where ,HC/S differences are small. Finally, separate "cut-offs" must be used at each NFPA Instability Rating for organo-nitrates versus other organics containing combinations of CHON atoms. Additional materials need to be examined to extend this preliminary analysis. The net heat of oxidation would be a useful additional output parameter of the CHETAH program, if only for its application in flammability assessment. No conclusions are drawn regarding the usefulness of net heat of oxidation or RH in conducting CHETAH hazard assessments, since this procedure requires consideration of several variables. However, the analysis may be helpful to the ASTM E 27.07 subcommittee responsible for developing the program. For example, the -,HD , 700 cal/g cut-off used to assign a "high" CHETAH hazard rating typically corresponds to organic materials rated NFPA 1, the second to lowest hazard rating. [source] Modelling and analysis of attenuation anisotropy in multi-azimuth VSP data from the Clair fieldGEOPHYSICAL PROSPECTING, Issue 5 2007Sonja Maultzsch ABSTRACT Anisotropic variations in attenuation are of interest since they can give information on the fracture system and may be more amenable to measurement than absolute attenuation values. We examine methods for detecting changes in relative attenuation with azimuth from VSP data, and validate the techniques on synthetic data. Analysis of a multi-azimuth walkaway VSP data set from a fractured hydrocarbon reservoir indicates that such azimuthal variations in P-wave attenuation are observable. The effects are localized in the reservoir, and analysis allows the prediction of a fracture strike direction, which agrees with geological information. The observed effects can be modelled under reasonable assumptions, which suggests the validity of the link between the anisotropic attenuation and the fracturing. [source] The theory of currents through small bridge moleculesINTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY, Issue 10 2007B. L. Burrows Abstract A model to treat the theory of currents through small bridge molecules connected to leads is constructed using the time-dependent Schrödinger equation and the wide-band approximation to treat the leads. It is shown that the behaviour of the current through the bridge may be summarized by considering three time periods: a transient period, a quasi-steady -state period and a decaying period. The results obtained are compared with previous work and in particular it is shown that, under reasonable assumptions, they are in accord with the more conventional time-independent scattering theory approaches in the steady-state period. Illustrative calculations are presented for both chain and ring bridge molecules. © 2007 Wiley Periodicals, Inc. Int J Quantum Chem, 2007 [source] VOTING, INEQUALITY AND REDISTRIBUTIONJOURNAL OF ECONOMIC SURVEYS, Issue 1 2007Rainald Borck Abstract This paper surveys models of voting on redistribution. Under reasonable assumptions, the baseline model produces an equilibrium with the extent of redistributive taxation chosen by the median income earner. If the median is poorer than average, redistribution is from rich to poor, and increasing inequality increases redistribution. However, under different assumptions about the economic environment, redistribution may not be simply rich to poor, and inequality need not increase redistribution. Several lines of argument are presented, in particular, political participation, public provision of private goods, public pensions, and tax avoidance or evasion. [source] The Determinants of Child Labour: The Role of Primary Product SpecializationLABOUR, Issue 2 2005Leonardo Becchetti The paper tests predictions of a traditional intra-household bargaining model which, under reasonable assumptions, shows that lack of bargaining power in the value chain significantly reduces the capacity for obtaining benefits from increased product demand arising from trade liberalization and therefore is positively associated with child labour. Cross-sectional and panel negative binomial estimates in a sample of emerging countries support this hypothesis. They show that proxies of domestic workers' bargaining power in the international division of labour (such as the share of primary product exports) are significantly related to child labour, net of the effect of traditional controls such as parental income, quality of education, international aid, and trade liberalization. The positive impact of the share of primary product exports on child labour outlines a potential paradox. The paradox suggests that trade liberalization does not always have straightforward positive effects on social indicators and that its short-run effects on income distribution and distribution of skills and market power across countries need to be carefully evaluated. [source] Influence of non-random incorporation of Mn ions on the magnetotransport properties of Ga1,xMnxAs alloysPHYSICA STATUS SOLIDI (C) - CURRENT TOPICS IN SOLID STATE PHYSICS, Issue 3 2008C. Michel Abstract We study theoretically the influence of a spatially nonrandom incorporation of Mn ions on the magnetotransport in paramagnetic Ga1,xMnxAs alloys. Such a nonrandomness may be introduced during post-growth annealing treatment. We use a resistor-network model for describing the electrical transport of this disordered semiconductor system as a function of temperature and external magnetic field. The model is founded on classical semiconductor band-transport and neglects many-body interactions. The peculiarities of paramagnetic dilute magnetic semiconductors, in particular, the magnetic-field induced changes of the density of states, the broad acceptor-energy distribution, and the interplay of magnetic field independent disorder (due to the alloying of GaAs with Mn) and magnetic field dependent disorder (due to the the Giant Zeeman splitting) are accounted for in a mean-field fashion. We have previously shown that this empirical transport model based on reasonable assumptions and realistic material parameters yields a satisfactory quantitative description of the experimentally obtained temperature and magnetic-field dependence of the resistivity of Ga0.98Mn0.02As samples annealed at different temperatures. For Ga1,xMnxAs alloys annealed at temperatures above 500 °C where structural changes lead to the formation of MnAs clusters, the transport is dominated by the paramagnetic GaAs:Mn host matrix as the cluster density is below the percolation threshold. We will show that in this situation the transport results can only be explained accounting for a nonrandom Mn distribution. Thus the analysis shown here provides further understanding of the annealing-induced changes of the transport properties in dilute magnetic III-Mn-V semiconductors. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Presidential Address: The Cost of Active InvestingTHE JOURNAL OF FINANCE, Issue 4 2008KENNETH R. FRENCH ABSTRACT I compare the fees, expenses, and trading costs society pays to invest in the U.S. stock market with an estimate of what would be paid if everyone invested passively. Averaging over 1980,2006, I find investors spend 0.67% of the aggregate value of the market each year searching for superior returns. Society's capitalized cost of price discovery is at least 10% of the current market cap. Under reasonable assumptions, the typical investor would increase his average annual return by 67 basis points over the 1980,2006 period if he switched to a passive market portfolio. [source] Response Adaptive Designs with a Variance-penalized CriterionBIOMETRICAL JOURNAL, Issue 5 2009Yanqing Yi Abstract We consider a response adaptive design of clinical trials with a variance-penalized criterion. It is shown that this criterion evaluates the performance of a response adaptive design based on both the number of patients assigned to the better treatment and the power of the statistical test. A new proportion of treatment allocation is proposed and the doubly biased coin procedure is used to target the proposed proportion. Under reasonable assumptions, the proposed design is demonstrated to generate an asymptotic variance of allocation proportions, which is smaller than that of the drop-the-loser design. Simulation comparisons of the proposed design with some existing designs are presented. [source] Two-dimensional Numerical Modeling Research on Continent Subduction DynamicsACTA GEOLOGICA SINICA (ENGLISH EDITION), Issue 1 2004WANG Zhimin Abstract Continent subduction is one of the hot research problems in geoscience. New models presented here have been set up and two-dimensional numerical modeling research on the possibility of continental subduction has been made with the finite element software, ANSYS, based on documentary evidence and reasonable assumptions that the subduction of oceanic crust has occurred, the subduction of continental crust can take place and the process can be simplified to a discontinuous plane strain theory model. The modeling results show that it is completely possible for continental crust to be subducted to a depth of 120 km under certain circumstances and conditions. At the same time, the simulations of continental subduction under a single dynamical factor have also been made, including the pull force of the subducted oceanic lithosphere, the drag force connected with mantle convection and the push force of the mid-ocean ridge. These experiments show that the drag force connected with mantle convection is critical for continent subduction. [source] |