Home About us Contact | |||
Priori Assumptions (priori + assumption)
Selected AbstractsValuing food-borne risks using time-series data: The case of E. coli O157:H7 and BSE crises in JapanAGRIBUSINESS : AN INTERNATIONAL JOURNAL, Issue 2 2006Shunji Oniki This study evaluates changes in consumers' concerns on food safety after the outbreaks of E. coli O157 and bovine spongiform encephalopathy (BSE) in Japan using household consumption time-series data. A food demand system for Japanese households is estimated using the linear approximate almost-ideal demand system (AIDS) model to evaluate the willingness to accept (WTA) compensation for risk. The Kalman filtering method is applied to produce estimates without a priori assumption regarding timing of the changes. The WTA value rises immediately after a food safety crisis occurs and declines in a short time. However, it does not return to previous levels for an extended period. A possible explanation for remaining effects of a crisis might be that they are the results of habit formation and learning effects of consumption. [EconLit citations: D12, D18, Q13]. © 2006 Wiley Periodicals, Inc. Agribusiness 22: 219,232, 2006. [source] Green's function and excitation spectrum of finite latticesPHYSICA STATUS SOLIDI (B) BASIC SOLID STATE PHYSICS, Issue 8 2006S. Cojocaru Abstract New analytical results are obtained for the Green's function of a finite one-dimensional lattice with nearest neighbor interaction using discrete Fourier transform. The method offers an alternative to the Bethe Ansatz, but does not require any a priori assumption on the form of the wavefunction. This makes it suitable for extensions to nano-ferromagnets of higher dimensions. Solutions of the Heisenberg spin chain with periodic and open boundary conditions are considered as examples. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] The Illusion of Equity: An Examination of Community Based Natural Resource Management and Inequality in AfricaGEOGRAPHY COMPASS (ELECTRONIC), Issue 9 2010Cerian Gibbes This article examines the dual goals of community based natural resource management (CBNRM) as a way to protect the environment (specifically wildlife) and enhance the socio-economic equity of communities. As described in the literature, CBNRM should integrate ecological sustainability, economic efficiency and social equity (Pagdee et al. 2006). Although occasionally successful at the first two ideal objectives, the enhancement of social equity is often wanting due to a priori assumptions about communities and resource management devolution. This article based largely on published literature, and addresses the constraints and opportunities for successful CBNRM in Africa, largely focusing on southern Africa as that part of the world has been one of the early testing grounds for these environmental management ideas. [source] An Analytical Solution for Ground Water Transit Time through Unconfined AquifersGROUND WATER, Issue 4 2005R. Chesnaux An exact, closed-form analytical solution is developed for calculating ground water transit times within Dupuit-type flow systems. The solution applies to steady-state, saturated flow through an unconfined, horizontal aquifer recharged by surface infiltration and discharging to a downgradient fixed-head boundary. The upgradient boundary can represent, using the same equation, a no-flow boundary or a fixed head. The approach is unique for calculating travel times because it makes no a priori assumptions regarding the limit of the water table rise with respect to the minimum saturated aquifer thickness. The computed travel times are verified against a numerical model, and examples are provided, which show that the predicted travel times can be on the order of nine times longer relative to existing analytical solutions. [source] A geomorphological explanation of the unit hydrograph conceptHYDROLOGICAL PROCESSES, Issue 4 2004C. Cudennec Abstract The water path from any point of a basin to the outlet through the self-similar river network was considered. This hydraulic path was split into components within the Strahler ordering scheme. For the entire basin, we assumed the probability density functions of the lengths of these components, reduced by the scaling factor, to be independent and isotropic. As with these assumptions, we propose a statistical physics reasoning (similar to Maxwell's reasoning) that considers a hydraulic length symbolic space, built on the self-similar lengths of the components. Theoretical expressions of the probability density functions of the hydraulic length and of the lengths of all the components were derived. These expressions are gamma laws expressed in terms of simple geomorphological parameters. We validated our theory with experimental observations from two French basins, which are different in terms of size and relief. From the comparisons, we discuss the relevance of the assumptions and show how a gamma law structure underlies the river network organization, but under the influence of a strong hierarchy constraint. These geomorphological results have been translated into travel time probability density functions, through the hydraulic linear hypothesis. This translation provides deterministic explanations of some famous a priori assumptions of the unit hydrograph and the geomorphological unit hydrograph theories, such as the gamma law general shape and the exponential distribution of residence time in Strahler states. Copyright © 2004 John Wiley & Sons, Ltd. [source] Numerical modelling of fluid flow in microscopic images of granular materialsINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 1 2002E. Masad Abstract A program for the simulation of two-dimensional (2-D) fluid flow at the microstructural level of a saturated anisotropic granular medium is presented. The program provides a numerical solution to the complete set of Navier,Stokes equations without a priori assumptions on the viscous or convection components. This is especially suited for the simulation of the flow of fluids with different density and viscosity values and for a wide range of granular material porosity. The analytical solution for fluid flow in a simple microstructure of porous medium is used to verify the computer program. Subsequently, the flow field is computed within microscopic images of granular material that differ in porosity, particle size and particle shape. The computed flow fields are shown to follow certain paths depending on air void size and connectivity. The permeability tensor coefficients are derived from the flow fields, and their values are shown to compare well with laboratory experimental data on glass beads, Ottawa sand and silica sands. The directional distribution of permeability is expressed in a functional form and its anisotropy is quantified. Permeability anisotropy is found to be more pronounced in the silica sand medium that consists of elongated particles. Copyright © 2001 John Wiley & Sons, Ltd. [source] Social Dialogue: A Potential "Highroad" to Policies Addressing Ageing in The EU Member StatesINTERNATIONAL SOCIAL SECURITY REVIEW, Issue 1 2006Hedva Sarfati The impact of demographic ageing on the sustainability of pensions has become the focus of heated debate in Europe, as governments try to reform their welfare systems. Among the most vocal opponents of the reforms are employers' organizations and managements, trade unions and individual workers. The article looks at the issues at stake and the relevance of social dialogue, despite difficulties, for reaching consensus among all stakeholders on acceptable labour market and pension reforms. These have to be comprehensive and free from ideological a priori assumptions. Specific examples of available options are mentioned, including Finland, Denmark, Spain and the United Kingdom. [source] A new shallow water model with polynomial dependence on depthMATHEMATICAL METHODS IN THE APPLIED SCIENCES, Issue 5 2008José M. Rodríguez Abstract In this paper, we study two-dimensional Euler equations in a domain with small depth. With this aim, we introduce a small non-dimensional parameter , related to the depth and we use asymptotic analysis to study what happens when , becomes small. We obtain a model for , small that, after coming back to the original domain, gives us a shallow water model that considers the possibility of a non-constant bottom, and the horizontal velocity has a dependence on z introduced by the vorticity when it is not zero. This represents an interesting novelty with respect to shallow water models found in the literature. We stand out that we do not need to make a priori assumptions about velocity or pressure behaviour to obtain the model. The new model is able to approximate the solutions to Euler equations with dependence on z (reobtaining the same velocities profile), whereas the classic model just obtains the average velocity. Copyright © 2007 John Wiley & Sons, Ltd. [source] Shifting distributions and speciation: species divergence during rapid climate changeMOLECULAR ECOLOGY, Issue 3 2007BRYAN C. CARSTENS Abstract Questions about how shifting distributions contribute to species diversification remain virtually without answer, even though rapid climate change during the Pleistocene clearly impacted genetic variation within many species. One factor that has prevented this question from being adequately addressed is the lack of precision associated with estimates of species divergence made from a single genetic locus and without incorporating processes that are biologically important as populations diverge. Analysis of DNA sequences from multiple variable loci in a coalescent framework that (i) corrects for gene divergence pre-dating speciation, and (ii) derives divergence-time estimates without making a priori assumptions about the processes underlying patterns of incomplete lineage sorting between species (i.e. allows for the possibility of gene flow during speciation), is critical to overcoming the inherent logistical and analytical difficulties of inferring the timing and mode of speciation during the dynamic Pleistocene. Estimates of species divergence that ignore these processes, use single locus data, or do both can dramatically overestimate species divergence. For example, using a coalescent approach with data from six loci, the divergence between two species of montane Melanoplus grasshoppers is estimated at between 200 000 and 300 000 years before present, far more recently than divergence estimates made using single-locus data or without the incorporation of population-level processes. Melanoplus grasshoppers radiated in the sky islands of the Rocky Mountains, and the analysis of divergence between these species suggests that the isolation of populations in multiple glacial refugia was an important factor in promoting speciation. Furthermore, the low estimates of gene flow between the species indicate that reproductive isolation must have evolved rapidly for the incipient species boundaries to be maintained through the subsequent glacial periods and shifts in species distributions. [source] Trajectory analysis: concepts and applicationsBASIN RESEARCH, Issue 5 2009W. Helland-Hansen ABSTRACT Shoreline and shelf-edge trajectories describe the migration through time of sedimentary systems, using geomorphological breaks-in-slope that are associated with key changes in depositional processes and products. Analysis of these trajectories provides a simple descriptive tool that complements and extends conventional sequence stratigraphic methods and models. Trajectory analysis offers four advantages over a sequence stratigraphic interpretation based on systems tracts: (1) each genetically related advance or retreat of a shoreline or shelf edge is viewed in the context of a continuously evolving depositional system, rather than as several discrete systems tracts; (2) subtle changes in depositional response (e.g. within systems tracts) can be identified and honoured; (3) trajectory analysis does not anticipate the succession of depositional events implied by systems-tract models; and (4) the descriptive emphasis of trajectory analysis does not involve any a priori assumptions about the type or nature of the mechanisms that drive sequence development. These four points allow the level of detail in a trajectory-based interpretation to be directly tailored to the available data, such that the interpretation may be qualitative or quantitative in two or three dimensions. Four classes of shoreline trajectory are recognized: ascending regressive, descending regressive, transgressive and stationary (i.e. nonmigratory). Ascending regressive and high-angle (accretionary) transgressive trajectories are associated with expanded facies belt thicknesses, the absence of laterally extensive erosional surfaces, and relatively high preservation of the shoreline depositional system. In contrast, descending regressive and low-angle (nonaccretionary) transgressive trajectories are associated with foreshortened and/or missing facies belts, the presence of laterally extensive erosional surfaces, and relatively low preservation of the shoreline depositional system. Stationary trajectories record shorelines positioned at a steeply sloping shelf edge, with accompanying bypass of sediment to the basin floor. Shelf-edge trajectories represent larger spatial and temporal scales than shoreline trajectories, and they can be subdivided into ascending, descending and stationary (i.e. nonmigratory) classes. Ascending trajectories are associated with a relatively large number and thickness of shoreline tongues (parasequences), the absence of laterally extensive erosional surfaces on the shelf, and relatively low sediment supply to the basin floor. Descending trajectories are associated with a few, thin shoreline tongues, the presence of laterally extensive erosional surfaces on the shelf, and high sediment supply to basin-floor fan systems. Stationary trajectories record near-total bypass of sediment across the shelf and mass transfer to the basin floor. [source] Atomic Properties of Amino Acids: Computed Atom Types as a Guide for Future Force-Field DesignCHEMPHYSCHEM, Issue 8 2003Paul L. A. Popelier Dr. Abstract The quantum chemical topology (QCT) is able to propose atom types by direct computation rather than by chemical intuition. In previous work, molecular electron densities of 20 amino acids and smaller derived molecules were partitioned into a set of 760 topological atoms. Each atom was characterised by seven atomic properties and subjected to cluster analysis element by element, that is, C, H, O, N, and S. From the respective dendrograms, 21 carbon atom types were distinguished, 7 hydrogen, 2 nitrogen, 6 oxygen, and 6 sulfur atom types. Herein, we contrast the QCT atom types with those of the assisted model building with energy refinement (AMBER) force field. We conclude that in spite of fair agreement between QCT and AMBER atom types, the latter are sometimes underdifferentiated and sometimes overdifferentiated. In summary, we suggest that QCT is a useful guide in designing new force fields or improving existing ones. The computational origin of QCT atom types makes their determination unbiased compared to atom type determination by chemical intuition and a priori assumptions. We provide a list of specific recommendations. [source] Hovenkamp's ostracized vicariance analysis: testing new methods of historical biogeographyCLADISTICS, Issue 4 2008Article first published online: 7 DEC 200, Simone Fattorini All methods currently employed in cladistic biogeography usually give contrasting results and are theoretically disputed. In two overlooked papers, Hovenkamp (1997, 2001) strongly criticized methods currently used by biogeographers and proposed two other methods. However, his criticisms have remained unanswered and his methods rarely applied. I used three different data sets to show the superiority of Hovenkamp's methods. Both methods proposed by Hovenkamp do not suffer from the unrealistic assumptions that underlie other methods commonly used in cladistic biogeography. The method proposed in 2001 is more powerful than the previous method published in 1997, because it does not use a priori assumptions about the areas involved. However, the method proposed in 1997 may be a valid alternative for large data sets. © The Willi Hennig Society 2007. [source] |