Future Evolution (future + evolution)

Distribution by Scientific Domains

Selected Abstracts

Factors determining mammal species richness on habitat islands and isolates: habitat diversity, disturbance, species interactions and guild assembly rules

GLOBAL ECOLOGY, Issue 1 2000
Barry J. Fox
Abstract 1For over three decades the equilibrium theory of island biogeography has galvanized studies in ecological biogeography. Studies of oceanic islands and of natural habitat islands share some similarities to continental studies, particularly in developed regions where habitat fragmentation results from many land uses. Increasingly, remnant habitat is in the form of isolates created by the clearing and destruction of natural areas. Future evolution of a theory to predict patterns of species abundance may well come from the application of island biogeography to habitat fragments or isolates. 2In this paper we consider four factors other than area and isolation that influence the number and type of mammal species coexisting in one place: habitat diversity, habitat disturbance, species interactions and guild assembly rules. In all examples our data derive from mainland habitat, fragmented to differing degrees, with different levels of isolation. 3Habitat diversity is seen to be a good predictor of species richness. Increased levels of disturbance produce a relatively greater decrease in species richness on smaller than on larger isolates. Species interactions in the recolonization of highly disturbed sites, such as regenerating mined sites, is analogous to island colonization. Species replacement sequences in secondary successions indicate not just how many, but which species are included. Lastly, the complement of species established on islands, or in insular habitats, may be governed by guild assembly rules. These contributions may assist in taking a renewed theory into the new millennium. [source]


EVOLUTION, Issue 9 2003
J. Arjan G. M. de Visser
Abstract Robustness is the invariance of phenotypes in the face of perturbation. The robustness of phenotypes appears at various levels of biological organization, including gene expression, protein folding, metabolic flux, physiological homeostasis, development, and even organismal fitness. The mechanisms underlying robustness are diverse, ranging from thermodynamic stability at the RNA and protein level to behavior at the organismal level. Phenotypes can be robust either against heritable perturbations (e.g., mutations) or nonheritable perturbations (e.g., the weather). Here we primarily focus on the first kind of robustness,genetic robustness,and survey three growing avenues of research: (1) measuring genetic robustness in nature and in the laboratory; (2) understanding the evolution of genetic robustness; and (3) exploring the implications of genetic robustness for future evolution. [source]

Conditional Probabilistic Population Projections: An Application to Climate Change

Brian C. O'Neill
Summary Future changes in population size, composition, and spatial distribution are key factors in the analysis of climate change, and their future evolution is highly uncertain. In climate change analyses, population uncertainty has traditionally been accounted for by using alternative scenarios spanning a range of outcomes. This paper illustrates how conditional probabilistic projections offer a means of combining probabilistic approaches with the scenario-based approach typically employed in the development of greenhouse gas emissions projections. The illustration combines a set of emissions scenarios developed by the Intergovernmental Panel on Climate Change (IPCC) with existing probabilistic population projections from IIASA. Results demonstrate that conditional probabilistic projections have the potential to account more fully for uncertainty in emissions within conditional storylines about future development patterns, to provide a context for judging the consistency of individual scenarios with a given storyline, and to provide insight into relative likelihoods across storylines, at least from a demographic perspective. They may also serve as a step toward more comprehensive quantification of uncertainty in emissions projections. Résumé Les changements futurs dans la taille, la composition et la distribution spatiale de la population sont des facteurs cels dans l'analyse du changement climatique, et leur évolution future est très incertaine. Dans les analyses du changement climatique, on tient traditionnellement compte de l'incertitude sur la population en utilisant des sénarios alternatifs couvrant un éventail de résultats. Cet article illustre comment des projections à probabilité conditionnelle permettent de combiner les approches probabilistes avec l'approche basée sur des scénarios, typiquement employée dans les travaux de projections d'émissions de gaz à effet de serre. La présentation combine un ensemble de scénarios d'émissions développé par le Panel Intergouvernemental sur le changement climatique (IPCC) avec des projections de population probabilistes existantes de l'IIASA. Les résultats démontrent que les projections à probabilité conditionnelle peuvent expliquer plus complètement l'incertitude sur les émissions dans le cadre de scénarios conditionnels des modèles de développement futurs, qu'elles peuvent permettre de juger de la cohérence de scénarios individuels avec un scénario donné, et de fournir une idée des vraisemblances relatives dans les scénarios, au moins d'un point de vue démographique. Ils peuvent aussi servir d'étape vers une quantification plus précise de l'incertitude dans les projections d'émissions. [source]

FOX, `free objects for crystallography': a modular approach to ab initio structure determination from powder diffraction

Vincent Favre-Nicolin
A new program has been developed for ab initio crystal structure determination from powder diffraction data (X-ray and neutron). It uses global-optimization algorithms to solve the structure by performing trials in direct space. It is a modular program, capable of using several criteria for evaluating each trial configuration (e.g. multi-pattern). It is also modular in the description of the crystal content, with the possibility of describing building blocks in the sample, such as polyhedra or molecules, and with automatic adaptive handling of special positions and sharing of identical atoms between neighbouring building blocks. It can therefore find the correct structure without any assumption about the connectivity of the building blocks and is suitable for any kind of material. Several optimization algorithms (simulated annealing, parallel tempering) are available, with the possibility of choosing the convergence criterion as a combination of available cost functions. This program is freely available for Linux and Windows platforms; it is also fully `open source', which, combined with an object-oriented design and a complete developer documentation, ensures its future evolution. [source]

A linear benchmark for forecasting GDP growth and inflation?

Massimiliano MarcellinoArticle first published online: 30 APR 200
Abstract Predicting the future evolution of GDP growth and inflation is a central concern in economics. Forecasts are typically produced either from economic theory-based models or from simple linear time series models. While a time series model can provide a reasonable benchmark to evaluate the value added of economic theory relative to the pure explanatory power of the past behavior of the variable, recent developments in time series analysis suggest that more sophisticated time series models could provide more serious benchmarks for economic models. In this paper we evaluate whether these complicated time series models can outperform standard linear models for forecasting GDP growth and inflation. We consider a large variety of models and evaluation criteria, using a bootstrap algorithm to evaluate the statistical significance of our results. Our main conclusion is that in general linear time series models can hardly be beaten if they are carefully specified. However, we also identify some important cases where the adoption of a more complicated benchmark can alter the conclusions of economic analyses about the driving forces of GDP growth and inflation. Copyright © 2008 John Wiley & Sons, Ltd. [source]

Strategic bias, herding behaviour and economic forecasts

Jordi Pons-Novell
Abstract Professional forecasters can have other objectives as well as minimizing expected squared forecast errors. This paper studies whether the people or companies which make forecasts behave strategically with the aim of maximizing aspects such as publicity, salary or their prestige, or more generally to minimize some loss function; or whether, on the contrary, they make forecasts which resemble consensus forecasts (herding behaviour). This study also analyses whether, as forecasters gain more reputation and experience, they make more radical forecasts, that is, they deviate further from the consensus. For this the Livingston Survey is used, a panel of experts who make forecasts on the future evolution of the United States economy. Copyright © 2003 John Wiley & Sons, Ltd. [source]

Is Bancassurance a Viable Model for Financial Firms?

L. Paige Fields
The bancassurance (i.e., bank and insurance company combinations) model for financial firm architecture has been widely used in Europe and recently has been adopted by U.S. financial firms. We provide evidence regarding the viability of bancassurance combinations for U.S. and non-U.S. mergers between 1997 and 2002. We find positive gains and no significant risk shifts for shareholders of bidding firms, and that higher CEO stock ownership results in less positive gains for shareholders. These and other results suggest that bancassurance firms are viable entities that may play an important role in the future evolution of the U.S. financial system. [source]

Information resources in High-Energy Physics: Surveying the present landscape and charting the future course

Anne Gentil-Beccot
Access to previous results is of paramount importance in the scientific process. Recent progress in information management focuses on building e-infrastructures for the optimization of the research workflow, through both policy-driven and user-pulled dynamics. For decades, High Energy Physics (HEP) has pioneered innovative solutions in the field of information management and dissemination. In light of a transforming information environment, it is important to assess the current usage of information resources by researchers and HEP provides a unique test bed for this assessment. A survey of about 10% of practitioners in the field reveals usage trends and information needs. Community-based services, such as the pioneering arXiv and SPIRES systems, largely answer the need of the scientists, with a limited but increasing fraction of younger users relying on Google. Commercial services offered by publishers or database vendors are essentially unused in the field. The survey offers an insight into the most important features that users require to optimize their research workflow. These results inform the future evolution of information management in HEP and, as these researchers are traditionally "early adopters" of innovation in scholarly communication, can inspire developments of disciplinary repositories serving other communities. [source]

The High-Volume Return Premium

Simon Gervais
The idea that extreme trading activity contains information about the future evolution of stock prices is investigated. We find that stocks experiencing unusually high (low) trading volume over a day or a week tend to appreciate (depreciate) over the course of the following month. We argue that this high-volume return premium is consistent with the idea that shocks in the trading activity of a stock affect its visibility, and in turn the subsequent demand and price for that stock. Return autocorrelations, firm announcements, market risk, and liquidity do not seem to explain our results. [source]