Empirical Predictions (empirical + prediction)

Distribution by Scientific Domains


Selected Abstracts


Empirical prediction of debris-flow mobility and deposition on fans

EARTH SURFACE PROCESSES AND LANDFORMS, Issue 2 2010
Christian Scheidl
Abstract A new method to predict the runout of debris flows is presented. A data base of documented sediment-transporting events in torrent catchments of Austria, Switzerland and northern Italy has been compiled, using common classification techniques. With this data we test an empirical approach between planimetric deposition area and event volume, and compare it with results from other studies. We introduce a new empirical relation to determine the mobility coefficient as a function of geomorphologic catchment parameters. The mobility coefficient is thought to reflect some of the flow properties during the depositional part of the debris-flow event. The empirical equations are implemented in a geographical information system (GIS) based simulation program and combined with a simple flow routing algorithm, to determine the potential runout area covered by debris-flow deposits. For a given volume and starting point of the deposits, a Monte-Carlo technique is used to produce flow paths that simulate the spreading effect of a debris flow. The runout zone is delineated by confining the simulated potential spreading area in the down slope direction with the empirically determined planimetric deposition area. The debris-flow volume is then distributed over the predicted area according to the calculated outflow probability of each cell. The simulation uses the ARC-Objects environment of ESRI© and is adapted to run with high resolution (2·5,m × 2·5,m) digital elevation models, generated for example from LiDAR data. The simulation program called TopRunDF is tested with debris-flow events of 1987 and 2005 in Switzerland. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Standing Facilities and Interbank Borrowing: Evidence from the Federal Reserve's New Discount Window

INTERNATIONAL FINANCE, Issue 3 2003
Craig Furfine
Standing facilities are designed to place an upper bound on the rates at which financial institutions lend to one another overnight, reducing the volatility of the overnight interest rate, typically the rate targeted by central banks. However, improper design of the facility might decrease a bank's incentive to participate actively in the interbank market. Thus, the mere availability of central-bank-provided credit may lead to its use being greater than what would be expected based on the characteristics of the interbank market. By contrast, however, banks may perceive a stigma from using such facilities, and thus borrow less than what one might expect, thereby reducing the facilities' effectiveness at reducing interest rate volatility. We develop a model demonstrating these two alternative implications of a standing facility. Empirical predictions of the model are then tested using data from the Federal Reserve's new primary credit facility and the US federal funds market. A comparison of data from before and after recent changes to the discount window suggests continued reluctance to borrow from the Federal Reserve. [source]


Amino acid ingestion and glucose metabolism,A review

IUBMB LIFE, Issue 9 2010
Mary C. Gannon
Abstract Interest in the effect of proteins or amino acids on glucose metabolism dates back at least a century, largely because it was demonstrated that the amino acids from ingested protein could be converted into glucose. Indeed, these observations influenced the dietary information provided to people with diabetes. Subsequently it was shown that ingested protein did not raise the blood glucose concentration. It also was shown that proteins could stimulate a rise in insulin and glucagon but the response to various proteins was different. In addition, it was shown that individual amino acids also could stimulate a rise in insulin and in glucagon concentrations. When individual amino acids are ingested by normal subjects, there is an ordering of the insulin and glucagon responses. However, the order is not the same for insulin and glucagon. In addition, the metabolic response cannot be predicted based on the functional groups of the amino acids. Thus, empirical prediction of the metabolic response to ingested single amino acids is not possible. © 2010 IUBMB IUBMB Life, 62(9): 660,668, 2010 [source]


Forecast Dispersion and the Cross Section of Expected Returns

THE JOURNAL OF FINANCE, Issue 5 2004
TIMOTHY C. JOHNSON
ABSTRACT Recent work by Diether, Malloy, and Scherbina (2002) has established a negative relationship between stock returns and the dispersion of analysts' earnings forecasts. I offer a simple explanation for this phenomenon based on the interpretation of dispersion as a proxy for unpriced information risk arising when asset values are unobservable. The relationship then follows from a general options-pricing result: For a levered firm, expected returns should always decrease with the level of idiosyncratic asset risk. This story is formalized with a straightforward model. Reasonable parameter values produce large effects, and the theory's main empirical prediction is supported in cross-sectional tests. [source]


Cyclic tests on large-scale models of existing bridge piers with rectangular hollow cross-section

EARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 13 2003
A. V. Pinto
Abstract Cyclic tests on two large-scale models of existing bridge piers with rectangular hollow cross-section were performed in the ELSA laboratory. The prototype structure is an existing reinforced concrete highway bridge constructed in Austria in 1975. The piers presented several seismic deficiencies and consequently they showed poor hysteretic behaviour and limited deformation capacity as well as undesirable failure modes that do not comply with the requirements of modern codes for seismic-resistant structures. Experimental data are compared to numerical and empirical predictions. Copyright © 2003 John Wiley & Sons, Ltd. [source]


The tri-trophic niche concept and adaptive radiation of phytophagous insects

ECOLOGY LETTERS, Issue 12 2005
Michael S. Singer
Abstract A conceptual divide exists between ecological and evolutionary approaches to understanding adaptive radiation, although the phenomenon is inherently both ecological and evolutionary. This divide is evident in studies of phytophagous insects, a highly diverse group that has been frequently investigated with the implicit or explicit goal of understanding its diversity. Whereas ecological studies of phytophagous insects increasingly recognize the importance of tri-trophic interactions as determinants of niche dimensions such as host-plant associations, evolutionary studies typically neglect the third trophic level. Here we attempt to reconcile ecological and evolutionary approaches through the concept of the ecological niche. We specifically present a tri-trophic niche concept as a foil to the traditional bi-trophic niche concept for phytophagous insects. We argue that these niche concepts have different implications for understanding herbivore community structure, population divergence, and evolutionary diversification. To this end, we offer contrasting empirical predictions of bi- and tri-trophic niche concepts for patterns of community structure, the process of population divergence, and patterns of evolutionary diversification of phytophagous insects. [source]


Ecoimmunity: immune tolerance by symmetric co-evolution

EVOLUTION AND DEVELOPMENT, Issue 6 2007
Uri Nevo
SUMMARY It is widely accepted that immune tolerance toward "self" is established by central and peripheral adaptations of the immune system. Mechanisms that have been demonstrated to play a role in the induction and maintenance of tolerance include thymic deletion of self-reactive T cells, peripheral T cell anergy and apoptosis, as well as thymic and peripheral induction of regulatory T cells. However, a large body of experimental findings cannot be rationalized solely based on adaptations of the immune system to its environment. Here we propose a new model termed Ecoimmunity, where the immune system and the tissue are viewed as two sides of a continuously active and co-evolving predator,prey system. Ecoimmunity views self-tolerance, not as an equilibrium in which autoimmunity is chronically suppressed, but as a symmetrical balanced conflict between the ability of immune cells to destroy tissue cells by numerous mechanisms, and the capacity of adapted tissue cells to avoid predation. This balance evolves during ontogeny, in parallel to immune adaptations, embryonic tissue cells adapt their phenotype to the corresponding immune activity by developing the ability to escape or modulate damaging local immune responses. This phenotypic plasticity of tissue cells is directed by epigenetic selection of gene expression pattern and cellular phenotype amidst an ongoing immune pressure. Thus, whereas some immune cells prey predominantly on pathogens and infected cells, self-reactive cells continuously prey on incompetent tissue cells that fail to express the adapted phenotype and resist predation. This model uses ecological generalization to reconcile current contradictory observations as well as classical enigmas related to both autoimmunity and to tolerance toward foreign tissues. Finally, it provides empirical predictions and alternative strategies toward clinical challenges. [source]


Signaling, Free Cash Flow and "Nonmonotonic" Dividends

FINANCIAL REVIEW, Issue 1 2010
Kathleen Fuller
G35 Abstract Many argue that dividends signal future earnings or dispose of excess cash. Empirical support is inconclusive, potentially because no model combines both rationales. This paper does. Higher quality firms pay dividends to eliminate the free cash-flow problem, while firms that outsiders perceive as lower quality pay dividends to signal future earnings and reduce the free cash-flow problem. In equilibrium, dividends are nonmonotonic with respect to the signal observed by outsiders; the highest quality firms pay smaller dividends than lower perceived quality firms. The model reconciles the existing literature and generates new empirical predictions that are tested and supported. [source]


A Double Moral Hazard Model of Organization Design

JOURNAL OF ECONOMICS & MANAGEMENT STRATEGY, Issue 1 2010
Elazar Berkovitch
We develop a theory of organization design in which the firm's structure is chosen by trading off ex post efficiency in the implementation of projects against ex ante efficiency in the selection of projects. Using our framework, we derive a novel set of empirical predictions regarding differences between firms with a functional structure and firms with a divisional structure. We examine how the overall profitability of the two structures is affected by various factors like size, complexity, and asymmetry in the importance of tasks and also explore the desirability of adopting a narrow business strategy. [source]


An Activation-Based Model of Sentence Processing as Skilled Memory Retrieval

COGNITIVE SCIENCE - A MULTIDISCIPLINARY JOURNAL, Issue 3 2005
Richard L. Lewis
Abstract We present a detailed process theory of the moment-by-moment working-memory retrievals and associated control structure that subserve sentence comprehension. The theory is derived from the application of independently motivated principles of memory and cognitive skill to the specialized task of sentence parsing. The resulting theory construes sentence processing as a series of skilled associative memory retrievals modulated by similarity-based interference and fluctuating activation. The cognitive principles are formalized in computational form in the Adaptive Control of Thought,Rational (ACT,R) architecture, and our process model is realized in ACT,R. We present the results of 6 sets of simulations: 5 simulation sets provide quantitative accounts of the effects of length and structural interference on both unambiguous and garden-path structures. A final simulation set provides a graded taxonomy of double center embeddings ranging from relatively easy to extremely difficult. The explanation of center-embedding difficulty is a novel one that derives from the model' complete reliance on discriminating retrieval cues in the absence of an explicit representation of serial order information. All fits were obtained with only 1 free scaling parameter fixed across the simulations; all other parameters were ACT,R defaults. The modeling results support the hypothesis that fluctuating activation and similarity-based interference are the key factors shaping working memory in sentence processing. We contrast the theory and empirical predictions with several related accounts of sentence-processing complexity. [source]