Make Predictions (make + prediction)

Distribution by Scientific Domains


Selected Abstracts


Job completion prediction using case-based reasoning for Grid computing environments

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2007
Lilian Noronha Nassif
Abstract One of the main focuses of Grid computing is solving resource-sharing problems in multi-institutional virtual organizations. In such heterogeneous and distributed environments, selecting the best resource to run a job is a complex task. The solutions currently employed still present numerous challenges and one of them is how to let users know when a job will finish. Consequently, reserve in advance remains unavailable. This article presents a new approach, which makes predictions for job execution time in Grid by applying the case-based reasoning paradigm. The work includes the development of a new case retrieval algorithm involving relevance sequence and similarity degree calculations. The prediction model is part of a multi-agent system that selects the best resource of a computational Grid to run a job. Agents representing candidate resources for job execution make predictions in a distributed and parallel manner. The technique presented here can be used in Grid environments at operation time to assist users with batch job submissions. Experimental results validate the prediction accuracy of the proposed mechanisms, and the performance of our case retrieval algorithm. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Latitudinal gradients in diversity: real patterns and random models

ECOGRAPHY, Issue 3 2001
Patricia Koleff
Mid-domain models have been argued to provide a default explanation for the best known spatial pattern in biodiversity, namely the latitudinal gradient in species richness. These models assume no environmental gradients, but merely a random latitudinal association between the size and placement of the geographic ranges of species. A mid-domain peak in richness is generated because when the latitudinal extents of species in a given taxonomic group are bounded to north and south, perhaps by a physical constraint such as a continental edge or perhaps by a climatic constraint such as a critical temperature or precipitation threshold, then the number of ways in which ranges can be distributed changes systematically between the bounds. In addition, such models make predictions about latitudinal variation in the latitudinal extents of the distributions of species, and in beta diversity (the spatial turnover in species identities). Here we test how well five mid-domain models predict observed latitudinal patterns of species richness, latitudinal extent and beta diversity in two groups of birds, parrots and woodpeckers, across the New World. Whilst both groups exhibit clear gradients in richness and beta diversity and the general trend in species richness is acceptably predicted (but not accurately, unless substantial empirical information is assumed), the fit of these models is uniformly poor for beta diversity and latitudinal range extent. This suggests either that, at least for these data, as presently formulated mid-domain models are too simplistic, or that in practice the mid-domain effect is not significant in determining geographical variation in diversity. [source]


Capturing pressure-dependence in automated mechanism generation: Reactions through cycloalkyl intermediates

INTERNATIONAL JOURNAL OF CHEMICAL KINETICS, Issue 3 2003
David M. Matheu
Chemical kinetic mechanisms for gas-phase processes (including combustion, pyrolysis, partial oxidation, or the atmospheric oxidation of organics) will often contain hundreds of species and thousands of reactions. The size and complexity of such models, and the need to ensure that important pathways are not left out, have inspired the use of computer tools to generate such large chemical mechanisms automatically. But the models produced by existing computerized mechanism generation codes, as well as a great many large mechanisms generated by hand, do not include pressure-dependence in a general way. This is due to the difficulty of computing the large number of k(T, P) estimates required. Here we present a fast, automated method for computing k(T, P) on-the-fly during automated mechanism generation. It uses as its principal inputs the same high-pressure-limit rate estimation rules and group-additivity thermochemistry estimates employed by existing computerized mechanism-generation codes, and automatically identifies the important chemically activated intermediates and pathways. We demonstrate the usefulness of this approach on a series of pressure-dependent reactions through cycloalkyl radical intermediates, including systems with over 90 isomers and 200 accessible product channels. We test the accuracy of these computer-generated k(T, P) estimates against experimental data on the systems H + cyclobutene, H + cyclopentene, H + cyclohexene, C2H3 + C2H4, and C3H5 + C2H4, and make predictions for temperatures and pressures where no experimental data are available. © 2002 Wiley Periodicals, Inc. Int J Chem Kinet 35: 95,119, 2003 [source]


Simulations of magnetic fields in the cosmos

ASTRONOMISCHE NACHRICHTEN, Issue 5-6 2006
M. Brüggen
Abstract The origin of large-scale magnetic fields in clusters of galaxies remains controversial. The intergalactic magnetic field within filaments should be less polluted by magnetised outflows from active galaxies than magnetic fields in clusters. Therefore, filaments may be a better laboratory to study magnetic field amplification by structure formation than galaxy clusters, which typically host many more active galaxies. We present highly resolved cosmological adaptive mesh refinement simulations of magnetic fields in the cosmos and make predictions about the evolution and structure of magnetic fields in filaments. Comparing our results to observational evidence of magnetic fields in filaments suggests that amplification of seed fields by gravitational collapse is not sufficient to produce intergalactic medium fields. Finally, implications for cosmic ray transport and the impact of magnetic fields on delayed photons from gamma-ray bursts are discussed. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


PREDICTION-FOCUSED MODEL SELECTION FOR AUTOREGRESSIVE MODELS

AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 4 2007
Gerda Claeskens
Summary In order to make predictions of future values of a time series, one needs to specify a forecasting model. A popular choice is an autoregressive time-series model, for which the order of the model is chosen by an information criterion. We propose an extension of the focused information criterion (FIC) for model-order selection, with emphasis on a high predictive accuracy (i.e. the mean squared forecast error is low). We obtain theoretical results and illustrate by means of a simulation study and some real data examples that the FIC is a valid alternative to the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) for selection of a prediction model. We also illustrate the possibility of using the FIC for purposes other than forecasting, and explore its use in an extended model. [source]


Reliable high-throughput screening with Pichia pastoris by limiting yeast cell death phenomena

FEMS YEAST RESEARCH, Issue 2 2004
Roland Weis
Abstract Comparative screening of gene expression libraries employing the potent industrial host Pichia pastoris for improving recombinant eukaryotic enzymes by protein engineering was an unsolved task. We simplified the protocol for protein expression by P. pastoris and scaled it down to 0.5-ml cultures. Optimising standard growth conditions and procedures, programmed cell death and necrosis of P. pastoris in microscale cultures were diminished. Uniform cell growth in 96-deep-well plates now allows for high-throughput protein expression and screening for improved enzyme variants. Furthermore, the change from one host for protein engineering to another host for enzyme production becomes dispensable, and this accelerates the protein breeding cycles and makes predictions for large-scale production more accurate. [source]


Metabolic cold adaptation and developmental plasticity in metabolic rates among species in the Fundulus notatus species complex

FUNCTIONAL ECOLOGY, Issue 5 2010
Jacob Schaefer
Summary 1.,In ectotherms, temperature and body size are the most influential and well studied variables affecting metabolic rate. Understanding mechanisms driving the evolution of metabolic rates is crucial to broader ecological theory. The metabolic cold adaptation hypothesis (MCA) makes predictions about the evolution of ectotherm metabolic rates and temperature-metabolic rate reaction norms. 2.,We examined intra and interspecific patterns in metabolic rate among populations in the Fundulus notatus species group (F. notatus, F. olivaceus and F. euryzonus). We ask if patterns of intra and interspecific variability in metabolic rate are consistent with the MCA and if metabolic rates in general are developmentally plastic. 3.,Support for the MCA was mixed among intra and interspecific tests. The northern population of F. olivaceus had increased metabolic rate and no difference in temperature sensitivity (slope of temperature-metabolic rate reaction norm). Northern populations of F. notatus had lower temperature sensitivity and no difference in overall metabolic rate. The southern coastal drainage endemic (F. euryzonus) had intermediate metabolic rates compared to southern populations of the other two more broadly distributed species. Metabolic rates were also developmentally plastic. Adults reared at warmer temperatures had lower metabolic rates after accounting for body size and temperature. 4.,Differences in thermal regimes explain some variability in metabolic rates among populations consistent with MCA. However, interspecific comparisons are not consistent with MCA and are likely influenced by species differences in ecology and life history strategies. [source]


Damage control , a possible non-proteolytic role for ubiquitin in limiting neurodegeneration

NEUROPATHOLOGY & APPLIED NEUROBIOLOGY, Issue 2 2001
D. A. Gray
Ubiquitin can be detected in the neuronal and glial inclusions that are the diagnostic hallmarks of a number of human neurodegenerative diseases. It has been assumed that the presence of ubiquitin signifies the failed attempt of the cell to remove abnormal protein structures, which have been allowed to aggregate. The burden of abnormal protein arising from genetic mutations or cumulative oxidative damage might in the course of time overwhelm the ubiquitin-proteasome pathway (whose responsibility it is to eliminate misfolded or damaged proteins). However, ubiquitin may still serve a protective purpose distinct from its role in proteolysis. The physical properties of ubiquitin are such that a surface coating of ubiquitin should preclude further growth of the aggregate, prevent non-productive interactions, and conceal the contents from detection mechanisms that might ultimately kill the cell. This ,nonstick coating' hypothesis makes predictions about the nature of the conjugated ubiquitin and the consequences of removing it. [source]