Basic Tools (basic + tool)

Distribution by Scientific Domains

Selected Abstracts

Stochastic optimization for the ruin probability

Manfred Schäl Prof. Dr. rer. nat.
The Cramér-Lundberg insurance model is studied where the risk process can be controlled by reinsurance and by investment in a financial market. The performance criterion is the ruin probability. The problem can be imbedded in the framework of discrete-time stochastic dynamic programming. Basic tools are the Howard improvement and the verification theorem. Explicit conditions are obtained for the optimality of employing no reinsurance and of not investing in the market. [source]

Implementing molecular connectivity theory, a basic tool in modeling drugs

Lionello Pogliani
Abstract The concepts of chain graph, general graph, and complete graph have been used to implement the graph framework of molecular connectivity (MC) theory. Some concepts of this theory have been addressed using "external" theoretical concepts belonging mostly to quantum or structural chemistry, with no direct counterpart in graph theory. Thus, while the concept of chain graph can be used to tackle the cis-trans isomerism problem, the concept of pseudograph, or general graph can be used to tackle the description of the sigma -, pi -, and nonbonding n -electrons. The concept of complete graph can instead be used to tackle the electron core problem of the atoms of a molecule. Graph concepts can also be used to tackle the problem of the hydrogen contribution in hydrogen depleted graphs, which are encoded by the aid of a perturbation parameter, which differentiates between compounds with similar hydrogen-suppressed chemical graphs, like the graphs of CH3F and BH2F. These concepts have allowed redesign of a central parameter of MC theory, the valence delta, giving MC indices with improved model quality as exemplified here with different properties for each treated topic. © 2007 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 96:1856,1871, 2007 [source]

Unsaturated incompressible flows in adsorbing porous media

A. Fasano
We study a free boundary problem modelling the penetration of a liquid through a porous material in the presence of absorbing granules. The geometry is one dimensional. The early stage of penetration is considered, when the flow is unsaturated. Since the hydraulic conductivity depends both on saturation and on porosity and the latter change due to the absorption, the main coefficient in the flow equation depends on the free boundary and on the history of the process. Some results have been obtained in Fasano (Math. Meth. Appl. Sci. 1999; 22:605) for a simplified version of the model. Here existence and uniqueness are proved in a class of weighted Hölder spaces in a more general situation. A basic tool are the estimates on a non-standard linear boundary value problem for the heat equation in an initially degenerate domain (Rend. Mat. Acc. Lincei 2002; 13:23). Copyright © 2003 John Wiley & Sons, Ltd. [source]


ARCHAEOMETRY, Issue 3 2009
Geographical Information Systems (GIS) are being incorporated into archaeology as a technique to improve the understanding of spatial organization and the relationships among finds within specific areas. Although their use as a basic tool in predicting the location of archaeological sites or in assessing the extent of their catchment areas is relatively common, in general, they have less often been applied to the study of the spatial distribution of archaeological remains within individual deposits, and in particular to faunal assemblages. Despite this, they can prove essential to understanding dispersion and grouping patterns within deposits fully, and, together with various correlation analytical techniques, they provide valuable information about the economic organization of settlements and inhabitant lifeways. To demonstrate the potential of this methodology, a zooarchaeological GIS has been prepared for the Middle and Late Magdalenian and Azilian layers in El Mirón Cave (eastern Cantabria, Spain), and the spatial distribution patterns of various attributes of the archaeological record have been analysed. Significant conclusions in terms of type and duration of human occupation have been drawn. [source]

Assessing the health of European rivers using functional ecological guilds of fish communities: standardising species classification and approaches to metric selection

Abstract, The functional ecological guild approach is the cornerstone for the development of Indices of Biotic Integrity and multi-metric indices to assess the ecological status of aquatic systems. These indices combine metrics (unit-specific measures of a functional component of the fish community known to respond to degradation) into a single measure of ecological assessment. The guild approach provides an operational unit linking individual species characteristics with the community as a whole. Species are grouped into guilds based on some degree of overlap in their ecological niches, regardless of taxonomic relationships. Despite European fish species having been classified into ecological guilds, classification has not been standardised Europe-wide or within the context of classifying species into guilds from which metrics can be developed for ecological assessment purposes. This paper examines the approach used by the EU project FAME to classify European fish species into consistent ecological guilds and to identify suitable metrics as basic tools for the development of a standardised ecological assessment method for European rivers to meet the requirements of the Water Framework Directive. [source]

Predicting avian patch occupancy in a fragmented landscape: do we know more than we think?

Danielle F. Shanahan
Summary 1.,A recent and controversial topic in landscape ecology is whether populations of species respond to habitat fragmentation in a general fashion. Empirical research has provided mixed support, resulting in controversy about the use of general rules in landscape management. Rather than simply assessing post hoc whether individual species follow such rules, a priori testing could shed light on their accuracy and utility for predicting species response to landscape change. 2.,We aim to create an a priori model that predicts the presence or absence of multiple species in habitat patches. Our goal is to balance general theory with relevant species life-history traits to obtain high prediction accuracy. To increase the utility of this work, we aim to use accessible methods that can be applied using readily available inexpensive resources. 3.,The classification tree patch-occupancy model we create for birds is based on habitat suitability, minimum area requirements, dispersal potential of each species and overall landscape connectivity. 4.,To test our model we apply it to the South East Queensland region, Australia, for 17 bird species with varying dispersal potential and habitat specialization. We test the accuracy of our predictions using presence,absence information for 55 vegetation patches. 5.,Overall we achieve Cohen's kappa of 0·33, or ,fair' agreement between the model predictions and test data sets, and generally a very high level of absence prediction accuracy. Habitat specialization appeared to influence the accuracy of the model for different species. 6.,We also compare the a priori model to the statistically derived model for each species. Although this ,optimal model' generally differed from our original predictive model, the process revealed ways in which it could be improved for future attempts. 7.,Synthesis and applications. Our study demonstrates that ecological generalizations alongside basic resources (a vegetation map and some species-specific information) can provide conservative accuracy for predicting species occupancy in remnant vegetation patches. We show that the process of testing and developing models based on general rules could provide basic tools for conservation managers to understand the impact of current or planned landscape change on wildlife populations. [source]

Bayesian statistics in medical research: an intuitive alternative to conventional data analysis

AStat, Lyle C. Gurrin BSc (Hons)
Summary Statistical analysis of both experimental and observational data is central to medical research. Unfortunately, the process of conventional statistical analysis is poorly understood by many medical scientists. This is due, in part, to the counter-intuitive nature of the basic tools of traditional (frequency-based) statistical inference. For example, the proper definition of a conventional 95% confidence interval is quite confusing. It is based upon the imaginary results of a series of hypothetical repetitions of the data generation process and subsequent analysis. Not surprisingly, this formal definition is often ignored and a 95% confidence interval is widely taken to represent a range of values that is associated with a 95% probability of containing the true value of the parameter being estimated. Working within the traditional framework of frequency-based statistics, this interpretation is fundamentally incorrect. It is perfectly valid, however, if one works within the framework of Bayesian statistics and assumes a ,prior distribution' that is uniform on the scale of the main outcome variable. This reflects a limited equivalence between conventional and Bayesian statistics that can be used to facilitate a simple Bayesian interpretation based on the results of a standard analysis. Such inferences provide direct and understandable answers to many important types of question in medical research. For example, they can be used to assist decision making based upon studies with unavoidably low statistical power, where non-significant results are all too often, and wrongly, interpreted as implying ,no effect'. They can also be used to overcome the confusion that can result when statistically significant effects are too small to be clinically relevant. This paper describes the theoretical basis of the Bayesian-based approach and illustrates its application with a practical example that investigates the prevalence of major cardiac defects in a cohort of children born using the assisted reproduction technique known as ICSI (intracytoplasmic sperm injection). [source]

Tandem mass spectrometry of synthetic polymers

Anna C. Crecelius
Abstract The detailed characterization of macromolecules plays an important role for synthetic chemists to define and specify the structure and properties of the successfully synthesized polymers. The search for new characterization techniques for polymers is essential for the continuation of the development of improved synthesis methods. The application of tandem mass spectrometry for the detailed characterization of synthetic polymers using the soft ionization techniques matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) and electrospray ionization mass spectrometry (ESI-MS), which became the basic tools in proteomics, has greatly been increased in recent years and is summarized in this perspective. Examples of a variety of homopolymers, such as poly(methyl methacrylate), poly(ethylene glycol), as well as copolymers, e.g. copolyesters, are given. The advanced mass spectrometric techniques described in this review will presumably become one of the basic tools in polymer chemistry in the near future. Copyright © 2009 John Wiley & Sons, Ltd. [source]

Further results on the asymptotic growth of entire solutions of iterated Dirac equations in ,n

D. Constales
Abstract In this paper, we establish some further results on the asymptotic growth behaviour of entire solutions to iterated Dirac equations in ,n. Solutions to this type of systems of partial differential equations are often called polymonogenic or k -monogenic. In the particular cases where k is even, one deals with polyharmonic functions. These are of central importance for a number of concrete problems arising in engineering and physics, such as for example in the case of the biharmonic equation for the description of the stream function in the Stokes flow regime with low Reynolds numbers and for elasticity problems in plates. The asymptotic study that we are going to perform within the context of these PDE departs from the Taylor series representation of their solutions. Generalizations of the maximum term and the central index serve as basic tools in our analysis. By applying these tools we then establish explicit asymptotic relations between the growth behaviour of polymonogenic functions, the growth behaviour of their iterated radial derivatives and that of functions obtained by applying iterations of the , operator to them. Copyright © 2005 John Wiley & Sons, Ltd. [source]

Mega thinking and planning: An introduction to defining and delivering individual and organizational success

Roger Kaufman
All organizations are means to societal ends, and thus Mega thinking and planning starts with a primary focus on adding value for all stakeholders, including our shared society. It is pragmatic, realistic, practical, and ethical. Defining and achieving continual organizational success is possible. It relies on three basic elements: (1) a societal value-added "frame of mind" (your perspective and commitment about your organization, people, and our shared world), (2) shared determination and agreement on where to head and why (everyone who can or might be affected by the shared objectives must agree on purposes and results criteria), and (3) pragmatic and basic tools. The article presents the basic concepts for thinking and planning Mega to define and deliver value to internal and external partners, defining and delivering individual and organizational success. [source]