Ultimate Aim (ultimate + aim)

Distribution by Scientific Domains


Selected Abstracts


Inversion of time-dependent nuclear well-logging data using neural networks

GEOPHYSICAL PROSPECTING, Issue 1 2008
Laura Carmine
ABSTRACT The purpose of this work was to investigate a new and fast inversion methodology for the prediction of subsurface formation properties such as porosity, salinity and oil saturation, using time-dependent nuclear well logging data. Although the ultimate aim is to apply the technique to real-field data, an initial investigation as described in this paper, was first required; this has been carried out using simulation results from the time-dependent radiation transport problem within a borehole. Simulated neutron and ,-ray fluxes at two sodium iodide (NaI) detectors, one near and one far from a pulsed neutron source emitting at ,14 MeV, were used for the investigation. A total of 67 energy groups from the BUGLE96 cross section library together with 567 property combinations were employed for the original flux response generation, achieved by solving numerically the time-dependent Boltzmann radiation transport equation in its even parity form. Material property combinations (scenarios) and their correspondent teaching outputs (flux response at detectors) are used to train the Artificial Neural Networks (ANNs) and test data is used to assess the accuracy of the ANNs. The trained networks are then used to produce a surrogate model of the expensive, in terms of computational time and resources, forward model with which a simple inversion method is applied to calculate material properties from the time evolution of flux responses at the two detectors. The inversion technique uses a fast surrogate model comprising 8026 artificial neural networks, which consist of an input layer with three input units (neurons) for porosity, salinity and oil saturation; and two hidden layers and one output neuron representing the scalar photon or neutron flux prediction at the detector. This is the first time this technique has been applied to invert pulsed neutron logging tool information and the results produced are very promising. The next step in the procedure is to apply the methodology to real data. [source]


Energy Group optimization for forward and inverse problems in nuclear engineering: application to downwell-logging problems

GEOPHYSICAL PROSPECTING, Issue 2 2006
Elsa Aristodemou
ABSTRACT Simulating radiation transport of neutral particles (neutrons and ,-ray photons) within subsurface formations has been an area of research in the nuclear well-logging community since the 1960s, with many researchers exploiting existing computational tools already available within the nuclear reactor community. Deterministic codes became a popular tool, with the radiation transport equation being solved using a discretization of phase-space of the problem (energy, angle, space and time). The energy discretization in such codes is based on the multigroup approximation, or equivalently the discrete finite-difference energy approximation. One of the uncertainties, therefore, of simulating radiation transport problems, has become the multigroup energy structure. The nuclear reactor community has tackled the problem by optimizing existing nuclear cross-sectional libraries using a variety of group-collapsing codes, whilst the nuclear well-logging community has relied, until now, on libraries used in the nuclear reactor community. However, although the utilization of such libraries has been extremely useful in the past, it has also become clear that a larger number of energy groups were available than was necessary for the well-logging problems. It was obvious, therefore, that a multigroup energy structure specific to the needs of the nuclear well-logging community needed to be established. This would have the benefit of reducing computational time (the ultimate aim of this work) for both the stochastic and deterministic calculations since computational time increases with the number of energy groups. We, therefore, present in this study two methodologies that enable the optimization of any multigroup neutron,, energy structure. Although we test our theoretical approaches on nuclear well-logging synthetic data, the methodologies can be applied to other radiation transport problems that use the multigroup energy approximation. The first approach considers the effect of collapsing the neutron groups by solving the forward transport problem directly using the deterministic code EVENT, and obtaining neutron and ,-ray fluxes deterministically for the different group-collapsing options. The best collapsing option is chosen as the one which minimizes the effect on the ,-ray spectrum. During this methodology, parallel processing is implemented to reduce computational times. The second approach uses the uncollapsed output from neural network simulations in order to estimate the new, collapsed fluxes for the different collapsing cases. Subsequently, an inversion technique is used which calculates the properties of the subsurface, based on the collapsed fluxes. The best collapsing option is chosen as the one that predicts the subsurface properties with a minimal error. The fundamental difference between the two methodologies relates to their effect on the generated ,-rays. The first methodology takes the generation of ,-rays fully into account by solving the transport equation directly. The second methodology assumes that the reduction of the neutron groups has no effect on the ,-ray fluxes. It does, however, utilize an inversion scheme to predict the subsurface properties reliably, and it looks at the effect of collapsing the neutron groups on these predictions. Although the second procedure is favoured because of (a) the speed with which a solution can be obtained and (b) the application of an inversion scheme, its results need to be validated against a physically more stringent methodology. A comparison of the two methodologies is therefore given. [source]


Doctrine and fairness in the law of contract*

LEGAL STUDIES, Issue 4 2009
Andrew Phang
This paper explores, through illustrations from the law of contract, the important central theme to the effect that the rules and principles, which constitute the doctrine of the law, are not ends in themselves but are, rather, the means through which the courts arrive at substantively fair outcomes in the cases before them. The paper focuses on the concept of ,radicalism', which relates to the point at which the courts decide that it is legally permissible to hold that a contract should come to an end because a radical or fundamental ,legal tipping point' has not only been arrived at but has, in fact, been crossed. It explores the role of this concept as embodied in the doctrines of frustration, common mistake, discharge by breach, as well as fundamental breach in the context of exception clauses , in particular, how ,radicalism' with regard to these doctrines can be viewed from the (integrated) perspectives of structure, linkage and fairness. The paper also touches briefly on linkages amongst the doctrines of economic duress, undue influence and unconscionability, as well as the ultimate aim these doctrines share of achieving fair outcomes in the cases concerned. [source]


Advances on the compositional analysis of glycosphingolipids combining thin-layer chromatography with mass spectrometry

MASS SPECTROMETRY REVIEWS, Issue 3 2010
Johannes Müthing
Abstract Glycosphingolipids (GSLs), composed of a hydrophilic carbohydrate chain and a lipophilic ceramide anchor, play pivotal roles in countless biological processes, including infectious diseases and the development of cancer. Knowledge of the number and sequence of monosaccharides and their anomeric configuration and linkage type, which make up the principal items of the glyco code of biologically active carbohydrate chains, is essential for exploring the function of GSLs. As part of the investigation of the vertebrate glycome, GSL analysis is undergoing rapid expansion owing to the application of novel biochemical and biophysical technologies. Mass spectrometry (MS) takes part in the network of collaborations to further unravel structural and functional aspects within the fascinating world of GSLs with the ultimate aim to better define their role in human health and disease. However, a single-method analytical MS technique without supporting tools is limited yielding only partial structural information. Because of its superior resolving power, robustness, and easy handling, high-performance thin-layer chromatography (TLC) is widely used as an invaluable tool in GSL analysis. The intention of this review is to give an insight into current advances obtained by coupling supplementary techniques such as TLC and mass spectrometry. A retrospective view of the development of this concept and the recent improvements by merging (1) TLC separation of GSLs, (2) their detection with oligosaccharide-specific proteins, and (3) in situ MS analysis of protein-detected GSLs directly on the TLC plate, are provided. The procedure works on a nanogram scale and was successfully applied to the identification of cancer-associated GSLs in several types of human tumors. The combination of these two supplementary techniques opens new doors by delivering specific structural information of trace quantities of GSLs with only limited investment in sample preparation. © 2009 Wiley Periodicals, Inc. Mass Spec Rev 29:425-479, 2010 [source]


HOW TO ANALYZE IMMEDIATE EXPERIENCE:

METAPHILOSOPHY, Issue 3 2008
AND THE IDEA OF PHENOMENOLOGY, HINTIKKA, HUSSERL
Abstract: This article discusses Jaakko Hintikka's interpretation of the aims and method of Husserl's phenomenology. I argue that Hintikka misrepresents Husserl's phenomenology on certain crucial points. More specifically, Hintikka misconstrues Husserl's notion of "immediate experience" and consequently fails to grasp the functions of the central methodological tools known as the "epoché" and the "phenomenological reduction." The result is that the conception of phenomenology he attributes to Husserl is very far from realizing the philosophical potential of Husserl's position. Hence if we want a fruitful rapprochement between analytical philosophy and Continental phenomenology of the kind that is Hintikka's ultimate aim, then Hintikka's account of Husserl needs correcting on a number of crucial points. [source]


The generation of nisin variants with enhanced activity against specific Gram-positive pathogens

MOLECULAR MICROBIOLOGY, Issue 1 2008
Des Field
Summary Nisin is the prototype of the lantibiotic group of antimicrobial peptides. It exhibits broad spectrum inhibition of Gram-positive bacteria including important food pathogens and clinically relevant antibiotic-resistant bacteria. Significantly, the gene-encoded nature of nisin means that it can be subjected to gene-based bioengineering to generate novel derivatives. Here, we take advantage of this to generate the largest bank of randomly mutated nisin derivatives reported to date, with the ultimate aim of identifying variants with enhanced bioactivity. This approach led to the identification of a nisin-producing strain with enhanced bioactivity against the mastitic pathogen Streptococcus agalactiae resulting from an amino acid change in the hinge region of the peptide (K22T). Prompted by this discovery, site-directed and site-saturation mutagenesis of the hinge region residues was employed, resulting in the identification of additional derivatives, most notably N20P, M21V and K22S, with enhanced bioactivity and specific activity against Gram-positive pathogens including Listeria monocytogenes and/or Staphylococcus aureus. The identification of these derivatives represents a major step forward in the bioengineering of nisin, and lantibiotics in general, and confirms that peptide engineering can deliver derivatives with enhanced antimicrobial activity against specific problematic spoilage and pathogenic microbes or against Gram-positive bacteria in general. [source]


Between Reason and Common Sense

PHILOSOPHICAL INVESTIGATIONS, Issue 2 2005
On the Very Idea of Necessary (though Unwarranted) Belief
This essay is intended as a companion-piece to my article, "Reality in Common Sense: Reflections on Realism and Anti-Realism from a ,Common Sense Naturalist' Perspective." (Philosophical Investigations, Vol. 25, No. 4 (October 2002). It explores the epistemological dimension of the Common Sense Naturalism that I developed in that earlier, predominantly metaphysical essay; a position that combines the views of David Hume, Thomas Reid, and the Wittgenstein of On Certainty. My ultimate aim is to produce a comprehensive philosophy of common sense, one that with future installments, will come to include an ethical and social-political philosophy as well. "Between Reason and Common Sense" offers a common sense naturalist reply to the skeptic. My basic argument is that the skeptic makes a Rylean category mistake, when he applies the concept of warrant to epistemologically basic beliefs, such as the belief in the external world or in the continued and distinct existence of bodies. He misidentifies these beliefs as being ordinary, when they are really a part of the framework that make the practices of believing and justifying possible. As a result, they are not themselves open to confirmation or disconfirmation. I also try to characterize the nature of the necessity carried by framework beliefs, in a way that avoids the charge that the common sense naturalist is simply a closet foundationalist. [source]


Affinity reagent resources for human proteome detection: Initiatives and perspectives

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 16 2007
Oda Stoevesandt
Abstract Essential to the ambition of characterising fully the human proteome are systematic and comprehensive collections of specific affinity reagents directed against all human proteins, including splice variants and modifications. Although a large number of affinity reagents are available commercially, their quality is often questionable and only a fraction of the proteome is covered. In order for more targets to be examined, there is a need for broad availability of panels of affinity reagents, including binders against proteins of unknown functions. The most familiar affinity reagents are antibodies and their fragments, but engineered forms of protein scaffolds and nucleic acid aptamers with similar diversity and binding properties are becoming viable alternatives. Recent initiatives in Europe and the USA have been established to improve both the availability and quality of reagents for affinity proteomics, with the ultimate aim of creating standardised collections of well-validated binding molecules for proteome analysis. As well as coordinating affinity reagent production through existing resources and technology providers, these projects aim to benchmark key molecular entities, tools, and applications, and establish the bioinformatics framework and databases needed. The benefits of such reagent resources will be seen in basic research, medicine and the biotechnology and pharmaceutical industries. [source]


Nurturing knowledge: the UK Higher Education Links scheme,

PUBLIC ADMINISTRATION & DEVELOPMENT, Issue 2 2003
Derek A. Eldridge
This article examines the development of academic networks and expertise through the UK Higher Education Links scheme, which is funded by the UK Department for International Development, managed by the British Council and supported by the principals of UK higher education institutions. The links are established between UK and overseas universities primarily to enhance research and/or teaching capacity, with the ultimate aim of alleviating poverty and promoting sustainable development. This article draws on data gathered for a large-scale, multiple-method evaluation which endorsed the scheme's continuation. It is argued that a crucial factor helping to make individual links a success was good relationships between respective co-ordinators, although the nature of these relationships varied. The article discusses the extent to which the formation of fruitful academic networks and partnerships enabling knowledge transfer were encouraged. Copyright © 2003 John Wiley & Sons, Ltd. [source]


The evolution of alternative morphs: density-dependent determination of larval colour dimorphism in a butterfly

BIOLOGICAL JOURNAL OF THE LINNEAN SOCIETY, Issue 2 2009
KARL GOTTHARD
Understanding the ultimate causes for the presence of polymorphisms within populations requires knowledge of how the expression of discrete morphs is regulated. In the present study, we explored the determination mechanism of a colour dimorphism in larvae of the butterfly Pararge xiphia (Satyrinae: Nymphalidae) with the ultimate aim of understanding its potential adaptive value. Last-instar larvae of P. xiphia develop into either a green or a brown morph, although all individuals are invariably green during the preceding three instars. A series of laboratory experiments reveal that morph development is strongly environmentally dependent and not the result of alternative alleles at one locus. Photoperiod, temperature, and in particular larval density, all influenced morph determination. The strong effect of a high larval density in inducing the brown morph parallels other known cases of density-dependent melanization in Lepidopteran larvae. Because melanization is often correlated with increased immune function, this type of determination mechanism is expected to be adaptive. However, the ecology and behaviour of P. xiphia larvae suggests that increased camouflage under high-density conditions may be an additional adaptive explanation. We conclude that the colour dimorphism of P. xiphia larvae is determined by a developmental threshold that is influenced both by heredity and by environmental conditions, and that selection for increased immune function and camouflage under high-density conditions may be responsible for maintaining the dimorphism. © 2009 The Linnean Society of London, Biological Journal of the Linnean Society, 2009, 98, 256,266. [source]