Make Use (make + use)

Distribution by Scientific Domains


Selected Abstracts


Multiversion concurrency control for the generalized search tree

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2009
Walter Binder
Abstract Many read-intensive systems where fast access to data is more important than the rate at which data can change make use of multidimensional index structures, like the generalized search tree (GiST). Although in these systems the indexed data are rarely updated and read access is highly concurrent, the existing concurrency control mechanisms for multidimensional index structures are based on locking techniques, which cause significant overhead. In this article we present the multiversion-GiST (MVGiST), an in-memory mechanism that extends the GiST with multiversion concurrency control. The MVGiST enables lock-free read access and ensures a consistent view of the index structure throughout a reader's series of queries, by creating lightweight, read-only versions of the GiST that share unchanging nodes among themselves. An example of a system with high read to write ratio, where providing wait-free queries is of utmost importance, is a large-scale directory that indexes web services according to their input and output parameters. A performance evaluation shows that for low update rates, the MVGiST significantly improves scalability w.r.t. the number of concurrent read accesses when compared with a traditional, locking-based concurrency control mechanism. We propose a technique to control memory consumption and confirm through our evaluation that the MVGiST efficiently manages memory. Copyright © 2009 John Wiley & Sons, Ltd. [source]


THE EARNINGS EFFECT OF EDUCATION AT COMMUNITY COLLEGES

CONTEMPORARY ECONOMIC POLICY, Issue 1 2010
DAVE E. MARCOTTE
In this paper, I make use of data from the 2000 follow-up of the National Education Longitudinal Survey postsecondary education transcript files to extend what is known about the value of education at community colleges. I examine the effects of enrollment in community colleges on students' subsequent earnings. I estimate the effects of credits earned separately from credentials because community colleges are often used as a means for students to engage in study not necessarily leading to a degree or certificate. I find consistent evidence of wage and salary effects of both credits and degrees, especially for women. There is no substantial evidence that enrollment in vocational rather than academic coursework has a particularly beneficial effect, however. (JEL I2, J24) [source]


A linguistic interpretation of Welford's hijack hypothesis

CORPORATE SOCIAL RESPONSIBILITY AND ENVIRONMENTAL MANAGEMENT, Issue 2 2010
Mark Brown
Abstract This paper makes a linguistic reinterpretation of Welford's 1997 hijack hypothesis, arguing that the hijack of the discourse of the radical environment is simply a process of appropriation, i.e., the adoption of particular words in order to make use of them within the green corporations' own frames of experience. Results are presented from an empirical study using two large ,databases' of language. These are electronic collections of texts taken from British environmental organizations , the radical non-governmental organizations (NGOs), and UK corporations that wish to be environmentally friendly , green business. The results show that there are very marked differences in the physical contextualization of a selection of words which are used by both the radical NGOs and green business. The paper concludes by noting the need to take the analysis a stage further by comparing the usage of particular words by the two discourse communities. Copyright © 2010 John Wiley & Sons, Ltd and ERP Environment. [source]


Genetics of anxiety disorders: the complex road from DSM to DNA,

DEPRESSION AND ANXIETY, Issue 11 2009
Jordan W. Smoller M.D. Sc.D.
Abstract Anxiety disorders are among the most common psychiatric disorders, affecting one in four individuals over a lifetime. Although our understanding of the etiology of these disorders is incomplete, familial and genetic factors are established risk factors. However, identifying the specific casual genes has been difficult. Within the past several years, advances in molecular and statistical genetic methods have made the genetic dissection of complex disorders a feasible project. Here we provide an overview of these developments, with a focus on their implications for genetic studies of anxiety disorders. Although the genetic and phenotypic complexity of the anxiety disorders present formidable challenges, advances in neuroimaging and experimental animal models of anxiety and fear offer important opportunities for discovery. Real progress in identifying the genetic basis of anxiety disorders will require integrative approaches that make use of these biologic tools as well as larger-scale genomic studies. If successful, such efforts may yield novel and more effective approaches for the prevention and treatment of these common and costly disorders. Depression and Anxiety, 2009. © 2009 Wiley-Liss, Inc. [source]


Transnational Advocacy Networks and Affirmative Action for Dalits in India

DEVELOPMENT AND CHANGE, Issue 2 2008
Jens Lerche
ABSTRACT In India, movements and parties representing the lowest ranking dalit caste groups have followed different strategies in their struggle against social, economic and cultural discrimination. In this article, a new dalit movement making use of a ,transnational advocacy network strategy' will be compared to a more ,classical'dalit political party. The main policy target for the new movement is an extension of existing affirmative action policies, while the dalit BSP party focuses more on emancipatory issues. Based on an analysis of the impacts of the BSP and of the new movement at the grassroots level, it is argued that the achievements of the new movement are tempered by the fact that in order to make use of international discourses and political pressure, the movement has had to develop a strategy and policy proposals compatible with existing mainstream neoliberal discourses. This depoliticizes the policies, and hence makes them of less importance strategically. It is argued that this is likely to be a difficulty for transnational advocacy networks in general. [source]


Experiences with integrated impact assessment , empirical evidence from a survey in three European member states

ENVIRONMENTAL POLICY AND GOVERNANCE, Issue 5 2009
Martin Achtnicht
Abstract The paper contributes to the discussion on the use of methods and quantification in regulatory impact assessment. We investigate whether there are differences between the three dimensions of sustainable development in terms of the methodical efforts to assess potential impacts. Based on a survey in Germany, the Netherlands and the UK, we provide some evidence regarding these questions. We find that regulatory impact assessment is still biased towards assessing intended and mainly economic costs. There is a gap between the recommended use of methods in official guidance documents and the practice in member states. The reason for this gap can be seen in the existence of operational problems in practice such as lack of data or lack of tools. However, we find that the degree of sophistication of conducted IAs can be improved if responsible desk officers receive training or make use of guidance documents and receive support from co-ordination units. Copyright © 2009 John Wiley & Sons, Ltd and ERP Environment. [source]


Performance analysis of optically preamplified DC-coupled burst mode receivers

EUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 3 2009
T. J. Zuo
Bit error rate and threshold acquisition penalty evaluation is performed for an optically preamplified DC-coupled burst mode receiver using a moment generating function (MGF) description of the signal plus noise. The threshold itself is a random variable and is also described using an appropriate MGF. Chernoff bound (CB), modified Chernoff bound (MCB) and the saddle-point approximation (SPA) techniques make use of the MGF to provide the performance analyses. This represents the first time that these widely used approaches to receiver performance evaluation have been applied to an optically preamplified burst mode receiver and it is shown that they give threshold acquisition penalty results in good agreement with a prior existing approach, whilst having the facility to incorporate arbitrary receiver filtering, receiver thermal noise and non-ideal extinction ratio. A traditional Gaussian approximation (GA) is also calculated and comparison shows that it is clearly less accurate (it exceeds the upper bounds provided by CB and MCB) in the realistic cases examined. It is deduced, in common with the equivalent continuous mode analysis, that the MCB is the most sensible approach. Copyright © 2009 John Wiley & Sons, Ltd. [source]


New concepts of microbial treatment processes for the nitrogen removal in wastewater

FEMS MICROBIOLOGY REVIEWS, Issue 4 2003
Ingo Schmidt
Abstract Many countries strive to reduce the emissions of nitrogen compounds (ammonia, nitrate, NOx) to the surface waters and the atmosphere. Since mainstream domestic wastewater treatment systems are usually already overloaded with ammonia, a dedicated nitrogen removal from concentrated secondary or industrial wastewaters is often more cost-effective than the disposal of such wastes to domestic wastewater treatment. The cost-effectiveness of separate treatment has increased dramatically in the past few years, since several processes for the biological removal of ammonia from concentrated waste streams have become available. Here, we review those processes that make use of new concepts in microbiology: partial nitrification, nitrifier denitrification and anaerobic ammonia oxidation (the anammox process). These processes target the removal of ammonia from gases, and ammonium-bicarbonate from concentrated wastewaters (i.e. sludge liquor and landfill leachate). The review addresses the microbiology, its consequences for their application, the current status regarding application, and the future developments. [source]


Guanine-Based Biogenic Photonic-Crystal Arrays in Fish and Spiders

ADVANCED FUNCTIONAL MATERIALS, Issue 2 2010
Avital Levy-Lior
Abstract Biological photonic systems composed of anhydrous guanine crystals evolved separately in several taxonomic groups. Here, two such systems found in fish and spiders, both of which make use of anhydrous guanine crystal plates to produce structural colors, are examined. Measurements of the photonic-crystal structures using cryo-SEM show that the crystal plates in both fish skin and spider integument are ,20-nm thick. The reflective unit in the fish comprises stacks of single plates alternating with ,230-nm-thick cytoplasm layers. In the spiders the plates are formed as doublet crystals, cemented by 30-nm layers of amorphous guanine, and are stacked with ,200,nm of cytoplasm between crystal doublets. They achieve light reflective properties through the control of crystal morphology and stack dimensions, reaching similar efficiencies of light reflectivity in both fish skin and spider integument. The structure of guanine plates in spiders are compared with the more common situation in which guanine occurs in the form of relatively unorganized prismatic crystals, yielding a matt white coloration. [source]


A comparative analysis of the diving behaviour of birds and mammals

FUNCTIONAL ECOLOGY, Issue 5 2006
L. G. HALSEY
Summary 1We use a large interspecific data set on diving variables for birds and mammals, and statistical techniques to control for the effects of phylogenetic non-independence, to assess evolutionary associations among different elements of diving behaviour across a broad and diverse range of diving species. Our aim is to assess whether the diving ability of homeothermic vertebrates is influenced by factors other than the physiology of the species. 2Body mass is related to dive duration even when dive depth is controlled for and thus for a given dive depth, larger species dive for longer. This implies that larger species have a greater capacity for diving than is expressed in their dive depth. Larger animals that dive shallowly, probably for ecological reasons such as water depth, make use of the physiological advantage that their size confers by diving for longer. 3Dive duration correlates with dive depth more strongly than with body mass. This confirms that some animals are poor divers for their body mass, either because of a lower physiological capacity or because their behaviour limits their diving. 4Surface duration relates not only to dive duration but also to dive depth, as well as to both independently. This indicates a relationship between dive depth and surface duration controlling for dive duration, which suggests that deeper dives are energetically more expensive than shallow dives of the same duration. 5Taxonomic class does not improve any of the dive variable models in the present study. There is thus an unsuspected consistency in the broad responses of different groups to the effects on diving of the environment, which are therefore general features of diving evolution. [source]


Comparison of methods to model the gravitational gradients from topographic data bases

GEOPHYSICAL JOURNAL INTERNATIONAL, Issue 3 2006
Christopher Jekeli
SUMMARY A number of methods have been developed over the last few decades to model the gravitational gradients using digital elevation data. All methods are based on second-order derivatives of the Newtonian mass integral for the gravitational potential. Foremost are algorithms that divide the topographic masses into prisms or more general polyhedra and sum the corresponding gradient contributions. Other methods are designed for computational speed and make use of the fast Fourier transform (FFT), require a regular rectangular grid of data, and yield gradients on the entire grid, but only at constant altitude. We add to these the ordinary numerical integration (in horizontal coordinates) of the gradient integrals. In total we compare two prism, two FFT and two ordinary numerical integration methods using 1, elevation data in two topographic regimes (rough and moderate terrain). Prism methods depend on the type of finite elements that are generated with the elevation data; in particular, alternative triangulations can yield significant differences in the gradients (up to tens of Eötvös). The FFT methods depend on a series development of the topographic heights, requiring terms up to 14th order in rough terrain; and, one popular method has significant bias errors (e.g. 13 Eötvös in the vertical,vertical gradient) embedded in its practical realization. The straightforward numerical integrations, whether on a rectangular or triangulated grid, yield sub-Eötvös differences in the gradients when compared to the other methods (except near the edges of the integration area) and they are as efficient computationally as the finite element methods. [source]


Statistical prediction of global sea-surface temperature anomalies

INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 14 2003
A. W. Colman
Abstract Sea-surface temperature (SST) is one of the principal factors that influence seasonal climate variability, and most seasonal prediction schemes make use of information regarding SST anomalies. In particular, dynamical atmospheric prediction models require global gridded SST data prescribed through the target season. The simplest way of providing those data is to persist the SST anomalies observed at the start of the forecast at each grid point, with some damping, and this strategy has proved to be quite effective in practice. In this paper we present a statistical scheme that aims to improve that basic strategy by combining three individual methods together: simple persistence, canonical correlation analysis (CCA), and nearest-neighbour regression. Several weighting schemes were tested: the best of these is one that uses equal weight in all areas except the east tropical Pacific, where CCA is preferred. The overall performance of the combined scheme is better than the individual schemes. The results show improvements in tropical ocean regions for lead times beyond 1 or 2 months, but the skill of simple persistence is difficult to beat in the extratropics at all lead times. Aspects such as averaging periods and grid size were also investigated: results showed little sensitivity to these factors. The combined statistical SST prediction scheme can also be used to improve statistical regional rainfall forecasts that use SST anomaly patterns as predictors. Copyright © Crown Copyright 2003. Published by John Wiley & Sons, Ltd. [source]


Marketing information systems in tourism and hospitality small- and medium-sized enterprises: a study of Internet use for market intelligence

INTERNATIONAL JOURNAL OF TOURISM RESEARCH, Issue 4 2001
Emma Wood
Abstract This study investigates the nature of marketing information systems (MkIS) within small- and medium-sized enterprises (SMEs) and focuses on the importance of external information and market intelligence. The sources of market intelligence are investigated with particular emphasis on understanding the usefulness of the Internet for external information gathering. The empirical research to support the study uses survey methods to investigate marketing information systems, market intelligence and Internet use within hospitality and tourism SMEs in the Yorkshire and Humber region. The findings indicate that SMEs in this sector make use of informal marketing information systems which mainly concentrate on internal and immediate operating environment data. Important wider market intelligence is underutilised owing mainly to the resource constraints of these smaller businesses. The Internet has not yet been recognised as an important source for market intelligence despite having the benefits of providing much of the necessary data more quickly and at a lower cost than many other sources. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Resident-oriented care in nursing homes: effects on nurses

JOURNAL OF ADVANCED NURSING, Issue 6 2004
Afke J.M.B. Berkhout PhD
Background., In a resident-oriented care model the assignment of patients to primary nurses takes place. These primary nurses are responsible for the total nursing care of their patients and make use of the nursing process. According to job demand-control models, these enlarged and enriched jobs can be described in terms of autonomy, job demands and social support, and the presence of these work characteristics has a positive influence on workers' psychological and behavioural outcomes. Aims., This paper reports a study to investigate the extent to which the various features of resident-oriented care were implemented and its effects nurses' on work characteristics and on psychological and behavioural outcomes in three Dutch nursing homes. Methods., In a quasi-experimental design, experimental and control groups were followed over 22 months, using a pretest and two post-tests with questionnaires, interviews and qualitative observations. Results., The quantitative data showed significant increases in resident assignment, the two variables measuring the nursing process and, in the psycho-geriatric experimental group, on resident-oriented tasks. The qualitative data showed that a partly task-oriented division of labour was still used and that the planned delegation of coordination tasks to primary nurses was not fully achieved. Effects on work perceptions were limited. After implementation of the new system, the experimental group showed an increase in job autonomy. Conclusions., The intervention appeared to be only partly successful. Most of the expected results regarding work characteristics and psychological and behavioural outcomes did not materialize. Theoretical and methodological reflections are presented in the light of these findings. [source]


Basic ingredients of free energy calculations: A review

JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 8 2010
Clara D. Christ
Abstract Methods to compute free energy differences between different states of a molecular system are reviewed with the aim of identifying their basic ingredients and their utility when applied in practice to biomolecular systems. A free energy calculation is comprised of three basic components: (i) a suitable model or Hamiltonian, (ii) a sampling protocol with which one can generate a representative ensemble of molecular configurations, and (iii) an estimator of the free energy difference itself. Alternative sampling protocols can be distinguished according to whether one or more states are to be sampled. In cases where only a single state is considered, six alternative techniques could be distinguished: (i) changing the dynamics, (ii) deforming the energy surface, (iii) extending the dimensionality, (iv) perturbing the forces, (v) reducing the number of degrees of freedom, and (vi) multi-copy approaches. In cases where multiple states are to be sampled, the three primary techniques are staging, importance sampling, and adiabatic decoupling. Estimators of the free energy can be classified as global methods that either count the number of times a given state is sampled or use energy differences. Or, they can be classified as local methods that either make use of the force or are based on transition probabilities. Finally, this overview of the available techniques and how they can be best used in a practical context is aimed at helping the reader choose the most appropriate combination of approaches for the biomolecular system, Hamiltonian and free energy difference of interest. © 2009 Wiley Periodicals, Inc. J Comput Chem, 2010 [source]


Life table response experiment analysis of the stochastic growth rate

JOURNAL OF ECOLOGY, Issue 2 2010
Hal Caswell
Summary 1.,Life table response experiment (LTRE) analyses decompose treatment effects on a dependent variable (usually, but not necessarily, population growth rate) into contributions from differences in the parameters that determine that variable. 2.,Fixed, random and regression LTRE designs have been applied to plant populations in many contexts. These designs all make use of the derivative of the dependent variable with respect to the parameters, and describe differences as sums of linear approximations. 3.,Here, I extend LTRE methods to analyse treatment effects on the stochastic growth rate log ,s. The problem is challenging because a stochastic model contains two layers of dynamics: the stochastic dynamics of the environment and the response of the vital rates to the state of the environment. I consider the widely used case where the environment is described by a Markov chain. 4.,As the parameters describing the environmental Markov chain do not appear explicitly in the calculation of log ,s, derivatives cannot be calculated. The solution presented here combines derivatives for the vital rates with an alternative (and older) approach, due to Kitagawa and Keyfitz, that calculates contributions in a way analogous to the calculation of main effects in statistical models. 5.,The resulting LTRE analysis decomposes log ,s into contributions from differences in: (i) the stationary distribution of environmental states, (ii) the autocorrelation pattern of the environment, and (iii) the stage-specific vital rate responses within each environmental state. 6.,As an example, the methods are applied to a stage-classified model of the prairie plant Lomatium bradshawii in a stochastic fire environment. 7.,Synthesis. The stochastic growth rate is an important parameter describing the effects of environmental fluctuations on population viability. Like any growth rate, it responds to differences in environmental factors. Without a decomposition analysis there is no way to attribute differences in the stochastic growth rate to particular parts of the life cycle or particular aspects of the stochastic environment. The methods presented here provide such an analysis, extending the LTRE analyses already available for deterministic environments. [source]


Off-axis electron holography of electrostatic potentials in unbiased and reverse biased focused ion beam milled semiconductor devices

JOURNAL OF MICROSCOPY, Issue 3 2004
A. C. TWITCHETT
Summary Off-axis electron holography in the transmission electron microscope (TEM) is used to measure two-dimensional electrostatic potentials in both unbiased and reverse biased silicon specimens that each contain a single p,n junction. All the specimens are prepared for examination in the TEM using focused ion beam (FIB) milling. The in situ electrical biasing experiments make use of a novel specimen geometry, which is based on a combination of cleaving and FIB milling. The design and construction of an electrical biasing holder are described, and the effects of TEM specimen preparation on the electrostatic potential in the specimen, as well as on fringing fields beyond the specimen surface, are assessed. [source]


From blog to bebo and beyond: text, risk, participation

JOURNAL OF RESEARCH IN READING, Issue 1 2009
Victoria Carrington
This paper broadly explores the notion that text is an artefact that encodes and displays the tensions, resistances, positioning and affinities of its producer and, further, that many of these drivers have their source in quite significant shifts in the broad contours of contemporary Western culture. Against this background, two different artefacts are analysed in this paper: a blog and a bebo page. The blog has been produced by an adult female academic and the bebo page by an early adolescent girl. These text producers and users are positioned quite differently in terms of geography, education, life experience, identity, social class and interests. They also have differential access to and experience of digital technologies. However, they both make use of the affordances of technologies, in particular Internet-connected laptops and desktops, to create and disseminate these texts to do ,work' on their behalf in particular social domains. [source]


Risk news in the world of Internet newsgroups

JOURNAL OF SOCIOLINGUISTICS, Issue 1 2001
Kay Richardson
The coming of the Internet has provided those who are able to benefit from it new ways of giving and seeking information. These new contexts of communication include newsgroups, very much a text-based form of interaction with little visual enhancement. In the new era of ,risk society' (Beck 1992) people make use of newsgroups to talk about the risks which now confront the world, in their pursuit of trustworthy information and informants. Using the affair of Mad Cow Disease (BSE), with particular reference to the crisis in 1996, this article explores the dynamics of news exchange via the newsgroups as a process which is Interactive, International, Interested and Intertextual. These characteristics result in a form of discourse through which participants engage in the interpersonal social construction of risk. The credibility of the proposition that BSE poses a health risk to humans is the focus of their discussions: they are concerned with the nature of the evidence for that proposition and with the reliability of the sources responsible for endorsing it. [source]


Indicating ontology data quality, stability, and completeness throughout ontology evolution

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 1 2007
Anthony M. Orme
Abstract Many application areas today make use of ontologies. With the advent of semantic Web technologies, ontology based systems have become widespread. Developing an ontology is part of the necessary early development of an ontology-based system. Since the validity and quality of the ontology data directly affects the validity and quality of the system using the ontology, evolution of the ontology data directly affects the evolution and/or maintenance of the ontology-based systems that depend on and employ the ontology data. Our research examines the quality, completeness, and stability of ontology data as ontologies evolve. We propose a metrics suite, based on standard software quality concepts, to measure the complexity and cohesion of ontology data. First we theoretically validate our metrics. Then we examine empirically whether our metrics determine ontology data quality, by comparing them to human evaluator ratings. We conclude that several of our metrics successfully determine ontology complexity or cohesion. Finally, we examine, over evolving ontology data, whether our metrics determine ontology completeness and stability. We determine that various metrics reflect different kinds of changes. Our experiments indicate our metrics' measure ontology stability and completeness; however, interpretation of specific metrics values and the interaction of different metrics requires further study. Copyright © 2007 John Wiley & Sons, Ltd. [source]


A model selection approach for the identification of quantitative trait loci in experimental crosses

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 4 2002
Karl W. Broman
Summary. We consider the problem of identifying the genetic loci (called quantitative trait loci (QTLs)) contributing to variation in a quantitative trait, with data on an experimental cross. A large number of different statistical approaches to this problem have been described; most make use of multiple tests of hypotheses, and many consider models allowing only a single QTL. We feel that the problem is best viewed as one of model selection. We discuss the use of model selection ideas to identify QTLs in experimental crosses. We focus on a back-cross experiment, with strictly additive QTLs, and concentrate on identifying QTLs, considering the estimation of their effects and precise locations of secondary importance. We present the results of a simulation study to compare the performances of the more prominent methods. [source]


Effects of radio-collars on European badgers (Meles meles)

JOURNAL OF ZOOLOGY, Issue 1 2002
F. A. M. Tuyttens
Abstract The relationships between radio-collaring/tracking and 12 biometric parameters in a population of badgers (Meles meles) that were live-trapped in south-west England were investigated. The length of time for which a badger had worn a radio-collar was selected as an explanatory variable in generalized linear models of three biometric parameters (body condition, body weight and testes volume) irrespective of whether or not age class was included as a variable in the analyses. There was evidence that badgers that had been carrying a radio-collar for 1,100 days had lower body condition scores both when compared to badgers that had not been collared and with those that had been collared for longer than 100 days, suggesting a post-collaring acclimation period. In addition, the time period between first and last capture was longer for radio-collared than non-collared badgers. It is unlikely that this was due to an effect of collaring on trappability or to non-random selection of badgers for collaring. Although testes size differed between non-collared badgers and badgers that had been tagged for > 100 days, the relationship between radio-collaring and reproductive output remained unproven. These results highlight not only the need to assess the welfare aspects of radio-collaring but also the potential intricacy of corollaries of collaring. Explorations such as that reported here are important to the validity of studies that make use of radio-telemetry. [source]


Use of the Rotation Vector in Brownian Dynamics Simulation of Transient Electro-Optical Properties

MACROMOLECULAR THEORY AND SIMULATIONS, Issue 1 2009
Tom Richard Evensen
Abstract We have recently developed a new singularity-free algorithm for Brownian dynamics simulation of free rotational diffusion. The algorithm is rigorously derived from kinetic theory and makes use of the Cartesian components of the rotation vector as the generalized coordinates describing angular orientation. Here, we report on the application of this new algorithm in Brownian dynamics simulations of transient electro-optical properties. This work serves two main purposes. Firstly, it demonstrates the integrity of the new algorithm for BD-simulations of the most common transient electro-optic experiments. Secondly, it provides new insight into the performance of the new algorithm compared to algorithms that make use of the Euler angles. We study the transient electrically induced birefringence in dilute solutions of rigid particles with anisotropic polarization tensor in response to external electric field pulses. The use of both one single electric pulse and two electric pulses with opposite polarity are being analyzed. We document that the new singularity-free algorithm performs flawlessly. We find that, for these types of systems, the new singularity-free algorithm, in general, outperforms similar algorithms based on the Euler angles. In a wider perspective, the most important aspect of this work is that it serves as an important reference for future development of efficient BD-algorithms for studies of more complex systems. These systems include polymers consisting of rigid segments with single-segment translational,rotational coupling, segment,segment fluid-dynamic interactions and holonomic constraints. [source]


Persons, Places, and Times: The Meanings of Repetition in an STD Clinic

MEDICAL ANTHROPOLOGY QUARTERLY, Issue 2 2007
Lori Leonard
In this article we work the tensions between the way clinical medicine and public health necessarily construct the problem of "repetition" in the context of a sexually transmitted disease (STD) clinic and the ways patients narrate their illness experiences. This tension,between clinical and epidemiological exigencies and the messiness of lived experience,is a recurring theme of work conducted at the intersections of epidemiology, anthropology, and clinical medicine. Clinically, repeated infections are a threat to the individual body and to "normal" biological processes like reproduction. From a public health perspective, "repeaters" are imagined to be part of a "core group" that keeps infections in circulation, endangering the social body. Yet patients' accounts are anchored in particular social histories, and their experiences rely on different time scales than those implicated in either of these types of readings. Extended analyses are provided of two such accounts: one in which repetition can be "read" as part of a performance of recovery, and one in which repetition is bound up in the effort to avoid becoming the involuntary subject of institutionally administered intervention. We argue the need to open up the category of repeaters to include the social and draw on work by Cheryl Mattingly to suggest that one way to do this in the context of the STD clinic might be to adopt forms of therapeutic practice that make use of interpretive, in addition to technical, skills. [source]


Novel initiation genes in squamous cell carcinomagenesis: A role for substrate-specific ubiquitylation in the control of cell survival

MOLECULAR CARCINOGENESIS, Issue 8 2007
Amador Albor
Abstract The study of experimental epidermal carcinogenesis offers several advantages over other epithelial carcinogenesis models, including easy accessibility and a database of research findings spanning over a century. Our studies make use of a clonal in vitro/in vivo keratinocyte carcinogenesis model with low frequency of ras mutation and derivative clonal-initiated lineages with distinct tumor fate. Analysis of this model has yielded candidate genes involved in the stages of initiation and tumorigenic progression, and has revealed novel roles for ubiquitylation in transcriptional control of survival and apoptotic pathways during the early stages of carcinogenesis. The expression of a recently described E3-ubiquitin ligase, Trim32, is elevated during initiation, and ectopic expression of Trim32 confers extended survival in response to terminal differentiation and ultraviolet light (UV) B/TNF-, death signals. Trim32 binds and ubiquitylates Piasy, controlling its stability and accumulation. Piasy is a SUMOylation factor involved in the control of apoptosis, senescence, and NF-,B activation. NF-,B is a survival factor for keratinocytes in response to UV irradiation, the main carcinogenic stimulus for the epidermis. Piasy inhibits NF-,B activity, and promotes keratinocyte apoptosis in response to UV and TNF-,. In human skin squamous cell carcinoma (SCC) samples, we found an inverse correlation between Trim32 and Piasy expression supporting a role for Trim32,Piasy interaction in human epidermal carcinogenesis. Our hypothesis is that increased expression of Trim32 may enhance epidermal carcinogenesis, by increasing the threshold of NF-,B activity through Piasy downmodulation. © 2007 Wiley-Liss, Inc. [source]


Introduction to "Ethnographic Emergences"

AMERICAN ANTHROPOLOGIST, Issue 1 2005
BILL MAURER
This introduction situates the articles in this "In Focus" in terms of the history of anthropological theory. I argue that the objects under ethnographic scrutiny here compel a rethinking of ethnography as a method and a retooling of the theoretical apparatus of the discipline. Such fields as medicine, science, media, law, and environment pose challenges to modernist analytical toolkits because they are always already complex hybrids of nature and culture. They do not stay put inside their own analytical frames. They are also autodocumentary and make use of the shift in perspective between general and particular to generate knowledge,much as anthropology does. This introduction is an argument for an anthropology of emergence that is not content to settle for mere descriptive adequacy but that uses its objects to unsettle anthropological claims to knowledge. [source]


Single-warehouse multi-retailer inventory systems with full truckload shipments

NAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 5 2009
Yue Jin
Abstract We consider a multi-stage inventory system composed of a single warehouse that receives a single product from a single supplier and replenishes the inventory of n retailers through direct shipments. Fixed costs are incurred for each truck dispatched and all trucks have the same capacity limit. Costs are stationary, or more generally monotone as in Lippman (Management Sci 16, 1969, 118,138). Demands for the n retailers over a planning horizon of T periods are given. The objective is to find the shipment quantities over the planning horizon to satisfy all demands at minimum system-wide inventory and transportation costs without backlogging. Using the structural properties of optimal solutions, we develop (1) an O(T2) algorithm for the single-stage dynamic lot sizing problem; (2) an O(T3) algorithm for the case of a single-warehouse single-retailer system; and (3) a nested shortest-path algorithm for the single-warehouse multi-retailer problem that runs in polynomial time for a given number of retailers. To overcome the computational burden when the number of retailers is large, we propose aggregated and disaggregated Lagrangian decomposition methods that make use of the structural properties and the efficient single-stage algorithm. Computational experiments show the effectiveness of these algorithms and the gains associated with coordinated versus decentralized systems. Finally, we show that the decentralized solution is asymptotically optimal. © 2009 Wiley Periodicals, Inc. Naval Research Logistics 2009 [source]


Exploiting self-canceling demand point aggregation error for some planar rectilinear median location problems

NAVAL RESEARCH LOGISTICS: AN INTERNATIONAL JOURNAL, Issue 6 2003
R.L. Francis
When solving location problems in practice it is quite common to aggregate demand points into centroids. Solving a location problem with aggregated demand data is computationally easier, but the aggregation process introduces error. We develop theory and algorithms for certain types of centroid aggregations for rectilinear 1-median problems. The objective is to construct an aggregation that minimizes the maximum aggregation error. We focus on row-column aggregations, and make use of aggregation results for 1-median problems on the line to do aggregation for 1-median problems in the plane. The aggregations developed for the 1-median problem are then used to construct approximate n -median problems. We test the theory computationally on n -median problems (n , 1) using both randomly generated, as well as real, data. Every error measure we consider can be well approximated by some power function in the number of aggregate demand points. Each such function exhibits decreasing returns to scale. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 614,637, 2003. [source]


Quantitative MRI for the assessment of bone structure and function,

NMR IN BIOMEDICINE, Issue 7 2006
Felix W. Wehrli
Abstract Osteoporosis is the most common degenerative disease in the elderly. It is characterized by low bone mass and structural deterioration of bone tissue, leading to morbidity and increased fracture risk in the hip, spine and wrist,all sites of predominantly trabecular bone. Bone densitometry, currently the standard methodology for diagnosis and treatment monitoring, has significant limitations in that it cannot provide information on the structural manifestations of the disease. Recent advances in imaging, in particular MRI, can now provide detailed insight into the architectural consequences of disease progression and regression in response to treatment. The focus of this review is on the emerging methodology of quantitative MRI for the assessment of structure and function of trabecular bone. During the past 10 years, various approaches have been explored for obtaining image-based quantitative information on trabecular architecture. Indirect methods that do not require resolution on the scale of individual trabeculae and therefore can be practiced at any skeletal location, make use of the induced magnetic fields in the intertrabecular space. These fields, which have their origin in the greater diamagnetism of bone relative to surrounding marrow, can be measured in various ways, most typically in the form of R2,, the recoverable component of the total transverse relaxation rate. Alternatively, the trabecular network can be quantified by high-resolution MRI (µ-MRI), which requires resolution adequate to at least partially resolve individual trabeculae. Micro-MRI-based structure analysis is therefore technically demanding in terms of image acquisition and algorithms needed to extract the structural information under conditions of limited signal-to-noise ratio and resolution. Other requirements that must be met include motion correction and image registration, both critical for achieving the reproducibility needed in repeat studies. Key clinical applications targeted involve fracture risk prediction and evaluation of the effect of therapeutic intervention. Copyright © 2006 John Wiley & Sons, Ltd. [source]


What Moore's Paradox Is About

PHILOSOPHY AND PHENOMENOLOGICAL RESEARCH, Issue 1 2001
CLAUDIO DE ALMEIDA
On the basis of arguments showing that none of the most influential analyses of Moore's paradox yields a successful resolution of the problem, a new analysis of it is offered. It is argued that, in attempting to render verdicts of either inconsistency or self-contradiction or self-refutation, those analyses have all failed to satisfactorily explain why a Moore-paradoxical proposition is such that it cannot be rationally believed. According to the proposed solution put forward here, a Moore-paradoxical proposition is one for which the believer can have no non-overridden evidence. the arguments for this claim make use of some of Peter Klein's views on epistemic defeasibility. It is further suggested that this proposal may have important meta-epistemological implications. [source]