New Set (new + set)

Distribution by Scientific Domains
Distribution within Humanities and Social Sciences

Selected Abstracts

Synthesis and Structure,Activity Relationships of a New Set of 1,2,4-Triazolo[4,3-a]quinoxalin-1-one Derivatives (I) and (II) as Adenosine Receptor Antagonists.

CHEMINFORM, Issue 46 2003
Vittoria Colotta
No abstract is available for this article. [source]

New sets of solubility parameters of linear and crosslinked aromatic polyamides

Stefano Fiori
Abstract As generally accepted, also in the case of polyamides linear and crosslinked polymeric materials are believed to be characterized by the same solution properties and, consequently, by the same solubility parameters. However, despite their great practical importance, a thorough study aimed to determine the best solvent media able to dissolve linear aromatic polyamides has not been performed yet or, at least, has not been published. In this study, we report on our study on the solubility parameters of linear and crosslinked aromatic polyamides. We demonstrate that the assumption of considering these two classes as having the same solubility properties can lead to dramatically erroneous results. Two new different sets for linear and crosslinked aromatic polyamides are proposed. Namely, linear poly(p -phenylene terephthalamide) is characterized by ,p, ,d, and ,H equal to 8.6, 18.4, and 11.3, respectively; by contrast, the corresponding values of the crosslinked aromatic polyamides taken into consideration are: 11.5, 16.8, and 10.2. © 2009 Wiley Periodicals, Inc. J Appl Polym Sci, 2010 [source]

Quantification of Extinction Risk: IUCN's System for Classifying Threatened Species

definición de prioridades de conservación; especies amenazadas; Lista Roja UICN; riesgo de extinción Abstract:,The International Union for Conservation of Nature (IUCN) Red List of Threatened Species was increasingly used during the 1980s to assess the conservation status of species for policy and planning purposes. This use stimulated the development of a new set of quantitative criteria for listing species in the categories of threat: critically endangered, endangered, and vulnerable. These criteria, which were intended to be applicable to all species except microorganisms, were part of a broader system for classifying threatened species and were fully implemented by IUCN in 2000. The system and the criteria have been widely used by conservation practitioners and scientists and now underpin one indicator being used to assess the Convention on Biological Diversity 2010 biodiversity target. We describe the process and the technical background to the IUCN Red List system. The criteria refer to fundamental biological processes underlying population decline and extinction. But given major differences between species, the threatening processes affecting them, and the paucity of knowledge relating to most species, the IUCN system had to be both broad and flexible to be applicable to the majority of described species. The system was designed to measure the symptoms of extinction risk, and uses 5 independent criteria relating to aspects of population loss and decline of range size. A species is assigned to a threat category if it meets the quantitative threshold for at least one criterion. The criteria and the accompanying rules and guidelines used by IUCN are intended to increase the consistency, transparency, and validity of its categorization system, but it necessitates some compromises that affect the applicability of the system and the species lists that result. In particular, choices were made over the assessment of uncertainty, poorly known species, depleted species, population decline, restricted ranges, and rarity; all of these affect the way red lists should be viewed and used. Processes related to priority setting and the development of national red lists need to take account of some assumptions in the formulation of the criteria. Resumen:,La Lista Roja de Especies Amenazadas de la UICN (Unión Internacional para la Conservación de la Naturaleza) fue muy utilizada durante la década de l980 para evaluar el estatus de conservación de especies para fines políticos y de planificación. Este uso estimuló el desarrollo de un conjunto nuevo de criterios cuantitativos para enlistar especies en las categorías de amenaza: en peligro crítico, en peligro y vulnerable. Estos criterios, que se pretendía fueran aplicables a todas las especies excepto microorganismos, eran parte de un sistema general para clasificar especies amenazadas y fueron implementadas completamente por la UICN en 2000. El sistema y los criterios han sido ampliamente utilizados por practicantes y científicos de la conservación y actualmente apuntalan un indicador utilizado para evaluar el objetivo al 2010 de la Convención de Diversidad Biológica. Describimos el proceso y el respaldo técnico del sistema de la Lista Roja de la IUCN. Los criterios se refieren a los procesos biológicos fundamentales que subyacen en la declinación y extinción de una población. Pero, debido a diferencias mayores entre especies, los procesos de amenaza que los afectan y la escasez de conocimiento sobre la mayoría de las especies, el sistema de la UICN tenía que ser amplio y flexible para ser aplicable a la mayoría de las especies descritas. El sistema fue diseñado para medir los síntomas del riesgo de extinción, y utiliza cinco criterios independientes que relacionan aspectos de la pérdida poblacional y la declinación del rango de distribución. Una especie es asignada a una categoría de amenaza si cumple el umbral cuantitativo por lo menos para un criterio. Los criterios, las reglas acompañantes y las directrices utilizadas por la UICN tienen la intención de incrementar la consistencia, transparencia y validez de su sistema de clasificación, pero requiere algunos compromisos que afectan la aplicabilidad del sistema y las listas de especies que resultan. En particular, se hicieron selecciones por encima de la evaluación de incertidumbre, especies poco conocidas, especies disminuidas, declinación poblacional, rangos restringidos y rareza; todas estas afectan la forma en que las listas rojas deberían ser vistas y usadas. Los procesos relacionados con la definición de prioridades y el desarrollo de las listas rojas nacionales necesitan considerar algunos de los supuestos en la formulación de los criterios. [source]

The Puzzle of Museum Educational Practice: A Comment on Rounds and Falk

Daniel Spock
The mandate that museums place education at the center of their public service role has had the effect of framing a new set of questions and,inevitably,problems. If museums have primary value to society as educational institutions, what kind of learning actually happens in them? Jay Rounds and John Falk, writing at the leading edge of this inquiry, explore curiosity, motivation and self-identity as paramount considerations for the special type of learning museums promote. Their analyses present interesting challenges for the museum practitioner, who may observe that people find the pursuit of curiosity pleasurable and value it more highly than knowledge acquisition. The practitioner may conclude that museums have a calling: They stand for the value of curiosity for its own sake, and for that reason will never wear out their welcome. [source]

Making White: Constructing Race in a South African High School

Nadine Dolby
As a social and cultural phenomenon, race is continually remade within changing circumstances and is constructed and located, in part, in institutions' pedagogical practices and discourses. In this article I examine how the administration of a multiracial, working-class high school in Durban, South Africa produces "white" in an era of political and social transition. As the population of Fernwood High School (a pseudonym) shifts from majority white working class to black working class, the school administration strives to reposition the school as "white," despite its predominantly black student population. This whiteness is not only a carryover from the apartheid era, but is actively produced within a new set of circumstances. Using the discourses and practices of sports and standards, the school administration attempts to create a whiteness that separates the school from the newly democratic nation-state of South Africa. Despite students' and some staff's general complacency and outright resistance, rugby and athletics are heralded as critical nodes of the school's "white" identity, connecting the school to other, local white schools, and disconnecting it from black schools. Dress standards function in a similar manner, creating an imagined equivalence between Fernwood and other white schools in Durban (and elite schools around the world), and disassociating Fernwood from black schools in South Africa and the "third world" writ large. This pedagogy of whiteness forms the core of the administration's relationship with Fernwood students, and maps how race is remade within a changing national context. [source]

Functional studies of an evolutionarily conserved, cytochrome b5 domain protein reveal a specific role in axonemal organisation and the general phenomenon of post-division axonemal growth in trypanosomes

CYTOSKELETON, Issue 1 2009
Helen Farr
Abstract Eukaryotic cilia and flagella are highly conserved structures composed of a canonical 9+2 microtubule axoneme. Several recent proteomic studies of cilia and flagella have been published, including a proteome of the flagellum of the protozoan parasite Trypanosoma brucei. Comparing proteomes reveals many novel proteins that appear to be widely conserved in evolution. Amongst these, we found a previously uncharacterised protein which localised to the axoneme in T. brucei, and therefore named it Trypanosome Axonemal protein (TAX)-2. Ablation of the protein using RNA interference in the procyclic form of the parasite has no effect on growth but causes a reduction in motility. Using transmission electron microscopy, various structural defects were seen in some axonemes, most frequently with microtubule doublets missing from the 9+2 arrangement. RNAi knockdown of TAX-2 expression in the bloodstream form of the parasite caused defects in growth and cytokinesis, a further example of the effects caused by loss of flagellar function in bloodstream form T. brucei. In procyclic cells we used a new set of vectors to ablate protein expression in cells expressing a GFP:TAX-2 fusion protein, which enabled us to easily quantify protein reduction and visualise axonemes made before and after RNAi induction. This establishes a useful generic technique but also revealed a specific observation that the new flagellum on the daughter trypanosome continues growth after cytokinesis. Our results provide evidence for TAX-2 function within the axoneme, where we suggest that it is involved in processes linking the outer doublet microtubules and the central pair. Cell Motil. Cytoskeleton 2008. © 2008 Wiley-Liss, Inc. [source]

Strengthening Public Safety Nets from the Bottom Up

Jonathan Morduch
Helping to reduce vulnerability poses a new set of challenges for public policy. A starting point is understanding the ways in which communities and extended families try to cope with difficulties in the absence of public interventions. Coping mechanisms range from the informal exchange of transfers and loans to more structured institutions that enable an entire community to provide protection to its neediest members. This article describes ways of building public safety nets to complement and extend informal and private institutions. The most effective policies will combine transfer systems that are sensitive to existing mechanisms with new institutions for providing insurance and credit and for generating savings. [source]

Divergent roles of the DEAD-box protein BS-PL10, the urochordate homologue of human DDX3 and DDX3Y proteins, in colony astogeny and ontogeny

Amalia Rosner
Abstract Proteins of the highly conserved PL-10 (Ded1P) subfamily of DEAD-box family, participate in a wide variety of biological functions. However, the entire spectrum of their functions in both vertebrates and invertebrates is still unknown. Here, we isolated the Botryllus schlosseri (Urochordata) homologue, BS-PL10, revealing its distributions and functions in ontogeny and colony astogeny. In botryllid ascidians, the colony grows by increasing the number of modular units (each called a zooid) through a whole colony synchronized and weekly cyclical astogenic budding process (blastogenesis). At the level of the colony, both BS-PL10 mRNA and its protein (78 kDa) fluctuate in a weekly pattern that corresponds with the animal's blastogenic cycle, increasing from blastogenic stage A to blastogenic stage D. At the organ/module level, a sharp decline is revealed. Primary and secondary developing buds express high levels of BS-PL10 mRNA and protein at all blastogeneic stages. These levels are reduced four to nine times in the new set of functional zooids. This portrait of colony astogeny differed from its ontogeny. Oocytes and sperm cells express high levels of BS-PL10 protein only at early stages of development. Young embryos reveal background levels with increased expressions in some organs at more developed stages. Results reveal that higher levels of BS-PL10 mRNA and protein are characteristic to multipotent soma and germ cells, but patterns deviate between two populations of differentiating stem cells, the stem cells involved in weekly blastogenesis and stem cells involved in embryogenesis. Two types of experimental manipulations, zooidectomy and siRNA assays, have confirmed the importance of BS-PL10 for cell differentiation and organogenesis. BS-PL10 (phylogenetically matching the animal's position in the evolutionary tree), is the only member of this subfamily in B. schlosseri, featuring a wide range of biological activities, some of which represent pivotal roles. The surprising weekly cyclical expression and the participation in cell differentiation posit this molecule as a model system for studying PL10 protein subfamily. Developmental Dynamics 235:1508,1521, 2006. © 2006 Wiley-Liss, Inc. [source]

New Humanitarianism: Does It Provide a Moral Banner for the 21st Century?

DISASTERS, Issue 4 2001
Fiona Fox
There is a ,new humanitarianism' for the new millennium. It is ,principled', ,human-rights based' and politically sensitive. Above all it is new. It marks a break from the past and a rejection of the traditional principles that guided humanitarianism through the last century. New humanitarians reject the political naivety of the past, assess the long-term political impact of relief and are prepared to see humanitarian aid used as a tool to achieve human rights and political goals. New Humanitarianism is compelling, in tune with our times and offers a new moral banner for humanitarians to cling to as we enter the new millennium. Or does it? After outlining the key elements of new humanitarianism, including the human rights approach and developmental relief, the paper spells out some of the dangers. The author claims that new humanitarianism results in an overt politicisation of aid in which agencies themselves use relief as a tool to achieve wider political goals. The paper shows how this approach has spawned a new conditionality which allows for aid to be withheld and has produced a moral hierarchy of victims in which some are more deserving than others. The paper concludes with a plea for a revival of the principle of universalism as the first step to a new set of principles. [source]

Initial hydrologic and geomorphic response following a wildfire in the Colorado Front Range

John A. Moody
Abstract A wildfire in May 1996 burned 4690 hectares in two watersheds forested by ponderosa pine and Douglas fir in a steep, mountainous landscape with a summer, convective thunderstorm precipitation regime. The wildfire lowered the erosion threshold in the watersheds, and consequently amplified the subsequent erosional response to shorter time interval episodic rainfall and created both erosional and depositional features in a complex pattern throughout the watersheds. The initial response during the first four years was an increase in runoff and erosion rates followed by decreases toward pre-fire rates. The maximum unit-area peak discharge was 24 m3 s,1 km,2 for a rainstorm in 1996 with a rain intensity of 90 mm h,1. Recovery to pre-fire conditions seems to have occurred by 2000 because for a maximum 30-min rainfall intensity of 50 mm h,1, the unit-area peak discharge in 1997 was 6.6 m3 s,1 km,2, while in 2000 a similar intensity produced only 0.11 m3 s,1 km,2. Rill erosion accounted for 6 per cent, interrill erosion for 14 per cent, and drainage erosion for 80 per cent of the initial erosion in 1996. This represents about a 200-fold increase in erosion rates on hillslopes which had a recovery or relaxation time of about three years. About 67 per cent of the initially eroded sediment is still stored in the watersheds after four years with an estimated residence time greater than 300 years. This residence time is much greater than the fire recurrence interval so erosional and depositional features may become legacies from the wildfire and may affect landscape evolution by acting as a new set of initial conditions for subsequent wildfire and flood sequences. Published in 2001 by John Wiley & Sons, Ltd. [source]

The Temporary Staffing Industry: Growth Imperatives and Limits to Contingency,

Nik Theodore
Abstract: The temporary staffing industry (TSI) in the United States has enjoyed explosive growth since the 1970s, during which time the market for temporary labor has become increasingly complex and diverse. Rather than focus, as has typically been done, on the wider labor market effects of this sustained expansion in temporary employment, this article explores patterns and processes of industrial restructuring in the TSI itself. The analysis reveals a powerfully recursive relationship among evolving TSI business practices, the industry's strategies for building and extending the market, and urban labor market outcomes as the sector has grown through a series of qualitatively differentiated phases of development or "modes of growth." Moreover, the distinctive character of the TSI's geographic rollout raises a new set of questions concerning, inter alia, the links between temping and labor market deregulation, the nature of local competition, the scope for and limits of value-adding strategies, and the emerging global structure of the temp market. This idiosyncratic industry,which has been a conspicuous beneficiary of growing economic instability,has, throughout the past three decades, restructured continuously through a period of sustained but highly uneven growth. In so doing, it has proved to be remarkably inventive in extending the market for contingent labor, but has encountered a series of (possibly structural) obstacles to further expansion in its domestic market. These obstacles, in turn, have triggered an unprecedented phase of international integration in the TSI, along with a new mode of development,global growth. [source]

Predicting ready biodegradability in the Japanese ministry of international trade and industry test

Jay Tunkel
Abstract Two new predictive models for assessing a chemical's biodegradability in the Japanese Ministry of International Trade and Industry (MITI) ready biodegradation test have been developed. The new methods use an approach similar to that in the existing BIOWIN© program, in which the probability of rapid biodegradation is estimated by means of multiple linear or nonlinear regression against counts of 36 chemical substructures (molecular fragments) plus molecular weight (mol wt). The data set used to develop the new models consisted of results (pass/no pass) from the MITI test for 884 discrete organic chemicals. This data set was first divided into randomly selected training and validation sets, and new coefficients were derived for the training set using the BIOWIN fragment library and mol wt as independent variables. Based on these results, the fragment library was then modified by deleting some fragments and adding or refining others, and the new set of independent variables (42 substructures and mol wt) was fit to the MITI data. The resulting linear and nonlinear regression models accurately classified 81% of the chemicals in an independent validation set. Like the established BIOWIN models, the MITI models are intended for use in chemical screening and in setting priorities for further review. [source]

Fruit Colour Preferences of Redwings (Turdus iliacus): Experiments with Hand-Raised Juveniles and Wild-Caught Adults

ETHOLOGY, Issue 6 2004
Johanna Honkavaara
Certain fruit colours and their contrast with the background coloration are suggested to attract frugivorous birds. To test the attractiveness of different colours, we performed three experiments in laboratory with controlled light conditions. In the first two experiments, we studied the fruit colour preferences of naive juvenile redwings. In the third experiment, we continued to investigate whether the contrast of the fruit colour with the background coloration affects the preference of both naive juveniles and experienced adult redwings. In the first experiment, juvenile birds preferred black, UV-blue and red berries, to white ones. In pairwise trials, a new set of juveniles still preferred red berries to white ones. When testing the effect of contrasts on their choice, juveniles preferred UV-blue berries to red ones on a UV-blue background. However, no preference was found, when the background was either red or green. Adult redwings preferred UV-blue berries to red ones on all backgrounds. According to these results, juveniles seem to have an innate avoidance of white berries. Furthermore, the foraging decisions of fruit-eating birds are affected more by fruit colour than its contrast with background coloration, at least when contrasting displays are encountered from relatively short distances. Differences in preferences of adult and juvenile birds also indicate that learning seems to play a role in fruit choices. [source]

Rational design of new CpG oligonucleotides that combine B cell activation with high IFN-, induction in plasmacytoid dendritic cells

Gunther Hartmann
Abstract Two different types of CpG motif-containing oligonucleotides (CpG ODN) have been described: CpG-A with high induction of IFN-, in plasmacytoid dendritic cells; and CpG-B with little induction of IFN-,, but potent activation of B cells. In this study, we demonstrate that CpG-A fail to activate B cells unless plasmacytoid dendritic cells are present. We identified a new set of CpG ODN sequences which induces high levels of IFN-, in plasmacytoid dendritic cells but remains capable of directly activating B cells. These new CpG ODN (termed CpG-C) are more potent stimulants of B cells than CpG-B due to their ability of directly and indirectly (via plasmacytoid dendritic cells) activating B cells. The sequence of CpG-C combines structural elements of both CpG-A and CpG-B. The most potent sequence, M362, contains a 5,-end ,TCGTCG-motif' and a ,GTCGTT-motif', both of which are present in CpG-B (ODN,2006); a palindromic sequence characteristic for CpG-A (ODN,2216); but no poly,G motif required for CpG-A. In conclusion, we defined the first CpG-containing sequences that potently activate both TLR9-expressing immune cell subsets in humans, the plasmacytoid dendritic cell and the B cell. CpG-C may allow for improved therapeutic immuno-modulation in vivo. [source]

Perirhinal cortex neuronal activity related to long-term familiarity memory in the macaque

Christian Hölscher
Abstract Lesion studies suggest that the perirhinal cortex plays a role in object recognition memory. To analyse its role, the activity of single neurons in the perirhinal cortex was recorded in three rhesus monkeys (Macaca mulatta) performing a delayed matching-to-sample task with up to three intervening stimuli. A set of familiar visual stimuli was used. Some neurons had activity related to working memory, in that they responded more to the sample than to the match image within a trial, as shown previously. However, when a novel set of stimuli was introduced, the neuronal responses were on average only 47% of the magnitude of the responses to the familiar set of stimuli. Moreover, it was shown in eight different replications in three monkeys that the responses of the perirhinal cortex neurons gradually increased over hundreds of presentations of the new set of (initially novel) stimuli to become as large as with the already familiar stimuli. The mean number of 1.3-s presentations to induce this effect was 400 occurring over 7,13 days. These results show that perirhinal cortex neurons represent the very long-term familiarity of visual stimuli. A representation of the long-term familiarity of visual stimuli may be important for many aspects of social behaviour, and part of the impairment in temporal lobe amnesia may be related to the difficulty of building representations of the degree of familiarity of stimuli. [source]

From doves to hawks: A spatial analysis of voting in the Monetary Policy Committee of the Bank of England

This article examines the making of monetary policy in the United Kingdom between 1997 and 2008 by analysing voting behaviour in the Bank of England's Monetary Policy Committee (MPC). It provides a new set of measures for the monetary policy preferences of individual MPC members by estimating a Bayesian item response model. The article demonstrates the usefulness of these measures by comparing the ideal points of outgoing MPC members with their successors and by looking at changes over time in the median ideal point on the MPC. The analysis indicates that the British Government has been able to move the position of the median voter on the MPC through its appointments to the Committee. This highlights the importance of central bank appointments for monetary policy. [source]

Education-based group identity and consciousness in the authoritarian-libertarian value conflict

The increasing importance of New Politics or authoritarian-libertarian values to electoral behaviour in advanced Western industrial democracies and the previously documented strong link between such values and educational attainment indicates that, contrary to the claims of some New Politics theorists, the ideological conflict is anchored in the social structure , in particular in educational groups. For this interpretation to be warranted, however, it should be possible to document the existence of education-based group identity and group consciousness related to the value conflict. The article develops indicators of the core variables out of Social Identity Theory. Based on a unique survey from Denmark, which includes the new set of indicators, the analyses show that members of the high and low education groups have developed both group identity and consciousness reflecting a conflict between the groups and that these factors are related to authoritarian-libertarian values. The results are interpreted as reflecting a relationship of dominance, which supports the view that the ideological conflict is structurally anchored. [source]

Toric duality, Seiberg duality and Picard-Lefschetz transformations

S. Franco
Toric Duality arises as an ambiguity in computing the quiver gauge theory living on a D3-brane which probes a toric singularity. It is reviewed how, in simple cases Toric Duality is Seiberg Duality. The set of all Seiberg Dualities on a single node in the quiver forms a group which is contained in a larger group given by a set of Picard-Lefschetz transformations. This leads to elements in the group (sometimes called fractional Seiberg Duals) which are not Seiberg Duality on a single node, thus providing a new set of gauge theories which flow to the same universality class in the Infra Red. [source]

Accelerated rehabilitation of an edentulous patient with an implant retained dental prosthesis: a case report

Gerald McKenna
This case report details the successful rehabilitation of an edentulous patient using a complete upper prosthesis and a lower implant retained overdenture. The provision of care was split between a specialist centre and a primary care setting. This approach reduced inconvenience to the patient. Modern surgical and prosthodontic techniques also reduced the total delivery time. After initial consultation a new set of complete dentures was prescribed with changes in design to the originals. The patient was also planned for placement of two mandibular implants to stabilise and retain the mandibular denture. The first line of treatment involved provision of a new set of dentures constructed by the patient's general dental practitioner. Dental implants were then placed in a specialist centre and the patient returned to the dental practice for attachment of the lower denture to the dental implants. The benefits and success of mandibular implant retained dentures are well documented. With delivery of the overdenture, the patient reported increased satisfaction with his prostheses which allowed him to eat a greater range of foods and enabled him to feel confident when speaking and socialising. [source]

Learning a new leadership game plan at Paychex

Dan Heffernan
The rules for success have changed at perennially successful Paychex. Greater operational and product/service complexity requires a new leadership style to produce the strong performance stakeholders have come to expect. Through a new set of leadership development programs, the Paychex operations organization is rapidly reshaping its leadership culture by setting expectations and building competencies for proactive problem solving, open dialogue, and greater accountability. © 2005 Wiley Periodicals, Inc. [source]

Improvement of the basic correlating equations and transition criteria of natural convection heat transfer

Shi-Ming Yang
Abstract In this paper, improvements in the basic physical laws of natural convection heat transfer were implemented in two major respects by incorporating recent research findings in this field. A preferred transition criterion was adopted in this paper to correlate all of the experimental data. Since transition correlations are primarily flow stability problems, the Grashof number, instead of the Rayleigh number, was found to be the preferred criterion. Furthermore, in the case of natural convection heat transfer from a horizontal cylinder, a series of experimental data in the high-Rayleigh-number regions recently became available. These data made it possible to establish new reliable correlations and also to test the validity of previous correlations. It is concluded that the previous correlation for a horizontal cylinder in high-Rayleigh-number regions was based on unreliable experimental results. The transition correlation for a horizontal cylinder occurred at much higher values of Rayleigh number than the previous recommendation. In the case of natural convection heat transfer from a vertical plate, more accurate property values for air under pressurized conditions are now available. This made it possible to replot the reliable data of Saunders. From this result and the experimental result of Warner and Arpaci, a new set of basic correlations in natural convection heat transfer for laminar, transitional, and turbulent regimes are recommended. These recommendations reflect a better understanding of the basic physical laws in the field of heat convection. © 2001 Scripta Technica, Heat Trans Asian Res, 30(4): 293,300, 2001 [source]

Hippocampal granule cells opt for early retirement

HIPPOCAMPUS, Issue 10 2010
C.B. Alme
Abstract Increased excitability and plasticity of adult-generated hippocampal granule cells during a critical period suggests that they may "orthogonalize" memories according to time. One version of this "temporal tag" hypothesis suggests that young granule cells are particularly responsive during a specific time period after their genesis, allowing them to play a significant role in sculpting CA3 representations, after which they become much less responsive to any input. An alternative possibility is that the granule cells active during their window of increased plasticity, and excitability become selectively tuned to events that occurred during that time and participate in later reinstatement of those experiences, to the exclusion of other cells. To discriminate between these possibilities, rats were exposed to different environments at different times over many weeks, and cell activation was subsequently assessed during a single session in which all environments were revisited. Dispersing the initial experiences in time did not lead to the increase in total recruitment at reinstatement time predicted by the selective tuning hypothesis. The data indicate that, during a given time frame, only a very small number of granule cells participate in many experiences, with most not participating significantly in any. Based on these and previous data, the small excitable population of granule cells probably correspond to the most recently generated cells. It appears that, rather than contributing to the recollection of long past events, most granule cells, possibly 90,95%, are effectively "retired." If granule cells indeed sculpt CA3 representations (which remains to be shown), then a possible consequence of having a new set of granule cells participate when old memories are reinstated is that new representations of these experiences might be generated in CA3. Whatever the case, the present data may be interpreted to undermine the standard "orthogonalizer" theory of the role of the dentate gyrus in memory. © 2010 Wiley-Liss, Inc. [source]

Octree-based reasonable-quality hexahedral mesh generation using a new set of refinement templates

Yasushi Ito
Abstract An octree-based mesh generation method is proposed to create reasonable-quality, geometry-adapted unstructured hexahedral meshes automatically from triangulated surface models without any sharp geometrical features. A new, easy-to-implement, easy-to-understand set of refinement templates is developed to perform local mesh refinement efficiently even for concave refinement domains without creating hanging nodes. A buffer layer is inserted on an octree core mesh to improve the mesh quality significantly. Laplacian-like smoothing, angle-based smoothing and local optimization-based untangling methods are used with certain restrictions to further improve the mesh quality. Several examples are shown to demonstrate the capability of our hexahedral mesh generation method for complex geometries. Copyright © 2008 John Wiley & Sons, Ltd. [source]

Fully hierarchical divergence-conforming basis functions on tetrahedral cells, with applications

Matthys M. Botha
Abstract A new set of hierarchical, divergence-conforming, vector basis functions on curvilinear tetrahedrons is presented. The basis can model both mixed- and full-order polynomial spaces to arbitrary order, as defined by Raviart and Thomas, and Nédélec. Solenoidal- and non-solenoidal components are separately represented on the element, except in the case of the mixed first-order space, for which a decomposition procedure on the global, mesh-wide level is presented. Therefore, the hierarchical aspect of the basis can be made to extend down to zero polynomial order. The basis can be used to model divergence-conforming quantities, such as electromagnetic flux- and current density, fluid velocity, etc., within numerical methods such as the finite element method (FEM) or integral equation-based methods. The basis is ideally suited to p -adaptive analysis. The paper concludes with two example applications. The first is the FEM-based solution of the linearized acoustic vector wave equation, where it is shown how the decomposition into solenoidal components and their complements can be used to stabilize the method at low frequencies. The second is the solution of the electric field, volume integral equation for electromagnetic scattering analysis, where the benefits of the decomposition are again demonstrated. Copyright © 2006 John Wiley & Sons, Ltd. [source]

A promising boundary element formulation for three-dimensional viscous flow

Xiao-Wei Gao
Abstract In this paper, a new set of boundary-domain integral equations is derived from the continuity and momentum equations for three-dimensional viscous flows. The primary variables involved in these integral equations are velocity, traction, and pressure. The final system of equations entering the iteration procedure only involves velocities and tractions as unknowns. In the use of the continuity equation, a complex-variable technique is used to compute the divergence of velocity for internal points, while the traction-recovery method is adopted for boundary points. Although the derived equations are valid for steady, unsteady, compressible, and incompressible problems, the numerical implementation is only focused on steady incompressible flows. Two commonly cited numerical examples and one practical pipe flow problem are presented to validate the derived equations. Copyright © 2004 John Wiley & Sons, Ltd. [source]

A ,-coordinate three-dimensional numerical model for surface wave propagation

Pengzhi Lin
Abstract A three-dimensional numerical model based on the full Navier,Stokes equations (NSE) in , -coordinate is developed in this study. The , -coordinate transformation is first introduced to map the irregular physical domain with the wavy free surface and uneven bottom to the regular computational domain with the shape of a rectangular prism. Using the chain rule of partial differentiation, a new set of governing equations is derived in the , -coordinate from the original NSE defined in the Cartesian coordinate. The operator splitting method (Li and Yu, Int. J. Num. Meth. Fluids 1996; 23: 485,501), which splits the solution procedure into the advection, diffusion, and propagation steps, is used to solve the modified NSE. The model is first tested for mass and energy conservation as well as mesh convergence by using an example of water sloshing in a confined tank. Excellent agreements between numerical results and analytical solutions are obtained. The model is then used to simulate two- and three-dimensional solitary waves propagating in constant depth. Very good agreements between numerical results and analytical solutions are obtained for both free surface displacements and velocities. Finally, a more realistic case of periodic wave train passing through a submerged breakwater is simulated. Comparisons between numerical results and experimental data are promising. The model is proven to be an accurate tool for consequent studies of wave-structure interaction. Copyright © 2002 John Wiley & Sons, Ltd. [source]

The development of a new set of long-term climate averages for the UK

Matthew Perry
Abstract Monthly and annual long-term average datasets of 13 climate variables are generated for the periods 1961,90 and 1971,2000 using a consistent analysis method. Values are produced for each station in the Met Office's observing network and for a rectangular grid of points covering the UK at a horizontal spacing of 1 km. The variables covered are mean, maximum, minimum, grass minimum and soil temperature, days of air and ground frost, precipitation, days with rain exceeding 0.2 and 1 mm, sunshine, and days with thunder and snow cover. Gaps in the monthly station data are filled with estimates obtained via regression relationships with a number of well-correlated neighbours, and long-term averages are then calculated for each site. Gridded datasets are created by inverse-distance-weighted interpolation of regression residuals obtained from the station averages. This method does not work well for days of frost, thunder and snow, so an alternative approach is used. This involves first producing a grid of values for each month from the available station data. The gridded long-term average datasets are then obtained by averaging the monthly grids. The errors associated with each stage in the process are assessed, including verification of the gridding stage by leaving out a set of stations. The estimation of missing values allows a dense network of stations to be used, and this, along with the range of independent variables used in the regression, allows detailed and accurate climate datasets and maps to be produced. The datasets have a range of applications, and the maps are freely available through the Met Office Website. © Crown Copyright 2005. Reproduced with the permission of Her Majesty's Stationery Office. Published by John Wiley & Sons, Ltd. [source]

Improved process monitoring using nonlinear principal component models,

David Antory
This paper presents two new approaches for use in complete process monitoring. The first concerns the identification of nonlinear principal component models. This involves the application of linear principal component analysis (PCA), prior to the identification of a modified autoassociative neural network (AAN) as the required nonlinear PCA (NLPCA) model. The benefits are that (i) the number of the reduced set of linear principal components (PCs) is smaller than the number of recorded process variables, and (ii) the set of PCs is better conditioned as redundant information is removed. The result is a new set of input data for a modified neural representation, referred to as a T2T network. The T2T NLPCA model is then used for complete process monitoring, involving fault detection, identification and isolation. The second approach introduces a new variable reconstruction algorithm, developed from the T2T NLPCA model. Variable reconstruction can enhance the findings of the contribution charts still widely used in industry by reconstructing the outputs from faulty sensors to produce more accurate fault isolation. These ideas are illustrated using recorded industrial data relating to developing cracks in an industrial glass melter process. A comparison of linear and nonlinear models, together with the combined use of contribution charts and variable reconstruction, is presented. © 2008 Wiley Periodicals, Inc. [source]

Visualization of dynamic systems with performance maps: A rough computing approach

James J. Alpigini
A visualization technique titled the "performance map" is considered, which is derived from the Julia set common in the visualization of iterative chaos. Such maps are generated automatically and require a minimum of a priori knowledge of the system under evaluation. By the use of intuitively derived evaluation rules combined with color coding, they convey a wealth of information to the informed user about dynamic behaviors of a system that may be hidden from all but the expert analyst. The concept of rough sets is then presented and used to derive a new set of rules to affect map generation. This derivation serves to formalize rule generation and further serves to minimize the number of variables to test during the system evaluation. © 2002 John Wiley & Sons, Inc. [source]

Estimated drug use based on direct questioning and open-ended questions: responses in the 2006 National Survey on Drug Use and Health

Larry A. Kroutil
Abstract Substance use surveys may use open-ended items to supplement questions about specific drugs and obtain more exhaustive information on illicit drug use. However these questions are likely to underestimate the prevalence of use of specific drugs. Little is known about the extent of such underestimation or the groups most prone to under-reporting. Using data from the 2006 National Survey on Drug Use and Health (NSDUH), a civilian, non-institutionalized population survey of persons aged 12 or older in the United States, we compared drug use estimates based on open-ended questions with estimates from a new set of direct questions that occurred later in the interview. For these drugs, estimates of lifetime drug use based on open-ended questions often were at least seven times lower than those based on direct questions. Among adults identified in direct questions as substance users, lower educational levels were consistently associated with non-reporting of use in the open-ended questions. Given NSDUH's large annual sample size (,67, 000 interviews), combining data across future survey years could increase our understanding of characteristics associated with non-reporting of use in open-ended questions and allow drug use trends to be extrapolated to survey years in which only open-ended question data are available. Copyright © 2010 John Wiley & Sons, Ltd. [source]