Scientific Community (scientific + community)

Distribution by Scientific Domains


Selected Abstracts


Editor's Introduction , Autonomy of Inquiry: Shaping the Future of Emerging Scientific Communities

MANAGEMENT AND ORGANIZATION REVIEW, Issue 1 2009
Anne S. Tsui
abstract Over two decades, research in Chinese management has exploited existing questions, theories, constructs, and methods developed in the Western context. Lagging are exploratory studies to address questions relevant to Chinese firms and to develop theories that offer meaningful explanations of Chinese phenomena. Framed as a debate between pursuing a theory of Chinese management versus a Chinese theory of management, this forum, through the voices of thirteen scholars, provides an analysis of the reasons for the current status of Chinese management research and offers alternatives to shape the future of Chinese management studies. Based on the principle of autonomy of inquiry and heeding the warning of the constraint of normal science, the Chinese management research community can shape its own future by engaging in research that may contribute to global management knowledge and address meaningful local management problems. [source]


The European Research Area: On the Way Towards a European Scientific Community?

EUROPEAN LAW JOURNAL, Issue 5 2006
Álvaro De Elera
The aim was to create an ,internal market of research', in contrast with previous efforts in research policy that amounted to continued fragmentation. Lack of support from both Member States and the Council, together with the almost exclusive use of the Open Method of Coordination for the design of the Area, meant that the initially high ambitions were not met. The social repercussions of the project were also watered down as a consequence. [source]


Open Letter to the Scientific Community of Mycologists.

PLANT BIOLOGY, Issue 6 2000
Inputs from Referees Requested
No abstract is available for this article. [source]


Using Web 2.0 for scientific applications and scientific communities

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2009
Marlon E. Pierce
Abstract Web 2.0 approaches are revolutionizing the Internet, blurring lines between developers and users and enabling collaboration and social networks that scale into the millions of users. As discussed in our previous work, the core technologies of Web 2.0 effectively define a comprehensive distributed computing environment that parallels many of the more complicated service-oriented systems such as Web service and Grid service architectures. In this paper we build upon this previous work to discuss the applications of Web 2.0 approaches to four different scenarios: client-side JavaScript libraries for building and composing Grid services; integrating server-side portlets with ,rich client' AJAX tools and Web services for analyzing Global Positioning System data; building and analyzing folksonomies of scientific user communities through social bookmarking; and applying microformats and GeoRSS to problems in scientific metadata description and delivery. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Scientific workflow management and the Kepler system

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2006
Bertram Ludäscher
Abstract Many scientific disciplines are now data and information driven, and new scientific knowledge is often gained by scientists putting together data analysis and knowledge discovery ,pipelines'. A related trend is that more and more scientific communities realize the benefits of sharing their data and computational services, and are thus contributing to a distributed data and computational community infrastructure (a.k.a. ,the Grid'). However, this infrastructure is only a means to an end and ideally scientists should not be too concerned with its existence. The goal is for scientists to focus on development and use of what we call scientific workflows. These are networks of analytical steps that may involve, e.g., database access and querying steps, data analysis and mining steps, and many other steps including computationally intensive jobs on high-performance cluster computers. In this paper we describe characteristics of and requirements for scientific workflows as identified in a number of our application projects. We then elaborate on Kepler, a particular scientific workflow system, currently under development across a number of scientific data management projects. We describe some key features of Kepler and its underlying Ptolemy II system, planned extensions, and areas of future research. Kepler is a community-driven, open source project, and we always welcome related projects and new contributors to join. Copyright © 2005 John Wiley & Sons, Ltd. [source]


SOMATOTYPING, ANTIMODERNISM, AND THE PRODUCTION OF CRIMINOLOGICAL KNOWLEDGE,

CRIMINOLOGY, Issue 4 2007
NICOLE RAFTER
This study analyzes the work of William H. Sheldon, the psychologist, physician, and advocate of the study of body types. It investigates how he arrived at his much-repeated finding that a correlation exists between mesomorphy (a stocky, muscular body build) and delinquency and how his ideas were validated and perpetuated. It reviews what Sheldon actually said about the causes of crime; identifies his goals in searching for a relationship between body shape and criminality; explains how he found audiences for his biological theory at a time when sociological approaches dominated criminology; and attempts to understand the current criminological ambivalence about the scientific status of Sheldon's work, despite its discreditation decades ago. I argue that the tripartite structure of Sheldon's thought attracted three different audiences,methodologists, social scientists, and supporters,and that it encouraged the supporters to fund his research without reference to the critiques of the social scientists. I also argue that somatotyping was part of a broader antimodernist reaction within international scientific communities against the dislocations of twentieth-century life. To understand the origins, acceptance, and maintenance of criminological ideas, we need a historical perspective on figures of the past. Positivism may inform us about what is true and false, but we also need to know how truth and falsity have been constructed over time and how the ideas of earlier criminologists were shaped by their personal and social contexts. [source]


Model uncertainty in the ecosystem approach to fisheries

FISH AND FISHERIES, Issue 4 2007
Simeon L. Hill
Abstract Fisheries scientists habitually consider uncertainty in parameter values, but often neglect uncertainty about model structure, an issue of increasing importance as ecosystem models are devised to support the move to an ecosystem approach to fisheries (EAF). This paper sets out pragmatic approaches with which to account for uncertainties in model structure and we review current ways of dealing with this issue in fisheries and other disciplines. All involve considering a set of alternative models representing different structural assumptions, but differ in how those models are used. The models can be asked to identify bounds on possible outcomes, find management actions that will perform adequately irrespective of the true model, find management actions that best achieve one or more objectives given weights assigned to each model, or formalize hypotheses for evaluation through experimentation. Data availability is likely to limit the use of approaches that involve weighting alternative models in an ecosystem setting, and the cost of experimentation is likely to limit its use. Practical implementation of an EAF should therefore be based on management approaches that acknowledge the uncertainty inherent in model predictions and are robust to it. Model results must be presented in ways that represent the risks and trade-offs associated with alternative actions and the degree of uncertainty in predictions. This presentation should not disguise the fact that, in many cases, estimates of model uncertainty may be based on subjective criteria. The problem of model uncertainty is far from unique to fisheries, and a dialogue among fisheries modellers and modellers from other scientific communities will therefore be helpful. [source]


Making History, Talking about History

HISTORY AND THEORY, Issue 2 2001
José Carlos Bermejo Barrera
Making history,in the sense of writing it,is often set against talking about it, with most historians considering writing history to be better than talking about it. My aim in this article is to analyze the topic of making history versus talking about history in order to understand most historians' evident decision to ignore talking about history. Ultimately my goal is to determine whether it is possible to talk about history with any sense. To this end, I will establish a typology of the different forms of talking practiced by historians, using a chronological approach, from the Greek andRoman emphasis on the visual witness to present-day narrativism and textual analysis. Having recognized the peculiar textual character of the historiographical work, I will then discuss whether one can speak of a method for analyzing historiographical works. After considering two possible approaches,the philosophy of science and literary criticism,I offer my own proposal. This involves breaking the dichotomy between making and talking about history, adopting a fuzzy method that overcomes the isolation of self-named scientific communities, and that destroys the barriers among disciplines that work with the same texts but often from mutually excluding perspectives. Talking about history is only possible if one knows about history and about its sources and methods, but also about the foundations of the other social sciences and about the continuing importance of traditional philosophical problems of Western thought in the fields of history and the human sciences. [source]


Biological indicators of prognosis in Ewing's sarcoma: An emerging role for lectin galactoside-binding soluble 3 binding protein (LGALS3BP)

INTERNATIONAL JOURNAL OF CANCER, Issue 1 2010
Diana Zambelli
Abstract Starting from an experimental model that accounts for the 2 most important adverse processes to successful therapy of Ewing's sarcoma (EWS), chemoresistance and the presence of metastasis at the time of diagnosis, we defined a molecular signature of potential prognostic value. Functional annotation of differentially regulated genes revealed 3 major networks related to cell cycle, cell-to-cell interactions and cellular development. The prognostic impact of 8 genes, representative of these 3 networks, was validated in 56 EWS patients. High mRNA expression levels of HINT1, IFITM2, LGALS3BP, STOML2 and c-MYC were associated with reduced risk to death and lower risk to develop metastasis. At multivariate analysis, LGALS3BP, a matricellular protein with a role in tumor progression and metastasis, was the most important predictor of event-free survival and overall survival. The association between LGALS3BP and prognosis was confirmed at protein level, when expression of the molecule was determined in tumor tissues but not in serum, indicating a role for the protein at local tumor microenvironment. Engineered enhancement of LGALS3BP expression in EWS cells resulted in inhibition of anchorage independent cell growth and reduction of cell migration and metastasis. Silencing of LGALS3BP expression reverted cell behavior with respect to in vitro parameters, thus providing further functional validation of genetic data obtained in clinical samples. Thus, we propose LGALS3BP as a novel reliable indicator of prognosis, and we offer genetic signatures to the scientific communities for cross-validation and meta-analysis, which are indispensable tools for a rare tumor such as EWS. [source]


The COST 723 Action

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue S2 2007
W. A. Lahoz
Abstract An overview is provided of the COST 723 Action, ,Data Exploitation and Modelling of the Upper Troposphere and Lower Stratosphere'. The three working groups are introduced and a summary of Action activities within them is provided. The achievements of the Action are: three international workshops; the LAUTLOS humidity measurement campaign; dedicated meetings to discuss the quality of upper troposphere/lower stratosphere ozone and humidity measurements; two journal special issues; more than 90 papers in the peer-reviewed literature; one international summer school; and a successor COST Action which builds on COST 723. The recommendations made are: for COST to continue to support the short-term scientific missions instrument, as they are perceived to be value for money; to encourage the use of COST money to increase links between COST Actions and other scientific communities; and for the COST secretariat to recommend that Actions consider a summer school instead of a final workshop or meeting. Copyright © 2007 Royal Meteorological Society [source]


Chronic cerebrospinal venous insufficiency and multiple sclerosis

ANNALS OF NEUROLOGY, Issue 3 2010
Omar Khan MD
A chronic state of impaired venous drainage from the central nervous system, termed chronic cerebrospinal venous insufficiency (CCSVI), is claimed to be a pathologic phenomenon exclusively seen in multiple sclerosis (MS). This has invigorated the causal debate of MS and generated immense interest in the patient and scientific communities. A potential shift in the treatment paradigm of MS involving endovascular balloon angioplasty or venous stent placement has been proposed as well as conducted in small patient series. In some cases, it may have resulted in serious injury. In this Point of View, we discuss the recent investigations that led to the description of CCSVI as well as the conceptual and technical shortcomings that challenge the potential relationship of this phenomenon to MS. The need for conducting carefully designed and rigorously controlled studies to investigate CCVSI has been recognized by the scientific bodies engaged in MS research. Several scientific endeavors examining the presence of CCSVI in MS are being undertaken. At present, invasive and potentially dangerous endovascular procedures as therapy for patients with MS should be discouraged until such studies have been completed, analyzed, and debated in the scientific arena. ANN NEUROL 2010;67:286,290 [source]


A Windows-based interface for teaching image processing

COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 2 2010
Melvin Ayala
Abstract The use of image processing in research represents a challenge to the scientific community interested in its various applications but is not familiar with this area of expertise. In academia as well as in industry, fundamental concepts such as image transformations, filtering, noise removal, morphology, convolution/deconvolution among others require extra efforts to be understood. Additionally, algorithms for image reading and visualization in computers are not always easy to develop by inexperienced researchers. This type of environment has lead to an adverse situation where most students and researchers develop their own image processing code for operations which are already standards in image processing, a redundant process which only exacerbates the situation. The research proposed in this article, with the aim to resolve this dilemma, is to propose a user-friendly computer interface that has a dual objective which is to free students and researchers from the learning time needed for understanding/applying diverse imaging techniques but to also provide them with the option to enhance or reprogram such algorithms with direct access to the software code. The interface was thus developed with the intention to assist in understanding and performing common image processing operations through simple commands that can be performed mostly by mouse clicks. The visualization of pseudo code after each command execution makes the interface attractive, while saving time and facilitating to users the learning of such practical concepts. © 2009 Wiley Periodicals, Inc. Comput Appl Eng Educ 18: 213,224, 2010; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20171 [source]


A comparison of using Taverna and BPEL in building scientific workflows: the case of caGrid

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2010
Wei Tan
Abstract When the emergence of ,service-oriented science,' the need arises to orchestrate multiple services to facilitate scientific investigation,that is, to create ,science workflows.' We present here our findings in providing a workflow solution for the caGrid service-based grid infrastructure. We choose BPEL and Taverna as candidates, and compare their usability in the lifecycle of a scientific workflow, including workflow composition, execution, and result analysis. Our experience shows that BPEL as an imperative language offers a comprehensive set of modeling primitives for workflows of all flavors; whereas Taverna offers a dataflow model and a more compact set of primitives that facilitates dataflow modeling and pipelined execution. We hope that this comparison study not only helps researchers to select a language or tool that meets their specific needs, but also offers some insight into how a workflow language and tool can fulfill the requirement of the scientific community. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Reliability in grid computing systems,

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 8 2009
Christopher Dabrowski
Abstract In recent years, grid technology has emerged as an important tool for solving compute-intensive problems within the scientific community and in industry. To further the development and adoption of this technology, researchers and practitioners from different disciplines have collaborated to produce standard specifications for implementing large-scale, interoperable grid systems. The focus of this activity has been the Open Grid Forum, but other standards development organizations have also produced specifications that are used in grid systems. To date, these specifications have provided the basis for a growing number of operational grid systems used in scientific and industrial applications. However, if the growth of grid technology is to continue, it will be important that grid systems also provide high reliability. In particular, it will be critical to ensure that grid systems are reliable as they continue to grow in scale, exhibit greater dynamism, and become more heterogeneous in composition. Ensuring grid system reliability in turn requires that the specifications used to build these systems fully support reliable grid services. This study surveys work on grid reliability that has been done in recent years and reviews progress made toward achieving these goals. The survey identifies important issues and problems that researchers are working to overcome in order to develop reliability methods for large-scale, heterogeneous, dynamic environments. The survey also illuminates reliability issues relating to standard specifications used in grid systems, identifying existing specifications that may need to be evolved and areas where new specifications are needed to better support the reliability. Published in 2009 by John Wiley & Sons, Ltd. [source]


Guidelines for Systematic Review in Conservation and Environmental Management

CONSERVATION BIOLOGY, Issue 6 2006
ANDREW S. PULLIN
política de la conservación; práctica de la conservación; toma de decisiones; transferencia de conocimiento basado en evidencia Abstract:,An increasing number of applied disciplines are utilizing evidence-based frameworks to review and disseminate the effectiveness of management and policy interventions. The rationale is that increased accessibility of the best available evidence will provide a more efficient and less biased platform for decision making. We argue that there are significant benefits for conservation in using such a framework, but the scientific community needs to undertake and disseminate more systematic reviews before the full benefit can be realized. We devised a set of guidelines for undertaking formalized systematic review, based on a health services model. The guideline stages include planning and conducting a review, including protocol formation, search strategy, data inclusion, data extraction, and analysis. Review dissemination is addressed in terms of current developments and future plans for a Web-based open-access library. By the use of case studies we highlight critical modifications to guidelines for protocol formulation, data-quality assessment, data extraction, and data synthesis for conservation and environmental management. Ecological data presented significant but soluble challenges for the systematic review process, particularly in terms of the quantity, accessibility, and diverse quality of available data. In the field of conservation and environmental management there needs to be further engagement of scientists and practitioners to develop and take ownership of an evidence-based framework. Resumen:,Un mayor número de disciplinas está utilizando marcos de referencia basados en evidencias para revisar y diseminar la efectividad de las intervenciones de gestión y política. El fundamento es que la mayor accesibilidad de la evidencia mejor disponible proporcionará una plataforma de toma de decisiones menos sesgada y más eficiente. Argumentamos que hay beneficios significativos para la conservación al utilizar tal marco de referencia, pero la comunidad científica debe emprender y diseminar revisiones más sistemáticas antes de que se pueda comprender el beneficio completo. Diseñamos un conjunto de directrices para realizar revisiones sistemáticas formales, basado en un modelo de servicios de salud. Las etapas de las directrices incluyen la planificación y conducción de una revisión, incluyendo formación del protocolo, estrategias de búsqueda, inclusión de datos, extracción y análisis de datos. La diseminación de revisiones es abordada en términos del desarrollo actual y los planes futuros para una biblioteca de acceso abierto en la Web. Al utilizar estudios de caso resaltamos modificaciones críticas a las directrices para la formulación del protocolo, evaluación de la calidad de los datos, extracción de datos y síntesis de datos para la gestión ambiental y de conservación. Los datos ecológicos presentaron retos significativos, pero solucionables, para el proceso de revisión sistemática, particularmente en términos de la cantidad, accesibilidad y calidad de los datos disponibles. Se requiere un mayor compromiso de científicos y profesionales de la gestión ambiental y de conservación para desarrollar y apropiarse de un marco de referencia basado en evidencias. [source]


A common standard for conflict of interest disclosure in addiction journals

ADDICTION, Issue 11 2009
Merrill Goozner
ABSTRACT This paper presents a common standard for conflict of interest disclosure. The common standard was drafted by the authors, following consultation with a multi-disciplinary group of journal editors, publishers, bioethicists and other academics. It is presented here for the benefit of authors, editorial managers, journal editors and peer reviewers to stimulate discussion and to provide guidance to authors in reporting real, apparent and potential conflicts of interest. It is particularly relevant to addiction specialty journals because of the potential conflicts of interest associated with funding from the alcohol, tobacco, pharmaceutical and gambling industries. Following an appropriate period of vetting the common standard within the scientific community, it is recommended that journal editors adopt journal policies and reporting procedures that are consistent across journals. [source]


Automated comparative protein structure modeling with SWISS-MODEL and Swiss-PdbViewer: A historical perspective

ELECTROPHORESIS, Issue S1 2009
Nicolas Guex
Abstract SWISS-MODEL pioneered the field of automated modeling as the first protein modeling service on the Internet. In combination with the visualization tool Swiss-PdbViewer, the Internet-based Workspace and the SWISS-MODEL Repository, it provides a fully integrated sequence to structure analysis and modeling platform. This computational environment is made freely available to the scientific community with the aim to hide the computational complexity of structural bioinformatics and encourage bench scientists to make use of the ever-increasing structural information available. Indeed, over the last decade, the availability of structural information has significantly increased for many organisms as a direct consequence of the complementary nature of comparative protein modeling and experimental structure determination. This has a very positive and enabling impact on many different applications in biomedical research as described in this paper. [source]


Atrazine increases the sodium absorption in frog (Rana esculenta) skin

ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 2 2006
Giuseppe Cassano
Abstract The presence of atrazine in agricultural sites has been linked to the decline in amphibian populations. The efforts of the scientific community generally are directed toward investigating the long-term effect of atrazine on complex functions (reproduction or respiration), but in the present study, we investigated the short-term effect on the short-circuit current (ISC), a quantitative measure of the ion transport operated by frog (Rana esculenta) skin. Treatment with 5 ,M atrazine (1.08 mg/L) does not affect the transepithelial outfluxes of [14C]mannitol or [14C]urea; therefore, atrazine does not damage the barrier properties of frog skin. Atrazine causes a dose-dependent increase in the short-circuit current, with a minimum of 4.64 ± 0.76 ,A/cm2 (11.05% ± 1.22%) and a maximum of 12.7 ± 0.7 ,A/cm2 (35% ± 2.4%) measured at 10 nM and 5 ,M, respectively. An increase in ISC also is caused by 5 ,M ametryne, prometryn, simazine, terbuthylazine, or terbutryn (other atrazine derivatives). In particular, atrazine increases the transepithelial 22Na+ influx without affecting the outflux. Finally, stimulation of ISC by atrazine is suppressed by SQ 22536, H89, U73122, 2-aminoethoxydiphenyl borate, and W7 (blockers of adenylate cyclase, protein kinase A, phospholipase C, intracellular Ca2+ increase, and calmodulin, respectively), whereas indomethacin and calphostin C (inhibitors of cyclooxygenase and protein kinase C, respectively) have no effect. [source]


Incidence and impact of axial malformations in larval bullfrogs (Rana catesbeiana) developing in sites polluted by a coal-burning power plant

ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 4 2000
William A. Hopkins
Abstract Amphibian malformations have recently received much attention from the scientific community, but few studies have provided evidence linking environmental pollution to larval amphibian malformations in the field. We document an increased incidence of axial malformations in bullfrog larvae (Rana catesbeiana) inhabiting two sites contaminated with coal combustion wastes. In the polluted sites, 18 and 37% of larvae exhibited lateral curvatures of the spine, whereas zero and 4% of larvae from two reference sites had similar malformations. Larvae from the most heavily polluted site had significantly higher tissue concentrations of potentially toxic trace elements, including As, Cd, Se, Cu, Cr, and V, compared with conspecifics from the reference sites. In addition, malformed larvae from the most contaminated site had decreased swimming speeds compared with those of normal larvae from the same site. We hypothesize that the complex mixture of contaminants produced by coal combustion is responsible for the high incidence of malformations and associated effects on swimming performance. [source]


Mechanisms of neurodegeneration in Huntington's disease

EUROPEAN JOURNAL OF NEUROSCIENCE, Issue 11 2008
Joana M. Gil
Abstract Huntington's disease (HD) is caused by an expansion of cytosine,adenine,guanine (CAG) repeats in the huntingtin gene, which leads to neuronal loss in the striatum and cortex and to the appearance of neuronal intranuclear inclusions of mutant huntingtin. Huntingtin plays a role in protein trafficking, vesicle transport, postsynaptic signaling, transcriptional regulation, and apoptosis. Thus, a loss of function of the normal protein and a toxic gain of function of the mutant huntingtin contribute to the disruption of multiple intracellular pathways. Furthermore, excitotoxicity, dopamine toxicity, metabolic impairment, mitochondrial dysfunction, oxidative stress, apoptosis, and autophagy have been implicated in the progressive degeneration observed in HD. Nevertheless, despite the efforts of a multidisciplinary scientific community, there is no cure for this devastating neurodegenerative disorder. This review presents an overview of the mechanisms that may contribute for HD pathogenesis. Ultimately, a better understanding of these mechanisms will lead to the development of more effective therapeutic targets. [source]


IS EVOLUTIONARY BIOLOGY STRATEGIC SCIENCE?

EVOLUTION, Issue 1 2007
Thomas R. Meagher
There is a profound need for the scientific community to be better aware of the policy context in which it operates. To address this need, Evolution has established a new Outlook feature section to include papers that explore the interface between society and evolutionary biology. This first paper in the series considers the strategic relevance of evolutionary biology. Support for scientific research in general is based on governmental or institutional expenditure that is an investment, and such investment is based on strategies designed to achieve particular outcomes, such as advance in particular areas of basic science or application. The scientific community can engage in the development of scientific strategies on a variety of levels, including workshops to explicitly develop research priorities and targeted funding initiatives to help define emerging scientific areas. Better understanding and communication of the scientific achievements of evolutionary biology, emphasizing immediate and potential societal relevance, are effective counters to challenges presented by the creationist agenda. Future papers in the Outlook feature section should assist the evolutionary biology community in achieving a better collective understanding of the societal relevance of their field. [source]


The structures of Escherichia coli O-polysaccharide antigens

FEMS MICROBIOLOGY REVIEWS, Issue 3 2006
Roland Stenutz
Abstract Escherichia coli is usually a non-pathogenic member of the human colonic flora. However, certain strains have acquired virulence factors and may cause a variety of infections in humans and in animals. There are three clinical syndromes caused by E. coli: (i) sepsis/meningitis; (ii) urinary tract infection and (iii) diarrhoea. Furthermore the E. coli causing diarrhoea is divided into different ,pathotypes' depending on the type of disease, i.e. (i) enterotoxigenic; (ii) enteropathogenic; (iii) enteroinvasive; (iv) enterohaemorrhagic; (v) enteroaggregative and (vi) diffusely adherent. The serotyping of E. coli based on the somatic (O), flagellar (H) and capsular polysaccharide antigens (K) is used in epidemiology. The different antigens may be unique for a particular serogroup or antigenic determinants may be shared, resulting in cross-reactions with other serogroups of E. coli or even with other members of the family Enterobacteriacea. To establish the uniqueness of a particular serogroup or to identify the presence of common epitopes, a database of the structures of O-antigenic polysaccharides has been created. The E. coli database (ECODAB) contains structures, nuclear magnetic resonance chemical shifts and to some extent cross-reactivity relationships. All fields are searchable. A ranking is produced based on similarity, which facilitates rapid identification of strains that are difficult to serotype (if known) based on classical agglutinating methods. In addition, results pertinent to the biosynthesis of the repeating units of O-antigens are discussed. The ECODAB is accessible to the scientific community at http://www.casper.organ.su.se/ECODAB/. [source]


The long and winding road from the research laboratory to industrial applications of lactic acid bacteria

FEMS MICROBIOLOGY REVIEWS, Issue 3 2005
Martin Bastian Pedersen
Abstract Research innovations are constantly occurring in universities, research institutions and industrial research laboratories. These are reported in the scientific literature and presented to the scientific community in various congresses and symposia as well as through direct contacts and collaborations. Conversion of these research results to industrially useful innovations is, however, considerably more complex than generally appreciated. The long and winding road from the research laboratory to industrial applications will be illustrated with two recent examples from Chr. Hansen A/S: the implementation in industrial scale of a new production technology based on respiration by Lactococcus lactis and the introduction to the market of L. lactis strains constructed using recombinant DNA technology. [source]


Zeolite Catalysts with Tunable Hierarchy Factor by Pore-Growth Moderators

ADVANCED FUNCTIONAL MATERIALS, Issue 24 2009
Javier Pérez-Ramírez
Abstract The design of hierarchical zeolite catalysts is attempted through the maximization of the hierarchy factor (HF); that is, by enhancing the mesopore surface area without severe penalization of the micropore volume. For this purpose, a novel desilication variant involving NaOH treatment of ZSM-5 in the presence of quaternary ammonium cations is developed. The organic cation (TPA+ or TBA+) acts as a pore-growth moderator in the crystal by OH, -assisted silicon extraction, largely protecting the zeolite crystal during the demetallation process. The protective effect is not seen when using cations that are able to enter the micropores, such as TMA+ Engineering the pore structure at the micro- and mesolevel is essential to optimize transport properties and catalytic performance, as demonstrated in the benzene alkylation with ethylene, a representative mass-transfer limited reaction. The hierarchy factor is an appropriate tool to classify hierarchically structured materials. The latter point is of wide interest to the scientific community as it not only embraces mesoporous zeolites obtained by desilication methods but it also enables to quantitatively compare and correlate various materials obtained by different synthetic methodologies. [source]


Quantitative analysis of the scientific literature on acetaminophen in medicine and biology: a 2003,2005 study,

FUNDAMENTAL & CLINICAL PHARMACOLOGY, Issue 2 2009
Claude Robert
Abstract This study quantifies the utilization of acetaminophen in life sciences and clinical medicine using bibliometric indicators. A total of 1626 documents involving acetaminophen published by 74 countries during 2003,2005 in the Thompson-Scientific Life sciences and Clinical Medicine collections were identified and analyzed. The USA leads in the number of publications followed by the UK, and industrialized countries, including France, Japan and Germany; the presence of countries such as China, India and Turkey among the top 15 countries deserves to be noticed. The European Union stands as a comparable contributor to the USA, both in terms of number of publications and in terms of profile of papers distributed among subcategories of Life Sciences and Clinical Medicine disciplines. All documents were published in 539 different journals. The most prolific journals were related to pharmacology and/or pharmaceutics. All aspects of acetaminophen (chemistry, pharmacokinetics, metabolism, etc.) were studied with primary interest for therapeutic use (42%) and adverse effects (28%) comprising a large part of publications focusing on acetaminophen hepatotoxicity. This quantitative overview provides as to the interest of the scientific community in this analgesic and completes the various review documents that regularly appear in the scientific literature. [source]


New pharmacological strategies against metastatic spread

FUNDAMENTAL & CLINICAL PHARMACOLOGY, Issue 5 2008
G.Y. Perret
Abstract Although metastatic spread is the most frequent cause of death in cancer patients, there are very few drugs specifically targeting this process. Bases for a new antimetastatic drug discovery strategy are weak because a great number of unknowns characterize the complete understanding of the metastatic cascade mechanisms. Moreover, the current experimental models are too simplistic and do not account for the complexity of the phenomenon. Some targets have been identified but too few are validated. Among them, the metastasis suppressor genes seem to be the most promising. In spite of this, during recent years, a dozen of molecules, which fulfil the definition of a specific metastatic drug that inhibits the metastases without altering the growth of the primary tumour (which can be eradicated by surgery), have been identified and assessed for the proof of the concept. The continuation of this effort would benefit in terms of efficiency, if the objectives were defined more precisely. It is particularly important to distinguish molecules that prevent spread of the metastatic cells of the early-stage primary tumour from the ones which induce a regression of the established metastases or to inhibit the transition from disseminated occult tumour cells to dormant micrometastasis. This second goal is a priori more relevant in the current clinical setting where the detection of early metastatic spread is very difficult, and therefore would call for greater effort on the part of the scientific community. [source]


From clergymen to computers,the advent of virtual palaeontology

GEOLOGY TODAY, Issue 3 2010
Russell J. Garwood
Palaeontology was established as a science in the Victorian era, yet has roots that stretch deeper into the recesses of history. More than 2000 years ago, the Greek philosopher Aristotle deduced that fossil sea shells were once living organisms, and around 500 ad Xenophanes used fossils to argue that many areas of land must have previously been submarine. In 1027, the Persian scholar Avicenna suggested that organisms were fossilized by petrifying fluids; this theory was accepted by most natural philosophers up until the eighteenth century Enlightenment, and even beyond. The late 1700s were notable for the work of Georges Cuvier who established the reality of extinction. This, coupled with advances in the recognition of faunal successions made by the canal engineer William Smith, laid the framework for the discipline that would become known as palaeontology. As the nineteenth century progressed, the scientific community became increasingly well organized. Most fossil workers were gentleman scientists and members of the clergy, who self-funded their studies in a new and exciting field. Many of the techniques used to study fossils today were developed during this ,classical' period. Perhaps the most fundamental of these is to expose a fossil by splitting the rock housing it, and then conduct investigations based upon the exposed surface (Fig. 1). This approach has served the science well in the last two centuries, having been pivotal to innumerable advances in our understanding of the history of life. Nevertheless, there are many cases where splitting a rock in this way results in incomplete data recovery; those where the fossils are not flattened, but are preserved in three-dimensions. Even the ephemeral soft-tissues of organisms are occasionally preserved in a three-dimensional state, for example in the Herefordshire, La Voulte Sûr Rhone and Orsten ,Fossil Lagerstätten' (sites of exceptional fossil preservation). These rare and precious deposits provide a wealth of information about the history of life on Earth, and are perhaps our most important resource in the quest to understand the palaeobiology of extinct organisms. With the aid of twenty-first century technology, we can now make the most of these opportunities through the field of ,virtual palaeontology',computer-aided visualization of fossils. Figure 1. A split nodule showing the fossil within, in this case a cockroachoid insect. Fossil 4 cm long (From Garwood & Sutton, in press). [source]


Three Secondary Reference Materials for Lithium Isotope Measurements: Li7-N, Li6-N and LiCl-N Solutions

GEOSTANDARDS & GEOANALYTICAL RESEARCH, Issue 1 2007
Jean Carignan
matériaux de référence; isotopes de Li; solutions de Li; QUAD-ICP-MS; MC-ICP-MS The CRPG (Nancy, France) has prepared secondary reference materials for Li isotope measurements by mixing 7Li or 6Li spikes and either L-SVEC or IRMM-016 certified reference materials to produce solutions having a known Li concentration and isotopic composition. The Li7-N and Li6-N solution samples (1.5 mol l,1 HNO3) have nominal ,7Li isotopic compositions of 30.1, and -9.7, respectively relative to L-SVEC and concentrations of 100 mg l,1. Repeated measurement of these samples using the QUAD-ICP-MS at the CRPG yielded ,7Li of 30.4 ± 1.1, (n = 13) and -8.9 ± 0.9, (n = 9) at the 2s level of confidence. An additional LiCl-N solution was measured and yielded a delta value of 9.5 ± 0.6, (n = 3). Identical results were obtained at the BRGM (Orléans, France) from determinations performed with a Neptune MC-ICP-MS (30.2 ± 0.3,, n = 89 for the Li7-N, -8.0 ± 0.3,, n = 38 for the Li6-N and 10.1 ± 0.2,, n = 46 for LiCl-N at the 2s level of confidence). The deviation of measured composition relative to the nominal value for the Li6-N solution might be explained by either contamination during preparation or an error during sample weighing. These secondary reference materials, previously passed through ion exchange resin or directly analysed, may be used for checking the accuracy of Li isotopic measurements over a range of almost 40, and will be available to the scientific community upon request to J. Carignan or N. Vigier, CRPG. Le CRPG (Nancy, France) a préparé des matériaux secondaires de référence pour l'analyse des isotopes du Li en mélangeant des spikes de 7Li ou 6Li avec les matériaux de référence certifiés L-SVEC ou IRMM-016, ceci afin de produire des solutions ayant des concentrations et compositions isotopiques de Li connues. Les solutions Li7-N et Li6-N (1.5 mol l,1 HNO3) ont des compositions isotopiques nominales de ,7Li, exprimées par rapport à L-SVEC, de 30.1, et de -9.7, respectivement, et des concentrations de 10 0 mg l,1. L'analyse répétée de ces solutions par QUAD-ICP-MS au CRPG donne des ,7Li de 30.4 ± 1.1, (n = 13) et -8.9 ± 0.9, (n = 9) avec une incertitude à 2s. Une solution additionnelle de LiCl-N a été analysée et a donné une valeur de delta de 9.5 ± 0.6, (n = 3). Des résultats identiques ont été obtenus au BRGM (Orléans, France) où les déterminations ont été effectuées sur le MC-ICP-MS Neptune (30.2 ± 0.3,, n = 89 pour Li7-N, -8.0 ± 0.3,, n = 38 pour Li6-N et 10.1 ± 0.2,, n = 46 pour LiCl-N, à 2s d'intervalle de confiance). Le biais entre les compositions mesurées et la valeur nominale, observé pour la solution Li6-N peut être expliqué par une contamination durant la préparation ou par une erreur durant la pesée. Ces matériaux secondaires de référence, préalablement passés sur résine échangeuse d'ions ou analysés directement, peuvent être utilisés pour vérifier la justesse des analyses isotopiques de Li sur une gamme de presque 40% et sont à la disposition de la communauté scientifique sur demande auprès de J. Carignan ou N. Vigier, CRPG. [source]


The importance of rapid, disturbance-induced losses in carbon management and sequestration

GLOBAL ECOLOGY, Issue 1 2002
David D. Breshears
Abstract Management of terrestrial carbon fluxes is being proposed as a means of increasing the amount of carbon sequestered in the terrestrial biosphere. This approach is generally viewed only as an interim strategy for the coming decades while other longer-term strategies are developed and implemented , the most important being the direct reduction of carbon emissions. We are concerned that the potential for rapid, disturbance-induced losses may be much greater than is currently appreciated, especially by the decision-making community. Here we wish to: (1) highlight the complex and threshold-like nature of disturbances , such as fire and drought, as well as the erosion associated with each , that could lead to carbon losses; (2) note the global extent of ecosystems that are at risk of such disturbance-induced carbon losses; and (3) call for increased consideration of and research on the mechanisms by which large, rapid disturbance-induced losses of terrestrial carbon could occur. Our lack of ability as a scientific community to predict such ecosystem dynamics is precluding the effective consideration of these processes into strategies and policies related to carbon management and sequestration. Consequently, scientists need to do more to improve quantification of these potential losses and to integrate them into sound, sustainable policy options. [source]


,Brain circulation' and transnational knowledge networks: studying long-term effects of academic mobility to Germany, 1954,2000

GLOBAL NETWORKS, Issue 3 2009
HEIKE JÖNS
Abstract ,Brain circulation' has become a buzzword for describing the increasingly networked character of highly skilled migration. In this article, the concept is linked to academics' work on circular mobility to explore the long-term effects of their research stays in Germany during the second half of the twentieth century. Based on original survey data on more than 1800 former visiting academics from 93 countries, it is argued that this type of brain circulation launched a cumulative process of subsequent academic mobility and collaboration that contributed significantly to the reintegration of Germany into the international scientific community after the Second World War and enabled the country's rise to the most important source for international co-authors of US scientists and engineers in the twenty-first century. In this article I discuss regional and disciplinary specificities in the formation of transnational knowledge networks through circulating academics and suggest that the long-term effects can be fruitfully conceptualized as accumulation processes in ,centres of calculation'. [source]