Scientific Efforts (scientific + effort)

Distribution by Scientific Domains


Selected Abstracts


Forecast verification: current status and future directions

METEOROLOGICAL APPLICATIONS, Issue 1 2008
Dr B. Casati
Abstract Research and development of new verification strategies and reassessment of traditional forecast verification methods has received a great deal of attention from the scientific community in the last decade. This scientific effort has arisen from the need to respond to changes encompassing several aspects of the verification process, such as the evolution of forecasting systems, or the desire for more meaningful verification approaches that address specific forecast user requirements. Verification techniques that account for the spatial structure and the presence of features in forecast fields, and which are designed specifically for high-resolution forecasts have been developed. The advent of ensemble forecasts has motivated the re-evaluation of some of the traditional scores and the development of new verification methods for probability forecasts. The expected climatological increase of extreme events and their potential socio-economical impacts have revitalized research studies addressing the challenges concerning extreme event verification. Verification issues encountered in the operational forecasting environment have been widely discussed, verification needs for different user communities have been identified, and models to assess the forecast value for specific users have been proposed. Proper verification practice and correct interpretation of verification statistics has been extensively promoted with recent publications and books, tutorials and workshops, and the development of open-source software and verification tools. This paper addresses some of the current issues in forecast verification, reviews some of the most recently developed verification techniques, and provides recommendations for future research. Copyright © 2008 Royal Meteorological Society and Crown in the right of Canada. [source]


Risk assessment of genetically modified crops for nutrition and health

NUTRITION REVIEWS, Issue 1 2009
Javier A Magańa-Gómez
The risk assessment of genetically modified (GM) crops for human nutrition and health has not been systematic. Evaluations for each GM crop or trait have been conducted using different feeding periods, animal models, and parameters. The most common result is that GM and conventional sources induce similar nutritional performance and growth in animals. However, adverse microscopic and molecular effects of some GM foods in different organs or tissues have been reported. Diversity among the methods and results of the risk assessments reflects the complexity of the subject. While there are currently no standardized methods to evaluate the safety of GM foods, attempts towards harmonization are on the way. More scientific effort is necessary in order to build confidence in the evaluation and acceptance of GM foods. [source]


Leading science: staying informed without micromanaging (achieving and keeping the balance)

DRUG DEVELOPMENT RESEARCH, Issue 2 2005
Alice M. Sapienza
Abstract Those who lead scientific efforts understand that they must contribute to judgments of both value and meaning of new data emerging from the bench. However, there is a fine line between staying informed, on the one hand, and quashing morale and creativity by overmanaging scientists at the bench. We discuss why we believe that micromanagement may be more likely to occur in science and how it can impact the caliber of scientific output. We also provide guidelines for how leaders can move with the data appropriately and keep the balance between staying abreast of work versus putting the brakes on work. Drug Dev. Res. 64:99,104, 2005. © 2005 Wiley-Liss, Inc. [source]


Quantitative structure-activity relationship methods: Perspectives on drug discovery and toxicology

ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 8 2003
Roger Perkins
Abstract Quantitative structure,activity relationships (QSARs) attempt to correlate chemical structure with activity using statistical approaches. The QSAR models are useful for various purposes including the prediction of activities of untested chemicals. Quantitative structure,activity relationships and other related approaches have attracted broad scientific interest, particularly in the pharmaceutical industry for drug discovery and in toxicology and environmental science for risk assessment. An assortment of new QSAR methods have been developed during the past decade, most of them focused on drug discovery. Besides advancing our fundamental knowledge of QSARs, these scientific efforts have stimulated their application in a wider range of disciplines, such as toxicology, where QSARs have not yet gained full appreciation. In this review, we attempt to summarize the status of QSAR with emphasis on illuminating the utility and limitations of QSAR technology. We will first review two-dimensional (2D) QSAR with a discussion of the availability and appropriate selection of molecular descriptors. We will then proceed to describe three-dimensional (3D) QSAR and key issues associated with this technology, then compare the relative suitability of 2D and 3D QSAR for different applications. Given the recent technological advances in biological research for rapid identification of drug targets, we mention several examples in which QSAR approaches are employed in conjunction with improved knowledge of the structure and function of the target receptor. The review will conclude by discussing statistical validation of QSAR models, a topic that has received sparse attention in recent years despite its critical importance. [source]


From Fundamental Polymerization Kinetics to Process Application,A Realistic Vision?,

MACROMOLECULAR CHEMISTRY AND PHYSICS, Issue 5 2010
Christian Bauer
Abstract This contribution presents a ramble through the past, inspecting the developments of fundamental scientific investigations into the elementary reaction steps of polymerizations. Often the question arises whether benefits from fundamental scientific investigation find their way into industrial practice and how these pay off in better understanding large-scale processes. The high-pressure ethene polymerization is the system of interest here because 1) it is a complex process with a world production of several million tons per year and 2) although dealing with system pressures up to 3000,bar is not an easy task, a long history,if not tradition,exists between the scientific efforts to improve the process and the research groups of Franck and Buback in Karlsruhe, Schönemann and Luft in Darmstadt, and later Buback in Göttingen. The scope of this discussion will range from non-invasive on- and in-line monitoring by spectroscopic methods up to process modeling by advanced kinetic models and their potential for explorative work. [source]


Scientific infrastructure design: Information environments and knowledge provinces

PROCEEDINGS OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE & TECHNOLOGY (ELECTRONIC), Issue 1 2007
Karen S. Baker
Conceptual models and design processes shape the practice of information infrastructure building in the sciences. We consider two distinct perspectives: (i) a cyber view of disintermediation where information technology enables data flow from the ,field' and on to the digital doorstep of the general end-user, and (ii) an intermediated view with bidirectional communications where local participants act as mediators within an information environment. Drawing from the literatures of information systems and science studies, we argue that differences in conceptual models have critical implications for users and their working environments. While the cyber view is receiving a lot of attention in current scientific efforts, highlighting the multiplicity of knowledge provinces with their respective worldviews opens up understandings of sociotechnical design processes and of knowledge work. The concept of a range of knowledge provinces enables description of dynamic configurations with shifting boundaries and supports planning for a diversity of arrangements across the digital landscape. [source]


From the inside: scientists' own experience of good (and bad) management

R & D MANAGEMENT, Issue 5 2005
Alice M. Sapienza
One crucial yet relatively unexamined perspective on issues of concern to both organizations and nations, the creativity and productivity of scientific efforts, is the insider perspective. Insiders are privy to confidential information , in this study, first-hand observations of good and bad leadership , because of their position within the laboratory. The insider perspective can help answer such questions as: What are scientists' lived experiences of effective management? What have they observed as some of the impacts of ineffective management? What worries them in terms of their own capacity to lead and manage? This paper describes interim results of an ongoing, exploratory study of insiders in academia, government, and industry. For the past 5 years, more than 200 scientific researchers from Europe, Asia, and the US have been asked open-ended questions about (1) the best example of scientific leadership they have encountered; (2) the worst example; and (3) their most difficult problems leading scientific endeavours. Their responses to date have included unexpected and surprising results. Good leaders are most frequently described as caring and compassionate (in contrast to the expected description of technically competent). Bad leaders are most frequently described as (surprisingly) abusive. The other important (and ,unintended') finding is that gender inequity persists. These responses illuminate some of the challenges facing those who manage research and development (R&D), who study the management of R&D, and who are responsible for national policies regarding R&D. [source]


Variability and Impact on Design of Bioequivalence Studies

BASIC AND CLINICAL PHARMACOLOGY & TOXICOLOGY, Issue 3 2010
Achiel Van Peer
Revisions of the regulatory guidance are based upon many questions over the past years and sometimes continuing scientific discussions on the use of the most suitable statistical analysis methods and study designs, particularly for drugs and drug products with high within-subject variability. Although high within-subject variability is usually associated with a coefficient of variation of 30% or more, new approaches are available in the literature to allow a gradual increase and a levelling off of the bioequivalence limits to some maximum wider values (e.g. 75,133%), dependent on the increase in the within-subject variability. The two-way, cross-over single dose study measuring parent drug is still the design of first choice. A partial replicate design with repeating the reference product and scaling the bioequivalence for the reference variability are proposed for drugs with high within-subject variability. In case of high variability, more regulatory authorities may accept a two-stage or group-sequential bioequivalence design using appropriately adjusted statistical analysis. This review also considers the mechanisms why drugs and drug products may exhibit large variability. The physiological complexity of the gastrointestinal tract and the interaction with the physicochemical properties of drug substances may contribute to the variation in plasma drug concentration-time profiles of drugs and drug products and to variability between and within subjects. A review of submitted bioequivalence studies at the Food and Drug Administration's Office of Generic Drugs over the period 2003,2005 indicated that extensive pre-systemic metabolism of the drug substance was the most important explanation for consistently high variability drugs, rather than a formulation factor. These scientific efforts are expected to further lead to revisions of earlier regulatory guidance in other regions as is the current situation in Europe. [source]


Microbial systems engineering: First successes and the way ahead

BIOESSAYS, Issue 4 2010
Sven Dietz
Abstract The first promising results from "streamlined," minimal genomes tend to support the notion that these are a useful tool in biological systems engineering. However, compared with the speed with which genomic microbial sequencing has provided us with a wealth of data to study biological functions, it is a slow process. So far only a few projects have emerged whose synthetic ambition even remotely matches our analytic capabilities. Here, we survey current technologies converging into a future ability to engineer large-scale biological systems. We argue that the underlying synthetic technology, de novo DNA synthesis, is already rather mature , in particular relative to the scope of our current synthetic ambitions. Furthermore, technologies towards rationalizing the design of the newly synthesized DNA fragment are emerging. These include techniques to implement complex regulatory circuits, suites of parts on a DNA and RNA level to fine tune gene expression, and supporting computational tools. As such DNA fragments will, in most cases, be destined for operating in a cellular context, attention has to be paid to the potential interactions of the host with the functions encoded on the engineered DNA fragment. Here, the need of biological systems engineering to deal with a robust and predictable bacterial host coincides with current scientific efforts to theoretically and experimentally explore minimal bacterial genomes. [source]