Data Management (data + management)

Distribution by Scientific Domains


Selected Abstracts


I. DATA MANAGEMENT: RECOMMENDED PRACTICES

MONOGRAPHS OF THE SOCIETY FOR RESEARCH IN CHILD DEVELOPMENT, Issue 3 2006
Margaret R. Burchinal
First page of article [source]


Data management and statistical methods used in the analysis of balanced chromosome abnormalities in therapy-related myelodysplastic syndromes and therapy-related acute leukemia: Report from an International Workshop,

GENES, CHROMOSOMES AND CANCER, Issue 4 2002
Theodore Karrison
First page of article [source]


Data management and quality assurance for an International project: the Indo,US Cross-National Dementia Epidemiology Study

INTERNATIONAL JOURNAL OF GERIATRIC PSYCHIATRY, Issue 6 2002
Rajesh Pandav
Abstract Background Data management and quality assurance play a vital but often neglected role in ensuring high quality research, particularly in collaborative and international studies. Objective A data management and quality assurance program was set up for a cross-national epidemiological study of Alzheimer's disease, with centers in India and the United States. Methods The study involved (a) the development of instruments for the assessment of elderly illiterate Hindi-speaking individuals; and (b) the use of those instruments to carry out an epidemiological study in a population-based cohort of over 5000 persons. Responsibility for data management and quality assurance was shared between the two sites. A cooperative system was instituted for forms and edit development, data entry, checking, transmission, and further checking to ensure that quality data were available for timely analysis. A quality control software program (CHECKS) was written expressly for this project to ensure the highest possible level of data integrity. Conclusions This report addresses issues particularly relevant to data management and quality assurance at developing country sites, and to collaborations between sites in developed and developing countries. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Development of applications with fuzzy objects in modern programming platforms

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 11 2005
F. Berzal
Most of the applications that are currently being developed use object-oriented programming technology, as is the case of those applications built with the Java or C, languages. Data management has not kept out of this trend, and object-oriented and object-relational database management systems have arisen as a result. Soft-computing applications need to manage imperfect data and Fuzzy Sets Theory has proven to be a good choice for accomplishing the task of imperfect data management. In this article we present a framework that allows the programmers of soft-computing applications to deal with fuzzy objects in a transparent and intuitive way. This framework can be used to develop an object-oriented code in those systems that conform with current hip object-oriented languages, so that imperfect information can be managed. © 2005 Wiley Periodicals, Inc. Int J Int Syst 20: 1117,1136, 2005. [source]


Adaptable cache service and application to grid caching

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2010
Laurent d'Orazio
Abstract Caching is an important element to tackle performance issues in largely distributed data management. However, caches are efficient only if they are well configured according to the context of use. As a consequence, they are usually built from scratch. Such an approach appears to be expensive and time consuming in grids where the various characteristics lead to many heterogeneous cache requirements. This paper proposes a framework facilitating the construction of sophisticated and dynamically adaptable caches for heterogeneous applications. Such a framework has enabled the evaluation of several configurations for distributed data querying systems and leads us to propose innovative approaches for semantic and cooperative caching. This paper also reports the results obtained in bioinformatics data management on grids showing the relevance of our proposals. Copyright © 2009 John Wiley & Sons, Ltd. [source]


A flexible content repository to enable a peer-to-peer-based wiki

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 7 2010
Udo Bartlang
Abstract Wikis,being major applications of the Web 2.0,are used for a large number of purposes, such as encyclopedias, project documentation, and coordination, both in open communities and in enterprises. At the application level, users are targeted as both consumers and producers of dynamic content. Yet, this kind of peer-to-peer (P2P) principle is not used at the technical level being still dominated by traditional client,server architectures. What lacks is a generic platform that combines the scalability of the P2P approach with, for example, a wiki's requirements for consistent content management in a highly concurrent environment. This paper presents a flexible content repository system that is intended to close the gap by using a hybrid P2P overlay to support scalable, fault-tolerant, consistent, and efficient data operations for the dynamic content of wikis. On the one hand, this paper introduces the generic, overall architecture of the content repository. On the other hand, it describes the major building blocks to enable P2P data management at the system's persistent storage layer, and how these may be used to implement a P2P-based wiki application: (i) a P2P back-end administrates a wiki's actual content resources. (ii) On top, P2P service groups act as indexing groups to implement a wiki's search index. Copyright © 2009 John Wiley & Sons, Ltd. [source]


On web communities mining and recommendation

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2009
Yanchun Zhang
Abstract Because of the lack of a uniform schema for web documents and the sheer amount and dynamics of web data, both the effectiveness and the efficiency of information management and retrieval of web data are often unsatisfactory when using conventional data management and searching techniques. To address this issue, we have adopted web mining and web community analysis approaches. On the basis of the analysis of web document contents, hyperlinks analysis, user access logs and semantic analysis, we have developed various approaches or algorithms to construct and analyze web communities, and to make recommendations. This paper will introduce and discuss several approaches on web community mining and recommendation. Copyright © 2009 John Wiley & Sons, Ltd. [source]


A Grid-enabled problem-solving environment for advanced reservoir uncertainty analysis

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 18 2008
Zhou Lei
Abstract Uncertainty analysis is critical for conducting reservoir performance prediction. However, it is challenging because it relies on (1) massive modeling-related, geographically distributed, terabyte, or even petabyte scale data sets (geoscience and engineering data), (2) needs to rapidly perform hundreds or thousands of flow simulations, being identical runs with different models calculating the impacts of various uncertainty factors, (3) an integrated, secure, and easy-to-use problem-solving toolkit to assist uncertainty analysis. We leverage Grid computing technologies to address these challenges. We design and implement an integrated problem-solving environment ResGrid to effectively improve reservoir uncertainty analysis. The ResGrid consists of data management, execution management, and a Grid portal. Data Grid tools, such as metadata, replica, and transfer services, are used to meet massive size and geographically distributed characteristics of data sets. Workflow, task farming, and resource allocation are used to support large-scale computation. A Grid portal integrates the data management and the computation solution into a unified easy-to-use interface, enabling reservoir engineers to specify uncertainty factors of interest and perform large-scale reservoir studies through a web browser. The ResGrid has been used in petroleum engineering. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Applying content management to automated provenance capture

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2008
Karen L. Schuchardt
Abstract Workflows and data pipelines are becoming increasingly valuable to computational and experimental sciences. These automated systems are capable of generating significantly more data within the same amount of time compared to their manual counterparts. Automatically capturing and recording data provenance and annotation as part of these workflows are critical for data management, verification, and dissemination. We have been prototyping a workflow provenance system, targeted at biological workflows, that extends our content management technologies and other open source tools. We applied this prototype to the provenance challenge to demonstrate an end-to-end system that supports dynamic provenance capture, persistent content management, and dynamic searches of both provenance and metadata. We describe our prototype, which extends the Kepler system for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to provide access to the provenance record with a variety of commonly available client tools. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Optimization of integrated Earth System Model components using Grid-enabled data management and computation

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 2 2007
A. R. Price
Abstract In this paper, we present the Grid enabled data management system that has been deployed for the Grid ENabled Integrated Earth system model (GENIE) project. The database system is an augmented version of the Geodise Database Toolbox and provides a repository for scripts, binaries and output data in the GENIE framework. By exploiting the functionality available in the Geodise toolboxes we demonstrate how the database can be employed to tune parameters of coupled GENIE Earth System Model components to improve their match with observational data. A Matlab client provides a common environment for the project Virtual Organization and allows the scripting of bespoke tuning studies that can exploit multiple heterogeneous computational resources. We present the results of a number of tuning exercises performed on GENIE model components using multi-dimensional optimization methods. In particular, we find that it is possible to successfully tune models with up to 30 free parameters using Kriging and Genetic Algorithm methods. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Automation in an addiction treatment research clinic: Computerised contingency management, ecological momentary assessment and a protocol workflow system

DRUG AND ALCOHOL REVIEW, Issue 1 2009
MASSOUD VAHABZADEH
Abstract Introduction and Aims. A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients' treatment needs and to accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with the provision of seamless methods for exporting, mining and querying the data. Design and Methods. We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialised applications: the Automated Contingency Management (ACM) system for the delivery of behavioural interventions, the transactional electronic diary (TED) system for the management of behavioural assessments and the Protocol Workflow System (PWS) for computerised workflow automation and guidance of each participant's daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorised staff. Results. ACM and the TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80 patient capacity, having an annual average of 18 000 patient visits and 7300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarise participant safety data for research oversight. Discussion and Conclusions. When developed in consultation with end users, automation in treatment research clinics can enable more efficient operations, better communication among staff and expansions in research methods. [Vahabzadeh M, Lin J-L, Mezghanni M, Epstein DH, Preston KL. Automation in an addiction treatment research clinic: Computerised contingency management, ecological momentary assessment and a protocol workflow system. Drug Alcohol Rev 2009;28:3,11] [source]


Use of a Voltammetric Electronic Tongue for Detection and Classification of Nerve Agent Mimics

ELECTROANALYSIS, Issue 14 2010
Inmaculada Campos
Abstract An electronic tongue (ET) based on pulse voltammetry has been used to predict the presence of nerve agent mimics in aqueous environments. The electronic tongue array consists of eight working electrodes (Au, Pt, Ir, Rh, Cu, Co, Ni and Ag) encapsulated on a stainless steel cylinder. Studies including principal component analysis (PCA), artificial neural networks (fuzzy ARTMAP) and partial least square techniques (PLS) have been applied for data management and prediction models. For instance the electronic tongue is able to discriminate the presence of the nerve agent simulants diethyl chlorophosphate (DCP) and diethyl cyanophosphate (DCNP) from the presence of other organophosphorous derivatives in water. Finally, PLS data analysis using a system of 3 compounds and 3 concentration levels shows a good accuracy in concentration prediction for DCP and DCNP in aqueous environments. [source]


Application of Six Sigma Methods for Improving the Analytical Data Management Process in the Environmental Industry

GROUND WATER MONITORING & REMEDIATION, Issue 2 2006
Christopher M. French
Honeywell applied the rigorous and well-documented Six Sigma quality-improvement approach to the complex, highly heterogeneous, and mission-critical process of remedial site environmental data management to achieve a sea change in terms of data quality, environmental risk reduction, and overall process cost reduction. The primary focus was to apply both qualitative and quantitative Six Sigma methods to improve electronic management of analytical laboratory data generated for environmental remediation and long-term monitoring programs. The process includes electronic data delivery, data QA/QC checking, data verification, data validation, database administration, regulatory agency reporting and linkage to spatial information, and real-time geographical information systems. Results of the analysis identified that automated, centralized web-based software tools delivered through Software as a Service (SaaS) model are optimal to improve the process resulting in cost reductions, while simultaneously improving data quality and long-term data usability and perseverance. A pilot project was completed that quantified cycle time and cost improvements of 50% and 65%, respectively. [source]


GENOMIZER: an integrated analysis system for genome-wide association data,

HUMAN MUTATION, Issue 6 2006
Andre Franke
Abstract Genome-wide association analysis appears to be a promising way to identify heritable susceptibility factors for complex human disorders. However, the feasibility of large-scale genotyping experiments is currently limited by an incomplete marker coverage of the genome, a restricted understanding of the functional role of given genomic regions, and the small sample sizes used. Thus, genome-wide association analysis will be a screening tool to facilitate subsequent gene discovery rather than a means to completely resolve individual genetic risk profiles. The validation of association findings will continue to rely upon the replication of "leads" in independent samples from either the same or different populations. Even under such pragmatic conditions, the timely analysis of the large data sets in question poses serious technical challenges. We have therefore developed public-domain software, GENOMIZER, that implements the workflow of an association experiment, including data management, single-point and haplotype analysis, "lead" definition, and data visualization. GENOMIZER (www.ikmb.uni-kiel.de/genomizer) comes with a complete user manual, and is open-source software licensed under the GNU Lesser General Public License. We suggest that the use of this software will facilitate the handling and interpretation of the currently emerging genome-wide association data. Hum Mutat 27(6), 583,588, 2006. © 2006 Wiley-Liss, Inc. [source]


Data management and quality assurance for an International project: the Indo,US Cross-National Dementia Epidemiology Study

INTERNATIONAL JOURNAL OF GERIATRIC PSYCHIATRY, Issue 6 2002
Rajesh Pandav
Abstract Background Data management and quality assurance play a vital but often neglected role in ensuring high quality research, particularly in collaborative and international studies. Objective A data management and quality assurance program was set up for a cross-national epidemiological study of Alzheimer's disease, with centers in India and the United States. Methods The study involved (a) the development of instruments for the assessment of elderly illiterate Hindi-speaking individuals; and (b) the use of those instruments to carry out an epidemiological study in a population-based cohort of over 5000 persons. Responsibility for data management and quality assurance was shared between the two sites. A cooperative system was instituted for forms and edit development, data entry, checking, transmission, and further checking to ensure that quality data were available for timely analysis. A quality control software program (CHECKS) was written expressly for this project to ensure the highest possible level of data integrity. Conclusions This report addresses issues particularly relevant to data management and quality assurance at developing country sites, and to collaborations between sites in developed and developing countries. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Development of applications with fuzzy objects in modern programming platforms

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 11 2005
F. Berzal
Most of the applications that are currently being developed use object-oriented programming technology, as is the case of those applications built with the Java or C, languages. Data management has not kept out of this trend, and object-oriented and object-relational database management systems have arisen as a result. Soft-computing applications need to manage imperfect data and Fuzzy Sets Theory has proven to be a good choice for accomplishing the task of imperfect data management. In this article we present a framework that allows the programmers of soft-computing applications to deal with fuzzy objects in a transparent and intuitive way. This framework can be used to develop an object-oriented code in those systems that conform with current hip object-oriented languages, so that imperfect information can be managed. © 2005 Wiley Periodicals, Inc. Int J Int Syst 20: 1117,1136, 2005. [source]


Technology in nursing scholarship: Use of citation reference managers

INTERNATIONAL JOURNAL OF MENTAL HEALTH NURSING, Issue 3 2007
Cheryl M. Smith
ABSTRACT:, Nurses, especially those in academia, feel the pressure to publish but have a limited time to write. One of the more time-consuming and frustrating tasks of research, and subsequent publications, is the collection and organization of accurate citations of sources of information. The purpose of this article is to discuss three types of citation reference managers (personal bibliographic software) and how their use can provide consistency and accuracy in recording all the information needed for the research and writing process. The advantages and disadvantages of three software programs, EndNote, Reference Manager, and ProCite, are discussed. These three software products have a variety of options that can be used in personal data management to assist researchers in becoming published authors. [source]


Advancing Patient Safety through Process Improvements

JOURNAL FOR HEALTHCARE QUALITY, Issue 5 2009
Linda Elgart
Abstract: The department of Women's and Children's Services at the Hospital of Saint Raphael (HSR) in New Haven, CT, has initiated several different and successful approaches to reducing patient risk within the department. The department purchased a computerized fetal monitoring and documentation program that has improved the ability to provide high-level antepartal care for mothers and fetuses with automatic patient data management and continuous fetal heart rate surveillance. A Risk Reduction Grant offered through the hospital malpractice insurance program provided the financial assistance for all medical providers to become certified in electronic fetal monitoring. The certification is now a required educational standard for nurses, certified nurse midwives, and for physicians who work in the labor and delivery unit. Infant and pediatric security is incorporated into policy and practice measures that include hospital-wide drills for the prevention of infant abduction. The Obstetrics and Gynecology (OB/GYN) Quality Improvement Committee supports systematic reviews of identified clinical risks and works to find viable solutions to these problems. The hospital has supported specialized obstetrical care through the Maternal Fetal Medicine Unit (MFMU), Newborn Intensive Care Unit (NICU), the Inpatient Pediatric Unit, and the labor and delivery unit. In addition, HSR has initiated an enhanced medical informed consent that is available online for providers and a patient education tool that includes a computer room at the hospital for patient use. [source]


Strategic Auditing: An Incomplete Information Model

JOURNAL OF BUSINESS FINANCE & ACCOUNTING, Issue 5-6 2001
Peter Cheng
This paper presents a stylized model of the strategy game between the auditor and the client. The client is assumed to have either good or bad inherent risk in her reporting system. She chooses a reporting effort level to maintain the accounting records and data management depending on her type of inherent risk. The auditor chooses a high or low level of audit procedures. A high level of auditing procedures will reveal the client's type and effort from which the auditor can decide either to qualify the financial statements or to issue a clean report. The client and the auditor are assumed to move simultaneously. Pure strategy equilibria are derived for all the undominated strategies between the auditor and the client in the region of the model that is more similar to the Fellingham and Newman (1985) model. Unlike their model in which a high auditing level is never a pure strategy in equilibrium, we obtain pure strategy equilibria for high auditing levels. [source]


Tracheostomy management in Acute Care Facilities , a matter of teamwork

JOURNAL OF CLINICAL NURSING, Issue 9-10 2010
Vicki Parker
Aim., Implement and evaluate an inter-disciplinary team approach to tracheostomy management in non-critical care. Background., Trends towards early tracheostomy in intensive care units (ICU) have led to increased numbers of tracheostomy patients. Together with the push for earlier discharge from ICU, this poses challenges across disciplines and wards. Even though tracheostomy is performed across a range of patient groups, tracheostomy care is seen as the domain of specialist clinicians in critical care. It is crucial to ensure quality care regardless of the patient's destination after ICU. Design., A mixed method evaluation incorporating quantitative and qualitative approaches. Method., Data collection included pre-implementation and postimplementation clinical audits and staff surveys and a postimplementation tracheostomy team focus group. Descriptive and inferential analysis was used to identify changes in clinical indicators and staff experiences. Focus group data were analysed using iterative processes of thematic analysis. Results., Findings revealed significant reductions in mean hospital length of stay (LOS) for survivors from 50,27 days (p < 0·0001) and an increase in the number of tracheostomy patients transferred to non-critical care wards in the postgroup (p = 0·006). The number of wards accepting patients from ICU increased from 3,7 and there was increased staff knowledge, confidence and awareness of the team's role. Conclusion., The team approach has led to work practice and patient outcome improvements. Organisational acceptance of the team has led to more wards indicating willingness to accept tracheostomy patients. Improved communication has resulted in more timely referral and better patient outcomes. Relevance to clinical practice., This study highlights the importance of inter-disciplinary teamwork in achieving effective patient outcomes and efficiencies. It offers a model of inter-disciplinary practice, supported by communication and data management that can be replicated across other patient groups. [source]


Development and implementation of a multisite evaluation for the Women, Co-Occurring Disorders and Violence Study

JOURNAL OF COMMUNITY PSYCHOLOGY, Issue 4 2005
Julienne Giard
In this article we describe the development and implementation of the Substance Abuse and Mental Health Services Administration's (SAMHSA's) multisite Women, Co-Occurring Disorders and Violence Study (WCDVS), highlighting some of the challenges encountered, decisions made, and lessons learned. Four themes are discussed. First, the unique contributions of the consumer/survivor/recovering (C/S/R) women to the research process are described through instances where their knowledge and advocacy were clearly influential. Second, the solutions chosen to address research design challenges are recounted, as are the ways in which these choices played out. Third, the procedures for standardizing recruitment, data collection, and data management across sites are described. Finally, the strategies employed by the nine sites to retain contact with this challenging population are reviewed and successful techniques are highlighted. © 2005 Wiley Periodicals, Inc. J Comm Psychol 33: 411,427, 2005. [source]


A Web-Based Interactive Database System for a Transcranial Doppler Ultrasound Laboratory

JOURNAL OF NEUROIMAGING, Issue 1 2006
Mark J. Gorman MD
ABSTRACT Background. Variations in transcranial Doppler (TCD) examination performance techniques and interpretive paradigms between individual laboratories are a common challenge in the practice of TCD. Demand for rapid access to patient ultrasound examination data and report for use in intensive care settings has necessitated a more flexible approach to data management. Both of these issues may benefit from a computerized approach. Methods. We describe the application of a World Wide Web-based database system for use in an ultrasound laboratory. Results. Databasing information while generating a TCD report is efficient. Web accessibility allows rapid and flexible communication of time-sensitive report information and interpretation for more expeditious clinical decision making. Conclusions. Web-based applications can extend the reach and efficiency of traditionally structured medical laboratories. [source]


Effective Strategies for Implementing a Multicenter International Clinical Trial

JOURNAL OF NURSING SCHOLARSHIP, Issue 2 2008
Leanne M. Aitken
Purpose:International collaboration in research is essential in order to improve worldwide health. The purpose of this paper is to describe strategies used to administer an international multicenter trial to assess the effectiveness of a nursing educational intervention. Design:The study design was a two-group randomized multicenter international clinical trial conducted to determine whether a brief education and counselling intervention delivered by a nurse could reduce prehospital delay in the event of symptoms suggestive of acute coronary syndrome (ACS) in patients previously diagnosed with cardiovascular disease. Method:A flexible but well-defined project structure showed intervention consistency in five sites among three countries and included experienced project coordinators, multidimensional communication methods, strategies to optimize intervention fidelity, site-specific recruitment and retention techniques, centralized data management, and consideration of ethical and budgetary requirements. Findings:Staff at five sites enrolled 3,522 participants from three countries and achieved 80% follow-up obtained at both 12 and 24 months. Conclusion:Multidimensional approaches to maintain consistency across study sites, while allowing flexibility to meet local expectations and needs, contributed to the success of this trial. Clinical Relevance:In order to support appropriate development of an evidence base for practice, nursing interventions should be tested in multiple settings. A range of strategies is described in this paper that proved effective in conducting a multicenter international trial. [source]


Computer Analysis of the Fetal Heart Rate

JOURNAL OF OBSTETRIC, GYNECOLOGIC & NEONATAL NURSING, Issue 5 2000
Patricia Robin McCartney RNC
Computer analysis of the fetal heart rate is a technology of the Information Age commercially available for research and clinical practice. Intelligent systems are engineered with algorithms or neural networks designed to simulate expert knowledge. Automated analysis has provided objective, standardized, and reproducible data used to research fetal heart rate responses in the antepartum and intrapartum setting. Perinatal information systems can integrate FHR analysis and data management. [source]


Proteome analysis of non-model plants: A challenging but powerful approach

MASS SPECTROMETRY REVIEWS, Issue 4 2008
Sebastien Christian Carpentier
Abstract Biological research has focused in the past on model organisms and most of the functional genomics studies in the field of plant sciences are still performed on model species or species that are characterized to a great extent. However, numerous non-model plants are essential as food, feed, or energy resource. Some features and processes are unique to these plant species or families and cannot be approached via a model plant. The power of all proteomic and transcriptomic methods, that is, high-throughput identification of candidate gene products, tends to be lost in non-model species due to the lack of genomic information or due to the sequence divergence to a related model organism. Nevertheless, a proteomics approach has a great potential to study non-model species. This work reviews non-model plants from a proteomic angle and provides an outline of the problems encountered when initiating the proteome analysis of a non-model organism. The review tackles problems associated with (i) sample preparation, (ii) the analysis and interpretation of a complex data set, (iii) the protein identification via MS, and (iv) data management and integration. We will illustrate the power of 2DE for non-model plants in combination with multivariate data analysis and MS/MS identification and will evaluate possible alternatives. © 2008 Wiley Periodicals, Inc., Mass Spec Rev 27: 354,377, 2008 [source]


Integrated deep drilling, coring, downhole logging, and data management in the Chicxulub Scientific Drilling Project (CSDP), Mexico

METEORITICS & PLANETARY SCIENCE, Issue 6 2004
Lothar Wohlgemuth
To date, a continuous scientific sampling of large impact craters from cover rocks to target material has only seldom been performed. The first project to deep-drill and core into one of the largest and well-preserved terrestrial impact structures was executed in the winter of 2001/2002 in the 65 Myr-old Chicxulub crater in Mexico using integrated coring sampling and in situ measurements. The combined use of different techniques allows a three-dimensional insight and a better understanding of impact processes. Here, we report the integration of conventional rotary drilling techniques with wireline mining coring technology that was applied to drill the 1510 m-deep Yaxcopoil-1 (Yax-1) well about 40 km southwest of Mérida, Yucatán, Mexico. During the course of the project, we recovered approximately 900 m of intact core samples including the transitions of reworked ejecta to post-impact sediments, and that one from large blocks of tilted target material to impact-generated rocks, i.e., impact melt breccias and suevites. Coring was complemented by wireline geophysical measurements to obtain a continuous set of in situ petrophysical data of the borehole walls. The data acquired is comprised of contents of a natural radioactive element, velocities of compressional sonic waves, and electrical resistivity values. All the digital data sets, including technical drilling parameters, initial scientific sample descriptions, and 360° core pictures, were distributed during the course of the operations via Internet and were stored in the ICDP Drilling Information System (http:www.icdp-online.org), serving the global community of cooperating scientists as a basic information service. [source]


Physician and patient survey of allergic rhinitis: methodology

ALLERGY, Issue 2007
V. Higgins
Methodology for Disease Specific Programme (DSP©) surveys designed by Adelphi Group Products is used each year to survey patients and physicians on their perceptions of treatment effectiveness, symptoms and impact of diseases. These point-in-time surveys, conducted in the USA and Europe (France, Germany, Italy, Spain and UK), provide useful information on the real-world management and treatment of diseases. This paper describes the methodology for the DSP survey in allergic rhinitis, detailing the preparation of materials, recruitment of physicians, data collection and data management. [source]


High throughput testing platform for organic Solar Cells

PROGRESS IN PHOTOVOLTAICS: RESEARCH & APPLICATIONS, Issue 7 2008
Moritz K. Riede
Abstract In this paper we present a high throughput testing setup for organic solar cells that is necessary for an efficient analysis of their behaviour. The setup comprises process parameter logging, automated measurement data acquisition and subsequent data management and analysis. Utilising this setup the reproducibility of solar cells and the effect of production parameter variations has been tested with a set of 360 solar cells based on the poly-3-hexylthiophene:1-(3-methoxycarbonyl)-propyl-1-1-phenyl-(6,6)C61 bulk heterojunction. Variations in power conversion efficiency between 1 and 3% were observed on varying production parameters hardly mentioned in literature. The conditions during the vacuum deposition of the aluminium cathode turned out to have a significant effect. The key solar cell parameter affecting the performance was the fill factor (FF). As such the work exemplifies the necessity for a combined approach to analyse the complex behaviour of organic solar cells. The developed high throughput testing setup provides a basis for an efficient testing of production parameter variations and materials and additionally opens the door for statistical analysis. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Automated technologies and novel techniques to accelerate protein crystallography for structural genomics

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 4 2008
Babu A. Manjasetty Dr.
Abstract The sequence infrastructure that has arisen through large-scale genomic projects dedicated to protein analysis, has provided a wealth of information and brought together scientists and institutions from all over the world. As a consequence, the development of novel technologies and methodologies in proteomics research is helping to unravel the biochemical and physiological mechanisms of complex multivariate diseases at both a functional and molecular level. In the late sixties, when X-ray crystallography had just been established, the idea of determining protein structure on an almost universal basis was akin to an impossible dream or a miracle. Yet only forty years after, automated protein structure determination platforms have been established. The widespread use of robotics in protein crystallography has had a huge impact at every stage of the pipeline from protein cloning, over-expression, purification, crystallization, data collection, structure solution, refinement, validation and data management- all of which have become more or less automated with minimal human intervention necessary. Here, recent advances in protein crystal structure analysis in the context of structural genomics will be discussed. In addition, this review aims to give an overview of recent developments in high throughput instrumentation, and technologies and strategies to accelerate protein structure/function analysis. [source]


Finding Ways to Create Connections Among Communities: Partial Results of an Ethnography of Urban Public Health Nurses

PUBLIC HEALTH NURSING, Issue 1 2000
Judeen Schulte Ph.D.
The purpose of this ethnographic study was to describe the culture of public health nurses (PHNs) in a large, Midwestern urban health department. Data collection methods, data management, and analyses followed ethnographic procedures and resulted in the development of categories, domains, and cultural themes. The general study participants were PHNs, clients, supervisors, and administrators. The primary cultural theme that emerged was that public health nursing is finding ways to create connections among communities. Three interacting communities were identified: the local communities, communities created by individuals and families, and communities of resources. This article describes one of the three subthemes that emerged, processes used to help clients create connections, and describes how caring is shown uniquely in public health nursing. As a result of the study, implications for nursing practice, education, and research were developed. The results of the study supported a position that public health nursing is a unique nursing specialty. It reinforced also the applicability of an ethnographic design and methodology to nursing research. [source]