Common Standards (common + standards)

Distribution by Scientific Domains


Selected Abstracts


DISCLOSING CONFLICTS OF INTEREST: COMMON STANDARDS IN UNCOMMON CONTEXTS

ADDICTION, Issue 11 2009
ISIDORE S. OBOT
No abstract is available for this article. [source]


Detection and delineation of P and T waves in 12-lead electrocardiograms

EXPERT SYSTEMS, Issue 1 2009
Sarabjeet Mehta
Abstract: This paper presents an efficient method for the detection and delineation of P and T waves in 12-lead electrocardiograms (ECGs) using a support vector machine (SVM). Digital filtering techniques are used to remove power line interference and baseline wander. An SVM is used as a classifier for the detection and delineation of P and T waves. The performance of the algorithm is validated using original simultaneously recorded 12-lead ECG recordings from the standard CSE (Common Standards for Quantitative Electrocardiography) ECG multi-lead measurement library. A significant detection rate of 95.43% is achieved for P wave detection and 96.89% for T wave detection. Delineation performance of the algorithm is validated by calculating the mean and standard deviation of the differences between automatic and manual annotations by the referee cardiologists. The proposed method not only detects all kinds of morphologies of QRS complexes, P and T waves but also delineates them accurately. The onsets and offsets of the detected P and T waves are found to be within the tolerance limits given in the CSE library. [source]


PC-Based ECG Waveform Recognition,Validation of Novel Software Against a Reference ECG Database

ANNALS OF NONINVASIVE ELECTROCARDIOLOGY, Issue 2009
Corina-Dana Dota M.D.
Background: PC-based ECG measurements must cope with normal as well as pathological ECGs in a reliable manner. EClysis, a software for ECG measurements was tested against reference values from the Common Standards for Quantitative Electrocardiography (CSE) database. Methods: Digital ECGs (12 leads, 500 Hz) were recorded by the CSE project. Data Set 3 contains reference values for 125 ECGs (33 normal and 92 pathological). Median values of measurements by 11 computer programs and by five cardiologists, respectively, refer to the earliest P and QRS onsets and to the latest P, QRS, and T offsets in any lead of a selected (index) beat. EClysis automatically measured all ECGs, without user interference. Results: The PQRST points were correctly detected but in two ECGs with AV block II,III. The software was not designed to detect atrial activity in atrial fibrillation (n = 9) and flutter (n = 1). In one case of atrial fibrillation, atrial activity interfered with positioning of QRS and T offsets. Regression coefficients between EClysis© and CSE (software-generated and human) were above 0.95 (P < 0.0001). The confidence intervals were 95% for the slope and the intercept of the regression lines. Conclusions: The PC-based detection and analysis of PQRST points showed a high level of agreement with the CSE database reference values. [source]


The Open Grid Computing Environments collaboration: portlets and services for science gateways

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 6 2007
Jay Alameda
Abstract We review the efforts of the Open Grid Computing Environments collaboration. By adopting a general three-tiered architecture based on common standards for portlets and Grid Web services, we can deliver numerous capabilities to science gateways from our diverse constituent efforts. In this paper, we discuss our support for standards-based Grid portlets using the Velocity development environment. Our Grid portlets are based on abstraction layers provided by the Java CoG kit, which hide the differences of different Grid toolkits. Sophisticated services are decoupled from the portal container using Web service strategies. We describe advance information, semantic data, collaboration, and science application services developed by our consortium. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Euroethics,a database network on biomedical ethics

HEALTH INFORMATION & LIBRARIES JOURNAL, Issue 3 2006
Ylva Gavel
Background:,euroethics is a database covering European literature on ethics in medicine. It is produced within Eurethnet, a European information network on ethics in medicine and biotechnology. Objectives:, The aim of Euroethics is to disseminate information on European bioethical literature that may otherwise be difficult to find. Methods:, A collaboration model for pooling data from different centres was developed. The policy was to accomplish data uniformity, while still allowing for local differences in terms of software, indexing practices and resources. Records contributed to the database follow common standards in terms of data fields and indexing terms. The indexing terms derive from two thesauri, Thesaurus Ethics in the Life Sciences (TELS) and Medical Subject Headings (MeSH). Combining elements from search tools developed previously, the developers sought to find a technical solution optimized for this data model. An approach relying on a thesaurus database that is loaded along with the bibliographic database is described. Results and conclusions:, The present case study offers examples of possible approaches to several tasks often encountered in database development, such as: merging data from diverse sources, getting the most out of indexing terms used in a database, and handling more than one thesaurus in the same system. [source]


GIQAR position paper on ,Archiving and Good Laboratory Practice',

QUALITY ASSURANCE JOURNAL, Issue 4 2005
M. M. Brunetti
Abstract Archiving of documents and specimens generated during a non-clinical laboratory study is a basic Good Laboratory Practice (GLP) requirement. The records and materials that should be archived as well as the characteristics and the organisation of archive facilities are addressed in the OECD Series on Principles of Good Laboratory Practice No. 1 (OECD Principles of Good Laboratory Practice (as revised in 1997) [1]. However, in recent years, questions concerning archiving have been raised and the need for a more detailed guidance on this matter has become evident The aim of the Society for Applied Pharmacological Sciences/Italian Group of Quality Assurance in Research (SSFA/GIQAR) working group on ,Archiving according to GLP' was to issue a position paper, present it for discussion in an ad hoc round table with representatives of the Italian GLP monitoring authority to promote common standards and to provide additional recommendations on storage and retention of records. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Methodology of QT-Interval Measurement in the Modular ECG Analysis System (MEANS)

ANNALS OF NONINVASIVE ELECTROCARDIOLOGY, Issue 2009
Jan A. Kors Ph.D.
Background: QT prolongation as can be induced by drugs, signals the risk of life-threatening arrhythmias. The methodology of QT measurement in the modular ECG analysis system (MEANS) is described. Methods: In the simultaneously recorded leads of the standard 12-lead electrocardiogram (ECG), the QRS complexes are detected by a spatial velocity function. They are typed as dominant or nondominant, and a representative complex per lead is obtained by averaging over the dominant complexes. QRS onset and T end are determined by a template technique, and QT is measured. MEANS performance was evaluated on the 125 ECGs of the common standards for quantitative electrocardiography (CSE) multilead database, of which the waveform boundaries have been released. Results: MEANS detected correctly all 1445 complexes of the CSE library, with one false-positive detection due to a sudden baseline jump. All dominant complexes were correctly typed. The average of the differences between MEANS and reference was less than 2 ms (=1 sample) for both QRS onset and T end, and 2.1 ms for QT duration. The standard deviation of the differences was 3.8, 8.4, and 10.4 ms, respectively. Conclusions: A standard deviation of 10.4 ms for QT measurement seems large when related to the regulatory requirement that a prolongation as small as 5 ms should be detected. However, QT variabilities as encountered in different individuals will be larger than when measured in one individual during pharmacological intervention. Finally, if the U wave is part of the total repolarization, then T and U form a continuum and the end of T becomes questionable. [source]