Data Access (data + access)

Distribution by Scientific Domains


Selected Abstracts


Data Governance and Stewardship: Designing Data Stewardship Entities and Advancing Data Access

HEALTH SERVICES RESEARCH, Issue 5p2 2010
Sara Rosenbaum
U.S. health policy is engaged in a struggle over access to health information, in particular, the conditions under which information should be accessible for research when appropriate privacy protections and security safeguards are in place. The expanded use of health information,an inevitable step in an information age,is widely considered be essential to health system reform. Models exist for the creation of data-sharing arrangements that promote proper use of information in a safe and secure environment and with attention to ethical standards. Data stewardship is a concept with deep roots in the science and practice of data collection, sharing, and analysis. Reflecting the values of fair information practice, data stewardship denotes an approach to the management of data, particularly data that can identify individuals. The concept of a data steward is intended to convey a fiduciary (or trust) level of responsibility toward the data. Data governance is the process by which responsibilities of stewardship are conceptualized and carried out. As the concept of health information data stewardship advances in a technology-enabled environment, the question is whether legal barriers to data access and use will begin to give way. One possible answer may lie in defining the public interest in certain data uses, tying provider participation in federal health programs to the release of all-payer data to recognized data stewardship entities for aggregation and management, and enabling such entities to foster and enable the creation of knowledge through research. [source]


Toward replication in grids for digital libraries with freshness and correctness guarantees

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 17 2008
Fuat Akal
Abstract Building digital libraries (DLs) on top of data grids while facilitating data access and minimizing access overheads is challenging. To achieve this, replication in a Grid has to provide dedicated features that are only partly supported by existing Grid environments. First, it must provide transparent and consistent access to distributed data. Second, it must dynamically control the creation and maintenance of replicas. Third, it should allow higher replication granularities, i.e. beyond individual files. Fourth, users should be able to specify their freshness demands, i.e. whether they need most recent data or are satisfied with slightly outdated data. Finally, all these tasks must be performed efficiently. This paper presents an approach that will finally allow one to build a fully integrated and self-managing replication subsystem for data grids that will provide all the above features. Our approach is to start with an accepted replication protocol for database clusters, namely PDBREP, and to adapt it to the grid. Copyright © 2008 John Wiley & Sons, Ltd. [source]


The development of a geospatial data Grid by integrating OGC Web services with Globus-based Grid technology

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2008
Liping Di
Abstract Geospatial science is the science and art of acquiring, archiving, manipulating, analyzing, communicating, modeling with, and utilizing spatially explicit data for understanding physical, chemical, biological, and social systems on the Earth's surface or near the surface. In order to share distributed geospatial resources and facilitate the interoperability, the Open Geospatial Consortium (OGC), an industry,government,academia consortium, has developed a set of widely accepted Web-based interoperability standards and protocols. Grid is the technology enabling resource sharing and coordinated problem solving in dynamic, multi-institutional virtual organizations. Geospatial Grid is an extension and application of Grid technology in the geospatial discipline. This paper discusses problems associated with directly using Globus-based Grid technology in the geospatial disciplines, the needs for geospatial Grids, and the features of geospatial Grids. Then, the paper presents a research project that develops and deploys a geospatial Grid through integrating Web-based geospatial interoperability standards and technology developed by OGC with Globus-based Grid technology. The geospatial Grid technology developed by this project makes the interoperable, personalized, on-demand data access and services a reality at large geospatial data archives. Such a technology can significantly reduce problems associated with archiving, manipulating, analyzing, and utilizing large volumes of geospatial data at distributed locations. Copyright © 2008 John Wiley & Sons, Ltd. [source]


APEX-Map: a parameterized scalable memory access probe for high-performance computing systems,

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 17 2007
Erich Strohmaier
Abstract The memory wall between the peak performance of microprocessors and their memory performance has become the prominent performance bottleneck for many scientific application codes. New benchmarks measuring data access speeds locally and globally in a variety of different ways are needed to explore the ever increasing diversity of architectures for high-performance computing. In this paper, we introduce a novel benchmark, APEX-Map, which focuses on global data movement and measures how fast global data can be fed into computational units. APEX-Map is a parameterized, synthetic performance probe and integrates concepts for temporal and spatial locality into its design. Our first parallel implementation in MPI and various results obtained with it are discussed in detail. By measuring the APEX-Map performance with parameter sweeps for a whole range of temporal and spatial localities performance surfaces can be generated. These surfaces are ideally suited to study the characteristics of the computational platforms and are useful for performance comparison. Results on a global-memory vector platform and distributed-memory superscalar platforms clearly reflect the design differences between these different architectures. Published in 2007 by John Wiley & Sons, Ltd. [source]


An application portal for collaborative coastal modeling

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2007
Chongjie Zhang
Abstract We describe the background, architecture and implementation of a user portal for the SCOOP coastal ocean observing and modeling community. SCOOP is engaged in the real-time prediction of severe weather events, including tropical storms and hurricanes, and provides operational information including wind, storm surge and resulting inundation, which are important for emergency management. The SCOOP portal, built with the GridSphere Framework, currently integrates customized Grid portlet components for data access, job submission, resource management and notification. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Data Governance and Stewardship: Designing Data Stewardship Entities and Advancing Data Access

HEALTH SERVICES RESEARCH, Issue 5p2 2010
Sara Rosenbaum
U.S. health policy is engaged in a struggle over access to health information, in particular, the conditions under which information should be accessible for research when appropriate privacy protections and security safeguards are in place. The expanded use of health information,an inevitable step in an information age,is widely considered be essential to health system reform. Models exist for the creation of data-sharing arrangements that promote proper use of information in a safe and secure environment and with attention to ethical standards. Data stewardship is a concept with deep roots in the science and practice of data collection, sharing, and analysis. Reflecting the values of fair information practice, data stewardship denotes an approach to the management of data, particularly data that can identify individuals. The concept of a data steward is intended to convey a fiduciary (or trust) level of responsibility toward the data. Data governance is the process by which responsibilities of stewardship are conceptualized and carried out. As the concept of health information data stewardship advances in a technology-enabled environment, the question is whether legal barriers to data access and use will begin to give way. One possible answer may lie in defining the public interest in certain data uses, tying provider participation in federal health programs to the release of all-payer data to recognized data stewardship entities for aggregation and management, and enabling such entities to foster and enable the creation of knowledge through research. [source]


Optimizing Patching-based multicast for video-on-demand in wireless mesh networks

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 9-10 2010
Fei Xie
Abstract In this work, we study the application of video-on-demand (VoD) in wireless mesh networks (WMN), a next generation edge technology to provide broadband data access in residential, business and even city-wise networks. We adopt a Patching-based multicast technique to better utilize the bandwidth resources in the mesh network. We optimize the Patching-based multicast by addressing two critical problems, namely, the Minimum Cost Multicast Tree (MCMT) problem and the Maximum Benefit Multicast Group (MBMG) problem. The MCMT problem is to find a MCMT in the network. We show that finding such a tree in the WMN can be formulated as a graph theory problem, which is to find the tree with minimum number of non-leaf nodes, and which spans all the nodes in the multicast group. We further prove the problem is NP-hard and propose a fast greedy algorithm to accommodate the real-time feature of the VoD application. We solve the MBMG problem by minimizing the communication of a Patching group in the entire network. A Markov model is proposed to capture the growth of the multicast group in the WMN. Simulation study results validate the proposed solutions of the two problems. Copyright © 2009 John Wiley & Sons, Ltd. [source]


The use of Geographic Information Systems in climatology and meteorology: COST 719

METEOROLOGICAL APPLICATIONS, Issue 1 2005
Izabela Dyras
The COST Action 719 started in 2001 and presently 20 European countries are participating. The main objectives of the Action are to establish interfaces between GIS and meteorological data, assess the availability, contents and accessibility of meteorological and climatological data sets and encourage and foster European co-operation. The tasks are carried out within three working groups concentrated on issues such as data access and availability, methods of spatial interpolation and developing recommendations for standardised GIS applications. The applications that have been adopted mainly focus on three parameters, i.e. precipitation, temperature and energy balance for which three demonstration projects have been formulated. It is expected that the Action will result in recommendations for better and more cost-effective production of state-of-the-art meteorological and climatological information. Also an improvement of the co-operation between European countries in the application of GIS in the field of meteorology, climatology and environmental sciences should be achieved together with better-trained personnel within the operational and scientific divisions of national meteorological services. Additionally, the development of a visualisation system for climate data sets for internet applications is under preparation. This paper provides information concerning the work in progress on the demonstration projects made within COST 719. Copyright © 2005 Royal Meteorological Society. [source]


Access to linked administrative healthcare utilization data for pharmacoepidemiology and pharmacoeconomics research in Canada: anti-viral drugs as an example,

PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 11 2009
Nigel S. B. Rawson PhD
Abstract Purpose Administrative healthcare utilization data from Canadian provinces have been used for pharmacoepidemiology and pharmacoeconomics research, but limited transparency exists about opportunities for data access, who can access them, and processes to obtain data. An attempt was made to obtain data from all 10 provinces to evaluate access and its complexity. Methods An initial enquiry about the process and requirements to obtain data on individual, anonymized patients dispensed any of four anti-viral drugs in the ambulatory setting, linked with data from hospital and physician service claims, was sent to each province. Where a response was encouraging, a technical description of the data of interest was submitted. Results Data were unavailable from the provinces of New Brunswick, Newfoundland and Labrador, and Prince Edward Island, and inaccessible from British Columbia, Manitoba and Ontario due to policies that prohibit collaborative work with pharmaceutical industry researchers. In Nova Scotia, patient-level data were available but only on site. Data were accessible in Alberta, Quebec and Saskatchewan, although variation exists in the currency of the data, time to obtain data, approval requirements and insurance coverage eligibility. Conclusions As Canada moves towards a life-cycle management approach to drug regulation, more post-marketing studies will be required, potentially using administrative data. Linked patient-level drug and healthcare data are presently accessible to pharmaceutical industry researchers in four provinces, although only logistically realistic in three and limited to seniors and low-income individuals in two. Collaborative endeavours to improve access to provincial data and to create other data resources should be encouraged. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Research Using Emergency Department,related Data Sets: Current Status and Future Directions

ACADEMIC EMERGENCY MEDICINE, Issue 11 2009
Jon Mark Hirshon MD
Abstract The 2009 Academic Emergency Medicine consensus conference focused on "Public Health in the ED: Surveillance, Screening and Intervention." One conference breakout session discussed the significant research value of health-related data sets. This article represents the proceedings from that session, primarily focusing on emergency department (ED)-related data sets and includes examples of the use of a data set based on ED visits for research purposes. It discusses types of ED-related data sets available, highlights barriers to research use of ED-related data sets, and notes limitations of these data sets. The paper highlights future directions and challenges to using these important sources of data for research, including identification of five main needs related to enhancing the use of ED-related data sets. These are 1) electronic linkage of initial and follow-up ED visits and linkage of information about ED visits to other outcomes, including costs of care, while maintaining deidentification of the data; 2) timely data access with minimal barriers; 3) complete data collection for clinically relevant and/or historical data elements, such as the external cause-of-injury code; 4) easy access to data that can be parsed into smaller jurisdictions (such as states) for policy and/or research purposes, while maintaining confidentiality; and 5) linkages between health survey data and health claims data. ED-related data sets contain much data collected directly from health care facilities, individual patient records, and multiple other sources that have significant potential impact for studying and improving the health of individuals and the population. [source]


Detecting Tropical Forests' Responses to Global Climatic and Atmospheric Change: Current Challenges and a Way Forward

BIOTROPICA, Issue 1 2007
Deborah A. ClarkArticle first published online: 21 DEC 200
ABSTRACT Because of tropical forests' disproportionate importance for world biodiversity and for the global carbon cycle, we urgently need to understand any effects on these ecosystems from the ongoing changes in climate and atmosphere. This review, intended to complement existing data reviews on this topic, focuses on three major classes of challenges that we currently face when trying to detect and interpret directional changes in tropical forests. One is the very limited existing information on the historical context of study sites. Lasting effects from past climate, natural disturbances, and/or human activities could be significantly affecting current-day processes in tropical forests and need to be investigated for all active field sites. Second, while progress has been made in recent years on standardizing and refining research approaches, a number of methods- and data-limitations continue to affect efforts both to detect within-forest changes and to relate them to ongoing environmental change. Important outstanding needs are improved sampling designs, longer time-series of observations, filling key data gaps, and data access. Finally, forest responses to ongoing environmental change are complex. The effects of many simultaneously changing environmental factors are integrated by the plants, and their responses can involve significant lags, carryovers, and non-linearities. Specifying effects of individual environmental changes, however, is required for accurate ecosystem-process models and thus for projecting future impacts on these forests. After discussing these several types of challenges and ways to address them, I conclude with a priority agenda for this critical area of research. Abstract in Spanish is available at http://www.blackwell-synergy.com/loi/btp. RESUMEN Debido a la importancia desproporcionada de los bosques tropicales para la biodiversidad mundial y para el ciclo global del carbono, es urgente identificar los impactos sobre estos ecosistemas provocados por los cambios actuales en el clima y en la atmósfera. Este artículo de revisión, escrito con el propósito de complementar otras revisiones recientes, se enfoca en tres principales clases de retos que enfrentamos actualmente en la detección e interpretación de cambios direccionales en los bosques tropicales. Primero es la gran escasez de información histórica acerca de los sitios de estudio. Los procesos actuales en los bosques tropicales pueden reflejar los efectos prolongados del pasado climático, las perturbaciones naturales y/o las actividades humanas, por lo que deben de ser investigados en todos los sitios actuales de estudio. Segundo, a pesar de avances recientes en la estandarización y el refinamiento de los métodos de investigación, nuestra habilidad para detectar cambios en los bosques y ligarlos a los grandes cambios ambientales sigue siendo limitada. Para garantizar avances en el área se requiere mejorar los diseños de muestreo, extender las series de observación en el tiempo a plazos mayores, llenar ciertos vacíos claves en el conocimiento, y facilitar el acceso a los datos existentes. Por último, se requiere de enfoques que tomen en cuenta la complejidad de las respuestas de los bosques a los cambios ambientales. Las plantas integran los efectos de cambios simultáneos en múltiples factores ambientales, y sus respuestas pueden ser no lineales e incluir efectos de retraso y acarreo. No obstante, es importante también especificar los efectos individuales de los diferentes cambios ambientales para afinar los modelos de procesos a nivel del ecosistema, y así poder proyectar los impactos futuros sobre estos bosques. Después de discutir dichos retos y estrategias para enfrentarlos, concluyo con una agenda de prioridades para esta área crítica de investigación. [source]