Home About us Contact | |||
Data Archives (data + archives)
Selected AbstractsPossible Environmental Factors Underlying Amphibian Decline in Eastern Puerto Rico: Analysis of U.S. Government Data ArchivesCONSERVATION BIOLOGY, Issue 4 2001Robert F. Stallard I examined changes in environmental conditions by examining time-series data sets that extend back at least into the 1980s, a period when frog populations were declining. The data include forest cover; annual mean, minimum, and maximum daily temperature; annual rainfall; rain and stream chemistry; and atmospheric-dust transport. I examined satellite imagery and air-chemistry samples from a single National Aeronautics and Space Administration aircraft flight across the Caribbean showing patches of pollutants, described as thin sheets or lenses, in the lower troposphere. The main source of these pollutants appeared to be fires from land clearing and deforestation, primarily in Africa. Some pollutant concentrations were high and, in the case of ozone, approached health limits set for urban air. Urban pollution impinging on Puerto Rico, dust generation from Africa ( potential soil pathogens), and tropical forest burning ( gaseous pollutants) have all increased during the last three decades, overlapping the timing of amphibian declines in eastern Puerto Rico. None of the data sets pointed directly to changes so extreme that they might be considered a direct lethal cause of amphibian declines in Puerto Rico. More experimental research is required to link any of these environmental factors to this problem. Resumen: Las pasadas tres décadas han visto grandes disminuciones poblacionales de especies de anfibios en altas elevaciones de Puerto Rico oriental, una región única en los trópicos húmedos debido al grado de monitoreo ambiental que se ha llevado a cabo mediante los esfuerzos de las agencias de gobierno de los Estados Unidos. Examiné los cambios en condiciones ambientales mediante el análisis de datos de series de tiempo que se extienden hasta los 1980s, un periodo en el que las poblaciones de ranas estaban declinando. Los datos incluyen cobertura forestal; temperatura diaria media, mínima y máxima anual; precipitación anual; química de la lluvia y arroyos; y el transporte atmosférico de polvo. Examiné imágenes de satélite y muestras de química del aire obtenidos de un solo vuelo de una nave de la NASA a lo largo del Caribe que mostraba parches de contaminantes descritas como capas delgadas de lentes en la inferior troposfera. La mayor fuente de contaminantes parece ser los incendios de tierras clareadas y la deforestación, principalmente en África. Algunas concentraciones de contaminantes fueron altas y en el caso del ozono, se aproximó a los límites de salud establecidos para aire urbano. La contaminación urbana afectando a Puerto Rico, la generación de polvo en África ( patógenos del suelo potenciales) y la quema de bosque tropical (contaminantes gaseosos) han incrementado durante las últimas tres décadas, superponiéndose con el periodo en que oturrieron las disminuciones de anfibios en Puerto Rico oriental. Ninguno de estos conjuntos de datos señaló directamente hacia cambios tan extremos que debieran ser considerados como una causa letal directa de las disminuciones en Puerto Rico. Se requiere de más investigación experimental que vincule a estos factores ambientales con este problema. [source] Parallel heterogeneous CBIR system for efficient hyperspectral image retrieval using spectral mixture analysisCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 9 2010Antonio J. Plaza Abstract The purpose of content-based image retrieval (CBIR) is to retrieve, from real data stored in a database, information that is relevant to a query. In remote sensing applications, the wealth of spectral information provided by latest-generation (hyperspectral) instruments has quickly introduced the need for parallel CBIR systems able to effectively retrieve features of interest from ever-growing data archives. To address this need, this paper develops a new parallel CBIR system that has been specifically designed to be run on heterogeneous networks of computers (HNOCs). These platforms have soon become a standard computing architecture in remote sensing missions due to the distributed nature of data repositories. The proposed heterogeneous system first extracts an image feature vector able to characterize image content with sub-pixel precision using spectral mixture analysis concepts, and then uses the obtained feature as a search reference. The system is validated using a complex hyperspectral image database, and implemented on several networks of workstations and a Beowulf cluster at NASA's Goddard Space Flight Center. Our experimental results indicate that the proposed parallel system can efficiently retrieve hyperspectral images from complex image databases by efficiently adapting to the underlying parallel platform on which it is run, regardless of the heterogeneity in the compute nodes and communication links that form such parallel platform. Copyright © 2009 John Wiley & Sons, Ltd. [source] The development of a geospatial data Grid by integrating OGC Web services with Globus-based Grid technologyCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2008Liping Di Abstract Geospatial science is the science and art of acquiring, archiving, manipulating, analyzing, communicating, modeling with, and utilizing spatially explicit data for understanding physical, chemical, biological, and social systems on the Earth's surface or near the surface. In order to share distributed geospatial resources and facilitate the interoperability, the Open Geospatial Consortium (OGC), an industry,government,academia consortium, has developed a set of widely accepted Web-based interoperability standards and protocols. Grid is the technology enabling resource sharing and coordinated problem solving in dynamic, multi-institutional virtual organizations. Geospatial Grid is an extension and application of Grid technology in the geospatial discipline. This paper discusses problems associated with directly using Globus-based Grid technology in the geospatial disciplines, the needs for geospatial Grids, and the features of geospatial Grids. Then, the paper presents a research project that develops and deploys a geospatial Grid through integrating Web-based geospatial interoperability standards and technology developed by OGC with Globus-based Grid technology. The geospatial Grid technology developed by this project makes the interoperable, personalized, on-demand data access and services a reality at large geospatial data archives. Such a technology can significantly reduce problems associated with archiving, manipulating, analyzing, and utilizing large volumes of geospatial data at distributed locations. Copyright © 2008 John Wiley & Sons, Ltd. [source] PETROLEUM MIGRATION, FAULTS AND OVERPRESSURE, PART I: CALIBRATING BASIN MODELLING USING PETROLEUM IN TRAPS , A REVIEWJOURNAL OF PETROLEUM GEOLOGY, Issue 3 2006D.A. Karlsen This paper considers the principles of deciphering basin-scale hydrocarbon migration patterns using the geochemical information which is present in trapped petroleum. Petroleum accumulations in subsiding basins can be thought of as "data archives" within which stored information can help us to understand aspects of hydrocarbon formation and migration. This information can impart a time-resolved picture of hydrocarbon migration in a basin in response to processes associated with progressive burial, particularly in the context of the occurrence and periodic activity of faults. This review, which includes a series of tentative models of migration-related processes in the extensional Halten Terrace area, offshore mid-Norway, illustrates how we can use information from the migrating mobile hydrocarbon phase to improve our knowledge of the static geological system. Of particular importance is the role of sub-seismic heterogeneities and faults in controlling migration processes. We focus on how the secondary migration process can be enhanced in a multi-source rock basin such as the Halten Terrace, thereby increasing prospectivity. [source] HST experience in data managementASTRONOMISCHE NACHRICHTEN, Issue 6-8 2004R. Albrecht Abstract The data generated by the Hubble Space Telescope pose a series of special requirements for the analysis process. The Hubble Space Telescope (HST) is being operated in a semi-autonomous, pre-programmed manner, executing a queue of observing requests. Calibration is being done "institutionally", i.e. not in response to individual observing programs, but in the same manner for all users. Data products are being generated for the observers, and they are ingested in the HST science data archives, to make them available for further exploitation through the Virtual Observatory. Added value products can be generated by combining data from different programs. Interactive analysis tools are being supplied to support users in the optimum exploitation of the data. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Comparison of 850-hPa relative humidity between ERA-40 and NCEP/NCAR re-analyses: detection of suspicious data in ERA-40ATMOSPHERIC SCIENCE LETTERS, Issue 1 2009Aurélien Ben Daoud Abstract An exploratory study is performed on the 850-hPa relative humidity data (RH850) extracted from the Re-Analysis Project (ERA-40) and National Centers for Environmental Prediction (NCEP)/National Center for Atmospheric Research (NCAR) re-analyses covering a domain centred on western Europe (60°W,60°E, 15°N,75°N). The largest deviations between the two data archives are observed over the North Atlantic Ocean. In addition, unrealistic values of RH850 are detected in the ERA-40 re-analysis at resolutions of both 2.5° and 1.125°. There is no strong correlation between RH850 provided by ERA-40 and observations from radio sounding stations, thus ruling out a straightforward correction of the detected anomalous values. Copyright © 2009 Royal Meteorological Society [source] |