Related Tools (relate + tool)

Distribution by Scientific Domains


Selected Abstracts


Bridging the language gap in scientific computing: the Chasm approach

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 2 2006
C. E. Rasmussen
Abstract Chasm is a toolkit providing seamless language interoperability between Fortran 95 and C++. Language interoperability is important to scientific programmers because scientific applications are predominantly written in Fortran, while software tools are mostly written in C++. Two design features differentiate Chasm from other related tools. First, we avoid the common-denominator type systems and programming models found in most Interface Definition Language (IDL)-based interoperability systems. Chasm uses the intermediate representation generated by a compiler front-end for each supported language as its source of interface information instead of an IDL. Second, bridging code is generated for each pairwise language binding, removing the need for a common intermediate data representation and multiple levels of indirection between the caller and callee. These features make Chasm a simple system that performs well, requires minimal user intervention and, in most instances, bridging code generation can be performed automatically. Chasm is also easily extensible and highly portable. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Environmental power analysis , a new perspective

ENVIRONMETRICS, Issue 5 2001
David R. Fox
Abstract Power analysis and sample-size determination are related tools that have recently gained popularity in the environmental sciences. Their indiscriminate application, however, can lead to wildly misleading results. This is particularly true in environmental monitoring and assessment, where the quality and nature of data is such that the implicit assumptions underpinning power and sample-size calculations are difficult to justify. When the assumptions are reasonably met these statistical techniques provide researchers with an important capability for the allocation of scarce and expensive resources to detect putative impact or change. Conventional analyses are predicated on a general linear model and normal distribution theory with statistical tests of environmental impact couched in terms of changes in a population mean. While these are ,optimal' statistical tests (uniformly most powerful), they nevertheless pose considerable practical difficulties for the researcher. Compounding this difficulty is the subsequent analysis of the data and the impost of a decision framework that commences with an assumption of ,no effect'. This assumption is only discarded when the sample data indicate demonstrable evidence to the contrary. The alternative (,green') view is that any anthropogenic activity has an impact on the environment and therefore a more realistic initial position is to assume that the environment is already impacted. In this article we examine these issues and provide a re-formulation of conventional mean-based hypotheses in terms of population percentiles. Prior information or belief concerning the probability of exceeding a criterion is incorporated into the power analysis using a Bayesian approach. Finally, a new statistic is introduced which attempts to balance the overall power regardless of the decision framework adopted. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Hypertext support for the information needs of software maintainers

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 3 2004
Jussi Koskinen
Abstract Making changes safely to programs requires program comprehension and satisfaction of the information needs of software maintainers. In this paper we provide insights into improving hypertext-based software maintenance support by analyzing those information needs. There exists a series of four earlier, detailed-level empirical studies on the information needs of professional C program maintainers. We focus on these studies, synthesize their results and determine sources from which the required information might be attained. An experimental research tool, the HyperSoft system, is used to demonstrate the satisfaction of information needs and the system is analytically evaluated against the needs explored by the empirical studies. HyperSoft is based on our transient hypertext approach for software maintenance support. HyperSoft provides automatically generated hypertextual access structures and software visualizations. The results show that transient hypertext is a well-suited representational form of the typically required versatile information. The discussion also covers related tools and the main features for providing the information required by maintainers are identified. The results show that the focus areas of these tools vary considerably. Copyright © 2004 John Wiley & Sons, Ltd. [source]


The Identity Division of Labor in Native Alaska

AMERICAN ANTHROPOLOGIST, Issue 1 2009
LISA FRINK
ABSTRACT, There is often an implicit assumption that womens' technologies and associated tasks in subsistence-based groups are expedient and simple. For instance, in Native Alaska, the butchering of fish has been illustrated as arduous but uncomplicated work. On the contrary, closer examinations, as well as discussions with the people who are still learning and practicing subsistence tasks, indicate that this perspective is inaccurate. Instead, these taken-for-granted technologies and techniques require a lifetime of training and practice, and not all people achieve master status. Drawing from data from contemporary herring processing and the related tools of the trade, I explore the division of labor in the context of expertise and apprenticeship. [Keywords:,apprenticeship, expertise, gender, age, Alaska] [source]


Proteomics FASTA Archive and Reference Resource

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 9 2008
Jayson A. Falkner
Abstract A FASTA file archive and reference resource has been added to ProteomeCommons.org. Motivation for this new functionality derives from two primary sources. The first is the recent FASTA standardization work done by the Human Proteome Organization's Proteomics Standards Initiative (HUPO-PSI). Second is the general lack of a uniform mechanism to properly cite FASTA files used in a study, and to publicly access such FASTA files post-publication. An extension to the Tranche data sharing network has been developed that includes web-pages, documentation, and tools for facilitating the use of FASTA files. These include conversion to the new HUPO-PSI format, and provisions for both citing and publicly archiving FASTA files. This new resource is available immediately, free of charge, and can be accessed at http://www.proteomecommons.org/data/fasta/. Source-code for related tools is also freely available under the BSD license. [source]