Computer Hardware (computer + hardware)

Distribution by Scientific Domains


Selected Abstracts


A Numerical Model and Spreadsheet Interface for Pumping Test Analysis

GROUND WATER, Issue 4 2001
Gary S. Johnson
Curve-matching techniques have been the standard method of aquifer test analysis for several decades. A variety of techniques provide the capability of evaluating test data from confined, unconfined, leaky aquitard, and other conditions. Each technique, however, is accompanied by a set of assumptions, and evaluation of a combination of conditions can be complicated or impossible due to intractable mathematics or nonuniqueness of the solution. Numerical modeling of pumping tests provides two major advantages: (1) the user can choose which properties to calibrate and what assumptions to make; and (2) in the calibration process the user is gaining insights into the conceptual model of the flow system and uncertainties in the analysis. Routine numerical modeling of pumping tests is now practical due to computer hardware and software advances of the last decade. The RADFLOW model and spreadsheet interface presented in this paper is an easy-to-use numerical model for estimation of aquifer properties from pumping test data. Layered conceptual models and their properties are evaluated in a trial-and-error estimation procedure. The RADFLOW model can treat most combinations of confined, unconfined, leaky aquitard, partial penetration, and borehole storage conditions. RADFLOW is especially useful in stratified aquifer systems with no identifiable lateral boundaries. It has been verified to several analytical solutions and has been applied in the Snake River Plain Aquifer to develop and test conceptual models and provide estimates of aquifer properties. Because the model assumes axially symmetrical flow, it is limited to representing multiple aquifer layers that are laterally continuous. [source]


Evaluation of operators' performance for automation design in the fully digital control room of nuclear power plants

HUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 1 2010
Chiuhsiang Joe Lin
Abstract Recent technical developments in computer hardware and software have meant that human,machine systems can be automated in many respects. If automation fails, however, human operators can have difficulty in recognizing the existence of a problem, identifying what has failed, and taking corrective action to remedy these out-of-the-loop (OOTL) performance problems. Several studies have suggested that taxonomies of levels of automation (LOAs) and types of automation (TOAs) can be used to solve OOTL problems. This study examined the impact of LOAs in process control automation within the context of nuclear power plants (NPPs). A simulation experiment in an NPP is performed to validate this framework using an automatic mode and a semiautomatic mode. Mental demand is found to be significantly reduced under the automatic mode; however, participants felt frustrated with this LOA. Situation awareness is found to be similar in the two modes. The results of an end-of-experiment subjective rating reveal that participants were evenly divided between the two modes with respect to generating and selecting functions. It is therefore suggested that human operators be involved in generating and selecting functions under an automatic mode. © 2009 Wiley Periodicals, Inc. [source]


Parallel, distributed and GPU computing technologies in single-particle electron microscopy

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 7 2009
Martin Schmeisser
Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined. [source]


A series of molecular dynamics and homology modeling computer labs for an undergraduate molecular modeling course

BIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION, Issue 4 2010
Donald E. Elmore
Abstract As computational modeling plays an increasingly central role in biochemical research, it is important to provide students with exposure to common modeling methods in their undergraduate curriculum. This article describes a series of computer labs designed to introduce undergraduate students to energy minimization, molecular dynamics simulations, and homology modeling. These labs were created as part of a one-semester course on the molecular modeling of biochemical systems. Students who completed these activities felt that they were an effective component of the course, reporting improved comfort with the conceptual background and practical implementation of the computational methods. Although created as a component of a larger course, these activities could be readily adapted for a variety of other educational contexts. As well, all of these labs utilize software that is freely available in an academic environment and can be run on fairly common computer hardware, making them accessible to teaching environments without extensive computational resources. [source]