Computing Technologies (computing + technology)

Distribution by Scientific Domains


Selected Abstracts


Integration of General Sparse Matrix and Parallel Computing Technologies for Large,Scale Structural Analysis

COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 6 2002
Hsien Hsieh, Shang
Both general sparse matrix and parallel computing technologies are integrated in this study as a finite element solution of large,scale structural problems in a PC cluster environment. The general sparse matrix technique is first employed to reduce execution time and storage requirements for solving the simultaneous equilibrium equations in finite element analysis. To further reduce the time required for large,scale structural analyses, two parallel processing approaches for sharing computational workloads among collaborating processors are then investigated. One approach adopts a publicly available parallel equation solver, called SPOOLES, to directly solve the sparse finite element equations, while the other employs a parallel substructure method for the finite element solution. This work focuses more on integrating the general sparse matrix technique and the parallel substructure method for large,scale finite element solutions. Additionally, numerical studies have been conducted on several large,scale structural analyses using a PC cluster to investigate the effectiveness of the general sparse matrix and parallel computing technologies in reducing time and storage requirements in large,scale finite element structural analyses. [source]


The Role of Effective Modeling in the Development of Self-Efficacy: The Case of the Transparent Engine,

DECISION SCIENCES JOURNAL OF INNOVATIVE EDUCATION, Issue 1 2007
Kevin P. Scheibe
ABSTRACT Computing technology augments learning in education in a number of ways. One particular method uses interactive programs to demonstrate complex concepts. The purpose of this article is to examine one type of interactive learning technology, the transparent engine. The transparent engine allows instructors and students to view and directly interact with educational concepts such as Web-enabled software development. The article first presents a framework describing transparent engines. The framework details four types of transparent engines: (1) enactive mastery/manipulatable, (2) enactive mastery/nonmanipulatable, (3) vicarious experience/manipulatable, and (4) vicarious experience/nonmanipulatable. Following this, we present the results of an experiment designed to examine this framework by testing its predictions for one quadrant, vicarious experience/nonmanipulatable. The results support the framework in that students taught concepts with the aid of the vicarious experience/nonmanipulatable transparent engine had significantly higher domain-specific self-efficacy compared to those taught the same concepts without this tool. [source]


Integration of General Sparse Matrix and Parallel Computing Technologies for Large,Scale Structural Analysis

COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 6 2002
Hsien Hsieh, Shang
Both general sparse matrix and parallel computing technologies are integrated in this study as a finite element solution of large,scale structural problems in a PC cluster environment. The general sparse matrix technique is first employed to reduce execution time and storage requirements for solving the simultaneous equilibrium equations in finite element analysis. To further reduce the time required for large,scale structural analyses, two parallel processing approaches for sharing computational workloads among collaborating processors are then investigated. One approach adopts a publicly available parallel equation solver, called SPOOLES, to directly solve the sparse finite element equations, while the other employs a parallel substructure method for the finite element solution. This work focuses more on integrating the general sparse matrix technique and the parallel substructure method for large,scale finite element solutions. Additionally, numerical studies have been conducted on several large,scale structural analyses using a PC cluster to investigate the effectiveness of the general sparse matrix and parallel computing technologies in reducing time and storage requirements in large,scale finite element structural analyses. [source]


A Grid-enabled problem-solving environment for advanced reservoir uncertainty analysis

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 18 2008
Zhou Lei
Abstract Uncertainty analysis is critical for conducting reservoir performance prediction. However, it is challenging because it relies on (1) massive modeling-related, geographically distributed, terabyte, or even petabyte scale data sets (geoscience and engineering data), (2) needs to rapidly perform hundreds or thousands of flow simulations, being identical runs with different models calculating the impacts of various uncertainty factors, (3) an integrated, secure, and easy-to-use problem-solving toolkit to assist uncertainty analysis. We leverage Grid computing technologies to address these challenges. We design and implement an integrated problem-solving environment ResGrid to effectively improve reservoir uncertainty analysis. The ResGrid consists of data management, execution management, and a Grid portal. Data Grid tools, such as metadata, replica, and transfer services, are used to meet massive size and geographically distributed characteristics of data sets. Workflow, task farming, and resource allocation are used to support large-scale computation. A Grid portal integrates the data management and the computation solution into a unified easy-to-use interface, enabling reservoir engineers to specify uncertainty factors of interest and perform large-scale reservoir studies through a web browser. The ResGrid has been used in petroleum engineering. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Information technology and transformations in social security policy and administration: A review

INTERNATIONAL SOCIAL SECURITY REVIEW, Issue 4 2001
Paul Henman
In this paper we analyse the interactive relationship between technology, administration and policy in social security. Focusing on new and emerging information and computing technologies, we show how they have been shaped and adopted by social security institutions in different countries, and explore their differential impact on recipients and staff, on organizational structures, and on policy and practice. We conclude that similar technologies have been adopted in a variety of ways to address different economic, social, political and organizational objectives and that, although these differences are becoming more blurred, different patterns have been associated with different welfare state regimes. [source]


Telerobotic systems design based on real-time CORBA

JOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 4 2005
Michele Amoretti
A new class of telerobotic applications is making its way into research laboratories, fine arts or science museums, and industrial installations. Virtual laboratories and remote equipment maintenance are examples of these applications, which are built exploiting distributed computing systems and Internet technologies. Distributed computing technologies provide several advantages to telerobotic applications, such as dynamic and multiuser access to remote resources and arbitrary user locations. Nonetheless, building these applications remains a substantial endeavor, especially when performance requirements must be met. The aim of this paper is to investigate how mainstream and advanced features of the CORBA object-oriented middleware can be put to work to meet the requirements of novel telerobotic applications. We show that Real-Time CORBA extensions and asynchronous method invocation of CORBA services can be relied upon to meet performance and functional requirements, thereby enabling teleoperation on local area networks. Furthermore, CORBA services for concurrency control and large-scale data distribution enable geographic-scale access for robot teleprogramming. Limitations in the currently available implementations of the CORBA standard are also discussed, along with their implications. The effectiveness and suitability for telerobotic applications of several CORBA mechanisms are tested first individually and then by means of a software framework exploiting CORBA services and ensuring component-based development, software reuse, low development cost, fully portable real-time and communication support. A comprehensive telerobotic application built based on the framework is described in the paper and evaluated on both local and wide area networks. The application includes a robot manipulator and several sensory subsystems under concurrent access by multiple competing or collaborating operators, one of which is equipped with a multimodal user interface acting as the master device. © 2005 Wiley Periodicals, Inc. [source]


Parallel, distributed and GPU computing technologies in single-particle electron microscopy

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 7 2009
Martin Schmeisser
Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined. [source]


Market-based grid resource co-allocation and reservation for applications with hard deadlines

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 18 2009
Kurt Vanmechelen
Abstract Grid computing technology enables the creation of large-scale IT infrastructures that are shared across organizational boundaries. In such shared infrastructures, conflicts between user requirements are common and originate from the selfish actions that users perform when formulating their service requests. The introduction of economic principles in grid resource management offers a promising way of dealing with these conflicts. We develop and analyze both a centralized and a decentralized algorithm for economic grid resource management in the context of compute bound applications with deadline-based quality of service requirements and non-migratable workloads. Through the use of reservations, we co-allocate resources across multiple providers in order to ensure that applications finish within their deadline. An evaluation of both algorithms is presented and their performance in terms of realized user value is compared with an existing market-based resource management algorithm. We establish that our algorithms, which operate under a more realistic workload model, can closely approximate the performance of this algorithm. We also quantify the effect of allowing local workload preemption and different scheduling heuristics on the realized user value. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Parameter Estimation in the Error-in-Variables Models Using the Gibbs Sampler

THE CANADIAN JOURNAL OF CHEMICAL ENGINEERING, Issue 1 2006
Jessada J. Jitjareonchai
Abstract Least squares and maximum likelihood techniques have long been used in parameter estimation problems. However, those techniques provide only point estimates with unknown or approximate uncertainty information. Bayesian inference coupled with the Gibbs Sampler is an approach to parameter estimation that exploits modern computing technology. The estimation results are complete with exact uncertainty information. The Error-in-Variables model (EVM) approach is investigated in this study. In it, both dependent and independent variables contain measurement errors, and the true values and uncertainties of all measurements are estimated. This EVM set-up leads to unusually large dimensionality in the estimation problem, which makes parameter estimation very difficult with classical techniques. In this paper, an innovative way of performing parameter estimation is introduced to chemical engineers. The paper shows that the method is simple and efficient; as well, complete and accurate uncertainty information about parameter estimates is readily available. Two real-world EVM examples are demonstrated: a large-scale linear model and an epidemiological model. The former is simple enough for most readers to understand the new concepts without difficulty. The latter has very interesting features in that a Poisson distribution is assumed, and a parameter with known distribution is retained while other unknown parameters are estimated. The Gibbs Sampler results are compared with those of the least squares. Les techniques de moindres carrés et de similitude maximale sont utilisées depuis longtemps dans les problèmes d'estimation des paramètres. Cependant, ces techniques ne fournissent que des estimations ponctuelles avec de l'information sur les incertitudes inconnue ou approximative. L'inférence de Bayes couplée à l'échantillonneur de Gibbs est une approche d'estimation paramétrique qui exploite la technologie moderne de calcul par ordinateur. Les résultats d'estimation sont complets avec l'information exacte sur les incertitudes. L'approche du modèle d'erreurs dans les variables (EVM) est étudiée dans cette étude. Dans cette méthode, les variables dépendantes et indépendantes contiennent des erreurs de mesure, et les véritables valeurs et incertitudes de toutes les mesures sont estimées. Ce système EVM mène à une dimensionnalité inhabituellement grande dans le problème d'estimation, ce qui rend l'estimation de paramètres très difficile avec les techniques classiques. Dans cet article, une façon innovante d'effectuer l'estimation de paramètres est présentée aux ingénieurs de génie chimique. On montre dans cet article que la méthode est simple et efficace; de même, de l'information complète et précise sur l'incertitude d'estimation de paramètres est accessible. Deux exemples d'EVM en situation réelle sont montrés, soient un modèle linéaire de grande échelle et un modèle épidémiologique. Le premier modèle est suffisamment simple pour la plupart des lecteurs pour comprendre les nouveaux concepts sans difficulté. Le deuxième possède des caractéristiques extrêmement intéressantes, en ce sens qu'on suppose une distribution de Poisson et qu'un paramètre ayant une distribution connue est retenu pendant que d'autres paramètres non connus sont estimés. Les résultats de l'échantillonneur de Gibbs sont comparés à ceux de la méthode des moindres carrés. [source]