Software Applications (software + application)

Distribution by Scientific Domains


Selected Abstracts


ON THE GEOCHRONOLOGICAL METHOD VERSUS FLOW SIMULATION SOFTWARE APPLICATION FOR LAHAR RISK MAPPING: A CASE STUDY OF POPOCATÉPETL VOLCANO, MEXICO

GEOGRAFISKA ANNALER SERIES A: PHYSICAL GEOGRAPHY, Issue 3 2010
ESPERANZA MUÑOZ-SALINAS
ABSTRACT. Lahars are hazardous events that can cause serious damage to people who live close to volcanic areas; several were registered at different times in the last century, such as at Mt St Helens (USA) in 1980, Nevado del Ruiz (Colombia) in 1985 and Mt Pinatubo (Philippines) in 1990. Risk maps are currently used by decision-makers to help them plan to mitigate the hazard-risk of lahars. Risk maps are acquired based on a series of tenets that take into account the distribution and chronology of past lahar deposits, and basically two approaches have been used: (1) The use of Flow Simulation Software (FSS), which simulates flows along channels in a Digital Elevation Model and (2) The Geochronological Method (GM), in which the mapping is based on the evaluation of lahar magnitude and frequency. This study addresses the production of a lahar risk map using the two approaches (FSS and GM) for a study area located at Popocatépetl volcano , Central Mexico. Santiago Xalitzintla, a town located on the northern flank of Popocatépetl volcano, where volcanic activity in recent centuries has triggered numerous lahars that have endangered local inhabitants, has been used for the case study. Results from FSS did not provide satisfactory findings because they were not consistent with lahar sediment observations made during fieldwork. By contrast, the GM produced results consistent with these observations, and therefore we use them to assess the hazard and produce the risk map for the study area. [source]


Generic representation and evaluation of properties as a function of position in reciprocal space

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 6 2002
Kevin Cowtan
A generalized approach is described for evaluating arbitrary functions of position in reciprocal space. This is a generalization which subsumes a whole range of calculations that form a part of almost every crystallographic software application. Examples include scaling of structure factors, the calculation of structure-factor statistics, and some simple likelihood calculations for a single parameter. The generalized approach has a number of advantages: all these calculations may now be performed by a single software routine which need only be debugged and optimized once; the existing approach of dividing reciprocal space into resolution shells with discontinuities at the boundaries is no longer necessary; the implementation provided makes employing the new functionality extremely simple and concise. The calculation is split into three standard components, for which a number of implementations are provided for different tasks. A `basis function' describes some function of position in reciprocal space, the shape of which is determined by a small number of parameters. A `target function' describes the property for which a functional representation is required, for example . An `evaluator' takes a basis and target function and optimizes the parameters of the basis function to fit the target function. Ideally the components should be usable in any combination. [source]


Determinants of software volatility: a field study

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 3 2003
Xiaoni Zhang
Abstract Although technology advances have provided new tools for maintaining software, maintenance costs remain the largest component of software life cycle cost. A basic factor claimed to be one of the driving factors in the cost of maintenance is software volatility. The objective of this research is to investigate the relationship between certain software attributes and software volatility. In this study, software volatility refers to the frequency or number of enhancements per unit of application over a specified time normalized. However, this metric is divided by the number of source lines of code (SLOC) to obtain a measure that takes into account the size of the software application. The research model is built on previous research concerning software volatility. Three factors are examined to determine their influence on software volatility normalized for SLOC: age, software complexity, and software complexity normalized for SLOC. In addition, we introduce the notion that mean time between software enhancements moderates the relationship of age, complexity, and complexity normalized for SLOC with software volatility. A field study at a major corporation allowed for the collection of data from a 13-year-time period. These data are used to empirically test the hypotheses presented in this study. As a moderator variable, mean time between enhancements significantly contributes to the explanatory power of a prediction model for software volatility adjusted for SLOC. Software administrators may wish to use the proposed model in their decision-making plans to control for software costs. Copyright © 2003 John Wiley & Sons, Ltd. [source]


The Logistical Tracking System (LTS) five years later: What has been accomplished?

NEW DIRECTIONS FOR INSTITUTIONAL RESEARCH, Issue 135 2007
Nicolas A. ValcikArticle first published online: 24 SEP 200
This chapter discusses creation and implementation at The University of Texas at Dallas of a software application developed in house to capture facilities, inventory, and personnel data for the purposes of federal and state mandatory reporting. [source]


A randomized trial evaluating a cognitive simulator for laparoscopic appendectomy

ANZ JOURNAL OF SURGERY, Issue 9 2010
Benjamin P. T. Loveday
Abstract Background:, The Integrated Cognitive Simulator (ICS) is a software application that integrates text, anatomy, video and simulation for training clinical procedures. The aim of this randomized controlled trial was to determine the usability of the ICS laparoscopic appendectomy module, and to determine its effectiveness in training the cognitive skills required for the procedure. Methods:, Junior surgical trainees were randomized into control and intervention groups. The latter had access to the ICS. Participants had three assessments: a pre-study questionnaire to determine demographics, 20 multiple choice questions to assess procedural knowledge (training effectiveness) after 2 weeks, and a questionnaire to assess usability after 4 months. Results:, Fifty-eight trainees were randomized. The overall response rate was 57%. The median scores for interface, functionality, usefulness and likelihood of utilization (usability) were 5/7 or higher. In the multiple choice questions (training effectiveness), first-year trainees in the intervention group scored higher than the control group (14.9 versus 12.1, P= 0.04), but second-year trainees did not. Use of the ICS did not alter the participants' perceived need for intra-operative guidance. Conclusions:, The ICS is considered highly usable by trainees. The ICS is effective for training cognitive skills for laparoscopic appendectomy among first-year surgical trainees. Training cognitive skills alone does not increase confidence in the ability to perform motor tasks. [source]


A review of the Behavioral Evaluation Strategy and Taxonomy (BEST®) software application

BEHAVIORAL INTERVENTIONS, Issue 4 2004
Tina M. Sidener
Recent computer technology has led to the development of a number of software applications that have been specifically designed for collecting and analyzing observational data in real time. Behavioral Evaluation Strategy and Taxonomy (BEST®) is an innovative software program that provides users with an effective way to collect, store, and analyze real-time observational data. The program is comprised of two distinct applications: BEST Collection® and BEST Analysis®. The purpose of the current article was to provide a critical review of BEST Version 4.1.6 for the Windows® (95/98/NT) operating system. The basis of this review was our use of BEST to collect and analyze data for several studies over a 2 year period. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Cell deposition system based on laser guidance

BIOTECHNOLOGY JOURNAL, Issue 9 2006
Russell K. Pirlo
Abstract We have designed a laser cell deposition system that employs the phenomenon of laser guidance to place single cells at specific points in a variety of in vitro environments. Here, we describe the components of the system: the laser optics, the deposition chamber, the microinjection cell feeding system and our custom system control software application. We discuss the requirements and challenges involved in laser guidance of cells and how our present system overcomes these challenges. We demonstrate that the patterning system is accurate within one micrometer by repeatedly depositing polymer microspheres and measuring their position. We demonstrate its ability to create highly defined living patterns of cells by creating a defined pattern of neurons with neurite extensions displaying normal function. We found that the positional accuracy of our system is smaller than the variations in cell size and pattern disruptions that occur from normal cell movement during substrate adhesion. The laser cell deposition system is a potentially useful tool that can be used to achieve site- and time-specific placement of an individual cell in a cell culture for the systematic investigation of cell-cell and cell-extracellular matrix interactions. [source]


Book and Media Reviews

ADDICTION BIOLOGY, Issue 4 2000
David Ball
Books reviewed in this article: Book reviews in this column will primarily be of titles focusing completely, or in part, on biological aspects of addiction. However, significant titles of general relevance to the addictions field will also be included, even if they are not "biological", as will titles of general methodological and clinical relevance, even if they are not on "addictions". Similar considerations will apply to other media (software, audio tapes and CDs, videos, etc). However, specific "addictions" software applications seem to be relatively uncommon and, as these items are rarely reviewed elsewhere, we will endeavour to include reviews of some of the older programmes that are still useful, as well as new titles that appear. I would appreciate suggesti ons of any items suitable for reviews, but especially software and other media of specifi c relevance to the addictions. Please contact: Dr David Ball, National Addiction Centre, 4 Windsor Walk, London SE5 8AF, UK. Dual Diagnosis and Treatment: substance abuse and co-morbid medical and psychiatric disorders HENRY R. KRANZLER & BRUCE J. ROUNSAVILLE (Eds) Improving the Care of People in Substance Misuse Ser vices: clinical audit project examples KIRSTY MACLEAN STEEL & CLAIRE PALMER Software European Legal Texts on Drugs (CD-Rom) [source]


Describing generic expertise models as object-oriented analysis patterns: the heuristic multi-attribute decision pattern

EXPERT SYSTEMS, Issue 3 2002
Ángeles Manjarrés
We report on work concerning the use of object-oriented analysis and design (OAD) methods in the development of artificial intelligence (AI) software applications, in which we compare such techniques to software development methods more commonly used in AI, in particular CommonKADS. As a contribution to clarifying the role of OAD methods in AI, in this paper we compare the analysis models of the object-oriented methods and the CommonKADS high-level expertise model. In particular, we study the correspondences between generic tasks, methods and ontologies in methodologies such as CommonKADS and analysis patterns in object-oriented analysis. Our aim in carrying out this study is to explore to what extent, in areas of AI where the object-oriented paradigm may be the most adequate way of conceiving applications, an analysis level ,pattern language' could play the role of the libraries of generic knowledge models in the more commonly used AI software development methods. As a case study we use the decision task , its importance arising from its status as the basic task of the intelligent agent , and the associated heuristic multi-attribute decision method, for which we derive a corresponding decision pattern described in the unified modelling language, a de facto standard in OAD. [source]


Relative importance of evaluation criteria for enterprise systems: a conjoint study

INFORMATION SYSTEMS JOURNAL, Issue 3 2006
Mark Keil
Abstract., While a large body of research exists on the development and implementation of software, organizations are increasingly acquiring enterprise software packages [e.g. enterprise resource planning (ERP) systems] instead of custom developing their own software applications. To be competitive in the marketplace, software package development firms must manage the three-pronged trade-off between cost, quality and functionality. Surprisingly, prior research has made little attempt to investigate the characteristics of packaged software that influence management information system (MIS) managers' likelihood of recommending purchase. As a result, both the criteria by which MIS managers evaluate prospective packaged systems and the attributes that lead to commercially competitive ERP software products are poorly understood. This paper examines this understudied issue through a conjoint study. We focus on ERP systems, which are among the largest and most complex packaged systems that are purchased by organizations. In a conjoint study, 1008 evaluation decisions based on hypothetical ERP software package profiles were completed by managers in 126 organizations. The study represents the first empirical investigation of the relative importance that managers ascribe to various factors that are believed to be important in evaluating packaged software. The results provide important insights for both organizations that acquire such systems and those that develop them. The results show that functionality, reliability, cost, ease of use and ease of customization are judged to be important criteria, while ease of implementation and vendor reputation were not found to be significant. Functionality and reliability were found to be the most heavily weighted factors. We conclude the paper with a detailed discussion of the results and their implications for software acquisition and development practice. [source]


Reliability factors in business software: volatility, requirements and end-users

INFORMATION SYSTEMS JOURNAL, Issue 3 2002
Paul L Bowen
Abstract. Many business-oriented software applications are subject to frequent changes in requirements. This paper shows that, ceteris paribus, increases in the volatility of system requirements decrease the reliability of software. Further, systems that exhibit high volatility during the development phase are likely to have lower reliability during their operational phase. In addition to the typically higher volatility of requirements, end-users who specify the requirements of business-oriented systems are usually less technically oriented than people who specify the requirements of compilers, radar tracking systems or medical equipment. Hence, the characteristics of software reliability problems for business-oriented systems are likely to differ significantly from those of more technically oriented systems. [source]


Networking lessons in delivering ,Software as a Service',Part II

INTERNATIONAL JOURNAL OF NETWORK MANAGEMENT, Issue 6 2002
David Greschler
In part I of this paper, we described the origins and evolution of Software as a Service (SaaS) and its value proposition to Corporate IT, Service Providers, Independent Software Vendors and End Users. SaaS is a model in which software applications are deployed, managed, updated and supported on demand,like a utility,and are served to users centrally using servers that are internal or external to the enterprise. Applications are no longer installed locally on a user's desktop PC; instead, upgrades, licensing and version control, metering, support and provisioning are all managed at the server level. In part we examine the lessons learned in researching, building and running an SaaS service. Copyright © 2002 John Wiley & Sons, Ltd. [source]


The status of training and education in information and computer technology of Australian nurses: a national survey

JOURNAL OF CLINICAL NURSING, Issue 20 2008
Robert Eley
Aims and objectives., A study was undertaken of the current knowledge and future training requirements of nurses in information and computer technology to inform policy to meet national goals for health. Background., The role of the modern clinical nurse is intertwined with information and computer technology and adoption of such technology forms an important component of national strategies in health. The majority of nurses are expected to use information and computer technology during their work; however, the full extent of their knowledge and experience is unclear. Design., Self-administered postal survey. Methods., A 78-item questionnaire was distributed to 10,000 Australian Nursing Federation members to identify the nurses' use of information and computer technology. Eighteen items related to nurses' training and education in information and computer technology. Results., Response rate was 44%. Computers were used by 86·3% of respondents as part of their work-related activities. Between 4,17% of nurses had received training in each of 11 generic computer skills and software applications during their preregistration/pre-enrolment and between 12,30% as continuing professional education. Nurses who had received training believed that it was adequate to meet the needs of their job and was given at an appropriate time. Almost half of the respondents indicated that they required more training to better meet the information and computer technology requirements of their jobs and a quarter believed that their level of computer literacy was restricting their career development. Nurses considered that the vast majority of employers did not encourage information and computer technology training and, for those for whom training was available, workload was the major barrier to uptake. Nurses favoured introduction of a national competency standard in information and computer technology. Conclusions., For the considerable benefits of information and computer technology to be incorporated fully into the health system, employers must pay more attention to the training and education of nurses who are the largest users of that technology. Relevance to clinical practice., Knowledge of the training and education needs of clinical nurses with respect to information and computer technology will provide a platform for the development of appropriate policies by government and by employers. [source]


An automated approach for abstracting execution logs to execution events

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 4 2008
Zhen Ming Jiang
Abstract Execution logs are generated by output statements that developers insert into the source code. Execution logs are widely available and are helpful in monitoring, remote issue resolution, and system understanding of complex enterprise applications. There are many proposals for standardized log formats such as the W3C and SNMP formats. However, most applications use ad hoc non-standardized logging formats. Automated analysis of such logs is complex due to the loosely defined structure and a large non-fixed vocabulary of words. The large volume of logs, produced by enterprise applications, limits the usefulness of manual analysis techniques. Automated techniques are needed to uncover the structure of execution logs. Using the uncovered structure, sophisticated analysis of logs can be performed. In this paper, we propose a log abstraction technique that recognizes the internal structure of each log line. Using the recovered structure, log lines can be easily summarized and categorized to help comprehend and investigate the complex behavior of large software applications. Our proposed approach handles free-form log lines with minimal requirements on the format of a log line. Through a case study using log files from four enterprise applications, we demonstrate that our approach abstracts log files of different complexities with high precision and recall. Copyright © 2008 John Wiley & Sons, Ltd. [source]


,You gotta lie to it': software applications and the management of technological change in a call centre

NEW TECHNOLOGY, WORK AND EMPLOYMENT, Issue 2 2007
Bob RussellArticle first published online: 20 JUN 200
This paper advances an extended material analysis to the study of technological change in a call centre. It shows how such an analysis is particularly apropos for understanding the distance that often separates managerial intentions in introducing a new technology from the outcomes associated with how workers utilise it. [source]


Public bookmarks and private benefits: An analysis of incentives in social computing

PROCEEDINGS OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE & TECHNOLOGY (ELECTRONIC), Issue 1 2007
Rick Wash
Users of social computing websites are both producers and consumers of the information found on the site. This creates a novel problem for web-based software applications: how can website designers induce users to produce information that is useful for others? We study this question by interviewing users of the social bookmarking website del.icio.us. We find that for the users in our sample, metadata reflecting who bookmarked a webpage better supports information seeking than free-form keyword metadata (tags). We explain this finding by describing differences in the way that the design of del.icio.us motivates users to contribute by providing personal benefits for bookmarking and tagging. [source]


Linking data to electronic records

QUALITY ASSURANCE JOURNAL, Issue 2 2003
Heather Longden
Abstract Today it is possible to maintain electronic records in a single application in compliance with 21 CFR Part 11. However, most electronic data for a sample in an analytical laboratory is spread across a number of software applications as well as traditional paper systems. This article will examine how it is possible to link both paper and electronic records together in hybrid systems. A case study is used to demonstrate the practical aspects of a totally electronic process. Copyright © 2003 John Wiley & Sons, Ltd. [source]


The role of medical simulation: an overview,

THE INTERNATIONAL JOURNAL OF MEDICAL ROBOTICS AND COMPUTER ASSISTED SURGERY, Issue 3 2006
Kevin Kunkler
Abstract Robotic surgery and medical simulation have much in common: both use a mechanized interface that provides visual "patient" reactions in response to the actions of the health care professional (although simulation also includes touch feedback); both use monitors to visualize the progression of the procedure; and both use computer software applications through which the health care professional interacts. Both technologies are experiencing rapid adoption and are viewed as modalities that allow physicians to perform increasingly complex minimally invasive procedures while enhancing patient safety. A review of the literature and industry developments concludes that medical simulators can be useful tools in determining a physician's understanding and use of best practices, management of patient complications, appropriate use of instruments and tools, and overall competence in performing procedures. Future use of these systems depends on their impact on patient safety, procedure completion time and cost efficiency. The sooner simulation training can be used to support developing technologies and procedures, the earlier, and typically the better, the results. Continued studies are needed to identify and ensure the ongoing applicability of these systems for both training and certification. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Life-science applications of the Cambridge Structural Database

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 6-1 2002
Robin Taylor
Several studies show that the molecular geometries and intermolecular interactions observed in small-molecule crystal structures are relevant to the modelling of in vivo situations, although the influence of crystal packing is sometimes important and should always be borne in mind. Torsional distributions derived from the Cambridge Structural Database (CSD) can be used to map out potential-energy surfaces and thereby help identify experimentally validated conformational minima of molecules with several rotatable bonds. The use of crystallographic data in this way is complementary to in vacuo theoretical calculations since it gives insights into conformational preferences in condensed-phase situations. Crystallographic data also underpin many molecular-fragment libraries and programs for generating three-dimensional models from two-dimensional chemical structures. The modelling of ligand binding to metalloenzymes is assisted by information in the CSD on preferred coordination numbers and geometries. CSD data on intermolecular interactions are useful in structure-based inhibitor design both in indicating how probable a protein,ligand interaction is and what its geometry is likely to be. They can also be used to guide searches for bioisosteric replacements. Crystallographically derived information has contributed to many life-science software applications, including programs for locating binding `hot spots' on proteins, docking ligands into enzyme active sites, de novo ligand design, molecular superposition and three-dimensional QSAR. Overall, crystallographic data in general, and the CSD in particular, are very significant tools for the rational design of biologically active molecules. [source]


A review of the Behavioral Evaluation Strategy and Taxonomy (BEST®) software application

BEHAVIORAL INTERVENTIONS, Issue 4 2004
Tina M. Sidener
Recent computer technology has led to the development of a number of software applications that have been specifically designed for collecting and analyzing observational data in real time. Behavioral Evaluation Strategy and Taxonomy (BEST®) is an innovative software program that provides users with an effective way to collect, store, and analyze real-time observational data. The program is comprised of two distinct applications: BEST Collection® and BEST Analysis®. The purpose of the current article was to provide a critical review of BEST Version 4.1.6 for the Windows® (95/98/NT) operating system. The basis of this review was our use of BEST to collect and analyze data for several studies over a 2 year period. Copyright © 2004 John Wiley & Sons, Ltd. [source]