Application Domain (application + domain)

Distribution by Scientific Domains


Selected Abstracts


A Multiresolution Model for Soft Objects Supporting Interactive Cuts and Lacerations

COMPUTER GRAPHICS FORUM, Issue 3 2000
Fabio Ganovelli
Performing a really interactive and physically-based simulation of complex soft objects is still an open problem in computer animation/simulation. Given the application domain of virtual surgery training, a complete model should be quite realistic, interactive and should enable the user to modify the topology of the objects. Recent papers propose the adoption of multiresolution techniques to optimize time performance by representing at high resolution only the object parts considered more important or critical. The speed up obtainable at simulation time are counterbalanced by the need of a preprocessing phase strongly dependent on the topology of the object, with the drawback that performing dynamic topology modification becomes a prohibitive issue. In this paper we present an approach that couples multiresolution and topological modifications, based on the adoption of a particle systems approach to the physical simulation. Our approach is based on a tetrahedral decomposition of the space, chosen both for its suitability to support a particle system and for the ready availability of many techniques recently proposed for the simplification and multiresolution management of 3D simplicial decompositions. The multiresolution simulation system is designed to ensure the required speedup and to support dynamic changes of the topology, e.g. due to cuts or lacerations of the represented tissue. [source]


Parallel processing of remotely sensed hyperspectral imagery: full-pixel versus mixed-pixel classification

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 13 2008
Antonio J. Plaza
Abstract The rapid development of space and computer technologies allows for the possibility to store huge amounts of remotely sensed image data, collected using airborne and satellite instruments. In particular, NASA is continuously gathering high-dimensional image data with Earth observing hyperspectral sensors such as the Jet Propulsion Laboratory's airborne visible,infrared imaging spectrometer (AVIRIS), which measures reflected radiation in hundreds of narrow spectral bands at different wavelength channels for the same area on the surface of the Earth. The development of fast techniques for transforming massive amounts of hyperspectral data into scientific understanding is critical for space-based Earth science and planetary exploration. Despite the growing interest in hyperspectral imaging research, only a few efforts have been devoted to the design of parallel implementations in the literature, and detailed comparisons of standardized parallel hyperspectral algorithms are currently unavailable. This paper compares several existing and new parallel processing techniques for pure and mixed-pixel classification in hyperspectral imagery. The distinction of pure versus mixed-pixel analysis is linked to the considered application domain, and results from the very rich spectral information available from hyperspectral instruments. In some cases, such information allows image analysts to overcome the constraints imposed by limited spatial resolution. In most cases, however, the spectral bands collected by hyperspectral instruments have high statistical correlation, and efficient parallel techniques are required to reduce the dimensionality of the data while retaining the spectral information that allows for the separation of the classes. In order to address this issue, this paper also develops a new parallel feature extraction algorithm that integrates the spatial and spectral information. The proposed technique is evaluated (from the viewpoint of both classification accuracy and parallel performance) and compared with other parallel techniques for dimensionality reduction and classification in the context of three representative application case studies: urban characterization, land-cover classification in agriculture, and mapping of geological features, using AVIRIS data sets with detailed ground-truth. Parallel performance is assessed using Thunderhead, a massively parallel Beowulf cluster at NASA's Goddard Space Flight Center. The detailed cross-validation of parallel algorithms conducted in this work may specifically help image analysts in selection of parallel algorithms for specific applications. Copyright © 2008 John Wiley & Sons, Ltd. [source]


GDKAT: A goal-driven knowledge acquisition tool for knowledge base development

EXPERT SYSTEMS, Issue 2 2000
Chien-Hsing Wu
While knowledge-based systems are being used extensively to assist in making decisions, a critical factor that affects their performance and reliability is the quantity and quality of the knowledge bases. Knowledge acquisition requires the design and development of an in-depth comprehension of knowledge modeling and of applicable domain. Many knowledge acquisition tools have been developed to support knowledge base development. However, a weakness that is revealed in these tools is the domain-dependent and complex acquisition process. Domain dependence limits the applicable areas and the complex acquisition process makes the tool difficult to use. In this paper, we present a goal-driven knowledge acquisition tool (GDKAT) that helps elicit and store experts' declarative and procedural knowledge in knowledge bases for a user-defined domain. The designed tool is implemented using the object-oriented design methodology under C++ Windows environment. An example that is used to demonstrate the GDKAT is also delineated. While the application domain for the example presented is reflow soldering in surface mount printed circuit board assembly, the GDKAT can be used to develop knowledge bases for other domains also. [source]


The application of knowledge discovery in databases to post-marketing drug safety: example of the WHO database

FUNDAMENTAL & CLINICAL PHARMACOLOGY, Issue 2 2008
A. Bate
Abstract After market launch, new information on adverse effects of medicinal products is almost exclusively first highlighted by spontaneous reporting. As data sets of spontaneous reports have become larger, and computational capability has increased, quantitative methods have been increasingly applied to such data sets. The screening of such data sets is an application of knowledge discovery in databases (KDD). Effective KDD is an iterative and interactive process made up of the following steps: developing an understanding of an application domain, creating a target data set, data cleaning and pre-processing, data reduction and projection, choosing the data mining task, choosing the data mining algorithm, data mining, interpretation of results and consolidating and using acquired knowledge. The process of KDD as it applies to the analysis of spontaneous reports can be exemplified by its routine use on the 3.5 million suspected adverse drug reaction (ADR) reports in the WHO ADR database. Examples of new adverse effects first highlighted by the KDD process on WHO data include topiramate glaucoma, infliximab vasculitis and the association of selective serotonin reuptake inhibitors (SSRIs) and neonatal convulsions. The KDD process has already improved our ability to highlight previously unsuspected ADRs for clinical review in spontaneous reporting, and we anticipate that such techniques will be increasingly used in the successful screening of other healthcare data sets such as patient records in the future. [source]


Lagrangian finite element treatment of transient vibration/acoustics of biosolids immersed in fluids

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 5 2008
P. Krysl
Abstract Superposition principle is used to separate the incident acoustic wave from the scattered and radiated waves in a displacement-based finite element model. An absorbing boundary condition is applied to the perturbation part of the displacement. Linear constitutive equation allows for inhomogeneous, anisotropic materials, both fluids and solids. Displacement-based finite elements are used for all materials in the computational volume. Robust performance for materials with limited compressibility is achieved using assumed-strain nodally integrated simplex elements or incompatible-mode brick elements. A centered-difference time-stepping algorithm is formulated to handle general damping accurately and efficiently. Verification problems (response of empty steel cylinder immersed in water to a step plane wave, and scattering of harmonic plane waves from an elastic sphere) are discussed for assumed-strain simplex and for voxel-based brick finite element models. A voxel-based modeling scheme for complex biological geometries is described, and two illustrative results are presented from the bioacoustics application domain: reception of sound by the human ear and simulation of biosonar in beaked whales. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Humanoids and personal robots: Design and experiments

JOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 12 2001
Paolo Dario
This paper addresses the field of humanoid and personal robotics,its objectives, motivations, and technical problems. The approach described in the paper is based on the analysis of humanoid and personal robots as an evolution from industrial to advanced and service robotics driven by the need for helpful machines, as well as a synthesis of the dream of replicating humans. The first part of the paper describes the development of anthropomorphic components for humanoid robots, with particular regard to anthropomorphic sensors for vision and touch, an eight-d.o.f. arm, a three-fingered hand with sensorized fingertips, and control schemes for grasping. Then, the authors propose a user-oriented design methodology for personal robots, and describe their experience in the design, development, and validation of a real personal robot composed of a mobile unit integrating some of the anthropomorphic components introduced previously and aimed at operating in a distributed working environment. Based on the analysis of experimental results, the authors conclude that humanoid robotics is a tremendous and attractive technical and scientific challenge for robotics research. The real utility of humanoids has still to be demonstrated, but personal assistance can be envisaged as a promising application domain. Personal robotics also poses difficult technical problems, especially related to the need for achieving adequate safety, proper human,robot interaction, useful performance, and affordable cost. When these problems are solved, personal robots will have an excellent chance for significant application opportunities, especially if integrated into future home automation systems, and if supported by the availability of humanoid robots. © 2001 John Wiley & Sons, Inc. [source]


Unifying clones with a generative programming technique: a case study

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 4 2006
Stan Jarzabek
Abstract Software clones,similar program structures repeated in variant forms,increase the risk of update anomalies, blow up the program size and complexity, possibly contributing to high maintenance costs. Yet, programs are often polluted by clones. In this paper, we present a case study of cloning in the Java Buffer library, JDK 1.5. We found that at least 68% of the code in the Buffer library was contained in cloned classes or class methods. Close analysis of program situations that led to cloning revealed difficulties in eliminating clones with conventional program design techniques. As a possible solution, we applied a generative technique of XVCL (XML-based Variant Configuration Language) to represent similar classes and methods in generic, adaptable form. Concrete buffer classes could be automatically produced from the generic structures. We argue, on analytical and empirical grounds, that unifying clones reduced conceptual complexity and enhanced the changeability of the Buffer library at rates proportional to code size reduction (68%). We evaluated our solution in qualitative and quantitative ways, and conducted a controlled experiment to support this claim. The approach presented in the paper can be used to enhance genericity and changeability of any program, independently of an application domain or programming language. As the solution is not without pitfalls, we discuss trade-offs involved in its project application. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Architecture-based semantic evolution of embedded remotely controlled systems

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 3 2003
Lawrence Chung
Abstract Evolution of a software system is a natural process. In most systems, evolution takes place during the maintenance phase of their life cycles. Those systems that have reached their limit in evolution have usually reached their end of useful life and may have to be replaced. However, there are systems in which evolution occurs during the operational phase of their life cycles. Such systems are designed to evolve while in use or, in other words, be adaptable. Semantically adaptable systems are of particular interest to industry as such systems often times adapt themselves to environment change with little or no intervention from their developing or maintaining organization. Since embedded systems usually have a restricted hardware configuration, it is difficult to apply the techniques developed for non-embedded systems directly to embedded systems. This paper focuses on evolution through adaptation and develops the concepts and techniques for semantic evolution in embedded systems. As the first step in the development of a software solution, architectures of software systems themselves have to be made semantically evolvable. In this paper we explore various architectural alternatives for the semantic evolution of embedded systems,these architectures are based on four different techniques that we have identified for semantic evolution in embedded systems. The development of these architectures follows the systematic process provided by the non-functional requirement (NFR) framework, which also permits the architectures to be rated in terms of their evolvability. As the field of embedded systems is vast, this paper concentrates on those embedded systems that can be remotely controlled. In this application domain the embedded system is connected to an external controller by a communication link such as ethernet, serial, radio frequency, etc., and receives commands from and sends responses to the external controller via the communication link. The architectures developed in this paper have been partly validated by applying them in a real embedded system,a test instrument used for testing cell phones. These architectures and techniques for semantic evolution in this application domain give a glimpse of what can be done in achieving semantic evolution in software-implemented systems. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Tailoring the software maintenance process to better support complex systems evolution projects

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 1 2003
Paolo DonzelliArticle first published online: 30 JAN 200
Abstract When an organization considers the evolution of a software-intensive system, the selection of the software maintenance process to be adopted must include consideration of the particular technical criteria, such as the application domain, the size and complexity of the final product, the hosting system characteristics, etc, yet be driven by the specific organization's goals, environment and maturity. By describing and analysing a real project, this paper shows how different approaches and techniques, usually applied in isolation, can be selected, customized and combined to implement a software maintenance and evolution process that better satisfies the goals and meets the constraints of the organization. The project was undertaken to investigate the feasibility of enhancing an aircraft avionics system by integrating new capabilities and, eventually, to identify a quick, low-cost and low-risk solution. Copyright © 2003 John Wiley & Sons, Ltd. [source]


MPEG-7 in practice: Analysis of a television news retrieval application

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 9 2007
Nastaran Fatemi
This article provides an overview of our experiments in using MPEG-7 in a television news retrieval application. Our study is based on a survey of professional users in the Télévision Suisse Romande (TSR) television news production environment. We present here two main issues. First, we describe the way the generic and voluminous MPEG-7 Schema can be exploited in the context of a specific application domain. Second, we discuss the problem of how to search MPEG-7 descriptions, which are detailed and complex by nature, via a high-level user-oriented retrieval model. [source]


Automated software development with XML and the Java* language

BELL LABS TECHNICAL JOURNAL, Issue 2 2000
Glenn R. Bruns
In software development with domain-specific languages (DSLs), one defines a requirements language for an application domain and then develops a compiler to generate an implementation from a requirements document. Because DSLs and DSL compilers are expensive to develop, DSLs are seen as cost effective only when many products of the same domain will be developed. In this paper, we show how the cost of DSL design and DSL compiler development can be reduced by defining DSLs as Extensible-Markup-Language (XML) dialects and by developing DSL compilers using commercial XML tools and the Java* language. This approach is illustrated through the Call View Data Language (CDL), a new DSL that generates provisioning support code and database table definitions for Lucent Technologies' 7R/EÔ Network Feature Server. [source]


Surface smoothing and quality improvement of quadrilateral/hexahedral meshes with geometric flow,

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 1 2009
Yongjie Zhang
Abstract This paper describes an approach to smooth the surface and improve the quality of quadrilateral/hexahedral meshes with feature preserved using geometric flow. For quadrilateral surface meshes, the surface diffusion flow is selected to remove noise by relocating vertices in the normal direction, and the aspect ratio is improved with feature preserved by adjusting vertex positions in the tangent direction. For hexahedral meshes, besides the surface vertex movement in the normal and tangent directions, interior vertices are relocated to improve the aspect ratio. Our method has the properties of noise removal, feature preservation and quality improvement of quadrilateral/hexahedral meshes, and it is especially suitable for biomolecular meshes because the surface diffusion flow preserves sphere accurately if the initial surface is close to a sphere. Several demonstration examples are provided from a wide variety of application domains. Some extracted meshes have been extensively used in finite element simulations. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Fuzzy quantification in two real scenarios: Information retrieval and mobile robotics

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 6 2009
Félix Díaz-Hermida
Fuzzy quantification supplies powerful tools for handling linguistic expressions. Nevertheless, its advantages are usually shown at the theoretical level without a proper empirical validation. In this work, we review the application of fuzzy quantification in two application domains. We provide empirical evidence on the adequacy of fuzzy quantification to support different tasks in the context of mobile robotics and information retrieval. This practical perspective aims at exemplifying the actual benefits that real application can get from fuzzy quantifiers. © 2009 Wiley Periodicals, Inc. [source]


A performance measurement paradigm for integrating strategy formulation: A review of systems and frameworks

INTERNATIONAL JOURNAL OF MANAGEMENT REVIEWS, Issue 1 2005
Kit Fai Pun
Measuring organizational performance plays a very important part in translating corporate strategy into results. Various emerging (non-traditional) performance systems have recently been devised to aid firms in selecting and implementing measures. This paper discusses the strategy/measurement initiatives and compares ten emerging performance measurement systems with respect to a list of performance dimensions, the characteristics of performance measures, and the requirements of development process. Although these systems have constraints borne with their own application domains, they stand by themselves empirically and/or theoretically, and provide guidance about what to measure and how to design performance measures that could be linked to the corporate strategy and objectives of an organization. This paper concludes that there is a need to develop a paradigm for integrating strategy formulation and performance measurement in organizations. [source]


The role of competence level in the self-efficacy,skills relationship: an empirical examination of the skill acquisition process and its implications for information technology training

INTERNATIONAL JOURNAL OF TRAINING AND DEVELOPMENT, Issue 2 2009
James P. Downey
The role of computer training has long been critical in organizations as reliance on technology for strategic advantage increases in importance. How to most effectively conduct such training has clear implications for organizations. This study examines one area of training which is not well understood: the role that competence level plays in the self-efficacy,competence relationship (if indeed it plays a role at all) during skill acquisition. Two opposing conceptual positions are presented from the literature, one that suggests the relationship between self-efficacy and competence will be stronger early in the skill acquisition process (when competence is minimal), the other suggesting the strength of the relationship will be stronger at mastery. Using a sample of over 600 and structural equation modeling, the relationship between self-efficacy and competence for six different computing application domains is tested by dividing respondents in each domain in half, according to competence level. Results empirically demonstrate that level of competence makes a significant difference in the domains, that those higher in ability typically have a stronger relationship with self-efficacy. Results also show that the relationship is weaker for those new to the application and those who have mastered the application. The important implications for training are discussed. [source]


Very large-scale neighborhood search

INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, Issue 4-5 2000
R.K. Ahuja
Abstract Neighborhood search algorithms are often the most effective approaches available for solving partitioning problems, a difficult class of combinatorial optimization problems arising in many application domains including vehicle routing, telecommunications network design, parallel machine scheduling, location theory, and clustering. A critical issue in the design of a neighborhood search algorithm is the choice of the neighborhood structure, that is, the manner in which the neighborhood is defined. Currently, the two-exchange neighborhood is the most widely used neighborhood for solving partitioning problems. The paper describes the cyclic exchange neighborhood, which is a generalization of the two-exchange neighborhood in which a neighbor is obtained by performing a cyclic exchange. The cyclic exchange neighborhood has substantially more neighbors compared to the two-exchange neighborhood. This paper outlines a network optimization based methodology to search the neighborhood efficiently and presents a proof of concept by applying it to the capacitated minimum spanning tree problem, an important problem in telecommunications network design. [source]


Test processes in software product evolution,a qualitative survey on the state of practice

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 1 2003
Per Runeson
Abstract In order to understand the state of test process practices in the software industry, we have conducted a qualitative survey, covering software development departments at 11 companies in Sweden of different sizes and application domains. The companies develop products in an evolutionary manner, which means either new versions are released regularly, or new product variants under new names are released. The survey was conducted through workshop and interview sessions, loosely guided by a questionnaire scheme. The main conclusions of the survey are that the documented development process is emphasized by larger organizations as a key asset, while smaller organizations tend to lean more on experienced people. Further, product volution is performed primarily as new product variants for embedded systems, and as new versions for packaged software. The development is structured using incremental development or a daily build approach; increments are used among more process-focused organizations, and daily build is more frequently utilized in less process-focused organizations. Test automation is performed using scripts for products with focus on functionality, and recorded data for products with focus on non-functional properties. Test automation is an issue which most organizations want to improve; handling the legacy parts of the product and related documentation presents a common problem in improvement efforts for product evolution. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Behavioural modelling of long-lived evolution processes,some issues and an example

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 5 2002
M. M. Lehman
Abstract For reasons now well understood, application software that is regularly used for real-world problem solving must be continually adapted and enhanced to maintain its fitness to an ever-changing real world, its applications and application domains. This type of activity is termed progressive. As evolution continues, the complexity (functional, structural) of the evolving system is likely to increase unless work, termed anti-regressive, is undertaken to control and even reduce it. However, with progressive and anti-regressive work naturally competing for the same pool of resources, management requires means to estimate the amount of work and resources to be applied to each of the two types. After providing a necessary background, the paper describes a systems dynamics model that can serve as the core of a tool to support decision making regarding the optimal personnel allocation over the system lifetime. The model is provided as an example of the use of formalisms in modelling the behaviour of the evolution process. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Use of Physicochemical Property Limits to Develop Rules for Identifying Chemical Substances with no Skin Irritation or Corrosion Potential

MOLECULAR INFORMATICS, Issue 9 2004
Ingrid Gerner
Abstract This is believed to be the first paper to promote the use of rules based on (quantitative) structure-activity relationship [(Q)SAR] models for identifying chemicals that are not likely to cause a specific adverse health effect, viz., skin irritation or corrosion. The purpose of this paper is to describe limit values for specific physicochemical properties that are appropriate for identifying chemical substances that have no skin irritation or corrosion potential. These physicochemical properties include melting point, molecular weight, octanol-water partition coefficient, surface tension, vapour pressure, aqueous solubility and lipid solubility. Based on analyses of 1833 chemicals, physicochemical properties for limits were defined to determine that when a chemical's physicochemical properties were either greater or less than these limits that these chemicals would have no skin irritation or corrosion potential. To facilitate classification and labeling, the application domains of these limits were constructed to correspond with the European Union's risk phrases for chemicals classified for skin irritation/corrosion, viz., R 34, R35 or R38. This is the second paper of four companion papers. The first paper discussed mechanisms that can lead to significant skin irritation or corrosion after acute exposures to chemicals. The third paper described the application of structural alerts to identify chemical substances with skin irritation or corrosion potential. The fourth paper described the Skin Irritation Corrosion Rules Estimation Tool (SICRET), a user-friendly tool that allows non-(Q)SAR experts to identify chemical substances with skin irritation or corrosion potential based on physicochemical property limits and structural alerts. [source]


Knowledge Translation in International Emergency Medical Care

ACADEMIC EMERGENCY MEDICINE, Issue 11 2007
L. Kristian Arnold MD
More than 90% of the world population receives emergency medical care from different types of practitioners with little or no specific training in the field and with variable guidance and oversight. Emergency medical care is being recognized by actively practicing physicians around the world as an increasingly important domain in the overall health services package for a community. The know-do gap is well recognized as a major impediment to high-quality health care in much of the world. Knowledge translation principles for application in this highly varied young domain will require investigation of numerous aspects of the knowledge synthesis, exchange, and application domains in order to bring the greatest benefit of both explicit and tacit knowledge to increasing numbers of the world's population. This article reviews some of the issues particular to knowledge development and transfer in the international domain. The authors present a set of research proposals developed from a several-month online discussion among practitioners and teachers of emergency medical care in 16 countries from around the globe and from all economic strata, aimed at improving the flow of knowledge from developers and repositories of knowledge to the front lines of clinical care. [source]