Software System (software + system)

Distribution by Scientific Domains


Selected Abstracts


Software systems: The missing element in M&A planning

JOURNAL OF CORPORATE ACCOUNTING & FINANCE, Issue 2 2003
Mike Fitzgerald
In the rush to complete a merger, the problem of merging two different IT systems often gets short shrift. Applications software, operations, and methodologies are particularly important to consider. Merging two companies will produce endless problems if their software isn't compatible. The author shows how to avoid that nightmare. What factors should you consider? How can you steer clear of common mistakes? © 2003 Wiley Periodicals, Inc. [source]


A direct circuit experiment system in non-immersive virtual environments for education and entertainment

COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 2 2005
Quang-Cherng Hsu
Abstract This article proposes to contribute to the goal of "The Popular Science Teaching Research Project" as well as to enhance the programming abilities of mechanical engineering students. Topics being included as example are in physical science, which include battery, lamp, and electric circuit. These materials are designed, based on virtual-reality technology that is suitable for students as early as fourth-grade students of primary school. It will help the students become familiar with new computer technology and provide an opportunity to study while playing virtual reality computer games. The benefits of the developed application software system of virtual reality are virtualization of teaching equipment, cost reduction of teaching materials, unlimited teaching style, and optimization of learning procedures. © 2005 Wiley Periodicals, Inc. Comput Appl Eng Educ 13: 146,152, 2005; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20044 [source]


A decentralized and fault-tolerant Desktop Grid system for distributed applications,

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 3 2010
Heithem Abbes
Abstract This paper proposes a decentralized and fault-tolerant software system for the purpose of managing Desktop Grid resources. Its main design principle is to eliminate the need for a centralized server, therefore to remove the single point of failure and bottleneck of existing Desktop Grids. Instead, each node can play alternatively the role of client or server. Our main contribution is to design the PastryGrid protocol (based on Pastry) for Desktop Grid in order to support a wider class of applications, especially the distributed application with precedence between tasks. Compared with a centralized system, we evaluate our approach over 205 machines executing 2500 tasks. The results we obtain show that our decentralized system outperforms XtremWeb-CH which is configured as a master/slave, with respect to the turnaround time. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Advanced eager scheduling for Java-based adaptive parallel computing

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 7-8 2005
Michael O. Neary
Abstract Javelin 3 is a software system for developing large-scale, fault-tolerant, adaptively parallel applications. When all or part of their application can be cast as a master,worker or branch-and-bound computation, Javelin 3 frees application developers from concerns about inter-processor communication and fault tolerance among networked hosts, allowing them to focus on the underlying application. The paper describes a fault-tolerant task scheduler and its performance analysis. The task scheduler integrates work stealing with an advanced form of eager scheduling. It enables dynamic task decomposition, which improves host load-balancing in the presence of tasks whose non-uniform computational load is evident only at execution time. Speedup measurements are presented of actual performance on up to 1000 hosts. We analyze the expected performance degradation due to unresponsive hosts, and measure actual performance degradation due to unresponsive hosts. Copyright © 2005 John Wiley & Sons, Ltd. [source]


A quality-of-service-based framework for creating distributed heterogeneous software components

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2002
Rajeev R. Raje
Abstract Component-based software development offers a promising solution for taming the complexity found in today's distributed applications. Today's and future distributed software systems will certainly require combining heterogeneous software components that are geographically dispersed. For the successful deployment of such a software system, it is necessary that its realization, based on assembling heterogeneous components, not only meets the functional requirements, but also satisfies the non-functional criteria such as the desired quality of service (QoS). In this paper, a framework based on the notions of a meta-component model, a generative domain model and QoS parameters is described. A formal specification based on two-level grammar is used to represent these notions in a tightly integrated way so that QoS becomes a part of the generative domain model. A simple case study is described in the context of this framework. Copyright © 2002 John Wiley & Sons, Ltd. [source]


A randomized controlled trial of Sweet Talk, a text-messaging system to support young people with diabetes

DIABETIC MEDICINE, Issue 12 2006
V. L. Franklin
Abstract Aims To assess Sweet Talk, a text-messaging support system designed to enhance self-efficacy, facilitate uptake of intensive insulin therapy and improve glycaemic control in paediatric patients with Type 1 diabetes. Methods One hundred and twenty-six patients fulfilled the eligibility criteria; Type 1 diabetes for > 1 year, on conventional insulin therapy, aged 8,18 years. Ninety-two patients were randomized to conventional insulin therapy (n = 28), conventional therapy and Sweet Talk (n = 33) or intensive insulin therapy and Sweet Talk (n = 31). Goal-setting at clinic visits was reinforced by daily text-messages from the Sweet Talk software system, containing personalized goal-specific prompts and messages tailored to patients' age, sex and insulin regimen. Results HbA1c did not change in patients on conventional therapy without or with Sweet Talk (10.3 ± 1.7 vs. 10.1 ± 1.7%), but improved in patients randomized to intensive therapy and Sweet Talk (9.2 ± 2.2%, 95% CI ,1.9, ,0.5, P < 0.001). Sweet Talk was associated with improvement in diabetes self-efficacy (conventional therapy 56.0 ± 13.7, conventional therapy plus Sweet Talk 62.1 ± 6.6, 95% CI +2.6, +7.5, P = 0.003) and self-reported adherence (conventional therapy 70.4 ± 20.0, conventional therapy plus Sweet Talk 77.2 ± 16.1, 95% CI +0.4, +17.4, P = 0.042). When surveyed, 82% of patients felt that Sweet Talk had improved their diabetes self-management and 90% wanted to continue receiving messages. Conclusions Sweet Talk was associated with improved self-efficacy and adherence; engaging a classically difficult to reach group of young people. While Sweet Talk alone did not improve glycaemic control, it may have had a role in supporting the introduction of intensive insulin therapy. Scheduled, tailored text messaging offers an innovative means of supporting adolescents with diabetes and could be adapted for other health-care settings and chronic diseases. [source]


Seroprevalence of Helicobacter pylori Infection Among Schoolchildren and Teachers in Taiwan

HELICOBACTER, Issue 3 2007
Ding-Bang Lin
Abstract Background:,Helicobacter pylori are associated with chronic antral gastritis that is related to duodenal ulcer, gastric ulcer, and probably gastric adenocarcinoma. Infection of H. pylori during childhood is considered an important risk factor for gastric carcinoma in adult life. Materials and Methods:, To examine the epidemiologic characteristics of H. pylori infection among schoolchildren in central Taiwan, a community-based survey was carried out using stratified sampling in 10 elementary schools and three junior high schools including students and theirs teachers. Serum specimens of 1950 healthy schoolchildren (aged 9,15 years old) and 253 teachers who were randomly sampled were screened for the H. pylori antibodies by enzyme-linked immunosorbent assay. Statistical analysis was performed by using the spss for Windows statistical software system. Results:, A total of 332 subjects were H. pylori antibodies positive, giving an overall prevalence of 15.1%. The age-specific seropositive rates were 11.0% in 9,12 years age group, 12.3% in 13,15 years age group, and 45.1% in the teacher group. The older the age, the higher the seroprevalence (OR = 11.53; 95% CI = 6.73,19.74; p < .001 for children vs. teachers). There was no difference in the seroprevalence of H. pylori infection by gender, ethnicity, geographical area, socioeconomic level, parental education, sibship size, family members, and source of drinking water. Conclusion:, The teachers had a much higher prevalence of H. pylori antibodies. The finding suggests that these teachers (adults) might be infected in their early childhood and implies that the poor environmental and hygienic conditions might be responsible for it. It seemed that poor water supply system, sewage disposal, and other environmental hygiene in adult might play some roles in H. pylori infection in Taiwan (before early 1980s). [source]


New centralized automatic vehicle location communications software system under GIS environment

INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 9 2005
Omar Al-Bayari
Abstract Recent advances in wireless communications and networks have integrated relatively new technologies such as Global Positioning System (GPS), to the popular Global System for Mobile Communication (GSM), second generation cellular systems and the Geographic Information Systems (GIS) technologies. Automatic Vehicle Location (AVL) is based on a combination of GPS, GIS and telecommunication technologies. Automatic Vehicle Tracking systems are more and more used for different purposes, especially those related to tracking one vehicle or a fleet of vehicles. In this work, we introduce a new AVL system, which is based and developed under GIS software environment. The centralized software at the control station offers a new technology of transferring the intelligence of tracking system from the car unit, into the control office PC software. Centralized software will reduce the programming efforts in the car unit and will offer better fleet management. Moreover, the core of our system is based on the objects or the controllers of the GIS software, which reduces dramatically the overall system cost. Our system provides an easy access to change the functions of the system, with great possibility to satisfy the local needs. The design of our software will be presented with an explanation of the new supporting technologies that were to create the system. Finally, our software system has been validated using data from local road networks. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Using a neural network in the software testing process

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 1 2002
Meenakshi Vanmali
Software testing forms an integral part of the software development life cycle. Since the objective of testing is to ensure the conformity of an application to its specification, a test "oracle" is needed to determine whether a given test case exposes a fault or not. Using an automated oracle to support the activities of human testers can reduce the actual cost of the testing process and the related maintenance costs. In this paper, we present a new concept of using an artificial neural network as an automated oracle for a tested software system. A neural network is trained by the backpropagation algorithm on a set of test cases applied to the original version of the system. The network training is based on the "black-box" approach, since only inputs and outputs of the system are presented to the algorithm. The trained network can be used as an artificial oracle for evaluating the correctness of the output produced by new and possibly faulty versions of the software. We present experimental results of using a two-layer neural network to detect faults within mutated code of a small credit approval application. The results appear to be promising for a wide range of injected faults. ? 2002 John Wiley & Sons, Inc. [source]


The World Mental Health (WMH) Survey Initiative version of the World Health Organization (WHO) Composite International Diagnostic Interview (CIDI)

INTERNATIONAL JOURNAL OF METHODS IN PSYCHIATRIC RESEARCH, Issue 2 2004
Ronald C. Kessler
Abstract This paper presents an overview of the World Mental Health (WMH) Survey Initiative version of the World Health Organization (WHO) Composite International Diagnostic Interview (CIDI) and a discussion of the methodological research on which the development of the instrument was based. The WMH-CIDI includes a screening module and 40 sections that focus on diagnoses (22 sections), functioning (four sections), treatment (two sections), risk factors (four sections), socio-demographic correlates (seven sections), and methodological factors (two sections). Innovations compared to earlier versions of the CIDI include expansion of the diagnostic sections, a focus on 12-month as well as lifetime disorders in the same interview, detailed assessment of clinical severity, and inclusion of information on treatment, risk factors, and consequences. A computer-assisted version of the interview is available along with a direct data entry software system that can be used to keypunch responses to the paper-and-pencil version of the interview. Computer programs that generate diagnoses are also available based on both ICD-10 and DSM-IV criteria. Elaborate CD-ROM-based training materials are available to teach interviewers how to administer the interview as well as to teach supervisors how to monitor the quality of data collection. Copyright © 2004 Whurr Publishers Ltd. [source]


A software system for rigid-body modelling of solution scattering data

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 3-1 2000
M.B. Kozin
A computer system for rigid body modelling against solution scattering data is described. Fast algorithms to compute scattering from a complex of two arbitrary positioned subunits are implemented and coupled with the graphics program ASSA (Kozin, Volkov & Svergun, 1997, J. Appl. Cryst.30, 811-815). Mutual positions and orientations of the subunits (represented by low-resolution envelopes or by atomic models) can be determined by interactively fitting the experimental scattering curve from the complex. The system runs on the major Unix platforms (SUN, SGI and DEC workstations). [source]


A computer-assisted test design and diagnosis system for use by classroom teachers

JOURNAL OF COMPUTER ASSISTED LEARNING, Issue 6 2005
Q. He
Abstract Computer-assisted assessment (CAA) has become increasingly important in education in recent years. A variety of computer software systems have been developed to help assess the performance of students at various levels. However, such systems are primarily designed to provide objective assessment of students and analysis of test items, and focus has been mainly placed on higher and further education. Although there are commercial professional systems available for use by primary and secondary educational institutions, such systems are generally expensive and require skilled expertise to operate. In view of the rapid progress made in the use of computer-based assessment for primary and secondary students by education authorities here in the UK and elsewhere, there is a need to develop systems which are economic and easy to use and can provide the necessary information that can help teachers improve students' performance. This paper presents the development of a software system that provides a range of functions including generating items and building item banks, designing tests, conducting tests on computers and analysing test results. Specifically, the system can generate information on the performance of students and test items that can be easily used to identify curriculum areas where students are under performing. A case study based on data collected from five secondary schools in Hong Kong involved in the Curriculum, Evaluation and Management Centre's Middle Years Information System Project, Durham University, UK, has been undertaken to demonstrate the use of the system for diagnostic and performance analysis. [source]


MPK: An open extensible motion planning kernel

JOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 8 2001
Ian Gipson
The motion planning kernel (MPK) is a software system designed to facilitate development, testing, and comparison of robotic and geometric reasoning algorithms. Examples of such algorithms include automatic path planning, grasping, etc. The system has been designed to be open and extensible, so that new methods can be easily added and compared on the same platform. © 2001 John Wiley & Sons, Inc. [source]


Recommending change clusters to support software investigation: an empirical study

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 3 2010
Martin P. Robillard
Abstract During software maintenance tasks, developers often spend a valuable amount of effort investigating source code. This effort can be reduced if tools are available to help developers navigate the source code effectively. We studied to what extent developers can benefit from information contained in clusters of change sets to guide their investigation of a software system. We defined change clusters as groups of change sets that have a certain amount of elements in common. Our analysis of 4200 change sets for seven different systems and covering a cumulative time span of over 17 years of development showed that less than one in five tasks overlapped with change clusters. Furthermore, a detailed qualitative analysis of the results revealed that only 13% of the clusters associated with applicable change tasks were likely to be useful. We conclude that change clusters can only support a minority of change tasks, and should only be recommended if it is possible to do so at minimal cost to the developers. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Viability for codifying and documenting architectural design decisions with tool support

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 2 2010
Rafael Capilla
Abstract Current software architecture practices have been focused on modeling and documenting the architecture of a software system by means of several architectural views. In practice, the standard architecture documentation lacks explicit description of the decisions made and their underlying rationale, which often leads to knowledge loss. This fact strongly affects the maintenance activities as we need to spend additional effort to replay the decisions made as well as to understand the changes performed in the design. Hence, codifying this architectural knowledge is a challenging task that requires adequate tool support. In this research, we test the capabilities of Architecture Design Decision Support System (ADDSS), a web-based tool for supporting the creation, maintenance, use, and documentation of architectural design decisions (ADD) with their architectures. We used ADDSS to codify architectural knowledge and to maintain those trace links between the design decisions and other software artefacts that would help in the maintenance operations. We illustrate the usage of the tool through four different experiences and discuss the potential benefits of using this architectural knowledge and its impact on the maintenance and evolution activities. Copyright © 2009 John Wiley & Sons, Ltd. [source]


A metric-based approach to identifying refactoring opportunities for merging code clones in a Java software system

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6 2008
Yoshiki Higo
Abstract A code clone is a code fragment that has other code fragments identical or similar to it in the source code. The presence of code clones is generally regarded as one factor that makes software maintenance more difficult. For example, if a code fragment with code clones is modified, it is necessary to consider whether each of the other code clones has to be modified as well. Removing code clones is one way of avoiding problems that arise due to the presence of code clones. This makes the source code more maintainable and more comprehensible. This paper proposes a set of metrics that suggest how code clones can be refactored. As well, the tool Aries, which automatically computes these metrics, is presented. The tool gives metrics that are indicators for certain refactoring methods rather than suggesting the refactoring methods themselves. The tool performs only lightweight source code analysis; hence, it can be applied to a large number of code lines. This paper also describes a case study that illustrates how this tool can be used. Based on the results of this case study, it can be concluded that this method can efficiently merge code clones. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Encapsulating targeted component abstractions using software Reflexion Modelling

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 2 2008
Jim Buckley
Abstract Design abstractions such as components, modules, subsystems or packages are often not made explicit in the implementation of legacy systems. Indeed, often the abstractions that are made explicit turn out to be inappropriate for future evolution agendas. This can make the maintenance, evolution and refactoring of these systems difficult. In this publication, we carry out a fine-grained evaluation of Reflexion Modelling as a technique for encapsulating user-targeted components. This process is a prelude to component recovery, reuse and refactoring. The evaluation takes the form of two in vivo case studies, where two professional software developers encapsulate components in a large, commercial software system. The studies demonstrate the validity of this approach and offer several best-use guidelines. Specifically, they argue that users benefit from having a strong mental model of the system in advance of Reflexion Modelling, even if that model is flawed, and that users should expend effort exploring the expected relationships present in Reflexion Models. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Analyzing software evolution through feature views

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6 2006
Orla Greevy
Abstract Features encapsulate the domain knowledge of a software system and thus are valuable sources of information for a reverse engineer. When analyzing the evolution of a system, we need to know how and which features were modified to recover both the change intention and extent, namely which source artifacts are affected. Typically, the implementation of a feature crosscuts a number of source artifacts. To obtain a mapping between features and the source artifacts, we exercise the features and capture their execution traces. However, this results in large traces that are difficult to interpret. To tackle this issue we compact the traces into simple sets of source artifacts that participate in a feature's runtime behavior. We refer to these compacted traces as feature views. Within a feature view, we partition the source artifacts into disjoint sets of characterized software entities. The characterization defines the level of participation of a source entity in the features. We then analyze the features over several versions of a system and we plot their evolution to reveal how and which features were affected by code changes. We show the usefulness of our approach by applying it to a case study where we address the problem of merging parallel development tracks of the same system. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Organizational evolution of digital signal processing software development

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 4 2006
Susanna Pantsar-Syväniemi
Abstract A base station, as a network element, has become an increasingly software-intensive system. Digital signal processing (DSP) software is hard real-time software that is a part of the software system needed in a base station. This article reports practical experiences related to organizing the development of embedded software in the telecommunication industry, at Nokia Networks. The article introduces the main factors influencing the development of DSP software and also compares the evolutionary process under study with both selected organizational models for a software product line and a multistage model for the software life cycle. We believe it is vitally important to formulate the organization according to the software architecture, and it is essential to have a dedicated development organization with long-term responsibility for the software. History shows that without long-term responsibility, there is no software reuse. In this paper we introduce a new organizational model for product line development. This new hybrid model clarifies long-term responsibilities in large software organizations with hundreds of staff members and formulates the organization according to the software architecture. Our case needs a couple more constraints to keep it in the evolution stage of the software life cycle. Thus, we extend the evolution phase in the multistage model to make it relevant for embedded, hard real-time software. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Supporting the analysis of clones in software systems

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 2 2006
Cory J. Kapser
Abstract Code duplication is a well-documented problem in industrial software systems. There has been considerable research into techniques for detecting duplication in software, and there are several effective tools to perform this task. However, there have been few detailed qualitative studies into how cloning actually manifests itself within software systems. This is primarily due to the large result sets that many clone-detection tools return; these result sets are very difficult to manage without complementary tool support that can scale to the size of the problem, and this kind of support does not currently exist. In this paper we present an in-depth case study of cloning in a large software system that is in wide use, the Apache Web server; we provide insights into cloning as it exists in this system, and we demonstrate techniques to manage and make effective use of the large result sets of clone-detection tools. In our case study, we found several interesting types of cloning occurrences, such as ,cloning hotspots', where a single subsystem comprising only 17% of the system code contained 38.8% of the clones. We also found several examples of cloning behavior that were beneficial to the development of the system, in particular cloning as a way to add experimental functionality. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Using software trails to reconstruct the evolution of software

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6 2004
Daniel M. German
Abstract This paper describes a method to recover the evolution of a software system using its software trails: information left behind by the contributors to the development process of the product, such as mailing lists, Web sites, version control logs, software releases, documentation, and the source code. This paper demonstrates the use of this method by recovering the evolution of Ximian Evolution, a mail client for Unix. By extracting useful facts stored in these software trails and correlating them, it was possible to provide a detailed view of the history of this project. This view provides interesting insight into how an open source software project evolves and some of the practices used by its software developers. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Visualizing feature evolution of large-scale software based on problem and modification report data

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6 2004
Michael Fischer
Abstract Gaining higher-level evolutionary information about large software systems is a key challenge in dealing with increasing complexity and architectural deterioration. Modification reports and problem reports (PRs) taken from systems such as the concurrent versions system (CVS) and Bugzilla contain an overwhelming amount of information about the reasons and effects of particular changes. Such reports can be analyzed to provide a clearer picture about the problems concerning a particular feature or a set of features. Hidden dependencies of structurally unrelated but over time logically coupled files exhibit a good potential to illustrate feature evolution and possible architectural deterioration. In this paper, we describe the visualization of feature evolution by taking advantage of this logical coupling introduced by changes required to fix a reported problem. We compute the proximity of PRs by applying a standard technique called multidimensional scaling (MDS). The visualization of these data enables us to depict feature evolution by projecting PR dependence onto (a) feature-connected files and (b) the project directory structure of the software system. These two different views show how PRs, features and the directory tree structure relate. As a result, our approach uncovers hidden dependencies between features and presents them in an easy-to-assess visual form. A visualization of interwoven features can indicate locations of design erosion in the architectural evolution of a software system. As a case study, we used Mozilla and its CVS and Bugzilla data to show the applicability and effectiveness of our approach. Copyright © 2004 John Wiley & Sons, Ltd. [source]


An assessment strategy for identifying legacy system evolution requirements in eBusiness context

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 4-5 2004
Lerina Aversano
Abstract The enactment of e Business processes requires the effective usage of the existing legacy applications in the e Business initiatives. Technical issues are not enough to drive the evolution of the existing legacy applications, but problems concerning the perspectives, strategies, and business of the enterprises have to be considered. In particular, there is a strict relationship between the evolution of the legacy systems and the evolution of the e Business processes. This paper proposes a strategy to extract the requirements for a legacy system evolution from the requirements of the e Business evolution. The proposed strategy aims at characterizing the software system within the whole environment in which its evolution will be performed. It provides a useful set of attributes addressing technical, process, and organizational issues. Moreover, a set of assessment activities is proposed affecting the order in which the attributes are assessed. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Architecture-based semantic evolution of embedded remotely controlled systems

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 3 2003
Lawrence Chung
Abstract Evolution of a software system is a natural process. In most systems, evolution takes place during the maintenance phase of their life cycles. Those systems that have reached their limit in evolution have usually reached their end of useful life and may have to be replaced. However, there are systems in which evolution occurs during the operational phase of their life cycles. Such systems are designed to evolve while in use or, in other words, be adaptable. Semantically adaptable systems are of particular interest to industry as such systems often times adapt themselves to environment change with little or no intervention from their developing or maintaining organization. Since embedded systems usually have a restricted hardware configuration, it is difficult to apply the techniques developed for non-embedded systems directly to embedded systems. This paper focuses on evolution through adaptation and develops the concepts and techniques for semantic evolution in embedded systems. As the first step in the development of a software solution, architectures of software systems themselves have to be made semantically evolvable. In this paper we explore various architectural alternatives for the semantic evolution of embedded systems,these architectures are based on four different techniques that we have identified for semantic evolution in embedded systems. The development of these architectures follows the systematic process provided by the non-functional requirement (NFR) framework, which also permits the architectures to be rated in terms of their evolvability. As the field of embedded systems is vast, this paper concentrates on those embedded systems that can be remotely controlled. In this application domain the embedded system is connected to an external controller by a communication link such as ethernet, serial, radio frequency, etc., and receives commands from and sends responses to the external controller via the communication link. The architectures developed in this paper have been partly validated by applying them in a real embedded system,a test instrument used for testing cell phones. These architectures and techniques for semantic evolution in this application domain give a glimpse of what can be done in achieving semantic evolution in software-implemented systems. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Consistent database sampling as a database prototyping approach

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6 2002
Jesús Bisbal
Abstract Requirements elicitation has been reported to be the stage of software development when errors have the most expensive consequences. Users usually find it difficult to articulate a consistent and complete set of requirements at the beginning of a development project. Prototyping is considered a powerful technique to ease this problem by exposing a partial implementation of the software system to the user, who can then identify required modifications. When prototyping data-intensive applications a so-called prototype database is needed. This paper investigates how a prototype database can be built. Two different approaches are analysed, namely test databases and sample databases; the former populates the resulting database with synthetic values, while the latter uses data values from an existing database. The application areas that require prototype databases, in addition to requirements analysis, are also identified. The paper reports on existing research into the construction of both types of prototype databases, and indicates to which type of application area each is best suited. This paper advocates for the use of sample databases when an operational database is available, as is commonly the case in software maintenance and evolution. Domain-relevant data values and integrity constraints will produce a prototype database which will support the information system development process better than synthetic data. The process of extracting a sample database is also investigated. Copyright © 2002 John Wiley & Sons, Ltd. [source]


MxCuBE: a synchrotron beamline control environment customized for macromolecular crystallography experiments

JOURNAL OF SYNCHROTRON RADIATION, Issue 5 2010
José Gabadinho
The design and features of a beamline control software system for macromolecular crystallography (MX) experiments developed at the European Synchrotron Radiation Facility (ESRF) are described. This system, MxCuBE, allows users to easily and simply interact with beamline hardware components and provides automated routines for common tasks in the operation of a synchrotron beamline dedicated to experiments in MX. Additional functionality is provided through intuitive interfaces that enable the assessment of the diffraction characteristics of samples, experiment planning, automatic data collection and the on-line collection and analysis of X-ray emission spectra. The software can be run in a tandem client-server mode that allows for remote control and relevant experimental parameters and results are automatically logged in a relational database, ISPyB. MxCuBE is modular, flexible and extensible and is currently deployed on eight macromolecular crystallography beamlines at the ESRF. Additionally, the software is installed at MAX-lab beamline I911-3 and at BESSY beamline BL14.1. [source]


Real-time accelerated interactive MRI with adaptive TSENSE and UNFOLD,

MAGNETIC RESONANCE IN MEDICINE, Issue 2 2003
Michael A. Guttman
Abstract Reduced field-of-view (FOV) acceleration using time-adaptive sensitivity encoding (TSENSE) or unaliasing by Fourier encoding the overlaps using the temporal dimension (UNFOLD) can improve the depiction of motion in real-time MRI. However, increased computational resources are required to maintain a high frame rate and low latency in image reconstruction and display. A high-performance software system has been implemented to perform TSENSE and UNFOLD reconstructions for real-time MRI with interactive, on-line display. Images were displayed in the scanner room to investigate image-guided procedures. Examples are shown for normal volunteers and cardiac interventional experiments in animals using a steady-state free precession (SSFP) sequence. In order to maintain adequate image quality for interventional procedures, the imaging rate was limited to seven frames per second after an acceleration factor of 2 with a voxel size of 1.8 × 3.5 × 8 mm. Initial experiences suggest that TSENSE and UNFOLD can each improve the compromise between spatial and temporal resolution in real-time imaging, and can function well in interactive imaging. UNFOLD places no additional constraints on receiver coils, and is therefore more flexible than SENSE methods; however, the temporal image filtering can blur motion and reduce the effective acceleration. Methods are proposed to overcome the challenges presented by the use of TSENSE in interactive imaging. TSENSE may be temporarily disabled after changing the imaging plane to avoid transient artifacts as the sensitivity coefficients adapt. For imaging with a combination of surface and interventional coils, a hybrid reconstruction approach is proposed whereby UNFOLD is used for the interventional coils, and TSENSE with or without UNFOLD is used for the surface coils. Magn Reson Med 50:315,321, 2003. Published 2003 Wiley-Liss, Inc. [source]


An automated tracking system to measure the dynamic properties of vesicles in living cells

MICROSCOPY RESEARCH AND TECHNIQUE, Issue 2 2007
Tien-Chuan Ku
Abstract Recent technological improvements have made it possible to examine the dynamics of individual vesicles at a very high temporal and spatial resolution. Quantification of the dynamic properties of secretory vesicles is labor-intensive and therefore it is crucial to develop software to automate the process of analyzing vesicle dynamics. Dual-threshold and binary image conversion were applied to enhance images and define the areas of objects of interest that were to be tracked. The movements, changes in fluorescence intensity, and changes in the area of each tracked object were measured using a new software system named the Protein Tracking system (PTrack). Simulations revealed that the system accurately recognized tracked objects and measured their dynamic properties. Comparison of the results from tracking real time-lapsed images manually with those automatically obtained using PTrack revealed similar patterns for changes in fluorescence intensity and a high accuracy (<89%). According to tracking results, PTrack can distinguish different vesicular organelles that are similar in shape, based on their unique dynamic properties. In conclusion, the novel tracking system, PTrack, should facilitate automated quantification of the dynamic properties of vesicles that are important when classifying vesicular protein locations. Microsc. Res. Tech. 2007. © 2006 Wiley-Liss, Inc. [source]


Mechanical properties of injection molded long fiber polypropylene composites, Part 1: Tensile and flexural properties

POLYMER COMPOSITES, Issue 2 2007
K. Senthil Kumar
Innovative polymers and composites are broadening the range of applications and commercial production of thermoplastics. Long fiber-reinforced thermoplastics have received much attention due to their processability by conventional technologies. This study describes the development of long fiber reinforced polypropylene (LFPP) composites and the effect of fiber length and compatibilizer content on their mechanical properties. LFPP pellets of different sizes were prepared by extrusion process using a specially designed radial impregnation die and these pellets were injection molded to develop LFPP composites. Maleic-anhydride grafted polypropylene (MA- g -PP) was chosen as a compatibilizer and its content was optimized by determining the interfacial properties through fiber pullout test. Critical fiber length was calculated using interfacial shear strength. Fiber length distributions were analyzed using profile projector and image analyzer software system. Fiber aspect ratio of more than 100 was achieved after injection molding. The results of the tensile and flexural properties of injection molded long glass fiber reinforced polypropylene with a glass fiber volume fraction of 0.18 are presented. It was found that the differences in pellet sizes improve the mechanical properties by 3,8%. Efforts are made to theoretically predict the tensile strength and modulus using the Kelly-Tyson and Halpin-Tsai model, respectively. POLYM. COMPOS., 28:259,266, 2007. © 2007 Society of Plastic Engineers [source]


Effect of single doses of maraviroc on the QT/QTc interval in healthy subjects

BRITISH JOURNAL OF CLINICAL PHARMACOLOGY, Issue 2008
John D. Davis
AIMS To assess the effect of a single dose of maraviroc on the QTc interval in healthy subjects and to evaluate the QTc interval,concentration relationship. METHODS A single-dose, placebo- and active-controlled, five-way crossover study was conducted to investigate the effects of maraviroc (100, 300, 900 mg) on QTc in healthy subjects. Moxifloxacin (400 mg) was used as the active comparator. The study was double-blind with respect to maraviroc/placebo and open label for moxifloxacin. There was a 7-day wash-out period between each dose. QT interval measurements obtained directly from the electrocardiogram (ECG) recorder were corrected for heart rate using Fridericia's correction (QTcF). A placebo run-in day was conducted before period 3, when ECGs were collected at intervals while subjects were resting or during exercise. These ECGs plus other predose ECGs were used to evaluate the QT/RR relationship for each subject to enable calculation of an individual's heart rate correction for their QT measurements (QTcI). ECGs were taken at various intervals pre- and postdose in each study period. Pharmacokinetic parameters were determined for each maraviroc dose. The end-points that were evaluated were QTcF at median time to maximum concentration (Tmax) based on the machine readings and QTcI at median Tmax based on manual over-reads of the QT/RR data. A separate analysis of variance was used for each of the pair-wise comparisons for each end-point. The relationship between QTc interval and plasma concentration was also investigated using a mixed-effects modelling approach, as implemented by the NONMEM software system. A one-stage model was employed in which the relationship between QT and RR and the effects of maraviroc plasma concentration on QT were estimated simultaneously. RESULTS The mean difference from placebo in machine-read QTcF at median Tmax for maraviroc 900 mg was 3.6 ms [90% confidence interval (CI) 1.5, 5.8]. For the active comparator, moxifloxacin, the mean difference from placebo in machine-read QTcF was 13.7 ms. The changes from placebo for each of the end-points were similar for men and women. No subjects receiving maraviroc or placebo had a QTcF ,,450 ms (men) or QTcF ,,470 ms (women), nor did any subject experience a QTcF increase ,,60 ms from baseline at any time point. Analysis based on the QTcI data obtained from the manual over-readings of the ECGs gave numerically very similar results. The QT:RR relationship was similar pre- and postdose and was not related to maraviroc concentration. The population estimate of the QT:RR correction factor was 0.324 (95% CI 0.309, 0.338). The population estimate of the slope describing the QT,concentration relationship was 0.97 ,s ml ng,1 (95% CI ,0.571, 2.48), equivalent to an increase of 0.97 ms in QT per 1000 ng maraviroc plasma concentration. Most adverse events were mild to moderate in severity. CONCLUSIONS Single doses of maraviroc, up to and including 900 mg, had no clinically relevant effect on QTcF or QTcI. At all maraviroc doses and for both end-points, the mean difference from placebo for QTc was <4 ms. There was no apparent relationship between QT interval and maraviroc plasma concentration up to 2363 ng ml,1. This conclusion held in both male and female subjects, and there was no evidence of a change in the QT/RR relationship with concentration. [source]