Distribution by Scientific Domains

Kinds of Software

  • analysis software
  • application software
  • available software
  • commercial software
  • computer software
  • control software
  • data analysis software
  • dedicated software
  • design software
  • educational software
  • element software
  • finite element software
  • free software
  • gi software
  • image analysis software
  • image processing software
  • image software
  • imaging software
  • management software
  • mapping software
  • modeling software
  • modelling software
  • navigation software
  • new software
  • nonmem software
  • open source software
  • open-source software
  • planning software
  • processing software
  • qualitative data analysis software
  • recognition software
  • simulation software
  • source software
  • specialized software
  • standard software
  • standard statistical software
  • statistical software
  • system software
  • user-friendly software

  • Terms modified by Software

  • software agent
  • software analysis
  • software application
  • software architecture
  • software available
  • software component
  • software defect
  • software design
  • software developers
  • software development
  • software development process
  • software engineering
  • software environment
  • software firm
  • software implementation
  • software industry
  • software library
  • software module
  • software organization
  • software package
  • software packages
  • software platform
  • software product
  • software products
  • software program
  • software solution
  • software suite
  • software system
  • software technology
  • software testing
  • software tool
  • software used

  • Selected Abstracts

    CODE IS SPEECH: Legal Tinkering, Expertise, and Protest among Free and Open Source Software Developers

    ABSTRACT In this essay, I examine the channels through which Free and Open Source Software (F/OSS) developers reconfigure central tenets of the liberal tradition,and the meanings of both freedom and speech,to defend against efforts to constrain their productive autonomy. I demonstrate how F/OSS developers contest and specify the meaning of liberal freedom,especially free speech,through the development of legal tools and discourses within the context of the F/OSS project. I highlight how developers concurrently tinker with technology and the law using similar skills, which transform and consolidate ethical precepts among developers. I contrast this legal pedagogy with more extraordinary legal battles over intellectual property, speech, and software. I concentrate on the arrests of two programmers, Jon Johansen and Dmitry Sklyarov, and on the protests they provoked, which unfolded between 1999 and 2003. These events are analytically significant because they dramatized and thus made visible tacit social processes. They publicized the challenge that F/OSS represents to the dominant regime of intellectual property (and clarified the democratic stakes involved) and also stabilized a rival liberal legal regime intimately connecting source code to speech. [source]

    Geeks, Social Imaginaries, and Recursive Publics

    Christopher Kelty
    This article investigates the social, technical, and legal affiliations among "geeks" (hackers, lawyers, activists, and IT entrepreneurs) on the Internet. The mode of association specific to this group is that of a "recursive public sphere" constituted by a shared imaginary of the technical and legal conditions of possibility for their own association. On the basis of fieldwork conducted in the United States, Europe, and India, I argue that geeks imagine their social existence and relations as much through technical practices (hacking, networking, and code writing) as through discursive argument (rights, identities, and relations). In addition, they consider a "right to tinker" a form of free speech that takes the form of creating, implementing, modifying, or using specific kinds of software (especially Free Software) rather than verbal discourse. [source]

    Globalization from Below: Free Software and Alternatives to Neoliberalism

    Sara Schoonmaker
    ABSTRACT This article explores one of the central struggles over the politics of globalization: forging alternatives to neoliberalism by developing new forms of globalization from below. It focuses on a unique facet of this struggle, rooted in the centrality of information technologies for global trade and production, as well as new forms of media and digital culture. The analysis has four main parts: examining the key role of software as a technological infrastructure for diverse forms of globalization; conceptualizing the contradictory implications of three software business models for realizing the utopian potential of digital technology to develop forms of globalization from below; exploring how three free and open source software business models were put into practice by Red Hat, IBM and the Free Software Foundation; and analysing Brazilian software policy as a form of globalization from below that challenges the historical dominance of the global North and seeks to develop new forms of digital inclusion and digital culture. [source]

    Automatic analysis of multiplex ligation-dependent probe amplification products (exemplified by a commercial kit for prenatal aneuploidy detection)

    ELECTROPHORESIS, Issue 22 2005
    Tommy Gerdes Dr.
    Abstract For use in routine prenatal diagnostics, we developed software and methods for automatic aneuploidy detection based on a commercial multiplex ligation-dependent probe amplification (MLPA) kit. Software and methods ensure a reliable, objective, and fast workflow, and may be applied to other types of MLPA kits. Following CE of MLPA amplification products, the software automatically identified the peak area for each probe, normalized it in relation to the neighboring peak areas of the test sample, computed the ratio relative to a reference created from normal samples, and compensated the ratio for a side effect of the normalization procedure that scaled all chromosomally normal DNA peak areas slightly up or down depending on the kind of aneuploidy present. For the chromosomes 13, 18, 21, X, and Y, probe reliability weighted mean ratio values and corresponding SDs were calculated, and the significance for being outside a reference interval around ratio 1.0 was tested. p,,,1% suggested aneuploidy and 1,<,p,,,5% suggested potential aneuploidy. Individual peaks, where the normalized area was situated more than 4 SD from the corresponding reference, suggested possible partial deletion or gain. Sample quality was automatically assessed. Control probes were not required. Having used the software and methods for two years, we conclude that a reliable, objective, and fast workflow is obtained. [source]

    Predicting the Mircostructure in Semi-Crystalline Thermoplastics using Software for the Simulation of Recrystallization in Metals,

    W. Michaeli
    A software for the simulation of spherulite growth during the cooling of a quiescent melt has been developed and tested vs. experimental data by the authors. The tests have verified good qualitative results: the calculated crystal microstructure and distribution (see Figure for a simulated pattern close to the mold surface) correspond well with the real morphology. [source]


    ABSTRACT. Lahars are hazardous events that can cause serious damage to people who live close to volcanic areas; several were registered at different times in the last century, such as at Mt St Helens (USA) in 1980, Nevado del Ruiz (Colombia) in 1985 and Mt Pinatubo (Philippines) in 1990. Risk maps are currently used by decision-makers to help them plan to mitigate the hazard-risk of lahars. Risk maps are acquired based on a series of tenets that take into account the distribution and chronology of past lahar deposits, and basically two approaches have been used: (1) The use of Flow Simulation Software (FSS), which simulates flows along channels in a Digital Elevation Model and (2) The Geochronological Method (GM), in which the mapping is based on the evaluation of lahar magnitude and frequency. This study addresses the production of a lahar risk map using the two approaches (FSS and GM) for a study area located at Popocatépetl volcano , Central Mexico. Santiago Xalitzintla, a town located on the northern flank of Popocatépetl volcano, where volcanic activity in recent centuries has triggered numerous lahars that have endangered local inhabitants, has been used for the case study. Results from FSS did not provide satisfactory findings because they were not consistent with lahar sediment observations made during fieldwork. By contrast, the GM produced results consistent with these observations, and therefore we use them to assess the hazard and produce the risk map for the study area. [source]

    Open Software: Can You Afford It?

    Can You Avoid It?

    Application of Six Sigma Methods for Improving the Analytical Data Management Process in the Environmental Industry

    Christopher M. French
    Honeywell applied the rigorous and well-documented Six Sigma quality-improvement approach to the complex, highly heterogeneous, and mission-critical process of remedial site environmental data management to achieve a sea change in terms of data quality, environmental risk reduction, and overall process cost reduction. The primary focus was to apply both qualitative and quantitative Six Sigma methods to improve electronic management of analytical laboratory data generated for environmental remediation and long-term monitoring programs. The process includes electronic data delivery, data QA/QC checking, data verification, data validation, database administration, regulatory agency reporting and linkage to spatial information, and real-time geographical information systems. Results of the analysis identified that automated, centralized web-based software tools delivered through Software as a Service (SaaS) model are optimal to improve the process resulting in cost reductions, while simultaneously improving data quality and long-term data usability and perseverance. A pilot project was completed that quantified cycle time and cost improvements of 50% and 65%, respectively. [source]

    Nonlinear multiple regression methods: a survey and extensions

    Kenneth O. Cogger
    Abstract This paper reviews some nonlinear statistical procedures useful in function approximation, classification, regression and time-series analysis. Primary emphasis is on piecewise linear models such as multivariate adaptive regression splines, adaptive logic networks, hinging hyperplanes and their conceptual differences. Potential and actual applications of these methods are cited. Software for implementation is discussed, and practical suggestions are given for improvement. Examples show the relative capabilities of the various methods, including their ability for universal approximation. Copyright © 2010 John Wiley & Sons, Ltd. [source]

    Penetration of propylene glycol into dentine

    E. V. Cruz
    Abstract Cruz EV, Kota K, Huque J, Iwaku M, Hoshino E. Penetration of propylene glycol into dentine. International Endodontic Journal, 35, 330,336, 2002. Aim This study aimed to evaluate penetration of propylene glycol into root dentine. Methodology Safranin O in propylene glycol and in distilled water were introduced into root canals with and without artificial smear layer. Dye diffusion through dentinal tubules was determined spectrophotometrically. The time required for dye to exit through the apical foramen using propylene glycol and distilled water as vehicles was also determined. The extent and areas of dye penetration on the split surfaces of roots were assessed using Adobe Photoshop and NIH Image Software. Results Propylene glycol allowed dye to exit faster through the apical foramen. The area and depth of dye penetration with propylene glycol was significantly greater than with distilled water (P < 0.0001). Smear layer significantly delayed the penetration of dye. Conclusion Propylene glycol delivered dye through the root canal system rapidly and more effectively indicating its potential use in delivering intracanal medicaments. [source]

    Identification of root canals in molars by tuned-aperture computed tomography

    R. Nance
    Abstract Aim To compare the tuned-aperture computed tomography system of imaging to conventional D-speed film for their ability to identify root canals in extracted human molars. Methodology Thirteen maxillary and six mandibular human molars were mounted in acrylic blocks to simulate clinical conditions by surrounding the teeth with a radiodense structure. The teeth were then imaged with conventional D-speed film using a standard paralleling technique, and with a modified orthopantomograph OP100 machine using a Schick no. 2 size CCD sensor as the image receptor. The source images were registered and TACT slices were generated using TACT WorkbenchÔ Software. Three observers were asked to identify the number of canals in the conventional film group and the TACT image group using specific criteria. Ground truth was established by cross-sectioning the teeth at the coronal, middle, and apical thirds of the roots and directly visualizing the root canal morphology. Results TACT imaging detected 36% of 4th canals in maxillary molars and 80% of third canals in mandibular molars. Conventional film detected 0% of fourth canals in maxillary molars and 0% of third canals in mandibular molars. The differences in canal detection between the two techniques were statistically significant (Wilcoxon matched pair sign rank test, P = 0.001). Conclusions In this study, the TACT system of digital imaging was superior to conventional film in the detection of root canals in human molars and may be useful for the detection of root canals that will probably be missed upon conventional X-ray examination. [source]

    Spectrophotometric variable-concentration kinetic experiments applied to inorganic reactions

    Giuseppe Alibrandi
    The dependence of the observed rate constant of inorganic substitution reactions on the concentration of nucleophilic reagents was obtained by single variable-parameter kinetic runs. The experiments were carried out spectrophotometrically, varying the concentration of the nucleophile inside the reaction vessel. Software and apparatus were developed for an easy and rapid performance. The method gives accurate results and a saving in time by a factor of up to 100 compared to conventional methods. © 2003 Wiley Periodicals, Inc. Int J Chem Kinet 35: 497,502, 2003 [source]

    Effectiveness of interventions that assist caregivers to support people with dementia living in the community: a systematic review

    Deborah Parker BA, MSocSci
    Executive summary Objectives, The objective of this review was to assess the effectiveness of interventions that assist caregivers to provide support for people living with dementia in the community. Inclusion criteria, Types of participants, Adult caregivers who provide support for people with dementia living in the community (non-institutional care). Types of interventions, Interventions designed to support caregivers in their role such as skills training, education to assist in caring for a person living with dementia and support groups/programs. Interventions of formal approaches to care designed to support caregivers in their role, care planning, case management and specially designated members of the healthcare team , for example dementia nurse specialist or volunteers trained in caring for someone with dementia. Types of studies, This review considered any meta-analyses, systematic reviews, randomised control trials, quasi-experimental studies, cohort studies, case control studies and observational studies without control groups that addressed the effectiveness of interventions that assist caregivers to provide support for people living with dementia in the community. Search strategy, The search sought to identify published studies from 2000 to 2005 through the use of electronic databases. Only studies in English were considered for inclusion. The initial search was conducted of the databases, CINAHL, MEDLINE and PsychINFO using search strategies adapted from the Cochrane Dementia and Cognitive Improvement Group. A second more extensive search was then conducted using the appropriate Medical Subject Headings (MeSH) and keywords for other available databases. Finally, hand searching of reference lists of articles retrieved and of core dementia, geriatric and psycho geriatric journals was undertaken. Assessment of quality, Methodological quality of each of the articles was assessed by two independent reviewers using appraisal checklist developed by the Joanna Briggs Institute and based on the work of the Cochrane Collaboration and Centre for Reviews and Dissemination. Data collection and analysis, Standardised mean differences or weighted mean differences and their 95% confidence intervals were calculated for each included study reported in the meta-analysis. Results from comparable groups of studies were pooled in statistical meta-analysis using Review Manager Software from the Cochrane Collaboration. Heterogeneity between combined studies was tested using standard chi-square test. Where statistical pooling was not appropriate or possible, the findings are summarised in narrative form. Results, A comprehensive search of relevant databases, hand searching and cross referencing found 685 articles that were assessed for relevance to the review. Eighty-five papers appeared to meet the inclusion criteria based on title and abstract, and the full paper was retrieved. Of the 85 full papers reviewed, 40 were accepted for inclusion, three were systematic reviews, three were meta-analysis, and the remaining 34 were randomised controlled trials. For the randomised controlled trials that were able to be included in a meta-analysis, standardised mean differences or weighted mean differences and their 95% confidence intervals were calculated for each. Results from comparable groups of studies were pooled in statistical meta-analysis using Review Manager Software and heterogeneity between combined studies was assessed by using the chi-square test. Where statistical pooling was not appropriate or possible, the findings are summarised in narrative form. The results are discussed in two main sections. Firstly it was possible to assess the effectiveness of different types of caregiver interventions on the outcome categories of depression, health, subjective well-being, self-efficacy and burden. Secondly, results are reported by main outcome category. For each of these sections, meta-analysis was conducted where it was possible; otherwise, a narrative summary describes the findings. Effectiveness of intervention type, Four categories of intervention were included in the review , psycho-educational, support, multi-component and other. Psycho-educational Thirteen studies used psycho-educational interventions, and all but one showed positive results across a range of outcomes. Eight studies were entered in a meta-analysis. No significant impact of psycho-educational interventions was found for the outcome categories of subjective well-being, self-efficacy or health. However, small but significant results were found for the categories of depression and burden. Support Seven studies discussed support only interventions and two of these showed significant results. These two studies were suitable for meta-analysis and demonstrated a small but significant improvement on caregiver burden. Multi-component Twelve of the studies report multi-component interventions and 10 of these report significant outcomes across a broad range of outcome measures including self-efficacy, depression, subjective well-being and burden. Unfortunately because of the heterogeneity of study designs and outcome measures, no meta-analysis was possible. Other interventions Other interventions included the use of exercise or nutrition which resulted in improvements in psychological distress and health benefits. Case management and a computer aided support intervention provided mixed results. One cognitive behavioural therapy study reported a reduction in anxiety and positive impacts on patient behaviour. Effectiveness of interventions using specific outcome categories, In addition to analysis by type of intervention it was possible to analyse results based on some outcome categories that were used across the studies. In particular the impact of interventions on caregiver depression was available for meta-analysis from eight studies. This indicated that multi-component and psycho-educational interventions showed a small but significant positive effect on caregiver depression. Five studies using the outcome category of caregiver burden were entered into a meta-analysis and findings indicated that there were no significant effects of any of interventions. No meta-analysis was possible for the outcome categories of health, self-efficacy or subjective well-being. Implications for practice, From this review there is evidence to support the use of well-designed psycho-educational or multi-component interventions for caregivers of people with dementia who live in the community. Factors that appear to positively contribute to effective interventions are those which: ,,Provide opportunities within the intervention for the person with dementia as well as the caregiver to be involved ,,Encourage active participation in educational interventions for caregivers ,,Offer individualised programs rather than group sessions ,,Provide information on an ongoing basis, with specific information about services and coaching regarding their new role ,,Target the care recipient particularly by reduction in behaviours Factors which do not appear to have benefit in interventions are those which: ,,Simply refer caregivers to support groups ,,Only provide self help materials ,,Only offer peer support [source]

    Networking lessons in delivering ,Software as a Service',Part II

    David Greschler
    In part I of this paper, we described the origins and evolution of Software as a Service (SaaS) and its value proposition to Corporate IT, Service Providers, Independent Software Vendors and End Users. SaaS is a model in which software applications are deployed, managed, updated and supported on demand,like a utility,and are served to users centrally using servers that are internal or external to the enterprise. Applications are no longer installed locally on a user's desktop PC; instead, upgrades, licensing and version control, metering, support and provisioning are all managed at the server level. In part we examine the lessons learned in researching, building and running an SaaS service. Copyright © 2002 John Wiley & Sons, Ltd. [source]

    A novel virtually centered broad wall longitudinal slot for antenna application

    A. Anand
    Abstract A virtually centered broad-wall longitudinal slot antenna have been designed and studied using commercial Electromagnetic Simulation Software (CST Microwave Studio). The result obtained for the S21 of such an antenna has been compared with the measured data to find the accuracy of the software. Also the results obtained for the slot have been compared with the more common offset broad-wall longitudinal slot. Different slot characteristics also have been studied by varying different slot parameters. © 2010 Wiley Periodicals, Inc. Int J RF and Microwave CAE, 2010. [source]

    Nonparametric population modeling of valproate pharmacokinetics in epileptic patients using routine serum monitoring data: implications for dosage

    I. B. Bondareva
    Summary Therapeutic drug monitoring (TDM) of valproate (VAL) is important in the optimization of its therapy. The aim of the present work was to evaluate the ability of TDM using model-based, goal-oriented Bayesian adaptive control for help in planning, monitoring, and adjusting individualized VAL dosing regimens. USC*PACK software and routine TDM data were used to estimate population and individual pharmacokinetics of two commercially available VAL formulations in epileptic adult and pediatric patients on chronic VAL monotherapy. The population parameter values found were in agreement with values reported earlier. A statistically significant (P < 0.001) difference in median values of the absorption rate constant was found between enteric-coated and sustained-release VAL formulations. In our patients (aged 0·25,53 years), VAL clearance declined with age until adult values were reached at about age 10. Because of the large interindividual variability in PK behavior, the median population parameter values gave poor predictions of the observed VAL serum concentrations. In contrast, the Bayesian individualized models gave good predictions for all subjects in all populations. The Bayesian posterior individualized PK models were based on the population models described here and where most patients had two (a peak and a trough) measured serum concentrations. Repeated consultations and adjusted dosage regimens with some patients allowed us to evaluate any possible influence of dose-dependent VAL clearance on the precision of total VAL concentration predictions based on TDM data and the proposed population models. These nonparametric expectation maximization (NPEM) population models thus provide a useful tool for planning an initial dosage regimen of VAL to achieve desired target peak and trough serum concentration goals, coupled with TDM soon thereafter, as a peak,trough pair of serum concentrations, and Bayesian fitting to individualize the PK model for each patient. The nonparametric PK parameter distributions in these NPEM population models also permit their use by the new method of ,multiple model' dosage design, which allows the target goals to be achieved specifically with maximum precision. Software for both types of Bayesian adaptive control is now available to employ these population models in clinical practice. [source]

    Open Source Software: Private Provision of a Public Good

    Justin Pappas Johnson
    A simple model of open source software (as typified by the GNU-Linux operating system) is presented. Individual user-programmers decide whether to invest their own effort to develop a software enhancement that will become a public good if so developed. The effect of changing the population size of user-programmers is considered; finite and asymptotic results are given. Welfare results are presented. It is shown that whether development will increase when applications have a modular structure depends on whether the developer base exceeds a critical size. Potential explanations of several stylized facts are given, including why certain useful programs don't get written. [source]

    The Human Ageing Genomic Resources: online databases and tools for biogerontologists

    AGING CELL, Issue 1 2009
    João Pedro De Magalhães
    Summary Aging is a complex, challenging phenomenon that requires multiple, interdisciplinary approaches to unravel its puzzles. To assist basic research on aging, we developed the Human Ageing Genomic Resources (HAGR). This work provides an overview of the databases and tools in HAGR and describes how the gerontology research community can employ them. Several recent changes and improvements to HAGR are also presented. The two centrepieces in HAGR are GenAge and AnAge. GenAge is a gene database featuring genes associated with aging and longevity in model organisms, a curated database of genes potentially associated with human aging, and a list of genes tested for their association with human longevity. A myriad of biological data and information is included for hundreds of genes, making GenAge a reference for research that reflects our current understanding of the genetic basis of aging. GenAge can also serve as a platform for the systems biology of aging, and tools for the visualization of protein,protein interactions are also included. AnAge is a database of aging in animals, featuring over 4000 species, primarily assembled as a resource for comparative and evolutionary studies of aging. Longevity records, developmental and reproductive traits, taxonomic information, basic metabolic characteristics, and key observations related to aging are included in AnAge. Software is also available to aid researchers in the form of Perl modules to automate numerous tasks and as an SPSS script to analyse demographic mortality data. The HAGR are available online at [source]

    Image Analysis Based Quantification of Bacterial Volume Change with High Hydrostatic Pressure

    M. Pilavtepe-Çelik
    ABSTRACT:, Scanning electron microscopy (SEM) images of Staphylococcus aureus 485 and Escherichia coli O157:H7 933 were taken after pressure treatments at 200 to 400 MPa. Software developed for this purpose was used to analyze SEM images and to calculate the change in view area and volume of cells. Significant increase in average cell view area and volume for S. aureus 485 was observed in response to pressure treatment at 400 MPa. Cell view area for E. coli O157:H7 933 significantly increased at 325 MPa, the maximum pressure treatment tested against this pathogen. In contrast to S. aureus, cells of E. coli O157:H7 exhibited significant increase in average view area and volume at 200 MPa. The pressure-induced increase in these parameters may be attributed to modifications in membrane properties, for example, denaturation of membrane-bound proteins and pressure-induced phase transition of membrane lipid bilayer. [source]

    Interactive magnetic resonance cholangiography (MRC) with adaptive averaging,

    Martin J. Graves MSc
    Abstract Purpose To implement and evaluate a technique for adaptively averaging 2D magnetic resonance cholangiography (MRC) images obtained using an interactive imaging system with a view to improving image quality at reduced fields of view (FOVs). Materials and Methods Images were obtained using an interactive implementation of a single-shot half-Fourier rapid acquisition with relaxation enhancement (RARE) technique. Software was developed for adaptively averaging images, and an evaluation was performed in a phantom and a cohort of 10 patients referred for standard MRC. Adaptively averaged and standard single-shot MRC images were evaluated with respect to their ability to demonstrate the common bile duct and main left and right intrahepatic duct branches. Results In all patient studies there was no difference in the ability of either the adaptive technique or the standard single-shot method to demonstrate the common bile duct and the main left and right intrahepatic duct branches. However, in seven of the 10 patient studies the adaptive technique provided better visualization of the peripheral bile duct system (P = 0.035; sign test). There was no difference in the diagnostic confidence of the two techniques (P = 0.32, Wilcoxon signed-rank test). Conclusion Adaptive averaging of MRC images obtained using an interactive imaging paradigm significantly improves visualization of peripheral intrahepatic ducts. J. Magn. Reson. Imaging 2006. © 2006 Wiley-Liss, Inc. [source]

    The Pros and Cons of Data Analysis Software for Qualitative Research

    Winsome St John
    Purpose: To explore the use of computer-based qualitative data analysis software packages. Scope: The advantages and capabilities of qualitative data analysis software are described and concerns about their effects on methods are discussed. Findings: Advantages of using qualitative data analysis software include being freed from manual and clerical tasks, saving time, being able to deal with large amounts of qualitative data, having increased flexibility, and having improved validity and auditability of qualitative research. Concerns include increasingly deterministic and rigid processes, privileging of coding, and retrieval methods; reification of data, increased pressure on researchers to focus on volume and breadth rather than on depth and meaning, time and energy spent learning to use computer packages, increased commercialism, and distraction from the real work of analysis. Conclusions: We recommend that researchers consider the capabilities of the package, their own computer literacy and knowledge of the package, or the time required to gain these skills, and the suitability of the package for their research. The intelligence and integrity that a researcher brings to the research process must also be brought to the choice and use of tools and analytical processes. Researchers should be as critical of the methodological approaches to using qualitative data analysis software as they are about the fit between research question, methods, and research design. [source]

    In vitro analysis of the cement mantle of femoral hip implants: Development and validation of a CT-scan based measurement tool

    Thierry Scheerlinck
    Abstract We developed, validated and assessed inter- and intraobserver reliability of a CT-scan based measurement tool to evaluate morphological characteristics of the bone,cement,stem complex of hip implants in cadaver femurs. Two different models were investigated: the stem-cavity model using a double tapered polished femoral-stem that is removed after cement curing and the plastic-replica model using a stereolithographic stem replica that is left in place during CT-scanning. Software was developed to segment and analyze connective CT-images and identify the contours of bone, cement, and stem based on their respective gray values. Volume parameters (whole specimen, cement, stem, air contents of bone and cement), concentricity parameters (distances between centroids of stem and cement, cement and bone, stem and bone), contact surfaces (bone/air and cement/bone) and bone cement mantle thickness parameters were calculated. A three-dimensional protocol was developed to evaluate the minimal mantle thickness out of the CT-plane. The average accuracy for surfaces within CT-images was 7.47 mm2 (1.80%), for bone and cement mantle thickness it was 0.51 mm (9.39%), for distances between centroids it was 0.38 mm (18.5%) and contours: 0.27 mm (2.57%). The intra- and interobserver reliability of air content in bone and cement was sub-optimal (intraclass-correlation coefficient (JCC) as low as 0.54 with an average ICC of 0.85). All other variables were reliable (ICC > 0.81, average ICC: 0.96). This in vitro technique can assess characteristics of cement mantles produced by different cementing techniques, stem types or centralizers. © 2005 Orthopaedic Research Society. Published by Elsevier Ltd. All rights reserved. [source]

    Using a Motion-Capture System to Record Dynamic Articulation for Application in CAD/CAM Software

    Oliver Röhrle PhD
    Abstract Purpose: One of the current limitations of computer software programs for the virtual articulation of the opposing teeth is the static nature of the intercuspal position. Currently, software programs cannot identify eccentric occlusal contacts during masticatory cyclic movements of the mandible. Materials and Methods: Chewing trajectories with six degrees of freedom (DOF) were recorded and imposed on a computer model of one subject's maxillary and mandibular teeth. The computer model was generated from a set of high-resolution ,-CT images. To obtain natural chewing trajectories with six DOF, an optoelectronic motion-capturing system (VICON MX) was used. For this purpose, a special mandibular motion-tracking appliance was developed for this subject. Results: Mandibular movements while chewing elastic and plastic food samples were recorded and reproduced with the computer model. Examples of mandibular movements at intraoral points are presented for elastic and plastic food samples. The potential of such a kinematic computer model to analyze the dynamic nature of an occlusion was demonstrated by investigating the interaction of the second molars and the direction of the biting force during a chewing cycle. Conclusions: The article described a methodology that measured mandibular movements during mastication for one subject. This produced kinematic input to 3D computer modeling for the production of a virtual dynamic articulation that is suitable for incorporation into dental CAD/CAM software. [source]

    Linear Mixed Models: a Practical Guide using Statistical Software

    R. Allan Reese
    No abstract is available for this article. [source]

    Statistical analysis of optimal culture conditions for Gluconacetobacter hansenii cellulose production

    S.A. Hutchens
    Abstract Aim:, The purpose of this study was to analyse the effects of different culture parameters on Gluconacetobacter hansenii (ATCC 10821) to determine which conditions provided optimum cellulose growth. Methods and Results:, Five culture factors were investigated: carbon source, addition of ethanol, inoculation ratio, pH and temperature. jmp Software (SAS, Cary, NC, USA) was used to design this experiment using a fractional factorial design. After 22 days of static culture, the cellulose produced by the bacteria was harvested, purified and dried to compare the cellulose yields. The results were analysed by fitting the data to a first-order model with two-factor interactions. Conclusions:, The study confirmed that carbon source, addition of ethanol, and temperature were significant factors in the production of cellulose of this G. hansenii strain. While pH alone does not significantly affect average cellulose production, cellulose yields are affected by pH interaction with the carbon source. Culturing the bacteria on glucose at pH 6·5 produces more cellulose than at pH 5·5, while using mannitol at pH 5·5 produces more cellulose than at pH 6·5. The bacteria produced the most cellulose when cultured on mannitol, at pH 5·5, without ethanol, at 20°C. Inoculation ratio was not found to be a significant factor or involved in any significant two-factor interaction. Significance and Impact of the Study:, These findings give insight into the conditions necessary to maximize cellulose production from this G. hansenii strain. In addition, this work demonstrates how the fractional factorial design can be used to test a large number of factors using an abbreviated set of experiments. Fitting a statistical model determined the significant factors as well as the significant two-factor interactions. [source]

    Quantification of Video-Taped Images in Microcirculation Research Using Inexpensive Imaging Software (Adobe Photoshop)

    MICROCIRCULATION, Issue 2 2000
    Joachim Brunner
    ABSTRACT Background: Study end-points in microcirculation research are usually videotaped images rather than numeric computer print-outs. Analysis of these videotaped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. Methods and Result: We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Conclusions: Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research. [source]

    Two Bits: The Cultural Significance of Free Software by Christopher Kelty

    No abstract is available for this article. [source]


    ABSTRACT. Software design is much more important for individual-based models (IBMs) than it is for conventional models, for three reasons. First, the results of an IBM are the emergent properties of a system of interacting agents that exist only in the software; unlike analytical model results, an IBMs outcomes can be reproduced only by exactly reproducing its software implementation. Second, outcomes of an IBM are expected to be complex and novel, making software errors difficult to identify. Third, an IBM needs ,systems software' that manages populations of multiple kinds of agents, often has nonlinear and multi-threaded process control and simulates a wide range of physical and biological processes. General software guidelines for complex models are especially important for IBMs. (1) Have code critically reviewed by several people. (2) Follow prudent release management prac-tices, keeping careful control over the software as changes are implemented. (3) Develop multiple representations of the model and its software; diagrams and written descriptions of code aid design and understanding. (4) Use appropriate and widespread software tools which provide numerous major benefits; coding ,from scratch' is rarely appropriate. (5) Test the software continually, following a planned, multi-level, exper-imental strategy. (6) Provide tools for thorough, pervasive validation and verification. (7) Pay attention to how pseudorandom numbers are generated and used. Additional guidelines for IBMs include: (a) design the model's organization before starting to write code,(b) provide the ability to observe all parts of the model from the beginning,(c) make an extensive effort to understand how the model executes how often different pieces of code are called by which objects, and (d) design the software to resemble the system being mod-eled, which helps maintain an understanding of the software. Strategies for meeting these guidelines include planning adequate resources for software development, using software professionals to implement models and using tools like Swarm that are designed specifically for IBMs. [source]

    Predicting the Number of Defects Remaining In Operational Software

    P. J. Hartman Ph.D
    ABSTRACT Software is becoming increasingly critical to the Fleet as more and more commercial off-the-shelf (COTS) programs are being introduced in operating systems and applications. Program managers need to specify, contract, and manage the development and integration of software for warfare systems, condition based monitoring, propulsion control, parts requisitions, and shipboard administration. The intention here is to describe the state-of-the-art in Software Reliability Engineering (SRE) and defect prediction for commercial and military programs. The information presented here is based on data from the commercial software industry and shipboard program development. The strengths and weaknesses of four failure models are compared using these cases. The Logarithmic Poisson Execution Time (LPET) model best fits the data and satisfied the fundamental principles of reliability theory. The paper presents the procedures for defining software failures, tracking defects, and making spreadsheet predictions of the defects still remaining in the software after it has been deployed. Rules-of-thumb for the number of defects in commercial software and the relative expense required to fix these errors are provided for perspective. [source]

    Pilot Study Examining the Utility of Microarray Data to Identify Genes Associated with Weight in Transplant Recipients

    Ann Cashion
    Purpose/Methods:, Obesity, a complex, polygenic disorder and a growing epidemic in transplant recipients, is a risk factor for chronic diseases. This secondary data analysis identified if microarray technologies and bioinformatics could find differences in gene expression profiles between liver transplant recipients with low Body Mass Index (BMI < 29; n = 5) vs. high (BMI > 29; n = 7). Blood was hybridized on Human U133 Plus 2 GeneChip (Affymetrix) and analyzed using GeneSpring Software. Results:, Groups were similar in age and race, but not gender. Expression levels of 852 genes were different between the low and high BMI groups (P < 0.05). The majority (562) of the changes associated with high BMI were decreases in transcript levels. Among the 852 genes associated with BMI, 263 and 14 genes were affected greater than 2- or 5-fold, respectively. Following functionally classification using Gene Ontology (GO), we found that 19 genes (P < 0.00008) belonged to defense response and 15 genes (P < 0.00006) belonged to immune response. Conclusion:, These data could point the way toward therapeutic interventions and identify those at-risk. These results demonstrate that we can (1) extract high quality RNA from immunosuppressed patients; (2) manage large datasets and perform statistical and functional analysis. [source]