Home About us Contact | |||
Automated System (automate + system)
Selected AbstractsAn Automated System for Argument Invention in Law Using Argumentation and Heuristic Search Procedures,RATIO JURIS, Issue 4 2005DOUGLAS WALTON Argumentation schemes are forms of argument representing premise-conclusion and inference structures of common types of arguments. Schemes especially useful in law represent defeasible arguments, like argument from expert opinion. Argument diagramming is a visualization tool used to display a chain of connected arguments linked together. One such tool, Araucaria, available free at http://araucaria.computing.dundee.ac.uk/, helps a user display an argument on the computer screen as an inverted tree structure with an ultimate conclusion as the root of the tree. These argumentation tools are applicable to analyzing a mass of evidence in a case at trial, in a manner already known in law using heuristic methods (Schum 1994) and Wigmore diagrams (Wigmore 1931). In this paper it is shown how they can be automated and applied to the task of inventing legal arguments. One important application is to proof construction in trial preparation (Palmer 2003). [source] A comparison of diagnostic efficacies among different reagent strips and automated cell count in spontaneous bacterial peritonitisJOURNAL OF GASTROENTEROLOGY AND HEPATOLOGY, Issue 5 2010Rungsun Rerknimitr Abstract Background:, Currently, decision to give antibiotics in spontaneous bacterial peritonitis (SBP) suspected patient depends mainly on the result of manual cell count, which requires significant waiting period. Recently, many reports on the efficacies of reagent strips and a few reports of automated cell count are available but there has been no direct comparison study. Aims:, This prospective study was to assess the diagnostic efficacies of different reagent strips (Aution, Multistix, Combur) and automated cell count. Methods and Results:, A total of 250 paracenteses were performed. There were 40 specimens obtained from patients with clinical suspicion for SBP, the rest were obtained from non SBP suspected patients. Thirty specimens from 250 samples (12%) were diagnosed as SBP by manual cell count. Automated system provided higher value for SBP diagnosis in all parameters (sensitivity, specificity, PPV, NPV, and accuracy; 87.5,99.1%) whereas the strip tests provided lower number in all parameters (80,98.6%). Multistix provided the lowest sensitivity (80%). The false negative rates by Aution, Multistix, Combur tests and automated cell count were 10%, 20%, 10% and 3.3%, respectively. By lowering the cut off for SBP diagnosis with the automated system to 200 cells/mm3, there was no false negative. Conclusions:, Comparing to reagent strips, automated cell count is a better screening tool for SBP diagnosis because it provides higher validity scores and a lower false negative rate. However, the discrepancy of cell count reading may occur, we suggest using a lower cut off for SBP diagnosis by the automated system. [source] Automated system for simultaneous analysis of ,13C, ,18O and CO2 concentrations in small air samplesRAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 5 2002Miquel Ribas-Carbo In this paper we present an automated system for simultaneous measurement of CO2 concentration, ,13C and ,18O from small (<1,mL) air samples in a short period of time (,1 hour). This system combines continuous-flow isotope ratio mass spectrometry (CF-IRMS) and gas chromatography (GC) with an inlet system similar to conventional dual-inlet methods permitting several measurement cycles of standard and sample air. Analogous to the dual-inlet method, the precision of this system increases with the number of replicate cycles measured. The standard error of the mean for a measurement with this system is 0.7,ppm for the CO2 concentration and 0.05, for the ,13C and ,18O with four replicate cycles and 0.4,ppm and 0.03, respectively with nine replicate cycles. The mean offset of our measurements from NOAA/CMDL analyzed air samples was 0.08,ppm for the CO2 concentration, 0.01, for ,13C and 0.00, for ,18O. A specific list of the parts and operation of the system is detailed as well as some of the applications for micrometeorological and ecophysiological applications. Copyright © 2002 John Wiley & Sons, Ltd. [source] Representation of Pseudo Inter-reflection and Transparency by Considering Characteristics of Human VisionCOMPUTER GRAPHICS FORUM, Issue 3 2002H. Matsuoka We have succeeded in developing a quick and fully automated system that can generate photo-realistic 3D CG data based on a real object. A major factor in this success comes from our findings through psychophysical experiments that human observers do not have an accurate idea of what should be actually reflected as inter-reflections on the surface of an object. Taking advantage of this characteristic of human vision, we propose a new inter-reflection representation technique in which inter-reflections are simulated by allowing the same quantity of reflection components as there are in the background to pass through the object. Since inter-reflection and transparency are calculated by the same algorithm, our system can capture 3D CG data from various real objects having a strong inter-reflection, such as plastic and porcelain items or translucent glass and acrylic resin objects. The synthetic images from the 3D CG data generated with this pseudo inter-reflection and transparency look very natural. In addition, the 3D CG data and synthetic images are produced quickly at a lower cost. [source] The Impact of E-Replenishment Strategy on Make-to-Order Supply Chain PerformanceDECISION SCIENCES, Issue 1 2005E. Powell Robinson Jr. ABSTRACT This research investigates the impact of electronic replenishment strategy on the operational activities and performance of a two-stage make-to-order supply chain. We develop simulation-based rolling schedule procedures that link the replenishment processes of the channel members and apply them in an experimental analysis to study manual, semi-automated, and fully automated e-replenishment strategies in decentralized and coordinated decision-making supply chain structures. The average operational cost reductions for moving from a manual-based system to a fully automated system are 19.6, 29.5, and 12.5%, respectively, for traditional decentralized, decentralized with information sharing, and coordinated supply chain structures. The savings are neither equally distributed among participants, nor consistent across supply chain structures. As expected, for the fully coordinated system, total costs monotonically decrease with higher levels of automation. However, for the two decentralized structures, under which most firms operate today, counter-intuitive findings reveal that the unilateral application of e-procurement technology by the buyer may lower his purchasing costs, but increase the seller's and system's costs. The exact nature of the relationship is determined by the channel's operational flexibility. Broader results indicate that while the potential economic benefit of e-replenishment in a decentralized system is substantial, greater operational improvements maybe possible through supply chain coordination. [source] Automated image-based phenotypic analysis in zebrafish embryosDEVELOPMENTAL DYNAMICS, Issue 3 2009Andreas Vogt Abstract Presently, the zebrafish is the only vertebrate model compatible with contemporary paradigms of drug discovery. Zebrafish embryos are amenable to automation necessary for high-throughput chemical screens, and optical transparency makes them potentially suited for image-based screening. However, the lack of tools for automated analysis of complex images presents an obstacle to using the zebrafish as a high-throughput screening model. We have developed an automated system for imaging and analyzing zebrafish embryos in multi-well plates regardless of embryo orientation and without user intervention. Images of fluorescent embryos were acquired on a high-content reader and analyzed using an artificial intelligence-based image analysis method termed Cognition Network Technology (CNT). CNT reliably detected transgenic fluorescent embryos (Tg(fli1:EGFP)y1) arrayed in 96-well plates and quantified intersegmental blood vessel development in embryos treated with small molecule inhibitors of anigiogenesis. The results demonstrate it is feasible to adapt image-based high-content screening methodology to measure complex whole organism phenotypes. Developmental Dynamics 238:656,663, 2009. © 2009 Wiley-Liss, Inc. [source] Socio-Spatial Relationships in Dairy CowsETHOLOGY, Issue 1 2010Lorenz Gygax Farm animals may serve as models for evaluating social networks in a controlled environment. We used an automated system to track, at fine temporal and spatial resolution (once per minute, ±50 cm) every individual in six herds of dairy cows (Bos taurus). We then analysed the data using social network analyses. Relationships were based on non-random attachment and avoidance relationships in respect to synchronous use and distances observed in three different functional areas (activity, feeding and lying). We found that neither synchrony nor distance between cows was strongly predictable among the three functional areas. The emerging social networks were tightly knit for attachment relationships and less dense for avoidance relationships. These networks loosened up from the feeding and lying area to the activity area, and were less dense for relationships based on synchronicity than on median distance with respect to node degree, relative size of the largest cluster, density and diameter of the network. In addition, synchronicity was higher in dyads of dairy cows that had grown up together and shared their last dry period. This last effect disappeared with increasing herd size. Dairy herds can be characterized by one strongly clustered network including most of the herd members with many non-random attachment and avoidance relationships. Closely synchronous dyads were composed of cows with more intense previous contact. The automatic tracking of a large number of individuals proved promising in acquiring the data necessary for tackling social network analyses. [source] Evaluation of Ves-Matic Cube 200 , an automated system for the measurement of the erythrocyte sedimentation rateINTERNATIONAL JOURNAL OF LABORATORY HEMATOLOGY, Issue 1p2 2010E. PEROVIC Summary Ves-Matic Cube 200 is fully automated analyzer that performs erythrocyte sedimentation rate (ESR) measurement using the standard ethylenediaminetetraacetic acid blood sample tube, thus markedly reducing the analytical time and avoiding the need for an extra blood sample. The aim of this study was to assess the automatic Ves-Matic Cube 200 system for the measurement of ESR in comparison with the original International Council for Standardization in Hematology reference method (Westergren). The evaluation comprised accuracy which was established using a 95% confidence interval (CI) for the mean difference between Ves-Matic Cube 200 and Westergren method (mean of difference: 0.47 ± 6.84 mm/h; 95% CI: ,0.376 to 1.325 mm/h), within-run imprecision for samples with ESR values of 9, 42 and 95 mm/h (coefficients of variation: 9.19%, 13.88% and 5.66%, respectively) and method comparison (, = 0.95; Passing-Bablok regression equation: Y = ,0.0435 + 1.0435 X; bias: ,0.5; limits of agreement: ,13.9 to 12.9). Stability was estimated after 24 h storage either at 4 °C and room temperature (mean of differences: ,1.91 mm/h; 95% CI: ,4.852 to 1.037 mm/h and mean of differences: ,12.48 mm/h; 95% CI: ,16.580 to ,8.390 mm/h, respectively). The obtained results suggest that the Ves-Matic Cube 200 automated analyzer is reliable system for the measurement of ESR in clinical laboratories. [source] Evaluation of NOC Measures in Home Care Nursing PracticeINTERNATIONAL JOURNAL OF NURSING TERMINOLOGIES AND CLASSIFICATION, Issue 2003Gail M. Keenan PURPOSE To evaluate the reliability, validity, usefulness, and sensitivity of 89 NOC outcomes in two Visiting Nurse Associations in Michigan. METHODS Of a total 190 NOC outcomes 89 were assigned for testing. Interrater reliability and criterion validity were assessed a total of 50 times per outcome (on 50 different patients) across the study units. The total number of times the reliability and validity were assessed for each of the 89 measures studied ranged from 5,45. Three RN research assistants (RNRAs) oversaw and participated in data collection with the help of 15 clinicians. Convenience sampling was used to identify subjects. A roster of outcomes to be studied was maintained and matched with patient conditions whenever possible until the quota of outcomes assigned had been evaluated. Clinicians and RNRAs independently rated the outcomes and indicators applicable to the patient. NANDA diagnoses, NIC interventions, and medical diagnoses were recorded. FINDINGS A total of 258 patients (mean age 62) enrolled; 60% were women, 23% were from minority groups, and 78% had no college degree. Thirty-six of the 89 NOC measures were designated "clinically useful." The 10 outcomes with the highest interrater reliability were Caregiver Home Care Readiness; Caregiver Stressors; Caregiving Endurance Potential; Infection Status; Mobility Level; Safety Status: Physical Injury; Self-Care: Activities of Daily Living; Self-Care: Bathing; Self-Care: Hygiene; and Wound Healing: Secondary Intention. Criterion measurement and repeated ratings provided evidence to support the validity and sensitivity of the NOC outcomes. Evidence also suggested that NOC label level ratings could be a feasible, reliable, and valid method of evaluating nursing outcomes under actual use. For some measures, adjustments in the scales and anchors are needed to enhance reliability. For others, it may be unrealistic to reliably score in one encounter, thus scoring should be deferred until the clinician has adequate knowledge of the patient. CONCLUSIONS Continued study and refinement that are coordinated and integrated systematically strongly recommended. Comprehensive study in an automated system with a controlled format will increase the efficiency of future studies. [source] Autolabo: an automated system for ligand-soaking experiments with protein crystalsJOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 4 2010Michihiro Sugahara Ligand soaking of protein crystals is important for the preparation of heavy-atom derivative crystals for experimental phasing as well as for large-scale ligand screening in pharmaceutical developments. To facilitate laborious large-scale ligand screening, to reduce the risk of human contact with hazardous ligand reagents and to increase the success rate of the soaking experiments, a protein crystallization robot `Autolabo' has been developed and implemented in the high-throughput crystallization-to-structure pipeline at RIKEN SPring-8 Center. The main functions of this robotic system are the production of protein crystals for experiments, the ligand soaking of these crystals and the observation of soaked crystals. The separate eight-channel dispensers of Autolabo eliminate the cross-contamination of reagents which should be strictly avoided in the ligand-soaking experiment. Furthermore, the automated approach reduces physical damage to crystals during experiments when compared with the conventional manual approach, and thereby has the potential to yield better quality diffraction data. Autolabo's performance as a ligand-soaking system was evaluated with a crystallization experiment on ten proteins from different sources and a heavy-atom derivatization experiment on three proteins using a versatile cryoprotectant containing heavy-atom reagents as ligands. The crystallization test confirmed reliable crystal reproduction in a single condition and the capability for crystallization with nucleants to improve crystal quality. Finally, Autolabo reproducibly derivatized the test protein crystals with sufficient diffraction quality for experimental phasing and model building, indicating a high potentiality of this automated approach in ligand-soaking experiments. [source] Generation and visualization of large-scale three-dimensional reconstructions from underwater robotic surveysJOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 1 2010Matthew Johnson-Roberson Robust, scalable simultaneous localization and mapping (SLAM) algorithms support the successful deployment of robots in real-world applications. In many cases these platforms deliver vast amounts of sensor data from large-scale, unstructured environments. These data may be difficult to interpret by end users without further processing and suitable visualization tools. We present a robust, automated system for large-scale three-dimensional (3D) reconstruction and visualization that takes stereo imagery from an autonomous underwater vehicle (AUV) and SLAM-based vehicle poses to deliver detailed 3D models of the seafloor in the form of textured polygonal meshes. Our system must cope with thousands of images, lighting conditions that create visual seams when texturing, and possible inconsistencies between stereo meshes arising from errors in calibration, triangulation, and navigation. Our approach breaks down the problem into manageable stages by first estimating local structure and then combining these estimates to recover a composite georeferenced structure using SLAM-based vehicle pose estimates. A texture-mapped surface at multiple scales is then generated that is interactively presented to the user through a visualization engine. We adapt established solutions when possible, with an emphasis on quickly delivering approximate yet visually consistent reconstructions on standard computing hardware. This allows scientists on a research cruise to use our system to design follow-up deployments of the AUV and complementary instruments. To date, this system has been tested on several research cruises in Australian waters and has been used to reliably generate and visualize reconstructions for more than 60 dives covering diverse habitats and representing hundreds of linear kilometers of survey. © 2009 Wiley Periodicals, Inc. [source] A comparison of diagnostic efficacies among different reagent strips and automated cell count in spontaneous bacterial peritonitisJOURNAL OF GASTROENTEROLOGY AND HEPATOLOGY, Issue 5 2010Rungsun Rerknimitr Abstract Background:, Currently, decision to give antibiotics in spontaneous bacterial peritonitis (SBP) suspected patient depends mainly on the result of manual cell count, which requires significant waiting period. Recently, many reports on the efficacies of reagent strips and a few reports of automated cell count are available but there has been no direct comparison study. Aims:, This prospective study was to assess the diagnostic efficacies of different reagent strips (Aution, Multistix, Combur) and automated cell count. Methods and Results:, A total of 250 paracenteses were performed. There were 40 specimens obtained from patients with clinical suspicion for SBP, the rest were obtained from non SBP suspected patients. Thirty specimens from 250 samples (12%) were diagnosed as SBP by manual cell count. Automated system provided higher value for SBP diagnosis in all parameters (sensitivity, specificity, PPV, NPV, and accuracy; 87.5,99.1%) whereas the strip tests provided lower number in all parameters (80,98.6%). Multistix provided the lowest sensitivity (80%). The false negative rates by Aution, Multistix, Combur tests and automated cell count were 10%, 20%, 10% and 3.3%, respectively. By lowering the cut off for SBP diagnosis with the automated system to 200 cells/mm3, there was no false negative. Conclusions:, Comparing to reagent strips, automated cell count is a better screening tool for SBP diagnosis because it provides higher validity scores and a lower false negative rate. However, the discrepancy of cell count reading may occur, we suggest using a lower cut off for SBP diagnosis by the automated system. [source] Automated Creativity: Digital Morphology and the Design ProcessJOURNAL OF INTERIOR DESIGN, Issue 3 2007Kathleen Gibson M.A. ABSTRACT Literature shows that traditional creative methods may reinforce repetitive and habitual behavior resulting in ineffective environmental design solutions (Lawson, 1980; Lang, 1987; Laseau, 1989). Two case studies explored the use of an automated system called cyber-ideation (Gibson, 2000b) as a method to stimulate idea generation. This procedure employed individual and team involvement, recursive and linear exploration, and manual and digital processes. Analysis compared students' production using traditional ideation processes with that resulting from cyber-ideation. Results from this case study found that: 1) digital creation was more linear when evaluated against traditional ideation output, 2) cyber-ideation had a positive impact on team dynamics, and 3) automated output possessed greater surface delineation when compared with subjects' manual sketching. [source] Radiosynthesis of 13N-labeled thalidomide using no-carrier-added [13N]NH3JOURNAL OF LABELLED COMPOUNDS AND RADIOPHARMACEUTICALS, Issue 2 2010Katsushi Kumata Abstract Recent studies revealed that thalidomide (1) has unique and broad pharmacological effects on multi-targets although the application of 1 in therapy is still controversial. In this study, we synthesized nitrogen-13-labeled thalidomide ([13N]1) as a potential positron emission tomography (PET) probe using no-carrier-added [13N]NH3 as a labeling agent. By use of an automated system, [13N]1 was prepared by reacting N -phthaloylglutamic anhydride (2) with [13N]NH3, following by cyclization with carbonyldiimidazole in a radiochemical yield of 56±12% (based on [11N]NH3, corrected for decay) and specific activity of 49±24,GBq/µmol at the end of synthesis (EOS). At EOS, 570,780,MBq (n=7) of [13N]1 was obtained at a beam current of 15,µA after 15,min proton bombardment with a synthesis time of 14,min from the end of bombardment. Using a small animal PET scanner, preliminary biodistribution of [13N]1 in mice was examined. Copyright © 2010 John Wiley & Sons, Ltd. [source] PHENOPSIS, an automated platform for reproducible phenotyping of plant responses to soil water deficit in Arabidopsis thaliana permitted the identification of an accession with low sensitivity to soil water deficitNEW PHYTOLOGIST, Issue 3 2006Christine Granier Summary ,,The high-throughput phenotypic analysis of Arabidopsis thaliana collections requires methodological progress and automation. Methods to impose stable and reproducible soil water deficits are presented and were used to analyse plant responses to water stress. ,,Several potential complications and methodological difficulties were identified, including the spatial and temporal variability of micrometeorological conditions within a growth chamber, the difference in soil water depletion rates between accessions and the differences in developmental stage of accessions the same time after sowing. Solutions were found. ,,Nine accessions were grown in four experiments in a rigorously controlled growth-chamber equipped with an automated system to control soil water content and take pictures of individual plants. One accession, An1, was unaffected by water deficit in terms of leaf number, leaf area, root growth and transpiration rate per unit leaf area. ,,Methods developed here will help identify quantitative trait loci and genes involved in plant tolerance to water deficit. [source] A new approach to determine method detection limits for compound-specific isotope analysis of volatile organic compoundsRAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 24 2006Maik A. Jochmann Compound-specific isotope analysis (CSIA) has been established as a useful tool in the field of environmental science, in particular in the assessment of contaminated sites. What limits the use of gas chromatography/isotope ratio mass spectrometry (GC/IRMS) is the low sensitivity of the method compared with GC/MS analysis; however, the development of suitable extraction and enrichment techniques for important groundwater contaminants will extend the fields of application for GC/IRMS. So far, purge and trap (P&T) is the most effective, known preconcentration technique for on-line CSIA with the lowest reported method detection limits (MDLs in the low,µg/L range). With the goal of improving the sensitivity of a fully automated GC/IRMS analysis method, a commercially available P&T system was modified. The method was evaluated for ten monoaromatic compounds (benzene, toluene, para -xylene, ethylbenzene, propylbenzene, isopropylbenzene, 1,2,3-trimethylbenzene, 1,2,4-trimethylbenzene, 1,3,5-trimethylbenzene, fluorobenzene) and ten halogenated volatile organic compounds (VOCs) (dichloromethane, cis -1,2-dichloroethene, trans -1,2-dichloroethene, carbon tetrachloride, chloroform, 1,2-dichloroethane, trichloroethene, tetrachlorethene, 1,2-dibromoethane, bromoform). The influence of method parameters, including purge gas flow rates and purge times, on ,13C values of target compounds was evaluated. The P&T method showed good reproducibility, high linearity and small isotopic fractionation. MDLs were determined by consecutive calculation of the ,13C mean values. The last concentration for which the ,13C value was within this iterative interval and for which the standard deviation was lower than ±0.5, for triplicate measurements was defined as the MDL. MDLs for monoaromatic compounds between 0.07 and 0.35,µg/L are the lowest values reported so far for continuous-flow isotope ratio measurements using an automated system. MDLs for halogenated hydrocarbons were between 0.76 and 27,µg/L. The environmental applicability of the P&T-GC/IRMS method in the low-µg/L range was demonstrated in a case study on groundwater samples from a former military air field contaminated with VOCs. Copyright © 2006 John Wiley & Sons, Ltd. [source] Automated system for simultaneous analysis of ,13C, ,18O and CO2 concentrations in small air samplesRAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 5 2002Miquel Ribas-Carbo In this paper we present an automated system for simultaneous measurement of CO2 concentration, ,13C and ,18O from small (<1,mL) air samples in a short period of time (,1 hour). This system combines continuous-flow isotope ratio mass spectrometry (CF-IRMS) and gas chromatography (GC) with an inlet system similar to conventional dual-inlet methods permitting several measurement cycles of standard and sample air. Analogous to the dual-inlet method, the precision of this system increases with the number of replicate cycles measured. The standard error of the mean for a measurement with this system is 0.7,ppm for the CO2 concentration and 0.05, for the ,13C and ,18O with four replicate cycles and 0.4,ppm and 0.03, respectively with nine replicate cycles. The mean offset of our measurements from NOAA/CMDL analyzed air samples was 0.08,ppm for the CO2 concentration, 0.01, for ,13C and 0.00, for ,18O. A specific list of the parts and operation of the system is detailed as well as some of the applications for micrometeorological and ecophysiological applications. Copyright © 2002 John Wiley & Sons, Ltd. [source] Improving forensic tribunal decisions: the role of the clinician,BEHAVIORAL SCIENCES & THE LAW, Issue 4 2007Shari A. McKee Ph.D. Three empirical investigations of forensic decision-making were conducted: a study of 104 hearings by a forensic tribunal; an evaluation of which aspects of forensic patients' clinical presentation were empirical predictors of violence; and a survey of forensic clinicians to determine which factors they said they used to assess risk of violent recidivism and which they actually used. Results showed a significant correlation between actuarial risk and clinical advice to the tribunal, and a nonsignificant trend for patients higher in actuarial risk to receive more restrictive dispositions. Psychotic diagnoses and symptoms were not indicators of increased risk of violent recidivism. Clinicians endorsed some empirically valid indicators of risk, but also relied on some invalid indicators. There was also inconsistency between factors clinicians said they used and factors actually related to their hypothetical decision-making. An automated system is presented as an illustration of how the consistency and validity of forensic decisions could be enhanced. Copyright © 2007 John Wiley & Sons, Ltd. [source] An Integrated Process for Mammalian Cell Perfusion Cultivation and Product Purification Using a Dynamic FilterBIOTECHNOLOGY PROGRESS, Issue 4 2002Leda R. Castilho In the present work, a dynamic filter was employed to develop an integrated perfusion/purification process. A recombinant CHO cell line producing a human anti-HIV IgG was employed in the experiments. In the first part of this work, the dynamic filter was fitted with conventional microfiltration membranes and tested as a new external cell retention device for perfusion cultivations. The filter was connected to a running perfusion bioreactor and operated for approximately 400 h at an average cell concentration of 10 million cells mL,1, whereby cell viability remained above 90% and no problems of sterility were experienced. In the second part of this work, the dynamic filter was employed to simultaneously carry out cell separation and product purification, using membrane adsorbers containing Protein A affinity ligands. An automated system was built, which integrated the features of an automated perfusion bioreactor and of a liquid chromatography system. The IgG was continuously adsorbed onto the affinity membranes and was periodically recovered through elution cycles. After connection of the filter, the system was operated for approximately 300 h, whereby three elution cycles were carried out. No progressive increase in transmembrane pressure was observed, indicating no membrane fouling problems, and the IgG was recovered practically free of contaminants in a 14-fold concentrated form, indicating that the integrated, one-step perfusion/purification process developed during this work is a promising alternative for the production of biologicals derived from mammalian cells. [source] Intravenous catheter infections associated with bacteraemia: a 2-year study in a University HospitalCLINICAL MICROBIOLOGY AND INFECTION, Issue 5 2004M. Paragioudaki Abstract The aim of this retrospective study was to assess the incidence and aetiology of central and peripheral venous catheter (C/PVC) infections during a 2-year period (1999,2000) and to determine the susceptibility of isolated microorganisms to various antimicrobial agents. Catheter tips were processed using the semiquantitative method and blood cultures were performed with the BacT/Alert automated system. Antibiotic susceptibilities were performed by disk agar diffusion and MICs were determined by Etest, according to NCCLS standards. During the study period, samples from 1039 C/PVC infections were evaluated, yielding 384 (37.0%) positive cultures. Blood cultures were also available from 274 patients, of which 155 (56.6%) yielded the same microorganism as from the catheter. No bloodstream infections were detected in 104 C/PVC-positive cases. Methicillin-resistant coagulase-negative staphylococci were the most frequent isolates, followed by Gram-negative bacteria, especially Pseudomonas aeruginosa. Resistance to glycopeptides among staphylococci and enterococci was not detected, whereas 60% of Gram-negative bacilli were resistant to ,-lactams. [source] A study on the spectral changes of reactive textile dyes and their implications for online control of dyeing processesCOLORATION TECHNOLOGY, Issue 1 2009Jorge G Santos Evidence is presented that confirms the colour changes of a widely used trichromatic mixture of bifunctional reactive dyes (Levafix CA) under alkaline conditions, showing that they occur slowly and throughout the dyeing time, and not instantly after alkali addition to the dyebath. Thus, it is impossible to determine the specific absorptivity of the dyes at each moment of the dyeing process. An investigation into the relationship of the type of reactive group to the dye and the visible spectral changes over time was undertaken. Model reactive dyes were studied. The samples collected from the simulated dyebaths were monitored online using an automated system and their absorption on the whole of the visible spectrum was measured. The studies of dyes that included halo- s -triazinyl groups revealed the existence of hypochromic shifts in the spectra of the dyes in the presence of an electrolyte (sodium chloride or sodium sulphate) and bathochromic and hyperchromic shifts, when evaluated in the presence of alkaline agents. However, the vinylsulphonyl derivatives present a more stable spectral profile. The use of buffer solution at pH 5 was an efficient method to stabilise the absorption profile of Levafix CA trichromatic samples. [source] 3D Image Segmentation of Aggregates from Laser ProfilingCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 4 2003Hyoungkwan Kim Automated scanners of different designs use cameras or lasers to obtain digital images of groups of aggregate particles. To accurately determine particle size and shape parameters, each particle region in the image must be isolated and processed individually. Here, a method for segmenting a particle image acquired from laser profiling is developed using a Canny edge detector and a watershed transformation. Canny edges with rigorous and liberal threshold values are used to outline particle boundaries on a binary image and to check the validity of watersheds, respectively. To find appropriate regional minima in the watershed transformation, a varying search window method is used, where the number of neighboring pixels being compared with the pixel of interest is determined from the height value of the pixel. Test results with this method are promising. When implemented in automated systems that are designed to rapidly assess size and shape characteristics of stone particles, this technique can not only reduce the amount of time required for aggregate preparation, but also increase the accuracy of analysis results. [source] Applying content management to automated provenance captureCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2008Karen L. Schuchardt Abstract Workflows and data pipelines are becoming increasingly valuable to computational and experimental sciences. These automated systems are capable of generating significantly more data within the same amount of time compared to their manual counterparts. Automatically capturing and recording data provenance and annotation as part of these workflows are critical for data management, verification, and dissemination. We have been prototyping a workflow provenance system, targeted at biological workflows, that extends our content management technologies and other open source tools. We applied this prototype to the provenance challenge to demonstrate an end-to-end system that supports dynamic provenance capture, persistent content management, and dynamic searches of both provenance and metadata. We describe our prototype, which extends the Kepler system for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to provide access to the provenance record with a variety of commonly available client tools. Copyright © 2007 John Wiley & Sons, Ltd. [source] Evaluation of best system performance: Human, automated, and hybrid inspection systemsHUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 2 2003Xiaochun Jiang Recently, 100% inspection with automated systems has seen more frequent application than traditional sampling inspection with human inspectors. Nevertheless, humans still outperform machines in most attribute inspection tasks. Because neither humans nor automation can achieve superior inspection system performance, hybrid inspection systems where humans work cooperatively with machines merit study. In response to this situation, this research was conducted to evaluate three of the following different inspection systems: (1) a human inspection system, (2) a computer search/human decision-making inspection system, and (3) a human/computer share search/decision-making inspection system. Results from this study showed that the human/computer share search/decision-making system achieve the best system performance, suggesting that both should be used in the inspection tasks rather than either alone. Furthermore, this study looked at the interaction between human inspectors and computers, specifically the effect of system response bias on inspection quality performance. These results revealed that the risky system was the best in terms of accuracy measures. Although this study demonstrated how recent advances in computer technology have modified previously prescribed notions about function allocation alternatives in a hybrid inspection environment, the adaptability of humans was again demonstrated, indicating that they will continue to play a vital role in future hybrid systems. © 2003 Wiley Periodicals, Inc. Hum Factors Man 13: 137,152, 2003. [source] Developing competency models to promote integrated human resource practicesHUMAN RESOURCE MANAGEMENT, Issue 3 2002Donna Rodriguez Today, competencies are used in many facets of human resource management, ranging from individual selection, development, and performance management to organizational strategic planning. By incorporating competencies into job analysis methodologies, the Office of Personnel Management (OPM) has developed robust competency models that can form the foundation for each of these initiatives. OPM has placed these models into automated systems to ensure access for employees, human resources professionals, and managers. Shared access to the data creates a shared frame of reference and a common language of competencies that have provided the basis for competency applications in public sector agencies. © 2002 Wiley Periodicals, Inc. [source] Critical care nurse practitioners and clinical nurse specialists interface patterns with computer-based decision support systemsJOURNAL OF THE AMERICAN ACADEMY OF NURSE PRACTITIONERS, Issue 11 2007APRN (Assistant Professor of Health, Community Systems, Coordinator of the Nursing Education Graduate Program), PhD(c), Scott Weber EdD Abstract Purpose: The purposes of this review are to examine the types of clinical decision support systems in use and to identify patterns of how critical care advanced practice nurses (APNs) have integrated these systems into their nursing care patient management practices. The decision-making process itself is analyzed with a focus on how automated systems attempt to capture and reflect human decisional processes in critical care nursing, including how systems actually organize and process information to create outcome estimations based on patient clinical indicators and prognosis logarithms. Characteristics of APN clinicians and implications of these characteristics on decision system use, based on the body of decision system user research, are introduced. Data sources: A review of the Medline, Ovid, CINAHL, and PubMed literature databases was conducted using "clinical decision support systems,""computerized clinical decision making," and "APNs"; an examination of components of several major clinical decision systems was also undertaken. Conclusions: Use patterns among APNs and other clinicians appear to vary; there is a need for original research to examine how APNs actually use these systems in their practices in critical care settings. Because APNs are increasingly responsible for admission to, and transfer from, critical care settings, more understanding is needed on how they interact with this technology and how they see automated decision systems impacting their practices. Implications for practice: APNs who practice in critical care settings vary significantly in how they use the clinical decision systems that are in operation in their practice settings. These APNs must have an understanding of their use patterns with these systems and should critically assess whether their patient care decision making is affected by the technology. [source] Knowledge Warfare in the 21ST Century: An Extension in PerformanceNAVAL ENGINEERS JOURNAL, Issue 2 2003Dr. Yvonne R. Masakowski ABSTRACT As we move into the 21st century, we are faced with a critical need to address the ways in which knowledge is generated and used to optimize system and human performance. Today, we are inundated with a plethora of information, emails, and ever-changing software. There is a dynamic relationship among humans, computers, expert systems and intelligent agent software that shapes the way we live, conduct business and participate in war. It is imperative that we master the critical components of knowledge management that will enhance their decision-making capacities and empower the warfighter. In the 21st century, knowledge management tools, intelligent agent architectures, robotics, and automated systems will facilitate expert performance necessary to fortify net-centric warfare. One of the principal metrics of performance will be our ability to reduce uncertainty and provide the most accurate information to the decision-maker at the right time. The importance of these goals becomes clear when considered within the context of images of the World Trade Center (WTC) crumbling to the ground. Now, we understand the cost of poor information in terms of life and freedom. This paper will provide an introduction to the importance of knowledge management and implications for future ship design. [source] A petri nets-based process planning system for wastewater treatment,ASIAN JOURNAL OF CONTROL, Issue 3 2010Albert W. L. Yao Abstract It is always challenging to simulate, debug or diagnose automated systems. The aim of this paper is to present the development of a convenient tool for reengineering the control system in a complicated industrial wastewater treatment plant. In this project, a PC-based Human-Machine Interface (HMI) in conjunction with Petri nets (PN) theory is adopted to develop and simulate the operational process for wastewater treatment. The resultant tool offers many advantages to the reality of the automated control world. It not only reduces the process reengineering time and the cost of error recovery, but also builds a panel of human interface for the process. The discrete event control sequence of wastewater treatment can be easily modeled and evaluated before its build-up. Furthermore, this PN-based system can be used as an online diagnostic tool when the wastewater treatment process is malfunctioning. That is, the presented PN tool provides an adequate means for offline process development, simulation, performance evaluation, and quick online process diagnosis. Copyright © 2010 John Wiley and Sons Asia Pte Ltd and Chinese Automatic Control Society [source] Low-level glycopeptide resistance in methicillin-resistant Staphylococcus aureus and how to test itCLINICAL MICROBIOLOGY AND INFECTION, Issue 2009P. M. Hawkey Abstract Vancomycin resistance in Staphylococcus aureus has emerged over the last ten years due to varying mechanisms and giving variable levels of resistance to vancomycin. The most resistant strains (fortunately rare) bear the vanA gene cluster and these are generally recognisable as MICs of vancomycin are usually found to be in the range 32-64mg/L. It should be noted that some automated systems have failed to detect these isolates. The much more commonly encountered GISA and hGISA vancomycin resistant strains of MRSA and methicillin sensitive Staph. aureus (MSSA) exhibit lower levels of resistance and difficulty is encountered in reliably defining and identifying these strains in clinical laboratories. No single completely reliable, convenient test either phonotypical genetic currently exists which can be readily applied in the clinical laboratory for the detection of hGISA/GISA. The population analysis profile (PAP) method is currently regarded as the reference method but is slow and tedious to perform on a large number of isolates. This enables the differentiation of hGISA and GISA from fully vancomycin sensitive strains. In the clinical laboratory the use of Meuller-Hinton agar with 5mg/L teicoplanin and a 10,L innoculum of MacFarland 0.5 incubated for 48h represents the most reliable and economical screening test. Further confirmation would be required using either macrodilution Etest methodology using an MIC , 8mg/L of vancomycin and/or teicoplanin as the cut off for hGISA or the newer GRD (glycopeptide resistance detection) strip. [source] |