Automated

Distribution by Scientific Domains
Distribution within Chemistry

Terms modified by Automated

  • automate algorithm
  • automate analysis
  • automate approach
  • automate assessment
  • automate detection
  • automate device
  • automate external defibrillator
  • automate hematology analyzer
  • automate identification
  • automate image analysis
  • automate immunomagnetic separation
  • automate measurement
  • automate method
  • automate methods
  • automate microscopy
  • automate perimetry
  • automate peritoneal dialysis
  • automate procedure
  • automate ribosomal intergenic spacer analysis
  • automate sequencing
  • automate survey
  • automate synthesis
  • automate system

  • Selected Abstracts


    Empowering Automated Trading in Multi-Agent Environments

    COMPUTATIONAL INTELLIGENCE, Issue 4 2004
    David W. Ash
    Trading in the financial markets often requires that information be available in real time to be effectively processed. Furthermore, complete information is not always available about the reliability of data, or its timeliness,nevertheless, a decision must still be made about whether to trade or not. We propose a mechanism whereby different data sources are monitored, using Semantic Web facilities, by different agents, which communicate among each other to determine the presence of good trading opportunities. When a trading opportunity presents itself, the human traders are notified to determine whether or not to execute the trade. The Semantic Web, Web Services, and URML technologies are used to enable this mechanism. The human traders are notified of the trade at the optimal time so as not to either waste their resources or lose a good trading opportunity. We also have designed a rudimentary prototype system for simulating the interaction between the intelligent agents and the human beings, and show some results through experiments on this simulation for trading of the Chicago Board Options Exchange (CBOE) options. [source]


    Automated normalized FLAIR imaging in MRI-negative patients with refractory focal epilepsy

    EPILEPSIA, Issue 6 2009
    Niels K. Focke
    Summary Background:, Patients with focal epilepsy that is refractory to medical treatment are often considered candidates for resective surgery. Magnetic resonance imaging (MRI) has a very important role in the presurgical work-up of these patients, but is unremarkable in about one-third of cases. These patients are often deferred from surgery or have a less positive outcome if surgery is eventually undertaken. The aim of this study was to evaluate our recently described voxel-based technique using routine T2-FLAIR (fluid-attenuated inversion-recovery) scans in MRI-negative patients and to compare the results with video-EEG (electroencephalography) telemetry (VT) findings. Methods:, We identified 70 epilepsy patients with refractory focal seizures who underwent VT and had a normal routine MRI. T2-FLAIR scans were bias-corrected, and intensity and spatially normalized (nFSI) using Statistical Parametric Mapping 5 (SPM5) as previously described. Individual scans were then compared against a set of 25 normal controls using a voxel-based method. Results:, SPM5 identified 10 patients with suprathreshold clusters (14.3%). In 50% of these there was concordance between the lobe of the most significant cluster and the presumed lobe of seizure onset, as defined by VT. All cases were concordant with respect to lateralization of the putative focus. Conclusion:, Using nFSI we identified focal structural cerebral abnormalities in 11.4% of patients with refractory focal seizures, and normal conventional MRI, that were fully or partially concordant with scalp VT. This voxel-based analysis of FLAIR scans, which are widely available, could provide a useful tool in the presurgical evaluation of epilepsy patients. Ongoing work is to compare these imaging findings with the results of intracranial EEG and histology of surgical resections. [source]


    Cerebral Damage in Epilepsy: A Population-based Longitudinal Quantitative MRI Study

    EPILEPSIA, Issue 9 2005
    Rebecca S. N. Liu
    Summary:,Purpose: Whether cerebral damage results from epileptic seizures remains a contentious issue. We report on the first longitudinal community-based quantitative magnetic resonance imaging (MRI) study to investigate the effect of seizures on the hippocampus, cerebellum, and neocortex. Methods: One hundred seventy-nine patients with epilepsy (66 temporal lobe epilepsy, 51 extratemporal partial epilepsy, and 62 generalized epilepsy) and 90 control subjects underwent two MRI brain scans 3.5 years apart. Automated and manual measurement techniques identified changes in global and regional brain volumes and hippocampal T2 relaxation times. Results: Baseline hippocampal volumes were significantly reduced in patients with temporal lobe epilepsy and could be attributed to an antecedent neurologic insult. Rates of hippocampal, cerebral, and cerebellar atrophy were not syndrome specific and were similar in control and patient groups. Global and regional brain atrophy was determined primarily by age. A prior neurologic insult was associated with reduced hippocampal and cerebellar volumes and an increased rate of cerebellar atrophy. Significant atrophy of the hippocampus, neocortex, or cerebellum occurred in 17% of patients compared with 6.7% of control subjects. Patients with and without significant volume reduction were comparable in terms of seizure frequency, antiepileptic drug (AED) use, and epilepsy duration, with no identifiable risk factors for the development of atrophy. Conclusions: Overt structural cerebral damage is not an inevitable consequence of epileptic seizures. In general, brain volume reduction in epilepsy is the cumulative effect of an initial precipitating injury and age-related cerebral atrophy. Significant atrophy developed in individual patients, particularly those with temporal lobe and generalized epilepsy. Longer periods of observation may detect more subtle effects of seizures. [source]


    Automated ultrasound-assisted method for the determination of the oxidative stability of virgin olive oil

    EUROPEAN JOURNAL OF LIPID SCIENCE AND TECHNOLOGY, Issue 2 2007
    José Platero-López
    Abstract A fast and automated method is proposed for determining the oxidative stability of virgin olive oil by using ultrasound. The ultrasound microprobe (3,mm in diameter) was directly immersed into the olive oil sample contained in a test tube. The most influential variables in the oxidation process, namely pulse amplitude, duty cycle, irradiation time, and sample amount, were optimized. The oil absorbance at 270,nm was continuously monitored by oil recirculation through a 0.1-mm path length flow cell connected to a fiber optic microspectrometer. This short path length allowed the direct monitoring of absorbance without needing any sample dilution. The ultrasound energy was applied during 35,min, and the resulting increase in absorbance was continuously monitored. The difference between the final and the initial absorbance at 270,nm of a set of virgin olive oil samples was closely correlated with their oxidative stability calculated by the Rancimat method (R2,=,0.9915). The resulting equation enabled the prediction of the oxidative stability of virgin olive oil in a short period of time (35,min), by using a simple, inexpensive, automatic and easy-to-use system. [source]


    A Fluorous Capping Strategy for Fmoc-Based Automated and Manual Solid-Phase Peptide Synthesis

    EUROPEAN JOURNAL OF ORGANIC CHEMISTRY, Issue 4 2006
    Vittorio Montanari
    Abstract Just add water: Peptides synthesized by the use of standardized Fmoc protocols with commercial automated synthesizers can be purified from deletion products by simple centrifugation of aqueous solutions. The deletion products are capped with fluorous trivalent iodonium salts. At the end of the synthesis, the crude peptide is dissolved in water and centrifuged, and the deletion products precipitate leaving only the full length peptide in solution. Protocols for generalized use of this strategy are reported. (© Wiley-VCH Verlag GmbH & Co. KGaA, 69451 Weinheim, Germany, 2006) [source]


    Clinical application of measurement of hippocampal atrophy in degenerative dementias

    HIPPOCAMPUS, Issue 6 2009
    Josephine Barnes
    Abstract Hippocampal atrophy is a characteristic and early feature of Alzheimer's disease. Volumetry of the hippocampus using T1-weighted magnetic resonance imaging (MRI) has been used not only to assess hippocampal involvement in different neurodegenerative diseases as a potential diagnostic biomarker, but also to understand the natural history of diseases, and to track changes in volume over time. Assessing change in structure circumvents issues surrounding interindividual variability and allows assessment of disease progression. Disease-modifying effects of putative therapies are important to assess in clinical trials and are difficult using clinical scales. As a result, there is increasing use of serial MRI in trials to detect potential slowing of atrophy rates as an outcome measure. Automated and yet reliable methods of quantifying such change in the hippocampus would therefore be very valuable. Algorithms capable of measuring such changes automatically have been developed and may be applicable to predict decline to a diagnosis of dementia in the future. This article details the progress in using MRI to understand hippocampal changes in the degenerative dementias and also describes attempts to automate hippocampal segmentation in these diseases. © 2009 Wiley-Liss, Inc. [source]


    Automated seeding for the optimization of crystal quality

    JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 4 2010
    Sahir Khurshid
    With the advent of structural genomics a variety of crystallization techniques have been automated and applied to high-throughput pipelines, yet seeding, which is the most common and successful optimization method, is still being performed predominantly manually. The aim of this study was to devise simple automated seeding techniques that can be applied in a routine manner using existing robots and not requiring special tools. Two alternative protocols for automated seeding experiments are described. One involves the delivery of microcrystals from stock to target wells using the robot dispensing tip as a seeding tool. The second harnesses an animal whisker as the seeding tool. Larger and better ordered crystals were obtained using both techniques. [source]


    Automated software-guided identification of new buspirone metabolites using capillary LC coupled to ion trap and TOF mass spectrometry

    JOURNAL OF MASS SPECTROMETRY (INCORP BIOLOGICAL MASS SPECTROMETRY), Issue 2 2006
    Anabel S. Fandiño
    Abstract The identification and structure elucidation of drug metabolites is one of the main objectives in in vitro ADME studies. Typical modern methodologies involve incubation of the drug with subcellular fractions to simulate metabolism followed by LC-MS/MS or LC-MSn analysis and chemometric approaches for the extraction of the metabolites. The objective of this work was the software-guided identification and structure elucidation of major and minor buspirone metabolites using capillary LC as a separation technique and ion trap MSn as well as electrospray ionization orthogonal acceleration time-of-flight (ESI oaTOF) mass spectrometry as detection techniques. Buspirone mainly underwent hydroxylation, dihydroxylation and N -oxidation in S9 fractions in the presence of phase I co-factors and the corresponding glucuronides were detected in the presence of phase II co-factors. The use of automated ion trap MS/MS data-dependent acquisition combined with a chemometric tool allowed the detection of five small chromatographic peaks of unexpected metabolites that co-eluted with the larger chromatographic peaks of expected metabolites. Using automatic assignment of ion trap MS/MS fragments as well as accurate mass measurements from an ESI oaTOF mass spectrometer, possible structures were postulated for these metabolites that were previously not reported in the literature. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Automated reporting from gel-based proteomics experiments using the open source Proteios database application

    PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 5 2007
    Fredrik Levander Dr.
    Abstract The assembly of data from different parts of proteomics workflow is often a major bottleneck in proteomics. Furthermore, there is an increasing demand for the publication of details about protein identifications due to the problems with false-positive and false-negative identifications. In this report, we describe how the open-source Proteios software has been expanded to automate the assembly of the different parts of a gel-based proteomics workflow. In Proteios it is possible to generate protein identification reports that contain all the information currently required by proteomics journals. It is also possible for the user to specify maximum allowed false positive ratios, and reports are automatically generated with the corresponding score cut-offs calculated. When protein identification is conducted using multiple search engines, the score thresholds that correlate to the predetermined error rate are also explicitly calculated for proteins that appear on the result lists of more than one search engine. [source]


    VESGEN 2D: Automated, User-Interactive Software for Quantification and Mapping of Angiogenic and Lymphangiogenic Trees and Networks

    THE ANATOMICAL RECORD : ADVANCES IN INTEGRATIVE ANATOMY AND EVOLUTIONARY BIOLOGY, Issue 3 2009
    Mary B. Vickerman
    Pseudocolor view of vascular branching generations in the chorioallantoic membrane (CAM) of quail. Vascular architecture was analyzed using the automated, user-interactive software, VESsel GENeration Analysis (VESGEN). See Vickerman, et al., on page 320, in this issue. [source]


    Automated, scalable culture of human embryonic stem cells in feeder-free conditions

    BIOTECHNOLOGY & BIOENGINEERING, Issue 6 2009
    Rob J. Thomas
    Abstract Large-scale manufacture of human embryonic stem cells (hESCs) is prerequisite to their widespread use in biomedical applications. However, current hESC culture strategies are labor-intensive and employ highly variable processes, presenting challenges for scaled production and commercial development. Here we demonstrate that passaging of the hESC lines, HUES7, and NOTT1, with trypsin in feeder-free conditions, is compatible with complete automation on the CompacT SelecT, a commercially available and industrially relevant robotic platform. Pluripotency was successfully retained, as evidenced by consistent proliferation during serial passage, expression of stem cell markers (OCT4, NANOG, TRA1-81, and SSEA-4), stable karyotype, and multi-germlayer differentiation in vitro, including to pharmacologically responsive cardiomyocytes. Automation of hESC culture will expedite cell-use in clinical, scientific, and industrial applications. Biotechnol. Bioeng. 2009;102: 1636,1644. © 2008 Wiley Periodicals, Inc. [source]


    Rapid Geometric Modeling for Unstructured Construction Workspaces

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 4 2003
    Yong-Kwon Cho
    Most automated and semi-automated construction tasks require real-time information about the local workspace in the form of 3D geometric models. This article describes and demonstrates a new rapid, local area, geometric data extraction and 3D visualization method for unstructured construction workspaces that combines human perception, simple sensors, and descriptive CAD models. The rapid approach will be useful in construction to optimize automated equipment tasks and to significantly improve safety and a remote operator's spatial perception of the workspace. [source]


    A Freshwater Classification Approach for Biodiversity Conservation Planning

    CONSERVATION BIOLOGY, Issue 2 2005
    JONATHAN V. HIGGINS
    biodiversidad de agua dulce; clasificación; planificación de conservación; representativo Abstract:,Freshwater biodiversity is highly endangered and faces increasing threats worldwide. To be complete, regional plans that identify critical areas for conservation must capture representative components of freshwater biodiversity as well as rare and endangered species. We present a spatially hierarchical approach to classify freshwater systems to create a coarse filter to capture representative freshwater biodiversity in regional conservation plans. The classification framework has four levels that we described using abiotic factors within a zoogeographic context and mapped in a geographic information system. Methods to classify and map units are flexible and can be automated where high-quality spatial data exist, or can be manually developed where such data are not available. Products include a spatially comprehensive inventory of mapped and classified units that can be used remotely to characterize regional patterns of aquatic ecosystems. We provide examples of classification procedures in data-rich and data-poor regions from the Columbia River Basin in the Pacific Northwest of North America and the upper Paraguay River in central South America. The approach, which has been applied in North, Central, and South America, provides a relatively rapid and pragmatic way to account for representative freshwater biodiversity at scales appropriate to regional assessments. Resumen:,La biodiversidad de agua dulce está en peligro y enfrenta amenazas crecientes en todo el mundo. Para ser completos, los planes regionales que identifican áreas críticas para la conservación deben incluir componentes representativos de la biodiversidad de agua dulce así como especies raras y en peligro. Presentamos un método espacialmente jerárquico para clasificar sistemas de agua dulce para crear un filtro grueso que capte a la biodiversidad de agua dulce en los planes regionales de conservación. La estructura de la clasificación tiene cuatro niveles que describimos utilizando factores abióticos en un contexto zoogeográfico y localizamos en un sistema de información geográfico. Los métodos para clasificar y trazar mapas son flexibles y pueden ser automatizados, donde existen datos espaciales de alta calidad, o desarrollados manualmente cuando tales datos no están disponibles. Los productos incluyen un inventario completo de unidades mapeadas y clasificadas que pueden ser usadas remotamente para caracterizar patrones regionales de ecosistemas acuáticos. Proporcionamos ejemplos de procedimientos de clasificación en regiones ricas y pobres en datos en la cuenca del Río Columbia en el noroeste de Norte América y del Río Paraguay en Sudamérica central. El método, que ha sido aplicado en Norte, Centro y Sudamérica, proporciona una forma relativamente rápida y pragmática de contabilizar biodiversidad de agua dulce representativa en escalas adecuadas para evaluaciones regionales. [source]


    Visualizing neurons one-by-one in vivo: Optical dissection and reconstruction of neural networks with reversible fluorescent proteins

    DEVELOPMENTAL DYNAMICS, Issue 8 2006
    Shinsuke Aramaki
    Abstract A great many axons and dendrites intermingle to fasciculate, creating synapses as well as glomeruli. During live imaging in particular, it is often impossible to distinguish between individual neurons when they are contiguous spatially and labeled in the same fluorescent color. In an attempt to solve this problem, we have taken advantage of Dronpa, a green fluorescent protein whose fluorescence can be erased with strong blue light, and reversibly highlighted with violet or ultraviolet light. We first visualized a neural network with fluorescent Dronpa using the Gal4-UAS system. During the time-lapse imaging of axonal navigation, we erased the Dronpa fluorescence entirely; re-highlighted it in a single neuron anterogradely from the soma or retrogradely from the axon; then repeated this procedure for other single neurons. After collecting images of several individual neurons, we then recombined them in multiple pseudo-colors to reconstruct the network. We have also successfully re-highlighted Dronpa using two-photon excitation microscopy to label individual cells located inside of tissues and were able to demonstrate visualization of a Mauthner neuron extending an axon. These "optical dissection" techniques have the potential to be automated in the future and may provide an effective means to identify gene function in morphogenesis and network formation at the single cell level. Developmental Dynamics 235:2192,2199, 2006. © 2006 Wiley-Liss, Inc. [source]


    An automated in situ hybridization screen in the medaka to identify unknown neural genes

    DEVELOPMENTAL DYNAMICS, Issue 3 2005
    Carole Deyts
    Abstract Despite the fact that a large body of factors that play important roles in development are known, there are still large gaps in understanding the genetic pathways that govern these processes. To find previously unknown genes that are expressed during embryonic development, we optimized and performed an automated whole-mount in situ hybridization screen on medaka embryos at the end of somitogenesis. Partial cDNA sequences were compared against public databases and identified according to similarities found to other genes and gene products. Among 321 isolated genes showing specific expression in the central nervous system in at least one of five stages of development, 55.14% represented genes whose functions are already documented (in fish or other model organisms). Additionally, 16.51% were identified as conserved unknown genes or genes with unknown function. We provide new data on eight of these genes that presented a restricted expression pattern that allowed for formulating testable hypotheses on their developmental roles, and that were homologous to mammalian molecules of unknown function. Thus, gene expression screening in medaka is an efficient tool for isolating new regulators of embryonic development, and can complement genome-sequencing projects that are producing a high number of genes without ascribed functions. Developmental Dynamics 234:698,708, 2005. © 2005 Wiley-Liss, Inc. [source]


    Development of microreactor array chip-based measurement system for massively parallel analysis of enzymatic activity

    ELECTRONICS & COMMUNICATIONS IN JAPAN, Issue 4 2009
    Yosuke Hosoi
    Abstract Microarray chip technology such as DNA chips, peptide chips, and protein chips is one of the promising approaches for achieving high-throughput screening (HTS) of biomolecule function since it has great advantages in feasibility of automated information processing due to one-to-one indexing between array position and molecular function as well as massively parallel sample analysis as a benefit of downsizing and large-scale integration. Mostly, however, the function that can be evaluated by such microarray chips is limited to affinity of target molecules. In this paper, we propose a new HTS system and enzymatic activity based on microreactor array chip technology. A prototype of the automated and massively parallel measurement system for fluorometric assay of enzymatic reactions was developed by the combination of microreactor array chips and a highly sensitive fluorescence microscope. Design strategy of microreactor array chips and an optical measurement platform for the high-throughput enzyme assay are discussed. © 2009 Wiley Periodicals, Inc. Electron Comm Jpn, 92(4): 35,41, 2009; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ecj.10056 [source]


    Continuous intact cell detection and viability determination by CE with dual-wavelength detection

    ELECTROPHORESIS, Issue 2 2010
    Xiaomin Ren
    Abstract We introduce here a method for continuous intact cell detection and viability determination of individual trypan blue stained cells by CE with ultraviolet,visible dual-wavelength detection. To avoid cell aggregation or damage during electrophoresis, cells after staining were fixed with 4% formaldehyde and were continuously introduced into the capillary by EOF. The absorbance of a cell at 590,nm was used to determine its viability. An absorbance of two milli-absorbance unit at 590,nm was the clear cut-off point for living and dead Hela cells in our experiments. Good viability correlation between the conventional trypan blue staining assay and our established CE method (correlation coefficient, R2=0.9623) was demonstrated by analysis of cell mixtures with varying proportions of living and dead cells. The CE method was also used to analyze the cytotoxicity of methylmercury, and the results were in good agreement with the trypan blue staining assay and 3-(4,5-dimethyl-2-thiazyl)-2,5-diphenyl-2H-tetrazolium bromide methods. Compared with the 3-(4,5-dimethyl-2-thiazyl)-2,5-diphenyl-2H-tetrazolium bromide method, our established CE method can be easily automated to report cell viability based on the state of individual cells. Tedious manual cell counting and human error due to investigator bias can be avoided by using this method. [source]


    Direct automatic determination of free and total anesthetic drugs in human plasma by use of a dual (microdialysis,microextraction by packed sorbent) sample treatment coupled at-line to NACE,MS

    ELECTROPHORESIS, Issue 10 2009
    Gabriel Morales-Cid
    Abstract This paper reports for the first time the use of microextraction by packed sorbent in combination with CE. The combined system was used to determine anesthetic drugs in human plasma. A microdialysis fiber was coupled on-line to the microextraction unit in order to distinguish between free and total concentrations of drugs. The system was automated by connecting the microextraction unit to a syringe pump and interfacing it to a computer. The ensuing method allows the determination of 10,,g/L concentrations of free drugs and 1,,g/L concentrations of total drugs from only 200,,L of sample with an RSD of less than 9%. [source]


    A fully automated 2-D LC-MS method utilizing online continuous pH and RP gradients for global proteome analysis

    ELECTROPHORESIS, Issue 23 2007
    Hu Zhou
    Abstract The conventional 2-D LC-MS/MS setup for global proteome analysis was based on online and offline salt gradients (step and continuous) using strong-cation-exchange chromatography in conjunction with RP chromatography and MS. The use of the online system with step salt elution had the possibility of resulting in peptide overlapping across fractions. The offline mode had the option to operate with continuous salt gradient to decrease peak overlap, but exhibited decreased robustness, lower reproducibility, and sample loss during the process. Due to the extensive washing requirement between the chromatography steps, online continuous gradient was not an option for salt elution. In this report, a fully automated, online, and continuous gradient (pH continuous online gradient, pCOG) 2-D LC-MS/MS system is introduced that provided excellent separation and identification power. The pH gradient-based elution provided more basic peptides than that of salt-based elution. Fraction overlap was significantly minimized by combining pH and continuous gradient elutions. This latter approach also increased sequence coverage and the concomitant confidence level in protein identification. The salt and pH elution-based 2-D LC-MS/MS approaches were compared by analyzing the mouse liver proteome. [source]


    An automated, sheathless capillary electrophoresis-mass spectrometry platform for discovery of biomarkers in human serum

    ELECTROPHORESIS, Issue 7-8 2005
    Alexander P. Sassi
    Abstract A capillary electrophoresis-mass spectrometry (CE-MS) method has been developed to perform routine, automated analysis of low-molecular-weight peptides in human serum. The method incorporates transient isotachophoresis for in-line preconcentration and a sheathless electrospray interface. To evaluate the performance of the method and demonstrate the utility of the approach, an experiment was designed in which peptides were added to sera from individuals at each of two different concentrations, artificially creating two groups of samples. The CE-MS data from the serum samples were divided into separate training and test sets. A pattern-recognition/feature-selection algorithm based on support vector machines was used to select the mass-to-charge (m/z) values from the training set data that distinguished the two groups of samples from each other. The added peptides were identified correctly as the distinguishing features, and pattern recognition based on these peptides was used to assign each sample in the independent test set to its respective group. A twofold difference in peptide concentration could be detected with statistical significance (p -value < 0.0001). The accuracy of the assignment was 95%, demonstrating the utility of this technique for the discovery of patterns of biomarkers in serum. [source]


    Electrophoretically mediated microanalysis with partial filling technique and indirect or direct detection as a tool for inhibition studies of enzymatic reaction

    ELECTROPHORESIS, Issue 7-8 2004
    Magdaléna Telnarová
    Abstract The inhibition of the model enzyme, haloalkane dehalogenase from Sphingomonas paucimobilis, was investigated by a combination of electrophoretically mediated microanalysis with a partial filling technique, followed by indirect or direct detection. In this setup, part of the capillary is filled with a buffer suitable for the enzymatic reaction (20 mM glycine buffer, pH 8.6) whereas the rest of the capillary is filled with the background electrolyte optimal for separation of substrates and products. Two different background electrolytes and corresponding detection approaches were used to show the versatility of the developed method. The inhibition effect of 1,2-dichloroethane on the dehalogenation of brominated substrate 1-bromobutane was studied by means of 10 mM chromate , 0.1 mM cetyltrimethylammonium bromide (pH 9.2) in combination with indirect detection or 20 mM ,-alanine , hydrochloric acid (pH 3.5) in combination with direct detection. The method was used to estimate the inhibition constant KI (0.44 mM by indirect detection and 0.63 mM by of direct detection) and to determine the inhibition type. Compared to spectrophotometric and other discontinuous assays, the method is rapid, can be automated, and requires only small amount of reagents that is especially important in the case of enzymes and inhibitors. [source]


    A three-dimensional model of the U1 small nuclear ribonucleoprotein particle

    ENTOMOLOGICAL RESEARCH, Issue 2 2010
    Jason A. SOMARELLI
    Abstract Most of the pre-mRNAs in the eukaryotic cell are comprised of protein-coding exons and non-protein-coding introns. The introns are removed and the exons are ligated together, or spliced, by a large, macromolecular complex known as the spliceosome. This RNA-protein assembly is made up of five uridine-rich small nuclear RNAs (U1-, U2-, U4-, U5- and U6-snRNA) as well over 300 proteins, which form small nuclear ribonucleoprotein particles (snRNPs). Initial recognition of the 5, exon/intron splice site is mediated by the U1 snRNP, which is composed of the U1 snRNA as well as at least ten proteins. By combining structural informatics tools with the available biochemical and crystallographic data, we attempted to simulate a complete, three dimensional U1 snRNP from the silk moth, Bombyx mori. Comparison of our model with empirically derived crystal structures and electron micrographs pinpoints both the strengths and weaknesses in the in silico determination of macromolecular complexes. One of the most striking differences between our model and experimentally generated structures is in the positioning of the U1 snRNA stem-loops. This highlights the continuing difficulties in generating reliable, complex RNA structures; however, three-dimensional modeling of individual protein subunits by threading provided models of biological significance and the use of both automated and manual docking strategies generated a complex that closely reflects the assembly found in nature. Yet, without utilizing experimentally-derived contacts to select the most likely docking scenario, ab initio docking would fall short of providing a reliable model. Our work shows that the combination of experimental data with structural informatics tools can result in generation of near-native macromolecular complexes. [source]


    Hippocampal volume assessment in temporal lobe epilepsy: How good is automated segmentation?

    EPILEPSIA, Issue 12 2009
    Heath R. Pardoe
    Summary Purpose:, Quantitative measurement of hippocampal volume using structural magnetic resonance imaging (MRI) is a valuable tool for detection and lateralization of mesial temporal lobe epilepsy with hippocampal sclerosis (mTLE). We compare two automated hippocampal volume methodologies and manual hippocampal volumetry to determine which technique is most sensitive for the detection of hippocampal atrophy in mTLE. Methods:, We acquired a three-dimensional (3D) volumetric sequence in 10 patients with left-lateralized mTLE and 10 age-matched controls. Hippocampal volumes were measured manually, and using the software packages Freesurfer and FSL-FIRST. The sensitivities of the techniques were compared by determining the effect size for average volume reduction in patients with mTLE compared to controls. The volumes and spatial overlap of the automated and manual segmentations were also compared. Results:, Significant volume reduction in affected hippocampi in mTLE compared to controls was detected by manual hippocampal volume measurement (p < 0.01, effect size 33.2%), Freesurfer (p < 0.01, effect size 20.8%), and FSL-FIRST (p < 0.01, effect size 13.6%) after correction for brain volume. Freesurfer correlated reasonably (r = 0.74, p << 0.01) with this manual segmentation and FSL-FIRST relatively poorly (r = 0.47, p << 0.01). The spatial overlap between manual and automated segmentation was reduced in affected hippocampi, suggesting the accuracy of automated segmentation is reduced in pathologic brains. Discussion:, Expert manual hippocampal volumetry is more sensitive than both automated methods for the detection of hippocampal atrophy associated with mTLE. In our study Freesurfer was the most sensitive to hippocampal atrophy in mTLE and could be used if expert manual segmentation is not available. [source]


    Antioxidant capacity of rapeseed meal and rapeseed oils enriched with meal extract

    EUROPEAN JOURNAL OF LIPID SCIENCE AND TECHNOLOGY, Issue 7 2010
    Aleksandra Szyd, owska-Czerniak
    Abstract Response surface methodology (RSM) was used to evaluate the quantitative effects of two independent variables: solvent polarity and temperature of the extraction process on the antioxidant capacity (AC) and total phenolics content (TPC) in meal rapeseed extracts. The mean AC and TPC results for meal ranged between 1181,9974,µmol TE/100,g and 73.8,814,mg sinapic acid/100,g of meal. The experimental results of AC and TPC were close to the predicted values calculated from the polynomial response surface models equations (R2,=,0.9758 and 0.9603, respectively). The effect of solvent polarity on AC and TPC in the examined extracts was about 3.6 and 2.6 times greater, respectively, than the effect of processing temperature. The predicted optimum solvent polarity of ,,=,78.3 and 63.8, and temperature of 89.4 and 74.2°C resulted in an AC of 10,014,µmol TE/100,g and TPC of 863,mg SAE/100,g meal, respectively. The phenolic profile of rapeseed meal was determined by an HPLC method. The main phenolics in rapeseed meal were sinapine and sinapic acid. Refined rapeseed oils were fortified with an extract , rich in polyphenols , obtained from rapeseed meal. The supplemented rapeseed oil had higher AC and TPC than the refined oil without addition of meal extracts. However, AC and TPC in the enriched oils decreased during storage. The TPC in the studied meal extracts and rapeseed oils correlated significantly (p<0.0000001) positively with their AC (R2,=,0.9387). Practical applications: Many bioactive compounds extracted from rapeseed meal provide health benefits and have antioxidative properties. Therefore, it seems worth to consider the application of antioxidants extracted from the rapeseed meal for the production of rapeseed oils with potent AC. Moreover, antioxidants extracted from the rapeseed meal were added to refined rapeseed oil in order to enhance its AC. AC was then tested by FRAP assay. FRAP method is based on the reduction of the ferric tripyridyltriazine (Fe3+ -TPTZ) complex to the ferrous tripyridyltriazine (Fe2+ -TPTZ), and it is simple, fast, low cost, and robust method. FRAP method does not require specialized equipment and can be performed using automated, semi-automatic, or manual methods. Therefore the proposed FRAP method can be employed by the fat industry laboratories to asses the AC of rapeseed oils and meal. [source]


    Improved automated extraction and separation procedure for soil lipid analyses

    EUROPEAN JOURNAL OF SOIL SCIENCE, Issue 2 2004
    G. L. B. Wiesenberg
    Summary Analysis of soil lipids may contribute to an improved understanding of atmosphere to soil carbon fluxes, soil organic matter source differentiation and pollutant accumulation. Soil lipids, mostly originating from plants and microorganisms, have traditionally been analysed by non-automated extraction and separation methods, which produce several lipid fractions, operationally defined by polarity. Here we present a combination of fast, automated and reproducible techniques, adopted from organic geochemical studies, for preparative separation of individual soil lipid fractions with increasing polarity. These techniques involve commercially available instruments, including accelerated solvent extraction and a two-step automated medium-pressure liquid chromatography procedure. The method yields eight lipid fractions consisting of five fractions fully amenable to gas chromatography/mass spectrometry (GC/MS) (aliphatic hydrocarbons, aromatic hydrocarbons, ketones, alcohols, carboxylic acids), and three fractions of highly polar or high molecular weight compounds (bases, very long-chain wax esters (C40+), high polarity compounds) that were not measurable with GC/MS under standard conditions. We tested the method on five agricultural soils. Results show that (i) mass recoveries for the individual fractions are reproducible, (ii) within individual fractions compound distribution patterns are reproducible, as demonstrated for alkanes and carboxylic acids, and (iii) individual fractions represent distinct and clean compound classes, free of interfering substances detectable by GC/MS. Thus, automated separation can be a fast, effective and reproducible procedure for fractionation of complex mixtures of soil lipids into clean compound classes, directly suitable for a variety of molecular (e.g. GC/MS) and isotopic characterizations (e.g. gas chromatography coupled with isotope ratio monitoring mass spectrometry or accelerator mass spectrometry). [source]


    Optimal Control of Rigid-Link Manipulators by Indirect Methods

    GAMM - MITTEILUNGEN, Issue 1 2008
    Rainer Callies
    Abstract The present paper is a survey and research paper on the treatment of optimal control problems of rigid-link manipulators by indirect methods. Maximum Principle based approaches provide an excellent tool to calculate optimal reference trajectories for multi-link manipulators with high accuracy. Their major drawback was the need to explicitly formulate the complicated system of adjoint differential equations and to apply the full apparatus of optimal control theory. This is necessary in order to convert the optimal control problem into a piecewise defined, nonlinear multi-point boundary value problem. An accurate and efficient access to first- and higher-order derivatives is crucial. The approach described in this paper allows it to generate all the derivative information recursively and simultaneously with the recursive formulation of the equations of motion. Nonlinear state and control constraints are treated without any simplifications by transforming them into sequences of systems of linear equations. By these means, the modeling of the complete optimal control problem and the accompanying boundary value problem is automated to a great extent. The fast numerical solution is by the advanced multiple shooting method JANUS. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Asymptomatic leukocyturia and the autonomic nervous system in women

    GERIATRICS & GERONTOLOGY INTERNATIONAL, Issue 2 2009
    Yoshimasa Igari
    Background: The present study sought to investigate the relationship between asymptomatic leukocyturia (ASL) and autonomic nervous function by power spectral analysis of the R-R intervals in women. Methods: One hundred and forty-two female outpatients aged 23,91 years were studied. We regarded ASL to be present if two consecutive samples were found to have 10 or more leukocytes/high-power field at ×400 magnification in a centrifuged midstream urine sample. The R-R intervals of all subjects were measured by the wavelet transform analysis system. This system detected R-R variation data distributed in two bands: low-frequency power (LF) (0.04,0.15 Hz) and high-frequency power (HF) (0.15,0.40 Hz). The ratio of LF to HF (LF/HF) was also determined. Post-void residual urine volume was measured using an automated, compact 3-D ultrasound device. Results: The patients with ASL had diabetes mellitus more frequently than those without ASL. Residual urine volume was significantly higher in the former than in the latter, while the HF values in both a recumbent position and a standing position were significantly lower in the former than in the latter (P = 0.003, P = 0.001, respectively). However, there were no significant differences in LF or LF/HF values in either a recumbent or a standing position between the two groups. The HF values in both a recumbent position and in a standing position were independent indicators of ASL, even after adjustment for age, diabetes mellitus and residual urine volume. Conclusion: The present study reveals the relationship between ASL and impairment of the parasympathetic nervous system in women. [source]


    Contrasting soil respiration in young and old-growth ponderosa pine forests

    GLOBAL CHANGE BIOLOGY, Issue 12 2002
    J. IRVINE
    Abstract Three years of fully automated and manual measurements of soil CO2 efflux, soil moisture and temperature were used to explore the diel, seasonal and inter-annual patterns of soil efflux in an old-growth (250-year-old, O site) and recently regenerating (14-year-old, Y site) ponderosa pine forest in central Oregon. The data were used in conjunction with empirical models to determine which variables could be used to predict soil efflux in forests of contrasting ages and disturbance histories. Both stands experienced similar meteorological conditions with moderately cold wet winters and hot dry summers. Soil CO2 efflux at both sites showed large inter-annual variability that could be attributed to soil moisture availability in the deeper soil horizons (O site) and the quantity of summer rainfall (Y site). Seasonal patterns of soil CO2 efflux at the O site showed a strong positive correlation between diel mean soil CO2 efflux and soil temperature at 64 cm depth whereas diel mean soil efflux at the Y site declined before maximum soil temperature occurred during summer drought. The use of diel mean soil temperature and soil water potential inferred from predawn foliage water potential measurements could account for 80% of the variance of diel mean soil efflux across 3 years at both sites, however, the functional shape of the soil water potential constraint was site-specific. Based on the similarity of the decomposition rates of litter and fine roots between sites, but greater productivity and amount of fine litter detritus available for decomposition at the O site, we would expect higher rates of soil CO2 efflux at the O site. However, annual rates were only higher at the O site in one of the 3 years (597 ± 45 vs. 427 ± 80 g C m,2). Seasonal patterns of soil efflux at both sites showed influences of soil water limitations that were also reflected in patterns of canopy stomatal conductance, suggesting strong linkages between above and below ground processes. [source]


    Application of Six Sigma Methods for Improving the Analytical Data Management Process in the Environmental Industry

    GROUND WATER MONITORING & REMEDIATION, Issue 2 2006
    Christopher M. French
    Honeywell applied the rigorous and well-documented Six Sigma quality-improvement approach to the complex, highly heterogeneous, and mission-critical process of remedial site environmental data management to achieve a sea change in terms of data quality, environmental risk reduction, and overall process cost reduction. The primary focus was to apply both qualitative and quantitative Six Sigma methods to improve electronic management of analytical laboratory data generated for environmental remediation and long-term monitoring programs. The process includes electronic data delivery, data QA/QC checking, data verification, data validation, database administration, regulatory agency reporting and linkage to spatial information, and real-time geographical information systems. Results of the analysis identified that automated, centralized web-based software tools delivered through Software as a Service (SaaS) model are optimal to improve the process resulting in cost reductions, while simultaneously improving data quality and long-term data usability and perseverance. A pilot project was completed that quantified cycle time and cost improvements of 50% and 65%, respectively. [source]


    Evaluation of operators' performance for automation design in the fully digital control room of nuclear power plants

    HUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 1 2010
    Chiuhsiang Joe Lin
    Abstract Recent technical developments in computer hardware and software have meant that human,machine systems can be automated in many respects. If automation fails, however, human operators can have difficulty in recognizing the existence of a problem, identifying what has failed, and taking corrective action to remedy these out-of-the-loop (OOTL) performance problems. Several studies have suggested that taxonomies of levels of automation (LOAs) and types of automation (TOAs) can be used to solve OOTL problems. This study examined the impact of LOAs in process control automation within the context of nuclear power plants (NPPs). A simulation experiment in an NPP is performed to validate this framework using an automatic mode and a semiautomatic mode. Mental demand is found to be significantly reduced under the automatic mode; however, participants felt frustrated with this LOA. Situation awareness is found to be similar in the two modes. The results of an end-of-experiment subjective rating reveal that participants were evenly divided between the two modes with respect to generating and selecting functions. It is therefore suggested that human operators be involved in generating and selecting functions under an automatic mode. © 2009 Wiley Periodicals, Inc. [source]