Data Processing (data + processing)

Distribution by Scientific Domains
Distribution within Chemistry


Selected Abstracts


Data processing in metabolic fingerprinting by CE-UV: Application to urine samples from autistic children

ELECTROPHORESIS, Issue 6 2007
Ana C. Soria
Abstract Metabolic fingerprinting of biofluids such as urine can be used to detect and analyse differences between individuals. However, before pattern recognition methods can be utilised for classification, preprocessing techniques for the denoising, baseline removal, normalisation and alignment of electropherograms must be applied. Here a MEKC method using diode array detection has been used for high-resolution separation of both charged and neutral metabolites. Novel and generic algorithms have been developed for use prior to multivariate data analysis. Alignment is achieved by combining the use of reference peaks with a method that uses information from multiple wavelengths to align electropherograms to a reference signal. This metabolic fingerprinting approach by MEKC has been applied for the first time to urine samples from autistic and control children in a nontargeted and unbiased search for markers for autism. Although no biomarkers for autism could be determined using MEKC data here, the general approach presented could also be applied to the processing of other data collected by CE with UV,Vis detection. [source]


Data processing and reconciliation in chemical process operations.

AICHE JOURNAL, Issue 4 2001
$79.9, 270 pp., Academic Press, By J. Romagnoli, M. C. Sánchez, San Diego
No abstract is available for this article. [source]


Nuclear magnetic resonance data processing.

CONCEPTS IN MAGNETIC RESONANCE, Issue 2 2003
MestRe-C: A software package for desktop computers
Abstract Magnetic Resonance Companion (MestRe-C) is a software package that offers state-of-the-art facilities for data processing, visualization, and analysis of high-resolution nuclear magnetic resonance (NMR) data, combined with a robust, user-friendly graphical interface that fully exploits the power and flexibility of the Windows platform. The program provides a variety of conversion facilities for most NMR spectrometer formats and includes all the conventional processing, displaying, and plotting capabilities of an NMR program, as well as more advanced processing techniques. A brief review of the basic concepts of NMR data processing is included also. © 2003 Wiley Periodicals, Inc. Concepts Magn Reson Part A 19A: 80,96, 2003. [source]


Novel software architecture for rapid development of magnetic resonance applications

CONCEPTS IN MAGNETIC RESONANCE, Issue 3 2002
Josef Debbins
Abstract As the pace of clinical magnetic resonance (MR) procedures grows, the need for an MR scanner software platform on which developers can rapidly prototype, validate, and produce product applications becomes paramount. A software architecture has been developed for a commercial MR scanner that employs state of the art software technologies including Java, C++, DICOM, XML, and so forth. This system permits graphical (drag and drop) assembly of applications built on simple processing building blocks, including pulse sequences, a user interface, reconstruction and postprocessing, and database control. The application developer (researcher or commercial) can assemble these building blocks to create custom applications. The developer can also write source code directly to create new building blocks and add these to the collection of components, which can be distributed worldwide over the internet. The application software and its components are developed in Java, which assures platform portability across any host computer that supports a Java Virtual Machine. The downloaded executable portion of the application is executed in compiled C++ code, which assures mission-critical real-time execution during fast MR acquisition and data processing on dedicated embedded hardware that supports C or C++. This combination permits flexible and rapid MR application development across virtually any combination of computer configurations and operating systems, and yet it allows for very high performance execution on actual scanner hardware. Applications, including prescan, are inherently real-time enabled and can be aggregated and customized to form "superapplications," wherein one or more applications work with another to accomplish the clinical objective with a very high transition speed between applications. © 2002 Wiley Periodicals, Inc. Concepts in Magnetic Resonance (Magn Reson Engineering) 15: 216,237, 2002 [source]


Programming scientific and distributed workflow with Triana services

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2006
David Churches
Abstract In this paper, we discuss a real-world application scenario that uses three distinct types of workflow within the Triana problem-solving environment: serial scientific workflow for the data processing of gravitational wave signals; job submission workflows that execute Triana services on a testbed; and monitoring workflows that examine and modify the behaviour of the executing application. We briefly describe the Triana distribution mechanisms and the underlying architectures that we can support. Our middleware independent abstraction layer, called the Grid Application Prototype (GAP), enables us to advertise, discover and communicate with Web and peer-to-peer (P2P) services. We show how gravitational wave search algorithms have been implemented to distribute both the search computation and data across the European GridLab testbed, using a combination of Web services, Globus interaction and P2P infrastructures. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Measurement delay associated with the Guardian® RT continuous glucose monitoring system

DIABETIC MEDICINE, Issue 1 2010
C. Wei
Diabet. Med. Abstract Aims, Using compartment modelling, we assessed the time delay between blood glucose and sensor glucose measured by the Guardian® RT continuous glucose monitoring system in young subjects with Type 1 diabetes (T1D). Methods, Twelve children and adolescents with T1D treated by continuous subcutaneous insulin infusion (male/female 7/5; age 13.1 ± 4.2 years; body mass index 21.9 ± 4.3 kg/m2; mean ± sd) were studied over 19 h in a Clinical Research Facility. Guardian® RT was calibrated every 6 h and sensor glucose measured every 5 min. Reference blood glucose was measured every 15 min using a YSI 2300 STAT Plus Analyser. A population compartment model of sensor glucose,blood glucose kinetics was adopted to estimate the time delay, the calibration scale and the calibration shift. Results, The population median of the time delay was 15.8 (interquartile range 15.2, 16.5) min, which was corroborated by correlation analysis between blood glucose and 15-min delayed sensor glucose. The delay has a relatively low intersubject variability, with 95% of individuals predicted to have delays between 10.4 and 24.3 min. Population medians (interquartile range) for the scale and shift are 0.800 (0.777, 0.823) (unitless) and 1.66 (1.47, 1.84) mmol/l, respectively. Conclusions, In young subjects with T1D, the total time delay associated with the Guardian® RT system was approximately 15 min. This is twice that expected on physiological grounds, suggesting a 5- to 10-min delay because of data processing. Delays above 25 min are rarely to be observed. [source]


Effects of species and habitat positional errors on the performance and interpretation of species distribution models

DIVERSITY AND DISTRIBUTIONS, Issue 4 2009
Patrick E. Osborne
Abstract Aim, A key assumption in species distribution modelling is that both species and environmental data layers contain no positional errors, yet this will rarely be true. This study assesses the effect of introduced positional errors on the performance and interpretation of species distribution models. Location, Baixo Alentejo region of Portugal. Methods, Data on steppe bird occurrence were collected using a random stratified sampling design on a 1-km2 pixel grid. Environmental data were sourced from satellite imagery and digital maps. Error was deliberately introduced into the species data as shifts in a random direction of 0,1, 2,3, 4,5 and 0,5 pixels. Whole habitat layers were shifted by 1 pixel to cause mis-registration, and the cumulative effect of one to three shifted layers investigated. Distribution models were built for three species using three algorithms with three replicates. Test models were compared with controls without errors. Results, Positional errors in the species data led to a drop in model performance (larger errors having larger effects , typically up to 10% drop in area under the curve on average), although not enough for models to be rejected. Model interpretation was more severely affected with inconsistencies in the contributing variables. Errors in the habitat layers had similar although lesser effects. Main conclusions, Models with species positional errors are hard to detect, often statistically good, ecologically plausible and useful for prediction, but interpreting them is dangerous. Mis-registered habitat layers produce smaller effects probably because shifting entire layers does not break down the correlation structure to the same extent as random shifts in individual species observations. Spatial autocorrelation in the habitat layers may protect against species positional errors to some extent but the relationship is complex and requires further work. The key recommendation must be that positional errors should be minimised through careful field design and data processing. [source]


On shear-wave triplications in a multilayered transeversely isotropic medium with vertical symmetry axis

GEOPHYSICAL PROSPECTING, Issue 4 2010
Yuriy Roganov
ABSTRACT The presence of triplications (caustics) can be a serious problem in seismic data processing and analysis. The traveltime curve becomes multi-valued and the geometrical spreading correction factor tends to zero due to energy focusing. We analyse the conditions for the qSV-wave triplications in a homogeneous transversely isotropic medium with vertical symmetry axis. The proposed technique can easily be extended to the case of horizontally layered vertical symmetry axis medium. We show that the triplications of the qSV-wave in a multilayered medium imply certain algebra. We illustrate this algebra on a two-layer vertical symmetry axis model. [source]


Distance separated simultaneous sweeping, for fast, clean, vibroseis acquisition

GEOPHYSICAL PROSPECTING, Issue 1 2010
Jack Bouska
ABSTRACT Distance separated simultaneous sweeping DS3 is a new vibroseis technique that produces independent records, uncontaminated by simultaneous source interference, for a range of offsets and depths that span all target zones of interest. Use of DS3 on a recent seismic survey in Oman, resulted in a peak acquisition rate of 1024 records per hour. This survey employed 15 vibrators, with a distance separation of 12 km between simultaneous active sources, recorded by 8000 active channels across 22 live lines in an 18.5 km × 11 km receiver patch. Broad distribution of simultaneous sources, across an adequately sized recording patch, effectively partitions the sensors so that each trace records only one of the simultaneous sources. With proper source separation, on a scale similar to twice the maximum usable source receiver offset, wavefield overlap occurs below the zone of interest. This yields records that are indistinguishable from non-simultaneous source data, within temporal and spatial limits. This DS3 technique may be implemented using a wide variety of acquisition geometries, optimally with spatially large recording patches that enable appropriate source separation distances. DS3 improves acquisition efficiency without data quality degradation, eliminating the requirement for special data processing or noise attenuation. [source]


Comparing state-of the art near-surface models of a seismic test line from Saudi Arabia

GEOPHYSICAL PROSPECTING, Issue 6 2006
Ralph Bridle
ABSTRACT We present a seismic Test Line, provided by Saudi Aramco for various research teams, to highlight a few major challenges in land data processing due to near-surface anomalies. We discuss state-of-the-art methods used to compensate for shallow distortions, including single-layer, multilayer, plus/minus, refraction and tomostatics methods. They are a starting point for the new technologies presented in other papers, all dealing with the same challenging data described here. The difficulties on the Test Line are mostly due to the assumption of vertical raypaths, inherent in classical applications of near-surface correction statics. Even the most detailed velocity/depth model presents difficulties, due to the compleX-raypath. There is a need for methods which are based on more complex models andtheories. [source]


CRS-stack-based seismic imaging considering top-surface topography

GEOPHYSICAL PROSPECTING, Issue 6 2006
Z. Heilmann
ABSTRACT In this case study we consider the seismic processing of a challenging land data set from the Arabian Peninsula. It suffers from rough top-surface topography, a strongly varying weathering layer, and complex near-surface geology. We aim at establishing a new seismic imaging workflow, well-suited to these specific problems of land data processing. This workflow is based on the common-reflection-surface stack for topography, a generalized high-density velocity analysis and stacking process. It is applied in a non-interactive manner and provides an entire set of physically interpretable stacking parameters that include and complement the conventional stacking velocity. The implementation introduced combines two different approaches to topography handling to minimize the computational effort: after initial values of the stacking parameters are determined for a smoothly curved floating datum using conventional elevation statics, the final stack and also the related residual static correction are applied to the original prestack data, considering the true source and receiver elevations without the assumption of nearly vertical rays. Finally, we extrapolate all results to a chosen planar reference level using the stacking parameters. This redatuming procedure removes the influence of the rough measurement surface and provides standardized input for interpretation, tomographic velocity model determination, and post-stack depth migration. The methodology of the residual static correction employed and the details of its application to this data example are discussed in a separate paper in this issue. In view of the complex near-surface conditions, the imaging workflow that is conducted, i.e. stack , residual static correction , redatuming , tomographic inversion , prestack and post-stack depth migration, leads to a significant improvement in resolution, signal-to-noise ratio and reflector continuity. [source]


Process Considerations for Trolling Borehole Flow Logs

GROUND WATER MONITORING & REMEDIATION, Issue 3 2006
Phil L. Oberlander
Horizontal hydraulic conductivity with depth is often understood only as a depth-integrated property based on pumping tests or estimated from geophysical logs and the lithology. A more explicit method exists for determining hydraulic conductivity over small vertical intervals by collecting borehole flow measurements while the well is being pumped. Borehole flow rates were collected from 15 deep monitoring wells on the Nevada Test Site and the Nevada Test and Training Range while continuously raising and lowering a high-precision impeller borehole flowmeter. Repeated logging passes at different logging speeds and pumping rates typically provided nine unique flow logs for each well. Over 60 km of borehole flow logs were collected at a 6.1-cm vertical resolution. Processing these data necessitated developing a methodology to delete anomalous values, smooth small-scale flow variations, combine multiple borehole flow logs, characterize measurement uncertainty, and determine the interval-specific lower limit to flow rate quantification. There are decision points in the data processing where judgment and ancillary analyses are needed to extract subtle hydrogeologic information. The analysis methodology indicates that processed measurements from a high-precision trolling impeller flowmeter in a screened well can confidently detect changes in borehole flow rate of ,0.7% of the combined trolling and borehole flow rate. An advantage of trolling the flowmeter is that the impeller is nearly always spinning as it is raised and lowered in the well and borehole flow rates can be measured at lower values than if measurements were taken while the flowmeter was held at a fixed depth. [source]


xBCI: A Generic Platform for Development of an Online BCI System

IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 4 2010
I Putu Susila Non-member
Abstract A generic platform for realizing an online brain,computer interface (BCI) named xBCI was developed. The platform consists of several functional modules (components), such as data acquisition, storage, mathematical operations, signal processing, network communication, data visualization, experiment control, and real-time feedback presentation. Users can easily build their own BCI systems by combining the components on a graphical-user-interface (GUI) based diagram editor. They can also extend the platform by adding components as plug-ins or by creating components using a scripting language. The platform works on multiple operating systems and supports parallel (multi-threaded) data processing and data transfer to other PCs through a network transmission control protocol/internet protocol or user datagram protocol (TCP/IP or UDP). A BCI system based on motor imagery and a steady-state visual evoked potential (SSVEP) based BCI system were constructed and tested on the platform. The results show that the platform is able to process multichannel brain signals in real time. The platform provides users with an easy-to-use system development tool and reduces the time needed to develop a BCI system. Copyright © 2010 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source]


PRIMUS: a Windows PC-based system for small-angle scattering data analysis

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 5 2003
Petr V. Konarev
A program suite for one-dimensional small-angle scattering data processing running on IBM-compatible PCs under Windows 9x/NT/2000/XP is presented. The main program, PRIMUS, has a menu-driven graphical user interface calling computational modules to perform data manipulation and analysis. Experimental data in binary OTOKO format can be reduced by calling the program SAPOKO, which includes statistical analysis of time frames, averaging and scaling. Tools to generate the angular axis and detector response files from diffraction patterns of calibration samples, as well as binary to ASCII transformation programs, are available. Several types of ASCII files can be directly imported into PRIMUS, in particular, sasCIF or ILL-type files are read without modification. PRIMUS provides basic data manipulation functions (averaging, background subtraction, merging of data measured in different angular ranges, extrapolation to zero sample concentration, etc.) and computes invariants from Guinier and Porod plots. Several external modules coupled with PRIMUSvia pop-up menus enable the user to evaluate the characteristic functions by indirect Fourier transformation, to perform peak analysis for partially ordered systems and to find shape approximations in terms of three-parametric geometrical bodies. For the analysis of mixtures, PRIMUS enables model-independent singular value decomposition or linear fitting if the scattering from the components is known. An interface is also provided to the general non-linear fitting program MIXTURE, which is designed for quantitative analysis of multicomponent systems represented by simple geometrical bodies, taking shape and size polydispersity as well as interparticle interference effects into account. [source]


Determination of rank by median absolute deviation (DRMAD): a simple method for determining the number of principal factors responsible for a data matrix,

JOURNAL OF CHEMOMETRICS, Issue 1 2009
Edmund R. Malinowski
Abstract Median absolute deviation (MAD) is a well-established statistical method for determining outliers. This simple statistic can be used to determine the number of principal factors responsible for a data matrix by direct application to the residual standard deviation (RSD) obtained from principal component analysis (PCA). Unlike many other popular methods the proposed method, called determination of rank by MAD (DRMAD), does not involve the use of pseudo degrees of freedom, pseudo F -tests, extensive calibration tables, time-consuming iterations, nor empirical procedures. The method does not require strict adherence to normal distributions of experimental uncertainties. The computations are direct, simple to use and extremely fast, ideally suitable for online data processing. The results obtained using various sets of chemical data previously reported in the chemical literature agree with the early work. Limitations of the method, determined from model data, are discussed. An algorithm, written in MATLAB format, is presented in the Appendix. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Land cover change and land degradation in parts of the southwest coast of Nigeria

AFRICAN JOURNAL OF ECOLOGY, Issue 2009
Mayowa Fasona
Abstract Frequent alteration in land cover often leads to decreased stability of ecosystems which can also increase the vulnerability of rural communities to externalities of environmental change. This study carried out in parts of the coast of southwestern Nigeria utilized topographic base maps and two-time Landsat TM imageries to assess the trend in land cover changes and ecosystems degradation for the three time periods 1965, 1986 and 2001. Remote sensing, geographic information systems and landscape pattern analysis were employed for data processing and analysis. The focus of the analysis was on land cover change, land degradation, and changes in landscape pattern resulting from interplay of natural and anthropogenic drivers. The results show increased trend in human-induced land cover change with concomitant severe negative impacts on ecosystems and livelihoods. About 98,000ha (30% of the area) was seriously degraded as at 2001. About 33,000ha (10%) was under permanent saline water inundation with about 21 communities already dislocated. Loss of fragile ecosystems including marshland (from 7.7% in 1965 to 1% in 2001) and mangrove (from 14.6% in 1965 to 3.1% in 2001) was intense, while over 300 ponds/small lakes which are important for the local fishing economy have disappeared. About eighteen communities were also dislocated by erosion in a section around the southeastern parts of the coastline. Landscape metrics generated, suggested increased ecosystems perturbation and landscape fragmentation. The paper also discussed the implications of these rapid changes for ecosystems stability, food security and sustainable rural livelihoods in the area. [source]


Comparison of TCA and ICA techniques in fMRI data processing

JOURNAL OF MAGNETIC RESONANCE IMAGING, Issue 4 2004
Xia Zhao MS
Abstract Purpose To make a quantitative comparison of temporal cluster analysis (TCA) and independent component analysis (ICA) techniques in detecting brain activation by using simulated data and in vivo event-related functional MRI (fMRI) experiments. Materials and Methods A single-slice MRI image was replicated 150 times to simulate an fMRI time series. An event-related brain activation pattern with five different levels of intensity and Gaussian noise was superimposed on these images. Maximum contrast-to-noise ratio (CNR) of the signal change ranged from 1.0 to 2.0 by 0.25 increments. In vivo visual stimulation fMRI experiments were performed on a 1.9 T magnet. Six human volunteers participated in this study. All imaging data were analyzed using both TCA and ICA methods. Results Both simulated and in vivo data have shown that no statistically significant difference exists in the activation areas detected by both ICA and TCA techniques when CNR of fMRI signal is larger than 1.75. Conclusion TCA and ICA techniques are comparable in generating functional brain maps in event-related fMRI experiments. Although ICA has richer features in exploring the spatial and temporal information of the functional images, the TCA method has advantages in its computational efficiency, repeatability, and readiness to average data from group subjects. J. Magn. Reson. Imaging 2004;19:397,402. © 2004 Wiley-Liss, Inc. [source]


Investigating the stimulus-dependent temporal dynamics of the BOLD signal using spectral methods

JOURNAL OF MAGNETIC RESONANCE IMAGING, Issue 3 2003
Karsten Müller PhD
Abstract Purpose To compare several spectral parameters using different durations of visual hemifield stimulation in order to explore the different temporal behavior of the blood oxygenation-level dependent (BOLD) signal in various brain regions. Materials and Methods Spectral methods were applied to three different groups of subjects with visual stimulation lasting 6, 12, and 30 seconds. Furthermore, diffusion weighting was applied in an interleaved way. The core of the data processing was the computation of the spectral density matrix using the multidimensional weighted covariance estimate. Spectral parameters of coherence and phase shift were computed. Results The correlation between signal changes and phase shifts was dependent on the duration of the visual stimulation. The shorter the duration of visual stimulation, the stronger the correlation between percentage signal change and phase shift. Conclusion The experiments with short and long stimuli differed mainly in the distribution of the activated voxels in the plane of percentage signal change and phase shift. It was revealed that the height of the signal change depends on the phase shift, whereas the diffusion weighting has no influence. J. Magn. Reson. Imaging 2003;17:375,382. © 2003 Wiley-Liss, Inc. [source]


Utilization of high-accuracy FTICR-MS data in protein quantitation experiments

JOURNAL OF MASS SPECTROMETRY (INCORP BIOLOGICAL MASS SPECTROMETRY), Issue 11 2009
Martin Strohalm
Abstract Human acute T-lymphoblastic leukemia cell line (CEM) treated with cisplatin, and the stable isotope labeling by amino acids in cell culture (SILAC) strategy were used to present an improved method of data processing in high-accuracy mass spectrometry (MS). By using peptide mass fingerprinting with low mass tolerance, we were able to utilize far more data retained in MS scans which would normally be missed by a standard processing method. This new way of data interpretation results in an improvement of the relevance of quantitation experiments and enabled us to search and quantify different types of posttranslational modifications. Furthermore, we used this technique to distinguish among different protein isoforms, commonly returned by Mascot search engine. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Matrix effects on accurate mass measurements of low-molecular weight compounds using liquid chromatography-electrospray-quadrupole time-of-flight mass spectrometry,

JOURNAL OF MASS SPECTROMETRY (INCORP BIOLOGICAL MASS SPECTROMETRY), Issue 3 2006
F. Calbiani
Abstract Liquid chromatography (LC) with high-resolution mass spectrometry (HRMS) represents a powerful technique for the identification and/or confirmation of small molecules, i.e. drugs, metabolites or contaminants, in different matrices. However, reliability of analyte identification by HRMS is being challenged by the uncertainty that affects the exact mass measurement. This parameter, characterized by accuracy and precision, is influenced by sample matrix and interferent compounds so that questions about how to develop and validate reliable LC-HRMS-based methods are being raised. Experimental approaches for studying the effects of various key factors influencing mass accuracy on low-molecular weight compounds (MW < 150 Da) when using a quadrupole-time-of-flight (QTOF) mass analyzer were described. Biogenic amines in human plasma were considered for the purpose and the effects of peak shape, ion abundance, resolution and data processing on accurate mass measurements of the analytes were evaluated. In addition, the influence of the matrix on the uncertainty associated with their identification and quantitation is discussed. A critical evaluation on the calculation of the limits of detection was carried out, considering the uncertainty associated with exact mass measurement of HRMS-based methods. The minimum concentration level of the analytes that was able to provide a statistical error lower than 5 ppm in terms of precision was 10 times higher than those calculated with S/N = 3, thus suggesting the importance of considering both components of exact mass measurement uncertainty in the evaluation of the limit of detection. Copyright © 2006 John Wiley & Sons, Ltd. [source]


The evolution of comprehensive two-dimensional gas chromatography (GC×GC)

JOURNAL OF SEPARATION SCIENCE, JSS, Issue 5-6 2004
Tadeusz Górecki
Abstract For a technology little over a decade old, comprehensive two-dimensional gas chromatography (GC×GC) has quickly reached the status of one of the most powerful analytical tools for volatile organic compounds. At the heart of any GC×GC system is an interface, which physically connects the primary and the secondary columns and acts to preserve the separation obtained in the first dimension (first column) while allowing additional separation in the second dimension. The paper presents a review of the technology, including fundamental principles of the technique, data processing and interpretation and a timeline of inventive contributions to interface design. In addition, applications of the technique are presented, with a more detailed discussion of selected examples. [source]


Three-dimensional chemical mapping by scanning transmission X-ray spectromicroscopy

JOURNAL OF SYNCHROTRON RADIATION, Issue 5 2007
Göran A. Johansson
Three-dimensional (3-d) chemical mapping using angle-scan tomography in a scanning transmission X-ray microscope is demonstrated. Apparatus, experimental procedures and data processing are presented and the 3-d spatial resolution is evaluated. The technique is illustrated using mapping of a low-density acrylate polyelectrolyte in and outside of polystyrene microspheres dispersed in water in a 4,µm-diameter microcapillary. The 3-d chemical visualization provides information about the microstructure that had not previously been observed. [source]


DATA ANALYSIS OF PENETROMETRIC FORCE/DISPLACEMENT CURVES FOR THE CHARACTERIZATION OF WHOLE APPLE FRUITS

JOURNAL OF TEXTURE STUDIES, Issue 4 2005
C. CAMPS
ABSTRACT The objective of the present study was to compare two chemometric approaches for characterizing the rheological properties of fruits from puncture test force/displacement curves. The first approach (parameter approach) computed six texture parameters from the curves, which were supposed to be representative of skin hardness, fruit deformation before skin rupture, flesh firmness and mechanical work needed to penetrate the fruit. The second approach (whole curve approach) used the whole digitized curve (300 data points) in further data processing. Two experimental studies were compared: first, the variability of the rheological parameters of five apple cultivars; second, the rheological variability that was characterized as a function of storage conditions. For both approaches, factorial discriminant analysis was applied to discriminate the fruits based on the measured rheological properties. The qualitative groups in factorial discriminant analysis were either the apple cultivar or the storage conditions (days and temperatures of storage). The tests were carried out using cross-validation procedures, making it possible to compute the number of fruits correctly identified. Thus the percentage of correct identification was 92% and 87% for using the parameter and the whole curve approaches, respectively. The discrimination of storage duration was less accurate for both approaches giving about 50% correct identifications. Comparison of the percentage of correct classifications based on the whole curve and the parameter approaches showed that the six computed parameters gave a good summary of the information present in the curve. The whole curve approach showed that some additional information, not present in the six parameters, may be appropriate for a complete description of the fruit rheology. [source]


Automating Standard Alcohol Use Assessment Instruments Via Interactive Voice Response Technology

ALCOHOLISM, Issue 2 2002
James C. Mundt
Background: Interactive voice response (IVR) technology integrates touch-tone telephones with computer-automated data processing. IVR offers a convenient, efficient method for remote collection of self-report data. Methods: Twenty-six subjects recruited from an outpatient alcohol treatment center completed IVR and paper/pencil versions of a demographic and drinking history questionnaire, Stages of Change Readiness and Treatment Eagerness Scale, Drinker Inventory of Consequences, Obsessive-Compulsive Drinking Scale, Alcohol Dependence Scale, and two numerical rating scales of craving and desire to drink during the prior week. Administration of the instruments in both formats was repeated 1 week later. The order of administration method was counterbalanced between subjects and reversed across data collection sessions. Scale and subscale scores from both methods were correlated within sessions. Test-retest correlations were also calculated for each method. A criterion of ,= 0.01 was used to control type I statistical error. Results: Intermethod correlations within each session were significant for all of the instruments administered. Test-retest correlations for both methods were also significant, except for the numerical ratings. Scores on the Alcohol Dependence Scale obtained via IVR were significantly lower than those collected by paper/pencil. Other differences between the data collection methods or across the sessions were inconsistent. The average IVR call length was 34 min and 23 sec. Paper/pencil forms required an average of 18 min and 38 sec to complete and an additional 10 min and 17 sec for data entry. Conclusions: IVR technology provides a convenient alternative to collecting self-report measures of treatment outcomes. Both paper/pencil and IVR assessments provide highly convergent data and demonstrate good test-retest reliability. Alcohol Dependence Scale score differences between methods highlight special considerations for IVR adaptation of existing paper/pencil instruments. Benefits of IVR include procedural standardization, automatic data scoring, direct electronic storage, and remote accessibility from multiple locations. [source]


Fast multidimensional localized parallel NMR spectroscopy for the analysis of samples

MAGNETIC RESONANCE IN CHEMISTRY, Issue 10 2010
Marino Vega-Vazquez
Abstract A parallel localized spectroscopy (PALSY) method is presented to speed up the acquisition of multidimensional NMR (nD) spectra. The sample is virtually divided into a discrete number of nonoverlapping slices that relax independently during consecutive scans of the experiment, affording a substantial reduction in the interscan relaxation delay and the total experiment time. PALSY was tested for the acquisition of three experiments 2D COSY, 2D DQF-COSY and 2D TQF-COSY in parallel, affording a time-saving factor of 3,4. Some unique advantages are that the achievable resolution in any dimension is not compromised in any way: it uses conventional NMR data processing, it is not prone to generate spectral artifacts, and once calibrated, it can be used routinely with these and other combinations of NMR spectra. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Broadband proton-decoupled proton spectra,

MAGNETIC RESONANCE IN CHEMISTRY, Issue 4 2007
Andrew J. Pell
Abstract We present a new method for recording broadband proton-decoupled proton spectra with absorption mode lineshapes and substantially correct integrals; in both these respects, the new method has significant advantages over conventional J -spectroscopy. In our approach, the decoupled spectrum is simply obtained from the 45° projection of the diagonal-peak multiplets of an anti z -COSY spectrum. This method is straightforward to apply, and does not require any unusual data processing. However, there is a significant reduction in sensitivity when compared to a conventional proton spectrum. The method is demonstrated for typical medium-sized molecules, and it is also shown how such a decoupled spectrum can be used to advantage in measurements of diffusion constants (DOSY), the measurement of relaxation parameters, and the analysis of complex mixtures. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Real-life applications of the MULVADO software package for processing DOSY NMR data

MAGNETIC RESONANCE IN CHEMISTRY, Issue 2 2006
R. Huo
Abstract MULVADO is a newly developed software package for DOSY NMR data processing, based on multivariate curve resolution (MCR), one of the principal multivariate methods for processing DOSY data. This paper will evaluate this software package by using real-life data of materials used in the printing industry: two data sets from the same ink sample but of different quality. Also a sample of an organic photoconductor and a toner sample are analysed. Compared with the routine DOSY output from monoexponential fitting, one of the single channel algorithms in the commercial Bruker software, MULVADO provides several advantages. The key advantage of MCR is that it overcomes the fluctuation problem (non-consistent diffusion coefficient of the same component). The combination of non-linear regression (NLR) and MCR can yield more accurate resolution of a complex mixture. In addition, the data pre-processing techniques in MULVADO minimise the negative effects of experimental artefacts on the results of the data. In this paper, the challenges for analysing polymer samples and other more complex samples will also be discussed. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Fast proton spectroscopic imaging using steady-state free precession methods

MAGNETIC RESONANCE IN MEDICINE, Issue 3 2003
Wolfgang Dreher
Abstract Various pulse sequences for fast proton spectroscopic imaging (SI) using the steady-state free precession (SSFP) condition are proposed. The sequences use either only the FID-like signal S1, only the echo-like signal S2, or both signals in separate but adjacent acquisition windows. As in SSFP imaging, S1 and S2 are separated by spoiler gradients. RF excitation is performed by slice-selective or chemical shift-selective pulses. The signals are detected in absence of a B0 gradient. Spatial localization is achieved by phase-encoding gradients which are applied prior to and rewound after each signal acquisition. Measurements with 2D or 3D spatial resolution were performed at 4.7 T on phantoms and healthy rat brain in vivo allowing the detection of uncoupled and J-coupled spins. The main advantages of SSFP based SI are the short minimum total measurement time (Tmin) and the high signal-to-noise ratio per unit measurement time (SNRt). The methods are of particular interest at higher magnetic field strength B0, as TR can be reduced with increasing B0 leading to a reduced Tmin and an increased SNRt. Drawbacks consist of the limited spectral resolution, particularly at lower B0, and the dependence of the signal intensities on T1 and T2. Further improvements are discussed including optimized data processing and signal detection under oscillating B0 gradients leading to a further reduction in Tmin. Magn Reson Med 50:453,460, 2003. © 2003 Wiley-Liss, Inc. [source]


Efficient organization of information processing

MANAGERIAL AND DECISION ECONOMICS, Issue 1 2007
Jacek Cukrowski
The paper examines the application of the concept of economic efficiency to organizational issues of collective information processing in decision making. Information processing is modeled in the framework of the dynamic parallel processing model of associative computation with an endogenous setup cost of the processors. The model is extended to include the specific features of collective information processing in the team of decision makers which may lead to an error in data analysis. In such a model, the conditions for efficient organization of information processing are defined and the architecture of the efficient structures is considered. We show that specific features of collective decision making procedures require a broader framework for judging organizational efficiency than has traditionally been adopted. In particular, and contrary to the results available in economic literature, we show that there is no unique architecture for efficient information processing structures, but a number of various efficient forms. The results indicate that technological progress resulting in faster data processing (ceteris paribus) will lead to more regular information processing structures. However, if the relative cost of the delay in data analysis increases significantly, less regular structures could be efficient. Copyright © 2007 John Wiley & Sons, Ltd. [source]


Advances in proteomics data analysis and display using an accurate mass and time tag approach

MASS SPECTROMETRY REVIEWS, Issue 3 2006
Jennifer S.D. Zimmer
Abstract Proteomics has recently demonstrated utility for increasing the understanding of cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled with high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. © 2006 Wiley Periodicals, Inc., Mass Spec Rev 25:450,482, 2006 [source]