Home About us Contact | |||
Statistical Data Analysis (statistical + data_analysis)
Selected AbstractsIs there a role for dynamic retinal vessel analysis in internal medicine?ACTA OPHTHALMOLOGICA, Issue 2008IM LANZL Purpose Human retinal vessels and their reaction to stimuli change during life and in disease due to physiological, genetic and pathological influences. Using the Dynamic Vessel Analyzer (DVA, Fa. IMEDOS, Jena) it is possible to assess changes in retinal vessel diameters in response to vasoactive stimuli in real time and non-invasively. Methods Retinal arterial vessel reaction in the natural time course and to the average of 3 consecutive monochromatic flicker stimulations (530-600 nm, 12,5 Hz, 20 s) with a 80 s observation pause between stimulations was investigated in healthy volunteers of different age groups, obese patients, diabetes type 1 patients, systemic hypertensive patients and patients with lysosomal storage disease. Statistical data analysis of vessel reactions independent from the DVA program was performed. Results There is a statistically significant difference in retinal vascular behaviour in different age groups in a healthy population. The same is true between a healthy population and each of the diseases investigated. Lysosomal storage disease however demonstrated an increase in dilation following flicker stimulation compared to normal persons. Conclusion Flicker stimulation of the retina light evokes a prompt vessel reaction in all healthy subjects. We could demonstrate an age dependence of the retinal arterial reaction in medically healthy persons and in hypertension, diabetes and obese patients. From the increased reaction in lysosomal storage disease further understanding of different factors leading to the vascular reaction to stimuli may be derived. Application of flicker stimulus to retinal vessels represents a method to assess the endothelial function of vessels which is important to understand in systemic disease. [source] Modeling and analysis of disease and risk factors through learning Bayesian networks from observational dataQUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 3 2008Jing Li Abstract This paper focuses on identification of the relationships between a disease and its potential risk factors using Bayesian networks in an epidemiologic study, with the emphasis on integrating medical domain knowledge and statistical data analysis. An integrated approach is developed to identify the risk factors associated with patients' occupational histories and is demonstrated using real-world data. This approach includes several steps. First, raw data are preprocessed into a format that is acceptable to the learning algorithms of Bayesian networks. Some important considerations are discussed to address the uniqueness of the data and the challenges of the learning. Second, a Bayesian network is learned from the preprocessed data set by integrating medical domain knowledge and generic learning algorithms. Third, the relationships revealed by the Bayesian network are used for risk factor analysis, including identification of a group of people who share certain common characteristics and have a relatively high probability of developing the disease, and prediction of a person's risk of developing the disease given information on his/her occupational history. Copyright © 2007 John Wiley & Sons, Ltd. [source] Comparison between Principal Component Analysis and Independent Component Analysis in Electroencephalograms ModellingBIOMETRICAL JOURNAL, Issue 2 2007C. Bugli Abstract Principal Component Analysis (PCA) is a classical technique in statistical data analysis, feature extraction and data reduction, aiming at explaining observed signals as a linear combination of orthogonal principal components. Independent Component Analysis (ICA) is a technique of array processing and data analysis, aiming at recovering unobserved signals or ,sources' from observed mixtures, exploiting only the assumption of mutual independence between the signals. The separation of the sources by ICA has great potential in applications such as the separation of sound signals (like voices mixed in simultaneous multiple records, for example), in telecommunication or in the treatment of medical signals. However, ICA is not yet often used by statisticians. In this paper, we shall present ICA in a statistical framework and compare this method with PCA for electroencephalograms (EEG) analysis. We shall see that ICA provides a more useful data representation than PCA, for instance, for the representation of a particular characteristic of the EEG named event-related potential (ERP). (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Generalized Additive Modeling with Implicit Variable Selection by Likelihood-Based BoostingBIOMETRICS, Issue 4 2006Gerhard Tutz Summary The use of generalized additive models in statistical data analysis suffers from the restriction to few explanatory variables and the problems of selection of smoothing parameters. Generalized additive model boosting circumvents these problems by means of stagewise fitting of weak learners. A fitting procedure is derived which works for all simple exponential family distributions, including binomial, Poisson, and normal response variables. The procedure combines the selection of variables and the determination of the appropriate amount of smoothing. Penalized regression splines and the newly introduced penalized stumps are considered as weak learners. Estimates of standard deviations and stopping criteria, which are notorious problems in iterative procedures, are based on an approximate hat matrix. The method is shown to be a strong competitor to common procedures for the fitting of generalized additive models. In particular, in high-dimensional settings with many nuisance predictor variables it performs very well. [source] Stimulation, Monitoring, and Analysis of Pathway Dynamics by Metabolic Profiling in the Aromatic Amino Acid PathwayBIOTECHNOLOGY PROGRESS, Issue 6 2004M. Oldiges Using a concerted approach of biochemical standard preparation, analytical access via LC-MS/MS, glucose pulse, metabolic profiling, and statistical data analysis, the metabolism dynamics in the aromatic amino acid pathway has been stimulated, monitored, and analyzed in different tyrosine-auxotrophic l -phenylalanine-producing Escherichiacoli strains. During the observation window from ,4 s (before) up to 27 s after the glucose pulse, the dynamics of the first five enzymatic reactions in the aromatic amino acid pathway was observed by measuring intracellular concentrations of 3-deoxy- d -arabino-heptulosonate 7-phosphate DAH(P), 3-dehydroquinate (3-DHQ), 3-dehydroshikimate (3-DHS), shikimate 3-phosphate (S3P), and shikimate (SHI), together with the pathway precursors phosphoenolpyruvate (PEP) and P5P, the lumped pentose phosphate pool as an alternative to the nondetectable erythrose 4-phosphate (E4P). Provided that a sufficient fortification of the carbon flux into the pathway of interest is ensured, respective metabolism dynamics can be observed. On the basis of the intracellular pool measurements, the standardized pool velocities were calculated, and a simple, data-driven criterion-called "pool efflux capacity" (PEC)-is derived. Despite its simplifying system description, the criterion managed to identify the well-known AroB limitation in the E. coli strain A (genotype ,( pheA tyrA aroF)/pJF119EH aroFfbrpheAfbramp) and it also succeeded to identify AroL and AroA (in strain B, genotype ,( pheA tyrA aroF)/pJF119EH aroFfbrpheAfbraroB amp) as promising metabolic engineering targets to alleviate respective flux control in subsequent l -Phe producing strains. Furthermore, using of a simple correlation analysis, the reconstruction of the metabolite sequence of the observed pathway was enabled. The results underline the necessity to extend the focus of glucose pulse experiments by studying not only the central metabolism but also anabolic pathways. [source] High Throughput Screening for the Design and Optimization of Chromatographic Processes: Assessment of Model Parameter Determination from High Throughput Compatible DataCHEMICAL ENGINEERING & TECHNOLOGY (CET), Issue 12 2008A. Susanto Abstract Chromatographic processes can be optimized in various ways, and the two most prominent approaches are based either on statistical data analysis or on experimentally validated simulation models. Both strategies rely heavily on experimental data, the generation of which usually imposes a significant bottleneck on rational process design. The latter approach is followed in this work, and the utilizability of high throughput compatible experiments for the determination of model parameters which are required for in silico process optimization, is assessed. The unknown parameter values are estimated from batch uptake experiments on a robotic platform and from dynamic breakthrough experiments with miniaturized chromatographic columns. The identified model is then validated with respect to process optimization by comparison of model predictions with experimental data from a preparative scale column. In this study, a strong cation exchanger Toyopearl SP-650M and lysozyme solved in phosphate buffer (pH 7), is used as the test system. The utilization of data from miniaturized and high throughput compatible experiments is shown to yield sufficiently accurate results, and minimizes efforts and costs for both parameter estimation and model validation. [source] |