Home About us Contact | |||
Processing Pipeline (processing + pipeline)
Selected AbstractsVisual modelling: from images to imagesCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 4 2002Marc Pollefeys Abstract This paper contains two parts. In the first part an automatic processing pipeline is presented that analyses an image sequence and automatically extracts camera motion, calibration and scene geometry. The system combines state-of-the-art algorithms developed in computer vision, computer graphics and photogrammetry. The approach consists of two stages. Salient features are extracted and tracked throughout the sequence to compute the camera motion and calibration and the 3D structure of the observed features. Then a dense estimate of the surface geometry of the observed scene is computed using stereo matching. The second part of the paper discusses how this information can be used for visualization. Traditionally, a textured 3D model is constructed from the computed information and used to render new images. Alternatively, it is also possible to avoid the need for an explicit 3D model and to obtain new views directly by combining the appropriate pixels from recorded views. It is interesting to note that even when there is an ambiguity on the reconstructed geometry, correct new images can often still be generated. Copyright © 2002 John Wiley & Sons, Ltd. [source] Image processing pipeline for synchrotron-radiation-based tomographic microscopyJOURNAL OF SYNCHROTRON RADIATION, Issue 4 2010C. Hintermüller With synchrotron-radiation-based tomographic microscopy, three-dimensional structures down to the micrometer level can be visualized. Tomographic data sets typically consist of 1000 to 1500 projections of 1024 × 1024 to 2048 × 2048 pixels and are acquired in 5,15,min. A processing pipeline has been developed to handle this large amount of data efficiently and to reconstruct the tomographic volume within a few minutes after the end of a scan. Just a few seconds after the raw data have been acquired, a selection of reconstructed slices is accessible through a web interface for preview and to fine tune the reconstruction parameters. The same interface allows initiation and control of the reconstruction process on the computer cluster. By integrating all programs and tools, required for tomographic reconstruction into the pipeline, the necessary user interaction is reduced to a minimum. The modularity of the pipeline allows functionality for new scan protocols to be added, such as an extended field of view, or new physical signals such as phase-contrast or dark-field imaging etc. [source] Automated reprocessing pipeline for searching heterogeneous mass spectrometric data of the HUPO Brain Proteome Project pilot phasePROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 18 2006Christian Stephan Dr. Abstract The newly available techniques for sensitive proteome analysis and the resulting amount of data require a new bioinformatics focus on automatic methods for spectrum reprocessing and peptide/protein validation. Manual validation of results in such studies is not feasible and objective enough for quality relevant interpretation. The necessity for tools enabling an automatic quality control is, therefore, important to produce reliable and comparable data in such big consortia as the Human Proteome Organization Brain Proteome Project. Standards and well-defined processing pipelines are important for these consortia. We show a way for choosing the right database model, through collecting data, processing these with a decoy database and end up with a quality controlled protein list merged from several search engines, including a known false-positive rate. [source] |