Thresholding Procedure (thresholding + procedure)

Distribution by Scientific Domains


Selected Abstracts


An adaptive empirical Bayesian thresholding procedure for analysing microarray experiments with replication

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 3 2007
Rebecca E. Walls
Summary., A typical microarray experiment attempts to ascertain which genes display differential expression in different samples. We model the data by using a two-component mixture model and develop an empirical Bayesian thresholding procedure, which was originally introduced for thresholding wavelet coefficients, as an alternative to the existing methods for determining differential expression across thousands of genes. The method is built on sound theoretical properties and has easy computer implementation in the R statistical package. Furthermore, we consider improvements to the standard empirical Bayesian procedure when replication is present, to increase the robustness and reliability of the method. We provide an introduction to microarrays for those who are unfamilar with the field and the proposed procedure is demonstrated with applications to two-channel complementary DNA microarray experiments. [source]


A novel wavelet-based thresholding method for the pre-processing of mass spectrometry data that accounts for heterogeneous noise

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 15 2008
Deukwoo Kwon Dr.
Abstract In recent years there has been an increased interest in using protein mass spectroscopy to discriminate diseased from healthy individuals with the aim of discovering molecular markers for disease. A crucial step before any statistical analysis is the pre-processing of the mass spectrometry data. Statistical results are typically strongly affected by the specific pre-processing techniques used. One important pre-processing step is the removal of chemical and instrumental noise from the mass spectra. Wavelet denoising techniques are a standard method for denoising. Existing techniques, however, do not accommodate errors that vary across the mass spectrum, but instead assume a homogeneous error structure. In this paper we propose a novel wavelet denoising approach that deals with heterogeneous errors by incorporating a variance change point detection method in the thresholding procedure. We study our method on real and simulated mass specrometry data and show that it improves on performances of peak detection methods. [source]


A new high-resolution computed tomography (CT) segmentation method for trabecular bone architectural analysis

AMERICAN JOURNAL OF PHYSICAL ANTHROPOLOGY, Issue 1 2009
Heike Scherf
Abstract In the last decade, high-resolution computed tomography (CT) and microcomputed tomography (micro-CT) have been increasingly used in anthropological studies and as a complement to traditional histological techniques. This is due in large part to the ability of CT techniques to nondestructively extract three-dimensional representations of bone structures. Despite prior studies employing CT techniques, no completely reliable method of bone segmentation has been established. Accurate preprocessing of digital data is crucial for measurement accuracy, especially when subtle structures such as trabecular bone are investigated. The research presented here is a new, reproducible, accurate, and fully automated computerized segmentation method for high-resolution CT datasets of fossil and recent cancellous bone: the Ray Casting Algorithm (RCA). We compare this technique with commonly used methods of image thresholding (i.e., the half-maximum height protocol and the automatic, adaptive iterative thresholding procedure). While the quality of the input images is crucial for conventional image segmentation, the RCA method is robust regarding the signal to noise ratio, beam hardening, ring artifacts, and blurriness. Tests with data of extant and fossil material demonstrate the superior quality of RCA compared with conventional thresholding procedures, and emphasize the need for careful consideration of optimal CT scanning parameters. Am J Phys Anthropol 2009. © 2009 Wiley-Liss, Inc. [source]


An Adaptive Single-step FDR Procedure with Applications to DNA Microarray Analysis

BIOMETRICAL JOURNAL, Issue 1 2007
Vishwanath Iyer
Abstract The use of multiple hypothesis testing procedures has been receiving a lot of attention recently by statisticians in DNA microarray analysis. The traditional FWER controlling procedures are not very useful in this situation since the experiments are exploratory by nature and researchers are more interested in controlling the rate of false positives rather than controlling the probability of making a single erroneous decision. This has led to increased use of FDR (False Discovery Rate) controlling procedures. Genovese and Wasserman proposed a single-step FDR procedure that is an asymptotic approximation to the original Benjamini and Hochberg stepwise procedure. In this paper, we modify the Genovese-Wasserman procedure to force the FDR control closer to the level alpha in the independence setting. Assuming that the data comes from a mixture of two normals, we also propose to make this procedure adaptive by first estimating the parameters using the EM algorithm and then using these estimated parameters into the above modification of the Genovese-Wasserman procedure. We compare this procedure with the original Benjamini-Hochberg and the SAM thresholding procedures. The FDR control and other properties of this adaptive procedure are verified numerically. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]