Home About us Contact | |||
Sampling Approach (sampling + approach)
Selected AbstractsMeasuring the Information Content of the Beige Book: A Mixed Data Sampling ApproachJOURNAL OF MONEY, CREDIT AND BANKING, Issue 1 2009MICHELLE T. ARMESTO data sampling frequency; textual analysis; DICTION; Beige Book Studies of the predictive ability of the Federal Reserve's Beige Book for aggregate output and employment have proven inconclusive. This might be attributed, in part, to its irregular release schedule. We use a model that allows for data sampling at mixed frequencies to analyze the predictive power of the Beige Book. We find that the Beige Book's national summary and District reports predict GDP and aggregate employment and that most District reports provide information content for regional employment. In addition, there appears to be an asymmetry in the predictive content of the Beige Book language. [source] A Sampling Approach for Evaluating Particle Loss During Continuous Field Measurement of Particulate MatterPARTICLE & PARTICLE SYSTEMS CHARACTERIZATION, Issue 2 2005Christopher A. Noble Abstract A method for evaluating sample bias in field measurements is presented. Experiments were performed in the field and laboratory to quantify the bias as a function of particle size for the scanning mobility particle sizer and the aerodynamic particle sizer. Sources of bias and sample loss considered in this work were sampling line loss, instrumental differences and inlet efficiencies. Measurement of the bias and sample loss allow for correction of the data acquired in the field, so as to obtain more representative samples of atmospheric concentrations. Substantial losses of fine and ultrafine particle count were observed, with sampling line losses ranging from 10,50,%, dependent on particle size. Only minor line losses were observed for coarse particles (approximately 5,%) because the sampling line was oriented vertically. Please note: corrected DOI, in print wrong DOI (10.1002/ppsc.200400939) [source] Measuring Social Mobility as UnpredictabilityECONOMICA, Issue 269 2001Simon C. Parker By associating mobility with the unpredictability of social states, new measures of social mobility may be constructed. We propose a family of three state-by-state and aggregate (scalar) predictability measures. The first set of measures is based on the transition matrix. The second uses a sampling approach and permits statistical testing of the hypothesis of perfect mobility, providing a new justification for the use of the ,2 statistic. The third satisfies the demanding criterion of ,period consistency'. An empirical example demonstrates the usefulness of the new measures to complement existing ones in the literature. [source] Quantitative microbial faecal source tracking with sampling guided by hydrological catchment dynamicsENVIRONMENTAL MICROBIOLOGY, Issue 10 2008G. H. Reischer Summary The impairment of water quality by faecal pollution is a global public health concern. Microbial source tracking methods help to identify faecal sources but the few recent quantitative microbial source tracking applications disregarded catchment hydrology and pollution dynamics. This quantitative microbial source tracking study, conducted in a large karstic spring catchment potentially influenced by humans and ruminant animals, was based on a tiered sampling approach: a 31-month water quality monitoring (Monitoring) covering seasonal hydrological dynamics and an investigation of flood events (Events) as periods of the strongest pollution. The detection of a ruminant-specific and a human-specific faecal Bacteroidetes marker by quantitative real-time PCR was complemented by standard microbiological and on-line hydrological parameters. Both quantitative microbial source tracking markers were detected in spring water during Monitoring and Events, with preponderance of the ruminant-specific marker. Applying multiparametric analysis of all data allowed linking the ruminant-specific marker to general faecal pollution indicators, especially during Events. Up to 80% of the variation of faecal indicator levels during Events could be explained by ruminant-specific marker levels proving the dominance of ruminant faecal sources in the catchment. Furthermore, soil was ruled out as a source of quantitative microbial source tracking markers. This study demonstrates the applicability of quantitative microbial source tracking methods and highlights the prerequisite of considering hydrological catchment dynamics in source tracking study design. [source] Large scale wildlife monitoring studies: statistical methods for design and analysisENVIRONMETRICS, Issue 2 2002Kenneth H. Pollock Abstract Techniques for estimation of absolute abundance of wildlife populations have received a lot of attention in recent years. The statistical research has been focused on intensive small-scale studies. Recently, however, wildlife biologists have desired to study populations of animals at very large scales for monitoring purposes. Population indices are widely used in these extensive monitoring programs because they are inexpensive compared to estimates of absolute abundance. A crucial underlying assumption is that the population index (C) is directly proportional to the population density (D). The proportionality constant, ,, is simply the probability of ,detection' for animals in the survey. As spatial and temporal comparisons of indices are crucial, it is necessary to also assume that the probability of detection is constant over space and time. Biologists intuitively recognize this when they design rigid protocols for the studies where the indices are collected. Unfortunately, however, in many field studies the assumption is clearly invalid. We believe that the estimation of detection probability should be built into the monitoring design through a double sampling approach. A large sample of points provides an abundance index, and a smaller sub-sample of the same points is used to estimate detection probability. There is an important need for statistical research on the design and analysis of these complex studies. Some basic concepts based on actual avian, amphibian, and fish monitoring studies are presented in this article. Copyright © 2002 John Wiley & Sons, Ltd. [source] LIKELIHOOD-BASED INFERENCE IN ISOLATION-BY-DISTANCE MODELS USING THE SPATIAL DISTRIBUTION OF LOW-FREQUENCY ALLELESEVOLUTION, Issue 11 2009John Novembre Estimating dispersal distances from population genetic data provides an important alternative to logistically taxing methods for directly observing dispersal. Although methods for estimating dispersal rates between a modest number of discrete demes are well developed, methods of inference applicable to "isolation-by-distance" models are much less established. Here, we present a method for estimating ,,2, the product of population density (,) and the variance of the dispersal displacement distribution (,2). The method is based on the assumption that low-frequency alleles are identical by descent. Hence, the extent of geographic clustering of such alleles, relative to their frequency in the population, provides information about ,,2. We show that a novel likelihood-based method can infer this composite parameter with a modest bias in a lattice model of isolation-by-distance. For calculating the likelihood, we use an importance sampling approach to average over the unobserved intraallelic genealogies, where the intraallelic genealogies are modeled as a pure birth process. The approach also leads to a likelihood-ratio test of isotropy of dispersal, that is, whether dispersal distances on two axes are different. We test the performance of our methods using simulations of new mutations in a lattice model and illustrate its use with a dataset from Arabidopsis thaliana. [source] Unified sampling approach for multipoint linkage disequilibrium mapping of qualitative and quantitative traitsGENETIC EPIDEMIOLOGY, Issue 4 2002Fang-Chi Hsu Abstract Rapid development in biotechnology has enhanced the opportunity to deal with multipoint gene mapping for complex diseases, and association studies using quantitative traits have recently generated much attention. Unlike the conventional hypothesis-testing approach for fine mapping, we propose a unified multipoint method to localize a gene controlling a quantitative trait. We first calculate the sample size needed to detect linkage and linkage disequilibrium (LD) for a quantitative trait, categorized by decile, under three different modes of inheritance. Our results show that sampling trios of offspring and their parents from either extremely low (EL) or extremely high (EH) probands provides greater statistical power than sampling in the intermediate range. We next propose a unified sampling approach for multipoint LD mapping, where the goal is to estimate the map position (,) of a trait locus and to calculate a confidence interval along with its sampling uncertainty. Our method builds upon a model for an expected preferential transmission statistic at an arbitrary locus conditional on the sampling scheme, such as sampling from EL and EH probands. This approach is valid regardless of the underlying genetic model. The one major assumption for this model is that no more than one quantitative trait locus (QTL) is linked to the region being mapped. Finally we illustrate the proposed method using family data on total serum IgE levels collected in multiplex asthmatic families from Barbados. An unobserved QTL appears to be located at ,, = 41.93 cM with 95% confidence interval of (40.84, 43.02) through the 20-cM region framed by markers D12S1052 and D12S1064 on chromosome 12. The test statistic shows strong evidence of linkage and LD (chi-square statistic = 18.39 with 2 df, P -value = 0.0001). Genet. Epidemiol. 22:298,312, 2002. © 2002 Wiley-Liss, Inc. [source] A Case Study of Soil-Gas Sampling in Silt and Clay-Rich (Low-Permeability) SoilsGROUND WATER MONITORING & REMEDIATION, Issue 1 2009Todd A. McAlary Soil-gas sampling and analysis is a common tool used in vapor intrusion assessments; however, sample collection becomes more difficult in fine-grained, low-permeability soils because of limitations on the flow rate that can be sustained during purging and sampling. This affects the time required to extract sufficient volume to satisfy purging and sampling requirements. The soil-gas probe tubing or pipe and sandpack around the probe screen should generally be purged prior to sampling. After purging, additional soil gas must be extracted for chemical analysis, which may include field screening, laboratory analysis, occasional duplicate samples, or analysis for more than one analytical method (e.g., volatile organic compounds and semivolatile organic compounds). At present, most regulatory guidance documents do not distinguish between soil-gas sampling methods that are appropriate for high- or low-permeability soils. This paper discusses permeability influences on soil-gas sample collection and reports data from a case study involving soil-gas sampling from silt and clay-rich soils with moderate to extremely low gas permeability to identify a sampling approach that yields reproducible samples with data quality appropriate for vapor intrusion investigations for a wide range of gas-permeability conditions. [source] Efficient sampling for spatial uncertainty quantification in multibody system dynamics applicationsINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 5 2009Kyle P. Schmitt Abstract We present two methods for efficiently sampling the response (trajectory space) of multibody systems operating under spatial uncertainty, when the latter is assumed to be representable with Gaussian processes. In this case, the dynamics (time evolution) of the multibody systems depends on spatially indexed uncertain parameters that span infinite-dimensional spaces. This places a heavy computational burden on existing methodologies, an issue addressed herein with two new conditional sampling approaches. When a single instance of the uncertainty is needed in the entire domain, we use a fast Fourier transform technique. When the initial conditions are fixed and the path distribution of the dynamical system is relatively narrow, we use an incremental sampling approach that is fast and has a small memory footprint. Both methods produce the same distributions as the widely used Cholesky-based approaches. We illustrate this convergence at a smaller computational effort and memory cost for a simple non-linear vehicle model. Copyright © 2009 John Wiley & Sons, Ltd. [source] False Promises: The Tobacco Industry, "Low Tar" Cigarettes, and Older SmokersJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 9 2008Janine K. Cataldo RN To investigate the role of the tobacco industry in marketing to and sustaining tobacco addiction among older smokers and aging baby boomers, We performed archival searches of electronic archives of internal tobacco company documents using a snowball sampling approach. Analysis was done using iterative and comparative review of documents, classification by themes, and a hermeneutic interpretive approach to develop a case study. Based on extensive marketing research, tobacco companies aggressively targeted older smokers and sought to prevent them from quitting. Innovative marketing approaches were used. "Low tar" cigarettes were developed in response to the health concerns of older smokers, despite industry knowledge that such products had no health advantage and did not help smokers quit. Tobacco industry activities influence the context of cessation for older smokers in several ways. Through marketing "low tar" or "light" cigarettes to older smokers "at risk" of quitting, the industry contributes to the illusion that such cigarettes are safer, although "light" cigarettes may make it harder for addicted smokers to quit. Through targeted mailings of coupons and incentives, the industry discourages older smokers from quitting. Through rhetoric aimed at convincing addicted smokers that they alone are responsible for their smoking, the industry contributes to self-blame, a documented barrier to cessation. Educating practitioners, older smokers, and families about the tobacco industry's influence may decrease the tendency to "blame the victim," thereby enhancing the likelihood of older adults receiving tobacco addiction treatment. Comprehensive tobacco control measures must include a focus on older smokers. [source] Spatial dynamics of predation by carabid beetles on slugsJOURNAL OF ANIMAL ECOLOGY, Issue 3 2000David A. Bohan Summary 1.,An explicitly spatial sampling approach was employed to test the null hypothesis that the predation on slugs by the carabid beetle Pterostichus melanarius (Illiger) was opportunistic. 2.,The beetles and slugs were sampled across a nested series of grids of sampling points, in a field of winter wheat during June and July 1997. 3.,The spatial distribution of all slugs in June was found to change with the scale of the sampling grid, from random on the 0.25 m scale, through aggregation at 1 m, to random at 4 m. At the highest scale of 16 m, the slugs were significantly spatially aggregated. 4.,The distribution of beetles in June was also spatially dynamic, with randomness observed at the 4 m and 8 m scales. At 16 m, significant aggregation was observed. 5.,The dynamic distributions of slugs and beetles, at 16 m, were found not to be associated with, and thus were not determined by, soil or crop factors. 6.,Comparison of slug and beetle populations showed, however, that the distributions at 16 m were dynamically associated with each other. In June where there were many slugs there were also many carabids, whilst in July where there were many carabids there were few slugs. 7.,Approximately 11% of the beetles sampled across the 16 m grid in June and July were found to have ingested slug protein, following intensive enzyme-linked immunosorbent assay (ELISA) testing. 8.,The spatial distribution of these slug-positive beetles was significantly associated with the distribution of the larger slug classes, over 25 mg. Where there were many large slugs in June there were many slug-positive beetles. Conversely, in July few large slugs were found where there were many slug-positive beetles. 9.,Parametric analysis revealed that these changes in the large slug class, at each sampling point between June and July (growth), were negatively related to the local numbers of slug-positive beetles, and that growth declined as the local numbers of beetles increased. 10.,These findings suggest that predation was not opportunistic, but direct and dynamic, falsifying the null hypothesis. Moreover, this predation elicited significant changes in the spatial distribution and local density of the slugs, in a manner that may be termed spatially density dependent. [source] Evaluating the effect of clustering when monitoring the abundance of sea lice populations on farmed Atlantic salmonJOURNAL OF FISH BIOLOGY, Issue 3 2005C. W. Revie Using cluster random sampling theory and empirical estimates of the intra-class correlations for sea lice Lepeophtheirus salmonis abundances, methods on how best to sample Atlantic salmon Salmo salar from cages on farms were derived. Estimates of intra-class correlations for the abundance of the chalimus and mobile sea lice stages on Atlantic salmon in Scottish farms are given. These correlations were higher for mobile stages than for chalimus, and they had a substantive effect on increasing the number of cages and fish to be sampled for all sea lice stages. An important finding is that it is better to have a procedure that randomly samples a large number of cages using a small number of fish from each. This finding and the cluster random sampling approach have relevance to the monitoring of all marine species being farmed in cages or tanks. [source] Monitoring Regional Riparian Forest Cover Change Using Stratified Sampling and Multiresolution Imagery,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 2 2010Peter R. Claggett Claggett, Peter R., Judy A. Okay, and Stephen V. Stehman, 2010. Monitoring Regional Riparian Forest Cover Change Using Stratified Sampling and Multiresolution Imagery. Journal of the American Water Resources Association (JAWRA) 46(2):334-343. DOI: 10.1111/j.1752-1688.2010.00424.x Abstract:, The Chesapeake Bay watershed encompasses 165,760 km2 of land area with 464,098 km of rivers and streams. As part of the Chesapeake Bay restoration effort, state and federal partners have committed to restoring 26,000 miles (41,843 km) of riparian forest buffers. Monitoring trends in riparian forest buffers over large areas is necessary to evaluate the efficacy of these restoration efforts. A sampling approach for estimating change in riparian forest cover from 1993/1994 to 2005 was developed and implemented in Anne Arundel County, Maryland, to exemplify a method that could be applied throughout the Bay watershed. All stream reaches in the county were stratified using forest cover change derived from Landsat imagery. A stratified random sample of 219 reaches was selected and forest cover change within the riparian buffer of each sampled reach was interpreted from high-resolution aerial photography. The estimated footprint of gross change in riparian forest cover (i.e., the sum of gross gain and gross loss) for the county was 1.83% (SE = 0.22%). Stratified sampling taking advantage of a priori knowledge of locations of change proved to be a practical and efficient protocol for estimating riparian forest buffer change at the county scale and the protocol would readily extend to much broader scale monitoring. [source] Evaluation of photovoltaic modules based on sampling inspection using smoothed empirical quantiles,PROGRESS IN PHOTOVOLTAICS: RESEARCH & APPLICATIONS, Issue 1 2010Ansgar Steland Abstract An important issue for end users and distributors of photovoltaic (PV) modules is the inspection of the power output specification of a shipment. The question is whether or not the modules satisfy the specifications given in the data sheet, namely the nominal power output under standard test conditions, relative to the power output tolerance. Since collecting control measurements of all modules is usually unrealistic, decisions have to be based on random samples. In many cases, one has access to flash data tables of final output power measurements (flash data) from the producer. We propose to rely on the statistical acceptance sampling approach as an objective decision framework, which takes into account both the end users and producers risk of a false decision. A practical solution to the problem is discussed which has been recently found by the authors. The solution consists of estimates of the required optimal sample size and the associated critical value where the estimation uses the information contained in the additional flash data. We propose and examine an improved solution which yields even more reliable estimated sampling plans as substantiated by a Monte Carlo study. This is achieved by employing advanced statistical estimation techniques. Copyright © 2009 John Wiley & Sons, Ltd. [source] Landings, logbooks and observer surveys: improving the protocols for sampling commercial fisheriesFISH AND FISHERIES, Issue 2 2007A J R Cotter Abstract The sampling of commercial marine fisheries for management purposes often displays a key weakness in the form of poor documentation of the scientific basis of sampling and estimation, the assumptions made, and the practical constraints. This paper reviews systematically the theoretical and practical options that can remedy this situation and recommends that decisions be archived in regularly updated ,Sampling Approach and Modifications' (SAM) documents. Defining the target population, the observable population (usually a subset of the target), and the assumed links between them is important, along with the distinction between design- and model-based sampling approaches. Fleet-targeted and stock-targeted sampling strategies are contrasted, the latter being much harder to implement. Sampling protocols aimed at estimating quantities of fish landed and discarded, length,frequency distributions, length-related variables such as age, weight and maturity, and ratio variables such as catch per unit of effort and the proportions of discards are discussed, together with the raising of estimates to fleet and/or stock levels. The ideas are summarized in the specific contexts of landings sampling, logbook schemes and sea-going observer surveys. SAMs are commended for enhancing the scientific value of fishery sampling, and for encouraging methodological discussions among users and producers of the data. [source] Efficient sampling for spatial uncertainty quantification in multibody system dynamics applicationsINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 5 2009Kyle P. Schmitt Abstract We present two methods for efficiently sampling the response (trajectory space) of multibody systems operating under spatial uncertainty, when the latter is assumed to be representable with Gaussian processes. In this case, the dynamics (time evolution) of the multibody systems depends on spatially indexed uncertain parameters that span infinite-dimensional spaces. This places a heavy computational burden on existing methodologies, an issue addressed herein with two new conditional sampling approaches. When a single instance of the uncertainty is needed in the entire domain, we use a fast Fourier transform technique. When the initial conditions are fixed and the path distribution of the dynamical system is relatively narrow, we use an incremental sampling approach that is fast and has a small memory footprint. Both methods produce the same distributions as the widely used Cholesky-based approaches. We illustrate this convergence at a smaller computational effort and memory cost for a simple non-linear vehicle model. Copyright © 2009 John Wiley & Sons, Ltd. [source] Using the local elevation method to construct optimized umbrella sampling potentials: Calculation of the relative free energies and interconversion barriers of glucopyranose ring conformers in waterJOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 1 2010Halvor S. Hansen Abstract A method is proposed to combine the local elevation (LE) conformational searching and the umbrella sampling (US) conformational sampling approaches into a single local elevation umbrella sampling (LEUS) scheme for (explicit-solvent) molecular dynamics (MD) simulations. In this approach, an initial (relatively short) LE build-up (searching) phase is used to construct an optimized biasing potential within a subspace of conformationally relevant degrees of freedom, that is then used in a (comparatively longer) US sampling phase. This scheme dramatically enhances (in comparison with plain MD) the sampling power of MD simulations, taking advantage of the fact that the preoptimized biasing potential represents a reasonable approximation to the negative of the free energy surface in the considered conformational subspace. The method is applied to the calculation of the relative free energies of ,- D -glucopyranose ring conformers in water (within the GROMOS 45A4 force field). Different schemes to assign sampled conformational regions to distinct states are also compared. This approach, which bears some analogies with adaptive umbrella sampling and metadynamics (but within a very distinct implementation), is shown to be: (i) efficient (nearly all the computational effort is invested in the actual sampling phase rather than in searching and equilibration); (ii) robust (the method is only weakly sensitive to the details of the build-up protocol, even for relatively short build-up times); (iii) versatile (a LEUS biasing potential database could easily be preoptimized for small molecules and assembled on a fragment basis for larger ones). © 2009 Wiley Periodicals, Inc. J Comput Chem 2010 [source] Design of flexible reduced kinetic mechanismsAICHE JOURNAL, Issue 11 2001Avinash R. Sirdeshpande Reduced mechanisms are often used in place of detailed chemistry because the computational burden of including all the species continuity equations in the reactor model is unreasonably high. Contemporary reduction techniques produce mechanisms that depend strongly on the nominal set of problem parameters for which the reduction is carried out. Effects of variability in these parameters on the reduced mechanism are the focus of this work. The range of validity of a reduced mechanism is determined for variations in initial conditions. Both sampling approaches and quantitative measures of feasibility, such as the flexibility index and the convex hull formulation, are employed. The inverse problem of designing a reduced mechanism that covers the desired range of initial conditions is addressed using a multiperiod approach. The effect of the value of a user-defined tolerance parameter, which determines whether the predictions made by the reduced mechanism are acceptable, is also assessed. The analytical techniques are illustrated with examples from the literature. [source] |