Estimation

Distribution by Scientific Domains
Distribution within Mathematics and Statistics

Kinds of Estimation

  • abundance estimation
  • accurate estimation
  • age estimation
  • approximate estimation
  • bayesian estimation
  • channel estimation
  • component estimation
  • consistent estimation
  • cost estimation
  • curve estimation
  • density estimation
  • direct estimation
  • distance estimation
  • efficient estimation
  • empirical estimation
  • energy estimation
  • error estimation
  • fault estimation
  • flux estimation
  • frequency estimation
  • function estimation
  • good estimation
  • gradient estimation
  • improved estimation
  • interval estimation
  • joint estimation
  • kernel estimation
  • kinetic parameter estimation
  • least-square estimation
  • likelihood estimation
  • mass estimation
  • maximum likelihood estimation
  • mean estimation
  • model estimation
  • model parameter estimation
  • moment estimation
  • non-parametric estimation
  • noninvasive estimation
  • nonparametric estimation
  • numerical estimation
  • online estimation
  • parameter estimation
  • parametric estimation
  • performance estimation
  • point estimation
  • population estimation
  • posteriori error estimation
  • probability estimation
  • quantitative estimation
  • rapid estimation
  • ratio estimation
  • regression estimation
  • reliable estimation
  • risk estimation
  • robust estimation
  • rough estimation
  • sample size estimation
  • semiparametric estimation
  • simultaneous estimation
  • size estimation
  • speed estimation
  • square estimation
  • state estimation
  • statistical estimation
  • structural estimation
  • threshold estimation
  • time estimation
  • trend estimation
  • unbiased estimation
  • uncertainty estimation
  • value estimation
  • variable estimation
  • variance estimation
  • visual estimation
  • weight estimation

  • Terms modified by Estimation

  • estimation accuracy
  • estimation algorithm
  • estimation algorithms
  • estimation approach
  • estimation bias
  • estimation equation
  • estimation error
  • estimation error covariance
  • estimation method
  • estimation methods
  • estimation model
  • estimation problem
  • estimation procedure
  • estimation process
  • estimation result
  • estimation scheme
  • estimation strategy
  • estimation task
  • estimation technique
  • estimation techniques
  • estimation tool
  • estimation uncertainty

  • Selected Abstracts


    LANGUAGE-RELATED DIFFERENCES IN ENVIRONMENTAL BENEFITS ESTIMATION: EVIDENCE FROM A MAIL SURVEY

    CONTEMPORARY ECONOMIC POLICY, Issue 1 2008
    XIAOLIN REN
    In contingent valuation studies, failing to accommodate populations with limited language skills might yield biased estimates. In the United States, there are many residents primarily fluent in Spanish. This study uses conditional logit models applied to data from a bilingual (English and Spanish) conjoint choice mail survey to evaluate the effects of language proficiency on estimates of the economic benefits of contaminated site cleanup. Results indicate that language does have significant effects on welfare estimates. The results suggest that mail surveys addressing environmental issues that may affect a linguistically diverse population should be designed at the outset with multiple languages in mind. (JEL Q51, J19) [source]


    THE INTERACTION OF ANTISOCIAL PROPENSITY AND LIFE-COURSE VARYING PREDICTORS OF DELINQUENT BEHAVIOR: DIFFERENCES BY METHOD OF ESTIMATION AND IMPLICATIONS FOR THEORY,

    CRIMINOLOGY, Issue 2 2007
    GRAHAM C. OUSEY
    Recent criminological research has explored the extent to which stable propensity and life-course perspectives may be integrated to provide a more comprehensive explanation of variation in individual criminal offending. One line of these integrative efforts focuses on the ways that stable individual characteristics may interact with, or modify, the effects of life-course varying social factors. Given their consistency with the long-standing view that person,environment interactions contribute to variation in human social behavior, these theoretical integration attempts have great intuitive appeal. However, a review of past criminological research suggests that conceptual and empirical complexities have, so far, somewhat dampened the development of a coherent theoretical understanding of the nature of interaction effects between stable individual antisocial propensity and time-varying social variables. In this study, we outline and empirically assess several of the sometimes conflicting hypotheses regarding the ways that antisocial propensity moderates the influence of time-varying social factors on delinquent offending. Unlike some prior studies, however, we explicitly measure the interactive effects of stable antisocial propensity and time-varying measures of selected social variables on changes in delinquent offending. In addition, drawing on recent research that suggests that the relative ubiquity of interaction effects in past studies may be partly from the poorly suited application of linear statistical models to delinquency data, we alternatively test our interaction hypotheses using least-squares and tobit estimation frameworks. Our findings suggest that method of estimation matters, with interaction effects appearing readily in the former but not in the latter. The implications of these findings for future conceptual and empirical work on stable propensity/time-varying social variable interaction effects are discussed. [source]


    THE GRAVITY MODEL: AN ILLUSTRATION OF STRUCTURAL ESTIMATION AS CALIBRATION

    ECONOMIC INQUIRY, Issue 4 2008
    EDWARD J. BALISTRERI
    Dawkins, Srinivasan, and Whalley ("Calibration,"Handbook of Econometrics, 2001) propose that estimation is calibration. We illustrate their point by examining a leading econometric application in the study of international and interregional trade by Anderson and van Wincoop ("Gravity with Gravitas: A Solution to the Border Puzzle,"American Economic Review, 2003). We replicate the econometric process and show it to be a calibration of a general equilibrium model. Our approach offers unique insights into structural estimation, and we highlight the importance of traditional calibration considerations when one uses econometric techniques to calibrate a model for comparative policy analysis. (JEL F10, C13, C60) [source]


    SEMINONPARAMETRIC MAXIMUM LIKELIHOOD ESTIMATION OF CONDITIONAL MOMENT RESTRICTION MODELS,

    INTERNATIONAL ECONOMIC REVIEW, Issue 4 2007
    Chunrong Ai
    This article studies estimation of a conditional moment restriction model with the seminonparametric maximum likelihood approach proposed by Gallant and Nychka (Econometrica 55 (March 1987), 363,90). Under some sufficient conditions, we show that the estimator of the finite dimensional parameter , is asymptotically normally distributed and attains the semiparametric efficiency bound and that the estimator of the density function is consistent under L2 norm. Some results on the convergence rate of the estimated density function are derived. An easy to compute covariance matrix for the asymptotic covariance of the , estimator is presented. [source]


    PAIRWISE DIFFERENCE ESTIMATION WITH NONPARAMETRIC CONTROL VARIABLES,

    INTERNATIONAL ECONOMIC REVIEW, Issue 4 2007
    Andres Aradillas-Lopez
    This article extends the pairwise difference estimators for various semilinear limited dependent variable models proposed by Honoré and Powell (Identification and Inference in Econometric Models. Essays in Honor of Thomas Rothenberg Cambridge: Cambridge University Press, 2005) to permit the regressor appearing in the nonparametric component to itself depend upon a conditional expectation that is nonparametrically estimated. This permits the estimation approach to be applied to nonlinear models with sample selectivity and/or endogeneity, in which a "control variable" for selectivity or endogeneity is nonparametrically estimated. We develop the relevant asymptotic theory for the proposed estimators and we illustrate the theory to derive the asymptotic distribution of the estimator for the partially linear logit model. [source]


    ESTIMATION AND HYPOTHESIS TESTING FOR NONPARAMETRIC HEDONIC HOUSE PRICE FUNCTIONS

    JOURNAL OF REGIONAL SCIENCE, Issue 3 2010
    Daniel P. McMillen
    ABSTRACT In contrast to the rigid structure of standard parametric hedonic analysis, nonparametric estimators control for misspecified spatial effects while using highly flexible functional forms. Despite these advantages, nonparametric procedures are still not used extensively for spatial data analysis due to perceived difficulties associated with estimation and hypothesis testing. We demonstrate that nonparametric estimation is feasible for large datasets with many independent variables, offering statistical tests of individual covariates and tests of model specification. We show that fixed parameterization of distance to the nearest rapid transit line is a misspecification and that pricing of access to this amenity varies across neighborhoods within Chicago. [source]


    ESTIMATION OF HEDONIC RESPONSES FROM DESCRIPTIVE SKIN SENSORY DATA BY CHI-SQUARE MINIMIZATION

    JOURNAL OF SENSORY STUDIES, Issue 1 2006
    I.F. ALMEIDA
    ABSTRACT Six topical formulations were evaluated by a trained panel according to a descriptive analysis methodology and by a group of consumers who rated the products on a hedonic scale. We present a new approach that describes the categorical appreciation of appearance, texture and skinfeel of the formulations by the consumers as a function of related sensory attributes assessed by the trained panel. For each hedonic attribute, a latent random variable depending on the sensory attributes is constructed and made discrete (in a nonlinear fashion) according to the distribution of consumer-hedonic scores in such a way as to minimize a corresponding chi-square criterion. Standard partial least squares (PLS) regression, bootstrapping and cross-validation techniques describing the overall liking of the hedonic attributes as a function of associated sensory attributes were also applied. Results from both methods were compared, and it was concluded that chi-square minimization can work as a complementary method to the PLS regression. [source]


    SENSORY SHELF-LIFE ESTIMATION OF ALFAJOR BY SURVIVAL ANALYSIS

    JOURNAL OF SENSORY STUDIES, Issue 6 2004
    ADRIANA GÁMBARO
    ABSTRACT Survival analysis methodology was used to estimate the shelf life of alfajor (a chocolate-coated individually wrapped cake) at 20 and 35C by using results obtained from consumers when asked if they would accept or reject samples with different storage times. Sensory acceptability (measured by consumers), off-flavor (measured by a trained panel) and moisture content were linearly related to time. These correlations were used to estimate values at the shelf-life times calculated for 25 and 50% rejection probability. Survival analysis provided the following shelf-life estimation: 74 days at 20C and 33 days at 35C for a 25% of rejection, 87 days at 20C and 39 days at 35C for a 50% of rejection. An alfajor stored at 20C having an acceptability value below 4.9 (1,9 hedonic scale) and off-flavor intensity above 5.3 (0,10 scale) would be rejected by 25% of the consumers. Chemical data were not good shelf-life predictors. [source]


    GEOSTATISTICAL ESTIMATION OF HORIZONTAL HYDRAULIC CONDUCTIVITY FOR THE KIRKWOOD-COHANSEY AQUIFER,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 1 2004
    Vikram M. Vyas
    ABSTRACT: The Kirkwood-Cohansey aquifer has been identified as a critical source for meeting existing and expected water supply needs for southern New Jersey. Several contaminated sites exist in the region; their impact on the aquifer has to be evaluated using ground water flow and transport models. Ground water modeling depends on availability of measured hydrogeologic data (e.g., hydraulic conductivity, for parameterization of the modeling runs). However, field measurements of such critical data have inadequate spatial density, and their locations are often clustered. The goal of this study was to research, compile, and geocode existing data, then use geostatistics and advanced mapping methods to develop a map of horizontal hydraulic conductivity for the Kirkwood-Cohansey aquifer. Spatial interpolation of horizontal hydraulic conductivity measurements was performed using the Bayesian Maximum Entropy (BME) Method implemented in the BMELib code library. This involved the integration of actual measurements with soft information on likely ranges of hydraulic conductivity at a given location to obtain estimate maps. The estimation error variance maps provide an insight into the uncertainty associated with the estimates, and indicate areas where more information on hydraulic conductivity is required. [source]


    ARTIFICIAL NEURAL NETWORK MODELING FOR REFORESTATION DESIGN THROUGH THE DOMINANT TREES BOLE-VOLUME ESTIMATION

    NATURAL RESOURCE MODELING, Issue 4 2009
    MARIA J. DIAMANTOPOULOU
    Abstract In the management of restoration reforestations or recreational reforestations of trees, the density of the planted trees and the site conditions can influence the growth and bole volume of the dominant tree. The ability to influence growth of these trees in a reforestation contributes greatly to the formation of large dimension trees and thereby to the production of commercially valuable wood. The potential of two artificial neural network (ANN) architectures in modeling the dominant,Pinus brutia,tree bole volume in reforestation configuration at 12 years of age was investigated: (1) the multilayer perceptron architecture using a back-propagation algorithm and (2) the cascade-correlation architecture, utilizing (a) either the nonlinear Kalman's filter theory or (b) the adaptive gradient descent learning rule. The incentive for developing bole-volume equations using ANN techniques was to demonstrate an alternative new methodology in the field of reforestation design, which would enable estimation and optimization of the bole volume of dominant trees in reforestations using easily measurable site and competition factors. The usage of the ANNs for the estimation of dominant tree bole volume through site and competition factors can be a very useful tool in forest management practice. [source]


    EXPONENTIAL DURATION: A MORE ACCURATE ESTIMATION OF INTEREST RATE RISK

    THE JOURNAL OF FINANCIAL RESEARCH, Issue 3 2005
    Miles Livingston
    Abstract We develop a new method to estimate the interest rate risk of an asset. This method is based on modified duration and is always more accurate than traditional estimation with modified duration. The estimates by this method are close to estimates using traditional duration plus convexity when interest rates decrease. If interest rates rise, investors will suffer larger value declines than predicted by traditional duration plus convexity estimate. The new method avoids this undesirable value overestimation and provides an estimate slightly below the true value. For risk-averse investors, overestimation of value declines is more desirable and conservative. [source]


    TWO-STEP EMPIRICAL LIKELIHOOD ESTIMATION UNDER STRATIFIED SAMPLING WHEN AGGREGATE INFORMATION IS AVAILABLE,

    THE MANCHESTER SCHOOL, Issue 5 2006
    ESMERALDA A. RAMALHO
    Empirical likelihood is appropriate to estimate moment condition models when a random sample from the target population is available. However, many economic surveys are subject to some form of stratification, in which case direct application of empirical likelihood will produce inconsistent estimators. In this paper we propose a two-step empirical likelihood estimator to deal with stratified samples in models defined by unconditional moment restrictions in the presence of some aggregate information such as the mean and the variance of the variable of interest. A Monte Carlo simulation study reveals promising results for many versions of the two-step empirical likelihood estimator. [source]


    OPTIMAL AND ADAPTIVE SEMI-PARAMETRIC NARROWBAND AND BROADBAND AND MAXIMUM LIKELIHOOD ESTIMATION OF THE LONG-MEMORY PARAMETER FOR REAL EXCHANGE RATES,

    THE MANCHESTER SCHOOL, Issue 2 2005
    SAEED HERAVI
    The nature of the time series properties of real exchange rates remains a contentious issue primarily because of the implications for purchasing power parity. In particular are real exchange rates best characterized as stationary and non-persistent; nonstationary but non-persistent; or nonstationary and persistent? Most assessments of this issue use the I(0)/I(1) paradigm, which only allows the first and last of these options. In contrast, in the I(d) paradigm, d fractional, all three are possible, with the crucial parameter d determining the long-run properties of the process. This study includes estimation of d by three methods of semi-parametric estimation in the frequency domain, using both local and global (Fourier) frequency estimation, and maximum likelihood estimation of ARFIMA models in the time domain. We give a transparent assessment of the key selection parameters in each method, particularly estimation of the truncation parameters for the semi-parametric methods. Two other important developments are also included. We implement Tanaka's locally best invariant parametric tests based on maximum likelihood estimation of the long-memory parameter and include a recent extension of the Dickey,Fuller approach, referred to as fractional Dickey,Fuller (FD-F), to fractionally integrated series, which allows a much wider range of generating processes under the alternative hypothesis. With this more general approach, we find very little evidence of stationarity for 10 real exchange rates for developed countries and some very limited evidence of nonstationarity but non-persistence, and none of the FD-F tests leads to rejection of the null of a unit root. [source]


    MINERALOGY OF MEDIEVAL SLAGS FROM LEAD AND SILVER SMELTING (BOHUTÍN, P,ÍBRAM DISTRICT, CZECH REPUBLIC): TOWARDS ESTIMATION OF HISTORICAL SMELTING CONDITIONS*

    ARCHAEOMETRY, Issue 6 2009
    V. ETTLER
    Slags from the Pb/Ag medieval (14th century) smelting plant located at Bohutín, P,íbram district, Czech Republic, were studied from the mineralogical and geochemical points of view. Two types of slags were distinguished: (i) quenched slags formed mainly by Pb-rich glass and unmelted residual grains of SiO2 and feldspars, and (ii) crystallized slags mainly composed of Fe-rich olivine (fayalite) and glass. The mean log viscosity value of the slags calculated for 1200°C was 2.119 Pa s. The morphology of olivine crystals was used to estimate the cooling rates of the melt, for some slags indicating rates > 1450°C/h. The projection of the bulk composition of slags onto the SiO2,PbO,FeO ternary system was used for rough temperature estimates of slag formation, lying probably between 800 and 1200°C. [source]


    SAMPLING AND ESTIMATION IN THE PRESENCE OF CUT-OFF SAMPLING

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 3 2010
    David Haziza
    Summary Cut-off sampling consists of deliberately excluding a set of units from possible selection in a sample, for example if the contribution of the excluded units to the total is small or if the inclusion of these units in the sample involves high costs. If the characteristics of interest of the excluded units differ from those of the rest of the population, the use of naïve estimators may result in highly biased estimates. In this paper, we discuss the use of auxiliary information to reduce the bias by means of calibration and balanced sampling. We show that the use of the available auxiliary information related to both the variable of interest and the probability of being excluded enables us to reduce the potential bias. A short numerical study supports our findings. [source]


    ROBUST ESTIMATION OF SMALL-AREA MEANS AND QUANTILES

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2010
    Nikos Tzavidis
    Summary Small-area estimation techniques have typically relied on plug-in estimation based on models containing random area effects. More recently, regression M-quantiles have been suggested for this purpose, thus avoiding conventional Gaussian assumptions, as well as problems associated with the specification of random effects. However, the plug-in M-quantile estimator for the small-area mean can be shown to be the expected value of this mean with respect to a generally biased estimator of the small-area cumulative distribution function of the characteristic of interest. To correct this problem, we propose a general framework for robust small-area estimation, based on representing a small-area estimator as a functional of a predictor of this small-area cumulative distribution function. Key advantages of this framework are that it naturally leads to integrated estimation of small-area means and quantiles and is not restricted to M-quantile models. We also discuss mean squared error estimation for the resulting estimators, and demonstrate the advantages of our approach through model-based and design-based simulations, with the latter using economic data collected in an Australian farm survey. [source]


    ERROR VARIANCE ESTIMATION FOR THE SINGLE-INDEX MODEL

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2010
    K. B. Kulasekera
    Summary Single-index models provide one way of reducing the dimension in regression analysis. The statistical literature has focused mainly on estimating the index coefficients, the mean function, and their asymptotic properties. For accurate statistical inference it is equally important to estimate the error variance of these models. We examine two estimators of the error variance in a single-index model and compare them with a few competing estimators with respect to their corresponding asymptotic properties. Using a simulation study, we evaluate the finite-sample performance of our estimators against their competitors. [source]


    ESTIMATION OF THE NUMBER OF PEOPLE IN A DEMONSTRATION

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 1 2010
    Paul S. F. Yip
    Summary It is of interest to estimate the size of a crowd in a demonstration. We propose a practical method to obtain an estimate of the size of the crowd and its standard error. This method has been implemented in practice and, compared with other counting methods, is found to be more efficient, more timely and have less scope for bias. The method described in this paper was motivated by the annual 1 July demonstrations in Hong Kong, and data from the 2006 demonstration are used as an example of the proposed method. [source]


    NONPARAMETRIC ESTIMATION OF CONDITIONAL CUMULATIVE HAZARDS FOR MISSING POPULATION MARKS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 1 2010
    Dipankar Bandyopadhyay
    Summary A new function for the competing risks model, the conditional cumulative hazard function, is introduced, from which the conditional distribution of failure times of individuals failing due to cause,j,can be studied. The standard Nelson,Aalen estimator is not appropriate in this setting, as population membership (mark) information may be missing for some individuals owing to random right-censoring. We propose the use of imputed population marks for the censored individuals through fractional risk sets. Some asymptotic properties, including uniform strong consistency, are established. We study the practical performance of this estimator through simulation studies and apply it to a real data set for illustration. [source]


    KERNEL DENSITY ESTIMATION WITH MISSING DATA AND AUXILIARY VARIABLES

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 3 2009
    Suzanne R. Dubnicka
    Summary In most parametric statistical analyses, knowledge of the distribution of the response variable, or of the errors, is important. As this distribution is not typically known with certainty, one might initially construct a histogram or estimate the density of the variable of interest to gain insight regarding the distribution and its characteristics. However, when the response variable is incomplete, a histogram will only provide a representation of the distribution of the observed data. In the AIDS Clinical Trial Study protocol 175, interest lies in the difference in CD4 counts from baseline to final follow-up, but CD4 counts collected at final follow-up were incomplete. A method is therefore proposed for estimating the density of an incomplete response variable when auxiliary data are available. The proposed estimator is based on the Horvitz,Thompson estimator, and the propensity scores are estimated nonparametrically. Simulation studies indicate that the proposed estimator performs well. [source]


    VARIANCE ESTIMATION IN TWO-PHASE SAMPLING

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2009
    M.A. Hidiroglou
    Summary Two-phase sampling is often used for estimating a population total or mean when the cost per unit of collecting auxiliary variables, x, is much smaller than the cost per unit of measuring a characteristic of interest, y. In the first phase, a large sample s1 is drawn according to a specific sampling design p(s1), and auxiliary data x are observed for the units i,s1. Given the first-phase sample s1, a second-phase sample s2 is selected from s1 according to a specified sampling design {p(s2,s1) }, and (y, x) is observed for the units i,s2. In some cases, the population totals of some components of x may also be known. Two-phase sampling is used for stratification at the second phase or both phases and for regression estimation. Horvitz,Thompson-type variance estimators are used for variance estimation. However, the Horvitz,Thompson (Horvitz & Thompson, J. Amer. Statist. Assoc. 1952) variance estimator in uni-phase sampling is known to be highly unstable and may take negative values when the units are selected with unequal probabilities. On the other hand, the Sen,Yates,Grundy variance estimator is relatively stable and non-negative for several unequal probability sampling designs with fixed sample sizes. In this paper, we extend the Sen,Yates,Grundy (Sen, J. Ind. Soc. Agric. Statist. 1953; Yates & Grundy, J. Roy. Statist. Soc. Ser. B 1953) variance estimator to two-phase sampling, assuming fixed first-phase sample size and fixed second-phase sample size given the first-phase sample. We apply the new variance estimators to two-phase sampling designs with stratification at the second phase or both phases. We also develop Sen,Yates,Grundy-type variance estimators of the two-phase regression estimators that make use of the first-phase auxiliary data and known population totals of some of the auxiliary variables. [source]


    ROBUST ESTIMATION IN PARAMETRIC TIME SERIES MODELS UNDER LONG- AND SHORT-RANGE-DEPENDENT STRUCTURES

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2009
    Jiti Gao
    Summary This paper studies the asymptotic behaviour of an M-estimator of regression parameters in the linear model when the design variables are either stationary short-range dependent (SRD), ,-mixing or long-range dependent (LRD), and the errors are LRD. The weak consistency and the asymptotic distributions of the M-estimator are established. We present some simulated examples to illustrate the efficiency of the proposed M-estimation method. [source]


    ESTIMATION, PREDICTION AND INFERENCE FOR THE LASSO RANDOM EFFECTS MODEL

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 1 2009
    Scott D. Foster
    Summary The least absolute shrinkage and selection operator (LASSO) can be formulated as a random effects model with an associated variance parameter that can be estimated with other components of variance. In this paper, estimation of the variance parameters is performed by means of an approximation to the marginal likelihood of the observed outcomes. The approximation is based on an alternative but equivalent formulation of the LASSO random effects model. Predictions can be made using point summaries of the predictive distribution of the random effects given the data with the parameters set to their estimated values. The standard LASSO method uses the mode of this distribution as the predictor. It is not the only choice, and a number of other possibilities are defined and empirically assessed in this article. The predictive mode is competitive with the predictive mean (best predictor), but no single predictor performs best across in all situations. Inference for the LASSO random effects is performed using predictive probability statements, which are more appropriate under the random effects formulation than tests of hypothesis. [source]


    MULTIPLE-RECORD SYSTEMS ESTIMATION USING LATENT CLASS MODELS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 1 2009
    Yan Wang
    Summary Capture,recapture methods (also referred to as ,multiple-record systems') have been widely used in enumerating human populations in the fields of epidemiology and public health. In this article, we introduce latent class models into multiple-record systems to account for unobserved heterogeneity in the population. Two approaches, the full and the conditional likelihood, are proposed to estimate the unknown population abundance. We also suggest rules to diagnose identifiability of the proposed latent class models. The methodologies are illustrated by two real examples: the first is to count the undercount of homelessness in the Adelaide central business district, and the second concerns the incidence of diabetes in a small Italian town. [source]


    EXACT P -VALUES FOR DISCRETE MODELS OBTAINED BY ESTIMATION AND MAXIMIZATION

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 4 2008
    Chris J. Lloyd
    Summary In constructing exact tests from discrete data, one must deal with the possible dependence of the P -value on nuisance parameter(s) , as well as the discreteness of the sample space. A classical but heavy-handed approach is to maximize over ,. We prove what has previously been understood informally, namely that maximization produces the unique and smallest possible P -value subject to the ordering induced by the underlying test statistic and test validity. On the other hand, allowing for the worst case will be more attractive when the P -value is less dependent on ,. We investigate the extent to which estimating , under the null reduces this dependence. An approach somewhere between full maximization and estimation is partial maximization, with appropriate penalty, as introduced by Berger & Boos (1994, P values maximized over a confidence set for the nuisance parameter. J. Amer. Statist. Assoc.,89, 1012,1016). It is argued that estimation followed by maximization is an attractive, but computationally more demanding, alternative to partial maximization. We illustrate the ideas on a range of low-dimensional but important examples for which the alternative methods can be investigated completely numerically. [source]


    MOMENT ESTIMATION IN THE CLASS OF BISEXUAL BRANCHING PROCESSES WITH POPULATION,SIZE DEPENDENT MATING

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 1 2007
    Miguel González
    Summary This paper concerns the estimation of the offspring mean vector, the covariance matrix and the growth rate in the class of bisexual branching processes with population-size dependent mating. For the proposed estimators, some unconditional moments and some conditioned to non-extinction are determined and asymptotic properties are established. Confidence intervals are obtained and, as illustration, a simulation example is given. [source]


    ESTIMATION IN RICKER'S TWO-RELEASE METHOD: A BAYESIAN APPROACH

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2006
    Shen-Ming Lee
    Summary The Ricker's two-release method is a simplified version of the Jolly-Seber method, from Seber's Estimation of Animal Abundance (1982), used to estimate survival rate and abundance in animal populations. This method assumes there is only a single recapture sample and no immigration, emigration or recruitment. In this paper, we propose a Bayesian analysis for this method to estimate the survival rate and the capture probability, employing Markov chain Monte Carlo methods and a latent variable analysis. The performance of the proposed method is illustrated with a simulation study as well as a real data set. The results show that the proposed method provides favourable inference for the survival rate when compared with the modified maximum likelihood method. [source]


    PLUG-IN ESTIMATION OF GENERAL LEVEL SETS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 1 2006
    Antonio Cuevas
    Summary Given an unknown function (e.g. a probability density, a regression function, ,) f and a constant c, the problem of estimating the level set L(c) ={f,c} is considered. This problem is tackled in a very general framework, which allows f to be defined on a metric space different from . Such a degree of generality is motivated by practical considerations and, in fact, an example with astronomical data is analyzed where the domain of f is the unit sphere. A plug-in approach is followed; that is, L(c) is estimated by Ln(c) ={fn,c}, where fn is an estimator of f. Two results are obtained concerning consistency and convergence rates, with respect to the Hausdorff metric, of the boundaries ,Ln(c) towards ,L(c). Also, the consistency of Ln(c) to L(c) is shown, under mild conditions, with respect to the L1 distance. Special attention is paid to the particular case of spherical data. [source]


    NON-PARAMETRIC ESTIMATION OF DIRECTION IN SINGLE-INDEX MODELS WITH CATEGORICAL PREDICTORS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2005
    Xiangrong Yin
    Summary This paper proposes a general dimension-reduction method targeting the partial central subspace recently introduced by Chiaromonte, Cook & Li. The dependence need not be confined to particular conditional moments, nor are restrictions placed on the predictors that are necessary for methods like partial sliced inverse regression. The paper focuses on a partially linear single-index model. However, the underlying idea is applicable more generally. Illustrative examples are presented. [source]


    MAXIMUM LIKELIHOOD ESTIMATION FOR A POISSON RATE PARAMETER WITH MISCLASSIFIED COUNTS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2005
    James D. Stamey
    Summary This paper proposes a Poisson-based model that uses both error-free data and error-prone data subject to misclassification in the form of false-negative and false-positive counts. It derives maximum likelihood estimators (MLEs) for the Poisson rate parameter and the two misclassification parameters , the false-negative parameter and the false-positive parameter. It also derives expressions for the information matrix and the asymptotic variances of the MLE for the rate parameter, the MLE for the false-positive parameter, and the MLE for the false-negative parameter. Using these expressions the paper analyses the value of the fallible data. It studies characteristics of the new double-sampling rate estimator via a simulation experiment and applies the new MLE estimators and confidence intervals to a real dataset. [source]