Curve Estimation (curve + estimation)

Distribution by Scientific Domains


Selected Abstracts


Estimation of Nonlinear Models with Measurement Error

ECONOMETRICA, Issue 1 2004
Susanne M. Schennach
This paper presents a solution to an important econometric problem, namely the root n consistent estimation of nonlinear models with measurement errors in the explanatory variables, when one repeated observation of each mismeasured regressor is available. While a root n consistent estimator has been derived for polynomial specifications (see Hausman, Ichimura, Newey, and Powell (1991)), such an estimator for general nonlinear specifications has so far not been available. Using the additional information provided by the repeated observation, the suggested estimator separates the measurement error from the "true" value of the regressors thanks to a useful property of the Fourier transform: The Fourier transform converts the integral equations that relate the distribution of the unobserved "true" variables to the observed variables measured with error into algebraic equations. The solution to these equations yields enough information to identify arbitrary moments of the "true," unobserved variables. The value of these moments can then be used to construct any estimator that can be written in terms of moments, including traditional linear and nonlinear least squares estimators, or general extremum estimators. The proposed estimator is shown to admit a representation in terms of an influence function, thus establishing its root n consistency and asymptotic normality. Monte Carlo evidence and an application to Engel curve estimation illustrate the usefulness of this new approach. [source]


Low-cost J-R curve estimation based on CVN upper shelf energy

FATIGUE & FRACTURE OF ENGINEERING MATERIALS AND STRUCTURES, Issue 8 2001
K. Wallin
J-R curve testing is costly and difficult. The results may also sometimes be unreliable. For less demanding structures, J-R curve testing is therefore not practical. The only way to introduce tearing instability analysis for such cases is to estimate the J-R curves indirectly from some simpler test. The Charpy-V notch test provides information about the energy needed to fracture a small specimen in half. On the upper shelf this energy relates to ductile fracture resistance and it is possible to correlate it to the J-R curve. Here, 112 multispecimen J-R curves from a wide variety of materials were analysed and a simple power-law-based description of the J-R curves was correlated to the CVNUS energy. This new correlation corresponds essentially to a 5% lower bound and conforms well with the earlier correlations, regardless of the definition of the ductile fracture toughness parameter. [source]


Use of longitudinal data in genetic studies in the genome-wide association studies era: summary of Group 14

GENETIC EPIDEMIOLOGY, Issue S1 2009
Berit Kerner
Abstract Participants analyzed actual and simulated longitudinal data from the Framingham Heart Study for various metabolic and cardiovascular traits. The genetic information incorporated into these investigations ranged from selected single-nucleotide polymorphisms to genome-wide association arrays. Genotypes were incorporated using a broad range of methodological approaches including conditional logistic regression, linear mixed models, generalized estimating equations, linear growth curve estimation, growth modeling, growth mixture modeling, population attributable risk fraction based on survival functions under the proportional hazards models, and multivariate adaptive splines for the analysis of longitudinal data. The specific scientific questions addressed by these different approaches also varied, ranging from a more precise definition of the phenotype, bias reduction in control selection, estimation of effect sizes and genotype associated risk, to direct incorporation of genetic data into longitudinal modeling approaches and the exploration of population heterogeneity with regard to longitudinal trajectories. The group reached several overall conclusions: (1) The additional information provided by longitudinal data may be useful in genetic analyses. (2) The precision of the phenotype definition as well as control selection in nested designs may be improved, especially if traits demonstrate a trend over time or have strong age-of-onset effects. (3) Analyzing genetic data stratified for high-risk subgroups defined by a unique development over time could be useful for the detection of rare mutations in common multifactorial diseases. (4) Estimation of the population impact of genomic risk variants could be more precise. The challenges and computational complexity demanded by genome-wide single-nucleotide polymorphism data were also discussed. Genet. Epidemiol. 33 (Suppl. 1):S93,S98, 2009. © 2009 Wiley-Liss, Inc. [source]


Sonographic assessment of uterine and ovarian development in normal girls aged 1 to 12 years

JOURNAL OF CLINICAL ULTRASOUND, Issue 9 2008
Maria Badouraki MD
Abstract Purpose. To provide normal references of sonographic uterine and ovarian size in girls aged 1,12 years. Method. Ninety-nine girls were enrolled in the study (mean age ± SD, 6.9 ± 2.4 years [range, 1,12 years]). Pubertal status was classified according to Tanner staging, whereas for height and weight assessment a standard stadiometer and weight scale were employed. All subjects underwent pelvic sonographic examination for the measurement of uterine length, volume, ratio of anteroposterior diameter at the fundus divided by the anteroposterior diameter at the cervix (fundal,cervical [F/C] ratio), and ovarian volume and morphology. Results. A gradual increase with age was observed in all uterine and ovarian measurements. Cubic model analysis provided the best curve estimation for uterine length, uterine volume, and ovarian volume in relation to age. Uterine length, uterine volume, ovarian volume and F/C ratio were significantly correlated to both age and height. With respect to ovarian morphology, there was a gradual decrease in frequency of the homogeneous and the paucicystic appearances with increasing age. The macrocystic appearance was observed after the age of 6 years, and its frequency increased gradually with age. Conclusion. There is a continuous increase in size of internal female genitalia from early childhood until the onset of puberty. We have provided reference percentile charts of normal uterine length, uterine volume, and ovarian volume in girls aged 1,12 years. © 2008 Wiley Periodicals, Inc. J Clin Ultrasound, 2008 [source]


Peritraumatic distress, posttraumatic stress disorder symptoms, and posttraumatic growth in victims of violence,

JOURNAL OF TRAUMATIC STRESS, Issue 4 2010
M. J. J. Kunst
This study explored whether peritraumatic distress and posttraumatic stress disorder (PTSD) symptoms are curvilinearly related to posttraumatic growth in victims of violence several years after victimization (Time 1; n = 678) and 6 months later (Time 2, n = 205). At both time points, curve estimation revealed linear and quadratic associations between peritraumatic distress and posttraumatic growth and quadratic associations between PTSD symptoms and posttraumatic growth. In multivariate regressions controlling for background variables, the linear peritraumatic distress and quadratic PTSD symptom terms remained significant predictors of posttraumatic growth Time 1 scores. For Time 2, the linear peritraumatic distress term remained significant, though only prior to controlling for posttraumatic growth Time 1 scores. The results suggest that peritraumatic distress enables growth after substantial time has elapsed since victimization. [source]


Adaptive tests of regression functions via multiscale generalized likelihood ratios

THE CANADIAN JOURNAL OF STATISTICS, Issue 2 2003
Chunming M. Zhang
Abstract Many applications of nonparametric tests based on curve estimation involve selecting a smoothing parameter. The author proposes an adaptive test that combines several generalized likelihood ratio tests in order to get power performance nearly equal to whichever of the component tests is best. She derives the asymptotic joint distribution of the component tests and that of the proposed test under the null hypothesis. She also develops a simple method of selecting the smoothing parameters for the proposed test and presents two approximate methods for obtaining its P-value. Finally, she evaluates the proposed test through simulations and illustrates its application to a set of real data. Moult applications des tests non paramétriques basés sur l'estimation de courbes font intervenir un paramètre de lissage. L'auteure propose un test adaptatif qui allie plusieurs tests du rapport de vraisemblances généralisés et rivalise de puissance avec le meilleur d'entre eux. Elle détermine la loi asymptotique conjointe des tests individuels et celle du test global sous l'hypothèse nulle. Elle montre aussi comment sélectionner facilement les paramètres de lissage du test global et propose deux méthodes de calcul approché de son seuil. Elle examine en outre le comportement du test proposé par voie de simulations et en illustre l'emploi dans un cas concret [source]


Theory & Methods: Data Sharpening for Hazard Rate Estimation

AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 3 2002
Gerda Claeskens
Data sharpening is a general tool for enhancing the performance of statistical estimators, by altering the data before substituting them into conventional methods. In one of the simplest forms of data sharpening, available for curve estimation, an explicit empirical transformation is used to alter the data. The attraction of this approach is diminished, however, if the formula has to be altered for each different application. For example, one could expect the formula for use in hazard rate estimation to differ from that for straight density estimation, since a hazard rate is a ratio,type functional of a density. This paper shows that, in fact, identical data transformations can be used in each case, regardless of whether the data involve censoring. This dramatically simplifies the application of data sharpening to problems involving hazard rate estimation, and makes data sharpening attractive. [source]