Failure Times (failure + time)

Distribution by Scientific Domains

Terms modified by Failure Times

  • failure time data
  • failure time distribution
  • failure time models

  • Selected Abstracts


    A Mixture Point Process for Repeated Failure Times, with an Application to a Recurrent Disease

    BIOMETRICAL JOURNAL, Issue 7 2003
    O. Pons
    Abstract We present a model that describes the distribution of recurring times of a disease in presence of covariate effects. After a first occurrence of the disease in an individual, the time intervals between successive cases are supposed to be independent and to be a mixture of two distributions according to the issue of the previous treatment. Both sub-distributions of the model and the mixture proportion are allowed to involve covariates. Parametric inference is considered and we illustrate the methods with data of a recurrent disease and with simulations, using piecewise constant baseline hazard functions. [source]


    A damage mechanics model for power-law creep and earthquake aftershock and foreshock sequences

    GEOPHYSICAL JOURNAL INTERNATIONAL, Issue 1 2000
    Ian G. Main
    It is common practice to refer to three independent stages of creep under static loading conditions in the laboratory: namely transient, steady-state, and accelerating. Here we suggest a simple damage mechanics model for the apparently trimodal behaviour of the strain and event rate dependence, by invoking two local mechanisms of positive and negative feedback applied to constitutive rules for time-dependent subcritical crack growth. In both phases, the individual constitutive rule for measured strain , takes the form ,(t),=,,0,[1,+,t/m,]m, where , is the ratio of initial crack length to rupture velocity. For a local hardening mechanism (negative feedback), we find that transient creep dominates, with 0,<,m,<,1. Crack growth in this stage is stable and decelerating. For a local softening mechanism (positive feedback), m,<,0, and crack growth is unstable and accelerating. In this case a quasi-static instability criterion , , , can be defined at a finite failure time, resulting in the localization of damage and the formation of a throughgoing fracture. In the hybrid model, transient creep dominates in the early stages of damage and accelerating creep in the latter stages. At intermediate times the linear superposition of the two mechanisms spontaneously produces an apparent steady-state phase of relatively constant strain rate, with a power-law rheology, as observed in laboratory creep test data. The predicted acoustic emission event rates in the transient and accelerating phases are identical to the modified Omori laws for aftershocks and foreshocks, respectively, and provide a physical meaning for the empirical constants measured. At intermediate times, the event rate tends to a relatively constant background rate. The requirement for a finite event rate at the time of the main shock can be satisfied by modifying the instability criterion to having a finite crack velocity at the dynamic failure time, dx/dt , VR,, where VR is the dynamic rupture velocity. The same hybrid model can be modified to account for dynamic loading (constant stress rate) boundary conditions, and predicts the observed loading rate dependence of the breaking strength. The resulting scaling exponents imply systematically more non-linear behaviour for dynamic loading. [source]


    Ozone cracking and flex cracking of crosslinked polymer blend compounds

    JOURNAL OF APPLIED POLYMER SCIENCE, Issue 4 2007
    M. F. Tse
    Abstract Ozone cracking and flex cracking of crosslinked elastomer blends of brominated isobutylene/para -methylstyrene copolymer (BIMSM) and unsaturated elastomers, such as polybutadiene rubber (BR) and natural rubber (NR), are studied. This saturated BIMSM elastomer, which is a terpolymer of isobutylene, para -bromomethylstyrene, and para -methylstyrene, functions as the ozone-inert phase of the blend. Ozone cracking is measured by the failure time of a tapered specimen under a fixed load in a high severity ozone oven, whereas flex cracking is ranked by the De Mattia cut growth. The ozone resistance of BIMSM/BR/NR blends is compared to that of a BR/NR blend (with or without antiozonant) at constant strain energy densities. The effects of the BIMSM content in the blend, the structural variations of BIMSM, and the network chain length between crosslinks on these two failure properties, which are important in crosslinked compounds for applications in tire sidewalls, are discussed. © 2006 Wiley Periodicals, Inc. J Appl Polym Sci 103: 2183,2196, 2007 [source]


    Modelling survival in acute severe illness: Cox versus accelerated failure time models

    JOURNAL OF EVALUATION IN CLINICAL PRACTICE, Issue 1 2008
    John L. Moran MBBS FRACP FJFICM MD
    Abstract Background, The Cox model has been the mainstay of survival analysis in the critically ill and time-dependent covariates have infrequently been incorporated into survival analysis. Objectives, To model 28-day survival of patients with acute lung injury (ALI) and acute respiratory distress syndrome (ARDS), and compare the utility of Cox and accelerated failure time (AFT) models. Methods, Prospective cohort study of 168 adult patients enrolled at diagnosis of ALI in 21 adult ICUs in three Australian States with measurement of survival time, censored at 28 days. Model performance was assessed as goodness-of-fit [GOF, cross-products of quantiles of risk and time intervals (P , 0.1), Cox model] and explained variation (,R2', Cox and ATF). Results, Over a 2-month study period (October,November 1999), 168 patients with ALI were identified, with a mean (SD) age of 61.5 (18) years and 30% female. Peak mortality hazard occurred at days 7,8 after onset of ALI/ARDS. In the Cox model, increasing age and female gender, plus interaction, were associated with an increased mortality hazard. Time-varying effects were established for patient severity-of-illness score (decreasing hazard over time) and multiple-organ-dysfunction score (increasing hazard over time). The Cox model was well specified (GOF, P > 0.34) and R2 = 0.546, 95% CI: 0.390, 0.781. Both log-normal (R2 = 0.451, 95% CI: 0.321, 0.695) and log-logistic (R2 0.470, 95% CI: 0.346, 0.714) AFT models identified the same predictors as the Cox model, but did not demonstrate convincingly superior overall fit. Conclusions, Time dependence of predictors of survival in ALI/ARDS exists and must be appropriately modelled. The Cox model with time-varying covariates remains a flexible model in survival analysis of patients with acute severe illness. [source]


    Estimating the effect of treatment in a proportional hazards model in the presence of non-compliance and contamination

    JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 4 2007
    Jack Cuzick
    Summary., Methods for adjusting for non-compliance and contamination, which respect the randomization, are extended from binary outcomes to time-to-event analyses by using a proportional hazards model. A simple non-iterative method is developed when there are no covariates, which is a generalization of the Mantel,Haenszel estimator. More generally, a ,partial likelihood' is developed which accommodates covariates under the assumption that they are independent of compliance. A key feature is that the proportion of contaminators and non-compliers in the risk set is updated at each failure time. When covariates are not independent of compliance, a full likelihood is developed and explored, but this leads to a complex estimator. Estimating equations and information matrices are derived for these estimators and they are evaluated by simulation studies. [source]


    Analysis of failure time data under competing censoring mechanisms

    JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 3 2007
    Andrea Rotnitzky
    Summary., We derive estimators of the survival curve of a failure time in the presence of competing right censoring mechanisms. Our approach allows for the possibility that some or all of the competing censoring mechanisms are associated with the end point, even after adjustment for recorded prognostic factors. It also allows the degree of residual association to be possibly different for distinct censoring processes. Our methods generalize from one to several competing censoring mechanisms the methods of Scharfstein and Robins. [source]


    The entry and exit decisions of foreign banks in Hong Kong

    MANAGERIAL AND DECISION ECONOMICS, Issue 6 2008
    Man K. Leung
    This paper presents a theoretical framework for explaining the entry and exit decisions of a firm, motivated by the differential returns in its home and a host market. Within this framework, the factors underpinning the entry and exit decisions of foreign banks in Hong Kong are examined, using a duration (accelerated failure time) model. It can be seen that a foreign bank, with international experience from having more overseas markets will take a shorter (longer) time to enter (exit) the Hong Kong market. Faster (slower) growth both in home trade with Hong Kong and in the Hong Kong banking sector itself will increase the likelihood of entry (exit). Ceteris paribus, Asian banks enter at a faster rate and survive longer in the Hong Kong market. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    ACCELERATED FAILURE TIME MODELS WITH NONLINEAR COVARIATES EFFECTS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2007
    Chenlei Leng
    Summary As a flexible alternative to the Cox model, the accelerated failure time (AFT) model assumes that the event time of interest depends on the covariates through a regression function. The AFT model with non-parametric covariate effects is investigated, when variable selection is desired along with estimation. Formulated in the framework of the smoothing spline analysis of variance model, the proposed method based on the Stute estimate (Stute, 1993[Consistent estimation under random censorship when covariables are present, J. Multivariate Anal.45, 89,103]) can achieve a sparse representation of the functional decomposition, by utilizing a reproducing kernel Hilbert norm penalty. Computational algorithms and theoretical properties of the proposed method are investigated. The finite sample size performance of the proposed approach is assessed via simulation studies. The primary biliary cirrhosis data is analyzed for demonstration. [source]


    Regularized Estimation for the Accelerated Failure Time Model

    BIOMETRICS, Issue 2 2009
    T. Cai
    Summary In the presence of high-dimensional predictors, it is challenging to develop reliable regression models that can be used to accurately predict future outcomes. Further complications arise when the outcome of interest is an event time, which is often not fully observed due to censoring. In this article, we develop robust prediction models for event time outcomes by regularizing the Gehan's estimator for the accelerated failure time (AFT) model (Tsiatis, 1996, Annals of Statistics18, 305,328) with least absolute shrinkage and selection operator (LASSO) penalty. Unlike existing methods based on the inverse probability weighting and the Buckley and James estimator (Buckley and James, 1979, Biometrika66, 429,436), the proposed approach does not require additional assumptions about the censoring and always yields a solution that is convergent. Furthermore, the proposed estimator leads to a stable regression model for prediction even if the AFT model fails to hold. To facilitate the adaptive selection of the tuning parameter, we detail an efficient numerical algorithm for obtaining the entire regularization path. The proposed procedures are applied to a breast cancer dataset to derive a reliable regression model for predicting patient survival based on a set of clinical prognostic factors and gene signatures. Finite sample performances of the procedures are evaluated through a simulation study. [source]


    Modeling Longitudinal Data with Nonparametric Multiplicative Random Effects Jointly with Survival Data

    BIOMETRICS, Issue 2 2008
    Jimin Ding
    Summary In clinical studies, longitudinal biomarkers are often used to monitor disease progression and failure time. Joint modeling of longitudinal and survival data has certain advantages and has emerged as an effective way to mutually enhance information. Typically, a parametric longitudinal model is assumed to facilitate the likelihood approach. However, the choice of a proper parametric model turns out to be more elusive than models for standard longitudinal studies in which no survival endpoint occurs. In this article, we propose a nonparametric multiplicative random effects model for the longitudinal process, which has many applications and leads to a flexible yet parsimonious nonparametric random effects model. A proportional hazards model is then used to link the biomarkers and event time. We use B-splines to represent the nonparametric longitudinal process, and select the number of knots and degrees based on a version of the Akaike information criterion (AIC). Unknown model parameters are estimated through maximizing the observed joint likelihood, which is iteratively maximized by the Monte Carlo Expectation Maximization (MCEM) algorithm. Due to the simplicity of the model structure, the proposed approach has good numerical stability and compares well with the competing parametric longitudinal approaches. The new approach is illustrated with primary biliary cirrhosis (PBC) data, aiming to capture nonlinear patterns of serum bilirubin time courses and their relationship with survival time of PBC patients. [source]


    Estimation of Competing Risks with General Missing Pattern in Failure Types

    BIOMETRICS, Issue 4 2003
    Anup Dewanji
    Summary. In competing risks data, missing failure types (causes) is a very common phenomenon. In this work, we consider a general missing pattern in which, if a failure type is not observed, one observes a set of possible types containing the true type, along with the failure time. We first consider maximum likelihood estimation with missing-at-random assumption via the expectation maximization (EM) algorithm. We then propose a Nelson-Aalen type estimator for situations when certain information on the conditional probability of the true type given a set of possible failure types is available from the experimentalists. This is based on a least-squares type method using the relationships between hazards for different types and hazards for different combinations of missing types. We conduct a simulation study to investigate the performance of this method, which indicates that bias may be small, even for high proportion of missing data, for sufficiently large number of observations. The estimates are somewhat sensitive to misspecification of the conditional probabilities of the true types when the missing proportion is high. We also consider an example from an animal experiment to illustrate our methodology. [source]


    Survival Analysis in Clinical Trials: Past Developments and Future Directions

    BIOMETRICS, Issue 4 2000
    Thomas R. Fleming
    Summary. The field of survival analysis emerged in the 20th century and experienced tremendous growth during the latter half of the century. The developments in this field that have had the most profound impact on clinical trials are the Kaplan-Meier (1958, Journal of the American Statistical Association53, 457,481) method for estimating the survival function, the log-rank statistic (Mantel, 1966, Cancer Chemotherapy Report50, 163,170) for comparing two survival distributions, and the Cox (1972, Journal of the Royal Statistical Society, Series B34, 187,220) proportional hazards model for quantifying the effects of covariates on the survival time. The counting-process martingale theory pioneered by Aalen (1975, Statistical inference for a family of counting processes, Ph.D. dissertation, University of California, Berkeley) provides a unified framework for studying the small- and large-sample properties of survival analysis statistics. Significant progress has been achieved and further developments are expected in many other areas, including the accelerated failure time model, multivariate failure time data, interval-censored data, dependent censoring, dynamic treatment regimes and causal inference, joint modeling of failure time and longitudinal data, and Baysian methods. [source]


    Maximum likelihood estimation in semiparametric regression models with censored data

    JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES B (STATISTICAL METHODOLOGY), Issue 4 2007
    D. Zeng
    Summary., Semiparametric regression models play a central role in formulating the effects of covariates on potentially censored failure times and in the joint modelling of incomplete repeated measures and failure times in longitudinal studies. The presence of infinite dimensional parameters poses considerable theoretical and computational challenges in the statistical analysis of such models. We present several classes of semiparametric regression models, which extend the existing models in important directions. We construct appropriate likelihood functions involving both finite dimensional and infinite dimensional parameters. The maximum likelihood estimators are consistent and asymptotically normal with efficient variances. We develop simple and stable numerical techniques to implement the corresponding inference procedures. Extensive simulation experiments demonstrate that the inferential and computational methods proposed perform well in practical settings. Applications to three medical studies yield important new insights. We conclude that there is no reason, theoretical or numerical, not to use maximum likelihood estimation for semiparametric regression models. We discuss several areas that need further research. [source]


    Fracture prediction in tough polyethylene pipes using measured craze strength

    POLYMER ENGINEERING & SCIENCE, Issue 5 2008
    P. Davis
    In this study, an empirical model is developed that predicts the time to failure for PE pipes under combined pressure and deflection loads. The time-dependent craze strength of different PE materials is measured using the circumferentially deep-notched tensile (CDNT) test. In agreement with previous research, results indicate that bimodal materials with comonomer side-chain densities biased toward high-molecular-weight PE molecules exhibit significantly higher long-term craze strengths. A comparison of currently available PE materials with CDNT samples taken from a PE pipe that failed by slow crack growth in service clearly indicates the superior performance of new-generation materials. Using measured craze strength data from the CDNT test, a simplified model for predicting failure in buried PE pipes is developed. Extending previous research, the reference stress concept is used to calculate an equivalent craze stress for a pipe subjected to combined internal pressure and deflection loads. Good agreement is obtained between the model predictions and observed failure times in an experimental test-bed study of pipes under in-service loading conditions. POLYM. ENG. SCI., 2008. © 2008 Society of Plastics Engineers [source]


    Control Charts for Monitoring Field Failure Data

    QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 7 2006
    Robert G. Batson
    Abstract One responsibility of the reliability engineer is to monitor failure trends for fielded units to confirm that pre-production life testing results remain valid. This research suggests an approach that is computationally simple and can be used with a small number of failures per observation period. The approach is based on converting failure time data from fielded units to normal distribution data, using simple logarithmic or power transformations. Appropriate normalizing transformations for the classic life distributions (exponential, lognormal, and Weibull) are identified from the literature. Samples of size 500 field failure times are generated for seven different lifetime distributions (normal, lognormal, exponential, and four Weibulls of various shapes). Various control charts are then tested under three sampling schemes (individual, fixed, and random) and three system reliability degradations (large step, small step, and linear decrease in mean time between failures (MTBF)). The results of these tests are converted to performance measures of time to first out-of-control signal and persistence of signal after out-of-control status begins. Three of the well-known Western Electric sensitizing rules are used to recognize the assignable cause signals. Based on this testing, the ,X -chart with fixed sample size is the best overall for field failure monitoring, although the individual chart was better for the transformed exponential and another highly-skewed Weibull. As expected, the linear decrease in MTBF is the most difficult change for any of the charts to detect. Copyright © 2005 John Wiley & Sons, Ltd. [source]


    Estimation of the expected ROCOF of a repairable system with bootstrap confidence region

    QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 3 2001
    M. J. Phillips
    Abstract Bootstrap methods are presented for constructing confidence regions for the expected ROCOF of a repairable system. This is based on the work of Cowling et al. (Journal of the American Statistical Association 1996; 91: 1516,1524) for the intensity function of a NHPP. The method is applied to the failure times of a photocopier given by Baker (Technometrics 1996; 38: 256,265). Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Semiparametric estimation of single-index hazard functions without proportional hazards

    THE ECONOMETRICS JOURNAL, Issue 1 2006
    Tue Gørgens
    Summary, This research develops a semiparametric kernel-based estimator of hazard functions which does not assume proportional hazards. The maintained assumption is that the hazard functions depend on regressors only through a linear index. The estimator permits both discrete and continuous regressors, both discrete and continuous failure times, and can be applied to right-censored data and to multiple-risks data, in which case the hazard functions are risk-specific. The estimator is root- n consistent and asymptotically normally distributed. The estimator performs well in Monte Carlo experiments. [source]


    NONPARAMETRIC ESTIMATION OF CONDITIONAL CUMULATIVE HAZARDS FOR MISSING POPULATION MARKS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 1 2010
    Dipankar Bandyopadhyay
    Summary A new function for the competing risks model, the conditional cumulative hazard function, is introduced, from which the conditional distribution of failure times of individuals failing due to cause,j,can be studied. The standard Nelson,Aalen estimator is not appropriate in this setting, as population membership (mark) information may be missing for some individuals owing to random right-censoring. We propose the use of imputed population marks for the censored individuals through fractional risk sets. Some asymptotic properties, including uniform strong consistency, are established. We study the practical performance of this estimator through simulation studies and apply it to a real data set for illustration. [source]


    A Goodness-of-fit Test for the Marginal Cox Model for Correlated Interval-censored Failure Time Data

    BIOMETRICAL JOURNAL, Issue 6 2006
    Lianming Wang
    Abstract The marginal Cox model approach is perhaps the most commonly used method in the analysis of correlated failure time data (Cai, 1999; Cai and Prentice, 1995; Lin, 1994; Wei, Lin and Weissfeld, 1989). It assumes that the marginal distributions for the correlated failure times can be described by the Cox model and leaves the dependence structure completely unspecified. This paper discusses the assessment of the marginal Cox model for correlated interval-censored data and a goodness-of-fit test is presented for the problem. The method is applied to a set of correlated interval-censored data arising from an AIDS clinical trial. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Parameter Estimation for Partially Complete Time and Type of Failure Data

    BIOMETRICAL JOURNAL, Issue 2 2004
    Debasis Kundu
    Abstract The theory of competing risks has been developed to asses a specific risk in presence of other risk factors. In this paper we consider the parametric estimation of different failure modes under partially complete time and type of failure data using latent failure times and cause specific hazard functions models. Uniformly minimum variance unbiased estimators and maximum likelihood estimators are obtained when latent failure times and cause specific hazard functions are exponentially distributed. We also consider the case when they follow Weibull distributions. One data set is used to illustrate the proposed techniques. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    A Bayesian Chi-Squared Goodness-of-Fit Test for Censored Data Models

    BIOMETRICS, Issue 2 2010
    Jing Cao
    Summary We propose a Bayesian chi-squared model diagnostic for analysis of data subject to censoring. The test statistic has the form of Pearson's chi-squared test statistic and is easy to calculate from standard output of Markov chain Monte Carlo algorithms. The key innovation of this diagnostic is that it is based only on observed failure times. Because it does not rely on the imputation of failure times for observations that have been censored, we show that under heavy censoring it can have higher power for detecting model departures than a comparable test based on the complete data. In a simulation study, we show that tests based on this diagnostic exhibit comparable power and better nominal Type I error rates than a commonly used alternative test proposed by Akritas (1988,,Journal of the American Statistical Association,83, 222,230). An important advantage of the proposed diagnostic is that it can be applied to a broad class of censored data models, including generalized linear models and other models with nonidentically distributed and nonadditive error structures. We illustrate the proposed model diagnostic for testing the adequacy of two parametric survival models for Space Shuttle main engine failures. [source]


    Joint Models for Multivariate Longitudinal and Multivariate Survival Data

    BIOMETRICS, Issue 2 2006
    Yueh-Yun Chi
    Summary Joint modeling of longitudinal and survival data is becoming increasingly essential in most cancer and AIDS clinical trials. We propose a likelihood approach to extend both longitudinal and survival components to be multidimensional. A multivariate mixed effects model is presented to explicitly capture two different sources of dependence among longitudinal measures over time as well as dependence between different variables. For the survival component of the joint model, we introduce a shared frailty, which is assumed to have a positive stable distribution, to induce correlation between failure times. The proposed marginal univariate survival model, which accommodates both zero and nonzero cure fractions for the time to event, is then applied to each marginal survival function. The proposed multivariate survival model has a proportional hazards structure for the population hazard, conditionally as well as marginally, when the baseline covariates are specified through a specific mechanism. In addition, the model is capable of dealing with survival functions with different cure rate structures. The methodology is specifically applied to the International Breast Cancer Study Group (IBCSG) trial to investigate the relationship between quality of life, disease-free survival, and overall survival. [source]


    Multivariate Survival Trees: A Maximum Likelihood Approach Based on Frailty Models

    BIOMETRICS, Issue 1 2004
    Xiaogang Su
    Summary. A method of constructing trees for correlated failure times is put forward. It adopts the backfitting idea of classification and regression trees (CART) (Breiman et al., 1984, in Classification and Regression Trees). The tree method is developed based on the maximized likelihoods associated with the gamma frailty model and standard likelihood-related techniques are incorporated. The proposed method is assessed through simulations conducted under a variety of model configurations and illustrated using the chronic granulomatous disease (CGD) study data. [source]


    A Semiparametric Estimate of Treatment Effects with Censored Data

    BIOMETRICS, Issue 3 2001
    Ronghui Xu
    Summary. A semiparametric estimate of an average regression effect with right-censored failure time data has recently been proposed under the Cox-type model where the regression effect ,(t) is allowed to vary with time. In this article, we derive a simple algebraic relationship between this average regression effect and a measurement of group differences in K -sample transformation models when the random error belongs to the Gp family of Harrington and Fleming (1982, Biometrika69, 553,566), the latter being equivalent to the conditional regression effect in a gamma frailty model. The models considered here are suitable for the attenuating hazard ratios that often arise in practice. The results reveal an interesting connection among the above three classes of models as alternatives to the proportional hazards assumption and add to our understanding of the behavior of the partial likelihood estimate under nonproportional hazards. The algebraic relationship provides a simple estimator under the transformation model. We develop a variance estimator based on the empirical influence function that is much easier to compute than the previously suggested resampling methods. When there is truncation in the right tail of the failure times, we propose a method of bias correction to improve the coverage properties of the confidence intervals. The estimate, its estimated variance, and the bias correction term can all be calculated with minor modifications to standard software for proportional hazards regression. [source]


    A Multiple Imputation Approach to Cox Regression with Interval-Censored Data

    BIOMETRICS, Issue 1 2000
    Wei Pan
    Summary. We propose a general semiparametric method based on multiple imputation for Cox regression with interval-censored data. The method consists of iterating the following two steps. First, from finite-interval-censored (but not right-censored) data, exact failure times are imputed using Tanner and Wei's poor man's or asymptotic normal data augmentation scheme based on the current estimates of the regression coefficient and the baseline survival curve. Second, a standard statistical procedure for right-censored data, such as the Cox partial likelihood method, is applied to imputed data to update the estimates. Through simulation, we demonstrate that the resulting estimate of the regression coefficient and its associated standard error provide a promising alternative to the nonparametric maximum likelihood estimate. Our proposal is easily implemented by taking advantage of existing computer programs for right,censored data. [source]