Posttransplant Survival (posttransplant + survival)

Distribution by Scientific Domains


Selected Abstracts


Kidney and Pancreas Transplantation in the United States, 1999,2008: The Changing Face of Living Donation

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 4p2 2010
D. A. Axelrod
The waiting list for kidney transplantation continued to grow between 1999 and 2008, from 41 177 to 76 089 candidates. However, active candidates represented the minority of this increase (36 951,50 624, a 37% change), while inactive candidates increased over 500% (4226,25 465). There were 5966 living donor (LD) and 10 551 deceased donor (DD) kidney transplants performed in 2008. The total number of pancreas transplants peaked at 1484 in 2004 and has declined to 1273. Although the number of LD transplants increased by 26% from 1999 to 2008, the total number peaked in 2004 at 6647 before declining 10% by 2008. The rate of LD transplantation continues to vary significantly as a function of demographic and geographic factors, including waiting time for DD transplant. Posttransplant survival remains excellent, and there appears to be greater use of induction agents and reduced use of corticosteroids in LD recipients. Significant changes occurred in the pediatric population, with a dramatic reduction in the use of LD organs after passage of the Share 35 rule. Many strategies have been adopted to reverse the decline in LD transplant rates for all age groups, including expansion of kidney paired donation, adoption of laparoscopic donor nephrectomy and use of incompatible LD. [source]


Outcomes of liver transplantation in patients with cirrhosis due to nonalcoholic steatohepatitis versus patients with cirrhosis due to alcoholic liver disease

LIVER TRANSPLANTATION, Issue 12 2009
Vishal Bhagat
Nonalcoholic steatohepatitis (NASH) is becoming a common cause of liver cirrhosis requiring liver transplantation (LT). Cardiovascular complications related to metabolic syndrome and NASH recurrence in the transplanted liver may affect the outcome of LT in these patients. We compared the outcomes of LT for NASH cirrhosis and alcoholic cirrhosis (ETOH) in a large transplant center. A retrospective chart review was performed for all patients who underwent LT for cryptogenic cirrhosis with the NASH phenotype (the NASH group) or ETOH (the ETOH group) at the University of Miami from January 1997 to January 2007. There was no significant difference in survival between the NASH and ETOH groups, despite a trend toward lower survival in the former (P = 0.1699). Sepsis was the leading cause of posttransplant death in both groups, and it was followed by cardiovascular causes in the NASH group (26% versus 7% in the ETOH group, P = 0.21) and malignancies in the ETOH group (29% versus 0% in the NASH group, P = 0.024). Recurrent steatohepatitis (33% versus 0%, P < 0.0001) and acute rejection (41% versus 23%, P < 0.023) were significantly more frequent in the NASH group than in the ETOH group. There was no difference in graft failure between the groups (24% in the NASH group versus 18% in the ETOH group, P = 0.3973). In conclusion, despite a numerical trend favoring the ETOH group, there were no statistically significant differences in posttransplant survival and cardiovascular mortality between the NASH and ETOH groups. Acute rejection and recurrent steatohepatitis were significantly more frequent in the NASH group but did not lead to higher rates of retransplantation. Liver Transpl 15:1814,1820, 2009. © 2009 AASLD. [source]


MELD and prediction of post,liver transplantation survival

LIVER TRANSPLANTATION, Issue 3 2006
Shahid Habib
The model for end-stage liver disease (MELD) was developed to predict short-term mortality in patients with cirrhosis. It has since become the standard tool to prioritize patients for liver transplantation. We assessed the value of pretransplant MELD in the prediction of posttransplant survival. We identified adult patients who underwent liver transplantation at our institution during 1991,2002. Among 2,009 recipients, 1,472 met the inclusion criteria. Based on pretransplant MELD scores, recipients were stratified as low risk (,15), medium risk (16,25), and high risk (>25). The primary endpoints were patient and graft survival. Mean posttransplant follow-up was 5.5 years. One-, 5- and 10-year patient survival was 83%, 72%, and 58%, respectively, and graft survival was 76%, 65%, and 53%, respectively. In univariable analysis, patient and donor age, patient sex, MELD score, disease etiology, and retransplantation were associated with posttransplantation patient and graft survival. In multivariable analysis adjusted for year of transplantation, patient age >65 years, donor age >50 years, male sex, and retransplantation and pretransplant MELD scores >25 were associated with poor patient and graft survival. The impact of MELD score >25 was maximal during the first year posttransplant. In conclusion, older patient and donor age, male sex of recipient, retransplantation, and high pretransplant MELD score are associated with poor posttransplant outcome. Pretransplant MELD scores correlate inversely with posttransplant survival. However, better prognostic models are needed that would provide an overall assessment of transplant benefit relative to the severity of hepatic dysfunction. Liver Transpl 12:440,447, 2006. © 2006 AASLD. [source]


Increasing Lung Allocation Scores Predict Worsened Survival Among Lung Transplant Recipients

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 4 2010
V. Liu
Implemented in 2005, the lung allocation score (LAS) aims to distribute donor organs based on overall survival benefits for all potential recipients, rather than on waiting list time accrued. While prior work has shown that patients with scores greater than 46 are at increased risk of death, it is not known whether that risk is equivalent among such patients when stratified by LAS score and diagnosis. We retrospectively evaluated 5331 adult lung transplant recipients from May 2005 to February 2009 to determine the association of LAS (groups based on scores of ,46, 47,59, 60,79 and ,80) and posttransplant survival. When compared with patients with LAS , 46, only those with LAS , 60 had an increased risk of death (LAS 60,79: hazard ratio [HR], 1.52; 95% confidence interval [CI], 1.21,1.90; LAS , 80: HR, 2.03; CI, 1.61,2.55; p < 0.001) despite shorter median waiting list times. This risk persisted after adjusting for age, diagnosis, transplant center volume and donor characteristics. By specific diagnosis, an increased hazard was observed in patients with COPD with LAS , 80, as well as those with IPF with LAS , 60. [source]


Liver Transplantation in the United States, 1999,2008

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 4p2 2010
P. J. Thuluvath
Changes in organ allocation policy in 2002 reduced the number of adult patients on the liver transplant waiting list, changed the characteristics of transplant recipients and increased the number of patients receiving simultaneous liver,kidney transplantation (SLK). The number of liver transplants peaked in 2006 and declined marginally in 2007 and 2008. During this period, there was an increase in donor age, the Donor Risk Index, the number of candidates receiving MELD exception scores and the number of recipients with hepatocellular carcinoma. In contrast, there was a decrease in retransplantation rates, and the number of patients receiving grafts from either a living donor or from donation after cardiac death. The proportion of patients with severe obesity, diabetes and renal insufficiency increased during this period. Despite increases in donor and recipient risk factors, there was a trend towards better 1-year graft and patient survival between 1998 and 2007. Of major concern, however, were considerable regional variations in waiting time and posttransplant survival. The current status of liver transplantation in the United States between 1999 and 2008 was analyzed using SRTR data. In addition to a general summary, we have included a more detailed analysis of liver transplantation for hepatitis C, retransplantation and SLK transplantation. [source]


Heart Transplantation in the United States, 1999,2008

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 4p2 2010
M. R. Johnson
This article features 1999,2008 trends in heart transplantation, as seen in data from the Organ Procurement and Transplantation Network (OPTN) and the Scientific Registry of Transplant Recipients (SRTR). Despite a 32% decline in actively listed candidates over the decade, there was a 20% increase from 2007 to 2008. There continues to be an increase in listed candidates diagnosed with congenital heart disease or retransplantation. The proportion of patients listed as Status 1A and 1B continues to increase, with a decrease in Status 2 listings. Waiting list mortality decreased from 2000 through 2007, but increased 18% from 2007 to 2008; despite the increase in waiting list death rates in 2008, waiting list mortality for Status 1A and Status 1B continues to decrease. Recipient numbers have varied by 10% over the past decade, with an increased proportion of transplants performed in infants and patients above 65 years of age. Despite the increase in Status 1A and Status 1B recipients at transplant, posttransplant survival has continued to improve. With the rise in infant candidates for transplantation and their high waiting list mortality, better means of supporting infants in need of transplant and allocation of organs to infant candidates is clearly needed. [source]


Harm and Benefits of Primary Liver Resection and Salvage Transplantation for Hepatocellular Carcinoma

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2010
A. Cucchetti
Primary transplantation offers longer life-expectancy in comparison to hepatic resection (HR) for hepatocellular carcinoma (HCC) followed by salvage transplantation; however, livers not used for primary transplantation can be reallocated to the remaining waiting-list patients, thus, the harm caused to resected patients could be balanced, or outweighed, by the benefit obtained from reallocation of livers originating from HCC patients first being resected. A Markov model was developed to investigate this issue based on literature data or estimated from the United Network for Organ Sharing database. Markov model shows that primary transplantation offers longer life-expectancy in comparison to HR and salvage transplantation if 5-year posttransplant survival remains higher than 60%. The balance between the harm for resected patients and the benefit for the remaining waiting list depends on (a) the proportion of HCC candidates, (b) the percentage shifted to HR and (c) the median expected time-to-transplant. Faced with a low proportion of HCC candidates, the harm caused to resected patients was higher than the benefit that could be obtained for the waiting-list population from re-allocation of extra livers. An increased proportion of HCC candidates and/or an increased median time-to-transplant could lead to a benefit for waiting-list patients that outweighs this harm. [source]


Centers for Disease Control ,High-Risk' Donors and Kidney Utilization

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 2 2010
K. I. Duan
The aims of this study were to determine whether Centers for Disease Control high risk (CDCHR) status of organ donors affects kidney utilization and recipient survival. Data from the Scientific Registry of Transplant Recipients were used to examine utilization rates of 45 112 standard criteria donor (SCD) deceased donor kidneys from January 1, 2005, and February 2, 2009. Utilization rates for transplantation were compared between CDCHR and non-CDCHR kidneys, using logistic regression to control for possible confounders. Cox regression was used to determine whether CDCHR status independently affected posttransplant survival among 25 158 recipients of SCD deceased donor kidneys between January 1, 2005, and February 1, 2008. CDCHR kidneys were 8.2% (95% CI 6.9,9.5) less likely to be used for transplantation than non-CDCHR kidneys; after adjusting for other factors, CDCHR was associated with an odds ratio of utilization of 0.67 (95% CI 0.61,0.74). After a median 2 years follow-up, recipients of CDCHR kidneys had similar posttransplant survival compared to recipients of non-CDCHR kidneys (hazard ratio 1.06, 95% CI 0.89,1.26). These findings suggest that labeling donor organs as ,high risk' may result in wastage of approximately 41 otherwise standard kidneys per year. [source]


Patient Survival After Kidney Transplantation: Relationship to Pretransplant Cardiac Troponin T Levels

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 6 2009
L. T. J. Hickson
Assessing cardiovascular (CV) risk pretransplant is imprecise. We sought to determine whether cardiac troponin T (cTnT) relates to patient survival posttransplant. The study includes 603 adults, recipients of kidney transplants. In addition to cTnT dobutamine stress echography and coronary angiography were done in 45% and 19% of the candidates respectively. During 28.4 ± 12.9 months 5.6% of patients died or had a major cardiac event. cTnT levels were elevated (>0.01 ng/ml) in 56.2% of patients. Elevated cTnT related to reduced event-free survival (hazard ratio (HR) = 1.81, CI 1.33,2.45, p < 0.0001) whether those events occurred during the first year or beyond. This relationship was statistically independent of all other variables tested, including older age, reduced left ventricular ejection fraction (EF) and delayed graft function. cTnT levels allowed better definition of risk in patients with other CV risk factors. Thus, event-free survival was excellent in older individuals, patients with diabetes, low EF and those with preexisting heart disease if their cTnT levels were normal. However, elevated cTnT together with another CV risk factor(s) identified patient with very poor survival posttransplant. Pretransplant cTnT levels are strong and independent predictors of posttransplant survival. These results suggest that cTnT is quite helpful in CV risk stratification of kidney transplant recipients. [source]


Survival Benefit-Based Deceased-Donor Liver Allocation

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 4p2 2009
D. E. Schaubel
Currently, patients awaiting deceased-donor liver transplantation are prioritized by medical urgency. Specifically, wait-listed chronic liver failure patients are sequenced in decreasing order of Model for End-stage Liver Disease (MELD) score. To maximize lifetime gained through liver transplantation, posttransplant survival should be considered in prioritizing liver waiting list candidates. We evaluate a survival benefit based system for allocating deceased-donor livers to chronic liver failure patients. Under the proposed system, at the time of offer, the transplant survival benefit score would be computed for each patient active on the waiting list. The proposed score is based on the difference in 5-year mean lifetime (with vs. without a liver transplant) and accounts for patient and donor characteristics. The rank correlation between benefit score and MELD score is 0.67. There is great overlap in the distribution of benefit scores across MELD categories, since waiting list mortality is significantly affected by several factors. Simulation results indicate that over 2000 life-years would be saved per year if benefit-based allocation was implemented. The shortage of donor livers increases the need to maximize the life-saving capacity of procured livers. Allocation of deceased-donor livers to chronic liver failure patients would be improved by prioritizing patients by transplant survival benefit. [source]


D-MELD, a Simple Predictor of Post Liver Transplant Mortality for Optimization of Donor/Recipient Matching

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 2 2009
J. B. Halldorson
Numerous donor and recipient risk factors interact to influence the probability of survival after liver transplantation. We developed a statistic, D-MELD, the product of donor age and preoperative MELD, calculated from laboratory values. Using the UNOS STAR national transplant data base, we analyzed survival for first liver transplant recipients with chronic liver failure from deceased after brain death donors. Preoperative D-MELD score effectively stratified posttransplant survival. Using a cutoff D-MELD score of 1600, we defined a subgroup of donor,recipient matches with significantly poorer short- and long-term outcomes as measured by survival and length of stay (LOS). Avoidance of D-MELD scores above 1600 improved results for subgroups of high-risk patients with donor age ,60 and those with preoperative MELD ,30. D-MELD ,1600 accurately predicted worse outcome in recipients with and without hepatitis C. There is significant regional variation in average D-MELD scores at transplant, however, regions with larger numbers of high D-MELD matches do not have higher survival rates. D-MELD is a simple, highly predictive tool for estimating outcomes after liver transplantation. This statistic could assist surgeons and their patients in making organ acceptance decisions. Applying D-MELD to liver allocation could eliminate many donor/recipient matches likely to have inferior outcome. [source]


Transplantation Risks and the Real World: What Does ,High Risk' Really Mean?

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 1 2009
R. B. Freeman
Candidates for, and recipients of, transplants face numerous risks that receive varying degrees of attention from the media and transplant professionals. Characterizations such as ,high risk donor' are not necessarily accurate or informative unless they are discussed in context with the other risks patients face before and after transplantation. Moreover, such labels do not provide accurate information for informed consent discussions or decision making. Recent cases of donor-transmitted diseases from donors labeled as being at ,high risk' have engendered concern, new policy proposals and attempts to employ additional testing of donors. The publicity and policy reactions to these cases do not necessarily better inform transplant candidates and recipients about these risks. Using comparative risk analysis, we compare the various risks associated with waiting on the list, accepting donors with various risk characteristics, posttransplant survival and everyday risks we all face in modern life to provide some quantitative perspective on what ,high risk' really means for transplant patients. In our analysis, donor-transmitted disease risks are orders of magnitude less than other transplantation risks and similar to many everyday occupational and recreational risks people readily and willingly accept. These comparisons can be helpful for informing patients and guiding future policy development. [source]


Analysis of recent pediatric orthotopic liver transplantation outcomes indicates that allograft type is no longer a predictor of survivals,

LIVER TRANSPLANTATION, Issue 8 2008
Natasha S. Becker
Two strategies to increase the donor allograft pool for pediatric orthotopic liver transplantation (OLT) are deceased donor segmental liver transplantation (DDSLT) and living donor liver transplantation (LDLT). The purpose of this study is to evaluate outcomes after use of these alternative allograft types. Data on all OLT recipients between February 2002 and December 2004 less than 12 years of age were obtained from the United Network for Organ Sharing database. The impact of allograft type on posttransplant survivals was assessed. The number of recipients was 1260. Of these, 52% underwent whole liver transplantation (WLT), 33% underwent DDSLT, and 15% underwent LDLT. There was no difference in retransplantation rates. Immediate posttransplant survivals differed, with WLT patients having improved 30-day patient survivals compared to DDSLT and LDLT patients (P = 0.004). Although unadjusted 1-year patient survivals were better for WLT versus DDSLT (P = 0.01), after risk adjustment, 1-year patient survivals for WLT (94%), DDSLT (91%), and LDLT (93%) were similar (P values > 0.05). Unadjusted allograft survivals were better for WLT and LDLT in comparison with DDSLT (P = 0.009 and 0.018, respectively); however, after adjustment, these differences became nonsignificant (all P values > 0.05). For patients , 2 years of age (n = 833), the adjusted 1-year patient and allograft survivals were also similar (all P values > 0.05). In conclusion, in the current era of pediatric liver transplantation, WLT recipients have better immediate postoperative survivals. By 1 year, adjusted patient and allograft survivals are similar, regardless of the allograft type. Liver Transpl 14:1125,1132, 2008. © 2008 AASLD. [source]