Donor Age (donor + age)

Distribution by Scientific Domains

Kinds of Donor Age

  • mean donor age


  • Selected Abstracts


    Pretransplant Diabetes, Not Donor Age, Predicts Long-Term Outcomes in Cardiac Transplantation

    JOURNAL OF CARDIAC SURGERY, Issue 2 2006
    Steven R. Meyer M.D.
    Objectives were to review the outcomes of using cardiac donors 50 years of age and older and to identify predictors of outcome at a single institution. Methods: A retrospective analysis of all adult cardiac transplants (n = 338) performed at our institution between 1988 and 2002 was conducted. Results: Of these, 284 patients received hearts from donors <50 years old and 54 received hearts from donors ,50 years old. Recipients of hearts from older donors had a greater frequency of pretransplant diabetes (19% vs 33%), renal failure (16% vs 30%), and dialysis (3% vs 9%). There were no differences in ICU or postoperative length of stay, days ventilated, or early rejection episodes. Recipients of older donor hearts, however, had increased perioperative mortality (7% vs 17%; p = 0.03). Multivariate analysis identified older donors (OR 2.599; p = 0.03) and donor ischemia time (OR 1.006; p = 0.002) as significant predictors of perioperative mortality. Actuarial survival at 1 (87% vs 74%), 5 (76% vs 69%), and 10 (59% vs 58%) years was similar (p = 0.08) for the two groups. Separate multivariate analyses identified pretransplant diabetes as the sole predictor of long-term survival (HR 1.659; p = 0.02) and transplant coronary disease (HR 2.486; p = 0.003). Conclusions: Despite increased perioperative mortality, donors ,50 years old may be used with long-term outcomes similar to those of younger donor hearts. This has potential to expand the donor pool. Pretransplant diabetes has a significant impact on long-term outcomes in cardiac transplantation and requires further investigation. [source]


    Effect of Donor Age on Long-Term Survival Following Cardiac Transplantation

    JOURNAL OF CARDIAC SURGERY, Issue 2 2006
    Veli K. Topkara M.D.
    Our objective was to analyze the effect of donor age on outcomes after cardiac transplantation. Methods: We retrospectively studied 864 patients who underwent cardiac transplantation at New York Presbyterian Hospital , Columbia University between 1992 and 2002. Patients were divided into two groups; donor age <40 years (Group A, n = 600) and donor age ,40 years (Group B, n = 264). Results: Characteristics including gender, body mass index, and cytomegalovirus (CMV) status were significantly different between the two donor age groups. Race, CMV status, toxoplasmosis status, left ventricular assist device prior to transplant, diabetes mellitus, and retransplantation were similar in both the recipient groups, while age, gender, and BMI were different. Early mortality was lower in Group A, 5%, versus 9.5% in Group B. Multivariate analysis revealed recipient female gender (odd ratio (OR) = 1.71), retransplantation (OR = 1.63), and increased donor age (OR = 1.02) as significant predictors of poor survival in the recipient population. Actuarial survival at 1 year (86.7% vs 81%), 5 years (75% vs 65%), and 10 years (56% vs 42%) was significantly different as well with a log rank p = 0.002. Conclusions: These findings suggest that increased donor age is an independent predictor of long-term survival. However, the shortage of organs makes it difficult to follow strict guidelines when placing hearts; therefore, decisions need to be made on a relative basis. [source]


    Effects of Donor Age and Cell Senescence on Kidney Allograft Survival

    AMERICAN JOURNAL OF TRANSPLANTATION, Issue 1 2009
    A. Melk
    The biological processes responsible for somatic cell senescence contribute to organ aging and progression of chronic diseases, and this may contribute to kidney transplant outcomes. We examined the effect of pre-existing donor aging on the performance of kidney transplants, comparing mouse kidney isografts and allografts from old versus young donors. Before transplantation, old kidneys were histologically normal, but displayed an increased expression of senescence marker p16INK4a. Old allografts at day 7 showed a more rapid emergence of epithelial changes and a further increase in the expression of p16INK4a. Similar but much milder changes occurred in old isografts. These changes were absent in young allografts at day 7, but emerged by day 21. The expression of p16INK4a remained low in young kidney allografts at day 7, but increased with severe rejection at day 21. Isografts from young donors showed no epithelial changes and no increase in p16INK4a. The measurements of the alloimmune response,infiltrate, cytology, expression of perforin, granzyme B, IFN-, and MHC,were not increased in old allografts. Thus, old donor kidneys display abnormal parenchymal susceptibility to transplant stresses and enhanced induction of senescence marker p16INK4a, but were not more immunogenic. These data are compatible with a key role of somatic cell senescence mechanisms in kidney transplant outcomes by contributing to donor aging, being accelerated by transplant stresses, and imposing limits on the capacity of the tissue to proliferate. [source]


    The Relationship Between Donor Age and Cadaveric Renal Allograft Survival Is Modified by the Recipient's Blood Pressure

    AMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2003
    Fernando G. Cosio
    Increasing donor age correlates with reduced renal allograft survival. In this study we analyzed variables that may modify this relationship. The study included 1285 cadaveric kidney allograft recipients followed for 7.2 + 4.5 years. By Cox, increasing donor age beyond 30 years was associated with significant increases in the hazard ratio for graft loss [age 31,46, hazard ratio (HR) =,1.4, p = 0.02; 46,60, HR = 1.55, p = 0.008; > 60, HR = 1.68, p = 0.03]. Increasing donor age was significantly associated with: older and heavier recipients; higher creatinine and blood pressure (BP) 6 months post-transplant; and lower total cyclosporine dose during the first year. Of interest, the 6-month serum creatinine and the BP level modified significantly the relationship between age and survival. Thus, increasing donor age was significantly related to reduced graft survival only in patients with a 6-month creatinine <,2 mg/dL. Furthermore, donor age related significantly to graft survival only among patients with higher BP levels 6 month post transplant. It is concluded that increasing donor age is associated with reduced cadaveric graft survival, but that relationship is significantly modified by graft function and BP. These data suggest that poorly functioning kidneys have reduced survival irrespective of age. Furthermore, elevated BP levels may have a particularly negative effect on the survival of older grafts. [source]


    Worse recent efficacy of antiviral therapy in liver transplant recipients with recurrent hepatitis C: Impact of donor age and baseline cirrhosis,

    LIVER TRANSPLANTATION, Issue 7 2009
    Marina Berenguer
    We hypothesized that antiviral efficacy [sustained virologic response (SVR)] has improved in recent years in the transplant setting. Our aim was to assess whether the efficacy of pegylated interferon (PegIFN),ribavirin (Rbv) has improved over time. One hundred seven liver transplant patients [74% men, 55.5 years old (range: 37.5,69.5), 86% genotype 1a or 1b] were treated with PegIFN-Rbv for 355 (16,623) days at 20.1 (1.7,132.6) months after transplantation. Tacrolimus was used in 61%. Sixty-seven percent had baseline F3,F4 (cirrhosis: 20.5%). Donor age was 49 (12,78) years. SVR was achieved in 39 (36.5%) patients, with worse results achieved in recent years (2001,2003: n = 27, 46.5%; 2004: n = 23, 43.5%; 2005: n = 21, 35%; 2006 to January 2007: n = 36, 24%; P = 0.043). Variables associated with SVR in the univariate analysis included donor age, baseline viremia and cirrhosis, bilirubin levels, rapid virologic response and early virologic response (EVR), premature discontinuation of PegIFN or Rbv, and accumulated Rbv dose. In the multivariate analysis, the variables in the model were EVR [odds ratio (OR): 0.08, 95% confidence interval (CI): 0.016,0.414, P = 0.002] and donor age (OR: 1.039, 95% CI: 1.008,1.071, P = 0.01). Variables that had changed over time included donor age, baseline viremia, disease severity (cirrhosis, baseline bilirubin, and leukocyte and platelet counts), interval between transplantation and therapy, and use of growth factors. In the multivariate analysis, variables independently changing were donor age (OR: 1.041, 95% CI: 1.013,1.071, P = 0.004), duration from transplantation to antiviral therapy (OR: 1.001, 95% CI: 1.000,1.001, P = 0.013), and baseline leukocyte count (OR: 1.000, 95% CI: 1.000,1.000, P = 0.034). In conclusion, the efficacy of antiviral therapy with PegIFN-Rbv has worsened over time, at least in our center. The increase in donor age and greater proportion of patients treated at advanced stages of disease are potential causes. Liver Transpl 15:738,746, 2009. © 2009 AASLD. [source]


    Recurrent hepatitis C after retransplantation: Factors affecting graft and patient outcome,

    LIVER TRANSPLANTATION, Issue 12 2005
    Michal Carmiel-Haggai
    Retransplantation (re-LT) of patients with recurrent hepatitis C virus (HCV) carries significant morbidity and mortality, negatively impacting on an already scarce donor allograft pool. In this study, we investigated the outcome of allografts and patients after re-LT due to recurrent HCV. Between 1989 and 2002, 47 patients were retransplanted at our institution due to HCV-related graft failure. Clinical HCV recurrence after re-LT was diagnosed when patients had acute liver enzyme elevation correlated with histological recurrence. The independent influence of these variables on survival was tested using Cox regression model. Chi-squared tests were used to examine the influence of individual demographic and pre/perioperative variables on recurrence. Thirty-one (66%) patients died after re-LT (median 2.2 months). Donor age >60, clinical HCV recurrence, and graft failure due to cirrhosis were significant risk factors for mortality (risk ratios of 3.6, 3.3, and 2.4, respectively). Pre-LT MELD score was lower among survivors (22± 5 vs. 27± 8). Following re-LT, 38 patients had at least one biopsy due to acute liver dysfunction; 19 of them (50%) had recurrence within the first 3 months. High-dose solumedrol was correlated with early recurrence. No association was found between time of recurrence after the first LT and time of recurrence after re-LT. In conclusion, patients with cirrhosis due to recurrent HCV undergoing re-LT have an extremely high mortality rate; older allografts should be avoided in retransplanting these patients. The timing of clinical recurrence after initial liver transplantation is not predictive of the timing of recurrence after re-LT. Patients experiencing early graft failure due to accelerated forms of HCV should not be denied re-LT with the expectation that a similar disease course will occur after re-LT. (Liver Transpl 2005;11:1567,1573.) [source]


    Donor age and gestational age influence on growth factor levels in human amniotic membrane

    ACTA OPHTHALMOLOGICA, Issue 6 2010
    Maria J. López-Valladares
    Acta Ophthalmol. 2010: 88: e211,e216 Abstract. Purpose:, Amniotic membrane (AM) is used as a biomaterial for reconstruction in ocular surface surgery. This study investigated the influence of interdonor variations and processing and preservation procedures applied to the AM on growth factors and protein levels. Methods:, Samples of human AM from thirteen donors were analysed. Collected donor data were age, parity and gestational age. Total protein amount was measured in extracts of intact AM nonpreserved, lyophilized and cryopreserved, at ,80°C and in liquid nitrogen. An enzyme-linked immunosorbent assay (ELISA) was used to assay growth factors protein levels for epidermal growth factor, basic fibroblast growth factor (bFGF), hepatocyte growth factor (HGF), keratinocyte growth factor (KGF), transforming growth factor beta1 (TFG-,1) and nerve growth factor (NGF). Univariate and multivariate statistical analyses were used to study the influence of the preservation method applied and interdonor variations on growth factors levels. Results:, We detected important variations in growth factors and protein concentrations between samples from different donors. Total protein amount, bFGF, HGF, KGF and TGF-,1 showed lower levels in samples from donors with higher gestational ages and donor ages, for all groups. Conclusion:, The variability in the biochemical composition of AM from different donors is considerable, and it is related with donor factors as donor age and gestational age. As AM biochemical composition has a role in its therapeutic effects, these variations could affect the clinical results of amniotic membrane transplantation and must be taken into account in donor selection processes. [source]


    Thoracic organ donor characteristics associated with successful lung procurement

    CLINICAL TRANSPLANTATION, Issue 1 2001
    Doff B McElhinney
    Purpose: A shortage of suitable donors is the major impediment to clinical lung transplantation. The rate of lung recovery from potential donors is lower than that for other organs. The purpose of this study was to evaluate what factors could be modified to improve the rate of cadaver lung recovery. Methods: We performed a retrospective review of records from all thoracic organ donors procured by the California Transplant Donor Network between 1 January 1995 and 31 May 1997 (251 donors) to determine which donor management factors were associated with an increased likelihood of successful lung procurement. Results: There were 88 lung donors (L) and 163 donors from which hearts but no lungs were procured (H). Longer time to donor network referral was associated with a reduced chance for successful lung procurement. Donor age, cause of death, and time of admission were not important factors. Most donors in this study had an acceptable A-a gradient at admission to the hospital but lung function deteriorated in group H. Corticosteroid usage and initially clear breath sounds were independent predictors of successful procurement by multivariate analysis. Conclusions: Early contact with the donor referral network, and corticosteroids may help to improve the lung procurement rate from potential donors. [source]


    Liver stiffness identifies two different patterns of fibrosis progression in patients with hepatitis C virus recurrence after liver transplantation,

    HEPATOLOGY, Issue 1 2010
    José A. Carrión
    Significant liver fibrosis (F , 2) and portal hypertension (hepatic venous pressure gradient [HVPG] , 6 mmHg) at 1 year after liver transplantation (LT) identify patients with severe hepatitis C recurrence. We evaluated whether repeated liver stiffness measurements (LSM) following LT can discriminate between slow and rapid "fibrosers" (fibrosis stage F2-F4 at 1 year after LT). Eighty-four patients who had undergone LT and who were infected with hepatitis C virus (HCV) and 19 LT controls who were not infected with HCV underwent LSM at 3, 6, 9, and 12 months after LT. All HCV-infected patients underwent liver biopsy 12 months after LT (paired HVPG measurements in 74); 31 (37%) were rapid fibrosers. Median LSM (in kilopascal) at months 6, 9, and 12 were significantly higher in rapid fibrosers (9.9, 9.5, 12.1) than in slow fibrosers (6.9, 7.5, 6.6) (P < 0.01 all time points). The slope of liver stiffness progression (kPa × month) in rapid fibrosers (0.42) was significantly greater than in slow fibrosers (0.05) (P < 0.001), suggesting two different speeds of liver fibrosis progression. Figures were almost identical for patients with HVPG , 6 mmHg or HVPG < 6 mmHg at 1 year after LT. Multivariate analysis identified donor age, bilirubin level, and LSM as independent predictors of fibrosis progression and portal hypertension in the estimation group (n = 50) and were validated in a second group of 34 patients. The areas under the receiver operating characteristic curve that could identify rapid fibrosers and patients with portal hypertension as early as 6 months after LT were 0.83 and 0.87, respectively, in the estimation group and 0.75 and 0.80, respectively, in the validation group. Conclusion: Early and repeated LSM following hepatitis C recurrence in combination with clinical variables discriminates between rapid and slow fibrosers after LT. (HEPATOLOGY 2009.) [source]


    Biliary reconstruction using non-penetrating, tissue everting clips versus conventional sewn biliary anastomosis in liver transplantation

    HPB, Issue 2 2006
    K. Tyson Thomas
    Background. Biliary complications occur following approximately 25% of liver transplantations. Efforts to decrease biliary complications include methods designed to diminish tissue ischemia. Previously, we reported excellent short-term results and decreased biliary anastomosis time in a porcine liver transplant model using non-penetrating, tissue everting clips (NTEC), specifically VCS® clips. Methods. We examined the incidence of biliary anastomotic complications in a group of patients in whom orthotopic liver transplantation was performed with biliary reconstruction using NTEC and compared that group to a matched group treated with biliary reconstruction via conventional end-to-end sewn choledochocholedochostomy. Patients were matched in a 1:2 fashion by age at transplantation, disease etiology, Child-Turcot-Pugh scores, MELD score or UNOS status (prior to 1998), cold and warm ischemia times, organ donor age, and date of transplantation. Results. Seventeen patients had clipped anastomosis and 34 comparison patients had conventional sewn anastomosis. There were no differences between groups in terms of baseline clinical or demographic data. The median time from completion of the hepatic artery anastomosis to completion of clipped versus conventional sewn biliary anastomosis was 45 (interquartile range = 20 min) versus 47 min (interquartile range = 23 min), respectively (p=0.12). Patients were followed for a mean of 29 months. Biliary anastomotic complications, including leak or anastomotic stricture, were observed in 18% of the clipped group and 24% of the conventional sewn group. Conclusions. Biliary reconstruction can be performed clinically using NTEC as an alternative to conventional sewn biliary anastomoses with good results. [source]


    BK virus nephropathy: Clinical experience in a university hospital in Japan

    INTERNATIONAL JOURNAL OF UROLOGY, Issue 12 2009
    Tatsuya Takayama
    Objectives: To review the medical records of patients with BK virus nephropathy (BKVN) following kidney transplantation in our institution. Methods: We screened patients for decoy cells using urine cytology, assessed serum creatinine levels, and conducted a graft biopsy, as well as assessed the presence of plasma BK virus DNA by quantitative real-time polymerase chain reaction. The treatment of BKVN was based on the decreased use of immunosuppressants. Results: Overall, six male patients were studied (mean age 40.8 years, range 18,58; mean donor age 45.2 years, range 15,67). A positive urine cytology screen led to the subsequent detection of plasma BK virus DNA in the five patients with urine cytology results positive for decoy cells. In the four patients in whom plasma BK virus DNA was detected, a maximum value of DNA of ,10 000 copies/mL was observed. Time elapsed from transplantation to BKVN diagnosis ranged from 3 to 62 months. Although the two cadaver grafts were lost, the loss was not due to any effects directly associated with BKVN. The other four grafts are still functioning with a mean creatinine level of 1.8 mg/dL. Most of the patients with BKVN were regarded as being in a state of heightened immunosuppression. BK virus transition to blood was prevented in one patient. Conclusions: Early diagnosis of BKV infection with reduction of immunosuppression may potentially counter BK viremia and retard progression of BKV nephropathy. Decoy cell screening by urine cytology as well as plasma BK virus DNA screening should be considered in addition to the required graft biopsy in kidney transplant recipients, particularly in those with impaired graft function. [source]


    Complete robotic-assistance during laparoscopic living donor nephrectomies: An evaluation of 38 procedures at a single site

    INTERNATIONAL JOURNAL OF UROLOGY, Issue 11 2007
    Jacques Hubert
    Objective: To evaluate our initial experience with entirely robot-assisted laparoscopic live donor (RALD) nephrectomies. Methods: From January 2002 to April 2006, we carried out 38 RALD nephrectomies at our institution, using four ports (three for the robotic arms and one for the assistant). The collateral veins were ligated, and the renal arteries and veins clipped, after completion of ureteral and renal dissection. The kidney was removed via a suprapubic Pfannenstiel incision. A complementary running suture was carried out on the arterial stump to secure the hemostasis. Results: Mean donor age was 43 years. All nephrectomies were carried out entirely laparoscopically, without complications and with minimal blood loss. Mean surgery time was 181 min. Average warm ischemia and cold ischemia times were 5.84 min and 180 min, respectively. Average donor hospital stay was 5.5 days. None of the transplant recipients had delayed graft function. Conclusions: Robot-assisted laparoscopic live donor nephrectomy can be safely carried out. Robotics enhances the laparoscopist's skills, enables the surgeon to dissect meticulously and to prevent problematic bleeding more easily. Donor morbidity and hospitalization are reduced by the laparoscopic approach and the use of robotics allows the surgeon to work under better ergonomic conditions. [source]


    Effect of Donor Age on Long-Term Survival Following Cardiac Transplantation

    JOURNAL OF CARDIAC SURGERY, Issue 2 2006
    Veli K. Topkara M.D.
    Our objective was to analyze the effect of donor age on outcomes after cardiac transplantation. Methods: We retrospectively studied 864 patients who underwent cardiac transplantation at New York Presbyterian Hospital , Columbia University between 1992 and 2002. Patients were divided into two groups; donor age <40 years (Group A, n = 600) and donor age ,40 years (Group B, n = 264). Results: Characteristics including gender, body mass index, and cytomegalovirus (CMV) status were significantly different between the two donor age groups. Race, CMV status, toxoplasmosis status, left ventricular assist device prior to transplant, diabetes mellitus, and retransplantation were similar in both the recipient groups, while age, gender, and BMI were different. Early mortality was lower in Group A, 5%, versus 9.5% in Group B. Multivariate analysis revealed recipient female gender (odd ratio (OR) = 1.71), retransplantation (OR = 1.63), and increased donor age (OR = 1.02) as significant predictors of poor survival in the recipient population. Actuarial survival at 1 year (86.7% vs 81%), 5 years (75% vs 65%), and 10 years (56% vs 42%) was significantly different as well with a log rank p = 0.002. Conclusions: These findings suggest that increased donor age is an independent predictor of long-term survival. However, the shortage of organs makes it difficult to follow strict guidelines when placing hearts; therefore, decisions need to be made on a relative basis. [source]


    Gene expression profiles associated with aging and mortality in humans

    AGING CELL, Issue 3 2009
    Richard A. Kerber
    Summary We investigated the hypothesis that gene expression profiles in cultured cell lines from adults, aged 57,97 years, contain information about the biological age and potential longevity of the donors. We studied 104 unrelated grandparents from 31 Utah CEU (Centre d'Etude du Polymorphisme Humain , Utah) families, for whom lymphoblastoid cell lines were established in the 1980s. Combining publicly available gene expression data from these cell lines, and survival data from the Utah Population Database, we tested the relationship between expression of 2151 always-expressed genes, age, and survival of the donors. Approximately 16% of 2151 expression levels were associated with donor age: 10% decreased in expression with age, and 6% increased with age. Cell division cycle 42 (CDC42) and CORO1A exhibited strong associations both with age at draw and survival after draw (multiple comparisons-adjusted Monte Carlo P -value < 0.05). In general, gene expressions that increased with age were associated with increased mortality. Gene expressions that decreased with age were generally associated with reduced mortality. A multivariate estimate of biological age modeled from expression data was dominated by CDC42 expression, and was a significant predictor of survival after blood draw. A multivariate model of survival as a function of gene expression was dominated by CORO1A expression. This model accounted for approximately 23% of the variation in survival among the CEU grandparents. Some expression levels were negligibly associated with age in this cross-sectional dataset, but strongly associated with inter-individual differences in survival. These observations may lead to new insights regarding the genetic contribution to exceptional longevity. [source]


    Cellular senescence in pretransplant renal biopsies predicts postoperative organ function

    AGING CELL, Issue 1 2009
    Liane M. McGlynn
    Summary Older and marginal donors have been used to meet the shortfall in available organs for renal transplantation. Post-transplant renal function and outcome from these donors are often poorer than chronologically younger donors. Some organs, however, function adequately for many years. We have hypothesized that such organs are biologically younger than poorer performing counterparts. We have tested this hypothesis in a cohort of pre-implantation human renal allograft biopsies (n = 75) that have been assayed by real-time polymerase chain reaction for the expression of known markers of cellular damage and biological aging, including CDKN2A, CDKN1A, SIRT2 and POT1. These have been investigated for any associations with traditional factors affecting transplant outcome (donor age, cold ischaemic time) and organ function post-transplant (serum creatinine levels). Linear regression analyses indicated a strong association for serum creatinine with pre-transplant CDKN2A levels (p = 0.001) and donor age (p = 0.004) at 6 months post-transplant. Both these markers correlated significantly with urinary protein to creatinine ratios (p = 0.002 and p = 0.005 respectively), an informative marker for subsequent graft dysfunction. POT1 expression also showed a significant association with this parameter (p = 0.05). Multiple linear regression analyses for CDKN2A and donor age accounted for 24.6% (p = 0.001) of observed variability in serum creatinine levels at 6 months and 23.7% (p = 0.001) at 1 year post-transplant. Thus, these data indicate that allograft biological age is an important novel prognostic determinant for renal transplant outcome. [source]


    The biopsied donor liver: Incorporating macrosteatosis into high-risk donor assessment,

    LIVER TRANSPLANTATION, Issue 7 2010
    Austin L. Spitzer
    To expand the donor liver pool, ways are sought to better define the limits of marginally transplantable organs. The Donor Risk Index (DRI) lists 7 donor characteristics, together with cold ischemia time and location of the donor, as risk factors for graft failure. We hypothesized that donor hepatic steatosis is an additional independent risk factor. We analyzed the Scientific Registry of Transplant Recipients for all adult liver transplants performed from October 1, 2003, through February 6, 2008, with grafts from deceased donors to identify donor characteristics and procurement logistics parameters predictive of decreased graft survival. A proportional hazard model of donor variables, including percent steatosis from higher-risk donors, was created with graft survival as the primary outcome. Of 21,777 transplants, 5051 donors had percent macrovesicular steatosis recorded on donor liver biopsy. Compared to the 16,726 donors with no recorded liver biopsy, the donors with biopsied livers had a higher DRI, were older and more obese, and a higher percentage died from anoxia or stroke than from head trauma. The donors whose livers were biopsied became our study group. Factors most strongly associated with graft failure at 1 year after transplantation with livers from this high-risk donor group were donor age, donor liver macrovesicular steatosis, cold ischemia time, and donation after cardiac death status. In conclusion, in a high-risk donor group, macrovesicular steatosis is an independent risk factor for graft survival, along with other factors of the DRI including donor age, donor race, donation after cardiac death status, and cold ischemia time. Liver Transpl 16:874,884, 2010. © 2010 AASLD. [source]


    Survival after orthotopic liver transplantation: The impact of antibody against hepatitis B core antigen in the donor,,§

    LIVER TRANSPLANTATION, Issue 10 2009
    Lei Yu
    Liver transplantation using grafts from donors with antibody against hepatitis B core antigen (anti-HBc) increases the recipients' risk of developing hepatitis B virus (HBV) infection post-transplantation. Our aim was to assess whether using such grafts was associated with reduced posttransplantation survival and whether this association depended on recipients' prior exposure to HBV on the basis of their pretransplantation serological patterns. Data were derived from the United Network for Organ Sharing on adult, cadaveric, first-time liver transplants performed between 1994 and 2006. Among recipients who did not have HBV infection before transplantation, those with anti-HBc,positive donors had significantly worse unadjusted posttransplantation patient survival than recipients with anti-HBc,negative donors [hazard ratio, 1.35; 95% confidence interval (CI), 1.21-1.50]. However, after adjustments for other predictors of posttransplantation survival, including donor age, donor race, and recipient underlying liver diseases, patient survival was not significantly different between the 2 groups (hazard ratio, 1.09; 95% CI, 0.97-1.24). Among recipients without antibody against hepatitis B surface antigen (anti-HBs), use of anti-HBc,positive donor grafts was associated with a trend toward worse survival (adjusted hazard ratio, 1.18; 95% CI, 0.95-1.46), whereas no such trend was observed among recipients positive for anti-HBs. In conclusion, in patients without HBV infection before transplantation, using anti-HBc,positive donors was not independently associated with worse posttransplantation survival. Matching these donors to recipients with anti-HBs pre-transplantation may be especially safe. Liver Transpl 15:1343,1350, 2009. © 2009 AASLD. [source]


    ABO-incompatible deceased donor liver transplantation in the United States: A national registry analysis,

    LIVER TRANSPLANTATION, Issue 8 2009
    Zoe A. Stewart
    In the United States, ABO-incompatible liver transplantation (ILT) is limited to emergent situations when ABO-compatible liver transplantation (CLT) is unavailable. We analyzed the United Network for Organ Sharing database of ILT performed from 1990-2006 to assess ILT outcomes for infant (0-1 years; N = 156), pediatric (2-17 years; N = 170), and adult (> 17 years; N = 667) patients. Since 2000, the number of ILT has decreased annually, and there has been decreased use of blood type B donors and increased use of blood type A donors. Furthermore, ILT graft survival has improved for all age groups in recent years, beyond the improved graft survival attributable to era effect based on comparison to respective age group CLT. On matched control analysis, graft survival was significantly worse for adult ILT as compared to adult CLT. However, infant and pediatric ILTs did not have worse graft survival versus age-matched CLT. Adjusted analyses identified age-specific characteristics impacting ILT graft loss. For infants, transplant after 2000 and donor age < 9 years were associated with reduced risk of ILT graft loss. For pediatric patients, female recipient sex and donor age > 50 years were associated with increased risk of ILT graft loss. For adults, life support, repeat transplant, split grafts, and hepatocellular carcinoma were associated with increased risk of ILT graft loss. The current study identifies important trends in ILT in the United States in the modern immunosuppression era, as well as specific recipient, donor, and graft characteristics impacting ILT graft survival that could be utilized to guide ILT organ allocation in exigent circumstances. Liver Transpl 15:883,893, 2009. © 2009 AASLD. [source]


    Extended right liver grafts obtained by an ex situ split can be used safely for primary and secondary transplantation with acceptable biliary morbidity

    LIVER TRANSPLANTATION, Issue 7 2009
    Atsushi Takebe
    Split liver transplantation (SLT) is clearly beneficial for pediatric recipients. However, the increased risk of biliary complications in adult recipients of SLT in comparison with whole liver transplantation (WLT) remains controversial. The objective of this study was to investigate the incidence and clinical outcome of biliary complications in an SLT group using split extended right grafts (ERGs) after ex situ splitting in comparison with WLT in adults. The retrospectively collected data for 80 consecutive liver transplants using ERGs after ex situ splitting between 1998 and 2007 were compared with the data for 80 liver transplants using whole liver grafts in a matched-pair analysis paired by the donor age, recipient age, indications, Model for End-Stage Liver Disease score, and high-urgency status. The cold ischemic time was significantly longer in the SLT group (P = 0.006). As expected, bile leakage from the transected surface occurred only in the SLT group (15%) without any mortality or graft loss. The incidence of all other early or late biliary complications (eg, anastomotic leakage and stenosis) was not different between SLT and WLT. The 1- and 5-year patient and graft survival rates showed no statistical difference between SLT and WLT [83.2% and 82.0% versus 88.5% and 79.8% (P = 0.92) and 70.8% and 67.5% versus 83.6% and 70.0% (P = 0.16), respectively]. In conclusion, ERGs can be used safely without any increased mortality and with acceptable morbidity, and they should also be considered for retransplantation. The significantly longer cold ischemic time in the SLT group indicates the potential for improved results and should thus be considered in the design of allocation policies. Liver Transpl 15:730,737, 2009. © 2009 AASLD. [source]


    Worse recent efficacy of antiviral therapy in liver transplant recipients with recurrent hepatitis C: Impact of donor age and baseline cirrhosis,

    LIVER TRANSPLANTATION, Issue 7 2009
    Marina Berenguer
    We hypothesized that antiviral efficacy [sustained virologic response (SVR)] has improved in recent years in the transplant setting. Our aim was to assess whether the efficacy of pegylated interferon (PegIFN),ribavirin (Rbv) has improved over time. One hundred seven liver transplant patients [74% men, 55.5 years old (range: 37.5,69.5), 86% genotype 1a or 1b] were treated with PegIFN-Rbv for 355 (16,623) days at 20.1 (1.7,132.6) months after transplantation. Tacrolimus was used in 61%. Sixty-seven percent had baseline F3,F4 (cirrhosis: 20.5%). Donor age was 49 (12,78) years. SVR was achieved in 39 (36.5%) patients, with worse results achieved in recent years (2001,2003: n = 27, 46.5%; 2004: n = 23, 43.5%; 2005: n = 21, 35%; 2006 to January 2007: n = 36, 24%; P = 0.043). Variables associated with SVR in the univariate analysis included donor age, baseline viremia and cirrhosis, bilirubin levels, rapid virologic response and early virologic response (EVR), premature discontinuation of PegIFN or Rbv, and accumulated Rbv dose. In the multivariate analysis, the variables in the model were EVR [odds ratio (OR): 0.08, 95% confidence interval (CI): 0.016,0.414, P = 0.002] and donor age (OR: 1.039, 95% CI: 1.008,1.071, P = 0.01). Variables that had changed over time included donor age, baseline viremia, disease severity (cirrhosis, baseline bilirubin, and leukocyte and platelet counts), interval between transplantation and therapy, and use of growth factors. In the multivariate analysis, variables independently changing were donor age (OR: 1.041, 95% CI: 1.013,1.071, P = 0.004), duration from transplantation to antiviral therapy (OR: 1.001, 95% CI: 1.000,1.001, P = 0.013), and baseline leukocyte count (OR: 1.000, 95% CI: 1.000,1.000, P = 0.034). In conclusion, the efficacy of antiviral therapy with PegIFN-Rbv has worsened over time, at least in our center. The increase in donor age and greater proportion of patients treated at advanced stages of disease are potential causes. Liver Transpl 15:738,746, 2009. © 2009 AASLD. [source]


    Corticosteroid-free immunosuppression with daclizumab in HCV+ liver transplant recipients: 1-year interim results of the HCV-3 study,

    LIVER TRANSPLANTATION, Issue 11 2007
    Goran B.G. Klintmalm
    This work is a 1-yr interim analysis of a prospective, randomized, multicenter trial evaluating the effect of corticosteroid-free immunosuppression on hepatitis C virus,positive (HCV+) liver transplant recipients following liver transplantation (LT). Patients received tacrolimus and corticosteroids (Arm 1; n = 80); tacrolimus, corticosteroids, and mycophenolate mofetil (MMF) (Arm 2; n = 79); or daclizumab induction, tacrolimus, and MMF (Arm 3; n = 153). At 1 yr, 64.1%, 63.4%, and 69.4% of patients achieved the composite primary endpoint of freedom from rejection, freedom from HCV recurrence, and freedom from treatment failure, respectively. Excellent patient and graft survival did not differ significantly among treatment arms. Freedom from HCV recurrence at 1 yr was 61.8 ± 6.2%, 60.1 ± 6.1%, and 67.0 ± 4.3% in Arms 1, 2, and 3, respectively (P = not significant). Freedom from rejection was significantly higher in Arm 3 compared to Arm 1 (93.0 ± 2.2% vs. 81.9 ± 4.4%; P = 0.011). Multivariate analysis identified acute rejection (hazard ratio = 2.692; P = 0.001) and donor age (hazard ratio = 1.015; P = 0.001) as significant risk factors for HCV recurrence. HCV recurrence was not influenced by recipient demographics, HCV genotype, or immunosuppression. In conclusion, these results suggest that a corticosteroid-free regimen of tacrolimus and MMF following daclizumab induction is safe and effective in HCV+ liver transplant recipients. Liver Transpl 13:1521,1531, 2007. © 2007 AASLD. [source]


    Liver transplantation for HCV cirrhosis: Improved survival in recent years and increased severity of recurrent disease in female recipients: Results of a long term retrospective study

    LIVER TRANSPLANTATION, Issue 5 2007
    Luca S. Belli
    In recent years, a worsening outcome of hepatitis C virus (HCV)-positive recipients and a faster progression of recurrent disease to overt cirrhosis has been reported. Our aims were to 1) assess patient survival and development of severe recurrent disease (Ishak fibrosis score > 3) in different transplant years; and 2) model the effects of pre- and post-liver transplantation (LT) variables on the severity of recurrent disease. A multicenter retrospective analysis was conducted on 502 consecutive HCV-positive transplant recipients between January 1990 and December 2002. Protocol liver biopsies were obtained at 1, 3, 5, 7, and 10 yr post-LT in almost 90% of the patients. All 502 patients were included in the overall survival analysis, while only the 354 patients with a follow-up longer than 1 yr were considered for the analysis of predictors of disease progression. The overall Kaplan,Meier survival rates were 78.7%, 66.3%, and 58.6%, at 12, 60, and 120 months, respectively, and a trend for a better patient survival over the years emerged from all 3 centers. The cumulative probability of developing HCV-related recurrent severe fibrosis (Ishak score 4-6) in the cohort of 354 patients who survived at least 1 yr remained unchanged over the years. Multivariate analysis indicated that older donors (P = 0.0001) and female gender of recipient (P = 0.02) were the 2 major risk factors for the development of severe recurrent disease, while the adoption of antilymphocytic preparations was associated with a less aggressive course (P = 0.03). Two of these prognostic factors, donor age and recipient gender, are easily available before LT and their combination showed an important synergy, such that a female recipient not only had a much higher probability of severe recurrent disease than a male recipient but her risk increased with the increasing age of the donor, reaching almost 100% when the age of the donor was 60 or older. In conclusion, a trend for a better patient survival was observed in more recent years but the cumulative probability of developing severe recurrent disease remained unchanged. The combination of a female recipient receiving an older graft emerged as a strong risk factor for a severe recurrence. Liver Transpl, 2007. © 2007 AASLD. [source]


    Recipient and donor factors influence the incidence of graft-vs.-host disease in liver transplant patients

    LIVER TRANSPLANTATION, Issue 4 2007
    Edie Y. Chan
    Acute cellular graft-vs.-host disease (GVHD) following liver transplantation has an incidence of 1 to 2% and a mortality rate of 85%. Our aim was to identify a patient population at high risk for developing GVHD using a large clinical database to study both recipient and donor factors. We compared our liver transplant patients who developed GVHD to those that did not for recipient and donor factors and combinations of factors. For 2003,2004 we had 205 first-time liver transplant patients surviving >30 days. From this group, 4 (1.9%) developed GVHD. Compared to the control group, there were no significant differences in recipient age, recipient gender, donor age, donor gender, total ischemia time, donor-recipient human leukocyte antigen (HLA) mismatch, or donor-recipient age difference. Percentages of liver disease etiologies among the patients who developed GVHD were as follows: 16% (1/6) autoimmune hepatitis (AIH) (P = 0.003), 5.6% (3/54) alcoholic liver disease (ALD) (P = 0.057), and 7.1% (3/42) hepatocellular carcinoma (HCC) (P = 0.026). The incidence of GVHD in patients with glucose intolerance (either Type I or Type II diabetes mellitus [DM]) was significant (P = 0.022). Focusing on patients only with high-risk factors for GVHD during the years 2003,2005, we had 19 such patients. Four of these high-risk patients developed GVHD. Three of these 4 patients had received a donor liver with steatosis of degree ,mild compared to only 2 of the 15 high-risk patients who did not develop GVHD (P = 0.037). In conclusion, we have identified liver transplant patients with AIH or the combination of ALD, HCC, and glucose intolerance who receive a steatotic donor liver as being at high risk for developing GVHD. Liver Transpl 13:516,522, 2007. © 2007 AASLD. [source]


    MELD and prediction of post,liver transplantation survival

    LIVER TRANSPLANTATION, Issue 3 2006
    Shahid Habib
    The model for end-stage liver disease (MELD) was developed to predict short-term mortality in patients with cirrhosis. It has since become the standard tool to prioritize patients for liver transplantation. We assessed the value of pretransplant MELD in the prediction of posttransplant survival. We identified adult patients who underwent liver transplantation at our institution during 1991,2002. Among 2,009 recipients, 1,472 met the inclusion criteria. Based on pretransplant MELD scores, recipients were stratified as low risk (,15), medium risk (16,25), and high risk (>25). The primary endpoints were patient and graft survival. Mean posttransplant follow-up was 5.5 years. One-, 5- and 10-year patient survival was 83%, 72%, and 58%, respectively, and graft survival was 76%, 65%, and 53%, respectively. In univariable analysis, patient and donor age, patient sex, MELD score, disease etiology, and retransplantation were associated with posttransplantation patient and graft survival. In multivariable analysis adjusted for year of transplantation, patient age >65 years, donor age >50 years, male sex, and retransplantation and pretransplant MELD scores >25 were associated with poor patient and graft survival. The impact of MELD score >25 was maximal during the first year posttransplant. In conclusion, older patient and donor age, male sex of recipient, retransplantation, and high pretransplant MELD score are associated with poor posttransplant outcome. Pretransplant MELD scores correlate inversely with posttransplant survival. However, better prognostic models are needed that would provide an overall assessment of transplant benefit relative to the severity of hepatic dysfunction. Liver Transpl 12:440,447, 2006. © 2006 AASLD. [source]


    Ascites after liver transplantation,A mystery

    LIVER TRANSPLANTATION, Issue 5 2004
    Charmaine A. Stewart
    Ascites after liver transplantation, although uncommon, presents a serious clinical dilemma. The hemodynamic changes that support the development of ascites before liver transplantation are resolved after transplant; therefore, persistent ascites (PA) after liver transplantation is unexpected and poorly characterized. The aim of this study was to define the clinical factors associated with PA after liver transplantation. This was a retrospective case,control analysis of patients who underwent liver transplantation at the University of Pennsylvania. PA occurring for more than 3 months after liver transplantation was confirmed by imaging studies. PA was correlated with multiple recipient and donor variables, including etiology of liver disease, preoperative ascites, prior portosystemic shunt (PS), donor age, and cold ischemic (CI) time. There were 2 groups: group 1, cases with PA transplanted from November 1990 to July 2001, and group 2, consecutive, control subjects who underwent liver transplantation between September 1999 and December 2001. Both groups were followed to censoring, May 2002, or death. Twenty-five from group 1 had ascites after liver transplantation after a median follow-up of 2.6 years. In group 1 vs group 2 (n = 106), there was a male predominance 80% vs 61% (P = .10) with similar age 52 years; chronic hepatitis C virus (HCV) was diagnosed in 88% vs 44% (P < .0001); preoperative ascites and ascites refractory to treatment were more prevalent in group 1 (P = .0004 and P =.02, respectively), and CI was higher in group 1, (8.5 hours vs 6.3 hours, P = .002). Eight of the 25 (group 1) had portal hypertension with median portosystemic gradient 16.5 mm Hg (range, 16,24). PS was performed in 7 of 25 cases, which resulted in partial resolution of ascites. The development of PA after liver transplantation is multifactorial; HCV, refractory ascites before liver transplantation, and prolonged CI contribute to PA after liver transplantation. (Liver Transpl 2004;10:654,660.) [source]


    Significance of positive cytotoxic cross-match in adult-to-adult living donor liver transplantation using small graft volume

    LIVER TRANSPLANTATION, Issue 12 2002
    Kyung-Suk Suh MD
    A positive cross-match in cadaveric liver transplantation is relatively acceptable, but its role in living donor liver transplantation (LDLT) is less well known. The aim of this study is to examine the significance of cytotoxic cross-match in adult-to-adult LDLT using small-for-size grafts. Forty-three adult-to-adult LDLTs were performed at Seoul National University Hospital (Seoul, Korea) from January 1999 to July 2001. Subjects consisted of 27 men and 16 women with an average age of 45.4 years. Average liver graft weight was 565.3 ± 145.7 g, and average graft-recipient weight ratio (GRWR) was 0.89% ± 0.20%. HLA cross-match testing by lymphocytotoxicity and flow cytometry was performed routinely preoperatively. Factors that may influence survival, such as age; sex; blood group type A, type B, type O compatibility; cytotoxic cross-match; donor age; surgical time; cold ischemic time; and GRWR, were analyzed. Nine patients (20.9%) died in the hospital. There was a greater in-hospital mortality rate in women than men (37.5% v 11.1%; P = .049). The extra-small,graft group (0.54% , GRWR < 0.8%; n = 14) showed greater in-hospital mortality rates than the small-graft group (0.8% , GRWR , 1.42%; n = 29; 42.9% v 10.3%; P = .022). A positive cross-match was detected in 4 women transplant recipients, and 3 of these patients belonged to the extra-small,graft group. All patients with a positive cross-match died of multiorgan failure after early postoperative acute rejection episodes. Positive cross-match was the only significant factor in multivariate analysis (P = .035). In conclusion, when lymphocytotoxic cross-match and flow cytometry are significantly positive, adult-to-adult LDLT using small-for-size grafts should not be performed. [source]


    Does tacrolimus offer virtual freedom from chronic rejection after primary liver transplantation?

    LIVER TRANSPLANTATION, Issue 7 2001
    048 liver transplantations with a mean follow-up of 6 years, prognostic factors in
    Tacrolimus has proven to be a potent immunosuppressive agent in liver transplantation (LT). Its introduction has led to significantly less frequent and severe acute rejection. Little is known about the rate of chronic rejection (CR) in primary LT using tacrolimus therapy. The aim of the present study is to examine the long-term incidence of CR, risk factors, prognostic factors, and outcome after CR. The present study evaluated the development of CR in 1,048 consecutive adult primary liver allograft recipients initiated and mostly maintained on tacrolimus-based immunosuppressive therapy. They were evaluated with a mean follow-up of 77.3 ± 14.7 months (range, 50.7 to 100.1 months). To assess the impact of primary diagnosis on the rate and outcome of CR, the population was divided into 3 groups. Group I included patients with hepatitis C virus (HCV)- or hepatitis B virus (HBV)-induced cirrhosis (n = 312); group II included patients diagnosed with primary biliary cirrhosis (PBC), primary sclerosing cholangitis (PSC), or autoimmune hepatitis (AIH; n = 217); and group III included patients with all other diagnoses (n = 519). Overall, 32 of 1,048 patients (3.1%) developed CR. This represented 13 (4.1%), 12 (5.5%), and 7 patients (1.3%) in groups I, II, and III, respectively. The relative risk for developing CR was 3.2 times greater for group I and 4.3 times greater for group II compared with group III. This difference was statistically significant (P = .004). The incidence of acute rejection and total number of acute rejection episodes were significantly greater in patients who developed CR compared with those who did not (P < .0001). Similarly, the mean donor age for CR was significantly older than for patients without CR (43.0 v 36.2 years; P = .02). Thirteen of the 32 patients (40.6%) who developed CR retained their original grafts for a mean period of 54 ± 25 months after diagnosis. Seven patients (21.9%) underwent re-LT, and 12 patients (38.3%) died. Serum bilirubin levels and the presence of arteriopathy, arterial loss, and duct loss on liver biopsy at the time of diagnosis of CR were significantly greater among the 3 groups of patients. In addition, patient and graft survival for group I were significantly worse compared with groups II and III. We conclude that CR occurred rarely among patients maintained long term on tacrolimus-based immunosuppressive therapy. When steroid use is controlled, the incidence of acute rejection, mean donor age, HBV- and/or HCV-induced cirrhosis, or a diagnosis of PBC, PSC, or AIH were found to be predictors of CR. Greater values for serum bilirubin level, duct loss, arteriopathy, arteriolar loss, and presence of HCV or HBV were found to be poor prognostic factors for the 3 groups; greater total serum bilirubin value (P = .05) was the only factor found to be significant between patients who had graft loss versus those who recovered. [source]


    Growth curves of pediatric patients with biliary atresia following living donor liver transplantation: Factors that influence post-transplantation growth

    PEDIATRIC TRANSPLANTATION, Issue 7 2007
    Takeshi Saito
    Abstract:, We evaluated the growth curves of children with BA after LDLT, and identified factors influencing growth velocity one-yr after LDLT (,Z). The clinical data of 51 children with BA, who had an LDLT at our center from 2001 to 2005, were retrospectively reviewed. The Z scores for height and weight, and ,Z were studied. The correlation between ,Z and various clinical factors was evaluated statistically. Multivariate stepwise analyses were performed for ,Z. The average height and weight Z scores at the time of LDLT were ,1.34 ± 1.36 (±s.d.) and ,0.78 ± 1.15, respectively. Among 30 BA recipients with stable liver function after transplant, weight returned to normal one-yr post-transplantation. However, height did not return to normal even by the third post-transplantation year. On multivariate analyses, 73% of the variance in height ,Z could be accounted for by factors such as standardized height at the time of LDLT (proportion of variance: 38%), number of steroid pulse treatments (17%), donor age (10%), and the presence of HVS (9%). Fifty-four percentage of the variance in weight ,Z could be accounted for by factors such as standardized weight at the time of LDLT (37%) and the total steroid dose given (17%). Height and weight status at the time of LDLT likely have the strongest impact on ,Z. Additional factors include steroid exposure, age of the living donor, and presence of HVS, all of which should be considered to improve post-transplantation growth. [source]


    A Risk Prediction Model for Delayed Graft Function in the Current Era of Deceased Donor Renal Transplantation

    AMERICAN JOURNAL OF TRANSPLANTATION, Issue 10 2010
    W. D. Irish
    Delayed graft function (DGF) impacts short- and long-term outcomes. We present a model for predicting DGF after renal transplantation. A multivariable logistic regression analysis of 24 337 deceased donor renal transplant recipients (2003,2006) was performed. We developed a nomogram, depicting relative contribution of risk factors, and a novel web-based calculator (http://www.transplantcalculator.com/DGF) as an easily accessible tool for predicting DGF. Risk factors in the modern era were compared with their relative impact in an earlier era (1995,1998). Although the impact of many risk factors remained similar over time, weight of immunological factors attenuated, while impact of donor renal function increased by 2-fold. This may reflect advances in immunosuppression and increased utilization of kidneys from expanded criteria donors (ECDs) in the modern era. The most significant factors associated with DGF were cold ischemia time, donor creatinine, body mass index, donation after cardiac death and donor age. In addition to predicting DGF, the model predicted graft failure. A 25,50% probability of DGF was associated with a 50% increased risk of graft failure relative to a DGF risk <25%, whereas a >50% DGF risk was associated with a 2-fold increased risk of graft failure. This tool is useful for predicting DGF and long-term outcomes at the time of transplant. [source]


    Liver Transplantation in the United States, 1999,2008

    AMERICAN JOURNAL OF TRANSPLANTATION, Issue 4p2 2010
    P. J. Thuluvath
    Changes in organ allocation policy in 2002 reduced the number of adult patients on the liver transplant waiting list, changed the characteristics of transplant recipients and increased the number of patients receiving simultaneous liver,kidney transplantation (SLK). The number of liver transplants peaked in 2006 and declined marginally in 2007 and 2008. During this period, there was an increase in donor age, the Donor Risk Index, the number of candidates receiving MELD exception scores and the number of recipients with hepatocellular carcinoma. In contrast, there was a decrease in retransplantation rates, and the number of patients receiving grafts from either a living donor or from donation after cardiac death. The proportion of patients with severe obesity, diabetes and renal insufficiency increased during this period. Despite increases in donor and recipient risk factors, there was a trend towards better 1-year graft and patient survival between 1998 and 2007. Of major concern, however, were considerable regional variations in waiting time and posttransplant survival. The current status of liver transplantation in the United States between 1999 and 2008 was analyzed using SRTR data. In addition to a general summary, we have included a more detailed analysis of liver transplantation for hepatitis C, retransplantation and SLK transplantation. [source]