Home About us Contact | |||
Cold Ischemic Time (cold + ischemic_time)
Selected AbstractsExtended right liver grafts obtained by an ex situ split can be used safely for primary and secondary transplantation with acceptable biliary morbidityLIVER TRANSPLANTATION, Issue 7 2009Atsushi Takebe Split liver transplantation (SLT) is clearly beneficial for pediatric recipients. However, the increased risk of biliary complications in adult recipients of SLT in comparison with whole liver transplantation (WLT) remains controversial. The objective of this study was to investigate the incidence and clinical outcome of biliary complications in an SLT group using split extended right grafts (ERGs) after ex situ splitting in comparison with WLT in adults. The retrospectively collected data for 80 consecutive liver transplants using ERGs after ex situ splitting between 1998 and 2007 were compared with the data for 80 liver transplants using whole liver grafts in a matched-pair analysis paired by the donor age, recipient age, indications, Model for End-Stage Liver Disease score, and high-urgency status. The cold ischemic time was significantly longer in the SLT group (P = 0.006). As expected, bile leakage from the transected surface occurred only in the SLT group (15%) without any mortality or graft loss. The incidence of all other early or late biliary complications (eg, anastomotic leakage and stenosis) was not different between SLT and WLT. The 1- and 5-year patient and graft survival rates showed no statistical difference between SLT and WLT [83.2% and 82.0% versus 88.5% and 79.8% (P = 0.92) and 70.8% and 67.5% versus 83.6% and 70.0% (P = 0.16), respectively]. In conclusion, ERGs can be used safely without any increased mortality and with acceptable morbidity, and they should also be considered for retransplantation. The significantly longer cold ischemic time in the SLT group indicates the potential for improved results and should thus be considered in the design of allocation policies. Liver Transpl 15:730,737, 2009. © 2009 AASLD. [source] Sirolimus-based immunosuppression following liver transplantation for hepatocellular carcinomaLIVER TRANSPLANTATION, Issue 5 2008Michael A. Zimmerman Experience with sirolimus (SRL)-based immunosuppression following orthotopic liver transplantation (OLT) is rapidly accumulating. In combination with calcineurin inhibitors (CNIs), SRL may reduce the incidence of acute rejection and lower overall required drug levels. This study sought to quantify long-term outcome following OLT in patients with cirrhosis and concomitant hepatocellular carcinoma (HCC) who were treated with an SRL-based regimen as a primary therapy. From January 2000 to June 2007, 97 patients underwent OLT for end-stage liver disease and HCC at the University of Colorado Health Sciences Center. Of those, 45 patients received SRL, in addition to CNIs, as a component of their primary immunosuppression regimen post-OLT. Conversely, 52 patients received the standard immunosuppression regimen including CNIs, mycophenolate mofetil, and corticosteroids. The 2 treatment groups were compared with respect to the following variables: age, gender, tumor stage by explant, grade, size, presence of vascular invasion, focality, Child's class, baseline creatinine, and warm and cold ischemic times. The 2 groups were comparable by all factors save for cold ischemic time, which was significantly longer in the CNI-treated group. Overall survival at 1 and 5 years post-OLT for patients treated with SRL was 95.5% and 78.8%, respectively. Conversely, survival in patients treated with CNIs exclusively at the same time intervals was 83% and 62%. Although there was no difference in the incidence of major complications, the SRL group experienced a modest improvement in renal function. Cumulatively, these data suggest a potential survival benefit with SRL-based therapy in patients undergoing OLT for end-stage liver disease and concomitant malignancy. Liver Transpl 2008. © 2008 AASLD. [source] Significance of positive cytotoxic cross-match in adult-to-adult living donor liver transplantation using small graft volumeLIVER TRANSPLANTATION, Issue 12 2002Kyung-Suk Suh MD A positive cross-match in cadaveric liver transplantation is relatively acceptable, but its role in living donor liver transplantation (LDLT) is less well known. The aim of this study is to examine the significance of cytotoxic cross-match in adult-to-adult LDLT using small-for-size grafts. Forty-three adult-to-adult LDLTs were performed at Seoul National University Hospital (Seoul, Korea) from January 1999 to July 2001. Subjects consisted of 27 men and 16 women with an average age of 45.4 years. Average liver graft weight was 565.3 ± 145.7 g, and average graft-recipient weight ratio (GRWR) was 0.89% ± 0.20%. HLA cross-match testing by lymphocytotoxicity and flow cytometry was performed routinely preoperatively. Factors that may influence survival, such as age; sex; blood group type A, type B, type O compatibility; cytotoxic cross-match; donor age; surgical time; cold ischemic time; and GRWR, were analyzed. Nine patients (20.9%) died in the hospital. There was a greater in-hospital mortality rate in women than men (37.5% v 11.1%; P = .049). The extra-small,graft group (0.54% , GRWR < 0.8%; n = 14) showed greater in-hospital mortality rates than the small-graft group (0.8% , GRWR , 1.42%; n = 29; 42.9% v 10.3%; P = .022). A positive cross-match was detected in 4 women transplant recipients, and 3 of these patients belonged to the extra-small,graft group. All patients with a positive cross-match died of multiorgan failure after early postoperative acute rejection episodes. Positive cross-match was the only significant factor in multivariate analysis (P = .035). In conclusion, when lymphocytotoxic cross-match and flow cytometry are significantly positive, adult-to-adult LDLT using small-for-size grafts should not be performed. [source] Orthotopic liver transplantation for children with Alagille syndromePEDIATRIC TRANSPLANTATION, Issue 5 2010Ronen Arnon Arnon R, Annunziato R, Miloh T, Suchy F, Sakworawich A, Hiroshi S, Kishore I, Kerkar N. Orthotopic liver transplantation for children with Alagille syndrome. Pediatr Transplantation 2010: 14:622,628. © 2010 John Wiley & Sons A/S. Abstract:, AGS is an inherited disorder involving the liver, heart, eyes, face, and skeleton. Aim: To determine the outcome of LT in children with AGS compared to those with BA. Methods: Children with AGS and BA who had a LT between 10/1987 and 5/2008 were identified from the UNOS database. Results: Of 11 467 children who received a liver transplant, 461 (4.0%) had AGS and 3056 (26.7%) had BA. One- and five-yr patient survival was significantly lower in patients with AGS in comparison with patients with BA (AGS; 82.9%, 78.4%, BA; 89.9%, 84%, respectively). Early death (<30 days from transplant) was significantly higher in AGS than in BA. One- and five-yr graft survival was significantly lower in AGS than in BA (AGS; 74.7%, 61.5%, BA; 81.6%, 70.0%, respectively). Death from graft failure, neurological, and cardiac complications was significantly higher in patients with AGS than in patients with BA. Serum creatinine at transplant, prior LT, and cold ischemic time >12 h were identified as risk factors for death. Conclusion: Children with AGS were older at the time of LT and their one- and five-yr patient and graft survival were significantly lower compared to BA. Risk factors for poor outcome in AGS after LT were identified. [source] Fate of the Mate: The Influence of Delayed Graft Function in Renal Transplantation on the Mate RecipientAMERICAN JOURNAL OF TRANSPLANTATION, Issue 8 2009J. F. Johnson Delayed graft function (DGF) in a deceased-donor renal recipient is associated with allograft dysfunction 1-year posttransplant. There is limited research about the influence to allograft function on the mate of a DGF recipient over time. Using a retrospective cohort design, we studied 55 recipients from a single center. The primary outcome was the change in glomerular filtration rate (GFR) 1-year posttransplant. The secondary outcome was the GFR at baseline. We found that mates to DGF recipients had a mean change in GFR 1-year posttransplant of ,11.2 mL/min, while the control group had a mean change of ,0.4 mL/min. The difference in the primary outcome was significant (p = 0.025) in a multivariate analysis, adjusting for cold ischemic time, panel reactive antibody level, allograft loss, human leukocyte antibody (HLA)-B mismatches and HLA-DR mismatches. No significant difference between groups was found in baseline GFR. In conclusion, mates to DGF recipients had a significantly larger decline in allograft function 1-year posttransplant compared to controls with similar renal function at baseline. We believe strategies that may preserve allograft function in these,at-risk'recipients should be developed and tested. [source] Increased Primary Non-Function in Transplanted Deceased-Donor Kidneys Flushed with Histidine-Tryptophan-Ketoglutarate SolutionAMERICAN JOURNAL OF TRANSPLANTATION, Issue 5 2009R. B. Stevens Histidine-Tryptophan-Ketoglutarate (HTK) solution is increasingly used to flush and preserve organ donor kidneys, with efficacy claimed equivalent to University of Wisconsin (UW) solution. We observed and reported increased graft pancreatitis in pancreata flushed with HTK solution, which prompted this review of transplanting HTK-flushed kidneys. We analyzed outcomes of deceased-donor kidneys flushed with HTK and UW solutions with a minimum of 12 months follow-up, excluding pediatric and multi-organ recipients. We evaluated patient and graft survival and rejection rates, variables that might constitute hazards to graft survival and renal function. Two-year patient survival, rejection, renal function and graft survival were not different, but early graft loss (<6 months) was worse in HTK-flushed kidneys (p < 0.03). A Cox analysis of donor grade, cold ischemic time, panel reactive antibodies (PRA), donor race, first vs. repeat transplant, rejection and flush solution showed that only HTK use predicted early graft loss (p < 0.04; relative risk = 3.24), almost exclusively attributable to primary non-function (HTK, n = 5 (6.30%); UW, n = 1 (0.65%); p = 0.02). Delayed graft function and early graft loss with HTK occurred only in lesser grade kidneys, suggesting it should be used with caution in marginal donors. [source] Living Donor and Split-Liver Transplants in Hepatitis C Recipients: Does Liver Regeneration Increase the Risk for Recurrence?AMERICAN JOURNAL OF TRANSPLANTATION, Issue 2 2005Abhinav Humar Concern exists that partial liver transplants (either a living donor [LD] or deceased donor [DD] in hepatitis C virus (HCV)-positive recipients may be associated with an increased risk for recurrence. From 1999 to 2003, at our institution, 51 HCV-positive recipients underwent liver transplants: 32 whole-liver (WL) transplants, 12 LD transplants and 7 DD split transplants. Donor characteristics differed in that WL donors were older, and LD livers had lower ischemic times. Recipient characteristics were similar except that mean MELD scores in LD recipients were lower (p < 0.05). With a mean follow-up of 28.3 months, 46 (90%) recipients are alive: three died from HCV recurrent liver disease and two from tumor recurrence. Based on 1-year protocol biopsies, the incidence of histologic recurrence in the three groups is as follows: WL, 81%; LD, 50% and DD split, 86% (p = 0.06 for LD versus WL). The mean grade of inflammation on the biopsy specimens was: WL, 1.31; LD, 0.33 and DD split, 1.2 (p = 0.002 for LD versus WL; p = 0.03 for LD versus DD split). Mean stage of fibrosis was: WL, 0.96; LD, 0.22 and DD split, 0.60 (p = 0.07 for LD versus WL). Liver regeneration does not seem to affect hepatitis C recurrence as much, perhaps, as factors such as DD status, donor age and cold ischemic time. [source] Sirolimus-based immunosuppression following liver transplantation for hepatocellular carcinomaLIVER TRANSPLANTATION, Issue 5 2008Michael A. Zimmerman Experience with sirolimus (SRL)-based immunosuppression following orthotopic liver transplantation (OLT) is rapidly accumulating. In combination with calcineurin inhibitors (CNIs), SRL may reduce the incidence of acute rejection and lower overall required drug levels. This study sought to quantify long-term outcome following OLT in patients with cirrhosis and concomitant hepatocellular carcinoma (HCC) who were treated with an SRL-based regimen as a primary therapy. From January 2000 to June 2007, 97 patients underwent OLT for end-stage liver disease and HCC at the University of Colorado Health Sciences Center. Of those, 45 patients received SRL, in addition to CNIs, as a component of their primary immunosuppression regimen post-OLT. Conversely, 52 patients received the standard immunosuppression regimen including CNIs, mycophenolate mofetil, and corticosteroids. The 2 treatment groups were compared with respect to the following variables: age, gender, tumor stage by explant, grade, size, presence of vascular invasion, focality, Child's class, baseline creatinine, and warm and cold ischemic times. The 2 groups were comparable by all factors save for cold ischemic time, which was significantly longer in the CNI-treated group. Overall survival at 1 and 5 years post-OLT for patients treated with SRL was 95.5% and 78.8%, respectively. Conversely, survival in patients treated with CNIs exclusively at the same time intervals was 83% and 62%. Although there was no difference in the incidence of major complications, the SRL group experienced a modest improvement in renal function. Cumulatively, these data suggest a potential survival benefit with SRL-based therapy in patients undergoing OLT for end-stage liver disease and concomitant malignancy. Liver Transpl 2008. © 2008 AASLD. [source] Safety and risk of using pediatric donor livers in adult liver transplantationLIVER TRANSPLANTATION, Issue 1 2001Sukru Emre MD Pediatric donor (PD) livers have been allocated to adult transplant recipients in certain situations despite size discrepancies. We compared data on adults (age , 19 years) who underwent primary liver transplantation using livers from either PDs (age < 13 years; n = 70) or adult donors (ADs; age , 19 years; n = 1,051). We also investigated the risk factors and effect of prolonged cholestasis on survival in the PD group. In an attempt to determine the minimal graft volume requirement, we divided the PD group into 2 subgroups based on the ratio of donor liver weight (DLW) to estimated recipient liver weight (ERLW) at 2 different cutoff values: less than 0.4 (n = 5) versus 0.4 or greater (n = 56) and less than 0.5 (n = 21) versus 0.5 or greater (n = 40). The incidence of hepatic artery thrombosis (HAT) was significantly greater in the PD group (12.9%) compared with the AD group (3.8%; P = .0003). Multivariate analysis showed that preoperative prothrombin time of 16 seconds or greater (relative risk, 3.206; P = .0115) and absence of FK506 use as a primary immunosuppressant (relative risk, 4.477; P = .0078) were independent risk factors affecting 1-year graft survival in the PD group. In the PD group, transplant recipients who developed cholestasis (total bilirubin level , 5 mg/dL on postoperative day 7) had longer warm (WITs) and cold ischemic times (CITs). Transplant recipients with a DLW/ERLW less than 0.4 had a trend toward a greater incidence of HAT (40%; P < .06), septicemia (60%), and decreased 1- and 5-year graft survival rates (40% and 20%; P = .08 and .07 v DLW/ERLW of 0.4 or greater, respectively). In conclusion, the use of PD livers for adult recipients was associated with a greater risk for developing HAT. The outcome of small-for-size grafts is more likely to be adversely affected by longer WITs and CITs. The safe limit of graft volume appeared to be a DLW/ERLW of 0.4 or greater. [source] Cold Machine Perfusion Versus Static Cold Storage of Kidneys Donated After Cardiac Death: A UK Multicenter Randomized Controlled TrialAMERICAN JOURNAL OF TRANSPLANTATION, Issue 9 2010C. J. E. Watson One third of deceased donor kidneys for transplantation in the UK are donated following cardiac death (DCD). Such kidneys have a high rate of delayed graft function (DGF) following transplantation. We conducted a multicenter, randomized controlled trial to determine whether kidney preservation using cold, pulsatile machine perfusion (MP) was superior to simple cold storage (CS) for DCD kidneys. One kidney from each DCD donor was randomly allocated to CS, the other to MP. A sequential trial design was used with the primary endpoint being DGF, defined as the necessity for dialysis within the first 7 days following transplant. The trial was stopped when data were available for 45 pairs of kidneys. There was no difference in the incidence of DGF between kidneys assigned to MP or CS (58% vs. 56%, respectively), in the context of an asystolic period of 15 min and median cold ischemic times of 13.9 h for MP and 14.3 h for CS kidneys. Renal function at 3 and 12 months was similar between groups, as was graft and patient survival. For kidneys from controlled DCD donors (with mean cold ischemic times around 14 h), MP offers no advantage over CS, which is cheaper and more straightforward. [source] |