Home About us Contact | |||
Graft Loss (graft + loss)
Kinds of Graft Loss Selected AbstractsInfluence of Cyclooxygenase-2 (COX-2) Gene Promoter Polymorphism ,765 on Graft Loss After Renal TransplantationAMERICAN JOURNAL OF TRANSPLANTATION, Issue 12 2009C. Courivaud A G,C polymorphism has been identified in the human cyclooxygenase-2 (COX-2) gene promoter at position ,765 with C allele leading to a decreased promoter activity with low prostaglandin E2 (PGE2) production. PGE2 has strong immunomodulatory properties that could influence graft survival. We studied the association between this polymorphism and allograft failure in two independent cohorts of renal transplant recipients (RTRs) including a total of 603 patients. The functional effect of COX-2 gene promoter polymorphism was analyzed by measuring serum levels of PGE2. Median follow-up was 8.7 and 7.9 years for the first and second cohort, respectively. Analysis of 603 patients identified 20 CC (3.3%), 179 GC (29.7%) and 404 GG (67%) carriers. Patients with the GG genotype had significantly higher serum PGE2 concentrations than patients with the C allele. Carriers with a C allele have an independent increased risk of graft loss (hazard ratio (HR) 2.43 [95% CI 1.19,4.97], p = 0.015 for cohort 1; HR 1.72 [95% CI 0.99,3.77], p = 0.051 for cohort 2) compared to GG patients. COX-2 gene promoter polymorphism at position ,765 (G,C) is associated with a higher rate of graft loss in RTRs. Such findings may be used to influence immunosuppressive strategies and optimize patient management. [source] Selective Retransplant After Graft Loss to Nonadherence: Success with a Second ChanceAMERICAN JOURNAL OF TRANSPLANTATION, Issue 6 2009T. B. Dunn Nonadherence (NA) is a difficult posttransplant problem that can lead to graft loss. A retransplant is controversial because of a fear of recurrent NA. We reviewed our center's data base and identified 114 kidney recipients who lost their graft to overt NA; of this group, 35 (31%) underwent a retransplant after a thorough reevaluation. We compared this NA retransplant group to a control group of second transplant recipients who did not lose their first graft to overt NA (non-NA) (n = 552). After 8 years of follow-up, we found no significant differences between the groups in actuarial graft or patient survival rates, renal function, or the incidence of biopsy-proven chronic rejection. However, 5 of 35 (14%) NA recipients versus 10 of 552 (2%) non-NA recipients lost their retransplant to NA (p = 0.0001). Twenty of 35 (57%) of the NA group exhibited repeat NA behavior after retransplant. We conclude that prior graft loss to NA is associated with increased graft loss to NA after retransplant. However, the majority of NA retransplant recipients did well,with overall long-term outcomes similar to those of the non-NA group. With careful patient selection and aggressive intervention, prior overt NA should not be an absolute contraindication to retransplantation. [source] Predictive Ability of Pretransplant Comorbidities to Predict Long-Term Graft Loss and DeathAMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2009G. Machnicki Whether to include additional comorbidities beyond diabetes in future kidney allocation schemes is controversial. We investigated the predictive ability of multiple pretransplant comorbidities for graft and patient survival. We included first-kidney transplant deceased donor recipients if Medicare was the primary payer for at least one year pretransplant. We extracted pretransplant comorbidities from Medicare claims with the Clinical Classifications Software (CCS), Charlson and Elixhauser comorbidities and used Cox regressions for graft loss, death with function (DWF) and death. Four models were compared: (1) Organ Procurement Transplant Network (OPTN) recipient and donor factors, (2) OPTN + CCS, (3) OPTN + Charlson and (4) OPTN + Elixhauser. Patients were censored at 9 years or loss to follow-up. Predictive performance was evaluated with the c-statistic. We examined 25 270 transplants between 1995 and 2002. For graft loss, the predictive value of all models was statistically and practically similar (Model 1: 0.61 [0.60 0.62], Model 2: 0.63 [0.62 0.64], Models 3 and 4: 0.62 [0.61 0.63]). For DWF and death, performance improved to 0.70 and was slightly better with the CCS. Pretransplant comorbidities derived from administrative claims did not identify factors not collected on OPTN that had a significant impact on graft outcome predictions. This has important implications for the revisions to the kidney allocation scheme. [source] Mycophenolate mofetil without antibody induction in cadaver vs. living donor pediatric renal transplantationPEDIATRIC TRANSPLANTATION, Issue 2 2003O. Ojogho Abstract: Mycophenolate mofetil (MMF) is a new immunosuppressive agent that blocks de novo purine synthesis in T and B lymphocytes via a potent selective inhibition of inosine monophosphate dehydrogenase. MMF has been shown to significantly reduce the incidence of acute rejection in both adult and pediatric renal transplantation. The impact of MMF on routine antibody induction therapy in pediatric renal transplantation has not been defined. Remarkably, a recent North American Pediatric Transplant Cooperative Study concluded that T-cell antibody induction therapy was deleterious for patients who received MMF. Our study examines the use of MMF in an evolving immunosuppressive strategy to avoid antibody induction in both living (LD) and cadaver (CAD) donor pediatric renal transplantation. We retrospectively analyzed the records of 43 pediatric renal transplants that received MMF-based triple therapy without antibody induction therapy between November 1996 and April 2000. We compared CAD (n = 17) with LD (n = 26). The two groups were similar demographically except that CAD had significantly younger donors than LD, 26.1 ± 13.7 vs. 36.2 ± 9.2 yr (p = 0.006). All the patients received MMF at 600 mg/m2/b.i.d. (maximum dose of 2 g/d) and prednisone with cyclosporine (86%) or tacrolimus (14%). Mean follow-up was >36 months for each group. Acute rejection rate at 6 months was 11.8% (CAD) vs. 15.4% (LD) (p = 0.999) and at 1 yr was 23.5% (CAD) vs. 26.9% (LD) (p = 0.999). Mean estimated glomerular filtration rate (ml/min/1.73 m2) at 6 months was 73.3 ± 15.3 (CAD) vs. 87.6 ± 24.2 (LD) (p = 0.068). Patient survival at 1, 2, and 3 yr was 100, 100, and 100% for CAD vs. 100, 96, and 96% for LD, respectively. Graft survival at 1, 2, and 3 yr was 100, 100, and 94% for CAD vs. 96, 88, and 71% for LD, respectively. Graft loss in CAD was because of chronic rejection (n = 2) while in LD it was because of non-compliance (n = 6), post-transplant lymphoproliferative disorder (n = 1), and sepsis (n = 1). In conclusion, MMF without antibody induction in both CAD and LD pediatric renal transplantation provides statistically similar and effective prophylaxis against acute rejection at 6 months and 1 yr post-transplant. The short-term patient and graft survival rates are excellent, however, non-compliance remains a serious challenge to long-term graft survival. Additional controlled studies are needed to define the role of MMF without antibody induction therapy in pediatric renal transplantation. [source] Association of Race and Socioeconomic Position with Outcomes in Pediatric Heart Transplant RecipientsAMERICAN JOURNAL OF TRANSPLANTATION, Issue 9 2010T. P. Singh We assessed the association of socioeconomic (SE) position with graft loss in a multicenter cohort of pediatric heart transplant (HT) recipients. We extracted six SE variables from the US Census 2000 database for the neighborhood of residence of 490 children who underwent their primary HT at participating transplant centers. A composite SE score was derived for each child and four groups (quartiles) compared for graft loss (death or retransplant). Graft loss occurred in 152 children (122 deaths, 30 retransplant). In adjusted analysis, graft loss during the first posttransplant year had a borderline association with the highest SE quartile (HR 1.94, p = 0.05) but not with race. Among 1-year survivors, both black race (HR 1.81, p = 0.02) and the lowest SE quartile (HR 1.77, p = 0.01) predicted subsequent graft loss in adjusted analysis. Among subgroups, the lowest SE quartile was associated with graft loss in white but not in black children. Thus, we found a complex relationship between SE position and graft loss in pediatric HT recipients. The finding of increased risk in the highest SE quartile children during the first year requires further confirmation. Black children and low SE position white children are at increased risk of graft loss after the first year. [source] Intramuscular hepatitis B immunoglobulin (HBIG) and nucleosides for prevention of recurrent hepatitis B following liver transplantation: comparison with other HBIG regimensCLINICAL TRANSPLANTATION, Issue 4 2007Robert D Anderson Abstract:, High titer hepatitis B immunoglobulin (HBIG) has significantly reduced the recurrence of hepatitis B virus (HBV) infection after liver transplantation. We compared our experience with intramuscular (IM) HBIG prophylaxis to our earlier outcomes with intravenous (IV) HBIG and other regimens. Methods:, One hundred and twenty-three patients with acute or chronic hepatitis B underwent liver transplant at the Baylor Regional Transplant Center between July 1985 and July of 2005. Of these, 63 (43%) received long-term low-dose IM (n = 17) or high-dose IV (n = 46) HBIG. All patients in IM group also received a nucleoside before and after transplant. These patients were compared with those transplanted earlier who received either no prophylaxis (n = 16) or HBIG on day zero and one only (n = 44). Results:, HBV recurrence was significantly lower in patients who received long-term HBIG [9/38 (23.7%) for IV and 1/17 (5.9%) for IM] compared with patients who received no treatment (8/11; 72.7%) or only two doses of HBIG (32/40; 80.0%). Two-yr actuarial survivals were 89%, 88%, 54%, and 64%, respectively. Patients on long-term HBIG by either parenteral route survived as well as patients transplanted for other indications. Post-transplant recurrence of hepatitis B in the long-term HBIG groups was usually controlled by intensifying antiviral therapy. Conclusion:, Long-term low-dose IM and high-dose IV HBIG are equally efficacious with similar survival and early hepatitis recurrence rates. Graft loss is usually avoidable when recurrence is discovered early and aggressively treated. The IM route is preferable to IV administration due to its ease of administration and lower cost. [source] The effect of Daclizumab in a high-risk renal transplant populationCLINICAL TRANSPLANTATION, Issue 5 2000Herwig-Ulf Meier-Kriesche Introduction: African,American (AA) renal transplant recipients have a higher incidence of acute rejection when compared to Caucasian renal transplant recipients. This higher rejection rate holds true even with the addition of several of the newer immunosuppressive agents (e.g. mycophenolate mofetil (MMF) and Rapamycin). Acute rejection rates among Hispanic (H) renal transplant recipients are higher in some settings, while lower or the same as in Caucasians in other settings. IL-2 receptor antibodies have been shown to decrease rejection rates when added to a regimen of cyclosporine (CsA), azathioprine and prednisone. Limited data are available on these agents in conjunction with triple CsA, MMF and prednisone therapy, particularly in higher risk group patients. We studied the effect of the addition of the IL-2 receptor antibody Daclizumab to a CsA, MMF, prednisone regimen in a group of African,American and high-risk Hispanic renal transplant recipients. Methods: This was a non-randomized, prospective study. A total of 49 renal transplant recipients (29 African,American and 20 Hispanic) were studied and followed. A simultaneous cohort of 56 (31 African,American and 25 Hispanic) renal transplant recipients receiving CsA, MMF and prednisone with no standard induction agent served as the control group. The study cohort received the same regimen with the addition of Daclizumab at 1 mg/kg for five doses over 10 wk. Multivariate analysis was performed to isolate independent factors influencing the study's results. Results: A total of 56 patients in the control group and 49 patients in the Daclizumab group received an average follow-up of 17.1±6.9 and 12.7±5.1 months, respectively. Acute rejection rates were lower in the Daclizumab group as compared to the control group 26.4% versus 49.3% per patient years, respectively. A total of eight recurrent rejections in 6 patients occurred in the control group and none in the Daclizumab arm. Graft loss at this follow-up was no different between the groups. Conclusion: The addition of Daclizumab to a regimen of CsA, MMF and prednisone decreases acute rejection episodes in a high-risk group of African,American and Hispanic renal transplant recipients. [source] A Phase III Study of Belatacept Versus Cyclosporine in Kidney Transplants from Extended Criteria Donors (BENEFIT-EXT Study)AMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2010A. Durrbach Recipients of extended criteria donor (ECD) kidneys are at increased risk for graft dysfunction/loss, and may benefit from immunosuppression that avoids calcineurin inhibitor (CNI) nephrotoxicity. Belatacept, a selective costimulation blocker, may preserve renal function and improve long-term outcomes versus CNIs. BENEFIT-EXT (Belatacept Evaluation of Nephroprotection and Efficacy as First-line Immunosuppression Trial,EXTended criteria donors) is a 3-year, Phase III study that assessed a more (MI) or less intensive (LI) regimen of belatacept versus cyclosporine in adult ECD kidney transplant recipients. The coprimary endpoints at 12 months were composite patient/graft survival and a composite renal impairment endpoint. Patient/graft survival with belatacept was similar to cyclosporine (86% MI, 89% LI, 85% cyclosporine) at 12 months. Fewer belatacept patients reached the composite renal impairment endpoint versus cyclosporine (71% MI, 77% LI, 85% cyclosporine; p = 0.002 MI vs. cyclosporine; p = 0.06 LI vs. cyclosporine). The mean measured glomerular filtration rate was 4,7 mL/min higher on belatacept versus cyclosporine (p = 0.008 MI vs. cyclosporine; p = 0.1039 LI vs. cyclosporine), and the overall cardiovascular/metabolic profile was better on belatacept versus cyclosporine. The incidence of acute rejection was similar across groups (18% MI; 18% LI; 14% cyclosporine). Overall rates of infection and malignancy were similar between groups; however, more cases of posttransplant lymphoproliferative disorder (PTLD) occurred in the CNS on belatacept. ECD kidney transplant recipients treated with belatacept-based immunosuppression achieved similar patient/graft survival, better renal function, had an increased incidence of PTLD, and exhibited improvement in the cardiovascular/metabolic risk profile versus cyclosporine-treated patients. [source] Anaemia after renal transplantationEUROPEAN JOURNAL OF CLINICAL INVESTIGATION, Issue 2005M. Lorenz Abstract Anaemia is a frequent complication among long-term renal transplant recipients. A prevalence of approximately 40% has been reported in several studies. If renal function declines to stage 5 kidney disease, the prevalence of anaemia in kidney transplants is even higher. A positive correlation between haemoglobin concentration and creatinine clearance has been reported, which is a function of endogenous erythropoietin production by the functioning graft. Inflammation related to a retained kidney graft may cause hypo-responsiveness to erythropoietic agents once kidney transplant recipients return to dialysis. Furthermore, the use of azathioprine, mycophenolate mofetil and sirolimus may be associated with post-transplant anaemia. Along with erythropoietin deficiency, depletion of iron stores is one of the major reasons for anaemia in the kidney transplant population. The proportion of hypochromic red blood cells appears to be a useful parameter to measure iron supply and utilization as well as to estimate mortality risks in kidney transplant recipients. While anaemia is an important cardiovascular risk-factor after transplantation, our data suggest that anaemia is not associated with mortality and graft loss. Nevertheless, inadequate attention is paid so far to the management of anaemia after renal transplantation. A promising future aspect for risk reduction of cardiovascular disease includes the effect of erythropoietic agents on endothelial progenitor cells. [source] Enhancing the outcome of free latissimus dorsi muscle flap reconstruction of scalp defectsHEAD & NECK: JOURNAL FOR THE SCIENCES & SPECIALTIES OF THE HEAD AND NECK, Issue 1 2004FRCS(C), Joan E. Lipa MD Abstract Background. Reconstruction of scalp and calvarial defects after tumor ablation frequently requires prosthetic cranioplasty and cutaneous coverage. Furthermore, patients often have advanced disease and receive perioperative radiotherapy. We evaluated the complications of scalp reconstruction with a free latissimus dorsi muscle flap in this setting. Methods. The complications and the oncologic and aesthetic outcomes of six consecutive scalp reconstructions with a free latissimus dorsi muscle flap and skin graft in five patients with advanced cancer were retrospectively evaluated. Patient, tumor, defect, reconstructive, and other treatment characteristics were reviewed. Reconstructive and perioperative techniques intended to improve flap survival and aesthetic outcome and reduce complications in these patients. Results. All patients (52,76 years old) had recurrent tumors (sarcoma, melanoma, or squamous cell carcinoma) and received postoperative radiotherapy. The mean scalp defect size was 367 cm2, and partial-thickness or full-thickness calvarial resection was required in all six cases. No vein grafts were needed. The mean follow-up period and disease-free survival time were 18 and 13 months, respectively. Three patients died of their disease, and two survived disease free. There were no flap failures or dehiscences. Complications consisted of donor site seroma in two patients; partial skin graft loss in one patient; and radiation burns to the flap, face, and ears in one patient. Scalp contour and aesthetic outcome were very good in all cases except for the one case with radiation burns. Conclusions. Good outcomes were achieved using a free latissimus dorsi muscle flap with a skin graft for flap reconstruction in elderly patients with advanced recurrent cancers who received perioperative radiotherapy. Several technical aspects of the reconstruction technique intended to enhance the functional and aesthetic outcome and/or reduce complications were believed to have contributed to the good results. © 2004 Wiley Periodicals, Inc. Head Neck26: 46,53, 2004 [source] Successful renal transplantation in the right iliac fossa 2 years after serious deep venous thrombosis in a patient with systemic lupus erythematosusINTERNATIONAL JOURNAL OF UROLOGY, Issue 10 2005NORIHIKO TSUCHIYA Abstract Deep venous thrombosis (DVT) possibly occurs in the perioperative period, and induces serious complications such as a pulmonary embolism. On the other hand, allograft renal vein thrombosis leads to a high incidence of graft loss. We experienced a case in which a serious DVT occurred prior to renal transplantation; however, a successful renal transplantation in the right iliac fossa was performed after 2 years of anticoagulant therapy. It is suggested that the external iliac vein even after suffering from DVT can be anastomosed to an allograft vein successfully, when enough blood ,ow or a lower venous pressure is con,rmed. However, one should be aware of the risk factors and the adequate management of thrombosis in renal transplantation because of the serious complications of DVT and the poor prognosis of allograft vein thrombosis. [source] Immune tolerance: mechanisms and application in clinical transplantationJOURNAL OF INTERNAL MEDICINE, Issue 3 2007M. Sykes Abstract. The achievement of immune tolerance, a state of specific unresponsiveness to the donor graft, has the potential to overcome the current major limitations to progress in organ transplantation, namely late graft loss, organ shortage and the toxicities of chronic nonspecific immumnosuppressive therapy. Advances in our understanding of immunological processes, mechanisms of rejection and tolerance have led to encouraging developments in animal models, which are just beginning to be translated into clinical pilot studies. These advances are reviewed here and the appropriate timing for clinical trials is discussed. [source] Prophylaxsis against recurrance of hepatitis B virus after liver transplantation: a retrospective analysis spanning 20 yearsLIVER INTERNATIONAL, Issue 1 2008Nevin Yilmaz Abstract Liver transplantation (LT) in patients with hepatitis B virus (HBV) infection is associated with a high rate of graft loss and poor survival, unless re-infection can be prevented. Human hepatitis B immune globulin (HBIG) has long been utilized to prevent re-infection. More recently, an anti-viral agent has been utilized along with HBIG. However, the regimens utilized have varied considerably among LT programmes and the optimal regimen has never been defined. We conducted a retrospective analysis of 41 patients who underwent LT for HBV at our centre since 1985 and received either HBIG with or without an anti-viral agent. The mean age of these patients was 46 years; 81% were male and 88% white. The mean and maximal follow-up were 5.9 and 15 years respectively. Eight out of 15 E-antigen-positive patients who received HBIG alone developed recurrence after a mean of 17 months. In contrast, none of 10 E-Ag-negative patients who received HBIG alone and none of the 10 E-antigen-positive patients who received both HBIG and either lamivudine or adefovir developed recurrence. As long as the anti-HB surface remained detectable, no absolute minimum serum level appeared to lead to recurrent HBV. We concluded that recurrence of HBV following LT can be prevented in E-antigen-positive patients with a combination of HBIG and an anti-viral agent. In contrast, recurrence can be prevented in E-antigen-negative patients with HBIG alone. Maintaining a serum anti-HB surface level above a minimum arbitrary titre of 200 pg/mL did not appear to be necessary for effective HBIG prophylaxis. [source] Cholestatic syndrome with bile duct damage and loss in renal transplant recipients with HCV infectionLIVER INTERNATIONAL, Issue 2 2001Johanna K. Delladetsima Abstract:Background/Aims: Bile duct cells are known to be susceptible to hepatitis B and C virus, while it has been recently suggested that hepatitis B virus (HBV) and hepatitis C virus (HCV) infection may have a direct role in the pathogenesis of vanishing bile duct syndrome (VBDS) after liver transplantation. We report the development of a cholestatic syndrome associated with bile duct damage and loss in four HCV-infected renal transplant recipients. Methods: All four patients were followed up biochemically, serologically and with consecutive liver biopsies. Serum HCV RNA was quantitatively assessed and genotyping was performed. Results: Three patients were anti-HCV negative and one was anti-HCV/HBsAg positive at the time of transplantation and received the combination of methylprednisolone, azathioprine and cyclosporine A. Two patients became anti-HCV positive 1 year and one patient 3 years post-transplantation. Elevation of the cholestatic enzymes appeared simultaneously with seroconversion, or 2,4 years later, and was related to lesions of the small-sized interlobular bile ducts. Early bile duct lesions were characterized by degenerative changes of the epithelium. Late and more severe bile duct damage was associated with bile duct loss. The progression of the cholestatic syndrome coincided with high HCV RNA serum levels, while HCV genotype was 1a and 1b. Two patients (one with HBV co-infection) developed progressive VBDS and died of liver failure 2 and 3 years after biochemical onset. One patient, despite developing VBDS within a 10-month period, showed marked improvement of liver function after cessation of immunosuppression because of graft loss. The fourth patient, who had mild biochemical and histological bile duct changes, almost normalized liver function tests after withdrawal of azathioprine. Conclusion:Á progressive cholestatic syndrome due to bile duct damage and loss may develop in renal transplant patients with HCV infection. The occurrence of the lesions after the appearance of anti-HCV antibodies and the high HCV RNA levels are indicative of viral involvement in the pathogenesis. Withdrawal of immunosuppressive therapy may have a beneficial effect on the outcome of the disease. [source] Seventh Day Syndrome , acute hepatocyte apoptosis associated with a unique syndrome of graft loss following liver transplantation,LIVER INTERNATIONAL, Issue 1 2001Muhammed Ashraf Memon Abstract:Aim: The aim of this study is to describe a unique 7th day syndrome (7DS), quite different from other causes of post-transplantation allograft dysfunction in a group of orthotopic liver transplant (OLT) patients who needed retransplantation. Methods: A retrospective analysis of 594 consecutive OLT over an 8-year period revealed that 10 patients developed allograft dysfunction approximately 7 days following an initially normal graft function. Results: The features included: (a) severe liver failure; (b) sudden peak of extremely high liver enzymes at approximately day 7; (c) serial liver biopsy findings of central lobular hemorrhage with minimal inflammatory cell infiltrate and (d) an explant with no evidence of vascular thrombosis. The biochemical and morphometric pathological data of these patients were compared with data of patitents who had early acute rejection (AR), hepatic artery thrombosis (HAT), primary non-function (PNF), severe sepsis and no dysfunction. Lastly, serial liver core biopsies and explants were tested for evidence of apoptosis, which revealed a significantly higher number of apoptotic hepatocytes in 7DS compared to all control groups. Conclusions: Seventh Day Syndrome is a distinct entity associated with early graft dysfunction characterized by a marked apoptosis of hepatocytes. Fas receptor activation or other pathways of program cell death may be implicated in occurrence of 7DS. [source] Long-term outcomes in pediatric liver transplantationLIVER TRANSPLANTATION, Issue S2 2009John Bucuvalas Key Points 1. Critical clinical outcomes for pediatric liver transplantation (LT) recipients include (1) patient and graft survival, (2) complications (immune and nonimmune) of chronic immunosuppressive medications, and (3) long-term graft function. 2. Recurrence of malignancy, sepsis, and posttransplant lymphoproliferative disorder account for more than 65% of deaths occurring more than 1 year after LT. 3. Chronic rejection, late hepatic artery thrombosis, and biliary strictures account for 70% of graft loss occurring more than 1 year after LT. 4. Late histological changes in the allograft are emerging as a common problem in patients more than 5 years after LT. The pathogenesis of these findings and the impact on graft survival remain to be determined. Liver Transpl 15:S6,S11, 2009. © 2009 AASLD. [source] ABO-incompatible deceased donor liver transplantation in the United States: A national registry analysis,LIVER TRANSPLANTATION, Issue 8 2009Zoe A. Stewart In the United States, ABO-incompatible liver transplantation (ILT) is limited to emergent situations when ABO-compatible liver transplantation (CLT) is unavailable. We analyzed the United Network for Organ Sharing database of ILT performed from 1990-2006 to assess ILT outcomes for infant (0-1 years; N = 156), pediatric (2-17 years; N = 170), and adult (> 17 years; N = 667) patients. Since 2000, the number of ILT has decreased annually, and there has been decreased use of blood type B donors and increased use of blood type A donors. Furthermore, ILT graft survival has improved for all age groups in recent years, beyond the improved graft survival attributable to era effect based on comparison to respective age group CLT. On matched control analysis, graft survival was significantly worse for adult ILT as compared to adult CLT. However, infant and pediatric ILTs did not have worse graft survival versus age-matched CLT. Adjusted analyses identified age-specific characteristics impacting ILT graft loss. For infants, transplant after 2000 and donor age < 9 years were associated with reduced risk of ILT graft loss. For pediatric patients, female recipient sex and donor age > 50 years were associated with increased risk of ILT graft loss. For adults, life support, repeat transplant, split grafts, and hepatocellular carcinoma were associated with increased risk of ILT graft loss. The current study identifies important trends in ILT in the United States in the modern immunosuppression era, as well as specific recipient, donor, and graft characteristics impacting ILT graft survival that could be utilized to guide ILT organ allocation in exigent circumstances. Liver Transpl 15:883,893, 2009. © 2009 AASLD. [source] Extended right liver grafts obtained by an ex situ split can be used safely for primary and secondary transplantation with acceptable biliary morbidityLIVER TRANSPLANTATION, Issue 7 2009Atsushi Takebe Split liver transplantation (SLT) is clearly beneficial for pediatric recipients. However, the increased risk of biliary complications in adult recipients of SLT in comparison with whole liver transplantation (WLT) remains controversial. The objective of this study was to investigate the incidence and clinical outcome of biliary complications in an SLT group using split extended right grafts (ERGs) after ex situ splitting in comparison with WLT in adults. The retrospectively collected data for 80 consecutive liver transplants using ERGs after ex situ splitting between 1998 and 2007 were compared with the data for 80 liver transplants using whole liver grafts in a matched-pair analysis paired by the donor age, recipient age, indications, Model for End-Stage Liver Disease score, and high-urgency status. The cold ischemic time was significantly longer in the SLT group (P = 0.006). As expected, bile leakage from the transected surface occurred only in the SLT group (15%) without any mortality or graft loss. The incidence of all other early or late biliary complications (eg, anastomotic leakage and stenosis) was not different between SLT and WLT. The 1- and 5-year patient and graft survival rates showed no statistical difference between SLT and WLT [83.2% and 82.0% versus 88.5% and 79.8% (P = 0.92) and 70.8% and 67.5% versus 83.6% and 70.0% (P = 0.16), respectively]. In conclusion, ERGs can be used safely without any increased mortality and with acceptable morbidity, and they should also be considered for retransplantation. The significantly longer cold ischemic time in the SLT group indicates the potential for improved results and should thus be considered in the design of allocation policies. Liver Transpl 15:730,737, 2009. © 2009 AASLD. [source] Steroid avoidance in liver transplantation: Meta-analysis and meta-regression of randomized trialsLIVER TRANSPLANTATION, Issue 4 2008Dorry L. Segev Steroid use after liver transplantation (LT) has been associated with diabetes, hypertension, hyperlipidemia, obesity, and hepatitis C (HCV) recurrence. We performed meta-analysis and meta-regression of 30 publications representing 19 randomized trials that compared steroid-free with steroid-based immunosuppression (IS). There were no differences in death, graft loss, and infection. Steroid-free recipients demonstrated a trend toward reduced hypertension [relative risk (RR) 0.84, P = 0.08], and statistically significant decreases in cholesterol (standard mean difference ,0.41, P < 0.001) and cytomegalovirus (RR 0.52, P = 0.001). In studies where steroids were replaced by another IS agent, the risks of diabetes (RR 0.29, P < 0.001), rejection (RR 0.68, P = 0.03), and severe rejection (RR 0.37, P = 0.001) were markedly lower in steroid-free arms. In studies in which steroids were not replaced, rejection rates were higher in steroid-free arms (RR 1.31, P = 0.02) and reduction of diabetes was attenuated (RR 0.74, P = 0.2). HCV recurrence was lower with steroid avoidance and, although no individual trial reached statistical significance, meta-analysis demonstrated this important effect (RR 0.90, P = 0.03). However, we emphasize the heterogeneity of trials performed to date and, as such, do not recommend basing clinical guidelines on our conclusions. We believe that a large, multicenter trial will better define the role of steroid-free regimens in LT. Liver Transpl 14:512,525, 2008. © 2008 AASLD. [source] Outcomes in hepatitis C virus,infected recipients of living donor vs. deceased donor liver transplantation,,§¶LIVER TRANSPLANTATION, Issue 1 2007Norah A. Terrault In this retrospective study of hepatitis C virus (HCV),infected transplant recipients in the 9-center Adult to Adult Living Donor Liver Transplantation Cohort Study, graft and patient survival and the development of advanced fibrosis were compared among 181 living donor liver transplant (LDLT) recipients and 94 deceased donor liver transplant (DDLT) recipients. Overall 3-year graft and patient survival were 68% and 74% in LDLT, and 80% and 82% in DDLT, respectively. Graft survival, but not patient survival, was significantly lower for LDLT compared to DDLT (P = 0.04 and P = 0.20, respectively). Further analyses demonstrated lower graft and patient survival among the first 20 LDLT cases at each center (LDLT ,20) compared to later cases (LDLT > 20; P = 0.002 and P = 0.002, respectively) and DDLT recipients (P < 0.001 and P = 0.008, respectively). Graft and patient survival in LDLT >20 and DDLT were not significantly different (P = 0.66 and P = 0.74, respectively). Overall, 3-year graft survival for DDLT, LDLT >20, and LDLT ,20 were 80%, 79% and 55%, with similar results conditional on survival to 90 days (84%, 87% and 68%, respectively). Predictors of graft loss beyond 90 days included LDLT ,20 vs. DDLT (hazard ratio [HR] = 2.1, P = 0.04), pretransplant hepatocellular carcinoma (HCC) (HR = 2.21, P = 0.03) and model for end-stage liver disease (MELD) at transplantation (HR = 1.24, P = 0.04). In conclusion, 3-year graft and patient survival in HCV-infected recipients of DDLT and LDLT >20 were not significantly different. Important predictors of graft loss in HCV-infected patients were limited LDLT experience, pretransplant HCC, and higher MELD at transplantation. Liver Transpl 13:122,129, 2007. © 2006 AASLD. [source] The association of HLA-DR13 with lower graft survival rates in hepatitis B and primary sclerosing cholangitis caucasian patients receiving a liver transplantLIVER TRANSPLANTATION, Issue 4 2006Yasuro Futagawa We investigated an association of human leukocyte antigen (HLA)-DR13 to graft survival in liver transplantation among Caucasian recipients. 28,708 deceased liver transplants performed between January 1990 and December 2002 in the United States as reported to the United Network for Organ Sharing registry were utilized to compare survival rates. We utilized Caucasian adult patients (>20 years) by Kaplan-Meier curves, log-rank tests, and Cox proportional hazard analyses. HLA-DR13-negative hepatitis B virus (HBV) and primary sclerosing cholangitis (PSC) recipients yielded significantly lower graft survival rates than those of DR13-negative patients (P = 0.002, P = 0.015, respectively). This negative association was still significant after adjusting potential confounding factors. The Cox test demonstrated that HLA-DR13-positive groups have a significantly higher hazard ratio in PSC (1.40; P = 0.029; 95% confidence interval, 1.04-1.90) and HBV patients (1.78; P = 0.032; 95% confidence interval, 1.05-3.02). In conclusion, our data suggest that HLA-DR13 is a strong, positive predictor of increased risk for graft loss in HBV and PSC liver transplant recipients. Further study is needed to test the hypothesis that DR13-related immune responses may play a role in mediating graft loss in HBV and PSC liver transplantations. Liver Transpl 12:600,604, 2006. © 2006 AASLD. [source] Hepatitis B in liver transplant recipientsLIVER TRANSPLANTATION, Issue S2 2006Robert G. Gish Key Concepts: 1The use of low-dose immunosuppressive therapy along with pre- and posttransplantation nucleos(t)ide therapy and posttransplantation hepatitis B immunoglobulin (HBIG) has yielded marked improvements in survival. 2Lamivudine (Epivir-HBV), adefovir (Hepsera), entecavir (Baraclude), tenofovir (Viread), emtricitabine (Emtriva), and the combination drugs tenofovir + emtricitabine (Truvada) and abacavir + lamivudine (Epzicom) are effective nucleos(t)ide antiviral agents that, in some cases, may help reverse liver disease sufficiently to avoid transplant. 3In posttransplantation patients, virus suppression with some combination of HBIG and the nucleos(t)ide agents may prevent graft loss and death or the need for a second transplant. 4In both the pre- and posttransplantation setting, the goal of hepatitis B virus management is complete virus suppression. 5The use of low-dose intramuscular HBIG is evolving, with studies showing that dosing and cost can be reduced by 50,300% with a customized approach. 6Elimination of HBIG from the treatment paradigm is currently under evaluation and may be possible with the use of newer medications that have no or low resistance rates. 7Although there is growing evidence that some types of combination therapy may decrease the chance that drug resistance will develop and increase the likelihood of long-term success in preventing graft loss and death, additional research will be required to determine which combinations will work well in the long term, and which will not. Liver Transpl 12:S54,S64, 2006. © 2006 AASLD. [source] Preclinical experiment of auxiliary partial orthotopic liver transplantation as a curative treatment for hemophiliaLIVER TRANSPLANTATION, Issue 5 2005Saiho Ko The cause of hemophilia is deficiency of coagulation factor VIII production in the liver, which can be cured by liver transplantation. Because the hepatic function of hemophilia patients is quite normal except for production of factor VIII, auxiliary partial orthotopic liver transplantation (APOLT) is beneficial in that patient survival is secured by preserving native liver even in the event of graft loss. However, it is not known whether the graft of APOLT would be enough to cure hemophilia. We evaluated the efficacy and feasibility of APOLT for hemophilia in a canine hemophilia A model that we established. Partial left liver graft was taken from the normal donor (blood factor VIII activity > 60%). The graft was transplanted to the hemophilia beagle dog (blood factor VIII activity < 5%) after resection of the left lobe preserving native right lobe. Changes in time of blood factor VIII activity and liver function parameters were observed after APOLT. APOLT and perioperative hemostatic management were successfully performed. The blood factor VIII activity increased to 30% after APOLT, and was sustained at least 6 weeks throughout the observation period without symptoms of bleeding. The result demonstrated sustained production of factor VIII in the hemophilia recipient after APOLT. Transplantation of approximately one third of whole liver resulted in cure of hemophilia. In conclusion, it is suggested that APOLT would be feasible as a curative treatment of hemophilia A to improve quality of life of the patients. (Liver Transpl 2005;11:579,584.) [source] Effects of interferon treatment on liver histology and allograft rejection in patients with recurrent hepatitis C following liver transplantationLIVER TRANSPLANTATION, Issue 7 2004R. Todd Stravitz Recurrent hepatitis C after liver transplantation remains a significant cause of graft loss and retransplantation. Although treatment of recurrent hepatitis C with interferon-based regimens has become widely accepted as safe and can lead to sustained virologic clearance of hepatitis C virus (HCV) RNA, long-term histologic improvement and the risk of precipitating graft rejection remain controversial. The present study is a retrospective evaluation of the clinical and histological consequences of treating recurrent hepatitis C with interferon-based therapy in a selected group of liver transplant recipients. Twenty-three liver transplant recipients with recurrent hepatitis C and histologic evidence of progressive fibrosis completed at least 6 months of interferon, 83% of whom received pegylated-interferon ,-2b; only 4 tolerated ribavirin. Overall, 11 patients (48%) had undetectable HCV RNA at the end of 6 months of treatment. Of these patients, 3 remained HCV RNA,negative on maintenance interferon monotherapy for 33 months, and the other 8 (35%) completed treatment and remained HCV RNA,undetectable 24 weeks after discontinuation of interferon. Overall necroinflammatory activity in liver biopsies obtained 2 years after HCV RNA became undetectable decreased significantly (7.73 ± 2.37 vs. 5.64 ± 2.94 units before and after treatment, respectively; P = .016). However, 5 of these 11 patients had no histologic improvement in follow-up liver histology. Liver biopsies in the 12 nonresponders demonstrated disease progression. Of the 23 patients treated with interferon, 8 (35%) had evidence of acute or chronic rejection on posttreatment liver biopsy, most of whom had no previous history of rejection (P < .01 for comparison of pretreatment and posttreatment prevalence of histologic rejection), and 2 experienced graft loss from chronic rejection, requiring retransplantation. In conclusion, interferon treatment of recurrent hepatitis C does not consistently improve histologic disease after virologic response, and it may increase the risk of allograft rejection. (Liver Transpl 2004;10:850,858.) [source] Outcomes of acute rejection after interferon therapy in liver transplant recipientsLIVER TRANSPLANTATION, Issue 7 2004Sammy Saab Interferon alfa has been increasingly used against recurrent hepatitis C (HCV) disease in post-liver transplant (LT) recipients. A serious potential adverse effect is acute rejection. We reviewed our experience using interferon-based therapy (interferon or pegylated interferon with or without ribavirin) for treating recurrent HCV in LT recipients. Forty-four LT recipients were treated with interferon for recurrent HCV. Five of the 44 patients developed acute rejection during interferon-based therapy. These 5 patients started treatment of 42.4 ± 33.89 months (mean ± SD) after LT. Mean (± SD) histological activity index and fibrosis scores before initiating antiviral therapy were 8.8 (± 1.92) and 2.6 (± 0.55), respectively. Patients were treated for 3.3 ± 2.28 months (mean ± SD) prior to rejection. At the time of rejection, HCV load was not detectable in 4 of the 5 recipients. All 5 patients had tolerated interferon therapy, and none had stopped therapy because of adverse effects. The rejection was successfully treated in 3 patients. In 2 of those 3 patients, cirrhosis eventually developed. In the 2 patients who did not respond to rejection treatment, immediate graft failure occurred, leading to re-LT in 1 patient and death from sepsis in the other. In conclusion, the results indicate that further studies are needed to assess the safety of interferon in LT recipients. Interferon-based therapy may lead to acute rejection and subsequent graft loss and should therefore be used with caution. Treated recipients may also develop progressive cirrhosis despite achieving a sustained virological response. (Liver Transpl 2004;10:859,867.) [source] Role of adult living donor liver transplantation in patients with hepatitis CLIVER TRANSPLANTATION, Issue 10C 2003Gregory T. Everson Key points 1. Living donor liver transplantation (LDLT) is an option for patients with end-stage liver disease or hepatoma caused by chronic hepatitis C. 2. Reports from some, but not all, transplant centers indicate that hepatitis C may recur earlier, recurrence may be more severe, and graft loss caused by recurrent hepatitis C may be more frequent in LDLT compared with cadaveric transplantation. 3. Several unique characteristics of LDLT (versus cadaveric transplantation) may favor severe recurrence of hepatitis C. These include an increase in genetic similarity between donor and recipient, higher degree of HLA matching, greater systemic bioavailability of immunosuppressive agent, and hepatic regeneration. 4. Hepatic regeneration may promote the acceleration and severity of recurrent hepatitis C by enhancement of hepatitis C viral uptake by hepatocytes through stimulation of the low-density lipoprotein receptor and increase in activity of the internal ribosomal entry site. [source] Can early liver biopsies predict long-term outcome of the graft?LIVER TRANSPLANTATION, Issue 1 2003Lydia M. Petrovic MD Background: Chronic rejection (CR) in liver allografts show a rapid onset and progressive course, leading to graft failure within the first year after transplantation. Most cases are preceded by episodes of acute cellular rejection (AR), but histological features predictive for the transition toward CR are not well documented. Method: We assessed the predictive value of centrilobular necrosis, central vein endothelialitis (CVE), central vein fibrosis, and lobular inflammation in the development of CR. One-week and one-month biopsy specimens of 12 patients with CR were compared with those of a control group consisting of 17 patients, who experienced AR without developing CR. The progress of the histological changes was further evaluated in follow-up biopsy specimens of the CR group taken at 2 months and beyond 3 months after transplantation. Result: Centrilobular necrosis, CVE, central vein fibrosis, and lobular inflammation were common features in both groups at 1 week. At 1 month, the incidence declined in the control group. The CR group showed an increased incidence and persistence of these features in the follow-up specimens. The incidence and median grade of severity of CVE was significantly higher in the CR group (p=0.04, and P<0.001). The severity of portal and lobular inflammation was also more pronounced in the CR group (P+0.01 and 0.069). Conversely, in the control group the incidence of the lobular features decreased and the severity of CVE declined significantly (P=0.03). Conclusion: The shift from a predominantly portal-based process toward lobular graft damage represents the early transition of AR to CR, for which a modification of immunosuppression might be necessary to prevent graft loss. [source] Does tacrolimus offer virtual freedom from chronic rejection after primary liver transplantation?LIVER TRANSPLANTATION, Issue 7 2001048 liver transplantations with a mean follow-up of 6 years, prognostic factors in Tacrolimus has proven to be a potent immunosuppressive agent in liver transplantation (LT). Its introduction has led to significantly less frequent and severe acute rejection. Little is known about the rate of chronic rejection (CR) in primary LT using tacrolimus therapy. The aim of the present study is to examine the long-term incidence of CR, risk factors, prognostic factors, and outcome after CR. The present study evaluated the development of CR in 1,048 consecutive adult primary liver allograft recipients initiated and mostly maintained on tacrolimus-based immunosuppressive therapy. They were evaluated with a mean follow-up of 77.3 ± 14.7 months (range, 50.7 to 100.1 months). To assess the impact of primary diagnosis on the rate and outcome of CR, the population was divided into 3 groups. Group I included patients with hepatitis C virus (HCV)- or hepatitis B virus (HBV)-induced cirrhosis (n = 312); group II included patients diagnosed with primary biliary cirrhosis (PBC), primary sclerosing cholangitis (PSC), or autoimmune hepatitis (AIH; n = 217); and group III included patients with all other diagnoses (n = 519). Overall, 32 of 1,048 patients (3.1%) developed CR. This represented 13 (4.1%), 12 (5.5%), and 7 patients (1.3%) in groups I, II, and III, respectively. The relative risk for developing CR was 3.2 times greater for group I and 4.3 times greater for group II compared with group III. This difference was statistically significant (P = .004). The incidence of acute rejection and total number of acute rejection episodes were significantly greater in patients who developed CR compared with those who did not (P < .0001). Similarly, the mean donor age for CR was significantly older than for patients without CR (43.0 v 36.2 years; P = .02). Thirteen of the 32 patients (40.6%) who developed CR retained their original grafts for a mean period of 54 ± 25 months after diagnosis. Seven patients (21.9%) underwent re-LT, and 12 patients (38.3%) died. Serum bilirubin levels and the presence of arteriopathy, arterial loss, and duct loss on liver biopsy at the time of diagnosis of CR were significantly greater among the 3 groups of patients. In addition, patient and graft survival for group I were significantly worse compared with groups II and III. We conclude that CR occurred rarely among patients maintained long term on tacrolimus-based immunosuppressive therapy. When steroid use is controlled, the incidence of acute rejection, mean donor age, HBV- and/or HCV-induced cirrhosis, or a diagnosis of PBC, PSC, or AIH were found to be predictors of CR. Greater values for serum bilirubin level, duct loss, arteriopathy, arteriolar loss, and presence of HCV or HBV were found to be poor prognostic factors for the 3 groups; greater total serum bilirubin value (P = .05) was the only factor found to be significant between patients who had graft loss versus those who recovered. [source] Hepatic artery thrombosis after orthotopic liver transplantation: A review of nonsurgical causesLIVER TRANSPLANTATION, Issue 2 2001Sabrina Pastacaldi Hepatic artery thrombosis (HAT) is one of the principal causes of morbidity and graft loss following liver transplantation. There are several risk factors for the development of HAT; technical aspects of the arterial anastomosis are important particularly for early thrombosis, but the improvement of surgical technique has lessened this problem. Apart from technical causes, other risk factors include a variety of conditions such as low donor/recipient age ratio, immunologic factors, clotting abnormalities, tobacco use, and infections. In particular, cytomegalovirus (CMV) infection of endothelial cells has been recently suggested as an infective cause of HAT, as it is known to be followed by a rapid procoagulant response. Thus, latent CMV in an allograft may become activated and promote or contribute to vascular thrombosis. This review evaluates these aspects, focusing on data relating CMV infection or viremia to HAT following liver transplantation. [source] Outcome of the use of pediatric donor livers in adult recipientsLIVER TRANSPLANTATION, Issue 1 2001Motohiko Yasutomi The prolonged waiting time caused by the lack of donor livers leads to an increasing number of terminally ill patients waiting for lifesaving liver transplantation. To rescue these patients, transplant programs are accepting donor organs from the expanded donor pool, using donors of increasingly older age, as well as from the pediatric age group, often despite significant mismatch in liver size. We investigated the outcome of 102 consecutive liver transplantations using pediatric donor livers in adult recipients. One-year graft survival using donors aged 12 years or younger (group 1, n = 14) and donors aged 12 to 18 years (group 2, n = 88) was compared. In addition, risk factors for graft loss and vascular complications were analyzed. The 1-year graft survival rate in adult transplant recipients in group 1 was 64.3% compared with 87.5% in those in group 2 (P = .015). The main cause of graft loss was arterial complications, occurring in 5 of 16 transplant recipients (31.3%). Major risk factors for graft loss and vascular complications were related to the size of the donor: age, height and weight, body surface area of donor and recipient, and warm ischemic time. We conclude that the outcome of small pediatric donor livers in adult recipients is poor, mainly because of the increased incidence of arterial complications. When a pediatric donor is used in an adult recipient, ischemic time should be kept to a minimum and anticoagulative therapy should be administered in the immediate postoperative period to avoid arterial complications. However, because small pediatric donors are the only source of lifesaving organs for the infant recipient, the use of small pediatric donor livers in adults should be avoided. [source] |