Better Graft Survival (good + graft_survival)

Distribution by Scientific Domains


Selected Abstracts


Do six-antigen-matched cadaver donor kidneys provide better graft survival to children compared with one-haploidentical living-related donor transplants?

PEDIATRIC TRANSPLANTATION, Issue 2 2000
A report of the North American Pediatric Renal Transplant Cooperative Study
Abstract: Since 1991, more than 50% of pediatric transplant recipients have received a living donor (LD) kidney, and , 85% of these allografts were one-haploidentical parental kidneys. Short-term (1 yr) and long-term (5 yr) graft survival of LD kidneys are 10% and 15% better, respectively, than that of cadaver donor (CD) kidneys. Because of these results, children are frequently not placed on a cadaver waiting list until the possibility of a LD is excluded , a process that may take up to 1 yr. The hypothesis for this study was that the graft outcome of a six-antigen-matched CD kidney is superior to that of a one-haploidentical LD kidney, and that children are at a disadvantage by not being placed on a CD list whilst waiting for a LD. The database of the North American Pediatric Renal Transplant Cooperative Study (NAPRTCS) for 11 yrs (1987,98), was reviewed to identify children who were recipients of a six-antigen-matched CD kidney (primary and repeat transplants), and those who were recipients of a one-haploidentical LD kidney (primary and repeat transplants). Using standard statistical methods, the morbidity, rejection episodes, post-transplant hospitalizations, renal function, long- and short-term graft survival, and half-life of primary recipients were compared in the two groups. Unlike adult patients, only 2.7% (87/3313) of CD recipients in the pediatric age range received a six-antigen-matched kidney, and the annual accrual rate over 11 yrs was never higher than 4%. Comparison of 57 primary six-antigen-CD kidneys (PCD) with 2472 primary LD (PLD) kidneys revealed that morbidity, rejection rates, and ratios were identical in the two groups. Renal function and subsequent hospitalizations were also identical in the two groups. Five-year graft survival of the PCD group was 90% compared with 80% for the PLD group, and the half-life of the PCD group was 25 ± 12.9 yrs compared with 19.6 ± 1.3 yrs. Our data suggest that the six-antigen-matched CD kidney may have less graft loss as a result of chronic rejection and would therefore confer a better long-term outcome. Based on these findings we recommend that all children, whilst waiting for a LD work-up, be listed with the United Network for Organ Sharing (UNOS) registry for a CD kidney. [source]


Liver Transplantation with Grafts from Controlled Donors after Cardiac Death: A 20-Year Follow-up at a Single Center

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2010
S. Yamamoto
The first liver transplantation (LTx) in Sweden was performed in 1984, but brain death as a legal death criterion was not accepted until 1988. Between November 1984 and May 1988, we performed 40 consecutive LTxs in 32 patients. Twenty-four grafts were from donors after cardiac death (DCD) and 16 grafts from heart-beating donors (HBD). Significantly, more hepatic artery thrombosis and biliary complications occurred in the DCD group (p < 0.01 and p < 0.05, respectively). Graft and patient survival did not differ between the groups. In the total group, there was a significant difference in graft survival between first-time LTx grafts and grafts used for retransplantation. There was better graft survival in nonmalignant than malignant patients, although this did not reach statistical significance. Multivariate analysis revealed cold ischemia time and post-LTx peak ALT to be independent predictive factors for graft survival in the DCD group. In the 11 livers surviving 20 years or more, follow-up biopsies were performed 18,20 years post-LTx (n = 10) and 6 years post-LTx (n = 1). Signs of chronic rejection were seen in three cases, with no difference between DCD and HBD. Our analysis with a 20-year follow-up suggests that controlled DCD liver grafts might be a feasible option to increase the donor pool. [source]


The Success of Continued Steroid Avoidance After Kidney Transplantation in the US

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 12 2009
J. D. Schold
There has been a significant increase in the use of steroid avoidance regimens as initial treatment for kidney transplant recipients. Early results of the effectiveness of this strategy has been mixed with certain prospective trials indicating increased acute rejection but population-based studies indicating similar or better graft survival as compared to steroid maintenance. We conducted a retrospective study of national registry data to evaluate risk factors for discontinuation of steroid avoidance protocols based on patient characteristics and concomitant immunosuppression. We evaluated 84 647 solitary kidney transplant recipients in the US with at least 6 months graft survival including 24 218 initially discharged without maintenance steroids. We utilized logistic models to assess risk factors for new initiation of steroids after initial steroid-avoidance and survival models to describe graft survival for patients after return to steroids. The most prominent risk factors for new initiation of steroids after deceased donor kidney transplantation included African-American race (AOR = 1.32, p < 0.01), retransplants (AOR = 1.81, p < 0.01), highly sensitized recipients (AOR = 1.29, p < 0.01), recipients with Medicaid (AOR = 1.85, p < 0.01), elevated HLA-MM (AOR = 1.26, p < 0.01) and older donor age (AOR = 1.19, p < 0.01). Concomitant medications were also significantly associated with the propensity to newly initiate steroids. Cumulatively the study suggests that both patient characteristics and concomitant medications are strongly associated with the success of steroid avoidance immunosuppressive regimens. [source]


The impact of induction on survival after lung transplantation: an analysis of the International Society for Heart and Lung Transplantation Registry

CLINICAL TRANSPLANTATION, Issue 5 2008
Ramsey R. Hachem
Abstract:, Background:, The use of induction immunosuppression after lung transplantation remains controversial. In this study, we examined the impact of induction on survival after lung transplantation. Methods:, We performed a retrospective cohort study of 3970 adult lung transplant recipients reported to the ISHLT Registry. We divided the cohort into three groups based on the use of induction: none, interleukin-2 receptor antagonists (IL-2 RA), and polyclonal antithymocyte globulins (ATG). We estimated graft survival using the Kaplan-Meier method and constructed a multivariable Cox proportional hazards model to examine the impact of induction on graft survival in the context of other variables. Results:, During the study period, 2249 patients received no induction, 1124 received IL-2 RA, and 597 received ATG. Four years after transplantation, recipients treated with IL-2 RA had better graft survival (64%) than those treated with ATG (60%) and those who did not receive induction (57%; log rank p = 0.0067). This survival advantage persisted in the multivariable model for single and bilateral recipients treated with IL-2 RA compared to those who did not receive induction (RR = 0.82, p = 0.007). Similarly, bilateral recipients treated with ATG had a survival advantage over bilateral recipients who did not receive induction (RR = 0.78, p = 0.043), but single lung recipients treated with ATG did not have a survival advantage over single lung recipients who did not receive induction (RR = 1.06, p = 0.58). Conclusions:, Induction with lL-2 RA for single and bilateral lung recipients and induction with ATG for bilateral recipients are associated with a survival benefit, independent of other variables that might impact survival. [source]


Chronic allograft nephropathy and nephrotic range proteinuria

CLINICAL TRANSPLANTATION, Issue 3 2005
Venkataraman Ramanathan
Abstract:, While the association between post-transplant nephrotic range proteinuria (PTx-NP) and chronic allograft nephropathy (CAN) has been described, the factors that determine graft survival in such patients are unclear. We retrospectively identified 30 patients with biopsy-proven CAN who presented with PTX-NP between 1988 and 2002. Patients were stratified into two groups according to PTX-NP onset: <1 yr vs. >1 yr post-transplantation. Both groups were comparable with respect to the degree of renal dysfunction (serum creatinine 4.3 ± 2.5 mg/dL vs. 3.4 ± 1.5 mg/dL) and proteinuria (4.7 ± 1.6 gm/d vs. 5.8 ± 3 gm/d). After a mean follow-up of 14 months post-biopsy, 87% of patients had lost their grafts in both groups (89% vs. 83%, p = NS). Overall, patients with serum creatinine ,2 mg/dL had better graft survival during follow-up than patients with serum creatinine >2 mg/dL (75% vs. 4%, Fisher Exact Probability p = 0.0038). Using Kaplan,Meier estimate, the 5-yr graft survival rate was 100% for patients with serum creatinine ,2 mg/dL and 40% in those with >2 mg/dL (p = 0.06). The magnitude of proteinuria beyond 3 gm/d did not influence graft survival. One-half of the patients (n = 15) received therapy with angiotensin converting enzyme inhibitors (ACEI). Graft survival, however, was not different between the patients who received ACEI compared with the patients who did not receive ACEI (13% vs. 13%). PTx-NP related to CAN was associated with poor allograft survival, irrespective of the time of onset of presentation, especially when renal function was reduced at the time of biopsy. [source]