Cold Ischemia Time (cold + ischemia_time)

Distribution by Scientific Domains


Selected Abstracts


Rh blood group and liver transplantation

LIVER TRANSPLANTATION, Issue 10 2007
James D. Perkins M.D. Special Section Editor
Background Cold ischemia time and the presence of postoperative hepatic arterial thrombosis have been associated with biliary complications (BC) after liver transplantation. An ABO-incompatible blood group has also been suggested as a factor for predisposal towards BC. However, the influence of Rh nonidentity has not been studied previously. Materials Three hundred fifty six liver transplants were performed from 1995 to 2000 at our hospital. BC incidence and risk factors were studied in 345 patients. Results Seventy patients (20%) presented BC after liver transplantation. Bile leakage (24/45%) and stenotic anastomosis (21/30%) were the most frequent complications. Presence of BC in Rh-nonidentical graft-host cases (23/76, 30%) was higher than in Rh-identical grafts (47/269, 17%) (P = 0.01). BC was also more frequent in grafts with arterial thrombosis (9/25, 36% vs 60/319, 19%; P = 0.03) and grafts with cold ischemia time longer than 430 min (26/174, 15% vs 44/171, 26%; P = 0.01). Multivariate logistic regression confirmed that Rh graft-host nonidentical blood groups [RR = 2(1.1-3.6); P = 0.02], arterial thrombosis [RR = 2.6(1.1-6.4); P = 0.02] and cold ischemia time longer than 430 min [RR = 1.8(1-3.2); P = 0.02] were risk factors for presenting BC. Conclusion Liver transplantation using Rh graft-host nonidentical blood groups leads to a greater incidence of BC. [source]


Graft fibrosis after pediatric liver transplantation: Ten years of follow-up,

HEPATOLOGY, Issue 3 2009
Rene Scheenstra
Previously we reported the presence of portal fibrosis in 31% (n = 84) of the grafts in protocol biopsies 1 year after pediatric liver transplantation (LTx). To assess the natural history of graft fibrosis after pediatric liver transplantation, we extended the analysis of graft histology in follow-up protocol biopsy specimens obtained 5 and 10 years after transplantation. We correlated histological results with clinical parameters at the time of LTx and during follow-up, to allow identification of risk factors for the development of fibrosis. From 1 year to 5 years after LTx, the prevalence of fibrosis increased from 31% to 65% (n = 66) but remained stable thereafter (at 10 years, 69%, n = 55). At 10 years after LTx, however, the percentage of patients with severe fibrosis had increased from 10% (at 5 years) to 29%. Of the 69% of children without fibrosis at 1 year post-transplantation, 64% (n = 39) had developed some degree of fibrosis at 10 years. Fibrosis was strongly related to transplant-related factors such as prolonged cold ischemia time, young age at the time of transplantation, high donor/recipient age ratio, and the use of partial grafts (P < 0.05). Fibrosis was not significantly related to rejection, chronic hepatitis, or the nature of the immunosuppressive therapy. Conclusion: Biopsies after pediatric LTx show that most grafts developed fibrosis within 5 years. At 10 years after LTx, the graft fibrosis had progressed to severe fibrosis in at least 25% of the patients. Development of fibrosis, starting either before or after the first year post-LTx, was strongly related to transplant-related factors, indicating the importance of these factors to long-term graft prognosis. (HEPATOLOGY 2008.) [source]


The biopsied donor liver: Incorporating macrosteatosis into high-risk donor assessment,

LIVER TRANSPLANTATION, Issue 7 2010
Austin L. Spitzer
To expand the donor liver pool, ways are sought to better define the limits of marginally transplantable organs. The Donor Risk Index (DRI) lists 7 donor characteristics, together with cold ischemia time and location of the donor, as risk factors for graft failure. We hypothesized that donor hepatic steatosis is an additional independent risk factor. We analyzed the Scientific Registry of Transplant Recipients for all adult liver transplants performed from October 1, 2003, through February 6, 2008, with grafts from deceased donors to identify donor characteristics and procurement logistics parameters predictive of decreased graft survival. A proportional hazard model of donor variables, including percent steatosis from higher-risk donors, was created with graft survival as the primary outcome. Of 21,777 transplants, 5051 donors had percent macrovesicular steatosis recorded on donor liver biopsy. Compared to the 16,726 donors with no recorded liver biopsy, the donors with biopsied livers had a higher DRI, were older and more obese, and a higher percentage died from anoxia or stroke than from head trauma. The donors whose livers were biopsied became our study group. Factors most strongly associated with graft failure at 1 year after transplantation with livers from this high-risk donor group were donor age, donor liver macrovesicular steatosis, cold ischemia time, and donation after cardiac death status. In conclusion, in a high-risk donor group, macrovesicular steatosis is an independent risk factor for graft survival, along with other factors of the DRI including donor age, donor race, donation after cardiac death status, and cold ischemia time. Liver Transpl 16:874,884, 2010. © 2010 AASLD. [source]


Does middle hepatic vein omission in a right split graft affect the outcome of liver transplantation?

LIVER TRANSPLANTATION, Issue 6 2007
A comparative study of right split livers with, without the middle hepatic vein
Preservation of the middle hepatic vein (MHV) for a right split liver transplantation (SLT) in an adult recipient is still controversial. The aim of this study was to evaluate the graft and patient outcomes after liver transplantation (LT) using a right split graft, according to the type of venous drainage. From February 2000 to May 2006, 33 patients received 34 cadaveric right split liver grafts. According to the type of recipient pairs (adult/adult or adult/child), the right liver graft was deprived of the MHV or not. The first group (GI, n = 15) included grafts with only the right hepatic vein (RHV) outflow, the second (GII, n = 18) included grafts with both right and MHV outflows. The 2 groups were similar for patient demographics, initial liver disease, and donor characteristics. In GI and GII, graft-to-recipient-weight ratio (GRWR) was 1.2 ± 0% and 1.6 ± 0.3% (P < 0.05), and cold ischemia time was 10 hours 55 minutes ± 2 hours 49 minutes and 10 hours 47 minutes ± 3 hours 32 minutes, respectively (P = not significant). Postoperative death occurred in 1 patient in each group. Vascular complications included anastomotic strictures: 2 portal vein (PV), 1 hepatic artery (HA), and 1 RHV anastomotic strictures; all in GI. Biliary complications occurred in 20% and 22% of the patients, in GI and GII, respectively (P = not significant). There were no differences between both groups regarding postoperative outcome and blood tests at day 1-15 except for a significantly higher cholestasis in GI. At 1 and 3 yr, patient survival was 94% for both groups and graft survival was 93% for GI and 90% for GII (P = not significant). In conclusion, our results suggest that adult right SLT without the MHV is safe and associated with similar long-term results as compared with those of the right graft including the MHV, despite that early liver function recovered more slowly. Technical refinements in outflow drainage should be evaluated in selected cases. Liver Transpl 13:829,837, 2007. © 2007 AASLD. [source]


Graft and patient survival after adult live donor liver transplantation compared to a matched cohort who received a deceased donor transplantation

LIVER TRANSPLANTATION, Issue 10 2004
Paul J. Thuluvath
Live donor liver transplantation (LDLT) has become increasingly common in the United States and around the world. In this study, we compared the outcome of 764 patients who received LDLT in the United States and compared the results with a matched population that received deceased donor transplantation (DDLT) using the United Network for Organ Sharing (UNOS) database. For each LDLT recipient (n = 764), two DDLT recipients (n = 1,470), matched for age, gender, race, diagnosis, and year of transplantation, were selected from the UNOS data after excluding multiple organ transplantation or retransplantation, children, and those with incomplete data. Despite our matching, recipients of LDLT had more stable liver disease, as shown by fewer patients with UNOS status 1 or 2A, in an intensive care unit, or on life support. Creatinine and cold ischemia time were also lower in the LDLT group. Primary graft nonfunction, hyperacute rejection rates, and patient survival by Kaplan-Meier analysis were similar in both groups (2-year survival was 79.0% in LDLT vs. 80.7% in case-controls; P = .5), but graft survival was significantly lower in LDLT (2-year graft survival was 64.4% vs. 73.3%; P < .001). Cox regression (after adjusting for confounding variables) analysis showed that LDLT recipients were 60% more likely to lose their graft compared to DDLT recipients (hazard ratio [HR] 1.6; confidence interval 1.1-2.5). Among hepatitis C virus (HCV) patients, LDLT recipients showed lower graft survival when compared to those who received DDLT. In conclusion, short-term patient survival in LDLT is similar to that in the DDLT group, but graft survival is significantly lower in LDLT recipients. LDLT is a reasonable option for patients who are unlikely to receive DDLT in a timely fashion. (Liver Transpl 2004;10:1263,1268.) [source]


In situ splitting of a liver with middle hepatic vein anomaly

LIVER TRANSPLANTATION, Issue 9 2001
Alessandro Genzone MD
In situ liver splitting provides a way to expand the graft pool, minimize cold ischemia time, and improve hemostasis at the cut surface of the graft. Vascular anomalies of the liver may make the splitting procedure very difficult or even impossible to perform. The in situ splitting procedure, performed on a liver with a middle hepatic vein (MHV) anomaly, is described here. The MHV drained directly into the segment III vein within the hepatic parenchyma instead of draining into the left hepatic vein to form the common trunk. In situ splitting was performed during multiorgan procurement from a 33-year-old man who died of isolated cerebral trauma. The MHV was reconstructed on the back table to secure right graft venous drainage using an iliac vein graft. The resultant right graft, segments I and IV to VIII, and left graft, segments II and III, were transplanted successfully into an adult and a child, respectively. The 2 transplant recipients are currently alive with normal hepatic function 20 months after transplantation. [source]


First experiences of pediatric kidney transplantation in Sri Lanka

PEDIATRIC TRANSPLANTATION, Issue 4 2007
C. K. Abeysekera
Abstract:, KT is the most effective therapeutic option for ESRF. We present our first experiences in a developing country. All children who underwent kidney transplantation since the inception of this program in July 2004 until 30 September 2005 were studied. Their demographic data, operative and peri-operative details, graft and host survival, and drug compliance are described here. Data were collected from patient records and nursing observation records. Eleven children were transplanted during this period (median recipient age 10.75 yr, range: 8,16). The median age of the donors was 41 yr (range: 38,45) and was the mother in eight, father in two and uncle in one. The median (range) follow-up period following transplantation was 12.5 months (7,12). The vascular anastomotic site was aorta and inferior vena cava in nine patients and the cold ischemia time was mean (s.d.) 1.9 h (0.96). All patients received steroids, cyclosporine and MMF for immunosuppression. Hypotension, heart failure and septicemia were common medical complications. Four were treated for acute rejection. Vascular anastomotic leak, burst abdomen, intestinal obstruction, intra-abdominal leak of supra pubic catheter and vesico-ureteric junction obstruction were surgical complications. There were no graft losses or deaths. Despite limited resources good outcomes are possible following renal transplantation in children in developing countries. [source]


Improved outcome of pediatric kidney transplantations in the Netherlands , Effect of the introduction of mycophenolate mofetil?

PEDIATRIC TRANSPLANTATION, Issue 1 2005
Karlien Cransberg
Abstract:, Collaboration of the Dutch centers for kidney transplantation in children started in 1997 with a shared immunosuppressive protocol, aimed at improving graft survival by diminishing the incidence of acute rejections. This study compares the results of transplantations in these patients to those in a historical reference group. Ninety-six consecutive patients receiving a first kidney transplant were treated with an immunosuppressive regimen consisting of mycophenolate mofetil, cyclosporine and corticosteroids. The results were compared with those of historic controls (first transplants between 1985 and 1995, n = 207), treated with different combinations of corticosteroids, cyclosporine A and/or azathioprine. Cytomegalovirus (CMV) prophylaxis was prescribed to high-risk patients in the study group, and only a small proportion of the reference group. The graft survival at 1 yr improved significantly: 92% in the study group, vs. 73% in the reference group (p < 0.001). In the study group 63% of patients remained rejection-free during the first year; in the reference group 28% (p < 0.001). After statistical adjustment of differences in baseline data, as cold ischemia time, the proportion of LRD, preemptive transplantation, and young donors, the difference between study and reference group in graft survival (RR 0.33, p = 0.003) and incidence of acute rejection (RR 0.37, p < 0.001), as the only factor, remained statistically significant, indicating the effect of the immunosuppressive therapy. In the first year one case of malignancy occurred in each group. CMV disease occurred less frequently in the study group (11%) than in the reference group (26%, p = 0.02). As a new complication in 4 patients bronchiectasis was diagnosed. A new consensus protocol, including the introduction of mycophenolate mofetil, considerably improved the outcome of pediatric kidney transplantation in the Netherlands, measured as reduction of the incidence of acute rejection and improved graft survival. [source]


Donor Pretreatment with Tetrahydrobiopterin Saves Pancreatic Isografts from Ischemia Reperfusion Injury in a Mouse Model

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 10 2010
M. Maglione
Depletion of the nitric oxide synthase cofactor tetrahydrobiopterin (H4B) during ischemia and reperfusion is associated with severe graft pancreatitis. Since clinically feasible approaches to prevent ischemia reperfusion injury (IRI) by H4B-substitution are missing we investigated its therapeutic potential in a murine pancreas transplantation model using different treatment regimens. Grafts were subjected to 16 h cold ischemia time (CIT) and different treatment regimens: no treatment, 160 ,M H4B to perfusion solution, H4B 50 mg/kg prior to reperfusion and H4B 50 mg/kg before recovery of organs. Nontransplanted animals served as controls. Recipient survival and endocrine graft function were assessed. Graft microcirculation was analyzed 2 h after reperfusion by intravital fluorescence microscopy. Parenchymal damage was assessed by histology and nitrotyrosine immunohistochemistry, H4B tissue levels by high pressure liquid chromatography (HPLC). Compared to nontransplanted controls prolonged CIT resulted in significant microcirculatory deterioration. Different efficacy according to route and timing of administration could be observed. Only donor pretreatment with H4B resulted in almost completely abrogated IRI-related damage showing graft microcirculation comparable to nontransplanted controls and restored intragraft H4B levels, resulting in significant reduction of parenchymal damage (p < 0.002) and improved survival and endocrine function (p = 0.0002 each). H4B donor pretreatment abrogates ischemia-induced parenchymal damage and represents a promising strategy to prevent IRI following pancreas transplantation. [source]


A Risk Prediction Model for Delayed Graft Function in the Current Era of Deceased Donor Renal Transplantation

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 10 2010
W. D. Irish
Delayed graft function (DGF) impacts short- and long-term outcomes. We present a model for predicting DGF after renal transplantation. A multivariable logistic regression analysis of 24 337 deceased donor renal transplant recipients (2003,2006) was performed. We developed a nomogram, depicting relative contribution of risk factors, and a novel web-based calculator (http://www.transplantcalculator.com/DGF) as an easily accessible tool for predicting DGF. Risk factors in the modern era were compared with their relative impact in an earlier era (1995,1998). Although the impact of many risk factors remained similar over time, weight of immunological factors attenuated, while impact of donor renal function increased by 2-fold. This may reflect advances in immunosuppression and increased utilization of kidneys from expanded criteria donors (ECDs) in the modern era. The most significant factors associated with DGF were cold ischemia time, donor creatinine, body mass index, donation after cardiac death and donor age. In addition to predicting DGF, the model predicted graft failure. A 25,50% probability of DGF was associated with a 50% increased risk of graft failure relative to a DGF risk <25%, whereas a >50% DGF risk was associated with a 2-fold increased risk of graft failure. This tool is useful for predicting DGF and long-term outcomes at the time of transplant. [source]


Improving Distribution Efficiency of Hard-to-Place Deceased Donor Kidneys: Predicting Probability of Discard or Delay

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 7 2010
A. B. Massie
We recently showed that DonorNet 2007 has reduced the efficiency of kidney distribution in the United States, particularly for those with prolonged cold ischemia time (CIT), by requiring systematic allocation of all kidneys regardless of quality. Reliable early identification of those most likely to be discarded or significantly delayed would enable assigning them to alternate, more efficient distribution strategies. Based on 39 035 adult kidneys recovered for possible transplantation between 2005 and 2008, we created a regression model that reliably (AUC 0.83) quantified the probability that a given kidney was either discarded or delayed beyond 36 h of CIT (Probability of Discard/Delay, PODD). We then analyzed two PODD cutoffs: a permissive cutoff that successfully flagged over half of those kidneys that were discarded/delayed, while only flagging 7% of kidneys that were not eventually discarded/delayed, and a more stringent cutoff that erroneously flagged only 3% but also correctly identified only 34%. Kidney transplants with high PODD were clustered in a minority of centers. Modifications of the kidney distribution system to more efficiently direct organs with high PODD to the centers that actually use them may result in reduced CIT and fewer discards. [source]


Histidine-Tryptophan-Ketoglutarate for Pancreas Allograft Preservation: The Indiana University Experience

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 5 2010
J. A. Fridell
Histidine-tryptophan-ketoglutarate solution (HTK) has been scrutinized for use in pancreas transplantation. A recent case series and a United Network for Organ Sharing data base review have suggested an increased incidence of allograft pancreatitis and graft loss with HTK compared to the University of Wisconsin solution (UW). Conversely, a recent randomized, controlled study failed to show any significant difference between HTK and UW for pancreas allograft preservation. This study was a retrospective review of all pancreas transplants performed at Indiana University between 2003 and 2009 comparing preservation with HTK or UW. Data included recipient and donor demographics, 7-day, 90-day and 1-year graft survival, peak 30-day serum amylase and lipase, HbA1c and C-peptide levels. Of the 308 pancreas transplants, 84% used HTK and 16% UW. There were more SPK compared to pancreas after kidney and pancreas transplant alone in the HTK group. Donor and recipient demographics were similar. There was no significant difference in 7-day, 90-day or 1-year graft survival, 30-day peak serum amylase and lipase, HbA1c or C-peptide. No clinically significant difference between HTK and UW for pancreas allograft preservation was identified. Specifically, in the context of low-to-moderate flush volume and short cold ischemia time (,10 h), no increased incidence of allograft pancreatitis or graft loss was observed. [source]


Liver Transplantation with Grafts from Controlled Donors after Cardiac Death: A 20-Year Follow-up at a Single Center

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2010
S. Yamamoto
The first liver transplantation (LTx) in Sweden was performed in 1984, but brain death as a legal death criterion was not accepted until 1988. Between November 1984 and May 1988, we performed 40 consecutive LTxs in 32 patients. Twenty-four grafts were from donors after cardiac death (DCD) and 16 grafts from heart-beating donors (HBD). Significantly, more hepatic artery thrombosis and biliary complications occurred in the DCD group (p < 0.01 and p < 0.05, respectively). Graft and patient survival did not differ between the groups. In the total group, there was a significant difference in graft survival between first-time LTx grafts and grafts used for retransplantation. There was better graft survival in nonmalignant than malignant patients, although this did not reach statistical significance. Multivariate analysis revealed cold ischemia time and post-LTx peak ALT to be independent predictive factors for graft survival in the DCD group. In the 11 livers surviving 20 years or more, follow-up biopsies were performed 18,20 years post-LTx (n = 10) and 6 years post-LTx (n = 1). Signs of chronic rejection were seen in three cases, with no difference between DCD and HBD. Our analysis with a 20-year follow-up suggests that controlled DCD liver grafts might be a feasible option to increase the donor pool. [source]


Diabetes Mellitus: A Risk Factor for Delayed Graft Function after Deceased Donor Kidney Transplantation

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 2 2010
J. Parekh
Early graft function is a major determinant of long-term outcomes after renal transplantation. Recently, recipient diabetes was identified as a risk factor for poor initial graft function in living donor renal transplantation. To further explore this association, we performed a paired analysis of deceased donor renal transplants from January 1994 to December 2005. A total of 25,523 transplant pairs were analyzed via conditional logistic regression. Diabetic recipients were older (53.16 vs. 46.75 years, p < 0.01), had a lower average panel reactive antibody (12% vs. 15%, p < 0.01) and fewer prior transplants (0.07 vs. 0.12, p < 0.01). Recipient diabetes, age, male gender, African American race, elevated peak panel reactive antibody and increased cold ischemia time were independent risk factors for delayed graft function. Specifically, diabetic recipients had increased risk of DGF on univariate analysis (odds ratio [OR] 1.32, 95% confidence interval [CI] 1.23,1.42, p < 0.01). Multivariable analysis confirmed this association but the risk differed by recipient gender; with diabetes having a greater effect in women (OR 1.66, 95% CI 1.45,1.91, p < 0.01) compared to men (OR 1.28, 95% CI 1.15,1.43, p < 0.01). It is unknown whether the deleterious impact of recipient diabetes on graft function after renal transplantation results from perioperative hyperglycemia or the chronic sequelae of diabetes. [source]


High Weight Differences between Donor and Recipient Affect Early Kidney Graft Function,A Role for Enhanced IL-6 Signaling

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 8 2009
W. Gong
The frequency of delayed function of kidney transplants varies greatly and is associated with quality of graft, donor age and the duration of cold ischemia time. Furthermore, body weight differences between donor and recipient can affect primary graft function, but the underlying mechanism is poorly understood. We transplanted kidney grafts from commensurate body weight (L-WD) or reduced body weight (H-WD) donor rats into syngeneic or allogeneic recipients. Twenty-four hours posttransplantation, serum creatinine levels in H-WD recipients were significantly higher compared to L-WD recipients indicating impaired primary graft function. This was accompanied by upregulation of IL-6 transcription and increased tubular destruction in grafts from H-WD recipients. Using DNA microarray analysis, we detected decreased expression of genes associated with kidney function and an upregulation of other genes such as Cyp3a13, FosL and Trib3. A single application of IL-6 into L-WD recipients is sufficient to impair primary graft function and cause tubular damage, whereas immediate neutralization of IL-6 receptor signaling in H-WD recipients rescued primary graft function with well-preserved kidney graft architecture and a normalized gene expression profile. These findings have strong clinical implication as anti-IL6R treatment of patients receiving grafts from lower-weight donors could be used to improve primary graft function. [source]


The Effects of DonorNet 2007 on Kidney Distribution Equity and Efficiency

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 7 2009
A. B. Massie
In 2007, UNOS released DonorNet 2007® (DN07) in hope of improving allocation equity and efficiency. We hypothesized that hard-to-place organs might be less efficiently handled through this regimented process. We analyzed associations between DN07 and center-level equity, number of refusals per organ and cold ischemia time (CIT). A total of 8244 kidney transplants between 1/2006 and 12/2006 (pre-DN07) were compared with 6029 transplants between 5/2007 and 2/2008 (post-DN07). Distribution equity was assessed by the Gini coefficient, changes in the number of refusals and CIT by negative binomial regression and discard rates by logistic regression. We estimated quantile-specific differences in CIT by bootstrapping. We found no significant change in center-level distribution equity after DN07. Number of refusals per organ increased by 20% (adjusted rate ratio 1.121.201.28, p < 0.001) at the patient level and 11% (ARR 1.071.111.16, p < 0.001) at the center level. Regression models of CIT showed no global change in CIT associated with DN07, but those kidneys with the longest CIT pre-DN07 had statistically significantly longer CIT post-DN07. The discard rate also increased significantly (ARR 1.061.111.17, p < 0.001). DN07 has not improved equity or efficiency in allocation of deceased donor kidneys, and may be harming the allocation of hard-to-place kidneys. [source]


Liver Transplantation Using Donation After Cardiac Death Donors: Long-Term Follow-Up from a Single Center

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 4 2009
M. E. De Vera
There is a lack of universally accepted clinical parameters to guide the utilization of donation after cardiac death (DCD) donor livers and it is unclear as to which patients would benefit most from these organs. We reviewed our experience in 141 patients who underwent liver transplantation using DCD allografts from 1993 to 2007. Patient outcomes were analyzed in comparison to a matched cohort of 282 patients who received livers from donation after brain death (DBD) donors. Patient survival was similar, but 1-, 5- and 10-year graft survival was significantly lower in DCD (69%, 56%, 44%) versus DBD (82%, 73%, 63%) subjects (p < 0.0001). Primary nonfunction and biliary complications were more common in DCD patients, accounting for 67% of early graft failures. A donor warm ischemia time >20 min, cold ischemia time >8 h and donor age >60 were associated with poorer DCD outcomes. There was a lack of survival benefit in DCD livers utilized in patients with model for end-stage liver disease (MELD) ,30 or those not on organ-perfusion support, as graft survival was significantly lower compared to DBD patients. However, DCD and DBD subjects transplanted with MELD >30 or on organ-perfusion support had similar graft survival, suggesting a potentially greater benefit of DCD livers in critically ill patients. [source]


Histidine,Tryptophan,Ketoglutarate (HTK) Is Associated with Reduced Graft Survival in Deceased Donor Livers, Especially Those Donated After Cardiac Death

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 2 2009
Z. A. Stewart
Single-center studies have reported that liver allograft survival is not affected by preservation in histidine,tryptophan,ketoglutarate (HTK) versus University of Wisconsin (UW) solution. We analyzed the UNOS database of liver transplants performed from July, 2004, through February, 2008, to determine if preservation with HTK (n = 4755) versus UW (n = 12 673) impacted graft survival. HTK preservation of allografts increased from 16.8% in 2004 to 26.9% in 2008; this was particularly striking among donor after cardiac death (DCD) allografts, rising from 20.7% in 2004 to 40.9% in 2008. After adjusting for donor, recipient and graft factors that affect graft survival, HTK preservation was associated with an increased risk of graft loss (HR 1.14, p = 0.002), especially with DCD allografts (HR 1.44, P = 0.025) and those with cold ischemia time over 8 h (HR 1.16, P = 0.009). Furthermore, HTK preservation was associated with a 1.2-fold higher odds of early (< 30 days) graft loss as compared to UW preservation (OR 1.20, p = 0.012), with a more pronounced effect on allografts with cold ischemia time over 8 h (OR 1.31, p = 0.007), DCD allografts (OR 1.63, p = 0.09) and donors over 70 years (OR 1.67, p = 0.081). These results suggest that the increasing use of HTK for abdominal organ preservation should be reexamined. [source]


Incidence and Severity of Acute Cellular Rejection in Recipients Undergoing Adult Living Donor or Deceased Donor Liver Transplantation,

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 2 2009
A. Shaked
Living donor liver transplantation (LDLT) may have better immunological outcomes compared to deceased donor liver transplantation (DDLT). The aim of this study was to analyze the incidence of acute cellular rejection (ACR) after LDLT and DDLT. Data from the adult-to-adult living donor liver transplantation (A2ALL) retrospective cohort study on 593 liver transplants done between May 1998 and March 2004 were studied (380 LDLT; 213 DDLT). Median LDLT and DDLT follow-up was 778 and 713 days, respectively. Rates of clinically treated and biopsy-proven ACR were compared. There were 174 (46%) LDLT and 80 (38%) DDLT recipients with ,1 clinically treated episodes of ACR, whereas 103 (27%) LDLT and 58 (27%) DDLT recipients had ,1 biopsy-proven ACR episode. A higher proportion of LDLT recipients had clinically treated ACR (p = 0.052), but this difference was largely attributable to one center. There were similar proportions of biopsy-proven rejection (p = 0.97) and graft loss due to rejection (p = 0.16). Longer cold ischemia time was associated with a higher rate of ACR in both groups despite much shorter median cold ischemia time in LDLT. These data do not show an immunological advantage for LDLT, and therefore do not support the application of unique posttransplant immunosuppression protocols for LDLT recipients. [source]


Histidine-Tryptophan-Ketoglutarate (HTK) Is Associated with Reduced Graft Survival in Pancreas Transplantation

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 1 2009
Z. A. Stewart
Prior single-center studies have reported that pancreas allograft survival is not affected by preservation in histidine-tryptophan-ketoglutarate (HTK) versus University of Wisconsin (UW) solution. To expand on these studies, we analyzed the United Network for Organ Sharing (UNOS) database of pancreas transplants from July 2004, through February 2008, to determine if preservation with HTK (N = 1081) versus UW (N = 3311) impacted graft survival. HTK preservation of pancreas allografts increased significantly in this time frame, from 15.4% in 2004 to 25.4% in 2008. After adjusting for other recipient, donor, graft and transplant center factors that impact graft survival, HTK preservation was independently associated with an increased risk of pancreas graft loss (hazard ratio [HR] 1.30, p = 0.014), especially in pancreas allografts with cold ischemia time (CIT) ,12 h (HR 1.42, p = 0.017). This reduced survival with HTK preservation as compared to UW preservation was seen in both simultaneous pancreas-kidney (SPK) transplants and pancreas alone (PA) transplants. Furthermore, HTK preservation was also associated with a 1.54-fold higher odds of early (<30 days) pancreas graft loss as compared to UW (OR 1.54, p = 0.008). These results suggest that the increasing use of HTK for abdominal organ preservation should be re-examined. [source]


Urine NGAL and IL-18 are Predictive Biomarkers for Delayed Graft Function Following Kidney Transplantation

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 7 2006
C. R. Parikh
Delayed graft function (DGF) due to tubule cell injury frequently complicates deceased donor kidney transplants. We tested whether urinary neutrophil gelatinase-associated lipocalin (NGAL) and interleukin-18 (IL-18) represent early biomarkers for DGF (defined as dialysis requirement within the first week after transplantation). Urine samples collected on day 0 from recipients of living donor kidneys (n = 23), deceased donor kidneys with prompt graft function (n = 20) and deceased donor kidneys with DGF (n = 10) were analyzed in a double blind fashion by ELISA for NGAL and IL-18. In patients with DGF, peak postoperative serum creatinine requiring dialysis typically occurred 2,4 days after transplant. Urine NGAL and IL-18 values were significantly different in the three groups on day 0, with maximally elevated levels noted in the DGF group (p < 0.0001). The receiver,operating characteristic curve for prediction of DGF based on urine NGAL or IL-18 at day 0 showed an area under the curve of 0.9 for both biomarkers. By multivariate analysis, both urine NGAL and IL-18 on day 0 predicted the trend in serum creatinine in the posttransplant period after adjusting for effects of age, gender, race, urine output and cold ischemia time (p < 0.01). Our results indicate that urine NGAL and IL-18 represent early, predictive biomarkers of DGF. [source]


The Expanded Criteria Donor Policy: An Evaluation of Program Objectives and Indirect Ramifications

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 7 2006
J. D. Schold
The expanded criteria donor (ECD) policy was formalized in 2002, which defined higher-risk deceased donor kidneys recovered for transplantation. There has not been a comprehensive examination of the impact of policy on the allocation of ECD kidneys, waiting times for transplant, center listing patterns or human leukocyte antigen (HLA) matching. We examined transplant candidates from 1998 to 2004 utilizing a national database. We constructed models to assess alterations in recipient characteristics of ECD kidneys and trends in waiting time and cold ischemia time (CIT) associated with policy. We also evaluated the impact of the proportion of center candidate listings for ECD kidneys on waiting times. Elderly recipients were more likely to receive ECDs following policy (odds ratio = 1.36, p < 0.01). There was no association of decreased CIT or pretransplant dialysis time while increasing HLA mismatching with policy inception. Over one quarter of centers listed <20% of candidates for ECDs, while an additional quarter of centers listed >90%. Only centers with selective listing for ECDs offered reduced waiting times to ECD recipients. The ECD policy demonstrates potential to achieve certain ascribed goals; however, the full impact of the program, reaching all transplant candidates, may only be achieved once ECD listing patterns are recommended and adopted accordingly. [source]


The Broad Spectrum of Quality in Deceased Donor Kidneys

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 4 2005
Jesse D. Schold
The quality of the deceased donor organ clearly is one of the most crucial factors in determining graft survival and function in recipients of a kidney transplant. There has been considerable effort made towards evaluating these organs culminating in an amendment to allocation policy with the introduction of the expanded criteria donor (ECD) policy. Our study, from first solitary adult deceased donor transplant recipients from 1996 to 2002 in the National Scientific Transplant Registry database, presents a donor kidney risk grade based on significant donor characteristics, donor,recipient matches and cold ischemia time, generated directly from their risk for graft loss. We investigated the impact of our donor risk grade in a naïve cohort on short- and long-term graft survival, as well as in subgroups of the population. The projected half-lives for overall graft survival in recipients by donor risk grade were I (10.7 years), II (10.0 years), III (7.9 years), IV (5.7 years) and V (4.5 years). This study indicates that there is great variability in the quality of deceased donor kidneys and that the assessment of risk might be enhanced by this scoring system as compared to the simple two-tiered system of the current ECD classification. [source]


Complete robotic-assistance during laparoscopic living donor nephrectomies: An evaluation of 38 procedures at a single site

INTERNATIONAL JOURNAL OF UROLOGY, Issue 11 2007
Jacques Hubert
Objective: To evaluate our initial experience with entirely robot-assisted laparoscopic live donor (RALD) nephrectomies. Methods: From January 2002 to April 2006, we carried out 38 RALD nephrectomies at our institution, using four ports (three for the robotic arms and one for the assistant). The collateral veins were ligated, and the renal arteries and veins clipped, after completion of ureteral and renal dissection. The kidney was removed via a suprapubic Pfannenstiel incision. A complementary running suture was carried out on the arterial stump to secure the hemostasis. Results: Mean donor age was 43 years. All nephrectomies were carried out entirely laparoscopically, without complications and with minimal blood loss. Mean surgery time was 181 min. Average warm ischemia and cold ischemia times were 5.84 min and 180 min, respectively. Average donor hospital stay was 5.5 days. None of the transplant recipients had delayed graft function. Conclusions: Robot-assisted laparoscopic live donor nephrectomy can be safely carried out. Robotics enhances the laparoscopist's skills, enables the surgeon to dissect meticulously and to prevent problematic bleeding more easily. Donor morbidity and hospitalization are reduced by the laparoscopic approach and the use of robotics allows the surgeon to work under better ergonomic conditions. [source]


Living donor liver transplantation for children with liver failure and concurrent multiple organ system failure

LIVER TRANSPLANTATION, Issue 10 2001
Cara L. Mack
Liver transplantation for pediatric patients in liver failure and multiple organ system failure (MOSF) often results in poor patient survival. Progression of organ failure occurs while awaiting a cadaveric allograft. Therefore, we considered living donor liver transplantation (LDLT) in this critically ill group of children and report our initial results with comparison to a similar group who received cadaveric donation (CAD). A retrospective chart review was performed on all pediatric liver transplant recipients who met criteria for MOSF at the time of transplantation. Data collection involved pretransplantation patient profiles, as well as postoperative complications and patient survival. Eight patients in MOSF received living donor transplants and 11 patients received a cadaveric allograft. Mean wait time was 3.5 days in the LDLT group and 6.5 days in the CAD group. Pretransplantation patient profiles and postoperative complications were similar between groups. Mean cold ischemia times were 3.8 hours in the LDLT group and 7.9 hours in the CAD group (P = .0002). Thirty-day and 6-month survival rates of the LDLT group were 88% and 63% compared with 45% and 27% in the CAD group, respectively. Living donor transplant recipients in MOSF had decreased wait times to transplantation, as well as decreased cold ischemia times, compared with cadaveric transplant recipients. Patients in the LDLT group had markedly improved survival compared with the CAD group. Timely transplantation before worsening organ failure may account for these findings. [source]


Multicenter Analysis of Novel and Established Variables Associated with Successful Human Islet Isolation Outcomes

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2010
J. S. Kaddis
Islet transplantation is a promising therapy used to achieve glycometabolic control in a select subgroup of individuals with type I diabetes. However, features that characterize human islet isolation success prior to transplantation are not standardized and lack validation. We conducted a retrospective analysis of 806 isolation records from 14 pancreas-processing laboratories, considering variables from relevant studies in the last 15 years. The outcome was defined as postpurification islet equivalent count, dichotomized into yields ,315 000 or ,220 000. Univariate analysis showed that donor cause of death and use of hormonal medications negatively influenced outcome. Conversely, pancreata from heavier donors and those containing elevated levels of surface fat positively influence outcome, as did heavier pancreata and donors with normal amylase levels. Multivariable logistic regression analysis identified the positive impact on outcome of surgically intact pancreata and donors with normal liver function, and confirmed that younger donors, increased body mass index, shorter cold ischemia times, no administration of fluid/electrolyte medications, absence of organ edema, use of University of Wisconsin preservation solution and a fatty pancreas improves outcome. In conclusion, this multicenter analysis highlights the importance of carefully reviewing all donor, pancreas and processing parameters prior to isolation and transplantation. [source]


Risk Factors for and Clinical Course of Non-Anastomotic Biliary Strictures After Liver Transplantation

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 7 2003
Maureen M. J. Guichelaar
Non-anastomotic biliary stricture (NAS) formation is a major complication of liver transplantation. We prospectively determined the time to development of responsiveness to treatment, and clinical outcomes following NAS formation. In addition, an extensive analysis of the association of recipient, donor, and clinical variables with NAS formation was performed. A total of 749 consecutive patients was studied in a prospective, protocol-based fashion. Seventy-two patients (9.6%) developed NAS at a mean of 23.6 ± 34.2 weeks post-transplantation. Non-anastomotic biliary stricture formation resolved in only 6% of affected patients. Although patient survival was not affected, retransplantation and graft loss rates were significantly greater in recipients who developed NAS. In contrast to previous reports, a pretransplant diagnosis of HCV was associated with a low frequency of NAS formation. The incidence of NAS was independently associated with pretransplant diagnoses of PSC and autoimmune hepatitis. Hepatic artery thrombosis, and prolonged warm and cold ischemia times were also independent risk factors for NAS formation. We conclude that NAS developed in ,10% of primary liver transplant recipients. A pretransplant diagnosis of autoimmune hepatitis has been identified as a novel independent risk factor for NAS formation. Development of NAS significantly attenuates graft but not patient survival. [source]