Home About us Contact | |||
Organ Sharing (organ + sharing)
Terms modified by Organ Sharing Selected AbstractsSurvival after orthotopic liver transplantation: The impact of antibody against hepatitis B core antigen in the donor,,§LIVER TRANSPLANTATION, Issue 10 2009Lei Yu Liver transplantation using grafts from donors with antibody against hepatitis B core antigen (anti-HBc) increases the recipients' risk of developing hepatitis B virus (HBV) infection post-transplantation. Our aim was to assess whether using such grafts was associated with reduced posttransplantation survival and whether this association depended on recipients' prior exposure to HBV on the basis of their pretransplantation serological patterns. Data were derived from the United Network for Organ Sharing on adult, cadaveric, first-time liver transplants performed between 1994 and 2006. Among recipients who did not have HBV infection before transplantation, those with anti-HBc,positive donors had significantly worse unadjusted posttransplantation patient survival than recipients with anti-HBc,negative donors [hazard ratio, 1.35; 95% confidence interval (CI), 1.21-1.50]. However, after adjustments for other predictors of posttransplantation survival, including donor age, donor race, and recipient underlying liver diseases, patient survival was not significantly different between the 2 groups (hazard ratio, 1.09; 95% CI, 0.97-1.24). Among recipients without antibody against hepatitis B surface antigen (anti-HBs), use of anti-HBc,positive donor grafts was associated with a trend toward worse survival (adjusted hazard ratio, 1.18; 95% CI, 0.95-1.46), whereas no such trend was observed among recipients positive for anti-HBs. In conclusion, in patients without HBV infection before transplantation, using anti-HBc,positive donors was not independently associated with worse posttransplantation survival. Matching these donors to recipients with anti-HBs pre-transplantation may be especially safe. Liver Transpl 15:1343,1350, 2009. © 2009 AASLD. [source] Predictors of clinical outcome in children undergoing orthotopic liver transplantation for acute and chronic liver diseaseLIVER TRANSPLANTATION, Issue 9 2006Chris Rhee The current United Network for Organ Sharing (UNOS) policy is to allocate liver grafts to pediatric patients with chronic liver disease based on the pediatric end-stage liver disease (PELD) scoring system, while children with fulminant hepatic failure may be urgently listed as Status 1a. The objective of this study was to identify pre-transplant variables that influence patient and graft survival in those children undergoing LTx (liver transplantion) for FHF (fulminant hepatic failure) compared to those patients transplanted for extrahepatic biliary atresia (EHBA), a chronic form of liver disease. The UNOS Liver Transplant Registry was examined for pediatric liver transplants performed for FHF and EHBA from 1987 to 2002. Variables that influenced patient and graft survival were assessed using univariate and multivariate analysis. Kaplan-Meier analysis of FHF and EHBA groups revealed that 5 year patient and graft survival were both significantly worse (P < 0.0001) in those patients who underwent transplantation for FHF. Multivariate analysis of 29 variables subsequently revealed distinct sets of factors that influenced patient and graft survival for both FHF and EHBA. These results confirm that separate prioritizing systems for LTx are needed for children with chronic liver disease and FHF; additionally, our findings illustrate that there are unique sets of variables which predict survival following LTx for these two groups. Liver Transpl 12:1347-1356, 2006. © 2006 AASLD. [source] Graft and patient survival after adult live donor liver transplantation compared to a matched cohort who received a deceased donor transplantationLIVER TRANSPLANTATION, Issue 10 2004Paul J. Thuluvath Live donor liver transplantation (LDLT) has become increasingly common in the United States and around the world. In this study, we compared the outcome of 764 patients who received LDLT in the United States and compared the results with a matched population that received deceased donor transplantation (DDLT) using the United Network for Organ Sharing (UNOS) database. For each LDLT recipient (n = 764), two DDLT recipients (n = 1,470), matched for age, gender, race, diagnosis, and year of transplantation, were selected from the UNOS data after excluding multiple organ transplantation or retransplantation, children, and those with incomplete data. Despite our matching, recipients of LDLT had more stable liver disease, as shown by fewer patients with UNOS status 1 or 2A, in an intensive care unit, or on life support. Creatinine and cold ischemia time were also lower in the LDLT group. Primary graft nonfunction, hyperacute rejection rates, and patient survival by Kaplan-Meier analysis were similar in both groups (2-year survival was 79.0% in LDLT vs. 80.7% in case-controls; P = .5), but graft survival was significantly lower in LDLT (2-year graft survival was 64.4% vs. 73.3%; P < .001). Cox regression (after adjusting for confounding variables) analysis showed that LDLT recipients were 60% more likely to lose their graft compared to DDLT recipients (hazard ratio [HR] 1.6; confidence interval 1.1-2.5). Among hepatitis C virus (HCV) patients, LDLT recipients showed lower graft survival when compared to those who received DDLT. In conclusion, short-term patient survival in LDLT is similar to that in the DDLT group, but graft survival is significantly lower in LDLT recipients. LDLT is a reasonable option for patients who are unlikely to receive DDLT in a timely fashion. (Liver Transpl 2004;10:1263,1268.) [source] Liver transplantation for acute liver failure from drug induced liver injury in the United StatesLIVER TRANSPLANTATION, Issue 8 2004Mark W. Russo Studies of acute liver failure from drugs have included cases mostly attributed to acetaminophen (APAP) but have reported limited data on other drugs. We used the United Network for Organ Sharing (UNOS) liver transplant database from 1990 to 2002 to identify recipients and estimate a U.S. population-based rate of liver transplantation due to acute liver failure from drugs. Patients were identified if their diagnosis was acute hepatic necrosis from an implicated drug at the time of transplant. Liver transplantation for drug hepatotoxicity accounted for 15% of liver transplants for acute liver failure over the study period. In our cohort (n = 270), 206 (76%) recipients were female. APAP alone, or in combination with another drug, accounted for 133 (49%) cases. In the non-acetaminophen (non-APAP) group (n = 137), the most frequently implicated drugs were: isoniazid, n = 24 (17.5%); propylthiouracil, n = 13 (9.5%); and phenytoin and valproate in 10 (7.3%) cases each. One-year patient and graft survival for the entire cohort was 77 and 71%, respectively. Among Caucasians (n = 206) and African-Americans (n = 48), APAP only was implicated in 110 (53%) patients and 12 (25%) patients, respectively, and non-APAP drugs were implicated in 96 (47%) patients and 36 (75%) patients, respectively (P = .0004). Among African-Americans in the non-APAP group, 28 (78%) were women. In conclusion four drugs were implicated in 42% of patients undergoing liver transplantation for acute liver failure due to drugs other than APAP. The increased frequency of African-American women undergoing liver transplantation for non-APAP drug induced liver injury warrants further study. (Liver Transpl 2004;10:1018,1023.) [source] Retransplantation for recurrent hepatitis C: Positive aspectsLIVER TRANSPLANTATION, Issue 11 2003Timothy M. McCashland Key points 1. The prevalence of retransplantation for hepatitis C (HCV) patients is stable (around 40%). 2. Survival models to predict outcome of retransplantation do not show that HCV is an independent variable with poor outcomes. 3. Using Model for End-Stage Liver Disease (MELD) scores from the United Network for Organ Sharing (UNOS) database from 1996-2002, retransplantation for HCV had similar outcomes to other causes of retransplantation. 4. Poorer outcomes were noted for retransplantation with MELD scores greater than 25. 5. Minimal survival thresholds need to be developed for retransplantation for all causes of retransplantation. [source] A model to predict survival at one month, one year, and five years after liver transplantation based on pretransplant clinical characteristicsLIVER TRANSPLANTATION, Issue 5 2003Paul J. Thuluvath MD Reliable models that could predict outcome of liver transplantation (LT) may guide physicians to advise their patients of immediate and late survival chances and may help them to optimize organ use. The objective of this study was to develop user-friendly models to predict short and long-term mortality after LT in adults based on pre-LT recipient characteristics. The United Network for Organ Sharing (UNOS) transplant registry (n = 38,876) from 1987 to 2001 was used to develop and validate the model. Two thirds of patients were randomized to develop the model (the modeling group), and the remaining third was randomized to cross-validate (the cross-validation group) it. Three separate models, using multivariate logistic regression analysis, were created and validated to predict survival at 1 month, 1 year, and 5 years. Using the total severity scores of patients in the modeling group, a predictive model then was created, and the predicted probability of death as a function of total score then was compared in the cross-validation group. The independent variables that were found to be very significant for 1 month and 1 year survival were age, body mass index (BMI), UNOS status 1, etiology, serum bilirubin (for 1 month and 1 year only), creatinine, and race (only for 5 years). The actual deaths in the cross-validation group followed very closely the predicted survival graph. The chi-squared goodness-of-fit test confirmed that the model could predict mortality reliably at 1 month, 1 year, and 5 years. We have developed and validated user-friendly models that could reliably predict short-term and long-term survival after LT. [source] Redrawing organ distribution boundaries: Results of a computer-simulated analysis for liver transplantationLIVER TRANSPLANTATION, Issue 8 2002Richard B. Freeman MD For several years, the Organ Procurement and Transplantation Network/United Network for Organ Sharing (UNOS) Liver and Intestinal Transplantation Committee has been examining effects of changes and proposed changes to the liver allocation system. The Institute of Medicine recently recommended that the size of liver distribution units be increased to improve the organ distribution system. Methods to achieve this and the potential impact on patients and transplant centers of such a change are evaluated in this study. In hypothetical scenarios, we combined geographically contiguous organ procurement organizations (OPOs) in seven different configurations to increase the size of liver distribution units to cover populations greater than 9 million persons. Using the UNOS Liver Allocation Model (ULAM), we examined the effect of 17 different organ allocation sequences in these proposed realignments and compared them with those predicted by ULAM for the current liver distribution system by using the following primary outcome variables: number of primary liver transplantations performed, total number of deaths, and total number of life-years saved. Every proposed new liver distribution unit plan resulted in fewer primary transplantations. Many policies increased the total number of deaths and reduced total life-years saved compared with the current system. Most of the proposed plans reduced interregional variation compared with the current plan, but no one plan consistently reduced variation for all outcome variables, and all reductions in variations were relatively small. All new liver distribution unit plans led to significant shifts in the number of transplantations performed in individual OPOs compared with the current system. The ULAM predicts that changing liver distribution units to larger geographic areas has little positive impact on overall results of liver transplantation in the United States compared with the current plan. Enlarging liver distribution units likely will result in significant shifts in organs across current OPO boundaries, which will have a significant impact on the activity of many transplant centers. [source] Model for end-stage liver disease and Child-Turcotte-Pugh score as predictors of pretransplantation disease severity, posttransplantation outcome, and resource utilization in United Network for Organ Sharing status 2A patientsLIVER TRANSPLANTATION, Issue 3 2002Robert S. Brown Jr MD The Model for End-Stage Liver Disease (MELD) has been proposed as a replacement for the Child-Turcotte-Pugh (CTP) classification to stratify patients for prioritization for orthotopic liver transplantation (OLT). Improved classification of patients with decompensated cirrhosis might allow timely OLT before the development of life-threatening complications, reducing the number of critically ill patients listed as United Network for Organ Sharing (UNOS) status 2A at the time of OLT. We compared the ability of the MELD and CTP scores to predict pre-OLT disease severity, as well as outcome and resource utilization post-OLT. Data from 42 consecutive UNOS status 2A patients undergoing OLT at a single center were used to calculate MELD and CTP scores at the time of status 2A listing. Multivariate analysis was used to determine the relationship between these scores and pre-OLT disease severity measures, survival post-OLT, and measures of resource use post-OLT. The MELD was superior to CTP score in predicting pre-OLT requirements for mechanical ventilation and dialysis. Neither score correlated with the resource utilization parameters studied. Only two patients died within 3 months post-OLT; neither score was predictive of survival in this cohort. In summary, the MELD is superior to CTP score in estimating pre-OLT disease severity in UNOS status 2A patients and thus may help risk stratify status 2A or decompensated status 2B OLT candidates and optimize the timing of OLT. However, neither score correlated with resource use post-OLT in the strata of critically ill patients. [source] Do six-antigen-matched cadaver donor kidneys provide better graft survival to children compared with one-haploidentical living-related donor transplants?PEDIATRIC TRANSPLANTATION, Issue 2 2000A report of the North American Pediatric Renal Transplant Cooperative Study Abstract: Since 1991, more than 50% of pediatric transplant recipients have received a living donor (LD) kidney, and , 85% of these allografts were one-haploidentical parental kidneys. Short-term (1 yr) and long-term (5 yr) graft survival of LD kidneys are 10% and 15% better, respectively, than that of cadaver donor (CD) kidneys. Because of these results, children are frequently not placed on a cadaver waiting list until the possibility of a LD is excluded , a process that may take up to 1 yr. The hypothesis for this study was that the graft outcome of a six-antigen-matched CD kidney is superior to that of a one-haploidentical LD kidney, and that children are at a disadvantage by not being placed on a CD list whilst waiting for a LD. The database of the North American Pediatric Renal Transplant Cooperative Study (NAPRTCS) for 11 yrs (1987,98), was reviewed to identify children who were recipients of a six-antigen-matched CD kidney (primary and repeat transplants), and those who were recipients of a one-haploidentical LD kidney (primary and repeat transplants). Using standard statistical methods, the morbidity, rejection episodes, post-transplant hospitalizations, renal function, long- and short-term graft survival, and half-life of primary recipients were compared in the two groups. Unlike adult patients, only 2.7% (87/3313) of CD recipients in the pediatric age range received a six-antigen-matched kidney, and the annual accrual rate over 11 yrs was never higher than 4%. Comparison of 57 primary six-antigen-CD kidneys (PCD) with 2472 primary LD (PLD) kidneys revealed that morbidity, rejection rates, and ratios were identical in the two groups. Renal function and subsequent hospitalizations were also identical in the two groups. Five-year graft survival of the PCD group was 90% compared with 80% for the PLD group, and the half-life of the PCD group was 25 ± 12.9 yrs compared with 19.6 ± 1.3 yrs. Our data suggest that the six-antigen-matched CD kidney may have less graft loss as a result of chronic rejection and would therefore confer a better long-term outcome. Based on these findings we recommend that all children, whilst waiting for a LD work-up, be listed with the United Network for Organ Sharing (UNOS) registry for a CD kidney. [source] Listing for Expanded Criteria Donor Kidneys in Older Adults and Those with Predicted BenefitAMERICAN JOURNAL OF TRANSPLANTATION, Issue 4 2010M. E. Grams Certain patient groups are predicted to derive significant survival benefit from transplantation with expanded criteria donor (ECD) kidneys. An algorithm published in 2005 by Merion and colleagues characterizes this group: older adults, diabetics and registrants at centers with long waiting times. Our goal was to evaluate ECD listing practice patterns in the United States in terms of these characteristics. We reviewed 142 907 first-time deceased donor kidney registrants reported to United Network for Organ Sharing (UNOS) between 2003 and 2008. Of registrants predicted to benefit from ECD transplantation according to the Merion algorithm ('ECD-benefit'), 49.8% were listed for ECD offers ('ECD-willing'), with proportions ranging from 0% to 100% by transplant center. In contrast, 67.6% of adults over the age of 65 years were ECD-willing, also ranging from 0% to 100% by center. In multivariate models, neither diabetes nor center waiting time was significantly associated with ECD-willingness in any subgroup. From the time of initial registration, irrespective of eventual transplantation, ECD-willingness was associated with a significant adjusted survival advantage in the ECD-benefit group (HR for death 0.88, p < 0.001) and in older adults (HR 0.89, p < 0.001), but an increased mortality in non-ECD-benefit registrants (HR 1.11, p < 0.001). In conclusion, ECD listing practices are widely varied and not consistent with published recommendations, a pattern that may disenfranchise certain transplant registrants. [source] A Comparative Analysis of Transarterial Downstaging for Hepatocellular Carcinoma: Chemoembolization Versus RadioembolizationAMERICAN JOURNAL OF TRANSPLANTATION, Issue 8 2009R. J. Lewandowski Chemoembolization and other ablative therapies are routinely utilized in downstaging from United Network for Organ Sharing (UNOS) T3 to T2, thus potentially making patients transplant candidates under the UNOS model for end-stage liver disease (MELD) upgrade for hepatocellular carcinoma (HCC). This study was undertaken to compare the downstaging efficacy of transarterial chemoembolization (TACE) versus transarterial radioembolization. Eighty-six patients were treated with either TACE (n = 43) or transarterial radioembolization with Yttrium-90 microspheres (TARE-Y90; n = 43). Median tumor size was similar (TACE: 5.7 cm, TARE-Y90: 5.6 cm). Partial response rates favored TARE-Y90 versus TACE (61% vs. 37%). Downstaging to UNOS T2 was achieved in 31% of TACE and 58% of TARE-Y90 patients. Time to progression according to UNOS criteria was similar for both groups (18.2 months for TACE vs. 33.3 months for TARE-Y90, p = 0.098). Event-free survival was significantly greater for TARE-Y90 than TACE (17.7 vs. 7.1 months, p = 0.0017). Overall survival favored TARE-Y90 compared to TACE (censored 35.7/18.7 months; p = 0.18; uncensored 41.6/19.2 months; p = 0.008). In conclusion, TARE-Y90 appears to outperform TACE for downstaging HCC from UNOS T3 to T2. [source] Moving Kidney Allocation Forward: The ASTS PerspectiveAMERICAN JOURNAL OF TRANSPLANTATION, Issue 7 2009R. B. Freeman In 2008, the United Network for Organ Sharing issued a request for information regarding a proposed revision to kidney allocation policy. This plan described combining dialysis time, donor characteristics and the estimated life years from transplant (LYFT) each candidate would gain in an allocation score that would rank waiting candidates. Though there were some advantages of this plan, the inclusion of LYFT raised many questions. Foremost, there was no clear agreement that LYFT should be the main criterion by which patients should be ranked. Moreover, to rank waiting candidates with this metric, long-term survival models were required in which there was no incorporation of patient preference or discounting for long survival times and for which the predictive accuracy did not achieve accepted standards. The American Society of Transplant Surgeons was pleased to participate in the evaluation of the proposal. Ultimately, the membership did not favor this proposal, because we felt that it was too complicated and that the projected slight increase in overall utility was not justified by the compromise in individual justice that was required. We offer alternative policy options to address some of the unmet needs and issues that were brought to light during this interesting process. [source] Histidine-Tryptophan-Ketoglutarate (HTK) Is Associated with Reduced Graft Survival of Deceased Donor Kidney TransplantsAMERICAN JOURNAL OF TRANSPLANTATION, Issue 5 2009Z. A. Stewart Single-center studies have reported equivalent outcomes of kidney allografts recovered with histidine-tryptophan-ketoglutarate (HTK) or University of Wisconsin (UW) solution. However, these studies were likely underpowered and often unadjusted, and multicenter studies have suggested HTK preservation might increase delayed graft function (DGF) and reduce graft survival of renal allografts. To further inform clinical practice, we analyzed the United Network for Organ Sharing (UNOS) database of deceased donor kidney transplants performed from July 2004 to February 2008 to determine if HTK (n = 5728) versus UW (n = 15 898) preservation impacted DGF or death-censored graft survival. On adjusted analyses, HTK preservation had no effect on DGF (odds ratio [OR] 0.99, p = 0.7) but was associated with an increased risk of death-censored graft loss (hazard ratio [HR] 1.20, p = 0.008). The detrimental effect of HTK was a relatively late one, with a strong association between HTK and subsequent graft loss in those surviving beyond 12 months (HR 1.43, p = 0.007). Interestingly, a much stronger effect was seen in African-American recipients (HR 1.55, p = 0.024) than in Caucasian recipients (HR 1.18, p = 0.5). Given recent studies that also demonstrate that HTK preservation reduces liver and pancreas allograft survival, we suggest that the use of HTK for abdominal organ recovery should be reconsidered. [source] Disparities in the Utilization of Live Donor Renal TransplantationAMERICAN JOURNAL OF TRANSPLANTATION, Issue 5 2009J. L. Gore Despite universal payer coverage with Medicare, sociodemographic disparities confound the care of patients with renal failure. We sought to determine whether adults who realize access to kidney transplantation suffer inequities in the utilization of live donor renal transplantation (LDRT). We identified adults undergoing primary renal transplantation in 2004,2006 from the United Network for Organ Sharing (UNOS). We modeled receipt of live versus deceased donor renal transplant on multilevel multivariate models that examined recipient, center and UNOS region-specific covariates. Among 41 090 adult recipients identified, 39% underwent LDRT. On multivariate analysis, older recipients (OR 0.62, 95% CI 0.56,0.68 for 50,59 year-olds vs. 18,39 year-old recipients), those of African American ethnicity (OR 0.54, 95% CI 0.50,0.59 vs. whites) and of lower socioeconomic status (OR 0.72, 95% CI 0.67,0.79 for high school-educated vs. college-educated recipients; OR 0.78, 95% CI 0.71,0.87 for lowest vs. highest income quartile) had lower odds of LDRT. These characteristics accounted for 14.2% of the variation in LDRT, more than recipient clinical variables, transplant center characteristics and UNOS region level variation. We identified significant racial and socioeconomic disparities in the utilization of LDRT. Educational initiatives and dissemination of processes that enable increased utilization of LDRT may address these disparities. [source] Changes in Pediatric Renal Transplantation After Implementation of the Revised Deceased Donor Kidney Allocation PolicyAMERICAN JOURNAL OF TRANSPLANTATION, Issue 5 2009S. Agarwal In October 2005, the United Network for Organ Sharing (UNOS) implemented a revised allocation policy requiring that renal allografts from young deceased donors (DDs) (<35 years old) be offered preferentially to pediatric patients (<18 years old). In this study, we compare the pre- and postpolicy quarterly pediatric transplant statistics from 2000 to 2008. The mean number of pediatric renal transplants with young DDs increased after policy implementation from 62.8 to 133 per quarter (p < 0.001), reflecting a change in the proportion of all transplants from young DDs during the study period from 0.33 to 0.63 (p < 0.001). The mean number of pediatric renal transplants from old DDs (,35 years old) decreased from 22.4 to 2.6 per quarter (p < 0.001). The proportion of all pediatric renal transplants from living donors decreased from 0.55 to 0.35 (p < 0.001). The proportion from young DDs with five or six mismatched human leukocyte antigen (HLA) loci increased from 0.16 to 0.36 (p < 0.001) while those with 0 to 4 HLA mismatches increased from 0.18 to 0.27 (p < 0.001). Revision of UNOS policy has increased the number of pediatric renal transplants with allografts from young DDs, while increasing HLA-mismatched allografts and decreasing the number from living donors. [source] Kidney Transplantation in Previous Heart or Lung RecipientsAMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2009B. E. Lonze Outcomes after heart and lung transplants have improved, and many recipients survive long enough to develop secondary renal failure, yet remain healthy enough to undergo kidney transplantation. We used national data reported to United Network for Organ Sharing (UNOS) to evaluate outcomes of 568 kidney after heart (KAH) and 210 kidney after lung (KAL) transplants performed between 1995 and 2008. Median time to kidney transplant was 100.3 months after heart, and 90.2 months after lung transplant. Renal failure was attributed to calcineurin inhibitor toxicity in most patients. Outcomes were compared with primary kidney recipients using matched controls (MC) to account for donor, recipient and graft characteristics. Although 5-year renal graft survival was lower than primary kidney recipients (61% KAH vs. 73.8% MC, p < 0.001; 62.6% KAL vs. 82.9% MC, p < 0.001), death-censored graft survival was comparable (84.9% KAH vs. 88.2% MC, p = 0.1; 87.6% KAL vs. 91.8% MC, p = 0.6). Furthermore, renal transplantation reduced the risk of death compared with dialysis by 43% for KAH and 54% for KAL recipients. Our findings that renal grafts function well and provide survival benefit in KAH and KAL recipients, but are limited in longevity by the general life expectancy of these recipients, might help inform clinical decision-making and allocation in this population. [source] Formal Policies and Special Informed Consent Are Associated with Higher Provider Utilization of CDC High-Risk Donor OrgansAMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2009L. M. Kucirka A new United Network for Organ Sharing (UNOS) policy mandates special informed consent (SIC) before transplanting organs from donors classified by the Public Health Service/Center for Disease Control (PHS/CDC) as high-risk donors (HRDs); however, concerns remain that this policy may cause suboptimal organ utilization. Currently, consent and disclosure policy is determined by individual centers or surgeons; as such, little is known about current practices. The goals of this study were to quantify consent and disclosure practices for HRDs in the United States, identify factors associated with SIC use and analyze associations between SIC use and HRD organ utilization. We surveyed 422 transplant surgeons about their use of HRD organs and their associated consent and disclosure practices. In total, 52.7% of surgeons use SIC, but there is a high variation in use within centers, between centers and by donor behavior. A defined HRD policy at a transplant center is strongly associated with SIC use at that center (OR = 4.68, p < 0.001 by multivariate hierarchical logistic regression). SIC use is associated with higher utilization of HRD livers (OR 3.37), and a trend toward higher utilization of HRD kidneys (OR 1.74) and pancreata (OR 1.28). We believe our findings support a formalized national policy and suggest that this policy will not result in decreased utilization. [source] The Evolution and Direction of OPTN Oversight of Live Organ Donation and Transplantation in the United StatesAMERICAN JOURNAL OF TRANSPLANTATION, Issue 1 2009R. S. Brown For more than 20 years, the Organ Procurement and Transplantation Network (OPTN) has developed policies and bylaws relating to equitable allocation of deceased donor organs for transplantation. United Network for Organ Sharing (UNOS) operates the OPTN under contract with the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS). Until recent years, the OPTN had little defined authority regarding living donor organ for transplantation except for the collection of data relating to living donor transplants. Beginning with the implementation of the OPTN Final Rule in 2000, and continuing with more recent announcements, the OPTN's role in living donation has grown. Its responsibilities now include monitoring of living donor outcomes, promoting equity in nondirected living donor transplantation and ensuring that transplant programs have expertise and established protocols to promote the safety of living donors and recipients. The purpose of this article is to describe the evolving mandates for the OPTN in living donation, as well as the network's recent activities and ongoing efforts. [source] Histidine-Tryptophan-Ketoglutarate (HTK) Is Associated with Reduced Graft Survival in Pancreas TransplantationAMERICAN JOURNAL OF TRANSPLANTATION, Issue 1 2009Z. A. Stewart Prior single-center studies have reported that pancreas allograft survival is not affected by preservation in histidine-tryptophan-ketoglutarate (HTK) versus University of Wisconsin (UW) solution. To expand on these studies, we analyzed the United Network for Organ Sharing (UNOS) database of pancreas transplants from July 2004, through February 2008, to determine if preservation with HTK (N = 1081) versus UW (N = 3311) impacted graft survival. HTK preservation of pancreas allografts increased significantly in this time frame, from 15.4% in 2004 to 25.4% in 2008. After adjusting for other recipient, donor, graft and transplant center factors that impact graft survival, HTK preservation was independently associated with an increased risk of pancreas graft loss (hazard ratio [HR] 1.30, p = 0.014), especially in pancreas allografts with cold ischemia time (CIT) ,12 h (HR 1.42, p = 0.017). This reduced survival with HTK preservation as compared to UW preservation was seen in both simultaneous pancreas-kidney (SPK) transplants and pancreas alone (PA) transplants. Furthermore, HTK preservation was also associated with a 1.54-fold higher odds of early (<30 days) pancreas graft loss as compared to UW (OR 1.54, p = 0.008). These results suggest that the increasing use of HTK for abdominal organ preservation should be re-examined. [source] Direction of the Organ Procurement and Transplantation Network and United Network for Organ Sharing Regarding the Oversight of Live Donor Transplantation and Solicitation for OrgansAMERICAN JOURNAL OF TRANSPLANTATION, Issue 1 2006F. L. Delmonico The Organ Procurement and Transplantation Network (OPTN) operated by United Network for Organ Sharing (UNOS) has taken recent steps to address public solicitation for organ donors and its oversight of live donor transplantation. This report provides the direction of the OPTN regarding deceased donor solicitation. The OPTN has authority under federal law to equitably allocate deceased donor organs within a single national network based upon medical criteria, not upon one's social or economic ability to utilize resources not available to all on the waiting list. The OPTN makes a distinction between solicitations for a live donor organ versus solicitations for directed donation of deceased organs. As to live donor solicitation, the OPTN cannot regulate or restrict ways relationships are developed in our society, nor does it seek to do so. OPTN members have a responsibility of helping protect potential recipients from hazards that can arise from public appeals for live donor organs. Oversight and support of the OPTN for live donor transplantation is now detailed by improving the reporting of live donor follow-up, by providing a mechanism for facilitating anonymous live kidney donation, and by providing information for potential live kidney donors via the UNOS Transplant LivingSM website. [source] Laparoscopic (vs. Open) Live Donor Nephrectomy: A UNOS Database Analysis of Early Graft Function and SurvivalAMERICAN JOURNAL OF TRANSPLANTATION, Issue 10 2003Christoph Troppmann The impact of laparoscopic (lap) live donor nephrectomy on early graft function and survival remains controversial. We compared 2734 kidney transplants (tx) from lap donors and 2576 tx from open donors reported to the U.S. United Network for Organ Sharing from 11/1999 to 12/2000. Early function quality (>40 mL urine and/or serum creatinine [creat] decline >25% during the first 24 h post-tx) and delayed function incidence were similar for both groups. Significantly more lap (vs. open) txs, however, had discharge creats greater than 1.4 mg/dL (49.2% vs. 44.9%, p = 0.002) and 2.0 mg/dL (21.8% vs. 19.5%, p = 0.04). But all later creats, early and late rejection, as well as graft survival at 1 year (94.4%, lap tx vs. 94.1%, open tx) were similar for lap and open recipients. Our data suggests that lap nephrectomy is associated with slower early graft function. Rejection rates and short-term graft survival, however, were similar for lap and open graft recipients. Further prospective studies with longer follow up are necessary to assess the potential impact of the laparoscopic procurement mode on early graft function and long-term outcome. [source] L/I-9 Adult living donor liver transplants: Niguarda experience in MilanCLINICAL TRANSPLANTATION, Issue 2006A. Giacomoni Introduction: Adult living donor liver transplants (ALDLTs) have emerged as an option in the last few years. Materials and methods:, From March 2001 through February 2006, we performed 27 ALDLTs. Liver volume, vascular, and biliary anatomy were assessed by CT scan and magnetic resonance cholangiography. The graft-to-recipient weight ratio was always above 0.8. The recipients were United Network for Organ Sharing (UNOS) status 2B or 3. The transplant was carried out grafting segments V-VIII to the recipient without the MHV. In the recipient we have never used a venous-venous bypass. Results:, With a mean follow-up of 675 days (range, 8 to 1,804 days), 23 out of the 27 patients are alive. Three have undergone a retransplant: 2 as a consequence of an arterial thrombosis and 1 because of small-for-size-syndrome. These data show an overall patient and graft survival rate of 85% and 74%. Four deaths were caused by massive pulmonary bleeding due to Rendů-Osler syndrome, systemic aspergillosis, sepsis, and cardiac arrhythmia. Fourteen biliary complications (51.85%) occurred in 11 recipients (40.74%); 3 of these patients developed 2 consecutive and different biliary complications. All the donors are alive and well. Conclusion:, An expert surgical team and proper selection of both donor and recipient are mandatory. Overall results of ALDLTs are very satisfactory, even if we have to take into account a high rate of biliary complications. [source] United Network for Organ Sharing's expanded criteria donors: is stratification useful?,CLINICAL TRANSPLANTATION, Issue 3 2005Edwina S. Baskin-Bey Abstract:, The United Network for Organ Sharing (UNOS) Expanded Criteria Donor (ECD) system utilizes pre-transplant variables to identify deceased donor kidneys with an increased risk of graft loss. The aim of this study was to compare the ECD system with a quantitative approach, the deceased donor score (DDS), in predicting outcome after kidney transplantation. We retrospectively reviewed 49 111 deceased donor renal transplants from the UNOS database between 1984 and 2002. DDS: 0,39 points; ,20 points defined as marginal. Recipient outcome variables were analyzed by ANOVA or Kaplan,Meier method. There was a 90% agreement between the DDS and ECD systems as predictors of renal function and graft survival. However, DDS identified ECD, kidneys (10.7%) with a significantly poorer outcome than expected (DDS 20,29 points, n = 5,252). Stratification of ECD+ kidneys identified a group with the poorest outcome (DDS ,30 points). Predictability of early post-transplant events (i.e. need for hemodialysis, decline of serum creatinine and length of hospital stay) was also improved by DDS. DDS predicted outcome of deceased donor renal transplantation better than the ECD system. Knowledge obtained by stratification of deceased donor kidneys can allow for improved utilization of marginal kidneys which is not achieved by the UNOS ECD definition alone. [source] |