Donor Selection (donor + selection)

Distribution by Scientific Domains


Selected Abstracts


Small-for-size liver syndrome after auxiliary and split liver transplantation: Donor selection

LIVER TRANSPLANTATION, Issue 9 2003
Nigel Heaton
Small-for-size liver grafts can be defined by a recognizable clinical syndrome that results from the transplantation of too small a functional mass of liver for a designated recipient. A graft to recipient body weight ratio less than 0.8, impaired venous inflow, and enhanced metabolic demands in patients with poor clinical conditions must be considered as main factors leading to the small-for-size syndrome (SFSS) when using living and cadaveric partial grafts such as split and auxiliary liver grafts. Increased risk of graft dysfunction is currently observed in fatty infiltration of more than 30%, abnormal liver test results (especially bilirubin and gamma glutaryl transferase), and other donor risk factors such as high inotrope administration and donor stay in the intensive care unit (>5 days). Older donors are especially vulnerable to prolonged cold ischemia and high inotrope levels, giving rise to early graft dysfunction and prolonged cholestasis. Increased metabolic need on a functionally small-for-size graft predisposes to surgical and septic complications and poorer survival. Splitting livers into right and left lobe grafts increases the potential risk of small-for-size grafts for both recipients. Several techniques of venous outflow reconstruction/implantation have been proposed to reduce the risk of obstruction postoperatively. Prevention and management of SFSS will improve in parallel with the increased experience, allowing us optimum usage of available organs and reducing overall morbidity and mortality. (Liver Transpl 2003;9:S26-S28.) [source]


Good practice in plasma collection and fractionation

ISBT SCIENCE SERIES: THE INTERNATIONAL JOURNAL OF INTRACELLULAR TRANSPORT, Issue n1 2010
C. Schärer
The control strategy to ensure safety of blood products includes a combination of measures focusing on ensuring the quality and safety of starting material by careful donor selection and testing strategies at different levels, together with validated manufacturing processes, including steps to inactivate or remove potential contaminating agents. Using an approach based on good manufacturing practice (GMP) provides a manufacturing model that allows for a documented system of incorporating quality throughout the entire manufacturing process and describes the activities and controls needed to consistently produce products that comply with specifications and are safe for use. There are no doubts that the aim of providing safe and high-quality product to the patients should be the same for all products derived from human blood, independent of its use either as a blood component for direct transfusion or as industrially manufactured product. It would be difficult to justify whether for blood components the good practice standards and for plasma derivatives the GMP standards for manufacturing would not ensure equivalent levels of quality and safety. To ensure a high level of quality and safety of blood components and plasma derivatives, the implementation of double standards in blood establishments and fractionation industry would not be effective and should be avoided. Harmonized standards and good practices for collection and fractionation, based on the principles of GMP, should be envisaged in the whole chain of manufacturing blood components and plasma derivatives. Global initiatives to further promote the implementation of harmonized GMP for the collection in blood establishments and a stringent regulatory control are ongoing. This would further contribute to the global availability of plasma-derived medicinal products. [source]


Comparison of different methods of bacterial detection in blood components

ISBT SCIENCE SERIES: THE INTERNATIONAL JOURNAL OF INTRACELLULAR TRANSPORT, Issue 1 2009
M. Schmidt
Background, Over the last two decades, the residual risk of acquiring a transfusion-transmitted viral infection has been reduced to less than 1 : 1 000 000 via improvements in different techniques (e.g. donor selection, leuco-depletion, introduction of 3rd or 4th generation enzyme-linked immunosorbent assays and mini-pool nucleic acid testing (MP-NAT). In contrast, the risk for transfusion-associated bacterial infections has remained fairly stable, and is estimated to be in a range between 1 : 2000 and 1 : 3000. Platelets are at an especially higher risk for bacterial contamination, because they are stored at room temperature, which provides good culture conditions for a broad range of bacterial strains. To improve bacterial safety of blood products, different detection systems have been developed that can be divided into culture systems like BacT/ALERT or Pall eBDS, rapid detection systems like NAT systems, immunoassays and systems based on the FACS technique. Culture systems are used for routine bacterial screening of platelets in many countries, whereas rapid detection systems so far are mainly used in experimental spiking studies. Nevertheless, pathogen-reduction systems are currently available for platelet concentrates and plasma, and are under investigation for erythrocytes. Methods, In this review, the functional principles of the different assays are described and discussed with regard to their analytical sensitivity, analytical specificity, diagnostic sensitivity, diagnostic specificity and clinical efficiency. The detection methods were clustered into three groups: (i) detection systems currently used for routine screening of blood products, (ii) experimental detection systems ready to use for routine screening of blood products, and (iii) new experimental detection systems that need to be investigated in additional spiking studies and clinical trials. Results, A recent International Society of Blood Transfusion international forum reported on bacterial detection methods in 12 countries. Eight countries have implemented BacT/ALERT into blood donor screening, whereas in three countries only quality controls were done by culture methods. In one country, shelf-life was reduced to 3 days, so no bacterial screening was implemented. Screening data with culture methods can be used to investigate the prevalence of bacterial contamination in platelets. Differing results between the countries could be explained by different test definitions and different test strategies. Nevertheless, false-negative results causing severe transfusion-related septic reactions have been reported all over the world due to a residual risk of sample errors. Rapid screening systems NAT and FACS assays have improved over the last few years and are now ready to be implemented in routine screening. Non-specific amplification in NAT can be prevented by pre-treatment with Sau3AI, filtration of NAT reagents, or reduction of the number of polymerase chain reaction cycles. FACS systems offer easy fully automated handling and a handling time of only 5 min, which could be an option for re-testing day-5 platelets. New screening approaches like immunoassays, detection of bacterial adenosine triphosphate, or detection of esterase activity need to be investigated in additional studies. Conclusion, Bacterial screening of blood products, especially platelets, can be done with a broad range of technologies. The ideal system should be able to detect one colony-forming unit per blood bag without a delay in the release process. Currently, we are far away from such an ideal screening system. Nevertheless, pathogen-inactivation systems are available, but a system for all blood components will not be expected in the next few years. Therefore, existing culture systems should be complemented by rapid systems like NAT or FACS especially for day-5 platelets. [source]


Where will pathogen inactivation have the greatest impact?

ISBT SCIENCE SERIES: THE INTERNATIONAL JOURNAL OF INTRACELLULAR TRANSPORT, Issue 1 2007
T. Hervig
Blood safety has always been a major task in transfusion medicine. A strategy to obtain this aim should include donor education, donor selection, and testing of blood donations. Pathogen inactivation adds another level of safety. In the fractionation industry, pathogen inactivation methods are mandatory. Several countries also use pathogen-inactivated plasma , from pools or single donors. Concerning the cellular blood components, there is still no method available for red cell concentrates, whereas methods for platelet concentrates are available in some countries and others are in the pipeline for commercialization. The efficiency of the ,old' methods to increase blood safety and the costs of the methods seem to be major obstacles for the introduction of the systems. There are also concerns on product quality and loss of volume during the inactivation process. As the importance of pathogen inactivation is largest in countries with blood donors who carry infections it is impossible to protect against, either due to high incidence of the infection or due to shortage of tests, cost will be a major question when pathogen inactivation is considered. Pathogen inactivation of red cell concentrates will also be a necessity. When pathogen inactivation methods are available for all blood components, they will have great impact to protect the patients in countries where a high percentage of the population is infected by agents transmissible through blood transfusion, and in all situations to protect against new pathogens and ,old' pathogens that become more virulent. The total risk of contracting infectious diseases through blood transfusion will probably be important when implementation of new methods for pathogen inactivation is considered. [source]


Influence of epidemiological factors on blood transfusion

ISBT SCIENCE SERIES: THE INTERNATIONAL JOURNAL OF INTRACELLULAR TRANSPORT, Issue 1 2007
S. Laperche
The prevalence, incidence and risk factors of infectious diseases observed in the general population have been described to directly influence transfusion medicine, especially the blood selection. The objective is to ensure the blood safety. The characterization of modes of transmission influences the donor selection: the risk factors of the main blood-borne infections have permitted to adapt the pre-donation questionnaire in order to exclude at-risk donors. The prevalence of infections also has an impact on the blood screening strategy. For example, anti-HBc antibody (Ab) screening is currently performed only in countries where the HBV prevalence is compatible with a reasonable number of donor exclusions. HTLV Ab screening is implemented in countries in which the rate of donors originating from endemic areas could represent a risk for blood components. Measurement of incidence which contributes to the residual risk has led to the introduction of nucleic acid testing (NAT) for HIV, HCV and in some cases for HBV in viral screening strategy in many countries worldwide. The observed NAT yield differs according to the incidence of the infection and according to the country. Finally, the putative blood transmission of new and emerging pathogens has led to implement specific and non-specific measures in order to enhance blood safety. Conversely, although the blood donor population is selected, the data observed in this population have also contributed to better understand epidemiology and pathogenesis of infection. Moreover, owing to the recent progress in developing modelling approaches for estimating risk, we are able to anticipate a transfusion transmission threat by introducing, when necessary, specific measures intended for reduce this risk. [source]


A 3-year analysis of plateletpheresis donor deferral pattern in a tertiary health care institute: Assessing the current donor selection criteria in Indian scenario

JOURNAL OF CLINICAL APHERESIS, Issue 4 2008
Rashmi Tondon
Abstract Introduction: This study reports the frequency and nature of plateletpheresis deferrals and evaluates donors with low platelet count and hemoglobin levels so as to assess the possibility of reentry without hampering donor safety. Materials and methods: Three-year retrospective data of plateletpheresis deferral was collected. Data from actual procedures was also reviewed to analyze the safety of performing plateletpheresis in donors with low hemoglobin and platelet values. Results: Four hundred sixteen donors were deferred for various reasons among 1,515 screened (27.5%), of which 69.7% deferrals were because of low platelet count (55.8%) and less hemoglobin levels. Among the low platelet count donor group, 20.3% had a count between 141 and 149 × 109/L and 41.8% below 120 × 109/L. Of the 14% donors deferred for low hemoglobin, 62.1% had values in the range of 11.5,12.4 g/dL with normal mean corpuscular volume and red cell distribution width in most (86.2%) of them. Expected blood loss in each procedure varied between 20 and 30 mL, whereas RBC contamination in the product varied from 0 to 1.6 mL in 538 procedures. There were 176 donations with predonation platelet count <180 × 109/L (32.7%). None of the 14 procedures performed on donors with platelet count of 150 × 109/L showed evidence of thrombocytopenia or donor reaction. Conclusion: Lowering the cut-off value for plateletpheresis from 12.5 g/dL to 11.5 g/dL has no deleterious effect on donor safety as the blood loss is minimal. One-fifth deferrals can be reconsidered if the criteria of plateletpheresis donor selection are relaxed for hemoglobin and platelet count. J. Clin. Apheresis, 2008. © 2008 Wiley-Liss, Inc. [source]


Adult-to-adult right hepatic lobe living donor liver transplantation

ALIMENTARY PHARMACOLOGY & THERAPEUTICS, Issue 11 2002
P. H. Hayashi
Summary Spurred on by the critical shortage of cadaveric livers, adult-to-adult right hepatic lobe living donor liver transplantation has grown rapidly as a therapeutic option for selected patients. In the USA alone, the number of living donor liver transplantations has increased six-fold in the last 4 years. The therapy can be complex, bringing together a variety of disciplines, including transplantation medicine and surgery, hepatology, psychiatry and medical ethics. Moreover, living donor liver transplantation is still defining itself in the adult-to-adult application. Uniform standards, guidelines and long-term outcomes are yet to be determined. Nevertheless, initial success has been remarkable, and a basic understanding of this field is essential to any physician contemplating options for their liver failure patients. This review covers a range of topics, including recipient and donor selection and outcomes, donor risk, controversies and future issues. [source]


Recipient Outcomes for Expanded Criteria Living Kidney Donors: The Disconnect Between Current Evidence and Practice

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 7 2009
Y. Iordanous
Older individuals or those with medical complexities are undergoing living donor nephrectomy more than ever before. Transplant outcomes for recipients of kidneys from these living expanded criteria donors are largely uncertain. We systematically reviewed studies from 1980 to June 2008 that described transplant outcomes for recipients of kidneys from expanded criteria living donors. Results were organized by the following criteria: older age, obesity, hypertension, reduced glomerular filtration rate (GFR), proteinuria and hematuria. Pairs of reviewers independently evaluated each citation and abstracted data on study and donor characteristics, recipient survival, graft survival, serum creatinine and GFR. Transplant outcomes for recipients of kidneys from older donors (,60 years) were described in 31 studies. Recipients of kidneys from older donors had poorer 5-year patient and graft survival than recipients of kidneys from younger donors [meta-analysis of 12 studies, 72% vs. 80%, unadjusted relative risk (RR) of survival 0.89, 95% confidence interval (CI) 0.83,0.95]. In meta-regression, this association diminished over time (1980s RR 0.79, 95% CI 0.65,0.96 vs. 1990s RR 0.91, 95% CI 0.85,0.99). Few transplant outcomes were described for other expanded criteria. This disconnect between donor selection and a lack of knowledge of recipient outcomes should give transplant decision-makers pause and sets an agenda for future research. [source]


Improving Outcomes of Liver Retransplantation: An Analysis of Trends and the Impact of Hepatitis C Infection

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 2 2008
M. Ghabril
Retransplantation (RT) in Hepatitis C (HCV) patients remains controversial. Aims: To study trends in RT and evaluate the impact of HCV status in the context of a comprehensive recipient and donor risk assessment. The UNOS database between 1994 and October 2005 was utilized to analyze 46 982 LT and RT. Graft and patient survival along with patient and donor characteristics were compared for 2283 RT performed in HCV and non-HCV patients during 1994,1997, 1998,2001 and 2002,October 2005. Overall HCV prevalence at RT increased from 36% in the initial period to 40.6% after 2002. In our study group, 1-year patient and graft survival post-RT improved over the same time intervals from 65.0% to 70.7% and 54.87% to 65.8%, respectively. HCV was only associated with decreased patient and graft survival with a retransplant (LT-RT) interval (RI) >90 days. Independent predictors of mortality for RT with RI >90 days were patient age, MELD score >25, RI <1 year, warm ischemia time ,75 min and donor age ,60 (significant for HCV patients only). Outcomes of RT are improving, but can be optimized by weighing recipient factors, anticipation of operative factors and donor selection. [source]