Term Survival (term + survival)

Distribution by Scientific Domains


Selected Abstracts


What is the real gain after liver transplantation?

LIVER TRANSPLANTATION, Issue S2 2009
James Neuberger
Key Points 1. For most liver allograft recipients, both the quality and length of life are greatly improved after transplantation. However, neither the quality of life nor the length of life in the survivors returns to that seen in age-matched and sex-matched normal subjects. 2. The gain in survival after transplantation can be estimated by a comparison of the actual outcome after transplantation and the predicted survival in the absence of transplantation. 3. The reduction in graft and patient survival, in comparison with a normal age-matched and sex-matched population, is determined by several factors: short-term survival is affected by the patient's condition pre-transplant and the quality of the graft, and for longer term survival, recurrent disease accounts for most of the differences seen between different indications. Some of the causes of premature death (such as infection, de novo malignancy, and cardiovascular and cerebrovascular disease) that are increased in the liver allograft recipient may be reduced by improved management with more aggressive surveillance and treatment. 4. The aims of selection and allocation vary in different health care systems: transparency, objectivity, equity of access, justice, mortality awaiting transplantation, utility, and transplant benefit are all important but often competing demands. Understanding the associated increase in survival will allow for a rational approach to this complex area. Liver Transpl 15:S1,S5, 2009. © 2009 AASLD. [source]


Around the world with the model for end-stage liver disease

LIVER TRANSPLANTATION, Issue 10 2003
Richard B. Freeman Jr MD
Background: Indices for predicting survival are essential for assessing prognosis and assigning priority for liver transplantation in patients with liver cirrhosis. The model for end stage liver disease (MELD) has been proposed as a tool to predict mortality risk in cirrhotic patients. However, this model has not been validated beyond its original setting. Aim: To evaluate the short and medium term survival prognosis of a European series of cirrhotic patients by means of MELD compared with the Child-Pugh score. We also assessed correlations between the MELD scoring system and the degree of impairment of liver function, as evaluated by the monoethylglycinexylidide (MEGX) test. Patients and methods: We retrospectively evaluated survival of a cohort of 129 cirrhotic patients with a follow up period of at least one year. The Child-Pugh score was calculated and the MELD score was computed according to the original formula for each patient. All patients had undergone a MEGX test. Multivariate analysis was performed on all variables to identify the parameters independently associated with one year and six month survival. MELD values were correlated with both Child-Pugh scores and MEGX test results. Results: Thirty one patients died within the first year of follow up. Child-Pugh and MELD scores, and MEGX serum levels were significantly different among patients who survived and those who died. Serum creatinine, international normalized ratio, and MEGX60 were independently associated with six month mortality while the same variables and the presence of ascites were associated with one year mortality. MELD scores showed significant correlations with both MEGX values and Child-Pugh scores. Conclusions: In a European series of cirrhotic patients the MELD score is an excellent predictor of both short and medium term survival, and performs at least as well as the Child-Pugh score. An increase in MELD score is associated with a decrease in residual liver function. [source]


The Campylobacter jejuni stringent response controls specific stress survival and virulence-associated phenotypes

MOLECULAR MICROBIOLOGY, Issue 1 2005
Erin C. Gaynor
Summary Campylobacter jejuni is a highly prevalent food-borne pathogen that causes diarrhoeal disease in humans. A natural zoonotic, it must overcome significant stresses both in vivo and during transmission despite the absence of several traditional stress response genes. Although relatively little is understood about its mechanisms of pathogenesis, its ability to interact with and invade human intestinal epithelial cells closely correlates with virulence. A C. jejuni microarray-based screen revealed that several known virulence genes and several uncharacterized genes, including spoT, were rapidly upregulated during infection of human epithelial cells. spoT and its homologue relA have been shown in other bacteria to regulate the stringent response, an important stress response that to date had not been demonstrated for C. jejuni or any other epsilon-proteobacteria. We have found that C. jejuni mounts a stringent response that is regulated by spoT. Detailed analyses of a C. jejuni,spoT mutant revealed that the stringent response is required for several specific stress, transmission and antibiotic resistance-related phenotypes. These include stationary phase survival, growth and survival under low CO2/high O2 conditions, and rifampicin resistance. A secondary suppressor strain that specifically rescues the low CO2 growth defect of the ,spoT mutant was also isolated. The stringent response additionally proved to be required for the virulence-related phenotypes of adherence, invasion, and intracellular survival in two human epithelial cell culture models of infection; spoT is the first C. jejuni gene shown to participate in longer term survival in epithelial cells. Microarray analyses comparing wild-type to the ,spoT mutant also revealed a strong correlation between gene expression profiles and phenotype differences observed. Together, these data demonstrate a critical role for the C. jejuni stringent response in multiple aspects of C. jejuni biology and pathogenesis and, further, may lend novel insight into unexplored features of the stringent response in other prokaryotic organisms. [source]


First percutaneous transcatheter aortic valve-in-valve implant with three year follow-up

CATHETERIZATION AND CARDIOVASCULAR INTERVENTIONS, Issue 2 2008
Carlos E. Ruiz MD, FSCAI
Abstract Objectives: This study was conducted to report the clinical, hemodynamic, and iconographic outcomes of the longest survivor of the global CoreValve experience. Background: Early results of percutaneous heart valve (PHV) implantation for severe symptomatic aortic stenosis (AS) have been encouraging, with mid term survival up to 2 years; however longer durability term is unknown. Although a PHV has been implanted in a degenerated surgical bioprosthesis, the feasibility of a PHV-in-PHV has not been demonstrated. Methods: A patient with severe refractory heart failure due to severe aortic regurgitation (AR) and moderate AS, underwent CoreValve prosthesis implantation. The PHV was deployed too proximal into the left ventricular outflow tract, resulting in severe AR through the frame struts. Using the first PHV as a landmark, a second CoreValve was then deployed slightly distal to the first, with trivial residual paravalvular leak. Results: The second CoreValve expanded well with proper function. Transvalvular gradient was 8 mmHg. Both coronary ostia were patent. New mild to moderate mitral regurgitation occurred due to impingement of the anterior mitral leaflet by the first PHV. NYHA functional class improved from IV to II, maintained over the past 3 years. Echocardiography at 3 years showed normal functioning CoreValve-in-CoreValve prostheses, without AR or paravalvular leaks. Transvalvular gradient was 10 mmHg. Cardiac CT showed stable valve-in-valve protheses with no migration. Conclusion: The CoreValve prosthesis has maintained proper function up to 3 years, with no structural deterioration or migration. Treating mixed aortic valve disease with predominant AR is feasible. The concept as well as durability of the first PHV-in-PHV has also been demonstrated. © 2008 Wiley-Liss, Inc. [source]


The taming of the shrew?

ACTA OPHTHALMOLOGICA, Issue 5 2009
The immunology of corneal transplantation
Abstract. Corneal transplantation, first reported a century ago, is the oldest and most frequent form of solid tissue transplantation. Although keratoplasty is also considered as the most successful transplant procedure, several studies indicate that the long term survival of corneal grafts is even lower than that of transplanted parenchymatous organs. Despite the immune privilege enjoyed by the cornea and anterior segment of the eye, immunologic graft rejection is a major limitation to corneal transplantation. This review gives an update on corneal immunobiology and the mechanisms of corneal graft rejection, focusing on antigen presentation, as well as on the molecular and cellular mediators of this particular immune response. [source]