Early Mortality (early + mortality)

Distribution by Scientific Domains
Distribution within Medical Sciences


Selected Abstracts


Valve Replacement Surgery Complicated by Acute Renal Failure,Predictors of Early Mortality

JOURNAL OF CARDIAC SURGERY, Issue 2 2006
F.R.C.P.(C)Article first published online: 21 FEB 200, Sheldon Tobe M.D.
No abstract is available for this article. [source]


Predictors of Early Mortality in Patients Age 80 and Older Receiving Implantable Defibrillators

PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 8 2010
DREW ERTEL M.D.
Background: There are no upper age restrictions for implantable defibrillators (ICDs) but their benefit may be limited in patients , 80 years with strong competing risks of early mortality. Risk factors for early (1-year) mortality in ICD recipients , 80 years of age have not been established. Methods: Two-center retrospective cohort study to assess predictors of one-year mortality in ICD recipients , 80 years of age. Results: Of 2,967 ICDs implanted in the two centers from 1990,2006, 225 (7.6%) patients were ,80 years of age and followed-up at one of the two centers. Mean age was 83.3 ± 3.1 years and follow-up time 3.3 ± 2.6 years. Median survival was 3.6 years (95% confidence interval 2.3,4.9). Multivariate predictors of 1-year mortality included ejection fraction (EF) , 20% and the absence of beta-blocker use. Actuarial 1-year mortality of ICD recipients , 80 with an EF , 20% was 38.2% versus 13.1% in patients 80+ years with an EF > 20% and 10.6% for patients < 80 years with an EF , 20% (P < 0.001 for both). There was no significant difference in the risk of appropriate ICD therapy between those patients 80+ years with EF above and below 20%. Conclusion: In general, patients , 80 years of age who meet current indications for ICD implantation live sufficiently long to warrant device implantation based on anticipated survival alone. However, those with an EF , 20% have a markedly elevated 1-year mortality with no observed increase in appropriate ICD therapy, thus reducing the benefit of device implantation in this population. (PACE 2010; 981,987) [source]


Surgical Ablation of Atrial Fibrillation: The Columbia Presbyterian Experience

JOURNAL OF CARDIAC SURGERY, Issue 5 2006
Veli K. Topkara M.D.
However, it is not widely applied due to its complexity, increased operative times, and the risk of bleeding. Various energy sources have been introduced to simplify the traditional "cut and sew" approach. Methods: This study involves patients undergoing surgical atrial fibrillation ablation (SAFA) at a single institution from 1999 to 2005. Type of concomitant procedures, preoperative clinical characteristics, and chronicity of AF were evaluated in overall patient population. Parameters including surgical approach, lesion pattern, and energy source used were collected intraoperatively. Clinical outcomes examined were postoperative rhythm success, stroke, early mortality, and long-term survival. Results: Three hundred thirty-nine patients were identified. Three hundred twenty-eight (96.8%) patients had associated cardiac disease and underwent concomitant procedures; 75.8% of patients had persistent AF. Energy sources used were microwave (49.8%), radiofrequency (42.2%), and laser (8.0%). In 41.9% of cases a pulmonary vein encircling lesion was the only lesion created. Combination lesion sets were performed in the remaining cases. Rhythm success rates at 3, 6, 12, and 24 months were 74.1%, 68.2%, 74.5%, and 71.1%, respectively. Patients who underwent surgical removal of left atrial appendage by means of stapling or simple excision had no early postoperative stroke. Early mortality was 4.9%. Postoperative survival rates at 1, 3, and 5 years were 89.6%, 83.1%, and 78.0%. Conclusions: Surgical ablation of atrial fibrillation is a safe and effective procedure in restoring sinus rhythm with excellent postoperative survival rates. Further advancements in the field will eventually result in minimally invasive procedures with higher success rates. [source]


Effect of Donor Age on Long-Term Survival Following Cardiac Transplantation

JOURNAL OF CARDIAC SURGERY, Issue 2 2006
Veli K. Topkara M.D.
Our objective was to analyze the effect of donor age on outcomes after cardiac transplantation. Methods: We retrospectively studied 864 patients who underwent cardiac transplantation at New York Presbyterian Hospital , Columbia University between 1992 and 2002. Patients were divided into two groups; donor age <40 years (Group A, n = 600) and donor age ,40 years (Group B, n = 264). Results: Characteristics including gender, body mass index, and cytomegalovirus (CMV) status were significantly different between the two donor age groups. Race, CMV status, toxoplasmosis status, left ventricular assist device prior to transplant, diabetes mellitus, and retransplantation were similar in both the recipient groups, while age, gender, and BMI were different. Early mortality was lower in Group A, 5%, versus 9.5% in Group B. Multivariate analysis revealed recipient female gender (odd ratio (OR) = 1.71), retransplantation (OR = 1.63), and increased donor age (OR = 1.02) as significant predictors of poor survival in the recipient population. Actuarial survival at 1 year (86.7% vs 81%), 5 years (75% vs 65%), and 10 years (56% vs 42%) was significantly different as well with a log rank p = 0.002. Conclusions: These findings suggest that increased donor age is an independent predictor of long-term survival. However, the shortage of organs makes it difficult to follow strict guidelines when placing hearts; therefore, decisions need to be made on a relative basis. [source]


Recent Trends in Early Outcome of Adult Patients after Heart Transplantation: A Single-institution Review of 251 Transplants Using Standard Donor Organs,

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 6 2002
Feng-Chun Tsai
Older age, prior transplantation, pulmonary hypertension, and mechanical support are commonly seen in current potential cardiac transplant recipients. Transplants in 436 consecutive adult patients from 1994 to 1999 were reviewed. There were 251 using standard donors in 243 patients (age range 18,69 years). To emphasize recipient risk, 185 patients who received a nonstandard donor were excluded from analysis. The indications for transplant were ischemic heart disease (n = 123, 47%), dilated cardiomyopathy (n = 82, 32%), and others (n = 56, 21%). One hundred and forty-nine (57%) recipients were listed as status I; 5 and 6% were supported with an intra-aortic balloon and an assist device, respectively. The 30-d survival and survival to discharge were 94.7 and 92.7%, respectively; 1-year survival was 89.1%. Causes of early death were graft failure (n = 6), infection (n = 4), stroke (n = 4), multiorgan failure (n = 3) and rejection (n = 2). Predictors were balloon pump use alone (OR = 11.4, p =,0.002), pulmonary vascular resistance > 4 Wood units (OR = 5.7, p =,0.007), pretransplant creatinine > 2.0 mg/dL (OR = 6.9, p =,0.004) and female donor (OR = 8.3, p =,0.002). Recipient age and previous surgery did not affect short-term survival. Heart transplantation in the current era consistently offers excellent early and 1-year survival for well-selected recipients receiving standard donors. Early mortality tends to reflect graft failure while hospital mortality may be more indicative of recipient selection. [source]


Plasma Exchange Before Surgery for Left Ventricular Assist Device Implantation

ARTIFICIAL ORGANS, Issue 6 2008
Rajko Radovancevic
Abstract:, Left ventricular assist device (LVAD) implantation in end-stage heart failure patients is frequently associated with hemorrhagic complications requiring reoperation. The preoperative coagulopathic profile includes prolonged prothrombin time (PT), partial thromboplastin time (PTT), and bleeding time; platelet dysfunction; decreased coagulation factor activity; and increased inflammatory markers. We compare outcomes in LVAD patients treated with preoperative plasma exchange with concurrent, nonrandomized control patients. We reviewed data from 68 consecutive elective patients who received LVADs at our institution. Thirty-five received LVADs after preoperative plasma exchange (replacement of one plasma volume of fresh frozen plasma), and 33 received LVADs without plasma exchange. Groups were comparable in age, sex, body weight, New York Heart Association class, intra-aortic balloon pump insertion, cardiac index, pulmonary capillary wedge pressure, creatinine, total bilirubin, hemoglobin levels, PT, international normalized ratio, PTT, and platelet count. Early mortality was lower in the plasma exchange group (0% [0/35] vs. 18% [6/33], P = 0.026), and postoperative chest tube drainage decreased by 33% (P = not significant). Blood transfusion requirements were similar.Perioperative mortality decreased in patients treated with plasma exchange before LVAD implantation. [source]


CT15 RISK STRATIFICATION MODELS FOR HEART VALVE SURGERY

ANZ JOURNAL OF SURGERY, Issue 2007
C. H. Yap
Purpose Risk stratification models may be useful in aiding surgical decision-making, preoperative informed consent, quality assurance and healthcare management. While several overseas models exist, no model has been well-validated for use in Australia. We aimed to assess the performance of two valve surgery risk stratification models in an Australian patient cohort. Method The Society of Cardiothoracic Surgeons of Great Britain and Ireland (SCTS) and Northern New England (NNE) models were applied to all patients undergoing valvular heart surgery at St Vincent's Hospital Melbourne and The Geelong Hospital between June 2001 and November 2006. Observed and predicted early mortalities were compared using the chi-square test. Model discrimination was assessed by the area under the receiver operating characteristic (ROC) curve. Model calibration was tested by applying the chi-square test to risk tertiles. Results SCTS model (n = 1095) performed well. Observed mortality was 4.84%, expected mortality 6.64% (chi-square p = 0.20). Model discrimination (area under ROC curve 0.835) and calibration was good (chi-square p = 0.9). the NNE model (n = 1015) over-predicted mortality. Observed mortality 4.83% and expected 7.54% (chi-square p < 0.02). Model discrimination (area under ROC curve 0.835) and calibration was good (chi-square p = 0.9). Conclusion Both models showed good model discrimination and calibration. The NNE model over-predicted early mortality whilst the SCTS model performed well in our cohort of patients. The SCTS model may be useful for use in Australia for risk stratification. [source]


Life expectancy among people with cerebral palsy in Western Australia

DEVELOPMENTAL MEDICINE & CHILD NEUROLOGY, Issue 8 2001
E Blair PhD
This report describes trends, predictors, and causes of mortality in persons with cerebral palsy (CP)using individuals identified by the Western Australian Cerebral Palsy Register and born between 1958 and 1994. Two thousand and fourteen people were identified (1154 males, 860 females), of whom 225 had died by 1 June 1997. Using date-of-death data, crude and standardized mortality rates were estimated and predictors of mortality sought using survival analysis stratified by decade of birth, description of impairments, and demographic and perinatal variables. For those born after 1967, the cause of death profile was examined over time. Mortality exceeded 1% per annum in the first 5 years and declined to age 15 years after which it remained steady at about 0.35% for the next 20 years. The strongest single predictor was intellectual disability, but all forms of disability contributed to decreased life expectancy. Half of those with IQ/DQ score <20 survived to adulthood, increasing to 76% with IQ/DQ score 20,34, and exceeding 92% for higher scores. Severe motor impairment primarily increased the risk of early mortality. Despite there being 72 persons aged from 25 to 41 years with severe motor impairment in our data set, none had died after the age of 25 years. Infants born after more than 32 weeks'gestation were at significantly higher risk of mortality than very preterm infants, accounted for by their higher rates of intellectual disability. No improvements in survival of persons with CP were seen over the study period despite advances in medical care, improved community awareness, and the increasing proportion of very preterm births among people with CP. This may be the result of improved neonatal care enabling the survival of infants with increasingly severe disabilities. [source]


Abdominal compartment syndrome: a new indication for operative intervention in severe acute pancreatitis

INTERNATIONAL JOURNAL OF CLINICAL PRACTICE, Issue 12 2005
K. Wong
Summary The current management of severe acute pancreatitis (SAP) is maximal conservative therapy within an intensive care environment. The only commonly accepted indication for operative intervention is the presence of infected pancreatic necrosis. We present a case wherein a laparotomy performed for treatment of abdominal compartment syndrome (ACS) arising in the setting of SAP in the absence of pancreatic necrosis prevented early mortality and discuss the diagnosis and treatment of ACS as a new indication for operative intervention in SAP. [source]


High incidence of rheumatic fever and Rheumatic heart disease in the republics of Central Asia

INTERNATIONAL JOURNAL OF RHEUMATIC DISEASES, Issue 2 2009
Nazgul A. OMURZAKOVA
Abstract The epidemiological situation involving rheumatic fever (RF) and rheumatic heart disease (RHD) not only remains unresolved but is also a cause of serious concern due to the rapid increase in the incidence of RF/RHD in many developing countries. After the collapse of the Soviet Union, the republics of Central Asia experienced an economic decline that directly affected the public health sector of this region. This is the main cause of the high prevalence of many infectious diseases in Central Asia, including streptococcal tonsillopharyngitis, which carries the risk of complications such as RF. The difficulty involved in early diagnosis of RF and the development of RHD among children and adolescents causes early mortality and sudden death, leading to economic damage in these countries due to the loss of the young working population. Among all the developing countries, Kyrgyzstan, which is located in the heart of Central Asia, has the highest prevalence of RF/RHD. The increase in the prevalence of RF in Central Asia can be attributed to factors such as the low standard of living and changes in the virulence of streptococci and their sensitivity to antibiotics. [source]


Perioperative Results of the Aortic Root Replacement in Strict Graft Inclusion Technique

JOURNAL OF CARDIAC SURGERY, Issue 5 2008
Niyazi Cebi M.D.
Therefore, the strict graft inclusion technique has been developed to avoid major complications. We present the early results after aortic root replacement in strict graft inclusion technique. Materials and Methods: The strict graft inclusion technique was performed in 28 patients between April 2001 and June 2006 in St-Johannes-Hospital-Dortmund, Dortmund, Germany. There were nine female and 19 male patients. The mean age was 57.78 ± 12.01 years (28 to 77 years). A type A aortic dissection and an ascending aortic aneurysm with aortic valve lesion were the indication to operation in patients. Results: There were no early mortality and postoperative rethoracotomy. The mean postoperative bleeding over mediastinal drains was 565 ± 310 mL. (100,2250 mL). In exception of the patients with preoperative double thrombocyte aggregation inhibitors therapy and postoperative consumption coagulopathy, the mean postoperative bleeding over mediastinal drain was 443.04 ± 171.59 mL (100,1100) in the first 24 hours, the transfusion rate was minimal, mean 0.39 ± 0.64 packed red blood cells (RBC) (0,4) and mean 0.14 ± 0.27 packed fresh frozen plasma (FFP) (0,4), whereas only in 18 patients (78.26%) out of 23 patients was a transfusion not necessary. The intraoperative and postoperative requirement for substitution of erythrocyte concentrate was mean 1 ± 1.28 packed RBC (0,5) and FFP concentrate was mean 1.21 ± 1.90 packed FFP (0,12). Conclusions: The strict graft inclusion technique for aortic root replacement represents a safe and feasible method to avoid bleeding from coronary ostial anastomoses, from aortic annular suture lines, and annular leak. [source]


Experience with over 1000 Implanted Ventricular Assist Devices

JOURNAL OF CARDIAC SURGERY, Issue 3 2008
Evgenij V. Potapov M.D.
We present our experience since 1987. Subjects and Methods: Between July 1987 and December 2006, 1026 VADs were implanted in 970 patients. Most of them were men (81.9%). The indications were: cardiomyopathy (n = 708), postcardiotomy heart failure (n = 173), acute myocardial infarction (n = 36), acute graft failure (n = 45), a VAD problem (n = 6), and others (n = 2). Mean age was 46.1 (range 3 days to 78) years. In 50.5% of the patients the VAD implanted was left ventricular, in 47.9% biventricular, and in 1.5% right ventricular. There were 14 different types of VAD. A total artificial heart was implanted in 14 patients. Results: Survival analysis showed higher early mortality (p < 0.05) in the postcardiotomy group (50.9%) than in patients with preoperative profound cardiogenic shock (31.1%) and patients with preoperative end-stage heart failure without severe shock (28.9%). A total of 270 patients were successfully bridged to heart transplantation (HTx). There were no significant differences in long-term survival after HTx among patients with and without previous VAD. In 76 patients the device could be explanted after myocardial recovery. In 72 patients the aim of implantation was permanent support. During the study period 114 patients were discharged home. Currently, 54 patients are on a device. Conclusions: VAD implantation may lead to recovery from secondary organ failure. Patients should be considered for VAD implantation before profound, possibly irreversible, cardiogenic shock occurs. In patients with postcardiotomy heart failure, a more efficient algorithm should be developed to improve survival. With increased experience, more VAD patients can participate in out-patient programs. [source]


Does Aortic Root Enlargement Impair the Outcome of Patients With Small Aortic Root?

JOURNAL OF CARDIAC SURGERY, Issue 5 2006
Hasan Ardal
The aim of this study was to evaluate long-term results of the posterior root enlargement. Methods: Between 1985 and 2002, 124 patients underwent aortic valve replacement with a posterior root enlargement. The main indication was a small aortic valve orifice area to patient body surface area (indexed valve area < 0.85 cm2/m2). Fifty-four (44%) patients were male, and 70 (56%) were female with a mean age 39.1 ± 14.3 years. Indications for operation were severe calcified aortic valve stenosis (37.1%), severe aortic insufficiency (25.8%), or combination (37.1%). Seventy-five (60%) patients received double-valve replacement. A pericardial patch was used in 100 patients (80.6%) and a Dacron patch was used in 24 patients. Results: Operative mortality was 6.4% (8 patients). The causes of hospital mortality were low cardiac output syndrome (LCOS) (in 6 patients), cerebrovascular events (in 1 patient) and multiple organ failure (in 1 patient). Multivariate analysis demonstrated concomitant coronary revascularization to be a significant (p = 0.03) predictor for early mortality. There were six (5.4%) late deaths. Cox proportional hazards regression analysis demonstrated LCOS (p = 0.013) and infective endocarditis (p = 0.003) to be significant predictors for late mortality. Atrioventricular block required a permanent pacemaker was observed in 4 patients (3.2%). Conclusions: Posterior aortic root enlargement techniques can be easily applied without additional risks. Long-term survival and freedoms from valve-related complications are satisfactory. [source]


Early and Late Results of Partial Left Ventriculectomy: Single Center Experience and Review of the Literature

JOURNAL OF CARDIAC SURGERY, Issue 3 2003
Raimondo Ascione M.D.
Methods: From February 1996 to August 2001, 24 patients with dilated cardiomyopathy (DCM) (12 idiopathic, 12 ischemic) underwent PLV. Perioperative and follow-up data were prospectively entered into a database and analyzed. An observational analysis of the literature was carried out of all the published series of PLV reporting on ,15 patients. Results: In our series there were 22 males with amean age of 65 years (range 49 to73]). Of the 22, there were 3 (12.5%) in-hospital deaths. Mean duration of follow-up was 26 months (range 3 to 71) with 9 late deaths (38%), 6 in the idiopathic group. The five-year actuarial survival was 74% in the ischemic group and 33% in the idiopathic group. The observational analysis of literature included a total of 506 patients (425 males, age 50.2 ± 5.2 years)]. The etiology was idiopathic in 255 (50.4%), and ischemic in 89 (17.6%) patients. Baseline characteristics of the whole population include: ejection fraction 18.9 ± 3.9%, NYHA functional class 3.7 ± 0.2, and LVEDD of 7.7 ± 0.4 cm. Severe mitral regurgitation was present in 368 (72.7%) patients. There were 88 (17.4%) in-hospital deaths. Cause of death included 55 due to (62.5%) low cardiac output, 10 (11.3%) due to severe bleeding, 7 (7.95%) caused by malignant arrhythmias, 8 (9%) due to sepsis, and 5 (5.7%) as a result of stroke. Ten of the selected series (overall 386 patients) reported late outcome. There were 89 (22.9%) late deaths, 12 (13.5%) were not cardiac-related, 50 (56.2%) were due to recurrence of congestive heart failure (CHF), 20 (22.5%) caused by sudden arrhythmias, 5 (5.6%) due to infections, and 2 (2.2%) from strokes. Overall, there were 248 (64.2%) survivors, of whom 179 (72.17%) were reported to be in NYHA functional class I or II. All 10 papers reported one-year survival ranging from 50% to 85%. Seven reported a two-year survival of 45% to 72%, and 4 reported a three-year survival of 33% to 64%. Conclusions: Our results and the review of the literature seem to suggest a relatively high early mortality with satisfactory late results of PLV in patients with dilated cardiomyopathy.(J Card Surg 2003;18:190-196) [source]


Effects of dietary restriction on mortality and age-related phenotypes in the short-lived fish Nothobranchius furzeri

AGING CELL, Issue 2 2009
Eva Terzibasi
Summary The short-lived annual fish Nothobranchius furzeri shows extremely short captive life span and accelerated expression of age markers, making it an interesting model system to investigate the effects of experimental manipulations on longevity and age-related pathologies. Here, we tested the effects of dietary restriction (DR) on mortality and age-related markers in N. furzeri. DR was induced by every other day feeding and the treatment was performed both in an inbred laboratory line and a longer-lived wild-derived line. In the inbred laboratory line, DR reduced age-related risk and prolonged maximum life span. In the wild-derived line, DR induced early mortality, did not reduce general age-related risk and caused a small but significant extension of maximum life span. Analysis of age-dependent mortality revealed that DR reduced demographic rate of aging, but increased baseline mortality in the wild-derived strain. In both inbred- and wild-derived lines, DR prevented the expression of the age markers lipofuscin in the liver and Fluoro-Jade B (neurodegeneration) in the brain. DR also improved performance in a learning test based on conditioning (active avoidance in a shuttle box). Finally, DR induced a paradoxical up-regulation of glial fibrillary acidic protein in the brain. [source]


Rotterdam score predicts early mortality in Budd-Chiari syndrome, and surgical shunting prolongs transplant-free survival

ALIMENTARY PHARMACOLOGY & THERAPEUTICS, Issue 10 2009
A. J. MONTANO-LOZA
Summary Background, Budd,Chiari syndrome carries significant mortality, but factors predicting this outcome are uncertain. Aim, To determine factors associated with 3-month mortality and compare outcomes after surgical shunting or liver transplantation. Methods, From 1985 to 2008, 51 patients with Budd,Chiari syndrome were identified. Results, By logistic regression analysis, features associated with higher risk of 3-month mortality were Rotterdam class III, Clichy >6.6, model for end-stage liver disease (MELD) >20 and Child,Pugh C. Rotterdam class III had the best performance to discriminate 3-month mortality with sensitivity of 0.89 and specificity of 0.63, whereas Clichy >6.60 had sensitivity of 0.78 and specificity of 0.69; MELD >20 had sensitivity of 0.78 and specificity of 0.75 and Child,Pugh C had sensitivity of 0.67 and specificity of 0.72. Eighteen patients underwent surgical shunts and 14 received liver transplantation with no significant differences in survival (median survival 10 ± 3 vs. 8 ± 2 years; log-rank, P = 0.9). Conclusions, Rotterdam score is the best discrimination index for 3-month mortality in Budd,Chiari syndrome and should be used preferentially to determine treatment urgency. Surgical shunts constitute an important therapeutic modality that may help save liver grafts and prolong transplantation-free survival in a selected group of patients with Budd,Chiari syndrome. [source]


Predictors of Early Mortality in Patients Age 80 and Older Receiving Implantable Defibrillators

PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 8 2010
DREW ERTEL M.D.
Background: There are no upper age restrictions for implantable defibrillators (ICDs) but their benefit may be limited in patients , 80 years with strong competing risks of early mortality. Risk factors for early (1-year) mortality in ICD recipients , 80 years of age have not been established. Methods: Two-center retrospective cohort study to assess predictors of one-year mortality in ICD recipients , 80 years of age. Results: Of 2,967 ICDs implanted in the two centers from 1990,2006, 225 (7.6%) patients were ,80 years of age and followed-up at one of the two centers. Mean age was 83.3 ± 3.1 years and follow-up time 3.3 ± 2.6 years. Median survival was 3.6 years (95% confidence interval 2.3,4.9). Multivariate predictors of 1-year mortality included ejection fraction (EF) , 20% and the absence of beta-blocker use. Actuarial 1-year mortality of ICD recipients , 80 with an EF , 20% was 38.2% versus 13.1% in patients 80+ years with an EF > 20% and 10.6% for patients < 80 years with an EF , 20% (P < 0.001 for both). There was no significant difference in the risk of appropriate ICD therapy between those patients 80+ years with EF above and below 20%. Conclusion: In general, patients , 80 years of age who meet current indications for ICD implantation live sufficiently long to warrant device implantation based on anticipated survival alone. However, those with an EF , 20% have a markedly elevated 1-year mortality with no observed increase in appropriate ICD therapy, thus reducing the benefit of device implantation in this population. (PACE 2010; 981,987) [source]


Cardiopulmonary complications leading to premature deaths in adult patients with sickle cell disease

AMERICAN JOURNAL OF HEMATOLOGY, Issue 1 2010
Courtney D. Fitzhugh
Sickle cell disease (SCD) is associated with early mortality. We sought to determine the incidence, cause, and risk factors for death in an adult population of patients with SCD. All patients aged ,18 years seen at the Adult Sickle Cell Center at Duke University Medical Center between January 2000 and April 2005 were enrolled. Forty-three patients (21 males and 22 females) died during the study period. The median age of survival was 39 years for females (95% CI: 34,56), 40 years for males (95% CI: 34,48), and 40 years overall (95% CI: 35,48). Cardiac causes of death accounted for 25.6% (11/43 patients); pulmonary, 14.0% (six patients); other SCD related, 32.6% (14 patients); unknown, 14.0% (six patients); and others, 14.0% (six patients). Pulseless electrical activity arrest, pulmonary emboli, multiorgan failure, and stroke were the most frequent causes of death. Among the deceased patients, the most common premorbid conditions were cardiopulmonary: acute chest syndrome/pneumonia (58.1%), Pulmonary hypertension (pHTN; 41.9%), systemic HTN (25.6%), congestive heart failure (25.6%), myocardial infarction (20.9%), and arrhythmias (14.0%). Tricuspid regurgitant jet velocity was significantly higher (3.1 m/sec vs. 2.6 m/sec, P < 0.001) and hemoglobin significantly lower (8.3 g/dL vs. 9.2 g/dL, P < 0.05) in deceased patients when compared with patients who lived, respectively. With improved preventive and therapeutic advances, including hydroxyurea therapy, acute complications such as infection are no longer the leading cause of death; instead, causes of death and premorbid conditions are shifting to chronic cardiopulmonary complications. Further, arrhythmia leading to premature death is under-recognized in SCD and warrants further investigation. Am. J. Hematol., 2010. © 2009 Wiley-Liss, Inc. [source]


The excess burden of stroke in hospitalized adults with sickle cell disease,

AMERICAN JOURNAL OF HEMATOLOGY, Issue 9 2009
John J. Strouse
This report compares the relative rates and risk factors associated with stroke in adults versus children with sickle cell disease (SCD) in the United States over the last decade. We identified incident strokes in patients with SCD using ICD-9 codes for acute stroke and SCD and the California Patient Discharge Databases. We estimated SCD prevalence by using the incidence of SCD at birth with adjustment for early mortality from SCD. We identified 255 acute strokes (70 primary hemorrhagic and 185 ischemic) among 69,586 hospitalizations for SCD-related complications from 1998 to 2007. The rate of stroke in children [<18 years old (310/100,000 person-years)] was similar to young adults [18,34 years old (360/100,000 person-years)], but much higher in middle-aged [35,64 years old (1,160/100,000 person-years)] and elderly adults [,65 years old (4,700/100,000 person-years)]. Stroke was associated with hypertension in children and hypertension, diabetes mellitus, hyperlipidemia, atrial fibrillation, and renal disease in adults. Most acute strokes (75%) and in-hospital deaths from stroke (91%) occurred in adults. Our results suggest that the rate of stroke in SCD peaks in older adults and is three-fold higher than rates previously reported in African-Americans of similar age (35,64 years) without SCD. Stroke in SCD is associated with several known adult risk factors for ischemic and hemorrhagic stroke. Studies for the primary and secondary prevention of stroke in adults with SCD are urgently needed. Am. J. Hematol. 2009. © 2009 Wiley-Liss, Inc. [source]


Asthma management: Reinventing the wheel in sickle cell disease,

AMERICAN JOURNAL OF HEMATOLOGY, Issue 4 2009
Claudia R. Morris
Asthma is a common comorbidity in sickle cell disease (SCD) with a reported prevalence of 30,70%. The high frequency of asthma in this population cannot be attributed to genetic predisposition alone, and likely reflects in part, the contribution of overlapping mechanisms shared between these otherwise distinct disorders. There is accumulating evidence that dysregulated arginine metabolism and in particular, elevated arginase activity contributes to pulmonary complications in SCD. Derangements of arginine metabolism are also emerging as newly appreciated mechanism in both asthma and pulmonary hypertension independent of SCD. Patients with SCD may potentially be at risk for an asthma-like condition triggered or worsened by hemolysis-driven release of erythrocyte arginase and low nitric oxide bioavailability, in addition to classic familial asthma. Mechanisms that contributed to asthma are complex and multifactorial, influenced by genetic polymorphisms as well as environmental and infectious triggers. Given the association of asthma with inflammation, oxidative stress and hypoxemia, factors known to contribute to a vasculopathy in SCD, and the consequences of these factors on sickle erythrocytes, comorbid asthma would likely contribute to a vicious cycle of sickling and subsequent complications of SCD. Indeed a growing body of evidence documents what should come as no surprise: Asthma in SCD is associated with acute chest syndrome, stroke, pulmonary hypertension, and early mortality, and should therefore be aggressively managed based on established National Institutes of Health Guidelines for asthma management. Barriers to appropriate asthma management in SCD are discussed as well as strategies to overcome these obstacles in order to provide optimal care. Am. J. Hematol., 2009. © 2008 Wiley-Liss, Inc. [source]


Phenotypic approaches for understanding patterns of intracemetery biological variation

AMERICAN JOURNAL OF PHYSICAL ANTHROPOLOGY, Issue S43 2006
Christopher M. Stojanowski
Abstract This paper reviews studies of phenotypic inheritance and microevolutionary processes in archaeological populations using data on cranial and dental phenotypic variation, often referred to as paleogenetics or biodistance analysis. The estimation of biological distances between populations, or among individuals within populations, is one component of bioarchaeological research on past populations. In this overview, five approaches that focus on morphological variation within cemeteries are summarized: kinship and cemetery structure analysis, postmarital residence analysis, sample aggregate phenotypic variability, temporal microchronology, and age-structured phenotypic variation. Previous research, theoretical justifications, and methods are outlined for each topic. Case studies are presented that illustrate these theoretical and methodological bases, as well as demonstrate the kinds of inferences possible using these approaches. Kinship and cemetery structure analysis seeks to identify the members of family groups within larger cemeteries or determine whether cemeteries were kin-structured. Analysis of sex-specific phenotypic variation allows estimation of postmarital residence practices, which is important for understanding other aspects of prehistoric social organization. Analysis of aggregate phenotypic variability can be used to infer site formation processes or cemetery catchment area. The study of temporal microchronologies can be used to evaluate provisional archaeological chronologies or study microevolutionary processes such as adaptive selection or changing patterns of gene flow. Finally, age-structured phenotypic variation can be reflective of selection processes within populations or it can be used as a measure of morbidity, growth arrest, and early mortality within past populations. Use of phenotypic data as a genotypic proxy is theoretically sound, even at small scales of analysis. Yrbk Phys Anthropol 49:49,88, 2006. © 2006 Wiley-Liss, Inc. [source]


2202 Kidney Transplant Recipients with 10 Years of Graft Function: What Happens Next?

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 11 2008
A. J. Matas
The ultimate goal of clinical transplantation is for the recipients to achieve long-term survival, with continuing graft function, that is equivalent to that of the age-matched general population. We studied subsequent outcome in kidney transplant recipients with 10 years of graft function. In all, 2202 kidney transplant recipients survived with graft function >10 years. For 10-year survivors, the actuarial 25-year patient survival rate for primary transplant living donor (LD) recipients was 57%; graft survival, 43%. For primary transplant deceased donor (DD) recipients, the actuarial 25-year patient survival rate was 39%; graft survival, 27%. The two major causes of late graft loss were death (with graft function) and chronic allograft nephropathy (tubular atrophy and interstitial fibrosis). The two major causes of death with function were cardiovascular disease (CVD) and malignancy. For nondiabetic recipients, the mean age at death with function from CVD was 54 ± 13 years; for diabetic recipients, 53 ± 7 years. By 20 years posttransplant, morbidity was common: >40% recipients had skin cancer (mean age for nondiabetic recipients, 53 ± 13 years; for diabetics, 49 ± 8 years), >10% had non-skin cancer (mean age for nondiabetic recipients, 53 ± 16 years; for diabetics, 46 ± 9 years), and >30% had CVD (mean age for nondiabetic recipients, 53 ± 15 years; for diabetics, 47 ± 9 years). We conclude that long-term transplant recipients have a high rate of morbidity and early mortality. As short-term results have improved, more focus is needed on long-term outcome. [source]


Cerebral gunshot wounds: a score based on three clinical parameters to predict the risk of early mortality

ANZ JOURNAL OF SURGERY, Issue 11 2009
Michael Stoffel
Abstract Background:, To provide a score to predict the risk of early mortality after single craniocerebral gunshot wound (GSW) based on three clinical parameters. Methods:, All patients admitted to Baragwanath Hospital, Johannesburg, South Africa, between October 2000 and May 2005 for an isolated single craniocerebral GSW were retrospectively evaluated for the documentation of (i) blood pressure (BP) on admission; (ii) inspection of the bullet entry and exit site; and (iii) initial consciousness (n= 214). Results:, Conscious GSW victims had an early mortality risk of 8.3%, unconscious patients a more than fourfold higher risk (39.2%). Patients with a systolic BP between 100 and 199 mm Hg had an 18.2% risk of mortality. Hypotension (<100 mm Hg) doubled this risk (37.7%) and severe hypertension (,200 mm Hg) was associated with an even higher mortality rate of 57.1%. Patients without brain spilling out of the wound (,non-oozer') exhibited a mortality of 19.7%, whereas it was twice as high (43.3%) in patients with brain spill (,oozer'). By logistic regression, a prognostic index for each variant of the evaluated parameters could be established: non-oozer:0, oozer:1, conscious:0, unconscious:2, 100 ,RRsys < 200 mm Hg:0, RRsys < 100 mm Hg:1, RRsys, 200 mm Hg:2. This resulted in a score (0,5) by which the individual risk of early mortality after GSW can be anticipated. Conclusions:, Three immediately obtainable clinical parameters were evaluated and a score for predicting the risk of early mortality after a single craniocerebral GSW was established. [source]


CT15 RISK STRATIFICATION MODELS FOR HEART VALVE SURGERY

ANZ JOURNAL OF SURGERY, Issue 2007
C. H. Yap
Purpose Risk stratification models may be useful in aiding surgical decision-making, preoperative informed consent, quality assurance and healthcare management. While several overseas models exist, no model has been well-validated for use in Australia. We aimed to assess the performance of two valve surgery risk stratification models in an Australian patient cohort. Method The Society of Cardiothoracic Surgeons of Great Britain and Ireland (SCTS) and Northern New England (NNE) models were applied to all patients undergoing valvular heart surgery at St Vincent's Hospital Melbourne and The Geelong Hospital between June 2001 and November 2006. Observed and predicted early mortalities were compared using the chi-square test. Model discrimination was assessed by the area under the receiver operating characteristic (ROC) curve. Model calibration was tested by applying the chi-square test to risk tertiles. Results SCTS model (n = 1095) performed well. Observed mortality was 4.84%, expected mortality 6.64% (chi-square p = 0.20). Model discrimination (area under ROC curve 0.835) and calibration was good (chi-square p = 0.9). the NNE model (n = 1015) over-predicted mortality. Observed mortality 4.83% and expected 7.54% (chi-square p < 0.02). Model discrimination (area under ROC curve 0.835) and calibration was good (chi-square p = 0.9). Conclusion Both models showed good model discrimination and calibration. The NNE model over-predicted early mortality whilst the SCTS model performed well in our cohort of patients. The SCTS model may be useful for use in Australia for risk stratification. [source]


Thoracoscopic talc pleurodesis for malignant pleural effusion

ANZ JOURNAL OF SURGERY, Issue 1-2 2003
David Love
Background: Malignant pleural effusion (MPE) is a common and distressing condition at the end of life for many patients with disseminated cancer. The challenge for the surgeon lies in managing this problem in order to deliver the most effective palliation with the least impact on the limited time available to these patients. Methods: Herein is reported a retrospective review of outcomes for a consecutive series of 66 MPE (61 patients) treated over a 5-year period from 1995 to 2000. A standard operative technique involving a single-lung anaesthetic and two-port thoracoscopy was employed. Outcomes were determined by contacting the referring practitioner or the patients themselves. Principal outcome measures included time to recurrence of the effusion and survival. Results: Complete follow up was achieved for 60 MPE (55 patients; five of whom were treated for metachronous, bilateral ­disease). The three most common primary sites were breast, lung and mesothelial tissue. The planned procedure was not completed in two cases due to encasement of the underlying lung by tumour. Primary failure (immediate recurrence of the effusion) occurred in six cases. Delayed recurrence of the effusion occurred in a further 23 MPE resulting in complete control in 31 cases (52%) until death. Overall median survival was 220 days and the 30-day mortality was 0. Conclusions: Complete and permanent control of a malignant effusion is difficult to achieve. Management based on thoracoscopy and talc insufflation produces satisfactory results with an acceptable morbidity and no early mortality. The ability to inspect the pleural space, break down adhesions and completely drain pockets of fluid to achieve complete lung expansion probably contributes to this. [source]


Examining the location and cause of death within 30 days of radical prostatectomy

BJU INTERNATIONAL, Issue 4 2005
Shabbir M.H. Alibhai
OBJECTIVES To better characterize the cause and location of death after radical prostatectomy (RP), as early mortality is relatively uncommon after RP, with little known about the cause of death among men who die within 30 days of RP, and the trend toward earlier discharge after surgery means that a greater proportion of early mortality after RP may occur out of hospital. PATIENTS AND METHODS Using the Ontario Cancer Registry, we identified 11 010 men (mean age 68 years) who had a RP in the province of Ontario between 1990 and 1999. We identified the occurrence and location of all deaths within 30 days of RP. The cause of death was obtained from death certificate information. Logistic regression was used to examine factors (age, comorbidity, year of surgery) associated with the location of death. RESULTS Of the 11 010 men, 53 died within 30 days of RP (0.5%); of these 53 men, 28 (53%) died in hospital. Neither age, comorbidity nor year of surgery were significantly associated with location of death (P > 0.05). Major causes of death included cardiovascular disease (38%) and pulmonary embolism (13%). More than half of the patients who died out of hospital had an unknown cause of death. CONCLUSIONS Almost half of all deaths within 30 days of RP occur out of hospital; the two most common causes of death are potentially preventable. More detailed cause-of-death information may help to identify opportunities for prevention. [source]


Improving outcomes of cord blood transplantation: HLA matching, cell dose and other graft- and transplantation-related factors

BRITISH JOURNAL OF HAEMATOLOGY, Issue 2 2009
Vanderson Rocha
Summary The use of unrelated umbilical cord blood (UCB) as an alternative source of haematopoietic stem cells transplantation (HSCT) has been widely used for patients lacking a human leucocyte antigen (HLA) matched donor. One of the disadvantages of using UCB is the limited number of haematopoietic stem cells and, consequently, delayed engraftment and increased risk of early mortality. Many approaches have been investigated in the attempt to improve engraftment and survival. Among those, studies analysing prognostic factors related to patients, disease, donor and transplantation have been performed. Variable factors have been identified, such as factors related to donor choice (HLA, cell dose and others) and transplantation (conditioning and graft- versus -host disease prophylaxis regimens). This review will focus on the interactions between HLA, cell dose and other modifiable factors related to the UCB unit selection and transplantation that may improve outcomes after UCB transplantation. [source]


Endogenous nitric oxide synthase inhibitors in sickle cell disease: abnormal levels and correlations with pulmonary hypertension, desaturation, haemolysis, organ dysfunction and death

BRITISH JOURNAL OF HAEMATOLOGY, Issue 4 2009
Gregory J. Kato
Summary Pulmonary hypertension (PH) in patients with sickle cell disease (SCD) is linked to intravascular haemolysis, impaired nitric oxide bioavailability, renal dysfunction, and early mortality. Asymmetric dimethylarginine (ADMA), an endogenous inhibitor of nitric oxide synthases (NOS), is associated with vascular disease in other populations. We determined the plasma concentrations for several key arginine metabolites and their relationships to clinical variables in 177 patients with SCD and 29 control subjects: ADMA, symmetric dimethylarginine (SDMA), NG-monomethyl L-arginine (L-NMMA), N-omega-hydroxy-L-arginine (NOHA), arginine and citrulline. The median ADMA was significantly higher in SCD than controls (0·94 ,mol/l vs. 0·31 ,mol/l, P < 0·001). Patients with homozygous SCD had a remarkably lower ratio of arginine to ADMA (50 ,mol/l vs. 237, P < 0·001). ADMA correlated with markers of haemolysis, low oxygen saturation and soluble adhesion molecules. PH was associated with high levels of ADMA and related metabolites. Higher ADMA level was associated with early mortality, remaining significant in a multivariate analysis. Subjects with homozygous SCD have high systemic levels of ADMA, associated with PH and early death, implicating ADMA as a functional NOS inhibitor in these patients. These defects and others converge on the nitric oxide pathway in homozygous SCD with vasculopathy. [source]


Impact of weekend admissions on quality of care and outcomes in patients with acute myeloid leukemia,

CANCER, Issue 15 2010
Nelli Bejanyan MD
Abstract BACKGROUND: Hospital services are expectantly reduced over the weekend, which may result in a delay in treatment or in obtainment of medical procedures. The authors investigated quality of care and clinical outcomes of newly diagnosed acute myeloid leukemia (AML) patients who were hospitalized on weekends versus weekdays and treated with induction chemotherapy. METHODS: This retrospective follow-up study involved 422 AML patients treated with cytarabine-based induction chemotherapy at Cleveland Clinic from 1994-2008. Quality outcome measures included time to triple-lumen catheter (TLC) placement, time to induction chemotherapy, length of stay (LOS), early death (within 15 days of chemotherapy), and 30-day mortality. These were tested for the association with known predictors of AML survival and etiology by the methods of linear, categorical, and survival analyses. RESULTS: Twenty-three percent of all admissions (n = 422) occurred over the weekend (n = 103). Compared with younger (aged <60 years) patients, older patients had higher 30-day mortality (P = .003), early death (P = .025), and time to induction rates (P = .02), but lower complete remission (P = .001) and overall survival (OS) rates (P < .0001). In univariate analyses, time to TLC was delayed for weekend admissions (P < .01). Weekend admissions had lower early mortality (P = .04) and 30-day mortality (P = .02). In multivariate analysis, only time to TLC remained significantly longer for weekend admissions (P < .001). CONCLUSIONS: Weekend admissions significantly delayed placement of TLC without affecting other quality parameters or patient survival. This is likely because of immediate initiation of peripheral chemotherapy with cytarabine even before the placement of TLC for infusion of anthracyclines. Cancer 2010. © 2010 American Cancer Society. [source]


A Systematic Review of Gender Differences in Mortality after Coronary Artery Bypass Graft Surgery and Percutaneous Coronary Interventions

CLINICAL CARDIOLOGY, Issue 10 2007
Catherine Kim M.D., M.P.H.
Abstract Gender differences exist in outcomes, particularly early mortality, for percutaneous interventions (PCI) and coronary artery bypass graft surgery (CABG). Better understanding of this issue may target areas for improvement for all patients undergoing revascularization. Therefore, we summarized the evidence on gender differences in PCI and CABG outcomes, particularly early mortality, and mediators of this difference. Using the key terms "women" or "gender," "revascularization," "coronary artery bypass," "angioplasty," "stent," and "coronary intervention," we searched MEDLINE from 1985 to 2005 for all randomized controlled trials (RCTs) and registries reporting outcomes by gender. Bibliographies and the Web sites of cardiology conferences were also reviewed. The literature was examined to identify gender differences in outcomes and mediators of these differences. We identified 23 studies reporting outcomes by gender for CABG and 48 studies reporting outcomes by gender for PCI. The majority of studies noted greater in-hospital mortality in women than in men, with mortality differences resolving with longer follow-up. Early mortality differences were reduced but not consistently eliminated after adjustment for comorbidities, procedural characteristics, and body habitus. Power to detect gender differences after multivariate adjustment was limited by declining mortality rates and small sample size. Gender was an independent risk factor for complications after both CABG and PCI. Women experience greater complications and early mortality after revascularization. Future exploration is needed of gender differences in quality of care and benefit from combinations of stenting and antiplatelet, and anticoagulant medications in order to optimize treatment. Copyright © 2007 Wiley Periodicals, Inc. [source]