Median Time (median + time)

Distribution by Scientific Domains
Distribution within Medical Sciences

Kinds of Median Time

  • shorter median time


  • Selected Abstracts


    Gender differences in bipolar disorder type I and II

    ACTA PSYCHIATRICA SCANDINAVICA, Issue 6 2009
    K. Suominen
    Objective:, We investigated gender differences in bipolar disorder (BD) type I and II in a representative cohort of secondary care psychiatric in- and out-patients. Method:, In the prospective, naturalistic Jorvi Bipolar Study of 191 secondary care psychiatric in- and out-patients, 160 patients (85.1%) could be followed up for 18 months with a life chart. Results:, After adjusting for confounders, no marked differences in illness-related characteristics were found. However, female patients with BD had more lifetime comorbid eating disorders (P < 0.001, OR = 5.99, 95% CI 2.12,16.93) but less substance use disorders (P < 0.001, OR = 0.29, 95% CI 0.16,0.56) than males. Median time to recurrence after remission was 3.1 months longer among men than women, female gender carrying a higher hazard of recurrence (P = 0.006, HR = 2.00, 95% CI 1.22,3.27). Conclusion:, Men and women with type I and II BD have fairly similar illness-related clinical characteristics, but their profile of comorbid disorders may differ significantly, particularly regarding substance use and eating disorders. In medium-term follow-up, females appear to have a higher hazard of recurrence than males. [source]


    Randomized comparison of the SLIPA (Streamlined Liner of the Pharynx Airway) and the SS-LM (Soft Seal Laryngeal Mask) by medical students

    EMERGENCY MEDICINE AUSTRALASIA, Issue 5-6 2006
    Cindy Hein
    Abstract Objective:, The aim of the study was to compare the Streamlined Liner of the Pharynx Airway (SLIPA; Hudson RCI), a new supraglottic airway device, with the Soft Seal Laryngeal Mask (SS-LM; Portex) when used by novices. Methods:, Thirty-six medical students with no previous airway experience, received manikin training in the use of the SLIPA and the SS-LM. Once proficient, the students inserted each device in randomized sequence, in two separate patients in the operating theatre. Only two insertion attempts per patient were allowed. Students were assessed in terms of: device preference; success or failure; success at first attempt and time to ventilation. Results:, Sixty-seven per cent of the students preferred to use the SLIPA (95% confidence interval 49,81%). The SLIPA was successfully inserted (one or two attempts) in 94% of patients (34/36) and the SS-LM in 89% (32/36) (P = 0.39). First attempt success rates were 83% (30/36) and 67% (24/36) in the SLIPA and SS-LM, respectively (P = 0.10). Median time to ventilation was shorter with the SLIPA (40.6 s) than with the SS-LM (66.9 s) when it was the first device used (P = 0.004), but times were similar when inserting the second device (43.8 s vs 42.9 s) (P = 0.75). Conclusions:, In the present study novice users demonstrated high success rates with both devices. The SLIPA group achieved shorter times to ventilation when it was the first device they inserted, which might prove to be of clinical significance, particularly in resuscitation attempts. Although the Laryngeal Mask has gained wide recognition for use by both novice users and as a rescue airway in failed intubation, the data presented here suggest that the SLIPA might also prove useful in these areas. [source]


    Epidemiology of unarmed threats in the emergency department

    EMERGENCY MEDICINE AUSTRALASIA, Issue 4 2005
    Jonathan C Knott
    Abstract Objective:, To evaluate the precipitants, subject characteristics, nature and outcomes of unarmed threats in the ED. Methods:, A 12 month prospective survey of security codes precipitated by an unarmed threat (Code Grey). Results:, Data were collected on 151 subjects. The Code Grey rate was 3.2/1000 ED presentations. They were most frequent on Saturday and in the late evening/early morning. There were verbal or physical threats of violence made to staff on 104 occasions (69%, 95% confidence interval [CI] 61,76) and a perceived threat of patient self-harm on 114 occasions (76%, 95% CI 68,82). Median time to be seen by a doctor was 8 min (interquartile range [IQR]: 2,21 min) and median time from presentation to Code was 59 min (IQR: 5,222 min). Sixteen subjects (11%, 95% CI 6,17) had a history of violence, 45 (30%, 95% CI 23,38) were affected by alcohol, 25 (17%, 95% CI 11,24) had used illicit drugs and 79 (52%, 95% CI 44,60) had a significant mental illness contributing to the Code Grey. Seventy-one patients (47%, 95% CI 39,55) required psychiatric admission, 49 (79%, 95% CI 66,88) involuntarily. Conclusion:, Acutely agitated subjects pose a threat to themselves and the staff caring for them. The reason for the agitation is multifactorial and the majority arrive in a behaviourally disturbed state requiring early intervention. The times most likely to result in a Code Grey coincide with least available resources: ED and hospital risk management policies must account for this. A coherent approach by ED to this population is required to optimize patient and staff outcomes. [source]


    Long-term outcomes of children treated with the ketogenic diet in the past

    EPILEPSIA, Issue 7 2010
    Amisha Patel
    Summary Purpose:, The ketogenic diet has well-established short- and long-term outcomes for children with intractable epilepsy, but only for those actively receiving it. However, no information exists about its long-term effects years after it has been discontinued. Methods:, Living subjects were identified who were treated at the Johns Hopkins Hospital with the ketogenic diet from November 1993 to December 2008 for ,1 month, and had discontinued it ,6 months prior to this study. Of 530 patients who were eligible, 254 were successfully contacted by phone or e-mail with a survey and request for laboratory studies. Results:, Questionnaires were completed by 101 patients, with a median current age of 13 years (range 2,26 years). Median time since discontinuing the ketogenic diet was 6 years (range 0.8,14 years). Few (8%) still preferred to eat high fat foods. In comparison to the 52% responder rate (>50% seizure reduction) at ketogenic diet discontinuation, 79% were now similarly improved (p = 0.0001). Ninety-six percent would recommend the ketogenic diet to others, yet only 54% would have started it before trying anticonvulsants. Lipids were normal (mean total cholesterol 158 mg/dl), despite most being abnormal while on the ketogenic diet. The mean Z scores for those younger than age 18 years were ,1.28 for height and ,0.79 for weight. In those 18 years of age or older, the mean body mass index (BMI) was 22.2. Discussion:, This is the first study to report on the long-term effects of the ketogenic diet after discontinuation. The majority of subjects are currently doing well with regard to health and seizure control. [source]


    Rituximab therapy in adult patients with relapsed or refractory immune thrombocytopenic purpura: long-term follow-up results

    EUROPEAN JOURNAL OF HAEMATOLOGY, Issue 3 2008
    Marta Medeot
    Abstract Objective:, To evaluate the long-term activity and toxicity profile of rituximab in adult patients with idiopathic immune thrombocytopenic purpura (ITP). Patients and methods:, Twenty-six patients with active and symptomatic ITP relapsed or refractory received weekly infusions of rituximab 375 mg/m2 for 4 wk. Median time from diagnosis to rituximab was 34.5 months. The following parameters of efficacy and toxicity were considered: complete response (CR) and partial response (PR), relapse rate, relapse-free survival (RFS), therapy-free survival (TFS), short- and long-term toxicity. Results:, CR and PR were 14/26 (54%) and 4/26 (15%), respectively. Median time of observation was 56.5 months (range 39,77). Nine of the 18 responding patients relapsed after a median of 21 months (range 8,66); 9/26 patients (35%) maintained the response, with a median follow-up of 57 months (range 39,69), and 11/26 (42%) did not necessitate further therapy; estimated 5 yr RFS and TFS were 61% and 72%, respectively. Younger age and shorter interval from diagnosis to rituximab appeared indicators of better outcome. Rituximab administration was associated with two episodes of short-term toxicity, with one case of serum sickness syndrome; no infectious or other significant long-term complications were documented. Conclusion:, Rituximab therapy may achieve long-lasting remission in nearly one-third of patients with relapsed or refractory ITP, with a good safety profile. [source]


    Indications of cricohyoidoepiglottopexy versus anterior frontal laryngectomy: The role of contralateral vocal fold spread

    HEAD & NECK: JOURNAL FOR THE SCIENCES & SPECIALTIES OF THE HEAD AND NECK, Issue 11 2008
    David Bakhos MD
    Abstract Background. The aim of the retrospective study was to compare the indications, the postoperative outcomes, and the survival of the supracricoid laryngectomy with cricohyoidoepiglottopexy and the anterior frontal laryngectomy. Method. Nineteen patients who underwent cricohyoidoepiglottopexy (group I) and 23 patients who underwent reconstructive anterior frontal laryngectomy (group II) from January 1992 and December 2004 have been reviewed. We have compared their respective indications and postoperative outcomes. Results. There were no differences for median time before decanulation. Median time for removal feeding tube, for first oral alimentation, and hospital stay period were significantly shorter in group II. Five-year survival was 85% (group I) and 95% (group II). Local tumor control was obtained in 83% in group I and in 87% in group II. Conclusion. Cricohyoidoepiglottopexia (CHEP) was used more often than anterior frontal laryngectomy when there was contralateral vocal fold spread but resulted in longer postoperative outcomes. © 2008 Wiley Periodicals, Inc. Head Neck, 2008 [source]


    Salvage laryngectomy and pharyngocutaneous fistulae after primary radiotherapy for head and neck cancer: A national survey from DAHANCA

    HEAD & NECK: JOURNAL FOR THE SCIENCES & SPECIALTIES OF THE HEAD AND NECK, Issue 9 2003
    Cai Grau MD, DMSc
    Objective. In 1998, the Danish Society for Head and Neck Oncology decided to conduct a nationwide survey at the five head and neck oncology centers with the aim of evaluating the surgical outcome of salvage laryngectomy after radiotherapy with special emphasis on identifying factors that could contribute to the development of pharyngocutaneous fistulae. Patients. A total of 472 consecutive patients undergoing postirradiation salvage laryngectomy in the period July 1, 1987,June 30, 1997 were recorded at the five head and neck oncology centers in Denmark. Age ranged from 36 to 84 years, median 63 years, 405 men and 67 women. Primary tumor site was glottic larynx (n = 242), supraglottic larynx (n = 149), other larynx (n = 45), pharynx (n = 27), and other (n = 9). All patients had received prior radiotherapy. Results. Median time between radiotherapy and laryngectomy was 10 months (range, 1,348 months). A total of 89 fistulae lasting at least 2 weeks were observed, corresponding to an overall average fistulae risk of 19%. The number of performed laryngectomies per year decreased linearly (from 58 to 37), whereas the annual number of fistulae increased slightly (from 7 to 11), which meant that the corresponding estimated fistulae risk increased significantly from 12% in 1987 to 30% in 1997. Other significant risk factors for fistulae in univariate analysis included younger patient age, primary advanced T and N stage, nonglottic primary site, resection of hyoid bone, high total radiation dose, and large radiation fields. Multiple logistic regression analysis of these parameters suggested that nonglottic tumor site, late laryngectomy period (1987,1992 vs 1993,1997), and advanced initial T stage were independent prognostic factors for fistulae risk. Surgical parameters like resection of thyroid/tongue base/trachea or radiotherapy parameters like overall treatment time or fractions per week did not influence fistulae risk. Conclusions. The risk of fistulae is especially high in patients initially treated with radiotherapy for nonglottic advanced stage tumors. A significant decrease in the number of performed salvage laryngectomies over the 10 years was seen. Over the same time period, the annual number of fistulae remained almost constant. The resulting more than doubling of fistulae rate could thus in part be explained by less surgical routine. © 2003 Wiley Periodicals, Inc. Head Neck 25: 711,716, 2003 [source]


    HIV-related morbidity and mortality in patients starting protease inhibitors in very advanced HIV disease (CD4 count of <,50 cells/µL): an analysis of 338 clinical events from a randomized clinical trial,

    HIV MEDICINE, Issue 2 2002
    M Floridia
    Background AIDS defining events occur infrequently in the presence of CD4 counts above 200 cells/µL. It is, however, uncertain for most of the AIDS defining conditions whether this threshold can be considered equally safe in patients with a previously very low CD4 nadir. Methods We evaluated in detail all the AIDS defining events observed during a 48-week clinical trial in 1251 nucleoside reverse transcriptase inhibitor-experienced patients who started protease inhibitors (PIs) at CD4 counts below 50 cells/µL. The type of event, immunological status at the moment of event and time between start of PI treatment and event occurrence were analysed cumulatively and by event type; event rates were calculated. Results Concomitant data on CD4 counts were available for 338 AIDS defining events (81% of total events). Median time between start of treatment with PI and event was 94.5 days and median absolute CD4 value at the occurrence of event was 20 per µL. Only 14 events (in 12 patients) were observed above the threshold of 200 CD4 cells/µL. An analysis of the 67 deaths with concomitantly available CD4 counts (57%) showed a median CD4 count of 10 cells/µL, with only four deaths occurring in the presence of a CD4 count above 100 cells/µL. Conclusions Very few clinical AIDS defining conditions were observed in patients who start PIs at very low CD4 counts and with treatment restore absolute values in CD4 counts above 200 cells/µL. This threshold can therefore be considered a clinically effective goal of treatment with respect to occurrence of all AIDS defining conditions in patients starting PIs in very advanced HIV disease. [source]


    Safety and efficacy of docetaxel, estramustine phosphate and hydrocortisone in hormone-refractory prostate cancer patients

    INTERNATIONAL JOURNAL OF UROLOGY, Issue 7 2010
    Yoshihiro Nakagami
    Objective: To assess the combination of docetaxel (DTX), estramustine phosphate (EMP) and hydrocortisone for patients with hormone-refractory prostate cancer (HRPC). Methods: A total of 63 patients with HRPC were treated with a chemotherapeutic regimen including DTX, EMP, and hydrocortisone. Clinical and pathological features were correlated to serum prostate-specific antigen (PSA) recurrence and survival rates. Incidence and degree of toxicities were also retrospectively reviewed. Results: A median of 11 courses of chemotherapy was administered per patient. PSA levels decreased by >50% in 32 (51%) patients and >90% in 18 (29%) patients. Median time to PSA progression was 6 months (range from 1 to 41 months) and median time of overall survival was 14 months (range from 1 to 56 months). In a univariate analysis to predict overall survival, PSA, hemoglobin, alkaliphosphatase, and performance status prior to the chemotherapy were significant factors. Despite grade 3,4 neutropenia in 87% of patients, grade 5 interstitial pneumonia in one patient and grade 4,5 myocardial infarction in two patients were recognized, the regimen seemed to be relatively safe. Conclusions: Combination chemotherapy with DTX, EMP and hydrocortisone provides survival benefits for patients with HRPC with an acceptable toxicity profile. We need to further evaluate who might benefit most from this regimen. [source]


    Stage 2 Pressure Ulcer Healing in Nursing Homes

    JOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 7 2008
    Nancy Bergstrom PhD
    OBJECTIVES: To identify resident and wound characteristics associated with Stage 2 pressure ulcer (PrU) healing time in nursing home residents. DESIGN: Retrospective cohort study with convenience sampling. SETTING: One hundred two nursing homes participating in the National Pressure Ulcer Long-Term Care Study (NPULS) in the United States. PARTICIPANTS: Seven hundred seventy-four residents aged 21 and older with length of stay of 14 days or longer who had at least one initial Stage 2 (hereafter Stage 2) PrU. MEASUREMENTS: Data collected for each resident over a 12-week period included resident characteristics and PrU characteristics, including area when first reached Stage 2. Data were obtained from medical records and logbooks. RESULTS: There were 1,241 initial Stage 2 PrUs on 774 residents; 563 (45.4%) healed. Median time to heal was 46 days. Initial area was significantly associated with days to heal. Using Kaplan-Meier survival analyses, median days to heal was 33 for small (,1 cm2), 53 days for medium (>1 to ,4 cm2), and 73 days for large (>4 cm2) ulcers. Using Cox proportional hazard regression models to examine effects of multiple variables simultaneously, small and medium ulcers and ulcers on residents with agitation and those who had oral eating problem healed more quickly, whereas ulcers on residents who required extensive assistance with seven to eight activities of daily living (ADLs), who temporarily left the facility for the emergency department (ED) or hospital, or whose PrU was on an extremity healed more slowly. CONCLUSION: PrUs on residents with agitation or with oral eating problems were associated with faster healing time. PrUs located on extremities, on residents who went temporarily to the ED or hospital, and on residents with high ADL disabilities were associated with slower healing time. Interaction between PrU size and place of onset was also associated with healing time. For PrU onset before or after admission to the facility, smaller size was associated with faster healing time. [source]


    Efficacy and Safety of Rofecoxib 12.5 mg Versus Nabumetone 1,000 mg in Patients with Osteoarthritis of the Knee: A Randomized Controlled Trial

    JOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 5 2004
    Alan J. Kivitz MD
    Objectives: To evaluate the use of starting doses of rofecoxib and nabumetone in patients with osteoarthritis (OA) of the knee. Design: A 6-week, randomized, parallel-group, double-blind, placebo-controlled study. Setting: One hundred thirteen outpatient sites in the United States. Participants: A total of 1,042 male and female patients aged 40 and older with OA of the knee (>6 months). Interventions: Rofecoxib 12.5 mg once a day (n=424), nabumetone 1,000 mg once a day (n=410), or placebo (n=208) for 6 weeks. Measurements: The primary efficacy endpoint was patient global assessment of response to therapy (PGART) over 6 weeks, which was also specifically evaluated over the first 6 days. The main safety measure was adverse events during the 6 weeks of treatment. Results: The percentage of patients with a good or excellent response to therapy as assessed using PGART at Week 6 was significantly higher with rofecoxib (55.4%) than nabumetone (47.5%; P=.018) or placebo (26.7%; P<.001 vs rofecoxib or nabumetone). Median time to first report of a good or excellent PGART response was significantly shorter in patients treated with rofecoxib (2 days) than with nabumetone (4 days, P=.002) and placebo (>5 days, P<.001) (nabumetone vs placebo; P=.007). The safety profiles of rofecoxib and nabumetone were generally similar, including gastrointestinal, hypertensive, and renal adverse events. Conclusion: Rofecoxib 12.5 mg daily demonstrated better efficacy over 6 weeks of treatment and quicker onset of OA efficacy over the first 6 days than nabumetone 1,000 mg daily. Both therapies were generally well tolerated. [source]


    Antifracture Efficacy and Reduction of Mortality in Relation to Timing of the First Dose of Zoledronic Acid After Hip Fracture,,

    JOURNAL OF BONE AND MINERAL RESEARCH, Issue 7 2009
    Erik Fink Eriksen
    Abstract Annual infusions of zoledronic acid (5 mg) significantly reduced the risk of vertebral, hip, and nonvertebral fractures in a study of postmenopausal women with osteoporosis and significantly reduced clinical fractures and all-cause mortality in another study of women and men who had recently undergone surgical repair of hip fracture. In this analysis, we examined whether timing of the first infusion of zoledronic acid study drug after hip fracture repair influenced the antifracture efficacy and mortality benefit observed in the study. A total of 2127 patients (1065 on active treatment and 1062 on placebo; mean age, 75 yr; 76% women and 24% men) were administered zoledronic acid or placebo within 90 days after surgical repair of an osteoporotic hip fracture and annually thereafter, with a median follow-up time of 1.9 yr. Median time to first dose after the incident hip fracture surgery was ,6 wk. Posthoc analyses were performed by dividing the study population into 2-wk intervals (calculated from time of first infusion in relation to surgical repair) to examine effects on BMD, fracture, and mortality. Analysis by 2-wk intervals showed a significant total hip BMD response and a consistent reduction of overall clinical fractures and mortality in patients receiving the first dose 2-wk or later after surgical repair. Clinical fracture subgroups (vertebral, nonvertebral, and hip) were also reduced, albeit with more variation and 95% CIs crossing 1 at most time points. We concluded that administration of zoledronic acid to patients suffering a low-trauma hip fracture 2 wk or later after surgical repair increases hip BMD, induces significant reductions in the risk of subsequent clinical vertebral, nonvertebral, and hip fractures, and reduces mortality. [source]


    Temporal Patterns of Atrial Arrhythmia Recurrences in Patients with Implantable Defibrillators: Implications for Assessing Antiarrhythmic Therapies

    JOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 4 2002
    LINA A. SHEHADEH M.S.
    Temporal Patterns of Atrial Arrhythmias.Introduction: The statistical measures commonly used to assess therapies for recurrent atrial arrhythmias (such as time to first recurrence) often assume a uniformly random pattern of arrhythmic events over time. However, the true temporal pattern of atrial arrhythmia recurrences is unknown. The aim of this study was to use linear and nonlinear analyses to characterize the temporal pattern of atrial arrhythmia recurrences in patients with implantable cardioverter defibrillators. Methods and Results: The time and date of atrial tachyarrhythmias recorded in 65 patients with combined atrial and ventricular defibrillators were used to construct a probability density function (PDF) and a model of a Poisson distribution of arrhythmic events for each patient. Average patient age was 66 ± 10 years and follow-up was 7.8 ± 4.8 months. A total of 10,759 episodes of atrial tachyarrhythmias were analyzed (range 43 to 618 episodes per patient). The PDF fit a power law distribution for all 65 patients, with an average r2= 0.89 ± 0.08. The PDF distribution differed significantly from the model Poisson distribution in 47 of 65 patients (P = 0.0002). Differences from the Poisson distribution were noted for patients both taking (30/43 patients; P < 0.015) and not taking (17/22 patients; P < 0.017) antiarrhythmic drugs. Median time between atrial arrhythmia detections for all 65 patients was 10.8 minutes. Conclusion: In implantable cardioverter defibrillator patients, the temporal pattern of frequent recurrences of atrial tachyarrhythmias usually is characterized by a power law distribution. The unique statistical properties of this type of distribution should be considered in designing outcome measures for treatment of atrial tachyarrhythmias. [source]


    Rituximab as an adjunct to plasma exchange in TTP: A report of 12 cases and review of literature,

    JOURNAL OF CLINICAL APHERESIS, Issue 5 2008
    Sushama Jasti
    Abstract Idiopathic thrombotic thrombocytopenic purpura (TTP) is caused by the production of autoantibodies against the Von Willebrand factor cleaving enzyme. This provides a rationale for the use of rituximab in this disease. We report a retrospective review of 12 patients treated with rituximab for TTP refractory to plasma exchange. Eleven patients were treated during initial presentation, and one patient was treated for recurrent relapse. Ten patients responded to treatment. Median time to response after first dose of rituximab was 10 days (5,32). Of the 11 patients treated during initial presentation, nine remain free of relapse after a median follow-up of 57+ months (1+,79+). Two patients died during initial treatment. One patient was lost to follow-up 1 month after achieving complete response. The patient treated for recurrent disease during second relapse remained disease free for 2years, relapsed and was treated again with rituximab, and was in remission for 22 months. She relapsed again, was retreated, and has now been in remission for 21+ months. We conclude that rituximab is an useful addition to plasma exchange treatment in TTP, but its exact role and dosing need to be verified in prospective studies. J. Clin. Apheresis, 2008. © 2008 Wiley-Liss, Inc. [source]


    Nutritional factors associated with survival following enteral tube feeding in patients with motor neurone disease

    JOURNAL OF HUMAN NUTRITION & DIETETICS, Issue 4 2010
    A. Rio
    Abstract Background: Motor neurone disease (MND) is a progressive neurodegenerative disease leading to limb weakness, wasting and respiratory failure. Prolonged poor nutritional intake causes fatigue, weight loss and malnutrition. Consequently, disease progression requires decisions to be made regarding enteral tube feeding. The present study aimed to investigate the survival, nutritional status and complications in patients with MND treated with enteral tube feeding. Methods: A retrospective case note review was performed to identify patients diagnosed with MND who were treated with enteral tube feeding. A total of 159 consecutive cases were identified suitable for analysis. Patients were treated with percutaneous endoscopic gastrostomy (PEG), radiologically inserted gastrostomy (RIG) or nasogastric feeding tube (NGT). Nutritional status was assessed by body mass index (BMI) and % weight loss (% WL). Serious complications arising from tube insertion and prescribed daily energy intake were both recorded. Results: Median survival from disease onset was 842 days [interquartile range (IQR) 573,1263]. Median time from disease onset to feeding tube was PEG 521 days (IQR 443,1032), RIG 633 days (IQR 496,1039) and NGT 427 days (IQR 77,781) (P = 0.28). Median survival from tube placement was PEG 200 (IQR 106,546) days, RIG 216 (IQR 83,383) days and NGT 28 (IQR 14,107) days. Survival between gastrostomy and NGT treated patients was significant (P , 0.001). Analysis of serious complications by nutritional status was BMI (P = 0.347) and % WL (P = 0.489). Conclusions: Nutritional factors associated with reduced survival were weight loss, malnutrition and severe dysphagia. Serious complications were not related to nutritional status but to method of tube insertion. There was no difference in survival between PEG and RIG treated patients. [source]


    Palliative management of cancer of the oesophagus , opportunities for dietetic intervention

    JOURNAL OF HUMAN NUTRITION & DIETETICS, Issue 5 2003
    A. Holdoway
    Introduction: Cancer of the oesophagus develops insidiously and when patients present with symptoms such as dysphagia to solids/semi-solids and in some cases liquids, the disease is often advanced and patients are frequently poorly nourished and cachectic (Angorn, 1981; Larrea, 1992). In our own unit we were aware that patients were only referred to the dietitian once an oesophageal stent was inserted or radiotherapy commenced, thereby possibly missing opportunities to treat or prevent malnutrition earlier. We therefore evaluated the nutritional status and care pathways of patients diagnosed with cancer of the oesophagus in whom palliative treatment was the only option, with the aim of assessing the extent of malnutrition and identifying opportunities for earlier dietetic intervention to prevent or slow the development of malnutrition. Method: Data were collated on all patients referred to the hospital's dysphagia clinic and diagnosed with inoperable cancer of the oesophagus. Height, weight, body mass index, degree of dysphagia, period of dysphagia, percentage weight loss (data collected as standard practice in the dysphagia clinic) and time to stent insertion/radiotherapy and survival time was collected from the medical notes. Results: Data were available on 58 patients, 33 male, 25 female, mean age 75 years (range 49,92 years). The mean length of survival was 10.2 months (0,24 months). At diagnosis, 47% experienced dysphagia with solids, 33% with semi-solids and 16% experienced a degree of dysphagia with liquids. The period of dysphagia was 1 month to 2 years. Eighty-three per cent of patients had lost weight at diagnosis. Mean percentage weight loss per individual was 13% (range 0,45%). Thirty-five per cent had a BMI <20 kg/m2. Median time from diagnosis to radiotherapy (n = 8) was 2 months with range, 1,6 months. Median time from diagnosis to the placement of the oesophageal stent (n = 12) was 1 month with range, 0,7 months. Discussion: These data illustrate that malnutrition remains a significant problem in this patient group. These results demonstrate that dysphagia and malnutrition, as indicated by weight loss, is developing in the community before diagnosis. Opportunities for earlier dietetic intervention exist between diagnosis and date at which other treatments commence, i.e. stent insertion. Further opportunities exist to educate community health professionals on treating and preventing malnutrition when dysphagia presents. Survival times support the need for dietetic follow-up. In our unit the results of this audit helped to improve care pathways for patients with cancer of the oesophagus. In response to the above findings, a nutritional screening tool is now completed by a nurse specialist at the first clinic attended. This has enabled appropriate and timely advice to be given on modified texture and fortification of food to optimize nutritional intake at diagnosis. [source]


    Hospitalizations of infants and young children with Down syndrome: evidence from inpatient person-records from a statewide administrative database

    JOURNAL OF INTELLECTUAL DISABILITY RESEARCH, Issue 12 2007
    S. A. So
    Abstract Background Although individuals with Down syndrome are increasingly living into the adult years, infants and young children with the syndrome continue to be at increased risk for health problems. Using linked, statewide administrative hospital discharge records of all infants with Down syndrome born over a 3-year period, this study ,follows forward' over 200 infants with Down syndrome from each individual's birth until they turn 3 years of age. By utilizing this procedure, we were able to assess the amount, reasons for, and timing of inpatient hospitalization and to investigate how congenital heart defects (CHDs) relate to hospitalization for young children with Down syndrome. Method This population-based, retrospective study used statewide administrative hospital discharge data. Subject inclusion criteria included residents of Tennessee, born between 1997 and 1999, and diagnosed with Down syndrome at birth. Inpatient records were linked to create person-record histories of hospitalization from birth to age 3. Main outcomes included the number of Non-birth Hospitalizations, length of stay, principal and other diagnosis codes to indicate reason(s) for hospitalization, and patient's age at first (non-birth) hospitalization. Procedure codes were added to determine if children with CHD were hospitalized primarily for operations on the heart. Results Of 217 births, 213 children survived birth; 54% (115) had CHDs. Almost half (49.8%) of all children were hospitalized before age 3; these 106 children were admitted 245 times. Children with CHDs were 2.31 times more likely to be hospitalized than children without CHDs. Respiratory illnesses affected 64.9% of all hospitalized children with CHD, were the principal diagnoses in 38.3% of their hospitalizations, and were the main principal diagnoses for non-CHD children. Thirty-three (of 77) hospitalized children with CHD underwent cardiac surgeries, accounting for 19.3% of all admissions. Median time to first hospitalization was 96 days (CI: 78,114) for CHD infants, 197 days (CI: 46,347) for non-CHD infants. Conclusions Children with Down syndrome are at high risk for early hospitalization. Prevention and treatment of respiratory illnesses require more attention. Down syndrome is associated with early, serious, physical health problems and substantial inpatient care use. [source]


    Efficacy and safety of secondary prophylactic vs. on-demand sucrose-formulated recombinant factor VIII treatment in adults with severe hemophilia A: results from a 13-month crossover study

    JOURNAL OF THROMBOSIS AND HAEMOSTASIS, Issue 1 2010
    P. COLLINS
    Summary.,Background: Hemarthroses in severe hemophilia precipitate physical, psychosocial and financial difficulties. Objective: To compare the effects of secondary prophylaxis with on-demand sucrose-formulated recombinant factor VIII (rFVIII-FS) therapy in severe hemophilia A. Patients and methods: This open-label study included patients aged 30,45 years with factor VIII (FVIII) coagulant activity < 1 IU dL,1 who were using on-demand FVIII treatment. Patients were treated with rFVIII-FS on demand for 6 months, followed by 7 months prophylaxis (20,40 IU kg,1, three times per week, with the first month considered a run-in). The primary endpoint was the number of hemarthroses. Results: Twenty patients were enrolled (n = 19 completed); the mean age was 36.4 years, and 16 had target joints. The median (25,75%) number of joint bleeds decreased significantly with prophylaxis [0 (0,3)] vs. on-demand [15 (11,26); P < 0.001] therapy. The number of all bleeds was 0 (0,3) vs. 20.5 (14,37; P < 0.001), respectively. Median (range) total Gilbert scores improved after prophylaxis [18 (3,39)] compared with on-demand [25 (4,46)] therapy, predominantly reflecting the improved bleeding score. Median time from last prophylactic infusion to bleed was 2 days; 82.5% of bleeds occurred 2,3 days after the last infusion. Median 48-h and 72-h FVIII trough levels measured during months 10 and 13 were consistently > 6 and > 4 IU dL,1, respectively. Treatment was well tolerated, and no inhibitor formation was observed. Conclusion: Secondary prophylaxis with rFVIII-FS significantly reduced the frequency of hemarthroses compared with on-demand therapy in adult patients with severe hemophilia A. [source]


    Retrospective Evaluation of Partial Parenteral Nutrition in Dogs and Cats

    JOURNAL OF VETERINARY INTERNAL MEDICINE, Issue 4 2002
    Daniel L. Chan
    The purpose of this retrospective study was to evaluate the use of partial parenteral nutrition (PPN) in dogs and cats. The medical records of all dogs and cats receiving PPN between 1994 and 1999 were reviewed to determine signalment, reasons for use of PPN, duration of PPN administration, duration of hospitalization, complications, and mortality. Complications were classified as metabolic, mechanical, or septic. One hundred twenty-seven animals (80 dogs and 47 cats) were included in the study, accounting for 443 patient days of PPN. The most common underlying diseases were pancreatitis (n = 41), gastrointestinal disease (n = 33), and hepatic disease (n = 23). Median time of hospitalization before initiation of PPN was 2.8 days (range, 0.2,10.7 days). Median duration of PPN administration was 3.0 days (range, 0.3,8.8 days). Median duration of hospitalization was 7 days (range, 2,20 days). In the 127 animals receiving PPN, 72 complications occurred. These included metabolic (n = 43), mechanical (n = 25), and septic (n = 4) complications. The most common metabolic complication was hyperglycemia (n = 19), followed by lipemia (n = 17) and hyperbilirubinemia (n = 6). Most complications were mild and did not require discontinuation of PPN. Ninety-three (73.2%) of the 127 patients were discharged. All 4 animals with septic complications were discharged from the hospital. The presence, type, and number of complications did not impact the duration of hospitalization or outcome. However, animals that received supplemental enteral nutrition survived more often than those receiving PPN exclusively. Although PPN seems to be a relatively safe method of providing nutritional support, future studies are warranted to determine its efficacy. [source]


    Human and other faeces as breeding media of the trachoma vector Musca sorbens

    MEDICAL AND VETERINARY ENTOMOLOGY, Issue 3 2001
    P. M. Emerson
    Abstract. The fly Musca sorbens Wiedemann (Diptera: Muscidae) apparently transmits Chlamydia trachomatis, causing human trachoma. The literature indicates that M. sorbens breeds predominantly in isolated human faeces on the soil surface, but not in covered pit latrines. We sought to identify breeding media of M. sorbens in a rural Gambian village endemic for trachoma. Test breeding media were presented for oviposition on soil-filled buckets and monitored for adult emergence. Musca sorbens emerged from human (6/9 trials), calf (3/9), cow (3/9), dog (2/9) and goat (1/9) faeces, but not from horse faeces, composting kitchen scraps or a soil control (0/9 of each). After adjusting for mass of medium, the greatest number of flies emerged from human faeces (1426 flies/kg). Median time for emergence was 9 (inter quartile range = 8,9.75) days post-oviposition. Of all flies emerging from faeces 81% were M. sorbens. Male and female flies emerging from human faeces were significantly larger than those from other media, suggesting that they would be more fecund and live longer than smaller flies from other sources. Female flies caught from children's eyes were of a similar size to those from human faeces, but significantly larger than those from other media. We consider that human faeces are the best larval medium for M. sorbens, although some breeding also occurs in animal faeces. Removal of human faeces from the environment, through the provision of basic sanitation, is likely to greatly reduce fly density, eye contact and hence trachoma transmission, but if faeces of other animals are present M. sorbens will persist. [source]


    Teaching paediatric residents about learning disorders: use of standardised case discussion versus multimedia computer tutorial

    MEDICAL EDUCATION, Issue 8 2005
    Carolyn Frazer Bridgemohan
    Background, We developed a standardised case-based educational exercise on the topic of childhood learning disorders, and a multimedia computerised adaptation of this exercise, as part of a national curriculum project based on the Bright Futures guidelines. Objective, To explore resident perceptions of the facilitated case discussion (FCD) and the computerised tutorial (CT). Design, Quasi-randomised comparison of two educational interventions. Setting, Preclinic teaching conferences at a large urban children's hospital. Participants, A total of 46 paediatric residents years 1,3 assigned to either FCD (n = 21) or CT (n = 25). Interventions, FCD residents met in groups of 8,12 with a trained facilitator for a structured case discussion, while CT residents worked in groups of 2,3 at a computer station linked to an interactive website. Outcome Measures, Participant responses during semistructured focus group interviews. Analysis, Focus group transcripts, field notes and computer logs were analysed simultaneously using qualitative grounded theory methodology. Results, Residents experienced CT as fun, offering flexibility, greater auditory and visual appeal and more opportunities for active learning. FCD allowed greater contact with expert faculty and made the material more relevant to clinical practice. FCD participants emphasised the clinical skills gleaned and stated that the learning experience would change their future patient management. Both groups reported that case discussion was more interactive than computer learning. Median time spent on learning was slightly shorter for the CT group. All groups of learners arrived at the correct final diagnosis. Conclusions, FCD and CT stimulate different types of learning among paediatric residents. Future studies are needed to determine how to integrate these two techniques to meet the learning needs of residents in diverse settings. [source]


    Granulocyte-macrophage colony stimulating factor and immunosuppression in the treatment of pediatric acquired severe aplastic anemia

    PEDIATRIC BLOOD & CANCER, Issue 2 2005
    Michael R. Jeng MD
    Abstract Background Immunosuppressive therapy (IS) is effective in the treatment of patients with acquired severe aplastic anemia (SAA). An enhanced myeloid response and decreased infection risk may be possible with the addition of a hematopoietic cytokine. Published data on the combination of cytokines and IS in patients with SAA are limited. The addition of G-CSF to IS shortens the time to neutrophil count recovery, but may not improve overall survival. Because GM-CSF acts differently than G-CSF, its use in combination with IS may be different. Procedure A retrospective chart review was performed on patients diagnosed with SAA and treated with IS and GM-CSF at St. Jude Children's Research Hospital. Hematologic recovery, prognostic factors, and infection data were collected. Results Eighteen patients were included in this study. The median age at diagnosis was 7.2 years (range 1.8,17.0). Ten patients (56%) had a complete response, four (22%) a partial response, and four (22%) no response. Median time to erythrocyte and platelet transfusion independence were 90 (18,243) and 64 days (18,243), and to discontinuation of treatment 287 days (90,730). Median time to partial (ANC,>,500) and full (ANC,>,1,500) neutrophil recovery were 41 and 51 days, respectively. Seventeen documented discrete infections occurred in six patients over 36 patient years. Conclusions GM-CSF, in addition to IS, may shorten time to neutrophil count recovery, may be beneficial in decreasing infection rates, and may improve platelet response in patients with SAA. However, consistent with studies utilizing G-CSF, GM-CSF probably does not affect overall response rate. To fully answer whether or not cytokine therapy is of added value to IS in pediatric patients, a multi-institutional randomized trial is needed. © 2004 Wiley-Liss, Inc. [source]


    Postmarketing drug dosage changes of 499 FDA-approved new molecular entities, 1980,1999,

    PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 6 2002
    James Cross MS
    Abstract Purpose Risks and benefits of marketed drugs can be improved by changing their labels to optimize dosage regimens for indicated populations. Such postmarketing label changes may reflect the quality of pre-marketing development, regulatory review, and postmarketing surveillance. We documented dosage changes of FDA-approved new molecular entities (NMEs), and investigated trends over time and across therapeutic groups, on the premise that improved drug development methods have yielded fewer postmarketing label changes over time. Methods We compiled a list of NMEs approved by FDA from 1 January 1980 to 31 December 1999 using FDA's website, Freedom of Information Act request, and PhRMA (Pharmaceutical Research and Manufacturers of America) database. Original labeled dosages and indicated patient populations were tracked in labels in the Physician's Desk Reference®. Time and covariate-adjusted risks for dosage changes by 5-year epoch and therapeutic groups were estimated by survival analysis. Results Of 499 NMEs, 354 (71%) were evaluable. Dosage changes in indicated populations occurred in 73 NMEs (21%). A total of 58 (79%) were safety-motivated, net dosage decreases. Percentage of NMEs with changes by therapeutic group ranged from 27.3% for neuropharmacologic drugs to 13.6% for miscellaneous drugs. Median time to change following approval fell from 6.5 years (1980,1984) to 2.0 years (1995,1999). Contrary to our premise, 1995,1999 NMEs were 3.15 times more likely to change in comparison to 1980,1984 NMEs (p,=,0.008, Cox analysis). Conclusions Dosages of one in five NMEs changed, four in five changes were safety reductions. Increasing frequency of changes, independent of therapeutic group, may reflect intensified postmarketing surveillance and underscores the need to improve pre-marketing optimization of dosage and indicated population. Copyright © 2002 John Wiley & Sons, Ltd. [source]


    Role of Surgical Salvage for Regional Recurrence in Laryngeal Cancer

    THE LARYNGOSCOPE, Issue 1 2007
    Woo-Jin Jeong MD
    Abstract Objectives: The aims of this study were to analyze the pattern of regional recurrence in laryngeal cancer, evaluate the role of surgical salvage, and identify factors affecting salvage outcome. Methods: Retrospective analysis was conducted on medical records from a 16-year period. Of 463 patients diagnosed with laryngeal cancer, 25 patients with regional recurrence managed with salvage neck dissection were identified and subject to study. Isolated local recurrences and all distant metastases were excluded. Results: All patients were male with a median age of 61 years. The overall rate of regional recurrence was 5.4%. Median time to regional recurrence was 13 months. Isolated regional recurrence occurred in 76% of cases, whereas locoregional recurrence occurred in 24%. A 5-year survival rate for patients undergoing neck dissection as salvage management was 61.2%. Patients with recurrence in the contralateral neck were definitely associated with poor prognosis. Although standard statistical significance was not met, trends for poorer salvage result were identified in patients with a history of local recurrence before regional recurrence, recurrence in a previously dissected neck, and recurred node size of 3 cm or above. Conclusions: Our study shows that salvage neck dissection for regional recurrence in laryngeal cancer is an acceptable approach. Surgical eradication of disease should be warranted whenever possible. Prudent planning of management is mandatory in the presence of history of local recurrence before regional recurrence, previously dissected neck, large size of recurrent node, and contralateral neck recurrence. [source]


    Disseminated tumor cells in bone marrow following definitive radiotherapy for intermediate or high-risk prostate cancer,

    THE PROSTATE, Issue 15 2008
    Arne Berg
    Abstract Background The purpose of this study was to explore the prevalence of disseminated tumor cells (DTCs) in bone marrow (BM) of clinically progression-free prostate cancer (PC) patients at least 2 years after curatively intended radiotherapy (RT) with or without adjuvant hormone treatment. Methods All patients were T1,3N0M0 with intermediate or high risk of progression. Median time from RT to BM sampling was 5 years (2,8). A standardized immunocytochemical method applying the anticytokeratin antibodies AE1/AE3 was used for DTCs detection in 130 patients. Morphological characterization of immunostained cells was performed to exclude false positive cells. The post-treatment BM was explored in relation to pre-treatment risk factors, treatment strategy and serum levels of Testosterone and PSA at the time of BM sampling. Longitudinal changes in BM status were studied in a sub-group of 109 patients who also had donated BM prior to treatment. Results Post-treatment BM-aspirates were positive for DTCs in 17% of cases without correlation to any of the tested variables. Out of 14 patients who had DTCs in BM prior to treatment, all but one had become post-treatment negative. Out of 95 patients with pre-treatment negative BM status, 18 (19%) had become post-treatment positive. Conclusions DTCs in BM were found in 17% of clinically progression-free PC patients following RT. The detection of these cells may provide PSA-independent prognostic information remaining to be explored by prolonged follow-up. Prostate 68: 1607,1614, 2008. © 2008 Wiley-Liss, Inc. [source]


    How much does Gleason grade of follow-up biopsy differ from that of initial biopsy in untreated, Gleason score 4,7, clinically localized prostate cancer?

    THE PROSTATE, Issue 15 2007
    R. Choo
    Abstract OBJECTIVE To compare histologic grades between an initial biopsy and a follow-up biopsy in untreated, Gleason score (GS) 4,7, clinically localized prostate cancer. METHODS AND MATERIALS In a prospective single-arm cohort study, clinically localized, GS 4,7, prostate cancer was managed with active surveillance alone, provided that a pre-defined definition of disease progression was not met. One hundred five (63%) of a total of 168 eligible patients underwent a follow-up prostate biopsy during surveillance. Median time to a follow-up biopsy was 22 months (range: 7,81). Histologic grades between these two biopsies were compared to evaluate the extent of histologic grade change. RESULTS On the follow-up biopsy, GS was unchanged in 33 patients (31%), upgraded in 37 (35%), and downgraded in 34 (32%). Eleven (10%) had upgrading by 2 Gleason points or more. Eight (8%) had upgrading to GS 8 (none to GS 9 or 10); of these, six were among those with upgrading by 2 Gleason points or more. Twenty-seven (26%) had no malignancy on the follow-up biopsy. Negative follow-up biopsy was more prevalent in patients with a small volume of malignancy in the initial biopsy and a low baseline PSA. CONCLUSIONS No consistent change in histologic grade was observed on the follow-up biopsy at a median of 22 months in untreated, GS 4,7, clinically localized prostate cancer. Upgrading to GS ,8 or by 2 Gleason points or more was relatively uncommon. Prostate 67: 1614,1620, 2007. © 2007 Wiley-Liss, Inc. [source]


    Kidney Transplantation in Previous Heart or Lung Recipients

    AMERICAN JOURNAL OF TRANSPLANTATION, Issue 3 2009
    B. E. Lonze
    Outcomes after heart and lung transplants have improved, and many recipients survive long enough to develop secondary renal failure, yet remain healthy enough to undergo kidney transplantation. We used national data reported to United Network for Organ Sharing (UNOS) to evaluate outcomes of 568 kidney after heart (KAH) and 210 kidney after lung (KAL) transplants performed between 1995 and 2008. Median time to kidney transplant was 100.3 months after heart, and 90.2 months after lung transplant. Renal failure was attributed to calcineurin inhibitor toxicity in most patients. Outcomes were compared with primary kidney recipients using matched controls (MC) to account for donor, recipient and graft characteristics. Although 5-year renal graft survival was lower than primary kidney recipients (61% KAH vs. 73.8% MC, p < 0.001; 62.6% KAL vs. 82.9% MC, p < 0.001), death-censored graft survival was comparable (84.9% KAH vs. 88.2% MC, p = 0.1; 87.6% KAL vs. 91.8% MC, p = 0.6). Furthermore, renal transplantation reduced the risk of death compared with dialysis by 43% for KAH and 54% for KAL recipients. Our findings that renal grafts function well and provide survival benefit in KAH and KAL recipients, but are limited in longevity by the general life expectancy of these recipients, might help inform clinical decision-making and allocation in this population. [source]


    A Comparison of GlideScope Video Laryngoscopy Versus Direct Laryngoscopy Intubation in the Emergency Department

    ACADEMIC EMERGENCY MEDICINE, Issue 9 2009
    Timothy F. Platts-Mills MD
    Abstract Objectives:, The first-attempt success rate of intubation was compared using GlideScope video laryngoscopy and direct laryngoscopy in an emergency department (ED). Methods:, A prospective observational study was conducted of adult patients undergoing intubation in the ED of a Level 1 trauma center with an emergency medicine residency program. Patients were consecutively enrolled between August 2006 and February 2008. Data collected included indication for intubation, patient characteristics, device used, initial oxygen saturation, and resident postgraduate year. The primary outcome measure was success with first attempt. Secondary outcome measures included time to successful intubation, intubation failure, and lowest oxygen saturation levels. An attempt was defined as the introduction of the laryngoscope into the mouth. Failure was defined as an esophageal intubation, changing to a different device or physician, or inability to place the endotracheal tube after three attempts. Results:, A total of 280 patients were enrolled, of whom video laryngoscopy was used for the initial intubation attempt in 63 (22%) and direct laryngoscopy was used in 217 (78%). Reasons for intubation included altered mental status (64%), respiratory distress (47%), facial trauma (9%), and immobilization for imaging (9%). Overall, 233 (83%) intubations were successful on the first attempt, 26 (9%) failures occurred, and one patient received a cricothyrotomy. The first-attempt success rate was 51 of 63 (81%, 95% confidence interval [CI] = 70% to 89%) for video laryngoscopy versus 182 of 217 (84%, 95% CI = 79% to 88%) for direct laryngoscopy (p = 0.59). Median time to successful intubation was 42 seconds (range, 13 to 350 seconds) for video laryngoscopy versus 30 seconds (range, 11 to 600 seconds) for direct laryngoscopy (p < 0.01). Conclusions:, Rates of successful intubation on first attempt were not significantly different between video and direct laryngoscopy. However, intubation using video laryngoscopy required significantly more time to complete. [source]


    Natalizumab dosage suspension: Are we helping or hurting?

    ANNALS OF NEUROLOGY, Issue 3 2010
    Timothy W. West MD
    The risk of developing progressive multifocal leukoencephalopathy increases with the duration of treatment with natalizumab. Planned dosage interruptions have been proposed as a means of decreasing cumulative risk. The clinical consequences of dosage interruption were evaluated in a single center cohort of natalizumab-treated patients. Medical records were reviewed for 84 patients identified with multiple sclerosis who received 12 or more infusions of natalizumab at an academic multiple sclerosis center. Eighty-one percent (68/84) underwent a dosage interruption, and 19% (16/84) had no interruption in natalizumab treatment. Of those with a treatment interruption, 27.9% (19/68) experienced a clinical relapse within 6 months of the suspension, whereas none of the patients with ongoing treatment experienced a flare during months 12 to 18 of treatment (p = 0.017, Fisher exact test). Survival analysis showed that Kaplan-Meier curves comparing dosage interruption to ongoing treatment diverged (p = 0.025). Median time from treatment interruption to relapse onset was 3 months. No clinical predictors associated with an increased risk of developing flares during dosage interruption were identified. Among the 19 patients who had a flare, 7 had severe flares, with a mean number of 16 Gad+ lesions on brain magnetic resonance imaging (range, 6,40). Their median Expanded Disability Status Scale at natalizumab interruption was 3.0 and increased to 6.0 during the flare (p = 0.0008). Natalizumab dosage interruption is associated with clinical flares and return of radiographic inflammatory disease activity. Some of these flares can be clinically severe, with a high number of contrast-enhanced lesions, suggesting a possible rebound of disease activity. Ann Neurol 2010;68:395,399 [source]


    Point-of-care reversal treatment in phenprocoumon-related intracerebral hemorrhage

    ANNALS OF NEUROLOGY, Issue 6 2010
    Timolaos Rizos MD
    Objective Rapid reversal of the anticoagulatory effect of vitamin K antagonists represents the primary emergency treatment for oral anticoagulant-related intracerebral hemorrhage (OAC-ICH). Predicting the amount of prothrombin complex concentrate (PCC) needed to reverse OAC in individual patients is difficult, and repeated international normalized ratio (INR) measurements in central laboratories (CLs) are time-consuming. Accuracy and effectiveness of point-of-care INR coagulometers (POCs) for INR reversal in OAC-ICH have not been evaluated. Methods In phase 1, the agreement of emergency POC and CL INR measurements was determined. In phase 2, stepwise OAC reversal was performed with PCC using a predetermined dosing schedule. Concordance of POC and CL INR measurements during reversal and time gain due to POC were determined. Results In phase 1 (n = 165), Bland-Altman analysis showed close agreement between POCs and CLs (mean INR deviation 0.04). In phase 2 (n = 26), POCs caused a median initial net time gain of 24 minutes for the start of treatment with PCC. Median time for POC-documented complete OAC reversal was 28 minutes, compared with 120 minutes for CLs. Bland-Altman analysis between POCs and CLs revealed a mean INR deviation of 0.13 during stepwise PCC administration. POCs tended to slightly overestimate the INR, especially at higher INR levels. Remarkably, POC-guided reversal led to a median reduction of 30.5% of PCC dose compared with the a priori dose calculation. Hematomas enlarged in 20% of patients. Interpretation POC INR monitoring is a fast, effective, and economic means of PCC dose-titration in OAC-ICH. Larger studies examining the clinical efficacy of this procedure are warranted. ANN NEUROL 2010;67:788,793 [source]