Minutes Vs (minute + v)

Distribution by Scientific Domains


Selected Abstracts


Rescuer Fatigue: Standard versus Continuous Chest-Compression Cardiopulmonary Resuscitation

ACADEMIC EMERGENCY MEDICINE, Issue 10 2006
Joseph W. Heidenreich MD
Abstract Objectives Continuous chest-compression cardiopulmonary resuscitation (CCC-CPR) has been advocated as an alternative to standard CPR (STD-CPR). Studies have shown that CCC-CPR delivers substantially more chest compressions per minute and is easier to remember and perform than STD-CPR. One concern regarding CCC-CPR is that the rescuer may fatigue and be unable to maintain adequate compression rate or depth throughout an average emergency medical services response time. The specific aim of this study was to compare the effects of fatigue on the performance of CCC-CPR and STD-CPR on a manikin model. Methods This was a prospective, randomized crossover study involving 53 medical students performing CCC-CPR and STD-CPR on a manikin model. Students were randomized to their initial CPR group and then performed the other type of CPR after a period of at least two days. Students were evaluated on their performance of 9 minutes of CPR for each method. The primary endpoint was the number of adequate chest compressions (at least 38 mm of compression depth) delivered per minute during each of the 9 minutes. The secondary endpoints were total compressions, compression rate, and the number of breaks taken for rest. The students' performance was evaluated on the basis of Skillreporter Resusci Anne (Laerdal, Wappingers Falls, NY) recordings. Primary and secondary endpoints were analyzed by using the generalized linear mixed model for counting data. Results In the first 2 minutes, participants delivered significantly more adequate compressions per minute with CCC-CPR than STD-CPR, (47 vs. 32, p = 0.004 in the 1st minute and 39 vs. 29, p = 0.04 in the 2nd minute). For minutes 3 through 9, the differences in number of adequate compressions between groups were not significant. Evaluating the 9 minutes of CPR as a whole, there were significantly more adequate compressions in CCC-CPR vs. STD-CPR (p = 0.0003). Although the number of adequate compressions per minute declined over time in both groups, the rate of decline was significantly greater in CCC-CPR compared with STD-CPR (p = 0.0003). The mean number of total compressions delivered in the first minute was significantly greater with CCC-CPR than STD-CPR (105 per minute vs. 58 per minute, p < 0.001) and did not change over 9 minutes in either group. There were no differences in compression rates or number of breaks between groups. Conclusions CCC-CPR resulted in more adequate compressions per minute than STD-CPR for the first 2 minutes of CPR. However, the difference diminished after 3 minutes, presumably as a result of greater rescuer fatigue with CCC-CPR. Overall, CCC-CPR resulted in more total compressions per minute than STD-CPR during the entire 9 minutes of resuscitation. [source]


Physical performance limitations among adult survivors of childhood brain tumors

CANCER, Issue 12 2010
Kirsten K. Ness PhD
Abstract BACKGROUND: Young adult survivors of childhood brain tumors (BTs) may have late effects that compromise physical performance and everyday task participation. The objective of this study was to evaluate muscle strength, fitness, physical performance, and task participation among adult survivors of childhood BTs. METHODS: In-home evaluations and interviews were conducted for 156 participants (54% men). Results on measures of muscle strength, fitness, physical performance, and participation were compared between BT survivors and members of a population-based comparison group by using chi-square statistics and 2-sample t tests. Associations between late effects and physical performance and between physical performance and participation were evaluated in regression models. RESULTS: The median age of BT survivors was 22 years (range, 18-58 years) at the time of the current evaluation, and they had survived for a median of 14.7 years (range, 6.5-45.9 years) postdiagnosis. Survivors had lower estimates of grip strength (women, 24.7 ± 9.2 kg vs 31.5 ± 5.8 kg; men, 39.0 ± 12.2 kg vs 53.0 ± 10.1 kg), knee extension strength (women, 246.6 ± 95.5 Newtons [N] vs 331.5 ± 5.8 N; men, 304.7 ± 116.4 N vs 466.6 ± 92.1 N), and peak oxygen uptake (women, 25.1 ± 8.8 mL/kg per minute vs 31.3 ± 5.1 mL/kg per minute; men, 24.6 ± 9.5 mL/kg per minute vs 33.2 ± 3.4 mL/kg per minute) than members of the population-based comparison group. Physical performance was lower among survivors and was associated with not living independently (odds ratio [OR], 5.0; 95% confidence interval [CI], 2.0-12.2) and not attending college (OR, 2.3; 95% CI 1.2-4.4). CONCLUSIONS: Muscle strength and fitness values among BT survivors were similar to those among individuals aged ,60 years and were associated with physical performance limitations. Physical performance limitations were associated with poor outcomes in home and school environments. The current data indicated an opportunity for interventions targeted at improving long-term physical function in this survivor population. Cancer 2010. © 2010 American Cancer Society. [source]


A Comparison of Computerized and Pencil-and-Paper Tasks in Assessing Cognitive Function in Community-Dwelling Older People in the Newcastle 85+ Pilot Study

JOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 10 2007
Joanna Collerton MRCP
OBJECTIVES: To compare the acceptability and feasibility of computerized and pencil-and-paper tests of cognitive function in 85-year-old people. DESIGN: Group comparison of participants randomly allocated to pencil-and-paper (Wechsler Adult Intelligence and Memory Scales) or computerized (Cognitive Drug Research) tests of verbal memory and attention. SETTING: The Newcastle 85+ Pilot Study was the precursor to the Newcastle 85+ Study a United Kingdom Medical Research Council/Biotechnology and Biological Sciences Research Council cohort study of health and aging in the oldest-old age group. PARTICIPANTS: Eighty-one community-dwelling individuals aged 85. MEASUREMENTS: Participant and researcher acceptability, completion rates, time taken, validity as cognitive measures, and psychometric utility. RESULTS: Participants randomized to computerized tests were less likely to rate the cognitive function tests as difficult (odds ratio (OR)=0.16, 95% confidence interval (CI)=0.07,0.39), stressful (OR=0.18, 95% CI=0.07,0.45), or unacceptable (OR=0.18, 95% CI=0.08,0.48) than those randomized to pencil-and-paper tests. Researchers were also less likely to rate participants as being distressed in the computer test group (OR=0.19, 95% CI=0.07,0.46). Pencil-and-paper tasks took participants less time to complete (mean±standard deviation 18±4 minutes vs 26±4 minutes) but had fewer participants who could complete all tasks (91% vs 100%). Both types of task were equally good measures of cognitive function. CONCLUSION: Computerized and pencil-and-paper tests are both feasible and useful means of assessing cognitive function in the oldest-old age group. Computerized tests are more acceptable to participants and administrators. [source]


Thiazide Diuretics Affect Osteocalcin Production in Human Osteoblasts at the Transcription Level Without Affecting Vitamin D3 Receptors

JOURNAL OF BONE AND MINERAL RESEARCH, Issue 5 2000
D. Lajeunesse
Abstract Besides their natriuretic and calciuretic effect, thiazide diuretics have been shown to decrease bone loss rate and improve bone mineral density. Clinical evidence suggests a specific role of thiazides on osteoblasts, because it reduces serum osteocalcin (OC), an osteoblast-specific protein, yet the mechanisms implicated are unknown. We therefore investigated the role of hydrochlorothiazide (HCTZ) on OC production by the human osteoblast-like cell line MG-63. HCTZ dose-dependently (1,100 ,M) inhibited 1,25-dihydroxyvitamin D3 [1,25(OH)2D3]- induced OC release by these cells (maximal effect, ,40,50% and p < 0.005 by analysis of variance [ANOVA]) as measured by ELISA. This effect of HCTZ on OC release was caused by a direct effect on OC gene expression because Northern blot analysis revealed that OC messenger RNA (mRNA) levels were reduced in the presence of increasing doses of the diuretic (,47.2 ± 4.0%; p < 0.0001 by paired ANOVA with 100 ,M HCTZ). HCTZ (100 ,M) also stimulated calcium (Ca2+) uptake (8.26 ± 1.78 pmol/mg protein/15 minutes vs. 13.6 ± 0.49 pmol/mg protein/15 minutes; p < 0.05) in MG-63 cells. Reducing extracellular Ca2+ concentration with 0.5 mM EDTA or 0.5 mM ethylene glycol-bis(,-amino ethyl ether)- N,N,N',N' -tetraacetic acid (EGTA) only partly prevented the inhibitory effect of the diuretic on OC secretion (maximal effect, ,22.5 ± 6.9%), suggesting that thiazide-dependent Ca2+ influx is not sufficient to elicit the inhibition of OC secretion. Because OC production is strictly dependent on the presence of 1,25(OH)2D3 in human osteoblasts, we next evaluated the possible role of HCTZ on vitamin D3 receptors (VDR) at the mRNA and protein levels. Both Northern and Western blot analyses showed no effect of HCTZ (1,100 ,M) on VDR levels. The presence of EGTA in the culture media reduced slightly the VDR mRNA levels under basal condition but this was not modified in the presence of increasing levels of HCTZ. The OC gene promoter also is under the control of transcription factors such as Yin Yang 1 (YY1) and cFOS. Western blot analysis revealed no changes in YY1 levels in response to HCTZ either in the presence or in the absence of 0.5 mM EGTA in the culture media. In contrast, HCTZ induced a dose-dependent increase in cFOS levels (p < 0.002 by ANOVA), a situation prevented by incubation with EGTA. These studies indicate that HCTZ inhibits OC mRNA expression independently of an effect on VDR, YY1, or extracellular Ca2+ levels but involves changes in cFOS levels. As OC retards bone formation/mineralization, the inhibition of OC production by HCTZ could explain its preventive role in bone loss rate. (J Bone Miner Res 2000;15:894,901) [source]


Harvesting of the Radial Artery for Coronary Artery Bypass Grafting: Comparison of Ultrasonic Harmonic Scalpel Dissector with the Conventional Technique

JOURNAL OF CARDIAC SURGERY, Issue 3 2009
Hosam F. Fawzy M.D.
We started routine use of the ultrasonic dissecting scalpel in harvesting radial arteries aiming to minimize harvesting time, improve graft quality, and reduce wound complications. Methods: Radial artery harvesting technique using harmonic scalpel (HS; 43 patients) was compared with the conventional technique (Hemostatic clips and scissors; 53 patients). To avoid spasm, the radial artery was not skeletonized and papaverine was used to irrigate radial artery routinely in all patients. Results: Compared to the conventional technique, radial artery harvesting using the HS has a significantly shorter harvesting time (25 minutes vs. 50 minutes, p < 0.001) and required a significantly smaller number of hemostatic clips (3 vs. 40, p < 0.001). In situ free blood flow was significantly higher in HS group (80 mL/min vs. 40 mL/min, p < 0.001). There was no forearm wound infection in the HS group. There was no graft failure, reoperation for bleeding, or hand ischemia with the use of either technique. Conclusion: Harvesting the radial artery using the HS is less time consuming and decreased the use of hemostatic clips rather atraumatic with good quality graft. [source]


Initial Clinical Experience with Cardiac Resynchronization Therapy Utilizing a Magnetic Navigation System

JOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 2 2007
PETER GALLAGHER M.D.
Introduction: The placement of left ventricular (LV) leads during cardiac resynchronization therapy (CRT) involves many technical difficulties. These difficulties increase procedural times and decrease procedural success rates. Methods and Results: A total of 50 patients with severe cardiomyopathy (mean LV ejection fraction was 21 ± 6%) and a wide QRS underwent CRT implantation. Magnetic navigation (Stereotaxis, Inc.) was used to position a magnet-tipped 0.014, guidewire (CronusÔ guidewire) within the coronary sinus (CS) vasculature. LV leads were placed in a lateral CS branch, either using a standard CS delivery sheath or using a "bare-wire" approach without a CS delivery sheath. The mean total procedure time was 98.1 ± 29.1 minutes with a mean fluoroscopy time of 22.7 ± 15.1 minutes. The mean LV lead positioning time was 10.4 ± 7.6 minutes. The use of a delivery sheath was associated with longer procedure times 98 ± 32 minutes vs 80 ± 18 minutes (P = 0.029), fluoroscopy times 23 ± 15 minutes vs 13 ± 4 minutes (P = 0.0007) and LV lead positioning times 10 ± 6 minutes vs 4 ± 2 minutes (P = 0.015) when compared to a "bare-wire" approach. When compared with 52 nonmagnetic-assisted control CRT cases, magnetic navigation reduced total LV lead positioning times (10.4 ± 7.6 minutes vs 18.6 ± 18.9 minutes; P = 0.005). If more than one CS branch vessel was tested, magnetic navigation was associated with significantly shorter times for LV lead placement (16.2 ± 7.7 minutes vs 36.4 ± 23.4 minutes; P = 0.004). Conclusions: Magnetic navigation is a safe, feasible, and efficient tool for lateral LV lead placement during CRT. Magnetic navigation during CRT allows for control of the tip direction of the CronusÔ 0.014, guidewire using either a standard CS delivery sheath or "bare-wire" approach. Although there are some important limitations to the 0.014, CronusÔ magnetic navigation can decrease LV lead placement times compared with nonmagnetic-assisted control CRT cases, particularly if multiple CS branches are to be tested. [source]


Comparison of Cool Tip Versus 8-mm Tip Catheter in Achieving Electrical Isolation of Pulmonary Veins for Long-Term Control of Atrial Fibrillation: A Prospective Randomized Pilot Study

JOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 10 2006
SANJAY DIXIT M.D.
Objective: To compare safety and efficacy of 8-mm versus cooled tip catheter in achieving electrical isolation (EI) of pulmonary veins (PV) for long-term control of atrial fibrillation (AF). Background: There is paucity of studies comparing safety/efficacy of 8-mm and cooled tip catheters in patients undergoing AF ablation. Methods and Results: This was a randomized and patient-blinded study. Subjects were followed by clinic visits (at 6 weeks and 6 months) and transtelephonic monitoring (3-week duration) done around each visit. Primary endpoints were: (1) long-term AF control (complete freedom and/or >90% reduction in AF burden on or off antiarrhythmic drugs at 6 months after a single ablation), and (2) occurrence of serious adverse events (cardiac tamponade, stroke, LA-esophageal fistula, and/or death). Eighty-two patients (age 56 ± 9 years, 60 males, paroxysmal AF = 59) were randomized (42 patients to 8-mm tip and 40 patients to cooled tip). EI of PVs was achieved in shorter time by the 8-mm tip as compared with cooled tip catheter (40 ± 23 minutes vs 50 ± 30 minutes; P < 0.05) but long-term AF control was not different between the two (32 patients [78%] vs 28 patients [70%], respectively; P = NS). One serious adverse event occurred in each group (LA-esophageal fistula and stroke, respectively) and no significant PV stenosis was observed in either. Conclusion: EI of PVs using either 8-mm or cooled tip catheter results in long-term AF control in the majority after a single ablation procedure, with comparable efficacy and safety. [source]


Creating Continuous Linear Lesions in the Atria: A Comparison of the Multipolar Ablation Technique Versus the Conventional Drag-and-Burn

JOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 8 2005
WILBER W. SU M.D.
Introduction: Catheter-based treatment of atrial fibrillation (AF) requires the isolation of the triggering foci as well as modification of the atria with substrate that sustains AF. The creation of linear lesions in the left atrium with standard radiofrequency ablative methods requires long procedural times with unpredictable results. Methods: The simultaneous delivery of phase-shifted radiofrequency energy from a multipolar catheter was compared to the conventional drag-and-burn technique for creating linear lesions in 10 dogs. Four atrial sites were targeted under intracardiac ultrasound and fluoroscopic guidance in each of 10 dogs. The conventional drag-and-burn technique or the multipolar phase-shifted ablation catheter was randomly applied for 60 seconds and compared. Results: Creating linear lesions using the simultaneous multipolar phase-shifted ablation catheter was on average 11.0 minutes faster (33.6 minutes vs 44.6 minutes, P < 0.01) than the drag-and-burn method. The fraction of the lesion length achieved using phase-shifted ablation compared to that intended was 23% greater (76% vs 53%, P < 0.01), and has less discontinuities (0.1 compared to 0.8 discontinuities/line, P < 0.003). There was no significant difference in either the lesion transmurality, or fluoroscopy times. Conclusion: The simultaneous delivery of phase-shifted, radiofrequency energy using a multipolar catheter is more effective and efficient in producing linear lesions than the traditional drag-and-burn technique. Using the multipolar ablative method to create linear lesions may be a useful technique in the treatment of patients with substrate-mediated atrial fibrillation. [source]


The Implementation of Intranasal Fentanyl for Children in a Mixed Adult and Pediatric Emergency Department Reduces Time to Analgesic Administration

ACADEMIC EMERGENCY MEDICINE, Issue 2 2010
Anna Holdgate MBBS, FACEM
ACADEMIC EMERGENCY MEDICINE 2010; 17:1,4 © 2010 by the Society for Academic Emergency Medicine Abstract Objectives:, The objective was to determine whether the introduction of intranasal (IN) fentanyl for children with acute pain would reduce the time to analgesic administration in a mixed adult and pediatric emergency department (ED). Methods:, A protocol for IN fentanyl (1.5 ,g/kg) for children age 1,15 years presenting with acute pain was introduced to the department. All children who received intravenous (IV) morphine in the 7 months prior to the introduction of the protocol and either IV morphine or IN fentanyl in the 7 months after the introduction of the protocol were identified from drug registers. Time to analgesic administration, time to see a doctor, and the ages of patients were compared between the periods before and after the introduction of IN fentanyl. Results:, Following implementation, 81 patients received IN fentanyl and 37 received IV morphine, compared to 63 patients receiving morphine in the previous 7 months. The median time to analgesic administration for IN fentanyl was significantly shorter than for morphine (32 minutes vs. 63 minutes, p = 0.001). Children receiving fentanyl were significantly younger than those receiving morphine (median = 8.5 years vs. 12 years, p < 0.001). Conclusions:, This study demonstrates that children treated with IN fentanyl received analgesic medication faster than those treated with IV morphine in a mixed ED. Younger children were more likely to receive opioid analgesia following the introduction of fentanyl. [source]


A single center comparison of one versus two venous anastomoses in 564 consecutive DIEP flaps: Investigating the effect on venous congestion and flap survival,

MICROSURGERY, Issue 3 2010
Morteza Enajat M.D.
Background: Venous complications have been reported as the more frequently encountered vascular complications seen in the transfer of deep inferior epigastric artery (DIEA) perforator (DIEP) flaps, with a variety of techniques described for augmenting the venous drainage of these flaps to minimize venous congestion. The benefits of such techniques have not been shown to be of clinical benefit on a large scale due to the small number of cases in published series. Methods: A retrospective study of 564 consecutive DIEP flaps at a single institution was undertaken, comparing the prospective use of one venous anastomosis (273 cases) to two anastomoses (291 cases). The secondary donor vein comprised a second DIEA venae commitante in 7.9% of cases and a superficial inferior epigastric vein (SIEV) in 92.1%. Clinical outcomes were assessed, in particular rates of venous congestion. Results: The use of two venous anastomoses resulted in a significant reduction in the number of cases of venous congestion to zero (0 vs. 7, P = 0.006). All other outcomes were similar between groups. Notably, the use of a secondary vein did not result in any significant increase in operative time (385 minutes vs. 383 minutes, P = 0.57). Conclusions: The use of a secondary vein in the drainage of a DIEP flap can significantly reduce the incidence of venous congestion, with no detriment to complication rates. Consideration of incorporating both the superficial and deep venous systems is an approach that may further improve the venous drainage of the flap. © 2009 Wiley-Liss, Inc. Microsurgery, 2010. [source]


Catheter Ablation for Paroxysmal Atrial Fibrillation: A Randomized Comparison between Multielectrode Catheter and Point-by-Point Ablation

PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 9 2010
ALAN BULAVA M.D., Ph.D.
Introduction:,Catheter ablation for paroxysmal atrial fibrillation is widely used for patients with drug-refractory paroxysms of arrhythmia. Recently, novel technologies have been introduced to the market that aim to simplify and shorten the procedure. Aim:,To compare the clinical outcome of pulmonary vein (PV) isolation using a multipolar circular ablation catheter (PVAC group), with point-by-point PV isolation using an irrigated-tip ablation catheter and the CARTO mapping system (CARTO group; CARTO, Biosense Webster, Diamond Bar, CA, USA). Methods:,Patients with documented PAF were randomized to undergo PV isolation using PVAC or CARTO. Atrial fibrillation (AF) recurrences were documented by serial 7-day Holter monitoring. Results:,One hundred and two patients (mean age 58 ± 11 years, 68 men) were included in the study. The patients had comparable baseline clinical characteristics, including left atrial dimensions and left ventricular ejection fraction, in both study arms (PVAC: n = 51 and CARTO: n = 51). Total procedural and fluoroscopic times were significantly shorter in the PVAC group (107 ± 31 minutes vs 208 ± 46 minutes, P < 0.0001 and 16 ± 5 minutes vs 28 ± 8 minutes, P < 0.0001, respectively). The AF recurrence was documented in 23% and 29% of patients in the PVAC and CARTO groups, respectively (P = 0.8), during the mean follow-up of 200 ± 13 days. No serious complications were noted in both study groups. Conclusions:,Clinical success rates of PV isolation are similar when using multipolar circular PV ablation catheter and point-by-point ablation with a three-dimensional (3D) navigation system in patients with PAF, and results in shorter procedural and fluoroscopic times with a comparable safety profile. (PACE 2010; 33:1039,1046) [source]


Importance of Anterograde Visualization of the Coronary Venous Network by Selective Left Coronary Angiography Prior To Resynchronization

PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 1 2007
NICOLAS DELARCHE M.D.
Background: Understanding of coronary anatomy is essential to the advancement of cardiac resynchronization therapy (CRT) techniques. We determined whether the difficulties associated with catheterization of the coronary sinus (CS) and its lateral branches could be overcome by a preliminary angiographical study of the coronary venous system carried out during a pre-operative coronary angiography with examination of venous return. Methods and Results: All patients were scheduled for an exploratory angiography procedure and indicated for CRT. Group A patients were implanted with a CRT device after a right arterial angiographical procedure while group B patients had a selective left angiogram including examination of venous return. Data analyzed in group B were: position of CS ostium, number and distribution of lateral branches, and ability to preselect a marginal vein suitable for catheterization. Subsequent device implantation was guided by these parameters. A total of 96 and 89 patients were included in groups A and B, respectively. Implantation success rates were not different (98% and 100%, respectively), but CS catheterization time was reduced in group B (6 minutes vs 4 minutes; P < 10,6) as well as total time required to position the left ventricular lead (25 minutes vs 15 minutes; P < 10,6), fluoroscopy exposure (7 minutes vs 5 minutes; P < 10,6), and volume of contrast medium required (45 mL vs 15 mL; P < 10,6). Conclusion: A coronary angiographical study, including examination of the coronary venous return prior to implantation of a CRT device, can simplify the device implant and allows patient-specific preselection of appropriate tools for the procedure. [source]


Atrial Lead Placement During Atrial Fibrillation.

PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 7 2000
Is Restitution of Sinus Rhythm Required for Proper Lead Function?
Unexpected atrial fibrillation (AF) during implantation of an atrial pacemaker lead is sometimes encountered. Infra-operative cardioversion may lengthen and complicate the implantation process. This study prospectively investigates the performance of atrial leads implanted during AF (group A) and compares atrial sensing and pacing properties to an age- and sex-matched control group in which sinus rhythm had been restored before atrial lead placement (group B). Patient groups consisted of 32 patients each. All patients received DDDE pacemakers and bipolar, steroid-elating, active fixation atrial leads. In patients with AF at the time of implantation (group A), a minimal intracardiac fibrillatory amplitude of at least 1.0 mV was required for acceptable atrial lead placement. In patients with restored sinus rhythm (group B). a voltage threshold < 1.5 V at 0.5 ms and a minimal atrial potential amplitude > 1.5 mV was required. Patients of group A in whom spontaneous conversion to sinus rhythm did not occur within 4 weeks after implantation underwent electrical cardioversion to sinus rhythm. Pacemaker interrogations were performed 3, 6, and 12 months after implantation. In group A, implantation time was significantly shorter as compared to group B (58.7 ± 8.6 minutes vs 73.0 ± 17.3 minutes, P < 0.001). Mean atrial potential amplitude during AF was correlated with the telemetered atrial potential during sinus rhythm (r = 0.49, P < 0.001), but not with the atrial stimulation threshold. Twelve months after implantation, sensing thresholds (1.74 ± 0.52 mV vs 1.78 ± 0.69 mV, P = 0.98) and stimulation thresholds (1.09 ± 0.42 V vs 1.01 ± 0.31 V.P = 0.66) did not differ between groups A and B. However, in three, patients of group A, chronic atrial sensing threshold was , 1 mV requiring atria) sensitivities of at least 0.35 mV to achieve reliable atrial sensing. Atrial lead placement during AF is feasible and reduces implantation time. However, bipolar atrial leads and the option to program high atrial sensitivities are required. [source]


Racial Disparities in Emergency Department Length of Stay for Admitted Patients in the United States

ACADEMIC EMERGENCY MEDICINE, Issue 5 2009
Jesse M. Pines MD
Abstract Objectives:, Recent studies have demonstrated the adverse effects of prolonged emergency department (ED) boarding times on outcomes. The authors sought to examine racial disparities across U.S. hospitals in ED length of stay (LOS) for admitted patients, which may serve as a proxy for boarding time in data sets where the actual time of admission is unavailable. Specifically, the study estimated both the within- and among-hospital effects of black versus non,black race on LOS for admitted patients. Methods:, The authors studied 14,516 intensive care unit (ICU) and non-ICU admissions in 408 EDs in the National Hospital Ambulatory Medical Care Survey (NHAMCS; 2003,2005). The main outcomes were ED LOS (triage to transfer to inpatient bed) and proportion of patients with prolonged LOS (>6 hours). The effects of black versus non,black race on LOS were decomposed to distinguish racial disparities between patients at the same hospital (within-hospital component) and between hospitals that serve higher proportions of black patients (among-hospital component). Results:, In the unadjusted analyses, ED LOS was significantly longer for black patients admitted to ICU beds (367 minutes vs. 290 minutes) and non-ICU beds (397 minutes vs. 345 minutes). For admissions to ICU beds, the within-hospital estimates suggested that blacks were at higher risk for ED LOS of >6 hours (odds ratio [OR] = 1.42, 95% confidence interval [CI] = 1.01 to 2.01), while the among-hospital differences were not significant (OR = 1.08 for each 10% increase in the proportion of black patients, 95% CI = 0.96 to 1.23). By contrast, for non-ICU admissions, the within-hospital racial disparities were not significant (OR = 1.12, 95% CI = 0.94 to 1.23), but the among-hospital differences were significant (OR = 1.13, 95% CI = 1.04 to 1.22) per 10% point increase in the percentage of blacks admitted to a hospital. Conclusions:, Black patients who are admitted to the hospital through the ED have longer ED LOS compared to non,blacks, indicating that racial disparities may exist across U.S. hospitals. The disparity for non-ICU patients might be accounted for by among-hospital differences, where hospitals with a higher proportion of blacks have longer waits. The disparity for ICU patients is better explained by within-hospital differences, where blacks have longer wait times than non,blacks in the same hospital. However, there may be additional unmeasured clinical or socioeconomic factors that explain these results. [source]


Impact of an Audit Program and Other Factors on Door-to-balloon Times in Acute ST-elevation Myocardial Infarction Patients Destined for Primary Coronary Intervention

ACADEMIC EMERGENCY MEDICINE, Issue 4 2009
Chao-Lun Lai MD
Abstract Objectives:, This before,after study investigated the association between an audit program and door-to-balloon times in patients with acute ST-elevation myocardial infarction (STEMI) and explored other factors associated with the door-to-balloon time. Methods:, An audit program that collected time data for essential time intervals in acute STEMI was developed with data feedback to both the Department of Emergency Medicine and the Department of Cardiology. The door-to-balloon times for 76 consecutive acute STEMI patients were collected from February 16, 2007, through October 31, 2007, after the implementation of the audit program, as the intervention group. The control group was defined by 104 consecutive acute STEMI patients presenting from April 1, 2006, through February 15, 2007, before the audit was applied. A multivariate linear regression model was used for analysis of factors associated with the door-to-balloon time. Results:, The geometric mean 95% CI of the door-to-balloon time decreased from 164.9 (150.3, 180.9) minutes to 141.9 (127.4, 158.2) minutes (p = 0.039) in the intervention phase. The median door-to-balloon time was 147.5 minutes in the control group and 136.0 minutes in the intervention group (p = 0.09). In the multivariate regression model, the audit program was associated with a shortening of the door-to-balloon time by 35.5 minutes (160.4 minutes vs. 195.9 minutes, p = 0.004); female gender was associated with a mean delay of 58.4 minutes (208.9 minutes vs. 150.5 minutes; p = 0.001); posterolateral wall infarction was associated with a mean delay of 70.5 minutes compared to anterior wall infarction (215.4 minutes vs. 144.9 minutes; p = 0.037) and a mean delay of 69.5 minutes compared to inferior wall infarction (215.4 minutes vs. 145.9 minutes; p = 0.044). The use of a glycoprotein IIb/IIIa inhibitor was associated with a 46.1 minutes mean shortening of door-to-balloon time (155.7 minutes vs. 201.8 minutes; p < 0.001). Conclusions:, The implementation of an audit program was associated with a significant reduction in door-to-balloon times among patients with acute STEMI. In addition, female patients, posterolateral wall infarction territory, and nonuse of glycoprotein IIb/IIIa inhibitor were associated with longer door-to-balloon times. [source]


Gender Disparity in Analgesic Treatment of Emergency Department Patients with Acute Abdominal Pain

ACADEMIC EMERGENCY MEDICINE, Issue 5 2008
Esther H. Chen MD
Abstract Objectives:, Oligoanalgesia for acute abdominal pain historically has been attributed to the provider's fear of masking serious underlying pathology. The authors assessed whether a gender disparity exists in the administration of analgesia for acute abdominal pain. Methods:, This was a prospective cohort study of consecutive nonpregnant adults with acute nontraumatic abdominal pain of less than 72 hours' duration who presented to an urban emergency department (ED) from April 5, 2004, to January 4, 2005. The main outcome measures were analgesia administration and time to analgesic treatment. Standard comparative statistics were used. Results:, Of the 981 patients enrolled (mean age ± standard deviation [SD] 41 ± 17 years; 65% female), 62% received any analgesic treatment. Men and women had similar mean pain scores, but women were less likely to receive any analgesia (60% vs. 67%, difference 7%, 95% confidence interval [CI] = 1.1% to 13.6%) and less likely to receive opiates (45% vs. 56%, difference 11%, 95% CI = 4.1% to 17.1%). These differences persisted when gender-specific diagnoses were excluded (47% vs. 56%, difference 9%, 95% CI = 2.5% to 16.2%). After controlling for age, race, triage class, and pain score, women were still 13% to 25% less likely than men to receive opioid analgesia. There was no gender difference in the receipt of nonopioid analgesia. Women waited longer to receive their analgesia (median time 65 minutes vs. 49 minutes, difference 16 minutes, 95% CI = 3.5 to 33 minutes). Conclusions:, Gender bias is a possible explanation for oligoanalgesia in women who present to the ED with acute abdominal pain. Standardized protocols for analgesic administration may ameliorate this discrepancy. [source]