Matched Cohort (matched + cohort)

Distribution by Scientific Domains

Terms modified by Matched Cohort

  • matched cohort study

  • Selected Abstracts


    Tramadol versus Buprenorphine for the Management of Acute Heroin Withdrawal: A Retrospective Matched Cohort Controlled Study

    THE AMERICAN JOURNAL ON ADDICTIONS, Issue 2 2006
    Threlkeld Threlkeld MD
    Many medications have been used over the past thirty years for the treatment of opioid withdrawal, including propoxyphene, methadone, clonidine, parenteral buprenorphine, and, more recently, sublingual buprenorphine. Each has been found to have clinical strengths and limitations. Tramadol is a centrally acting synthetic analgesic with opiate activity primarily due to the binding ofa metabolite to the , receptor. Despite this , receptor activity, tramadol appears to have low abuse potential and is a non-scheduled analgesic. The pharmacologic profile of tramadol makes it a candidate for opiate withdrawal treatment. A chart review was undertaken to retrospectively compare treatment outcomes of heroin-dependent patients when detoxified with parenteral buprenorphine (1996,1997) versus tramadol (1999,2000). Inclusion criteria for this study were heroin as drug of choice, current opioid physical dependence (ie, withdrawal symptoms), no current abuse of oral opioid analgesics, and no alcohol or benzodiazepine withdrawal symptoms. Patient cases that met inclusion criteria were group-matched between buprenorphine and tramadol on the basis of age, sex, and amount of heroin used (bags/ day). Charts were audited for patient demographics, daily heroin use at admission, withdrawal symptoms, and discharge status. In total, 129 patient charts were reviewed, and 115 met all inclusion criteria and were group-matched (45 patients in the buprenorphine group, seventy in the tramadol group). There were no differences in demographics between the two groups of patients. Fifty-six percent of the buprenorphine group and 71% of the tramadol group completed detoxification; tramadol-treated patients had significantly higher average withdrawal symptoms when compared to the buprenorphine group and a greater reduction in withdrawal symptoms over time. Finally, the number of side effects was small and did not differ between the groups. The results of this study are consistent with previous pilot reports that indicated few clinical differences between parenteral buprenorphine and oral tramadol protocols when used in the management of acute heroin withdrawal. As a consequence, tramadol shows some promise as an opioid withdrawal management medication. [source]


    Increased Mortality Associated With Low Use of Clopidogrel in Patients With Heart Failure and Acute Myocardial Infarction Not Undergoing Percutaneous Coronary Intervention

    CONGESTIVE HEART FAILURE, Issue 5 2010
    Scott Harris DO
    We studied the association of clopidogrel with mortality in acute myocardial infarction (AMI) patients with heart failure (HF) not receiving percutaneous coronary intervention (PCI). Background. Use of clopidogrel after AMI is low in patients with HF, despite the fact that clopidogrel is associated with absolute mortality reduction in AMI patients. Methods. All patients hospitalized with first-time AMI (2000 through 2005) and not undergoing PCI within 30 days from discharge were identified in national registers. Patients with HF treated with clopidogrel were matched by propensity score with patients not treated with clopidogrel. Similarly, 2 groups without HF were identified. Risks of all-cause death were obtained by the Kaplan,Meier method and Cox regression analyses. Results. We identified 56,944 patients with first-time AMI. In the matched cohort with HF (n=5050) and a mean follow-up of 1.50 years (SD=1.2), 709 (28.1%) and 812 (32.2%) deaths occurred in patients receiving and not receiving clopidogrel treatment, respectively (P=.002). The corresponding numbers for patients without HF (n=6092), with a mean follow-up of 2.05 years (SD=1.3), were 285 (9.4%) and 294 (9.7%), respectively (P=.83). Patients with HF receiving clopidogrel demonstrated reduced mortality (hazard ratio, 0.86; 95% confidence interval, 0.78,0.95) compared with patients with HF not receiving clopidogrel. No difference was observed among patients without HF (hazard ratio, 0.98; 95% confidence interval, 0.83,1.16). Conclusions. Clopidogrel was associated with reduced mortality in patients with HF who do not undergo PCI after their first-time AMI, whereas this association was not apparent in patients without HF. Further studies of the benefit of clopidogrel in patients with HF and AMI are warranted.,Bonde L, Sorensen R, Fosbol EL, et al. Increased mortality associated with low use of clopidogrel in patients with heart failure and acute myocardial infarction not undergoing percutaneous coronary intervention: a nationwide study. J Am Coll Cardiol. 2010;55:1300,1307. [source]


    The International Quotidian Hemodialysis Registry: Rationale and challenges

    HEMODIALYSIS INTERNATIONAL, Issue 2008
    Robert M. LINDSAY
    Abstract Outcomes from conventional thrice-weekly hemodialysis (CHD) are disappointing for a life-saving therapy. The results of the HEMO Study show that the recommended minimum dose (Kt/V) for adequacy is also the optimum attainable with CHD. Interest is therefore turning to alternative therapies exploring the effects of increased frequency and time of hemodialysis (HD) treatment. The National Institutes of Health have sponsored 2 randomized prospective trials comparing short hours daily in-center HD and long hours slow nightly home HD with CHD. An International Registry has also been created to capture observational data on patients receiving short hours daily in-center HD, long hours slow nightly home HD, and other alternative therapies. Participation by individual centers, other registries and the major dialysis chains is growing and currently data from nearly 3000 patients have been collected. Pitfalls in data collection have been identified and are being corrected. A matched cohort (patients in other registries) study is planned to obtain information regarding hard outcomes expected from these therapies. The Registry may become the most important source of information required by governments, providers, and the nephrological community in assessing the utility of such therapies. [source]


    Graft and patient survival after adult live donor liver transplantation compared to a matched cohort who received a deceased donor transplantation

    LIVER TRANSPLANTATION, Issue 10 2004
    Paul J. Thuluvath
    Live donor liver transplantation (LDLT) has become increasingly common in the United States and around the world. In this study, we compared the outcome of 764 patients who received LDLT in the United States and compared the results with a matched population that received deceased donor transplantation (DDLT) using the United Network for Organ Sharing (UNOS) database. For each LDLT recipient (n = 764), two DDLT recipients (n = 1,470), matched for age, gender, race, diagnosis, and year of transplantation, were selected from the UNOS data after excluding multiple organ transplantation or retransplantation, children, and those with incomplete data. Despite our matching, recipients of LDLT had more stable liver disease, as shown by fewer patients with UNOS status 1 or 2A, in an intensive care unit, or on life support. Creatinine and cold ischemia time were also lower in the LDLT group. Primary graft nonfunction, hyperacute rejection rates, and patient survival by Kaplan-Meier analysis were similar in both groups (2-year survival was 79.0% in LDLT vs. 80.7% in case-controls; P = .5), but graft survival was significantly lower in LDLT (2-year graft survival was 64.4% vs. 73.3%; P < .001). Cox regression (after adjusting for confounding variables) analysis showed that LDLT recipients were 60% more likely to lose their graft compared to DDLT recipients (hazard ratio [HR] 1.6; confidence interval 1.1-2.5). Among hepatitis C virus (HCV) patients, LDLT recipients showed lower graft survival when compared to those who received DDLT. In conclusion, short-term patient survival in LDLT is similar to that in the DDLT group, but graft survival is significantly lower in LDLT recipients. LDLT is a reasonable option for patients who are unlikely to receive DDLT in a timely fashion. (Liver Transpl 2004;10:1263,1268.) [source]


    Steroid-Free Immunosuppression Since 1999: 129 Pediatric Renal Transplants with Sustained Graft and Patient Benefits

    AMERICAN JOURNAL OF TRANSPLANTATION, Issue 6 2009
    L. Li
    Despite early promising patient and graft outcomes with steroid-free (SF) immunosuppression in pediatric kidney transplant recipients, data on long-term safety and efficacy results are lacking. We present our single-center experience with 129 consecutive pediatric kidney transplant recipients on SF immunosuppression, with a mean follow-up of 5 years. Outcomes are compared against a matched cohort of 57 concurrent recipients treated with steroid-based (SB) immunosuppression. In the SF group, 87% of kidney recipients with functioning grafts remain corticosteroid - free. Actual intent-to-treat SF (ITT-SF) and still-on-protocol SF patient survivals are 96% and 96%, respectively, actual graft survivals for both groups are 93% and 96%, respectively and actual death-censored graft survivals for both groups are 97% and 99%, respectively. Unprecedented catch-up growth is observed in SF recipients below 12 years of age. Continued low rates of acute rejection, posttransplant diabetes mellitus (PTDM), hypertension and hyperlipidemia are seen in SF patients, with sustained benefits for graft function. In conclusion, extended enrollment and longer experience with SF immunosuppression for renal transplantation in low-risk children confirms protocol safety, continued benefits for growth and graft function, low acute rejection rates and reduced cardiovascular morbidity. [source]


    Liver Transplantation Using Donation After Cardiac Death Donors: Long-Term Follow-Up from a Single Center

    AMERICAN JOURNAL OF TRANSPLANTATION, Issue 4 2009
    M. E. De Vera
    There is a lack of universally accepted clinical parameters to guide the utilization of donation after cardiac death (DCD) donor livers and it is unclear as to which patients would benefit most from these organs. We reviewed our experience in 141 patients who underwent liver transplantation using DCD allografts from 1993 to 2007. Patient outcomes were analyzed in comparison to a matched cohort of 282 patients who received livers from donation after brain death (DBD) donors. Patient survival was similar, but 1-, 5- and 10-year graft survival was significantly lower in DCD (69%, 56%, 44%) versus DBD (82%, 73%, 63%) subjects (p < 0.0001). Primary nonfunction and biliary complications were more common in DCD patients, accounting for 67% of early graft failures. A donor warm ischemia time >20 min, cold ischemia time >8 h and donor age >60 were associated with poorer DCD outcomes. There was a lack of survival benefit in DCD livers utilized in patients with model for end-stage liver disease (MELD) ,30 or those not on organ-perfusion support, as graft survival was significantly lower compared to DBD patients. However, DCD and DBD subjects transplanted with MELD >30 or on organ-perfusion support had similar graft survival, suggesting a potentially greater benefit of DCD livers in critically ill patients. [source]


    Incidence and mortality of interstitial lung disease in rheumatoid arthritis: A population-based study

    ARTHRITIS & RHEUMATISM, Issue 6 2010
    Tim Bongartz
    Objective Interstitial lung disease (ILD) has been recognized as an important comorbidity in rheumatoid arthritis (RA). We undertook the current study to assess incidence, predictors, and mortality of RA-associated ILD. Methods We examined a population-based incidence cohort of patients with RA and a matched cohort of individuals without RA. All subjects were followed up longitudinally. The lifetime risk of ILD was estimated. Cox proportional hazards models were used to compare the incidence of ILD between cohorts, to investigate predictors, and to explore the impact of ILD on survival. Results Patients with RA (n = 582) and subjects without RA (n = 603) were followed up for a mean of 16.4 and 19.3 years, respectively. The lifetime risk of developing ILD was 7.7% for RA patients and 0.9% for non-RA subjects. This difference translated into a hazard ratio (HR) of 8.96 (95% confidence interval [95% CI] 4.02,19.94). The risk of developing ILD was higher in RA patients who were older at the time of disease onset, in male patients, and in individuals with more severe RA. The risk of death for RA patients with ILD was 3 times higher than in RA patients without ILD (HR 2.86 [95% CI 1.98,4.12]). Median survival after ILD diagnosis was only 2.6 years. ILD contributed ,13% to the excess mortality of RA patients when compared with the general population. Conclusion Our results emphasize the increased risk of ILD in patients with RA. The devastating impact of ILD on survival provides evidence that development of better strategies for the treatment of ILD could significantly lower the excess mortality among individuals with RA. [source]


    Triadic bed-sharing and infant temperature

    CHILD: CARE, HEALTH AND DEVELOPMENT, Issue 2002
    H. L. Ball
    Abstract The effects on infants of sleeping with their parents is currently the subject of much debate. One concern regarding infants who sleep in their parents' bed involves the possibility of overheating. Previous research reported a significantly greater core temperature of 0.1°C among a cohort of bed-sharing infants compared with a matched cohort of infants sleeping alone. This paper presents a preliminary analysis of the overnight rectal temperature of 12 of the 20 infants who were monitored sleeping alone and with their parents on separate nights at the University of Durham Parent-Infant Sleep Lab. No significant differences were found in all night rectal temperature, or temperature from 2 h after sleep onset between bed-sharing and cot sleeping nights. These preliminary analyses suggest a night-time difference in rectal temperature between routine bed-sharers and routine cot sleepers, however, these findings will be further explored in the full analyses for this study. [source]


    Semi-automated risk estimation using large databases: quinolones and clostridium difficile associated diarrhea,

    PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 6 2010
    Robertino M. Mera
    Abstract Purpose The availability of large databases with person time information and appropriate statistical methods allow for relatively rapid pharmacovigilance analyses. A semi-automated method was used to investigate the effect of fluoroquinolones on the incidence of C. difficile associated diarrhea (CDAD). Methods Two US databases, an electronic medical record (EMR) and a large medical claims database for the period 2006,2007 were evaluated using a semi-automated methodology. The raw EMR and claims datasets were subject to a normalization procedure that aligns the drug exposures and conditions using ontologies; Snowmed for medications and MedDRA for conditions. A retrospective cohort design was used together with matching by means of the propensity score. The association between exposure and outcome was evaluated using a Poisson regression model after taking into account potential confounders. Results A comparison between quinolones as the target cohort and macrolides as the comparison cohort produced a total of 564,797 subjects exposed to a quinolone in the claims data and 233,090 subjects in the EMR. They were matched with replacement within six strata of the propensity score. Among the matched cohorts there were a total of 488 and 158 outcomes in the claims and the EMR respectively. Quinolones were found to be twice more likely to be significantly associated with CDAD than macrolides adjusting for risk factors (IRR 2.75, 95%CI 2.18,3.48). Conclusions Use of a semi-automated method was successfully applied to two observational databases and was able to rapidly identify a potential for increased risk of developing CDAD with quinolones. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    End-of-life care for older cancer patients in the Veterans Health Administration versus the private sector,,§¶

    CANCER, Issue 15 2010
    Nancy L. Keating MD
    Abstract BACKGROUND: Treatment of older cancer patients at the end of life has become increasingly aggressive, despite the absence of evidence for better outcomes. We compared aggressiveness of end-of-life care of older metastatic cancer patients treated in the Veterans Health Administration (VHA) and those under fee-for-service Medicare arrangements. METHODS: Using propensity score methods, we matched 2913 male veterans who were diagnosed with stage IV lung or colorectal cancer in 2001-2002 and died before 2006 with 2913 similar men enrolled in fee-for-service Medicare living in Surveillance, Epidemiology, and End Result (SEER) areas. We assessed chemotherapy within 14 days of death, intensive care unit (ICU) admissions within 30 days of death, and >1 emergency room visit within 30 days of death. RESULTS: Among matched cohorts, men treated in the VHA were less likely than men in the private sector to receive chemotherapy within 14 days of death (4.6% vs 7.5%, P < .001), be admitted to an ICU within 30 days of death (12.5% vs 19.7%, P < .001), or have >1 emergency room visit within 30 days of death (13.1 vs 14.7, P = .09). CONCLUSIONS: Older men with metastatic lung or colorectal cancer treated in the VHA healthcare system received less aggressive end-of-life care than similar men in fee-for-service Medicare. This may result from the absence of financial incentives for more intensive care in the VHA or because this integrated delivery system is better structured to limit potentially overly aggressive care. Additional studies are needed to assess whether men undergoing less aggressive end-of-life care also experience better outcomes. Cancer 2010. © 2010 American Cancer Society. [source]