Home About us Contact | |||
Cohort Design (cohort + design)
Kinds of Cohort Design Selected AbstractsExcess use of coercive measures in psychiatry among migrants compared with native DanesACTA PSYCHIATRICA SCANDINAVICA, Issue 2 2010M. Norredam Norredam M, Garcia-Lopez A, Keiding N, Krasnik A. Excess use of coercive measures in psychiatry among migrants compared with native Danes. Objective:, To investigate differences in risk of compulsory admission and other coercive measures in psychiatric emergencies among refugees and immigrants compared with that among native Danes. Method:, A register-based retrospective cohort design. All refugees (n = 29 174) and immigrants (n = 33 287) who received residence permission in Denmark from 1.1.1993 to 31.12.1999 were included and matched 1 : 4 on age and sex with native Danes. Civil registration numbers were cross-linked to the Danish Psychiatric Central Register and the Registry of Coercive Measures in Psychiatric Treatment. Results:, Refugees (RR = 1.82; 95%CI: 1.45; 2.29) and immigrants (RR = 1.14; 95%CI: 0.83; 1.56) experienced higher rates of compulsory admissions than did native Danes. This was most striking for refugee men (RR = 2.00; 95%CI: 1.53; 2.61) and immigrant women (RR = 1.73; 95%CI: 1.45; 2.60). Moreover, refugees and immigrants experienced higher frequencies of other coercive measures during hospitalisation compared with native Danes. Conclusion:, Coercive measures in psychiatry are more likely to be experienced by migrants than by native Danes. [source] Overdose deaths following previous non-fatal heroin overdose: Record linkage of ambulance attendance and death registry dataDRUG AND ALCOHOL REVIEW, Issue 4 2009MARK A. STOOVÉ Abstract Introduction and Aims. Experiencing previous non-fatal overdoses have been identified as a predictor of subsequent non-fatal overdoses; however, few studies have investigated the association between previous non-fatal overdose experiences and overdose mortality. We examined overdose mortality among injecting drug users who had previously been attended by an ambulance for a non-fatal heroin overdose. Design and Methods. Using a retrospective cohort design, we linked data on non-fatal heroin overdose cases obtained from ambulance attendance records in Melbourne, Australia over a 5-year period (2000,2005) with a national death register. Results. 4884 people who were attended by ambulance for a non-fatal heroin overdose were identified. One hundred and sixty-four overdose deaths occurred among this cohort, with an average overdose mortality rate of 1.20 per 100 person-years (95% CI, 1.03,1.40). Mortality rate decreased 10-fold after 2000 coinciding with widely reported declines in heroin availability. Being male, of older age (>35 years) and having been attended multiple times for previous non-fatal overdoses were associated with increased mortality risk. Discussion and Conclusions. As the first to show a direct association between non-fatal overdose and subsequent overdose mortality, this study has important implications for the prevention of overdose mortality. This study also shows the profound effect of macro-level heroin market dynamics on overdose mortality.[Stoové MA, Dietze PM, Jolley D. Overdose deaths following previous non-fatal heroin overdose: Record linkage of ambulance attendance and death registry data. Drug Alcohol Rev 2009;28:347,352] [source] Psychopathological changes and quality of life in hepatitis C virus-infected, opioid-dependent patients during maintenance therapyADDICTION, Issue 4 2009Arne Schäfer ABSTRACT Aims To examine among maintenance patients (methadone or buprenorphine) with and without hepatitis C virus (HCV) infection (i) the frequency of psychopathological symptoms at baseline and 1-year follow-up; (ii) the association between antiviral interferon (IFN) treatment and psychopathological symptoms; and (iii) to explore whether IFN therapy has an effect on 1-year outcome of maintenance treatment. Design Naturalistic prospective longitudinal cohort design. Setting A total of 223 substitution centres in Germany. Participants A nationally representative sample of 2414 maintenance patients, namely 800 without and 1614 with HCV infection, of whom 122 received IFN therapy. Measures HCV infection (HCV+/HCV - ), IFN (IFN+/IFN - ) treatment status and clinical measures. Diagnostic status and severity (rated by clinician), psychopathology (BSI,Brief Symptom Inventory) and quality of life (EQ-5D,EuroQol Group questionnaire). Findings HCV+ patients revealed indications for a moderately increased psychopathological burden and poorer quality of life at baseline and follow-up compared to HCV - patients. HCV+ patients showed a marked deterioration over time only in the BSI subscale somatization (P = 0.002), and the frequency of sleep disorders almost doubled over time (12.8% at baseline; 24.1% at follow-up; P < 0.01). IFN treatment, received by 10% of HCV+ patients, did not impair efficacy or tolerability of maintenance therapy and was associated overall with neither increased psychopathological burden nor reduced quality of life. Conclusions Findings suggest no increased risk among HCV+ patients on maintenance therapy for depressive or other psychopathological syndromes. In our patient sample, IFN treatment was not associated with increased psychopathological burden, reduced quality of life or poorer tolerability and efficacy of maintenance treatment. [source] Attendance at Narcotics Anonymous and Alcoholics Anonymous meetings, frequency of attendance and substance use outcomes after residential treatment for drug dependence: a 5-year follow-up studyADDICTION, Issue 1 2008Michael Gossop ABSTRACT Aims This study investigates the relationship between frequency of attendance at Narcotics Anonymous and Alcoholics Anonymous (NA/AA) meetings and substance use outcomes after residential treatment of drug dependence. It was predicted that post-treatment NA/AA attendance would be related to improved substance use outcomes. Methods Using a longitudinal, prospective cohort design, interviews were conducted with drug-dependent clients (n = 142) at intake to residential treatment, and at 1 year, 2 years and 4,5 years follow-up. Data were collected by structured interviews. All follow-up interviews were carried out by independent professional interviewers. Findings Abstinence from opiates was increased throughout the 5-year follow-up period compared to pre-treatment levels. Clients who attended NA/AA after treatment were more likely to be abstinent from opiates at follow-up. Abstinence from stimulants increased at follow-up but (except at 1-year follow-up) no additional benefit was found for NA/AA attendance. There was no overall change in alcohol abstinence after treatment but clients who attended NA/AA were more likely to be abstinent from alcohol at all follow-up points. More frequent NA/AA attenders were more likely to be abstinent from opiates and alcohol when compared both to non-attenders and to infrequent (less than weekly) attenders. Conclusions NA/AA can support and supplement residential addiction treatment as an aftercare resource. In view of the generally poor alcohol use outcomes achieved by drug-dependent patients after treatment, the improved alcohol outcomes of NA/AA attenders suggests that the effectiveness of existing treatment services may be improved by initiatives that lead to increased involvement and engagement with such groups. [source] The National Treatment Outcome Research Study (NTORS): 4,5 year follow-up resultsADDICTION, Issue 3 2003Michael Gossop ABSTRACT Aims ,The National Treatment Outcome Research Study (NTORS) is the first prospective national study of treatment outcome among drug misusers in the United Kingdom. NTORS investigates outcomes for drug misusers treated in existing services in residential and community settings. Design, setting and participants,The study used a longitudinal, prospective cohort design. Data were collected by structured interviews at intake to treatment, 1 year, 2 years and at 4,5 years. The sample comprised 418 patients from 54 agencies and four treatment modalities. Measurements, Measures were taken of illicit drug use, injecting and sharing injecting equipment, alcohol use, psychological health and crime. Findings, Rates of abstinence from illicit drugs increased after treatment among patients from both residential and community (methadone) programmes. Reductions were found for frequency of use of heroin, non-prescribed methadone, benzodiazepines, injecting and sharing of injecting equipment. For most variables, reductions were evident at 1 year with outcomes remaining at about the 1 year level or with further reductions. Crack cocaine and alcohol outcomes at 4,5 years were not significantly different from intake. Conclusions, Substantial reductions across a range of problem behaviours were found 4,5 years after patients were admitted to national treatment programmes delivered under day-to-day conditions. The less satisfactory outcomes for heavy drinking and use of crack cocaine suggest the need for services to be modified to tackle these problems more effectively. Despite differences between the United Kingdom and the United States in patient populations and in treatment programmes, there are many similarities between the two countries in outcomes from large-scale, multi-site studies. [source] Effect of Ethnicity on Denial of Authorization for Emergency Department Care by Managed Care GatekeepersACADEMIC EMERGENCY MEDICINE, Issue 3 2001Robert A. Lowe MD Abstract. Objective: After a pilot study suggested that African American patients enrolled in managed care organizations (MCOs) were more likely than whites to be denied authorization for emergency department (ED) care through gatekeeping, the authors sought to determine the association between ethnicity and denial of authorization in a second, larger study at another hospital. Methods: A retrospective cohort design was used, with adjustment for triage score, age, gender, day and time of arrival at the ED, and type of MCO. Results: African Americans were more likely to be denied authorization for ED visits by the gatekeepers representing their MCOs even after adjusting for confounders, with an odds ratio of 1.52 (95% CI = 1.18 to 1.94). Conclusions: African Americans were more likely than whites to be denied authorization for ED visits. The observational study design raises the possibility that incomplete control of confounding contributed to or accounted for the association between ethnicity and gatekeeping decisions. Nevertheless, the questions that these findings raise about equity of gatekeeping indicate a need for additional research in this area. [source] Progressive segmented health insurance: Colombian health reform and access to health servicesHEALTH ECONOMICS, Issue 1 2007Fernando Ruiz Abstract Equal access for poor populations to health services is a comprehensive objective for any health reform. The Colombian health reform addressed this issue through a segmented progressive social health insurance approach. The strategy was to assure universal coverage expanding the population covered through payroll linked insurance, and implementing a subsidized insurance program for the poorest populations, those not affiliated through formal employment. A prospective study was performed to follow-up health service utilization and out-of-pocket expenses using a cohort design. It was representative of four Colombian cities (Cendex Health Services Use and Expenditure Study, 2001). A four part econometric model was applied. The model related medical service utilization and medication with different socioeconomic, geographic, and risk associated variables. Results showed that subsidized health insurance improves health service utilization and reduces the financial burden for the poorest, as compared to those non-insured. Other social health insurance schemes preserved high utilization with variable out-of-pocket expenditures. Family and age conditions have significant effect on medical service utilization. Geographic variables play a significant role in hospital inpatient service utilization. Both, geographic and income variables also have significant impact on out-of-pocket expenses. Projected utilization rates and a simulation favor a dual policy for two-stage income segmented insurance to progress towards the universal insurance goal. Copyright © 2006 John Wiley & Sons, Ltd. [source] Cerebro- and cardiovascular conditions in adults with schizophrenia treated with antipsychotic medicationsHUMAN PSYCHOPHARMACOLOGY: CLINICAL AND EXPERIMENTAL, Issue 6 2007Jeanette M. Jerrell Abstract Objective To report on the relative risk of cerebro- and cardiovascular disorders associated with antipsychotic treatment among adults with schizophrenia. Method Medical and pharmacy claims data from the South Carolina Medicaid program were extracted to compare the prevalence rates for four coded cerebrovascular (cerebrovascular disease; cerebrovascular accident; cerebrovascular hemorrhage; and peripheral vascular disease) and four cardiovascular (myocardial infarction; ischemic heart disease; arrhythmias; and cardiomyopathy) conditions. The analysis employed a retrospective cohort design with a 3 years time period as the interval of interest. Schizophrenic adults (18,54) (n,=,2251) prescribed one of six atypical or two conventional antipsychotic medications were identified and comprised the analysis set. Results Incidence rates for cerebrovascular disorders ranged from 0.5 to 3.6%. No significant association between antipsychotic usage and cerebrovascular disorders was noted largely due to the low base rate. Incidence rates for overall cardiovascular conditions ranged from 6 to 20%. The odds of developing cardiomyopathy were significantly lower for aripiprazole (OR,=,,3.45; p,=,0.02), while the odds of developing hypertension were significantly lower for males (OR,=,,1.37; p,=,0.009) but significantly higher for patients prescribed ziprasidone (OR,=,1.91; p,=,0.01) relative to conventional antipsychotics. Conclusion No significant association between antipsychotic usage and cerebro- or cardiovascular disorders was noted. Copyright © 2007 John Wiley & Sons, Ltd. [source] Cognitive leisure activities and their role in preventing dementia: a systematic reviewINTERNATIONAL JOURNAL OF EVIDENCE BASED HEALTHCARE, Issue 1 2010Cindy Stern BHSc(Hons) Abstract Background, Dementia inflicts a tremendous burden on the healthcare system. Identifying protective factors or effective prevention strategies may lead to considerable benefits. One possible strategy mentioned in the literature relates to participation in cognitive leisure activities. Aim, To determine the effectiveness of cognitive leisure activities in preventing Alzheimer's and other dementias among older adults. Inclusion criteria Types of participants.,Adults aged at least 60 years of age with or without a clinical diagnosis of dementia that resided in the community or care setting. Types of interventions.,Cognitive leisure activities, defined as activities that required a mental response from the individual taking part in the activity (e.g. reading). Types of outcomes.,The presence or absence of dementia was the outcome of interest. Types of studies.,Any randomised controlled trials, other experimental studies, as well as cohort, case,control and cross-sectional studies were considered for inclusion. Search strategy.,A search for published and unpublished studies in the English language was undertaken with no publication date restriction. Methodological quality, Each study was appraised independently by two reviewers using the standard Joanna Briggs Institute instruments. Data collection and analysis, Information was extracted from studies meeting quality criteria using the standard Joanna Briggs Institute tools. Because of the heterogeneity of populations and interventions, meta-analyses were not possible and results are presented in narrative form. Results, There were no randomised controlled trials located that met inclusion criteria. Thirteen observational studies were included in the review; the majority were cohort design. Because of the heterogeneity of interventions, the study design, the way in which they were grouped and the different stages of life they were measured at, statistical pooling was not appropriate. Studies were grouped by stage of adult life participation when interventions were undertaken, that is, early adulthood, middle adulthood and late life. Five out of six studies showed a positive association between participating in activities and a reduced risk of developing Alzheimer's disease and other dementias when interventions were undertaken in middle adulthood and six out of seven studies produced a positive association for late life participation. Results indicated that some activities might be more beneficial than others; however, results should be interpreted with caution because of the subjective nature of activity inclusion. Conclusion ,,Actively participating in cognitive leisure activities during mid- or late life may be beneficial in preventing the risk of Alzheimer's disease and other dementias in the elderly; however, the evidence is currently not strong enough to infer a direct causal relationship. ,,Participating in selected cognitive leisure activities may be more favourable than others but currently there is no strong evidence to recommend one over the other. [source] Reducing the burden of caring for Alzheimer's disease through the amelioration of ,delusions of theft' by drug therapyINTERNATIONAL JOURNAL OF GERIATRIC PSYCHIATRY, Issue 3 2002Kazue Shigenobu Abstract Background Delusions of theft (delusions involving the theft of possessions) are one of the most frequent neuropsychiatric manifestations of Alzheimer's disease (AD). Objective The current study investigated the presence and extent of such delusions before and after drug treatment in a group of AD patients, and the consequent effects on the burden of care on caregivers. Method The study was an open-label cohort design. The delusions studied consisted only of those involving theft of possessions. Sixteen AD patients served as subjects in order to assess the efficacy of Risperidone administration, in the reduction or elimination of these delusions. The caregiver burden was evaluated using the Zarit Caregiver Burden Interview (ZBI) before the administration of Risperidone and 12 weeks after administration, for cases where delusions of theft were eliminated or reduced. Results The burden of care on caregivers was significantly reduced (p,<,0.001) through the elimination or reduction of delusions of theft. Conclusion Delusions of theft are considered to be a major factor in increasing the burden of care, and the treatment of these, through appropriate drug therapy, is therefore of great importance in the continuation of satisfactory care in the home. Copyright © 2002 John Wiley & Sons, Ltd. [source] The Effects of a Variant of the Program for All-inclusive Care of the Elderly on Hospital Utilization and OutcomesJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 2 2006Robert L. Kane MD OBJECTIVES: To compare the effects of the Wisconsin Partnership Program (WPP) on hospital, emergency department (ED), and nursing home utilization with those of traditional care. DESIGN: Quasi-experimental longitudinal cohort design. SETTING: Selected counties in Wisconsin. PARTICIPANTS: WPP elderly enrollees and two matched control groups consisting of frail older people enrolled in fee-for-service insurance plans, Medicare, and Medicaid and receiving home- and community-based waiver services, one from the same geographic area as the WPP and another from a location in the state where the WPP was not offered. MEASUREMENTS: Data came from administrative records. Regression and survival analyses were adjusted for case-mix variables. RESULTS: No significant differences in hospital utilization, ED visits, preventable hospitalizations, risk of entry into nursing homes, or mortality were found. WPP enrollees had more contact with care providers than did controls. CONCLUSION: WPP did not dramatically alter the pattern of care. Part of the weak effect may be attributable to the small numbers of WPP cases per participating physician. [source] Income-Related Differences in the Use of Evidence-Based Therapies in Older Persons with Diabetes Mellitus in For-Profit Managed CareJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 5 2003Arleen F. Brown MD OBJECTIVES: To determine whether income influences evidence-based medication use by older persons with diabetes mellitus in managed care who have the same prescription drug benefit. DESIGN: Observational cohort design with telephone interviews and clinical examinations. SETTING: Managed care provider groups that contract with one large network-model health plan in Los Angeles County. PARTICIPANTS: A random sample of community-dwelling Medicare beneficiaries with diabetes mellitus aged 65 and older covered by the same pharmacy benefit. MEASUREMENTS: Patients reported their sociodemographic and clinical characteristics. Annual household income (,$20,000 or <$20,000) was the primary predictor. The outcome variable was use of evidence-based therapies determined by a review of all current medications brought to the clinical examination. The medications studied included use of any cholesterol-lowering medications, use of 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitors (statins) for cholesterol lowering, aspirin for primary and secondary prevention of cardiovascular disease, and angiotensin-converting enzyme (ACE) inhibitors in those with diabetic nephropathy. The influence of income on evidence-based medication use was adjusted for other patient characteristics. RESULTS: The cohort consisted of 301 persons with diabetes mellitus, of whom 53% had annual household income under $20,000. In unadjusted analyses, there were lower rates of use of all evidence-based therapies and lower rates of statin use for persons with annual income under $20,000 than for higher-income persons. In multivariate models, statin use was observed in 57% of higher-income versus 30% of lower-income respondents with a history of hyperlipidemia (P = .01) and 66% of higher-income versus 29% of lower-income respondents with a history of myocardial infarction (P = .03). There were no differences by income in the rates of aspirin or ACE inhibitor use. CONCLUSION: Among these Medicare managed care beneficiaries with diabetes mellitus, all of whom had the same pharmacy benefit, there were low rates of use of evidence-based therapies overall and substantially lower use of statins by poorer persons. [source] Use of radiotherapy in the primary treatment of cancer in South AustraliaJOURNAL OF MEDICAL IMAGING AND RADIATION ONCOLOGY, Issue 2 2003Colin Luke Summary Previous studies point to a lower use of radiotherapy by Australian cancer patients in lower socioeconomic areas and in country regions that are some distance from urban treatment centres. These were cross-sectional studies with the potential for error from changes in place of residence. We used a cohort design to avoid such error. South Australian patients diagnosed in 1990,1994 were followed until the date of censoring of 31 December 1999 using data from the State Cancer Registry. The percentage found to have had megavoltage therapy in the first 12 months following diagnosis varied by leading primary incidence site from 44% for the prostate to 40% for female breast, 38% for lung, 17% for rectum, 3% for colon and 2% for skin (melanoma). Multivariate analysis indicated that determinants of not receiving megavoltage therapy in the first 12 months were older age, female sex, residence in a country region and country of birth. Melanoma data revealed earlier stages for women than men. If this difference by sex applies to other cancers, it might explain the lower exposure of women to radiotherapy. Fewer older patients received radiotherapy, consistent with trends observed in hospital-based cancer-registry data. The influence on this finding of differences in stage and comorbidity requires additional study. While earlier findings of a lower exposure of country residents to radiotherapy were confirmed, the difference was comparatively small in this study. Variations in exposure by socioeconomic status of residential area were not observed. [source] The relationship between bed rest and sitting orthostatic intolerance in adults residing in chronic care facilitiesJOURNAL OF NURSING AND HEALTHCARE OF CHRONIC ILLNE SS: AN INTERNATIONAL INTERDISCIPLINARY JOURNAL, Issue 3 2010Mary T Fox MSc fox mt, sidani s & brooks d (2010) Journal of Nursing and Healthcare of Chronic Illness2, 187,196 The relationship between bed rest and sitting orthostatic intolerance in adults residing in chronic care facilities Aim., To examine the relationship between orthostatic intolerance and bed rest as it was used by/with 65 adults residing in chronic care facilities. Background., The evidence on the relationship between bed rest and orthostatic intolerance has been obtained from aerospace studies conducted in highly controlled laboratory settings, and is regarded as having high internal validity. In the studies, prolonged and continuous bed rest, administered in a horizontal or negative tilt body position, had a major effect on orthostatic intolerance in young adults. However, the applicability of the findings to the conditions of the real world of practice is questionable. Methods., Participants were recruited over the period of April 2005 to August 2006. A naturalistic cohort design was used. The cohorts represented different doses of bed rest that were naturally occurring. Comparisons were made between patients who had no bed rest (comparative dose group, n = 20), two to four days (moderate dose, n = 23) and five to seven days of bed rest (high dose, n = 22) during a one-week monitoring period. Orthostatic intolerance was measured by orthostatic vital signs and a self-report scale. Bed rest dose was measured by the total number of days spent in bed during one week. Results.,Post hoc comparisons, using Bonferroni adjustments, indicated significant differences in adjusted means on self-reported orthostatic intolerance between the comparative and high (CI: ,4·12, ,0·85; p < 0·001), and the moderate and high (CI: 0·35, 3·56, p < 0·01) bed rest dose cohorts. No group differences were found on orthostatic vital signs. Conclusions., A moderate dose of bed rest with intermittent exposure to upright posture may protect against subjective orthostatic intolerance in patients who are unable to tolerate being out of bed every day. Future research may examine the effects of reducing bed rest days on orthostatic intolerance in individuals with high doses of five to seven days of bed rest. [source] The Role of Youth Problem Behaviors in the Path From Child Abuse and Neglect to Prostitution: A Prospective ExaminationJOURNAL OF RESEARCH ON ADOLESCENCE, Issue 1 2010Helen W. Wilson Behaviors beginning in childhood or adolescence may mediate the relationship between childhood maltreatment and involvement in prostitution. This paper examines 5 potential mediators: early sexual initiation, running away, juvenile crime, school problems, and early drug use. Using a prospective cohort design, abused and neglected children (ages 0,11) with cases processed during 1967,1971 were matched with nonabused, nonneglected children and followed into young adulthood. Data are from in-person interviews at approximately age 29 and arrest records through 1994. Structural equation modeling tested path models. Results indicated that victims of child abuse and neglect were at increased risk for all problem behaviors except drug use. In the full model, only early sexual initiation remained significant as a mediator in the pathway from child abuse and neglect to prostitution. Findings were generally consistent for physical and sexual abuse and neglect. These findings suggest that interventions to reduce problem behaviors among maltreated children may also reduce their risk for prostitution later in life. [source] Extreme prematurity and school outcomesPAEDIATRIC & PERINATAL EPIDEMIOLOGY, Issue 4 2000G.M. Buck The purpose of this study was to assess the impact of extreme prematurity on three global measures of school outcomes. Using a matched cohort design, exposed infants comprised all surviving singleton infants 28 weeks gestation born at one regional neonatal intensive care hospital between 1983 and 1986 (n = 132). Unexposed infants comprised randomly selected full-term infants ( 37 weeks gestation) frequency matched on date of birth, zip code and health insurance. All children were selected from a regional tertiary children's centre serving western New York population. Standardised telephone interviews elicited information on grade repetition, special education placement and use of school-based services. Unconditional logistic regression was used to estimate odds ratios (OR) and corresponding 95% confidence intervals (CI) adjusted for potential confounders for children without major handicaps. Extreme prematurity was associated with a significant increase in risk of grade repetition (OR = 3.22; 95% CI = 1.63, 6.34), special education placement (OR = 3.16; 95% CI = 1.14, 8.76) and use of school-based services (OR = 4.56; 95% CI = 1.82, 11.42) in comparison with children born at term, even after controlling for age, race, maternal education, foster care placement and the matching factors. These findings suggest that survivors of extreme prematurity remain at risk of educational underachievement. [source] Extended phase I evaluation of vincristine, irinotecan, temozolomide, and antibiotic in children with refractory solid tumors,PEDIATRIC BLOOD & CANCER, Issue 7 2010René Y. McNall-Knapp MD Abstract Background The combination of irinotecan, temozolomide, and vincristine is appealing because of potentially synergistic mechanisms of action and non-overlapping toxicities. This phase I study was designed to determine the toxicity and maximum tolerated dose (MTD) of escalating daily protracted doses of irinotecan given in this combination. With extended accrual, we more fully explored the toxicity of multiple courses at the MTD. Procedure Patients under 22 years with recurrent or refractory solid tumors were eligible. A course of chemotherapy was given every 28 days. Cefpodoxime was given for diarrhea prophylaxis. Vincristine (1.5,mg/m2, max 2,mg) was given intravenously (IV) on days 1 and 8. Temozolomide (100,mg/m2/day) was given orally on days 1,5. Irinotecan was given IV over 1,hr on days 1,5 and 8,12. Dose escalation was done in the standard 3,+,3 cohort design, starting at 15,mg/m2/day. Results Twenty-five of 26 eligible patients were evaluable for toxicity and response. They received 111 courses (1,13, median 4). Dose limiting toxicity (DLT,pancreatitis, transaminitis) was seen in two of three patients at dose level 2 (20,mg/m2). No patients at level 1 had DLT during the first two cycles. Thus, the MTD of irinotecan in this combination is 15,mg/m2/day,×,10 doses. Hematologic toxicity was mild and not prolonged. Grade 3 diarrhea was seen in five courses. Responses included two complete and two partial with 12 stable disease (SD) (median 6 months). Conclusions This combination is safe and shows activity in pediatric patients with recurrent malignancy. Pediatr Blood Cancer 2010;54:909,915 © 2010 Wiley-Liss, Inc. [source] A basic study design for expedited safety signal evaluation based on electronic healthcare data,PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 8 2010Sebastian Schneeweiss MD Abstract Active drug safety monitoring based on longitudinal electronic healthcare databases (a Sentinel System), as outlined in recent FDA-commissioned reports, consists of several interlocked processes, including signal generation, signal strengthening, and signal evaluation. Once a signal of a potential drug safety issue is generated, signal strengthening and signal evaluation have to follow in short sequence in order to quickly provide as much information about the triggering drug-event association as possible. This paper proposes a basic study design based on the incident user cohort design for expedited signal evaluation in longitudinal healthcare databases. It will not resolve all methodological issues nor will it fit all study questions arising within the framework of a Sentinel System. It should rather be seen as a guidance that will fit the majority of situations and serve as a starting point for adaptations to specific studies. Such an approach will expedite and structure the process of study development and highlight specific assumptions, which is particularly valuable in a Sentinel System where signals are by definition preliminary and evaluation of signals is time critical. Copyright © 2010 John Wiley & Sons, Ltd. [source] Semi-automated risk estimation using large databases: quinolones and clostridium difficile associated diarrhea,PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 6 2010Robertino M. Mera Abstract Purpose The availability of large databases with person time information and appropriate statistical methods allow for relatively rapid pharmacovigilance analyses. A semi-automated method was used to investigate the effect of fluoroquinolones on the incidence of C. difficile associated diarrhea (CDAD). Methods Two US databases, an electronic medical record (EMR) and a large medical claims database for the period 2006,2007 were evaluated using a semi-automated methodology. The raw EMR and claims datasets were subject to a normalization procedure that aligns the drug exposures and conditions using ontologies; Snowmed for medications and MedDRA for conditions. A retrospective cohort design was used together with matching by means of the propensity score. The association between exposure and outcome was evaluated using a Poisson regression model after taking into account potential confounders. Results A comparison between quinolones as the target cohort and macrolides as the comparison cohort produced a total of 564,797 subjects exposed to a quinolone in the claims data and 233,090 subjects in the EMR. They were matched with replacement within six strata of the propensity score. Among the matched cohorts there were a total of 488 and 158 outcomes in the claims and the EMR respectively. Quinolones were found to be twice more likely to be significantly associated with CDAD than macrolides adjusting for risk factors (IRR 2.75, 95%CI 2.18,3.48). Conclusions Use of a semi-automated method was successfully applied to two observational databases and was able to rapidly identify a potential for increased risk of developing CDAD with quinolones. Copyright © 2010 John Wiley & Sons, Ltd. [source] Trends and determinants of antiresorptive drug use for osteoporosis among elderly women,PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 10 2005Sylvie Perreault PhD Abstract Aim It has been established that women who have had a first osteoporotic fracture are at a significantly greater risk of future fractures. Effective antiresorptive treatments (ART) are available to reduce this risk, yet little information is available on trends in ART drug use among the elderly. The objective is to estimate the rate ratio (RR) of having an ART prescription filled among elderly women and its relation to selected determinants from 1995 through 2001. Method A cohort design was used. Through random sampling, we selected 40% of the women aged 70 years and older listed in the Régie de l'assurance maladie du Québec (RAMQ) health database. The women were grouped into four cohorts (for 1995, 1996, 1998 and 2000). January 1 was established as the index date within each cohort (1995, 1996, 1998 and 2000). The dependent variable was the RR of having at least one prescription of ART drugs filled during the year following the index date among women with and without prior use. ART users were divided in two groups: bone-specific drugs (bisphosphonates, calcitonin, raloxifen) and HRT (hormone replacement therapy). The independent variable was whether or not (control) there had been an osteoporotic-related fracture. The RR was determined for having at least one prescription of bone-specific drugs or of HRT filled during the year following the index date using a Cox regression adjusted for age, chronic disease score (CDS) and prior bone mineral density (BMD) test. Results Crude rates of BMD testing (per 500 person-years) ranged from 20.4 (1995) to 41.1 (2000) in women who had had an osteoporotic-related fracture, and from 4.4 to 15.3 in controls. The crude rate of women (per 100 person-years) who had had an osteoporotic-related fracture and who took at least one bone-specific drug during follow-up ranged from 1.9 in 1995 to 31 in 2000 among those with prior osteoporotic-related fracture, and from 0.5 in 1995 to 11 in 2000 for controls; the corresponding figures for HRT ranged 6.7 in 1995 to 13 in 2000, and from 8.4 in 1995 to 11 in 2000 respectively. BMD test is the only major factor affecting the adjusted RR of having a prescription filled for bone-specific drugs (RR of 10.44; 6.91,15.79 in 1995 and RR of 3.68; 3.30,4.10 in 2000) or HRT (RR of 2.08; 1.64,2.64 in 1995 and RR of 1.44; 1.17,1.77 in 2000), particularly among women who had not had prior use. The fact of having a fracture status does significantly affect the RR of having at least one bone-specific drug prescription filled only among women without prior use (RR of 1.71; 1.26,2.33 in 1996 and RR of 1.77; 1.44,2.19 in 2000). The fact of being younger did not affect the RR of having at least one prescription of bone-specific drugs filled, but being younger increased the RR of filling a prescription of HRT. Conclusions Significant change was seen over time in the number of BMD tests ordered and ART use. Effective osteoporosis interventions are not optimal in the treatment of elderly women in a Canadian health-care system who have had an osteoporotic fracture, given that approximately 25% of women who had had an osteoporotic-related fracture were users of ART. Copyright © 2005 John Wiley & Sons, Ltd. [source] Assessing teratogenicity of antiretroviral drugs: monitoring and analysis plan of the Antiretroviral Pregnancy Registry,,PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 8 2004Deborah L. Covington DrPH Abstract This paper describes the Antiretroviral Pregnancy Registry's (APR) monitoring and analysis plan. APR is overseen by a committee of experts in obstetrics, pediatrics, teratology, infectious diseases, epidemiology and biostatistics from academia, government and the pharmaceutical industry. APR uses a prospective exposure-registration cohort design. Clinicians voluntarily register pregnant women with prenatal exposures to any antiretroviral therapy and provide fetal/neonatal outcomes. A birth defect is any birth outcome ,20 weeks gestation with a structural or chromosomal abnormality as determined by a geneticist. The prevalence is calculated by dividing the number of defects by the total number of live births and is compared to the prevalence in the CDC's population-based surveillance system. Additionally, first trimester exposures, in which organogenesis occurs, are compared with second/third trimester exposures. Statistical inference is based on exact methods for binomial proportions. Overall, a cohort of 200 exposed newborns is required to detect a doubling of risk, with 80% power and a Type I error rate of 5%. APR uses the Rule of Three: immediate review occurs once three specific defects are reported for a specific exposure. The likelihood of finding three specific defects in a cohort of ,600 by chance alone is less than 5% for all but the most common defects. To enhance the assurance of prompt, responsible, and appropriate action in the event of a potential signal, APR employs the strategy of ,threshold'. The threshold for action is determined by the extent of certainty about the cases, driven by statistical considerations and tempered by the specifics of the cases. Copyright © 2004 John Wiley & Sons, Ltd. [source] Environmental exposure to carcinogens causing lung cancer: Epidemiological evidence from the medical literatureRESPIROLOGY, Issue 4 2003Melissa J. WHITROW Objective: In 2000 there were 1.1 million lung or bronchial cancer deaths worldwide, with relatively limited evidence of causation other than for smoking. We aimed to search and appraise the literature regarding evidence for a causal relationship between air pollution and lung cancer according to the 10 Bradford Hill criteria for causality. Methodology: A MEDLINE search was performed using the following key words: ,lung neoplasm', ,epidemiology', ,human', ,air pollution'and ,not molec*'. The criteria for inclusion was: cited original research that described the study population, measured environmental factors, was of case control or cohort design, and was undertaken after 1982. Results: Fourteen papers (10 case control, four cohort studies) fulfilled the search criteria, with a sample size ranging from 101 cases and 89 controls, to a cohort of 552 cases and 138 controls. Of the 14 papers that fulfilled the search criteria the number of papers addressing each of the Bradford Hill critera were as follows: Strength of association: eight studies demonstrated significant positive associations between environmental exposure and lung cancer with a relative risk range of 1.14,5.2. One study found a negative association with relative risk 0.28. Consistency: eight of 14 studies found significant positive associations and one of 14 a significant negative association. Specificity: tobacco smoking and occupational exposure were addressed in all studies (often crudely with misclassification). Temporality: exposure prior to diagnosis was demonstrated in nine studies. Dose,response relationship: evident in three studies. Coherence, analogy: not addressed in any study. Conclusion: Evidence for causality is modest, with intermediate consistency of findings, limited dose,response evidence and crude adjustment for important potential confounders. Large studies with comprehensive risk factor quantification are required to clarify the potentially small effect of air pollution given the relatively large effects of tobacco smoking and occupational carcinogen exposure. [source] Fate of the Mate: The Influence of Delayed Graft Function in Renal Transplantation on the Mate RecipientAMERICAN JOURNAL OF TRANSPLANTATION, Issue 8 2009J. F. Johnson Delayed graft function (DGF) in a deceased-donor renal recipient is associated with allograft dysfunction 1-year posttransplant. There is limited research about the influence to allograft function on the mate of a DGF recipient over time. Using a retrospective cohort design, we studied 55 recipients from a single center. The primary outcome was the change in glomerular filtration rate (GFR) 1-year posttransplant. The secondary outcome was the GFR at baseline. We found that mates to DGF recipients had a mean change in GFR 1-year posttransplant of ,11.2 mL/min, while the control group had a mean change of ,0.4 mL/min. The difference in the primary outcome was significant (p = 0.025) in a multivariate analysis, adjusting for cold ischemic time, panel reactive antibody level, allograft loss, human leukocyte antibody (HLA)-B mismatches and HLA-DR mismatches. No significant difference between groups was found in baseline GFR. In conclusion, mates to DGF recipients had a significantly larger decline in allograft function 1-year posttransplant compared to controls with similar renal function at baseline. We believe strategies that may preserve allograft function in these,at-risk'recipients should be developed and tested. [source] Is in vitro fertilisation more effective than stimulated intrauterine insemination as a first-line therapy for subfertility?AUSTRALIAN AND NEW ZEALAND JOURNAL OF OBSTETRICS AND GYNAECOLOGY, Issue 3 2010A cohort analysis Objective:, To compare a strategy of two cycles of intrauterine insemination with controlled ovarian hyperstimulation (IUI/COH) vs one in vitro fertilisation (IVF) treatment programme (one fresh plus associated frozen embryo cycles) in couples presenting with unexplained, mild male or mild female subfertility. Methods:, A retrospective cohort design was used and analysed according to intention-to-treat principles. A total of 272 couples underwent an intended course of two cycles of IUI/COH and 176 couples underwent one IVF treatment programme. Results:, The cumulative live birth rate (CLBR) per couple for the IUI/COH group was 27.6% compared to 39.2% for the IVF group (P = 0.01). The mean time to pregnancy was 69 days in the IUI/COH group compared to 44 days in the IVF group (P = 0.02). The IVF programme was costlier, with an incremental cost-effectiveness ratio for an additional live birth in the range of $39 637,$46 325. The multiple delivery rate was 13.3% in the IUI/COH group compared to 10.1% in the IVF group (P = 0.55). One set of triplets and one set of quadruplets followed IUI/COH treatment. Conclusions:, One IVF treatment programme was more effective, but costlier than an intended course of two cycles of IUI/COH. With consistently higher success rates, shorter times to pregnancy and a trend to less higher order multiple pregnancies, this study supports the view that IVF is now potentially safer and more clinically effective than IUI/COH as a first-line therapy for subfertility. [source] Validation of the WHOQOL-BREF among women following childbirthAUSTRALIAN AND NEW ZEALAND JOURNAL OF OBSTETRICS AND GYNAECOLOGY, Issue 2 2010Joan WEBSTER Background:, There is increasing interest in measuring quality of life (QOL) in clinical settings and in clinical trials. None of the commonly used QOL instruments has been validated for use postnatally. Aim:, To assess the psychometric properties of the 26-item WHOQOL-BREF (short version of the World Health Organization Quality of Life assessment) among women following childbirth. Methods:, Using a prospective cohort design, we recruited 320 women within the first few days of childbirth. At six weeks postpartum, participants were asked to complete the WHOQOL-BREF, the Edinburgh Postnatal Depression Index and the Australian Unity Wellbeing Index. Validation of the WHOQOL-BREF included an analysis of internal consistency, discriminate validity, convergent validity and an examination of the domain structure. Results:, In all, 221 (69.1%) women returned their six-week questionnaire. All domains of the WHOQOL-BREF met reliability standards (alpha coefficient exceeding 0.70). The questionnaire discriminated well between known groups (depressed women and non-depressed women. P , 0.000) and demonstrated satisfactory correlations with the Australian Unity Wellbeing index (r , 0.45). The domain structure of the WHOQOL-BREF was also valid in this population of new mothers, with moderate-to-high correlation between individual items and the domain structure to which the items were originally assigned. Conclusion:, The WHOQOL-BRF is a well-accepted and valid instrument in this population and may be used in postnatal clinical settings or for assessing intervention effects in research studies. [source] Endometrial cells as a predictor of uterine cancerAUSTRALIAN AND NEW ZEALAND JOURNAL OF OBSTETRICS AND GYNAECOLOGY, Issue 1 2007Adrian R. HEARD Abstract Background:, With the recent cervix screening national guidelines recommending against reporting of benign endometrial cells, we examined South Australian data to see what impact this would have on detecting uterine cancers. Aims:, To test whether benign endometrial cells detected in cervical cytology testing confer an increased risk of uterine cancer, and to ascertain what percentage of uterine cancers will be missed in cervical screening programs if these cells are not reported. Methods:, The study was a retrospective cohort design of 1585 women with shed endometrial cells, each matched with three women without shed cells. All were linked with cancer registry data to check for uterine cancer diagnosis. Cox proportional hazards regression was used to check for any increase in cancer risk with shed endometrial cells. Using the calculated relative risks for uterine cancer diagnosis, we estimated the number of uterine cancers in South Australia associated with benign endometrial cells. Results:, The presence of benign endometrial cells in a cervical cytology test increases the risk of uterine cancer sixfold. However, screening women with benign cells would involve a major increase in pathology work for only an 18% increase in uterine cancers detected. Conclusions:, Until cytology systems have a higher sensitivity in detecting which benign endometrial cells are associated with uterine cancer, pathology laboratories are unlikely to be required to report these cells on tests. Inability to adjust for symptomatic status may have reduced the relevance of the results in this study. [source] Case,Cohort Analysis with Accelerated Failure Time ModelBIOMETRICS, Issue 1 2009Lan Kong Summary In a case,cohort design, covariates are assembled only for a subcohort that is randomly selected from the entire cohort and any additional cases outside the subcohort. This design is appealing for large cohort studies of rare disease, especially when the exposures of interest are expensive to ascertain for all the subjects. We propose statistical methods for analyzing the case,cohort data with a semiparametric accelerated failure time model that interprets the covariates effects as to accelerate or decelerate the time to failure. Asymptotic properties of the proposed estimators are developed. The finite sample properties of case,cohort estimator and its relative efficiency to full cohort estimator are assessed via simulation studies. A real example from a study of cardiovascular disease is provided to illustrate the estimating procedure. [source] Waiting time for rehabilitation services for children with physical disabilitiesCHILD: CARE, HEALTH AND DEVELOPMENT, Issue 5 2002D. Ehrmann Feldman Abstract Background Early rehabilitation may minimize disability and complications. However, children often wait a long time to gain admission to rehabilitation centres. Objectives To describe waiting times for paediatric physical and occupational therapy and to determine factors associated with these waiting times. Research Design The study was a prospective cohort design. Patients were followed from 1 January 1999 to 1 March 2000. Subjects All children with physical disabilities, aged 0,18 years, referred in 1999 from the Montreal Children's Hospital to paediatric rehabilitation centres. Measures Data on date of referral, date of first appointment at the rehabilitation centre, age, gender, diagnosis, region and language were obtained from the rehabilitation transfer database. Primary family caregivers of children who were transferred to a rehabilitation facility participated in a telephone interview regarding their perceptions of the transfer process. Results There were 172 children referred to rehabilitation facilities. The mean age of the children was 2.5 years. Average waiting time was 157.4 days (SD 57.1) for occupational therapy and 129.4 days (SD 51.6) for physical therapy. Decreased waiting time was associated with living in the city as opposed to the suburbs (hazard ratio = 1.77; 95% confidence interval = 0.92,3.41) and inversely associated with age (hazard ratio = 0.46; 95% confidence interval = 0.34,0.62). Among the 41 primary family caregivers who participated in the survey, higher empowerment scores were associated with shorter waits for rehabilitation. Conclusion Waiting time for rehabilitation services needs to be reduced. Empowered parents appear to manoeuvre within the system to reduce waiting times for their children. [source] Evidence of shared genetic risk factors for migraine and rolandic epilepsyEPILEPSIA, Issue 11 2009Tara Clarke Summary Purpose:, Evidence for a specific association between migraine and rolandic epilepsy (RE) has been conflicting. Children with migraine frequently have electroencephalographic (EEG) abnormalities, including rolandic discharges, and approximately 50% of siblings of patients with RE exhibit rolandic discharges. We assessed migraine risk in RE probands and their siblings. Methods:, We used cohort and reconstructed cohort designs to respectively assess the relative risk of migraine in 72 children with RE and their 88 siblings using International Classification of Headache Disorders (ICHD-2) criteria. Incidences were compared in 150 age and geographically matched nonepilepsy probands and their 188 siblings. We used a Cox proportional hazards model, using age as the time base, adjusting hazard ratios (HRs) for sex in the proband analysis, and for sex and proband migraine status in the sibling analysis. Results:, Prevalence of migraine in RE probands was 15% versus 7% in nonepilepsy probands, and in siblings of RE probands prevalence was 14% versus 4% in nonepilepsy siblings. The sex-adjusted HR of migraine for an RE proband was 2.46 [95% confidence interval (CI) 1.06,5.70]. The adjusted HR of having ,1 sibling with migraine in an RE family was 3.35 (95% CI 1.20,9.33), whereas the HR of any one sibling of a RE proband was 2.86 (95% CI 1.10,7.43). Discussion:, Migraine is strongly comorbid in RE and independently clusters in their siblings. These results suggest shared susceptibility to migraine and RE that is not directly mediated by epileptic seizures. Susceptibility gene variants for RE may be tested as risk factors for migraine. [source] |