Home About us Contact | |||
Longitudinal Cohort Study (longitudinal + cohort_study)
Kinds of Longitudinal Cohort Study Selected AbstractsAntihypertensive Drugs and New-Onset Diabetes: A Retrospective Longitudinal Cohort StudyCARDIOVASCULAR THERAPEUTICS, Issue 3 2009Gwo-Ping Jong Antihypertensive drugs have been linked to new-onset diabetes (NOD); however, data on the effect of these drugs on the development of NOD in hypertensive patients has not been well determined. We aimed to investigate the association between antihypertensive drugs and NOD. This was a retrospective cohort study performed using data from claim forms provided to the central region branch of the Bureau of National Health Insurance in Taiwan from January 2002 to December 2007. Prescriptions for antihypertensive drugs before the index date were retrieved from a prescription database. We estimated the odds ratios (ORs) of NOD associated with antihypertensive drug use; nondiabetic subjects served as the reference group. A total of 4233 NOD cases were identified in 24,688 hypertensive patients during the study period. The risk of NOD after adjusting for sex and age was higher among users of diuretics (OR = 1.10, 95% confidence interval [CI]= 1.01,1.20), beta-blockers (BBS; OR = 1.12, 95% CI = 1.04,1.21), and calcium channel blockers (CCBs; OR = 1.10, 95% CI = 1.02,1.18) than among nonusers. Patients who take angiotensin-converting enzyme (ACE) inhibitors (OR = 0.92, 95% CI = 0.84,1.00), angiotensin receptor blockers (ARB; OR = 0.90, 95% CI = 0.81,0.98), or alpha-blockers (OR = 0.88, 95% CI = 0.80,0.98) are at a lower risk of developing NOD than nonusers. Vasodilators were not associated with the risk of NOD. The results of this study suggest that hypertensive patients who take ACE inhibitors, ARBs, or alpha-blockers are at a lower risk of NOD. Diuretics, BBs, and CCBs were associated with a significant increase in the risk of NOD. [source] Access to Health Care Services for the Disabled ElderlyHEALTH SERVICES RESEARCH, Issue 3p1 2006Donald H. Taylor Jr. Objective. To determine whether difficulty walking and the strategies persons use to compensate for this deficit influenced downstream Medicare expenditures. Data Source. Secondary data analysis of Medicare claims data (1999,2000) for age-eligible Medicare beneficiaries (N=4,997) responding to the community portion of the 1999 National Long Term Care Survey (NLTCS). Study Design. Longitudinal cohort study. Walking difficulty and compensatory strategy were measured at the 1999 NLTCS, and used to predict health care use as measured in Medicare claims data from the survey date through year-end 2000. Data Extraction. Respondents to the 1999 community NLTCS with complete information on key explanatory variables (walking difficulty and compensatory strategy) were linked with Medicare claims to define outcome variables (health care use and cost). Principal Findings. Persons who reported it was very difficult to walk had more downstream home health visits (1.1/month, p<.001), but fewer outpatient physician visits (,0.16/month, p<.001) after controlling for overall disease burden. Those using a compensatory strategy for walking also had increased home health visits/month (0.55 for equipment, 1.0 for personal assistance, p<.001 for both) but did not have significantly reduced outpatient visits. Persons reporting difficulty walking had increased downstream Medicare costs ranging from $163 to $222/month (p<.001) depending upon how difficult walking was. Less than half of the persons who used equipment to adapt to walking difficulty had their difficulty fully compensated by the use of equipment. Persons using equipment that fully compensated their difficulty used around $300/month less in Medicare-financed costs compared with those with residual difficulty. Conclusions. Difficulty walking and use of compensatory strategies are correlated with the use of Medicare-financed services. The potential impact on the Medicare program is large, given how common such limitations are among the elderly. [source] Cardiovascular Disease Is Associated with Greater Incident Dehydroepiandrosterone Sulfate Decline in the Oldest Old: The Cardiovascular Health Study All Stars StudyJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 3 2010Jason L. Sanders BA OBJECTIVES: To describe cross-sectional and longitudinal associations with dehydroepiandrosterone sulfate (DHEAS) and change in DHEAS with age. DESIGN: Longitudinal cohort study. SETTING: Pittsburgh, Pennsylvania. PARTICIPANTS: Cardiovascular Health Study All Stars study participants assessed in 2005/06 (N=989, mean age 85.2, 63.5% women, 16.5% African American). MEASUREMENTS: Health characteristics were assessed in 2005/06 according to DHEAS level, mean DHEAS and DHEAS change across age categories were tested, and linear and logistic regression was used to identify factors present in 1996/97 associated with continuous and categorical DHEAS change. RESULTS: Mean ± standard deviation DHEAS was 0.555 ± 0.414 ,g/mL in 1996/97 and 0.482 ± 0.449 ,g/mL in 2005/06 for women and 0.845 ± 0.520 ,g/mL in 1996/97 and 0.658 ± 0.516 ,g/mL in 2005/06 for men. In 2005/06, DHEAS was lower in women and subjects with cardiovascular disease (CVD) and chronic pulmonary disease and higher for African Americans and subjects with hypertension and high cholesterol. Mean DHEAS change was greater in men (,0.200 ,g/mL) than in women (,0.078 ,g/mL) (P<.001). Each 1-year increase in age attenuated the effect of male sex by 0.01 ,g/mL (P=.009), abolishing the sex difference in DHEAS change by age 79. Presence of CVD before the study period was associated with greater absolute DHEAS change (,=,0.04 ,g/mL, P=.04) and with the fourth quartile of DHEAS change versus the first to third quartiles (odds ratio=1.46, 95% confidence interval=1.03,2.05). CONCLUSION: DHEAS change continues into very old age, is not homogenous, is affected by sex, and is associated with prevalent CVD. Future studies should investigate factors that might accelerate DHEAS decline. [source] Physical Performance and Subsequent Disability and Survival in Older Adults with Malignancy: Results from the Health, Aging and Body Composition StudyJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 1 2010Heidi D. Klepin MD OBJECTIVES: To evaluate objective physical performance measures as predictors of survival and subsequent disability in older patients with cancer. DESIGN: Longitudinal cohort study. SETTING: Health, Aging and Body Composition (Health ABC) Study. PARTICIPANTS: Four hundred twenty-nine individuals diagnosed with cancer during the first 6 years of follow-up of the Health ABC Study. MEASUREMENTS: The associations between precancer measures of physical performance (20-m usual gait speed, 400-m long-distance corridor walk (LDCW), and grip strength) and overall survival and a short-term outcome of 2-year progression to disability or death were evaluated. Cox proportional hazards and logistic regression models, stratified for metastatic disease, respectively, were used for outcomes. RESULTS: Mean age was 77.2, 36.1% were women, and 45.7% were black. Faster 20-m usual walking speed was associated with a lower risk of death in the metastatic group (hazard ratio=0.89, 95% confidence interval (CI)=0.79,0.99) and lower 2-year progression to disability or death in the nonmetastatic group (odds ratio (OR)=0.77, 95% CI=0.64,0.94). Ability to complete the 400-m LDCW was associated with lower 2-year progression to disability or death in the nonmetastatic group (OR=0.24, 95% CI=0.10,0.62). There were no associations between grip strength and disability or death. CONCLUSION: Lower extremity physical performance tests (usual gait speed and 400-m LDCW) were associated with survival and 2-year progression to disability or death. Objective physical performance measures may help inform pretreatment evaluations in older adults with cancer. [source] Do Hierarchical Condition Category Model Scores Predict Hospitalization Risk in Newly Enrolled Medicare Advantage Participants as Well as Probability of Repeated Admission Scores?JOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 12 2009David G. Mosley MHA OBJECTIVES: To compare how well hierarchical condition categories (HCC) and probability of repeated admission (PRA) scores predict hospitalization. DESIGN: Longitudinal cohort study with 12-month follow-up. SETTING: A Medicare Advantage (MA) plan. PARTICIPANTS: Four thousand five hundred six newly enrolled beneficiaries. MEASUREMENT: HCC scores were identified from enrollment files. The PRA tool was administered by mail and telephone. Inpatient admissions were based on notifications. The Mann-Whitney test was used to compare HCC scores of PRA responders and nonresponders. The receiver operating characteristic curve provided the area under the curve (AUC) for each score. Admission risk in the top 5% of scores was evaluated using logistic regression. RESULTS: Within 60 days of enrollment, 45.1% of the 3,954 beneficiaries with HCC scores completed the PRA tool. HCC scores were lower for the 1,783 PRA respondents than the 2,171 nonrespondents (0.71 vs 0.81, P<.001). AUCs predicting hospitalization with regard to HCC and PRA were similar (0.638, 95% confidence interval (CI)=0.603,0.674; 0.654, 95% CI=0.618,0.690). Individuals identified in the top 5% of scores using both tools, using HCC alone, or using PRA alone had higher risk for hospitalization than those below the 95th percentile (odds ratio (OR)=8.5, 95% CI=3.7,19.4, OR=3.8, 95% CI=2.3,6.3, and OR=3.9, 95% CI=2.3,6.4, respectively). CONCLUSION: HCC scores provided to MA plans for risk adjustment of revenue can also be used to identify hospitalization risk. Additional studies are required to evaluate whether a hybrid approach incorporating administrative and self-reported models would further optimize risk stratification efforts. [source] Allostatic Load and Frailty in Older AdultsJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 9 2009Tara L. Gruenewald PhD OBJECTIVES: To examine the association between allostatic load (AL), an index of multisystem physiological dysregulation, and frailty development over a 3-year follow-up in a sample of older adults. DESIGN: Longitudinal cohort study. SETTING: Community. PARTICIPANTS: High-functioning men and women aged 70 to 79 at study entry. MEASUREMENTS: Multisystem physiological dysregulation, or AL, was assessed according to 13 biomarkers of cardiovascular, endocrine, immune, and metabolic function. An AL score was computed as the total number of biomarkers for which participant values fell into high-risk biomarker quartiles. Frailty status (not frail, intermediate frail, frail) was determined according to the total number of five indicators of frailty: weight loss, exhaustion, weak grip, slow gait, and low physical activity. The association between level of AL at baseline and frailty status 3 years later was examined using ordinal logistic regression in 803 participants not frail at baseline. RESULTS: In a multivariable model adjusting for sociodemographic, health, and behavioral characteristics, each 1-unit increase in AL at baseline was associated with a 10% greater likelihood of frailty at the 3-year follow-up (cumulative adjusted odds ratio=1.10, 95% confidence interval=1.03,1.19). CONCLUSION: These findings support the hypothesis that dysregulation across multiple physiological systems is associated with greater risk of frailty. Greater levels of multisystem physiological dysregulation may serve as a warning sign of frailty development in later life. [source] Vision-Enhancing Interventions in Nursing Home Residents and Their Short-Term Effect on Physical and Cognitive FunctionJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 2 2009Amanda F. Elliott PhD OBJECTIVES: To evaluate the effect of vision-enhancing interventions (cataract surgery or refractive error correction) on physical function and cognitive status in nursing home residents. DESIGN: Longitudinal cohort study. SETTING: Seventeen nursing homes in Birmingham, Alabama. PARTICIPANTS: A total of 187 English-speaking adults aged 55 and older. INTERVENTION: Participants took part in one of two vision-enhancing interventions: cataract surgery or refractive error correction. Each group was compared against a control group (persons eligible for but who declined cataract surgery or who received delayed correction of refractive error). MEASUREMENTS: Physical function (ability to perform activities of daily living and mobility) was assessed using a series of self-report and certified nursing assistant ratings at baseline and at 2 months for the refractive error correction group and at 4 months for the cataract surgery group. The Mini Mental State Examination was also administered. RESULTS: No significant differences existed within or between groups from baseline to follow-up on any of the measures of physical function. Mental status scores significantly declined from baseline to follow-up for the immediate (P=.05) and delayed (P<.02) refractive error correction groups and for the cataract surgery control group (P=.05). CONCLUSION: Vision-enhancing interventions did not lead to short-term improvements in physical functioning or cognitive status in this sample of elderly nursing home residents. [source] Dementia and Alzheimer's Disease Incidence in Relationship to Cardiovascular Disease in the Cardiovascular Health Study CohortJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 7 2005Anne B. Newman MD Objectives: To determine whether coronary artery disease, peripheral arterial disease (PAD), or noninvasive markers of cardiovascular disease (CVD) predict the onset of dementia and Alzheimer's disease (AD). Design: Longitudinal cohort study. Setting: Four U.S. communities. Participants: Men and women (N=3,602) with a brain magnetic resonance imaging (MRI) scan but no dementia were followed for 5.4 years. Participants with stroke were excluded. Measurements: Neurologists and psychiatrists classified incident cases of dementia and subtype using neuropsychological tests, examination, medical records and informant interviews. CVD was defined at the time of the MRI scan. Noninvasive tests of CVD were assessed within 1 year of the MRI. Apolipoprotein E allele status, age, race, sex, education, Mini-Mental State Examination score, and income were assessed as potential confounders. Results: The incidence of dementia was higher in those with prevalent CVD, particularly in the subgroup with PAD. The rate of AD was 34.4 per 1,000 person-years for those with a history of CVD, versus 22.2 per 1,000 person-years without a history of CVD (adjusted hazard ratio (HR)=1.3, 95% confidence interval (CI)=1.0,1.7). Rates of AD were highest in those with PAD (57.4 vs 23.7 per 100 person-years, adjusted HR=2.4, 95% CI=1.4,4.2). Results were similar with further exclusion of those with vascular dementia from the AD group. A gradient of increasing risk was noted with the extent of vascular disease. Conclusion: Older adults with CVD other than stroke had a higher risk of dementia and AD than did those without CVD. The risk was highest in people with PAD, suggesting that extensive peripheral atherosclerosis is a risk factor for AD. [source] Stability and decline in gross motor function among children and youth with cerebral palsy aged 2 to 21 yearsDEVELOPMENTAL MEDICINE & CHILD NEUROLOGY, Issue 4 2009STEVEN E HANNA PHD This paper reports the construction of gross motor development curves for children and youth with cerebral palsy (CP) in order to assess whether function is lost during adolescence. We followed children previously enrolled in a prospective longitudinal cohort study for an additional 4 years, as they entered adolescence and young adulthood. The resulting longitudinal dataset comprised 3455 observations of 657 children with CP (369 males, 288 females), assessed up to 10 times, at ages ranging from 16 months to 21 years. Motor function was assessed using the 66-item Gross Motor Function Measure (GMFM-66). Participants were classified using the Gross Motor Function Classification System (GMFCS). We assessed the loss of function in adolescence by contrasting a model of function that assumes no loss with a model that allows for a peak and subsequent decline. We found no evidence of functional decline, on average, for children in GMFCS Levels I and II. However, in Levels III, IV, and V, average GMFM-66 was estimated to peak at ages 7 years 11 months, 6 years 11 months, and 6 years 11 months respectively, before declining by 4.7, 7.8, and 6.4 GMFM-66 points, in Levels III, IV, and V respectively, as these adolescents became young adults. We show that these declines are clinically significant. [source] Parents who quit smoking and their adult children's smoking cessation: a 20-year follow-up studyADDICTION, Issue 6 2009Jonathan B. Bricker ABSTRACT Aims Extending our earlier findings from a longitudinal cohort study, this study examines parents' early and late smoking cessation as predictors of their young adult children's smoking cessation. Design Parents' early smoking cessation status was assessed when their children were aged 8 years; parents' late smoking cessation was assessed when their children were aged 17 years. Young adult children's smoking cessation, of at least 6 months duration, was assessed at age 28 years. Setting Forty Washington State school districts. Participants and measurements Participants were 991 at least weekly smokers at age 17 whose parents were ever regular smokers and who also reported their smoking status at age 28. Questionnaire data were gathered on parents and their children (49% female and 91% Caucasian) in a longitudinal cohort (84% retention). Findings Among children who smoked daily at age 17, parents' quitting early (i.e. by the time their children were aged 8) was associated with a 1.7 times higher odds of these children quitting by age 28 compared to those whose parents did not quit [odds ratio (OR) 1.70; 95% confidence interval (CI) 1.23, 2.36]. Results were similar among children who smoked weekly at age 17 (OR 1.91; 95% CI 1.41, 2.58). There was a similar, but non-significant, pattern of results among those whose parents quit late. Conclusions Supporting our earlier findings, results suggest that parents' early smoking cessation has a long-term influence on their adult children's smoking cessation. Parents who smoke should be encouraged to quit when their children are young. [source] Adult Emergency Department Patients with Sickle Cell Pain Crisis: A Learning Collaborative Model to Improve Analgesic ManagementACADEMIC EMERGENCY MEDICINE, Issue 4 2010Paula Tanabe PhD Abstract Objectives:, The objectives were to report the baseline (prior to quality improvement interventions) patient and visit characteristics and analgesic management practices for each site participating in an emergency department (ED) sickle cell learning collaborative. Methods:, A prospective, multisite longitudinal cohort study in the context of a learning-collaborative model was performed in three midwestern EDs. Each site formed a multidisciplinary team charged with improving analgesic management for patients with sickle cell disease (SCD). Each team developed a nurse-initiated analgesic protocol for SCD patients (implemented after a baseline data collection period of 3.5 months at one site and 10 months at the other two sites). All sites prospectively enrolled adults with an acute pain crisis and SCD. All medical records for patients meeting study criteria were reviewed. Demographic, health services, and analgesic management data were abstracted, including ED visit frequency data, ED disposition, arrival and discharge pain score, and name and route of initial analgesic administered. Ten interviews per quarter per site were conducted with patients within 14 days of their ED discharge, and subjects were queried about the highest level of pain acceptable at discharge. The primary outcome variable was the time to initial analgesic administration. Variable data were described as means and standard deviations (SDs) or medians and interquartile ranges (IQR) for nonnormal data. Results:, A total of 155 patients met study criteria (median age = 32 years, IQR = 24,40 years) with a total of 701 ED visits. Eighty-six interviews were conducted. Most patients (71.6%) had between one and three visits to the ED during the study period. However, after removing Site 3 from the analysis because of the short data enrollment period (3.5 months), which influenced the mean number of visits for the entire cohort, 52% of patients had between one and three ED visits over 10 months, 21% had four to nine visits, and 27% had between 10 and 67 visits. Fifty-nine percent of patients were discharged home. The median time to initial analgesic for the cohort was 74 minutes (IQR = 48,135 minutes). Differences between choice of analgesic agent and route selected were evident between sites. For the cohort, 680 initial analgesic doses were given (morphine sulfate, 42%; hydromorphone, 46%; meperidine, 4%; morphine sulfate and ibuprofen or ketorolac, 7%) using the following routes: oral (2%), intravenous (67%), subcutaneous (3%), and intramuscular (28%). Patients reported a significantly lower targeted discharge pain score (mean ± SD = 4.19 ± 1.18) compared to the actual documented discharge pain score within 45 minutes of discharge (mean ± SD = 5.77 ± 2.45; mean difference = 1.58, 95% confidence interval = .723 to 2.44, n = 43). Conclusions:, While half of the patients had one to three ED visits during the study period, many patients had more frequent visits. Delays to receiving an initial analgesic were common, and post-ED interviews reveal that sickle cell pain patients are discharged from the ED with higher pain scores than what they perceive as desirable. ACADEMIC EMERGENCY MEDICINE 2010; 17:399,407 © 2010 by the Society for Academic Emergency Medicine [source] Predictors of entering 24-h care for people with Alzheimer's disease: results from the LASER-AD studyINTERNATIONAL JOURNAL OF GERIATRIC PSYCHIATRY, Issue 11 2009Stephanie Habermann Abstract Objectives Many studies have investigated predictors of people with dementia entering 24-h care but this is the first to consider a comprehensive range of carer and care recipient (CR) characteristics derived from a systematic review, in a longitudinal cohort study followed up for several years. Methods We interviewed 224 people with Alzheimer's disease (AD) and their carers, recruited to be representative in terms of their severity, sex and living situation as part of the LASER-AD study; and determined whether they entered 24-h care in the subsequent 4.5,years. We tested a comprehensive range of characteristics derived from a systematic review, and used Cox proportional hazard regression to determine whether they independently predicted entering 24-h care. Results The main independent predictors of shorter time to enter 24-h care were the patient being: more cognitively or functionally impaired (hazard ratio (HR),=,1.09; 95% CI,=,1.06,1.12) and (HR,=,1.04 95% CI,=,1.03,1.05), having a paid versus a family carer (HR,=,2.22; 95% CI,=,1.39,3.57), the carer being less educated (HR,=,1.43; 95% CI,=,1.12,1.83) and spending less hours caring (HR,=,1.01; 95% CI,=,1.00,1.01). Conclusion As having a family carer who spent more time caring (taking into account illness severity) delayed entry to 24-h care, future research should investigate how to enable carers to provide this. Other interventions to improve patients' impairment may not only have benefits for patients' health but also allow them to remain longer at home. This financial benefit could more than offset the treatment cost. Copyright © 2009 John Wiley & Sons, Ltd. [source] Persistent apathy in Alzheimer's disease as an independent factor of rapid functional decline: the REAL longitudinal cohort studyINTERNATIONAL JOURNAL OF GERIATRIC PSYCHIATRY, Issue 4 2009L. Lechowski Abstract Objective To determine the role of persistent apathy in rapid loss of autonomy in Instrumental Activities of Daily Living (IADL) in women with Alzheimer´s disease (AD), taking into account the grade of cognitive decline. Methods The study was conducted on 272 women from the French REAL cohort. At inclusion patients had a Mini-Mental State Examination (MMSE) score between 10,26. A rapid functional decline was defined as a yearly drop of 4 points or more on the 14-point IADL Lawton scale. Persistent apathy was defined as a frequency score equal to 3 or 4 on the Neuro-Psychiatric Inventory at the three consecutive 6-monthly assessments. Results 27.6% of women had rapid functional decline in 1 year and 22.1% of them had persistent apathy. A logistic regression analysis showed that, in addition to cognitive decline, persistent apathy plays a role in rapid functional decline in 1 year. For example, for a 3-point decline in MMSE in 1 year, the probability of a rapid loss in IADL is 0.45 for women with persistent apathy compared with 0.28 for those without persistent apathy. Conlusions In this study, a rapid loss in IADL score was partly explained by persistent apathy. Copyright © 2008 John Wiley & Sons, Ltd. [source] Reaching the population with dementia drugs: what are the challenges?INTERNATIONAL JOURNAL OF GERIATRIC PSYCHIATRY, Issue 7 2007Fiona E Matthews Abstract Background Systematic evidence became available in the late 1990s on efficacy of cholinesterase inhibitors (CHEIs) for patients with mild to moderate Alzheimer's disease (AD) and they began to be used sporadically. Since January 2001 UK based guidelines indicated that one of three cholinesterase inhibitors (CHEIs) could be prescribed for these patients. Since then the cost of prescription in England and Wales has risen. There has been little investigation of uptake at the population level. Objective To estimate the population uptake of CHEIs in a population based study of dementia spanning this period. Design Using data from a 10-year follow up and a later 12 year interview of the Medical Research Council Cognitive Function and Ageing Study (MRC CFAS), a UK population based longitudinal cohort study of people originally aged 65 years and above, we investigated who was taking CHEIs during the period 2001,2004. We sought information from respondents taking part in the study what medication they were taking on a regular basis. Results Only 12, of the 219 individuals who received a study diagnosis of dementia were prescribed CHEIs [5%, 95% Confidence Intervals (CI) 3%,9%]) in 2001/2003 and none of the 28 individuals with a study diagnosis of dementia (0%, 95% CI 0,18%) in 2004 were prescribed CHEIs. Uptake was biased towards individuals with more education and higher social class. Conclusions These data suggest that any impact on AD progression at the population level will be negligible as prescription of CHEIs and uptake in the age group at highest risk is so limited. There is little evidence that this has changed over time. Copyright © 2006 John Wiley & Sons, Ltd. [source] Association Between Interleukin-6 and Lower Extremity Function After Hip Fracture,The Role of Muscle Mass and StrengthJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 6 2008Ram R. Miller MDCM OBJECTIVES: To examine whether an effect on muscle mass or strength explains the association between interleukin-6 (IL-6) and lower extremity function in the year after hip fracture. DESIGN: Analysis of data from a longitudinal cohort study. SETTING: Two Baltimore-area hospitals. PARTICIPANTS: Community-dwelling women aged 65 and older admitted to one of two hospitals in Baltimore with a new, nonpathological fracture of the proximal femur between 1992 and 1995. MEASUREMENTS: At 2, 6, and 12 months postfracture, serum IL-6, appendicular lean muscle mass (aLM), and grip strength were measured, and the Lower Extremity Gain Scale (LEGS), a summary measure of performance of nine lower extremity tasks was calculated. Generalized estimating equations were used to model the longitudinal relationship between IL-6 tertile and LEGS. Whether muscle mass or strength explained the relationship between IL-6 and LEGS was examined by adding measures of aLM, grip strength, or both into the model. RESULTS: Subjects in the lowest IL-6 group performed better on the LEGS than those in the highest tertile by 4.51 (95% confidence interval (CI)=1.50,7.52) points at 12 months postfracture. Adjusting for aLM and grip strength, this difference was 4.28 points (95% CI=1.14,7.43) and 3.81 points (95% CI=0.63,7.00), respectively. Adjusting for both aLM and grip strength, the mean difference in LEGS score was 3.88 points (95% CI=0.63,7.13). CONCLUSION: In older women, after hip fracture, reduced muscle strength, rather than reduced muscle mass, better explains the poorer recovery of lower extremity function observed with higher levels of the inflammatory marker IL-6. [source] Holocaust Survivors in Old Age: The Jerusalem Longitudinal StudyJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 3 2008Jochanan Stesssman MD OBJECTIVES: To examine the hypothesis that Holocaust exposure during young adulthood negatively affects physical aging, causing greater morbidity, faster deterioration in health parameters, and shorter survival. DESIGN: A longitudinal cohort study of the natural history of an age-homogenous representative sample born in 1920/21 and living in Jerusalem. SETTING: Community-based home assessments. PARTICIPANTS: Four hundred fifty-eight subjects of European origin aged 70 at baseline and 77 at follow-up. MEASUREMENTS: Comprehensive assessment of physical, functional, and psychosocial domains; biographical history of concentration camp internment (Camp), exposure to Nazi occupation during World War II (Exposure), or lack thereof (Controls); and 7-year mortality data from the National Death Registry. RESULTS: Holocaust survivors of the Camp (n=93) and Exposure (n=129) groups were more likely than Controls (n=236) to be male and less educated and have less social support (P=.01), less physical activity (P=.03), greater difficulty in basic activities of daily living (P=.009), poorer self-rated health (P=.04), and greater usage of psychiatric medication (P=.008). No other differences in health parameters or physical illnesses were found. Holocaust survivors had similar rates of deterioration in health and illness parameters over the follow-up period, and 7-year mortality rates were identical. Proportional hazard models showed that being an elderly Holocaust survivor was not predictive of greter 7-year mortality. CONCLUSION: Fifty years after their Holocaust trauma, survivors still displayed significant psychosocial and functional impairment, although no evidence was found to support the hypothesis that the delayed effects of the trauma of the Holocaust negatively influence physical health, health trajectories, or mortality. [source] Rates of Acute Care Admissions for Frail Older People Living with Met Versus Unmet Activity of Daily Living NeedsJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 2 2006Laura P. Sands PhD OBJECTIVES: To determine whether older people who do not have help for their activity of daily living (ADL) disabilities are at higher risk for acute care admissions and whether entry into a program that provides for these needs decreases this risk. DESIGN: A longitudinal cohort study. SETTING: Thirteen nationwide sites for the Program of All-inclusive Care for the Elderly (PACE). PACE provides comprehensive medical and long-term care to community-living older adults. PARTICIPANTS: Two thousand nine hundred forty-three PACE enrollees with one or more ADL dependencies. MEASUREMENTS: Unmet needs were defined as the absence of paid or unpaid assistance for ADL disabilities before PACE enrollment. Hospital admissions in the 6 months before PACE enrollment and acute admissions in the first 6 weeks and the 7th through 12th weeks after enrollment were determined. RESULTS: Those who lived with unmet ADL needs before enrollment were more likely to have a hospital admission before PACE enrollment (odds ratio (OR)=1.28, 95% confidence interval (CI)=1.01,1.63) and an acute admission in the first 6 weeks after enrollment (OR=1.45, 95% CI=1.00,2.09) but not after 6 weeks of receiving PACE services (OR=0.86, 95% CI=0.53,1.40). CONCLUSION: Frail older people who live without needed help for their ADL disabilities have higher rates of admissions while they are living with unmet ADL needs but not after their needs are met. With state governments under increasing pressure to develop fiscally feasible solutions for caring for disabled older people, it is important that they be aware of the potential health consequences of older adults living without needed ADL assistance. [source] Agreement Between Patient and Proxy Responses of Health-Related Quality of Life After Hip FractureJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 7 2005C. Allyson Jones PT Objectives: To examine agreement between patient and proxy respondents on health-related quality of life (HRQL) over time during the 6-month recovery after hip fracture. Design: Prospective longitudinal cohort study. Setting: A healthcare region serving Edmonton, Alberta, and the surrounding area. Participants: Two hundred forty-five patients aged 65 and older, were treated for hip fracture, and had Mini-Mental State Examination scores greater than 17; 245 family caregivers participated as proxy respondents. Measurements: Primary outcome was HRQL (Health Utilities Mark 2 and Mark 3). Interviews were completed within 5 days after surgery and at 1, 3, and 6 months. Agreement was evaluated using intraclass correlation coefficients (ICCs). Results: Agreement was considered moderate to excellent for HRQL. ICC values ranged from 0.50 to 0.85 (P<.001) for physically based observable dimensions of health status and from 0.32 to 0.66 (P<.01) for less-observable dimensions. Agreement improved with time. Time and the number of days between patient and proxy interviews were significant factors in accounting for patient,proxy differences. Conclusion: Although proxy and patient responses are not interchangeable, proxy responses provide an option for assessing function and health status in patients who are unable to respond on their own behalf. [source] Effects of Provider Practice on Functional Independence in Older AdultsJOURNAL OF AMERICAN GERIATRICS SOCIETY, Issue 8 2004Elizabeth A. Phelan MD Objectives: To examine provider determinants of new-onset disability in basic activities of daily living (ADLs) in community-dwelling elderly. Design: Observational study. Setting: King County, Washington. Participants: A random sample of 800 health maintenance organization (HMO) enrollees aged 65 and older participating in a prospective longitudinal cohort study of dementia and normal aging and their 56 primary care providers formed the study population. Measurements: Incident ADL disability, defined as any new onset of difficulty performing any of the basic ADLs at follow-up assessments, was examined in relation to provider characteristics and practice style using logistic regression and adjusting for case-mix, patient and provider factors associated with ADL disability, and clustering by provider. Results: Neither provider experience taking care of large numbers of elderly patients nor having a certificate of added qualifications in geriatrics was associated with patient ADL disability at 2 or 4 years of follow-up (adjusted odds ratio (AOR) for experience=1.29, 95% confidence interval (CI)=0.81,2.05; AOR for added qualifications=0.72, 95% CI=0.38,1.39; results at 4 years analogous). A practice style embodying traditional geriatric principles of care was not associated with a reduced likelihood of ADL disability over 4 years of follow-up (AOR for prescribing no high-risk medications=0.56, 95% CI=0.16,1.94; AOR for managing geriatric syndromes=0.94, 95% CI=0.40,2.19; AOR for a team care approach=1.35, 95% CI=0.66,2.75). Conclusion: Taking care of a large number of elderly patients, obtaining a certificate of added qualifications in geriatrics, and practicing with a traditional geriatric orientation do not appear to influence the development of ADL disability in elder, community dwelling HMO enrollees. [source] Lung function, insulin resistance and incidence of cardiovascular disease: a longitudinal cohort studyJOURNAL OF INTERNAL MEDICINE, Issue 5 2003G. Engström Abstract., Engström G, Hedblad B, Nilsson P, Wollmer P, Berglund G, Janzon L (University of Lund, Malmö University Hospital, Malmö, Sweden). Lung function, insulin resistance and incidence of cardiovascular disease: a longitudinal cohort study. J Intern Med 2003; 253: 574,581. Objectives., To explore whether a reduced lung function is a risk factor for developing diabetes and insulin resistance (IR), and whether such relationship contributes to the largely unexplained association between lung function and incidence of cardiovascular disease (CVD). Design., Forced vital capacity (FVC) was assessed at baseline. Incidence of diabetes and IR [according to the homeostasis model assessment (HOMA) model] was assessed in a follow-up examination after 13.9 ± 2.6 and 9.4 ± 3.6 years for men and women, respectively. After the follow-up examination, incidence of CVD (stroke, myocardial infarction or cardiovascular death) was monitored over 7 years. Setting., Populations-based cohort study. Subjects., Initially nondiabetic men (n = 1436, mean age 44.6 years) and women (n = 896, mean age 49.8 years). Results., Prevalence of IR at the follow-up examination was 34, 26, 21 and 21%, respectively, for men in the first (lowest), second, third and fourth quartile of baseline FVC (P for trend <0.0001). The corresponding values for women were 30, 29, 25 and 17%, respectively (P for trend <0.001). Adjusted for potential confounders, the odds ratio (OR) for IR (per 10% increase in FVC) was 0.91 (CI: 0.84,0.99) for men and 0.89 (CI: 0.80,0.98) for women. FVC was similarly significantly associated with the incidence of diabetes (OR = 0.90, CI: 0.81,1.00), adjusted for sex and other confounders. The incidence of CVD after the follow-up examination was significantly increased only amongst subjects with low FVC who had developed IR (RR = 1.7, CI: 1.02,2.7). Conclusion., Subjects with a moderately reduced FVC have an increased risk of developing IR and diabetes. This relationship seems to contribute to the largely unexplained association between reduced lung function and incidence of CVD. [source] Evaluation of hepatitis C antibody testing in saliva specimens collected by two different systems in comparison with HCV antibody and HCV rna in serum,JOURNAL OF MEDICAL VIROLOGY, Issue 1 2001G.J.J. van Doornum Abstract Two different ELISA assays, the Ortho HCV 3.0 ELISA (Ortho Diagnostics Systems) and the Mono-Lisa anti-HCV Plus (Sanofi Diagnostics Pasteur) were evaluated for the detection of hepatitis C virus (HCV) antibody in saliva samples. Specimens were collected from 152 individuals who participated in a longitudinal cohort study on HIV infection, and who used illicit drugs. Saliva specimens were collected using two different systems: Salivette (Sarstedt) and Omni-Sal (Saliva Diagnostic Systems). Saliva specimens were tested following modified protocols by both ELISAs, and the results were compared with serum specimens that were tested according to the instructions of the manufacturer. Serum samples of 102 (67%) participants were positive by both assays, and 50 persons were negative for HCV antibody. A total of 99 of the 102 serum specimens were confirmed as positive using Ortho Riba HCV 3.0 (Ortho Diagnostics System) and Deciscan HCV (Sanofi Diagnostics Pasteur), and 3 yielded discrepant results. As no cut-off level is known for testing saliva samples by ELISA, 3 different levels were chosen: mean (M),+,1 standard deviation (SD), M,+,2 SD, and M,+,3 SD of the optical densities of saliva tests of the 50 HCV serum antibody negative persons. At a level of M,+,1 SD and M,+,2 SD the Salivette/Mono-Lisa combination gave the greatest proportion of HCV antibody positive saliva specimens obtained from the 102 HCV serum antibody positive participants, 88% and 79%, respectively. Differences between the various collection systems and assay combinations were not significant statistically. In 76 of the 102 persons with HCV antibodies in serum, HCV RNA was detected in serum. Salivary presence of HCV RNA, however, could not be demonstrated. The results show that the assays compared are unsuitable for diagnostic use, but the sensitivities of the assays are acceptable for use in epidemiological studies. J. Med. Virol. 64:13,20, 2001. © 2001 Wiley-Liss, Inc. [source] Twelve-month outcomes and predictors of very stable INR control in prevalent warfarin usersJOURNAL OF THROMBOSIS AND HAEMOSTASIS, Issue 4 2010D. M. WITT Summary., Background:, For patients on warfarin therapy an international normalized ratio (INR) recall interval not exceeding 4 weeks has traditionally been recommended. For patients whose INR values are nearly always therapeutic, less frequent INR monitoring may be feasible. Objective:, To identify patients with stable INRs (INR values exclusively within the INR range) and comparator patients (at least one INR outside the INR range), compare occurrences of thromboembolism, bleeding and death between groups, and identify independent predictors of stable INR control. Methods:, The study was a retrospective, longitudinal cohort study using data extracted from electronic databases. Patient characteristics and risk factors were entered into multivariate logistic regression models to identify variables that independently predict stable INR status. Results:, There were 533 stable and 2555 comparator patients. Bleeding and thromboembolic complications were significantly lower in stable vs. comparator patients (2.1% vs. 4.1% and 0.2% vs. 1.3%, respectively; P < 0.05). Independent predictors of stable INR control were age >70 years, male gender and the absence of heart failure. Stable patients were significantly less likely to have target INR ,3.0 or chronic diseases. Conclusion:, A group of patients with exclusively therapeutic INR values over 12 months is identifiable. In general, these patients are older, have a target INR <3.0, and do not have heart failure and/or other chronic diseases. Our findings suggest that many patients whose INR values remain within the therapeutic range over time could be safely treated with INR recall intervals >4 weeks. [source] Remote Sensing and Malaria Risk for Military Personnel in AfricaJOURNAL OF TRAVEL MEDICINE, Issue 4 2008Vanessa Machault MSc Background Nonimmune travelers in malaria-endemic areas are exposed to transmission and may experience clinical malaria attacks during or after their travel despite using antivectorial devices or chemoprophylaxis. Environment plays an essential role in the epidemiology of this disease. Remote-sensed environmental information had not yet been tested as an indicator of malaria risk among nonimmune travelers. Methods A total of 1,189 personnel from 10 French military companies traveling for a short-duration mission (about 4 mo) in sub-Saharan Africa from February 2004 to February 2006 were enrolled in a prospective longitudinal cohort study. Incidence rate of clinical malaria attacks occurring during or after the mission was analyzed according to individual characteristics, compliance with antimalaria prophylactic measures, and environmental information obtained from earth observation satellites for all the locations visited during the missions. Results Age, the lack of compliance with the chemoprophylaxis, and staying in areas with an average Normalized Difference Vegetation Index higher than 0.35 were risk factors for clinical malaria. Conclusions Remotely sensed environmental data can provide important planning information on the likely level of malaria risk among nonimmune travelers who could be briefly exposed to malaria transmission and could be used to standardize for the risk of malaria transmission when evaluating the efficacy of antimalaria prophylactic measures. [source] Patterns of breastfeeding in a UK longitudinal cohort studyMATERNAL & CHILD NUTRITION, Issue 1 2007David Pontin Abstract Although exclusive breastfeeding for the first 6 months of infant life is recommended in the UK, there is little information on the extent of exclusive breastfeeding. This study has taken the 1996 and 2003 World Health Organization (WHO) definitions of breastfeeding and investigated breastfeeding rates in the first 6 months of life in infants born to mothers enrolled in a longitudinal, representative, population-based cohort study , the Avon Longitudinal Study of Parents and Children (ALSPAC). Information about breastfeeding and introduction of solids was available for 11 490 infants at 6 months of age (81% of live births). Exclusive breastfeeding declined steadily from 54.8% in the first month to 31% in the third, and fell to 9.6% in the fourth month mainly due to the introduction of solids to the infants. In the first 2 months, complementary feeding (breastmilk and solid/semi-solid foods with any liquid including non-human milk) was used in combination, and declined from 22% in the first month to 16.8% in the second due to a switch to exclusive commercial infant formula feeding. Replacement feeding (exclusive commercial infant formula or combined with any liquid or solid/semi-solid food but excluding breastmilk) increased steadily from 21.9% in the first month to 67.1% by the seventh. This obscured the change from exclusive commercial infant formula feeding only to commercial infant formula feeding plus solids/semi-solids, a change which started in the third month and was complete by the fifth. Using categories in the 1996 and 2003 WHO definitions, such as complementary feeding and replacement feeding, presented difficulties for an analysis of the extent of breastfeeding in this population. [source] The risk of lower urinary tract symptoms five years after the first delivery,NEUROUROLOGY AND URODYNAMICS, Issue 1 2002Lars Viktrup Abstract Aim of the study To estimate the prevalence and 5-year incidence of lower urinary tract symptoms (LUTS) after the first delivery and to evaluate the impact of pregnancy per se and delivery per se on long-lasting symptoms. Materials and methods A longitudinal cohort study of 305 primiparae questioned a few days, 3 months, and 5 years after their delivery. The questionnaire used was tested and validated, and the questions were formulated according to the definitions of the International Continence Society (ICS). Maternal, obstetric, and neonatal data concerning every delivery and objective data concerning surgeries during the observation period were obtained from the records. From the sample of 278 women (91%) who responded 5 years after their first delivery, three subpopulations were defined: 1) women without initial LUTS before or during the first pregnancy or during the puerperal period, 2) women with onset of LUTS during the first pregnancy, and 3) women with onset of LUTS during the first puerperium. The risk of LUTS 5 years after the first delivery was examined using bivariate analyses. The obstetric variables in the bivariate tests with a significant association with long-lasting urinary incontinence were entered into a multivariate logistic regression. Results The prevalence of stress and urge incontinence 5 years after first delivery was 30% and 15%, respectively, whereas the 5-year incidence was 19% and 11%, respectively. The prevalence of urgency, diurnal frequency, and nocturia 5 years after the first delivery was 18%, 24%, and 2%, respectively, whereas the 5-year incidence was 15%, 20%, and 0.5%, respectively. The prevalence of all LUTS except nocturia increased significantly during the 5 years of observation. The risk of long-lasting stress and urge incontinence was related to the onset and duration of the symptom after the first pregnancy and delivery in a dose-response,like manner. Vacuum extraction at the first delivery was used significantly more often in the group of women with onset of stress incontinence during the first puerperium, whereas an episiotomy at the first delivery was performed significantly more often in the group of women with onset of stress incontinence in the 5 years of observation. The prevalence of urgency and diurnal frequency 5 years after the first delivery was not increased in women with symptom onset during the first pregnancy or puerperium compared with those without such symptoms. The frequency of nocturia 5 years after the first delivery was too low for statistical analysis. Conclusion The first pregnancy and delivery may result in stress and urge incontinence 5 years later. Women with stress and urge incontinence 3 months after the first delivery have a very high risk of long-lasting symptoms. An episiotomy or a vacuum extraction at the first delivery seems to increase the risk. Subsequent childbearing or surgery seems without significant contribution. Long-lasting urgency, diurnal frequency, or nocturia cannot be predicted from onset during the first pregnancy or puerperium. Neurourol. Urodynam. 21:2,29, 2002. © 2002 Wiley-Liss, Inc. [source] Prenatal diagnosis of congenital malformations and parental psychological distress,a prospective longitudinal cohort studyPRENATAL DIAGNOSIS, Issue 11 2006H. Skari Abstract Objective To test whether postnatal psychological distress in parents of babies with congenital malformations is reduced by prenatal diagnosis. Methods A prospective observational longitudinal cohort study was conducted at two Norwegian hospitals. We included 293 parents of babies with congenital malformations (prenatal detection rate: 36.5%) referred for neonatal surgery and 249 parents of healthy babies (comparison group). Parental psychological responses were assessed on three postnatal occasions by psychometric instruments (GHQ-28, STAI-X1, and IES). Results Significantly increased psychological distress (GHQ-28) was reported by parents who received prenatal diagnosis as compared to postnatal diagnosis; acutely 28.9 versus 24.4, P = 0.006 (comparison group: 19.6); at 6 weeks 26.8 versus 21.5, P < 0.001 (comparison group: 17.7); and at 6 months 22.6 versus 18.7, P = 0.015 (comparison group: 16.6). Mothers consistently reported higher levels of distress than fathers. Multiple linear regression analysis showed that prenatal diagnosis and being a mother significantly predicted severity of acute psychological distress. At 6 weeks and 6 months, mortality and associated anomalies were significant independent predictors of psychological distress. Conclusion Controlling for other covariates, we found that prenatal diagnosis of congenital malformations was a significant independent predictor of acute parental psychological distress after birth. Copyright © 2006 John Wiley & Sons, Ltd. [source] Coffee consumption and risk of rheumatoid arthritisARTHRITIS & RHEUMATISM, Issue 11 2003Elizabeth W. Karlson Objective Recent reports have suggested an association between consumption of coffee or decaffeinated coffee and the risk of rheumatoid arthritis (RA), although data are sparse and somewhat inconsistent. Furthermore, existing studies measured dietary exposures and potential confounders only at baseline and did not consider possible changes in diet or lifestyle over the followup period. We studied whether coffee, decaffeinated coffee, total coffee, tea, or overall caffeine consumption was associated with the risk of RA, using the Nurses' Health Study, a longitudinal cohort study of 121,701 women. Methods Information on beverage consumption was assessed with a food frequency questionnaire (FFQ) that was completed every 4 years, from baseline in 1980 through 1998. Among the 83,124 women who completed the FFQ at baseline, the diagnosis of incident RA (between 1980 and 2000) was confirmed in 480 women by a connective tissue disease screening questionnaire and medical record review for American College of Rheumatology criteria. Relationships between intake of various beverages and the risk of RA were assessed in age-adjusted models and in multivariate Cox proportional hazards models including the cumulative average intake of each beverage during the followup period, adjusted for numerous potential confounders. In addition, for direct comparisons with prior reports, multivariate analyses were repeated using only baseline beverage information. Results We did not find a significant association between decaffeinated coffee consumption of ,4 cups/day (compared with no decaffeinated coffee consumption) and subsequent risk of incident RA, in either an adjusted multivariate model (relative risk [RR] 1.1, 95% confidence interval [95% CI] 0.5,2.2) or a multivariate model using only baseline reports of decaffeinated coffee consumption (RR 1.0, 95% CI 0.6,1.7). Similarly, there was no relationship between cumulative caffeinated coffee consumption and RA risk (RR 1.1, 95% CI 0.8,1.6 for ,4 cups per day versus none) or between tea consumption and RA risk (RR 1.1, 95% CI 0.7,1.8 for >3 cups/day versus none). Total coffee and total caffeine consumption were also not associated with the risk of RA. Conclusion In this large, prospective study, we find little evidence of an association between coffee, decaffeinated coffee, or tea consumption and the risk of RA among women. [source] Patterns and costs of treatment for heroin dependence over 12 months: fndings from the Australian Treatment Outcome StudyAUSTRALIAN AND NEW ZEALAND JOURNAL OF PUBLIC HEALTH, Issue 4 2006Marian Shanahan Objective: To determine patterns and costs of treatment for heroin dependence over a 12-month period among a cohort of heroin users seeking treatment. Methods: The design was a longitudinal cohort study of heroin users seeking treatment who participated in the Australian Treatment Outcome Study (ATOS), which was conducted in Sydney, Melbourne and Adelaide, Australia. Treatment for heroin dependence, for those who were followed up at 12 months, was recorded and costed. Unit costs, obtained from secondary sources, were used to estimate the cost of treatment. This study does not include wide societal costs and only includes personal costs as they pertain to treatment. Results: A follow-up rate of 81% at 12 months was achieved, resulting in data for 596 participants. Participants spent an average of 188 days in treatment over 2.7 episodes. Sixty-nine per cent of the sample reported at least one episode of treatment following their index treatment. There was a noticeable trend for subjects who received maintenance or residential rehabilitation as their index treatment to return to the same form of treatment for subsequent episodes. In contrast, those who received detoxifcation as index treatment accessed a wider variety of treatment types over the follow-up period. The cost of treatment over the 12-month follow-up totalled 3,901,416, with a mean of 6,517 per person. Conclusions and Implications: This study demonstrates that individuals seeking treatment have multiple treatment episodes throughout a 12-month period, with a tendency to return to the same form of treatment. This study also demonstrates that it is feasible and affordable to provide ongoing treatment for a group of heroin users seeking treatment. [source] Bayesian Inference for Smoking Cessation with a Latent Cure StateBIOMETRICS, Issue 3 2009Sheng Luo Summary We present a Bayesian approach to modeling dynamic smoking addiction behavior processes when cure is not directly observed due to censoring. Subject-specific probabilities model the stochastic transitions among three behavioral states: smoking, transient quitting, and permanent quitting (absorbent state). A multivariate normal distribution for random effects is used to account for the potential correlation among the subject-specific transition probabilities. Inference is conducted using a Bayesian framework via Markov chain Monte Carlo simulation. This framework provides various measures of subject-specific predictions, which are useful for policy-making, intervention development, and evaluation. Simulations are used to validate our Bayesian methodology and assess its frequentist properties. Our methods are motivated by, and applied to, the Alpha-Tocopherol, Beta-Carotene Lung Cancer Prevention study, a large (29,133 individuals) longitudinal cohort study of smokers from Finland. [source] Occupational status and social adjustment six months after hospitalization early in the course of bipolar disorder: a prospective studyBIPOLAR DISORDERS, Issue 1 2010Faith Dickerson Dickerson F, Origoni A, Stallings C, Khushalani S, Dickinson D, Medoff D. Occupational status and social adjustment six months after hospitalization early in the course of bipolar disorder: a prospective study. Bipolar Disord 2010: 12: 10,20. © 2010 The Authors. Journal compilation © 2010 John Wiley & Sons A/S. Objectives:, Bipolar disorder is often accompanied by poor functional outcomes, the determinants of which are not fully understood. We assessed patients with bipolar disorder undergoing a hospital admission early in the illness course and identified predictors of occupational status, overall social adjustment, and work adjustment six months later. Methods:, This was a prospective longitudinal cohort study. During hospitalization patients were evaluated with a cognitive battery; symptoms, occupational history, and other clinical factors were also assessed. At six-month follow-up, patients' symptom remission status was assessed; they were also evaluated as to their occupational status, overall social adjustment, and work adjustment. Multivariate analyses were used to identify predictors of these outcomes. Results:, Among the 52 participants, the average rating of overall social adjustment at follow-up was between mild and moderate maladjustment. While 51 had a history of working full time, only 28 (54%) worked full time at follow-up. A total of 24 (46%) had symptoms that met criteria for a full depression or mania syndrome. In multivariate analyses, full-time occupational status at follow-up was predicted by the absence of baseline substance abuse. Better overall social adjustment was predicted by better performance on cognitive tasks of processing speed and by symptom remission; the latter variable also predicted work adjustment. Conclusions:, Persons with bipolar disorder have limited occupational recovery and overall social adjustment six months after a hospital admission early in the illness course. Predictors vary among outcomes; performance on tasks of processing speed and the extent of symptom remission are independently associated with functional outcomes. [source] |