Risk Increases (risk + increase)

Distribution by Scientific Domains


Selected Abstracts


Methodological issues in the epidemiological study of the teratogenicity of drugs

CONGENITAL ANOMALIES, Issue 2 2005
Bengt A.J. Källén
ABSTRACT The review presented here discusses and exemplifies problems in epidemiological studies of drug teratogenesis according to methodology: case,control studies, cohort studies, or total population studies. Sources of errors and the possibility of confounding are underlined. The review stresses the caution with which conclusions have to be drawn when exposure data are retrospective or other possible bias exists. It also stresses the problem with the multiple testing situation that is usually present in the studies. It is therefore difficult to draw any firm conclusion from single studies and still more difficult to draw conclusions on causality. As randomized studies are in most cases out of the question, one has to rely on the type of studies which can be made, but the interpretation of the results should be cautious. The ideal study, next to a randomized one, is a large prospective study with detailed exposure information and detailed and unbiased outcome data. Even so, such a study can mainly be used for identifying possible associations which have to be verified or rejected in new studies. Nearly every finding of a risk increase, if not extremely strong, should only be regarded as a tentative signal to be tested in independent studies. [source]


The Effect of Anemia on Mortality in Indigent Patients With Mild-to-Moderate Chronic Heart Failure

CONGESTIVE HEART FAILURE, Issue 2 2006
Kathy Hebert MD
Anemia has been described as an independent predictor of death in patients with chronic heart failure. Little is known, however, about the significance of anemia in heart failure patients with severely depressed socioeconomic backgrounds who receive comprehensive care in a heart failure management program. The impact of anemia on mortality was investigated in 410 indigent chronic heart failure patients, the majority of whom were in New York Heart Association functional class I,III and were treated with angiotensin-converting enzyme inhibitors or angiotensin receptor blockers and , blockers at maximally tolerated doses. Anemia was present in 28% of patients. In an adjusted Cox analysis, anemia was strongly associated with mortality, but only in men: hazard ratio, 2.54; 95% confidence interval, 1.31,4.93; p=0.006. The investigators conclude that anemia in this population is common and that, for men, the relative risk increase associated with anemia is high. [source]


Identifying High Risk Groups and Quantifying Absolute Risk of Cancer After Kidney Transplantation: A Cohort Study of 15 183 Recipients

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 9 2007
A. C. Webster
Transplant recipients have increased cancer risk, but data on risk variation across different patient groups are sparse. Rates and standardized rate ratios (SRR) of cancer (all sites, excluding nonmelanocytic skin and lip cancer) compared to the general population were calculated, using Australia and New Zealand Dialysis and Transplant Registry data. Within the transplant population, risk factors were identified (hazard ratios: HR; 95% CI) and absolute risk estimated for recipient groups. A total of 1642 (10.8%) of 15 183 recipients developed cancer. Risk was inversely related to age (SRR 15,30 children, 2 if >65 years). Females aged 25,29 had rates equivalent to women aged 55,59 from the general population. Age trend for lymphoma, colorectal and breast risk was similar; melanoma showed less variability across ages, prostate showed no risk increase. Within the transplanted population, risk was affected by age differently for each sex (p = 0.007), elevated by prior malignancy (HR 1.40; 1.03,1.89), white race (HR 1.36; 1.12,1.89), but reduced by diabetic end-stage kidney disease (ESKD) (HR 0.67; 0.50,0.89). Cancer rates in kidney recipients are similar to nontransplanted people 20,30 years older, but absolute risk differs across patient groups. Men aged 45,54 surviving 10 years have cancer risks varying from 1 in 13 (non-white, no prior cancer, diabetic ESKD) to 1 in 5 (white, prior cancer, other ESKD). [source]


Congenital malformations in infants born after in vitro fertilization in Sweden

BIRTH DEFECTS RESEARCH, Issue 3 2010
Bengt Källén
Abstract BACKGROUND: The risk for congenital malformations is increased in infants born after in vitro fertilization (IVF). Some specific malformations appear to be more affected than others. METHODS: The presence of congenital malformations in 15,570 infants born after IVF with an embryo transfer between April 1, 2001, and the end of 2006 were compared with all infants born in Sweden during 2001 to 2007 (n = 689,157). Risk estimates were made after adjusting for year of birth, maternal age, parity, smoking, and body mass index. The risks of specific malformations were compared with data from a previous study (1982 to March 31, 2001) of 16,280 infants born after IVF. Different IVF methods were compared to respect to malformation risk. RESULTS: Increased risks of a similar magnitude were found for most cardiovascular malformations and limb reduction defects for both study periods. For neural tube defects, cardiac septal defects, and esophageal atresia, there was still an increased risk, but it was lower during the second than during the first period. For small bowel atresia, anal atresia, and hypospadias, the risk increase observed during the first study period had disappeared during the second period. An increased risk was seen for some syndromes that have been associated with imprinting errors. No difference in malformation risk according to IVF method was apparent. CONCLUSIONS: A slightly increased risk for congenital malformations after IVF persists. A decreasing risk is seen for some specific malformations, either true or the result of multiple testing. Birth Defects Research (Part A), 2010. © 2010 Wiley-Liss, Inc. [source]


Plesiomorphic Escape Decisions in Cryptic Horned Lizards (Phrynosoma) Having Highly Derived Antipredatory Defenses

ETHOLOGY, Issue 10 2010
William E. Cooper Jr
Escape theory predicts that the probability of fleeing and flight initiation distance (predator,prey distance when escape begins) increase as predation risk increases and decrease as escape cost increases. These factors may apply even to highly cryptic species that sometimes must flee. Horned lizards (Phrynosoma) rely on crypsis because of coloration, flattened body form, and lateral fringe scales that reduce detectability. At close range they sometimes squirt blood-containing noxious substances and defend themselves with cranial spines. These antipredatory traits are highly derived, but little is known about the escape behavior of horned lizards. Of particular interest is whether their escape decisions bear the same relationships to predation risk and opportunity costs of escaping as in typical prey lacking such derived defenses. We investigated the effects of repeated attack and direction of predator turning on P. cornutum and of opportunity cost of fleeing during a social encounter in P. modestum. Flight initiation distance was greater for the second of two successive approaches and probability of fleeing decreased as distance between the turning predator and prey increased, but was greater when the predator turned toward than away from a lizard. Flight initiation distance was shorter during social encounters than when lizards were solitary. For all variables studied, risk assessment by horned lizards conforms to the predictions of escape theory and is similar to that in other prey despite their specialized defenses. Our findings show that these specialized, derived defenses coexist with a taxonomically widespread, plesiomorphic method of making escape decisions. They suggest that escape theory based on costs and benefits, as intended, applies very generally, even to highly cryptic prey that have specialized defense mechanisms. [source]


How starvation risk in Redshanks Tringa totanus results in predation mortality from Sparrowhawks Accipiter nisus

IBIS, Issue 2008
WILL CRESSWELL
Redshanks Tringa totanus that are preyed upon by Sparrowhawks Accipiter nisus at the Tyninghame Estuary, Firth of Forth, Scotland, provide an example of how the starvation,predation risk trade-off results in mortality. In this trade-off, animals cannot always optimize anti-predation behaviour because anti-predation behaviours, such as avoiding predators, are usually incompatible with foraging behaviours that might maximize intake rates. Therefore, as animals compensate for starvation risk, predation risk increases. Sparrowhawks are the main direct cause of death in Redshanks at Tyninghame. Sparrowhawk attack rate is determined by Redshank vulnerability, and vulnerability decreases as group size and distance to cover increase, and probably as spacing decreases. But reduction of predation vulnerability reduces feeding rate because areas away from cover are less food-profitable and grouping results in increased interference competition. Increased starvation risk in midwinter means Redshanks are forced to feed on highly profitable prey, Orchestia amphipods, the behaviour of which means that Redshanks are forced to feed vulnerably, in widely spaced groups, close to predator-concealing cover. Therefore, it is the constraints that limit the ability of Redshanks to feed in large, dense flocks away from cover that ultimately lead to mortality. We investigate this hypothesis further by testing the prediction that mortality can be predicted directly by cold weather and population density. We demonstrate that the overall number of Redshanks and the proportion of Redshanks killed increase in cold months when controlling for population size. We also demonstrate that the proportion of Redshanks killed increases when there are fewer Redshanks present, because the success rate of hunting Sparrowhawks increases, probably because effective management of predation risk through flocking is constrained by a low population size. Redshanks therefore provide an example of how directly mortality caused by predation arises from starvation risk and other constraints that prevent animals from optimizing anti-predation behaviour. [source]


Method for moderation: measuring lifetime risk of alcohol-attributable mortality as a basis for drinking guidelines

INTERNATIONAL JOURNAL OF METHODS IN PSYCHIATRIC RESEARCH, Issue 3 2008
Jürgen Rehm
Abstract The objective of this paper was to determine separately the lifetime risk of drinking alcohol for chronic disease and acute injury outcomes as a basis for setting general population drinking guidelines for Australia. Relative risk data for different levels of average consumption of alcohol were combined with age, sex, and disease-specific risks of dying from an alcohol-attributable chronic disease. For injury, combinations of the number of drinks per occasion and frequency of drinking occasions were combined to model lifetime risk of death for different drinking pattern scenarios. A lifetime risk of injury death of 1 in 100 is reached for consumption levels of about three drinks daily per week for women, and three drinks five times a week for men. For chronic disease death, lifetime risk increases by about 10% with each 10-gram (one drink) increase in daily average alcohol consumption, although risks are higher for women than men, particularly at higher average consumption levels. Lifetime risks for injury and chronic disease combine to overall risk of alcohol-attributable mortality. In terms of guidelines, if a lifetime risk standard of 1 in 100 is set, then the implications of the analysis presented here are that both men and women should not exceed a volume of two drinks a day for chronic disease mortality, and for occasional drinking three or four drinks seem tolerable. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Biodemographic analysis of male honey bee mortality

AGING CELL, Issue 1 2005
Olav Rueppell
Summary Biodemographic studies of insects have significantly enhanced our understanding of the biology of aging. Eusocial insects have evolved to form different groups of colony members that are specialized for particular tasks and highly dependent on each other. These different groups (castes and sexes) also differ strongly in their life expectancy but relatively little is known about their mortality dynamics. In this study we present data on the age-specific flight activity and mortality of male honey bees from two different genetic lines that are exclusively dedicated to reproduction. We show that males initiating flight at a young age experience more flight events during their lifetime. No (negative) relation between the age at flight initiation and lifespan exists, as might be predicted on the basis of the antagonistic pleiotropy theory of aging. Furthermore, we fit our data to different aging models and conclude that overall a slight deceleration of the age-dependent mortality increase at advanced ages occurs. However, mortality risk increases according to the Gompertz,Makeham model when only days with flight activity (active days) are taken into account. Our interpretation of the latter is that two mortality components act on honey bee males during flight: increasing, age-dependent deaths (possibly from wear-and-tear), and age-independent deaths (possibly due to predation). The overall mortality curve is caused by the interaction of the distribution of age at foraging initiation and the mortality function during the active (flight) lifespan. [source]


Stock Options and the Corporate Demand for Insurance

JOURNAL OF RISK AND INSURANCE, Issue 2 2006
Li-Ming Han
This article shows that a corporate manager compensated in stock options makes corporate decisions to maximize stock option value. Overinvestment is a consequence if risk increases with investment. Facing the choice of hedging corporate risk with forward contracts on a stock market index fund and insuring pure risks the manager will choose the latter. Hedging with forwards reduces weight in both tails of corporate payoff distribution and thus reduces option value. Insuring pure risks reduces the weight in the left tail where the options are out-of-the-money and increases the weight in the right tail where the options are in-the-money; the effect is an increase in the option value. Insurance reduces the overinvestment problem but no level of insurance coverage can reduce investment to that which maximizes the shareholder value. [source]


Desmopressin treatment in nocturia; an analysis of risk factors for hyponatremia

NEUROUROLOGY AND URODYNAMICS, Issue 2 2006
A. Rembratt
Abstract Aims To explore the incidence, severity, time course, and risk factors of clinically significant hyponatremia in desmopressin treatment for nocturia. Methods Data from three multi-center phase III trials were pooled. Hyponatremia was categorised as borderline (134,130 mmol/L) or significant (<130 mmol/L). Risk factors were explored with logistic regression and subgroup analysis performed to explore threshold values for contra-indication. Results In total 632 patients (344 men, 288 women) were analyzed. During dose-titration, serum sodium concentration below normal range was recorded in 95 patients (15%) and 31 patients (4.9%) experienced significant hyponatremia. The risk increased with age, lower serum sodium concentration at baseline, higher basal 24-hr urine volume per bodyweight and weight gain at time of minimum serum sodium concentration. Age was the best single predictor. Elderly patients (,65 years of age) with a baseline serum sodium concentration below normal range were at high risk (75%). Limiting treatment in elderly with normal basal serum sodium concentration to those below 79 years and with a 24-hr urine output below 28 ml/kg would reduce the risk from 8.1% to 3.0% at the cost of 34% fulfilling the contra-indication. Conclusions The majority of nocturia patients tolerate desmopressin treatment without clinically significant hyponatremia. However, the risk increases with increasing age and decreasing baseline serum sodium concentration. Treatment of nocturia in elderly patients with desmopressin should only be undertaken together with careful monitoring of the serum sodium concentration. Patients with a baseline serum sodium concentration below normal range should not be treated. © 2005 Wiley-Liss, Inc. [source]


Optimal foraging when predation risk increases with patch resources: an analysis of pollinators and ambush predators

OIKOS, Issue 5 2010
Emily I. Jones
Pollinators and their predators share innate and learned preferences for high quality flowers. Consequently, pollinators are more likely to encounter predators when visiting the most rewarding flowers. I present a model of how different pollinator species can maximize lifetime resource gains depending on the density and distribution of predators, as well as their vulnerability to capture by predators. For pollinator species that are difficult for predators to capture, the optimal strategy is to visit the most rewarding flowers as long as predator density is low. At higher predator densities and for pollinators that are more vulnerable to predator capture, the lifetime resource gain from the most rewarding flowers declines and the optimal strategy depends on the predator distribution. In some cases, a wide range of floral rewards provides near-maximum lifetime resource gains, which may favor generalization if searching for flowers is costly. In other cases, a low flower reward level provides the maximum lifetime resource gain and so pollinators should specialize on less rewarding flowers. Thus, the model suggests that predators can have qualitatively different top-down effects on plant reproductive success depending on the pollinator species, the density of predators, and the distribution of predators across flower reward levels. [source]


Original Article: Detection of p16 promoter methylation in premature rats with chronic lung disease induced by hyperoxia

PEDIATRICS INTERNATIONAL, Issue 4 2010
Xiaohong Yue
Abstract Background:, The aim of the present study was to investigate p16 promoter methylation in premature rats with chronic lung disease (CLD) induced by hyperoxia. Methods:, Eighty Wistar rats were randomized into the hyperoxia group (fraction of inspired oxygen [FiO2]= 900 mL/L) or the control group (FiO2= 210 mL/L), 40 for each group. Semi-nested methylation-specific polymerase chain reaction (sn-MSP) was applied to detect p16 promoter hypermethylation in lung tissues. Additionally, p16 mRNA and protein expression was detected on reverse transcription,polymerase chain reaction (RT-PCR), western blot and the strept actividin,biotin complex method. Results:, Extended exposure to hyperoxia led to increased methylation, and the methylation level reached a peak in the period of maximum pulmonary fibrosis in the hyperoxia group, while the methylation did not occur in the control group. The methylation rates on semi-nested PCR (sn-PCR) and nested-MSP were, respectively, 52.5% and 42.5% in the hyperoxia group. There was no statistically significant difference between the two methods. The p16 mRNA and protein expression was significantly higher in those with p16 promoter hypermethylation than those without. Conclusion:, Exposure to hyperoxia may induce p16 promoter hypermethylation in lung tissues in premature rats, and methylation risk increases as exposure extends. p16 promoter methylation induced by hyperoxia may be one of the mechanisms for low p16 mRNA and protein expression. [source]


The nuchal translucency and the fetal heart: a literature review

PRENATAL DIAGNOSIS, Issue 8 2009
S. A. Clur
Abstract In this overview the current knowledge of the relationship between an increased nuchal translucency (NT) measurement and fetal heart structure and function in chromosomally normal fetuses is reviewed. Relevant pathophysiological theories behind the increased NT are discussed. Fetuses with an increased NT have an increased risk for congenital heart disease (CHD) with no particular bias for one form of CHD over another. This risk increases with increasing NT measurement. Although the NT measurement is only a modestly effective screening tool for all CHD when used alone, it may indeed be effective in identifying specific CHD "likely to benefit" from prenatal diagnosis. The combination of an increased NT, tricuspid regurgitation and an abnormal ductus venosus (DV) Doppler flow profile, is a strong marker for CHD. A fetal echocardiogram should be performed at 20 weeks' gestation in fetuses with an NT , 95th percentile but < 99th percentile. When the NT measurement is , 99th percentile, or when tricuspid regurgitation and/or an abnormal DV flow pattern is found along with the increased NT, an earlier echocardiogram is indicated, followed by a repeat scan at around 20 weeks' gestation. The resultant increased demand for early fetal echocardiography and sonographers with this special expertise needs to be planned and provided for. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Primary infection with the Epstein-Barr virus and risk of multiple sclerosis,

ANNALS OF NEUROLOGY, Issue 6 2010
Lynn I. Levin PhD
To determine whether multiple sclerosis (MS) risk increases following primary infection with the Epstein-Barr virus (EBV), we conducted a nested case-control study including 305 individuals who developed MS and 610 matched controls selected among the >8 million active-duty military personnel whose serum has been stored in the Department of Defense Serum Repository. Time of EBV infection was determined by measuring antibody titers in serial serum samples collected before MS onset among cases, and on matched dates among controls. Ten (3.3%) cases and 32 (5.2%) controls were initially EBV negative. All of the 10 EBV-negative cases became EBV positive before MS onset; in contrast, only 35.7% (n = 10) of the 28 controls with follow-up samples seroconverted (exact p value = 0.0008). We conclude that MS risk is extremely low among individuals not infected with EBV, but it increases sharply in the same individuals following EBV infection. ANN NEUROL 2010;67:824,830 [source]


Do rheumatoid arthritis and lymphoma share risk factors?: A comparison of lymphoma and cancer risks before and after diagnosis of rheumatoid arthritis

ARTHRITIS & RHEUMATISM, Issue 5 2010
Karin Hellgren
Objective Patients with rheumatoid arthritis (RA), in particular those with the most severe disease, are at increased risk of developing malignant lymphoma. Whether this increase is entirely a consequence of the RA disease and/or its treatment or is reflective of shared susceptibility to the two diseases remains unclear. We undertook this study to assess whether patients with RA are already at increased risk of lymphoma or of other cancers before the diagnosis of RA, and if the relative risk increases with time since RA diagnosis. Methods Patients with incident RA (symptom duration <1 year) (n = 6,745) registered in the Swedish Early Arthritis Registry from 1997 through 2006 were identified. For each patient, 5 general population controls were randomly matched by sex, age, marital status, and residence (n = 33,657). For all study subjects, inclusion in the nationwide Swedish Cancer Register in 1958,2006 was determined. Relative risks (RRs) (with 95% confidence intervals [95% CIs]) of lymphoma and of cancer overall, before and after diagnosis of RA, were estimated using conditional logistic regression and Cox regression, respectively. Results Before diagnosis of RA, there was no observed increase in the risk of lymphoma (RR [odds ratio] 0.67 [95% CI 0.37,1.23]) or other cancers (RR 0.78 [95% CI 0.70,0.88]). During the first 10 years following diagnosis of RA, the overall RR (hazard ratio) of lymphoma development was 1.75 (95 % CI 1.04,2.96). Conclusion These findings indicate that overall, a history of cancer, including lymphoma, does not increase the risk of subsequent RA development. Shared susceptibility to RA and lymphoma may thus be of limited importance. In contrast, increased lymphoma risks were observed within the first decade following RA diagnosis. [source]


Evidence From Data Searches and Life-Table Analyses for Gender-Related Differences in Absolute Risk of Hip Fracture After Colles' or Spine Fracture: Colles' Fracture as an Early and Sensitive Marker of Skeletal Fragility in White Men,

JOURNAL OF BONE AND MINERAL RESEARCH, Issue 12 2004
Patrick Haentjens
Abstract Based on data searches and life-table analyses, we determined the long-term (remaining lifetime) and short-term (10- and 5-year) absolute risks of hip fracture after sustaining a Colles' or spine fracture and searched for potential gender-related differences. In aging men, Colles' fractures carry a higher absolute risk for hip fracture than spinal fractures in contrast to women. These findings support the concept that forearm fracture is an early and sensitive marker of male skeletal fragility. Introduction: Colles' fracture occurrence has been largely ignored in public health approaches to identify target populations at risk for hip fracture. The aim of this study was to estimate the long-term and short-term absolute risks of hip fracture after sustaining a Colles' or spine fracture and to search for potential gender-related differences in the relationship between fracture history and future fracture risk. Materials and Methods: To determine the long-term (remaining lifetime) and short-term (10- and 5-year) absolute risks of hip fracture, we applied life-table methods using U.S. age- and sex-specific hip fracture incidence rates, U.S. age-specific mortality rates for white women and men, pooled hazard ratios for mortality after Colles' and spine fracture, and pooled relative risks for hip fracture after Colles' and spine fracture, estimated from cohort studies by standard meta-analytic methods. Results: Our results indicate that the estimated remaining lifetime risks are dependent on age in both genders. In women, remaining lifetime risks increase until the age of 80 years, when they start to decline because of the competing probabilities of fracture and death. The same pattern is found in men until the age of 85 years, the increment in lifetime risk being even more pronounced. As expected, the risk of sustaining a hip fracture was found to be higher in postmenopausal women with a previous spine fracture compared with those with a history of Colles' fracture. In men, on the other hand, the prospective association between fracture history and subsequent hip fracture risk seemed to be strongest for Colles' fracture. At the age of 50, for example, the remaining lifetime risk was 13% in women with a previous Colles' fracture compared with 15% in the context of a previous spine fracture and 9% among women of the general population. In men at the age of 50 years, the corresponding risk estimates were 8%, 6%, and 3%, respectively. Similar trends were observed when calculating 5- and 10-year risks. Conclusions: In aging men, Colles' fractures carry a higher absolute risk for hip fracture than spinal fractures in contrast to women. These findings support the concept that forearm fracture is an early and sensitive marker of male skeletal fragility. The gender-related differences reported in this analysis should be taken into account when designing screening and treatment strategies for prevention of hip fracture in men. [source]