Daily Intake (daily + intake)

Distribution by Scientific Domains
Distribution within Medical Sciences

Kinds of Daily Intake

  • acceptable daily intake
  • tolerable daily intake


  • Selected Abstracts


    Safety of Polyethylene Terephthalate Food Containers Evaluated by HPLC, Migration Test, and Estimated Daily Intake

    JOURNAL OF FOOD SCIENCE, Issue 6 2008
    H.-J. Park
    ABSTRACT:, A comparative high-pressure liquid chromatography (HPLC) analysis of monomers, terephthalic acid (TPA), isophthalic acid (IPA), and dimethyl terephthalate (DMT) from polyethylene terephthalate (PET) food containers was conducted. Monomer linearities and sensitivities were calibrated between established and novel HPLC analyses. Safety of PET containers was evaluated with newly established detection methods for TPA, IPA, and DMT. Migration of the 3 monomers into food simulants (water, 4% acetic acid, 20% alcohol, and n-heptane) from 56 PET containers collected from open markets was monitored. Migrated monomers were not detected over 0.1 ppm of detection limit. The corresponding estimated daily intake was measured to confirm the safety of these publicly available PET containers and to permit comparison to the specific migration limit of the European Union. The estimated daily intake of 3 monomers migrating from PET was 0.0384 mg/kg each. This represented only 0.6% of the European Union's specific migration limit, confirming the safety of the examined containers. [source]


    Cover Picture , Mol.

    MOLECULAR NUTRITION & FOOD RESEARCH (FORMERLY NAHRUNG/FOOD), Issue 12 2008
    Nutr.
    Selected topics of issue 12 are: Procyanidin effects on oesophageal adenocarcinoma cells strongly depend on flavan-3-ol degree of polymerization Daily Intake of thiamine correlates with the circulating level of endothelial progenitor cells and the endothelial function in patients with type II diabetes Fungal siderophores function as protective agents of LDL oxidation and are promising anti-atherosclerotic metabolites in functional food Identification of four IgE-reactive proteins in raspberry (Rubus ideaeus L.) [source]


    Body composition and time course changes in regional distribution of fat and lean tissue in unselected cancer patients on palliative care,Correlations with food intake, metabolism, exercise capacity, and hormones

    CANCER, Issue 10 2005
    Marita Fouladiun M.D.
    Abstract BACKGROUND Several investigations that yielded different results in terms of net changes in body composition of weight-losing cancer patients have been reported that employed a variety of methods based on fundamentally different technology. Most of those reports were cross-sectional, whereas to the authors' knowledge there is sparse information available on longitudinal follow-up measurements in relation to other independent methods for the assessment of metabolism and performance. METHODS For the current report, the authors evaluated time course changes in body composition (dual-energy X-ray absorptiometry) with measurements of whole body and regional distribution of fat and lean tissue in relation to food and dietary intake, host metabolism (indirect calorimetry), maximum exercise capacity (walking test), and circulating hormones in cancer patients who were receiving palliative care during 4,62 months of follow-up. The entire cohort comprised 311 patients, ages 68 years ± 3 years who were diagnosed with solid gastrointestinal tumors (84 colorectal tumors, 74 pancreatic tumors, 73 upper gastrointestinal tumors, 51 liver-biliary tumors, 3 breast tumors, 5 melanomas, and 21 other tumor types). RESULTS Decreased body weight was explained by loss of body fat, preferentially from the trunk, followed by leg tissue and arm tissue, respectively. Lean tissue (fat-free mass) was lost from arm tissue, whereas trunk and leg tissue compartments increased, all concomitant with declines in serum albumin, increased systemic inflammation (C-reactive protein, erythrocyte sedimentation rate), increased serum insulin, and elevated daily caloric intake; whereas serum insulin-like growth factor 1 (IGF-1), resting energy expenditure, and maximum exercise capacity remained unchanged in the same patients. Serum albumin levels (P < 0.001), whole body fat (P < 0.02), and caloric intake (P < 0.001) predicted survival, whereas lean tissue mass did not. Daily intake of fat and carbohydrate was more important for predicting survival than protein intake. Survival also was predicted by serum IGF-1, insulin, leptin, and ghrelin levels (P < 0.02 , P < 0.001). Serum insulin, leptin, and ghrelin (total) levels predicted body fat (P < 0.001), whereas IGF-1 and thyroid hormone levels (T3, free T3) predicted lean tissue mass (P < 0.01). Systemic inflammation primarily explained variation in lean tissue and secondarily explained loss in body fat. Depletion of lean arm tissue was related most to short survival compared with the depletion of lean leg and trunk tissue. CONCLUSIONS The current results demonstrated that body fat was lost more rapidly than lean tissue in progressive cancer cachexia, a phenomenon that was related highly to alterations in the levels of circulating classic hormones and food intake, including both caloric amount and diet composition. The results showed importance in the planning of efficient palliative treatment for cancer patients. Cancer 2005. © 2005 American Cancer Society. [source]


    HTLV-1 provirus load in peripheral blood lymphocytes of HTLV-1 carriers is diminished by green tea drinking

    CANCER SCIENCE, Issue 7 2004
    Junichiro Sonoda
    Human T-cell lymphotropic virus type 1 (HTLV-1) is causatively associated with adult T-cell leukemia (ATL) and HTLV-1-associated myelopathy/tropical spastic paraparesis (HAM/TSP). Since a high level of HTLV-1 provirus load in circulating lymphocytes is thought to be a risk for ATL and HAM/TSP, diminution of HTLV-1 provirus load in the circulation may prevent these intractable diseases. Our previous study (Jpn J Cancer Res 2000; 91: 34,40) demonstrated that green tea polyphenols inhibit in vitro growth of ATL cells, as well as HTLV-1-infected T-cells. The present study aimed to investigate the in vivo effect of green tea polyphenols on HTLV-1 provirus load in peripheral blood lymphocytes on HTLV-1 carriers. We recruited 83 asymptomatic HTLV-1 carriers to examine HTLV-1 provirus DNA with or without administration of capsulated green tea extract powder. Thirty-seven subjects were followed up for 5 months by measuring HTLV-1 provirus load after daily intake of 9 capsules of green tea extract powder per day (equivalent to 10 cups of regular green tea), and 46 subjects lived ad libitum without intake of any green tea capsule. The real-time PCR quantification of HTLV-1 DNA revealed a wide range of variation of HTLV-1 provirus load among asymptomatic HTLV-1 carriers (0.2-200.2 copies of HTLV-1 provirus load per 1000 peripheral blood lymphocytes). Daily intake of the capsulated green tea for 5 months significantly diminished the HTLV-1 provirus load as compared with the controls (P=0.031). These results suggest that green tea drinking suppresses proliferation of HTLV-1-infected lymphocytes in vivo. [source]


    Principles of risk assessment for determining the safety of chemicals: Recent assessment of residual solvents in drugs and di(2-ethylhexyl) phthalate

    CONGENITAL ANOMALIES, Issue 2 2004
    Ryuichi Hasegawa
    ABSTRACT Risk assessment of chemicals is essential for the estimation of chemical safety, and animal toxicity data are typically used in the evaluation process, which consists of hazard identification, dose,response assessment, exposure assessment, and risk characterization. Hazard identification entails the collection of all available toxicity data and assessment of toxicity endpoints based on findings for repeated dose toxicity, carcinogenicity or genotoxicity and species-specificity. Once a review is compiled, the allowable lifetime exposure level of a chemical is estimated from a dose,response assessment based on several measures. For non-carcinogens and non-genotoxic carcinogens, the no-observed-adverse-effect-level (NOAEL) is divided by uncertainty factors (e.g. with environmental pollutants) or safety factors (e.g. with food additives) to derive a tolerable daily intake (TDI) or acceptable daily intake (ADI), respectively. These factors include interspecies and individual differences, duration of exposure, quality of data, and nature of toxicity such as carcinogenicity or neurotoxicity. For genotoxic carcinogens, low dose extrapolation is accomplished with mathematical modeling (e.g. linearized multistage model) from the point of departure to obtain exposure levels that will be associated with an excess lifetime cancer risk of a certain level. Data for levels of chemicals in food, water and air, are routinely used for exposure assessment. Finally, risk characterization is performed to ensure that the established ,safe' level of exposure exceeds the estimated level of actual exposure. These principles have led to the evaluation of several existing chemicals. To establish a guideline for residual solvents in medicine, the permitted daily exposure (PDE), equivalent to TDI, of N,N-dimethylformamide was derived on the basis of developmental toxicity (malformation) and of N-methylpyrrolidone on the basis of the developmental neurotoxicity. A TDI for di(2-ethylhexyl)phthalate was derived from assessment of testicular toxicity. [source]


    Ethyl glucuronide in hair.

    ADDICTION, Issue 6 2009
    A sensitive, specific marker of chronic heavy drinking
    ABSTRACT Aims This study aims to define a cut-off concentration for ethyl glucuronide in hair to determine if there was a history of heavy drinking. Settings Pavia, Italy. Participants We analysed hair samples from 98 volunteers among teetotallers, social drinkers and heavy drinkers, whose ethanol daily intake (EDI) was estimated by means of a written questionnaire. Measurements Ethyl glucuronide hair concentration (HEtG) was measured by liquid chromatography-tandem mass spectrometry (lower limit of quantification: 3 pg/mg) using a fully validated method. Findings The HEtG level providing the best compromise between sensitivity (0.92) and specificity (0.96) at detecting an EDI of 60 g or higher during the last 3 months was 27 pg/mg. None of the factors examined among those known to affect ethanol metabolism and/or the diagnostic power of other markers of ethanol use or hair analyses, including age, gender, body mass index, tobacco smoke, prevalent beverage, hair colour, cosmetic treatments and hygienic habits was found to influence marker performance significantly. However, the slight differences in HEtG performance observed for some factors (e.g. body mass index, smoke and hair treatments) require further studies on larger groups of individuals in order to assess their influence more precisely. Conclusions Our results confirm further that HEtG is a sensitive and specific marker of chronic heavy drinking. [source]


    Seasonal dynamics of the hepatotoxic microcystins in various organs of four freshwater bivalves from the large eutrophic lake Taihu of subtropical China and the risk to human consumption

    ENVIRONMENTAL TOXICOLOGY, Issue 6 2005
    Jun Chen
    Abstract So far, little is known on the distribution of hepatotoxic microcystin (MC) in various organs of bivalves, and there is no study on MC accumulation in bivalves from Chinese waters. Distribution pattern and seasonal dynamics of MC-LR, -YR and -RR in various organs (hepatopancreas, intestine, visceral mass, gill, foot, and rest) of four edible freshwater mussels (Anodonta woodiana, Hyriopsis cumingii, Cristaria plicata, and Lamprotula leai) were studied monthly during Oct. 2003,Sep. 2004 in Lake Taihu with toxic cyanobacterial blooms in the summer. Qualitative and quantitative determinations of MCs in the organs were done by LC,MS and HPLC. The major toxins were present in the hepatopancreas (45.5,55.4%), followed by visceral mass with substantial amount of gonad (27.6,35.5%), whereas gill and foot were the least (1.8,5.1%). The maximum MC contents in the hepatopancreas, intestine, visceral mass, gill, foot, and rest were 38.48, 20.65, 1.70, 0.64, 0.58, and 0.61 ,g/g DW, respectively. There were rather good positive correlation in MC contents between intestines and hepatopancreas of the four bivalves (r = 0.75,0.97, p < 0.05). There appeared to be positive correlations between the maximum MC content in the hepatopancreas and the ,13C (r = 0.919) or ,15N (r = 0.878) of the foot, indicating that the different MC content in the hepatopancreas might be due to different food ingestion. A glutathione (GSH) conjugate of MC-LR was also detected in the foot sample of C. plicata. Among the foot samples analyzed, 54% were above the provisional WHO tolerable daily intake (TDI) level, and the mean daily intakes from the four bivalves were 8,23.5 times the TDI value when the bivalves are eaten as a whole, suggesting the high risk of consuming bivalves in Lake Taihu. © 2005 Wiley Periodicals, Inc. Environ Toxicol 20: 572,584, 2005. [source]


    Combination of low-dose mirtazapine and ibuprofen for prophylaxis of chronic tension-type headache

    EUROPEAN JOURNAL OF NEUROLOGY, Issue 2 2007
    L. Bendtsen
    Chronic headaches are difficult to treat and represent the biggest challenge in headache centres. Mirtazapine has a prophylactic and ibuprofen an acute effect in tension-type headache. Combination therapy may increase efficacy and lower side effects. We aimed to evaluate the prophylactic effect of a combination of low-dose mirtazapine and ibuprofen in chronic tension-type headache. Ninety-three patients were included in the double-blind, placebo-controlled, parallel trial. Following a 4-week run-in period they were randomized to four groups for treatment with a combination of mirtazapine 4.5 mg and ibuprofen 400 mg, placebo, mirtazapine 4.5 mg or ibuprofen 400 mg daily for 8 weeks. Eighty-four patients completed the study. The primary efficacy parameter, change in area under the headache curve from run-in to the last 4 weeks of treatment, did not differ between combination therapy (190) and placebo (219), P = 0.85. Explanatory analyses revealed worsening of headache already in the third week of treatment with ibuprofen alone. In conclusion, the combination of low-dose mirtazapine and ibuprofen is not effective for the treatment of chronic tension-type headache. Moreover, the study suggests that daily intake of ibuprofen worsens headache already after few weeks in chronic tension-type headache. [source]


    Impact of inter-individual differences in drug metabolism and pharmacokinetics on safety evaluation

    FUNDAMENTAL & CLINICAL PHARMACOLOGY, Issue 6 2004
    J.L.C.M. Dorne
    Abstract Safety evaluation aims to assess the dose,response relationship to determine a dose/level of exposure for food contaminants below which no deleterious effect is measurable that is ,without appreciable health risk' when consumed daily over a lifetime. These safe levels, such as the acceptable daily intake (ADI) have been derived from animal studies using surrogates for the threshold such as the no-observed-adverse-effect-level (NOAEL). The extrapolation from the NOAEL to the human safe intake uses a 100-fold uncertainty factor, defined as the product of two 10-fold factors allowing for human variability and interspecies differences. The 10-fold factor for human variability has been further subdivided into two factors of 100.5 (3.16) to cover toxicokinetics and toxicodynamics and this subdivsion allows for the replacement of an uncertainty factor with a chemical-specific adjustment factor (CSAF) when compound-specific data are available. Recently, an analysis of human variability in pharmacokinetics for phase I metabolism (CYP1A2, CYP2A6, CYP2C9, CYP2C19, CYP2D6, CYP2E1, CYP3A4, hydrolysis, alcohol dehydrogenase), phase II metabolism (N-acetyltransferase, glucuronidation, glycine conjugation, sulphation) and renal excretion was used to derive pathway-related uncertainty factors in subgroups of the human population (healthy adults, effects of ethnicity and age). Overall, the pathway-related uncertainty factors (99th centile) were above the toxicokinetic uncertainty factor for healthy adults exposed to xenobiotics handled by polymorphic metabolic pathways (and assuming the parent compound was the proximate toxicant) such as CYP2D6 poor metabolizers (26), CYP2C19 poor metabolizers (52) and NAT-2 slow acetylators (5.2). Neonates were the most susceptible subgroup of the population for pathways with available data [CYP1A2 and glucuronidation (12), CYP3A4 (14), glycine conjugation (28)]. Data for polymorphic pathways were not available in neonates but uncertainty factors of up to 45 and 9 would allow for the variability observed in children for CYP2D6 and CYP2C19 metabolism, respectively. This review presents an overview on the history of uncertainty factors, the main conclusions drawn from the analysis of inter-individual differences in metabolism and pharmacokinetics, the development of pathway-related uncertainty factors and their use in chemical risk assessment. [source]


    The effect of herbage allowance on daily intake by Creole heifers tethered on natural Dichanthium spp. pasture

    GRASS & FORAGE SCIENCE, Issue 3 2000
    Boval
    Two experiments were carried out in Guadeloupe to estimate the organic matter intake (OMI) and digestibility (OMD) of a Dichanthium spp. sward, grazed by tethered Creole heifers [mean live weight (LW) 202 ± 2·0 kg], at three daily herbage allowances. Experiment 1 examined herbage allowances of 16, 25 and 31 kg of dry matter (DM) d,1 on a fertilized sward at 21 days of regrowth whereas, in experiment 2, lower allowances of 11, 15 and 19 kg DM d,1 were examined on the same sward, which was unfertilized and grazed at 14 days of regrowth. In each experiment, the herbage was grazed with three groups of two heifers in a 3 × 3 Latin square design. Sward characteristics were described before grazing. OMI was calculated from total faecal output, and OMD was predicted from the crude protein (CP) content of the faeces. The amount of herbage defoliated by the heifers was also estimated on tillers selected at random. Organic matter intakes were on average 26 g and 19 g OM kg,1 LW, and OMD values were 0·740 and 0·665 for Experiments 1 and 2, respectively, and were not affected by allowance. In Experiment 1, the herbage quality was high [0·50 of leaf and 116 g CP kg,1 organic matter (OM)] for a tropical forage, whereas in Experiment 2, the quality of the herbage (0·27 of leaf and 73 g CP kg,1 OM) was lower. These differences were reflected in differences in intake and digestibility in the two experiments. The experimental tropical Dichanthium spp. swards can have intake characteristics similar to those of a temperate sward. [source]


    Two week induction of interferon-beta followed by pegylated interferon alpha-2b and ribavirin for chronic infection with hepatitis C

    HEPATOLOGY RESEARCH, Issue 8 2010
    Keiji Matsui
    Objectives:, To elucidate the efficacy of interferon (IFN)-beta induction therapy followed by pegylated IFN alpha and ribavirin for chronic infection with hepatitis C virus (HCV). Methods:, Patients chronically infected with HCV genotype 1, high titer were enrolled. Twice daily bolus injections of 3 million units IFN-beta were administered for 14 days. Thereafter, weekly injection of pegylated IFN alpha 2b and daily intake of ribavirin were followed. Therapy duration was adjusted according to the response to the therapy. When time to an undetectable HCV-RNA was 1, 2, 4, 8, and 12 weeks, total duration of therapy was 12, 24, 36, 48 and 60 weeks, respectively. Patients who failed to achieve an undetectable HCV-RNA within 12 weeks discontinued therapy on 12 week. Results:, Among the 101 patients treated, 56 (55.4%) achieved sustained virological response (SVR). SVR rate for each treatment duration was 10/10 for 12 weeks, 12/14 for 24 weeks, 18/19 for 36 weeks, 15/26 for 48 weeks, 1/4 for 60 weeks and 0/28 for patients who discontinued therapy at 12 weeks. Mean time to an undetectable HCV-RNA was 35.5 ± 2.7 days. Mean therapy duration was 27.3 ± 1.4 weeks. Using a cut off value of 21.5 fmol/L of HCV core-antigen in the first week, SVR could be predicted by sensitivity of 0.91 and specificity of 0.78. Conclusion:, IFN-beta induction therapy resulted in acceptable SVR rates despite short therapy duration. Steep reduction of HCV by IFN-beta enables us to predict SVR in the first week of therapy. [source]


    Report on the vitamin D status of adult and pediatric patients with inflammatory bowel disease and its significance for bone health and disease

    INFLAMMATORY BOWEL DISEASES, Issue 12 2006
    Helen M. Pappa MD
    Abstract Vitamin D is a hormone responsible for calcium homeostasis and essential for bone mineralization throughout the lifespan. Recent studies revealed a high prevalence of hypovitaminosis D among healthy adults and children, especially in the northern hemisphere, and a link between this condition and suboptimal bone health. Moreover, maintenance of what are today considered optimal vitamin D stores has not been achieved throughout the year with currently recommended daily intake for vitamin D. The prevalence of hypovitaminosis D is even higher among adults with inflammatory bowel disease (IBD), a situation that may be caused by malabsorption and gastrointestinal losses through an inflamed intestine, among other factors. In children with IBD, existing reports of vitamin D status are scarce. The relationship between vitamin D status and bone health, although well-established in healthy adults and children, has been controversial among adults and children with IBD, and the reasons for this have not been investigated to date. Studies in animal models of colitis and in vitro human studies support a role of vitamin D in the regulation of the immune system of the gut and the potential of vitamin D and its derivatives as therapeutic adjuncts in the treatment of IBD. This role of vitamin D has not been investigated with translational studies to date. Currently, there are no guidelines for monitoring vitamin D status, treating hypovitaminosis D, and maintaining optimal vitamin D stores in patients with IBD. These tasks may prove particularly difficult because of malabsorption and gastrointestinal losses that are associated with IBD. [source]


    A case,control study on the dietary intake of mushrooms and breast cancer risk among Korean women

    INTERNATIONAL JOURNAL OF CANCER, Issue 4 2008
    Seo Ah Hong
    Abstract To evaluate the association between dietary mushroom intake and breast cancer risk, a total of 362 women between the ages of 30 and 65 years who were histologically confirmed to have breast cancer were matched to controls by age (±2 years) and menopausal status. Mushroom intake was measured via a food frequency questionnaire that was administered by well-trained interviewers. The associations between the daily intake and the average consumption frequency of mushrooms with breast cancer risk were evaluated using matched data analysis. Both the daily intake (5th vs. 1st quintile, OR = 0.48, 95% CI = 0.30,0.78, p for trend 0.030) and the average consumption frequency of mushrooms (4th vs. 1st quartile, OR = 0.54, 95% CI = 0.35,0.82, p for trend 0.008) were inversely associated with breast cancer risk after adjustment for education, family history of breast cancer, regular exercise [,22.5 MET (metabolic equivalent)-hr/week], BMI (body mass index, Kg/m2), number of children and whether they are currently smoking, drinking or using multivitamin supplements. Further adjustments were made for energy-adjusted carbohydrate, soy protein, folate and vitamin E levels, which tended to attenuate these results. After a stratification was performed according to menopausal status, a strong inverse association was found in postmenopausal women (OR = 0.16, 95% CI = 0.04,0.54, p for trend = 0.0058 for daily intake; OR = 0.17, 95% CI = 0.05,0.54, p for trend = 0.0037 for average frequency), but not in premenopausal women. In conclusion, the consumption of dietary mushrooms may decrease breast cancer risk in postmenopausal women. © 2007 Wiley-Liss, Inc. [source]


    Feeding ecology of wild migratory tunas revealed by archival tag records of visceral warming

    JOURNAL OF ANIMAL ECOLOGY, Issue 6 2008
    Sophie Bestley
    Summary 1Seasonal long-distance migrations are often expected to be related to resource distribution, and foraging theory predicts that animals should spend more time in areas with relatively richer resources. Yet for highly migratory marine species, data on feeding success are difficult to obtain. We analysed the temporal feeding patterns of wild juvenile southern bluefin tuna from visceral warming patterns recorded by archival tags implanted within the body cavity. 2Data collected during 1998,2000 totalled 6221 days, with individual time series (n = 19) varying from 141 to 496 days. These data span an annual migration circuit including a coastal summer residency within Australian waters and subsequent migration into the temperate south Indian Ocean. 3Individual fish recommenced feeding between 5 and 38 days after tagging, and feeding events (n = 5194) were subsequently identified on 76·3 ± 5·8% of days giving a mean estimated daily intake of 0·75 ± 0·05 kg. 4The number of feeding events varied significantly with time of day with the greatest number occurring around dawn (58·2 ± 8·0%). Night feeding, although rare (5·7 ± 1·3%), was linked to the full moon quarter. Southern bluefin tuna foraged in ambient water temperatures ranging from 4·9 °C to 22·9 °C and depths ranging from the surface to 672 m, with different targeting strategies evident between seasons. 5No clear relationship was found between feeding success and time spent within an area. This was primarily due to high individual variability, with both positive and negative relationships observed at all spatial scales examined (grid ranges of 2 × 2° to 10 × 10°). Assuming feeding success is proportional to forage density, our data do not support the hypothesis that these predators concentrate their activity in areas of higher resource availability. 6Multiple-day fasting periods were recorded by most individuals. The majority of these (87·8%) occurred during periods of apparent residency within warmer waters (sea surface temperature > 15 °C) at the northern edge of the observed migratory range. These previously undocumented nonfeeding periods may indicate alternative motivations for residency. 7Our results demonstrate the importance of obtaining information on feeding when interpreting habitat utilization from individual animal tracks. [source]


    Resting energy expenditure and body composition of Labrador Retrievers fed high fat and low fat diets

    JOURNAL OF ANIMAL PHYSIOLOGY AND NUTRITION, Issue 5-6 2006
    S. Yoo
    Summary A high dietary fat intake may be an important environmental factor leading to obesity in some animals. The mechanism could be either an increase in caloric intake and/or a decrease in energy expenditure. To test the hypothesis that high fat diets result in decreased resting energy expenditure (REE), we measured REE using indirect calorimetry in 10-adult intact male Labrador Retrievers, eating weight-maintenance high-fat (HF, 41% energy, average daily intake: 8018 ± 1247 kJ/day, mean ± SD) and low-fat (LF, 14% energy, average daily intake: 7331 ± 771 kJ/day) diets for a 30-day period. At the end of each dietary treatment, body composition measurements were performed using dual-energy X-ray absorptiometry. The mean ± SD REE was not different between diets (4940 ± 361 vs. 4861 ± 413 kJ/day on HF and LF diets respectively). Measurements of fat-free mass (FFM) and fat mass (FM) also did not differ between diets (FFM: 26.8 ± 2.3 kg vs. 26.3 ± 2.5 kg; FM: 3.0 ± 2.3 vs. 3.1 ± 1.5 kg on HF and LF diets respectively). In summary, using a whole body calorimeter, we found no evidence of a decrease in REE or a change in body composition on a HF diet compared with LF diet. [source]


    Mechanisms linking plant species richness to foraging of a large herbivore

    JOURNAL OF APPLIED ECOLOGY, Issue 4 2010
    Ling Wang
    Summary 1.,There is general concern that local loss of plant diversity will adversely impact net primary productivity and other ecosystem properties. However, mechanisms linking plant diversity with other trophic levels, especially for large herbivores, are poorly understood. 2.,We examine the responses of foraging sheep to changes in plant species richness in an indoor cafeteria experiment involving six plant species richness levels (1, 2, 4, 6, 8 and 11 species) and three plant functional group compositions within each level, and in a field experiment involving three plant species richness levels (1, 4,6 or >8 species). 3.,Sheep preferred a diverse diet over a single diet even when palatable species were in the diet. Voluntary daily intake steadily rose with increases in plant species richness in both cafeteria and field experiments. The overall nutrient intake (i.e. daily energy and protein intakes) of sheep in the cafeteria also rose significantly with increased plant species richness until it reached a plateau at eight species. The quality of the diet selected by sheep was also significantly affected by plant species richness, but the variation of dietary quality was small and variable. 4.,High nutrient acquisition by the sheep depended on selecting those palatable species with high nutrient content from the plant forage on offer together with the complementary effects of plant species richness, especially for plant functional group richness. 5.,Synthesis and applications. Our experiments demonstrate an asymptotic relationship between plant species richness and voluntary intake by sheep. Increases in plant species richness from a low level led to increased daily nutrient intake, and presumably performance of the sheep. Natural grasslands are generally low in nutritional quality and so plant species richness will critically influence herbivore food intake and nutrition. The asymptotic relationship indicates that the maintenance of plant species richness in rangelands will benefit both domestic herbivore production and the conservation of biodiversity. [source]


    Comparative foraging and nutrition of horses and cattle in European wetlands

    JOURNAL OF APPLIED ECOLOGY, Issue 1 2002
    Catherine Menard
    Summary 1Equids are generalist herbivores that co-exist with bovids of similar body size in many ecosystems. There are two major hypotheses to explain their co-existence, but few comparative data are available to test them. The first postulates that the very different functioning of their digestive tracts leads to fundamentally different patterns of use of grasses of different fibre contents. The second postulates resource partitioning through the use of different plant species. As domestic horses and cattle are used widely in Europe for the management of conservation areas, particularly in wetlands, a good knowledge of their foraging behaviour and comparative nutrition is necessary. 2In this paper we describe resource-use by horses and cattle in complementary studies in two French wetlands. Horses used marshes intensively during the warmer seasons; both species used grasslands intensively throughout the year; cattle used forbs and shrubs much more than horses. Niche breadth was similar and overlap was high (Kulczinski's index 0·58,0·77). Horses spent much more time feeding on short grass than cattle. These results from the two sites indicate strong potential for competition. 3Comparative daily food intake, measured in the field during this study for the first time, was 63% higher in horses (144 gDM kg W,0·75 day,1) than in cattle (88 gDM kg W,0·75 day,1). Digestibility of the cattle diets was a little higher, but daily intake of digestible dry matter (i.e. nutrient extraction) in all seasons was considerably higher in horses (78 gDM kg W,0·75 day,1) than in cattle (51 gDM kg W,0·75 day,1). When food is limiting, horses should outcompete cattle in habitats dominated by grasses because their functional response is steeper; under these circumstances cattle will require an ecological refuge for survival during winter, woodland or shrubland with abundant dicotyledons. 4Horses are a good tool for plant management because they remove more vegetation per unit body weight than cattle, and use the most productive plant communities and plant species (especially graminoids) to a greater extent. They feed closer to the ground, and maintain a mosaic of patches of short and tall grass that contributes to structural diversity at this scale. Cattle use broadleaved plants to a greater extent than horses, and can reduce the rate of encroachment by certain woody species. [source]


    Using Biomonitoring Equivalents to interpret human biomonitoring data in a public health risk context

    JOURNAL OF APPLIED TOXICOLOGY, Issue 4 2009
    Sean M. Hays
    Abstract Increasingly sensitive analytical tools allow measurement of trace concentrations of chemicals in human biological media in persons from the general population. Such data are being generated by biomonitoring programs conducted by the US Centers for Disease Control and other researchers. However, few screening tools are available for interpretation of such data in a health risk assessment context. This review describes the concept and implementation of Biomonitoring Equivalents (BEs), estimates of the concentration of a chemical or metabolite in a biological medium that is consistent with an existing exposure guidance value such as a tolerable daily intake or reference dose. The BE approach integrates available pharmacokinetic data to convert an existing exposure guidance value into an equivalent concentration in a biological medium. Key concepts regarding the derivation and communication of BE values resulting from an expert workshop held in 2007 are summarized. BE derivations for four case study chemicals (toluene, 2,4-dichlorophenoxyacetic acid, cadmium and acrylamide) are presented, and the interpretation of biomonitoring data for these chemicals is presented using the BE values. These case studies demonstrate that a range of pharmacokinetic data and approaches can be used to derive BE values; fully developed physiologically based pharmacokinetic models, while useful, are not required. The resulting screening level evaluation can be used to classify these compounds into relative categories of low, medium and high priority for risk assessment follow-up. Future challenges related to the derivation and use of BE values as tools in risk management are discussed. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    Wintertime Vitamin D Supplementation Inhibits Seasonal Variation of Calcitropic Hormones and Maintains Bone Turnover in Healthy Men,

    JOURNAL OF BONE AND MINERAL RESEARCH, Issue 2 2009
    Heli T Viljakainen
    Abstract Vitamin D is suggested to have a role in the coupling of bone resorption and formation. Compared with women, men are believed to have more stable bone remodeling, and thus, are considered less susceptible to the seasonal variation of calcitropic hormones. We examined whether seasonal variation exists in calcitropic hormones, bone remodeling markers, and BMD in healthy men. Furthermore, we determined which vitamin D intake is required to prevent this variation. Subjects (N = 48) were healthy white men 21,49 yr of age from the Helsinki area with a mean habitual dietary intake of vitamin D of 6.6 ± 5.1 (SD) ,g/d. This was a 6-mo double-blinded vitamin D intervention study, in which subjects were allocated to three groups of 20 ,g (800 IU), 10 ,g (400 IU), or placebo. Fasting blood samplings were collected six times for analyses of serum (S-)25(OH)D, iPTH, bone-specific alkaline phosphatase (BALP), and TRACP. Radial volumetric BMD (vBMD) was measured at the beginning and end of the study with pQCT. Wintertime variation was noted in S-25(OH)D, S-PTH, and S-TRACP (p < 0.001, p = 0.012, and p < 0.05, respectively) but not in S-BALP or vBMD in the placebo group. Supplementation inhibited the winter elevation of PTH (p = 0.035), decreased the S-BALP concentration (p < 0.05), but benefited cortical BMD (p = 0.09) only slightly. Healthy men are exposed to wintertime decrease in vitamin D status that impacts PTH concentration. Vitamin D supplementation improved vitamin D status and inhibited the winter elevation of PTH and also decreased BALP concentration. The ratio of TRACP to BALP shows the coupling of bone remodeling in a robust way. A stable ratio was observed among those retaining a stable PTH throughout the study. A daily intake of vitamin D in the range of 17.5,20 ,g (700,800 IU) seems to be required to prevent winter seasonal increases in PTH and maintain stable bone turnover in young, healthy white men. [source]


    Prey consumption rates and growth of piscivorous brown trout in a subarctic watercourse

    JOURNAL OF FISH BIOLOGY, Issue 3 2006
    H. Jensen
    Prey consumption rates of piscivorous brown trout Salmo trutta were studied in the Pasvik watercourse, which forms the border between Norway and Russia. Estimates of food consumption in the field were similar to or slightly less than maximum values from a bioenergetic model. The piscivore diet consisted mainly of vendace Coregonus albula with a smaller number of whitefish Coregonus lavaretus. Individual brown trout had an estimated mean daily intake of c. 1·5 vendace and 0·4 whitefish, and a rapid annual growth increment of 7,8 cm year,1. The total population of brown trout >25 cm total length was estimated as 8445 individuals (0·6 individuals ha,1), giving a mean ± s.e. annual consumption of 1553880 ± 405360 vendace and 439140 ± 287130 whitefish for the whole watercourse. The rapid growth in summer of brown trout >25 cm indicated a high prey consumption rate. [source]


    Safety of Polyethylene Terephthalate Food Containers Evaluated by HPLC, Migration Test, and Estimated Daily Intake

    JOURNAL OF FOOD SCIENCE, Issue 6 2008
    H.-J. Park
    ABSTRACT:, A comparative high-pressure liquid chromatography (HPLC) analysis of monomers, terephthalic acid (TPA), isophthalic acid (IPA), and dimethyl terephthalate (DMT) from polyethylene terephthalate (PET) food containers was conducted. Monomer linearities and sensitivities were calibrated between established and novel HPLC analyses. Safety of PET containers was evaluated with newly established detection methods for TPA, IPA, and DMT. Migration of the 3 monomers into food simulants (water, 4% acetic acid, 20% alcohol, and n-heptane) from 56 PET containers collected from open markets was monitored. Migrated monomers were not detected over 0.1 ppm of detection limit. The corresponding estimated daily intake was measured to confirm the safety of these publicly available PET containers and to permit comparison to the specific migration limit of the European Union. The estimated daily intake of 3 monomers migrating from PET was 0.0384 mg/kg each. This represented only 0.6% of the European Union's specific migration limit, confirming the safety of the examined containers. [source]


    Nutrient intake of children consuming breakfast at school clubs in London

    JOURNAL OF HUMAN NUTRITION & DIETETICS, Issue 5 2003
    S. Waddington
    Introduction: Research into the effectiveness of breakfast clubs has most commonly focused on social benefits to the child and school, such as improved attendance at school, punctuality and improved concentration levels in the classroom (UEA, 2002). Limited research has been undertaken to investigate the nutritional value of the breakfast foods on offer, or the nutritional content of foods consumed by the child. The aim of this study was to find out what children eat and drink at school breakfast clubs in London. Method: The sample population consisted of 98 children (39 boys and 59 girls) aged 5,11 years attending four primary schools in London. Data were collected about the food on offer and the pricing of different food items, demographic data about the children attending the school club, qualitative data on food preferences and a weighed food intake on two different occasions for each child. Statistical tests (anova and chi-squared tests) and nutrient analysis using Comp-Eat were carried out. Results: The average nutrient content of the breakfast meal consumed was 330 kcal, 12 g protein, 11 g fat and 49 g carbohydrate. Variation was seen between schools. Generally intakes of vitamin C, calcium and sodium were high and intakes of iron were average. anova between schools showed statistically significant results for a number of nutrients , protein, fat, saturated fat, carbohydrate, sugar, calcium and sodium. Boys were consuming statistically significantly more fat, saturated fat and calcium than girls. One in five children did not have a drink at breakfast. Menu options and pricing of food items varied between the schools and it was noted to influence children's food choice and consumption. Mean energy intakes equated to 18% of the estimated average requirement for boys and 20% for girls, with girls consuming more carbohydrate and sugar, and boys consumed more fat and protein. Discussion: The findings suggest that careful planning of menus should be undertaken with cereal-based options being offered daily and cooked options only occasionally, and that healthier eating messages can be incorporated effectively into school clubs when supported by the whole school approach to healthy eating. Conclusion: Food offered at school breakfast clubs can contribute substantial nutrients to a child's daily intake and therefore a varied menu, and guided food choices, should be developed incorporating healthier nutrient rich options. This work was supported by Brooke Bond working in partnership with the BDA Community Nutrition Group. [source]


    Oral pre-cancer and the associated risk factors among industrial workers in Japan's overseas enterprises in the UK

    JOURNAL OF ORAL PATHOLOGY & MEDICINE, Issue 5 2003
    Toru Nagao
    Abstract Background: ,Screening at industries has been advocated as a method of early detection for cancer. This study describes the prevalence of oral pre-cancerous lesions and other mucosal diseases following oral mucosal screening, and associated risk factors among Japanese industrial workers in the UK. Methods: ,Oral mucosal screening was by invitation at 51 industrial locations in the UK. A self-administered questionnaire was used to record socio-behavioural factors and frequency of daily intake of fruits and vegetables. Results: ,Four hundred and eighty-four subjects attended for oral mucosal screening (mean age 39.9 ± 8.3 years) and their mean period of residence in UK was 5.3 ± 4.5 years. 63.4% examined were male. 31.3% of males and 26.6% of females smoked daily. The gender differences were striking compared with Japan's national rates. A higher proportion of managerial staff was regular heavy (20+ per day) smokers. The intake of more than five portions per day of vegetables and/or fruits during the weekend was significantly higher in females than in males (P = 0.022). One hundred and six subjects (22%) were detected with oral mucosal lesions, including 16 leukoplakia lesions (3.3%) and three with oral lichen planus (1%). The rate of positive detections was higher in managers (7.5%). Odds ratios were estimated by socio-behavioural variables. Among subjects positive for oral leukoplakia, managers accounted for 68.8% (OR 5.26; 95% CI, 1.24,22.29). 87.5% of subjects detected with oral leukoplakia smoked daily and had done so for the past 10 years (OR 28.40; 95% CI, 5.63,143.28). Though regular alcohol drinking was a common feature among male leukoplakia cases, heavy alcohol misuse was not encountered. None reported an intake of five or more portions of fruits or vegetables. Conclusions: ,The Japanese nationals working in managerial positions in the UK and daily regular smokers in the industries visited were found to be at a high risk of oral pre-cancer. Regular dental/oral check up and tobacco education programmes are encouraged for oral cancer/pre-cancer control in industrial settings. [source]


    Ethyl Glucuronide in Hair Compared With Traditional Alcohol Biomarkers,A Pilot Study of Heavy Drinkers Referred to an Alcohol Detoxification Unit

    ALCOHOLISM, Issue 5 2009
    Gudrun Høiseth
    Background:, Traditional biomarkers for heavy alcohol use include serum carbohydrate-deficient transferrin (CDT), the enzymes aspartate aminotranserase (AST), and alanine aminotransferase (ALT) as well as gamma-glutamyl transferase (GGT). Measurement of the nonoxidative ethanol metabolite, ethyl glucuronide (EtG) in hair, has been proposed as a new marker with superior qualities. The aim of this study was to investigate the sensitivity of EtG in hair to detect heavy alcohol use compared with CDT, AST, ALT, and GGT. We also wanted to study the quantitative relation between alcohol intake and the different biomarkers. Methods:, Sixteen patients with a history of heavy alcohol use over the previous 3 months were recruited directly after admission to a withdrawal clinic. They were thoroughly interviewed about their drinking pattern as well as relevant diseases and use of medicines or drugs. Serum was sampled and analyzed for %CDT, AST, ALT, and GGT. Hair samples were collected and analyzed for EtG. Results:, The mean estimated daily intake (EDI) over the previous 3 months was 206 ± 136 g pure alcohol. All patients fulfilled the criteria for heavy alcohol use. The sensitivity to detect heavy alcohol use was 64% for %CDT, 67% for AST, 67% for ALT, 93% for GGT, and 94% for EtG. There was no correlation between the quantitative values of EDI and %CDT, AST, ALT, and GGT. There was a positive, statistically significant correlation between EDI and the level of EtG in hair. Conclusions:, In this study, EtG in hair and GGT showed the best sensitivity to detect heavy alcohol use and there was a positive correlation between EDI and the concentrations of EtG in hair. Before giving recommendations for clinical practice, further studies should be carried out on larger materials and populations with a wider range of alcohol intake. [source]


    Liver Disease in Heavy Drinkers With and Without Alcohol Withdrawal Syndrome

    ALCOHOLISM, Issue 1 2004
    E. Barrio
    Abstract: Background: Withdrawal syndrome is a hallmark of alcohol dependence. The characteristics of alcohol consumption, closely related to dependence, could influence the development of alcoholic liver disease. The study aimed to investigate if patients with severe alcohol withdrawal syndrome have a peculiar profile of liver disease. Methods: The study included 256 heavy drinkers (aged 19,75 years, 70.3% males) admitted to an Internal Medicine Department. Patients admitted for complications of liver disease were not included. Severe alcohol withdrawal syndrome (seizures, disordered perceptions, or delirium) developed in 150 patients (58.6%). Alcohol consumption (daily quantity, duration, and pattern [regular or irregular]) was assessed by questionnaire. Liver biopsy was performed in all cases. Results: Patients with alcohol withdrawal syndrome showed a lower prevalence of liver cirrhosis and a higher prevalence of alcoholic hepatitis than patients without it. The negative association of alcohol withdrawal syndrome with liver cirrhosis persisted after we adjusted for sex, daily intake, duration, and pattern of alcohol consumption. Alcoholic hepatitis was independently associated with the irregular pattern of alcohol consumption, which was closely associated with severe alcohol withdrawal syndrome. Conclusions: The profile of liver injury is different in heavy drinkers who develop and who do not develop a severe alcohol withdrawal syndrome when admitted to the hospital. [source]


    Genetic Repeat Polymorphism in the Regulating Region of CYP2E1: Frequency and Relationship With Enzymatic Activity in Alcoholics

    ALCOHOLISM, Issue 6 2001
    E. Plee-Gautier
    Background: Differences in the regulatory region of the CYP2E1 gene could be responsible for the interindividual variation in the cytochrome P-450 2E1 (CYP2E1) involved in ethanol oxidation. Recently, a polymorphic repeat sequence in the human gene was described between ,2178 and ,1945 base pairs. Its frequency seemed to vary among different ethnic populations, and it was suspected to be related to an increased inducibility to further ethanol intake. In the study reported here, the frequency of this polymorphism was investigated in a white French population. Its relationship with the previously described Pst I/Rsa I or Dra I CYP2E1 polymorphisms, alcoholism, alcoholic liver disease, and inducibility of CYP2E1 by ethanol was examined. Methods: The polymorphic region was characterized by polymerase chain reaction in 103 controls, 148 alcoholic subjects without liver diseases, and 98 others with liver cirrhosis. By using in vivo chlorzoxazone (CHZ) metabolism, CYP2E1 phenotype was assessed in 36 non,ethanol-induced subjects (17 controls and 19 withdrawn alcoholics) and in 14 ethanol-induced subjects (10 controls after ingestion of 0.8 g/kg ethanol and four alcoholics with 100 g of daily intake). This phenotype was expressed as the 6-hydroxy CHZ/CHZ ratio. Results: The rare allele frequency was found to be 1.58% in whites (n= 349). Neither significant association with alcoholism or alcoholic liver diseases, nor relationship with the Pst I/Rsa I polymorphism, was observed. But the Dra I polymorphism was more frequent among the heterozygous subjects when compared with wild-type homozygous ones (p < 0.05). The CYP2E1 phenotype was similar in wild-type homozygotes and in heterozygotes at the constitutive level, as well as after induction with ethanol. Conclusions: Our data suggest that CYP2E1 repeat polymorphism does not seem to constitute a major factor for interindividual differences in CYP2E1 expression and susceptibility to alcohol-related disorders in whites. [source]


    Essential elements and contaminants in tissues of commercial pelagic fish from the Eastern Mediterranean Sea

    JOURNAL OF THE SCIENCE OF FOOD AND AGRICULTURE, Issue 9 2009
    Beyza Ersoy
    Abstract BACKGROUND: It is important to determine the concentrations of essential and non-essential metals in fish for human health. The essential elements and contaminants (Pb and Cd) were determined seasonally in the muscle and liver of some pelagic fish species round herring (Etrumeus teres), chub mackerel (Scomber japonicus), golden grey mullet (Liza aurata) and Mediterranean horse mackerel (Trachurus mediterraneus) from the Iskenderun Bay, Eastern Mediterranean Sea. RESULTS: The Na, K, Ca and Mg were the most abundant elements in muscle and liver tissues. The Na, K, Ca and Mg concentrations in fish tissues were between 51.7 and 3426 mg kg,1. Muscle accumulated the lowest levels of elements. Trace element and contaminant levels in muscle were highest in spring and summer. The Cu, Zn and Cr concentrations were highest in summer. The Ni, Mn and Fe concentrations were highest in spring. The maximum Pb concentrations in the muscle and liver of fish species was 0.39 and 0.80 mg kg,1 in autumn. The maximum Cd concentration in the muscle of fish was 0.27 mg kg,1 in spring and the maximum Cd concentration in the liver was 0.78 mg kg,1 in summer. CONCLUSION: The Cr, Pb, Cd, Cu and Zn levels in muscle were found to be lower than permissible limits reported by various authorities. Estimated weekly and daily intake for Pb and Cd by consumption of fish muscle were far below the PTWI and PTDI values established by FAO/WHO. Copyright © 2009 Society of Chemical Industry [source]


    Comparative assessment of soybean meal with high and low glucosinolate rapeseed,mustard cake as protein supplement on performance of growing crossbred calves

    JOURNAL OF THE SCIENCE OF FOOD AND AGRICULTURE, Issue 5 2008
    S Ravichandiran
    Abstract BACKGROUND: Feeding of high glucosinolate rapeseed,mustard cakes (RMCs) imparts adverse effects on dry matter (DM) intake, health and overall performance of animals. Recently, plant breeding efforts have resulted in many cultivars of RMCs containing low to moderate levels of glucosinolate in India. The feeding value of RMC cultivars with high and low glucosinolate was evaluated relative to commonly used soybean meal as a protein supplement in growing crossbred calves. RESULTS: Eighteen growing crossbred calves (62.9 ± 3.8 kg body weight) were randomly allocated to three dietary treatments SBM, LG and HG containing soybean meal, low glucosinolate B. napus (15 µmol glucosinolates g,1) and high glucosinolate B. juncea (135 µmol glucosinolates g,1), respectively. Although daily intake of total DM and wheat straw did not differ (P > 0.05) among the dietary treatments, intake (g/kgW0.75) of concentrate moiety decreased quadratically (P < 0.01) with increasing glucosinolate levels in diets. Nutrient digestibility and balances of N, Ca and P by calves were similar (P > 0.05) among dietary treatments. However, average daily gain (g) decreased and feed conversion ratio values increased quadratically (P < 0.05) with increasing glucosinolate levels. Serum metabolic profile and triiodothyronine remained within the normal range; however, thyroxine changed quadratically. CONCLUSION: The results suggested that while high glucosinolate RMCs may reduce the palatability and consequently growth rate in crossbred calves, SBM can be replaced completely by low glucosinolate rapeseed without compromising their performance. Copyright © 2008 Society of Chemical Industry [source]


    Approaches in the safety evaluations of veterinary antimicrobial agents in food to determine the effects on the human intestinal microflora

    JOURNAL OF VETERINARY PHARMACOLOGY & THERAPEUTICS, Issue 1 2005
    C. E. CERNIGLIA
    The administration of antimicrobial agents to livestock creates potential for antibiotic residues to enter the food supply and be consumed by humans. Therefore, as a process of food animal drug registration, national regulatory agencies and international committees evaluate data regarding the chemical, microbiologic, pharmacokinetic, pharmacodynamic, pharmacologic, toxicologic, and antimicrobial properties of veterinary drugs to assess the safety of ingested antimicrobial residues to consumers. Currently, European, Australian and United States guidelines for veterinary drug registration require a safety assessment of microbiologic hazards from consumption of antimicrobial residues taking into account the potentially adverse effects on human intestinal microflora. The main concerns addressed are selection of resistant bacteria in the gastrointestinal tract and disruption of the colonization barrier of the resident intestinal microflora. Current requirements differ among national agencies. Efforts are ongoing internationally to review and harmonize approaches and test methods and protocols for application to these microbiologic safety evaluations of antimicrobial drug residues in food. This review describes the background to current regulatory approaches used in applying in vitro and in vivo methods to set a microbiologic acceptable daily intake for residues in food derived from animals treated with an antimicrobial agent. This paper also examines the current research needs to support these evaluations. [source]


    Review article: fructose malabsorption and the bigger picture

    ALIMENTARY PHARMACOLOGY & THERAPEUTICS, Issue 4 2007
    P. R. GIBSON
    Summary Fructose is found widely in the diet as a free hexose, as the disaccharide, sucrose and in a polymerized form (fructans). Free fructose has limited absorption in the small intestine, with up to one half of the population unable to completely absorb a load of 25 g. Average daily intake of fructose varies from 11 to 54 g around the world. Fructans are not hydrolysed or absorbed in the small intestine. The physiological consequences of their malabsorption include increasing osmotic load, providing substrate for rapid bacterial fermentation, changing gastrointestinal motility, promoting mucosal biofilm and altering the profile of bacteria. These effects are additive with other short-chain poorly absorbed carbohydrates such as sorbitol. The clinical significance of these events depends upon the response of the bowel to such changes; they have a higher chance of inducing symptoms in patients with functional gut disorders than asymptomatic subjects. Restricting dietary intake of free fructose and/or fructans may have durable symptomatic benefits in a high proportion of patients with functional gut disorders, but high quality evidence is lacking. It is proposed that confusion over the clinical relevance of fructose malabsorption may be reduced by regarding it not as an abnormality but as a physiological process offering an opportunity to improve functional gastrointestinal symptoms by dietary change. [source]