What is the Tolerable Upper Intake Level for vitamin A

Finally, iodine is also used in some medical applications and can be a source of potentially supraphysiologic iodine exposure. Iodine tinctures (dilute mixtures of alcohol and iodine) and Betadine are used as iodine-containing antiseptics. Iodinated contrast in radiographic studies is a source of iodine exposure that is often several thousand-fold higher than recommended daily nutritional intake amounts. In some instances of severe hyperthyroidism (increased thyroid hormone production), saturated solutions of potassium iodine may be preoperatively used in conjunction with other medical treatments for thyroid surgery. There is also a role for prophylactic potassium iodide consumption in the event of a nuclear accident if indicated.

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128021682000117

Current Understanding of Vitamin D Metabolism, Nutritional Status, and Role in Disease Prevention

Susan J. Whiting, ... Hassan Vatanparast, in Nutrition in the Prevention and Treatment of Disease (Fourth Edition), 2017

A The Tolerable UL for Vitamin D

The daily tolerable UL was established to discourage potentially dangerous self-medication [121]. The two main indicators of excess vitamin D include hypercalcemia, which can lead to calcification of soft tissues such as arteries (arteriosclerosis) and kidney (nephrocalcinosis), and hypercalciuria, which reflects the presence of excess serum calcium and could be damaging on its own. Kidney stones are sometimes raised as a concern when vitamin D intakes are increased. When incidence of kidney stones is reported from RCTs of vitamin D, there is usually calcium supplementation along with vitamin D, so it has been difficult to distinguish the cause [85]. A study of over 2000 participants who had blood 25(OH)D measured and reported, prospectively, on health outcomes. No statistically significant association between serum 25(OH)D and kidney stones was found. Serum 25(OH)D ranged between 50 and 250 nmol/L [132].

A risk assessment for vitamin D showed that based on an analysis of all published reports related to excess vitamin D ingestion up to 2007, doses of supplemental vitamin D3 ingested at levels below 10,000 IU/day were not associated with toxicity [133]. It is possible that intakes consistently above 50,000 IU/day could be linked to side effects including hypercalcemia [4].

UL values for infants 0–6 months and 6–12 months were set at 1000 and 1500 IU, respectively (Table 43.15). A no-observable-adverse-effect-level of 1800 IU was used to set these ULs as studies exist where high doses of vitamin D have produced hypercalcemia. An uncertainty factor (UF) of 1.8 was set for very young infants and a smaller UF for older infants indicting greater tolerance to vitamin D. For adults over 18 years, the committee concluded that 25(OH)D should not be above 125–150 nmol/L, and that using published data from a dosing study, determined that 4000 IU would not raise serum 25(OH)D above this threshold [4]. In that dosing study, Heaney et al. [16] gave cholecalciferol to subjects in doses up to 10,000 IU, and found no adverse effects in men treated for 5 months. The IOM indicated that attaining a 25(OH)D above 125 nmol/L had no potential benefit and possible (unknown) risk; therefore, the UL was set at 4000 IU which is an adjustment down from the 5000 IU that subjects ingested to reach 125 nmol/L with 5 months of dosing. For children, the committee chose to “scale down” the adult UL [4].

Table 43.15. Comparison of Safety Assessments for Vitamin D Based on Healthy and At-Risk Status

Organization and CriterionTolerable ULHealthy PopulationIOM, 20110–0.5 year   10000.5–1 year   15001–3 years    25004–8 years    30009–13 years   400014+ years    4000At-Risk PopulationEndocrine Society, 20110–0.5 year   20000.5–1 year   20001–3 years    40004–8 years    40009–13 years   400014+ years   10,000

Sources: From Institute of Medicine, Dietary Reference Intakes for Calcium and Vitamin D, National Academy Press, Washington, DC, 2011; M.F. Holick, N.C. Binkley, H.A. Bischoff-Ferrari, C.M. Gordon, D.A. Hanley, R.P. Heaney, M.H. Murad, C.M. Weaver, Evaluation, treatment, and prevention of vitamin D deficiency: an Endocrine Society Clinical Practice Guideline, J. Clin. Endocrinol. Metab. 96 (2011) 1911–1930.

The Endocrine Society’s recommendations for UL values are higher than those of the IOM levels. The Endocrine Society uses these higher amounts, shown in Table 43.14, as they represent intakes that would be used in treatment regimens for those at high risk of vitamin D deficiency or insufficiency [51]. Unless a person is under a physician’s care, it would be prudent that the IOM values for ULs should be used as the highest safest intakes.

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128029282000436

Health Benefits of Algal Polysaccharides in Human Nutrition

Ladislava Mišurcová, ... Ludmila Machů, in Advances in Food and Nutrition Research, 2012

E Possible negative effects of dietary fiber on human health

As well as adequate TDF, also tolerable upper intake level has not been determined, yet. Potential adverse effect of high intakes of dietary fiber on mineral, vitamin, and carotenoid bioavailability has been documented (Greiner et al., 2006, 2009). The studies of effects of dietary fiber on mineral absorption were focused on the potential for the cell wall polysaccharides to act as weak cation exchangers and therefore have the capacity to bind divalent ions; moreover, they were focused on the extent of degree of their fermentability or accessibility to bacterial enzymes in the human colon (Southgate, 1987). The different affinity of dietary fiber to the formation of bonds with minerals can be predestinated from the miscellaneous chemical structures of plant cell wall polysaccharides included into the term of dietary fiber. Moreover, the extent of mineral bioavailability can depend on the presence of some antinutrients in high-fiber foods, such as oxalates, tannins, and phytates with adverse effect to mineral bioavailability (Harland, 1989; Kay, 1982). A dietary fiber–mineral–oxalate complex has been established as lowered mechanism to mineral bioavailability by binding of both oxalic acid and dietary fiber to mineral. The content of phytate in a meal has been considered as a negative factor on mineral uptake and bioavailability because of its inhibitory impairment of iron, zinc, calcium, magnesium, manganese, and copper absorption. The formation of insoluble mineral–phytate complexes at physiological pH values is regarded as the major reason for the insufficient mineral bioavailability (Afinah et al., 2010; Kumar et al., 2010). The stability and solubility of these complexes depend on the chemical forms and amounts of phytate presented in the diet, pH value, the individual cation, and the phytate to cation molar ratio and finally on the interaction with the other present compounds (Grases et al., 2001). pH is one of the major factors influencing the solubility of phytate complexes; nevertheless, its value is specific for individual minerals. Mostly, lower pH 4–5 causes the higher solubility of Ca2 +, Cd2 +, Zn2 +, and Cu2 + salts, while the Mg-phytate is soluble at acid pH up to pH 7.5. Finally, synergistic effects of different mineral cations may cause reduction of their bioavailability (Graf and Eaton, 1984; Greiner et al., 2006). In the presence of phytate and calcium, absorption of other minerals may be depressed by reason of insoluble complex formation (Kumar et al., 2010).

However, damaging effects of fiber on the mineral bioavailability have not been confirmed in all studies. Benefit effects of fermentable carbohydrates (e.g., inulin, resistant starch) on mineral absorption in rats were reported (Younes et al., 2001). Moreover, the combination of inulin and resistant starch in rat diet increased significantly the cecal-soluble Ca and Mg concentrations (Levrat et al., 1991).

The effects of lignification on the adsorption extent of mutagenic heterocyclic aromatic amines to fiber in the human intestine and colon were studied in vitro. These findings were reported by Funk et al. (2007), who indicated increase of adsorption of hydrophobic heterocyclic aromatic amines by increase of lignification degree in fiber, resulting in reduction of microbial fiber degradation. Nevertheless, shifts in pH and fermentation may somewhat reduce the adsorption of some hydrophobic heterocyclic aromatic amines during passage through the colon. In addition, consistent results were reported by Ta et al. (1999), who investigated effects of different dietary fibers on the binding capacity to pesticides, such as azinphosmethyl, chlorpropham, chlorothalonil, and permethrin under the same conditions as in the gastrointestinal tract. The most significant effect on pesticide solubility exerted lignin. Hemicellulose and cellulose have been found to bind permethrin and azinphosmethyl in the same extent as lignin, while the soluble fiber pectin had the lowest capacity to bind pesticides. Finally, diet supplementation with fiber could decrease the toxicity of food carcinogens such as pesticides and mechanism of this protective effect is the absorption and subsequently their excretion in the feces.

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123945976000033

Compositional determinants of fruit and vegetable quality and nutritional value

Ariel R. Vicente, ... Carlos H. Crisosto, in Postharvest Handling (Fourth Edition), 2022

19.2.8.11 Boron (B)

No estimated average requirements or dietary reference intakes have been set for boron, only a tolerable upper intake level of 20 mg/day for individuals aged ≥19 years (Białek, Czauderna, Krajewska, & Przybylski, 2019). In humans, boron plays important roles in growth and maintenance of bone tissue, improvement of wound healing, and calcium metabolism (enhanced gut absorption of calcium, calcification). Boron acts together with vitamin D, calcium, and magnesium in bone metabolism. Boron has antiinflammatory effects, influences central nervous system functions, and helps regulate the hormones testosterone, estrogen, insulin, triiodothyronine, and thyroxine. It increases biological half-life and bioavailability of estradiol and vitamin D. Boron raises the abundance of antioxidant enzymes, such as superoxide dismutase, catalase, glutathione peroxidase, glutathione-S-transferase, and glucose-6-phosphate dehydrogenase. It also detoxifies reactive oxygen and nitrogen species and reduces lipid peroxidation and DNA damage (Białek et al., 2019).

In vascular plants, boron forms strong complexes with different molecules carrying cis-diol groups in appropriate spatial configurations and is essential for correct formation and stabilization of primary cell walls. Most authors consider the formation of borate diester cross links with two chains of rhamnogalacturonan-II is an essential function for growth, development, and reproduction in vascular plants (Wimmer et al., 2020). Animals lack the capacity to metabolize inorganic boron compounds (like boric acid or borates) into mono- or di-sugar–borate esters (e.g., glucose and fructose borate esters), bis-sucrose borate esters, sugar alcohol borate esters (sorbitol, mannitol), or pectic polysaccharide borate esters. In contrast, plants can convert inorganic boron into dietary sugar-borate ester complexes, which are the best chemical form for assimilation into cells. In general, plants have higher boron concentrations (from 0.1 to 0.6 mg B/100 g) than animal-based foods (from 0.01 to 0.06 mg/100 g) (Białek et al., 2019). Fresh fruits (e.g., avocado, apple, banana, and red grape), leafy vegetables, flowering heads (broccoli), dried fruits (plums, apricots, and raisins), seeds, and nuts (pecans, almonds, and hazelnuts) are primary natural dietary sources of fructoborate esters, mainly calcium fructoborate complex—an excellent source of soluble boron with many beneficial physiological properties (Gharibzahedi & Jafari, 2017).

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128228456000191

Copper: Physiology

J. Bertinato, in Encyclopedia of Food and Health, 2016

Reference Values for Copper

Nutrient reference values for Cu have been established by scientific bodies. Table 1 shows recommended intakes and tolerable upper intake levels (UL) for Cu by life stage group established for North America (i.e., Canada and the United States) and the European Union. Depression in Cu status induces a number of quantifiable physiologic changes. Despite this, no single indicator was deemed sufficient for deriving the estimated average requirement (EAR) for Cu as part of the North American Dietary Reference Intakes due to inconsistencies in results from studies measuring similar biomarkers. For adults aged ≥ 19 years, four indicators were used to establish the EAR: serum Cu concentration, serum ceruloplasmin concentration, erythrocyte Cu/Zn superoxide dismutase (SOD1) activity, and platelet Cu concentration. The EAR for adults range from 0.7 to 1 mg day− 1 and the recommended dietary allowances (RDA) range from 0.9 to 1.3 mg day− 1. EAR and RDA are the same for males and females and these reference values are higher for pregnancy and lactation. It was determined that there were inadequate data to separately establish EAR for infants, children, and adolescents. For infants 0–12 months, adequate intakes were established based on mean Cu intake of infants fed with mainly human milk. For children and adolescents aged 1–18 years, the EAR were derived by extrapolation from the adult EAR. The European Union established population reference intakes that are slightly higher than the North American RDA.

Table 1. Reference values for coppera

Life stage groupNorth AmericabEuropean UnioncAI (mg day− 1)EAR (mg day− 1)RDA (mg day− 1)UL (mg day− 1)PRI (mg day− 1)UL (mg day− 1)0–6 months0.2–––––6–11 months––––0.3–7–12 months0.22–––––1–3 years–0.260.3410.414–6 years––––0.624–8 years–0.340.443––7–10 years––––0.739–13 years–0.540.75––11–14 years––––0.8414–18 years–0.6850.898––15–17 years––––1419–50 years–0.70.9101.1551–70 years–0.70.9101.15> 70 years–0.70.9101.1514–18 years (P)–0.785181.1–19–50 years (P)–0.81101.1–14–18 years (L)–0.9851.381.4–19–50 years (L)–11.3101.4–

aAI, adequate intake; EAR, estimated average requirement; L, lactation; P, pregnancy; PRI, population reference intake; RDA, recommended dietary allowance; UL, tolerable upper intake level.bValues obtained from Food and Nutrition Board, Institute of Medicine (2001). Dietary reference intakes for vitamin A, vitamin K, arsenic, boron, chromium, copper, iodine, iron, manganese, molybdenum, nickel, silicon, vanadium, and zinc. Washington, DC: National Academy Press.cValues obtained from the Scientific Committee on Food, European Food Safety Authority (2006). Tolerable upper intake levels for vitamins and minerals. Parma, Italy: European Commission Publications Office; Scientific Committee for Food, Commission of the European Communities. (1993). Nutrient and energy intakes for the European community. Luxembourg: Office for Official Publications of the European Communities.

There are limited data on the effects of high intakes of Cu in healthy people. Chronic high intakes of Cu result in the accumulation of Cu in the liver that can lead to liver damage. Acute ingestion of large amounts of Cu salts in drinking water has been reported to cause gastrointestinal symptoms; however, liver damage was selected as the more relevant endpoint to establish the UL for Cu. A no observed adverse effect level (NOAEL) of 10 mg day− 1 was established for adults based on data showing no liver injury in adults consuming 10 mg of supplemental Cu daily. North America applied an uncertainty factor (UF) of 1 to the NOAEL to derive a UL of 10 mg day− 1 because of the low prevalence of liver damage from Cu exposure in human populations with normal Cu homeostasis and other evidences indicating the absence of adverse effects with Cu intakes of 10–12 mg day− 1 from foods. The European Union set a more conservative UL of 5 mg day− 1, applying a UF of 2 to account for variability to Cu toxicity within the normal population. It was also concluded that the UL is not applicable during pregnancy and lactation because of limited data relating to these life stages. Given the paucity of data on Cu toxicity in healthy infants, children, and adolescents, a UL was not set for infants and UL for children and adolescents were derived by extrapolation from the adult UL.

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123849472002014

Authorised EU health claims for DHA and EPA

F.W. Vas Dias, in Foods, Nutrients and Food Ingredients with Authorised EU Health Claims: Volume 2, 2015

13.7.2 Tolerable upper intake levels

In relation to high intakes, the NDA panel concluded that available data were not sufficient to establish a tolerable upper intake level for dietary LC omega-3 PUFAs (EPA, DHA and DPA individually or combined). The panel also considered that long-term supplemental intakes of EPA and DHA combined at doses up to 5 g/day, and supplemental intakes of EPA alone up to 1.8 g/day, do not raise safety concerns for the adult population. The panel noted further that observed intakes of EPA and DHA from food and food supplements in European populations are generally below these amounts.

Often cited as potentially negative effects of EPA and DHA are bleeding, lipid peroxidation, inflammation and modulation of the immune system, impaired glucose metabolism and gastrointestinal disturbances. However, the NDA panel found that long-term supplementation with amounts of EPA and DHA combined of up to 5 g/day did not increase the risk of spontaneous bleeding episodes or bleeding complications, even in subjects at high risk of bleeding by, for example, being treated with anticoagulant medication or with drugs affecting platelet function such as aspirin. Supplemental intake of doses of up to 5 g EPA + DHA/day for up to 12 weeks was shown not to affect glucose homeostasis in healthy or diabetic subjects, nor do these amounts induce changes in immune function that might raise concerns in relation to the risk of infection or inappropriate activation of inflammatory responses. Taken for up to 16 weeks, no changes in lipid peroxidation have been observed at this dose level that, otherwise, might have raised concerns about cardiovascular disease risk. It is important, however, to ensure oxidative stability of these LC omega-3 PUFAs. Gastrointestinal discomfort has frequently been reported, particularly when EPA and DHA are consumed as supplements. However, in many studies, the test material and the placebo oil caused the same type of disturbances, which would indicate that it may have been an oil effect. If gastrointestinal discomfort does occur, the symptoms are usually mild and, in many subjects, transient.

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978178242382900013X

Vitamins and the Immune System

Didem Pekmezci, in Vitamins & Hormones, 2011

IV Vitamin E Requirements and Reference Ranges

Vitamin E has an estimated average requirement (EAR), recommended dietary allowances (RDAs), and tolerable upper intake level (UL) (Murphy and Barr, 2007). The EAR was based on the amount of 2R-α-tocopherol intake that reversed erythrocyte hemolysis in men who were vitamin E-deficient as a result of consuming a vitamin E-deficient diet for 5 years (Food and Nutrition Board, and Institute of Medicine, 2000). RDAs are designed so that, if met at a population level, almost all individuals would meet their requirements and avoid clinical deficiency symptoms (Morrissey and Sheehy, 1999). The UL is the highest level of daily nutrient intake that is likely to pose no risks of adverse health effects to most individuals in the general population (Yates et al., 1998). The functional criterion for the requirement of vitamin E for most age groups is the prevention of hydrogen peroxide (H2O2)-induced hemolysis of red blood cells (Murphy and Barr, 2007). Based on the plasma concentration of α-tocopherol to prevent significant hemolysis in vitro (14–16 μmol per L), the US/Canadian EAR is 12 mg/day; RDA for vitamin E is currently set at 15 mg/day of α-tocopherol for adults (ages above 19) (Food and Nutrition Board, and Institute of Medicine, 2000), a 50% increase on the previous RDA (National Research Council, 1989). The EAR of vitamin E in 1–3, 4–8, and 9–13 years aged children are 5, 6, 9 mg α-tocopherol per day, respectively (Eitenmiller and Lee, 2004). On the other hand, RDA levels of 1–3, 4–8, and 9–13 years aged children are 6, 7, 11 mg α-tocopherol per day, respectively (Eitenmiller and Lee, 2004). The tolerable UL was set at 1000 mg/day for vitamin E (any form of supplemental α-tocopherol) (Traber, 2007). This was one of the few UL that was set using data in rats, because sufficient and appropriate quantitative data assessing long-term adverse effects of vitamin E supplements in humans was not available (Traber, 2007). Although vitamin E seems to have very low toxicity and habitual intake of supplements of 200–600 mg/day (compared with an average dietary intake of 8–12 mg seems to be without untoward effect; Shrimpton, 1997), very high intakes may antagonize vitamin K and hence potentiate anticoagulant therapy. This is probably the result of inhibition of the vitamin K quinone reductase, but α-tocopheryl quinone may compete with vitamin K hydroquinone and hence inhibit carboxylation of glutamate in target proteins (Bender, 2003). Animal studies show that vitamin E is not mutagenic, carcinogenic, or teratogenic (Abdo et al., 1986; Dysmsza and Park, 1975; Krasavage and Terhaar, 1977). Adults tolerate relatively high doses without significant toxicity; however, muscle weakness, fatigue, double vision, emotional disturbance, breast soreness, thrombophlebitis, nausea, diarrhea, and flatulence have been reported at tocopherol intakes at 1600–3000 mg/day (Anderson and Reid, 1974; Bendich and Machlin, 1998; Food and Nutrition Board, and Institute of Medicine, 2000; Machlin, 1989; Tsai et al., 1978). The vitamin E requirement for ideal immune function depends on its interactions with other antioxidant and pro-oxidant nutrients, especially with polyunsaturated fatty acids (PUFAs), and on other factors that modulate the immune response, such as age and stress, exercise, infection, and tissue trauma all increase vitamin E requirements (Eskew et al., 1986; Meydani and Tengerdy, 1992; Nockels, 1991). Early reports suggested that vitamin E requirements increase with the intake of PUFAs (Bender, 2003), oxidizing agents, vitamin A, carotenoids, gossypol, or trace minerals and decreased with increasing levels of fat-soluble antioxidants, sulfur amino acids, or Se (Dove and Ewan, 1990; Franklin et al., 1998; Hidiroglou et al., 1992; McDowell et al., 1996). Neither the United Kingdom (Department of Health, 1991) nor the European Union (Scientific Committee for Food, 1993) set reference intakes for vitamin E, but both suggested that an acceptable intake was 0.4 mg of α-tocopherol equivalent per gram of dietary PUFA (Bender, 2003). Morrissey and Sheehy (1999) report that a consensus about the exact daily intake of vitamin E for optimal health protection has not yet been reached. Some authors believe that the scientific evidence is strong enough already, especially for cardiovascular disease (CVD), to recommend daily intakes of the order of 87–100 mg α-tocopherol per day or more (Horwitt, 1991; Packer, 1992; Weber et al., 1997). Gey (1995) has shown that there is an inverse relationship between plasma α-tocopherol and risk of ischemic heart disease over a range of 2.5–4.0 mmol per mol of cholesterol, and has suggested an optimum or desirable plasma concentration > 4 mmol of α-tocopherol per mol of cholesterol (> 3.4 μmol per gram of total plasma lipid), which corresponds an intake of 17–40 mg of α-tocopherol equivalents per day. Like the elderly, certain groups of population are at greater risk for inadequate dietary intake of vitamin E (Panemangalore and Lee, 1992; Ryan et al., 1992). Ryan et al. (1992) reported that over 40% of elderly (65–98 years) had intakes of vitamin E that were below two-thirds the 1989 RDA. In another study done by Panemangalore and Lee (1992), 37% of elderly subjects (average age of 73 years old) consumed below two-thirds RDA and 12% had low lipid adjusted plasma tocopherol status. In several studies, data have shown that elderly humans, as well as laboratory and farm animals, consuming diets that contain more than five times the RDA of vitamin E for their species had significantly increased humoral and cell-mediated immune responses and increased resistance to infectious diseases compared to nonsupplemented controls (Bendich, 1990; Bendich et al., 1986; Meydani and Blumberg, 1991; Meydani et al., 1993; Moriguchi et al., 1990; Tengerdy, 1989; Tengerdy et al., 1990). The minimum vitamin E requirement of normal animals is ~ 30 ppm of diet as in humans (McDowell, 2000). Beharka et al. (1997a) suggested that conventional methods for determining the RDA, while adequate for arriving at the level of vitamin E required to prevent clinical deficiency symptoms, may not adequately predict the optimal level of vitamin E needed to maintain immunological health.

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123869609000083

Nonessential Trace Minerals

Forrest Harold Nielsen, in Molecular, Genetic, and Nutritional Aspects of Major and Trace Minerals, 2017

Safe Tolerable Upper Intake Level

In the United States and Canada, an RDA or AI has not been set for boron. However, tolerable upper intake levels (ULs) were set; these were 3 mg/day for children 1–3 years of age; 6 mg/day for children 4–8 years of age; 17 mg/day for adolescents 9–18 years of age; and 20 mg/day for adults (Food and Nutrition Board, Institute of Medicine, 2001). The World Health Organization (1996) first suggested that 13 mg/day would be a safe upper intake level but later increased this to 0.4 mg/kg body weight (World Health Organization, International Program on Chemical Safety, 1998) or approximately 28 mg/day for a 70-kg person. The European Food Safety Authority (2004) established a UL for boron based on body weight that results in approximately 10 mg/day. The lack of finding adverse effects because of high boron in drinking water supports the establishment of these relatively high ULs (Nielsen and Meacham, 2011). In Turkey, populations consuming high amounts of boron via drinking water and consequently food do not exhibit any apparent adverse effects. For example, in a population exposed to drinking water up to 29 mg boron/L and to boron mining and production, no adverse effects on health and fertility were found over three generations. In another study, no adverse effects were found in 66 men (mean age of 39 years) residing in a high-boron area for 36 years with a calculated mean boron excretion of 6.77 mg/L. The drinking water from where the men resided had boron concentrations that ranged from 2.05 to 29.00 mg/L, with a mean of 10.2 ± 4.1 mg/L.

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128021682000439

The Latest Research and Development of Minerals in Human Nutrition

James F. Collins, in Advances in Food and Nutrition Research, 2021

6.3.3 Wilson's disease (WD), a genetic, copper-overload disorder

Genetic predisposition disorders, like WD, may increase risk for copper toxicity even when copper intakes are well below the UL. WD is an autosomal recessive (genetic) disorder which is typified by defects in copper distribution and storage (Mak & Lam, 2008), resulting from mutations in the ATP7B gene, which encodes a copper transporter, thus disrupting copper homeostasis. A recent review provides more details on this potentially devastating human disease (Mulligan & Bronstein, 2020). The prevalence of WD is ≈ 1:30,000 individuals worldwide (Scheinberg & Sternlieb, 1984), although some genetic studies have reported much higher prevalence rates. A recent study addressed this seeming disconnect and suggested that differences in the penetrance of potentially disease-causing genetic variants may explain the discrepancy reported between epidemiological and genetic prevalence studies of WD (Wallace & Dooley, 2020). In patients with WD, redox-active copper accumulates in liver, brain, and cornea (due to impaired copper efflux), increasing the production of prooxidants and causing eventual tissue and organ damage. If the disease goes untreated, WD patients may develop hepatitis, liver fibrosis and cirrhosis, hemolytic crisis, and eventual hepatic failure. Moreover, high brain copper may cause neurological damage, and copper accumulation in the eye (i.e., so-called Kayser-Fleisher rings) may result in abnormal eye movements. Circulating CP is characteristically low in WD (since hepatic ATP7B is required for biosynthesis of CP), and urinary copper may be abnormally elevated. Early intervention can prevent development of some of these notable pathologies. Treatments for WD include high supplemental zinc dosing (which attenuates enteral copper absorption), and/or copper chelation therapy with trientine or penicillamine (LeWitt, 1999).

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/S104345262100005X

Nutrient adequacy and supplementation

Martin Kohlmeier, in Nutrient Metabolism, 2003

Nutrient adequacy and supplementation

Abbreviations

Al

adequate intake

DRI

Dietary Reference Intakes

EAR

estimated average requirement

IOM

Institute of Medicine

RDA

recommended dietary allowance

UL

tolerable upper intake level

Dietary Reference Intakes

Nutrients are coming to be seen more and more like medications, where the ‘dose makes the poison’. The Dietary Reference Intakes (DRI) published by the Food and Nutrition Board now take into consideration, at least in principle, both lower and upper desirable limits. How to determine appropriate limits for individuals or groups remains the unresolved and much debated question. The Food and Nutrition Board has established a basic framework for tackling this question. Needs are considered separately for each of 22 groups defined by age, gender, as well as pregnancy and lactation status.

The Food and Nutrition Board is a subdivision of the Institute of Medicine at the National Academies of the United States and comprises panels that set guidelines for the US and Canada. Similar institutions exist in several other countries to provide guidance on optimal nutrient intake levels.

Minimal nutrient requirements

The lower limits for nutrient intakes are based on the observed consumption level of healthy populations, if data from functional investigations are lacking. This intake level is called adequate intake (AI) and is assumed to cover the needs of healthy people. Because of the current limitations of scientific evidence, the Food and Nutrition Board established AIs applicable to adults for the following nutrients: total fat, omega-6 fatty acids, omega-3 fatty acids, vitamin D, vitamin K, pantothenic acid, biotin, choline, calcium, chromium, fluoride, and manganese. Because information about the requirements of infants (under one year old) is even more limited, only AIs were set for most nutrients. The exceptions are more definite lower limits for protein, iron, and zinc intakes of 7-12-month-old infants.

Where functional information is found to be reasonably reliable, the Food and Nutrition Board sets recommended dietary allowances (RDA). This amount is thought to cover the needs of most healthy people (97-98%) in the designated group. According to this model the RDA is determined in a three-step process. First, the intake level is sought at which the risk of inadequacy of the healthy target population (e.g., 19- to 50-year-old men) is 50%. This is called the estimated average requirement (EAR, an oxymoronic expression, since it relates to the median and not the average). The second step estimates the variance of requirements. For most nutrient requirements a normal distribution is assumed. Unless evidence to the contrary is available, the variation coefficient (standard deviation divided by the mean) is set at 10% (because this is thought to correspond to the variance of basal metabolic rate). The final step then cither adds two standard deviations (usually 20%) to the EAR or determines the 97.5th percentile of requirements by a Monte Carlo simulation procedure. So far, variation coefficients of 10% were set for thiamin, riboflavin, vitamin B6, folate, vitamin B12, phosphate, magnesium, and selenium because the actual variance was thought to be unknown. The decision to set the variation coefficient for niacin requirements at 15% was based on four separate studies on a total of 29 adults with an average variation coefficient of 34%. Similarly, the variation coefficient for vitamin A requirements was set at 20%, based on a single study of the vitamin A half-life in the livers of adults, that gave a 21% variation coefficient of the results. Based on a single study of adults, which gave a variation coefficient of 40%, the variation coefficient of iodine requirements was set to 20%. In each case a judgment was made about the relative contributions of measurement error versus intrinsic interindividual variation. The variation coefficients for copper and molybdenum requirements were set at 15%. The panel commented that data supporting the EARs are limited, but provided no explanation why they did not use a 10% value as for other nutrients. In all instances, the variation coefficients set for young adults were applied to children, pregnant and lactating women, and older people without the benefit of additional supporting evidence.

A significant weakness of the current recommendations relates to the extremely narrow basis of supporting data for several nutrients. In the vast majority of instances data are completely lacking for specific age and gender groups and the recommendations are based on extrapolations from other groups. Information in children and old people is particularly sparse. When levels are set on the basis of observations in a few subjects, as is the case for most of the covered nutrients, there is little opportunity to differentiate needs by genetic disposition. Only rarely is the existence of genetic diversity acknowledged. A typical and important example pertains to niacin requirements. It is likely that many people can cover their niacin requirements through endogenous synthesis from tryptophan while others need significant intakes of preformed niacin. An even better documented example is the greater than average susceptibility to folate deficiency (Ashfield-Watt et al., 2002) in people with variant (thermolabile) 5,10-methylenetetrahydrofolate reductase (MTHFR; EC1.7.99.5). The current intake guidelines take little note of such differences.

So far, the functional assessment of adequate nutrient intakes has been limited on long-known properties. Reliable knowledge about more recently recognized functions has been disregarded without good reason. Metabolic and health consequences of suboptimal or nutrient status are most likely to be observed when they are monitored by focused observation.

A major shortcoming of the current framework is the deliberate exclusion of any long-term effects, in particular chronic degenerative disease. This ignores that in affluent societies nutrition influences the main causes of morbidity and death such as atherosclerosis, cancer, and osteoporosis. It is with respect to these chronic degenerative diseases that genetic variation of nutrient metabolism is most significant today. Polymorphisms relating to metabolism of energy, glucose, lipids, folate and iron, to name just a few, are known to be important determinants of disease risk and outcome.

How much vitamin A is safe per day?

When taken by mouth: Vitamin A is likely safe when taken in amounts less than 10,000 units (3,000 mcg) daily. Vitamin A is available in two forms: pre-formed vitamin A (retinol or retinyl ester) and provitamin A (carotenoids). The maximum daily dose relates to only pre-formed vitamin A.

Is 3000 IU of vitamin A too much?

Levels of up to 10,000 IU (3,000 mcg) have been considered safe. Beyond that, though, vitamin A can build up to cause liver damage and brain swelling; pregnant women who ingest too much run the risk of fetal damage.