POINT OF CONTACT FOR THIS DOCUMENT:
Tables
Normal distribution of iron-containing compounds in men and women
Iron absorption by infants fed formula or milk
Spectrum of body iron content
Prevalence of iron deficiency and and iron-deficiency anemia, US
Recommended daily allowance for iron ane the proportion of ...
Maximum hemoglobin concentration and hematocrit values for anemia
Adjustment of maximum hemoglobin concentration and hematocrit ...
Cutoff values for laboratory tests for iron deficiency
Causes of iron deficiency
Summary
Iron deficiency is the most common known form of nutritional deficiency. Its prevalence is highest among young children and women of childbearing age (particularly pregnant women). In children, iron deficiency causes developmental delays and behavioral disturbances, and in pregnant women, it increases the risk for a preterm delivery and delivering a low-birthweight baby. In the past three decades, increased iron intake among infants has resulted in a decline in childhood iron-deficiency anemia in the United States. As a consequence, the use of screening tests for anemia has become a less efficient means of detecting iron deficiency in some populations. For women of childbearing age, iron deficiency has remained prevalent.
To address the changing epidemiology of iron deficiency in the United States, CDC staff
in consultation with experts developed new recommendations for use by primary health-care
providers to prevent, detect, and treat iron deficiency. These recommendations update the
1989 "CDC Criteria for Anemia in Children and Childbearing-Aged Women" (MMWR
1989;38(22):400-4) and are the first comprehensive CDC recommendations to prevent and
control iron deficiency. CDC emphasizes sound iron nutrition for infants and young
children, screening for anemia among women of childbearing age, and the importance of
low-dose iron supplementation for pregnant women.
INTRODUCTION
In the human body, iron is present in all cells and has several vital functions -- as a carrier of oxygen to the tissues from the lungs in the form of hemoglobin (Hb), as a facilitator of oxygen use and storage in the muscles as myoglobin, as a transport medium for electrons within the cells in the form of cytochromes, and as an integral part of enzyme reactions in various tissues. Too little iron can interfere with these vital functions and lead to morbidity and mortality.
In the United States, the prevalence of iron-deficiency anemia among children declined during the 1970s in association with increased iron intake during infancy (1-3). Because of this decline, the value of anemia as a predictor of iron deficiency has also declined, thus decreasing the effectiveness of routine anemia screening among children. In contrast, the rate of anemia among low-income women during pregnancy is high, and no improvement has been noted since the 1970s (4). These findings, plus increased knowledge about screening for iron status, raised questions about the necessity and effectiveness of existing U.S. programs to prevent and control iron deficiency. CDC requested the Institute of Medicine to convene an expert committee to develop recommendations for preventing, detecting, and treating iron-deficiency anemia among U.S. children and U.S. women of childbearing age. The committee met throughout 1992, and in 1993 the Institute of Medicine published the committee's recommendations (5). These guidelines are not practical for all primary health-care and public health settings, however, because they require serum ferritin testing during pregnancy (6). This testing may be appropriate in practices where women consistently visit their physician throughout pregnancy, but it is less feasible when analysis of serum ferritin concentration is unavailable or when prenatal care visits are sporadic. The CDC recommendations in this report -- including those for pregnant women -- were developed for practical use in primary health-care and public health settings.
Beside the Institute of Medicine (5,7), the American Academy of Pediatrics (8,9), the U.S. Preventive Services Task Force (10), the American College of Obstetricians and Gynecologists (9,11), the Federation of American Societies for Experimental Biology (12), and the U.S. Public Health Service (13) have all published guidelines within the past 9 years for health-care providers that address screening for and treatment of iron deficiency in the United States. Preventing and controlling iron deficiency are also addressed in Nutrition and Your Health: Dietary Guidelines for Americans (14).
The CDC recommendations differ from the guidelines published by the U.S. Preventive Services Task Force (10) in two major areas. First, the Task Force recommended screening for anemia among infants at high risk for anemia and pregnant women only. The CDC recommends periodic screening for anemia among high-risk populations of infants and preschool children, among pregnant women, and among nonpregnant women of childbearing age. Second, the Task Force stated there is insufficient evidence to recommend for or against iron supplementation during pregnancy, but the CDC recommends universal iron supplementation to meet the iron requirements of pregnancy. The CDC recommendations for iron supplementation during pregnancy are similar to the guidelines issued by the American Academy of Pediatrics and the American College of Obstetricians and Gynecologists (9).
This report is intended to provide guidance to primary health-care providers and emphasizes the etiology and epidemiology of iron deficiency, the laboratory tests used to assess iron status, and the screening for and treatment of iron deficiency at all ages. The recommendations in this report are based on the 1993 Institute of Medicine guidelines; the conclusions of an expert panel convened by CDC in April 1994; and input from public health nutrition program personnel, primary health-care providers, and experts in hematology, biochemistry, and nutrition.
National health objective 2.10 for the year 2000 is to "reduce iron deficiency to
less than 3% among children aged 1-4 and among women of childbearing age" (15). The
recommendations in this report for preventing and controlling iron deficiency are meant to
move the nation toward this objective.
BACKGROUND
Iron Metabolism
Total body iron averages approximately 3.8 g in men and 2.3 g in women, which is
equivalent to 50 mg/kg body weight for a 75-kg man (16,17) and 42 mg/kg body weight for a
55-kg woman (18), respectively. When the body has sufficient iron to meet its needs, most
iron (greater than 70%) may be classified as functional iron; the remainder is storage or
transport iron. More than 80% of functional iron in the body is found in the red blood
cell mass as Hb, and the rest is found in myoglobin and intracellular respiratory enzymes
(e.g., cytochromes) (Table 1). Iron is stored primarily as
ferritin, but some is stored as hemosiderin. Iron is transported in blood by the protein
transferrin. The total amount of iron in the body is determined by intake, loss, and
storage of this mineral (16).
Iron Intake
Regulation of iron balance occurs mainly in the gastrointestinal tract through absorption. When the absorptive mechanism is operating normally, a person maintains functional iron and tends to establish iron stores. The capacity of the body to absorb iron from the diet depends on the amount of iron in the body, the rate of red blood cell production, the amount and kind of iron in the diet, and the presence of absorption enhancers and inhibitors in the diet.
The percentage of iron absorbed (i.e., iron bioavailability) can vary from less than 1% to greater than 50% (19). The main factor controlling iron absorption is the amount of iron stored in the body. The gastrointestinal tract increases iron absorption when the body's iron stores are low and decreases absorption when stores are sufficient. An increased rate of red blood cell production can also stimulate iron uptake severalfold (16,20).
Among adults, absorption of dietary iron averages approximately 6% for men and 13% for nonpregnant women in their childbearing years (19). The higher absorption efficiency of these women reflects primarily their lower iron stores as a result of menstruation and pregnancy. Among iron-deficient persons, iron absorption is also high (21). Absorption of iron increases during pregnancy, but the amount of the increase is not well defined (6); as iron stores increase postpartum, iron absorption decreases.
Iron bioavailability also depends on dietary composition. Heme iron, which is found only in meat, poultry, and fish, is two to three times more absorbable than non-heme iron, which is found in plant-based foods and iron-fortified foods (19,20). The bioavailability of non-heme iron is strongly affected by the kind of other foods ingested at the same meal. Enhancers of iron absorption are heme iron (in meat, poultry, and fish) and vitamin C; inhibitors of iron absorption include polyphenols (in certain vegetables), tannins (in tea), phytates (in bran), and calcium (in dairy products) (16,22). Vegetarian diets, by definition, are low in heme iron. However, iron bioavailability in a vegeterian diet can be increased by careful planning of meals to include other sources of iron and enhancers of iron absorption (14). In the diet of an infant, before the introduction of solid foods, the amount of iron absorbed depends on the amount and bioavailability of iron in breast milk or formula (8) (Table 2). Iron Turnover and Loss
Red blood cell formation and destruction is responsible for most iron turnover in the body. For example, in adult men, approximately 95% of the iron required for the production of red blood cells is recycled from the breakdown of red blood cells and only 5% comes from dietary sources. In contrast, an infant is estimated to derive approximately 70% of red blood cell iron from the breakdown of red blood cells and 30% from the diet (23).
In adults, approximately 1 mg of iron is lost daily through feces and desquamated
mucosal and skin cells (24). Women of childbearing age require additional iron to
compensate for menstrual blood loss (an average of 0.3-0.5 mg daily during the
childbearing years) (18) and for tissue growth during pregnancy and blood loss at delivery
and postpartum (an average of 3 mg daily over 280 days' gestation) (25). In all persons, a
minute amount of iron is lost daily from physiological gastrointestinal blood loss.
Pathological gastrointestinal iron loss through gastrointestinal bleeding occurs in
infants and children sensitive to cow's milk and in adults who have peptic ulcer disease,
inflammatory bowel syndrome, or bowel cancer. Hookworm infections, although not common in
the United States (26), are also associated with gastrointestinal blood loss and iron
depletion (27).
Iron Stores
Iron present in the body beyond what is immediately needed for functional purposes is stored as the soluble protein complex ferritin or the insoluble protein complex hemosiderin (16,17). Ferritin and hemosiderin are present primarily in the liver, bone marrow, spleen, and skeletal muscles. Small amounts of ferritin also circulate in the plasma. In healthy persons, most iron is stored as ferritin (an estimated 70% in men and 80% in women) and smaller amounts are stored as hemosiderin (Table 1). When long-term negative iron balance occurs, iron stores are depleted before iron deficiency begins.
Men store approximately 1.0-1.4 g of body iron (17,28), women approximately 0.2-0.4 g (18,28), and children even less (23). Full-term infants of normal or high birthweight are born with high body iron (an average of 75 mg/kg body weight), to which iron stores contribute approximately 25% (23). Preterm or low-birthweight infants are born with the same ratio of total body iron to body weight, but because their body weight is low, the amount of stored iron is low too.
Manifestations of Iron Deficiency
Iron deficiency is one of the most common nutritional deficiencies worldwide (29) and has several causes (Exhibit 1) (Table 1B). Iron deficiency represents a spectrum (Table 3) ranging from iron depletion, which causes no physiological impairments, to iron-deficiency anemia, which affects the functioning of several organ systems. In iron depletion, the amount of stored iron (e.g., as measured by serum ferritin concentration) is reduced but the amount of functional iron may not be affected (30,31). Persons who have iron depletion have no iron stores to mobilize if the body requires more iron. In iron-deficient erythropoiesis, stored iron is depleted and transport iron (e.g., as measured by transferrin saturation) is reduced further; the amount of iron absorbed is not sufficient to replace the amount lost or to provide the amount needed for growth and function. In this stage, the shortage of iron limits red blood cell production and results in increased erthryocyte protoporphyrin concentration. In iron-deficiency anemia, the most severe form of iron deficiency, the shortage of iron leads to underproduction of iron-containing functional compounds, including Hb. The red blood cells of persons who have iron-deficiency anemia are microcytic and hypochromic (30,31).
In infants (persons aged 0-12 months) and preschool children (persons aged 1-5 years), iron-deficiency anemia results in developmental delays and behavioral disturbances (e.g., decreased motor activity, social interaction, and attention to tasks) (32,33). These developmental delays may persist past school age (i.e., 5 years) if the iron deficiency is not fully reversed (32-34). In these studies of development and behavior, iron-deficiency anemia was defined as a Hb concentration of less than or equal to 10.0 g/dL or less than or equal to 10.5 g/dL; further study is needed to determine the effects of mild iron-deficiency anemia (for example, a Hb concentration of greater than 10.0 g/dL but less than 11.0 g/dL in children aged 1- less than 2 years) on infant and child development and behavior. Iron-deficiency anemia also contributes to lead poisoning in children by increasing the gastrointestinal tract's ability to absorb heavy metals, including lead (35). Iron-deficiency anemia is associated with conditions that may independently affect infant and child development (e.g., low birthweight, generalized undernutrition, poverty, and high blood level of lead) that need to be taken into account when interventions addressing iron-deficiency anemia are developed and evaluated (34).
In adults (persons aged greater than or equal to 18 years), iron-deficiency anemia among laborers (e.g., tea pickers, latex tappers, and cotton mill workers) in the developing world impairs work capacity; the impairment appears to be at least partially reversible with iron treatment (36,37). It is not known whether iron-deficiency anemia affects the capacity to perform less physically demanding labor that is dependent on sustained cognitive or coordinated motor function (37).
Among pregnant women, iron-deficiency anemia during the first two trimesters of pregnancy is associated with a twofold increased risk for preterm delivery and a threefold increased risk for delivering a low-birthweight baby (38). Evidence from randomized control trials indicates that iron supplementation decreases the incidence of iron-deficiency anemia during pregnancy (10,39-42), but trials of the effect of universal iron supplementation during pregnancy on adverse maternal and infant outcomes are inconclusive (10,43,44).
Risk for and Prevalence of Iron Deficiency in the United States
A rapid rate of growth coincident with frequently inadequate intake of dietary iron places children aged less than 24 months, particularly those aged 9-18 months, at the highest risk of any age group for iron deficiency (3). The iron stores of full-term infants can meet an infant's iron requirements until ages 4-6 months, and iron-deficiency anemia generally does not occur until approximately age 9 months. Compared with full-term infants of normal or high birthweight, preterm and low-birthweight infants are born with lower iron stores and grow faster during infancy; consequently, their iron stores are often depleted by ages 2-3 months (5,23) and they are at greater risk for iron deficiency than are full-term infants of normal or high birthweight. Data from the third National Health and Nutrition Examination Survey (NHANES III), which was conducted during 1988-1994, indicated that 9% of children aged 12-36 months in the United States had iron deficiency (on the basis of two of three abnormal values for erythrocyte protoporphyrin concentration, serum ferritin concentration, and transferrin saturation) and that 3% also had iron-deficiency anemia (Table 4). The prevalence of iron deficiency is higher among children living at or below the poverty level than among those living above the poverty level and higher among black or Mexican-American children than among white children (45).
Evidence from the Continuing Survey of Food Intakes by Individuals (CSFII), which was conducted during 1994-1996, suggests that most infants meet the recommended dietary allowance for iron through diet (Table 5; these data exclude breast-fed infants). However, the evidence also suggests that more than half of children aged 1-2 years may not be meeting the recommended dietary allowance for iron through their diet (Table 5; these data do not include iron intake from supplemental iron).
An infant's diet is a reasonable predictor of iron status in late infancy and early childhood (23,48). For example, approximately 20%-40% of infants fed only non-iron-fortified formula or whole cow's milk and 15%-25% of breast-fed infants are at risk for iron deficiency by ages 9-12 months (23,48). Infants fed mainly iron-fortified formula (greater than or equal to 1.0 mg iron/100 kcal formula) (8) are not likely to have iron deficiency at age 9 months (48). Another study has documented that intake of iron-fortified cereal protects against iron deficiency: among exclusively breast-fed infants who were fed cereal starting at age 4 months, 3% of infants who were randomized to receive iron-fortified cereal compared with 15% of infants who were randomized to receive non-iron-fortified cereal had iron-deficiency anemia at age 8 months (49). The effect of prolonged exclusive breast feeding on iron status is not well understood. One nonrandomized study with a small cohort suggested that exclusive breast feeding for greater than 7 months is protective against iron deficiency compared with breast feeding plus the introduction of non-iron-fortified foods at age less than or equal to 7 months (50); infants weaned to iron-fortified foods were not included in this study.
Early introduction (i.e., before age 1 year) of whole cow's milk and consumption of greater than 24 oz of whole cow's milk daily after the 1st year of life are risk factors for iron deficiency because this milk has little iron, may replace foods with higher iron content, and may cause occult gastrointestinal bleeding (8,48,51,52). Because goat's milk and cow's milk have similar compositions (53,54), infants fed goat's milk are likely to have the same risk for developing iron deficiency as do infants fed cow's milk. Of all milks and formulas, breast milk has the highest percentage of bioavailable iron, and breast milk and iron-fortified formulas provide sufficient iron to meet an infant's needs (55). Iron-fortified formulas are readily available, do not cost much more than non-iron-fortified formulas, and have few proven side effects except for darker stools (56,57). Controlled trials and observational studies have indicated that iron-fortified formula causes no more gastrointestinal distress than does non-iron-fortified formula (56-58), and there is little medical indication for non-iron-fortified formula (59).
After age 24 months, when the growth rate of children slows and the diet becomes more diversified, the risk for iron deficiency drops (28,45,47). In children aged greater than 36 months, dietary iron and iron status are usually adequate (45,47). For these older children, risks for iron deficiency include limited access to food (e.g., because of low family income (45) or because of migrant or refugee status), a low-iron or other specialized diet, and medical conditions that affect iron status (e.g., inflammatory or bleeding disorders) (3).
During adolescence (ages 12- less than 18 years), iron requirements (46) and hence the risk for iron deficiency increase because of rapid growth (60,61). Among boys, the risk subsides after the peak pubertal growth period. Among girls and women, however, menstruation increases the risk for iron deficiency throughout the childbearing years. An important risk factor for iron-deficiency anemia among nonpregnant women of childbearing age is heavy menstrual blood loss (greater than or equal to 80 mL/month) (18), which affects an estimated 10% of these women in the United States (17,18). Other risk factors include use of an intrauterine device (which is associated with increased menstrual blood loss), high parity, previous diagnosis of iron-deficiency anemia, and low iron intake (45,60). Use of oral contraceptives is associated with decreased risk for iron deficiency (18,62).
Data from CSFII suggest that only one fourth of adolescent girls and women of childbearing age (12-49 years) meet the recommended dietary allowance for iron through diet (Table 5). Indeed, data from the complete NHANES III indicated that 11% of nonpregnant women aged 16-49 years had iron deficiency and that 3%-5% also had iron-deficiency anemia (Table 4).
Among pregnant women, expansion of blood volume by approximately 35% and growth of the fetus, placenta, and other maternal tissues increase the demand for iron threefold in the second and third trimesters to approximately 5.0 mg iron/day (18,46). Although menstruation ceases and iron absorption increases during pregnancy, most pregnant women who do not take iron supplements to meet increased iron requirements during pregnancy cannot maintain adequate iron stores, particularly during the second and third trimesters (63). After delivery, the iron in the fetus and placenta is lost to the woman, but some of the iron in the expanded blood volume may be returned to the woman's iron stores (18).
The prevalence of anemia in low-income, pregnant women enrolled in public health programs in the United States has remained fairly stable since 1979 (4). In 1993, the prevalence of anemia among these women was 9%, 14%, and 37% in the first, second, and third trimesters, respectively (4). Comparable data for the U.S. population of all pregnant women are unavailable. The low dietary intake of iron among U.S. women of childbearing age (47), the high prevalence of iron deficiency and iron-deficiency anemia among these women (45), and the increased demand for iron during pregnancy (18,46) suggest that anemia during pregnancy may extend beyond low-income women.
Published data on iron supplement use by a representative sample of pregnant U.S. women are limited. In the 1988 National Maternal and Infant Health Survey of a nationally representative sample of U.S. women who delivered a child in that year, 83% of respondents reported that they took supplements with multiple vitamins and minerals greater than or equal to 3 days/week for 3 months after they found out they were pregnant (64). Significantly smaller percentages of black women; Eskimo, Aleut, or American Indian women; women aged less than 20 years; and women having less than a high school education reported taking these supplements. In this survey, self-reported use of supplementation was within the range (55%-95%) found in a review of studies using objective measures to estimate adherence (e.g., pill counts and serum ferritin concentration) (65). The survey results suggest that the groups of women at high risk for iron deficiency during nonpregnancy are less likely to take supplements with multiple vitamins and minerals during pregnancy. This survey did not question respondents about changes in supplement use during pregnancy or what dose of iron supplements was consumed.
In the United States, the main reasons for lack of a recommended iron supplementation regimen during pregnancy may include lack of health-care provider and patient perceptions that iron supplements improve maternal and infant outcomes (65), complicated dose schedules (5,65), and uncomfortable side effects (e.g., constipation, nausea, and vomiting) (66,67). Low-dose supplementation regimens that meet pregnancy requirements (i.e., 30 mg iron/day) (46) and reduce unwanted side effects are as effective as higher dose regimens (i.e., 60 or 120 mg iron/day) in preventing iron-deficiency anemia (66). Simplified dose schedules (e.g., 1 dose/day) may also improve compliance (65). Methods to improve compliance among pregnant women at high risk for iron deficiency require further study.
Among men (males aged greater than or equal to 18 years) and postmenopausal women in
the United States, iron-deficiency anemia is uncommon. Data from NHANES III indicated that
less than or equal to 2% of men aged greater than or equal to 20 years and 2% of women
aged greater than or equal to 50 years had iron-deficiency anemia (Table
4). Data from CFSII indicate that most men and most women aged greater than or equal
to 50 years meet the recommended dietary allowance for iron through diet (Table 5). In a study of adults having iron-deficiency anemia, 62% had
clinical evidence of gastrointestinal bleeding as a result of lesions (e.g., ulcers and
tumors) (68). In NHANES I, which was conducted during 1971-1975, about two thirds of
anemia cases among men and postmenopausal women were attributable to chronic disease or
inflammatory conditions (69). The findings of these studies suggest that, among these
populations, the primary causes of anemia are chronic disease and inflammatory conditions
and that low iron intake should not be assumed to be the cause of the anemia.
TESTS USED TO ASSESS IRON STATUS
Iron status can be assessed through several laboratory tests. Because each test assesses a different aspect of iron metabolism, results of one test may not always agree with results of other tests. Hematological tests based on characteristics of red blood cells (i.e., Hb concentration, hematocrit, mean cell volume, and red blood cell distribution width) are generally more available and less expensive than are biochemical tests. Biochemical tests (i.e., erythrocyte protoporphyrin concentration, serum ferritin concentration, and transferrin saturation), however, detect earlier changes in iron status.
Although all of these tests can be used to assess iron status, no single test is accepted for diagnosing iron deficiency (70). Detecting iron deficiency in a clinical or field setting is more complex than is generally believed.
Lack of standardization among the tests and a paucity of laboratory proficiency testing limit comparison of results between laboratories (71). Laboratory proficiency testing is currently available for measuring Hb concentration, hematocrit, red blood cell count, serum ferritin concentration, and serum iron concentration; provisional proficiency testing was added in 1997 for total iron-binding capacity in the College of American Pathologists survey and was added to the American Association of Bioanalysts survey in 1998. As of April 1998, three states (New York, Pennsylvania, and Wisconsin) had proficiency testing programs for erthrocyte protoporphryin concentration. Regardless of whether test standardization and proficiency testing become routine, better understanding among health-care providers about the strengths and limitations of each test is necessary to improve screening for and diagnosis of iron-deficiency anemia, especially because the results from all of these tests can be affected by factors other than iron status.
Only the most common indicators of iron deficiency are described in this section. Other
indicators of iron deficiency (e.g., unbound iron-binding capacity and the concentrations
of transferrin receptor, serum transferrin, and holo-ferritin) are less often used or are
under development.
Hb Concentration and Hematocrit
Because of their low cost and the ease and rapidity in performing them, the tests most commonly used to screen for iron deficiency are Hb concentration and hematocrit (Hct). These measures reflect the amount of functional iron in the body. The concentration of the iron-containing protein Hb in circulating red blood cells is the more direct and sensitive measure. Hct indicates the proportion of whole blood occupied by the red blood cells; it falls only after the Hb concentration falls. Because changes in Hb concentration and Hct occur only at the late stages of iron deficiency, both tests are late indicators of iron deficiency; nevertheless, these tests are essential for determining iron-deficiency anemia.
Because iron deficiency is such a common cause of childhood anemia, the terms anemia, iron deficiency, and iron-deficiency anemia are often used interchangeably (3). The only cases of anemia that can be classified as iron-deficiency anemia, however, are those with additional evidence of iron deficiency. The concept of a close association between anemia and iron deficiency is closest to correct when the prevalence of iron deficiency is high. In the United States, the prevalence and severity of anemia have declined in recent years; hence, the proportion of anemia due to causes other than iron deficiency has increased substantially. As a consequence, the effectiveness of anemia screening for iron deficiency has decreased in the United States.
Iron deficiency may be defined as absent bone marrow iron stores (as described on bone marrow iron smears), an increase in Hb concentration of greater than 1.0 g/dL after iron treatment, or abnormal values on certain other biochemical tests (17). The recent recognition that iron deficiency seems to have general and potentially serious negative effects (32-34) has made identifying persons having iron deficiency as important as identifying persons having iron-deficiency anemia.
The case definition of anemia recommended in this report is less than 5th percentile of the distribution of Hb concentration or Hct in a healthy reference population and is based on age, sex, and (among pregnant women) stage of pregnancy (45,72). This case definition for anemia was shown to correctly identify 37% of women of childbearing age and 25% of children aged 1-5 years who were iron deficient (defined as two of three positive test results {i.e., low mean cell volume, high erythrocyte protoporphyrin, or low transferrin saturation}) (sensitivity) and to correctly classify 93% of women of childbearing age and 92% of children aged 1-5 years as not having iron deficiency (specificity) (73). Lowering the Hb concentration or Hct cut-off would result in identifying fewer people who have anemia due to causes other than iron deficiency (false positives) but also in overlooking more people with iron deficiency (true positives) (74).
The distributions of Hb concentration and Hct and thus the cutoff values for anemia differ between children, men, nonpregnant women, and pregnant women and by age or weeks of gestation (Table 6). The distributions also differ by altitude, smoking status, and race.
Among pregnant women, Hb concentration and Hct decline during the first and second trimesters because of an expanding blood volume (18,39-42). Among pregnant women who do not take iron supplements, Hb concentration and Hct remain low in the third trimester, and among pregnant women who have adequate iron intake, Hb concentration and Hct gradually rise during the third trimester toward the prepregnancy levels (39,40). Because adequate data are lacking in the United States, the cutoff values for anemia are based on clinical studies of European women who had taken iron supplementation during pregnancy (39-42,72). For pregnant women, a test result greater than 3 standard deviations (SD) higher than the mean of the reference population (i.e., a Hb concentration of greater than 15.0 g/dL or a Hct of greater than 45.0%), particularly in the second trimester, likely indicates poor blood volume expansion (72). High Hb concentration or Hct has been associated with hypertension and poor pregnancy outcomes (e.g., fetal growth retardation, fetal death, preterm delivery, and low birthweight) (75-78). In one study, women who had a Hct of greater than or equal to 43% at 26-30 weeks' gestation had more than a twofold increased risk for preterm delivery and a fourfold increased risk for delivering a child having fetal growth retardation than did women who had a Hct of 33%-36% (76). Hence, a high Hb concentration or Hct in the second or third trimester of pregnancy should not be considered an indicator of desirable iron status.
Long-term residency at high altitude (greater than or equal to 3,000 ft) (79) and cigarette smoking (80) cause a generalized upward shift in Hb concentration and Hct (Table 7). The effectiveness of screening for anemia is lowered if the cutoff values are not adjusted for these factors (72,79,80). Adjustment allows the positive predictive value of anemia screening to be comparable between those who reside near sea-level and those who live at high altitude and between smokers and nonsmokers (72).
In the United States, the distribution of Hb concentration values is similar among whites and Asian Americans (81), and the distribution of Hct values is similar among whites and American Indians (82). The distributions are lower among blacks than whites, however, even after adjustment for income (83,84). These different distributions are not caused by a difference in iron status indicators (e.g., iron intake, serum ferritin concentration, or transferrin saturation); thus, applying the same criteria for anemia to all races results in a higher rate of false-positive cases of iron deficiency for blacks (84). For example, in the United States during 1976-1980, 28% of nonpregnant black women but only 5% of nonpregnant white women had a Hb concentration of less than 12 g/dL and, according to the anemia criteria, would be classified as iron deficient, even though other tests for iron status suggested these women were not iron deficient (84). For this reason, the Institute of Medicine recommends lowering Hb concentration and Hct cutoff values for black children aged less than 5 years by 0.4 g/dL and 1%, respectively, and for black adults by 0.8 g/dL and 2%, respectively (5). Because the reason for this disparity in distributions by race has not been determined, the recommendations in this report do not provide race-specific cutoff values for anemia. Regardless, health-care providers should be aware of the possible difference in the positive predictive value of anemia screening for iron deficiency among blacks and whites and consider using other iron status tests (e.g., serum ferritin concentration and transferrin saturation) for their black patients.
Accurate, low-cost, clinic-based instruments have been developed for measuring Hb concentration and Hct by using capillary or venous blood (85,86). Small diurnal variations are seen in Hb concentration and Hct measurements, but these variations are neither biologically nor statistically significant (87,88). A potential source of error of using capillary blood to estimate Hb concentration and Hct in screening is improper sampling technique. For example, excessive squeezing (i.e., "milking") of the finger contaminates the blood with tissue fluid, leading to false low readings (89). Confirmation of a low reading is recommended by obtaining a second capillary blood sample from the finger or by venipuncture.
Although measures of Hb concentration and Hct cannot be used to determine the cause of anemia, a diagnosis of iron-deficiency anemia can be made if Hb concentration or Hct increases after a course of therapeutic iron supplementation (23,51). Alternatively, other laboratory tests (e.g., mean cell volume, red blood cell distribution width, and serum ferritin concentration) can be used to differentiate iron-deficiency anemia from anemia due to other causes.
In the United States in recent years, the usefulness of anemia screening as an
indicator of iron deficiency has become more limited, particularly for children. Studies
using transferrin saturation (a more sensitive test for iron deficiency) have documented
that iron deficiency in most subpopulations of children has declined such that screening
by Hb concentration no longer efficiently predicts iron deficiency (3,45,51,90). Data from
NHANES II, which was conducted during 1976-1980, indicated that less than 50% of children
aged 1-5 years and women in their childbearing years who had anemia (as defined by Hb
concentration less than 5th percentile) were iron deficient (i.e., had at least two of the
following: low mean cell volume, high erythrocyte protoporphyrin concentration, or low
transferrin saturation) (70,73,83). Causes of anemia other than iron deficiency include
other nutritional deficiencies (e.g., folate or vitamin B12 deficiency), hereditary
defects in red blood cell production (e.g., thalassemia major and sickle cell disease),
recent or current infection, and chronic inflammation (91). The current pattern of
iron-deficiency anemia in the United States (28,45) indicates that selective anemia
screening of children at known risk for iron deficiency or additional measurement of
indicators of iron deficiency (e.g., erythrocyte protoporphyrin concentration and serum
ferritin concentration) to increase the positive predictive value of screening are now
suitable approaches to assessing iron deficiency among most U.S. children (3,73). The
costs and feasibility of screening using additional indicators of iron deficiency may
preclude the routine use of these indicators.
Mean Cell Volume
Mean cell volume (MCV), the average volume of red blood cells, is measured in femtoliters (10-15 liters). This value can be calculated as the ratio of Hct to red blood cell count or measured directly using an electronic counter. MCV is highest at birth, decreases during the first 6 months of life, then gradually increases during childhood to adult levels (23,51). A low MCV corresponds with the 5th percentile for age for the reference population in NHANES III (28).
Some anemias, including iron-deficiency anemia, result in microcytic red blood cells; a
low MCV thus indicates microcytic anemia (Table 8). If cases of
lead poisoning and the anemias of infection, chronic inflammatory disease, and thalassemia
minor can be excluded, a low MCV serves as a specific index for iron-deficiency anemia
(28,87,94,95).
Red Blood Cell Distribution Width
Red blood cell distribution width (RDW) is calculated by dividing the SD of red blood cell volume by MCV and multiplying by 100 to express the result as a percentage:
RDW (%) = {SD of red blood cell volume (fL)/MCV (fL)} x 100
A high RDW is generally set at greater than 14.0%, which corresponds to the 95th percentile of RDW for the reference population in NHANES III (20). The RDW value obtained depends on the instrument used (51,95).
An RDW measurement often follows an MCV test to help determine the cause of a low MCV.
For example, iron-deficiency anemia usually causes greater variation in red blood cell
size than does thalassemia minor (96). Thus, a low MCV and an RDW of greater than 14.0%
indicates iron-deficiency anemia, whereas a low MCV and an RDW less than or equal to 14.0%
indicates thalassemia minor (51).
Erythrocyte Protoporphyrin Concentration
Erythrocyte protoporphyrin is the immediate precursor of Hb. The concentration of erythrocyte protoporphyrin in blood increases when insufficient iron is available for Hb production. A concentration of greater than 30 ug/dL of whole blood or greater than 70 ug/dL of red blood cells among adults and a concentration of greater than 80 ug/dL of red blood cells among children aged 1-2 years indicates iron deficiency (28,45,91). The normal range of erythrocyte protoporphyrin concentration is higher for children aged 1-2 years than for adults, but no consensus exists on the normal range for infants (28,90). The sensitivity of free erythrocyte protoporphyrin to iron deficiency (as determined by response to iron therapy) in children and adolescents aged 6 months-17 years is 42%, and the estimated specificity is 61% (74).
Infection, inflammation, and lead poisoning as well as iron deficiency can elevate
erythrocyte protoporphyrin concentration (23,92). This measure of iron status has several
advantages and disadvantages relative to other laboratory measures. For example, the
day-to-day variation within persons for erythrocyte protoporphyrin concentration is less
than that for serum iron concentration and transferrin saturation (87). A high erythrocyte
protoporphyrin concentration is an earlier indicator of iron-deficient erythropoiesis than
is anemia, but it is not as early an indicator of low iron stores as is low serum ferritin
concentration (30). Inexpensive, clinic-based methods have been developed for measuring
erythrocyte protoporphyrin concentration, but these methods can be less reliable than
laboratory methods (92).
Serum Ferritin Concentration
Nearly all ferritin in the body is intracellular; a small amount circulates in the plasma. Under normal conditions, a direct relationship exists between serum ferritin concentration and the amount of iron stored in the body (97), such that 1 ug/L of serum ferritin concentration is equivalent to approximately 10 mg of stored iron (98). In the United States, the average serum ferritin concentration is 135 ug/L for men (28), 43 ug/L for women (28), and approximately 30 ug/L for children aged 6-24 months (23).
Serum ferritin concentration is an early indicator of the status of iron stores and is the most specific indicator available of depleted iron stores, especially when used in conjunction with other tests to assess iron status. For example, among women who test positive for anemia on the basis of Hb concentration or Hct, a serum ferritin concentration of less than or equal to 15 ug/L confirms iron deficiency and a serum ferritin concentration of greater than 15 ug/L suggests that iron deficiency is not the cause of the anemia (93). Among women of childbearing age, the sensitivity of low serum ferritin concentration (less than or equal to 15 ug/L) for iron deficiency as defined by no stainable bone marrow iron is 75%, and the specificity is 98%; when low serum ferritin concentration is set at less than 12 ug/L, the sensitivity for iron deficiency is 61% and the specificity is 100% (93). Although low serum ferritin concentration is an early indicator of low iron stores, it has been questioned whether a normal concentration measured during the first or second trimester of pregnancy can predict adequate iron status later in pregnancy (6).
The cost of assessing serum ferritin concentration and the unavailability of
clinic-based measurement methods hamper the use of this measurement in screening for iron
deficiency. In the past, methodological problems have hindered the comparability of
measurements taken in different laboratories (87), but this problem may be reduced by
proficiency testing and standardized methods. Factors other than the level of stored iron
can result in large within-individual variation in serum ferritin concentration (99). For
example, because serum ferritin is an acute-phase reactant, chronic infection,
inflammation, or diseases that cause tissue and organ damage (e.g., hepatitis, cirrhosis,
neoplasia, or arthritis) can raise its concentration independent of iron status (97). This
elevation can mask depleted iron stores.
Transferrin Saturation
Transferrin saturation indicates the extent to which transferrin has vacant iron-binding sites (e.g., a low transferrin saturation indicates a high proportion of vacant iron-binding sites). Saturation is highest in neonates, decreases by age 4 months, and increases throughout childhood and adolescence until adulthood (23,28). Transferrin saturation is based on two laboratory measures, serum iron concentration and total iron-binding capacity (TIBC). Transferrin saturation is calculated by dividing serum iron concentration by TIBC and multiplying by 100 to express the result as a percentage:
Transferrin saturation (%) = {serum iron concentration (ug/dL)/TIBC (ug/dL)} x 100
Serum iron concentration is a measure of the total amount of iron in the serum and is often provided with results from other routine tests evaluated by automated, laboratory chemistry panels. Many factors can affect the results of this test. For example, the concentration of serum iron increases after each meal (71), infections and inflammations can decrease the concentration (69), and diurnal variation causes the concentration to rise in the morning and fall at night (100). The day-to-day variation of serum iron concentration within individuals is greater than that for Hb concentration and Hct (88,101).
TIBC is a measure of the iron-binding capacity within the serum and reflects the availability of iron-binding sites on transferrin (94). Thus, TIBC increases when serum iron concentration (and stored iron) is low and decreases when serum iron concentration (and stored iron) is high. Factors other than iron status can affect results from this test. For example, inflammation, chronic infection, malignancies, liver disease, nephrotic syndrome, and malnutrition can lower TIBC readings, and oral contraceptive use and pregnancy can raise the readings (87,102). Nevertheless, the day-to-day variation is less than that for serum iron concentration (87,101). TIBC is less sensitive to iron deficiency than is serum ferritin concentration, because changes in TIBC occur after iron stores are depleted (17,31,94).
A transferrin saturation of less than 16% among adults is often used to confirm iron deficiency (93). Among nonpregnant women of childbearing age, the sensitivity of low transferrin saturation (less than 16%) for iron deficiency as defined by no stainable bone marrow iron is 20%, and the specificity is 93% (93).
The factors that affect serum iron concentration and TIBC, such as iron status, diurnal
variation (87,103), and day-to-day variation within persons (101), can affect the measured
transferrin saturation as well. The diurnal varation is larger for transferrin saturation
than it is for Hb concentration or Hct (87,103). Transferrin saturation is an indicator of
iron-deficient erythropoiesis rather than iron depletion; hence, it is less sensitive to
changes in iron stores than is serum ferritin concentration (30,31). The cost of assessing
transferrin saturation and the unavailability of simple, clinic-based methods for
measuring transferrin saturation hinder the use of this test in screening for iron
deficiency.
JUSTIFICATION FOR RECOMMENDATIONS
These recommendations are intended to guide primary health-care providers in preventing
and controlling iron deficiency in infants, preschool children, and women of childbearing
age (especially pregnant women). Both primary prevention through appropriate dietary
intake and secondary prevention through detecting and treating iron-deficiency anemia are
discussed.
Primary Prevention
Primary prevention of iron deficiency means ensuring an adequate intake of iron. A reliable source of dietary iron is essential for every infant and child's growth and development, because a rapid rate of growth and low dietary iron may predispose an infant to exhaustion of iron stores by ages 4-6 months (23). Primary prevention of iron deficiency is most important for children aged less than 2 years, because among all age groups they are at the greatest risk for iron deficiency caused by inadequate intake of iron (28,45,47,48,91). The adequacy of the iron content of an infant's diet is a major determinant of the iron status of the infant as a young child, as indicated by declines in the prevalence of iron-deficiency anemia that correspond with improvements in infant feeding practices (1-3). In infants and young children, iron deficiency may result in developmental and behavioral disturbances (33,34).
The evidence for the effectiveness of primary prevention among pregnant women is less clear. Although iron-deficiency anemia during pregnancy is associated with preterm delivery and delivering a low-birthweight baby (38), well designed, randomized control trials are needed to evaluate the effectiveness of universal iron supplementation on mitigating adverse birth outcomes. Some studies have indicated that adequate iron supplementation during pregnancy reduces the prevalence of iron-deficiency anemia (6,10,39-42,66,104), but over the last few decades, the recommendation by the Council on Foods and Nutrition and other groups to supplement iron intake during pregnancy has not resulted in a reduced prevalence of anemia among low-income, pregnant women (4,9,105). Evidence on iron supplement use is limited, however, so it is not known how well the recommendation has been followed. Conclusive evidence of the benefits of universal iron supplementation for all women is lacking, but CDC advocates universal iron supplementation for pregnant women because a large proportion of women have difficulty maintaining iron stores during pregnancy and are at risk for anemia (6,18,63), iron-deficiency anemia during pregnancy is associated with adverse outcomes (38), and supplementation during pregnancy is not associated with important health risks (10,65,66). Potential Adverse Effects of Increasing Dietary Iron Intake
Approximately 3.3 million women of childbearing age and 240,000 children aged 1-2 years have iron-deficiency anemia (45); conversely, up to one million persons in the United States may be affected by iron overload due to hemochromatosis (106,107). Hemochromatosis is a genetic condition characterized by excessive iron absorption, excess tissue iron stores, and potential tissue injury. If undetected and untreated, iron overload may eventually result in the onset of morbidity (e.g., cirrhosis, hepatomas, diabetes, cardiomyopathy, arthritis or athropathy, or hypopituitarism with hypogonadism), usually between ages 40 and 60 years. Clinical expression of iron overload depends on the severity of the metabolic defect, the presence of sufficient quantities of absorbable iron in the diet, and physiological blood loss from the body (e.g., menstruation) (16). Transferrin saturation is the recommended screening test for hemochromatosis; a repeated high value indicates hemochromatosis (108). Preventing or treating the clinical signs of hemochromatosis involves repeated phlebotomy to remove excess iron from the body (108).
Although increases in iron intake would seem contraindicated in persons with hemochromatosis, there is no evidence that iron fortification of foods or the use of a recommended iron supplementation regimen during pregnancy is associated with increased risk for clinical disease due to hemochromatosis (16). Even when their dietary intake of iron is approximately average, persons with iron overload due to hemochromatosis will require phlebotomy to reduce their body's iron stores (108). Secondary Prevention
Secondary prevention involves screening for, diagnosing, and treating iron deficiency. Screening tests can be for anemia or for earlier indicators of iron deficiency (e.g., erythrocyte protoporphyrin concentration or serum ferritin concentration). The cost, feasibility, and variability of measurements other than Hb concentration and Hct currently preclude their use for screening. The decision to screen an entire population or to screen only persons at known risk for iron deficiency should be based on the prevalence of iron deficiency in that population (73).
The percentage of anemic persons who are truly iron deficient (i.e., the positive
predictive value of anemia screening for iron deficiency) increases with increasing
prevalence of iron deficiency in the population (73). In the United States, children from
low-income families, children living at or below the poverty level, and black or
Mexican-American children are at higher risk for iron deficiency than are children from
middle- or high-income families, children living above the poverty level, and white
children, respectively (2,3,45). Routine screening for anemia among populations of
children at higher risk for iron deficiency is effective, because anemia is predictive of
iron deficiency. In populations having a low prevalence of anemia or a prevalence of iron
deficiency less than 10% (e.g., children from middle- or high-income families and white
children) (2,3,45), anemia is less predictive of iron deficiency (73), and selectively
screening only the persons having known risk factors for iron deficiency increases the
positive predictive value of anemia screening (3,70). Because the iron stores of a
full-term infant of normal or high birthweight can meet the body's iron requirements up to
age 6 months (23), anemia screening is of little value before age 6 months for these
infants. Anemia among pregnant women and anemia among all nonpregnant women of
childbearing age should be considered together, because childbearing increases the risk
for iron deficiency (both during and after pregnancy) (41,42), and iron deficiency before
pregnancy likely increases the risk for iron deficiency during pregnancy (109). Periodic
screening for anemia among adolescent girls and women of childbearing age is indicated for
several reasons. First, most women have dietary intake of iron below the recommended
dietary allowance (46,47). Second, heavy menstrual blood loss, which increases iron
requirements to above the recommended dietary allowance, affects an estimated 10% of women
of childbearing age (17,18). Finally, the relatively high prevalence of iron deficiency
and iron-deficiency anemia among nonpregnant women of childbearing age (45) and of anemia
among low-income, pregnant women (4) suggests that periodic screening for anemia is
indicated among adolescent girls and nonpregnant women of childbearing age during routine
medical examinations (73) and among pregnant women at the first prenatal visit. Among men
and postmenopausal women, in whom iron deficiency and iron-deficiency anemia are uncommon
(45), anemia screening is not highly predictive of iron deficiency.
RECOMMENDATIONS
Infants (Persons Aged 0-12 Months) and Preschool Children (Persons Aged 1-5 Years)
Primary prevention of iron deficiency in infants and preschool children should be
achieved through diet. Information on diet and feeding is available in the Pediatric
Nutrition Handbook (8), Guide to Clinical Preventive Services (10), Nutrition and Your
Health: Dietary Guidelines for Americans (14), Breastfeeding and the Use of Human Milk
(110), and Clinician's Handbook of Preventive Services: Put Prevention into Practice
(111). For secondary prevention of iron deficiency in this age group, screening for,
diagnosing, and treating iron-deficiency anemia are recommended.
Primary Prevention
Milk and Infant Formulas
Solid Foods
Secondary Prevention
Universal Screening
Selective Screening
Screen the following children:
Diagnosis and Treatment
School-Age Children (Persons Aged 5- less than 12 Years) and Adolescent Boys (Males Aged 12- less than 18 Years)
Among school-age children and adolescent boys, only those who have a history of iron-deficiency anemia, special health-care needs, or low iron intake should be screened for anemia. Age-specific anemia criteria should be used (Table 6). Treatment for iron-deficiency anemia includes one 60-mg iron tablet each day for school-age children and two 60-mg iron tablets each day for adolescent boys and counseling about dietary intake of iron. Follow-up and laboratory evaluation are the same for school-age children and adolescent boys as they are for infants and preschool children.
Adolescent Girls (Females 12- less than 18 Years) and Nonpregnant Women of Childbearing Age
Primary prevention of iron deficiency for adolescent girls and nonpregnant women of
childbearing age is through diet. Information about healthy diets, including good sources
of iron, is available in Nutrition and Your Health: Dietary Guidelines for Americans (14).
Screening for, diagnosing, and treating iron-deficiency anemia are secondary prevention
approaches. Age-specific anemia criteria should be used during screening (Table 6).
Primary Prevention
Secondary Prevention
Screening
Diagnosis and Treatment
Pregnant Women
Primary prevention of iron deficiency during pregnancy includes adequate dietary iron intake and iron supplementation. Information about healthy diets, including good sources of iron, is found in Nutrition and Your Health: Dietary Guidelines for Americans (14). More detailed information for pregnant women is found in Nutrition During Pregnancy and Lactation: An Implementation Guide (112). Secondary prevention involves screening for, diagnosing, and treating iron-deficiency anemia. Primary Prevention
Secondary Prevention
Screening
Diagnosis and Treatment
Postpartum Women
Women at risk for anemia at 4-6 weeks postpartum should be screened for anemia by using a Hb concentration or Hct test. The anemia criteria for nonpregnant women should be used (Table 6). Risk factors include anemia continued through the third trimester, excessive blood loss during delivery, and a multiple birth. Treatment and follow-up for iron-deficiency anemia in postpartum women are the same as for nonpregnant women. If no risk factors for anemia are present, supplemental iron should be stopped at delivery.
Men (Males Aged greater than or equal to 18 Years) and Postmenopausal Women
No routine screening for iron deficiency is recommended for men or postmeno-pausal
women. Iron deficiency or anemia detected during routine medical examinations should be
fully evaluated for its cause. Men and postmenopausal women usually do not need iron
supplements.
CONCLUSION
In the United States, iron deficiency affects 7.8 million adolescent girls and women of
childbearing age and 700,000 children aged 1-2 years (45). Primary health-care providers
can help prevent and control iron deficiency by counseling individuals and families about
sound iron nutrition during infancy and beyond and about iron supplementation during
pregnancy, by screening persons on the basis of their risk for iron deficiency, and by
treating and following up persons with presumptive iron deficiency. Implementing these
recommendations will help reduce manifestations of iron deficiency (e.g., preterm births,
low birthweight, and delays in infant and child development) and thus improve public
health.
References
To request a copy of this document or for questions concerning this document, please
contact the person or office listed below. If requesting a document, please specify the
complete name of the document as well as the address to which you would like it mailed.
Note that if a name is listed with the address below, you may wish to contact this person
via CDC WONDER/PC e-mail.
For single issue purchase 800-843-6356
NATIONAL CTR FOR CHRONIC DISEASE PREVENTION & HLTH PROM State/Fed Gov: For free copies
write to: CDC, MMWR MS(C-08)
Atlanta, GA 30333
TABLE 1. Normal distribution of iron-containing compounds in men (17 ) and women (18 ) (milligrams of iron per kilogram of body weight) ==================================================== Compound Men Women ---------------------------------------------------- Storage complexes Ferritin 9 4 Hemosiderin 4 1 Transport protein Transferrin <1 <1 Functional compounds Hemoglobin 31 31 Myoglobin 4 4 Respiratory enzymes 2 2 Total 50 42 ====================================================
TABLE 2. Iron absorption by infants fed formula or milk (8) =============================================================================================== Substance Iron content (mg/L) Bioavailable iron (%) Absorbed iron (mg/L) ----------------------------------------------------------------------------------------------- Nonfortified formula 1.5-4.8* ~10 0.15-0.48 Iron-fortified formula+ 10.0-12.8* ~ 4 0.40-0.51 Whole cow's milk 0.5 ~10 0.05 Breast milk 0.5 ~50 0.25 ----------------------------------------------------------------------------------------------- * Values are given for commonly marketed infant formulas. + Iron-fortified formula contains >=1.0 mg iron/100 kcal formula (8). Most iron-fortified formulas contain approximately 680 kcal/L, which is equivalent to >=6.8 mg iron/L. ===============================================================================================
TABLE 3. Spectrum of body iron content (17,30,31) ================================================================================= Iron status Stored iron Transport iron Functional iron --------------------------------------------------------------------------------- Iron-deficiency anemia Low Low Low Iron-deficient erythropoiesis Low Low Normal Iron depletion Low Normal Normal Normal Normal Normal Normal Iron overload High High Normal =================================================================================
TABLE 4. Prevalence (%) of iron deficiency and iron-deficiency anemia, United States, third National Health and Nutrition Examination Survey, 1988-1994 (45) ======================================================================= Sex and age (years) Iron deficiency Iron-deficiency anemia ----------------------------------------------------------------------- Both sexes 1-2 9 3* 3-5 3 <1 6-11 2 <1 Nonpregnant females 12-15 9 2* 16-19 11* 3* 20-49 11 5* 50-69 5 2 >=70 7* 2* Males 12-15 1 <1 16-19 <1 <1 20-49 <1 <1 50-69 2 1 >=70 4 2 ----------------------------------------------------------------------- * Prevalence in nonblacks is 1 percentage point lower than prevalance in all races. =======================================================================
TABLE 5. 1989 Recommended dietary allowance (RDA) for iron and the proportion of Americans having diets meeting 100% of the RDA for iron, 1994-1996 ============================================================================== Sex and age (years) RDA (mg/day)* Proportion of Americans meeting 100% of the 1989 RDA for iron+ (%) ------------------------------------------------------------------------------ Both sexes <1 6-10 87.9& 1-2 10 43.9 3-5 10 61.7 Females 6-11 10 60.9 12-19 15 27.7 20-29 15 25.9 30-39 15 26.6 40-49 15 22.1 50-59 10 55.2 60-69 10 59.3 >=70 10 59.2 Males 6-11 10 79.8 12-19 12 83.1 20-29 10 86.9 30-39 10 88.9 40-49 10 85.9 50-59 10 83.8 60-69 10 85.5 >=70 10 78.5 ------------------------------------------------------------------------------ * National Research Council (46). The age groups designated by the council are slightly different from those presented in this table. + Two-day average dietary intakes, from the U.S. Department of Agriculture Continuing Survey of Food Intakes by Individuals, 1994-1996 (47). & Excludes breast-fed infants. ==============================================================================
TABLE 6. Maximum hemoglobin concentration and hematocrit values for anemia* (45,72) ====================================================================================== Hemoglobin concentration Hematocrit (<%) (<g/dL) -------------------------------------------------------------------------------------- Children (age,in years) 1-<2+ 11.0 32.9 2-<5 11.1 33.0 5-<8 11.5 34.5 8-<12 11.9 35.4 Men (age in years) 12-<15 12.5 37.3 15-<18 13.3 39.7 >=18 13.5 39.9 Nonpregnant women and lactating women (age in years) 12-<15 11.8 35.7 15-<18 12.0 35.9 >=18 12.0 35.7 Pregnant women Weeks' gestation 12 11.0 33.0 16 10.6 32.0 20 10.5 32.0 24 10.5 32.0 28 10.7 32.0 32 11.0 33.0 36 11.4 34.0 40 11.9 36.0 Trimester First 11.0 33.0 Second 10.5 32.0 Third 11.0 33.0 * Age- and sex-specific cutoff values for anemia are based on the 5th percentile from the third National Health and Nutrition Examination Survey (NHANES III), which excluded persons who had a high likelihood of iron deficiency by using the same methods described by Looker et al. (45). Maximum values for anemia during pregnancy are based on values from pregnant women who had adequate supplementation (39-42,72). + Although no data are available from NHANES III to determine the maximum hemoglobin concentration and hematocrit values for anemia among infants, the values listed for children aged 1-<2 years can be used for infants aged 6-12 months. ======================================================================================
TABLE 7. Adjustment of maximum hemoglobin concentration and hematocrit values for anemia (72,79,80) ======================================================================= Hemoglobin Hematocrit (%) concentration (<g/dL) ----------------------------------------------------------------------- Altitude (feet) 3,000-3,999 0.2 0.5 4,000-4,999 0.3 1.0 5,000-5,999 0.5 1.5 6,000-6,999 0.7 2.0 7,000-7,999 1.0 3.0 8,000-8,999 1.3 4.0 9,000-9,999 1.6 5.0 10,000-11,000 2.0 6.0 Cigarette smoking 0.5-<1.0 pack per day 0.3 1.0 1.0-<2.0 packs per day 0.5 1.5 >=2.0 packs per day 0.7 2.0 All smokers 0.3 1.0 =======================================================================
TABLE 8. Cutoff values for laboratory tests for iron deficiency ================================================================================================== Test Cutoff value Reference -------------------------------------------------------------------------------------------------- Hemoglobin concentration See Table 6 for cutoffs for anemia Looker et al. (45 ), CDC (72) Hematocrit See Table 6 for cutoffs for anemia Looker et al. (45 ), CDC (72) Mean cell volume Cutoffs for microcytic anemia at Dallman et al. (28 ) age: 1-2 years: <77 fL 3-5 years: <79 fL 6-11 years: <80 fL 12-15 years: <82 fL >15 years: <85 fL Red blood cell distribution Cutoff for iron-deficiency anemia*: Dallman et al. (28 ), width 14.0% Oski (51) Erythrocyte protoporphyrin Cutoffs for iron deficiency: Dallman et al. (28 ), concentration Adults: 30 u/dL of whole blood or Piomelli (92) 70 ug/dL of red blood cells Children aged 1-2 years: 80 ug/dL of red blood cells Serum ferritin concentration Cutoff for iron deficiency in Hallberg et al. (93 ) persons aged >6 months: <=15 ug/L Transferrin saturation Cutoff for iron deficiency: <16% Dallman et al. (23 ), Pilch and Senti (90) -------------------------------------------------------------------------------------------------- * The cutoff is instrument specific and may not apply in all laboratories. ==================================================================================================
EXHIBIT 1. Causes of iron deficiency =================================================================== Increased iron requirements Inadequate iron absorption ------------------------------------------------------------------- Blood loss Diet low in bioavailable iron Menstruation Impaired absorption Gastrointestinal tract Intestinal malabsorption Food sensitivity Gastric surgery Hookworms Hypochlorhydria Genitourinary tract Respiratory tract Blood donation Growth Pregnancy ===================================================================