More than 80% women develop iron deficiency by third trimester: Study
A new study highlights the high prevalence of iron deficiency among pregnant women. Researchers call for universal screening.

During pregnancy, a woman’s iron requirements increase approximately tenfold to aid fetal growth and meet her own increasing iron requirements. Meeting these demands depends on her iron levels before pregnancy and her body’s ability to absorb more iron as the pregnancy progresses.
However, these physiological changes are often not sufficient, especially for the estimated 50% of women who begin pregnancy with low iron stores.
Surprisingly, iron deficiency is not only a problem in low-resource settings. Recent studies show that 33–42% of pregnant women in high-resource countries also suffer from iron deficiency.
effect of iron deficiency
Iron deficiency can lead to anemia, where the body struggles to produce enough hemoglobin, reducing the ability of red blood cells to carry oxygen throughout the body.

This condition is associated with a higher risk of adverse outcomes for both mother and baby, including:
- Maternal risks: postpartum hemorrhage, postpartum depression
- Infant risks: premature birth, low birth weight, long-term neurodevelopmental challenges
Screening for asymptomatic iron deficiency
Despite these risks, screening for iron deficiency during pregnancy is not a universal practice.
For example, the US Preventive Services Task Force found insufficient evidence to support routine screening for iron deficiency anemia in pregnant women.
On the other hand, the International Federation of Gynecology and Obstetrics recommends that all pregnant women, regardless of their anemia status, should be screened for iron deficiency.

In clinical settings, hemoglobin levels are often the only measure used to assess iron status. However, this is problematic because hemoglobin only detects anemia, not iron deficiency, which may lead to iron deficiency being detected before it progresses.
all about study
A new study titled ‘Longitudinal assessment of iron status during pregnancy’ provides important information about how iron levels change during pregnancy. Conducted in Ireland, it involved 641 women who were monitored at 15, 20 and 33 weeks of pregnancy to assess their iron status.
The study found that while no women suffered from anemia in the first trimester, more than 80% of women became iron deficient by the third trimester.
The prevalence of iron deficiency among these women was higher than in some low-resource settings.
The study showed that nearly three-quarters of women took an iron-containing supplement, which was associated with a reduced risk of iron deficiency, particularly in the third trimester.

This highlights the potential benefits of early iron supplementation during pregnancy.
Researchers proposed that levels of ferritin, a protein that stores iron, at 15 weeks of pregnancy could predict iron deficiency later in pregnancy.
They recommended a limit of 60 micrograms of ferritin per liter at 15 weeks to reduce the risk of iron deficiency by 33 weeks.
Study criticized for lack of investigation of iron deficiency
In an editorial accompanying the study, authors Michael Auerbach and Helen Landy criticized the medical community’s inconsistent approach to iron deficiency screening and treatment for pregnant women.
They called this “misogyny” and urged institutions such as the American College of Obstetricians and Gynecologists to revise their guidelines to recommend screening for iron deficiency in all pregnant women regardless of anemia status.
The study authors stressed the need for more large-scale research to establish consensus on best practices for screening and treating iron deficiency in pregnancy.
Their findings highlight the importance of quickly addressing iron deficiency to improve maternal and child health outcomes.