Background. Pregnancy-associated malaria (PAM) remains a significant health concern in sub-Saharan Africa. Cross-sectional studies report that iron might be associated with increased malaria morbidity, raising fears that current iron supplementation policies will cause harm in the present context of increasing resistance against intermittent preventive treatment in pregnancy (IPTp). Therefore, it is necessary to assess the relation of iron levels with malaria risk during the entire pregnancy.
Methods. To investigate the association of maternal iron levels on malaria risk in the context of an IPTp clinical trial, 1005 human immunodeficiency virus-negative, pregnant Beninese women were monitored throughout their pregnancy between January 2010 and May 2011. Multilevel models with random intercept at the individual levels and random slope for gestational age were used to analyze the factors associated with increased risk of a positive blood smear and increased Plasmodium falciparum density.
Results. During the follow-up, 29% of the women had at least 1 episode of malaria. On average, women had 0.52 positive smears (95% confidence interval [CI], 0.44–0.60). High iron levels (measured by the log10 of ferritin corrected on inflammation) were significantly associated with increased risk of a positive blood smear (adjusted odds ratio = 1.75; 95% CI, 1.46–2.11; P < .001) and high P falciparum density (beta estimate = 0.22; 95% CI, 0.18–0.27; P < .001) during the follow-up period adjusted on pregnancy parameters, comorbidities, environmental and socioeconomic indicators, and IPTp regime. Furthermore, iron-deficient women were significantly less likely to have a positive blood smear and high P falciparum density (P < .001 in both cases).
Conclusions. Iron levels were positively associated with increased PAM during pregnancy in the context of IPTp. Supplementary interventional studies are needed to determine the benefits and risks of differently dosed iron and folate supplements in malaria-endemic regions.