Serum Ferritin Is Not Sensitive or Specific for the Diagnosis of Iron Deficiency in Patients with Normocytic Anemia Conference Paper uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • Introduction: Following the seminal study by Guyatt et al., serum ferritin has been widely accepted as the most accurate surrogate marker for iron deficiency, particularly if ferritin levels are < 45 mg/L. However, as an acute-phase reactant, ferritin levels rise with a number of conditions, including obesity, age, liver disorders, and inflammation. Elevated ferritin levels due to these concomitant clinical conditions may mask the underlying iron deficiency, thus rendering serum ferritin an unreliable marker for iron status. Therefore, the aim of this study is to evaluate the sensitivity and specificity of ferritin for the diagnosis of iron deficiency in patients presenting with normocytic anemia, when response to iron replacement was used as the gold standard for the diagnosis of iron deficiency. Methods: This study is a retrospective case review involving patients referred to an academic hematology clinic from 2003 to 2015 for further evaluation of chronic normocytic anemia without other cell lines abnormalities. Following initial workup to ensure the absence of 1) mixed microcytic-macrocytic anemia, 2) reticulocytosis suggesting acute blood loss or hemolysis, and 3) suboptimally low erythropoietin level, 59 patients received a therapeutic trial of oral ferrous gluconate. Intravenous iron sucrose was provided if patients could not tolerate or were refractory to oral iron therapy. All 59 patients (median age: 71 years, range: 24-93, male:female ratio 23:36) underwent a complete review of records before and after iron therapy for changes in haematological parameters and iron indexes. An increase of Hb ≥ 10.0 g/L from baseline was defined as a response to iron therapy, according to the WHO criteria. Results: The mean pre-treatment ferritin level of the cohort was 110 μg/L (median: 61 μg/L), which was higher than the generally accepted cut-off for iron deficiency. Following iron replacement therapy, the mean ferritin concentration of the cohort was raised to 257 μg/L, thus confirming the efficacy of iron therapy. Overall, 33 patients (56%) responded to iron therapy, experiencing an increase in Hb ≥ 10.0 g/L. Interestingly, 19 (58%) of these 33 patients had a pre-treatment ferritin value > 45 μg/L. Receiver operating characteristic (ROC) analysis of response rates to iron therapy and pre-treatment ferritin levels revealed an area under the curve (AUC) of only 0.492, indicating poor performance of ferritin tests in predicting the response to iron therapy. As such, serum ferritin is inadequate in predicting response to iron therapy in patients presenting with normocytic anemia. Conclusion: Despite the prevailing notion that low ferritin levels are diagnostic of iron deficiency, this retrospective case study exhibited the shortcomings of using ferritin as the sole determinant of iron status. Consequently, patients with normocytic anemia having a normal or high ferritin should not be excluded from a therapeutic trial of iron therapy. Disclosures No relevant conflicts of interest to declare.

authors

  • Ho, Justin CL
  • Stevic, Ivan
  • Chan, Anthony
  • Lau, Keith KH
  • Chan, Howard HW

publication date

  • December 3, 2015

published in