AbstractBackground and Objectives
The safety of the blood supply in a number of countries is achieved by interventions that include behaviour-based time-limited or indefinite deferrals and screening of donated units for transfusion-transmitted infections. The relatively high sensitivity of nucleic acid testing (NAT) used in blood donor screening has raised the question of whether such time-based deferrals can be eliminated in favour of individual risk assessment.Materials and Methods
Data on the annual number of incident human immunodeficiency virus (HIV) infections associated with various behaviours and on the performance characteristics of NAT applied to donor screening were used to model the number of potentially infected units that might escape detection in the worst-case scenario in which individual risk assessment was implemented, but was not effective as a screening tool, and donors did not otherwise self-select for lower risk.Results
In the absence of effective individual risk-based screening or donor self-selection, the model predicts that in the United States, an additional 39 (95% CI 35–43) HIV-infected units would escape detection by nucleic acid testing, potentially capable of exposing approximately 68 (95% CI 61–75) individuals to the risk of HIV infection through the administration of prepared blood components.Conclusion
Despite some inherent uncertainty, the worst-case scenario of completely ineffective individual risk assessment, absence of donor self-selection and increased reliance on NAT for blood screening is estimated to be associated with an approximately fourfold increase in the risk of HIV exposure through transfusion in the United States.