Background
The sensitivity of HIV screening assays often leads to a high rate of false-positive results, requiring retests and confirmatory tests. This study aimed to analyze the capability of signal-to-cutoff (S/CO) ratios of HIV screening assay to predict HIV infection.
Methods
A retrospective study on the HIV screening-positive population was performed at Zhongshan Hospital, Xiamen University, the correlation between HIV screening assay S/CO ratios and HIV infection was assessed, and plotted Receiver Operating Characteristic (ROC) curves were generated to establish the optimal cutoff value for predicting HIV infection.
Results
Out of 396,679 patients, 836 were confirmed to be HIV-infected, with an HIV prevalence of 0.21%. The median S/CO ratios in HIV infection were significantly higher than that in non-HIV infection (296.9 vs. 2.41, P < 0.001). The rate of confirmed HIV infection was increased with higher S/CO ratios in the screening assay. The ROC curve based on the HIV screening assay S/CO ratio achieved a sensitivity of 93.78% and a specificity of 93.12% with an optimal cutoff value of 14.09. The area under the ROC curve was 0.9612. Further analysis of the ROC curve indicated that the S/CO ratio thresholds yielding positive predictive values of 99%, 99.5%, and 100% for HIV infection were 26.25, 285.7, and 354.5, respectively.
Conclusion
Using HIV screening assay S/CO ratio to predict HIV infection can largely reduce necessitating retests and confirmatory tests. Incorporating the S/CO ratio into HIV testing algorithms can have significant implications for medical and public health practices.