Author
Listed:
- Zhuoran Duan
(School of Integrated Ciruits, Anhui University, Hefei 230601, China
Anhui Engineering Laboratory of Agro-Ecological Big Data, Hefei 230601, China)
- Chao Xu
(School of Integrated Ciruits, Anhui University, Hefei 230601, China
Anhui Engineering Laboratory of Agro-Ecological Big Data, Hefei 230601, China)
- Zhengping Li
(School of Integrated Ciruits, Anhui University, Hefei 230601, China
Anhui Engineering Laboratory of Agro-Ecological Big Data, Hefei 230601, China)
- Bo Feng
(School of Integrated Ciruits, Anhui University, Hefei 230601, China
Anhui Engineering Laboratory of Agro-Ecological Big Data, Hefei 230601, China)
- Chao Nie
(School of Integrated Ciruits, Anhui University, Hefei 230601, China
Anhui Engineering Laboratory of Agro-Ecological Big Data, Hefei 230601, China)
Abstract
Cervical cancer, as the fourth most common cancer in women, poses a significant threat to women’s health. Vaginal colposcopy examination, as the most cost-effective step in cervical cancer screening, can effectively detect precancerous lesions and prevent their progression into cancer. The size of the lesion areas in the colposcopic images varies, and the characteristics of the lesions are complex and difficult to discern, thus heavily relying on the expertise of the medical professionals. To address these issues, this paper constructs a vaginal colposcopy image dataset, ACIN-3, and proposes a Fusion Multi-scale Attention Network for the detection of cervical precancerous lesions. First, we propose a heterogeneous receptive field convolution module to construct the backbone network, which utilizes combinations of convolutions with different structures to extract multi-scale features from multiple receptive fields and capture features from different-sized regions of the cervix at different levels. Second, we propose an attention fusion module to construct a branch network, which integrates multi-scale features and establishes connections in both the spatial and channel dimensions. Finally, we design a dual-threshold loss function and introduce positive and negative thresholds to improve sample weights and address the issue of data imbalance in the dataset. Multiple experiments are conducted on the ACIN-3 dataset to demonstrate the superior performance of our approach compared to some classical and recent advanced methods. Our method achieves an accuracy of 92.2% in grading and 94.7% in detection, with average AUCs of 0.9862 and 0.9878. Our heatmap illustrates the accuracy of our approach in focusing on the locations of lesions.
Suggested Citation
Zhuoran Duan & Chao Xu & Zhengping Li & Bo Feng & Chao Nie, 2024.
"FMA-Net: Fusion of Multi-Scale Attention for Grading Cervical Precancerous Lesions,"
Mathematics, MDPI, vol. 12(7), pages 1-17, March.
Handle:
RePEc:gam:jmathe:v:12:y:2024:i:7:p:958-:d:1362661
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:12:y:2024:i:7:p:958-:d:1362661. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.