Empowering Ovarian Cancer Subtype Classification with Parallel Swin Transformers and WSI Imaging
Ovarian cancer constitutes a notable proportion of cancer-related mortalities among women. The diagnostic classification of ovarian cancer subtypes has demonstrated complexity, characterized by limited concordance among pathologists. Vision Transformer (ViT) models have emerged as the predominant architecture in numerous computer vision applications, encompassing tasks such as image classification and cancer detection. Their success stems primarily from their capacity to integrate global contextual information through self-attention mechanisms during the learning process. However, the key issue with ViT is its compatibility with high-res images. Computation grows quadratically with image size, resulting in a large number of tokens and significant computational demands for self-attention. Swin Transformer (Swin-T) addresses this challenge by introducing two main concepts: hierarchical feature mapping and windowed attention transformation. This work presents a parallel implementation of Swin Transformers (Swin-Ts) that leverages the powerful feature extraction capabilities and aimed at classifying five subtypes within ovarian cancer utilizing Whole Slide Imaging (WSI) and it yielded average precision, recall, and F1-score metrics of 0.958, 0.964, and 0.96 correspondingly. The findings show that the proposed parallel Swin-Ts reduce the misclassification errors and improve medical image analysis robustness. Additionally, the suggested technique is promising for accurate and efficient ovarian carcinoma subtype categorization, with possible applicability to other cancers. Future research will integrate other data sources and validate the technique in various clinical contexts.
[1] Chen L., Qiao C., Wu M., Cai L., Yin C., Yang M., Sang X., and Bai W., “Improving the Segmentation Accuracy of Ovarian-Tumor Ultrasound Images Using Image Inpainting,” Bioengineering, vol. 10, no. 2, pp. 184, 2023. DOI:10.3390/bioengineering10020184
[2] D’Ascoli S., Touvron H., Leavitt M., Morcos A., Biroli G., and Sagun L., “ConViT: Improving Vision Transformers with Soft Convolutional Inductive Biases,” Journal of Statistical Mechanics, vol. 2022, no. 11, pp. 1-27, 2022. DOI: 10.1088/1742-5468/ac9830
[3] Fahim T., Alam F., and Ahmmed K., “OVANet: Dual Attention Mechanism Based New Deep Learning Framework for Diagnosis and Classification of Ovarian Cancer Subtypes from Histopathological Images,” IEEE Access, vol. 12, pp. 131942-131953, 2024. https://ieeexplore.ieee.org/document/10679974
[4] Farahani H., Boschman J., Farnell D., Darbandsari A., Zhang A., Ahmadvand P., Jones S., Huntsman D., Kobel M., Gilks C., Singh N., and Bashashati A., “Deep Learning-based Histotype Diagnosis of Ovarian Carcinoma Whole-Slide Pathology Images,” Modern Pathology, vol. 35, no. 12, pp. 1983-1990, 2022. DOI:10.1038/s41379-022- 01146-z
[5] Gao Z., Hong B., Zhang X., Li Y., Jia C., Wu J., Wang C., Meng D., and Li C., “Instance-based Vision Transformer for Subtyping of Papillary Renal Cell Carcinoma in Histopathological Image,” in Proceedings of the Medical Image Computing and Computer Assisted Intervention, Strasbourg, pp. 299-308, 2021. DOI:10.1007/978- 3-030-87237-3_29
[6] Hwangbo S., Kim S., Kim J., Eoh K., Lee C., Kim Y., Suh D., Park T., and Song Y., “Development of Machine Learning Models to Predict Platinum Sensitivity of High-Grade Serous Ovarian Carcinoma,” Cancers, vol. 13, no. 8, pp. 1875, 2021. DOI: 10.3390/cancers13081875
[7] Jiang Y., Zhang Y., Lin X., Dong J., Cheng T., and Liang J., “SwinBTS: A Method for 3D Multimodal Brain Tumor Segmentation Using Swin Transformer,” Brain Sciences, vol. 12, no. 6, pp. 1-15, 2022. DOI: 10.3390/brainsci12060797
[8] Jung Y., Kim T., Han M., Kim S., Kim G., Lee S., and Choi Y., “Ovarian Tumor Diagnosis Using Deep Convolutional Neural Networks and a Denoising Convolutional Autoencoder,” Scientific Reports, vol. 12, pp. 1-10, 2022. DOI:10.1038/s41598-022-20653-2
[9] Kaggle, Ovarian-Cancer-Classification- Iamvishaldubey, https://www.kaggle.com/code/charu31/ovarian- cancer-classification-iamvishaldubey/input, Last Visited, 2024.
[10] Koshiyama M., Matsumura N., and Konishi I., “Subtypes of Ovarian Cancer and Ovarian Cancer Screening,” Diagnostics, vol. 7, no. 1, pp. 12, 2017. DOI:10.3390/diagnostics7010012
[11] Liu Y., Lawson B., Huang X., Broom B., and Weinstein J., “Prediction of Ovarian Cancer Response to Therapy Based on Deep Learning Analysis of Histopathology Images,” Cancers, vol. 15, no. 16, pp. 1-13, 2023. DOI:10.3390/cancers15164044
[12] Liu Z., Lin Y., Cao Y., Hu H., Wei Y., Zhang Z., Lin S., and Guo B., “Swin Transformer: Hierarchical Vision Transformer using Shifted Windows,” IEEE/CVF International Conference 1014 The International Arab Journal of Information Technology, Vol. 21, No. 6, November 2024 on Computer Vision, Montreal, pp. 9992-10002, 2021. DOI:10.1109/iccv48922.2021.00986
[13] Liu Z., Tan Y., He Q., and Xiao Y., “SwinNet: Swin Transformer Drives Edge-Aware RGB-D and RGB-T Salient Object Detection,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 32, no. 7, pp. 4486-4497, 2022. DOI: 10.1109/tcsvt.2021.3127149
[14] Madhukar B., Bharathi S., and Ashwin M., “Classification of Breast Cancer using Ensemble Filter Feature Selection with Triplet Attention Based Efficient Net Classifier,” The International Arab Journal of Information Technology, vol. 21, no. 1, pp. 17-31, 2024. DOI: 10.34028/iajit/21/1/2
[15] Narayanan L., Krishnan S., and Robinson H., “A Hybrid Deep Learning Based Assist System for Detection and Classification of Breast Cancer from Mammogram Images,” The International Arab Journal of Information Technology, vol. 19, no. 6, pp. 965-974, 2022. DOI:10.34028/iajit/19/6/15
[16] Sengupta D., Ali S., Bhattacharya A., Mustafi J., Mukhopadhyay A., and Sengupta K., “A Deep Hybrid Learning Pipeline for Accurate Diagnosis of Ovarian Cancer Based on Nuclear Morphology,” PLoS One, vol. 17, no. 1, pp. 1-20, 2022. DOI: 10.1371/journal.pone.0261181
[17] Sun K., Sun L., Zhao Y., Chen Y., Hao X., Liu H., Liu X., and Chen J., “XGBG: A Novel Method for Identifying Ovarian Carcinoma Susceptible Genes Based on Deep Learning,” Frontiers in Oncology, vol. 12, pp. 1-7, 2022. DOI:10.3389/fonc.2022.897503
[18] Tummala S., Kim J., and Kadry S., “BreaST-Net: Multi-Class Classification of Breast Cancer from Histopathological Images Using Ensemble of Swin Transformers,” Mathematics, vol. 10, no. 21, pp. 1-15, 2022. DOI: 10.3390/math10214109
[19] Wang S., Liu Z., Rong Y., Zhou B., Bai Y., Wei W., Wei W., Wang M., Guo Y., and Tian J., “Deep Learning Provides a New Computed Tomography-based Prognostic Biomarker for Recurrence Prediction in High-Grade Serous Ovarian Cancer,” Radiotherapy and Oncology, vol. 132, pp. 171-177, 2019. DOI:10.1016/j.radonc.2018.10.019
[20] Wu M., Yan C., Liu H., and Liu Q., “Automatic Classification of Ovarian Cancer Types from Cytological Images Using Deep Convolutional Neural Networks,” Bioscience Reports, vol. 38, no. 3, 2018. DOI: 10.1042/bsr20180289
[21] Zhou R., Zhao B., Ding H., Fu Y., Li H., Wei Y., Xie J., Chen C., Yin F., and Huang D., “Survival Prediction of Ovarian Serous Carcinoma Based on Machine Learning Combined with Pathological Images and Clinical Information,” AIP Advances, vol. 14, no. 4, pp. 1-13, 2024. DOI:10.1063/5.0196414
[22] Zhuang H., Li B., Ma J., Monkam P., Qian W., and He D., “An Attention-based Deep Learning Network for Predicting Platinum Resistance in Ovarian Cancer,” IEEE Access, vol. 12, pp. 41000-41008, 2024. https://ieeexplore.ieee.org/document/10472495