Abstract
Timely and accurate classification of breast lesions is needed on mammograms, as it will enhance clinical decisions and reduce unnecessary biopsies. The study proposes a Multi-Scale Hybrid ResNet–Transformer with Distance-Aware Learning for interpretable BI-RADS mammographic classification. The model integrates the spatial representation strength of ResNet-50 with the contextual modeling capability of lightweight multi-head self-attention layers, forming a unified hybrid architecture. Distance-Aware Learning loss is introduced to account for the ordinal nature of BI-RADS categories, penalizing predictions based on their proximity to the true class. The stage of preprocessing includes CLAHE to enhance mammographic contrast, followed by balanced oversampling and controlled augmentations to address data imbalance. Further, the model is trained and evaluated, which indicates strong generalization across validation and test sets. The hybrid model achieved a test accuracy of 0.921, with a mean AUC of 0.987 on the test set. The model performs a per-class discriminability, with F1-scores above 0.92 for clinically critical BI-RADS 4–5 categories. Moreover, the feature-space visualization and Grad-CAM based visual explanations confirm that the model focuses on clinically relevant lesion regions, providing interpretable outputs aligned with radiologist’s reasoning. The proposed framework will provide a clinically meaningful and efficient approach to automated BI-RADS classification, and may support future computer-aided diagnostic workflows.
Data availability
The data used in this study are publicly available from the INbreast Dataset - BI-RADS Classification, accessible at https://www.kaggle.com/datasets/orvile/inbreast-dataset-bi-rads-classification/data.
References
Torre, L. A., Islami, F., Siegel, R. L., Ward, E. M. & Jemal, A. Global cancer in women: burden and trends. Cancer Epidemiol. Biomarkers Prev. 26 (4), 444–457 (2017).
Elbasheer, M. M., Dodwell, D. & Gathani, T. Understanding global variation in breast cancer mortality. Br. J. Radiol., tqaf148. (2025).
Wilkinson, L. & Gathani, T. Understanding breast cancer as a global health concern. Br. J. Radiol. 95 (1130), 20211033 (2022).
Hayum, A. A. & Daniel Shadrach, F. Optimizing breast cancer classification based on cat swarm-enhanced ensemble neural network approach for improved diagnosis and treatment decisions. Sci. Rep. 15 (1), 33740 (2025).
Zhang, Y. et al. Global burden of female breast cancer: new estimates in 2022, temporal trend and future projections up to 2050 based on the latest release from GLOBOCAN. J. Natl. Cancer Cent. 5 (3), 287 (2025).
Mubarik, S., Luo, L., Iqbal, M., Nawsherwan, Bai, J. & Yu, C. More recent insights into the breast cancer burden across BRICS-Plus: Health consequences in key nations with emerging economies using the global burden of disease study 2019. Front. Oncol. 13, 1100300 (2023).
Tanveer, H., Faheem, M., Khan, A. H., Adam, M. A. & AI-Powered Diagnosis A Machine Learning Approach to Early Detection of Breast Cancer. Int. J. Eng. Dev. Res. 13 (2), 153–166 (2025).
Ghorbian, M. & Ghorbian, S. Usefulness of machine learning and deep learning approaches in screening and early detection of breast cancer. Heliyon 9 (12), e22427 (2023).
Patro, B. D. K. Improved early detection accuracy for breast cancer using a deep learning framework in medical imaging. Comput. Biol. Med. 187, 109751 (2025).
Ng, A. Y. et al. Prospective implementation of AI-assisted screen reading to improve early detection of breast cancer. Nat. Med. 29 (12), 3044–3049 (2023).
Iacob, R. et al. Evaluating the role of breast ultrasound in early detection of breast cancer in low-and middle-income countries: a comprehensive narrative review. Bioengineering 11 (3), 262 (2024).
Cömert, D., van Gils, C. H., Veldhuis, W. B. & Mann, R. M. Challenges and changes of the breast cancer screening paradigm. J. Magn. Reson. Imaging. 57 (3), 706–726 (2023).
Donat, D., Lukač-Pualić, S., Basta-Nikolić, M., Nićiforović, D. & Đurđević, A. Importance of ultrasound in screening dense breasts: case report. Med. Pregl. 77 (9–12), 355–357 (2024).
Hasan, A. H., Khalid, U. A. R., Abid, M. A. & Khalid, U. A. R. Leveraging artificial intelligence in breast cancer screening and diagnosis. Cureus 17 (2), e79177 (2025).
Abeelh, E. A. & AbuAbeileh, Z. Comparative effectiveness of mammography, ultrasound, and MRI in the detection of breast carcinoma in dense breast tissue: a systematic review. Cureus 16 (4), e59054 (2024).
Magny, S. J., Shikhman, R. & Keppke, A. L. Breast Imaging Reporting and Data System. [Updated 2023 Aug 28]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2025 Jan-. Available from: https://www.ncbi.nlm.nih.gov/books/NBK459169/
Marcon, M., Fuchsjäger, M. H., Clauser, P. & Mann, R. M. ESR Essentials: screening for breast cancer-general recommendations by EUSOBI. Eur. Radiol. 34 (10), 6348–6357 (2024).
Rainio, O. & Klén, R. Convolutional neural networks for tumor segmentation by cancer type and imaging modality: a systematic review. Netw. Model. Anal. Health Inf. Bioinf. 14 (1), 58 (2025).
Prakash, U. M. et al. Multi-scale feature fusion of deep convolutional neural networks on cancerous tumor detection and classification using biomedical images. Sci. Rep. 15 (1), 1105 (2025).
Tagnamas, J., Ramadan, H., Yahyaouy, A. & Tairi, H. Multi-task approach based on combined CNN-transformer for efficient segmentation and classification of breast tumors in ultrasound images. Visual Comput. Ind. Biomed. Art. 7 (1), 2 (2024).
Harrison, P., Hasan, R. & Park, K. State-of-the-art of breast cancer diagnosis in medical images via convolutional neural networks (cnns). J. Healthc. Inf. Res. 7 (4), 387–432 (2023).
Abdallah, A., Kasem, M. S., Abdelhalim, I., Alghamdi, N. S. & El-Baz, A. Improving BI-RADS Mammographic Classification with Self-Supervised Vision Transformers and Cascade Learning (IEEE Access, 2025).
Xu, M., Huang, J., Huang, K. & Liu, F. Incorporating tumor edge information for fine-grained bi-rads classification of breast ultrasound images. IEEE Access. 12, 38732–38744 (2024).
Ji, H. et al. Development and validation of a transformer-based CAD model for improving the consistency of BI-RADS category 3–5 nodule classification among radiologists: a multiple center study. Quant. Imaging Med. Surg. 13 (6), 3671 (2023).
Zhang, B., Vakanski, A. & Xian, M. Bi-Rads-Net: an explainable multitask learning approach for cancer diagnosis in breast ultrasound images. In 2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP) (pp. 1–6). IEEE. (2021), October.
Kollem, S., Sirigiri, C. & Peddakrishna, S. A novel hybrid deep CNN model for breast cancer classification using Lipschitz-based image augmentation and recursive feature elimination. Biomed. Signal Process. Control. 95, 106406 (2024).
Miao, B. et al. Exploring the use of large language models for classification, clinical interpretation, and treatment recommendation in breast tumor patient records. Sci. Rep. 15 (1), 31450 (2025).
Acknowledgements
The authors gratefully acknowledge the availability of the publicly accessible dataset for the BI-RADS Classification used in this study.
Funding
Open access funding provided by Symbiosis International (Deemed University).
Author information
Authors and Affiliations
Contributions
MS, AM and UT were involved in the conception of the study, methodology, data analysis, writing the original draft and revision of the manuscript preparation. SP, RG, and BK were involved in the supervision and revision of manuscript preparation. Finally, all the authors reviewed and approved the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Singh, M., Mohan, A., Tripathi, U. et al. A multi-scale hybrid ResNet–transformer with distance-aware learning for interpretable BI-RADS mammographic classification. Sci Rep (2026). https://doi.org/10.1038/s41598-026-40906-8
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-026-40906-8