Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Scientific Reports
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. scientific reports
  3. articles
  4. article
A multi-scale hybrid ResNet–transformer with distance-aware learning for interpretable BI-RADS mammographic classification
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 20 February 2026

A multi-scale hybrid ResNet–transformer with distance-aware learning for interpretable BI-RADS mammographic classification

  • Maninder Singh1,
  • Amrita Mohan2,
  • Umang Tripathi3,
  • Shashwat Pathak4,
  • Rajeev Gupta5 &
  • …
  • Basant Kumar5 

Scientific Reports , Article number:  (2026) Cite this article

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Computational biology and bioinformatics
  • Engineering
  • Health care
  • Mathematics and computing
  • Medical research

Abstract

Timely and accurate classification of breast lesions is needed on mammograms, as it will enhance clinical decisions and reduce unnecessary biopsies. The study proposes a Multi-Scale Hybrid ResNet–Transformer with Distance-Aware Learning for interpretable BI-RADS mammographic classification. The model integrates the spatial representation strength of ResNet-50 with the contextual modeling capability of lightweight multi-head self-attention layers, forming a unified hybrid architecture. Distance-Aware Learning loss is introduced to account for the ordinal nature of BI-RADS categories, penalizing predictions based on their proximity to the true class. The stage of preprocessing includes CLAHE to enhance mammographic contrast, followed by balanced oversampling and controlled augmentations to address data imbalance. Further, the model is trained and evaluated, which indicates strong generalization across validation and test sets. The hybrid model achieved a test accuracy of 0.921, with a mean AUC of 0.987 on the test set. The model performs a per-class discriminability, with F1-scores above 0.92 for clinically critical BI-RADS 4–5 categories. Moreover, the feature-space visualization and Grad-CAM based visual explanations confirm that the model focuses on clinically relevant lesion regions, providing interpretable outputs aligned with radiologist’s reasoning. The proposed framework will provide a clinically meaningful and efficient approach to automated BI-RADS classification, and may support future computer-aided diagnostic workflows.

Data availability

The data used in this study are publicly available from the INbreast Dataset - BI-RADS Classification, accessible at https://www.kaggle.com/datasets/orvile/inbreast-dataset-bi-rads-classification/data.

References

  1. Torre, L. A., Islami, F., Siegel, R. L., Ward, E. M. & Jemal, A. Global cancer in women: burden and trends. Cancer Epidemiol. Biomarkers Prev. 26 (4), 444–457 (2017).

    Google Scholar 

  2. Elbasheer, M. M., Dodwell, D. & Gathani, T. Understanding global variation in breast cancer mortality. Br. J. Radiol., tqaf148. (2025).

  3. Wilkinson, L. & Gathani, T. Understanding breast cancer as a global health concern. Br. J. Radiol. 95 (1130), 20211033 (2022).

    Google Scholar 

  4. Hayum, A. A. & Daniel Shadrach, F. Optimizing breast cancer classification based on cat swarm-enhanced ensemble neural network approach for improved diagnosis and treatment decisions. Sci. Rep. 15 (1), 33740 (2025).

    Google Scholar 

  5. Zhang, Y. et al. Global burden of female breast cancer: new estimates in 2022, temporal trend and future projections up to 2050 based on the latest release from GLOBOCAN. J. Natl. Cancer Cent. 5 (3), 287 (2025).

    Google Scholar 

  6. Mubarik, S., Luo, L., Iqbal, M., Nawsherwan, Bai, J. & Yu, C. More recent insights into the breast cancer burden across BRICS-Plus: Health consequences in key nations with emerging economies using the global burden of disease study 2019. Front. Oncol. 13, 1100300 (2023).

    Google Scholar 

  7. Tanveer, H., Faheem, M., Khan, A. H., Adam, M. A. & AI-Powered Diagnosis A Machine Learning Approach to Early Detection of Breast Cancer. Int. J. Eng. Dev. Res. 13 (2), 153–166 (2025).

    Google Scholar 

  8. Ghorbian, M. & Ghorbian, S. Usefulness of machine learning and deep learning approaches in screening and early detection of breast cancer. Heliyon 9 (12), e22427 (2023).

  9. Patro, B. D. K. Improved early detection accuracy for breast cancer using a deep learning framework in medical imaging. Comput. Biol. Med. 187, 109751 (2025).

    Google Scholar 

  10. Ng, A. Y. et al. Prospective implementation of AI-assisted screen reading to improve early detection of breast cancer. Nat. Med. 29 (12), 3044–3049 (2023).

    Google Scholar 

  11. Iacob, R. et al. Evaluating the role of breast ultrasound in early detection of breast cancer in low-and middle-income countries: a comprehensive narrative review. Bioengineering 11 (3), 262 (2024).

    Google Scholar 

  12. Cömert, D., van Gils, C. H., Veldhuis, W. B. & Mann, R. M. Challenges and changes of the breast cancer screening paradigm. J. Magn. Reson. Imaging. 57 (3), 706–726 (2023).

    Google Scholar 

  13. Donat, D., Lukač-Pualić, S., Basta-Nikolić, M., Nićiforović, D. & Đurđević, A. Importance of ultrasound in screening dense breasts: case report. Med. Pregl. 77 (9–12), 355–357 (2024).

    Google Scholar 

  14. Hasan, A. H., Khalid, U. A. R., Abid, M. A. & Khalid, U. A. R. Leveraging artificial intelligence in breast cancer screening and diagnosis. Cureus 17 (2), e79177 (2025).

  15. Abeelh, E. A. & AbuAbeileh, Z. Comparative effectiveness of mammography, ultrasound, and MRI in the detection of breast carcinoma in dense breast tissue: a systematic review. Cureus 16 (4), e59054 (2024).

  16. Magny, S. J., Shikhman, R. & Keppke, A. L. Breast Imaging Reporting and Data System. [Updated 2023 Aug 28]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2025 Jan-. Available from: https://www.ncbi.nlm.nih.gov/books/NBK459169/

  17. Marcon, M., Fuchsjäger, M. H., Clauser, P. & Mann, R. M. ESR Essentials: screening for breast cancer-general recommendations by EUSOBI. Eur. Radiol. 34 (10), 6348–6357 (2024).

    Google Scholar 

  18. Rainio, O. & Klén, R. Convolutional neural networks for tumor segmentation by cancer type and imaging modality: a systematic review. Netw. Model. Anal. Health Inf. Bioinf. 14 (1), 58 (2025).

    Google Scholar 

  19. Prakash, U. M. et al. Multi-scale feature fusion of deep convolutional neural networks on cancerous tumor detection and classification using biomedical images. Sci. Rep. 15 (1), 1105 (2025).

    Google Scholar 

  20. Tagnamas, J., Ramadan, H., Yahyaouy, A. & Tairi, H. Multi-task approach based on combined CNN-transformer for efficient segmentation and classification of breast tumors in ultrasound images. Visual Comput. Ind. Biomed. Art. 7 (1), 2 (2024).

    Google Scholar 

  21. Harrison, P., Hasan, R. & Park, K. State-of-the-art of breast cancer diagnosis in medical images via convolutional neural networks (cnns). J. Healthc. Inf. Res. 7 (4), 387–432 (2023).

    Google Scholar 

  22. Abdallah, A., Kasem, M. S., Abdelhalim, I., Alghamdi, N. S. & El-Baz, A. Improving BI-RADS Mammographic Classification with Self-Supervised Vision Transformers and Cascade Learning (IEEE Access, 2025).

  23. Xu, M., Huang, J., Huang, K. & Liu, F. Incorporating tumor edge information for fine-grained bi-rads classification of breast ultrasound images. IEEE Access. 12, 38732–38744 (2024).

    Google Scholar 

  24. Ji, H. et al. Development and validation of a transformer-based CAD model for improving the consistency of BI-RADS category 3–5 nodule classification among radiologists: a multiple center study. Quant. Imaging Med. Surg. 13 (6), 3671 (2023).

    Google Scholar 

  25. Zhang, B., Vakanski, A. & Xian, M. Bi-Rads-Net: an explainable multitask learning approach for cancer diagnosis in breast ultrasound images. In 2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP) (pp. 1–6). IEEE. (2021), October.

  26. Kollem, S., Sirigiri, C. & Peddakrishna, S. A novel hybrid deep CNN model for breast cancer classification using Lipschitz-based image augmentation and recursive feature elimination. Biomed. Signal Process. Control. 95, 106406 (2024).

    Google Scholar 

  27. Miao, B. et al. Exploring the use of large language models for classification, clinical interpretation, and treatment recommendation in breast tumor patient records. Sci. Rep. 15 (1), 31450 (2025).

    Google Scholar 

Download references

Acknowledgements

The authors gratefully acknowledge the availability of the publicly accessible dataset for the BI-RADS Classification used in this study.

Funding

Open access funding provided by Symbiosis International (Deemed University).

Author information

Authors and Affiliations

  1. Symbiosis Centre for Medical Image Analysis, Symbiosis International (Deemed University), Pune, 412115, India

    Maninder Singh

  2. Department of Computer Science Engineering, National Institute of Technology, Patna, 800005, India

    Amrita Mohan

  3. EEI Department, Autonomy Technologies (M.Sc.), Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) Erlangen, 91054, Erlangen, Germany

    Umang Tripathi

  4. Atal Incubation Centre, AIC GNITS Foundation, Hyderabad, 500104, India

    Shashwat Pathak

  5. Department of Electronics and Communication Engineering, Motilal Nehru National Institute of Technology Allahabad, Prayagraj, 211004, India

    Rajeev Gupta & Basant Kumar

Authors
  1. Maninder Singh
    View author publications

    Search author on:PubMed Google Scholar

  2. Amrita Mohan
    View author publications

    Search author on:PubMed Google Scholar

  3. Umang Tripathi
    View author publications

    Search author on:PubMed Google Scholar

  4. Shashwat Pathak
    View author publications

    Search author on:PubMed Google Scholar

  5. Rajeev Gupta
    View author publications

    Search author on:PubMed Google Scholar

  6. Basant Kumar
    View author publications

    Search author on:PubMed Google Scholar

Contributions

MS, AM and UT were involved in the conception of the study, methodology, data analysis, writing the original draft and revision of the manuscript preparation. SP, RG, and BK were involved in the supervision and revision of manuscript preparation. Finally, all the authors reviewed and approved the manuscript.

Corresponding author

Correspondence to Maninder Singh.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Singh, M., Mohan, A., Tripathi, U. et al. A multi-scale hybrid ResNet–transformer with distance-aware learning for interpretable BI-RADS mammographic classification. Sci Rep (2026). https://doi.org/10.1038/s41598-026-40906-8

Download citation

  • Received: 12 October 2025

  • Accepted: 17 February 2026

  • Published: 20 February 2026

  • DOI: https://doi.org/10.1038/s41598-026-40906-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Keywords

  • Breast cancer detection
  • BI-RADS classification
  • Hybrid ResNet–transformer
  • Distance-aware learning
  • Explainable AI
  • Multi-scale feature fusion
  • Mammography
Download PDF

Advertisement

Explore content

  • Research articles
  • News & Comment
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on X
  • Sign up for alerts
  • RSS feed

About the journal

  • About Scientific Reports
  • Contact
  • Journal policies
  • Guide to referees
  • Calls for Papers
  • Editor's Choice
  • Journal highlights
  • Open Access Fees and Funding

Publish with us

  • For authors
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Scientific Reports (Sci Rep)

ISSN 2045-2322 (online)

nature.com sitemap

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics