Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

npj Digital Medicine
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. npj digital medicine
  3. articles
  4. article
User-preference alignment with uncertainty-aware interactive rectification for liver organ and tumor segmentation and analysis from CT images
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 03 April 2026

User-preference alignment with uncertainty-aware interactive rectification for liver organ and tumor segmentation and analysis from CT images

  • Guangyuan Zhao1 na1,
  • Yang Wang2 na1,
  • Chen Gong3 na1,
  • Zipei Wang1,
  • Guobin Huang1,
  • Xuechun Zhao1,
  • Tengfei Chao3 &
  • …
  • Bo Yang1 

npj Digital Medicine , Article number:  (2026) Cite this article

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Cancer
  • Computational biology and bioinformatics
  • Health care
  • Mathematics and computing
  • Medical research

Abstract

Accurate liver and tumor segmentation from CT images is essential for cancer diagnosis, treatment planning, and response assessment. However, manual segmentation is labor-intensive and variable, while standard automated models lack the flexibility to adapt to diverse clinical needs or inherent image uncertainties. To bridge this gap, we introduce User-Preference Alignment with Uncertainty-Aware Interactive Rectification (UAIR), a novel framework designed for efficient and adaptive segmentation. Instead of requiring laborious pixel-level corrections, UAIR presents the clinician with a small, curated set of diverse segmentation candidates generated by quantifying model uncertainty. The user simply selects the most suitable option, allowing the framework to iteratively refine its results and align with specific clinical preferences. This selection-based approach drastically reduces the human interaction cost. We validated UAIR on a large-scale, multi-center CT dataset, demonstrating superior accuracy (DSC 0.776) over existing manual positional prompting (DSC 0.685) and less prompting efforts. UAIR provides a clinically-viable solution that integrates seamless human guidance, enabling rapid and robust segmentation for downstream quantitative analysis.

Similar content being viewed by others

A prospective multicenter trial of deep learning auto-segmentation for organs at risk in thoracic radiotherapy

Article Open access 31 March 2026

A novel unified Inception-U-Net hybrid gravitational optimization model (UIGO) incorporating automated medical image segmentation and feature selection for liver tumor detection

Article Open access 14 August 2025

Improving automatic liver tumor segmentation in late-phase MRI using multi-model training and 3D convolutional neural networks

Article Open access 18 July 2022

Data availability

All imaging datasets analyzed in this study are publicly accessible. The AbdomenAtlas dataset is available at (https://github.com/MrGiovanni/AbdomenAtlas). Processed or derived data supporting the findings of this study are available from the corresponding author upon reasonable request.

References

  1. Wild, C. P., Weiderpass, E. & Stewart, B. W. World cancer report (IARC, 2020).

  2. Sedano, R. et al. Immunotherapy for cancer: common gastrointestinal, liver, and pancreatic side effects and their management. Am. J. Gastroenterol. 117, 1917–1932 (2022).

    Google Scholar 

  3. Midya, A. et al. Computerized diagnosis of liver tumors from ct scans using a deep neural network approach. IEEE J. Biomed. Health Inform. 27, 2456–2464 (2023).

    Google Scholar 

  4. Shiina, S. et al. Percutaneous ablation for hepatocellular carcinoma: comparison of various ablation techniques and surgery. Can. J. Gastroenterol. Hepatol. 2018, 4756147 (2018).

    Google Scholar 

  5. Gul, S. et al. Deep learning techniques for liver and liver tumor segmentation: a review. Comput. Biol. Med. 147, 105620 (2022).

    Google Scholar 

  6. Bilic, P. et al. The liver tumor segmentation benchmark (lits). Med. Image Anal. 84, 102680 (2023).

    Google Scholar 

  7. Ma, J. et al. Abdomenct-1k: Is abdominal organ segmentation a solved problem? IEEE Trans. Pattern Anal. Mach. Intell. 44, 6695–6714 (2022).

    Google Scholar 

  8. Eisenhauer, E. A. et al. New response evaluation criteria in solid tumours: revised recist guideline (version 1.1). Eur. J. Cancer 45, 228–247 (2009).

    Google Scholar 

  9. Virdis, F. et al. Clinical outcomes of primary arterial embolization in severe hepatic trauma: a systematic review. Diagn. Interv. Imaging 100, 65–75 (2019).

    Google Scholar 

  10. Todorov, M. I. et al. Machine learning analysis of whole mouse brain vasculature. Nat. Methods 17, 442–449 (2020).

    Google Scholar 

  11. Azad, R. et al. Medical image segmentation review: the success of U-Net. IEEE Trans. Pattern Anal. Mach. Intell. 46, 10076–10095 (2024).

    Google Scholar 

  12. Alirr, O. I. & Rahni, A. A. A. Survey on liver tumour resection planning system: steps, techniques, and parameters. J. Digit. Imaging 33, 304–323 (2020).

    Google Scholar 

  13. Moghbel, M., Mashohor, S., Mahmud, R. & Saripan, M. I. B. Review of liver segmentation and computer assisted detection/diagnosis methods in computed tomography. Artif. Intell. Rev. 50, 497–537 (2018).

    Google Scholar 

  14. Hesamian, M. H., Jia, W., He, X. & Kennedy, P. Deep learning techniques for medical image segmentation: achievements and challenges. J. Digit. Imaging 32, 582–596 (2019).

    Google Scholar 

  15. Baumgartner, C. F. et al. Phiseg: Capturing uncertainty in medical image segmentation. In Proc. International Conference on Medical Image Computing and Computer-Assisted Intervention, 119–127 (Springer, 2019).

  16. Prasanna, P. G. et al. Normal tissue protection for improving radiotherapy: where are the gaps? Transl. Cancer Res. 1, 35 (2012).

    Google Scholar 

  17. Zhu, J., Wu, J., Ouyang, C., Kamnitsas, K. & Noble, J. A. Spa: efficient user-preference alignment against uncertainty in medical image segmentation. In Proc. IEEE/CVF International Conference on Computer Vision, 23731–23740 (IEEE, 2025).

  18. Zhang, Y., Shen, Z. & Jiao, R. Segment anything model for medical image segmentation: current applications and future directions. Comput. Biol. Med. 171, 108238 (2024).

    Google Scholar 

  19. Kirillov, A. et al. Segment anything. In Proc. IEEE/CVF International Conference on Computer Vision, 4015–4026 (IEEE, 2023).

  20. Ma, J. et al. Segment anything in medical images. Nat. Commun. 15, 654 (2024).

    Google Scholar 

  21. Wang, H. et al. Sam-med3d: towards general-purpose segmentation models for volumetric medical images. In Proc. European Conference on Computer Vision, 51–67 (Springer, 2024).

  22. Zhang, Y. et al. Enhancing the reliability of auto-prompting sam for medical image segmentation with uncertainty estimation and rectification. In Proc. IEEE/CVF International Conference on Computer Vision, 1282–1291 (IEEE, 2025).

  23. Deng, G. et al. Sam-u: Multi-box prompts triggered uncertainty estimation for reliable sam in medical image. In Proc. International Conference on Medical Image Computing and Computer-Assisted Intervention, 368–377 (Springer, 2023).

  24. Li, W. et al. Abdomenatlas: a large-scale, detailed-annotated & multi-center dataset for efficient transfer learning and open algorithmic benchmarking. Med. Image Anal. 97, 103285 (2024).

    Google Scholar 

  25. Ronneberger, O., Fischer, P. & Brox, T. U-net: convolutional networks for biomedical image segmentation. In Proc. International Conference on Medical image computing and computer-assisted intervention, 234–241 (Springer, 2015).

  26. Chen, J. et al. Transunet: rethinking the U-Net architecture design for medical image segmentation through the lens of transformers. Med. Image Anal. 97, 103280 (2024).

    Google Scholar 

  27. Lei, W., Xu, W., Li, K., Zhang, X. & Zhang, S. Medlsam: Localize and segment anything model for 3d ct images. Med. Image Anal. 99, 103370 (2025).

    Google Scholar 

  28. Zhang, D., Chen, B., Chong, J. & Li, S. Weakly-supervised teacher-student network for liver tumor segmentation from non-enhanced images. Med. Image Anal. 70, 102005 (2021).

    Google Scholar 

  29. Ma, Y., Wang, J., Yang, J. & Wang, L. Model-heterogeneous semi-supervised federated learning for medical image segmentation. IEEE Trans. Med. Imaging 43, 1804–1815 (2024).

    Google Scholar 

  30. Jiang, M., Yang, H., Cheng, C. & Dou, Q. Iop-fl: Inside-outside personalization for federated medical image segmentation. IEEE Trans. Med. Imaging 42, 2106–2117 (2023).

    Google Scholar 

  31. Zhao, J. et al. United adversarial learning for liver tumor segmentation and detection of multi-modality non-contrast mri. Med. Image Anal. 73, 102154 (2021).

    Google Scholar 

  32. Ji, Y. et al. Amos: A large-scale abdominal multi-organ benchmark for versatile medical image segmentation. Adv. Neural Inf. Process. Syst. 35, 36722–36732 (2022).

    Google Scholar 

  33. Zhang, Y. et al. Seganypet: Universal promptable segmentation from positron emission tomography images. In Proc. IEEE/CVF International Conference on Computer Vision (ICCV), 21107–21116 (IEEE, 2025).

  34. Wang, Z., Wu, Z., Agarwal, D. & Sun, J. Medclip: contrastive learning from unpaired medical images and text. Proc. Conf. Empir. Methods Nat. Lang. Process. 2022, 3876 (2022).

    Google Scholar 

  35. Moor, M. et al. Med-flamingo: a multimodal medical few-shot learner. In Machine Learning for Health (ML4H), 353–367 (PMLR, 2023).

  36. Zhang, Y. et al. Semisam+: rethinking semi-supervised medical image segmentation in the era of foundation models. Med. Image Anal. (2025).

  37. Ali, M. et al. A review of the segment anything model (sam) for medical image analysis: accomplishments and perspectives. Comput. Med. Imaging Graph. 119, 102473 (2025).

    Google Scholar 

  38. Jiao, R. et al. Learning with limited annotations: a survey on deep semi-supervised learning for medical image segmentation. Comput. Biol. Med. 169, 107840 (2024).

    Google Scholar 

  39. Zhang, Y., Jiao, R., Liao, Q., Li, D. & Zhang, J. Uncertainty-guided mutual consistency learning for semi-supervised medical image segmentation. Artif. Intell. Med. 138, 102476 (2023).

    Google Scholar 

  40. Zou, K. et al. A review of uncertainty estimation and its application in medical imaging. Meta Radiol. 1, 100003 (2023).

    Google Scholar 

  41. Zhou, N. et al. Medsam-u: Uncertainty-guided auto multi-prompt adaptation for reliable medsam. IEEE Trans. Circuits Syst. Video Technol. 36, 3768–3781 (2025).

    Google Scholar 

Download references

Acknowledgements

This work was supported by the Natural Science Foundation of Hubei Province (Grant No. 2025AFD774). The authors would also like to express their sincere gratitude to Yuchuan Jiang for his valuable contributions to the field, from which this study has greatly benefited. His previously published work provided important insights and methodological guidance for the present research.

Author information

Author notes
  1. These authors contributed equally: Guangyuan Zhao, Yang Wang, Chen Gong.

Authors and Affiliations

  1. Institute of Organ Transplantation, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology; Key Laboratory of Organ Transplantation, Ministry of Education, NHC Key Laboratory of Organ Transplantation; Key Laboratory of Organ Transplantation, Chinese Academy of Medical SciencesOrgan Transplantation Clinical Medical Research Center of Hubei Province, Wuhan, China

    Guangyuan Zhao, Zipei Wang, Guobin Huang, Xuechun Zhao & Bo Yang

  2. Department of Hepatobiliary Surgery, Peking University People’s Hospital, Beijing, China

    Yang Wang

  3. Department of Oncology, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China

    Chen Gong & Tengfei Chao

Authors
  1. Guangyuan Zhao
    View author publications

    Search author on:PubMed Google Scholar

  2. Yang Wang
    View author publications

    Search author on:PubMed Google Scholar

  3. Chen Gong
    View author publications

    Search author on:PubMed Google Scholar

  4. Zipei Wang
    View author publications

    Search author on:PubMed Google Scholar

  5. Guobin Huang
    View author publications

    Search author on:PubMed Google Scholar

  6. Xuechun Zhao
    View author publications

    Search author on:PubMed Google Scholar

  7. Tengfei Chao
    View author publications

    Search author on:PubMed Google Scholar

  8. Bo Yang
    View author publications

    Search author on:PubMed Google Scholar

Contributions

G.Z., C.G., and Y.W. contributed equally to this work, having full access to all study data and assuming responsibility for the integrity and accuracy of the analyses (Validation and Formal analysis). G.Z., C.G., and G.H. conceptualized the study, designed the methodology, and participated in securing research funding (Conceptualization, Methodology, and Funding acquisition). Y.W. and X.Z. carried out data acquisition, curation, and investigation (Investigation, Data curation) and provided key resources, instruments, and technical support (Resources and Software). Z.W. drafted the initial manuscript and generated visualizations (Writing—Original Draft and Visualization). T.C. and B.Y. supervised the project, coordinated collaborations, and ensured administrative support (Supervision and Project administration). All authors contributed to reviewing and revising the manuscript critically for important intellectual content (Writing—Review & Editing) and approved the final version for submission.

Corresponding authors

Correspondence to Tengfei Chao or Bo Yang.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, G., Wang, Y., Gong, C. et al. User-preference alignment with uncertainty-aware interactive rectification for liver organ and tumor segmentation and analysis from CT images. npj Digit. Med. (2026). https://doi.org/10.1038/s41746-026-02544-2

Download citation

  • Received: 08 November 2025

  • Accepted: 03 March 2026

  • Published: 03 April 2026

  • DOI: https://doi.org/10.1038/s41746-026-02544-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Download PDF

Associated content

Collection

Emerging Applications of Machine Learning and AI for Predictive Modeling in Precision Medicine

Advertisement

Explore content

  • Research articles
  • Reviews & Analysis
  • News & Comment
  • Collections
  • Follow us on X
  • Sign up for alerts
  • RSS feed

About the journal

  • Aims and scope
  • Content types
  • Journal Information
  • About the Editors
  • Contact
  • Editorial policies
  • Calls for Papers
  • Journal Metrics
  • About the Partner
  • Open Access
  • Early Career Researcher Editorial Fellowship
  • Editorial Team Vacancies
  • News and Views Student Editor
  • Communication Fellowship

Publish with us

  • For Authors and Referees
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

npj Digital Medicine (npj Digit. Med.)

ISSN 2398-6352 (online)

nature.com footer links

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing: Cancer

Sign up for the Nature Briefing: Cancer newsletter — what matters in cancer research, free to your inbox weekly.

Get what matters in cancer research, free to your inbox weekly. Sign up for Nature Briefing: Cancer