Abstract
Digital mental health assessments can effectively link young people to appropriate health services. Current approaches often lack personalization, failing to recognize the complex and multidimensional needs of young people. 1734 young people aged 12–25 years completed seven standardized measures (49 items) using a digital health assessment tool while receiving mental health care. A multidimensional computerized adaptive test (MCAT) was developed to predict scores on seven standardized scales, spanning clinical symptoms, suicidality, functioning, and alcohol use. Different adaptive tests were simulated under various stopping criteria configurations. Ten-fold cross-validation was performed to determine the accuracy and efficiency of the multidimensional assessment. By administering a personalized subset of items to each individual, the average number of assessment items could be reduced by 69% while maintaining excellent agreement with full-length scores for suicidality (ICC = 0.96), anxiety (ICC = 0.92), and alcohol use (ICC = 0.91), and good agreement for psychological distress (ICC = 0.88), functioning (ICC = 0.86), psychosis (ICC = 0.78), and mania (ICC = 0.75). Estimated average assessment time decreased from 10.5 minutes to under 3.3 minutes (49 items reduced to 15.3 items, per person, with mean absolute agreement ICC = 0.87). This adaptive digital assessment can screen across key domains to identify mental health needs and complexity in youth mental health, leading to rapid decisions about treatment needs and care pathways.
Data availability
The data that support the findings of this study are available from Innowell Pty Ltd, but restrictions apply to the availability of these data, which were used under licence for the current study, and so are not publicly available. Data are however available from the authors upon reasonable request and with permission of Innowell Pty Ltd.
Code availability
The underlying code for this study is under formal intellectual property review by the University of Sydney and is therefore not publicly available.
References
Tutun, S. et al. An AI-based decision support system for predicting mental health disorders. Inf. Syst. Front. 25, 1261–1276 (2023).
McGorry, P. D. et al. The Lancet Psychiatry Commission on youth mental health. Lancet. Psychiatry 11, 731–774 (2024).
McMahon B., McInerney D. Right care, right place, first time: how AI is improving national virtual front doors. NEJM AI. 2, 2025.
Bucci, S., Schwannauer, M. & Berry, N. The digital revolution and its impact on mental health care. Psychol. Psychother. 92, 277–297 (2019).
Karcher, N. R. et al. Youth mental health screening and linkage to care. Psychiatr. Serv. 74, 727–736 (2023).
Capon, W. et al. A multidimensional approach for differentiating the clinical needs of young people presenting for primary mental health care. Compr. Psychiatry 126, 152404 (2023).
Capon, W. et al. Characterising variability in youth mental health service populations: a detailed and scalable approach using digital technology. Australas. Psychiatry 31, 295–301 (2023).
Capon, W. et al. Matching needs to services: Development of a service needs index for determining care pathways in youth mental health. Aust. N. Z. J. Psychiatry 59, 776–785 (2025).
Hickie, I. B. et al. Right care, first time: a highly personalised and measurement-based care model to manage youth mental health. Med. J. Aust. 211, S3–S46 (2019).
Mughal, S. et al. The needs of most or the most in need? How integrated and transdiagnostic youth services assess and operationalize mental health needs. Psychiatr. Serv. 76, 997–1017 (2025).
Capon, W. et al. What gets measured gets managed: an analysis of how international integrated and transdiagnostic youth mental health service models identify treatment needs. Under review. 2025.
Martel, R. M. et al. YouthCHAT as a primary care E-screening tool for mental health issues among Te Tai Tokerau Youth: protocol for a co-design study. JMIR Res. Protoc. 8, e12108 (2019).
Stewart, S. L. et al. The interRAI Child and Youth Suite of Mental Health Assessment Instruments: an integrated approach to mental health service delivery. Front. Psychiatry 13, 710569 (2022).
Gorban, C. et al. Building mutually beneficial collaborations between digital navigators, mental health professionals, and clients: naturalistic observational case study. JMIR Ment. Health 11, e58068 (2024).
Thabrew, H. et al. Comparison of YouthCHAT, an electronic composite psychosocial screener, with a clinician interview assessment for young people: randomized controlled trial. J. Med. Internet Res. 21, e13911 (2019).
Chong, M. K. et al. Personalized and collaborative use of digital measurement-based care tools enhances engagement among young adults: a mixed-methods study. BMC Health Serv. Res. 25, 752 (2025).
Forbes, M. K. et al. Elemental psychopathology: distilling constituent symptoms and patterns of repetition in the diagnostic criteria of the DSM-5. Psychol. Med. 54, 886–894 (2024).
Spiller, T. R. et al. Unveiling the structure in mental disorder presentations. JAMA Psychiatry 81, 1101–1107 (2024).
Newson, J. J., Pastukh, V. & Thiagarajan, T. C. Poor separation of clinical symptom profiles by DSM-5 disorder criteria. Front. Psychiatry 12, 775762 (2021).
Martin-Key, N. A. et al. The current state and validity of digital assessment tools for psychiatry: systematic review. JMIR Ment. Health 9, e32824 (2022).
Gibbons, R. D. & deGruy, F. V. Without wasting a word: extreme improvements in efficiency and accuracy using computerized adaptive testing for mental health disorders (CAT-MH). Curr. Psychiatry Rep. 21, 67 (2019).
Giordano, A. et al. Applying multidimensional computerized adaptive testing to the MSQOL-54: a simulation study. Health Qual. Life Outcomes 21, 61 (2023).
Gibbons, R. D. et al. Computerized adaptive tests for rapid and accurate assessment of psychopathology dimensions in youth. J. Am. Acad. Child Adolesc. Psychiatry 59, 1264–1273 (2020).
Bass, M., Morris, S. & Neapolitan, R. Utilizing multidimensional computer adaptive testing to mitigate burden with patient reported outcomes. AMIA Annu Symp Proc. 5, 320–328 (2015).
Sunderland, M., Batterham, P., Carragher, N., Calear, A. & Slade, T. Developing and validating a computerized adaptive test to measure broad and specific factors of internalizing in a community sample. Assessment 26, 1030–1045 (2019).
Loe, B. S., Stillwell, D. & Gibbons, C. Computerized adaptive testing provides reliable and efficient depression measurement using the CES-D scale. J. Med. Internet Res. 19, e302 (2017).
Sunderland, M. et al. Comparing scores from full length, short form, and adaptive tests of the social interaction anxiety and social phobia scales. Assessment 27, 518–532 (2020).
Kessler, R. C. et al. Short screening scales to monitor population prevalences and trends in non-specific psychological distress. Psychol. Med. 32, 959–976 (2002).
Ising, H. K. et al. The validity of the 16-item version of the Prodromal Questionnaire (PQ-16) to screen for ultra high risk of developing psychosis in the general help-seeking population. Schizophr. Bull. 38, 1288–1296 (2012).
Altman, E. G., Hedeker, D., Peterson, J. L. & Davis, J. M. The Altman Self-Rating Mania Scale. Biol. Psychiatry 42, 948–955 (1997).
Norman, S. B., Cissell, S. H., Means-Christensen, A. J. & Stein, M. B. Development and validation of an Overall Anxiety Severity and Impairment Scale (OASIS). Depress. Anxiety 23, 245–249 (2006).
van Spijker, B. A. et al. The suicidal ideation attributes scale (SIDAS): community-based validation study of a new scale for the measurement of suicidal ideation. Suicide Life Threat Behav. 44, 408–419 (2014).
Bush, K., Kivlahan, D. R., McDonell, M. B., Fihn, S. D. & Bradley, K. A. The AUDIT alcohol consumption questions (AUDIT-C): an effective brief screening test for problem drinking. Ambulatory Care Quality Improvement Project (ACQUIP). Alcohol Use Disorders Identification Test. Arch. Intern Med. 158, 1789–1795 (1998).
Mundt, J. C., Marks, I. M., Shear, M. K. & Greist, J. H. The Work and Social Adjustment Scale: a simple measure of impairment in functioning. Br. J. Psychiatry 180, 461–464 (2002).
Zein, R. A. & Akhtar, H. Getting started with the graded response model: an introduction and tutorial in R. Int. J. Psychol. 60, e13265 (2025).
Ganga, R. N., Santa, K., Ali, M. & Smith, G. The impact of a digital referral platform to improve access to child and adolescent mental health services: a prospective observational study with real-world data. Int. J. Environ. Res. Public Health 21, 1318 (2024).
Sin, J. et al. Digital interventions for screening and treating common mental disorders or symptoms of common mental illness in adults: systematic review and meta-analysis. J. Med. Internet Res. 22, e20581 (2020).
Skinner, A., Occhipinti, J. A., Prodan, A., Song, Y. J. C. & Hickie, I. B. Bi-stability and critical transitions in mental health care systems: a model-based analysis. Int. J. Ment. Health Syst. 17, 5 (2023).
Kjell, O. N. E., Kjell, K. & Schwartz, H. A. Beyond rating scales: with targeted evaluation, large language models are poised for psychological assessment. Psychiatry Res. 333, 115667 (2024).
Ghaemi, S. N. & Pope, H. G. Jr. Lack of insight in psychotic and affective disorders: a review of empirical studies. Harv. Rev. Psychiatry 2, 22–33 (1994).
Graham, A. K. et al. Validation of the computerized adaptive test for mental health in primary care. Ann. Fam. Med. 17, 23–30 (2019).
Bassi, E. M. et al. Perceptions of mental health providers of the barriers and facilitators of using and engaging youth in digital mental-health-enabled measurement based care. Digit. Health 10, 20552076241253093 (2024).
Li J., Gibbons R., Ročková V. Deep computerized adaptive testing. Preprint at arXiv. https://doi.org/10.48550/arXiv.2502.19275 (2025).
Colledani, D. & Barbaranelli, C. Anselmi P. Fast, smart, and adaptive: using machine learning to optimize mental health assessment and monitor change over time. Sci. Rep. 15, 6492 (2025).
Bischl, B. et al. Hyperparameter optimization: foundations, algorithms, best practices, and open challenges. WIREs Data Min. Knowl. Discov. 13, e1484 (2023).
Iorfino, F. et al. A digital platform designed for youth mental health services to deliver personalized and measurement-based care. Front. Psychiatry 10, 595 (2019).
Cappelleri, J. C., Jason Lundy, J. & Hays, R. D. Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures. Clin. Ther. 36, 648–662 (2014).
Reise, S. P. & Waller, N. G. Item response theory and clinical measurement. Annu. Rev. Clin. Psychol. 5, 27–48 (2009).
Cai, L., Choi, K., Hansen, M. & Harrell, L. Item Response Theory. Annu. Rev. Stat. Appl. 3, 297–321 (2016).
Mulder, J. & van der Linden, W. J. Multidimensional adaptive testing with optimal design criteria for item selection. Psychometrika 74, 273–296 (2009).
Christensen, K. B., Makransky, G. & Horton, M. Critical values for Yen’s Q(3): identification of local dependence in the Rasch model using residual correlations. Appl. Psychol. Meas. 41, 178–194 (2017).
Haley, S. M., Coster, W. J., Andres, P. L., Kosinski, M. & Ni, P. Score comparability of short forms and computerized adaptive testing: simulation study with the activity measure for post-acute care. Arch. Phys. Med. Rehabil. 85, 661–666 (2004).
Koo, T. K. & Li, M. Y. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. J. Chiropr. Med. 15, 155–163 (2016).
Chalmers, R. P. mirt: a multidimensional item response theory package for the R environment. J. Stat. Softw. 48, 1–29 (2012).
Chalmers, R. P. Generating adaptive and non-adaptive test interfaces for multidimensional item response theory applications. J. Stat. Softw. 71, 1–38 (2016).
Acknowledgements
WC extends his gratitude to Samuel McLeod for guidance related to cross-validation. The funding sources of this study had no role in the design, data collection, data analysis, and reporting of this study. This work was supported by the Medical Research Future Fund Applied Artificial Intelligence in Health Care grant [MRFAI000097]. W.C. was supported by the Australian Government Research Training Program (RTP) Scholarship. J.J.C. was supported by an NHMRC Emerging Leadership Fellowship (2008197). I.B.H. was supported by an NHMRC Research Fellowship (511921). HML is supported by the Bill & Patricia Richie Foundation and a philanthropic funding donor affected by mental health who wishes to remain anonymous. F.I. was supported by an NHMRC EL1 Investigator Grant (GNT2018157).
Author information
Authors and Affiliations
Contributions
W.C. and F.I. conceptualized the project. W.C. conceptualized and conducted the analysis. M.V. reviewed the analysis and provided ongoing statistical guidance. All authors interpreted the results and provided insights. W.C. wrote the initial draft of the manuscript. WC created the figures, tables, and supplementary material. All authors (W.C., I.B.H., M.V., H.M.L., L.J.B., J.J.C., E.M.S., F.I.) provided useful and significant changes to the first draft of the manuscript.
Corresponding author
Ethics declarations
Competing interests
I.B.H. is a Professor of Psychiatry and the Co-Director of Health and Policy, Brain and Mind Centre, University of Sydney. He has led major public health and health service development in Australia, particularly focusing on early intervention for young people with depression, suicidal thoughts and behaviors and complex mood disorders. He is active in the development through codesign, implementation and continuous evaluation of new health information and personal monitoring technologies to drive highly-personalized and measurement-based care. He holds a 3.2% equity share in Innowell Pty Ltd., which is focused on the digital transformation of mental health services. E.M.S. is a Principal Research Fellow at the Brain and Mind Centre, University of Sydney, a Consultant Psychiatrist and Adjunct Clinical Professor at the School of Medicine, University of Notre Dame. She previously served as the Discipline Leader for Adult Mental Health at Notre Dame until January 2025. In addition, she is a member of Medibank’s Medical and Mental Health Reference Groups. A/Prof Scott has also delivered educational seminars on the clinical management of depressive disorders, receiving honoraria from pharmaceutical companies including Servier, Janssen, and Eli Lilly. Moreover, she has contributed to a national advisory board for Pfizer’s antidepressant Pristiq and served as the National Coordinator for an antidepressant trial sponsored by Servier. All other authors declare no competing interests.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Capon, W., Hickie, I.B., Varidel, M. et al. Validating an adaptive digital assessment of youth mental health needs: a cross-sectional study. npj Digit. Med. (2026). https://doi.org/10.1038/s41746-026-02374-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41746-026-02374-2