Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Scientific Reports
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. scientific reports
  3. articles
  4. article
A deep learning approach to emotionally intelligent AI for improved learning outcomes
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 05 February 2026

A deep learning approach to emotionally intelligent AI for improved learning outcomes

  • Xiaoyu Wu1,2,
  • Tientien Lee1,3,
  • Umesh Kumar Lilhore4,
  • Sarita Simaiya5,
  • Roobaea Alroobaea6,
  • Abdullah M. Baqasah7,
  • Majed Alsafyani6 &
  • …
  • Lidia Gosy Tekeste8 

Scientific Reports , Article number:  (2026) Cite this article

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Mathematics and computing
  • Psychology

Abstract

Artificial intelligence–driven educational systems have largely prioritised cognitive adaptation, often neglecting the critical role of learners’ emotional states in shaping engagement and learning outcomes. To address this limitation, this study proposes a multimodal, emotion-aware deep learning framework designed to integrate emotional intelligence into intelligent learning environments. The framework jointly analyses facial expressions, speech characteristics, and textual responses to infer learners’ emotional states and models the interdependencies among these modalities through a graph-based fusion mechanism. The proposed approach is evaluated using benchmark emotion datasets, namely AffectNet and IEMOCAP, to assess its capability to recognise emotional patterns and support adaptive feedback during learning interactions. Experimental results demonstrate that incorporating emotional awareness leads to substantial improvements in learner engagement, emotional regulation, and task persistence when compared with conventional cognition-focused systems. The framework achieves consistently high emotion recognition performance, particularly for positive and neutral affective states, and shows robust generalisation across different emotion categories. User study outcomes further suggest that learners perceive the system as more supportive and responsive due to its emotional adaptability. In addition to performance evaluation, the study discusses key ethical considerations associated with emotion-aware educational technologies, including data privacy, informed consent, and responsible deployment. Overall, the findings underscore the potential of multimodal emotional intelligence to advance the development of more empathetic, adaptive, and effective artificial intelligence-based educational systems.

Data availability

The dataset is available from the corresponding author upon individual request.

Abbreviations

AI:

Artificial intelligence

FER:

Facial expression recognition

NLP:

Natural language processing

CNN:

Convolutional neural network

LSTM:

Long short-term memory

ViT:

Vision transformer

GNN:

Graph neural network

SMOTE:

Synthetic minority over-sampling technique

Dlib:

Digital library (face processing toolkit)

IoT:

Internet of Things

EEG:

Electroencephalogram

GDPR:

General data protection regulation

IEMOCAP:

Interactive emotional dyadic motion capture

SoftMax:

Soft maximum function

EI:

Emotional intelligence

SER:

Speech emotion recognition

DL:

Deep learning

RNN:

Recurrent neural network

TCN:

Temporal convolutional network

BERT:

Bidirectional encoder representations from transformers

MFCC:

Mel-frequency cepstral coefficients

MTCNN:

Multi-task cascaded convolutional neural network

FC:

Fully connected

FL:

Federated learning

HCI:

Human–computer interaction

FERPA:

Family educational rights and privacy act

AffectNet:

Facial emotion dataset

MF:

Multimodal fusion

References

  1. Singh, T. M., Reddy, C. K. K., Murthy, B. V. R., Nag, A. & Doss, S. AI and education: Bridging the gap to personalized, efficient, and accessible learning. In Internet of Behavior-Based Computational Intelligence for Smart Education Systems, 131–160 (IGI Global, 2025).

  2. Roumpas, K., Fotopoulos, A. & Xenos, M. A framework for ethical, cognitive-aware human–AI interaction in multimodal adaptive learning systems. In Cognitive-Aware Human–AI Interaction in Multimodal Adaptive Learning Systems.

  3. Islam, M. M., Nooruddin, S., Karray, F. & Muhammad, G. Enhanced multimodal emotion recognition in healthcare analytics: A deep learning-based model-level fusion approach. Biomed. Signal Process. Control 94, 106241 (2024).

    Google Scholar 

  4. Qiang, S. U. N. Deep learning-based modeling methods in personalized education. Artif. Intell. Educ. Stud. 1(1), 23–47 (2025).

    Google Scholar 

  5. Hadinezhad, S., Garg, S. & Lindgren, R. Enhancing inclusivity: Exploring AI applications for diverse learners. In Trust and Inclusion in AI-Mediated Education: Where Human Learning Meets Learning Machines, 163–182 (Springer, Cham, 2024).

  6. Kumar, R., Kumar, P., Sobin, C. C. & Subheesh, N. P. Blockchain and AI in Shaping the Modern Education System (2025).

  7. Lee, A. V. Y., Koh, E. & Looi, C. K. AI in education and learning analytics in Singapore: An overview of key projects and initiatives. Inf. Technol. Educ. Learn. 3(1), Inv-p001 (2023).

    Google Scholar 

  8. Zhou, X. et al. Personalized federated learning with model-contrastive learning for multi-modal user modeling in human-centric metaverse. IEEE J. Sel. Areas Commun. 42(4), 817–831 (2024).

    Google Scholar 

  9. Soman, G., Judy, M. V. & Abou, A. M. Human guided empathetic AI agent for mental health support leveraging reinforcement learning-enhanced retrieval-augmented generation. Cogn. Syst. Res. 90, 101337 (2025).

    Google Scholar 

  10. Xia, B., Innab, N., Kandasamy, V., Ahmadian, A. & Ferrara, M. Intelligent cardiovascular disease diagnosis using deep learning enhanced neural network with ant colony optimization. Sci. Rep. 14(1), 21777 (2024).

    Google Scholar 

  11. Lateef, M. Harnessing AI and machine learning to elevate educational wearable technology. In Wearable Devices and Smart Technology for Educational Teaching Assistance, 53–80. (IGI Global Scientific Publishing, 2025).

  12. Rayudu, K. M., Chinnammal, V., Rubiston, M. M., Padmaloshani, P., Singaravelu, R. & Merlin, N. R. G. Experimental analysis of artificial intelligence powered adaptive learning methodology using enhanced deep learning principle. In 2024 International Conference on Innovative Computing, Intelligent Communication and Smart Electrical Systems (ICSES), 1–7 (IEEE, 2024).

  13. Salloum, S. A., Alomari, K. M., Alfaisal, A. M., Aljanada, R. A. & Basiouni, A. Emotion recognition for enhanced learning: using AI to detect students’ emotions and adjust teaching methods. Smart Learn. Environ. 12(1), 21 (2025).

    Google Scholar 

  14. Vistorte, A. O. R. et al. Integrating artificial intelligence to assess emotions in learning environments: A systematic literature review. Front. Psychol. 15, 1387089 (2024).

    Google Scholar 

  15. Liu, Y. et al. Sample-cohesive pose-aware contrastive facial representation learning. Int. J. Comput. Vis. 133(6), 3727–3745 (2025).

    Google Scholar 

  16. Zhang, X., Cheng, X. & Liu, H. TPRO-NET: an EEG-based emotion recognition method reflecting subtle changes in emotion. Sci. Rep. 14(1), 13491 (2024).

    Google Scholar 

  17. Meng, T. et al. A multi-message passing framework based on heterogeneous graphs in conversational emotion recognition. Neurocomputing 569, 127109 (2024).

    Google Scholar 

  18. Xie, Y., Yang, L., Zhang, M., Chen, S. & Li, J. A review of multimodal interaction in remote education: Technologies, applications, and challenges. Appl. Sci. 15(7), 3937 (2025).

    Google Scholar 

  19. Sangeetha, S. K. B., Immanuel, R. R., Mathivanan, S. K., Cho, J. & Easwaramoorthy, S. V. An empirical analysis of multimodal affective computing approaches for advancing emotional intelligence in artificial intelligence for healthcare. IEEE Access 12, 114416–114434 (2024).

    Google Scholar 

  20. Li, C., Weng, X., Li, Y. & Zhang, T. Multimodal learning engagement assessment system: An innovative approach to optimizing learning engagement. Int. J. Hum. Comput. Interact. 41(5), 3474–3490 (2025).

    Google Scholar 

  21. Khediri, N., Ben Ammar, M. & Kherallah, M. A real-time multimodal intelligent tutoring emotion recognition system (MITERS). Multimed. Tools Appl. 83(19), 57759–57783 (2024).

    Google Scholar 

  22. Sajja, R., Sermet, Y., Cikmaz, M., Cwiertny, D. & Demir, I. Artificial intelligence-enabled intelligent assistant for personalized and adaptive learning in higher education. Information 15(10), 596 (2024).

    Google Scholar 

  23. Chetry, K. K. Transforming education: How AI is revolutionizing the learning experience. Int. J. Res. Publ. Rev. 5(5), 6352–6356 (2024).

    Google Scholar 

  24. Zhang, X. et al. Smart classrooms: How sensors and AI are shaping educational paradigms. Sensors (Basel, Switzerland) 24(17), 5487 (2024).

    Google Scholar 

  25. Govea, J., Navarro, A. M., Sánchez-Viteri, S. & Villegas-Ch, W. Implementation of deep reinforcement learning models for emotion detection and personalization of learning in hybrid educational environments. Front. Artif. Intell. 7, 1458230 (2024).

    Google Scholar 

  26. Yadav, Uma, and Urmila Shrawankar. "Artificial Intelligence Across Industries: A Comprehensive Review With a Focus on Education." AI Applications and Strategies in Teacher Education (2025): 275–320.

  27. Marques-Cobeta, N. Artificial intelligence in education: Unveiling opportunities and challenges. In Innovation and Technologies for the Digital Transformation of Education: European and Latin American Perspectives, 33–42 (2024).

  28. Gan, W., Dao, M. S., Zettsu, K. & Sun, Y. IoT-based multimodal analysis for smart education: Current status, challenges, and opportunities. In Proceedings of the 3rd ACM Workshop on Intelligent Cross-Data Analysis and Retrieval, 32–40 (2022).

  29. Zhou, X., Xuesong, Xu., Liang, W., Zeng, Z. & Yan, Z. Deep-learning-enhanced multitarget detection for end–edge–cloud surveillance in smart IoT. IEEE Internet Things J. 8(16), 12588–12596 (2021).

    Google Scholar 

  30. Halkiopoulos, C. & Gkintoni, E. Leveraging AI in e-learning: Personalized learning and adaptive assessment through cognitive neuropsychology—A systematic analysis. Electronics 13(18), 3762 (2024).

    Google Scholar 

  31. Duan, S., Wang, Z., Wang, S., Chen, M. & Zhang, R. Emotion-aware interaction design in intelligent user interface using multi-modal deep learning. In 2024 5th International Symposium on Computer Engineering and Intelligent Communications (ISCEIC), 110–114 (IEEE, 2024).

  32. Sharma, K., Papamitsiou, Z. & Giannakos, M. Building pipelines for educational data using AI and multimodal analytics: A “grey-box” approach. Br. J. Edu. Technol. 50(6), 3004–3031 (2019).

    Google Scholar 

  33. Villegas-Ch, W., Gutierrez, R. & Mera-Navarrete, A. Multimodal emotional detection system for virtual educational environments: Integration into microsoft teams to improve student engagement. IEEE Access 13, 42910–42933 (2025).

    Google Scholar 

  34. Li, Y., Chai, Z., You, S., Ye, G. & Liu, Q. Student portraits and their applications in personalized learning: Theoretical foundations and practical exploration. Front. Digit. Educ. 2(2), 1–17 (2025).

    Google Scholar 

  35. Javed, S., Ezehra, S. R., Ullah, H. & Naveed, M. How AI can detect emotional cues in students, improving virtual learning environments by providing personalized support and enhancing social-emotional learning. Rev. Appl. Manag. Soc. Sci. 8(2), 665–682 (2025).

    Google Scholar 

  36. Zong, Y. & Yang, L. How AI-enhanced social–emotional learning framework transforms EFL students’ engagement and emotional well-being. Eur. J. Educ. 60(1), e12925 (2025).

    Google Scholar 

  37. Thirunagalingam, A. & Whig, P. Emotional AI integrating human feelings in machine learning. In Humanizing Technology With Emotional Intelligence, 19–32 (IGI Global Scientific Publishing, 2025).

  38. Annapareddy, V. N., Singireddy, J., Nanan, B. P. & Burugulla, J. K. R. Emotional Intelligence in Artificial Agents: Leveraging Deep Multimodal Big Data for Contextual Social Interaction and Adaptive Behavioral Modelling (2025).

  39. Zhang, F., Wang, X. & Zhang, X. Applications of deep learning method of artificial intelligence in education. Educ. Inf. Technol. 30(2), 1563–1587 (2025).

    Google Scholar 

  40. Kolhatin, A. O. From automation to augmentation: a human-centered framework for generative AI in adaptive educational content creation. In CEUR Workshop Proceedings, 143–195 (2025).

  41. Sajja, R., Sermet, Y., Cwiertny, D. & Demir, I. Integrating AI and learning analytics for data-driven pedagogical decisions and personalized interventions in education (2023). https://arxiv.org/abs/2312.09548.

  42. Parkavi, R., Karthikeyan, P. & Abdullah, A. S. Enhancing personalized learning with explainable AI: A chaotic particle swarm optimization-based decision support system. Appl. Soft Comput. 156, 111451 (2024).

    Google Scholar 

  43. Cheng, S., Liu, Q., Chen, E., Huang, Z., Huang, Z., Chen, Y., Ma, H. & Hu, G. DIRT: Deep learning enhanced item response theory for cognitive diagnosis. In Proceedings of the 28th ACM International Conference on Information and Knowledge Management, 2397–2400 (2019).

Download references

Acknowledgements

The author extends their appreciation to Taif University, Saudi Arabia, for supporting this work through project number (TU-DSPP-2024-17).

Funding

This research was funded by Taif University, Taif, Saudi Arabia, project number (TU-DSPP-2024-17).

Author information

Authors and Affiliations

  1. Faculty of Social Sciences and Liberal Arts, UCSI University, No. 1, UCSI Heights, Jalan Puncak Menara Gading, Taman Connaught, 56000, Cheras, Wilayah Persekutuan Kuala Lumpur, Malaysia

    Xiaoyu Wu & Tientien Lee

  2. College of General Education, Zhangzhou College of Science and Technology, Zhangzhou, 363202, Fujian, China

    Xiaoyu Wu

  3. Faculty of Science and Mathematics, University Pendidikan Sultan Idris, 35900 Tanjong Malim, Perak, Malaysia

    Tientien Lee

  4. School of Computer Science and Engineering, Galgotias University, Greater Noida, UP, India

    Umesh Kumar Lilhore

  5. School of Computer Applications and Technology, Galgotias University, Greater Noida, UP, India

    Sarita Simaiya

  6. Department of Computer Science, College of Computers and Information Technology, Taif University, P. O. Box 11099, 21944, Taif, Saudi Arabia

    Roobaea Alroobaea & Majed Alsafyani

  7. Department of Information Technology, College of Computers and Information Technology, Taif University, 21974, Taif, Saudi Arabia

    Abdullah M. Baqasah

  8. Eritrea Institute of Technology, Mai-Nefhi College, Himbrti, Mai Nefhi, Eritrea

    Lidia Gosy Tekeste

Authors
  1. Xiaoyu Wu
    View author publications

    Search author on:PubMed Google Scholar

  2. Tientien Lee
    View author publications

    Search author on:PubMed Google Scholar

  3. Umesh Kumar Lilhore
    View author publications

    Search author on:PubMed Google Scholar

  4. Sarita Simaiya
    View author publications

    Search author on:PubMed Google Scholar

  5. Roobaea Alroobaea
    View author publications

    Search author on:PubMed Google Scholar

  6. Abdullah M. Baqasah
    View author publications

    Search author on:PubMed Google Scholar

  7. Majed Alsafyani
    View author publications

    Search author on:PubMed Google Scholar

  8. Lidia Gosy Tekeste
    View author publications

    Search author on:PubMed Google Scholar

Contributions

Umesh Kumar Lilhore, Xiaoyu Wu conceptualised the research, designed the methodology, and contributed to data analysis and result interpretation. Tientien Lee was responsible for data collection and played a key role in experimental work while assisting in manuscript drafting and revision. Umesh Kumar Lilhore focused on statistical analysis, data visualisation, and contributed to writing the discussion section. Sarita Simaiya supported laboratory work, experimental processes, and manuscript editing. Roobaea Alroobaea contributed to the literature review and assisted with manuscript revisions. Abdullah M. Baqasah provided technical support during data collection, validated results, and contributed to the methodology. Majed Alsafyani helped with data analysis and interpretation and provided feedback on the manuscript. Finally, Lidia Gosy Tekeste, as the corresponding author, oversaw the project, coordinated the team, and wrote the final manuscript, ensuring the research was completed.

Corresponding authors

Correspondence to Tientien Lee, Umesh Kumar Lilhore or Lidia Gosy Tekeste.

Ethics declarations

Competing interests

The authors declare no competing interests.

Consent for publication

All authors have reviewed and approved the final manuscript for publication.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, X., Lee, T., Lilhore, U.K. et al. A deep learning approach to emotionally intelligent AI for improved learning outcomes. Sci Rep (2026). https://doi.org/10.1038/s41598-026-37750-1

Download citation

  • Received: 18 August 2025

  • Accepted: 24 January 2026

  • Published: 05 February 2026

  • DOI: https://doi.org/10.1038/s41598-026-37750-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Keywords

  • Deep learning
  • Emotional intelligence
  • Personalised learning
  • Multimodal data
  • AI education
  • Facial expression recognition
  • Speech sentiment analysis
Download PDF

Advertisement

Explore content

  • Research articles
  • News & Comment
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on Twitter
  • Sign up for alerts
  • RSS feed

About the journal

  • About Scientific Reports
  • Contact
  • Journal policies
  • Guide to referees
  • Calls for Papers
  • Editor's Choice
  • Journal highlights
  • Open Access Fees and Funding

Publish with us

  • For authors
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Scientific Reports (Sci Rep)

ISSN 2045-2322 (online)

nature.com sitemap

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics