Abstract
Panoramic radiography is an essential auxiliary diagnostic tool for oral diseases. It is a difficult and time-consuming task to conduct extensive panoramic radiography interpretation. These challenges are exacerbated by the creation of electronic medical records and the investigation of oral diseases using collective data. So, we develop a visualization system based on panoramic radiographs. Its function focuses on the intelligent diagnosis and statistical analysis of oral diseases. Firstly, we provide a human-machine collaborative tool for the diagnosis and data extraction of oral diseases in panoramic radiographs. After that, the system generates electronic medical records, including visual charts of oral health status and radiology reports. We further develop statistical correlation analysis to visually evaluate and interactively explore the statistical data from oral health surveys. We conduct intelligent diagnosis, obtain the electronic medical records and do collective analysis based on 521 panoramic radiographs. The available analyses cover disease-prone teeth, disease distribution per tooth position and association of age, sex with oral diseases. The results are reported from a comprehensive case study showing that our system can improve the efficiency in disease detection and data mining. It can also fuel research studies in the field of public oral health and provide robust support for oral healthcare strategies.
Similar content being viewed by others
Introduction
Early detection and accurate treatment of oral diseases are essential for human health1. As a most common and important examination technique for oral disease diagnosis, panoramic radiography utilizes specialized X-ray apparatus to capture panoramic radiographs2. In panoramic radiographs, the oral structures are visually presented such as crowns, roots, alveolar bones and temporomandibular joints, assisting the dentists in the diagnosis of oral diseases and assessment of the oral health situation3.
However, it is a time-consuming and complex task to conduct the interpretation of panoramic radiographs, due to some issues summarized as follows: First, dentists are always required to spend a lot of time interpreting panoramic radiographs and writing diagnosis reports. And, the interpretations largely depend on the personal knowledge of dentists, which would easily lead to the misinterpretation of some oral diseases4. Second, the diagnostic conclusions are always drawn to focus on the alleviation of patients’ concerns, which are further recorded in the electronic medical record system. So the general oral health status cannot be achieved from the diagnostic conclusions, due to the absence of tools for comprehensive analysis5.
Recently, plenty of methods have been proposed to conduct automatic panoramic radiograph interpretation, with the development of artificial intelligence (AI) technology. (1) Deep learning based tooth segmentation. For example, many scholars are using deep learning to accurately identify and segment teeth. This makes clinical work easier and helps create advanced dental programs for diagnosis and treatment6,7,8,9,10. (2) AI-based disease diagnosis and health report generation. For instance, researchers have discovered that deep learning-based systems can improve the accuracy and speed of diagnosing oral diseases (such as impacted teeth, residual roots, tooth loss, and caries). However, the existing AI models may lack the accuracy needed for reliable oral disease diagnosis11. Furthermore, some scholars have put forward new frameworks. These can not only learn to detect diseases from initial images, but also generate relevant medical reports based on these findings12,13. (3) Population based oral health survey. Some studies concentrate on oral health data at the population level. They describe the oral health status, awareness, and behaviors among different groups, and investigate their interrelations. The research has two main goals. First, it helps us better understand oral health in communities. Second, it offers tailored advice and suggestions to people14,15,16. Although the above technologies have shown great advantages in some specific oral clinical scenarios, there is still no tool capable of providing a comprehensive workflow to improve the efficiency and validity of oral disease diagnosis through the combination of computer and human intelligence.
To address the above issues, we develop an intelligent diagnosis and statistical analysis system based on panoramic radiographs. The major contributions of this paper are summarized as follows: Firstly, we propose a human-machine collaborative tool for the diagnosis and data extraction of oral diseases in panoramic radiographs. This tool greatly facilitates the segmentation, computation, and archiving of radiographic features. Secondly, we develop a visual analysis approach for correlating population data, enabling in-depth exploration and research of population-based oral health data. We hypothesize that the system will improve the accuracy and efficiency of dental disease diagnosis. Furthermore, statistical analysis of the collected data will provide valuable insights and recommendations for population-level oral health care strategies.
Methods
Flowchart
The flowchart of our system is presented in Fig. 1. First, dentists upload the examinee’s panoramic radiography. And, personal information will be also associated with the system. Then, the system will segment teeth and detect dental diseases autonomously in panoramic radiographs. In addition, we provide a rich set of user interfaces enabling the dentists to revise the tooth segmentations and the lesion location for analysis of oral diseases. Our system allows for detecting six common oral diseases, including caries, pulp exposure, periradicular lesions, impacted teeth, residual roots and missing teeth. After that, a rule-based Natural Language Processing (NLP) method is provided to generate electronic medical records, including visual charts of oral health status and radiology reports. Finally, we conduct a statistical survey and visual analysis of disease characteristics and group characteristics within the examined group. We employ multi-dimensional images, such as heat maps and bar charts, to illustrate disease types, distribution, and associations. To assess the system’s effectiveness and usability, we conduct expert interviews and a quantitative questionnaire survey. The innovative system greatly aids in the detection, prevention, and treatment of oral diseases, offering improved oral health management for the public.
Data collection
The panoramic radiographs were retrospectively selected from an image database of patients who visited the hospital between November 2023 and January 2024. Then, the dataset was constructed by extracting age, gender, and panoramic radiographs. The patients gave their informed consent before any panoramic radiographs were taken, and their privacy was protected when using the data for medical research. Because of this, the study did not include any more informed consent from these individuals. The study was approved by the Ethics Committee of the First Affiliated Hospital, Zhejiang University School of Medicine (approval no. 20230785), and was performed in accordance with the Declaration of Helsinki.
The inclusion criteria for panoramic radiographs included: permanent dentition; age ≧ 12. The exclusion criteria included: (1) the deciduous dentition and deciduous teeth were preserved; (2) there was severe crowding of the teeth (greater than 8 mm per arch); (3) incomplete and blurry panoramic radiographs; (4) there was an artifact of spectacles, earrings, piercings and detachable dentures on the panoramic radiographs. Panoramic images were also obtained using a Planmeca Promax Digital Panoramic X-ray (Planmeca, Helsinki, Finland). Tube voltages are between 60 and 90 kV and tube operating currents are between 1 and 10 mA. X-ray exposures were performed using the device’s default program, which had a fixed magnification of 1.2× and a rotation time of 15.8 s. The resolution of panoramic radiographs was 2943 × 1435.
The dataset encompassed 521 panoramic radiographs, with a distribution of 249 male cases (47.8%) and 272 female cases (52.2%). The age spectrum ranged from 12 to 85 years, categorized into five distinct groups: <18 years old, with 109 cases (20.9%); 18–29 years old, with 135 cases (25.9%); 30–44 years old, with 71 cases (13.6%); 45–60 years old, with 128 cases (24.6%); and > 60 years old, with 78 cases (15.0%).
System development
We develop an intelligent diagnosis and statistical analysis system based on panoramic radiographs, as shown in Fig. 2.
Intelligent diagnosis and statistical analysis system for oral diseases based on panoramic radiographs. (A) A control panel enabling users to load data sets and select the specified patient, whose basic information is presented below. (B) The panoramic view to show the labeled panoramic radiography. (C) The electronic medical record including the number of teeth, oral health risk index, the occurrences of six diseases and radiology report. (D) The heat map to show the prevalence rate per tooth position. (E) The bar view to present the disease distribution.
Interactive capabilities for diagnosing diseases
We used Mask R-CNN for tooth segmentation and tooth position recognition. Next, we selected the ResNet101 model as our initial model. It was pre-trained on the ImageNet dataset, which includes over 14 million labeled images. We then fine-tuned the pre-trained ResNet101 model through transfer learning using our self-annotated dataset of multi-type oral diseases. The dataset consists of 307 panoramic dental radiographs. Each tooth was meticulously annotated by multiple dental experts and categorized into seven classes: healthy, caries, pulp exposure, periradicular lesions, impacted teeth, residual roots and missing teeth. Labels were established by a team of three dentists, each with over 10 years of clinical experience and a background in oral and maxillofacial radiology. When consensus was not reached, a fourth senior radiologist provided final validation. In total, we extracted approximately 531 annotated tooth samples, with 80% used for training and 20% used for testing. During the transfer learning process, we replaced the original fully connected layer of the ResNet101 model with a new one tailored to the current task. The new fully connected layer includes seven output nodes, corresponding to the seven categories, and uses the Softmax function to generate class probabilities. We froze the initial convolutional layers of the model to preserve their pre-trained feature extraction capabilities. At the same time, we fine-tuned the later convolutional layers and the new fully connected layer to adapt the model to the specific characteristics of dental lesions. To improve the model’s robustness and generalizability, a range of data augmentation techniques was employed, including random rotation, horizontal flipping, brightness adjustment, and denoising. These methods enhanced the diversity and quality of the training samples, enabling the model to handle varied clinical scenarios. Additionally, we optimized the training process by performing cross-validation to identify the best hyperparameter settings. As a result, we obtained a ResNet101 model capable of intelligent diagnosis across multiple types of oral diseases.
The system enables dentists to verify and refine the results of automatic tooth segmentation and preliminary disease diagnosis, ensuring accuracy and clinical reliability. After the model segments the teeth and identifies potential diseases, dentists can interactively review the results. They can annotate lesion locations, and modify any inaccuracies in segmentation or diagnosis. By combining automated diagnosis with interactive verification, the system increases diagnostic precision while providing dentists complete authority. This ensures alignment with clinical standards and patient-specific needs.
Generation of electronic medical records
The system generates electronic medical records based on visualization technology and annotated images as the characteristic data. Here are two functional modules below. (Fig. 2C)
-
(1)
Visualized charts.
The top line presents the number of teeth information. Following this, using the lesion location data, we assign a corresponding weight to the disease’s severity. Based on the weight, a health risk chart is designed to represent the health risk index. Lastly, the system counts the occurrences of six diseases from the uploaded panoramic radiographs. A radar chart is employed to visualize this statistical information, with the vertices of the radar chart representing the extreme values of the statistical data.
-
(2)
Radiology report.
This module primarily involves text matching and generation. We first created a comprehensive oral therapy corpus that included a variety of oral issues, available treatments, and clinical settings. Secondly, we built a rule-based text matching model. It processes disease diagnoses according to preset rules and categorizes key information based on disease type, clinical symptoms, and disease severity. Thirdly, the system integrates structured information to facilitate model matching. It queries the corpus based on the extracted structured information to obtain more detailed information about oral problems and treatments. For instance, the medical record description of a patient with caries can be integrated into the structured information of “17-caries” following the rule of “tooth position + disease”. Cosine similarity computation is employed to retrieve relevant treatment suggestions from the corpus. Additionally, we predefined templates for generating oral diagnostic health reports. These templates cover diagnostic descriptions, symptoms, treatment recommendations, and precautions for oral diseases. Ultimately, we standardized and refined the information extracted from the corpus, which produced readable treatment recommendation texts. The final confirmation of the radiology report is made by the treating dentist, who reviews and adjusts the generated content based on clinical examination and patient consultation.
Statistical analysis
Our study further provides a statistical analysis module that employs multi-dimensional attribute images such as heat maps and bar charts. It helps to conduct statistical investigations and visual analyses of group characteristics and disease features (Fig. 2D and E).
-
(1)
Prevalence rate per tooth position.
The system organizes and consolidates the gathered data, tallying the occurrences of six oral diseases (caries, pulp exposure, periradicular lesions, impacted teeth, residual roots, missing teeth) in each tooth position. Different shades of red represent the prevalence level and deeper colors indicate a higher number of affected teeth at a certain dental position. Additionally, a dashboard is located above each tooth icon. It displays the probability of the tooth being non-normal (number of affected teeth on site/total number of teeth on site) for all panoramas that have been studied.
Dentists can hover over a tooth position to reveal the number of affected teeth in that position by interacting with the heat map. Utilizing the filtering options in the interaction box (e.g., choosing a specific oral disease or all oral diseases), dentists can refine and tally the data, obtain the filtered information, and redraw the heat map for visualization.
-
(2)
Disease distribution per tooth position.
When the dentist clicks a tooth icon on the heat map, the system filters the corresponding disease data. After retrieving the age and gender distribution, it shows them on the right as a gender distribution chart and a bar chart. This interaction between the heat map and bar chart facilitates a comprehensive understanding.
-
(3)
Association of disease with other factors.
Bar graphs are used to illustrate the prevalence of the six oral diseases across various genders and age groups. The vertical axis represents the six oral diseases, while the horizontal axis indicates the number of affected teeth. Each bar corresponds to one oral disease. Its stacked segments represent subgroups based on gender (male or female) or age (< 18, 18–29, 30–44, 45–60, and > 60)17.
User study
To further assess the effectiveness of our system, we invited twelve dentists (6 males and 6 females) to use our system. The composition of the dental team includes: 4 high-level dentists with more than 10 years of clinical experience (H1, H2, H3, H4); 4 mid-level dentists with 3–10 years of clinical experience (M1, M2, M3, M4); and 4 low-level dentists with less than 3 years of clinical experience (L1, L2, L3, L4). Before teaching them how to use the system, we gave them an overview of its goals and features. Subsequently, they applied this system to everyday clinical consultations. During the process, we recorded their feedback and interactions. Additionally, we asked them to evaluate our system using a 7-point Likert scale based on the post-study questionnaire (Table 1), focusing on system function, system usability and system design.
Results
Performance score of the detection model
In this study, multiple metrics are computed to assess the performance of the detection model, including precision, recall, F1-score, as well as average precision (AP) and mean average precision (mAP). Mean average precision is calculated by averaging the AP across all classes, offering an overall measure of the model’s performance. In our study, the mAP achieved by the model was 0.76, indicating strong performance in precision and recall across all classes and the ability to accurately localize most targets (Table 2).
Statistical analysis of patients’ oral conditions
In this study, we conducted a statistical analysis of the test results from 521 examinees to better articulate and extract feature data from the system.
Disease-prone teeth
This heat map illustrates the prevalence rate per tooth position. Among the 521 patients surveyed, 206 mandibular first molars were found to be diseased. Following closely were the maxillary third molar and mandibular third molar, accounting for 172 and 155, respectively. The prevalence of lesions in the canines was infrequent.
Six kinds of diseases can be selected from the drop-down menu: caries, pulp exposure, periradicular lesions, impacted teeth, residual roots, and missing teeth. The total number of carious teeth was 280, with the highest prevalence found in the mandibular first molars, constituting 55 (19.6%). In the anterior teeth, the maxillary teeth had a higher caries prevalence than their mandibular counterparts. Conversely, in the posterior teeth, the mandibular teeth showed a higher caries prevalence than the maxillary teeth. The total number of missing teeth was 731, with the highest prevalence found in the mandibular first molars, constituting 108 (14.8%). There were 100 residual roots, and the mandibular first molars were the preferred location for residual roots, constituting 17 (17%). The number of teeth with endodontic periapical disease totaled 111, and the mandibular first molars were again the predominant site, accounting for 26 (23.4%) (Fig. 3).
Heat maps for all six diseases and each disease. Heat maps depicting the prevalence of six oral diseases (caries, pulp exposure, periradicular lesions, impacted teeth, residual roots, and missing teeth) across tooth positions. Each map uses a gradient color scale where deeper red shades indicate higher prevalence. These maps emphasize disease hotspots, such as the mandibular first molars for caries and residual roots, highlighting their clinical importance for preventive care.
Disease distribution per tooth position
We take “caries” as an example. Clicking on a specific tooth icon on the heat map, such as tooth 46, reveals 28 caries. The gender and age distribution is presented at the bottom: the prevalence in females was 13 (46%) and 15 (54%) in males. Caries occurred under 18 years old were 5, 17 in the 18–29 years age group, 4 in the 30–44 years age group, and 2 in the 45–60 years age group (Fig. 4).
Association of age, sex with oral diseases
The bar graph below illustrates the association of the six oral diseases with gender and age. The number of affected teeth in each category can be found by hovering over a bar. The length of each bar represents the quantity of affected teeth (Fig. 5). Tooth loss was more prevalent in the > 60 years age group. This was partly due to physiological factors such as alveolar bone resorption and periodontal atrophy. Moreover, multiple tooth loss sometimes occurs as a result of untreated caries and periodontal disease18.
Impacted teeth were particularly prevalent among people aged 18 to 29. During this age range, the third molars are in the eruption phase, and the crown may be partially or fully covered by the gingival flap. This creates an environment where food debris and bacteria can be easily retained in the enclosed pocket, leading to the occurrence of pericoronitis. Moreover, adults in this age group often have preventive extraction of impacted teeth that need to be removed19. There were no definitive conclusions about the relationship between gender and oral diseases (Table 3).
System use and features
Figure 6 shows the feedback from the use study, inclusive of mean and standard deviation values derived from data analysis. A mean score exceeding 4 was achieved across all evaluated dimensions, indicating a strong consensus among dentists that the system performed well.
The feedback of the user study. The bar charts represent ratings (on a scale of 1–7) for each question within three categories: system function, system usability and system design. High scores across all categories indicate the system’s strong acceptance among dentists. These results validate the system’s potential for clinical application. SD: Standard deviation.
System function
Most dentists agreed that our system significantly boosted the accuracy of clinical diagnoses (\(\:{rating}_{mean}\)=4.75, \(\:{rating}_{sd}\)=1.34) while diminishing the likelihood of missed ones (\(\:{rating}_{mean}\)=4.33, \(\:{rating}_{sd}\)=1.07). It also improved their efficiency in reading panoramic radiographs and saved time on writing medical records (\(\:{rating}_{mean}\)=4.5, \(\:{rating}_{sd}\)=0.94). Additionally, the graphical representations of data were helpful in gaining insights into multiple data (\(\:{rating}_{mean}\)=5.17, \(\:{rating}_{sd}\)=1.03). H1 noted that the system currently covers six diseases and offers limited patient history. If we expand the system’s functionalities to include more conditions and comprehensive patient histories, it would likely increase adoption and effectiveness in clinical settings.
System usability
The majority of dentists provided positive feedback on the usability. L1 shared, “I thought the system was easy to use,” highlighting an initial positive impression. She added, “In clinical diagnosis and treatment, I would like to use this system.” This indicated a strong inclination towards incorporation of the system in practical medical settings. Emphasizing the system’s broad utility, H3 mentioned, “I will recommend the system to health examination centers.” This showed a belief in the system’s value and relevance in healthcare contexts.
System design
Most dentists agreed that the system’s various functions were well integrated, with modules clearly and logically arranged (\(\:{rating}_{mean}\)=5.08, \(\:{rating}_{sd}\)=1.05). In addition, the system’s statistical analysis phase was intuitive, particularly when selecting and comparing different datasets (\(\:{rating}_{mean}\)=4.83, \(\:{rating}_{sd}\)=1.35). M2 noted that if more patient information and statistical analysis items were added, the “disease distribution” module in the bottom left corner would require further design considerations. For example, adding interactive elements like filters and search functions would allow users to customize the data they view.
Discussion
Machine learning and artificial intelligence have increasingly advanced the field of dental imaging, driving progress in the automated analysis and interpretation of dental X-rays. These techniques improve diagnostic accuracy, reduce decision-making time, and achieve high accuracy in predicting disease progression20. Numerous previous studies have employed AI to diagnose specific oral diseases such as periodontal disease21, caries22 and periapical lesion23. Building on these efforts, recent research has extended toward the simultaneous detection of multiple dental conditions in panoramic radiographs. For example, Lee et al.24 explored the potential of AI in detecting 17 fine-grained dental anomalies with high performance especially for specificity. Başaran et al.25 evaluated an AI model for ten different dental conditions using panoramic radiographs. Alsakar et al.26 proposed a hybrid model integrating MoileNetv2 and Swin Transformer architectures for dental disease diagnosis using dental X-ray images. Building on these developments, our system also incorporates AI models to enhance diagnostic efficiency and consistency. This supports more accurate clinical decision-making and assists dentists in formulating personalized treatment plans tailored to individual patient needs.
Data visualization, with its intuitive and accessible nature, shows growing potential in healthcare applications27,28. Visualization tools such as charts and heat maps enable rapid interpretation of large-scale data, enhancing diagnostics, patient monitoring, and resource allocation. They also support predictive analytics and preventive care by revealing hidden patterns and trends29. Kwon et al.30 developed a system called DPVis. The system effectively evaluates disease progression, visualizes disease states, and enables interactive exploration of progression patterns. In our study, the module for group oral health statistics survey and visual analysis aids dentists in exploring survey statistics, revealing oral disease prevalence trends, identifying high-risk groups, and understanding associated risk factors. This is essential for government attempts to develop successful therapies as well as for the analysis of extensive oral health surveys31.
Despite these advancements, integrated systems that combine diagnosis, report generation, and data analysis remain limited, hindering transformative changes in clinical workflows. In this study, we integrate intelligent disease diagnosis, electronic medical record generation, and statistical analysis of group data into a single system. Based on the results of the “user study,” the system demonstrates high efficiency, visualization and usability, with mean scores exceeding 4 across all evaluated dimensions. The study also aligns with the global strategy and action plan on oral health 2023–2030 proposed by the WHO32. It focuses on developing innovative methods to collect and analyze oral health and epidemiological data using digital technologies. WHO’s goal is to create more efficient, effective, and inclusive health information systems that integrate oral health.
Although the system shows considerable potential, some limitations should be acknowledged. First, the current system is limited to covering only six kinds of common oral diseases. This limitation arises from the fact that certain prevalent oral diseases might not be adequately visualized on panoramic radiographs. Second, our study data was obtained from patients who underwent panoramic radiographs at the oral and maxillofacial department of a specific hospital, rather than from a general population sample. This could lead to bias because there aren’t enough healthy people in the study population. When interpreting the findings, it should be considered that the restricted sampling frame and timeframe limit the generalizability of the results.
In the future, there are plans to expand data sources, such as Cone Beam Computed Tomography (CBCT) data33, intra-oral scanning data34, intraoral images35 and pathology section images36,37. Secondly, we plan to further optimize the model by enhancing its ability to extract specific oral disease features, aiming for more accurate automated diagnosis11,38,39. With the optimization of AI models, they can surpass experts in detecting early-stage lesions that humans might miss visually. Thirdly, we advocate for the incorporation of Artificial Intelligence Generated Content (AIGC) to train and learn from this diverse oral data40,41,42. By integrating various data sources, AIGC can be leveraged to generate electronic medical records and conduct population-level analyses of oral health status43. Fourthly, we propose integrating panoramic radiographs from multiple hospitals, medical checkup centers, and research facilities into a consolidated dataset. This includes outcomes from patients and normal populations to achieve more comprehensive and representative data44. This broader collection would not only enhance the diversity of samples, but could also help alleviate the class imbalance problem by increasing the representation of under-sampled categories. Additional information about each examinee includes related medical histories, various auxiliary examinations, and oral hygiene habits.
Conclusion
Our study proposes an interactive diagnosis and intelligent statistical analysis system for oral diseases based on panoramic radiography. This system is an invaluable tool for dentists. It significantly improves the efficiency and effectiveness of individual oral health management. Beyond individual care, it also excels in facilitating large-scale studies and managing oral health at the population level. Moreover, the system offers substantial potential for integration with emerging technologies, such as AIGC, to further enhance its capabilities. In the future, when applied to large-scale population health surveys, it could play a pivotal role in improving oral health prevention across broader demographics. By leveraging AI-driven insights, the system promotes early detection, timely treatment, and personalized care plans. This results in better oral health promotions and more effective public health strategies. Overall, this expanded application highlights the system’s significant value in both clinical practice and population health management.
Data availability
The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy restrictions.
References
Fatima, A. et al. Advancements in dentistry with Artificial Intelligence: current clinical applications and future perspectives. Healthc (Basel). 10, 2188 (2022).
Choi, J. W. Assessment of panoramic radiography as a National oral examination tool: review of the literature. Imaging Sci. Dent. 41, 1–6 (2011).
Kaur, R., Sandhu, R. S., Gera, A. & Kaur, T. Edge detection in digital panoramic dental radiograph using improved morphological gradient and MATLAB. 2017 International Conference On Smart Technologies For Smart Nation (SmartTechCon) 793–797 (2017). https://doi.org/10.1109/SmartTechCon.2017.8358481
Hegde, S., Gao, J., Vasa, R., Nanayakkara, S. & Cox, S. Australian dentist’s knowledge and perceptions of factors affecting radiographic interpretation. Int. Dent. J. 74, 589–596 (2024).
Schopf, T. R., Nedrebø, B., Hufthammer, K. O., Daphu, I. K. & Lærum, H. How well is the electronic health record supporting the clinical tasks of hospital physicians? A survey of physicians at three Norwegian hospitals. BMC Health Serv. Res. 19, 934 (2019).
Chandrashekar, G., AlQarni, S., Bumann, E. E. & Lee, Y. Collaborative deep learning model for tooth segmentation and identification using panoramic radiographs. Comput. Biol. Med. 148, 105829 (2022).
Sheng, C. et al. Transformer-Based deep learning network for tooth segmentation on panoramic radiographs. J. Syst. Sci. Complex. 36, 257–272 (2023).
Zhong, W., Ren, X. & Zhang, H. Automatic X-ray teeth segmentation with grouped attention. Sci. Rep. 15, 64 (2025).
Ali, M. A., Fujita, D. & Kobashi, S. Teeth and prostheses detection in dental panoramic X-rays using CNN-based object detector and a priori knowledge-based algorithm. Sci. Rep. 13, 16542 (2023).
Xu, M. et al. Robust automated teeth identification from dental radiographs using deep learning. J. Dent. 136, 104607 (2023).
Zhu, J. et al. Artificial intelligence in the diagnosis of dental diseases on panoramic radiographs: a preliminary study. BMC Oral Health 23, 358 (2023).
Park, J., Oh, K., Han, K. & Lee, Y. H. Patient-centered radiology reports with generative artificial intelligence: adding value to radiology reporting. Sci. Rep. 14, 13218 (2024).
Büttner, M., Leser, U., Schneider, L. & Schwendicke, F. Natural language processing: chances and challenges in dentistry. J. Dent. 141, 104796 (2024).
Xu, T. et al. Visual analytics of multidimensional oral health surveys: data mining study. JMIR Med. Inf. 11, e46275 (2023).
Su, S., Lipsky, M. S., Licari, F. W. & Hung, M. Comparing oral health behaviours of men and women in the united States. J. Dent. 122, 104157 (2022).
Jamieson, L. et al. Oral health, social and emotional Well-Being, and economic costs: protocol for the second Australian National child oral health survey. JMIR Res. Protoc. 12, e52233 (2023).
Luo, L. et al. Contact settings and risk for transmission in 3410 close contacts of patients with COVID-19 in Guangzhou, China. Ann. Intern. Med. 173, 879–887 (2020).
Haworth, S. et al. Tooth loss is a complex measure of oral disease: determinants and methodological considerations. Community Dent. Oral Epidemiol. 46, 555–562 (2018).
Al-Dajani, M. et al. A cohort study of the patterns of third molar impaction in panoramic radiographs in Saudi population. Open. Dent. J. 11, 648–660 (2017).
Shakor, M. Y. & Khaleel, M. I. Recent advances in big medical image data analysis through deep learning and cloud computing. Electronics 13, 4860 (2024).
Kim, M. J., Chae, S. G., Bae, S. J. & Hwang, K. G. Unsupervised few shot learning architecture for diagnosis of periodontal disease in dental panoramic radiographs. Sci. Rep. 14, 23237 (2024).
Adnan, N. et al. Developing an AI-based application for caries index detection on intraoral photographs. Sci. Rep. 14, 26752 (2024).
Liu, J. et al. Periapical lesion detection in periapical radiographs using the latest convolutional neural network ConvNeXt and its integrated models. Sci. Rep. 14, 25429 (2024).
Lee, S., Kim, D. & -G Jeong, H. Detecting 17 fine-grained dental anomalies from panoramic dental radiography using artificial intelligence. Sci. Rep. 12, 5172 (2022).
Başaran, M. et al. Diagnostic charting of panoramic radiography using deep-learning artificial intelligence system. Oral Radiol. 38, 363–369 (2022).
Alsakar, Y. M. et al. Multi-label dental disorder diagnosis based on MobileNetV2 and Swin transformer using bagging ensemble classifier. Sci. Rep. 14, 25193 (2024).
Rajpurkar, P., Chen, E., Banerjee, O. & Topol, E. J. AI in health and medicine. Nat. Med. 28, 31–38 (2022).
Ooge, J., Stiglic, G. & Verbert, K. Explaining artificial intelligence with visual analytics in healthcare. WIREs Data Min. Knowl. Discov. 12, e1427 (2022).
Abudiyab, N. A., Alanazi, A. T., Abudiyab, N. & Alanazi, A. T. Visualization techniques in healthcare applications: a narrative review. Cureus 14, e31355 (2022).
Kwon, B. C. et al. DPVis: visual analytics with hidden Markov models for disease progression pathways. IEEE Trans. Vis. Comput. Graph. 27, 3685–3700 (2021).
Zheng, F. M. et al. Early childhood caries and dental public health programmes in Hong Kong. Int. Dent. J. 74, 35–41 (2024).
Organization, W. H. Global Strategy and Action Plan on Oral Health 2023–2030. (World Health Organization, (2024).
Takeya, A., Watanabe, K. & Haga, A. Fine structural human Phantom in dentistry and instance tooth segmentation. Sci. Rep. 14, 12630 (2024).
Vinayahalingam, S. et al. Intra-oral scan segmentation using deep learning. BMC Oral Health 23, 643 (2023).
Ragodos, R. et al. Dental anomaly detection using intraoral photos via deep learning. Sci. Rep. 12, 11577 (2022).
Sukegawa, S. et al. Effectiveness of deep learning classifiers in histopathological diagnosis of oral squamous cell carcinoma by pathologists. Sci. Rep. 13, 11676 (2023).
Sukegawa, S. et al. Training high-performance deep learning classifier for diagnosis in oral cytology using diverse annotations. Sci. Rep. 14, 17591 (2024).
Kim, C., Jeong, H., Park, W. & Kim, D. Tooth-Related disease detection system based on panoramic images and optimization through automation: development study. JMIR Med. Inform. 10, e38640 (2022).
Kolarkodi, S. & Alotaibi, K. Artificial intelligence in diagnosis of oral diseases: a systematic review. J. Contemp. Dent. Pract. 24, 61–68 (2023).
Mago, J. & Sharma, M. The potential usefulness of ChatGPT in oral and maxillofacial radiology. Cureus 15, e42133 (2023).
Liao, W. et al. Differentiating ChatGPT-Generated and Human-Written medical texts: quantitative study. JMIR Med. Educ. 9, e48904 (2023).
Huang, H. et al. ChatGPT for shaping the future of dentistry: the potential of multi-modal large Language model. Int. J. Oral Sci. 15, 1–13 (2023).
Huang, S. C., Pareek, A., Seyyedi, S., Banerjee, I. & Lungren, M. P. Fusion of medical imaging and electronic health records using deep learning: a systematic review and implementation guidelines. Npj Digit. Med. 3, 1–9 (2020).
Rischke, R. et al. Federated learning in dentistry: chances and challenges. J. Dent. Res. 101, 1269–1273 (2022).
Funding
This work was supported in part by the National Natural Science Foundation of China (No. 62177040, No.62277013), the National Statistical Science Research Project (No.2023LZ035, No.2022LY099), Public Welfare Plan Research Project of Zhejiang Provincial Science and Technology Department (No.LTGG23H260003).
Author information
Authors and Affiliations
Contributions
Y.H.: Conceptualization; Data curation; Investigation; Formal Analysis; Writing – Original Draft; Writing – Review and Editing. T.P.: Conceptualization; Methodology; Supervision; Review & Editing. S.Z. and M.H.: Review and Editing. Z.Z. and T.X.: Methodology, Project administration, Resources, validation, Review, and Editing.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Ethics approval
The study was conducted following the Declaration of Helsinki and approved by the Ethics Committee of the First Affiliated Hospital, Zhejiang University School of Medicine. (approval no. 20230785) Panoramic radiographs were taken with the patients’ informed consents for their therapeutic or diagnostic purpose and these data could be used for medical research without compromising their privacies.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Hong, Y., Pan, T., Zhu, S. et al. A visualization system for intelligent diagnosis and statistical analysis of oral diseases based on panoramic radiography. Sci Rep 15, 18222 (2025). https://doi.org/10.1038/s41598-025-01733-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-025-01733-5