Abstract
In the digital age, hospital websites are essential for providing healthcare information and services. This research introduces an automated tool, WUAHP, created in Python utilizing BeautifulSoup for HTML parsing. This facilitates the extraction of structural and content-based components essential for usability assessment. It assesses websites based on five principal criteria: Navigational efficiency, operational efficiency, accessibility, responsiveness & compatibility, and security—each subdivided into many sub-criteria. Each measure is evaluated on a scale from 0 (least desirable) to 1 (most ideal) utilizing normalized modules. The entropy weighting method is utilized to impartially allocate weights according to data variability. Usability scores are subsequently confirmed via user feedback and aligned with Nielsen’s heuristic usability standards. The tool was utilized on fifty healthcare websites. The results indicated significant variability, with HW9 attaining the greatest usability score of 97% and HW39 the lowest at 12%. The ultimate usability scores varied from 12 to 97%, underscoring disparities in design efficacy. WUAHP provides web developers and healthcare providers with an effective method to assess and enhance website usability. The technology establishes a basis for future applications in training machine learning models for automated, large-scale website assessment.
Similar content being viewed by others
Introduction
In today’s interconnected society, individuals are increasingly turning to the internet to seek health-related information1,2. The widespread availability of mobile devices and affordable internet access has opened new avenues for healthcare organizations to engage patients through digital platforms for education, information sharing, and service promotion3,4. Nevertheless, despite the increasing reliance on digital health services, previous research assessing healthcare websites has been constrained in both reach and depth.
Much of the current research concentrates exclusively on either technical elements or certain usability features, lacking a holistic evaluation framework that encompasses several dimensions of website performance, user requirements, and accessibility standards. Furthermore, the healthcare industry has historically underutilized one of its most valuable resources—patients themselves5,6. Active participation by patients not only improves health outcomes but also leads to reduced costs for healthcare systems7. Health Information Technology (Health IT), particularly hospital and medical websites, can significantly enhance patient-centered care by supporting shared decision-making, improving communication, facilitating access to medical information, and encouraging healthier lifestyles8,9,10,11. These websites are increasingly expected to serve as comprehensive platforms that provide essential information about services, medical staff, procedures, and health education to patients, their families, and the wider public12,13,14,15. Despite the extensive availability of health-related online content—4.5% of global internet searches are health-related—hospital and medical websites have fallen short in terms of accessibility, inclusion, and engagement. Numerous contemporary usability evaluation tools (e.g., WebSAT, Bobby, WAMMI) evaluate only discrete components and fail to incorporate critical characteristics such as accessibility, responsive design, security, and content relevancy. This disjointed methodology has constrained the practical applicability of current research in enhancing healthcare website design and user experience. Consequently, significant usability issues persist, eroding consumer trust and the credibility of online medical institutions. This study presents WUAHP tool, an innovative and thorough evaluation system tailored for hospital and medical websites. WUAHP integrates essential usability dimensions—Navigation Efficiency, Operational Efficiency, Accessibility, Responsiveness & Compatibility, and Security—into a cohesive instrument that facilitates a comprehensive, objective, and replicable evaluation. This tool addresses the deficiencies of previous methods and provides practical guidance for designers, developers, and administrators in the creation of user-centric, accessible, and secure digital platforms within the healthcare sector. This discovery holds significant value for both the scientific community and the healthcare sector. It enhances the body of knowledge regarding user experience and usability assessment by offering a verified paradigm for health-related web interfaces. It allows healthcare organizations to evaluate and consistently enhance their websites, so improving user experience, fostering health literacy, and eventually advancing public health outcomes. As healthcare increasingly digitizes, it is imperative to guarantee the quality and usability of online platforms, not just for organizational credibility but also to enable users to make informed health decisions within an accessible and reliable digital environment.
Significance of the work
This study is relevant from two different viewpoints. (1) The present study presents an automated tool that takes parameter and sub-parameter measurements to assess the attributes of a website. The tool was developed utilizing HTML parsing with normalized modules. The implementation of the proposed tool for healthcare websites will enhance the usability of these platforms. (2) This study aims to propose parameters based on the user’s perspective. Thus, through the application of the recommended usability parameters, users can acquire the necessary information with greater efficiency and effectiveness. This study first examines the evaluation criteria used to assess website usability before proceeding with the assessment. After a comprehensive study of these criteria, a collection of relevant parameters based on the user’s experience is developed. The proposed parameters enhance the current usability standards by incorporating additional aspects, thereby facilitating the evaluation of healthcare websites. In order to assess the efficacy of the proposed tool and ascertain its merits and demerits, we selected fifty prominent healthcare websites for testing purposes.
The structure of the paper is organized as follows: Sect. 2 reviews related work in the field of healthcare website usability. Section 3 details the proposed framework. Section 4 presents the testing of the tool on various healthcare websites. Section 5 discusses the results and key findings. Section 6 compares the developed tool with existing automated usability evaluation tools. Section 7 aligns Nielsen’s Heuristic Principles with the parameters of the WUAHP tool. Section 8 explores the practical implications for healthcare website design. Section 9 outlines the study’s limitations, and Sect. 10 concludes with the main findings and future research directions.
Related work
This research aims to standardize website analysis in the healthcare industry, which has yet to be overlooked despite its importance in other industries16. Recent technical progress has reduced the cost of medicine and raised the quality of care17. This approach includes an essential component known as usability studies, which enable healthcare companies to enhance their online presence through their websites. The COVID-19 epidemic has increased public awareness of the value of technology in healthcare, making medical facilities’ web presence even more crucial to the effective sharing of health information. Websites play a crucial role in various aspects of life and are essential for organizations due to their widespread use and impact. They have been a consistent focus of research across various fields and have been extensively examined in the e-commerce literature. Previous research has highlighted the scarcity of studies that examine several parameters related to website design, implementation, and organization. There should be a significant number and variety of factors associated with website success, yet limited research has been conducted on the combination of these factors and services. Many current studies focus on a restricted set of quality factors or are specific to a particular web service18. Previous research has accumulated various models and frameworks for evaluating the quality and performance of websites. A limited number of works exist where authors have introduced an automated tool for measuring quality metrics. While the presented approaches are precise, the number of parameters or metrics that can be modelled mathematically is very limited. In their study, Michaud et al.19 examined the implementation and evaluation of an internet health website for adolescents in Switzerland, with a primary focus on health-related topics. The study outlines the procedure for establishing the site and an initial assessment conducted through the utilization of two questionnaires. N.B. Teo et al.20 utilized an interactive web-based questionnaire to assess a breast cancer website and detailed their findings and results. They have detailed the optimal solutions for hospital websites. Their discussion pertains to their involvement in a website project that was founded on the strategic principles of requirements elicitation, requirements analysis, and requirements. As well as the domain-specific social and cultural aspects that were involved, they discuss the elicitation approach that was utilized, the unique characteristics of negotiation, the issues that arose during the process of developing the website, and the solutions that were discovered for those issues. Elizabeth Sillence et al.21 conducted a study on trustworthy health websites focusing on hypertension. The primary objective of this study is to investigate the factors that influence the reliability of medical online advice. A proposed set of guidelines outlines the development of trust in health websites while also examining the key distinctions between interpersonal interaction and web-based systems. Dohoon Kim et al.22 explained essential functional characteristics for developing and managing health information websites to enhance user satisfaction. The article aims to provide a technical perspective on the design and functionality of health information websites. In their work, Vangelis G. Alexiou et al.23 have developed a web portal intended for the global medical community. This portal serves as a central platform for sharing high-quality educational resources available on the World Wide Web. The portal offers access to over 800 educational web pages and over 2100 clinical practice regulations.
Maaike Van Den Haak et al.24 conducted a study on assessing consumer health information websites, emphasizing the significance of gathering observational, user-driven data. The focus has been on the usability of these websites and the discussion of methodological limitations in current usability studies. Furthermore, an examination is conducted on the impact of user characteristics in assessing consumer health information websites. Moreover, Nicola Reavley et al.25 investigated the standard of websites that provide information on mental disorders. Moreno et al.2 introduced a qualitative and user-centric methodology for evaluating the quality of health-related websites using a 2-tuple fuzzy linguistic strategy. Qualitative research was conducted using the focus group approach to determine the quality criteria set. The measurement method produces linguistic quality assessments based on visitors’ judgments regarding quality criteria. Implementation of the linguistic judgments is achieved without any loss of information through the application of a 2-tuple linguistic weighted average operator. This methodology represents an enhancement in the quality evaluation of health websites by prioritizing user-centric approaches. In their study, Duan et al.26 introduced automated verification techniques for website maintenance. They utilized algebraic reasoning and model checking on the abstract navigational behavior of evolving web applications represented in labelled transition systems. This was done to assess the applications against the desired characteristics expressed in temporal logic calculations combined with tree automata.
Usability parameters suggested by various researchers
The implementation of a proficient website design methodology is not only essential for the development of a functional website but also for the assessment of its methods and techniques. As a result, numerous researchers have attempted to identify the sub-parameters and parameters of web usability, which are detailed in (Table 1). The health-ITUEM is evidence-based and integrates elements from established usability frameworks. This study27 aimed to investigate the effectiveness of Health-ITUEM in evaluating the usability of mHealth technology. The framework was applied to two separate data sets. Health-ITUEM offers a new framework for understanding usability issues in mHealth technology. The study illustrated the adaptability, strength, and limitations of this model. The health-ITUEM framework improves mHealth technology evaluation and encourages efficient tool utilization. An examination of mHealth applications was undertaken to determine a comprehensive classification. A survey is developed with the help of psychologists to measure the quality of experience (QoE). The tool is evaluated with a sample of applications selected according to the classification acquired. The tool aims to assist developers in assessing the quality of their healthcare apps by identifying strengths and areas for enhancement, thus mitigating the release of substandard apps28. The study29 demonstrates the development and use of assessment criteria to differentiate quality variations among pain apps. Apps in health settings, like pain management, can be developed and marketed without regulation or guidance, raising concerns due to the high user motivation to find effective interventions. Elevating public awareness of quality standards for health promotion apps is essential. This approach is anticipated to foster innovation within the industry. The study30 included a literature review and the development of a tool with nine features categorized as security risks and safety measures. This tool assesses security risks and safety. The study31 employed quality criteria based on the HON code to assess the quality of asthma self-management apps. The criteria comprise eight best-practice principles concerning attribution, transparency of information, and traceability. The study32 integrated a checklist for evaluating apps concerning chronic diseases based on peer-reviewed studies and checklists. The authors assessed face and construct validity. The study33 outlines the creation of an assessment tool for scoring the functionalities of apps focused on tuberculosis prevention and treatment. The Institute for Healthcare Informatics defined the tool as having seven functionality criteria and four subcategories. The study34 presents the Mobile Application Rating Scale (MARS) as a tool for categorizing and assessing the quality of mHealth apps. The tool was developed through a comprehensive review of established guidelines and evaluation tools for websites, as well as input from an expert panel. The tool assesses app quality based on four dimensions. The components are rated on a 5-point scale from “inadequate” to “excellent.” The study35 included six reviewers who assessed 20 apps using 22 measures. The Anxiety and Depression Association of America (ADAA) website and the PsyberGuide website were the primary sources for the mental health app scale. The ADAA website provided five measures on a five-point scale, whereas the PsyberGuide website offered seven measures. Participants are queried regarding the attributes of child- and adolescent-friendly websites. The author utilized a 12-point website assessment tool to analyze 13 websites designed for children and adolescents36. Program directors were tasked with developing optimal website practices.
The author suggests that web-based multimedia, including video, icons, and supporting images, have a positive impact on website success. Incorporating embedded media enhances website usability. Accessible information enables convenient data location and identification. The issues requiring attention are data retrieval, search functionality, and navigation39. The study aimed to develop the Health Content Website Evaluation Tool (HIWET) to assess online content quality and evaluate the reliability, validity, and usefulness of HIWET. HIWET was developed through small-scale pilot testing. The psychometric properties of 20 neck pain websites were evaluated for reliability using the intra-class correlation coefficient (ICC).
Two website user interfaces were developed in the study using interaction design models (IDM) and computer science (CS)41. An evaluation study was conducted in a controlled laboratory environment to illustrate the variability in usability across different design techniques. The study found that IDM is a more effective design technique than CS for enhancing the usability of an eHealth website.
Previous techniques and methods used
Researchers have evaluated websites using various criteria and methods. Law et al.42 categorized these into automated, user judgment, mathematical, and combined approaches. Key techniques include heuristic evaluations, user questionnaires, statistical analysis, linear programming, and fuzzy logic. Evaluators are typically either experts (developers, domain users) or novice users. While automated methods can be applied pre-deployment, others require live websites. Evaluation criteria vary based on the evaluation’s purpose, including usability, security, and aesthetics. Limited studies assess websites across domains. Notable models include WebQEM43, 2QCV3Q44, WebQM45, and modular strategies by Mich46. This study addresses gaps in domain-wide evaluation tools, limited scalability, and the need for HTML-based automated usability assessment.
-
Previous studies have not deeply examined the navigation aspect of healthcare website usability, despite its recognized importance by Jabar et al.47. This study introduces an automated usability evaluation tool with five key parameters—Navigational Efficiency, Operational Efficiency, Accessibility, Responsiveness & Compatibility, and Security—each with measurable HTML-based sub-parameters. The tool is designed to assess any healthcare website and enhance usability and user satisfaction.
-
Many prior studies evaluated only a few websites, often country-specific, limiting the generalizability of findings. This study addresses that gap by analyzing fifty healthcare websites from various countries to uncover broader usability issues.
-
Due to the diverse objectives and criteria across existing literature, usability evaluation remains inconsistent. While some models are universal, others are domain-specific, and only a few rely on HTML code for analysis. This study’s tool fills that gap by offering a scalable, automated approach to usability evaluation based on structural HTML features.
Proposed framework
Building on extensive research and practical insights, our proposed framework brings together key features, characteristics, and evaluation metrics specifically tailored for hospital and medical websites. The goal is to provide a comprehensive collection of metrics and approaches capable of representing a successful hospital or medical website. This tool is great for comparing the quality of various hospital and medical websites and other sites. This can also help clarify methods for enhancing qualities and offer designers and developers a detailed guideline for implementing this group of websites.
Research methodology
The research approach used in this study is shown in (Fig. 1). A review of the appropriate literature is conducted to begin the analysis of website usability issues. This includes the concept of parameters and techniques for optimizing website usability. The authors proposed five parameters to enhance user accessibility to the websites. These factors were chosen because research shows that they are an efficient way to assess customer satisfaction with a website22. A systematic approach is used to design the automated tool. In order to develop the tool, we applied the web parsing techniques. Web parsing involves the automated extraction of data from different websites. Many libraries and frameworks in various computer languages can collect data from websites, but Python is widely regarded as the most popular choice for web scraping or web parsing. After completing the development of the tool, we proceeded to conduct individual evaluations of fifty websites that referred to healthcare domains.
The suggested usability parameters for the design of the website
Building on the research approach, this study aims to investigate the critical parameters for evaluating web usability, proposing five key parameters for comprehensive website assessment. Website usability plays a critical role in determining user satisfaction, engagement, and task completion. Based on core principles of Human-Computer Interaction (HCI), the following classification organizes essential usability parameters into five key categories: Navigation Efficiency, Operational Efficiency, Accessibility, Responsiveness & Compatibility, and Security. Each parameter contributes to the overall performance and user experience of a website. All of the parameters and sub-parameters are listed in detail in (Table 2).
Methodology employed in the tool’s development
To implement the proposed usability assessment, the web parsing technique was employed in the development of the WUAHP (Website Usability Assessment using HTML Parsing for Healthcare Websites) tool. Specifically, a Python package called Beautiful Soup is utilized to achieve this objective.
Website parsing technique
Web parsing, also known as screen parsing or web harvesting, is an automated method for extracting valuable data from website HTML and storing it locally. It can be tailored to specific sites or work universally. The primary function of a web parser is to organize unstructured content into a structured format using tools like HTML parsers, DOM interpreters, and HTTP protocols. As shown in (Fig. 2), the process involves three main phases:
-
Fetching: Retrieving the target web page via the HTTP protocol.
-
Extraction: Isolating relevant information using regular expressions and parsing frameworks.
-
Parsing: Converting the extracted data into a structured format for storage or presentation, enabling informed decisions by developers.
Web-parsing library selection
In this work, Python was selected for parsing due to its popularity and available libraries. Key tools include Beautiful Soup for HTML parsing, Scrapy for web scraping with JavaScript support, Selenium for browser automation, regular expressions for pattern matching, and Lxml for fast XML/HTML parsing. The comparison of Python’s libraries is presented in (Table 3).
Using beautiful soup for parsing
Based on the data presented in Table 3, it can be concluded that Beautiful Soup outperforms other methods of web parsing in terms of efficiency. In order to obtain the desired outcomes in our research, we utilized the Beautiful Soup library of Python to analyze healthcare websites. Just make a new Python file and open it in any web-based integrated development environment (IDE), such as Jupyter Notebook. Figure 3 demonstrates the process of parsing with Beautiful Soup.
Websites chosen for usability evaluation
Following the implementation of the web parsing technique, the WUAHP tool demonstrates compatibility with a wide range of healthcare websites. Table 4 presents 50 websites for the tool’s realization. Healthcare websites are assigned codes ranging from HW1 to HW50.
Testing
The developed evaluation tool is named WUAHP (Website Usability Assessment using HTML Parsing). The tool was implemented on fifty healthcare websites to validate its functionality. Table 5 displays the assessed critical values for healthcare. Table 6 refers to a detailed evaluation of healthcare websites based on five critical usability factors: navigation, operational efficiency, accessibility, responsiveness, and security. Parameters have been assigned values between zero and one to evaluate the website. Table 7 highlights a comprehensive review of healthcare websites, focusing on their usability strengths and weaknesses. Figure 4a–f display the computed measures for the healthcare domains as bar charts. Every computed number nearer “0” indicates poor usability, while those nearer “1” indicate excellent usability for the related metric.
Determining weights for usability sub-parameters using entropy weighting method
Following the establishment of the WUAHP tool’s broad applicability, this study recognizes that the five key sub-parameters of healthcare website usability—Navigational Efficiency (A1), Operational Efficiency (A2), Accessibility (A3), Responsiveness and Compatibility (A4), and Security (A5)—hold differing levels of importance, and therefore, their weights were assigned accordingly rather than being treated equally. We utilized the Entropy Weighting Method, an objective, data-driven technique frequently applied in multi-criteria decision-making (MCDM) processes to determine weights based on the intrinsic information content of each parameter presented in (Table 6). The Entropy Weighting Method assesses the extent of dispersion or variability of data within each criterion (sub-parameter). A sub-parameter with greater fluctuation provides more valuable information and is therefore deemed more significant. This approach mitigates human bias in the allocation of subjective weights, guaranteeing that factors with greater informational differentiation receive enhanced significance in the assessment process.
Steps followed in the entropy weighting method
-
Data normalization: The unprocessed values of each sub-parameter were standardized by min-max normalization to achieve a uniform scale ranging from 0 to 1.
-
Entropy calculation: Entropy values for each sub-parameter were calculated via the formula:
where pij is the proportion of the ith value in sub-parameter j and k is a constant ensuring 0 ≤ Ej ≤ 1.
-
Degree of diversification (dj):
-
The degree of diversification was computed as dj = 1 − Ej.
-
This highlights the significance of the sub-parameter; increased variability (diversification) enhances the value of the information.
-
-
Weight calculation:
-
The weight for each sub-parameter was ultimately established by normalizing its diversification value: wj=\(\:dj/\)∑dj
-
Results and discussions
Usability assessment for healthcare websites
The A1 (Navigational Efficiency) heatmap of the leading 50 healthcare websites depicted in Fig. 4a demonstrates a predominantly robust performance: Eighteen sites (36%) achieve an Excellent rating (≥ 0.80), with HW14 leading at 0.93, followed by notable performers such as HW3 (0.91), HW8 (0.86), and HW38 (0.87). Additionally, twenty-one sites (42%) are categorized as Good (0.60–0.79), indicating robust and user-friendly navigation. A limited cohort of eight sites (16%) receives a Fair rating (0.40–0.59), such as HW11 (0.45) and HW32 (0.56), indicating moderate usability challenges. Only three outliers—HW17 (0.30), HW39 (0.20), and HW47 (0.29)—fall inside the Poor group (< 0.40), signifying distinct areas requiring redesign.
Significantly, both high and low performers are dispersed throughout the grid rather than concentrated, highlighting that proficiency (and deficiency) in navigation is uniformly distributed across this collection of healthcare institutions. In summary, although almost 80% of these sites provide good to excellent navigational efficiency, the few underperformers should be prioritized for specific enhancements.
Figure 4b displays a heatmap illustrating the operational efficiency (A2) scores for 50 healthcare websites, organized in a 5 × 10 matrix style. Each cell represents a specific website (HW1 to HW50) and exhibits its A2 score, which ranges from 0 to 1, accompanied by a color-coded classification based on established performance thresholds. The color system classifies the scores into four categories: Excellent (0.80–1.00) in green, Good (0.60–0.79) in yellow, Fair (0.40–0.59) in orange, and Poor (0.00–0.39) in red. This graphic depiction facilitates the rapid recognition of performance discrepancies among websites. Although several websites attain exceptional operational efficiency (e.g., HW12, HW25, HW50), a significant proportion reside in the Good and Fair categories, indicating moderate usability. Several websites, including HW42 and HW39, exhibit low scores, signifying significant problems that could obstruct seamless user engagement. The heatmap functions as a diagnostic instrument to assess and contrast the operational efficacy of healthcare websites, enabling focused usability improvements.
The heatmap shown in 4c visually represents the accessibility scores (A3) of 50 healthcare websites. A majority of the websites fall under the Excellent category, indicating strong accessibility standards. Several websites are marked as Good, suggesting they perform reasonably well but have room for improvement. A few websites, such as HW15 and HW43, fall into the Poor category, signalling a critical need for accessibility enhancements. Overall, the heatmap serves as an effective tool for evaluating and comparing the accessibility of top healthcare websites.
The heatmap shown in Fig. 4d illustrates the A4 scores, representing Responsiveness & Compatibility, for the top 50 healthcare websites.
The image clearly demonstrates that most websites are classified as Excellent, indicating robust responsiveness and device compatibility. A limited number of websites are classified as Good or Fair, with no websites falling within the Poor category. This color-coded presentation facilitates a rapid and intuitive evaluation of each website’s performance regarding A4 usability, aiding in the identification of both high achievers and those requiring enhancement.
The heatmap shown in Fig. 4e displays the A5 usability scores of the top 50 healthcare websites, arranged in a 5 × 10 grid. Each cell includes the website code and its associated A5 score, offering a distinct comparison analysis of each site’s performance in this particular usability metric. The graphic design facilitates rapid recognition of high and poor scoring websites through color intensity. A score of 1.00, indicative of superior performance, is attained by multiple websites, including HW3, HW12, HW14, HW18, HW24, among others. The minimum score is 0.50, noted in websites like HW6, HW30, and HW49, indicating a potential need for enhancements in usability for the A5 element.
In the heatmap presented in Fig. 4f, the peak A2 usability score is 0.97, attained by HW9, signifying an Excellent degree of usability. This indicates that the website excels in user experience and interface design according to the assessed criteria. Conversely, the minimum score of 0.12, attributed to HW39, categorizes it as Poor. This indicates substantial usability challenges, like inadequate navigation, accessibility deficiencies, or an unwelcoming user interface. These extreme numbers underscore the disparity in usability quality among the examined healthcare websites.
(a) Heatmap showing A1 usability scores of top 50 healthcare websites with annotated website codes and corresponding scores. (b) Heatmap showing A2 usability scores of top 50 healthcare websites with annotated website codes and corresponding scores. (c) A3 score heatmap of top 50 healthcare websites, color-coded by accessibility performance ranging from excellent (green) to poor (red). (d) Heatmap showing A4 (responsiveness & compatibility) scores for the top 50 healthcare websites. (e) Heatmap showing A5 usability scores of top 50 healthcare websites with annotated website codes and corresponding scores. (f) Heatmap showing A2 usability scores of top 50 healthcare websites, categorized by performance levels (poor to excellent) with annotated website codes and scores.
An analysis of the suggested tool compared to existing automated usability tools
Table 8 compares the proposed WUAHP tool with existing website usability tools such as WAMMI, WebSAT, Bobby, Protocol Analysis, Google Page Speed, Lighthouse, WAVE, Browser Stack, SecurityHeaders.com, and SSL Labs across five key usability metrics: Navigation Efficiency, Operational Efficiency, Accessibility, Responsiveness & Compatibility, and Security48,49,50,51,52,53,54
-
Navigation efficiency: WUAHP ensures organized menus, functional search, active links, and intuitive navigation, outperforming other tools that offer limited or basic support.
-
Operational efficiency: WUAHP covers contact info, email access, image optimization, load speed, and security headers comprehensively; other tools address only some aspects.
-
Accessibility: WUAHP provides full support for alt text, ARIA features, visual hierarchy, and color contrast, exceeding other tools that offer partial compliance.
-
Responsiveness & compatibility: WUAHP, along with Browser Stack and Lighthouse, guarantees full device and screen adaptability.
-
Security: WUAHP performs thorough HTTPS and security header assessments, surpassing others like SSL Labs which offer partial coverage.
Overall, WUAHP delivers a comprehensive and detailed usability evaluation, outperforming existing tools that tend to focus on specific areas or provide partial coverage. This makes WUAHP a superior choice for holistic website usability assessment.
Comparison of nielsen’s heuristic principles with WUAHP tool parameters
The WUAHP tool incorporates Nielsen’s usability heuristics by ensuring users stay informed through fast page load times, providing intuitive navigation menus aligned with real-world expectations, and enabling user control via effective search and easy navigation. It maintains consistency with uniform menus and working links while preventing errors through broken link checks and security validations. The tool supports recognition over recall with clear headings and navigation, enhances efficiency with image optimization and responsive design, and promotes minimalist aesthetics through thoughtful color use. Additionally, accessible contact information offers reliable help and documentation. Table 9 illustrates the connection between these heuristics and website usability features.
Tool evaluation and enhancement through user-centered feedback
Each healthcare website (HW1–HW100) was assessed using five usability parameters—Navigation (A1), Operational Efficiency (A2), Accessibility (A3), Responsiveness & Compatibility (A4), and Security (A5)—to compute a Usability Score. User feedback was also gathered, and an Adjusted Usability Score combined both objective and subjective data for a balanced metric The table 10 presents the usability parameter scores (A1–A5), user feedback, and adjusted usability scores for 100 healthcare websites. Appendix A contains the questionnaire used to collect user feedback.
Consistency between system-based evaluation and user feedback
Most websites, such as HW12, showed strong alignment between technical scores and user satisfaction.
Discrepancies between system metrics and user perceptions
Some sites (e.g., HW57) had high technical scores but low user feedback, while others (e.g., HW53) scored low technically but were rated highly by users.
Cases of low performance across both metrics
Websites like HW100 underperformed on both scores, though exceptions like HW95 had low system scores but relatively high user feedback.
Parameter-specific observations
Accessibility (A3) and Responsiveness (A4) correlated more strongly with user feedback than Security (A5), emphasizing the importance of navigation and operational efficiency.
Overall insights
Approximately 70% of websites exhibited agreement between system and user evaluations, validating the Adjusted Usability Score as a reliable usability benchmark.
Practical implications for healthcare web design
The results of this usability test offer significant value for web designers and developers, especially in the healthcare sector where accessibility, performance, and security are paramount. The parameter-specific outcomes provide a diagnostic framework that may be utilized to direct focused design and development activities.
-
Websites with inadequate ratings in this area generally display deficient information architecture, erratic menu topologies, or excessive navigation depth. The findings indicate a necessity for implementing user-centered design concepts, including hierarchical navigation models, permanent navigation menus, and enhanced content labeling to promote intuitive user experiences.
-
Performance-related shortcomings were frequently linked to elevated page load durations, unoptimized media assets, and interaction lags. It is advisable to use technical improvements including asynchronous loading, content delivery network (CDN) integration, picture compression, and script minification to improve responsiveness and decrease latency.
-
Websites with low accessibility ratings sometimes contravened WCAG 2.1 compliance guidelines. Designers and developers must utilize semantic HTML, ARIA features, keyboard accessibility, and high-contrast themes to achieve inclusive design. Automated accessibility testing technologies can be incorporated into the development pipeline to ensure compliance.
-
Websites that received low scores in this category had inconsistencies in layout rendering and functionality across various devices and browsers. The implementation of responsive frameworks (such as Bootstrap and CSS Grid) and thorough cross-browser/device testing methods is crucial for achieving functional consistency across diverse user scenarios.
-
Websites that exhibited inadequate encryption protocols or did not demonstrate data protection measures received poor scores in this criterion. Implementing HTTPS using SSL/TLS, ensuring secure session management, and providing clear visibility of privacy policies and terms of service are essential for bolstering user trust and safeguarding sensitive information.
The parameter-specific numeric scores, along with user validation and weighted importance, furnish developers with a prioritized framework for improving usability. This evaluation approach facilitates iterative design enhancement while conforming to the regulatory and ethical norms that govern digital health platforms. This study enhances the creation of healthcare websites by converting usability ratings into technical specifications, resulting in improved functionality, accessibility, and security.
Limitations
The current study employs a structured evaluation methodology augmented by parameter weighting and user validation; yet, certain methodological and practical constraints merit attention.
-
Due to the evolving characteristics of web platforms, where interface designs, functionality, and security protocols are subject to regular upgrades, the usability scores represent a fixed evaluation at a certain moment. This time constraint may impact the long-term significance of the findings.
-
The study examines five principal usability dimensions: Navigational Efficiency, Operational Efficiency, Accessibility, Responsiveness and Compatibility, and Security. Nevertheless, supplementary variables such as content relevancy, user engagement, and personalization were excluded, potentially neglecting other essential factors affecting total usability.
-
The evaluation employs a synthesis of automated tools and manual assessment techniques. While these techniques are considered industry-standard, they may inadequately encompass the contextual and experiential dimensions of usability, especially in varied user conditions such as low bandwidth locations or dependence on assistive technology.
-
User validation was utilized to improve the evaluation’s trustworthiness; nonetheless, the subjectivity of user perception—influenced by digital literacy, prior experiences, and cultural expectations—may add unpredictability in the assessment results.
Conclusion and recommendations
Several studies in the literature examine website usability evaluation; nonetheless, inconsistency remains due to domain-specific objectives and diverse evaluation aims. This discrepancy results in varied criteria and procedures among studies. Conventional usability models typically require either labor-intensive processes or excessively adaptable solutions that rely much on the evaluator’s discretion. This work presents WUAHP, an automated tool specifically developed to assess the usability of healthcare websites, thereby addressing existing constraints and ensuring objective evaluation. The tool architecture comprises three primary modules: An interface for user input and displaying computed usability scores, An HTML parser, implemented in Python using BeautifulSoup, which extracts key features from website content. Normalization modules that scale extracted values between 0 and 1, enabling uniform usability representation. Usability is measured across five core parameters: Navigational Efficiency, Operational Efficiency, Accessibility, Responsiveness & Compatibility, and Security. These parameters are further divided into multiple sub-parameters for fine-grained analysis. To assign objective weights, the Entropy Weighting Method is applied, emphasizing parameter significance based on data variability. The tool was tested on 50 healthcare websites. Results showed usability scores ranging from 12% (HW39) to 97% (HW9). These results were further validated using user feedback and mapped to Nielsen’s standard heuristics, ensuring reliability. Future studies could apply the suggested Approach to different domains like government, e-commerce, and education to assess its cross-domain generalizability. This would make it easier to evaluate how resilient it is in different structural and functional web settings. Furthermore, including machine learning methods—like unsupervised approaches for pattern recognition or supervised models for usability prediction—could facilitate ongoing development in real-time web environments and allow automated, scalable assessments.
Data availability
Data will be available only on reasonable request by corresponding Author.
References
Jamal, A. et al. Association of online health information–seeking behavior and self-care activities among type 2 diabetic patients in Saudi Arabia. J. Med. Internet. Res. 17 (8), e4312. https://doi.org/10.2196/jmir.4312 (2015).
Moreno, J. M., Morales del Castillo, J. M., Porcel, C. & Herrera-Viedma, E. A quality evaluation methodology for health-related websites based on a 2-tuple fuzzy linguistic approach. Soft. Comput. 14, 887–897 (2010).
Amante, D. J., Hogan, T. P., Pagoto, S. L., English, T. M. & Lapane, K. L. Access to care and use of the internet to search for health information: results from the US National health interview survey. J. Med. Internet. Res. 17 (4), e106. https://doi.org/10.2196/jmir.4126 (2015).
Ranallo, P. A., Kilbourne, A. M., Whatley, A. S. & Pincus, H. A. Behavioral health information technology: from chaos to clarity. Health Aff. 35 (6), 1106–1113. https://doi.org/10.1377/hlthaff.2016.0013 (2016).
Carman, K. L. et al. Patient and family engagement: a framework for Understanding the elements and developing interventions and policies. Health Aff. 32 (2), 223–231. https://doi.org/10.1377/hlthaff.2012.1133 (2013).
Lindblad, S. et al. Creating a culture of health: evolving healthcare systems and patient engagement. QJM: Int. J. Med. 110 (3), 125–129. https://doi.org/10.1093/qjmed/hcw188 (2017).
Rutter, D., Manley, C., Weaver, T., Crawford, M. J. & Fulop, N. Patients or partners? Case studies of user involvement in the planning and delivery of adult mental health services in London. Soc. Sci. Med. 58 (10), 1973–1984. https://doi.org/10.1016/S0277-9536(03)00401-5 (2004).
Cipriano, P. F. et al. The importance of health information technology in care coordination and transitional care. Nurs. Outlook. 61 (6), 475–489. https://doi.org/10.1016/j.outlook.2013.10.005 (2013).
Coulter, A. Patient engagement—what works? J. Ambul. Care Manag. 35 (2), 80–89. https://doi.org/10.1097/JAC.0b013e318249e0fd (2012).
Snyder, C. F. et al. The role of informatics in promoting patient-centered care. Cancer J. (Sudbury Mass). 17 (4), 211. https://doi.org/10.109+97%2FPPO.0b013e318225ff89 (2011).
Street, R. L. Jr et al. Provider interaction with the electronic health record: the effects on patient-centered communication in medical encounters. Patient Educ. Couns. 96 (3), 315–319. https://doi.org/10.1016/j.pec.2014.05.004 (2014).
Lewis, D., Chang, B. L. & Friedman, C. P. Consumer health informatics. In Consumer Health Informatics: Informing Consumers and Improving Health Care (1–7). New York, NY: Springer New York. (2005).
Ow, D., Wetherell, D., Papa, N., Bolton, D. & Lawrentschuk, N. Patients’ perspectives of accessibility and digital delivery of factual content provided by official medical and surgical specialty society websites: a qualitative assessment. Interact. J. Med. Res. 4 (1), e3963. https://doi.org/10.2196/ijmr.3963 (2015).
Schenker, Y. & London, A. J. Risks of imbalanced information on US hospital websites. JAMA Intern. Med. 175 (3), 441–443. https://doi.org/10.1001/jamainternmed.2014.7400 (2015).
Snyder, K., Ornes, L. L. & Paulson, P. Engaging patients through your website. J. Healthc. Qual. 36 (2), 33–38. https://doi.org/10.1111/j.1945-1474.2012.00214.x (2014).
Oermann, M. H., Lowery, N. F. & Thornley, J. Evaluation of web sites on management of pain in children. Pain Manage. Nurs. 4 (3), 99–105. https://doi.org/10.1016/S1524-9042(03)00029-8 (2003).
DePasse, J. W., Chen, C. E., Sawyer, A., Jethwani, K. & Sim, I. Academic medical centers as digital health catalysts. In Healthcare 2 (3), 173–176 https://doi.org/10.1016/j.hjdsi.2014.05.006 (Elsevier, 2014).
Saad, M., Zia, A., Raza, M., Kundi, M. & Haleem, M. A comprehensive analysis of healthcare websites usability features, testing techniques and issues. IEEE Access. https://doi.org/10.1109/ACCESS.2022.3193378 (2022).
Michaud, P. A. & Colom, P. Implementation and evaluation of an internet health site for adolescents in Switzerland. J. Adolesc. Health. 33 (4), 287–290. https://doi.org/10.1016/S1054-139X(03)00181-2 (2003).
Teo, N. B., Paton, P. & Kettlewell, S. Use of an interactive web-based questionnaire to evaluate a breast cancer website. Breast 14 (2), 153–156. https://doi.org/10.1016/j.breast.2004.04.009 (2005).
Sillence, E., Briggs, P., Harris, P. & Fishwick, L. Health websites that people can trust–the case of hypertension. Interact. Comput. 19 (1), 32–42. https://doi.org/10.1016/j.intcom.2006.07.009 (2007).
Kim, D. & Chang, H. Key functional characteristics in designing and operating health information websites for user satisfaction: an application of the extended technology acceptance model. Int. J. Med. Informatics. 76 (11–12), 790–800. https://doi.org/10.1016/j.ijmedinf.2006.09.001 (2007).
Alexiou, V. G. & Falagas, M. E. e-meducation. Org: an open access medical education web portal. BMC Med. Educ. 8, 1–4 (2008).
van den Haak, M. & van Hooijdonk, C. Evaluating consumer health information websites: The importance of collecting observational, user-driven data. In 2010 IEEE International Professional Comunication Conference 333–338 https://doi.org/10.1109/IPCC.2010.5530031 (IEEE, 2010).
Reavley, N. J. & Jorm, A. F. The quality of mental disorder information websites: a review. Patient Educ. Couns. 85 (2), e16–e25. https://doi.org/10.1016/j.pec.2010.10.015 (2011).
Duan, L. & Chen, J. A formal approach to website maintenance. In 10th IEEE High Assurance Systems Engineering Symposium (HASE’07) 419–420https://doi.org/10.1109/HASE.2007.52 (IEEE, 2007).
Brown, I. I. I., Yen, W., Rojas, P. Y., Schnall, R. & M., & Assessment of the health IT usability evaluation model (Health-ITUEM) for evaluating mobile health (mHealth) technology. J. Biomed. Inform. 46 (6), 1080–1087. https://doi.org/10.1016/j.jbi.2013.08.001 (2013).
Martínez-Pérez, B., de la Torre-Díez, I., Candelas-Plasencia, S. & López-Coronado, M. Development and evaluation of tools for measuring the quality of experience (QoE) in mHealth applications. J. Med. Syst. 37, 1–8 (2013).
Reynoldson, C. et al. Assessing the quality and usability of smartphone apps for pain self-management. Pain Med. 15 (6), 898–909. https://doi.org/10.1111/pme.12327 (2014).
Scott, K., Richards, D. & Adhikari, R. A review and comparative analysis of security risks and safety measures of mobile health apps. Australasian J. Inform. Syst. https://doi.org/10.3127/ajis.v19i0.1210 (2015). 19.
Huckvale, K., Car, M., Morrison, C. & Car, J. Apps for asthma self-management: a systematic assessment of content and tools. BMC Med. 10, 1–11 (2012).
Anderson, K., Burford, O. & Emmerton, L. App chronic disease checklist: protocol to evaluate mobile apps for chronic disease self-management. JMIR Res. Protocols. 5 (4), e6194 (2016).
Iribarren, S. J., Schnall, R., Stone, P. W. & Carballo-Diéguez, A. Smartphone applications to support tuberculosis prevention and treatment: review and evaluation. JMIR mHealth uHealth. 4 (2), e5022. https://doi.org/10.2196/mhealth.5022 (2016).
Stoyanov, S. R., Hides, L., Kavanagh, D. J. & Wilson, H. Development and validation of the user version of the mobile application rating scale (uMARS). JMIR mHealth uHealth. 4 (2), e5849. https://doi.org/10.2196/mhealth.5849 (2016).
Powell, A. C. et al. Interrater reliability of mHealth app rating measures: analysis of top depression and smoking cessation apps. JMIR mHealth uHealth. 4 (1), e5176. https://doi.org/10.2196/mhealth.5176 (2016).
Huang, B. Y., Hicks, T. D., Haidar, G. M., Pounds, L. L. & Davies, M. G. An evaluation of the availability, accessibility, and quality of online content of vascular surgery training program websites for residency and fellowship applicants. J. Vasc. Surg. 66 (6), 1892–1901. https://doi.org/10.1016/j.jvs.2017.08.064 (2017).
Chen, J., Cade, J. E. & Allman-Farinelli, M. The most popular smartphone apps for weight loss: a quality assessment. JMIR mHealth uHealth 3 (4), e4334. https://doi.org/10.2196/mhealth.4334 (2015).
Slattery, P., Finnegan, P. & Vidgen, R. Creating compassion: how volunteering websites encourage prosocial behaviour. Inf. Organ. 29 (1), 57–76. https://doi.org/10.1016/j.infoandorg.2019.02.001 (2019).
O’Keeffe, H. & Walls, D. W., Usability testing and experience design in citizen science: A case study. In Proceedings of the 38th ACM International Conference on Design of Communication 1–8 https://doi.org/10.1145/3380851.3416768 (2020).
Zubiena, L. et al. Development and testing of the health information website evaluation tool on neck pain websites–An analysis of reliability, validity, and utility. Patient Educ. Couns. 113, 107762. https://doi.org/10.1016/j.pec.2023.107762 (2023).
Munim, K. M. et al. Exploring the impact of design technique on usability: A case study on designing the ehealth websites using card sorting and interactive dialogue model. Eng. Rep. e12738. https://doi.org/10.1002/eng2.12738 (2023).
Law, R., Qi, S. & Buhalis, D. Progress in tourism management: A review of website evaluation in tourism research. Tour. Manag. 31 (3), 297–313. https://doi.org/10.1016/j.tourman.2009.11.007 (2010).
Olsina, L. & Rossi, G. A quantitative method for quality evaluation of web sites and applications. IEEE Multimedia. 9 (4), 20–29. https://doi.org/10.1109/MMUL.2002.1041945 (2002).
Mich, L. Evaluating website quality by addressing quality gaps: a modular process. In 2014 IEEE International Conference on Software Science, Technology and Engineering 42–49 https://doi.org/10.1109/SWSTE.2014.13 (2014).
ZHU, Y. Integrating External Data from Web Sources into a Data Warehouse for OLAP and decision-making (Shaker, 2004).
Mich, L., Franch, M. & Gaio, L. Evaluating and designing web site quality. IEEE MultiMedia. 10 (1), 34–43. https://doi.org/10.1109/MMUL.2003.1167920 (2003).
Jabar, M. A., Usman, U. A. & Sidi, F. Usability evaluation of universities’ websites. Int. J. Inform. Process. Manage. 5 (1), 10 (2014).
Bobby, A. & Bobby http://bobby.watchfire.com/bobby/html/en/about.jsp
Hasan, L. & Abuelrub, E. Assessing the quality of web sites. Appl. Comput. Inf. 9 (1), 11–29. https://doi.org/10.1016/j.aci.2009.03.001 (2011).
Huerta, T. R., Walker, D. M. & Ford, E. W. An evaluation and ranking of children’s hospital websites in the united States. J. Med. Internet. Res. 18 (8), e5799. https://doi.org/10.2196/jmir.5799 (2016).
Lee, S. & Koubek, R. J. The effects of usability and web design attributes on user preference for e-commerce web sites. Comput. Ind. 61 (4), 329–341. https://doi.org/10.1016/j.compind.2009.12.004 (2010).
Moreno, M. A. Seeking health information online. JAMA Pediatr. 171 (5), 500–500. https://doi.org/10.1001/jamapediatrics.2016.3109 (2017).
Benbunan-Fich, R. Using protocol analysis to evaluate the usability of a commercial website. Inform. Manage. Vol. 39 (2), 151–163 (2001).
Web, M. & Testbed http://zing.ncsl.nist.gov/WebTools/tech.html
Author information
Authors and Affiliations
Contributions
Amandeep Kaur examines the technique and compiles the main research work. Jaswinder Singh supervised the research and assisted with the methods. Satinder Kaur supervised the research and assisted in collecting literature.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Ethics and informed consent for data used
The authors have no relevant financial or non-financial interests to disclose. All authors certify that they have no affiliations with or involvement in any organization or entity with any financial interest or non-financial interest in the subject matter or materials discussed in this manuscript. The authors have no financial or proprietary interests in any material discussed in this article. This article does not contain any studies with human participants or animals performed by any of the authors. This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix
Appendix A
Website usability feedback questionnaire (MCQ)
Instructions
Please answer the following questions based on your experience using the website. Select the option that best represents your opinion for each question.
1. Navigation efficiency
-
1.
How easy is it to locate and use the navigation menu?
-
(A) Very difficult.
-
(B) Somewhat difficult.
-
(C) Neutral.
-
(D) Easy.
-
(E) Very easy.
-
2.
Does the website have a helpful and accessible search feature?
-
(A) No, there is no search feature.
-
(B) Yes, but it is hard to use.
-
(C) Yes, it is somewhat helpful.
-
(D) Yes, it works well.
-
(E) Yes, it works excellently.
-
3.
Are all the links on the website functioning properly?
-
(A) No, many links are broken.
-
(B) Some links are broken.
-
(C) Most links are working.
-
(D) All links are working.
-
(E) All links are working perfectly.
-
4.
Did you encounter any broken or dead links on the website?
-
(A) Yes, many broken links.
-
(B) Yes, a few broken links.
-
(C) No, no broken links.
-
(D) Not sure.
-
5.
How intuitive is the overall navigation on the website?
-
(A) Very confusing.
-
(B) Somewhat confusing.
-
(C) Neutral.
-
(D) Easy to navigate.
-
(E) Very intuitive and easy to use.
2. Operational efficiency
-
6.
Is the contact information easily accessible on the website?
-
(A) No, it is hard to find.
-
(B) Somewhat accessible.
-
(C) Neutral.
-
(D) Accessible but requires extra effort.
-
(E) Yes, it’s very easy to find.
-
7.
Can you find an email address or contact form quickly?
-
(A) No, I could not find it.
-
(B) Yes, but it took some effort.
-
(C) Yes, it was easy to find.
-
(D) Yes, it’s prominently displayed.
-
8.
How well are images optimized for fast loading and quality?
-
(A) Images are very slow to load and low quality.
-
(B) Some images are slow to load or pixelated.
-
(C) Images load fine but some lose quality.
-
(D) Most images load fast and look great.
-
(E) All images load quickly and are clear.
-
9.
Does the website load within a reasonable time?
-
(A) Very slowly.
-
(B) Slowly.
-
(C) Neutral.
-
(D) Quickly.
-
(E) Very quickly.
-
10.
How secure do you feel while browsing this website?
-
(A) I feel very insecure.
-
(B) I feel somewhat insecure.
-
(C) Neutral.
-
(D) I feel secure.
-
(E) I feel completely secure.
3. Accessibility
-
11.
Do images have alternative text (alt text) for screen readers?
-
(A) No, there is no alt text.
-
(B) Some images have alt text.
-
(C) Most images have alt text.
-
(D) Yes, all images have alt text.
-
(E) Yes, all images have descriptive alt text.
-
12.
Is the website accessible for users with disabilities (e.g., screen reader support)?
-
(A) Not accessible at all.
-
(B) Partially accessible.
-
(C) Neutral.
-
(D) Mostly accessible.
-
(E) Fully accessible.
-
13.
Is the text structure on the website clear and easy to read (e.g., with proper headings)?
-
(A) Very unclear.
-
(B) Somewhat unclear.
-
(C) Neutral.
-
(D) Mostly clear.
-
(E) Very clear and well-structured.
-
14.
Are the color schemes used on the website visually clear and inclusive?
-
(A) No, the colors are hard to distinguish.
-
(B) Some colors are hard to distinguish.
-
(C) Neutral.
-
(D) Colors are clear and easy to distinguish.
-
(E) Colors are excellent and visually inclusive.
4. Responsiveness & compatibility
-
15.
Does the website display properly across different devices (mobile, tablet, desktop)?
-
(A) No, it is not compatible on any device.
-
(B) Yes, but with some issues.
-
(C) Neutral.
-
(D) Yes, it works well on most devices.
-
(E) Yes, it works perfectly across all devices.
-
16.
Does the layout adjust appropriately when resizing the screen or changing device orientation?
-
(A) No, the layout is broken on resizing.
-
(B) The layout adjusts poorly.
-
(C) Neutral.
-
(D) The layout adjusts well.
-
(E) The layout adjusts perfectly on all devices and orientations.
5. Security
-
17.
Does the website use HTTPS (secure browsing)?
-
(A) No, it does not use HTTPS.
-
(B) Yes, but the HTTPS is not properly implemented.
-
(C) Yes, the HTTPS is implemented well.
-
(D) Yes, the website is fully secured with HTTPS.
-
18.
Do you feel your personal data is protected while using this website?
-
(A) No, I do not feel secure.
-
(B) I feel somewhat insecure.
-
(C) Neutral.
-
(D) I feel secure.
-
(E) I feel completely protected.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Kaur, A., Singh, J. & Kaur, S. Automated framework for comprehensive usability analysis of healthcare websites using web parsing. Sci Rep 15, 21834 (2025). https://doi.org/10.1038/s41598-025-07271-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-025-07271-4