Table 1 DECIDE-AI reporting item checklist.
Item n0 | Theme | Recommendation | Reported on page |
---|---|---|---|
1 - 17 | AI-specific reporting items | ||
I - X | Generic Reporting Items | ||
Title and abstract | |||
1 | Title | Identify the study as early clinical evaluation of a decision support system based on AI or machine learning, specifying the problem addressed. | Â |
I | Abstract | Provide a structured summary of the study. Consider including: intended use of the AI system, type of underlying algorithm, study setting, number of patients and users included, primary and secondary outcomes, key safety endpoints, human factors evaluated, main results, conclusions. | Â |
Introduction | |||
2 | Intended use | a) Describe the targeted medical condition(s) and problem(s), including the current standard practice, and the intended patient population(s). | Â |
b) Describe the intended users of the AI system, its planned integration in the care pathway, and the potential impact, including patient outcomes, it is intended to have. | |||
II | Objectives | State the study objectives. | Â |
Methods | |||
III | Research governance | Provide a reference to any study protocol, study registration number, and ethics approval. | Â |
3 | Participants | a) Describe how patients were recruited, stating the inclusion and exclusion criteria at both patient and data level, and how the number of recruited patients was decided. | Â |
b) Describe how users were recruited, stating the inclusion and exclusion criteria, and how the intended number of recruited users was decided. | |||
c) Describe steps taken to familiarise the users with the AI system, including any training received prior to the study. | |||
4 | AI system | a) Briefly describe the AI system, specifying its version and type of underlying algorithm used. Describe, or provide a direct reference to, the characteristics of the patient population on which the algorithm was trained and its performance in preclinical development/validation studies. | Â |
b) Identify the data used as inputs. Describe how the data were acquired, the process needed to enter the input data, the pre-processing applied, and how missing/low-quality data were handled. | |||
c) Describe the AI system outputs and how they were presented to the users (an image may be useful). | |||
5 | Implementation | a) Describe the settings in which the AI system was evaluated. | Â |
b) Describe the clinical workflow/care pathway in which the AI system was evaluated, the timing of its use, and how the final supported decision was reached and by whom. | |||
IV | Outcomes | Specify the primary and secondary outcomes measured. | Â |
6 | Safety and errors | a) Provide a description of how significant errors/malfunctions were defined and identified. | Â |
b) Describe how any risks to patient safety or instances of harm were identified, analysed, and minimised. | |||
7 | Human factors | Describe the human factors tools, methods or frameworks used, the use cases considered, and the users involved. | Â |
V | Analysis | Describe the statistical methods by which the primary and secondary outcomes were analysed, as well as any prespecified additional analyses, including subgroup analyses and their rationale. | Â |
8 | Ethics | Describe whether specific methodologies were utilised to fulfil an ethics- related goal (such as algorithmic fairness) and their rationale. | Â |
VI | Patient involvement | State how patients were involved in any aspect of: the development of the research question, the study design, and the conduct of the study. | Â |
Result | |||
9 | Participants | a) Describe the baseline characteristics of the patients included in the study, and report on input data missingness. | Â |
b) Describe the baseline characteristics of the users included in the study. | |||
10 | Implementation | a) Report on the user exposure to the AI system, on the number of instances the AI system was used, and on the users’ adherence to the intended implementation. |  |
b) Report any significant changes to the clinical workflow or care pathway caused by the AI system. | |||
VII | Main results | Report on the prespecified outcomes, including outcomes for any comparison group if applicable. | Â |
VIII | Subgroups analysis | Report on the differences in the main outcomes according to the prespecified subgroups. | Â |
11 | Modifications | Report any changes made to the AI system or its hardware platform during the study. Report the timing of these modifications, the rationale for each, and any changes in outcomes observed after each of them. | Â |
12 | Human-computer agreement | Report on the user agreement with the AI system. Describe any instances of and reasons for user variation from the AI system’s recommendations and, if applicable, users changing their mind based on the AI system’s recommendations. |  |
13 | Safety and errors | a) List any significant errors/malfunctions related to: AI system recommendations, supporting software/hardware, or users. Include details of: (i) rate of occurrence, (ii) apparent causes, (iii) whether they could be corrected, and (iv) any significant potential impacts on patient care. | Â |
b) Report on any risks to patient safety or observed instances of harm (including indirect harm) identified during the study. | |||
14 | Human factors | a) Report on the usability evaluation, according to recognised standards or frameworks. | Â |
b) Report on the user learning curves evaluation. | |||
Discussion | |||
15 | Support for intended use | Discuss whether the results obtained support the intended use of the AI system in clinical settings. | Â |
16 | Safety and errors | Discuss what the results indicate about the safety profile of the AI system. Discuss any observed errors/malfunctions and instances of harm, their implications for patient care, and whether/how they can be mitigated. | Â |
IX | Strengths and limitations | Discuss the strengths and limitations of the study. | Â |
Statements | |||
17 | Data availability | Disclose if and how data and relevant code are available. | Â |
X | Conflicts of interest | Disclose any relevant conflicts of interest, including the source of funding for the study, the role of funders, any other roles played by commercial companies, and personal conflicts of interest for each author. | Â |