Sir, on reading how ChatGPT could pass an anatomy exam,1 we tested its ability to clear the standard Overseas Registration Exam (ORE) (an exam that overseas-qualified dentists have to pass in order to register with the GDC in the UK) exam template.2 The exam consists of two parts, Part 1 being written and Part 2 being clinical. Part 1 consists of two computer-based exam papers: Paper A covers clinically applied dental science and clinically applied human disease; Paper B covers aspects of clinical dentistry, including law and ethics and health and safety.3

We tested the Part 1 ORE Sample Questions available on the official website.3 ChatGPT was able to get 90% of answers correct in Paper A but for Paper B, which covers decision-making questions, ChatGPT could only get 20% correct answers. ORE requires any candidate to clear both papers to proceed to Part 2. Thus, due to its significantly low scores in Paper B, ChatGPT was unable to successfully pass the sample ORE. This pilot test demonstrated that ChatGPT could give correct answers to straightforward theoretical questions with high accuracy, whereas it was unable to answer analytical questions correctly. Additionally, ChatGPT struggled to find correct answers to ethical and law-related questions.