Table 1 Graduate courses with examinations completed by GPT-4.

From: The model student: GPT-4 performance on graduate biomedical science exams

Course

Exam type

Blinding

Access

Figures

Format

Simple

Expert

Short

GMS6035 Advanced Virology II: RNA Viruses

Final

Partial

ChatGPT

X

Paper

 

X

 

GMS6038: Bacterial Genetics and Physiology

Final

None

GPT-4 API

 

Text

X

X

X

GMS6473: Fund. of Phys. & Fun. Genomics

Final

Full

GPT-4 API

 

Text

X

X

X

PHC7007: Cancer Epidemiology

Final

Full

GPT-4 API

 

Text

X

X

X

GMS6231: Genomics and Bioinformatics

First-year

Full

GPT-4 API

 

Text

X

X

 

PCB5065: Advanced Genetics

First-year

Full

GPT-4 API

X

Text

X

X

 

BCH6415: Adv. Molecular and Cell Biology

First-year

Full

GPT-4 API

 

Text

X

X

 

GMS6221: Genetical Ethics

First-year

Full

GPT-4 API

 

Text

X

X

 

PHC6052: Intro. to Biostatistical Methods

First-year

Full

GPT-4 API

 

Text

X

X

 
  1. Courses included in assessing GPT-4 capability on graduate examinations in the Biomedical Sciences. Columns include: Course: name of University of Florida (UF) graduate course; Exam Type: whether the examination was a course final exam (Final) or an end-of-year exam administered to first-year PhD students in the UF Genetics and Genomics Program (first-year); Blinding: whether GPT-4 answers were graded in parallel with student examinations using blinded identifiers (Full), whether grading was performed with knowledge of which exams were generated by GPT-4 (None), or a mixture of both types (Partial); Access: whether GPT-4 was queried using ChatGPT (ChatGPT) or a script accessing the OpenAI API (GPT-4 API); Figures: whether the exam questions contained graphical figures; Format: whether GPT-4 exam answers were handwritten (Paper) or digitally copied (Text) into the respective exam document; Simple/Expert/Short: Whether the GPT4-Simple, GPT4-Expert, and GPT4-Short prompt patterns was evaluated for each course’s examination, respectively.