TY - JOUR
T1 - Assessment of ChatGPT's Performance on the ACP 2024 National Prosthodontics Resident Exam (NPRE)
AU - Ateeq Almalki, Abdulrahman
AU - Obeid Althubaitiy, Ramzi
AU - Alkhtani, Fahad
AU - Anadioti, Evanthia
AU - Wageh Mansour, Heba
N1 - Publisher Copyright:
© 2025 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
PY - 2025
Y1 - 2025
N2 - Purpose: To evaluate the performance of ChatGPT on the National Prosthodontics Resident Exam (NPRE). Methods: Two separate OpenAI accounts were used for ChatGPT 3.5 and ChatGPT 4.0, each managed by independent examiners. The dataset was sourced from the American College of Prosthodontics (ACP) 2024 National Prosthodontics Resident Exam (NPRE), which includes 150 multiple-choice board-style questions on various prosthodontic topics. Questions were inputted as they appeared in the NPRE, and responses were recorded as correct or incorrect. Accuracy was assessed using a two-tailed t-test, with statistical significance set at p < 0.05. After the study was completed, OpenAI accounts were deleted to ensure data privacy and security. Results: ChatGPT 3.5 correctly answered 84 out of 150 questions, achieving a score of 56.0%; while ChatGPT 4 significantly outperformed it with a score of 73.7%, correctly answering 109 out of 150 questions (p < 0.001). In specific subjects, ChatGPT 4 consistently scored higher, with significant improvements in Basic Science (71.2% vs. 61.3%), Implant Surgery (67.5% vs. 41.2%), Diagnosis and Treatment Planning (66.6% vs. 53.4%) and Fixed Prosthodontics (86.9% vs. 62.5%). The highest scores for both versions were in Dental Materials, with ChatGPT 4 achieving 91.6% compared to ChatGPT 3.5's 73.1%. Conclusion: ChatGPT 4.0 shows promising potential as an educational tool for prosthodontics residents by effectively addressing board-style questions. However, due to a significant presence of misinformation in ChatGPT's current prosthodontics knowledge base, residents should exercise caution and supplement AI-generated content with evidence-based information from credible sources to ensure accuracy and reliability.
AB - Purpose: To evaluate the performance of ChatGPT on the National Prosthodontics Resident Exam (NPRE). Methods: Two separate OpenAI accounts were used for ChatGPT 3.5 and ChatGPT 4.0, each managed by independent examiners. The dataset was sourced from the American College of Prosthodontics (ACP) 2024 National Prosthodontics Resident Exam (NPRE), which includes 150 multiple-choice board-style questions on various prosthodontic topics. Questions were inputted as they appeared in the NPRE, and responses were recorded as correct or incorrect. Accuracy was assessed using a two-tailed t-test, with statistical significance set at p < 0.05. After the study was completed, OpenAI accounts were deleted to ensure data privacy and security. Results: ChatGPT 3.5 correctly answered 84 out of 150 questions, achieving a score of 56.0%; while ChatGPT 4 significantly outperformed it with a score of 73.7%, correctly answering 109 out of 150 questions (p < 0.001). In specific subjects, ChatGPT 4 consistently scored higher, with significant improvements in Basic Science (71.2% vs. 61.3%), Implant Surgery (67.5% vs. 41.2%), Diagnosis and Treatment Planning (66.6% vs. 53.4%) and Fixed Prosthodontics (86.9% vs. 62.5%). The highest scores for both versions were in Dental Materials, with ChatGPT 4 achieving 91.6% compared to ChatGPT 3.5's 73.1%. Conclusion: ChatGPT 4.0 shows promising potential as an educational tool for prosthodontics residents by effectively addressing board-style questions. However, due to a significant presence of misinformation in ChatGPT's current prosthodontics knowledge base, residents should exercise caution and supplement AI-generated content with evidence-based information from credible sources to ensure accuracy and reliability.
KW - artificial intelligence
KW - chatGPT
KW - prosthodontics
UR - https://www.scopus.com/pages/publications/105013800900
U2 - 10.1111/eje.70045
DO - 10.1111/eje.70045
M3 - Article
AN - SCOPUS:105013800900
SN - 1396-5883
JO - European Journal of Dental Education
JF - European Journal of Dental Education
ER -