Editorial Commentary: At Present, ChatGPT Cannot Be Relied Upon to Answer Patient Questions and Requires Physician Expertise to Interpret Answers for Patients

Eoghan T. Hurley, Bryan S. Crook, Jonathan F. Dickens

Research output: Contribution to journalEditorial

Abstract

ChatGPT is designed to provide accurate and reliable information to the best of its abilities based on the data input and knowledge available. Thus, ChatGPT is being studied as a patient information tool. This artificial intelligence (AI) tool has been shown to frequently provide technically correct information but with limitations. ChatGPT provides different answers to similar questions based on the prompts, and patients may not have expertise in prompting ChatGPT to elicit a best answer. (Prompting large language models has been shown to be a skill that can improve.) Of greater concern, ChatGPT fails to provide sources or references for its answers. At present, ChatGPT cannot be relied upon to address patient questions; in the future, ChatGPT will improve. Today, AI requires physician expertise to interpret AI answers for patients.

Original languageEnglish
JournalArthroscopy - Journal of Arthroscopic and Related Surgery
DOIs
StateAccepted/In press - 2024
Externally publishedYes

Fingerprint

Dive into the research topics of 'Editorial Commentary: At Present, ChatGPT Cannot Be Relied Upon to Answer Patient Questions and Requires Physician Expertise to Interpret Answers for Patients'. Together they form a unique fingerprint.

Cite this