Objective: Patients widely use artificial intelligence-based chatbots, and this study aims to determine their utility and limitations on
questions about strabismus. The answers to the common questions about the management of strabismus provided by Chat Generative
Pre-trained Transformer (ChatGPT)-3.5, an artificial intelligence-powered chatbot, were compared to answers from a strabismus
specialist (The Specialist) in terms of appropriateness and readability.
Patients and Methods: In this descriptive, cross-sectional study, a list of questions from strabismus patients or caregivers in outpatient
clinics about treatment, prognosis, postoperative care, and complications were subjected to ChatGPT and The Specialist. The answers
of ChatGPT were classified as appropriate or not, considering the answers of The Specialist as the reference. The readability of all the
answers was assessed according to the parameters of the Readable online toolkit.
Results: All answers provided by ChatGPT were classified as appropriate. The mean Flesch Kincaid Grade Levels of the respective
answers given by ChatGPT and The Specialist were 13.75±1.55 and 10.17±2.17 (p<0.001), higher levels indicating complexity; and
the mean Flesch Reading Ease Scores of which higher scores indicated ease, were 23.86±9.38 and 44.54±14.66 (p=0.002). The mean
reading times were 15.6±2.85 and 10.17±2.17 seconds for ChatGPT and The Specialist, respectively (p=0.003). The overall reach of the
answers by ChatGPT and The Specialist was 56.87±11.67 and 81.67±12.80 (p<0.001).
Conclusion: Although, ChatGPT provided appropriate answers to all compiled strabismus questions, those were complex or very
difficult to read for an average person. The readability scores indicated a college graduation degree would be required to understand
the answers provided by ChatGPT. However, The Specialist gave similar information in a more readable form. Therefore, physicians
and patients should consider the limitations of such similar platforms for ocular health-related questions.
Primary Language | English |
---|---|
Subjects | Surgery (Other) |
Journal Section | Original Research |
Authors | |
Publication Date | October 30, 2024 |
Submission Date | April 20, 2024 |
Acceptance Date | May 22, 2024 |
Published in Issue | Year 2024 |