Introduction: Cardiotocography is an indispensable instrument for assessing fetal well-being at the antenatal period. The interpretation of cardiotocography requires substantial background information, knowledge and experience. In particular, inexperienced clinicians and midwives are easily prone to making mistakes during the evaluation of cardiotocography and making wrong decisions. The aim of this project is to develop a new artificial intelligence integrated interface to aid teaching the interpretation of fetal cardiotocography.
Methods: The parameters of presence of uterine contractions, beat-to-beat variability, fetal heart rate and periodic changes (accelerations and decelerations) of 118 scanned fetal cardiotocographies acquired from Niğde Ömer Halisdemir University, Niğde Research and Training Hospital Obstetrics and Gynecology clinic were classified by 2 experienced obstetrics and gynecology specialists and were uploaded to the interface. Convolutional neural network (CNN) deep learning architecture was used and cardiotocography classification model was generated.
Results: The developed interface consisted of 4 sections. First contained the basic information. Educational information for interpretation of cardiotocography was uploaded to the second section. The third section was designed for students’ self-training with randomly selected and previously classified cardiotocographies. The fourth section consisted of a deep learning based trained cardiotocography classification model, for uploading a test sample and generating classification result automatically. The performance of CNN module on the classification of cardiotocography dataset was 84%.
Conclusion: Artificial intelligence integrated fetal cardiotocography educational interface was successfully developed. We believe applied utilization of the interface would provide great benefit to fetal cardiotocography education.
none
Primary Language | English |
---|---|
Subjects | Clinical Sciences |
Journal Section | Research Articles |
Authors | |
Publication Date | October 1, 2022 |
Published in Issue | Year 2022 Volume: 2 Issue: 2 |