Research Article
BibTex RIS Cite

U-Net ve Morfolojik İşlemler Kullanılarak Panoramik Radyografiler Üzerinde Diş Örneği Bölütleme

Year 2022, Volume: 10 Issue: 1, 39 - 50, 31.01.2022
https://doi.org/10.29130/dubited.950568

Abstract

Panoramik röntgen görüntülerinde otomatik diş bölütleme, diş hekimliği görüntü analizinin önemli bir araştırma konusudur. Bu çalışmada, görüntüdeki nesnelerin ayrıldığı bir bölütleme haritası elde etmek için bir son işleme aşaması öneriyoruz ve bu tekniği U-Net ağı ile diş örneği bölütlemeye uyguluyoruz. Son işleme, ikilileştirmeden önce ağın sigmoid çıkışına uygulanan gri tonlamalı morfolojik ve filtreleme işlemlerinden oluşmaktadır. Tüm diş bölütlemede %95,4±0,3'lük bir Dice örtüşme puanı elde edilmiştir. Önerilen son işleme aşamaları, diş sayısının tespitinde ortalama hatayı %26,81'den %6,15'e düşürmüştür. Bildiğimiz kadarıyla hem bölütleme, hem de diş sayma performansları literatürdeki en yüksek performanslardır. Ayrıca bu sonuç, 105 görüntüden oluşan nispeten küçük bir eğitim veri seti kullanılarak elde edilmiştir. Bu çalışmadaki amaç diş örneklerini bölütlemek olsa da, sunulan yöntem hücre örneklerini ayırmak gibi diğer alanlardaki benzer problemlere uygulanabilir. 

References

  • [1]M. Glick, D. M. Williams, D. V. Kleinman, M. Vujicic, R. G. Watt, and R. J. Weyant, “A new definition for oral health developed by the fdi world dental federation opens the door to a universal definition of oral health,” British dental journal, vol. 221, no. 12, pp. 792–793, 2016.
  • [2]M. A. Peres, L. M. Macpherson, R. J. Weyant, B. Daly, R. Venturelli, M. R. Mathur, S. Listl, R. K. Celeste, C. C. Guarnizo-Herreño, C. Kearns et al., “Oral diseases: a global public health challenge,” The Lancet, vol. 394, no. 10194, pp. 249–260, 2019.
  • [3]N. Shah, N. Bansal, and A. Logani, “Recent advances in imaging technologies in dentistry,” World journal of radiology, vol. 6, no. 10, p. 794, 2014.
  • [4]B. Vandenberghe, R. Jacobs, and H. Bosmans, “Modern dental imaging: a review of the current technology and clinical applications in dental practice,” European radiology, vol. 20, no. 11, pp. 2637–2655, 2010.
  • [5]R. G. Birdal, E. Gumus, A. Sertbas, and I. S. Birdal, “Automated lesion detection in panoramic dental radiographs,” Oral Radiology, vol. 32, no. 2, pp. 111–118, 2016.
  • [6]E. Avuclu and F. Basciftci, “Novel approaches to determine age and gender from dental x-ray images by using multiplayer perceptron neural networks and image processing techniques,” Chaos, Solitons & Fractals, vol. 120, pp. 127 – 138, 2019.
  • [7]M. H. Bozkurt and S. Karagol, “Jaw and teeth segmentation on the panoramic x-ray images for dental human identification,” Journal of Digital Imaging, pp. 1–18, 2020.
  • [8]G. Jader, J. Fontineli, M. Ruiz, K. Abdalla, M. Pithon, and L. Oliveira, “Deep instance segmentation of teeth in panoramic x-ray images,” in 2018 31st SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI). IEEE, 2018, pp. 400–407.
  • [9]G. Silva, L. Oliveira, and M. Pithon, “Automatic segmenting teeth in x-ray images: Trends, a novel data set, benchmarking and future perspectives,” Expert Systems with Applications, vol. 107, pp. 15–31, 2018.
  • [10]G. Litjens, T. Kooi, B. E. Bejnordi, A. A. A. Setio, F. Ciompi, M. Ghafoorian, J. A. Van Der Laak, B. Van Ginneken, and C. I. Sánchez, “A survey on deep learning in medical image analysis,” Medical image analysis, vol. 42, pp. 60–88, 2017.
  • [11]Y. Zhao, P. Li, C. Gao, Y. Liu, Q. Chen, F. Yang, and D. Meng, “Tsasnet: Tooth segmentation on dental panoramic x-ray images by two-stage attention segmentation network,” Knowledge-Based Systems, vol. 206, p. 106338, 2020.
  • [12]S. Ren, K. He, R. Girshick, and J. Sun, “Faster r-cnn: Towards real-time object detection with region proposal networks,” in Advances in neural information processing systems, 2015, pp. 91–99.
  • [13]H. Chen, K. Zhang, P. Lyu, H. Li, L. Zhang, J. Wu, and C.-H. Lee, “A deep learning approach to automatic teeth detection and numbering based on object detection in dental periapical films,” Scientific reports, vol. 9, no. 1, pp.1–11, 2019.
  • [14]C. Kim, D. Kim, H. Jeong, S.-J. Yoon, and S. Youm, “Automatic tooth detection and numbering using a combination of a cnn and heuristic algorithm,” Applied Sciences, vol. 10, no. 16, p. 5624, 2020.
  • [15]D. V. Tuzoff, L. N. Tuzova, M. M. Bornstein, A. S. Krasnov, M. A. Kharchenko, S. I. Nikolenko, M. M. Sveshnikov, and G. B. Bednenko, “Tooth detection and numbering in panoramic radiographs using convolutional neural networks,” Dentomaxillofacial Radiology, vol. 48, no. 4, p. 20180051, 2019.
  • [16]K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv preprint arXiv:1409.1556, 2014.
  • [17]K. Zhang, J. Wu, H. Chen, and P. Lyu, “An effective teeth recognition method using label tree with cascade network structure,” Computerized Medical Imaging and Graphics, vol. 68, pp. 61–70, 2018.
  • [18]K. He, G. Gkioxari, P. Dollár, and R. Girshick, “Mask r-cnn,” in Proceedings of the IEEE international conference on computer vision, 2017, pp. 2961–2969.
  • [19]J.-H. Lee, S.-S. Han, Y. H. Kim, C. Lee, and I. Kim, “Application of a fully deep convolutional neural network to the automation of tooth segmentation on panoramic radiographs,” Oral surgery, oral medicine, oral pathology and oral radiology, vol. 129, no. 6, pp. 635–642, 2020.
  • [20]A. F. Leite, A. Van Gerven, H. Willems, T. Beznik, P. Lahoud, H. Gaêta-Araujo, M. Vranckx, and R. Jacobs, “Artificial intelligence-driven novel tool for tooth detection and segmentation on panoramic radiographs,” Clinical Oral Investigations, pp. 1–11, 2020.
  • [21]L.-C. Chen, G. Papandreou, I. Kokkinos, K. Murphy, and A. L. Yuille, “Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs,” IEEE transactions on pattern analysis and machine intelligence, vol. 40, no. 4, pp. 834–848, 2017.
  • [22]M. Chung, M. Lee, J. Hong, S. Park, J. Lee, J. Lee, I.-H. Yang, J. Lee, and Y.-G. Shin, “Pose-aware instance segmentation framework from cone beam ct images for tooth segmentation,” Computers in Biology and Medicine, pp. 103720, 2020.
  • [23]Z. Cui, C. Li, and W. Wang, “Toothnet: Automatic tooth instance segmentation and identification from cone beam ct images,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2019, pp.6368–6377.
  • [24]F. G. Zanjani, D. A. Moin, F. Claessen, T. Cherici, S. Parinussa, A. Pourtaherian, S. Zinger et al., “Mask-mcnet: Instance segmentation in 3d point cloud of intra-oral scans,” in International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer, 2019, pp. 128–136.
  • [25]O. Ronneberger, P. Fischer, and T. Brox, “U-net: Convolutional networks for biomedical image segmentation,” in International Conference on Medical image computing and computer-assisted intervention, Springer, 2015, pp.234–241.
  • [26]O. Ronneberger, P. Fischer, and T. . Brox, “Dental x-ray image segmentation using a u-shaped deep convolutional network,” in International Symposium on Biomedical Imaging (ISBI), 2015.
  • [27]T. L. Koch, M. Perslev, C. Igel, and S. S. Brandt, “Accurate segmentation of dental panoramic radiographs with u-nets,” in 2019 IEEE 16th International Symposium on Biomedical Imaging (ISBI 2019), 2019, pp. 15–19.
  • [28]N. Lu and X. Ke, “A segmentation method based on gray-scale morphological filter and watershed algorithm for touching objects image,” in Fourth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD 2007), vol. 3, 2007, pp. 474–478.
  • [29]T. Falk, D. Mai, R. Bensch, Ö. Çiçek, A. Abdulkadir, Y. Marrakchi, A. Böhm, J. Deubner, Z. Jäckel, K. Seiwal et al., “U-net: deep learning for cell counting, detection, and morphometry,” Nature methods, vol. 16, no. 1, pp.67–70, 2019.
  • [30]A. H. Abdi, S. Kasaei, and M. Mehdizadeh, “Automatic segmentation of mandible in panoramic x-ray,” Journal of Medical Imaging, vol. 2, no. 4, p. 044003, 2015.
  • [31]K. He, X. Zhang, S. Ren, and J. Sun, “Delving deep into rectifiers: Surpassing human-level performance on imagene classification,” In Proceedings of the IEEE international conference on computer vision, 2015, pp. 1026–1034.
  • [32]D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” in Proceedings of the 3rd international conference for learning representations (ICLR’15), 2015.
  • [33]N. Otsu, “A threshold selection method from gray-level histograms,” IEEE transactions on systems, man, and cybernetics, vol. 9, no. 1, pp. 62–66, 1979.
  • [34]F. Wilcoxon, “Individual comparisons by ranking methods,” Biometrics Bulletin, vol. 1, no. 6, pp. 80–83, 1945.
  • [35]M. Sepehrian, A. M. Deylami, and R. A. Zoroofi, “Individual teeth segmentation in cbct and msct dental images using watershed,” in 2013 20th Iranian Conference on Biomedical Engineering (ICBME), 2013, pp. 27–30.

Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing

Year 2022, Volume: 10 Issue: 1, 39 - 50, 31.01.2022
https://doi.org/10.29130/dubited.950568

Abstract

Automatic teeth segmentation in panoramic x-ray images is an important research subject of the image analysis in dentistry. In this study, we propose a post-processing stage to obtain a segmentation map in which the objects in the image are separated, and apply this technique to tooth instance segmentation with U-Net network. The post-processing consists of grayscale morphological and filtering operations, which are applied to the sigmoid output of the network before binarization. A dice overlap score of 95.4±0.3% is obtained in overall teeth segmentation. The proposed post-processing stages reduce the mean error of tooth count to 6.15%, whereas the error without post-processing is 26.81%. The performances of both segmentation and tooth counting are the highest in the literature, to our knowledge. Moreover, this is achieved by using a relatively small training dataset, which consists of 105 images. Although the aim in this study is to segment tooth instances, the presented method is applicable to similar problems in other domains, such as separating the cell instances. 

References

  • [1]M. Glick, D. M. Williams, D. V. Kleinman, M. Vujicic, R. G. Watt, and R. J. Weyant, “A new definition for oral health developed by the fdi world dental federation opens the door to a universal definition of oral health,” British dental journal, vol. 221, no. 12, pp. 792–793, 2016.
  • [2]M. A. Peres, L. M. Macpherson, R. J. Weyant, B. Daly, R. Venturelli, M. R. Mathur, S. Listl, R. K. Celeste, C. C. Guarnizo-Herreño, C. Kearns et al., “Oral diseases: a global public health challenge,” The Lancet, vol. 394, no. 10194, pp. 249–260, 2019.
  • [3]N. Shah, N. Bansal, and A. Logani, “Recent advances in imaging technologies in dentistry,” World journal of radiology, vol. 6, no. 10, p. 794, 2014.
  • [4]B. Vandenberghe, R. Jacobs, and H. Bosmans, “Modern dental imaging: a review of the current technology and clinical applications in dental practice,” European radiology, vol. 20, no. 11, pp. 2637–2655, 2010.
  • [5]R. G. Birdal, E. Gumus, A. Sertbas, and I. S. Birdal, “Automated lesion detection in panoramic dental radiographs,” Oral Radiology, vol. 32, no. 2, pp. 111–118, 2016.
  • [6]E. Avuclu and F. Basciftci, “Novel approaches to determine age and gender from dental x-ray images by using multiplayer perceptron neural networks and image processing techniques,” Chaos, Solitons & Fractals, vol. 120, pp. 127 – 138, 2019.
  • [7]M. H. Bozkurt and S. Karagol, “Jaw and teeth segmentation on the panoramic x-ray images for dental human identification,” Journal of Digital Imaging, pp. 1–18, 2020.
  • [8]G. Jader, J. Fontineli, M. Ruiz, K. Abdalla, M. Pithon, and L. Oliveira, “Deep instance segmentation of teeth in panoramic x-ray images,” in 2018 31st SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI). IEEE, 2018, pp. 400–407.
  • [9]G. Silva, L. Oliveira, and M. Pithon, “Automatic segmenting teeth in x-ray images: Trends, a novel data set, benchmarking and future perspectives,” Expert Systems with Applications, vol. 107, pp. 15–31, 2018.
  • [10]G. Litjens, T. Kooi, B. E. Bejnordi, A. A. A. Setio, F. Ciompi, M. Ghafoorian, J. A. Van Der Laak, B. Van Ginneken, and C. I. Sánchez, “A survey on deep learning in medical image analysis,” Medical image analysis, vol. 42, pp. 60–88, 2017.
  • [11]Y. Zhao, P. Li, C. Gao, Y. Liu, Q. Chen, F. Yang, and D. Meng, “Tsasnet: Tooth segmentation on dental panoramic x-ray images by two-stage attention segmentation network,” Knowledge-Based Systems, vol. 206, p. 106338, 2020.
  • [12]S. Ren, K. He, R. Girshick, and J. Sun, “Faster r-cnn: Towards real-time object detection with region proposal networks,” in Advances in neural information processing systems, 2015, pp. 91–99.
  • [13]H. Chen, K. Zhang, P. Lyu, H. Li, L. Zhang, J. Wu, and C.-H. Lee, “A deep learning approach to automatic teeth detection and numbering based on object detection in dental periapical films,” Scientific reports, vol. 9, no. 1, pp.1–11, 2019.
  • [14]C. Kim, D. Kim, H. Jeong, S.-J. Yoon, and S. Youm, “Automatic tooth detection and numbering using a combination of a cnn and heuristic algorithm,” Applied Sciences, vol. 10, no. 16, p. 5624, 2020.
  • [15]D. V. Tuzoff, L. N. Tuzova, M. M. Bornstein, A. S. Krasnov, M. A. Kharchenko, S. I. Nikolenko, M. M. Sveshnikov, and G. B. Bednenko, “Tooth detection and numbering in panoramic radiographs using convolutional neural networks,” Dentomaxillofacial Radiology, vol. 48, no. 4, p. 20180051, 2019.
  • [16]K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv preprint arXiv:1409.1556, 2014.
  • [17]K. Zhang, J. Wu, H. Chen, and P. Lyu, “An effective teeth recognition method using label tree with cascade network structure,” Computerized Medical Imaging and Graphics, vol. 68, pp. 61–70, 2018.
  • [18]K. He, G. Gkioxari, P. Dollár, and R. Girshick, “Mask r-cnn,” in Proceedings of the IEEE international conference on computer vision, 2017, pp. 2961–2969.
  • [19]J.-H. Lee, S.-S. Han, Y. H. Kim, C. Lee, and I. Kim, “Application of a fully deep convolutional neural network to the automation of tooth segmentation on panoramic radiographs,” Oral surgery, oral medicine, oral pathology and oral radiology, vol. 129, no. 6, pp. 635–642, 2020.
  • [20]A. F. Leite, A. Van Gerven, H. Willems, T. Beznik, P. Lahoud, H. Gaêta-Araujo, M. Vranckx, and R. Jacobs, “Artificial intelligence-driven novel tool for tooth detection and segmentation on panoramic radiographs,” Clinical Oral Investigations, pp. 1–11, 2020.
  • [21]L.-C. Chen, G. Papandreou, I. Kokkinos, K. Murphy, and A. L. Yuille, “Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs,” IEEE transactions on pattern analysis and machine intelligence, vol. 40, no. 4, pp. 834–848, 2017.
  • [22]M. Chung, M. Lee, J. Hong, S. Park, J. Lee, J. Lee, I.-H. Yang, J. Lee, and Y.-G. Shin, “Pose-aware instance segmentation framework from cone beam ct images for tooth segmentation,” Computers in Biology and Medicine, pp. 103720, 2020.
  • [23]Z. Cui, C. Li, and W. Wang, “Toothnet: Automatic tooth instance segmentation and identification from cone beam ct images,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2019, pp.6368–6377.
  • [24]F. G. Zanjani, D. A. Moin, F. Claessen, T. Cherici, S. Parinussa, A. Pourtaherian, S. Zinger et al., “Mask-mcnet: Instance segmentation in 3d point cloud of intra-oral scans,” in International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer, 2019, pp. 128–136.
  • [25]O. Ronneberger, P. Fischer, and T. Brox, “U-net: Convolutional networks for biomedical image segmentation,” in International Conference on Medical image computing and computer-assisted intervention, Springer, 2015, pp.234–241.
  • [26]O. Ronneberger, P. Fischer, and T. . Brox, “Dental x-ray image segmentation using a u-shaped deep convolutional network,” in International Symposium on Biomedical Imaging (ISBI), 2015.
  • [27]T. L. Koch, M. Perslev, C. Igel, and S. S. Brandt, “Accurate segmentation of dental panoramic radiographs with u-nets,” in 2019 IEEE 16th International Symposium on Biomedical Imaging (ISBI 2019), 2019, pp. 15–19.
  • [28]N. Lu and X. Ke, “A segmentation method based on gray-scale morphological filter and watershed algorithm for touching objects image,” in Fourth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD 2007), vol. 3, 2007, pp. 474–478.
  • [29]T. Falk, D. Mai, R. Bensch, Ö. Çiçek, A. Abdulkadir, Y. Marrakchi, A. Böhm, J. Deubner, Z. Jäckel, K. Seiwal et al., “U-net: deep learning for cell counting, detection, and morphometry,” Nature methods, vol. 16, no. 1, pp.67–70, 2019.
  • [30]A. H. Abdi, S. Kasaei, and M. Mehdizadeh, “Automatic segmentation of mandible in panoramic x-ray,” Journal of Medical Imaging, vol. 2, no. 4, p. 044003, 2015.
  • [31]K. He, X. Zhang, S. Ren, and J. Sun, “Delving deep into rectifiers: Surpassing human-level performance on imagene classification,” In Proceedings of the IEEE international conference on computer vision, 2015, pp. 1026–1034.
  • [32]D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” in Proceedings of the 3rd international conference for learning representations (ICLR’15), 2015.
  • [33]N. Otsu, “A threshold selection method from gray-level histograms,” IEEE transactions on systems, man, and cybernetics, vol. 9, no. 1, pp. 62–66, 1979.
  • [34]F. Wilcoxon, “Individual comparisons by ranking methods,” Biometrics Bulletin, vol. 1, no. 6, pp. 80–83, 1945.
  • [35]M. Sepehrian, A. M. Deylami, and R. A. Zoroofi, “Individual teeth segmentation in cbct and msct dental images using watershed,” in 2013 20th Iranian Conference on Biomedical Engineering (ICBME), 2013, pp. 27–30.
There are 35 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Articles
Authors

Serdar Helli 0000-0002-5894-9038

Andaç Hamamcı 0000-0003-3438-3727

Publication Date January 31, 2022
Published in Issue Year 2022 Volume: 10 Issue: 1

Cite

APA Helli, S., & Hamamcı, A. (2022). Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing. Duzce University Journal of Science and Technology, 10(1), 39-50. https://doi.org/10.29130/dubited.950568
AMA Helli S, Hamamcı A. Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing. DUBİTED. January 2022;10(1):39-50. doi:10.29130/dubited.950568
Chicago Helli, Serdar, and Andaç Hamamcı. “Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing”. Duzce University Journal of Science and Technology 10, no. 1 (January 2022): 39-50. https://doi.org/10.29130/dubited.950568.
EndNote Helli S, Hamamcı A (January 1, 2022) Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing. Duzce University Journal of Science and Technology 10 1 39–50.
IEEE S. Helli and A. Hamamcı, “Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing”, DUBİTED, vol. 10, no. 1, pp. 39–50, 2022, doi: 10.29130/dubited.950568.
ISNAD Helli, Serdar - Hamamcı, Andaç. “Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing”. Duzce University Journal of Science and Technology 10/1 (January 2022), 39-50. https://doi.org/10.29130/dubited.950568.
JAMA Helli S, Hamamcı A. Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing. DUBİTED. 2022;10:39–50.
MLA Helli, Serdar and Andaç Hamamcı. “Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing”. Duzce University Journal of Science and Technology, vol. 10, no. 1, 2022, pp. 39-50, doi:10.29130/dubited.950568.
Vancouver Helli S, Hamamcı A. Tooth Instance Segmentation on Panoramic Dental Radiographs Using U-Nets and Morphological Processing. DUBİTED. 2022;10(1):39-50.