Araştırma Makalesi
BibTex RIS Kaynak Göster

CNN-based Gender Prediction in Uncontrolled Environments

Yıl 2021, Cilt: 9 Sayı: 2, 890 - 898, 25.04.2021
https://doi.org/10.29130/dubited.763427

Öz

With the increasing amount of data produced and collected, the use of artificial intelligence technologies has become inevitable. By using deep learning techniques from these technologies, high performance can be achieved in tasks such as classification and face analysis in the fields of image processing and computer vision. In this study, Convolutional Neural Networks (CNN), one of the deep learning algorithms, was used. The model created with this algorithm was trained with facial images and gender prediction was made. As a result of the experiments, 93.71% success rate was achieved on the VGGFace2 data set and 85.52% success rate on the Adience data set. The aim of the study is to classify low-resolution images with high accuracy.

Kaynakça

  • [1] J. M. Gandarias, A. J. García-Cerezo and J. M. Gómez-de-Gabriel, “CNN-based methods for object recognition with high-resolution tactile sensors,” IEEE Sensors Journal, vol. 19, no. 16, pp. 6872–6882, 2019.
  • [2] W. Chan, N. Jaitly, Q. Le, O. Vinyals and N. Shazeer, “Speech recognition with attention-based recurrent neural networks,” U.S. Patent Appl. 20200118554A1, Apr. 16, 2020.
  • [3] K. A. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale ımage recognition,” in Proc. International Conference on Learning Representations (ICLR), 2015, pp. 1–14.
  • [4] B. Shrestha, Y. Kwon, D. Chung and W. Gal, “The atrous cnn method with short computation time for super-resolution,” International Journal of Computing and Digital Systems, vol. 9, no. 2, pp. 221–227, 2020.
  • [5] E. Learned-Miller, G. B. Huang, A. RoyChowdhury, H. Li and G. Hua, “Labeled faces in the wild: A survey,” in Advances in Face Detection and Facial Image Analysis, 1st ed., Cham, Switzerland: Springer, 2016, pp. 189–248.
  • [6] N. Van Balen, C. Ball and H. Wang, “Analysis of targeted mouse movements for gender classification,” EAI Endorsed Transactions on Security and Safety, vol. 4, no. 11, 2017.
  • [7] J. E. Tapia and C. A. Perez, “Gender classification from nır ımages by using quadrature encoding filters of the most relevant features,” IEEE Access, vol. 7, pp. 29114–29127, 2019.
  • [8] D. Yaman, F. I. Eyiokur and H. K. Ekenel, “Multimodal age and gender classification using ear and profile face ımages,” in Proc. Computer Vision and Pattern Recognition Workshops (CVPRW), 2019, pp. 2414–2421.
  • [9] N. A. Nazifa, C. Y. Fook, L. C. Chin, V. Vijean and E. S. Kheng, “Gender prediction by speech analysis,” Journal of Physics: Conference Series, vol. 1372, no. 012011, 2019.
  • [10] B. Kaur, D. Singh and P. P. Roy, “Age and gender classification using brain–computer interface,” Neural Computing and Applications, vol. 31, no. 10, pp. 5887–5900, 2019.
  • [11] C. Bhagvati, “Word representations for gender classification using deep learning,” Procedia Computer Science, vol. 132, pp. 614–622, 2018.
  • [12] Z. Q. Mawlood and A. T. Sabir, “Human gait-based gender classification using neutral and non-neutral gait sequences,” Revista Innovaciencia, vol. 7, no. 1, pp. 1–13, 2019.
  • [13] J. A. Polin and O. Khan, “Gender identification from smart phone usage using machine learning algorithm,” B.Sc. Report, Department Computer Science and Engineering, Daffodil International University, Dhaka, Bangladesh, 2019.
  • [14] M. Afifi, “11K Hands: gender recognition and biometric identification using a large dataset of hand images,” Multimedia Tools and Applications, vol. 78, no. 15, pp. 20835–20854, 2019.
  • [15] S. Bentaieb, A. Ouamri and M. Keche, “SAX2SEX: Gender classification on 3d faces using symbolic aggregate approximation,” in Proc. 6th International Conference on Image and Signal Processing and their Applications (ISPA), 2019, pp. 1–5.
  • [16] V. Muthukumar, T. Pedapati, N. Ratha, P. Sattigeri, C. Wu, B. Kingsbury, A. Kumar, S. Thomas, A. Mojsilovic and K. Varshney, “Understanding unequal gender classification accuracy from face images,” 2018, arXiv:1812.00099.
  • [17] F. Bougourzi, S. Bekhouche, M. Zighem, A. Benlamoudi, T. Ouafi and A. Taleb-Ahmed, “A comparative study on textures descriptors in facial gender classification,” presented at 10 ème Conférence sur le Génie Electrique, Bordj El Bahri, Algeria, 2017.
  • [18] O. Arriaga, M. Valdenegro-Toro and P. Plöger, “Real-time convolutional neural networks for emotion and gender classification,” in Proc. European Symposium on Artificial Neural Networks (ESANN), 2019, pp. 221–226.
  • [19] G. Levi and T. Hassner, “Age and gender classification using convolutional neural networks,” in Proc. Computer Vision and Pattern Recognition Workshops (CVPRW), 2015, pp. 34–42.
  • [20] J. Zhang, Y. Xie, Q. Wu and Y. Xia, “Medical image classification using synergic deep learning,” Medical Image Analysis, vol. 54, pp. 10–19, 2019.
  • [21] F. V. Massoli, G. Amato, F. Falchi, C. Gennaro and C. Vairo, “Improving multi-scale face recognition using VGGFace2,” in Proc. International Conference on Image Analysis and Processing (ICIAP), 2019, pp. 21–29.
  • [22] F. Juefei-Xu, E. Verma, P. Goel, A. Cherodian and M. Savvides, “Deepgender: occlusion and low resolution robust facial gender classification via progressively trained convolutional neural networks with attention,” in Proc. Computer Vision and Pattern Recognition Workshops (CVPRW), 2016, pp. 68–77.
  • [23] H. Mliki, S. Dammak and E. Fendri, “An improved multi-scale face detection using convolutional neural network,” Signal Image and Video Processing, vol. 14, no. 7, pp. 1345–1353, 2020.
  • [24] M. Masud, G. Muhammad, H. Alhumyani, S. S. Alshamrani, O. Cheikhrouhou, S. Ibrahim and M. S. Hossain, “Deep learning-based intelligent face recognition in IoT-cloud environment,” Computer Communications, vol. 152, pp. 215–222, 2020.
  • [25] S. M. González-Lozoya, J. de la Calleja, L. Pellegrin, H. J. Escalante, M. A. Medina and A. Benitez-Ruiz, “Recognition of facial expressions based on CNN features,” Multimedia Tools and Applications, vol. 79, pp. 13987–14007, 2020.
  • [26] A. Chaudhuri, “Deep learning models for face recognition: A comparative analysis,” in Deep Biometrics, 1st ed, Cham, Switzerland: Springer, 2020, pp. 99–140.
  • [27] Y. Luo, Y. Shao, H. Chu, B. Wu, M. Huang and Y. Rao, “CNN-based blade tip vortex region detection in flow field,” in Proc. International Conference on Graphics and Image Processing (ICGIP), 2019, vol. 11373.
  • [28] Y. Wang, M. Liu, P. Zheng, H. Yang and J. Zou, “A smart surface inspection system using faster R-CNN in cloud-edge computing environment,” Advanced Engineering Informatics, vol. 43, no. 101037, 2020.
  • [29] O. A. Aghdam, B. Bozorgtabar, H. K. Ekenel, J. Thiran, “Exploring factors for improving low resolution face recognition,” in Proc. CVPR Workshops, 2019, pp. 2363–2370.
  • [30] G. Tolias, R. Sicre and H. Jégou, “Particular object retrieval with integral max-pooling of CNN activations,” in Proc. International Conference on Learning Representations (ICLR), 2016, pp. 1–12.
  • [31] X. Liang, X. Wang, Z. Lei, S. Liao and S. Li, “Soft-margin softmax for deep classification,” in Proc. NIPS, 2017, pp. 413–421.
  • [32] A. Meliboev, J. Alikhanov and W. Kim, “1D CNN based network ıntrusion detection with normalization on ımbalanced data,” in Proc. International Conference on Artificial Intelligence in Information and Communication (ICAIIC), 2020, pp. 218–224.
  • [33] O. M. Parkhi, A. Vedaldi and A. Zisserman, “Deep face recognition,” in Proc. The British Machine Vision Conference (BMVC), 2015, pp. 1–12.
  • [34] Q. Cao, L. Shen, W. Xie, O. M. Parkhi and A. Zisserman, “Vggface2: A dataset for recognising faces across pose and age,” in Proc. International Conference on Automatic Face & Gesture Recognition (FG), 2018, pp. 67–74.
  • [35] E. Eidinger, R. Enbar and T. Hassner, “Age and gender estimation of unfiltered faces,” IEEE Transactions on Information Forensics and Security, vol. 9, no. 12, pp. 2170–2179, 2014.
  • [36] S. C. Mynepalli, P. Hu and D. Ramanan, “Recognizing tiny faces,” in Proc. International Conference on Computer Vision Workshops (ICCVW), 2019, pp. 1121–1130.
  • [37] Z. Liao, S. Petridis and M. Pantic, “Local deep neural networks for age and gender classification,” 2017, arXiv:1703.08497.
  • [38] T. Hassner, S. Harel, E. Paz and R. Enbar, “Effective face frontalization in unconstrained images,” in Proc. Computer Vision and Pattern Recognition (CVPR), 2015, pp. 4295–4304.
  • [39] G. Levi and T. Hassner, “Emotion recognition in the wild via convolutional neural networks and mapped binary patterns,” in Proc. ACM on International Conference on Multimodal Interaction, 2015, pp. 503–510.

Kontrolsüz Ortamlarda CNN Tabanlı Cinsiyet Tahmini

Yıl 2021, Cilt: 9 Sayı: 2, 890 - 898, 25.04.2021
https://doi.org/10.29130/dubited.763427

Öz

Üretilen ve toplanan veri miktarının giderek artması ile birlikte yapay zekâ teknolojilerin kullanılması kaçınılmaz hale gelmiştir. Bu teknolojilerden biri olan derin öğrenme teknikleri kullanılarak bilgisayarlı görü ve görüntü işleme alanlarında yüz analizi ve sınıflandırma gibi görevlerde yüksek performans alınabilmektedir. Bu çalışmada derin öğrenme algoritmalarından Evrişimsel Sinir Ağları (CNN) kullanılmıştır. Bu algoritma ile oluşturulan model, yüz görüntüleriyle eğitilmiş ve cinsiyet tahmini yapılmıştır. Yapılan deneyler sonucunda VGGFace2 veri seti üzerinde 93.71% ve Adience veri seti üzerinde 85.52% oranında başarı sağlanmıştır. Çalışmanın amacı düşük çözünürlükteki resimleri yüksek doğrulukla sınıflandırabilmektir.

Kaynakça

  • [1] J. M. Gandarias, A. J. García-Cerezo and J. M. Gómez-de-Gabriel, “CNN-based methods for object recognition with high-resolution tactile sensors,” IEEE Sensors Journal, vol. 19, no. 16, pp. 6872–6882, 2019.
  • [2] W. Chan, N. Jaitly, Q. Le, O. Vinyals and N. Shazeer, “Speech recognition with attention-based recurrent neural networks,” U.S. Patent Appl. 20200118554A1, Apr. 16, 2020.
  • [3] K. A. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale ımage recognition,” in Proc. International Conference on Learning Representations (ICLR), 2015, pp. 1–14.
  • [4] B. Shrestha, Y. Kwon, D. Chung and W. Gal, “The atrous cnn method with short computation time for super-resolution,” International Journal of Computing and Digital Systems, vol. 9, no. 2, pp. 221–227, 2020.
  • [5] E. Learned-Miller, G. B. Huang, A. RoyChowdhury, H. Li and G. Hua, “Labeled faces in the wild: A survey,” in Advances in Face Detection and Facial Image Analysis, 1st ed., Cham, Switzerland: Springer, 2016, pp. 189–248.
  • [6] N. Van Balen, C. Ball and H. Wang, “Analysis of targeted mouse movements for gender classification,” EAI Endorsed Transactions on Security and Safety, vol. 4, no. 11, 2017.
  • [7] J. E. Tapia and C. A. Perez, “Gender classification from nır ımages by using quadrature encoding filters of the most relevant features,” IEEE Access, vol. 7, pp. 29114–29127, 2019.
  • [8] D. Yaman, F. I. Eyiokur and H. K. Ekenel, “Multimodal age and gender classification using ear and profile face ımages,” in Proc. Computer Vision and Pattern Recognition Workshops (CVPRW), 2019, pp. 2414–2421.
  • [9] N. A. Nazifa, C. Y. Fook, L. C. Chin, V. Vijean and E. S. Kheng, “Gender prediction by speech analysis,” Journal of Physics: Conference Series, vol. 1372, no. 012011, 2019.
  • [10] B. Kaur, D. Singh and P. P. Roy, “Age and gender classification using brain–computer interface,” Neural Computing and Applications, vol. 31, no. 10, pp. 5887–5900, 2019.
  • [11] C. Bhagvati, “Word representations for gender classification using deep learning,” Procedia Computer Science, vol. 132, pp. 614–622, 2018.
  • [12] Z. Q. Mawlood and A. T. Sabir, “Human gait-based gender classification using neutral and non-neutral gait sequences,” Revista Innovaciencia, vol. 7, no. 1, pp. 1–13, 2019.
  • [13] J. A. Polin and O. Khan, “Gender identification from smart phone usage using machine learning algorithm,” B.Sc. Report, Department Computer Science and Engineering, Daffodil International University, Dhaka, Bangladesh, 2019.
  • [14] M. Afifi, “11K Hands: gender recognition and biometric identification using a large dataset of hand images,” Multimedia Tools and Applications, vol. 78, no. 15, pp. 20835–20854, 2019.
  • [15] S. Bentaieb, A. Ouamri and M. Keche, “SAX2SEX: Gender classification on 3d faces using symbolic aggregate approximation,” in Proc. 6th International Conference on Image and Signal Processing and their Applications (ISPA), 2019, pp. 1–5.
  • [16] V. Muthukumar, T. Pedapati, N. Ratha, P. Sattigeri, C. Wu, B. Kingsbury, A. Kumar, S. Thomas, A. Mojsilovic and K. Varshney, “Understanding unequal gender classification accuracy from face images,” 2018, arXiv:1812.00099.
  • [17] F. Bougourzi, S. Bekhouche, M. Zighem, A. Benlamoudi, T. Ouafi and A. Taleb-Ahmed, “A comparative study on textures descriptors in facial gender classification,” presented at 10 ème Conférence sur le Génie Electrique, Bordj El Bahri, Algeria, 2017.
  • [18] O. Arriaga, M. Valdenegro-Toro and P. Plöger, “Real-time convolutional neural networks for emotion and gender classification,” in Proc. European Symposium on Artificial Neural Networks (ESANN), 2019, pp. 221–226.
  • [19] G. Levi and T. Hassner, “Age and gender classification using convolutional neural networks,” in Proc. Computer Vision and Pattern Recognition Workshops (CVPRW), 2015, pp. 34–42.
  • [20] J. Zhang, Y. Xie, Q. Wu and Y. Xia, “Medical image classification using synergic deep learning,” Medical Image Analysis, vol. 54, pp. 10–19, 2019.
  • [21] F. V. Massoli, G. Amato, F. Falchi, C. Gennaro and C. Vairo, “Improving multi-scale face recognition using VGGFace2,” in Proc. International Conference on Image Analysis and Processing (ICIAP), 2019, pp. 21–29.
  • [22] F. Juefei-Xu, E. Verma, P. Goel, A. Cherodian and M. Savvides, “Deepgender: occlusion and low resolution robust facial gender classification via progressively trained convolutional neural networks with attention,” in Proc. Computer Vision and Pattern Recognition Workshops (CVPRW), 2016, pp. 68–77.
  • [23] H. Mliki, S. Dammak and E. Fendri, “An improved multi-scale face detection using convolutional neural network,” Signal Image and Video Processing, vol. 14, no. 7, pp. 1345–1353, 2020.
  • [24] M. Masud, G. Muhammad, H. Alhumyani, S. S. Alshamrani, O. Cheikhrouhou, S. Ibrahim and M. S. Hossain, “Deep learning-based intelligent face recognition in IoT-cloud environment,” Computer Communications, vol. 152, pp. 215–222, 2020.
  • [25] S. M. González-Lozoya, J. de la Calleja, L. Pellegrin, H. J. Escalante, M. A. Medina and A. Benitez-Ruiz, “Recognition of facial expressions based on CNN features,” Multimedia Tools and Applications, vol. 79, pp. 13987–14007, 2020.
  • [26] A. Chaudhuri, “Deep learning models for face recognition: A comparative analysis,” in Deep Biometrics, 1st ed, Cham, Switzerland: Springer, 2020, pp. 99–140.
  • [27] Y. Luo, Y. Shao, H. Chu, B. Wu, M. Huang and Y. Rao, “CNN-based blade tip vortex region detection in flow field,” in Proc. International Conference on Graphics and Image Processing (ICGIP), 2019, vol. 11373.
  • [28] Y. Wang, M. Liu, P. Zheng, H. Yang and J. Zou, “A smart surface inspection system using faster R-CNN in cloud-edge computing environment,” Advanced Engineering Informatics, vol. 43, no. 101037, 2020.
  • [29] O. A. Aghdam, B. Bozorgtabar, H. K. Ekenel, J. Thiran, “Exploring factors for improving low resolution face recognition,” in Proc. CVPR Workshops, 2019, pp. 2363–2370.
  • [30] G. Tolias, R. Sicre and H. Jégou, “Particular object retrieval with integral max-pooling of CNN activations,” in Proc. International Conference on Learning Representations (ICLR), 2016, pp. 1–12.
  • [31] X. Liang, X. Wang, Z. Lei, S. Liao and S. Li, “Soft-margin softmax for deep classification,” in Proc. NIPS, 2017, pp. 413–421.
  • [32] A. Meliboev, J. Alikhanov and W. Kim, “1D CNN based network ıntrusion detection with normalization on ımbalanced data,” in Proc. International Conference on Artificial Intelligence in Information and Communication (ICAIIC), 2020, pp. 218–224.
  • [33] O. M. Parkhi, A. Vedaldi and A. Zisserman, “Deep face recognition,” in Proc. The British Machine Vision Conference (BMVC), 2015, pp. 1–12.
  • [34] Q. Cao, L. Shen, W. Xie, O. M. Parkhi and A. Zisserman, “Vggface2: A dataset for recognising faces across pose and age,” in Proc. International Conference on Automatic Face & Gesture Recognition (FG), 2018, pp. 67–74.
  • [35] E. Eidinger, R. Enbar and T. Hassner, “Age and gender estimation of unfiltered faces,” IEEE Transactions on Information Forensics and Security, vol. 9, no. 12, pp. 2170–2179, 2014.
  • [36] S. C. Mynepalli, P. Hu and D. Ramanan, “Recognizing tiny faces,” in Proc. International Conference on Computer Vision Workshops (ICCVW), 2019, pp. 1121–1130.
  • [37] Z. Liao, S. Petridis and M. Pantic, “Local deep neural networks for age and gender classification,” 2017, arXiv:1703.08497.
  • [38] T. Hassner, S. Harel, E. Paz and R. Enbar, “Effective face frontalization in unconstrained images,” in Proc. Computer Vision and Pattern Recognition (CVPR), 2015, pp. 4295–4304.
  • [39] G. Levi and T. Hassner, “Emotion recognition in the wild via convolutional neural networks and mapped binary patterns,” in Proc. ACM on International Conference on Multimodal Interaction, 2015, pp. 503–510.
Toplam 39 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Mühendislik
Bölüm Makaleler
Yazarlar

Kazım Yıldız 0000-0001-6999-1410

Engin Güneş 0000-0003-3757-5214

Anil Bas 0000-0002-3833-6023

Yayımlanma Tarihi 25 Nisan 2021
Yayımlandığı Sayı Yıl 2021 Cilt: 9 Sayı: 2

Kaynak Göster

APA Yıldız, K., Güneş, E., & Bas, A. (2021). CNN-based Gender Prediction in Uncontrolled Environments. Duzce University Journal of Science and Technology, 9(2), 890-898. https://doi.org/10.29130/dubited.763427
AMA Yıldız K, Güneş E, Bas A. CNN-based Gender Prediction in Uncontrolled Environments. DÜBİTED. Nisan 2021;9(2):890-898. doi:10.29130/dubited.763427
Chicago Yıldız, Kazım, Engin Güneş, ve Anil Bas. “CNN-Based Gender Prediction in Uncontrolled Environments”. Duzce University Journal of Science and Technology 9, sy. 2 (Nisan 2021): 890-98. https://doi.org/10.29130/dubited.763427.
EndNote Yıldız K, Güneş E, Bas A (01 Nisan 2021) CNN-based Gender Prediction in Uncontrolled Environments. Duzce University Journal of Science and Technology 9 2 890–898.
IEEE K. Yıldız, E. Güneş, ve A. Bas, “CNN-based Gender Prediction in Uncontrolled Environments”, DÜBİTED, c. 9, sy. 2, ss. 890–898, 2021, doi: 10.29130/dubited.763427.
ISNAD Yıldız, Kazım vd. “CNN-Based Gender Prediction in Uncontrolled Environments”. Duzce University Journal of Science and Technology 9/2 (Nisan 2021), 890-898. https://doi.org/10.29130/dubited.763427.
JAMA Yıldız K, Güneş E, Bas A. CNN-based Gender Prediction in Uncontrolled Environments. DÜBİTED. 2021;9:890–898.
MLA Yıldız, Kazım vd. “CNN-Based Gender Prediction in Uncontrolled Environments”. Duzce University Journal of Science and Technology, c. 9, sy. 2, 2021, ss. 890-8, doi:10.29130/dubited.763427.
Vancouver Yıldız K, Güneş E, Bas A. CNN-based Gender Prediction in Uncontrolled Environments. DÜBİTED. 2021;9(2):890-8.