Conference Paper

Gender Classification from Eye Images by Using Pretrained Convolutional Neural Networks

Volume: 14 December 31, 2021
  • Yucel Cımtay
  • Gokce Nur Yılmaz
EN

Gender Classification from Eye Images by Using Pretrained Convolutional Neural Networks

Abstract

Automatic gender classification from face images has been a popular topic among researchers for a decade. Feature extraction and classification methods are very important to create a successful automatic classification system. Due to the richness of face image datasets today, many successful machine learning and deep learning methods has been implemented. It is very critical to extract accurate features from the datasets to achieve promising classification scores when traditional machine learning methods are used. However, deep learning models have been designed to extract the features automatically from the raw data directly. This also automatize the feature extraction process besides classification. The hidden and unpredictable feature sets can be explored by the deep neural networks which can increase the classification performance comparing to traditional machine learning methods. Convolutional Neural Networks (CNN) as one of the effective classes of deep models have been adopted by many scientists for solving the gender classification problem. It can solve the problem of the fact that facial cues can change from origin to origin which makes the accurate feature extraction harder. There are several state-of-the art pretrained CNN structures which are very successful for image classification problems. The performance of CNNs is generally higher when the number of the input data is high. However, in this study, the success of the pretrained CNN models is investigated when the data is limited. Considering this fact, in this study, rather than using complete face images, only the one eye image regions with eyebrows are used for the gender classification. The performance results present that the best CNN models are NASNetLarge and Xception models.

Keywords

References

  1. Abdalrady, N. A., & Aly, S. (2020, February). Fusion of multiple simple convolutional neural networks for gender classification. In 2020 International Conference on Innovative Trends in Communication and Computer Engineering (ITCE), 251-256. IEEE.
  2. Akbulut, Y., Şengür, A., & Ekici, S. (2017, September). Gender recognition from face images with deep learning. In 2017 International artificial intelligence and data processing symposium (IDAP), 1-4. IEEE.
  3. Arora, S., & Bhatia, M. P. S. (2018, July). A robust approach for gender recognition using deep learning. In 2018 9th International Conference on Computing, Communication and Networking Technologies (ICCCNT),1-6. IEEE.
  4. Cimtay, Y., Alkan, B., & Demirel, B. (2021). Fingerprint Pattern Classification by Using Various Pre-Trained Deep Neural Networks. Avrupa Bilim ve Teknoloji Dergisi, (24), 258-261.
  5. Color-feret-database. (12 October 2021). https://www.nist.gov/itl/products-and-services/
  6. Danisman, T., Bilasco, I. M., & Martinet, J. (2015). Boosting gender recognition performance with a fuzzy inference system. Expert Systems with Applications, 42(5), 2772-2784.
  7. Deng, Y., Luo, P., Loy, C. C., & Tang, X. (2014, November). Pedestrian attribute recognition at far distance. In Proceedings of the 22nd ACM international conference on Multimedia,789-792.
  8. Eidinger, E., Enbar, R., & Hassner, T. (2014). Age and gender estimation of unfiltered faces. in IEEE Transactions on Information Forensics and Security, 9 (12), 2170-2179, doi: 10.1109/TIFS.2014.2359646.

Details

Primary Language

English

Subjects

Engineering

Journal Section

Conference Paper

Authors

Yucel Cımtay This is me
Türkiye

Gokce Nur Yılmaz This is me
Türkiye

Publication Date

December 31, 2021

Submission Date

March 10, 2021

Acceptance Date

May 31, 2021

Published in Issue

Year 2021 Volume: 14

APA
Cımtay, Y., & Yılmaz, G. N. (2021). Gender Classification from Eye Images by Using Pretrained Convolutional Neural Networks. The Eurasia Proceedings of Science Technology Engineering and Mathematics, 14, 39-44. https://doi.org/10.55549/epstem.1050171