Research Article
BibTex RIS Cite
Year 2022, , 129 - 151, 30.12.2022
https://doi.org/10.33769/aupse.1184112

Abstract

Supporting Institution

Yok

Project Number

Yok

Thanks

Yok

References

  • Connor, E., Lundregan, T., Miller, N., McEwan, T., Convicted by Juries, Exonerated by Science: Case Studies in the Use of DNA Evidence to Establish Innocence After Trial, Office of Justice Programs, 1996.
  • Garrett, B., Convicting the Innocent: Where Criminal Prosecutions Go Wrong, Harvard University Press, 2011.
  • Innocence Project, Eyewitness Identification Reform, 2018. Retrieved February 17, 2022, https://innocenceproject.org/eyewitness-identification-reform/.
  • Saks, M. J., Koehler, J. J., The coming paradigm shift in forensic identification science, Science, 309 (5736) (2005), 892–895, https://doi.org/10.1126/science.1111565.
  • Berkowitz, S. R., Loftus, E., Misinformation in the courtroom, H. Otgaar, M. L. Howe (Eds.), Finding the Truth in the Courtroom: Dealing with Deception, Lies, and Memories, Oxford University Press, 2018, 11–20.
  • Wells, G. L., Applied eyewitness-testimony research: system variables and estimator variables, J. Pers. Soc. Psychol., 36 (12) (1978), 1546-1557, https://doi.org/10.1037/0022-3514.36.12.1546.
  • Meissner, C. A., Sporer, S. L., Schooler, J. W., Person descriptions as eyewitness evidence, In book: Handbook of Eyewitness Psychology: Memory for People, 2007, 3-34.
  • Reisberg, D., Heuer, F., Remembering emotional events, Memory and Emotion, Oxford University Press, 2004, 3-41.
  • Semmler, C., Dunn, J., Wixted, J. T., The role of estimator variables in eyewitness identification, J. Exp. Psychol. Appl., 24 (3) (2018), 400-415, https://doi.org/10.1037/xap0000157.
  • Siegel, J. M., Loftus, E. F., Impact of anxiety and life stress upon eyewitness testimony, Bull. Psychon. Soc., (12) (1978), 479-480.
  • Yarmey, A. D., Jacob, J., Porter, A., Person recall in field settings, J. Appl. Soc. Psychol., 32 (11) (2002), 2354-2367, https://doi.org/10.1111/j.1559-1816.2002.tb01866.x.
  • Wells, G. L., Olson, E. A. Eyewitness testimony, Annu. Rev. Psychol., (54) (2003), 277-295, https://doi.org/10.1146/annurev.psych.54.101601.145028.
  • Wells, G. L., Kovera, M. B., Douglass, A. B., Brewer, N., Meissner, C. A., Wixted, J. T., Policy and procedure recommendations for the collection and preservation of eyewitness identification evidence, Law Hum. Behav., 44 (1) (2020), 3-36, https://doi.org/10.1037/lhb0000359.
  • Wells, G. L., Loftus, E., Eyewitness memory for people and events, Handbook of Psychology: Forensic Psychology, Vol. 11, John Wiley and Sons Inc, 2003, 149-160.
  • Memmon, A., Higham, P. A., A review of the cognitive interview, Psychol. Crime Law, 5 (1-2) (1999), 177-196, https://doi.org/10.1080/10683169908415000.
  • Semmler, C., Brewer, N., Wells, G. L., Effects of postidentification feedback on eyewitness identification and nonidentification confidence, J. Appl. Psychol., 89 (2) (2004), 334-346, https://doi.org/10.1037/0021-9010.89.2.334.
  • Davis, D., Elizabeth, L. F., Internal and external sources of misinformation in adult witness, The Handbook of Eyewitness Psychology, Vol. 1. Memory for Events, 2007, 195-237.
  • Wells, G. L., Eyewitness identification: A System Handbook, Carswell Legal Publications, 1988.
  • Fitzgerald, R., Price, H. L., Valentine, T., Eyewitness identification: Live, photo, and video lineups, Psychol. Public Policy Law, 24 (3) (2018), 307-325, http://doi.org/10.1037/law0000164.
  • Wells, G. L., Steblay, N. K., Dysart, J. E., A Test of the Simultaneous vs. Sequential Lineup Methods an Initial Report of the AJS National Eyewitness Identification Field Studies, 2011.
  • Luus, C. A. E., Wells, G. L., Eyewitness identification and the selection of distracters for lineups, Law Hum. Behav., 15 (1991), 43-47, https://doi.org/10.1007/BF01044829.
  • Wells, G. L., Small, M., Penrod, S., Malpass, R. S., Fulero, S. M., Brimacombe, C.A.E., Eyewitness identification procedures: Recommendations for lineups and photospreads, Law Hum. Behav., 22 (6) (1998), 603-647, http://doi.org/10.1023/A:1025750605807.
  • Fitzgerald, R. J., Oriet, C., Price, H. L., Suspect filler similarity in eyewitness lineups: A literature review and a novel methodology, Law Hum. Behav., 39 (2015), 62-74, http://doi.org/10.1037/lhb0000095.
  • Fitzgerald, R. J., Price, H. L., Oriet, C., Charman, S. D., The effect of suspect-filler similarity on eyewitness identification decisions: A meta-analysis, Psychol. Public Policy Law, 19 (2) (2013), 151–164, https://doi.org/10.1037/a0030618.
  • Fitzgerald, R., Rubinova, E., Juncu, S., Eyewitness identification around the world, Methods, Measures, and Theories in Eyewitness Identification Tasks, 2021, 294-322, http://doi.org/10.4324/9781003138105-16.
  • Carlson, C. A., Jones, A. R., Whittington, J. E., Lockamyeir, R. F., Carlson, M. A., Wooten, A. R., Lineup fairness: propitious heterogeneity and the diagnostic feature-detection hypothesis, Cogn. Res. Princ. Implic., 4 (2019), 20–26. http://doi.org/10.1186/s41235-019-0172-5.
  • Yates, S. Q., Memorandum for Heads of Department Law Enforcement Components All Department Prosecutors, Department of Justice, 2017.
  • Bergold, N. A., Heaton, P., Does filler database size influence identification accuracy?, Law Hum. Behav., 42 (3) (2018), 227–243, https://doi.org/10.1037/lhb0000289.
  • National Research Council, Identifying the Culprit: Assessing Eyewitness İdentification, The National Academies Press, 2014.
  • Police Executive Research Forum (PERF), Library of Congress, 2013. Retrieved December 23, 2021, https://www.loc.gov/item/lcwaN0009235/.
  • Memon, A., Havard, C., Clifford, B., Gabbert, F., Watt, M., A field evaluation of the VIPER system: A new technique for eliciting eyewitness identification evidence, Psychol. Crime Law, 17(8) (2011), 711-729, https://doi.org/10.1080/10683160903524333.
  • Tredoux, C., A direct measure of facial similarity and its relation to human similarity perceptions, J. Exp. Psychol. Appl., 8 (3) (2002), 180-193, https://doi.org/10.1037/1076-898X.8.3.180.
  • Valentine, T., A unified account of the effects of distinctiveness, inversion, and race in face recognition, Q. J. Exp. Psychol. A., 43 (1991), 161-204, https://doi.org/10.1080/14640749108400966.
  • Lee, J., Mansour, J., Penrod, S., Validity of mock-witness measures for assessing lineup fairness, Psychol. Crime Law, 28 (3) (2021), 215-245, https://doi.org/10.1080/1068316X.2021.1905811.
  • Yassin, K., Jridi, M., Falou, A. A., Atri, M., Face recognition systems: a survey, Sensors, 20 (2) (2020), 342, https://doi.org/10.3390/s20020342.
  • Kumar, A., Kaur, A., Kumar, M., Face detection techniques: a review, Artif. Intell. Rev., 52 (2019), 927-948, https://doi.org/10.1007/s10462-018-9650-2.
  • Manjunath, B. S., Chellappa, R., von der Malsburg, C., A feature based approach to face recognition, Proceedings 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (1992), 373-378, https://doi.org/10.1109/CVPR.1992.223162.
  • Kalocsai, P., von der Malsburg, C., Horn, J., Face recognition by statistical analysis of feature detectors. Image Vis. Comput., 18 (4) (2000), 273-278, https://doi.org/10.1016/S0262-8856(99)00051-7.
  • Alzubaidi, L., Zhang, J., Humaidi, A. J., Review of deep learning: concepts, CNN architectures, challenges, applications, future directions, J. Big Data, 8 (53) (2021), https://doi.org/10.1186/s40537-021-00444-8.
  • Serengil, S. İ., Özpınar, A., LightFace: A hybrid deep face recognition framework, 2020 Innovations in Intelligent Systems and Applications Conference (ASYU), (2020), 23-27, https://doi.org/10.1109/ASYU50717.2020.9259802.
  • Yamashita, R., Nishio, M., Do, R.K.G., Convolutional neural networks: an overview and application in radiology, Insights. Imaging., 9 (2018), 611-629, https://doi.org/10.1007/s13244-018-0639-9.
  • Buolamwini, J., Gender Shades [Master Thesis, Massachusetts Institute of Technology], MIT Media Lab, 2017, https://www.media.mit.edu/publications/full-gender-shadesthesis-17/.
  • Taigman, Y., Yang, M., Ranzato, M., Wolf, L., DeepFace: Closing the gap to human level performance in face verification, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2014), 1701-1708. https://doi.org/10.1109/CVPR.2014.220.
  • Serengil, S. İ., deepface/README.md at master · serengil/deepface · GitHub, 2020. Retrieved November 13, 2021, https://github.com/serengil/deepface/blob/master/README.md.
  • Huang, G. B., Ramesh, M., Berg, T., Learned-Miller, E., Labeled Faces in the Wild: A Database for Studying Face Recognition in Unconstrained Environments, 2007.
  • Parkhi, O. M., Vedaldi, A., Zisserman, A. Deep Face Recognition, Oxford Robotics Institute, 2015. Retrieved November 23, 2021, https://www.robots.ox.ac.uk/vgg/publications/2015/Parkhi15/parkhi15.pdf.
  • Kumar, N., Berg, A., Belhumeur, A., Nayar, S., Attribute and simile classifiers for face verification, Proceedings of the IEEE International Conference on Computer Vision, (2009), 365-372, https://doi.org/10.1109/ICCV.2009.5459250.
  • Schroff, F., Kalenichenko, D., Philbin, J., FaceNet: A unified embedding for face recognition and clustering, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), (2015), 815-823, https://doi.org/10.1109/CVPR.2015.7298682.
  • Serengil, S. İ., deepface/Fine-Tuning-Threshold.ipynb at master · serengil/deepface. GitHub, 2020. Retrieved November 13, 2021, https://github.com/serengil/deepface/blob/master/tests/Fine-Tuning-Threshold.ipynb.
  • Krishnapriya, S., Kushal, V., Michael, K., Vítor, A., Kevin, B., Characterizing the variability in face recognition accuracy relative to rac, (2019), arXiv:1904.07325, https://arxiv.org/abs/1904.07325.
  • Tabachnick, B. G., Fidell, L. S., Using Multivariate Statistics (6th ed.), Boston, MA: Pearson, 2013.
  • Betaface, Betaface free online demo - Face recognition, Face search, Face analysis, Betaface API. Retrieved February 7, 2022, https://www.betafaceapi.com/demo.html.
  • Buolamwini, J., Gebru, T., Gender shades: Intersectional accuracy disparities in commercial gender classification, Proceedings of the 1st Conference on Fairness, Accountability and Transparency, in Proceedings of Machine Learning Research, 81 (2018), 77-91, https://proceedings.mlr.press/v81/buolamwini18a.html.
  • Raji, I., Buolamwini, J., Actionable auditing: investigating the impact of publicly naming biased performance results of commercial AI products, Conference on Artificial Intelligence, Ethics, and Society, (2019), https://damprod.media.mit.edu/x/2019/01/24/AIES-19_paper_223.pdf.
  • Tian, J., Hailun, X. A., Hu, S., Liu, J., Multidimensional face representation in a deep convolutional neural network reveals the mechanism underlying AI racism, Frontiers, 2021. Retrieved December 26, 2021, https://www.frontiersin.org/articles/10.3389/fncom.2021.620281/full.
  • Han, H., Jain, A. K., Age, gender and race estimation from unconstrained face images (Rep.). Michigan State University, 2014.
  • CEPEJ (European Commission for the Efficiency of Justice), European Ethical Charter on the use of artificial intelligence (AI) in judicial systems and their environment, Council of Europe, 2018. Retrieved November 11, 2021, https://www.coe.int/en/web/cepej/cepej-european-ethical-charter-on-the-use-of-artificial-intelligence-ai-in-judicial-systems-and-their-environment.
  • European Commission, EUR-Lex. Retrieved November 11, 2021, https://eur-lex.europa.eu/resource.html?uri=cellar:e0649735-a372-11eb-9585-01aa75ed71a1.0001.02/DOC_1&format=PDF.
  • Serengil, S. İ., Özpınar, A., HyperExtended LightFace: A facial attribute analysis framework, 2021 International Conference on Engineering and Emerging Technologies (ICEET), (2021), 1-4, https://doi.org/10.1109/ICEET53442.2021.9659697.

A method for analyzing suspect-filler similarity using convolutional neural networks

Year 2022, , 129 - 151, 30.12.2022
https://doi.org/10.33769/aupse.1184112

Abstract

Eyewitness misidentifications are one of the leading factors in wrongful convictions. This study focuses on the structure of the lineups, which is one of the factors that cause misidentification, and the use of artificial intelligence (AI) technologies in the selection of fillers to be included in the lineups. In the study, AI-based face recognition systems are used to determine the level of similarity of fillers to the suspect. Using two different face recognition models with a Convolutional Neural Network (CNN) structure, similarity threshold values close to human performance were calculated (VGG Face and Cosine similarity = 0.383, FaceNet and Euclidean l2 = 1.16). In the second part of the study, the problems that are likely to be caused by facial recognition systems used in the selection of fillers are examined. The results of the study reveal that models responsible for facial recognition may not suffice alone in the selection of fillers and, an advanced structure using CNN models trained to recognize other attributes (race, gender, age, etc.) associated with similarity along with face recognition models would produce more accurate results. In the last part of the study, a Line-up application that can analyze attributes such as facial similarity, race, gender, age, and facial expression, is introduced.

Project Number

Yok

References

  • Connor, E., Lundregan, T., Miller, N., McEwan, T., Convicted by Juries, Exonerated by Science: Case Studies in the Use of DNA Evidence to Establish Innocence After Trial, Office of Justice Programs, 1996.
  • Garrett, B., Convicting the Innocent: Where Criminal Prosecutions Go Wrong, Harvard University Press, 2011.
  • Innocence Project, Eyewitness Identification Reform, 2018. Retrieved February 17, 2022, https://innocenceproject.org/eyewitness-identification-reform/.
  • Saks, M. J., Koehler, J. J., The coming paradigm shift in forensic identification science, Science, 309 (5736) (2005), 892–895, https://doi.org/10.1126/science.1111565.
  • Berkowitz, S. R., Loftus, E., Misinformation in the courtroom, H. Otgaar, M. L. Howe (Eds.), Finding the Truth in the Courtroom: Dealing with Deception, Lies, and Memories, Oxford University Press, 2018, 11–20.
  • Wells, G. L., Applied eyewitness-testimony research: system variables and estimator variables, J. Pers. Soc. Psychol., 36 (12) (1978), 1546-1557, https://doi.org/10.1037/0022-3514.36.12.1546.
  • Meissner, C. A., Sporer, S. L., Schooler, J. W., Person descriptions as eyewitness evidence, In book: Handbook of Eyewitness Psychology: Memory for People, 2007, 3-34.
  • Reisberg, D., Heuer, F., Remembering emotional events, Memory and Emotion, Oxford University Press, 2004, 3-41.
  • Semmler, C., Dunn, J., Wixted, J. T., The role of estimator variables in eyewitness identification, J. Exp. Psychol. Appl., 24 (3) (2018), 400-415, https://doi.org/10.1037/xap0000157.
  • Siegel, J. M., Loftus, E. F., Impact of anxiety and life stress upon eyewitness testimony, Bull. Psychon. Soc., (12) (1978), 479-480.
  • Yarmey, A. D., Jacob, J., Porter, A., Person recall in field settings, J. Appl. Soc. Psychol., 32 (11) (2002), 2354-2367, https://doi.org/10.1111/j.1559-1816.2002.tb01866.x.
  • Wells, G. L., Olson, E. A. Eyewitness testimony, Annu. Rev. Psychol., (54) (2003), 277-295, https://doi.org/10.1146/annurev.psych.54.101601.145028.
  • Wells, G. L., Kovera, M. B., Douglass, A. B., Brewer, N., Meissner, C. A., Wixted, J. T., Policy and procedure recommendations for the collection and preservation of eyewitness identification evidence, Law Hum. Behav., 44 (1) (2020), 3-36, https://doi.org/10.1037/lhb0000359.
  • Wells, G. L., Loftus, E., Eyewitness memory for people and events, Handbook of Psychology: Forensic Psychology, Vol. 11, John Wiley and Sons Inc, 2003, 149-160.
  • Memmon, A., Higham, P. A., A review of the cognitive interview, Psychol. Crime Law, 5 (1-2) (1999), 177-196, https://doi.org/10.1080/10683169908415000.
  • Semmler, C., Brewer, N., Wells, G. L., Effects of postidentification feedback on eyewitness identification and nonidentification confidence, J. Appl. Psychol., 89 (2) (2004), 334-346, https://doi.org/10.1037/0021-9010.89.2.334.
  • Davis, D., Elizabeth, L. F., Internal and external sources of misinformation in adult witness, The Handbook of Eyewitness Psychology, Vol. 1. Memory for Events, 2007, 195-237.
  • Wells, G. L., Eyewitness identification: A System Handbook, Carswell Legal Publications, 1988.
  • Fitzgerald, R., Price, H. L., Valentine, T., Eyewitness identification: Live, photo, and video lineups, Psychol. Public Policy Law, 24 (3) (2018), 307-325, http://doi.org/10.1037/law0000164.
  • Wells, G. L., Steblay, N. K., Dysart, J. E., A Test of the Simultaneous vs. Sequential Lineup Methods an Initial Report of the AJS National Eyewitness Identification Field Studies, 2011.
  • Luus, C. A. E., Wells, G. L., Eyewitness identification and the selection of distracters for lineups, Law Hum. Behav., 15 (1991), 43-47, https://doi.org/10.1007/BF01044829.
  • Wells, G. L., Small, M., Penrod, S., Malpass, R. S., Fulero, S. M., Brimacombe, C.A.E., Eyewitness identification procedures: Recommendations for lineups and photospreads, Law Hum. Behav., 22 (6) (1998), 603-647, http://doi.org/10.1023/A:1025750605807.
  • Fitzgerald, R. J., Oriet, C., Price, H. L., Suspect filler similarity in eyewitness lineups: A literature review and a novel methodology, Law Hum. Behav., 39 (2015), 62-74, http://doi.org/10.1037/lhb0000095.
  • Fitzgerald, R. J., Price, H. L., Oriet, C., Charman, S. D., The effect of suspect-filler similarity on eyewitness identification decisions: A meta-analysis, Psychol. Public Policy Law, 19 (2) (2013), 151–164, https://doi.org/10.1037/a0030618.
  • Fitzgerald, R., Rubinova, E., Juncu, S., Eyewitness identification around the world, Methods, Measures, and Theories in Eyewitness Identification Tasks, 2021, 294-322, http://doi.org/10.4324/9781003138105-16.
  • Carlson, C. A., Jones, A. R., Whittington, J. E., Lockamyeir, R. F., Carlson, M. A., Wooten, A. R., Lineup fairness: propitious heterogeneity and the diagnostic feature-detection hypothesis, Cogn. Res. Princ. Implic., 4 (2019), 20–26. http://doi.org/10.1186/s41235-019-0172-5.
  • Yates, S. Q., Memorandum for Heads of Department Law Enforcement Components All Department Prosecutors, Department of Justice, 2017.
  • Bergold, N. A., Heaton, P., Does filler database size influence identification accuracy?, Law Hum. Behav., 42 (3) (2018), 227–243, https://doi.org/10.1037/lhb0000289.
  • National Research Council, Identifying the Culprit: Assessing Eyewitness İdentification, The National Academies Press, 2014.
  • Police Executive Research Forum (PERF), Library of Congress, 2013. Retrieved December 23, 2021, https://www.loc.gov/item/lcwaN0009235/.
  • Memon, A., Havard, C., Clifford, B., Gabbert, F., Watt, M., A field evaluation of the VIPER system: A new technique for eliciting eyewitness identification evidence, Psychol. Crime Law, 17(8) (2011), 711-729, https://doi.org/10.1080/10683160903524333.
  • Tredoux, C., A direct measure of facial similarity and its relation to human similarity perceptions, J. Exp. Psychol. Appl., 8 (3) (2002), 180-193, https://doi.org/10.1037/1076-898X.8.3.180.
  • Valentine, T., A unified account of the effects of distinctiveness, inversion, and race in face recognition, Q. J. Exp. Psychol. A., 43 (1991), 161-204, https://doi.org/10.1080/14640749108400966.
  • Lee, J., Mansour, J., Penrod, S., Validity of mock-witness measures for assessing lineup fairness, Psychol. Crime Law, 28 (3) (2021), 215-245, https://doi.org/10.1080/1068316X.2021.1905811.
  • Yassin, K., Jridi, M., Falou, A. A., Atri, M., Face recognition systems: a survey, Sensors, 20 (2) (2020), 342, https://doi.org/10.3390/s20020342.
  • Kumar, A., Kaur, A., Kumar, M., Face detection techniques: a review, Artif. Intell. Rev., 52 (2019), 927-948, https://doi.org/10.1007/s10462-018-9650-2.
  • Manjunath, B. S., Chellappa, R., von der Malsburg, C., A feature based approach to face recognition, Proceedings 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (1992), 373-378, https://doi.org/10.1109/CVPR.1992.223162.
  • Kalocsai, P., von der Malsburg, C., Horn, J., Face recognition by statistical analysis of feature detectors. Image Vis. Comput., 18 (4) (2000), 273-278, https://doi.org/10.1016/S0262-8856(99)00051-7.
  • Alzubaidi, L., Zhang, J., Humaidi, A. J., Review of deep learning: concepts, CNN architectures, challenges, applications, future directions, J. Big Data, 8 (53) (2021), https://doi.org/10.1186/s40537-021-00444-8.
  • Serengil, S. İ., Özpınar, A., LightFace: A hybrid deep face recognition framework, 2020 Innovations in Intelligent Systems and Applications Conference (ASYU), (2020), 23-27, https://doi.org/10.1109/ASYU50717.2020.9259802.
  • Yamashita, R., Nishio, M., Do, R.K.G., Convolutional neural networks: an overview and application in radiology, Insights. Imaging., 9 (2018), 611-629, https://doi.org/10.1007/s13244-018-0639-9.
  • Buolamwini, J., Gender Shades [Master Thesis, Massachusetts Institute of Technology], MIT Media Lab, 2017, https://www.media.mit.edu/publications/full-gender-shadesthesis-17/.
  • Taigman, Y., Yang, M., Ranzato, M., Wolf, L., DeepFace: Closing the gap to human level performance in face verification, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2014), 1701-1708. https://doi.org/10.1109/CVPR.2014.220.
  • Serengil, S. İ., deepface/README.md at master · serengil/deepface · GitHub, 2020. Retrieved November 13, 2021, https://github.com/serengil/deepface/blob/master/README.md.
  • Huang, G. B., Ramesh, M., Berg, T., Learned-Miller, E., Labeled Faces in the Wild: A Database for Studying Face Recognition in Unconstrained Environments, 2007.
  • Parkhi, O. M., Vedaldi, A., Zisserman, A. Deep Face Recognition, Oxford Robotics Institute, 2015. Retrieved November 23, 2021, https://www.robots.ox.ac.uk/vgg/publications/2015/Parkhi15/parkhi15.pdf.
  • Kumar, N., Berg, A., Belhumeur, A., Nayar, S., Attribute and simile classifiers for face verification, Proceedings of the IEEE International Conference on Computer Vision, (2009), 365-372, https://doi.org/10.1109/ICCV.2009.5459250.
  • Schroff, F., Kalenichenko, D., Philbin, J., FaceNet: A unified embedding for face recognition and clustering, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), (2015), 815-823, https://doi.org/10.1109/CVPR.2015.7298682.
  • Serengil, S. İ., deepface/Fine-Tuning-Threshold.ipynb at master · serengil/deepface. GitHub, 2020. Retrieved November 13, 2021, https://github.com/serengil/deepface/blob/master/tests/Fine-Tuning-Threshold.ipynb.
  • Krishnapriya, S., Kushal, V., Michael, K., Vítor, A., Kevin, B., Characterizing the variability in face recognition accuracy relative to rac, (2019), arXiv:1904.07325, https://arxiv.org/abs/1904.07325.
  • Tabachnick, B. G., Fidell, L. S., Using Multivariate Statistics (6th ed.), Boston, MA: Pearson, 2013.
  • Betaface, Betaface free online demo - Face recognition, Face search, Face analysis, Betaface API. Retrieved February 7, 2022, https://www.betafaceapi.com/demo.html.
  • Buolamwini, J., Gebru, T., Gender shades: Intersectional accuracy disparities in commercial gender classification, Proceedings of the 1st Conference on Fairness, Accountability and Transparency, in Proceedings of Machine Learning Research, 81 (2018), 77-91, https://proceedings.mlr.press/v81/buolamwini18a.html.
  • Raji, I., Buolamwini, J., Actionable auditing: investigating the impact of publicly naming biased performance results of commercial AI products, Conference on Artificial Intelligence, Ethics, and Society, (2019), https://damprod.media.mit.edu/x/2019/01/24/AIES-19_paper_223.pdf.
  • Tian, J., Hailun, X. A., Hu, S., Liu, J., Multidimensional face representation in a deep convolutional neural network reveals the mechanism underlying AI racism, Frontiers, 2021. Retrieved December 26, 2021, https://www.frontiersin.org/articles/10.3389/fncom.2021.620281/full.
  • Han, H., Jain, A. K., Age, gender and race estimation from unconstrained face images (Rep.). Michigan State University, 2014.
  • CEPEJ (European Commission for the Efficiency of Justice), European Ethical Charter on the use of artificial intelligence (AI) in judicial systems and their environment, Council of Europe, 2018. Retrieved November 11, 2021, https://www.coe.int/en/web/cepej/cepej-european-ethical-charter-on-the-use-of-artificial-intelligence-ai-in-judicial-systems-and-their-environment.
  • European Commission, EUR-Lex. Retrieved November 11, 2021, https://eur-lex.europa.eu/resource.html?uri=cellar:e0649735-a372-11eb-9585-01aa75ed71a1.0001.02/DOC_1&format=PDF.
  • Serengil, S. İ., Özpınar, A., HyperExtended LightFace: A facial attribute analysis framework, 2021 International Conference on Engineering and Emerging Technologies (ICEET), (2021), 1-4, https://doi.org/10.1109/ICEET53442.2021.9659697.
There are 59 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Research Articles
Authors

Derviş Emre Aydın 0000-0001-6128-6514

Yilmaz Ar 0000-0003-2370-357X

Project Number Yok
Publication Date December 30, 2022
Submission Date October 4, 2022
Acceptance Date November 2, 2022
Published in Issue Year 2022

Cite

APA Aydın, D. E., & Ar, Y. (2022). A method for analyzing suspect-filler similarity using convolutional neural networks. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering, 64(2), 129-151. https://doi.org/10.33769/aupse.1184112
AMA Aydın DE, Ar Y. A method for analyzing suspect-filler similarity using convolutional neural networks. Commun.Fac.Sci.Univ.Ank.Series A2-A3: Phys.Sci. and Eng. December 2022;64(2):129-151. doi:10.33769/aupse.1184112
Chicago Aydın, Derviş Emre, and Yilmaz Ar. “A Method for Analyzing Suspect-Filler Similarity Using Convolutional Neural Networks”. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering 64, no. 2 (December 2022): 129-51. https://doi.org/10.33769/aupse.1184112.
EndNote Aydın DE, Ar Y (December 1, 2022) A method for analyzing suspect-filler similarity using convolutional neural networks. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering 64 2 129–151.
IEEE D. E. Aydın and Y. Ar, “A method for analyzing suspect-filler similarity using convolutional neural networks”, Commun.Fac.Sci.Univ.Ank.Series A2-A3: Phys.Sci. and Eng., vol. 64, no. 2, pp. 129–151, 2022, doi: 10.33769/aupse.1184112.
ISNAD Aydın, Derviş Emre - Ar, Yilmaz. “A Method for Analyzing Suspect-Filler Similarity Using Convolutional Neural Networks”. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering 64/2 (December 2022), 129-151. https://doi.org/10.33769/aupse.1184112.
JAMA Aydın DE, Ar Y. A method for analyzing suspect-filler similarity using convolutional neural networks. Commun.Fac.Sci.Univ.Ank.Series A2-A3: Phys.Sci. and Eng. 2022;64:129–151.
MLA Aydın, Derviş Emre and Yilmaz Ar. “A Method for Analyzing Suspect-Filler Similarity Using Convolutional Neural Networks”. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering, vol. 64, no. 2, 2022, pp. 129-51, doi:10.33769/aupse.1184112.
Vancouver Aydın DE, Ar Y. A method for analyzing suspect-filler similarity using convolutional neural networks. Commun.Fac.Sci.Univ.Ank.Series A2-A3: Phys.Sci. and Eng. 2022;64(2):129-51.

Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.