Review
BibTex RIS Cite

THE ETHICAL DIMENSIONS OF LAW ENFORCEMENT'S USE OF FACIAL RECOGNITION TECHNOLOGY

Year 2025, Issue: Kolluk Uygulamaları ve Güvenlik Teknolojileri Özel Sayısı, 184 - 204, 27.10.2025

Abstract

The purpose of this conceptual study is to comprehensively examine the ethical issues arising from the use of Facial Recognition Technology (FRT) and Artificial Intelligence (AI)-based security applications by law enforcement agencies. Without disregarding the opportunities offered by technological tools in ensuring public security and preventing crime, this study aims to discuss these tools' relationship with ethical concerns such as individual privacy, the risk of discrimination, the feeling of pervasive surveillance, transparency, and accountability. Conducted as a general review of the literature, this work seeks to provide a holistic assessment of how technological advancements like FRT can be implemented ethically in law enforcement services, emphasizing that liberties must not be overlooked while ensuring security. In this regard, the study also puts forward guiding recommendations for managing this complex relationship in a balanced manner. The study concludes that while law enforcement agencies collect individuals' facial biometric data for legitimate public purposes, such as maintaining public order and security, preventing crime, and tracking criminals, it has also been determined that this data can lead to erroneous and unethical practices for a variety of reasons.

References

  • Ali, W. Tian, W. Din, S. U. Iradukunda, D. and Khan, A. A. (2021). Classical and modern face recognition approaches: a complete review. Multimedia Tools and Applications, 80, 4825-4880. https://doi.org/10.1007/s11042-020-09850-1
  • Almeida, D. Shmarko, K. and Lomas, E. (2022). The ethics of facial recognition technologies, surveillance, and accountability in an age of artificial intelligence: a comparative analysis of US, EU, and UK regulatory frameworks. AI and Ethics, 2(3), 377-387. https://doi.org/10.1007/s43681-021-00077-w
  • Barocas, S. and Selbst, A. D. (2016). Big data's disparate impact. California Law Review, 104(3), 671–732.
  • Buolamwini, J. and Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency (pp. 77-91). http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf (12.03.2025).
  • Burrell, J. (2016). How the machine ‘thinks’: understanding opacity in machine learning algorithms, Big Data and Society, 3(1), 1-12. DOI: 10.1177/2053951715622512
  • Coglianese, C. and Lehr, D. (2019). Transparency and algorithmic governance, Administrative Law Review, 71(1), 1-56.
  • Diakopoulos, N. (2015). Algorithmic accountability: journalistic investigation of computational power structures, Digital Journalism, 3(3), 398-415. DOI: 10.1080/21670811.2014.976411
  • Engstrom, D. F. Ho, D. E. Sharkey, C. M. and Cuéllar, M. F. (2020). Government by algorithm: artificial intelligence in federal administrative agencies. NYU School of Law, Public Law Research Paper, (pp. 20-54).
  • Etzioni, A. ve Etzioni, O. (2016), Keeping AI legal, Vanderbilt Journal of Entertainment and Technology Law, 19(1), 133-146.
  • Eubanks, V. (2018). Automating inequality: how high-tech tools profile, police, and punish the poor. New York: St. Martin's Press.
  • Foucault, M. (2000). Hapishanenin doğuşu. (2. Basım), (Çev. M. A. Kılıçbay), İstanbul: İmge Yayınları.
  • Gates, K. A. (2011). Our biometric future: Facial recognition technology and the culture of surveillance. New York: New York University Press.
  • International Justice and Public Safety Network (IJIS), & International Association of Chiefs of Police (IACP). (2019). Facial recognition use case catalogue (LEITTF White Paper). https://www.theiacp.org/sites/default/files/2019- 10/IJIS_IACP%20WP_LEITTF_Facial%20Recognition%20UseCasesRpt_20190322.pdf (15.03.2025)
  • Jain, A. K. Klare, B. and Park, U. (2011). Face recognition: Some challenges in forensics. In 2011 IEEE International Conference on Automatic Face and Gesture Recognition. (pp. 726-733).
  • Joh, E. E. (2017). The undue influence of surveillance technology companies on policing. New York University Law Review, 92, 101–130.
  • Jones, C. (2021). Law enforcement use of facial recognition: bias, disparate impacts on people of color, and the need for federal legislation. North Carolina Journal of Law and Technology, 22(4), 777-816.
  • Kortli, Y. Jridi, M. Al Falou, A. and Atri, M. (2020). Face recognition systems: A survey. Sensors, 20(2), 1-36. https://doi.org/10.3390/s20020342
  • LeCun, Y. Bengio, Y. ve Hinton, G. (2015). Deep Learning, Nature, 521(7553), 436-444.
  • Lynch J. (2018). Face off: Law enforcement use of face recognition technology. Electronic Frontier Foundation. https://www.eff.org/files/2020/04/20/face-offreport-2020_1.pdf (30.03.2025)
  • Macnish, K. (2012). Unblinking eyes: the ethics of automating surveillance. Ethics and Information Technology, 14(2), 151-167. DOI 10.1007/s10676-012-9291-0
  • Mann, M. and Smith, M. (2017). Automated facial recognition technology: Recent developments and approaches to oversight. University of New South Wales Law Journal, 40(1), 121-145.
  • Mittelstadt, B. D. Allo, P. Taddeo, M. Wachter, S. and Floridi, L. (2016). The ethics of algorithms: mapping the debate. Big Data and Society, 3(2), 1-21. DOI: 10.1177/2053951716679679
  • O’Neill, J. (2019). How facial recognition makes you safer. https://ebac2018.sites.pomona.edu/classes/cs181s/2020fa/lectures/14-privacy/NYT-ONeill.pdf (23.07.2025).
  • Oxley, G. Uwazuruike, A. Lalic, M. Samuel, H. Downs, W. (2024). Police use of live facial recognition technology. https://researchbriefings.files.parliament.uk/documents/CDP-2024-0144/CDP-2024-0144.pdf (23.07.2025).
  • Pasquale, F. (2015), The black box society: the secret algorithms that control money and information, Cambridge, Massachusetts: Harvard University Press.
  • Pastaltzidis, I. Dimitriou, N. Quezada-Tavarez, K. Aidinlis, S. Marquenie, T. Gurzawska, A. and Tzovaras, D. (2022).
  • Data augmentation for fairness-aware machine learning: Preventing algorithmic bias in law enforcement systems. In Proceedings of the 2022 ACM conference on fairness, accountability, and transparency (pp. 2302-2314).
  • Peterson, S, Jackson, B. A. Woods, D. and Treyger, E. (2023). Finding a broadly practical approach for regulating the use of facial recognition by Law Enforcement. https://www.rand.org/content/dam/rand/pubs/research_reports/RRA2200/RRA2249-1/RAND_RRA2249-1.pdf (15.03.2025).
  • Richards, N. M. and King, J. H. (2014). Big data ethics. Wake Forest Law Review, 49(2), 393-432.
  • Ruhrmann, H. (2019). Facing the future: protecting human rights in policy strategies for facial recognition technology in law enforcement. https://discovery.ucl.ac.uk/id/eprint/10148887/1/Facing-the-Future_Ruhrmann_CITRIS-Policy-Lab.pdf (25.03.2025).
  • Smith, M. and Miller, S. (2022). The ethical application of biometric facial recognition technology. AI and Society, 37(1), 167-175. https://doi.org/10.1007/s00146-021-01199-9
  • Von Eschenbach, W. J. (2021). Transparency and the black box problem: why we do not trust AI, Philosophy and Technology, 34(4), 1607-1622. https://doi.org/10.1007/s13347-021-00477-0
  • Wiggers, K. (2020). Researchers show that computer vision algorithms pretrained on ImageNet exhibit multiple, distressing biases. https://venturebeat.com/business/researchers-show-that-computer-vision-algorithms-pretrained-on-imagenet-exhibit-multiple-distressing-biases/ (25.03.2025)
  • Zhao, W. Chellappa, R. Phillips, P. J. and Rosenfeld, A. (2003). Face recognition: a literature survey. ACM Computing Surveys, 35(4), 399-458.

KOLLUK KUVVETLERİNİN YÜZ TANIMA TEKNOLOJİSİNİ KULLANMASININ ETİK BOYUTLARI

Year 2025, Issue: Kolluk Uygulamaları ve Güvenlik Teknolojileri Özel Sayısı, 184 - 204, 27.10.2025

Abstract

Bu kavramsal çalışmanın amacı, ‘Yüz Tanıma Teknolojisi’ (YTT) ve yapay zekâ (YZ) tabanlı güvenlik uygulamalarının kolluk kuvvetleri tarafından kullanımı bağlamında ortaya çıkan etik sorunları kapsamlı bir şekilde incelemektir. Çalışma; kamu güvenliğinin sağlanmasında, suç işlenmesinin önlenmesinde, teknolojik araçların sunduğu fırsatları göz ardı etmeden bu araçların bireysel mahremiyet, ayrımcılık riski, yaygın gözetim hissi, şeffaflık ve hesap verebilirlik gibi etik meselelerle olan ilişkisini tartışmayı hedeflemektedir. Literatürün genel bir incelemesi şeklinde yürütülen bu derleme, kolluk hizmetlerinde YTT gibi teknolojik ilerlemelerin etik sorumlulukla uyumlu biçimde nasıl hayata geçirilebileceğine ve güvenlik sağlanırken özgürlüklerin de göz ardı edilmemesi gerektiğine dair bütüncül bir değerlendirme sunmayı amaçlamaktadır. Bu yönüyle çalışma, bu karmaşık ilişkinin dengeli bir şekilde yönetilmesi için rehber niteliğinde öneriler de ortaya atmaktadır. Kolluk kuvvetlerince kamu düzeni ve güvenliğinin sağlanması, suç işlenmesinin önlenmesi, suçluların takibi gibi meşru kamusal amaçlarla bireylerin yüzlerine ait biyometrik verilerin toplandığı sonucuna ulaşılmış ancak bu verilerin birtakım nedenlerle hatalı ve etik dışı uygulamalara kapı aralayacağı tespit edilmiştir.

References

  • Ali, W. Tian, W. Din, S. U. Iradukunda, D. and Khan, A. A. (2021). Classical and modern face recognition approaches: a complete review. Multimedia Tools and Applications, 80, 4825-4880. https://doi.org/10.1007/s11042-020-09850-1
  • Almeida, D. Shmarko, K. and Lomas, E. (2022). The ethics of facial recognition technologies, surveillance, and accountability in an age of artificial intelligence: a comparative analysis of US, EU, and UK regulatory frameworks. AI and Ethics, 2(3), 377-387. https://doi.org/10.1007/s43681-021-00077-w
  • Barocas, S. and Selbst, A. D. (2016). Big data's disparate impact. California Law Review, 104(3), 671–732.
  • Buolamwini, J. and Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency (pp. 77-91). http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf (12.03.2025).
  • Burrell, J. (2016). How the machine ‘thinks’: understanding opacity in machine learning algorithms, Big Data and Society, 3(1), 1-12. DOI: 10.1177/2053951715622512
  • Coglianese, C. and Lehr, D. (2019). Transparency and algorithmic governance, Administrative Law Review, 71(1), 1-56.
  • Diakopoulos, N. (2015). Algorithmic accountability: journalistic investigation of computational power structures, Digital Journalism, 3(3), 398-415. DOI: 10.1080/21670811.2014.976411
  • Engstrom, D. F. Ho, D. E. Sharkey, C. M. and Cuéllar, M. F. (2020). Government by algorithm: artificial intelligence in federal administrative agencies. NYU School of Law, Public Law Research Paper, (pp. 20-54).
  • Etzioni, A. ve Etzioni, O. (2016), Keeping AI legal, Vanderbilt Journal of Entertainment and Technology Law, 19(1), 133-146.
  • Eubanks, V. (2018). Automating inequality: how high-tech tools profile, police, and punish the poor. New York: St. Martin's Press.
  • Foucault, M. (2000). Hapishanenin doğuşu. (2. Basım), (Çev. M. A. Kılıçbay), İstanbul: İmge Yayınları.
  • Gates, K. A. (2011). Our biometric future: Facial recognition technology and the culture of surveillance. New York: New York University Press.
  • International Justice and Public Safety Network (IJIS), & International Association of Chiefs of Police (IACP). (2019). Facial recognition use case catalogue (LEITTF White Paper). https://www.theiacp.org/sites/default/files/2019- 10/IJIS_IACP%20WP_LEITTF_Facial%20Recognition%20UseCasesRpt_20190322.pdf (15.03.2025)
  • Jain, A. K. Klare, B. and Park, U. (2011). Face recognition: Some challenges in forensics. In 2011 IEEE International Conference on Automatic Face and Gesture Recognition. (pp. 726-733).
  • Joh, E. E. (2017). The undue influence of surveillance technology companies on policing. New York University Law Review, 92, 101–130.
  • Jones, C. (2021). Law enforcement use of facial recognition: bias, disparate impacts on people of color, and the need for federal legislation. North Carolina Journal of Law and Technology, 22(4), 777-816.
  • Kortli, Y. Jridi, M. Al Falou, A. and Atri, M. (2020). Face recognition systems: A survey. Sensors, 20(2), 1-36. https://doi.org/10.3390/s20020342
  • LeCun, Y. Bengio, Y. ve Hinton, G. (2015). Deep Learning, Nature, 521(7553), 436-444.
  • Lynch J. (2018). Face off: Law enforcement use of face recognition technology. Electronic Frontier Foundation. https://www.eff.org/files/2020/04/20/face-offreport-2020_1.pdf (30.03.2025)
  • Macnish, K. (2012). Unblinking eyes: the ethics of automating surveillance. Ethics and Information Technology, 14(2), 151-167. DOI 10.1007/s10676-012-9291-0
  • Mann, M. and Smith, M. (2017). Automated facial recognition technology: Recent developments and approaches to oversight. University of New South Wales Law Journal, 40(1), 121-145.
  • Mittelstadt, B. D. Allo, P. Taddeo, M. Wachter, S. and Floridi, L. (2016). The ethics of algorithms: mapping the debate. Big Data and Society, 3(2), 1-21. DOI: 10.1177/2053951716679679
  • O’Neill, J. (2019). How facial recognition makes you safer. https://ebac2018.sites.pomona.edu/classes/cs181s/2020fa/lectures/14-privacy/NYT-ONeill.pdf (23.07.2025).
  • Oxley, G. Uwazuruike, A. Lalic, M. Samuel, H. Downs, W. (2024). Police use of live facial recognition technology. https://researchbriefings.files.parliament.uk/documents/CDP-2024-0144/CDP-2024-0144.pdf (23.07.2025).
  • Pasquale, F. (2015), The black box society: the secret algorithms that control money and information, Cambridge, Massachusetts: Harvard University Press.
  • Pastaltzidis, I. Dimitriou, N. Quezada-Tavarez, K. Aidinlis, S. Marquenie, T. Gurzawska, A. and Tzovaras, D. (2022).
  • Data augmentation for fairness-aware machine learning: Preventing algorithmic bias in law enforcement systems. In Proceedings of the 2022 ACM conference on fairness, accountability, and transparency (pp. 2302-2314).
  • Peterson, S, Jackson, B. A. Woods, D. and Treyger, E. (2023). Finding a broadly practical approach for regulating the use of facial recognition by Law Enforcement. https://www.rand.org/content/dam/rand/pubs/research_reports/RRA2200/RRA2249-1/RAND_RRA2249-1.pdf (15.03.2025).
  • Richards, N. M. and King, J. H. (2014). Big data ethics. Wake Forest Law Review, 49(2), 393-432.
  • Ruhrmann, H. (2019). Facing the future: protecting human rights in policy strategies for facial recognition technology in law enforcement. https://discovery.ucl.ac.uk/id/eprint/10148887/1/Facing-the-Future_Ruhrmann_CITRIS-Policy-Lab.pdf (25.03.2025).
  • Smith, M. and Miller, S. (2022). The ethical application of biometric facial recognition technology. AI and Society, 37(1), 167-175. https://doi.org/10.1007/s00146-021-01199-9
  • Von Eschenbach, W. J. (2021). Transparency and the black box problem: why we do not trust AI, Philosophy and Technology, 34(4), 1607-1622. https://doi.org/10.1007/s13347-021-00477-0
  • Wiggers, K. (2020). Researchers show that computer vision algorithms pretrained on ImageNet exhibit multiple, distressing biases. https://venturebeat.com/business/researchers-show-that-computer-vision-algorithms-pretrained-on-imagenet-exhibit-multiple-distressing-biases/ (25.03.2025)
  • Zhao, W. Chellappa, R. Phillips, P. J. and Rosenfeld, A. (2003). Face recognition: a literature survey. ACM Computing Surveys, 35(4), 399-458.
There are 34 citations in total.

Details

Primary Language English
Subjects Public Administration, Policy and Administration (Other)
Journal Section Derleme
Authors

Müslüm Kayacı 0000-0002-6055-2734

Publication Date October 27, 2025
Submission Date April 25, 2025
Acceptance Date October 9, 2025
Published in Issue Year 2025 Issue: Kolluk Uygulamaları ve Güvenlik Teknolojileri Özel Sayısı

Cite

APA Kayacı, M. (2025). THE ETHICAL DIMENSIONS OF LAW ENFORCEMENT’S USE OF FACIAL RECOGNITION TECHNOLOGY. Güvenlik Bilimleri Dergisi(Kolluk Uygulamaları ve Güvenlik Teknolojileri Özel Sayısı), 184-204. https://doi.org/10.28956/gbd.1683533

2434714728147311473929833


This journal is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License29846