Araştırma Makalesi
BibTex RIS Kaynak Göster

Yıl 2025, Sayı: Sosyal Bilimlerde Yapay Zeka: Kuram, Uygulama ve Gelecek Perspektifleri , 23 - 44 , 07.12.2025
https://izlik.org/JA35RZ32ZB

Öz

Kaynakça

  • Adam, A. (1995). Artificial intelligence and women’s knowledge. Women’s Studies International Forum, 18(4), 407-415.
  • Adam, A, (1998). Artificial Knowing: Gender and the Thinking Machine. Taylor & Francis Group. http://ebookcentral.proquest.com/lib/mcgill/detail.action?docID=165123
  • Adam, A & Furnival, C. (1995). Designing intersections-designing subjectivity: Feminist theory and praxis in a sex discrimination legislation system. Information & Communications Technology Law 4(2), 161-173.
  • Alemany, L. A., Benotti, L., González, J., Busaniche, B., & Halvorsen, A. (2024). Tool to overcome technical barriers for bias assessment in human language technologies. https://drive.google.com/file/d/1Ppdf6K8p4rYv5U-MLUql4o7fwTpoKheW/view
  • Algorithmic Bias Persists (August, 2020). https://www.media.mit.edu/projects/gender-shades/overview/ Altunışık, R., Coşkun, R., Bayraktaroğlu, S., & Yıldırım, E. (2010). Sosyal Bilimlerde Araştırma Yöntemleri SPSS Uygulamalı (6. Baskı). Sakarya Yayıncılık.
  • Atanasoski, N. & Vora, K. (2019). Surrogate Humanity: Race, Robots, and the Politics of Technological Futures. Duke University Press.
  • Avila, R, (2021). Deploying Feminist AI. Alliance+. https://feministai.pubpub.org/pub/deploying-feminist-ai/release/1
  • Bardzell, S. (2010, April 10-15). Feminist HCI: Taking stock and outlining an agenda for design. In CHI’10: Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1301-1310). Association for Computing Machinery, Atlanta Georgia USA. https://doi.org/10.1145/1753326.1753521
  • Bauchspies, W. K., & Bellacasa, M. P. D. L. (2009). Feminist Science and Technology Studies: A Patchwork of Moving Subjectivities. An Interview with Geoffrey Bowker, Sandra Harding, Anne Marie Mol, Susan Leigh Star and Banu Subramaniam. Subjectivity, 28, 334-344.
  • Baudrillard, J. (1997). Tüketim toplumu. (H. Deliçaylı, F. Keskin, Çev.). Ayrıntı Yayınları.
  • Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021, March). On the dangers of stochastic parrots: Can language models be too big?🦜. In Proceedings of the 2021 ACM conference on fairness, accountability, and transparency (pp. 610-623). https://doi.org/10.1145/3442188.3445922
  • Benjamin, R, (ed.) (2019). Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life. Duke University Press.
  • Bonila, Y., & Rosa, J. (2015). #Ferguson: Digital protest, hashtag ethnography, and the racial politics of social media in the United States. American Ethnologist, 42(1), 4-17.
  • Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 77-91.
  • Brimacombe, T., Kant, R., Finau, G., Tarai, J., & Titifanue, J. (2018). A new frontier in digital activism: An exploration of digital feminism in Fiji. Asia & the Pacific Policy Studies, 5(3), 508-521.
  • Butler, J. (1988). Performative Acts and Gender Constitution: An Essay in Phenomenology and Feminist Theory. Theatre Journal, 40, 519-531.
  • Butler, J. (1994). Gender as Performance: An Interview with Judith Butler. Radical Philosophy: A Journal of Socialist and Feminist Philosophy, 67, 32-39.
  • Butler, J. (2009). Performativity, Precarity and Sexual Politics. AIBR: Revista de Antropología Iberoamericana 4: i–xiii.
  • Castells, M. (2012). Networks of outrage and hope: Social movements in the internet age. Polity Press.
  • Cifor, M., Garcia, P. Cowan, T. L., Rault, J., Sutherland, T., Chan, A. S., Rode, J., Hoffmann, A. L., Salehi, N., & Nakamura, L. (2018). Feminist Data Manifest-No (2019). https://www.manifestno.com/; Design Justice Network, Design Justice Network Principles. https://designjustice.org/read-the-principles
  • ChatGPT. 2025a. Bir Kadın Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • ChatGPT. 2025b. Bir Erkek Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • ChatGPT. 2023-5c. Bir Kadının Rolleri Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • ChatGPT. 2025d. Bir Erkeğin Rolleri Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • Chmiliar, l. (2010). Multiple-case designs. In A. J. Mills, G. Eurepas & E. Wiebe (Ed.), Encyclopedia of case study research (pp 582-583). USA: SAGE Publications.
  • Colenutt, M. (2021). One Is Not Born, But Becomes a Woman. The Oxford Research Centre in the Humanities. March 22. https://www.torch.ox.ac.uk/article/one-is-not-born-but-becomes-a-woman
  • Connell, R. W. (1987). Gender and power: society, the person and sexuel politics. Polity.
  • Crawford, K, (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.
  • Çelik, H., & Ekşi, H. (2008). Söylem Analizi. Marmara Üniversitesi Atatürk Eğitim Fakültesi Eğitim Bilimleri Dergisi, 27(27). 99-117.
  • Davis, J. L., Williams, A., & Yang, M. W. (2021). Algorithmic reparation. Big Data & Society, 8(2). https://doi.org/10.1177/2053951721104480
  • Elliott, K. (2016). Caring masculinities: Theorizing an emerging concept. Men and masculinities, 19(3), 240-259. Eroğul, N. (2024). Ataerkil şartlanma ve şartlanmanın kadın ve erkek davranışlarına yansımaları. 17. Uluslararası Sosyal, Beşerî ve Eğitim Bilimler Kongresi, 21-22 Aralık 2024, 314-322.
  • Feldfeber, I,, Quiroga, Y,B,, Guevara, C. & Ciolfi Felice, M. (2022). Feminisms in Artificial Intelligence: Automation tools towards a feminist judiciary reform in Argentina and Mexico. https://drive.google.com/file/d/1VKeeSdJz6-8DoEbetQ8f1wTH1ORqvIXV/view
  • Feminist Data Manifest-No (2019). https://irwg.umich.edu/news/feminist-data-manifest-no
  • Ferrara, E. (2023). Should chatgpt be biased? challenges and risks of bias in large language models. arXiv preprint arXiv:2304.03738.
  • Gal, Y. (2020). Algorithmic bias: How AI mirrors and exacerbates societal biases. AI & Society, 1-7. https://doi.org/10.1007/s00146-020-01105-0
  • Gedik, E., Çakır, H., & Coşkun, A. (2020). Bir inşaa süreci olarak erkeklik: Yozgat örneği. Artvin Çoruh Üniversitesi Uluslararası Sosyal Bilimler Dergisi, 6(1), 84-95.
  • Gerbaudo, P. (2012). Tweets and the streets: Social media and contemporary activism. Pluto Press.
  • Ghosh, S., & Caliskan, A. (2023, August). Chatgpt perpetuates gender bias in machine translation and ignores non-gendered pronouns: Findings across bengali and five other low-resource languages. In Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society (pp. 901-912).
  • Goyal, P., Soriano, A. R., Hazirbas, C., Sagun, L., & Usunier, N. (2022, June). Fairness indicators for systematic assessments of visual feature extractors. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (pp. 70-88).
  • Gross, N. (2023). What ChatGPT tells us about gender: a cautionary tale about performativity and gender biases in AI. Social Sciences, 12(8), 435. https://doi.org/10.3390/socsci12080435
  • Guerra, J. (2022). Towards a feminist framework for AI development: From principles to practice. https://www.derechosdigitales.org/wp-content/uploads/Fair_Doc_Eng.pdf
  • Güngör, B. (2020). Söylem Yaklaşımı Üzerine Bir Kavram Çalışması ve Eleştirel Söylem Analizi. Kritik İletişim Çalışmaları Dergisi, 2(2), 1-11.
  • Güven, Z. C. (2016). Eril Tarih Söyleminde Bir Osmanlı Kadın Kamusal Alanı: Hanımlara Mahsus Gazete. İletişim Kuram ve Araştırma Dergisi, 2016(43), 1-14.
  • Hanna, A. D., Rollins, N., & Loveys, K. (2020). Recognizing the role of social context in AI with ‘critical AI for good’. arXiv preprint arXiv:2011.05196.
  • Haraway, D. (1988), Situated knowledges: the science question in feminism and the privilege of partial perspective. Feminist Studies, 14(3), 575-599.
  • IDRC (2021). Advancing research on feminist artificial intelligence to advance gender equality and inclusion. IDRC. https://www.idrc.ca/en/project/advancing-research-feminist-artificial-intelligence-advance-gender-equality-and-inclusion
  • Iyer, N. (t.y.). What does the future of Feminist AI look like? Goethe Institute. https://www.goethe.de/prj/k40/en/eth/fem.html
  • Jouët, J. (2018). Digital feminism: Questioning the renewal of activism. Journal of Research in Gender Studies, 8(1), 133-157.
  • Kabeer, N. (2012). The power of association: Reflecting on women's collective action as a force for social change. Thinkpiece #2. UK Feminista.
  • Kaplan, H. (2023). Kadın aşıklar. Nobel.
  • Kaplan, H. (2025). Toplumsal cinsiyet bağlamında kadın aşıklar ve sosyal rolleri. İçinde A.E. Gündoğdu & B. Karakaş (Ed.), Sosyal bilimler çerçevesinde kadın (ss. 193-221). NEU Press.
  • Meinders, C. (2017). From Feminist AI projects to artificial knowing. Media Design Practices Lab, 30 November. https://mediadesignpractices.net/lab/2017/11/collaborative-ai-research/
  • Meinders, C., Thompson, J., Ciston, S., & Griffiths, C. (2020). Week 3: Feminist AI (Main Thread). Critical Code Studies. http://wg20.criticalcodestudies.com/index.php?p=/discussion/87/week-3-feminist-ai-main-thread
  • Noble. S.U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.
  • Özdemir, H. (2019). Toplumsal cinsiyet perspektifinde erkeklik ve kadınlık algısı: Bir alan araştirmasi. Asya Studies, 4(10), 90-107.
  • Panch, T., Mattie, H., & Atun, R. (2019). Artificial intelligence and algorithmic bias: implications for health systems. Journal of global health, 9(2), 020318.
  • Polo, E. P., & Ailodion, D. O. (2025). Tackling Racial Bias in AI Systems: Applying the Bioethical Principle of Justice an-d Insights from Joy Buolamwini’s “Coded Bias” and the “Algorithmic Justice League, Bangladesh Journal of Bioethics, 16(1), 8-14.
  • Posada, J. (2022). Embedded reproduction in platform data work. Information, Communication & Society, 25(6), 816-834.
  • Purvis, T. & Hunt, A. (1993). Discourse, ideology, discourse, ideology, discourse, ideology... British journal of sociology, 473-499.
  • Ray, P. P. (2023). ChatGPT: A comprehensive review on background, applications, key challenges, bias, ethics, limitations and future scope. Internet of Things and Cyber-Physical Systems, 3, 121-154.
  • Rich, A. (2021). Of woman born: Motherhood as experience and institution. WW Norton & Company.
  • Richardson, B., & Gilbert, J. E. (2021). A framework for fairness: A systematic review of existing fair AI solutions. arXiv preprint arXiv:2112.05700. https://doi.org/10.48550/arXiv.2112.05700
  • Roberts, S. (2021). Your AI is a human. In T. S. Mullaney, B. Peters and M. Hicks (Ed.), Your Computer Is on Fire (pp. 51-70). MIT Press.
  • Rodazo, D. (2023). The Political Biases of ChatGPT. Social Sciences, 12(3), 148. https://doi.org/10.3390/socsci12030148
  • Roozafzai, Z. S. (2024). Unveiling Power and Ideologies in the Age of Algorithms: Exploring the Intersection of Critical Discourse Analysis and Artificial Intelligence. Qeios.
  • Sakib, M. S. I. (2023). What is ChatGPT. ResearchGate.
  • Sallan Gül, S. & Kahya Nizam, Ö. (2021). Sosyal Bilimlerde İçerik ve Söylem Analizi. Pamukkale Üniversitesi Sosyal Bilimler Enstitüsü Dergisi ( 42 ), 181-198. https://doi.org/10.30794/pausbed.803182
  • Sarrion, E. (2023). What Is ChatGPT?. In Exploring the power of ChatGPT: Applications, techniques, and implications (pp. 3-8). Apress.
  • Sinders, C. (2020). Feminist data set. https://carolinesinders.com/wp-content/uploads/2020/05/Feminist-Data-Set-Final-Draft-2020-0526.pdf
  • Titifanue, J., Tarai, J., Kant, R., & Finau, G. (2016). From social networking to activism: The role of social media in the free West Papua campaign. Pacific Studies, 39(3), 255–281.
  • Toupin, S. (2024). Shaping feminist artificial intelligence. New Media & Society, 26(1), 580-595. https://doi.org/10.1177/14614448221150776
  • Van Dijk, T. A. (2001). Multidisciplinary CDA: A plea for diversity. Methods of critical discourse analysis, 1(1), 95-120. Varshney, K. (2018). Introducing AI fairness 360. IBM Research blog. https://research.ibm.com/blog/ai-fairness-360
  • Webb. C. (2019) WHAT IS A FEMINIST AI? Possible Feminisms, Possible Internets. Disruption Network Lab, Berlin. https://www.youtube.com/watch?v=bBQOyvNhWJY
  • Weiss, G., & Wodak, R. (2003). Introduction: Theory, interdisciplinarity and critical discourse analysis. In Critical discourse analysis: Theory and interdisciplinarity (pp. 1-32). Palgrave Macmillan UK.
  • Wellner, G. & Rothman, T. (2020). Feminist AI: can we expect our AI systems to become feminist?, Philosophy & Technology, 33(2), 191-205.
  • West, S. M. (2020). Redistribution and rekognition: A feminist critique of algorithm Fairness. CATALYST: Feminism, Theory, Technoscience, 6(2), 124.
  • Whittaker, M. (2021). The steep cost of capture. Interactions, 28(6), 50-55.
  • Wodak, R. (2001). What CDA is About–a Summary of its History, Important Concepts and its Developments. In R. Wodak & M. Meyer, (Ed.), Methods of Critical Discourse Analysis. (pp. 1-13). Sage. Wu, C.-L., Waisman, H. W., Ran, F., Wang, Y., & Zucker, D. J. (2020). The future of artificial intelligence in healthcare: An interdisciplinary roadmap. Current Opinion in Systems Biology, 23, 31-37. https://doi.org/10.1016/j.coisb.2020.06.005
  • Xu, C., Greer, C., Joshi, M. N., & Doshi, T. (2020). Fairness indicators demo: Scalable infrastructure for fair ML systems. Google Research.
  • Xu, C. & Doshi, T. (2019). Fairness Indicators: Scalable Infrastructure for Fair ML Systems. https://research.google/blog/fairness-indicators-scalable-infrastructure-for-fair-ml-systems/
  • Yalçın, H. (2022). Bir araştırma deseni olarak fenomenoloji. Anadolu Üniversitesi Sosyal Bilimler Dergisi, 22(Özel Sayı 2), 213-232.
  • Yıldırım, A. & Şimşek, H. (2008). Sosyal bilimlerde nitel araştırma yöntemleri (6. Baskı). Seçkin Yayıncılık.
  • Young, J. (2019). Why we need to design feminist AI. https://www.youtube.com/watch?v=E-O3LaSEcVw
  • Zhang, Y., & Zhou, L. (2020). Fairness and bias in machine learning: A critical discourse analysis. Knowledge-Based Systems, 195, 105603. https://doi.org/10.1016/j.knosys.2020.105603

ChatGPT’nin toplumsal cinsiyet algısına yönelik söylem analizi

Yıl 2025, Sayı: Sosyal Bilimlerde Yapay Zeka: Kuram, Uygulama ve Gelecek Perspektifleri , 23 - 44 , 07.12.2025
https://izlik.org/JA35RZ32ZB

Öz

Eğitilmiş bir doğal dil işleme modeli olan ChatGPT, erkek egemen tarihin, ideolojinin, söylemin kontrolünde eğitilen bir sistem olarak toplumsal cinsiyet eşitsizliğine yönelik içerikler ortaya koymaktadır. Bu durumdan yola çıkarak araştırmada, ChatGPT’nin toplumsal cinsiyet algısı, cinsiyet ön yargıları üzerinden incelenmektedir. Araştırmada nitel bir yöntem benimsenmiştir. Bu araştırmanın veri analizi sürecinde söylem analizinin üç aşaması olan yorumlama, yapılandırma ve makro yapılandırma aşamaları izlenmiştir. Yorumlama aşamasında toplumsal cinsiyet temsilleri, yapılandırmada söylemlerin ürettiği normlar, makro yapılandırmada ise egemen ideolojilerle ilişkileri ve eşitsizliği pekiştiren yapılar analiz edilmiştir. Araştırmanın bulgularına göre ChatGPT kadınlığı sakinlik, şefkat ve ev içi rollerle; erkekliği ise ciddiyet, karar verme ve otorite ile ilişkilendirmiştir. Annelik şefkat ve özveriyle idealize edilirken, babalık güven ve rehberlik ile tanımlanmıştır. “Güven” kavramının erkeklik söyleminde baskın biçimde yer alması, toplumsal cinsiyet rollerinin dikotomik biçimde yeniden üretildiğini göstermektedir. Bununla birlikte babaların bakım veren yönlerine de yer verilmesi, alternatif erkeklik temsillerine sınırlı da olsa kapı aralamaktadır. Sonuç olarak ChatGPT’nin mevcut toplumsal normlara dayalı veri havuzundan beslendiği ve cinsiyetçi kalıp yargıları yeniden ürettiği ortaya konulmuştur. Bu nedenle sistemin feminist perspektifle yeniden yapılandırılması önerilmektedir.

Etik Beyan

Bu araştırma etik kurul onay gerektirmemektedir.

Destekleyen Kurum

Destekleyen bir kurum bulunmamaktadır.

Teşekkür

-

Kaynakça

  • Adam, A. (1995). Artificial intelligence and women’s knowledge. Women’s Studies International Forum, 18(4), 407-415.
  • Adam, A, (1998). Artificial Knowing: Gender and the Thinking Machine. Taylor & Francis Group. http://ebookcentral.proquest.com/lib/mcgill/detail.action?docID=165123
  • Adam, A & Furnival, C. (1995). Designing intersections-designing subjectivity: Feminist theory and praxis in a sex discrimination legislation system. Information & Communications Technology Law 4(2), 161-173.
  • Alemany, L. A., Benotti, L., González, J., Busaniche, B., & Halvorsen, A. (2024). Tool to overcome technical barriers for bias assessment in human language technologies. https://drive.google.com/file/d/1Ppdf6K8p4rYv5U-MLUql4o7fwTpoKheW/view
  • Algorithmic Bias Persists (August, 2020). https://www.media.mit.edu/projects/gender-shades/overview/ Altunışık, R., Coşkun, R., Bayraktaroğlu, S., & Yıldırım, E. (2010). Sosyal Bilimlerde Araştırma Yöntemleri SPSS Uygulamalı (6. Baskı). Sakarya Yayıncılık.
  • Atanasoski, N. & Vora, K. (2019). Surrogate Humanity: Race, Robots, and the Politics of Technological Futures. Duke University Press.
  • Avila, R, (2021). Deploying Feminist AI. Alliance+. https://feministai.pubpub.org/pub/deploying-feminist-ai/release/1
  • Bardzell, S. (2010, April 10-15). Feminist HCI: Taking stock and outlining an agenda for design. In CHI’10: Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1301-1310). Association for Computing Machinery, Atlanta Georgia USA. https://doi.org/10.1145/1753326.1753521
  • Bauchspies, W. K., & Bellacasa, M. P. D. L. (2009). Feminist Science and Technology Studies: A Patchwork of Moving Subjectivities. An Interview with Geoffrey Bowker, Sandra Harding, Anne Marie Mol, Susan Leigh Star and Banu Subramaniam. Subjectivity, 28, 334-344.
  • Baudrillard, J. (1997). Tüketim toplumu. (H. Deliçaylı, F. Keskin, Çev.). Ayrıntı Yayınları.
  • Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021, March). On the dangers of stochastic parrots: Can language models be too big?🦜. In Proceedings of the 2021 ACM conference on fairness, accountability, and transparency (pp. 610-623). https://doi.org/10.1145/3442188.3445922
  • Benjamin, R, (ed.) (2019). Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life. Duke University Press.
  • Bonila, Y., & Rosa, J. (2015). #Ferguson: Digital protest, hashtag ethnography, and the racial politics of social media in the United States. American Ethnologist, 42(1), 4-17.
  • Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 77-91.
  • Brimacombe, T., Kant, R., Finau, G., Tarai, J., & Titifanue, J. (2018). A new frontier in digital activism: An exploration of digital feminism in Fiji. Asia & the Pacific Policy Studies, 5(3), 508-521.
  • Butler, J. (1988). Performative Acts and Gender Constitution: An Essay in Phenomenology and Feminist Theory. Theatre Journal, 40, 519-531.
  • Butler, J. (1994). Gender as Performance: An Interview with Judith Butler. Radical Philosophy: A Journal of Socialist and Feminist Philosophy, 67, 32-39.
  • Butler, J. (2009). Performativity, Precarity and Sexual Politics. AIBR: Revista de Antropología Iberoamericana 4: i–xiii.
  • Castells, M. (2012). Networks of outrage and hope: Social movements in the internet age. Polity Press.
  • Cifor, M., Garcia, P. Cowan, T. L., Rault, J., Sutherland, T., Chan, A. S., Rode, J., Hoffmann, A. L., Salehi, N., & Nakamura, L. (2018). Feminist Data Manifest-No (2019). https://www.manifestno.com/; Design Justice Network, Design Justice Network Principles. https://designjustice.org/read-the-principles
  • ChatGPT. 2025a. Bir Kadın Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • ChatGPT. 2025b. Bir Erkek Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • ChatGPT. 2023-5c. Bir Kadının Rolleri Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • ChatGPT. 2025d. Bir Erkeğin Rolleri Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • Chmiliar, l. (2010). Multiple-case designs. In A. J. Mills, G. Eurepas & E. Wiebe (Ed.), Encyclopedia of case study research (pp 582-583). USA: SAGE Publications.
  • Colenutt, M. (2021). One Is Not Born, But Becomes a Woman. The Oxford Research Centre in the Humanities. March 22. https://www.torch.ox.ac.uk/article/one-is-not-born-but-becomes-a-woman
  • Connell, R. W. (1987). Gender and power: society, the person and sexuel politics. Polity.
  • Crawford, K, (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.
  • Çelik, H., & Ekşi, H. (2008). Söylem Analizi. Marmara Üniversitesi Atatürk Eğitim Fakültesi Eğitim Bilimleri Dergisi, 27(27). 99-117.
  • Davis, J. L., Williams, A., & Yang, M. W. (2021). Algorithmic reparation. Big Data & Society, 8(2). https://doi.org/10.1177/2053951721104480
  • Elliott, K. (2016). Caring masculinities: Theorizing an emerging concept. Men and masculinities, 19(3), 240-259. Eroğul, N. (2024). Ataerkil şartlanma ve şartlanmanın kadın ve erkek davranışlarına yansımaları. 17. Uluslararası Sosyal, Beşerî ve Eğitim Bilimler Kongresi, 21-22 Aralık 2024, 314-322.
  • Feldfeber, I,, Quiroga, Y,B,, Guevara, C. & Ciolfi Felice, M. (2022). Feminisms in Artificial Intelligence: Automation tools towards a feminist judiciary reform in Argentina and Mexico. https://drive.google.com/file/d/1VKeeSdJz6-8DoEbetQ8f1wTH1ORqvIXV/view
  • Feminist Data Manifest-No (2019). https://irwg.umich.edu/news/feminist-data-manifest-no
  • Ferrara, E. (2023). Should chatgpt be biased? challenges and risks of bias in large language models. arXiv preprint arXiv:2304.03738.
  • Gal, Y. (2020). Algorithmic bias: How AI mirrors and exacerbates societal biases. AI & Society, 1-7. https://doi.org/10.1007/s00146-020-01105-0
  • Gedik, E., Çakır, H., & Coşkun, A. (2020). Bir inşaa süreci olarak erkeklik: Yozgat örneği. Artvin Çoruh Üniversitesi Uluslararası Sosyal Bilimler Dergisi, 6(1), 84-95.
  • Gerbaudo, P. (2012). Tweets and the streets: Social media and contemporary activism. Pluto Press.
  • Ghosh, S., & Caliskan, A. (2023, August). Chatgpt perpetuates gender bias in machine translation and ignores non-gendered pronouns: Findings across bengali and five other low-resource languages. In Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society (pp. 901-912).
  • Goyal, P., Soriano, A. R., Hazirbas, C., Sagun, L., & Usunier, N. (2022, June). Fairness indicators for systematic assessments of visual feature extractors. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (pp. 70-88).
  • Gross, N. (2023). What ChatGPT tells us about gender: a cautionary tale about performativity and gender biases in AI. Social Sciences, 12(8), 435. https://doi.org/10.3390/socsci12080435
  • Guerra, J. (2022). Towards a feminist framework for AI development: From principles to practice. https://www.derechosdigitales.org/wp-content/uploads/Fair_Doc_Eng.pdf
  • Güngör, B. (2020). Söylem Yaklaşımı Üzerine Bir Kavram Çalışması ve Eleştirel Söylem Analizi. Kritik İletişim Çalışmaları Dergisi, 2(2), 1-11.
  • Güven, Z. C. (2016). Eril Tarih Söyleminde Bir Osmanlı Kadın Kamusal Alanı: Hanımlara Mahsus Gazete. İletişim Kuram ve Araştırma Dergisi, 2016(43), 1-14.
  • Hanna, A. D., Rollins, N., & Loveys, K. (2020). Recognizing the role of social context in AI with ‘critical AI for good’. arXiv preprint arXiv:2011.05196.
  • Haraway, D. (1988), Situated knowledges: the science question in feminism and the privilege of partial perspective. Feminist Studies, 14(3), 575-599.
  • IDRC (2021). Advancing research on feminist artificial intelligence to advance gender equality and inclusion. IDRC. https://www.idrc.ca/en/project/advancing-research-feminist-artificial-intelligence-advance-gender-equality-and-inclusion
  • Iyer, N. (t.y.). What does the future of Feminist AI look like? Goethe Institute. https://www.goethe.de/prj/k40/en/eth/fem.html
  • Jouët, J. (2018). Digital feminism: Questioning the renewal of activism. Journal of Research in Gender Studies, 8(1), 133-157.
  • Kabeer, N. (2012). The power of association: Reflecting on women's collective action as a force for social change. Thinkpiece #2. UK Feminista.
  • Kaplan, H. (2023). Kadın aşıklar. Nobel.
  • Kaplan, H. (2025). Toplumsal cinsiyet bağlamında kadın aşıklar ve sosyal rolleri. İçinde A.E. Gündoğdu & B. Karakaş (Ed.), Sosyal bilimler çerçevesinde kadın (ss. 193-221). NEU Press.
  • Meinders, C. (2017). From Feminist AI projects to artificial knowing. Media Design Practices Lab, 30 November. https://mediadesignpractices.net/lab/2017/11/collaborative-ai-research/
  • Meinders, C., Thompson, J., Ciston, S., & Griffiths, C. (2020). Week 3: Feminist AI (Main Thread). Critical Code Studies. http://wg20.criticalcodestudies.com/index.php?p=/discussion/87/week-3-feminist-ai-main-thread
  • Noble. S.U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.
  • Özdemir, H. (2019). Toplumsal cinsiyet perspektifinde erkeklik ve kadınlık algısı: Bir alan araştirmasi. Asya Studies, 4(10), 90-107.
  • Panch, T., Mattie, H., & Atun, R. (2019). Artificial intelligence and algorithmic bias: implications for health systems. Journal of global health, 9(2), 020318.
  • Polo, E. P., & Ailodion, D. O. (2025). Tackling Racial Bias in AI Systems: Applying the Bioethical Principle of Justice an-d Insights from Joy Buolamwini’s “Coded Bias” and the “Algorithmic Justice League, Bangladesh Journal of Bioethics, 16(1), 8-14.
  • Posada, J. (2022). Embedded reproduction in platform data work. Information, Communication & Society, 25(6), 816-834.
  • Purvis, T. & Hunt, A. (1993). Discourse, ideology, discourse, ideology, discourse, ideology... British journal of sociology, 473-499.
  • Ray, P. P. (2023). ChatGPT: A comprehensive review on background, applications, key challenges, bias, ethics, limitations and future scope. Internet of Things and Cyber-Physical Systems, 3, 121-154.
  • Rich, A. (2021). Of woman born: Motherhood as experience and institution. WW Norton & Company.
  • Richardson, B., & Gilbert, J. E. (2021). A framework for fairness: A systematic review of existing fair AI solutions. arXiv preprint arXiv:2112.05700. https://doi.org/10.48550/arXiv.2112.05700
  • Roberts, S. (2021). Your AI is a human. In T. S. Mullaney, B. Peters and M. Hicks (Ed.), Your Computer Is on Fire (pp. 51-70). MIT Press.
  • Rodazo, D. (2023). The Political Biases of ChatGPT. Social Sciences, 12(3), 148. https://doi.org/10.3390/socsci12030148
  • Roozafzai, Z. S. (2024). Unveiling Power and Ideologies in the Age of Algorithms: Exploring the Intersection of Critical Discourse Analysis and Artificial Intelligence. Qeios.
  • Sakib, M. S. I. (2023). What is ChatGPT. ResearchGate.
  • Sallan Gül, S. & Kahya Nizam, Ö. (2021). Sosyal Bilimlerde İçerik ve Söylem Analizi. Pamukkale Üniversitesi Sosyal Bilimler Enstitüsü Dergisi ( 42 ), 181-198. https://doi.org/10.30794/pausbed.803182
  • Sarrion, E. (2023). What Is ChatGPT?. In Exploring the power of ChatGPT: Applications, techniques, and implications (pp. 3-8). Apress.
  • Sinders, C. (2020). Feminist data set. https://carolinesinders.com/wp-content/uploads/2020/05/Feminist-Data-Set-Final-Draft-2020-0526.pdf
  • Titifanue, J., Tarai, J., Kant, R., & Finau, G. (2016). From social networking to activism: The role of social media in the free West Papua campaign. Pacific Studies, 39(3), 255–281.
  • Toupin, S. (2024). Shaping feminist artificial intelligence. New Media & Society, 26(1), 580-595. https://doi.org/10.1177/14614448221150776
  • Van Dijk, T. A. (2001). Multidisciplinary CDA: A plea for diversity. Methods of critical discourse analysis, 1(1), 95-120. Varshney, K. (2018). Introducing AI fairness 360. IBM Research blog. https://research.ibm.com/blog/ai-fairness-360
  • Webb. C. (2019) WHAT IS A FEMINIST AI? Possible Feminisms, Possible Internets. Disruption Network Lab, Berlin. https://www.youtube.com/watch?v=bBQOyvNhWJY
  • Weiss, G., & Wodak, R. (2003). Introduction: Theory, interdisciplinarity and critical discourse analysis. In Critical discourse analysis: Theory and interdisciplinarity (pp. 1-32). Palgrave Macmillan UK.
  • Wellner, G. & Rothman, T. (2020). Feminist AI: can we expect our AI systems to become feminist?, Philosophy & Technology, 33(2), 191-205.
  • West, S. M. (2020). Redistribution and rekognition: A feminist critique of algorithm Fairness. CATALYST: Feminism, Theory, Technoscience, 6(2), 124.
  • Whittaker, M. (2021). The steep cost of capture. Interactions, 28(6), 50-55.
  • Wodak, R. (2001). What CDA is About–a Summary of its History, Important Concepts and its Developments. In R. Wodak & M. Meyer, (Ed.), Methods of Critical Discourse Analysis. (pp. 1-13). Sage. Wu, C.-L., Waisman, H. W., Ran, F., Wang, Y., & Zucker, D. J. (2020). The future of artificial intelligence in healthcare: An interdisciplinary roadmap. Current Opinion in Systems Biology, 23, 31-37. https://doi.org/10.1016/j.coisb.2020.06.005
  • Xu, C., Greer, C., Joshi, M. N., & Doshi, T. (2020). Fairness indicators demo: Scalable infrastructure for fair ML systems. Google Research.
  • Xu, C. & Doshi, T. (2019). Fairness Indicators: Scalable Infrastructure for Fair ML Systems. https://research.google/blog/fairness-indicators-scalable-infrastructure-for-fair-ml-systems/
  • Yalçın, H. (2022). Bir araştırma deseni olarak fenomenoloji. Anadolu Üniversitesi Sosyal Bilimler Dergisi, 22(Özel Sayı 2), 213-232.
  • Yıldırım, A. & Şimşek, H. (2008). Sosyal bilimlerde nitel araştırma yöntemleri (6. Baskı). Seçkin Yayıncılık.
  • Young, J. (2019). Why we need to design feminist AI. https://www.youtube.com/watch?v=E-O3LaSEcVw
  • Zhang, Y., & Zhou, L. (2020). Fairness and bias in machine learning: A critical discourse analysis. Knowledge-Based Systems, 195, 105603. https://doi.org/10.1016/j.knosys.2020.105603

A discourse analysis of ChatGPT’s gender perception

Yıl 2025, Sayı: Sosyal Bilimlerde Yapay Zeka: Kuram, Uygulama ve Gelecek Perspektifleri , 23 - 44 , 07.12.2025
https://izlik.org/JA35RZ32ZB

Öz

As a trained natural language processing model, ChatGPT presents content shaped by gender inequality, having been developed within the control of male-dominated history, ideology, and discourse. Based on this premise, the study examines ChatGPT’s perception of gender through the lens of gender biases. A qualitative research method was adopted in this study. In the data analysis process, the three stages of discourse analysis—interpretation, construction, and macro-construction—were followed. In the interpretation stage, gender representations were examined; in the structuring stage, the norms reproduced by the discourses; and in the macro-structuring stage, their relations with dominant ideologies and the structures reinforcing gender inequality were analyzed. According to the findings, ChatGPT associates femininity with calmness, compassion, and domestic roles, while masculinity is linked to seriousness, decision-making, and authority. Motherhood is idealized through compassion and self-sacrifice, whereas fatherhood is defined by notions of trust and guidance. The predominance of the concept of “trust” in discourses on masculinity demonstrates the binary reproduction of gender roles. However, the inclusion of nurturing aspects of fatherhood opens limited space for alternative masculinities. Ultimately, the study reveals that ChatGPT draws on a data pool shaped by prevailing social norms and reproduces gendered stereotypes. Therefore, it is recommended that the system be restructured from a feminist perspective.

Kaynakça

  • Adam, A. (1995). Artificial intelligence and women’s knowledge. Women’s Studies International Forum, 18(4), 407-415.
  • Adam, A, (1998). Artificial Knowing: Gender and the Thinking Machine. Taylor & Francis Group. http://ebookcentral.proquest.com/lib/mcgill/detail.action?docID=165123
  • Adam, A & Furnival, C. (1995). Designing intersections-designing subjectivity: Feminist theory and praxis in a sex discrimination legislation system. Information & Communications Technology Law 4(2), 161-173.
  • Alemany, L. A., Benotti, L., González, J., Busaniche, B., & Halvorsen, A. (2024). Tool to overcome technical barriers for bias assessment in human language technologies. https://drive.google.com/file/d/1Ppdf6K8p4rYv5U-MLUql4o7fwTpoKheW/view
  • Algorithmic Bias Persists (August, 2020). https://www.media.mit.edu/projects/gender-shades/overview/ Altunışık, R., Coşkun, R., Bayraktaroğlu, S., & Yıldırım, E. (2010). Sosyal Bilimlerde Araştırma Yöntemleri SPSS Uygulamalı (6. Baskı). Sakarya Yayıncılık.
  • Atanasoski, N. & Vora, K. (2019). Surrogate Humanity: Race, Robots, and the Politics of Technological Futures. Duke University Press.
  • Avila, R, (2021). Deploying Feminist AI. Alliance+. https://feministai.pubpub.org/pub/deploying-feminist-ai/release/1
  • Bardzell, S. (2010, April 10-15). Feminist HCI: Taking stock and outlining an agenda for design. In CHI’10: Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1301-1310). Association for Computing Machinery, Atlanta Georgia USA. https://doi.org/10.1145/1753326.1753521
  • Bauchspies, W. K., & Bellacasa, M. P. D. L. (2009). Feminist Science and Technology Studies: A Patchwork of Moving Subjectivities. An Interview with Geoffrey Bowker, Sandra Harding, Anne Marie Mol, Susan Leigh Star and Banu Subramaniam. Subjectivity, 28, 334-344.
  • Baudrillard, J. (1997). Tüketim toplumu. (H. Deliçaylı, F. Keskin, Çev.). Ayrıntı Yayınları.
  • Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021, March). On the dangers of stochastic parrots: Can language models be too big?🦜. In Proceedings of the 2021 ACM conference on fairness, accountability, and transparency (pp. 610-623). https://doi.org/10.1145/3442188.3445922
  • Benjamin, R, (ed.) (2019). Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life. Duke University Press.
  • Bonila, Y., & Rosa, J. (2015). #Ferguson: Digital protest, hashtag ethnography, and the racial politics of social media in the United States. American Ethnologist, 42(1), 4-17.
  • Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 77-91.
  • Brimacombe, T., Kant, R., Finau, G., Tarai, J., & Titifanue, J. (2018). A new frontier in digital activism: An exploration of digital feminism in Fiji. Asia & the Pacific Policy Studies, 5(3), 508-521.
  • Butler, J. (1988). Performative Acts and Gender Constitution: An Essay in Phenomenology and Feminist Theory. Theatre Journal, 40, 519-531.
  • Butler, J. (1994). Gender as Performance: An Interview with Judith Butler. Radical Philosophy: A Journal of Socialist and Feminist Philosophy, 67, 32-39.
  • Butler, J. (2009). Performativity, Precarity and Sexual Politics. AIBR: Revista de Antropología Iberoamericana 4: i–xiii.
  • Castells, M. (2012). Networks of outrage and hope: Social movements in the internet age. Polity Press.
  • Cifor, M., Garcia, P. Cowan, T. L., Rault, J., Sutherland, T., Chan, A. S., Rode, J., Hoffmann, A. L., Salehi, N., & Nakamura, L. (2018). Feminist Data Manifest-No (2019). https://www.manifestno.com/; Design Justice Network, Design Justice Network Principles. https://designjustice.org/read-the-principles
  • ChatGPT. 2025a. Bir Kadın Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • ChatGPT. 2025b. Bir Erkek Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • ChatGPT. 2023-5c. Bir Kadının Rolleri Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • ChatGPT. 2025d. Bir Erkeğin Rolleri Nasıl Görünür? [Kullanıcı Sorusuna Cevap]. Çevrimiçi olarak erişilebilir: https://chat.openai.com/chat (1 Mart 2025’te edinildi).
  • Chmiliar, l. (2010). Multiple-case designs. In A. J. Mills, G. Eurepas & E. Wiebe (Ed.), Encyclopedia of case study research (pp 582-583). USA: SAGE Publications.
  • Colenutt, M. (2021). One Is Not Born, But Becomes a Woman. The Oxford Research Centre in the Humanities. March 22. https://www.torch.ox.ac.uk/article/one-is-not-born-but-becomes-a-woman
  • Connell, R. W. (1987). Gender and power: society, the person and sexuel politics. Polity.
  • Crawford, K, (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.
  • Çelik, H., & Ekşi, H. (2008). Söylem Analizi. Marmara Üniversitesi Atatürk Eğitim Fakültesi Eğitim Bilimleri Dergisi, 27(27). 99-117.
  • Davis, J. L., Williams, A., & Yang, M. W. (2021). Algorithmic reparation. Big Data & Society, 8(2). https://doi.org/10.1177/2053951721104480
  • Elliott, K. (2016). Caring masculinities: Theorizing an emerging concept. Men and masculinities, 19(3), 240-259. Eroğul, N. (2024). Ataerkil şartlanma ve şartlanmanın kadın ve erkek davranışlarına yansımaları. 17. Uluslararası Sosyal, Beşerî ve Eğitim Bilimler Kongresi, 21-22 Aralık 2024, 314-322.
  • Feldfeber, I,, Quiroga, Y,B,, Guevara, C. & Ciolfi Felice, M. (2022). Feminisms in Artificial Intelligence: Automation tools towards a feminist judiciary reform in Argentina and Mexico. https://drive.google.com/file/d/1VKeeSdJz6-8DoEbetQ8f1wTH1ORqvIXV/view
  • Feminist Data Manifest-No (2019). https://irwg.umich.edu/news/feminist-data-manifest-no
  • Ferrara, E. (2023). Should chatgpt be biased? challenges and risks of bias in large language models. arXiv preprint arXiv:2304.03738.
  • Gal, Y. (2020). Algorithmic bias: How AI mirrors and exacerbates societal biases. AI & Society, 1-7. https://doi.org/10.1007/s00146-020-01105-0
  • Gedik, E., Çakır, H., & Coşkun, A. (2020). Bir inşaa süreci olarak erkeklik: Yozgat örneği. Artvin Çoruh Üniversitesi Uluslararası Sosyal Bilimler Dergisi, 6(1), 84-95.
  • Gerbaudo, P. (2012). Tweets and the streets: Social media and contemporary activism. Pluto Press.
  • Ghosh, S., & Caliskan, A. (2023, August). Chatgpt perpetuates gender bias in machine translation and ignores non-gendered pronouns: Findings across bengali and five other low-resource languages. In Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society (pp. 901-912).
  • Goyal, P., Soriano, A. R., Hazirbas, C., Sagun, L., & Usunier, N. (2022, June). Fairness indicators for systematic assessments of visual feature extractors. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (pp. 70-88).
  • Gross, N. (2023). What ChatGPT tells us about gender: a cautionary tale about performativity and gender biases in AI. Social Sciences, 12(8), 435. https://doi.org/10.3390/socsci12080435
  • Guerra, J. (2022). Towards a feminist framework for AI development: From principles to practice. https://www.derechosdigitales.org/wp-content/uploads/Fair_Doc_Eng.pdf
  • Güngör, B. (2020). Söylem Yaklaşımı Üzerine Bir Kavram Çalışması ve Eleştirel Söylem Analizi. Kritik İletişim Çalışmaları Dergisi, 2(2), 1-11.
  • Güven, Z. C. (2016). Eril Tarih Söyleminde Bir Osmanlı Kadın Kamusal Alanı: Hanımlara Mahsus Gazete. İletişim Kuram ve Araştırma Dergisi, 2016(43), 1-14.
  • Hanna, A. D., Rollins, N., & Loveys, K. (2020). Recognizing the role of social context in AI with ‘critical AI for good’. arXiv preprint arXiv:2011.05196.
  • Haraway, D. (1988), Situated knowledges: the science question in feminism and the privilege of partial perspective. Feminist Studies, 14(3), 575-599.
  • IDRC (2021). Advancing research on feminist artificial intelligence to advance gender equality and inclusion. IDRC. https://www.idrc.ca/en/project/advancing-research-feminist-artificial-intelligence-advance-gender-equality-and-inclusion
  • Iyer, N. (t.y.). What does the future of Feminist AI look like? Goethe Institute. https://www.goethe.de/prj/k40/en/eth/fem.html
  • Jouët, J. (2018). Digital feminism: Questioning the renewal of activism. Journal of Research in Gender Studies, 8(1), 133-157.
  • Kabeer, N. (2012). The power of association: Reflecting on women's collective action as a force for social change. Thinkpiece #2. UK Feminista.
  • Kaplan, H. (2023). Kadın aşıklar. Nobel.
  • Kaplan, H. (2025). Toplumsal cinsiyet bağlamında kadın aşıklar ve sosyal rolleri. İçinde A.E. Gündoğdu & B. Karakaş (Ed.), Sosyal bilimler çerçevesinde kadın (ss. 193-221). NEU Press.
  • Meinders, C. (2017). From Feminist AI projects to artificial knowing. Media Design Practices Lab, 30 November. https://mediadesignpractices.net/lab/2017/11/collaborative-ai-research/
  • Meinders, C., Thompson, J., Ciston, S., & Griffiths, C. (2020). Week 3: Feminist AI (Main Thread). Critical Code Studies. http://wg20.criticalcodestudies.com/index.php?p=/discussion/87/week-3-feminist-ai-main-thread
  • Noble. S.U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.
  • Özdemir, H. (2019). Toplumsal cinsiyet perspektifinde erkeklik ve kadınlık algısı: Bir alan araştirmasi. Asya Studies, 4(10), 90-107.
  • Panch, T., Mattie, H., & Atun, R. (2019). Artificial intelligence and algorithmic bias: implications for health systems. Journal of global health, 9(2), 020318.
  • Polo, E. P., & Ailodion, D. O. (2025). Tackling Racial Bias in AI Systems: Applying the Bioethical Principle of Justice an-d Insights from Joy Buolamwini’s “Coded Bias” and the “Algorithmic Justice League, Bangladesh Journal of Bioethics, 16(1), 8-14.
  • Posada, J. (2022). Embedded reproduction in platform data work. Information, Communication & Society, 25(6), 816-834.
  • Purvis, T. & Hunt, A. (1993). Discourse, ideology, discourse, ideology, discourse, ideology... British journal of sociology, 473-499.
  • Ray, P. P. (2023). ChatGPT: A comprehensive review on background, applications, key challenges, bias, ethics, limitations and future scope. Internet of Things and Cyber-Physical Systems, 3, 121-154.
  • Rich, A. (2021). Of woman born: Motherhood as experience and institution. WW Norton & Company.
  • Richardson, B., & Gilbert, J. E. (2021). A framework for fairness: A systematic review of existing fair AI solutions. arXiv preprint arXiv:2112.05700. https://doi.org/10.48550/arXiv.2112.05700
  • Roberts, S. (2021). Your AI is a human. In T. S. Mullaney, B. Peters and M. Hicks (Ed.), Your Computer Is on Fire (pp. 51-70). MIT Press.
  • Rodazo, D. (2023). The Political Biases of ChatGPT. Social Sciences, 12(3), 148. https://doi.org/10.3390/socsci12030148
  • Roozafzai, Z. S. (2024). Unveiling Power and Ideologies in the Age of Algorithms: Exploring the Intersection of Critical Discourse Analysis and Artificial Intelligence. Qeios.
  • Sakib, M. S. I. (2023). What is ChatGPT. ResearchGate.
  • Sallan Gül, S. & Kahya Nizam, Ö. (2021). Sosyal Bilimlerde İçerik ve Söylem Analizi. Pamukkale Üniversitesi Sosyal Bilimler Enstitüsü Dergisi ( 42 ), 181-198. https://doi.org/10.30794/pausbed.803182
  • Sarrion, E. (2023). What Is ChatGPT?. In Exploring the power of ChatGPT: Applications, techniques, and implications (pp. 3-8). Apress.
  • Sinders, C. (2020). Feminist data set. https://carolinesinders.com/wp-content/uploads/2020/05/Feminist-Data-Set-Final-Draft-2020-0526.pdf
  • Titifanue, J., Tarai, J., Kant, R., & Finau, G. (2016). From social networking to activism: The role of social media in the free West Papua campaign. Pacific Studies, 39(3), 255–281.
  • Toupin, S. (2024). Shaping feminist artificial intelligence. New Media & Society, 26(1), 580-595. https://doi.org/10.1177/14614448221150776
  • Van Dijk, T. A. (2001). Multidisciplinary CDA: A plea for diversity. Methods of critical discourse analysis, 1(1), 95-120. Varshney, K. (2018). Introducing AI fairness 360. IBM Research blog. https://research.ibm.com/blog/ai-fairness-360
  • Webb. C. (2019) WHAT IS A FEMINIST AI? Possible Feminisms, Possible Internets. Disruption Network Lab, Berlin. https://www.youtube.com/watch?v=bBQOyvNhWJY
  • Weiss, G., & Wodak, R. (2003). Introduction: Theory, interdisciplinarity and critical discourse analysis. In Critical discourse analysis: Theory and interdisciplinarity (pp. 1-32). Palgrave Macmillan UK.
  • Wellner, G. & Rothman, T. (2020). Feminist AI: can we expect our AI systems to become feminist?, Philosophy & Technology, 33(2), 191-205.
  • West, S. M. (2020). Redistribution and rekognition: A feminist critique of algorithm Fairness. CATALYST: Feminism, Theory, Technoscience, 6(2), 124.
  • Whittaker, M. (2021). The steep cost of capture. Interactions, 28(6), 50-55.
  • Wodak, R. (2001). What CDA is About–a Summary of its History, Important Concepts and its Developments. In R. Wodak & M. Meyer, (Ed.), Methods of Critical Discourse Analysis. (pp. 1-13). Sage. Wu, C.-L., Waisman, H. W., Ran, F., Wang, Y., & Zucker, D. J. (2020). The future of artificial intelligence in healthcare: An interdisciplinary roadmap. Current Opinion in Systems Biology, 23, 31-37. https://doi.org/10.1016/j.coisb.2020.06.005
  • Xu, C., Greer, C., Joshi, M. N., & Doshi, T. (2020). Fairness indicators demo: Scalable infrastructure for fair ML systems. Google Research.
  • Xu, C. & Doshi, T. (2019). Fairness Indicators: Scalable Infrastructure for Fair ML Systems. https://research.google/blog/fairness-indicators-scalable-infrastructure-for-fair-ml-systems/
  • Yalçın, H. (2022). Bir araştırma deseni olarak fenomenoloji. Anadolu Üniversitesi Sosyal Bilimler Dergisi, 22(Özel Sayı 2), 213-232.
  • Yıldırım, A. & Şimşek, H. (2008). Sosyal bilimlerde nitel araştırma yöntemleri (6. Baskı). Seçkin Yayıncılık.
  • Young, J. (2019). Why we need to design feminist AI. https://www.youtube.com/watch?v=E-O3LaSEcVw
  • Zhang, Y., & Zhou, L. (2020). Fairness and bias in machine learning: A critical discourse analysis. Knowledge-Based Systems, 195, 105603. https://doi.org/10.1016/j.knosys.2020.105603
Toplam 84 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Toplumsal Cinsiyet ve Siyaset, Kültür Sosyolojisi
Bölüm Araştırma Makalesi
Yazarlar

Nebile Eroğul 0000-0002-8099-5559

Gönderilme Tarihi 14 Temmuz 2025
Kabul Tarihi 22 Ekim 2025
Erken Görünüm Tarihi 4 Aralık 2025
Yayımlanma Tarihi 7 Aralık 2025
DOI https://doi.org/10.30783/nevsosbilen.1742358
IZ https://izlik.org/JA35RZ32ZB
Yayımlandığı Sayı Yıl 2025 Sayı: Sosyal Bilimlerde Yapay Zeka: Kuram, Uygulama ve Gelecek Perspektifleri

Kaynak Göster

APA Eroğul, N. (2025). ChatGPT’nin toplumsal cinsiyet algısına yönelik söylem analizi. Nevşehir Hacı Bektaş Veli Üniversitesi SBE Dergisi, Sosyal Bilimlerde Yapay Zeka: Kuram, Uygulama ve Gelecek Perspektifleri, 23-44. https://doi.org/10.30783/nevsosbilen.1742358