Research Article
BibTex RIS Cite

Kiminle Konuşuyorum Ben?: Sohbet Robotlarının Belleği ve Kişiselleştirilmiş Sohbet Etkileşimi Üzerine

Year 2025, Issue: 50, 339 - 356, 30.12.2025
https://doi.org/10.31123/akil.1709303

Abstract

OpenAI tarafından 2022’de geliştirilen büyük dil modeline dayalı sohbet robotu ChatGPT, insan benzeri diyalog üretmenin ötesine geçerek resmî belgelerden edebî metinlere kadar çeşitli içerikler oluşturabilen çok yönlü bir araca dönüşmüştür. Araştırmalar, kullanıcıların yapay zekâ ile etkileşimlerinde çoğu zaman insanla konuşuyormuş gibi hissettiklerini ortaya koymaktadır. 2024 yılında ChatGPT’ye eklenen “sohbet belleği” özelliği, sohbet robotunun kullanıcılarla olan konuşmaları kaydedip işlemesini mümkün kılmıştır. Bu yenilik, önceki etkileşimlere atıfta bulunarak daha kişiselleştirilmiş ve bağlama uygun yanıtların verilmesini sağlamış; aynı zamanda insan deneyimi ve eylem yetisine dair tartışmaları da gündeme getirmiştir. Bu çalışmanın amacı, ChatGPT’nin sohbet belleği özelliğinin kullanıcı deneyimine etkisini incelemektir. Nitel bir yaklaşımla, belleği kullanan ve kullanmayan iki grup olmak üzere toplam 30 ChatGPT kullanıcısıyla yarı yapılandırılmış derinlemesine görüşmeler gerçekleştirilmiştir. Bulgular, bellek özelliğini kullanan katılımcıların etkileşimlerini daha kişiselleştirilmiş, tutarlı ve duygusal açıdan zengin olarak tanımladıklarını göstermiştir. Özelliği kullanmayan katılımcılar ise deneyimlerini daha işlemsel ve faydacı olarak nitelendirerek veri gizliliği ve kullanıcı özerkliği gibi konularda kaygılarını dile getirmişlerdir. Çalışma, sohbet robotlarının artık yalnızca teknik araçlar değil, duygusal ihtiyaçlara yanıt verebilen dijital ajanlar olarak algılandığını vurgulamaktadır. Ayrıca çalışma, insan-yapay zekâ etkileşimi, dijital bağlılık ve teknolojik sistemlere güven konularında daha fazla araştırma yapılması gerektiğine dikkat çekmektedir.

References

  • Abd-Alrazaq, A. A., Rababeh, A., Alajlani, M., Bewick, B. M., & Househ, M. (2020). Effectiveness and Safety of Using Chatbots to Improve Mental Health: Systematic Review and Meta-Analysis. Journal of Medical Internet Research, 22(7), e16021. https://doi.org/10.2196/16021
  • Adamopoulou, E., & Moussiades, L. (2020). Chatbots: History, technology, and applications. Machine Learning with Applications, 2, 100006. https://doi.org/10.1016/j.mlwa.2020.100006
  • Aggarwal, P., & McGill, A. L. (2007). Is That Car Smiling at Me? Schema Congruity as a Basis for Evaluating Anthropomorphized Products. Journal of Consumer Research, 34(4), 468–479. https://doi.org/10.1086/518544
  • Başkale, H. (2016). Nitel araştırmalarda geçerlik, güvenirlik ve örneklem büyüklüğü. Dokuz Eylül Üniversitesi Hemşirelik Fakültesi Elektronik Dergisi, 9(1), 23–28.
  • Belda-Medina, J., & Kokošková, V. (2023). Integrating chatbots in education: insights from the Chatbot-Human Interaction Satisfaction Model (CHISM). International Journal of Educational Technology in Higher Education, 20(1), 62. https://doi.org/10.1186/s41239-023-00432-3
  • Brandtzaeg, P. B., & Følstad, A. (2017, November). Why people use chatbots. In International conference on internet science (pp. 377-392). Cham: Springer International Publishing.
  • Brin, D., Sorin, V., Vaid, A., Soroush, A., Glicksberg, B. S., Charney, A. W., Nadkarni, G., & Klang, E. (2023). Comparing ChatGPT and GPT-4 performance in USMLE soft skill assessments. Scientific Reports, 13(1), 16492. https://doi.org/10.1038/s41598-023-43436-9
  • Burgoon, J. K., Bonito, J. A., Bengtsson, B., Cederberg, C., Lundeberg, M., & Allspach, L. (2000). Interactivity in human–computer interaction: a study of credibility, understanding, and influence. Computers in Human Behavior, 16(6), 553–574. https://doi.org/10.1016/S0747-5632(00)00029-7
  • Chaves, A. P., & Gerosa, M. A. (2021). How Should My Chatbot Interact? A Survey on Social Characteristics in Human–Chatbot Interaction Design. International Journal of Human–Computer Interaction, 37(8), 729–758. https://doi.org/10.1080/10447318.2020.1841438
  • Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny valley: An experimental study of human–chatbot interaction. Future Generation Computer Systems, 92, 539–548. https://doi.org/10.1016/j.future.2018.01.055
  • Creswell, J. W., & Poth, C. N. (2018). Qualitative inquiry and research design: Choosing among five approaches (4th ed.). SAGE.
  • Croes, E. A. J., & Antheunis, M. L. (2021). Can we be friends with Mitsuku? A longitudinal study on the process of relationship formation between humans and a social chatbot. Journal of Social and Personal Relationships, 38(1), 279–300. https://doi.org/10.1177/0265407520959463
  • Demirel, S., Bulur, N., & Çakıcı, Z. (2024). Utilizing Artificial Intelligence for Text Classification in Communication Sciences (pp. 218–235). https://doi.org/10.4018/979-8-3693-1830-0.ch013
  • Dosovitsky, G., & Bunge, E. L. (2021). Bonding With Bot: User Feedback on a Chatbot for Social Isolation. Frontiers in Digital Health, 3. https://doi.org/10.3389/fdgth.2021.735053
  • Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3–4), 177–190. https://doi.org/10.1016/S0921-8890(02)00374-3
  • Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886. https://doi.org/10.1037/0033-295X.114.4.864
  • Eyrek, A. (2022). Dijitalleşen Seyahat An(ı)ları: Fiziki ve Çevrim içi Seyahat Deneyiminin Hatırlama Bağlamında Karşılaştırılması. Türkiye İletişim Araştırmaları Dergisi, 40, 40–63. https://doi.org/10.17829/turcom.1050187
  • Eyssel, F., & Reich, N. (2013). Loneliness makes the heart grow fonder (of robots). In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (pp. 121–122). IEEE. https://doi.org/10.1109/HRI.2013.6483531
  • Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Mental Health, 4(2), e19. https://doi.org/10.2196/mental.7785
  • Fulmer, R., Joerin, A., Gentile, B., Lakerink, L., & Rauws, M. (2018). Using Psychological Artificial Intelligence (Tess) to Relieve Symptoms of Depression and Anxiety: Randomized Controlled Trial. JMIR Mental Health, 5(4), e64. https://doi.org/10.2196/mental.9782
  • Gambino, A., Fox, J., & Ratan, R. (2020). Building a Stronger CASA: Extending Computers Are Social Actors Paradigm. Human-Machine Communication, 1, 71–86. https://doi.org/10.30658/hmc.1.5
  • Gunkel, D. J. (2012). Communication and artificial intelligence: Opportunities and challenges for the 21st century. Communication Research and Practice, 1(1), 1–16. https://doi.org/10.7275/R5QJ7F7R
  • Gursoy, D., Chi, O. H., Lu, L., & Nunkoo, R. (2019). Consumers accept the application of artificially intelligent (AI) devices used in service delivery. International Journal of Information Management, 49, 157–169. https://doi.org/10.1016/j.ijinfomgt.2019.03.008
  • Guthrie, S. (2013). Anthropomorphism. In A. Runehov & L. Oviedo (Eds.), Encyclopedia of Sciences (pp. 111–113). Springer. https://doi.org/10.1163/9789004249707_vsr_COM_00000204
  • He, X., Shen, X., Chen, Z., Backes, M., & Zhang, Y. (2024). MGTBench: Benchmarking Machine-Generated Text Detection. Proceedings of the 2024 on ACM SIGSAC Conference on Computer and Communications Security, 2251–2265. https://doi.org/10.1145/3658644.3670344
  • Inkster, B., Sarda, S., & Subramanian, V. (2018). An Empathy-Driven, Conversational Artificial Intelligence Agent (Wysa) for Digital Mental Well-Being: Real-World Data Evaluation Mixed-Methods Study. JMIR MHealth and UHealth, 6(11), e12106. https://doi.org/10.2196/12106
  • Kaushik, K. (2025, January 25). The AI chatbot dilemma: Are we sacrificing privacy and trust for convenience? Medium. https://medium.com/@kumarakaushik/the-ai-chatbot-dilemma-are-we-sacrificing-privacy-and-trust-for-convenience-a0f9e04b94df
  • Krämer, N. C., Lucas, G., Schmitt, L., & Gratch, J. (2018). Social snacking with a virtual agent – On the interrelation of need to belong and effects of social responsiveness when interacting with artificial entities. International Journal of Human-Computer Studies, 109, 112–121. https://doi.org/10.1016/j.ijhcs.2017.09.001
  • Lee, M., & Park, J. (2022). Do parasocial relationships and the quality of communication with AI shopping chatbots determine middle‐aged women consumers’ continuance usage intentions? Journal of Consumer Behaviour, 21(4), 842–854. https://doi.org/10.1002/cb.2043
  • Lin, H., Chi, O. H., & Gursoy, D. (2020). Antecedents of customers’ acceptance of artificially intelligent robotic device use in hospitality services. Journal of Hospitality Marketing & Management, 29(5), 530–549. https://doi.org/10.1080/19368623.2020.1685053
  • Liu, J. (2024). ChatGPT: perspectives from human–computer interaction and psychology. Frontiers in Artificial Intelligence, 7. https://doi.org/10.3389/frai.2024.1418869
  • Lofland, J., & Lofland, L. H. (1995). Analyzing social settings: A guide to qualitative observation and analysis (3rd ed.). Wadsworth.
  • Lucas, G. M., Gratch, J., King, A., & Morency, L.-P. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94–100. https://doi.org/10.1016/j.chb.2014.04.043
  • Łukasik, A., & Gut, A. (2025). From robots to chatbots: unveiling the dynamics of human-AI interaction. Frontiers in Psychology, 16. https://doi.org/10.3389/fpsyg.2025.1569277
  • Mori, M. (1970). Bukimi no tani [The uncanny valley]. Energy, 7, 33–35.
  • McCarthy, J. (2007). From here to human-level AI. Artificial Intelligence, 171(18), 1174–1182. https://doi.org/10.1016/j.artint.2007.10.009
  • Moussawi, S., Koufaris, M., & Benbunan-Fich, R. (2020). How perceptions of intelligence and anthropomorphism affect adoption of personal intelligent agents. Electronic Markets, 31(2), 343–364. https://doi.org/10.1007/s12525-020-00411-w/Published
  • Nass, C., & Moon, Y. (2000). Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
  • Nass, C., Steuer, J., Tauber, E., & Reeder, H. (1993). Anthropomorphism, agency, and ethopoeia: Computers as social actors. Proceedings of the INTERACT ’93 Conference, 111–116.
  • Neff, G., & Nagy, P. (2018). Agency in the Digital Age: Using Symbiotic Agency to Explain Human–Technology Interaction. In A Networked Self and Human Augmentics, Artificial Intelligence, Sentience (pp. 97–107). Routledge. https://doi.org/10.4324/9781315202082-8
  • OpenAI. (2024, February 13). ChatGPT için bellek ve yeni kontroller. https://openai.com/tr-TR/index/memory-and-new-controls-for-chatgpt/
  • OpenAI, Achiam, J., Adler, S., Agarwal, S., Ahmad, L., Akkaya, I., Aleman, F. L., Almeida, D., Altenschmidt, J., Altman, S., Anadkat, S., Avila, R., Babuschkin, I., Balaji, S., Balcom, V., Baltescu, P., Bao, H., Bavarian, M., Belgum, J., … Zoph, B. (2024). GPT-4 Technical Report. http://arxiv.org/abs/2303.08774
  • Prakash, A. V., & Das, S. (2020). Intelligent conversational agents in mental healthcare services: a thematic analysis of user perceptions. Pacific Asia Journal of the Association for Information Systems, 12(2).
  • Reichardt, Jasia. (1978). Robots: fact, fiction, and prediction. Viking Press.
  • Sarıoğlu, E. B., & Güregen, E. P. (2024). Duygusal Yalnızlığa Bir Çözüm Olarak Chatgpt: Kişilerarası İletişimin Yeni Aracı. Türkiye İletişim Araştırmaları Dergisi. https://doi.org/10.17829/turcom.1360418
  • Shi, S., Gong, Y., & Gursoy, D. (2021). Antecedents of Trust and Adoption Intention toward Artificially Intelligent Recommendation Systems in Travel Planning: A Heuristic–Systematic Model. Journal of Travel Research, 60(8), 1714–1734. https://doi.org/10.1177/0047287520966395
  • Skjuve, M., Haugstveit, I. M., Følstad, A., & Brandtzaeg, P. B. (2019). Help! Is my chatbot falling into the uncanny valley? An empirical study of user experience in human-chatbot interaction. Human Technology, 30–54. https://doi.org/10.17011/ht/urn.201902201607
  • Song, S. W., & Shin, M. (2024). Uncanny Valley Effects on Chatbot Trust, Purchase Intention, and Adoption Intention in the Context of E-Commerce: The Moderating Role of Avatar Familiarity. International Journal of Human–Computer Interaction, 40(2), 441–456. https://doi.org/10.1080/10447318.2022.2121038
  • Sullivan, Y., Nyawa, S., & Fosso Wamba, S. (2023). Combating Loneliness with Artificial Intelligence: An AI-Based Emotional Support Model. https://doi.org/10.24251/HICSS.2023.541
  • Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., DeCero, E., & Loggarakis, A. (2020). User Experiences of Social Support from Companion Chatbots in Everyday Contexts: Thematic Analysis. Journal of Medical Internet Research, 22(3), e16235. https://doi.org/10.2196/16235
  • Troshani, I., Rao Hill, S., Sherman, C., & Arthur, D. (2021). Do We Trust in AI? Role of Anthropomorphism and Intelligence. Journal of Computer Information Systems, 61(5), 481–491. https://doi.org/10.1080/08874417.2020.1788473
  • Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433–460. https://doi.org/10.1093/mind/LIX.236.433
  • Vinkler, M. L. & Yu, P. (2020). Conversational Chatbots with Memory-based Question and Answer Generation [Linkoping University]. In Conversational chatbots with memory-based questions and answer generation. https://www.diva-portal.org/smash/get/diva2:1510358/FULLTEXT01.pdf
  • Wagner, K., & Schramm-Klein, H. (2019). Alexa, Are You Human? Investigating the Anthropomorphism of Digital Voice Assistants-A Qualitative Approach.
  • Wang, P., & Shao, J. (2022). Escaping Loneliness Through Tourist-Chatbot Interactions. In Information and Communication Technologies in Tourism 2022 (pp. 473–485). Springer International Publishing. https://doi.org/10.1007/978-3-030-94751-4_44
  • Waytz, A., Morewedge, C. K., Epley, N., Monteleone, G., Gao, J.-H., & Cacioppo, J. T. (2010). Making sense by making sentient: Effectance motivation increases anthropomorphism. Journal of Personality and Social Psychology, 99(3), 410–435. https://doi.org/10.1037/a0020240
  • Weizenbaum, J. (1966). ELIZA—a computer program for the study of natural language communication between man and machine. Communications of the ACM, 9(1), 36-45.
  • https://courses.cs.umbc.edu/471/papers/weizenbaum.eliza.1966.pdf
  • Weizenbaum, J. (1983). ELIZA — a computer program for the study of natural language communication between man and machine. Communications of the ACM, 26(1), 23–28. https://doi.org/10.1145/357980.357991
  • Zamora, J. (2017). I’m Sorry, Dave, I’m Afraid I Can’t Do That. Proceedings of the 5th International Conference on Human Agent Interaction, 253–260. https://doi.org/10.1145/3125739.3125766
  • Zumstein, D., & Hundertmark, S. (2017). CHATBOTS-AN INTERACTIVE TECHNOLOGY FOR PERSONALIZED COMMUNICATION, TRANSACTIONS AND SERVICES. IADIS International Journal on WWW/Internet, 15(1), 96–109.

Who am I talking to?: Memory of Chatbots and Personalized Chat Interaction

Year 2025, Issue: 50, 339 - 356, 30.12.2025
https://doi.org/10.31123/akil.1709303

Abstract

Developed by OpenAI in 2022, ChatGPT is a chatbot based on a large language model that has evolved beyond generating human-like dialogue to become a versatile tool producing a wide range of content, from official documents to literary texts. Research indicates that users often feel as though they are communicating with a human during their interactions with artificial intelligence. In 2024, OpenAI introduced the “conversational memory” feature to ChatGPT, enabling the system to store and process past interactions with users. This innovation allows the chatbot to refer to previous conversations and generate more personalized and contextually appropriate responses. At the same time, it raises ongoing debates regarding its implications for human experience and agency. This study aims to examine the impact of ChatGPT’s conversational memory feature on user experience. Employing a qualitative approach, semi-structured in-depth interviews were conducted with 30 ChatGPT users, divided into two groups: those who use the conversational memory feature and those who do not. The findings reveal that participants who utilized the memory feature described their interactions as more personalized, consistent, and emotionally engaging. In contrast, participants who refrained from using the feature reported more transactional and utilitarian experiences, often citing concerns related to data privacy and user autonomy. The study highlights that chatbots are no longer perceived merely as technical tools; rather, they are seen as digital agents capable of addressing users’ emotional needs. This study also underlines the need for studies on human-artificial intelligence interaction, digital connectedness, and trust in technological systems.

References

  • Abd-Alrazaq, A. A., Rababeh, A., Alajlani, M., Bewick, B. M., & Househ, M. (2020). Effectiveness and Safety of Using Chatbots to Improve Mental Health: Systematic Review and Meta-Analysis. Journal of Medical Internet Research, 22(7), e16021. https://doi.org/10.2196/16021
  • Adamopoulou, E., & Moussiades, L. (2020). Chatbots: History, technology, and applications. Machine Learning with Applications, 2, 100006. https://doi.org/10.1016/j.mlwa.2020.100006
  • Aggarwal, P., & McGill, A. L. (2007). Is That Car Smiling at Me? Schema Congruity as a Basis for Evaluating Anthropomorphized Products. Journal of Consumer Research, 34(4), 468–479. https://doi.org/10.1086/518544
  • Başkale, H. (2016). Nitel araştırmalarda geçerlik, güvenirlik ve örneklem büyüklüğü. Dokuz Eylül Üniversitesi Hemşirelik Fakültesi Elektronik Dergisi, 9(1), 23–28.
  • Belda-Medina, J., & Kokošková, V. (2023). Integrating chatbots in education: insights from the Chatbot-Human Interaction Satisfaction Model (CHISM). International Journal of Educational Technology in Higher Education, 20(1), 62. https://doi.org/10.1186/s41239-023-00432-3
  • Brandtzaeg, P. B., & Følstad, A. (2017, November). Why people use chatbots. In International conference on internet science (pp. 377-392). Cham: Springer International Publishing.
  • Brin, D., Sorin, V., Vaid, A., Soroush, A., Glicksberg, B. S., Charney, A. W., Nadkarni, G., & Klang, E. (2023). Comparing ChatGPT and GPT-4 performance in USMLE soft skill assessments. Scientific Reports, 13(1), 16492. https://doi.org/10.1038/s41598-023-43436-9
  • Burgoon, J. K., Bonito, J. A., Bengtsson, B., Cederberg, C., Lundeberg, M., & Allspach, L. (2000). Interactivity in human–computer interaction: a study of credibility, understanding, and influence. Computers in Human Behavior, 16(6), 553–574. https://doi.org/10.1016/S0747-5632(00)00029-7
  • Chaves, A. P., & Gerosa, M. A. (2021). How Should My Chatbot Interact? A Survey on Social Characteristics in Human–Chatbot Interaction Design. International Journal of Human–Computer Interaction, 37(8), 729–758. https://doi.org/10.1080/10447318.2020.1841438
  • Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny valley: An experimental study of human–chatbot interaction. Future Generation Computer Systems, 92, 539–548. https://doi.org/10.1016/j.future.2018.01.055
  • Creswell, J. W., & Poth, C. N. (2018). Qualitative inquiry and research design: Choosing among five approaches (4th ed.). SAGE.
  • Croes, E. A. J., & Antheunis, M. L. (2021). Can we be friends with Mitsuku? A longitudinal study on the process of relationship formation between humans and a social chatbot. Journal of Social and Personal Relationships, 38(1), 279–300. https://doi.org/10.1177/0265407520959463
  • Demirel, S., Bulur, N., & Çakıcı, Z. (2024). Utilizing Artificial Intelligence for Text Classification in Communication Sciences (pp. 218–235). https://doi.org/10.4018/979-8-3693-1830-0.ch013
  • Dosovitsky, G., & Bunge, E. L. (2021). Bonding With Bot: User Feedback on a Chatbot for Social Isolation. Frontiers in Digital Health, 3. https://doi.org/10.3389/fdgth.2021.735053
  • Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3–4), 177–190. https://doi.org/10.1016/S0921-8890(02)00374-3
  • Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886. https://doi.org/10.1037/0033-295X.114.4.864
  • Eyrek, A. (2022). Dijitalleşen Seyahat An(ı)ları: Fiziki ve Çevrim içi Seyahat Deneyiminin Hatırlama Bağlamında Karşılaştırılması. Türkiye İletişim Araştırmaları Dergisi, 40, 40–63. https://doi.org/10.17829/turcom.1050187
  • Eyssel, F., & Reich, N. (2013). Loneliness makes the heart grow fonder (of robots). In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (pp. 121–122). IEEE. https://doi.org/10.1109/HRI.2013.6483531
  • Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Mental Health, 4(2), e19. https://doi.org/10.2196/mental.7785
  • Fulmer, R., Joerin, A., Gentile, B., Lakerink, L., & Rauws, M. (2018). Using Psychological Artificial Intelligence (Tess) to Relieve Symptoms of Depression and Anxiety: Randomized Controlled Trial. JMIR Mental Health, 5(4), e64. https://doi.org/10.2196/mental.9782
  • Gambino, A., Fox, J., & Ratan, R. (2020). Building a Stronger CASA: Extending Computers Are Social Actors Paradigm. Human-Machine Communication, 1, 71–86. https://doi.org/10.30658/hmc.1.5
  • Gunkel, D. J. (2012). Communication and artificial intelligence: Opportunities and challenges for the 21st century. Communication Research and Practice, 1(1), 1–16. https://doi.org/10.7275/R5QJ7F7R
  • Gursoy, D., Chi, O. H., Lu, L., & Nunkoo, R. (2019). Consumers accept the application of artificially intelligent (AI) devices used in service delivery. International Journal of Information Management, 49, 157–169. https://doi.org/10.1016/j.ijinfomgt.2019.03.008
  • Guthrie, S. (2013). Anthropomorphism. In A. Runehov & L. Oviedo (Eds.), Encyclopedia of Sciences (pp. 111–113). Springer. https://doi.org/10.1163/9789004249707_vsr_COM_00000204
  • He, X., Shen, X., Chen, Z., Backes, M., & Zhang, Y. (2024). MGTBench: Benchmarking Machine-Generated Text Detection. Proceedings of the 2024 on ACM SIGSAC Conference on Computer and Communications Security, 2251–2265. https://doi.org/10.1145/3658644.3670344
  • Inkster, B., Sarda, S., & Subramanian, V. (2018). An Empathy-Driven, Conversational Artificial Intelligence Agent (Wysa) for Digital Mental Well-Being: Real-World Data Evaluation Mixed-Methods Study. JMIR MHealth and UHealth, 6(11), e12106. https://doi.org/10.2196/12106
  • Kaushik, K. (2025, January 25). The AI chatbot dilemma: Are we sacrificing privacy and trust for convenience? Medium. https://medium.com/@kumarakaushik/the-ai-chatbot-dilemma-are-we-sacrificing-privacy-and-trust-for-convenience-a0f9e04b94df
  • Krämer, N. C., Lucas, G., Schmitt, L., & Gratch, J. (2018). Social snacking with a virtual agent – On the interrelation of need to belong and effects of social responsiveness when interacting with artificial entities. International Journal of Human-Computer Studies, 109, 112–121. https://doi.org/10.1016/j.ijhcs.2017.09.001
  • Lee, M., & Park, J. (2022). Do parasocial relationships and the quality of communication with AI shopping chatbots determine middle‐aged women consumers’ continuance usage intentions? Journal of Consumer Behaviour, 21(4), 842–854. https://doi.org/10.1002/cb.2043
  • Lin, H., Chi, O. H., & Gursoy, D. (2020). Antecedents of customers’ acceptance of artificially intelligent robotic device use in hospitality services. Journal of Hospitality Marketing & Management, 29(5), 530–549. https://doi.org/10.1080/19368623.2020.1685053
  • Liu, J. (2024). ChatGPT: perspectives from human–computer interaction and psychology. Frontiers in Artificial Intelligence, 7. https://doi.org/10.3389/frai.2024.1418869
  • Lofland, J., & Lofland, L. H. (1995). Analyzing social settings: A guide to qualitative observation and analysis (3rd ed.). Wadsworth.
  • Lucas, G. M., Gratch, J., King, A., & Morency, L.-P. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94–100. https://doi.org/10.1016/j.chb.2014.04.043
  • Łukasik, A., & Gut, A. (2025). From robots to chatbots: unveiling the dynamics of human-AI interaction. Frontiers in Psychology, 16. https://doi.org/10.3389/fpsyg.2025.1569277
  • Mori, M. (1970). Bukimi no tani [The uncanny valley]. Energy, 7, 33–35.
  • McCarthy, J. (2007). From here to human-level AI. Artificial Intelligence, 171(18), 1174–1182. https://doi.org/10.1016/j.artint.2007.10.009
  • Moussawi, S., Koufaris, M., & Benbunan-Fich, R. (2020). How perceptions of intelligence and anthropomorphism affect adoption of personal intelligent agents. Electronic Markets, 31(2), 343–364. https://doi.org/10.1007/s12525-020-00411-w/Published
  • Nass, C., & Moon, Y. (2000). Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
  • Nass, C., Steuer, J., Tauber, E., & Reeder, H. (1993). Anthropomorphism, agency, and ethopoeia: Computers as social actors. Proceedings of the INTERACT ’93 Conference, 111–116.
  • Neff, G., & Nagy, P. (2018). Agency in the Digital Age: Using Symbiotic Agency to Explain Human–Technology Interaction. In A Networked Self and Human Augmentics, Artificial Intelligence, Sentience (pp. 97–107). Routledge. https://doi.org/10.4324/9781315202082-8
  • OpenAI. (2024, February 13). ChatGPT için bellek ve yeni kontroller. https://openai.com/tr-TR/index/memory-and-new-controls-for-chatgpt/
  • OpenAI, Achiam, J., Adler, S., Agarwal, S., Ahmad, L., Akkaya, I., Aleman, F. L., Almeida, D., Altenschmidt, J., Altman, S., Anadkat, S., Avila, R., Babuschkin, I., Balaji, S., Balcom, V., Baltescu, P., Bao, H., Bavarian, M., Belgum, J., … Zoph, B. (2024). GPT-4 Technical Report. http://arxiv.org/abs/2303.08774
  • Prakash, A. V., & Das, S. (2020). Intelligent conversational agents in mental healthcare services: a thematic analysis of user perceptions. Pacific Asia Journal of the Association for Information Systems, 12(2).
  • Reichardt, Jasia. (1978). Robots: fact, fiction, and prediction. Viking Press.
  • Sarıoğlu, E. B., & Güregen, E. P. (2024). Duygusal Yalnızlığa Bir Çözüm Olarak Chatgpt: Kişilerarası İletişimin Yeni Aracı. Türkiye İletişim Araştırmaları Dergisi. https://doi.org/10.17829/turcom.1360418
  • Shi, S., Gong, Y., & Gursoy, D. (2021). Antecedents of Trust and Adoption Intention toward Artificially Intelligent Recommendation Systems in Travel Planning: A Heuristic–Systematic Model. Journal of Travel Research, 60(8), 1714–1734. https://doi.org/10.1177/0047287520966395
  • Skjuve, M., Haugstveit, I. M., Følstad, A., & Brandtzaeg, P. B. (2019). Help! Is my chatbot falling into the uncanny valley? An empirical study of user experience in human-chatbot interaction. Human Technology, 30–54. https://doi.org/10.17011/ht/urn.201902201607
  • Song, S. W., & Shin, M. (2024). Uncanny Valley Effects on Chatbot Trust, Purchase Intention, and Adoption Intention in the Context of E-Commerce: The Moderating Role of Avatar Familiarity. International Journal of Human–Computer Interaction, 40(2), 441–456. https://doi.org/10.1080/10447318.2022.2121038
  • Sullivan, Y., Nyawa, S., & Fosso Wamba, S. (2023). Combating Loneliness with Artificial Intelligence: An AI-Based Emotional Support Model. https://doi.org/10.24251/HICSS.2023.541
  • Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., DeCero, E., & Loggarakis, A. (2020). User Experiences of Social Support from Companion Chatbots in Everyday Contexts: Thematic Analysis. Journal of Medical Internet Research, 22(3), e16235. https://doi.org/10.2196/16235
  • Troshani, I., Rao Hill, S., Sherman, C., & Arthur, D. (2021). Do We Trust in AI? Role of Anthropomorphism and Intelligence. Journal of Computer Information Systems, 61(5), 481–491. https://doi.org/10.1080/08874417.2020.1788473
  • Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433–460. https://doi.org/10.1093/mind/LIX.236.433
  • Vinkler, M. L. & Yu, P. (2020). Conversational Chatbots with Memory-based Question and Answer Generation [Linkoping University]. In Conversational chatbots with memory-based questions and answer generation. https://www.diva-portal.org/smash/get/diva2:1510358/FULLTEXT01.pdf
  • Wagner, K., & Schramm-Klein, H. (2019). Alexa, Are You Human? Investigating the Anthropomorphism of Digital Voice Assistants-A Qualitative Approach.
  • Wang, P., & Shao, J. (2022). Escaping Loneliness Through Tourist-Chatbot Interactions. In Information and Communication Technologies in Tourism 2022 (pp. 473–485). Springer International Publishing. https://doi.org/10.1007/978-3-030-94751-4_44
  • Waytz, A., Morewedge, C. K., Epley, N., Monteleone, G., Gao, J.-H., & Cacioppo, J. T. (2010). Making sense by making sentient: Effectance motivation increases anthropomorphism. Journal of Personality and Social Psychology, 99(3), 410–435. https://doi.org/10.1037/a0020240
  • Weizenbaum, J. (1966). ELIZA—a computer program for the study of natural language communication between man and machine. Communications of the ACM, 9(1), 36-45.
  • https://courses.cs.umbc.edu/471/papers/weizenbaum.eliza.1966.pdf
  • Weizenbaum, J. (1983). ELIZA — a computer program for the study of natural language communication between man and machine. Communications of the ACM, 26(1), 23–28. https://doi.org/10.1145/357980.357991
  • Zamora, J. (2017). I’m Sorry, Dave, I’m Afraid I Can’t Do That. Proceedings of the 5th International Conference on Human Agent Interaction, 253–260. https://doi.org/10.1145/3125739.3125766
  • Zumstein, D., & Hundertmark, S. (2017). CHATBOTS-AN INTERACTIVE TECHNOLOGY FOR PERSONALIZED COMMUNICATION, TRANSACTIONS AND SERVICES. IADIS International Journal on WWW/Internet, 15(1), 96–109.
There are 61 citations in total.

Details

Primary Language English
Subjects Communication and Media Studies (Other)
Journal Section Research Article
Authors

Ecem Ergül 0009-0005-4603-4331

Aysun Eyrek 0000-0002-9898-6271

Esra Pelin Güregen 0000-0003-3564-7560

Submission Date May 29, 2025
Acceptance Date December 22, 2025
Publication Date December 30, 2025
Published in Issue Year 2025 Issue: 50

Cite

APA Ergül, E., Eyrek, A., & Güregen, E. P. (2025). Who am I talking to?: Memory of Chatbots and Personalized Chat Interaction. Akdeniz Üniversitesi İletişim Fakültesi Dergisi(50), 339-356. https://doi.org/10.31123/akil.1709303

3328033281

Journal of Akdeniz University Faculty of Communication is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC-BY-NC).