İnceleme Makalesi
BibTex RIS Kaynak Göster

Yanlış Bilgi Ve Dezenformasyonla Mücadele: Kavramlar Ve Uygulamalar

Yıl 2023, , 464 - 493, 30.12.2023
https://doi.org/10.47994/usbad.1257491

Öz

Yeni enformasyon teknolojilerinin gelişimiyle birlikte dezenformasyon ve yanlış bilgi içeren veri yığınlarının artış gösterdiği bilinmektedir. Dünya’da politika yapıcılar farklı biçimlerde bu süreçten etkilenmiştir. 2016 yılında ABD’de gerçekleşen seçimde yaşanan veri skandalları bu sürecin kırılma noktası olmuş ve Brexıt vakasının ardından birçok yeni kavram ortaya atılmıştır. Dezenformasyon ve yanlış bilgi içeren verilerin yayılması kamusal alanda politika yapıcılar ile hizmet sağlayıcılar arasında yeni bir rekabet ve gerilim alanının ortaya çıkmasına neden olmuş, dünyada hükümetler, hükümet dışı örgütler, sivil toplumla birlikte özel girişimler tarafından dezenformasyon ve sahte bilgiyle mücadelede pek çok farklı yaklaşım ve pratik uygulama geliştirilmiştir. Bu makale; doğrudan bu yeni kavram, yaklaşım ve dezenformasyon ve sahte bilgiyle mücadelede geliştirilen uygulamaları incelemeyi amaçlamaktadır. Bu amaçla sahte içeriklere maruz kalmadan önce, maruz kaldıktan sonra ve dezenformasyon ağını analiz eden uygulamalar sistematik literatür taramasıyla analiz edilmiştir. Ayrıca, dünyanın farklı coğrafyalarında çeşitli ülkeler tarafından uygulanan politika girişimleri ve Türkiye’nin dezenformasyon ve yanlış bilgiyle mücadelede alabileceği olası önlemler incelenmiştir.

Destekleyen Kurum

Bulunmamaktadır.

Teşekkür

Saygılarımla;

Kaynakça

  • Abramowitz, A. I., & Webster, S. (2016). The rise of negative partisanship and the nationalization of US elections in the 21st century. Electoral Studies, 41, 12-22.
  • Abrams, S. (2016). Beyond Propaganda: Soviet Active Measures in Putin’s Russia. Connections, 15(1), 5–31. http://www.jstor.org/stable/26326426
  • Allport, G. W., & Postman, L. J. (1945). Section of psychology: The basic psychology of rumor. Transactions of the New York Academy of Sciences, 8(2 Series II), 61-81.
  • Anderson, A. A., Brossard, D., Scheufele, D. A., Xenos, M. A., & Ladwig, P. (2014). The “nasty effect:” Online incivility and risk perceptions of emerging technologies. Journal of computer-mediated communication, 19(3), 373-387.
  • Ayers M. S., Reder L. M. (1998). A theoretical review of the misinformation effect: Predictions from an activation-based memory model. Psychonomic Bulletin & Review, 5, 1–21.
  • Batson C. D. (1975). Rational processing or rationalization? Effect of disconfirming information on a stated religious belief. Journal of Personality and Social Psychology, 32, 176–184.
  • Bauman, S., Toomey, R. B., & Walker, J. L. (2013). Associations among bullying, cyberbullying, and suicide in high school students. Journal of adolescence, 36(2), 341-350.
  • Bedford D. (2010). Agnotology as a teaching tool: Learning climate science by studying misinformation. Journal of Geography, 109, 159–165.
  • Berger J. (2011). Arousal increases social transmission of information. Psychological Science, 22, 891–893.
  • Berger, J., & Milkman, K. L. (2012). What makes online content viral?. Journal of marketing research, 49(2), 192-205.
  • Berghel, H. (2017). Lies, damn lies, and fake news. Computer, 50(2), 80-85.
  • Berinsky A. (2012). Rumors, truths, and reality: A study of political misinformation. Unpublished manuscript, Massachusetts Institute of Technology, Cambridge, MA.
  • Blais A., Gidengil E., Fournier P., Nevitte N., Everitt J., Kim J. (2010). Political judgments, perceptions of facts, and partisan effects. Electoral Studies, 29, 1–12.
  • Bode, L., & Vraga, E. K. (2015). In related news, that was wrong: The correction of misinformation through related stories functionality in social media. Journal of Communication, 65(4), 619-638.
  • Booth, A., Sutton, A., Clowes, M., & Martyn-St James, M. (2021). Systematic approaches to a successful literature review.
  • Boussalis, C., & Coan, T. G. (2016). Text-mining the signals of climate change doubt. Global Environmental Change, 36, 89-100.
  • Bradshaw, S., & Howard, P. N. (2019). The global disinformation order: 2019 global inventory of organised social media manipulation.
  • Brashier, N. M., & Schacter, D. L. (2020). Aging in an era of fake news. Current directions in psychological science, 29(3), 316-323.
  • Bu, Z., Xia, Z., & Wang, J. (2013). A sock puppet detection algorithm on virtual spaces. Knowledge-Based Systems, 37, 366-377.
  • Bush J. G., Johnson H. M., Seifert C. M. (1994). The implications of corrections: Then why did you mention it? In Ram A., Eiselt K. (Eds.), Proceedings of the 16th annual conference of the cognitive science society (pp. 112–117). Hillsdale, NJ: Erlbaum.
  • Byrne S., Hart P. S. (2009). The boomerang effect: A synthesis of findings and a preliminary theoretical framework. In Beck C. S (Ed.), Communication yearbook (Vol. 220, pp. 3–37). Hoboken, NY: Routledge. Cambridge Sözlük Resmi İnternet Sayfası, İnformation Meaning, https://dictionary.cambridge.
  • Campan, A., Cuzzocrea, A., & Truta, T. M. (2017, December). Fighting fake news spread in online social networks: Actual trends and future research directions. In 2017 IEEE International Conference on Big Data (Big Data) (pp. 4453-4457). IEEE.
  • Carraro, L., Castelli, L., & Macchiella, C. (2011). The automatic conservative: Ideology-based attentional asymmetries in the processing of valenced information. PLoS One, 6(11), e26456.
  • Cartwright, B., Frank, R., Weir, G., & Padda, K. (2022). Detecting and responding to hostile disinformation activities on social media using machine learning and deep neural networks. Neural Computing and Applications, 34(18), 15141-15163.
  • Castelli L., Carraro L. (2011). Ideology is related to basic cognitive processes involved in attitude formation. Journal of Experimental Social Psychology, 47, 1013–1016.
  • Chen, X., Cheng, Q., & Wang, X. (2012). Institutional Knowledge at Singapore Management Universit y. Quaternary International, 282, 122-129.
  • Cho C. H., Martens M. L., Kim H., Rodrigue M. (2011). Astroturfing global warming: It isn’t always greener on the other side of the fence. Journal of Business Ethics, 104, 571–587.
  • Ciampaglia, G. L. (2018). Fighting fake news: a role for computational social science in the fight against digital misinformation. Journal of Computational Social Science, 1(1), 147-153.
  • Cook, J., Ecker, U., & Lewandowsky, S. (2015). Misinformation and how to correct it. Emerging trends in the social and behavioral sciences: An interdisciplinary, searchable, and linkable resource, 1-17.
  • Cotter E. M. (2008). Influence of emotional content and perceived relevance on spread of urban legends: A pilot study. Psychological Reports, 102, 623–629.
  • Craker, N., & March, E. (2016). The dark side of Facebook: The Dark Tetrad, negative social potency, and trolling behaviours. Personality and Individual Differences, 102, 79-84.
  • Cunliffe-Jones, P., Diagne, A., Finlay, A., Gaye, S., Gichunge, W., Onumah, C., Pretorius, C., & Schiffrin, A. (2018). MEDIA LITERACY TEACHING AROUND THE WORLD. In Misinformation Policy in Sub-Saharan Africa: From Laws and Regulations to Media Literacy (pp. 31–34). University of Westminster Press. http://www.jstor.org/stable/j.ctv1v3gqw5.9
  • Demestichas, K., Remoundou, K., & Adamopoulou, E. (2020). Food for thought: fighting fake news and online disinformation. IT Professional, 22(2), 28-34.
  • Ecker U. K. H., Lewandowsky S., Fenton O., & Martin K. (2012). Pre-existing attitudes and the continued influence of misinformation. Unpublished manuscript, University of Western Australia, Perth.
  • Ecker U. K. H., Lewandowsky S., Swire B., Chang D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review, 18, 570–578.
  • Ferrara, E. (2017). Disinformation and social bot operations in the run up to the 2017 French presidential election. arXiv preprint arXiv:1707.00086.
  • Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press.
  • Fragale A. R., Heath C. (2004). Evolving informational credentials: The (mis)attribution of believable facts to credible sources. Personality and Social Psychology Bulletin, 30, 225–236.
  • Gaines B. J., Kuklinski J. H., Quirk P. J., Peyton B., Verkuilen J. (2007). Same facts, different interpretations: Partisan motivation and opinion on Iraq. Journal of Politics, 69, 957–974.
  • Gerrie M. P., Belcher L. E., & Garry M. (2006). “Mind the gap”: False memories for missing aspects of an event. Applied Cognitive Psychology, 20, 689–696.
  • Gilbert D. T. (1991). How mental systems believe. American Psychologist, 46, 107–119.
  • Gilbert D. T., Tafarodi R. W., Malone P. S. (1993). You can’t not believe everything you read. Journal of Personality and Social Psychology, 65, 221–233.
  • Grace, L., & Hone, B. (2019, May). Factitious: large scale computer game to fight fake news and improve news literacy. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-8).
  • Graham, S. S. (2021). Misinformation inoculation and literacy support tweetorials on COVID-19. Journal of Business and Technical Communication, 35(1), 7-14.
  • Greenberg, J., & Jonas, E. (2003). Psychological motives and political orientation--The left, the right, and the rigid: Comment on Jost et al. (2003). Psychological Bulletin, 129(3), 376–382. https://doi.org/10.1037/0033-2909.129.3.376
  • Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B., & Lazer, D. (2019). Fake news on Twitter during the 2016 US presidential election. Science, 363(6425), 374-378.
  • Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, B., Reifler, J., & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences, 117(27), 15536-15545.
  • Guess, A., Nagler, J., & Tucker, J. (2019). Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science advances, 5(1), eaau4586, 1-8.
  • Iyengar, S., & Massey, D. S. (2019). Scientific communication in a post-truth society. Proceedings of the National Academy of Sciences, 116(16), 7656-7661.
  • Kahan, D. M. (2008). Cultural cognition as a conception of the cultural theory of risk. HANDBOOK OF RISK THEORY, S. Roeser, ed., Forthcoming, Harvard Law School Program on Risk Regulation Research Paper, (08-20).
  • Kazemi, A., Garimella, K., Gaffney, D., & Hale, S. A. (2021). Claim matching beyond English to scale global fact-checking. arXiv preprint arXiv:2106.00853.
  • Lewandowsky S. vd. (2020). The debunking handbook. https://www.climatechangecommunication.org/wp-content/uploads/2020/10/DebunkingHandbook2020.pdf
  • Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of applied research in memory and cognition, 6(4), 353-369.
  • Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological science in the public interest, 13(3), 106-131.
  • Lim, S. S., & Tan, K. R. (2020). Front liners fighting fake news: Global perspectives on mobilising young people as media literacy advocates. Journal of Children and Media, 14(4), 529-535.
  • Linvill, D. L., & Warren, P. L. (2020). Troll factories: Manufacturing specialized disinformation on Twitter. Political Communication, 37(4), 447-467.
  • Lorenzo-Dus, N., Blitvich, P. G. C., & Bou-Franch, P. (2011). On-line polylogues and impoliteness: The case of postings sent in response to the Obama Reggaeton YouTube video. Journal of pragmatics, 43(10), 2578-2593.
  • Mayer J., Mussweiler T. (2011). Suspicious spirits, flexible minds: When distrust enhances creativity. Journal of Personality and Social Psychology, 101, 1262–1277.
  • McGuire, W. J., & Papageorgis, D. (1962). Effectiveness of forewarning in developing resistance to persuasion. Public Opinion Quarterly, 26(1), 24-34.
  • Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303-330.
  • Oliver, J. E., & Wood, T. J. (2014). Conspiracy Theories and the Paranoid Style(s) of Mass Opinion. American Journal of Political Science, 58(4), 952–966. http://www.jstor.org/stable/24363536 org/tr, Erişim Tarihi:(19.02.2020)
  • Pamment, J. (2020). The EU's role in fighting disinformation: taking back the initiative.
  • Pantazi, M., Papaioannou, K., & van Prooijen, J. W. (2022). Power to the people: The hidden link between support for direct democracy and belief in conspiracy theories. Political Psychology, 43(3), 529-548.
  • Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39-50.
  • Pfau, M., Haigh, M. M., Sims, J., & Wigley, S. (2007). The influence of corporate front-group stealth campaigns. Communication research, 34(1), 73-99.
  • Porter, E., & Wood, T. J. (2021). The global effectiveness of fact-checking: Evidence from simultaneous experiments in Argentina, Nigeria, South Africa, and the United Kingdom. Proceedings of the National Academy of Sciences, 118(37), e2104235118.
  • Robinson, O., Coleman, A., & Sardarizadeh, S. (2019). A report on anti-disinformation initiatives. Oxford University.
  • Rodríguez-Pérez, C., Paniagua-Rojano, F. J., & Magallón-Rosa, R. (2021). Debunking political disinformation through journalists’ perceptions: an analysis of Colombia’s fact-checking news practices. Media and Communication, 9(1), 264-275.
  • Roozenbeek, J., & Van der Linden, S. (2019). Fake news game confers psychological resistance against online misinformation. Palgrave Communications, 5(1), 1-10.
  • Roozenbeek, J., van der Linden, S. Fake news game confers psychological resistance against online misinformation. Palgrave Commun 5, 65 (2019). https://doi.org/10.1057/s41599-019-0279-9
  • Roozenbeek, J., van der Linden, S. Fake news game confers psychological resistance against online misinformation. Palgrave Commun 5, 65 (2019). https://doi.org/10.1057/s41599-019-0279-9 Saez-Trumper, D. (2019). Online disinformation and the role of wikipedia. arXiv preprint arXiv:1910.12596.
  • Scales, D., Gorman, J., & Jamieson, K.H., (2021). The Covid-19 Infodemic — Applying the Epidemiologic Model to Counter Misinformation. Perspective. 385:678-681.
  • Shaar, S., Martino, G. D. S., Babulkov, N., & Nakov, P. (2020). That is a known lie: Detecting previously fact-checked claims. arXiv preprint arXiv:2005.06058.
  • Soares, F. B., Recuero, R., Volcan, T., Fagundes, G., & Sodré, G. (2021). Research note: Bolsonaro’s firehose: How Covid-19 disinformation on WhatsApp was used to fight a government political crisis in Brazil. Harvard Kennedy School Misinformation Review.
  • Sun, J., & Tang, J. (2011). A survey of models and algorithms for social influence analysis. Social network data analytics, 177-214.
  • Tandoc Jr, E. C., Lim, Z. W., & Ling, R. (2018). Defining “fake news” A typology of scholarly definitions. Digital journalism, 6(2), 137-153. Erişim Tarihi: (19.02.2020)
  • Thompson, D., Baranowski, T., Buday, R., Baranowski, J., Thompson, V., Jago, R., & Griffith, M. J. (2010). Serious video games for health: How behavioral science guided the development of a serious video game. Simulation & gaming, 41(4), 587-606.
  • Traberg, C. S., Roozenbeek, J., & van der Linden, S. (2022). Psychological Inoculation against Misinformation: Current Evidence and Future Directions. The ANNALS of the American Academy of Political and Social Science, 700(1), 136–151. https://doi.org/10.1177/00027162221087936
  • Turk Dil Kurumu Resmi İnternet Sayfası, Enformasyon, https://sozluk.gov.tr/, Erişim Van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Global Challenges, 1(2), 1600008.
  • Varol, O., Ferrara, E., Davis, C., Menczer, F., & Flammini, A. (2017, May). Online human-bot interactions: Detection, estimation, and characterization. In Proceedings of the international AAAI conference on web and social media (Vol. 11, No. 1, pp. 280-289).
  • Vosoughi, S., Mohsenvand, M. N., & Roy, D. (2017). Rumor gauge: Predicting the veracity of rumors on Twitter. ACM transactions on knowledge discovery from data (TKDD), 11(4), 1-36.
  • Walter, N., & Murphy, S. T. (2018). How to unring the bell: A meta-analytic approach to correction of misinformation. Communication Monographs, 85(3), 423–441. https://doi.org/10.1080/03637751.2018.1467564
  • Walter, N., & Murphy, S. T. (2018). How to unring the bell: A meta-analytic approach to correction of misinformation. Communication Monographs, 85(3), 423-441.
  • Walter, N., & Tukachinsky, R. (2020). A meta-analytic examination of the continued influence of misinformation in the face of correction: How powerful is it, why does it happen, and how to stop it?. Communication research, 47(2), 155-177.
  • Walter, N., Cohen, J., Holbert, R. L., & Morag, Y. (2020). Fact-checking: A meta-analysis of what works and for whom. Political Communication, 37(3), 350-375.
  • Wang, W. Y. (2017). " liar, liar pants on fire": A new benchmark dataset for fake news detection. arXiv preprint arXiv:1705.00648.
  • Zaragoza M. S., Mitchell K. J. (1996). Repeated exposure to suggestion and the creation of false memories. Psychological Science, 7, 294–300.
  • Zellers, R., Holtzman, A., Rashkin, H., Bisk, Y., Farhadi, A., Roesner, F., & Choi, Y. (2019). Defending against neural fake news. Advances in neural information processing systems, 32.
  • Zollo, F., Novak, P. K., Del Vicario, M., Bessi, A., Mozetič, I., Scala, A., ... & Quattrociocchi, W. (2015). Emotional dynamics in the age of misinformation. PloS one, 10(9), e0138740.
  • World Economic Forum. (2018, May). The Global Risks Report 2018 13th Edition. https://www3.weforum.org/docs/WEF_GRR18_Report.pdf
  • Pew Research Center.(2016). Digital Readiness Gaps. https://www.pewresearch.org/internet/2016/09/20/digital-readiness-gaps/
  • Fishkin, R. (2022 May 15). SparkToro & Followerwonk Joint Twitter Analysis: 19.42% of Active Accounts Are Fake or Spam. https://sparktoro.com/blog/sparktoro-followerwonk-joint-twitter-analysis-19-42-of-active-accounts-are-fake-or-spam/
  • Tools to Monitor Disinformation, https://www.disinfo.eu/resources/tools-to-monitor-disinformation The SDG Communicator Toolkit, (2020). https://sdg-communicator.org/toolkit/ https://www.poynter.org/ifcn/
  • StratCom activity report - Strategic Communication Task Forces and Information Analysis Division, ( 2022 March 24). https://www.eeas.europa.eu/eeas/2021-stratcom-activity-report-strategic-communication-task-forces-and-information-analysis_en
  • East Stratcom Task Force, https://euvsdisinfo.eu/about/# Factsheet: Rapid Alert System, (2019 March 19). https://www.eeas.europa.eu/node/59644_en
  • EU Election Observation Missions, (2023 August 7). https://www.eeas.europa.eu/eeas/eu-election-observation-missions-1_en
  • The Digital Methods Initiative, https://wiki.digitalmethods.net/Dmi/DmiAbout https://aoir.org/supporttwitterapiopendata/
  • AoIR signs open letter in support of Twitter API open data Access, (2023 February 16) https://www.nbcnews.com/news/world/pope-says-fake-news-dates-adam-eve-says-truth-will-n840516
  • Tools That Fight Disinformation Online, https://www.rand.org/research/projects/truth-decay/fighting-disinformation/search.html
  • Drog Group Disinformation Laboratories, https://drog.group/cases
  • Biteniece, N. vd., (2017 November 16). Digital Hydra: Security Implications of False Information Online https://stratcomcoe.org/publications/digital-hydra-security-implications-of-false-information-online/205
  • Kirby, E.J. (2016 December 5). The city getting rich from fake news. https://www.bbc.com/news/magazine-38168281
  • Adair, B. (2009 April 20). PolitiFact wins Pulitzer https://www.politifact.com/article/2009/apr/20/politifact-wins-pulitzer/
  • Burgos, P. (2009 June 27). What 100,000 WhatsApp messages reveal about misinformation in Brazil. https://firstdraftnews.org/articles/what-100000-whatsapp-messages-reveal-about-misinformation-in-brazil/
  • GLOBAL FACT-CHECKING SITES; Reporters Lab Maps, https://reporterslab.org/fact-checking/
  • Game; Play Go Viral; https://www.goviralgame.com/books/go-viral/ Game; Cranky Uncle; https://crankyuncle.com/
  • Dictionary Sözlük; https://www.dictionary.com/e/misinformation-vs-disinformation-get-informed-on-the-difference/
  • World Health Organization; Munich Security Conference. https://www.who.int/director-general/speeches/detail/munich-security-conference

Combatting Misinformation And Disinformation; Concepts And Applications

Yıl 2023, , 464 - 493, 30.12.2023
https://doi.org/10.47994/usbad.1257491

Öz

It is known that with the development of new information technologies, the mass of data containing disinformation and misinformation is increasing. Policy makers around the world have been affected by this process in different ways. The data scandals during the 2016 election in the USA were the breaking point of this process, and many new concepts were introduced after the Brexit incident. The spread of data containing disinformation and misinformation has led to the emergence of a new area of competition and tension between policy makers and service providers in the public sphere, and many different approaches and practices have been adopted in the fight against disinformation and fake information by governments, non-governmental organizations, civil society and private initiatives around the world. The application has been developed. This article; It aims to directly examine this new concept, approach and practices developed in the fight against disinformation and fake information. For this purpose, applications that analyze the disinformation network before and after exposure to fake content were analyzed through a systematic literature review. Additionally, policy initiatives implemented by various countries in different geographies of the world and possible measures that Turkey can take in the fight against disinformation and misinformation were examined.

Kaynakça

  • Abramowitz, A. I., & Webster, S. (2016). The rise of negative partisanship and the nationalization of US elections in the 21st century. Electoral Studies, 41, 12-22.
  • Abrams, S. (2016). Beyond Propaganda: Soviet Active Measures in Putin’s Russia. Connections, 15(1), 5–31. http://www.jstor.org/stable/26326426
  • Allport, G. W., & Postman, L. J. (1945). Section of psychology: The basic psychology of rumor. Transactions of the New York Academy of Sciences, 8(2 Series II), 61-81.
  • Anderson, A. A., Brossard, D., Scheufele, D. A., Xenos, M. A., & Ladwig, P. (2014). The “nasty effect:” Online incivility and risk perceptions of emerging technologies. Journal of computer-mediated communication, 19(3), 373-387.
  • Ayers M. S., Reder L. M. (1998). A theoretical review of the misinformation effect: Predictions from an activation-based memory model. Psychonomic Bulletin & Review, 5, 1–21.
  • Batson C. D. (1975). Rational processing or rationalization? Effect of disconfirming information on a stated religious belief. Journal of Personality and Social Psychology, 32, 176–184.
  • Bauman, S., Toomey, R. B., & Walker, J. L. (2013). Associations among bullying, cyberbullying, and suicide in high school students. Journal of adolescence, 36(2), 341-350.
  • Bedford D. (2010). Agnotology as a teaching tool: Learning climate science by studying misinformation. Journal of Geography, 109, 159–165.
  • Berger J. (2011). Arousal increases social transmission of information. Psychological Science, 22, 891–893.
  • Berger, J., & Milkman, K. L. (2012). What makes online content viral?. Journal of marketing research, 49(2), 192-205.
  • Berghel, H. (2017). Lies, damn lies, and fake news. Computer, 50(2), 80-85.
  • Berinsky A. (2012). Rumors, truths, and reality: A study of political misinformation. Unpublished manuscript, Massachusetts Institute of Technology, Cambridge, MA.
  • Blais A., Gidengil E., Fournier P., Nevitte N., Everitt J., Kim J. (2010). Political judgments, perceptions of facts, and partisan effects. Electoral Studies, 29, 1–12.
  • Bode, L., & Vraga, E. K. (2015). In related news, that was wrong: The correction of misinformation through related stories functionality in social media. Journal of Communication, 65(4), 619-638.
  • Booth, A., Sutton, A., Clowes, M., & Martyn-St James, M. (2021). Systematic approaches to a successful literature review.
  • Boussalis, C., & Coan, T. G. (2016). Text-mining the signals of climate change doubt. Global Environmental Change, 36, 89-100.
  • Bradshaw, S., & Howard, P. N. (2019). The global disinformation order: 2019 global inventory of organised social media manipulation.
  • Brashier, N. M., & Schacter, D. L. (2020). Aging in an era of fake news. Current directions in psychological science, 29(3), 316-323.
  • Bu, Z., Xia, Z., & Wang, J. (2013). A sock puppet detection algorithm on virtual spaces. Knowledge-Based Systems, 37, 366-377.
  • Bush J. G., Johnson H. M., Seifert C. M. (1994). The implications of corrections: Then why did you mention it? In Ram A., Eiselt K. (Eds.), Proceedings of the 16th annual conference of the cognitive science society (pp. 112–117). Hillsdale, NJ: Erlbaum.
  • Byrne S., Hart P. S. (2009). The boomerang effect: A synthesis of findings and a preliminary theoretical framework. In Beck C. S (Ed.), Communication yearbook (Vol. 220, pp. 3–37). Hoboken, NY: Routledge. Cambridge Sözlük Resmi İnternet Sayfası, İnformation Meaning, https://dictionary.cambridge.
  • Campan, A., Cuzzocrea, A., & Truta, T. M. (2017, December). Fighting fake news spread in online social networks: Actual trends and future research directions. In 2017 IEEE International Conference on Big Data (Big Data) (pp. 4453-4457). IEEE.
  • Carraro, L., Castelli, L., & Macchiella, C. (2011). The automatic conservative: Ideology-based attentional asymmetries in the processing of valenced information. PLoS One, 6(11), e26456.
  • Cartwright, B., Frank, R., Weir, G., & Padda, K. (2022). Detecting and responding to hostile disinformation activities on social media using machine learning and deep neural networks. Neural Computing and Applications, 34(18), 15141-15163.
  • Castelli L., Carraro L. (2011). Ideology is related to basic cognitive processes involved in attitude formation. Journal of Experimental Social Psychology, 47, 1013–1016.
  • Chen, X., Cheng, Q., & Wang, X. (2012). Institutional Knowledge at Singapore Management Universit y. Quaternary International, 282, 122-129.
  • Cho C. H., Martens M. L., Kim H., Rodrigue M. (2011). Astroturfing global warming: It isn’t always greener on the other side of the fence. Journal of Business Ethics, 104, 571–587.
  • Ciampaglia, G. L. (2018). Fighting fake news: a role for computational social science in the fight against digital misinformation. Journal of Computational Social Science, 1(1), 147-153.
  • Cook, J., Ecker, U., & Lewandowsky, S. (2015). Misinformation and how to correct it. Emerging trends in the social and behavioral sciences: An interdisciplinary, searchable, and linkable resource, 1-17.
  • Cotter E. M. (2008). Influence of emotional content and perceived relevance on spread of urban legends: A pilot study. Psychological Reports, 102, 623–629.
  • Craker, N., & March, E. (2016). The dark side of Facebook: The Dark Tetrad, negative social potency, and trolling behaviours. Personality and Individual Differences, 102, 79-84.
  • Cunliffe-Jones, P., Diagne, A., Finlay, A., Gaye, S., Gichunge, W., Onumah, C., Pretorius, C., & Schiffrin, A. (2018). MEDIA LITERACY TEACHING AROUND THE WORLD. In Misinformation Policy in Sub-Saharan Africa: From Laws and Regulations to Media Literacy (pp. 31–34). University of Westminster Press. http://www.jstor.org/stable/j.ctv1v3gqw5.9
  • Demestichas, K., Remoundou, K., & Adamopoulou, E. (2020). Food for thought: fighting fake news and online disinformation. IT Professional, 22(2), 28-34.
  • Ecker U. K. H., Lewandowsky S., Fenton O., & Martin K. (2012). Pre-existing attitudes and the continued influence of misinformation. Unpublished manuscript, University of Western Australia, Perth.
  • Ecker U. K. H., Lewandowsky S., Swire B., Chang D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review, 18, 570–578.
  • Ferrara, E. (2017). Disinformation and social bot operations in the run up to the 2017 French presidential election. arXiv preprint arXiv:1707.00086.
  • Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press.
  • Fragale A. R., Heath C. (2004). Evolving informational credentials: The (mis)attribution of believable facts to credible sources. Personality and Social Psychology Bulletin, 30, 225–236.
  • Gaines B. J., Kuklinski J. H., Quirk P. J., Peyton B., Verkuilen J. (2007). Same facts, different interpretations: Partisan motivation and opinion on Iraq. Journal of Politics, 69, 957–974.
  • Gerrie M. P., Belcher L. E., & Garry M. (2006). “Mind the gap”: False memories for missing aspects of an event. Applied Cognitive Psychology, 20, 689–696.
  • Gilbert D. T. (1991). How mental systems believe. American Psychologist, 46, 107–119.
  • Gilbert D. T., Tafarodi R. W., Malone P. S. (1993). You can’t not believe everything you read. Journal of Personality and Social Psychology, 65, 221–233.
  • Grace, L., & Hone, B. (2019, May). Factitious: large scale computer game to fight fake news and improve news literacy. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-8).
  • Graham, S. S. (2021). Misinformation inoculation and literacy support tweetorials on COVID-19. Journal of Business and Technical Communication, 35(1), 7-14.
  • Greenberg, J., & Jonas, E. (2003). Psychological motives and political orientation--The left, the right, and the rigid: Comment on Jost et al. (2003). Psychological Bulletin, 129(3), 376–382. https://doi.org/10.1037/0033-2909.129.3.376
  • Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B., & Lazer, D. (2019). Fake news on Twitter during the 2016 US presidential election. Science, 363(6425), 374-378.
  • Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, B., Reifler, J., & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences, 117(27), 15536-15545.
  • Guess, A., Nagler, J., & Tucker, J. (2019). Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science advances, 5(1), eaau4586, 1-8.
  • Iyengar, S., & Massey, D. S. (2019). Scientific communication in a post-truth society. Proceedings of the National Academy of Sciences, 116(16), 7656-7661.
  • Kahan, D. M. (2008). Cultural cognition as a conception of the cultural theory of risk. HANDBOOK OF RISK THEORY, S. Roeser, ed., Forthcoming, Harvard Law School Program on Risk Regulation Research Paper, (08-20).
  • Kazemi, A., Garimella, K., Gaffney, D., & Hale, S. A. (2021). Claim matching beyond English to scale global fact-checking. arXiv preprint arXiv:2106.00853.
  • Lewandowsky S. vd. (2020). The debunking handbook. https://www.climatechangecommunication.org/wp-content/uploads/2020/10/DebunkingHandbook2020.pdf
  • Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of applied research in memory and cognition, 6(4), 353-369.
  • Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological science in the public interest, 13(3), 106-131.
  • Lim, S. S., & Tan, K. R. (2020). Front liners fighting fake news: Global perspectives on mobilising young people as media literacy advocates. Journal of Children and Media, 14(4), 529-535.
  • Linvill, D. L., & Warren, P. L. (2020). Troll factories: Manufacturing specialized disinformation on Twitter. Political Communication, 37(4), 447-467.
  • Lorenzo-Dus, N., Blitvich, P. G. C., & Bou-Franch, P. (2011). On-line polylogues and impoliteness: The case of postings sent in response to the Obama Reggaeton YouTube video. Journal of pragmatics, 43(10), 2578-2593.
  • Mayer J., Mussweiler T. (2011). Suspicious spirits, flexible minds: When distrust enhances creativity. Journal of Personality and Social Psychology, 101, 1262–1277.
  • McGuire, W. J., & Papageorgis, D. (1962). Effectiveness of forewarning in developing resistance to persuasion. Public Opinion Quarterly, 26(1), 24-34.
  • Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303-330.
  • Oliver, J. E., & Wood, T. J. (2014). Conspiracy Theories and the Paranoid Style(s) of Mass Opinion. American Journal of Political Science, 58(4), 952–966. http://www.jstor.org/stable/24363536 org/tr, Erişim Tarihi:(19.02.2020)
  • Pamment, J. (2020). The EU's role in fighting disinformation: taking back the initiative.
  • Pantazi, M., Papaioannou, K., & van Prooijen, J. W. (2022). Power to the people: The hidden link between support for direct democracy and belief in conspiracy theories. Political Psychology, 43(3), 529-548.
  • Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39-50.
  • Pfau, M., Haigh, M. M., Sims, J., & Wigley, S. (2007). The influence of corporate front-group stealth campaigns. Communication research, 34(1), 73-99.
  • Porter, E., & Wood, T. J. (2021). The global effectiveness of fact-checking: Evidence from simultaneous experiments in Argentina, Nigeria, South Africa, and the United Kingdom. Proceedings of the National Academy of Sciences, 118(37), e2104235118.
  • Robinson, O., Coleman, A., & Sardarizadeh, S. (2019). A report on anti-disinformation initiatives. Oxford University.
  • Rodríguez-Pérez, C., Paniagua-Rojano, F. J., & Magallón-Rosa, R. (2021). Debunking political disinformation through journalists’ perceptions: an analysis of Colombia’s fact-checking news practices. Media and Communication, 9(1), 264-275.
  • Roozenbeek, J., & Van der Linden, S. (2019). Fake news game confers psychological resistance against online misinformation. Palgrave Communications, 5(1), 1-10.
  • Roozenbeek, J., van der Linden, S. Fake news game confers psychological resistance against online misinformation. Palgrave Commun 5, 65 (2019). https://doi.org/10.1057/s41599-019-0279-9
  • Roozenbeek, J., van der Linden, S. Fake news game confers psychological resistance against online misinformation. Palgrave Commun 5, 65 (2019). https://doi.org/10.1057/s41599-019-0279-9 Saez-Trumper, D. (2019). Online disinformation and the role of wikipedia. arXiv preprint arXiv:1910.12596.
  • Scales, D., Gorman, J., & Jamieson, K.H., (2021). The Covid-19 Infodemic — Applying the Epidemiologic Model to Counter Misinformation. Perspective. 385:678-681.
  • Shaar, S., Martino, G. D. S., Babulkov, N., & Nakov, P. (2020). That is a known lie: Detecting previously fact-checked claims. arXiv preprint arXiv:2005.06058.
  • Soares, F. B., Recuero, R., Volcan, T., Fagundes, G., & Sodré, G. (2021). Research note: Bolsonaro’s firehose: How Covid-19 disinformation on WhatsApp was used to fight a government political crisis in Brazil. Harvard Kennedy School Misinformation Review.
  • Sun, J., & Tang, J. (2011). A survey of models and algorithms for social influence analysis. Social network data analytics, 177-214.
  • Tandoc Jr, E. C., Lim, Z. W., & Ling, R. (2018). Defining “fake news” A typology of scholarly definitions. Digital journalism, 6(2), 137-153. Erişim Tarihi: (19.02.2020)
  • Thompson, D., Baranowski, T., Buday, R., Baranowski, J., Thompson, V., Jago, R., & Griffith, M. J. (2010). Serious video games for health: How behavioral science guided the development of a serious video game. Simulation & gaming, 41(4), 587-606.
  • Traberg, C. S., Roozenbeek, J., & van der Linden, S. (2022). Psychological Inoculation against Misinformation: Current Evidence and Future Directions. The ANNALS of the American Academy of Political and Social Science, 700(1), 136–151. https://doi.org/10.1177/00027162221087936
  • Turk Dil Kurumu Resmi İnternet Sayfası, Enformasyon, https://sozluk.gov.tr/, Erişim Van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Global Challenges, 1(2), 1600008.
  • Varol, O., Ferrara, E., Davis, C., Menczer, F., & Flammini, A. (2017, May). Online human-bot interactions: Detection, estimation, and characterization. In Proceedings of the international AAAI conference on web and social media (Vol. 11, No. 1, pp. 280-289).
  • Vosoughi, S., Mohsenvand, M. N., & Roy, D. (2017). Rumor gauge: Predicting the veracity of rumors on Twitter. ACM transactions on knowledge discovery from data (TKDD), 11(4), 1-36.
  • Walter, N., & Murphy, S. T. (2018). How to unring the bell: A meta-analytic approach to correction of misinformation. Communication Monographs, 85(3), 423–441. https://doi.org/10.1080/03637751.2018.1467564
  • Walter, N., & Murphy, S. T. (2018). How to unring the bell: A meta-analytic approach to correction of misinformation. Communication Monographs, 85(3), 423-441.
  • Walter, N., & Tukachinsky, R. (2020). A meta-analytic examination of the continued influence of misinformation in the face of correction: How powerful is it, why does it happen, and how to stop it?. Communication research, 47(2), 155-177.
  • Walter, N., Cohen, J., Holbert, R. L., & Morag, Y. (2020). Fact-checking: A meta-analysis of what works and for whom. Political Communication, 37(3), 350-375.
  • Wang, W. Y. (2017). " liar, liar pants on fire": A new benchmark dataset for fake news detection. arXiv preprint arXiv:1705.00648.
  • Zaragoza M. S., Mitchell K. J. (1996). Repeated exposure to suggestion and the creation of false memories. Psychological Science, 7, 294–300.
  • Zellers, R., Holtzman, A., Rashkin, H., Bisk, Y., Farhadi, A., Roesner, F., & Choi, Y. (2019). Defending against neural fake news. Advances in neural information processing systems, 32.
  • Zollo, F., Novak, P. K., Del Vicario, M., Bessi, A., Mozetič, I., Scala, A., ... & Quattrociocchi, W. (2015). Emotional dynamics in the age of misinformation. PloS one, 10(9), e0138740.
  • World Economic Forum. (2018, May). The Global Risks Report 2018 13th Edition. https://www3.weforum.org/docs/WEF_GRR18_Report.pdf
  • Pew Research Center.(2016). Digital Readiness Gaps. https://www.pewresearch.org/internet/2016/09/20/digital-readiness-gaps/
  • Fishkin, R. (2022 May 15). SparkToro & Followerwonk Joint Twitter Analysis: 19.42% of Active Accounts Are Fake or Spam. https://sparktoro.com/blog/sparktoro-followerwonk-joint-twitter-analysis-19-42-of-active-accounts-are-fake-or-spam/
  • Tools to Monitor Disinformation, https://www.disinfo.eu/resources/tools-to-monitor-disinformation The SDG Communicator Toolkit, (2020). https://sdg-communicator.org/toolkit/ https://www.poynter.org/ifcn/
  • StratCom activity report - Strategic Communication Task Forces and Information Analysis Division, ( 2022 March 24). https://www.eeas.europa.eu/eeas/2021-stratcom-activity-report-strategic-communication-task-forces-and-information-analysis_en
  • East Stratcom Task Force, https://euvsdisinfo.eu/about/# Factsheet: Rapid Alert System, (2019 March 19). https://www.eeas.europa.eu/node/59644_en
  • EU Election Observation Missions, (2023 August 7). https://www.eeas.europa.eu/eeas/eu-election-observation-missions-1_en
  • The Digital Methods Initiative, https://wiki.digitalmethods.net/Dmi/DmiAbout https://aoir.org/supporttwitterapiopendata/
  • AoIR signs open letter in support of Twitter API open data Access, (2023 February 16) https://www.nbcnews.com/news/world/pope-says-fake-news-dates-adam-eve-says-truth-will-n840516
  • Tools That Fight Disinformation Online, https://www.rand.org/research/projects/truth-decay/fighting-disinformation/search.html
  • Drog Group Disinformation Laboratories, https://drog.group/cases
  • Biteniece, N. vd., (2017 November 16). Digital Hydra: Security Implications of False Information Online https://stratcomcoe.org/publications/digital-hydra-security-implications-of-false-information-online/205
  • Kirby, E.J. (2016 December 5). The city getting rich from fake news. https://www.bbc.com/news/magazine-38168281
  • Adair, B. (2009 April 20). PolitiFact wins Pulitzer https://www.politifact.com/article/2009/apr/20/politifact-wins-pulitzer/
  • Burgos, P. (2009 June 27). What 100,000 WhatsApp messages reveal about misinformation in Brazil. https://firstdraftnews.org/articles/what-100000-whatsapp-messages-reveal-about-misinformation-in-brazil/
  • GLOBAL FACT-CHECKING SITES; Reporters Lab Maps, https://reporterslab.org/fact-checking/
  • Game; Play Go Viral; https://www.goviralgame.com/books/go-viral/ Game; Cranky Uncle; https://crankyuncle.com/
  • Dictionary Sözlük; https://www.dictionary.com/e/misinformation-vs-disinformation-get-informed-on-the-difference/
  • World Health Organization; Munich Security Conference. https://www.who.int/director-general/speeches/detail/munich-security-conference
Toplam 108 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Siyaset Bilimi (Diğer)
Bölüm İnceleme Makaleleri
Yazarlar

Necmettin Mutlu 0000-0001-5509-4380

Erken Görünüm Tarihi 20 Aralık 2023
Yayımlanma Tarihi 30 Aralık 2023
Kabul Tarihi 25 Kasım 2023
Yayımlandığı Sayı Yıl 2023

Kaynak Göster

APA Mutlu, N. (2023). Yanlış Bilgi Ve Dezenformasyonla Mücadele: Kavramlar Ve Uygulamalar. Uluslararası Sosyal Bilimler Akademi Dergisi(13), 464-493. https://doi.org/10.47994/usbad.1257491

16494  16495  16496  16503  16570 16633   16823 

 16824 17564  19288 22730 22803 

 23495     

   25002    30847    

             Uluslararası Sosyal Bilimler Akademi Dergisi (USBAD), İnönü Üniversitesi Eğitim Fakültesi Türkçe ve Sosyal Bilimler Eğitimi Bölümü Yerleşke / Malatya 

Telefon: 0533 5438933, https://dergipark.org.tr/tr/pub/usbad -- sdurukoglu@gmail.com -- usbaddergi@gmail.com 18414