Review
BibTex RIS Cite

Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market

Year 2024, , 159 - 168, 13.06.2024
https://doi.org/10.26650/JECS2023-1415085

Abstract

Artificial intelligence (AI) is now present in nearly every aspect of our daily lives. Furthermore, while this AI augmentation is generally beneficial, or at worst, nonproblematic, some instances warrant attention. In this study, we argue that AI bias resulting from training data sets in the labor market can significantly amplify minor inequalities, which later in life manifest as permanently lost opportunities and social status and wealth segregation. The Matthew effect is responsible for this phenomenon, except that the focus is not on the rich getting richer, but on the poor becoming even poorer. We demonstrate how frequently changing expectations for skills, competencies, and knowledge lead to AI failing to make impartial hiring decisions. Specifically, the bias in the training data sets used by AI affects the results, causing the disadvantaged to be overlooked while the privileged are frequently chosen. This simple AI bias contributes to growing social inequalities by reinforcing the Matthew effect, and it does so at much faster rates than previously. We assess these threats by studying data from various labor fields, including justice, security, healthcare, human resource management, and education.

References

  • Acemoglu, D., & Restrepo, P. (2018). Artificial intelligence, automation and work. NBER Working Paper 24196. Cambridge: National Bureau of Economic Research. google scholar
  • Aldoseri, A., Al-Khalifa, K.N., & Hamouda, A.M. (2023). Re-thinking data strategy and integration for artificial intelligence: Concepts, opportunities, and challenges. Applied Sciences, 13(12), 7082. google scholar
  • Aldoseri, A., Al-Khalifa, K.N., & Hamouda, A.M. (2023). Re-thinking data strategy and integration for artificial intelligence: Concepts, opportunities, and challenges. Applied Sciences, 13(12), 7082. google scholar
  • Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. ProPublica. google scholar
  • Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. ProPublica. google scholar
  • Agrawal, A., Gans, J. S., & Goldfarb, A. (2019). Artificial intelligence: The ambiguous labor market impact of automating prediction. Journal of Economic Perspectives, 33(2), 31-50. google scholar
  • Agrawal, A., Gans, J. S., & Goldfarb, A. (2019). Artificial intelligence: The ambiguous labor market impact of automating prediction. Journal of Economic Perspectives, 33(2), 31-50. google scholar
  • Aquino, Y. S. J. (2023). Making decisions: Bias in artificial intelligence and data-driven diagnostic tools. Australian Journal of General Practice, 52(7), 439-442. google scholar
  • Aquino, Y. S. J. (2023). Making decisions: Bias in artificial intelligence and data-driven diagnostic tools. Australian Journal of General Practice, 52(7), 439-442. google scholar
  • Babic, B., Gerke, S., Evgeniou, T., & Cohen, I. G. (2021). Beware explanations from AI in health care. Science, 373(6552), 284-286. google scholar
  • Babic, B., Gerke, S., Evgeniou, T., & Cohen, I. G. (2021). Beware explanations from AI in health care. Science, 373(6552), 284-286. google scholar
  • Bagaric, M. (2016). Three things that a baseline study shows don’t cause indigenous over-imprisonment: Three things that might but shouldn’t and three reforms that will reduce indigenous over-imprisonment. Harvard J Racial & Ethnic Just, 103. google scholar
  • Bagaric, M. (2016). Three things that a baseline study shows don’t cause indigenous over-imprisonment: Three things that might but shouldn’t and three reforms that will reduce indigenous over-imprisonment. Harvard J Racial & Ethnic Just, 103. google scholar
  • Bagaric, M., Hunter, D., & Stobbs, N. (2019). Erasing the bias against using artificial intelligence to predict future criminality: Algorithms are color blind and never dash. University of Cincinnati Law Review, 88(4), 1037-1081. google scholar
  • Bagaric, M., Hunter, D., & Stobbs, N. (2019). Erasing the bias against using artificial intelligence to predict future criminality: Algorithms are color blind and never dash. University of Cincinnati Law Review, 88(4), 1037-1081. google scholar
  • Bagaric, M., Svilar, J., Bull, M., Hunter, D., & Stobbs, N. (2022). The solution to the pervasive bias and discrimination in the criminal justice system: Transparent and fair artificial intelligence. American Criminal Law Review, 59(95), 95-148. google scholar
  • Bagaric, M., Svilar, J., Bull, M., Hunter, D., & Stobbs, N. (2022). The solution to the pervasive bias and discrimination in the criminal justice system: Transparent and fair artificial intelligence. American Criminal Law Review, 59(95), 95-148. google scholar
  • Baker, R. S., & Hawn, A. (2021). Algorithmic bias in education. International Journal of Artificial Intelligence in Education, 32, 1052-1092. google scholar
  • Baker, R. S., & Hawn, A. (2021). Algorithmic bias in education. International Journal of Artificial Intelligence in Education, 32, 1052-1092. google scholar
  • Barocas, S., & Selbst, A.D. (2016). Big data’s disparate impact. California Law Review, 104, 671-732. google scholar
  • Barocas, S., & Selbst, A.D. (2016). Big data’s disparate impact. California Law Review, 104, 671-732. google scholar
  • Bates, D.W., Saria, S., Ohno-Machado, L., Shah, A., & Escobar, G. (2014). Big data in health care: Using analytics to identify and manage high-risk and high-cost patients. Health Affairs, 33(7), 1123-1131. google scholar
  • Bates, D.W., Saria, S., Ohno-Machado, L., Shah, A., & Escobar, G. (2014). Big data in health care: Using analytics to identify and manage high-risk and high-cost patients. Health Affairs, 33(7), 1123-1131. google scholar
  • Bird, K. A., Castleman, B. L., & Song, Y. (2023). Are algorithms biased in education? Exploring racial bias in predicting community college student success. Ed Working Paper No. 23-717. Annenberg Institute: Brown University. google scholar
  • Bird, K. A., Castleman, B. L., & Song, Y. (2023). Are algorithms biased in education? Exploring racial bias in predicting community college student success. Ed Working Paper No. 23-717. Annenberg Institute: Brown University. google scholar
  • Bourdieu, P., & Passeron, J. C. (2000). Reproduction in education, society and culture. Sage Publishing. google scholar
  • Bourdieu, P., & Passeron, J. C. (2000). Reproduction in education, society and culture. Sage Publishing. google scholar
  • Bozkurt, V., & Gursoy, D. (2023). The artificial intelligence paradox: Opportunity or threat for humanity?, International Journal of Human-Computer Interaction, doi: 10.1080/10447318.2023.2297114 google scholar
  • Bozkurt, V., & Gursoy, D. (2023). The artificial intelligence paradox: Opportunity or threat for humanity?, International Journal of Human-Computer Interaction, doi: 10.1080/10447318.2023.2297114 google scholar
  • Castelluccia, C., & Le Metayer, D. (2019). Understanding algorithmic decision-making: opportunities and challenges. European Parliament. google scholar
  • Castelluccia, C., & Le Metayer, D. (2019). Understanding algorithmic decision-making: opportunities and challenges. European Parliament. google scholar
  • Celi L. A., Cellini, J., Charpignon, M. L., Dee, E. C., Dernoncourt, F., Eber, R. et al (2022). Sources of bias in artificial intelligence that perpetuate healthcare disparities: A global review. PLOS Digit Health, 31; 1(3), e0000022. google scholar
  • Celi L. A., Cellini, J., Charpignon, M. L., Dee, E. C., Dernoncourt, F., Eber, R. et al (2022). Sources of bias in artificial intelligence that perpetuate healthcare disparities: A global review. PLOS Digit Health, 31; 1(3), e0000022. google scholar
  • Charpignon, M. L., Byers, J., Cabral, S., Celi, L. A., Fernandes, F., Gallifant, J. et al. (2023). Critical bias in critical care devices. Critical Care Clinics, 39(4), 795-813. google scholar
  • Charpignon, M. L., Byers, J., Cabral, S., Celi, L. A., Fernandes, F., Gallifant, J. et al. (2023). Critical bias in critical care devices. Critical Care Clinics, 39(4), 795-813. google scholar
  • Chawla, N. V., Bowyer, K. W., Hall, L. O., Kegelmeyer, W. P. (2002). SMOTE: Synthetic minority over-sampling technique. The Journal of Artificial Intelligence Research, 16, 321-357. google scholar
  • Chawla, N. V., Bowyer, K. W., Hall, L. O., Kegelmeyer, W. P. (2002). SMOTE: Synthetic minority over-sampling technique. The Journal of Artificial Intelligence Research, 16, 321-357. google scholar
  • Coleman, J. S., Campbell , E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfeld, F. D., & York, R. L. (1966). Equality of educational opportunity. US Office of Education. google scholar
  • Coleman, J. S., Campbell , E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfeld, F. D., & York, R. L. (1966). Equality of educational opportunity. US Office of Education. google scholar
  • Colon-Rodriguez, C. J. (2023, 12 July). Shedding light on healthcare algorithmic and artificial intelligence bias. US Department of Health and Human Services Office of Minority Health. Retrieved from https://minorityhealth.hhs.gov/news/shedding-light-healthcare-algorithmic-and-artificial-intelligence-bias google scholar
  • Colon-Rodriguez, C. J. (2023, 12 July). Shedding light on healthcare algorithmic and artificial intelligence bias. US Department of Health and Human Services Office of Minority Health. Retrieved from https://minorityhealth.hhs.gov/news/shedding-light-healthcare-algorithmic-and-artificial-intelligence-bias google scholar
  • Dressel, J., & Farid, H. (2018). The accuracy, fairness, and limits of predicting recidivism. Science Advances, 4, eaao5580. google scholar
  • Dressel, J., & Farid, H. (2018). The accuracy, fairness, and limits of predicting recidivism. Science Advances, 4, eaao5580. google scholar
  • Edwards, B., & Cheok, A. D. (2018). Why not robot teachers: Artificial intelligence for addressing teacher shortage. Applied Artificial Intelligence, 32(4), 345-360. google scholar
  • Edwards, B., & Cheok, A. D. (2018). Why not robot teachers: Artificial intelligence for addressing teacher shortage. Applied Artificial Intelligence, 32(4), 345-360. google scholar
  • Erdi, P. (2020). Ranking: The unwritten rules of the social game we all play. Oxford University Press. google scholar
  • Erdi, P. (2020). Ranking: The unwritten rules of the social game we all play. Oxford University Press. google scholar
  • European Commission (2022). Mitigating diversity biases ofAI in the labor market. Horizon Europe Project Report. google scholar
  • European Commission (2022). Mitigating diversity biases ofAI in the labor market. Horizon Europe Project Report. google scholar
  • European Network Against Racism (2020). Artificial intelligence in HR: How to address racial biases and algorithmic discrimination in HR?. ENAR Publishing. google scholar
  • European Network Against Racism (2020). Artificial intelligence in HR: How to address racial biases and algorithmic discrimination in HR?. ENAR Publishing. google scholar
  • European Parliament (2020). Skills and jobs for future labour markets: European policies and Skills Agendas 2010-2020. EMPL in Focus. google scholar
  • European Parliament (2020). Skills and jobs for future labour markets: European policies and Skills Agendas 2010-2020. EMPL in Focus. google scholar
  • Franca, T. J. F., Mamede, H. S., Barroso, J. M. P., & dos Santos, V. M. P. D. (2023). Artificial intelligence applied to potential assessment and talent identification in an organizational context. Heliyon, 9(4), e14694. google scholar
  • Franca, T. J. F., Mamede, H. S., Barroso, J. M. P., & dos Santos, V. M. P. D. (2023). Artificial intelligence applied to potential assessment and talent identification in an organizational context. Heliyon, 9(4), e14694. google scholar
  • Hale, K. E. (2020). Using artificial intelligence to circumvent the teacher shortage in special education: A phenomenological investigation. Doctoral Dissertation, Liberty University, USA. google scholar
  • Hale, K. E. (2020). Using artificial intelligence to circumvent the teacher shortage in special education: A phenomenological investigation. Doctoral Dissertation, Liberty University, USA. google scholar
  • Harari, Y. N. (2017). Reboot for the AI revolution. Nature, 550(19), 324-327. google scholar
  • Harari, Y. N. (2017). Reboot for the AI revolution. Nature, 550(19), 324-327. google scholar
  • Holmes, W., Bialik, M., & Fadel, C. (2023). Artificial intelligence in education. In Data ethics: Building trust: How digital technologies can serve humanity (pp. 621-653). Globethics Publications. google scholar
  • Holmes, W., Bialik, M., & Fadel, C. (2023). Artificial intelligence in education. In Data ethics: Building trust: How digital technologies can serve humanity (pp. 621-653). Globethics Publications. google scholar
  • IFOW (2020). Artificial intelligence in hiring: Assessing impacts on equality. Institute for the Future of Work. Retrieved from https://assets-global.website-files.com/64d5f73a7fc5e8a240310c4d/64d5f73b7fc5e8a240310ea0_5f71d338891671faa84de443_IFOW%2B-%2BAssessing%2Bimpacts%2Bon%2Bequality.pdf google scholar
  • IFOW (2020). Artificial intelligence in hiring: Assessing impacts on equality. Institute for the Future of Work. Retrieved from https://assets-global.website-files.com/64d5f73a7fc5e8a240310c4d/64d5f73b7fc5e8a240310ea0_5f71d338891671faa84de443_IFOW%2B-%2BAssessing%2Bimpacts%2Bon%2Bequality.pdf google scholar
  • ILO (2023). Artificial intelligence in human resource management: A challenge for the human-centred agenda?. ILO Working Paper 95. google scholar
  • ILO (2023). Artificial intelligence in human resource management: A challenge for the human-centred agenda?. ILO Working Paper 95. google scholar
  • Kamulegeya, L. H., Okello, M., Bwanika, J. M., et al. (2019). Using artificial intelligence on dermatology conditions in Uganda: A case for diversity in training data sets for machine learning. African Health Sciences, 23(2), 753-763. google scholar
  • Kamulegeya, L. H., Okello, M., Bwanika, J. M., et al. (2019). Using artificial intelligence on dermatology conditions in Uganda: A case for diversity in training data sets for machine learning. African Health Sciences, 23(2), 753-763. google scholar
  • King, M. (2022). Harmful biases in artificial intelligence. The Lancet Psychiatry, 9(11), E48. google scholar
  • King, M. (2022). Harmful biases in artificial intelligence. The Lancet Psychiatry, 9(11), E48. google scholar
  • Kizilcec, R.F., & Lee, H. (2022). Algorithmic fairness in education. In Holmes W., Porayska-Pomsta, K. (Eds), The ethics of artificial intelligence in education (pp. 174-202). Taylor & Francis. google scholar
  • Kizilcec, R.F., & Lee, H. (2022). Algorithmic fairness in education. In Holmes W., Porayska-Pomsta, K. (Eds), The ethics of artificial intelligence in education (pp. 174-202). Taylor & Francis. google scholar
  • Köchling, A., & Wehner, M. C. (2020). Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development. Bus Res, 13, 795-848. google scholar
  • Köchling, A., & Wehner, M. C. (2020). Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development. Bus Res, 13, 795-848. google scholar
  • Larrazabal, A. J., Nieto, N., Peterson, V., Milone, D. H., & Ferrante, E. (2020). Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis. Proc Natl Acad Sci USA, 117(23), 12592-12594. google scholar
  • Larrazabal, A. J., Nieto, N., Peterson, V., Milone, D. H., & Ferrante, E. (2020). Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis. Proc Natl Acad Sci USA, 117(23), 12592-12594. google scholar
  • Li, W., Aste, T., Caccioli, F., & Livan, G. (2019). Early co-authorship with top scientists predicts success in academic careers, Nature Communications, 10, 51-70. google scholar
  • Li, W., Aste, T., Caccioli, F., & Livan, G. (2019). Early co-authorship with top scientists predicts success in academic careers, Nature Communications, 10, 51-70. google scholar
  • Lum, K., & Isaac, W. (2016). To predict and serve?. Significance, 13(5), 14-19. google scholar
  • Lum, K., & Isaac, W. (2016). To predict and serve?. Significance, 13(5), 14-19. google scholar
  • Mahmud, H. A. K. M., Islam, N., Ahmed, S. I., & Smolander, K. (2022). What influences algorithmic decision-making? A systematic literature review on algorithm aversion. Technological Forecasting and Social Change, 175, 121390. google scholar
  • Mahmud, H. A. K. M., Islam, N., Ahmed, S. I., & Smolander, K. (2022). What influences algorithmic decision-making? A systematic literature review on algorithm aversion. Technological Forecasting and Social Change, 175, 121390. google scholar
  • Makridakis, S. (2017). The forthcoming Artificial Intelligence (AI) revolution: Its impact on society and firms. Futures, 90, 46-60. google scholar
  • Makridakis, S. (2017). The forthcoming Artificial Intelligence (AI) revolution: Its impact on society and firms. Futures, 90, 46-60. google scholar
  • Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., & Galstyan, A. (2021). A survey on bias and fairness in machine learning. ACM Computing Surveys, 54(6), 1-35. google scholar
  • Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., & Galstyan, A. (2021). A survey on bias and fairness in machine learning. ACM Computing Surveys, 54(6), 1-35. google scholar
  • Merton, R. K. (1968). The Matthew effect in science. Science, 159, 53-63. google scholar
  • Merton, R. K. (1968). The Matthew effect in science. Science, 159, 53-63. google scholar
  • Mittermaier, M., Raza, M. M., Kvedar, J. C. (2023). Bias in AI-based models for medical applications: Challenges and mitigation strategies. NPJ Digital Medicine, 6, 113. google scholar
  • Mittermaier, M., Raza, M. M., Kvedar, J. C. (2023). Bias in AI-based models for medical applications: Challenges and mitigation strategies. NPJ Digital Medicine, 6, 113. google scholar
  • Napierala, J., Kvetan, V. (2023). Changing job skills in a changing world. In: Bertoni, E., Fontana, M., Gabrielli, L., Signorelli, S., Vespe, M. (eds) Handbook of computational social science for policy (pp. 243-259). Springer, Cham. google scholar
  • Napierala, J., Kvetan, V. (2023). Changing job skills in a changing world. In: Bertoni, E., Fontana, M., Gabrielli, L., Signorelli, S., Vespe, M. (eds) Handbook of computational social science for policy (pp. 243-259). Springer, Cham. google scholar
  • Nazer, L. H., Zatarah, R., Waldrip, S., Ke, J. X. C., Moukheiber, M., Khanna, A. K., et al. (2023). Bias in artificial intelligence algorithms and recommendations for mitigation. PLOS Digital Health, 2(6), e0000278. google scholar
  • Nazer, L. H., Zatarah, R., Waldrip, S., Ke, J. X. C., Moukheiber, M., Khanna, A. K., et al. (2023). Bias in artificial intelligence algorithms and recommendations for mitigation. PLOS Digital Health, 2(6), e0000278. google scholar
  • Ntoutsi, E., Fafalios, P., Gadiraju, U., et al. (2020). Bias in data-driven artificial intelligence systems-an introductory survey. WIREs Data Mining Knowl Discov, 10(3), e1356. google scholar
  • Ntoutsi, E., Fafalios, P., Gadiraju, U., et al. (2020). Bias in data-driven artificial intelligence systems-an introductory survey. WIREs Data Mining Knowl Discov, 10(3), e1356. google scholar
  • Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366, 447-453. google scholar
  • Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366, 447-453. google scholar
  • Ocumpaugh, J., Baker, R., Gowda, S., Heffernan, N., & Heffernan, C. (2014). Population validity for educational data mining models: A case study in affect detection. British Journal of Educational Technology, 45(3), 487-501. google scholar
  • Ocumpaugh, J., Baker, R., Gowda, S., Heffernan, N., & Heffernan, C. (2014). Population validity for educational data mining models: A case study in affect detection. British Journal of Educational Technology, 45(3), 487-501. google scholar
  • OECD (2023). OECD employment outlook 2023: Artificial intelligence and the labour market. OECD Publishing. google scholar
  • OECD (2023). OECD employment outlook 2023: Artificial intelligence and the labour market. OECD Publishing. google scholar
  • O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown Books. google scholar
  • O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown Books. google scholar
  • Oracle (2019). AI in human resources: The time is now. Retrieved from https://www.oracle.com/a/ocom/docs/applications/hcm/oracle-ai-in-hr-wp.pdf google scholar
  • Oracle (2019). AI in human resources: The time is now. Retrieved from https://www.oracle.com/a/ocom/docs/applications/hcm/oracle-ai-in-hr-wp.pdf google scholar
  • Özer, M., & Perc, M. (2022). Improving equality in the education system of Türkiye. İstanbul University Journal of Sociology, 2022, 42(2), 325-334. google scholar
  • Özer, M., & Perc, M. (2022). Improving equality in the education system of Türkiye. İstanbul University Journal of Sociology, 2022, 42(2), 325-334. google scholar
  • Özer, M. (2023). The Matthew effect Turkish education system. Bartın University Journal of Faculty of Education, 12(4), 704-712. google scholar
  • Özer, M. (2023). The Matthew effect Turkish education system. Bartın University Journal of Faculty of Education, 12(4), 704-712. google scholar
  • Perc, M. (2014). The Matthew effect in empirical data. Journal of Royal Society Interface, 11(98), 20140378. google scholar
  • Perc, M. (2014). The Matthew effect in empirical data. Journal of Royal Society Interface, 11(98), 20140378. google scholar
  • Perc, M., Özer, M., & Hojnik, J. (2019). Social and juristic challenges of artificial intelligence. Palgrave Communications, 5, 61. google scholar
  • Perc, M., Özer, M., & Hojnik, J. (2019). Social and juristic challenges of artificial intelligence. Palgrave Communications, 5, 61. google scholar
  • Rajpurkar, P., Chen, E., Banerjee, O., & Topol, E. J. (2022). AI in health and medicine. Nature Medicine, 28, 31-38. google scholar
  • Rajpurkar, P., Chen, E., Banerjee, O., & Topol, E. J. (2022). AI in health and medicine. Nature Medicine, 28, 31-38. google scholar
  • Renz, A., & Hilbig, R. (Eds.). (2023). Digital transformation of educational institutions accelerated by covid-19: A digital dynamic capabilities approach. Emerald Publishing. google scholar
  • Renz, A., & Hilbig, R. (Eds.). (2023). Digital transformation of educational institutions accelerated by covid-19: A digital dynamic capabilities approach. Emerald Publishing. google scholar
  • Rigney, D. (2010). The Matthew effect: How advantage begets further advantage. Columbia University Press. New York. google scholar
  • Rigney, D. (2010). The Matthew effect: How advantage begets further advantage. Columbia University Press. New York. google scholar
  • Sackett, P. R., Kuncel, N. R., Arneson, J. J., Cooper, S. R., & Waters, S. D. (2009). Does socioeconomic status explain the relationship between admissions tests and post-secondary academic performance. Psychological Bulletin, 135(1), 1-22. google scholar
  • Sackett, P. R., Kuncel, N. R., Arneson, J. J., Cooper, S. R., & Waters, S. D. (2009). Does socioeconomic status explain the relationship between admissions tests and post-secondary academic performance. Psychological Bulletin, 135(1), 1-22. google scholar
  • Seyyed-Kalantari, L., Zhang, H., McDermott, M. B. A., Chen, I. Y., & Ghassemi, M. (2021). Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations. Nature Med, 27, 2176-2182. google scholar
  • Seyyed-Kalantari, L., Zhang, H., McDermott, M. B. A., Chen, I. Y., & Ghassemi, M. (2021). Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations. Nature Med, 27, 2176-2182. google scholar
  • Smith, H. (2020). Algorithmic bias: should students pay the price? AI & Society, 35, 1077-1078. google scholar
  • Smith, H. (2020). Algorithmic bias: should students pay the price? AI & Society, 35, 1077-1078. google scholar
  • Silberg, J., & Manyika, J. (2019). Notes from the AI frontier: Tackling bias in AI (and in humans). McKinsey Global Institute. google scholar
  • Silberg, J., & Manyika, J. (2019). Notes from the AI frontier: Tackling bias in AI (and in humans). McKinsey Global Institute. google scholar
  • Srinivas, N. (2023, February 24). The ethical debate of AI in criminal justice: Balancing efficiency and human rights. Manage Engine Insights. google scholar
  • Srinivas, N. (2023, February 24). The ethical debate of AI in criminal justice: Balancing efficiency and human rights. Manage Engine Insights. google scholar
  • Retrieved from https://insights.manageengine.com/artificial-intelligence/the-ethical-debate-of-ai-in-criminal-justice-balancing-efficiency-and-human-rights// google scholar
  • Retrieved from https://insights.manageengine.com/artificial-intelligence/the-ethical-debate-of-ai-in-criminal-justice-balancing-efficiency-and-human-rights// google scholar
  • Stefan, V. (2019). Artificial intelligence and its impact on young people. Seminar Report from the Council of Europe. google scholar
  • Stefan, V. (2019). Artificial intelligence and its impact on young people. Seminar Report from the Council of Europe. google scholar
  • Storm, K. I. L., Reiss, L. K., Guenther, E. A., Clar-Novak, M., & Muhr, S.L. (2023). Unconscious bias in the HRM literature: Towards a critical-reflexive approach. Human Resource Management Review, 33(3), 100969. google scholar
  • Storm, K. I. L., Reiss, L. K., Guenther, E. A., Clar-Novak, M., & Muhr, S.L. (2023). Unconscious bias in the HRM literature: Towards a critical-reflexive approach. Human Resource Management Review, 33(3), 100969. google scholar
  • Suleyman, M. (2023). The coming wave: Technology, power, and the twenty-first century’s greatest dilemma. Crown: New York. google scholar
  • Suleyman, M. (2023). The coming wave: Technology, power, and the twenty-first century’s greatest dilemma. Crown: New York. google scholar
  • Suna, H. E., Tanberkan, H., Gür, B. S., Perc, M., & Özer, M. (2020). Socioeconomic status and school type as predictors of academic achievement. Journal of Economics culture and Society, 61, 41-64. google scholar
  • Suna, H. E., Tanberkan, H., Gür, B. S., Perc, M., & Özer, M. (2020). Socioeconomic status and school type as predictors of academic achievement. Journal of Economics culture and Society, 61, 41-64. google scholar
  • Tuffaha, M. (2023). The impact of artificial intelligence bias on human resource management functions: Systematic literature review and future research directions. European Journal of Business and Innovation Research, 11(4), 35-58. google scholar
  • Tuffaha, M. (2023). The impact of artificial intelligence bias on human resource management functions: Systematic literature review and future research directions. European Journal of Business and Innovation Research, 11(4), 35-58. google scholar
  • Ulnicane, I., & Aden, A. (2023). Power and politics in framing bias in artificial intelligence policy. Review of Policy Research, 40, 665-687. google scholar
  • Ulnicane, I., & Aden, A. (2023). Power and politics in framing bias in artificial intelligence policy. Review of Policy Research, 40, 665-687. google scholar
  • UNESCO (2023). The teachers we need for the education we want: The global imperative to reverse the teacher shortage. UNESCO & Education 2030 Factsheet. google scholar
  • UNESCO (2023). The teachers we need for the education we want: The global imperative to reverse the teacher shortage. UNESCO & Education 2030 Factsheet. google scholar
  • Varsha, P. S. (2023). How can we manage biases in artificial intelligence systems-A systematic literature review. International Journal of Information Management Data Insights, 3, 100165. google scholar
  • Varsha, P. S. (2023). How can we manage biases in artificial intelligence systems-A systematic literature review. International Journal of Information Management Data Insights, 3, 100165. google scholar
  • Vicente, L., & Matute, H. (2023). Humans inherit artificial intelligence biases. Scientific Reports, 13, 15737. google scholar
  • Vicente, L., & Matute, H. (2023). Humans inherit artificial intelligence biases. Scientific Reports, 13, 15737. google scholar
  • Zhang, J., Zhang, Z. M. (2023). Ethics and governance of trustwothy medical artificial intelligence. BMC Medical Informatics and Decision Making, 23, 7. google scholar
  • Zhang, J., Zhang, Z. M. (2023). Ethics and governance of trustwothy medical artificial intelligence. BMC Medical Informatics and Decision Making, 23, 7. google scholar
  • Zuckerman, H. A. (1977). Scientific elite: Nobel laureates in the United States. New York: Free Press. google scholar
  • Zuckerman, H. A. (1977). Scientific elite: Nobel laureates in the United States. New York: Free Press. google scholar
  • Zuckerman, H. (1989). Accumulation of advantage and disadvantage: The theory and its intellectual biography. In Mongardini, C., Tabboni, S. (Eds), Robert K. Merton and contemporary sociology (pp. 153-176). New Brunswick, NJ: Transaction. google scholar
  • Zuckerman, H. (1989). Accumulation of advantage and disadvantage: The theory and its intellectual biography. In Mongardini, C., Tabboni, S. (Eds), Robert K. Merton and contemporary sociology (pp. 153-176). New Brunswick, NJ: Transaction. google scholar
Year 2024, , 159 - 168, 13.06.2024
https://doi.org/10.26650/JECS2023-1415085

Abstract

References

  • Acemoglu, D., & Restrepo, P. (2018). Artificial intelligence, automation and work. NBER Working Paper 24196. Cambridge: National Bureau of Economic Research. google scholar
  • Aldoseri, A., Al-Khalifa, K.N., & Hamouda, A.M. (2023). Re-thinking data strategy and integration for artificial intelligence: Concepts, opportunities, and challenges. Applied Sciences, 13(12), 7082. google scholar
  • Aldoseri, A., Al-Khalifa, K.N., & Hamouda, A.M. (2023). Re-thinking data strategy and integration for artificial intelligence: Concepts, opportunities, and challenges. Applied Sciences, 13(12), 7082. google scholar
  • Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. ProPublica. google scholar
  • Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. ProPublica. google scholar
  • Agrawal, A., Gans, J. S., & Goldfarb, A. (2019). Artificial intelligence: The ambiguous labor market impact of automating prediction. Journal of Economic Perspectives, 33(2), 31-50. google scholar
  • Agrawal, A., Gans, J. S., & Goldfarb, A. (2019). Artificial intelligence: The ambiguous labor market impact of automating prediction. Journal of Economic Perspectives, 33(2), 31-50. google scholar
  • Aquino, Y. S. J. (2023). Making decisions: Bias in artificial intelligence and data-driven diagnostic tools. Australian Journal of General Practice, 52(7), 439-442. google scholar
  • Aquino, Y. S. J. (2023). Making decisions: Bias in artificial intelligence and data-driven diagnostic tools. Australian Journal of General Practice, 52(7), 439-442. google scholar
  • Babic, B., Gerke, S., Evgeniou, T., & Cohen, I. G. (2021). Beware explanations from AI in health care. Science, 373(6552), 284-286. google scholar
  • Babic, B., Gerke, S., Evgeniou, T., & Cohen, I. G. (2021). Beware explanations from AI in health care. Science, 373(6552), 284-286. google scholar
  • Bagaric, M. (2016). Three things that a baseline study shows don’t cause indigenous over-imprisonment: Three things that might but shouldn’t and three reforms that will reduce indigenous over-imprisonment. Harvard J Racial & Ethnic Just, 103. google scholar
  • Bagaric, M. (2016). Three things that a baseline study shows don’t cause indigenous over-imprisonment: Three things that might but shouldn’t and three reforms that will reduce indigenous over-imprisonment. Harvard J Racial & Ethnic Just, 103. google scholar
  • Bagaric, M., Hunter, D., & Stobbs, N. (2019). Erasing the bias against using artificial intelligence to predict future criminality: Algorithms are color blind and never dash. University of Cincinnati Law Review, 88(4), 1037-1081. google scholar
  • Bagaric, M., Hunter, D., & Stobbs, N. (2019). Erasing the bias against using artificial intelligence to predict future criminality: Algorithms are color blind and never dash. University of Cincinnati Law Review, 88(4), 1037-1081. google scholar
  • Bagaric, M., Svilar, J., Bull, M., Hunter, D., & Stobbs, N. (2022). The solution to the pervasive bias and discrimination in the criminal justice system: Transparent and fair artificial intelligence. American Criminal Law Review, 59(95), 95-148. google scholar
  • Bagaric, M., Svilar, J., Bull, M., Hunter, D., & Stobbs, N. (2022). The solution to the pervasive bias and discrimination in the criminal justice system: Transparent and fair artificial intelligence. American Criminal Law Review, 59(95), 95-148. google scholar
  • Baker, R. S., & Hawn, A. (2021). Algorithmic bias in education. International Journal of Artificial Intelligence in Education, 32, 1052-1092. google scholar
  • Baker, R. S., & Hawn, A. (2021). Algorithmic bias in education. International Journal of Artificial Intelligence in Education, 32, 1052-1092. google scholar
  • Barocas, S., & Selbst, A.D. (2016). Big data’s disparate impact. California Law Review, 104, 671-732. google scholar
  • Barocas, S., & Selbst, A.D. (2016). Big data’s disparate impact. California Law Review, 104, 671-732. google scholar
  • Bates, D.W., Saria, S., Ohno-Machado, L., Shah, A., & Escobar, G. (2014). Big data in health care: Using analytics to identify and manage high-risk and high-cost patients. Health Affairs, 33(7), 1123-1131. google scholar
  • Bates, D.W., Saria, S., Ohno-Machado, L., Shah, A., & Escobar, G. (2014). Big data in health care: Using analytics to identify and manage high-risk and high-cost patients. Health Affairs, 33(7), 1123-1131. google scholar
  • Bird, K. A., Castleman, B. L., & Song, Y. (2023). Are algorithms biased in education? Exploring racial bias in predicting community college student success. Ed Working Paper No. 23-717. Annenberg Institute: Brown University. google scholar
  • Bird, K. A., Castleman, B. L., & Song, Y. (2023). Are algorithms biased in education? Exploring racial bias in predicting community college student success. Ed Working Paper No. 23-717. Annenberg Institute: Brown University. google scholar
  • Bourdieu, P., & Passeron, J. C. (2000). Reproduction in education, society and culture. Sage Publishing. google scholar
  • Bourdieu, P., & Passeron, J. C. (2000). Reproduction in education, society and culture. Sage Publishing. google scholar
  • Bozkurt, V., & Gursoy, D. (2023). The artificial intelligence paradox: Opportunity or threat for humanity?, International Journal of Human-Computer Interaction, doi: 10.1080/10447318.2023.2297114 google scholar
  • Bozkurt, V., & Gursoy, D. (2023). The artificial intelligence paradox: Opportunity or threat for humanity?, International Journal of Human-Computer Interaction, doi: 10.1080/10447318.2023.2297114 google scholar
  • Castelluccia, C., & Le Metayer, D. (2019). Understanding algorithmic decision-making: opportunities and challenges. European Parliament. google scholar
  • Castelluccia, C., & Le Metayer, D. (2019). Understanding algorithmic decision-making: opportunities and challenges. European Parliament. google scholar
  • Celi L. A., Cellini, J., Charpignon, M. L., Dee, E. C., Dernoncourt, F., Eber, R. et al (2022). Sources of bias in artificial intelligence that perpetuate healthcare disparities: A global review. PLOS Digit Health, 31; 1(3), e0000022. google scholar
  • Celi L. A., Cellini, J., Charpignon, M. L., Dee, E. C., Dernoncourt, F., Eber, R. et al (2022). Sources of bias in artificial intelligence that perpetuate healthcare disparities: A global review. PLOS Digit Health, 31; 1(3), e0000022. google scholar
  • Charpignon, M. L., Byers, J., Cabral, S., Celi, L. A., Fernandes, F., Gallifant, J. et al. (2023). Critical bias in critical care devices. Critical Care Clinics, 39(4), 795-813. google scholar
  • Charpignon, M. L., Byers, J., Cabral, S., Celi, L. A., Fernandes, F., Gallifant, J. et al. (2023). Critical bias in critical care devices. Critical Care Clinics, 39(4), 795-813. google scholar
  • Chawla, N. V., Bowyer, K. W., Hall, L. O., Kegelmeyer, W. P. (2002). SMOTE: Synthetic minority over-sampling technique. The Journal of Artificial Intelligence Research, 16, 321-357. google scholar
  • Chawla, N. V., Bowyer, K. W., Hall, L. O., Kegelmeyer, W. P. (2002). SMOTE: Synthetic minority over-sampling technique. The Journal of Artificial Intelligence Research, 16, 321-357. google scholar
  • Coleman, J. S., Campbell , E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfeld, F. D., & York, R. L. (1966). Equality of educational opportunity. US Office of Education. google scholar
  • Coleman, J. S., Campbell , E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfeld, F. D., & York, R. L. (1966). Equality of educational opportunity. US Office of Education. google scholar
  • Colon-Rodriguez, C. J. (2023, 12 July). Shedding light on healthcare algorithmic and artificial intelligence bias. US Department of Health and Human Services Office of Minority Health. Retrieved from https://minorityhealth.hhs.gov/news/shedding-light-healthcare-algorithmic-and-artificial-intelligence-bias google scholar
  • Colon-Rodriguez, C. J. (2023, 12 July). Shedding light on healthcare algorithmic and artificial intelligence bias. US Department of Health and Human Services Office of Minority Health. Retrieved from https://minorityhealth.hhs.gov/news/shedding-light-healthcare-algorithmic-and-artificial-intelligence-bias google scholar
  • Dressel, J., & Farid, H. (2018). The accuracy, fairness, and limits of predicting recidivism. Science Advances, 4, eaao5580. google scholar
  • Dressel, J., & Farid, H. (2018). The accuracy, fairness, and limits of predicting recidivism. Science Advances, 4, eaao5580. google scholar
  • Edwards, B., & Cheok, A. D. (2018). Why not robot teachers: Artificial intelligence for addressing teacher shortage. Applied Artificial Intelligence, 32(4), 345-360. google scholar
  • Edwards, B., & Cheok, A. D. (2018). Why not robot teachers: Artificial intelligence for addressing teacher shortage. Applied Artificial Intelligence, 32(4), 345-360. google scholar
  • Erdi, P. (2020). Ranking: The unwritten rules of the social game we all play. Oxford University Press. google scholar
  • Erdi, P. (2020). Ranking: The unwritten rules of the social game we all play. Oxford University Press. google scholar
  • European Commission (2022). Mitigating diversity biases ofAI in the labor market. Horizon Europe Project Report. google scholar
  • European Commission (2022). Mitigating diversity biases ofAI in the labor market. Horizon Europe Project Report. google scholar
  • European Network Against Racism (2020). Artificial intelligence in HR: How to address racial biases and algorithmic discrimination in HR?. ENAR Publishing. google scholar
  • European Network Against Racism (2020). Artificial intelligence in HR: How to address racial biases and algorithmic discrimination in HR?. ENAR Publishing. google scholar
  • European Parliament (2020). Skills and jobs for future labour markets: European policies and Skills Agendas 2010-2020. EMPL in Focus. google scholar
  • European Parliament (2020). Skills and jobs for future labour markets: European policies and Skills Agendas 2010-2020. EMPL in Focus. google scholar
  • Franca, T. J. F., Mamede, H. S., Barroso, J. M. P., & dos Santos, V. M. P. D. (2023). Artificial intelligence applied to potential assessment and talent identification in an organizational context. Heliyon, 9(4), e14694. google scholar
  • Franca, T. J. F., Mamede, H. S., Barroso, J. M. P., & dos Santos, V. M. P. D. (2023). Artificial intelligence applied to potential assessment and talent identification in an organizational context. Heliyon, 9(4), e14694. google scholar
  • Hale, K. E. (2020). Using artificial intelligence to circumvent the teacher shortage in special education: A phenomenological investigation. Doctoral Dissertation, Liberty University, USA. google scholar
  • Hale, K. E. (2020). Using artificial intelligence to circumvent the teacher shortage in special education: A phenomenological investigation. Doctoral Dissertation, Liberty University, USA. google scholar
  • Harari, Y. N. (2017). Reboot for the AI revolution. Nature, 550(19), 324-327. google scholar
  • Harari, Y. N. (2017). Reboot for the AI revolution. Nature, 550(19), 324-327. google scholar
  • Holmes, W., Bialik, M., & Fadel, C. (2023). Artificial intelligence in education. In Data ethics: Building trust: How digital technologies can serve humanity (pp. 621-653). Globethics Publications. google scholar
  • Holmes, W., Bialik, M., & Fadel, C. (2023). Artificial intelligence in education. In Data ethics: Building trust: How digital technologies can serve humanity (pp. 621-653). Globethics Publications. google scholar
  • IFOW (2020). Artificial intelligence in hiring: Assessing impacts on equality. Institute for the Future of Work. Retrieved from https://assets-global.website-files.com/64d5f73a7fc5e8a240310c4d/64d5f73b7fc5e8a240310ea0_5f71d338891671faa84de443_IFOW%2B-%2BAssessing%2Bimpacts%2Bon%2Bequality.pdf google scholar
  • IFOW (2020). Artificial intelligence in hiring: Assessing impacts on equality. Institute for the Future of Work. Retrieved from https://assets-global.website-files.com/64d5f73a7fc5e8a240310c4d/64d5f73b7fc5e8a240310ea0_5f71d338891671faa84de443_IFOW%2B-%2BAssessing%2Bimpacts%2Bon%2Bequality.pdf google scholar
  • ILO (2023). Artificial intelligence in human resource management: A challenge for the human-centred agenda?. ILO Working Paper 95. google scholar
  • ILO (2023). Artificial intelligence in human resource management: A challenge for the human-centred agenda?. ILO Working Paper 95. google scholar
  • Kamulegeya, L. H., Okello, M., Bwanika, J. M., et al. (2019). Using artificial intelligence on dermatology conditions in Uganda: A case for diversity in training data sets for machine learning. African Health Sciences, 23(2), 753-763. google scholar
  • Kamulegeya, L. H., Okello, M., Bwanika, J. M., et al. (2019). Using artificial intelligence on dermatology conditions in Uganda: A case for diversity in training data sets for machine learning. African Health Sciences, 23(2), 753-763. google scholar
  • King, M. (2022). Harmful biases in artificial intelligence. The Lancet Psychiatry, 9(11), E48. google scholar
  • King, M. (2022). Harmful biases in artificial intelligence. The Lancet Psychiatry, 9(11), E48. google scholar
  • Kizilcec, R.F., & Lee, H. (2022). Algorithmic fairness in education. In Holmes W., Porayska-Pomsta, K. (Eds), The ethics of artificial intelligence in education (pp. 174-202). Taylor & Francis. google scholar
  • Kizilcec, R.F., & Lee, H. (2022). Algorithmic fairness in education. In Holmes W., Porayska-Pomsta, K. (Eds), The ethics of artificial intelligence in education (pp. 174-202). Taylor & Francis. google scholar
  • Köchling, A., & Wehner, M. C. (2020). Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development. Bus Res, 13, 795-848. google scholar
  • Köchling, A., & Wehner, M. C. (2020). Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development. Bus Res, 13, 795-848. google scholar
  • Larrazabal, A. J., Nieto, N., Peterson, V., Milone, D. H., & Ferrante, E. (2020). Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis. Proc Natl Acad Sci USA, 117(23), 12592-12594. google scholar
  • Larrazabal, A. J., Nieto, N., Peterson, V., Milone, D. H., & Ferrante, E. (2020). Gender imbalance in medical imaging datasets produces biased classifiers for computer-aided diagnosis. Proc Natl Acad Sci USA, 117(23), 12592-12594. google scholar
  • Li, W., Aste, T., Caccioli, F., & Livan, G. (2019). Early co-authorship with top scientists predicts success in academic careers, Nature Communications, 10, 51-70. google scholar
  • Li, W., Aste, T., Caccioli, F., & Livan, G. (2019). Early co-authorship with top scientists predicts success in academic careers, Nature Communications, 10, 51-70. google scholar
  • Lum, K., & Isaac, W. (2016). To predict and serve?. Significance, 13(5), 14-19. google scholar
  • Lum, K., & Isaac, W. (2016). To predict and serve?. Significance, 13(5), 14-19. google scholar
  • Mahmud, H. A. K. M., Islam, N., Ahmed, S. I., & Smolander, K. (2022). What influences algorithmic decision-making? A systematic literature review on algorithm aversion. Technological Forecasting and Social Change, 175, 121390. google scholar
  • Mahmud, H. A. K. M., Islam, N., Ahmed, S. I., & Smolander, K. (2022). What influences algorithmic decision-making? A systematic literature review on algorithm aversion. Technological Forecasting and Social Change, 175, 121390. google scholar
  • Makridakis, S. (2017). The forthcoming Artificial Intelligence (AI) revolution: Its impact on society and firms. Futures, 90, 46-60. google scholar
  • Makridakis, S. (2017). The forthcoming Artificial Intelligence (AI) revolution: Its impact on society and firms. Futures, 90, 46-60. google scholar
  • Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., & Galstyan, A. (2021). A survey on bias and fairness in machine learning. ACM Computing Surveys, 54(6), 1-35. google scholar
  • Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., & Galstyan, A. (2021). A survey on bias and fairness in machine learning. ACM Computing Surveys, 54(6), 1-35. google scholar
  • Merton, R. K. (1968). The Matthew effect in science. Science, 159, 53-63. google scholar
  • Merton, R. K. (1968). The Matthew effect in science. Science, 159, 53-63. google scholar
  • Mittermaier, M., Raza, M. M., Kvedar, J. C. (2023). Bias in AI-based models for medical applications: Challenges and mitigation strategies. NPJ Digital Medicine, 6, 113. google scholar
  • Mittermaier, M., Raza, M. M., Kvedar, J. C. (2023). Bias in AI-based models for medical applications: Challenges and mitigation strategies. NPJ Digital Medicine, 6, 113. google scholar
  • Napierala, J., Kvetan, V. (2023). Changing job skills in a changing world. In: Bertoni, E., Fontana, M., Gabrielli, L., Signorelli, S., Vespe, M. (eds) Handbook of computational social science for policy (pp. 243-259). Springer, Cham. google scholar
  • Napierala, J., Kvetan, V. (2023). Changing job skills in a changing world. In: Bertoni, E., Fontana, M., Gabrielli, L., Signorelli, S., Vespe, M. (eds) Handbook of computational social science for policy (pp. 243-259). Springer, Cham. google scholar
  • Nazer, L. H., Zatarah, R., Waldrip, S., Ke, J. X. C., Moukheiber, M., Khanna, A. K., et al. (2023). Bias in artificial intelligence algorithms and recommendations for mitigation. PLOS Digital Health, 2(6), e0000278. google scholar
  • Nazer, L. H., Zatarah, R., Waldrip, S., Ke, J. X. C., Moukheiber, M., Khanna, A. K., et al. (2023). Bias in artificial intelligence algorithms and recommendations for mitigation. PLOS Digital Health, 2(6), e0000278. google scholar
  • Ntoutsi, E., Fafalios, P., Gadiraju, U., et al. (2020). Bias in data-driven artificial intelligence systems-an introductory survey. WIREs Data Mining Knowl Discov, 10(3), e1356. google scholar
  • Ntoutsi, E., Fafalios, P., Gadiraju, U., et al. (2020). Bias in data-driven artificial intelligence systems-an introductory survey. WIREs Data Mining Knowl Discov, 10(3), e1356. google scholar
  • Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366, 447-453. google scholar
  • Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366, 447-453. google scholar
  • Ocumpaugh, J., Baker, R., Gowda, S., Heffernan, N., & Heffernan, C. (2014). Population validity for educational data mining models: A case study in affect detection. British Journal of Educational Technology, 45(3), 487-501. google scholar
  • Ocumpaugh, J., Baker, R., Gowda, S., Heffernan, N., & Heffernan, C. (2014). Population validity for educational data mining models: A case study in affect detection. British Journal of Educational Technology, 45(3), 487-501. google scholar
  • OECD (2023). OECD employment outlook 2023: Artificial intelligence and the labour market. OECD Publishing. google scholar
  • OECD (2023). OECD employment outlook 2023: Artificial intelligence and the labour market. OECD Publishing. google scholar
  • O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown Books. google scholar
  • O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown Books. google scholar
  • Oracle (2019). AI in human resources: The time is now. Retrieved from https://www.oracle.com/a/ocom/docs/applications/hcm/oracle-ai-in-hr-wp.pdf google scholar
  • Oracle (2019). AI in human resources: The time is now. Retrieved from https://www.oracle.com/a/ocom/docs/applications/hcm/oracle-ai-in-hr-wp.pdf google scholar
  • Özer, M., & Perc, M. (2022). Improving equality in the education system of Türkiye. İstanbul University Journal of Sociology, 2022, 42(2), 325-334. google scholar
  • Özer, M., & Perc, M. (2022). Improving equality in the education system of Türkiye. İstanbul University Journal of Sociology, 2022, 42(2), 325-334. google scholar
  • Özer, M. (2023). The Matthew effect Turkish education system. Bartın University Journal of Faculty of Education, 12(4), 704-712. google scholar
  • Özer, M. (2023). The Matthew effect Turkish education system. Bartın University Journal of Faculty of Education, 12(4), 704-712. google scholar
  • Perc, M. (2014). The Matthew effect in empirical data. Journal of Royal Society Interface, 11(98), 20140378. google scholar
  • Perc, M. (2014). The Matthew effect in empirical data. Journal of Royal Society Interface, 11(98), 20140378. google scholar
  • Perc, M., Özer, M., & Hojnik, J. (2019). Social and juristic challenges of artificial intelligence. Palgrave Communications, 5, 61. google scholar
  • Perc, M., Özer, M., & Hojnik, J. (2019). Social and juristic challenges of artificial intelligence. Palgrave Communications, 5, 61. google scholar
  • Rajpurkar, P., Chen, E., Banerjee, O., & Topol, E. J. (2022). AI in health and medicine. Nature Medicine, 28, 31-38. google scholar
  • Rajpurkar, P., Chen, E., Banerjee, O., & Topol, E. J. (2022). AI in health and medicine. Nature Medicine, 28, 31-38. google scholar
  • Renz, A., & Hilbig, R. (Eds.). (2023). Digital transformation of educational institutions accelerated by covid-19: A digital dynamic capabilities approach. Emerald Publishing. google scholar
  • Renz, A., & Hilbig, R. (Eds.). (2023). Digital transformation of educational institutions accelerated by covid-19: A digital dynamic capabilities approach. Emerald Publishing. google scholar
  • Rigney, D. (2010). The Matthew effect: How advantage begets further advantage. Columbia University Press. New York. google scholar
  • Rigney, D. (2010). The Matthew effect: How advantage begets further advantage. Columbia University Press. New York. google scholar
  • Sackett, P. R., Kuncel, N. R., Arneson, J. J., Cooper, S. R., & Waters, S. D. (2009). Does socioeconomic status explain the relationship between admissions tests and post-secondary academic performance. Psychological Bulletin, 135(1), 1-22. google scholar
  • Sackett, P. R., Kuncel, N. R., Arneson, J. J., Cooper, S. R., & Waters, S. D. (2009). Does socioeconomic status explain the relationship between admissions tests and post-secondary academic performance. Psychological Bulletin, 135(1), 1-22. google scholar
  • Seyyed-Kalantari, L., Zhang, H., McDermott, M. B. A., Chen, I. Y., & Ghassemi, M. (2021). Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations. Nature Med, 27, 2176-2182. google scholar
  • Seyyed-Kalantari, L., Zhang, H., McDermott, M. B. A., Chen, I. Y., & Ghassemi, M. (2021). Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations. Nature Med, 27, 2176-2182. google scholar
  • Smith, H. (2020). Algorithmic bias: should students pay the price? AI & Society, 35, 1077-1078. google scholar
  • Smith, H. (2020). Algorithmic bias: should students pay the price? AI & Society, 35, 1077-1078. google scholar
  • Silberg, J., & Manyika, J. (2019). Notes from the AI frontier: Tackling bias in AI (and in humans). McKinsey Global Institute. google scholar
  • Silberg, J., & Manyika, J. (2019). Notes from the AI frontier: Tackling bias in AI (and in humans). McKinsey Global Institute. google scholar
  • Srinivas, N. (2023, February 24). The ethical debate of AI in criminal justice: Balancing efficiency and human rights. Manage Engine Insights. google scholar
  • Srinivas, N. (2023, February 24). The ethical debate of AI in criminal justice: Balancing efficiency and human rights. Manage Engine Insights. google scholar
  • Retrieved from https://insights.manageengine.com/artificial-intelligence/the-ethical-debate-of-ai-in-criminal-justice-balancing-efficiency-and-human-rights// google scholar
  • Retrieved from https://insights.manageengine.com/artificial-intelligence/the-ethical-debate-of-ai-in-criminal-justice-balancing-efficiency-and-human-rights// google scholar
  • Stefan, V. (2019). Artificial intelligence and its impact on young people. Seminar Report from the Council of Europe. google scholar
  • Stefan, V. (2019). Artificial intelligence and its impact on young people. Seminar Report from the Council of Europe. google scholar
  • Storm, K. I. L., Reiss, L. K., Guenther, E. A., Clar-Novak, M., & Muhr, S.L. (2023). Unconscious bias in the HRM literature: Towards a critical-reflexive approach. Human Resource Management Review, 33(3), 100969. google scholar
  • Storm, K. I. L., Reiss, L. K., Guenther, E. A., Clar-Novak, M., & Muhr, S.L. (2023). Unconscious bias in the HRM literature: Towards a critical-reflexive approach. Human Resource Management Review, 33(3), 100969. google scholar
  • Suleyman, M. (2023). The coming wave: Technology, power, and the twenty-first century’s greatest dilemma. Crown: New York. google scholar
  • Suleyman, M. (2023). The coming wave: Technology, power, and the twenty-first century’s greatest dilemma. Crown: New York. google scholar
  • Suna, H. E., Tanberkan, H., Gür, B. S., Perc, M., & Özer, M. (2020). Socioeconomic status and school type as predictors of academic achievement. Journal of Economics culture and Society, 61, 41-64. google scholar
  • Suna, H. E., Tanberkan, H., Gür, B. S., Perc, M., & Özer, M. (2020). Socioeconomic status and school type as predictors of academic achievement. Journal of Economics culture and Society, 61, 41-64. google scholar
  • Tuffaha, M. (2023). The impact of artificial intelligence bias on human resource management functions: Systematic literature review and future research directions. European Journal of Business and Innovation Research, 11(4), 35-58. google scholar
  • Tuffaha, M. (2023). The impact of artificial intelligence bias on human resource management functions: Systematic literature review and future research directions. European Journal of Business and Innovation Research, 11(4), 35-58. google scholar
  • Ulnicane, I., & Aden, A. (2023). Power and politics in framing bias in artificial intelligence policy. Review of Policy Research, 40, 665-687. google scholar
  • Ulnicane, I., & Aden, A. (2023). Power and politics in framing bias in artificial intelligence policy. Review of Policy Research, 40, 665-687. google scholar
  • UNESCO (2023). The teachers we need for the education we want: The global imperative to reverse the teacher shortage. UNESCO & Education 2030 Factsheet. google scholar
  • UNESCO (2023). The teachers we need for the education we want: The global imperative to reverse the teacher shortage. UNESCO & Education 2030 Factsheet. google scholar
  • Varsha, P. S. (2023). How can we manage biases in artificial intelligence systems-A systematic literature review. International Journal of Information Management Data Insights, 3, 100165. google scholar
  • Varsha, P. S. (2023). How can we manage biases in artificial intelligence systems-A systematic literature review. International Journal of Information Management Data Insights, 3, 100165. google scholar
  • Vicente, L., & Matute, H. (2023). Humans inherit artificial intelligence biases. Scientific Reports, 13, 15737. google scholar
  • Vicente, L., & Matute, H. (2023). Humans inherit artificial intelligence biases. Scientific Reports, 13, 15737. google scholar
  • Zhang, J., Zhang, Z. M. (2023). Ethics and governance of trustwothy medical artificial intelligence. BMC Medical Informatics and Decision Making, 23, 7. google scholar
  • Zhang, J., Zhang, Z. M. (2023). Ethics and governance of trustwothy medical artificial intelligence. BMC Medical Informatics and Decision Making, 23, 7. google scholar
  • Zuckerman, H. A. (1977). Scientific elite: Nobel laureates in the United States. New York: Free Press. google scholar
  • Zuckerman, H. A. (1977). Scientific elite: Nobel laureates in the United States. New York: Free Press. google scholar
  • Zuckerman, H. (1989). Accumulation of advantage and disadvantage: The theory and its intellectual biography. In Mongardini, C., Tabboni, S. (Eds), Robert K. Merton and contemporary sociology (pp. 153-176). New Brunswick, NJ: Transaction. google scholar
  • Zuckerman, H. (1989). Accumulation of advantage and disadvantage: The theory and its intellectual biography. In Mongardini, C., Tabboni, S. (Eds), Robert K. Merton and contemporary sociology (pp. 153-176). New Brunswick, NJ: Transaction. google scholar
There are 155 citations in total.

Details

Primary Language English
Subjects Sociology (Other)
Journal Section Review Articles
Authors

Mahmut Özer 0000-0001-8722-8670

Matjaz Perc 0000-0002-3087-541X

H. Eren Suna 0000-0002-6874-7472

Publication Date June 13, 2024
Submission Date January 5, 2024
Acceptance Date February 8, 2024
Published in Issue Year 2024

Cite

APA Özer, M., Perc, M., & Suna, H. E. (2024). Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market. Journal of Economy Culture and Society(69), 159-168. https://doi.org/10.26650/JECS2023-1415085
AMA Özer M, Perc M, Suna HE. Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market. Journal of Economy Culture and Society. June 2024;(69):159-168. doi:10.26650/JECS2023-1415085
Chicago Özer, Mahmut, Matjaz Perc, and H. Eren Suna. “Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market”. Journal of Economy Culture and Society, no. 69 (June 2024): 159-68. https://doi.org/10.26650/JECS2023-1415085.
EndNote Özer M, Perc M, Suna HE (June 1, 2024) Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market. Journal of Economy Culture and Society 69 159–168.
IEEE M. Özer, M. Perc, and H. E. Suna, “Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market”, Journal of Economy Culture and Society, no. 69, pp. 159–168, June 2024, doi: 10.26650/JECS2023-1415085.
ISNAD Özer, Mahmut et al. “Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market”. Journal of Economy Culture and Society 69 (June 2024), 159-168. https://doi.org/10.26650/JECS2023-1415085.
JAMA Özer M, Perc M, Suna HE. Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market. Journal of Economy Culture and Society. 2024;:159–168.
MLA Özer, Mahmut et al. “Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market”. Journal of Economy Culture and Society, no. 69, 2024, pp. 159-68, doi:10.26650/JECS2023-1415085.
Vancouver Özer M, Perc M, Suna HE. Artificial Intelligence Bias and the Amplification of Inequalities in the Labor Market. Journal of Economy Culture and Society. 2024(69):159-68.