Araştırma Makalesi
BibTex RIS Kaynak Göster

Changes in Scholars’ Ways of Knowledge Production Shaped by Systematic Measures

Yıl 2023, Cilt: 22 Sayı: 1, 30 - 45, 27.01.2023
https://doi.org/10.21547/jss.1209134

Öz

This paper considers the role played by systematic measuring in academic conducts. The introduction of entrepreneurial culture into academia fosters competitive atmospheres through its emphasis on personal merit, which entails quantitative performance measurement. As knowledge production is one of scholars’ core missions, this study aims to articulate impacts on ways of knowledge production influenced by environmental factors, comprising measuring instruments, institutional management, funding bodies and state policy. By interviewing 41 scholars in Taiwan and exploring their accounts of academic practices, this paper empirically analyses how knowledge production has been shaped by policy environments. These modifying behaviour patterns include a selection of research topics, a design for research agenda, strategies for grant applications and publications, engendering delicate changes in the epistemic properties of research. From this, it characterizes power relations between researchers, scientific communities and the state in the neoliberal age.

Kaynakça

  • Archambault, É., & Larivière, V. (2009). History of the journal impact factor: Contingencies and consequences. Scientometrics, 79(3), 635-649.
  • Archambault, É., Vignola-Gagne, É., Côté, G., Larivière, V., & Gingrasb, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329-342.
  • Back, L. (2016). Academic Diary: Or Why Higher Education Still Matters. London: Goldsmiths Press.
  • Barbash, F. (2014). Scholarly journal retracts 60 articles, smashes ‘peer review ring’. The Washington Post. Retrieved from https://www.washingtonpost.com/news/morning-mix/wp/2014/07/10/scholarly-journal-retracts-60-articles-smashes-peer-review-ring/
  • Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: unintended consequences of journal rank. Frontiers in human Neuroscience, 7.
  • Butler, L. (2007). Assessing university research: a plea for a balanced approach. Science and Public Policy, 34(8), 565-574.
  • Cheng, S. Y., Jacob, W. J., & Yang, S.-K. (2014). Reflections from the Social Science Citation Index (SSCI) and its influence on education research in Taiwan. In The SSCI Syndrome in Higher Education (pp. 97-107). Rotterdam; Boston; Taipei: Sense Publishers.
  • Engels, T. C., Ossenblok, T. L., & Spruyt, E. H. (2012). Changing publication patterns in the social sciences and humanities, 2000–2009. Scientometrics, 93(2), 373-390.
  • Espeland, W. N., & Lom, S. E. (2015). Noticing Numbers: How Quantification Changes What We See and What We Don't. In M. Kornberger, L. Justesen, A. K. Madsen, & J. Mouritsen (Eds.), Making things valuable. Oxford: Oxford University Press.
  • Esposito, E., & Stark, D. (2019). What's Observed in a Rating? Rankings as Orientation in the Face of Uncertainty. Theory, Culture & Society, 36(4), 3-26.
  • Fanelli, D. (2009). How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLoS One, 4(5), e5738. doi:10.1371/journal.pone.0005738
  • Fanelli, D. (2012). Negative results are disappearing from most disciplines and countries. Scientometrics, 90(3), 891-904.
  • Foucault, M. (1979). Discipline and punish: the birth of the prison. London: Penguin.
  • Gane, N. (2012). The governmentalities of neoliberalism: panopticism, post‐panopticism and beyond. The Sociological Review, 60(4), 611-634.
  • Glänzel, W., & Schoepflin, U. (1999). A bibliometric study of reference literature in the sciences and social sciences. Information processing & management, 35(1), 31-44.
  • Gläser, J., Lange, S., Laudel, G., & Schimank, U. (2010). The Limits of Universality: How Field-Speciflc Epistemic Conditions Affect. Authority Relations and their Consequences. In R. W. J. G. s. L. Engwall (Ed.), Reconfiguring knowledge production : changing authority relationships in the sciences and their consequences for intellectual innovation (pp. 291-324). Oxford: Oxford University Press.
  • Gläser, J., & Laudel, G. (2015). A bibliometric reconstruction of research trails for qualitative investigations of scientific innovations. Historical Social Research, 40(3), 299-330.
  • Gläser, J., Laudel, G., Hinze, S., & Butler, L. (2002). Impact of evaluation-based funding on the production of scientific knowledge: What to worry about, and how to find out: Fraunhofer ISI Karlsruhe, Germany. Hallonsten, O. (2021). Stop evaluating science: A historical-sociological argument. Social Science Information, 60(1), 7-26.
  • Hammarfelt, B., & de Rijcke, S. (2014). Accountability in context: effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the faculty of Arts at Uppsala University. Research Evaluation, rvu029.
  • Herbst, M. (2007). Financing public universities: The case of performance funding (Vol. 18). Dordrecht: Springer.
  • Hicks, D. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44(2), 193-215. doi:10.1007/bf02457380
  • Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251-261. doi:10.1016/j.respol.2011.09.007
  • Kelly, C. D. (2006). Replicating empirical research in behavioral ecology: how and why it should be done but rarely ever is. Q Rev Biol, 81(3), 221-236.
  • Kim, T.-H. (2008). How could a scientist become a national celebrity? Nationalism and Hwang Woo-Suk scandal. East Asian Science, Technology and Society: An International Journal, 2(1), 27-45.
  • Laine, C., & Winker, M. A. (2017). Identifying predatory or pseudo-journals. Biochemia medica: Biochemia medica, 27(2), 285-291.
  • Larsen, P. O., & Von Ins, M. (2010). The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics, 84(3), 575-603.
  • Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge, Mass.: Harvard University Press.
  • Latour, B. (1999). Circulating reference: Sampling the soil in the Amazon forest. In Pandora's hope: essays on the reality of science studies. Cambridge, Mass.: Harvard University Press.
  • Laudel, G., Bielick, J., & Gläser, J. (2019). ‘Ultimately the question always is:“What do I have to do to do it right?”’Scripts as explanatory factors of career decisions. human relations, 72(5), 932-961.
  • Laudel, G., & Gläser, J. (2014). Beyond breakthrough research: Epistemic properties of research and their consequences for research funding. Research Policy, 43(7), 1204-1216.
  • Lawrence, P. A. (2003). The politics of publication. Nature, 422(6929), 259-261.
  • Li, Y., & Flowerdew, J. (2009). International engagement versus local commitment: Hong Kong academics in the humanities and social sciences writing for publication. Journal of English for Academic Purposes, 8(4), 279-293.
  • Liu, Z. (2003). Trends in transforming scholarly communication and their implications. Information processing & management, 39(6), 889-898.
  • Møller, A. P., & Jennions, M. D. (2001). Testing and adjusting for publication bias. Trends in Ecology & Evolution, 16(10), 580-586. doi:10.1016/S0169-5347(01)02235-2
  • Martin, B. R. (2003). The changing social contract for science and the evolution of the university. In A. Geuna, A. J. Salter, & W. E. Steinmueller (Eds.), Science and Innovation: Rethinking the Rationales for Funding and Governance (pp. 7-29). Massachusetts: Edward Elgar Publishing Ltd.
  • Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105-2125. doi:10.1002/asi.20677
  • Merton, R. K. (1973). The sociology of science: Theoretical and empirical investigations. Chicago; London: University of Chicago Press.
  • Mikami, K. (2018). The case of inferred doability: An analysis of the socio-institutional background of the STAP cell Scandal. East Asian Science, Technology and Society, 12(2), 123-142.
  • Miller, P., & Rose, N. (2008). Governing the present : administering economic, social and personal life. Cambridge: Polity.
  • Mirowski, P. (2011). Science-mart. Cambridge, Mass.: Harvard University Press.
  • Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66(1), 81-100.
  • Piro, F. N., Aksnes, D. W., & Rørstad, K. (2013). A macro analysis of productivity differences across fields: Challenges in the measurement of scientific publishing. Journal of the American Society for Information Science and Technology, 64(2), 307-320.
  • Rafols, I., Leydesdorff, L., O’Hare, A., Nightingale, P., & Stirling, A. (2012). How journal rankings can suppress interdisciplinary research: A comparison between innovation studies and business & management. Research Policy, 41(7), 1262-1282.
  • Rijcke, S. d., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use—a literature review. Research Evaluation, 25(2), 161-169.
  • Schäfer, L. O. (2016). Performance assessment in science and academia: effects of the RAE/REF on academic life. Retrieved from
  • Shin, J. C., Watanabe, S. P., Chen, R. J.-C., Ho, S. S.-H., & Lee, J.-k. (2020). Institutionalization of competition-based funding under neoliberalism in East Asia. Studies in Higher Education, 45(10), 2054-2062.
  • Shore, C. (2020). Symbiotic or Parasitic? Universities, Academic Capitalism and the Global Knowledge Economy. In E. Heffernan, F. Murphy, & J. Skinner (Eds.), Collaborations: Anthropology in a Neoliberal Age (pp. 23-44). London: Routledge.
  • Sismondo, S., & Doucet, M. (2010). Publication ethics and the ghost management of medical publication. Bioethics, 24(6), 273-283.
  • Sorokowski, P., Kulczycki, E., Sorokowska, A., & Pisanski, K. (2017). Predatory journals recruit fake editor. Nature News, 543(7646), 481.
  • Steen, R. G. (2011). Retractions in the scientific literature: is the incidence of research fraud increasing? Journal of medical ethics, 37(4), 249-253.
  • Strathern, M. (2000). Audit cultures: anthropological studies in accountability, ethics, and the academy (M. Strathern Ed.). London; New York: Routledge.
  • Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117-131.
  • Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., . . . Johnson, B. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. doi:10.13140/RG.2.1.4929.1363
  • Wright, S. (2016). Universities in a knowledge economy or ecology? Policy, contestation and abjection. Critical policy studies, 10(1), 59-78.
  • Yong, E. (2012). Replication studies: Bad copy. Nature, 485(7398), 298-300. doi:10.1038/485298a
Yıl 2023, Cilt: 22 Sayı: 1, 30 - 45, 27.01.2023
https://doi.org/10.21547/jss.1209134

Öz

Kaynakça

  • Archambault, É., & Larivière, V. (2009). History of the journal impact factor: Contingencies and consequences. Scientometrics, 79(3), 635-649.
  • Archambault, É., Vignola-Gagne, É., Côté, G., Larivière, V., & Gingrasb, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329-342.
  • Back, L. (2016). Academic Diary: Or Why Higher Education Still Matters. London: Goldsmiths Press.
  • Barbash, F. (2014). Scholarly journal retracts 60 articles, smashes ‘peer review ring’. The Washington Post. Retrieved from https://www.washingtonpost.com/news/morning-mix/wp/2014/07/10/scholarly-journal-retracts-60-articles-smashes-peer-review-ring/
  • Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: unintended consequences of journal rank. Frontiers in human Neuroscience, 7.
  • Butler, L. (2007). Assessing university research: a plea for a balanced approach. Science and Public Policy, 34(8), 565-574.
  • Cheng, S. Y., Jacob, W. J., & Yang, S.-K. (2014). Reflections from the Social Science Citation Index (SSCI) and its influence on education research in Taiwan. In The SSCI Syndrome in Higher Education (pp. 97-107). Rotterdam; Boston; Taipei: Sense Publishers.
  • Engels, T. C., Ossenblok, T. L., & Spruyt, E. H. (2012). Changing publication patterns in the social sciences and humanities, 2000–2009. Scientometrics, 93(2), 373-390.
  • Espeland, W. N., & Lom, S. E. (2015). Noticing Numbers: How Quantification Changes What We See and What We Don't. In M. Kornberger, L. Justesen, A. K. Madsen, & J. Mouritsen (Eds.), Making things valuable. Oxford: Oxford University Press.
  • Esposito, E., & Stark, D. (2019). What's Observed in a Rating? Rankings as Orientation in the Face of Uncertainty. Theory, Culture & Society, 36(4), 3-26.
  • Fanelli, D. (2009). How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLoS One, 4(5), e5738. doi:10.1371/journal.pone.0005738
  • Fanelli, D. (2012). Negative results are disappearing from most disciplines and countries. Scientometrics, 90(3), 891-904.
  • Foucault, M. (1979). Discipline and punish: the birth of the prison. London: Penguin.
  • Gane, N. (2012). The governmentalities of neoliberalism: panopticism, post‐panopticism and beyond. The Sociological Review, 60(4), 611-634.
  • Glänzel, W., & Schoepflin, U. (1999). A bibliometric study of reference literature in the sciences and social sciences. Information processing & management, 35(1), 31-44.
  • Gläser, J., Lange, S., Laudel, G., & Schimank, U. (2010). The Limits of Universality: How Field-Speciflc Epistemic Conditions Affect. Authority Relations and their Consequences. In R. W. J. G. s. L. Engwall (Ed.), Reconfiguring knowledge production : changing authority relationships in the sciences and their consequences for intellectual innovation (pp. 291-324). Oxford: Oxford University Press.
  • Gläser, J., & Laudel, G. (2015). A bibliometric reconstruction of research trails for qualitative investigations of scientific innovations. Historical Social Research, 40(3), 299-330.
  • Gläser, J., Laudel, G., Hinze, S., & Butler, L. (2002). Impact of evaluation-based funding on the production of scientific knowledge: What to worry about, and how to find out: Fraunhofer ISI Karlsruhe, Germany. Hallonsten, O. (2021). Stop evaluating science: A historical-sociological argument. Social Science Information, 60(1), 7-26.
  • Hammarfelt, B., & de Rijcke, S. (2014). Accountability in context: effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the faculty of Arts at Uppsala University. Research Evaluation, rvu029.
  • Herbst, M. (2007). Financing public universities: The case of performance funding (Vol. 18). Dordrecht: Springer.
  • Hicks, D. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44(2), 193-215. doi:10.1007/bf02457380
  • Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251-261. doi:10.1016/j.respol.2011.09.007
  • Kelly, C. D. (2006). Replicating empirical research in behavioral ecology: how and why it should be done but rarely ever is. Q Rev Biol, 81(3), 221-236.
  • Kim, T.-H. (2008). How could a scientist become a national celebrity? Nationalism and Hwang Woo-Suk scandal. East Asian Science, Technology and Society: An International Journal, 2(1), 27-45.
  • Laine, C., & Winker, M. A. (2017). Identifying predatory or pseudo-journals. Biochemia medica: Biochemia medica, 27(2), 285-291.
  • Larsen, P. O., & Von Ins, M. (2010). The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics, 84(3), 575-603.
  • Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge, Mass.: Harvard University Press.
  • Latour, B. (1999). Circulating reference: Sampling the soil in the Amazon forest. In Pandora's hope: essays on the reality of science studies. Cambridge, Mass.: Harvard University Press.
  • Laudel, G., Bielick, J., & Gläser, J. (2019). ‘Ultimately the question always is:“What do I have to do to do it right?”’Scripts as explanatory factors of career decisions. human relations, 72(5), 932-961.
  • Laudel, G., & Gläser, J. (2014). Beyond breakthrough research: Epistemic properties of research and their consequences for research funding. Research Policy, 43(7), 1204-1216.
  • Lawrence, P. A. (2003). The politics of publication. Nature, 422(6929), 259-261.
  • Li, Y., & Flowerdew, J. (2009). International engagement versus local commitment: Hong Kong academics in the humanities and social sciences writing for publication. Journal of English for Academic Purposes, 8(4), 279-293.
  • Liu, Z. (2003). Trends in transforming scholarly communication and their implications. Information processing & management, 39(6), 889-898.
  • Møller, A. P., & Jennions, M. D. (2001). Testing and adjusting for publication bias. Trends in Ecology & Evolution, 16(10), 580-586. doi:10.1016/S0169-5347(01)02235-2
  • Martin, B. R. (2003). The changing social contract for science and the evolution of the university. In A. Geuna, A. J. Salter, & W. E. Steinmueller (Eds.), Science and Innovation: Rethinking the Rationales for Funding and Governance (pp. 7-29). Massachusetts: Edward Elgar Publishing Ltd.
  • Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105-2125. doi:10.1002/asi.20677
  • Merton, R. K. (1973). The sociology of science: Theoretical and empirical investigations. Chicago; London: University of Chicago Press.
  • Mikami, K. (2018). The case of inferred doability: An analysis of the socio-institutional background of the STAP cell Scandal. East Asian Science, Technology and Society, 12(2), 123-142.
  • Miller, P., & Rose, N. (2008). Governing the present : administering economic, social and personal life. Cambridge: Polity.
  • Mirowski, P. (2011). Science-mart. Cambridge, Mass.: Harvard University Press.
  • Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66(1), 81-100.
  • Piro, F. N., Aksnes, D. W., & Rørstad, K. (2013). A macro analysis of productivity differences across fields: Challenges in the measurement of scientific publishing. Journal of the American Society for Information Science and Technology, 64(2), 307-320.
  • Rafols, I., Leydesdorff, L., O’Hare, A., Nightingale, P., & Stirling, A. (2012). How journal rankings can suppress interdisciplinary research: A comparison between innovation studies and business & management. Research Policy, 41(7), 1262-1282.
  • Rijcke, S. d., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use—a literature review. Research Evaluation, 25(2), 161-169.
  • Schäfer, L. O. (2016). Performance assessment in science and academia: effects of the RAE/REF on academic life. Retrieved from
  • Shin, J. C., Watanabe, S. P., Chen, R. J.-C., Ho, S. S.-H., & Lee, J.-k. (2020). Institutionalization of competition-based funding under neoliberalism in East Asia. Studies in Higher Education, 45(10), 2054-2062.
  • Shore, C. (2020). Symbiotic or Parasitic? Universities, Academic Capitalism and the Global Knowledge Economy. In E. Heffernan, F. Murphy, & J. Skinner (Eds.), Collaborations: Anthropology in a Neoliberal Age (pp. 23-44). London: Routledge.
  • Sismondo, S., & Doucet, M. (2010). Publication ethics and the ghost management of medical publication. Bioethics, 24(6), 273-283.
  • Sorokowski, P., Kulczycki, E., Sorokowska, A., & Pisanski, K. (2017). Predatory journals recruit fake editor. Nature News, 543(7646), 481.
  • Steen, R. G. (2011). Retractions in the scientific literature: is the incidence of research fraud increasing? Journal of medical ethics, 37(4), 249-253.
  • Strathern, M. (2000). Audit cultures: anthropological studies in accountability, ethics, and the academy (M. Strathern Ed.). London; New York: Routledge.
  • Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117-131.
  • Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., . . . Johnson, B. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. doi:10.13140/RG.2.1.4929.1363
  • Wright, S. (2016). Universities in a knowledge economy or ecology? Policy, contestation and abjection. Critical policy studies, 10(1), 59-78.
  • Yong, E. (2012). Replication studies: Bad copy. Nature, 485(7398), 298-300. doi:10.1038/485298a

Changes in Scholars’ Ways of Knowledge Production Shaped by Systematic Measures

Yıl 2023, Cilt: 22 Sayı: 1, 30 - 45, 27.01.2023
https://doi.org/10.21547/jss.1209134

Öz

Bu makale, akademik davranışlarda sistematik ölçmenin oynadığı rolü ele almaktadır. Girişimcilik kültürünün akademide ortaya çıkışı, nicel performans ölçümünü gerektiren bireysel liyakat vurgusu yoluyla rekabetçi atmosferleri teşvik etmiştir. Bilgi üretimi akademisyenlerin temel misyonlarından biri olduğu için, bu çalışma, ölçüm araçları, kurumsal yönetim, finansman organları ve devlet politikasını içeren çevresel faktörlerden etkilenen bilgi üretimi biçimleri üzerindeki etkileri ifade etmeyi amaçlamaktadır. Tayvan'daki 41 akademisyenle yapılan mülakatlar vasıtasıyla onların akademik uygulamalara ilişkin anlatılarını keşfe çıkan bu makale, bilgi üretiminin politik atmosfer tarafından nasıl şekillendiğini ampirik olarak analiz etmektedir. Bu değiştirici davranış kalıpları; araştırma konularının seçimini, araştırma gündemi tasarımını, hibe başvuruları ve yayınları için strateji oluşturmayı, araştırmanın epistemik özelliklerinde hassas değişiklikleri içerir. Çalışma buradan hareketle, neoliberal çağda araştırmacılar, bilimsel topluluklar ve devlet arasındaki güç ilişkilerini karakterize etmektedir.

Kaynakça

  • Archambault, É., & Larivière, V. (2009). History of the journal impact factor: Contingencies and consequences. Scientometrics, 79(3), 635-649.
  • Archambault, É., Vignola-Gagne, É., Côté, G., Larivière, V., & Gingrasb, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329-342.
  • Back, L. (2016). Academic Diary: Or Why Higher Education Still Matters. London: Goldsmiths Press.
  • Barbash, F. (2014). Scholarly journal retracts 60 articles, smashes ‘peer review ring’. The Washington Post. Retrieved from https://www.washingtonpost.com/news/morning-mix/wp/2014/07/10/scholarly-journal-retracts-60-articles-smashes-peer-review-ring/
  • Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: unintended consequences of journal rank. Frontiers in human Neuroscience, 7.
  • Butler, L. (2007). Assessing university research: a plea for a balanced approach. Science and Public Policy, 34(8), 565-574.
  • Cheng, S. Y., Jacob, W. J., & Yang, S.-K. (2014). Reflections from the Social Science Citation Index (SSCI) and its influence on education research in Taiwan. In The SSCI Syndrome in Higher Education (pp. 97-107). Rotterdam; Boston; Taipei: Sense Publishers.
  • Engels, T. C., Ossenblok, T. L., & Spruyt, E. H. (2012). Changing publication patterns in the social sciences and humanities, 2000–2009. Scientometrics, 93(2), 373-390.
  • Espeland, W. N., & Lom, S. E. (2015). Noticing Numbers: How Quantification Changes What We See and What We Don't. In M. Kornberger, L. Justesen, A. K. Madsen, & J. Mouritsen (Eds.), Making things valuable. Oxford: Oxford University Press.
  • Esposito, E., & Stark, D. (2019). What's Observed in a Rating? Rankings as Orientation in the Face of Uncertainty. Theory, Culture & Society, 36(4), 3-26.
  • Fanelli, D. (2009). How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLoS One, 4(5), e5738. doi:10.1371/journal.pone.0005738
  • Fanelli, D. (2012). Negative results are disappearing from most disciplines and countries. Scientometrics, 90(3), 891-904.
  • Foucault, M. (1979). Discipline and punish: the birth of the prison. London: Penguin.
  • Gane, N. (2012). The governmentalities of neoliberalism: panopticism, post‐panopticism and beyond. The Sociological Review, 60(4), 611-634.
  • Glänzel, W., & Schoepflin, U. (1999). A bibliometric study of reference literature in the sciences and social sciences. Information processing & management, 35(1), 31-44.
  • Gläser, J., Lange, S., Laudel, G., & Schimank, U. (2010). The Limits of Universality: How Field-Speciflc Epistemic Conditions Affect. Authority Relations and their Consequences. In R. W. J. G. s. L. Engwall (Ed.), Reconfiguring knowledge production : changing authority relationships in the sciences and their consequences for intellectual innovation (pp. 291-324). Oxford: Oxford University Press.
  • Gläser, J., & Laudel, G. (2015). A bibliometric reconstruction of research trails for qualitative investigations of scientific innovations. Historical Social Research, 40(3), 299-330.
  • Gläser, J., Laudel, G., Hinze, S., & Butler, L. (2002). Impact of evaluation-based funding on the production of scientific knowledge: What to worry about, and how to find out: Fraunhofer ISI Karlsruhe, Germany. Hallonsten, O. (2021). Stop evaluating science: A historical-sociological argument. Social Science Information, 60(1), 7-26.
  • Hammarfelt, B., & de Rijcke, S. (2014). Accountability in context: effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the faculty of Arts at Uppsala University. Research Evaluation, rvu029.
  • Herbst, M. (2007). Financing public universities: The case of performance funding (Vol. 18). Dordrecht: Springer.
  • Hicks, D. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44(2), 193-215. doi:10.1007/bf02457380
  • Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251-261. doi:10.1016/j.respol.2011.09.007
  • Kelly, C. D. (2006). Replicating empirical research in behavioral ecology: how and why it should be done but rarely ever is. Q Rev Biol, 81(3), 221-236.
  • Kim, T.-H. (2008). How could a scientist become a national celebrity? Nationalism and Hwang Woo-Suk scandal. East Asian Science, Technology and Society: An International Journal, 2(1), 27-45.
  • Laine, C., & Winker, M. A. (2017). Identifying predatory or pseudo-journals. Biochemia medica: Biochemia medica, 27(2), 285-291.
  • Larsen, P. O., & Von Ins, M. (2010). The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics, 84(3), 575-603.
  • Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge, Mass.: Harvard University Press.
  • Latour, B. (1999). Circulating reference: Sampling the soil in the Amazon forest. In Pandora's hope: essays on the reality of science studies. Cambridge, Mass.: Harvard University Press.
  • Laudel, G., Bielick, J., & Gläser, J. (2019). ‘Ultimately the question always is:“What do I have to do to do it right?”’Scripts as explanatory factors of career decisions. human relations, 72(5), 932-961.
  • Laudel, G., & Gläser, J. (2014). Beyond breakthrough research: Epistemic properties of research and their consequences for research funding. Research Policy, 43(7), 1204-1216.
  • Lawrence, P. A. (2003). The politics of publication. Nature, 422(6929), 259-261.
  • Li, Y., & Flowerdew, J. (2009). International engagement versus local commitment: Hong Kong academics in the humanities and social sciences writing for publication. Journal of English for Academic Purposes, 8(4), 279-293.
  • Liu, Z. (2003). Trends in transforming scholarly communication and their implications. Information processing & management, 39(6), 889-898.
  • Møller, A. P., & Jennions, M. D. (2001). Testing and adjusting for publication bias. Trends in Ecology & Evolution, 16(10), 580-586. doi:10.1016/S0169-5347(01)02235-2
  • Martin, B. R. (2003). The changing social contract for science and the evolution of the university. In A. Geuna, A. J. Salter, & W. E. Steinmueller (Eds.), Science and Innovation: Rethinking the Rationales for Funding and Governance (pp. 7-29). Massachusetts: Edward Elgar Publishing Ltd.
  • Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105-2125. doi:10.1002/asi.20677
  • Merton, R. K. (1973). The sociology of science: Theoretical and empirical investigations. Chicago; London: University of Chicago Press.
  • Mikami, K. (2018). The case of inferred doability: An analysis of the socio-institutional background of the STAP cell Scandal. East Asian Science, Technology and Society, 12(2), 123-142.
  • Miller, P., & Rose, N. (2008). Governing the present : administering economic, social and personal life. Cambridge: Polity.
  • Mirowski, P. (2011). Science-mart. Cambridge, Mass.: Harvard University Press.
  • Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66(1), 81-100.
  • Piro, F. N., Aksnes, D. W., & Rørstad, K. (2013). A macro analysis of productivity differences across fields: Challenges in the measurement of scientific publishing. Journal of the American Society for Information Science and Technology, 64(2), 307-320.
  • Rafols, I., Leydesdorff, L., O’Hare, A., Nightingale, P., & Stirling, A. (2012). How journal rankings can suppress interdisciplinary research: A comparison between innovation studies and business & management. Research Policy, 41(7), 1262-1282.
  • Rijcke, S. d., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use—a literature review. Research Evaluation, 25(2), 161-169.
  • Schäfer, L. O. (2016). Performance assessment in science and academia: effects of the RAE/REF on academic life. Retrieved from
  • Shin, J. C., Watanabe, S. P., Chen, R. J.-C., Ho, S. S.-H., & Lee, J.-k. (2020). Institutionalization of competition-based funding under neoliberalism in East Asia. Studies in Higher Education, 45(10), 2054-2062.
  • Shore, C. (2020). Symbiotic or Parasitic? Universities, Academic Capitalism and the Global Knowledge Economy. In E. Heffernan, F. Murphy, & J. Skinner (Eds.), Collaborations: Anthropology in a Neoliberal Age (pp. 23-44). London: Routledge.
  • Sismondo, S., & Doucet, M. (2010). Publication ethics and the ghost management of medical publication. Bioethics, 24(6), 273-283.
  • Sorokowski, P., Kulczycki, E., Sorokowska, A., & Pisanski, K. (2017). Predatory journals recruit fake editor. Nature News, 543(7646), 481.
  • Steen, R. G. (2011). Retractions in the scientific literature: is the incidence of research fraud increasing? Journal of medical ethics, 37(4), 249-253.
  • Strathern, M. (2000). Audit cultures: anthropological studies in accountability, ethics, and the academy (M. Strathern Ed.). London; New York: Routledge.
  • Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117-131.
  • Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., . . . Johnson, B. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. doi:10.13140/RG.2.1.4929.1363
  • Wright, S. (2016). Universities in a knowledge economy or ecology? Policy, contestation and abjection. Critical policy studies, 10(1), 59-78.
  • Yong, E. (2012). Replication studies: Bad copy. Nature, 485(7398), 298-300. doi:10.1038/485298a
Toplam 55 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Sosyoloji
Bölüm Sosyoloji
Yazarlar

Ming-te Peng 0000-0001-7507-2701

Yayımlanma Tarihi 27 Ocak 2023
Gönderilme Tarihi 23 Kasım 2022
Kabul Tarihi 11 Ocak 2023
Yayımlandığı Sayı Yıl 2023 Cilt: 22 Sayı: 1

Kaynak Göster

APA Peng, M.-t. (2023). Changes in Scholars’ Ways of Knowledge Production Shaped by Systematic Measures. Gaziantep Üniversitesi Sosyal Bilimler Dergisi, 22(1), 30-45. https://doi.org/10.21547/jss.1209134