Research Article
BibTex RIS Cite

Disciplinary Differences in Academics' Perceptions of Performance Measurement at Nordic Universities

Year 2020, Volume: 1 Issue: 1, 18 - 31, 30.06.2020

Abstract

As performance measurement has become increasingly common at Nordic universities, concerns have been raised that disciplinary differences create difficulties in comparing academic performances. To better understand the potential of utilising performance measures for the management of academic work, this study explores how academics perceive governance and steering based on performance measurement. Building on an established typology of the disciplines that distinguishes the hard sciences from the soft and the applied sciences from the pure, we ask how academics perceive performance measurement depending on their disciplinary affiliation. The empirical material consists of a survey sent to academics in four Nordic countries. Our results show there are clear differences in the attitudes toward performance measurement between academics from different disciplines. Academics from the hard applied sciences are more positive about performance measurement than any other group, and academics from the soft pure sciences are more negative. These findings are consistent with notions about the poor adaptation of metrics to publication practices within the soft sciences and greater sensitivity to performance measurement among the applied sciences. The main contribution of the article is to provide empirical data that support the notion that performance measures are accepted to different degrees in different disciplines.

Supporting Institution

Research Council of Norway - Riksbankens Jubileumsfond

Project Number

237782 - FSK15-1059:1

Thanks

This work was supported by the Research Council of Norway under grant number 237782 and Riksbankens Jubileumsfond grant number FSK15-1059:1.

References

  • Aagaard, K. (2015). How incentives trickle down: Local use of a national bibliometric indicator system. Science and Public Policy, 42(5), 725-737.
  • Ahola, S., Hedmo, T., Thomsen, J.-P., & Vabø, A. (2014). Organisational features of higher education: Denmark, Finland, Norway & Sweden. Oslo: NIFU. Accessed (June 1, 2020): https://www.nifu.no/en/publications/1162368
  • Aksnes, D. W., & Rip, A. (2009). Researchers’ perceptions of citations. Research Policy, 38(6), 895-905.
  • Amaral, A., Meek, L., & Larsen, I. M. (Eds.). (2003). The higher education managerial revolution?. Dordrecht: Springer.
  • Becher, T. (1994). The significance of disciplinary differences. Studies in Higher Education, 19(2), 151-161.
  • Becher, T., & Trowler, P. R. (2001). Academic tribes and territories. Buckingham: Open University Press.
  • Biglan, A. (1973a). The characteristics of subject matter in different academic areas. Journal of Applied Psychology, 57(3), 195-203.
  • Biglan, A. (1973b). Relationships between subject matter characteristics and the structure and output of university departments. Journal of Applied Psychology, 57(3), 204-213.
  • Bleiklie, I. (1998). Justifying the evaluative state: New public management ideals in higher education. Journal of Public Affairs Education, 4(2), 87-100.
  • Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895-903.
  • Buela-Casal, G., & Zych, I. (2012). What do the scientists think about the impact factor?. Scientometrics, 92(2), 281-292.
  • Campbell, J. P., & Oblinger, D. G. (2007). Academic Analytics. EDUCAUSE Quarterly. Accessed (September 01, 2019): https://library.educause.edu/-/media/files/library/2007/10/pub6101-pdf.pdf
  • Christensen, T. (2011). University governance reforms: potential problems of more autonomy?. Higher Education, 62(4), 503-517.
  • Clark, B. R. (1998). Creating entrepreneurial universities: Organizational pathways of transformation. Bingley: Emerald.
  • Cole, S. (1983). The hierarchy of the sciences?. American Journal of Sociology, 89(1), 111-139.
  • Deem, R., Hillyard, S., Reed, M., & Reed, M. (2007). Knowledge, higher education, and the new managerialism: The changing management of UK universities. Oxford: Oxford University Press.
  • Deephouse, D. L., & Suchman, M. (2008). Legitimacy in organizational institutionalism. In R. Greenwood, C. Oliver, R. Suddaby, & K. Sahlin-Andersson (Eds.), The Sage Handbook of Organizational Institutionalism (pp. 49-77). London: Sage.
  • Donovan, C. (2007). The qualitative future of research evaluation. Science and Public Policy, 34(8), 585-597.
  • El‐Khawas, E. (1996). One professoriate, or many?: Assessing aspects of differentiation among academics. Tertiary Education & Management, 2(2), 146-152.
  • Elzinga, A. (1997). The science-society contract in historical transformation: With special reference to “epistemic drift”. Social Science Information, 36(3), 411-445.
  • Elzinga, A. (2012). Features of the current science policy regime: Viewed in historical perspective. Science and Public Policy, 39(4), 416-428.
  • Espeland, W. N., & Sauder, M. (2007). Rankings and reactivity: How public measures recreate social worlds. American Journal of Sociology, 113(1), 1-40.
  • Ferlie, E., Musselin, C., & Andresani, G. (2008). The steering of higher education systems: A public management perspective. Higher education, 56(3), 325-348.
  • Fanelli, D., & Glänzel, W. (2013). Bibliometric evidence for a hierarchy of the sciences. PLoS One, 8(6), e66938.
  • Fägerlind, L., & Strömqvist, G. (Eds.). (2004). Reforming higher education in the Nordic countries: Studies of change in Denmark, Finland, Iceland, Norway and Sweden. Paris: International Institute for Educational Planning. Accessed (June 1, 2020): http://www.iiep.unesco.org/en/reforming-higher-education-nordic-countries-studies-change-denmark-finland-iceland-norway-and-sweden
  • Gibbons, M., Limoges, C., Nowotny, H., Schwartzman, S., Scott, P., & Trow, M. (Eds.). (1994). The new production of knowledge: The dynamics of science and research in contemporary societies. London: Sage.
  • Gläser, J., & Laudel, G. (2007). The social construction of bibliometric evaluations. In R. Whitley & J. Gläser (Eds.), The changing governance of the sciences: The advent of research evaluation systems, (pp. 101-123). Dordrecht: Springer.
  • Godin, B. (2006). The linear model of innovation: The historical construction of an analytical framework. Science, Technology, & Human Values, 31(6), 639-667.
  • Gornitzka, Å., Stensaker, B., Smeby, J-C., & De Boer, H. (2004). Contract arrangements in the Nordic countries—solving the efficiency/effectiveness dilemma?. Higher Education in Europe, 29(1), 87-101.
  • Gulbrandsen, M., & Slipersaeter, S. (2007). The third mission and the entrepreneurial university model. In A. Bonaccorsi & C. Daraio (Eds.), Universities and strategic knowledge creation: Specialization and Performance in Europe (pp. 112-143). Cheltenham: Edward Elgar Publishing.
  • Hammarfelt, B., & de Rijcke, S. (2015). Accountability in context: Effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the Faculty of Arts at Uppsala University. Research Evaluation, 24(1), 63-77.
  • Hammarfelt, B., & Rushforth, A. D. (2017). Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation. Research Evaluation, 26(3), 169-180.
  • Hammarfelt, B., Nelhans, G., Eklund, P., & Åström, F. (2016). The heterogeneous landscape of bibliometric indicators: Evaluating models for allocating resources at Swedish universities. Research Evaluation, 25(3), 292-305.
  • Hansen, H. F., Geschwind, L., Kivistö, J., Pekkola, E., Pinheiro, R., & Pulkkinen, K. (2019). Balancing accountability and trust: university reforms in the Nordic countries. Higher Education, 78(4), 557-573.
  • Harvey, L. (2008). Rankings of higher education institutions: A critical review. Quality in Higher Education, 14(3), 187-207.
  • Hedges, L. V. (1987). How hard is hard science, how soft is soft science? The empirical cumulativeness of research. American Psychologist, 42(5), 443-455.
  • Henkel, M. (2000). Academic identities and policy change in higher education. London: Jessica Kingsley Publishers.
  • Hicks, D. (2004). The four literatures of social science. In H. F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of Quantitative Science and Technology Research (pp. 473-496). Dordrecht: Springer.
  • Hood, C. (1991). A public management for all seasons?. Public Administration, 69(1), 3-19.
  • Kallio, K. M., & Kallio, T. J. (2014). Management-by-results and performance measurement in universities–implications for work motivation. Studies in Higher Education, 39(4), 574-589.
  • Kallio, K. M., Kallio, T. J., Tienari, J., & Hyvönen, T. (2016). Ethos at stake: Performance management and academic work in universities. Human Relations, 69(3), 685-709.
  • Kuhn, T. (1962) [Reprinted 2012]. The structure of scientific revolutions. Chicago: The University of Chicago Press.
  • Mingers, J., & Willmott, H. (2013). Taylorizing business school research: On the ‘one best way’ performative effects of journal ranking lists. Human Relations, 66(8), 1051-1073.
  • Neumann, R. (2001). Disciplinary differences and university teaching. Studies in Higher Education, 26(2), 135-146.
  • OECD. (2007). Revised field of science and technology (FOS) classification in the Frascati manual. Accessed (June 1, 2020): https://www.oecd.org/science/inno/38235147.pdf
  • Osterloh, M. (2010). Governance by numbers. Does it really work in research?. Analyse & Kritik, 32(2), 267-283.
  • Pinheiro, R., Geschwind, L., & Aarrevaara, T. (Eds.). (2016). Mergers in higher education: The experience from Northern Europe. Cham: Springer.
  • Pinheiro, R., Geschwind, L., Hansen, H. F., & Pulkkinen, K. (Eds.). (2019). Reforms, organizational change and performance in higher education: A comparative account from the Nordic Countries. Cham: Palgrave Macmillan.
  • Paradeise, C., Reale, E., Bleiklie, I., & Ferlie, E. (Eds.). (2009). University governance: Western European comparative perspectives. Dordrecht: Springer.
  • Piwowar, H. (2013). Altmetrics: Value all research products. Nature, 493(7431), 159.
  • Pollitt, C., & Bouckaert, G. (2004). Public management reform: A comparative analysis. Oxford: Oxford University Press.
  • Porter, T. M. (1995). Trust in numbers: The pursuit of objectivity in science and public life. Princeton: Princeton University Press.
  • Reale, E., & Seeber, M. (2011). Organisation response to institutional pressures in Higher Education: The important role of the disciplines. Higher Education, 61(1), 1-22.
  • Slaughter, S., & Leslie, L. L. (1997). Academic capitalism: Politics, policies, and the entrepreneurial university. Baltimore: The Johns Hopkins University Press.
  • Smith, L. D., Best, L. A., Stubbs, D. A., Johnston, J., & Archibald, A. B. (2000). Scientific graphs and the hierarchy of the sciences: A Latourian Survey of Inscription Practices. Social Studies of Science, 30(1), 73-94.
  • Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching: The state of the art. Review of Educational Research, 83(4), 598-642.
  • Stichweh, R. (2009). Differentiation of scientific disciplines: Causes and consequences. In G. H. Hadorn (Ed.), Unity of knowledge in transdisciplinary research for sustainable development, Volume 1 (pp. 82-90). Oxford: EOLSS Publishers/UNESCO.
  • Stoecker, J. L. (1993). The Biglan classification revisited. Research in Higher Education, 34(4), 451-464.
  • Stokes, D. E. (1997). Pasteur's Quadrant: Basic science and technological innovation. Washington, DC: Brookings Institution Press.
  • Sugimoto, C. R., & Weingart, S. (2015). The kaleidoscope of disciplinarity. Journal of Documentation, 71(4), 775-794.
  • Söderlind, J., & Geschwind, L. (2019). Making sense of academic work: The influence of performance measurement in Swedish universities. Policy Reviews in Higher Education, 3(1), 75-93.
  • Trowler, P. (2014). Depicting and researching disciplines: Strong and moderate essentialist approaches. Studies in Higher Education, 39(10), 1720-1731.
  • van Raan, A. F. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133-143.
  • van Vught, F., & Westerheijden, D. F. (2010). Multidimensional ranking. Higher Education Management and Policy, 22(3), 1-26.
  • Whitley, R. (2007). Changing governance of the public sciences: The consequences of establishing research evaluation systems for knowledge production in different Countries and scientific fields. In R. Whitley & J. Gläser (Eds.), The changing governance of the sciences: The advent of research evaluation systems (pp. 3-27). Dordrecht: Springer.
Year 2020, Volume: 1 Issue: 1, 18 - 31, 30.06.2020

Abstract

Project Number

237782 - FSK15-1059:1

References

  • Aagaard, K. (2015). How incentives trickle down: Local use of a national bibliometric indicator system. Science and Public Policy, 42(5), 725-737.
  • Ahola, S., Hedmo, T., Thomsen, J.-P., & Vabø, A. (2014). Organisational features of higher education: Denmark, Finland, Norway & Sweden. Oslo: NIFU. Accessed (June 1, 2020): https://www.nifu.no/en/publications/1162368
  • Aksnes, D. W., & Rip, A. (2009). Researchers’ perceptions of citations. Research Policy, 38(6), 895-905.
  • Amaral, A., Meek, L., & Larsen, I. M. (Eds.). (2003). The higher education managerial revolution?. Dordrecht: Springer.
  • Becher, T. (1994). The significance of disciplinary differences. Studies in Higher Education, 19(2), 151-161.
  • Becher, T., & Trowler, P. R. (2001). Academic tribes and territories. Buckingham: Open University Press.
  • Biglan, A. (1973a). The characteristics of subject matter in different academic areas. Journal of Applied Psychology, 57(3), 195-203.
  • Biglan, A. (1973b). Relationships between subject matter characteristics and the structure and output of university departments. Journal of Applied Psychology, 57(3), 204-213.
  • Bleiklie, I. (1998). Justifying the evaluative state: New public management ideals in higher education. Journal of Public Affairs Education, 4(2), 87-100.
  • Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895-903.
  • Buela-Casal, G., & Zych, I. (2012). What do the scientists think about the impact factor?. Scientometrics, 92(2), 281-292.
  • Campbell, J. P., & Oblinger, D. G. (2007). Academic Analytics. EDUCAUSE Quarterly. Accessed (September 01, 2019): https://library.educause.edu/-/media/files/library/2007/10/pub6101-pdf.pdf
  • Christensen, T. (2011). University governance reforms: potential problems of more autonomy?. Higher Education, 62(4), 503-517.
  • Clark, B. R. (1998). Creating entrepreneurial universities: Organizational pathways of transformation. Bingley: Emerald.
  • Cole, S. (1983). The hierarchy of the sciences?. American Journal of Sociology, 89(1), 111-139.
  • Deem, R., Hillyard, S., Reed, M., & Reed, M. (2007). Knowledge, higher education, and the new managerialism: The changing management of UK universities. Oxford: Oxford University Press.
  • Deephouse, D. L., & Suchman, M. (2008). Legitimacy in organizational institutionalism. In R. Greenwood, C. Oliver, R. Suddaby, & K. Sahlin-Andersson (Eds.), The Sage Handbook of Organizational Institutionalism (pp. 49-77). London: Sage.
  • Donovan, C. (2007). The qualitative future of research evaluation. Science and Public Policy, 34(8), 585-597.
  • El‐Khawas, E. (1996). One professoriate, or many?: Assessing aspects of differentiation among academics. Tertiary Education & Management, 2(2), 146-152.
  • Elzinga, A. (1997). The science-society contract in historical transformation: With special reference to “epistemic drift”. Social Science Information, 36(3), 411-445.
  • Elzinga, A. (2012). Features of the current science policy regime: Viewed in historical perspective. Science and Public Policy, 39(4), 416-428.
  • Espeland, W. N., & Sauder, M. (2007). Rankings and reactivity: How public measures recreate social worlds. American Journal of Sociology, 113(1), 1-40.
  • Ferlie, E., Musselin, C., & Andresani, G. (2008). The steering of higher education systems: A public management perspective. Higher education, 56(3), 325-348.
  • Fanelli, D., & Glänzel, W. (2013). Bibliometric evidence for a hierarchy of the sciences. PLoS One, 8(6), e66938.
  • Fägerlind, L., & Strömqvist, G. (Eds.). (2004). Reforming higher education in the Nordic countries: Studies of change in Denmark, Finland, Iceland, Norway and Sweden. Paris: International Institute for Educational Planning. Accessed (June 1, 2020): http://www.iiep.unesco.org/en/reforming-higher-education-nordic-countries-studies-change-denmark-finland-iceland-norway-and-sweden
  • Gibbons, M., Limoges, C., Nowotny, H., Schwartzman, S., Scott, P., & Trow, M. (Eds.). (1994). The new production of knowledge: The dynamics of science and research in contemporary societies. London: Sage.
  • Gläser, J., & Laudel, G. (2007). The social construction of bibliometric evaluations. In R. Whitley & J. Gläser (Eds.), The changing governance of the sciences: The advent of research evaluation systems, (pp. 101-123). Dordrecht: Springer.
  • Godin, B. (2006). The linear model of innovation: The historical construction of an analytical framework. Science, Technology, & Human Values, 31(6), 639-667.
  • Gornitzka, Å., Stensaker, B., Smeby, J-C., & De Boer, H. (2004). Contract arrangements in the Nordic countries—solving the efficiency/effectiveness dilemma?. Higher Education in Europe, 29(1), 87-101.
  • Gulbrandsen, M., & Slipersaeter, S. (2007). The third mission and the entrepreneurial university model. In A. Bonaccorsi & C. Daraio (Eds.), Universities and strategic knowledge creation: Specialization and Performance in Europe (pp. 112-143). Cheltenham: Edward Elgar Publishing.
  • Hammarfelt, B., & de Rijcke, S. (2015). Accountability in context: Effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the Faculty of Arts at Uppsala University. Research Evaluation, 24(1), 63-77.
  • Hammarfelt, B., & Rushforth, A. D. (2017). Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation. Research Evaluation, 26(3), 169-180.
  • Hammarfelt, B., Nelhans, G., Eklund, P., & Åström, F. (2016). The heterogeneous landscape of bibliometric indicators: Evaluating models for allocating resources at Swedish universities. Research Evaluation, 25(3), 292-305.
  • Hansen, H. F., Geschwind, L., Kivistö, J., Pekkola, E., Pinheiro, R., & Pulkkinen, K. (2019). Balancing accountability and trust: university reforms in the Nordic countries. Higher Education, 78(4), 557-573.
  • Harvey, L. (2008). Rankings of higher education institutions: A critical review. Quality in Higher Education, 14(3), 187-207.
  • Hedges, L. V. (1987). How hard is hard science, how soft is soft science? The empirical cumulativeness of research. American Psychologist, 42(5), 443-455.
  • Henkel, M. (2000). Academic identities and policy change in higher education. London: Jessica Kingsley Publishers.
  • Hicks, D. (2004). The four literatures of social science. In H. F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of Quantitative Science and Technology Research (pp. 473-496). Dordrecht: Springer.
  • Hood, C. (1991). A public management for all seasons?. Public Administration, 69(1), 3-19.
  • Kallio, K. M., & Kallio, T. J. (2014). Management-by-results and performance measurement in universities–implications for work motivation. Studies in Higher Education, 39(4), 574-589.
  • Kallio, K. M., Kallio, T. J., Tienari, J., & Hyvönen, T. (2016). Ethos at stake: Performance management and academic work in universities. Human Relations, 69(3), 685-709.
  • Kuhn, T. (1962) [Reprinted 2012]. The structure of scientific revolutions. Chicago: The University of Chicago Press.
  • Mingers, J., & Willmott, H. (2013). Taylorizing business school research: On the ‘one best way’ performative effects of journal ranking lists. Human Relations, 66(8), 1051-1073.
  • Neumann, R. (2001). Disciplinary differences and university teaching. Studies in Higher Education, 26(2), 135-146.
  • OECD. (2007). Revised field of science and technology (FOS) classification in the Frascati manual. Accessed (June 1, 2020): https://www.oecd.org/science/inno/38235147.pdf
  • Osterloh, M. (2010). Governance by numbers. Does it really work in research?. Analyse & Kritik, 32(2), 267-283.
  • Pinheiro, R., Geschwind, L., & Aarrevaara, T. (Eds.). (2016). Mergers in higher education: The experience from Northern Europe. Cham: Springer.
  • Pinheiro, R., Geschwind, L., Hansen, H. F., & Pulkkinen, K. (Eds.). (2019). Reforms, organizational change and performance in higher education: A comparative account from the Nordic Countries. Cham: Palgrave Macmillan.
  • Paradeise, C., Reale, E., Bleiklie, I., & Ferlie, E. (Eds.). (2009). University governance: Western European comparative perspectives. Dordrecht: Springer.
  • Piwowar, H. (2013). Altmetrics: Value all research products. Nature, 493(7431), 159.
  • Pollitt, C., & Bouckaert, G. (2004). Public management reform: A comparative analysis. Oxford: Oxford University Press.
  • Porter, T. M. (1995). Trust in numbers: The pursuit of objectivity in science and public life. Princeton: Princeton University Press.
  • Reale, E., & Seeber, M. (2011). Organisation response to institutional pressures in Higher Education: The important role of the disciplines. Higher Education, 61(1), 1-22.
  • Slaughter, S., & Leslie, L. L. (1997). Academic capitalism: Politics, policies, and the entrepreneurial university. Baltimore: The Johns Hopkins University Press.
  • Smith, L. D., Best, L. A., Stubbs, D. A., Johnston, J., & Archibald, A. B. (2000). Scientific graphs and the hierarchy of the sciences: A Latourian Survey of Inscription Practices. Social Studies of Science, 30(1), 73-94.
  • Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching: The state of the art. Review of Educational Research, 83(4), 598-642.
  • Stichweh, R. (2009). Differentiation of scientific disciplines: Causes and consequences. In G. H. Hadorn (Ed.), Unity of knowledge in transdisciplinary research for sustainable development, Volume 1 (pp. 82-90). Oxford: EOLSS Publishers/UNESCO.
  • Stoecker, J. L. (1993). The Biglan classification revisited. Research in Higher Education, 34(4), 451-464.
  • Stokes, D. E. (1997). Pasteur's Quadrant: Basic science and technological innovation. Washington, DC: Brookings Institution Press.
  • Sugimoto, C. R., & Weingart, S. (2015). The kaleidoscope of disciplinarity. Journal of Documentation, 71(4), 775-794.
  • Söderlind, J., & Geschwind, L. (2019). Making sense of academic work: The influence of performance measurement in Swedish universities. Policy Reviews in Higher Education, 3(1), 75-93.
  • Trowler, P. (2014). Depicting and researching disciplines: Strong and moderate essentialist approaches. Studies in Higher Education, 39(10), 1720-1731.
  • van Raan, A. F. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133-143.
  • van Vught, F., & Westerheijden, D. F. (2010). Multidimensional ranking. Higher Education Management and Policy, 22(3), 1-26.
  • Whitley, R. (2007). Changing governance of the public sciences: The consequences of establishing research evaluation systems for knowledge production in different Countries and scientific fields. In R. Whitley & J. Gläser (Eds.), The changing governance of the sciences: The advent of research evaluation systems (pp. 3-27). Dordrecht: Springer.
There are 65 citations in total.

Details

Primary Language English
Subjects Other Fields of Education
Journal Section Research Articles
Authors

Johan Söderlind 0000-0002-4280-9866

Lars Geschwınd This is me 0000-0003-2983-5573

Project Number 237782 - FSK15-1059:1
Publication Date June 30, 2020
Published in Issue Year 2020 Volume: 1 Issue: 1

Cite

APA Söderlind, J., & Geschwınd, L. (2020). Disciplinary Differences in Academics’ Perceptions of Performance Measurement at Nordic Universities. Higher Education Governance and Policy, 1(1), 18-31.