Research Article
BibTex RIS Cite

MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT

Year 2020, Volume: 12 Issue: 1, 1 - 17, 19.02.2020

Abstract

This paper analyses the factors that influence Monitoring and Evaluation (M&E) of government interventions in a chaotic and complex organisational environment. The central argument is that many factors from the natural and man-made (intellectual) environment affect the evaluation of government interventions. However, despite approximately three decades of M&E studies, there is currently no known study that has focused on the effects of the different factors that influence the M&E of government interventions. The objective of this paper is to critically analyse the effects of different environmental factors on M&E of government interventions. This paper is an attempt to close the knowledge gap in the current literature. This research is qualitative and is based on a robust literature review of the existing literature on M&E and the theory of change, chaos and complexity. The research followed an interpretive, social constructivist paradigm which basically starts from an assumption that when M&E experts, scholars and practitioners construct meaning of their world, and in making sense of that world, they are influenced by their historical, economic, social and cultural backgrounds. This paradigm resonates well with the research’s central objective of identifying, explaining and interpreting the environmental factors that influence M&E. The main finding in this paper is that there are many natural and intellectual (man-made) environmental factors that affect M&E. Change caused by these environmental factors is chaotic, complex and unpredictable. The effects of these environmental factors on the M&E of government interventions is inevitable because organisations are open systems. An M&E endeavour which ignores the effects of natural and intellectual (man-made) environmental factors on M&E cannot produce accurate information and valid recommendations. Therefore, M&E scholars, professionals and practitioners should take into account the environmental context in which M&E is done in order to produce more accurate M&E results and valid recommendations.

References

  • Auriacombe, C.J. and Ackron, J. (2015). PLG 3A Section A: Semester 1 Integrated Development Planning and LED Internal Handbook. Johannesburg: University of Johannesburg.
  • Baggio, R. & Sainaghi, R. (2011). Complex and chaotic tourism systems: towards a quantitative approach. International Journal of Contemporary Hospitality Management, 1(1), 1-19.
  • Basu, A. (2017). What is the Real Meaning of “Butterfly Effect” in 13 Reasons Why? https://www.quora.com/What-is-the-real-meaning-of-butterfly-effect-in-13-reasons-why.
  • Bergoeing, R., Loayza, N. and Piguillem, F. (2015). The whole is greater than the sum of its parts: Complementary reforms to address microeconomic distortions. The World Bank Economic Review, 30(2), 268-305.
  • Bernhardt, Y. (2018). UJ Internal Handbook (APK & Sow) PMG3A1, PMG3AA3 & PGM3A11: Programme: Public Management and Governance (Third Year). Johannesburg: University of Johannesburg.
  • Bhikhoo, A. and Louw-Potgieter, J. (2014). Case: Do managers use evaluation reports? A case study of a process evaluation for a grant-making organisation. In Cloete, F., Rabie, B. and De Coning, C. (Eds.). 2014. Evaluation Management in South Africa and Africa. Stellenbosch: Sun Press Imprint.
  • Bloom, B., Englehart, M., Furst, E., Hill, W. and Krathwohl, D. (1956). Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain. New York and Toronto: Longmans Green.
  • Cloete, F. (2006). Chaos and quantum complexity approaches to public management: Insights from the new sciences. Administratio Publica, 14(1), 45-83.
  • Cloete, F. (2017). Module 2 (HEQF level 9): Policy evaluation theories, models, systems and processes, University of Johannesburg, Johannesburg.
  • Cronjé, F. (2014). A Time Traveller’s Guide: Our Next Ten Years. Cape Town: Tafelberg.
  • Evan, W.E. (1993). Organization Theory: Research and Design. New York: Macmillan.
  • Ho, S.Y. (2003). Evaluating Urban Policy, Ideology, Conflict and Compromise. Surrey: Ashgate.
  • Huitt, W. (2011). Bloom et al.’s Taxonomy of the Cognitive Domain. http://www.edpsycinteractive.org/topics/cogsys/bloom.html.
  • Kayuni, H. (2010). Chaos-complexity theory and education policy: Lessons from Malawi’s community day secondary schools. Bulgarian Journal of Science and Education Policy (BJSEP), 4(1), 1-31.
  • Kusek, J. and Rist, R. (2004). Ten Steps to a Results-Based Monitoring and Evaluation System. Washington, D.C.: The World Bank.
  • Lomofsky, D. (2016). Theory of Change: A Way of Thinking, a Way of Doing and a Way of Learning. http://www.samea.org.za/index.php?module=Pagesetter&type=file& func=get&tid=17&fid=file1&pid=37.
  • Morgan, T. and McMahon, C. (2017). Constructivism and Complexity: A Philosophical Basis for Experiential Learning Models in Engineering Design Education? https://www.itas.kit.edu/downloads/veranstaltung_2017_philosophy_of_models_morgan.pdf.
  • Muthan, V.public service comM. (2015). Using Chaos and Complexity Theory to Design Robust Leadership Architecture for South African Technology Businesses. Master’s Thesis. University of the Witwatersrand, Johannesburg.
  • Nalubega, T. & Uwizeyimana, D.E.(2019). Public sector monitoring and evaluation in the Fourth Industrial Revolution: Implications for Africa, Africa’s Public Service Delivery and Performance Review 7(1):1-12, a318. https:// doi.org/10.4102/apsdpr. v7i1.318.
  • Oehmen, J., Thuesen, C., Ruiz, P. and Geraldi, J. (2015). Complexity Management for Projects, Programmes, and Portfolios: An Engineering Systems Perspective. http://orbit.dtu.dk/files/108586258/Complexity_Management.pdf.
  • Organisation for Economic Co-operation and Development (OECD). (2002). Evaluation and Aid Effectiveness: Glossary of Key Terms in Evaluation and Results-Based Management. Paris: OECD.
  • Owen, J.M. (2006). Program Evaluation: Forms and Approaches. (3rd ed.). New York: The Guilford Press.
  • Oxford English Dictionary. (2018). Definition of ‘Normal’ in English. https://en.oxforddictionaries.com/definition/normal.
  • Public Service Commission. (2008). Basic Concepts in Monitoring and Evaluation. http://www.psc.gov.za/documents/docs/guidelines/PSC%206%20in%20one.pdf.
  • Rabie, B. & Cloete, F. (2009). A new typology of monitoring and evaluation approaches. Administratio Publica, 17(3), 76-97.
  • Rabie, B. and Goldman, I. (2014). The context of evaluation management. In Cloete, F., Rabie, B. and De Coning, C (Eds.). Evaluation Management in South Africa and Africa. Stellenbosch: Sun Press.
  • Rickles, D., Hawe, P. and Shiell, A. (2007). A simple guide to chaos and complexity. Journal of Epidemiology and Community Health, 61(11), 933-937.
  • Rossi, P.H., Lipsey, M.W. and Freeman, H.E. (2004). Evaluation: A Systematic Approach. (7th ed.). London: Sage Publications.
  • Salandy, A. (2018). Building Blocks for a Useful Programme Monitoring and Evaluation System. https://www.uniteforsight.org/conference/ppt-2018/Anthony%20Salandy.pdf.
  • Saunders, R. (2015). Implementation Monitoring and Process Evaluation. Singapore: Sage Publications.
  • Schneider, M. and Somers, M. (2006). Organizations as complex adaptive systems: Implications of complexity theory for leadership research. The Leadership Quarterly, 17(1), 351-365.
  • Scriven, M. (2003). Michael Scriven on the Difference Between Evaluation and Social Science Research. https://sparkaction.org/content/michael-scriven-differences-between-evaluation-and-social-science-research.
  • Thornhill, C. (2016). Quantum physics, cosmology and public administration: Compatibility in theory construction? Administratio Publica, 24(1), 45-58. University of Kansas and UARK.EDU. (n.d.). Using Bloom’s Taxonomy to Write Effective Learning Objectives. https://tips.uark.edu/using-blooms-taxonomy/.
  • Uwizeyimana, D.E. (2018). Progress made towards achieving Rwanda’s Vision 2020: Key indicators’ targets. International Journal of Management Practice, 12(1), 4-46.
  • Uwizeyimana, DE. (2019). The logframe as a monitoring and evaluation tool for government interventions in a chaotic and complex government interventions’ environment. Africa’s Public Service Delivery and Performance Review (forthcoming issue).
  • Woodrow, P. and Oatley, N. (2013). Change in Conflict, Security & Justice Programmes: Part I: What They Are, Different Types, How to Develop and Use Them. London: Department of International Development, UK Aid from the British People.
Year 2020, Volume: 12 Issue: 1, 1 - 17, 19.02.2020

Abstract

References

  • Auriacombe, C.J. and Ackron, J. (2015). PLG 3A Section A: Semester 1 Integrated Development Planning and LED Internal Handbook. Johannesburg: University of Johannesburg.
  • Baggio, R. & Sainaghi, R. (2011). Complex and chaotic tourism systems: towards a quantitative approach. International Journal of Contemporary Hospitality Management, 1(1), 1-19.
  • Basu, A. (2017). What is the Real Meaning of “Butterfly Effect” in 13 Reasons Why? https://www.quora.com/What-is-the-real-meaning-of-butterfly-effect-in-13-reasons-why.
  • Bergoeing, R., Loayza, N. and Piguillem, F. (2015). The whole is greater than the sum of its parts: Complementary reforms to address microeconomic distortions. The World Bank Economic Review, 30(2), 268-305.
  • Bernhardt, Y. (2018). UJ Internal Handbook (APK & Sow) PMG3A1, PMG3AA3 & PGM3A11: Programme: Public Management and Governance (Third Year). Johannesburg: University of Johannesburg.
  • Bhikhoo, A. and Louw-Potgieter, J. (2014). Case: Do managers use evaluation reports? A case study of a process evaluation for a grant-making organisation. In Cloete, F., Rabie, B. and De Coning, C. (Eds.). 2014. Evaluation Management in South Africa and Africa. Stellenbosch: Sun Press Imprint.
  • Bloom, B., Englehart, M., Furst, E., Hill, W. and Krathwohl, D. (1956). Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain. New York and Toronto: Longmans Green.
  • Cloete, F. (2006). Chaos and quantum complexity approaches to public management: Insights from the new sciences. Administratio Publica, 14(1), 45-83.
  • Cloete, F. (2017). Module 2 (HEQF level 9): Policy evaluation theories, models, systems and processes, University of Johannesburg, Johannesburg.
  • Cronjé, F. (2014). A Time Traveller’s Guide: Our Next Ten Years. Cape Town: Tafelberg.
  • Evan, W.E. (1993). Organization Theory: Research and Design. New York: Macmillan.
  • Ho, S.Y. (2003). Evaluating Urban Policy, Ideology, Conflict and Compromise. Surrey: Ashgate.
  • Huitt, W. (2011). Bloom et al.’s Taxonomy of the Cognitive Domain. http://www.edpsycinteractive.org/topics/cogsys/bloom.html.
  • Kayuni, H. (2010). Chaos-complexity theory and education policy: Lessons from Malawi’s community day secondary schools. Bulgarian Journal of Science and Education Policy (BJSEP), 4(1), 1-31.
  • Kusek, J. and Rist, R. (2004). Ten Steps to a Results-Based Monitoring and Evaluation System. Washington, D.C.: The World Bank.
  • Lomofsky, D. (2016). Theory of Change: A Way of Thinking, a Way of Doing and a Way of Learning. http://www.samea.org.za/index.php?module=Pagesetter&type=file& func=get&tid=17&fid=file1&pid=37.
  • Morgan, T. and McMahon, C. (2017). Constructivism and Complexity: A Philosophical Basis for Experiential Learning Models in Engineering Design Education? https://www.itas.kit.edu/downloads/veranstaltung_2017_philosophy_of_models_morgan.pdf.
  • Muthan, V.public service comM. (2015). Using Chaos and Complexity Theory to Design Robust Leadership Architecture for South African Technology Businesses. Master’s Thesis. University of the Witwatersrand, Johannesburg.
  • Nalubega, T. & Uwizeyimana, D.E.(2019). Public sector monitoring and evaluation in the Fourth Industrial Revolution: Implications for Africa, Africa’s Public Service Delivery and Performance Review 7(1):1-12, a318. https:// doi.org/10.4102/apsdpr. v7i1.318.
  • Oehmen, J., Thuesen, C., Ruiz, P. and Geraldi, J. (2015). Complexity Management for Projects, Programmes, and Portfolios: An Engineering Systems Perspective. http://orbit.dtu.dk/files/108586258/Complexity_Management.pdf.
  • Organisation for Economic Co-operation and Development (OECD). (2002). Evaluation and Aid Effectiveness: Glossary of Key Terms in Evaluation and Results-Based Management. Paris: OECD.
  • Owen, J.M. (2006). Program Evaluation: Forms and Approaches. (3rd ed.). New York: The Guilford Press.
  • Oxford English Dictionary. (2018). Definition of ‘Normal’ in English. https://en.oxforddictionaries.com/definition/normal.
  • Public Service Commission. (2008). Basic Concepts in Monitoring and Evaluation. http://www.psc.gov.za/documents/docs/guidelines/PSC%206%20in%20one.pdf.
  • Rabie, B. & Cloete, F. (2009). A new typology of monitoring and evaluation approaches. Administratio Publica, 17(3), 76-97.
  • Rabie, B. and Goldman, I. (2014). The context of evaluation management. In Cloete, F., Rabie, B. and De Coning, C (Eds.). Evaluation Management in South Africa and Africa. Stellenbosch: Sun Press.
  • Rickles, D., Hawe, P. and Shiell, A. (2007). A simple guide to chaos and complexity. Journal of Epidemiology and Community Health, 61(11), 933-937.
  • Rossi, P.H., Lipsey, M.W. and Freeman, H.E. (2004). Evaluation: A Systematic Approach. (7th ed.). London: Sage Publications.
  • Salandy, A. (2018). Building Blocks for a Useful Programme Monitoring and Evaluation System. https://www.uniteforsight.org/conference/ppt-2018/Anthony%20Salandy.pdf.
  • Saunders, R. (2015). Implementation Monitoring and Process Evaluation. Singapore: Sage Publications.
  • Schneider, M. and Somers, M. (2006). Organizations as complex adaptive systems: Implications of complexity theory for leadership research. The Leadership Quarterly, 17(1), 351-365.
  • Scriven, M. (2003). Michael Scriven on the Difference Between Evaluation and Social Science Research. https://sparkaction.org/content/michael-scriven-differences-between-evaluation-and-social-science-research.
  • Thornhill, C. (2016). Quantum physics, cosmology and public administration: Compatibility in theory construction? Administratio Publica, 24(1), 45-58. University of Kansas and UARK.EDU. (n.d.). Using Bloom’s Taxonomy to Write Effective Learning Objectives. https://tips.uark.edu/using-blooms-taxonomy/.
  • Uwizeyimana, D.E. (2018). Progress made towards achieving Rwanda’s Vision 2020: Key indicators’ targets. International Journal of Management Practice, 12(1), 4-46.
  • Uwizeyimana, DE. (2019). The logframe as a monitoring and evaluation tool for government interventions in a chaotic and complex government interventions’ environment. Africa’s Public Service Delivery and Performance Review (forthcoming issue).
  • Woodrow, P. and Oatley, N. (2013). Change in Conflict, Security & Justice Programmes: Part I: What They Are, Different Types, How to Develop and Use Them. London: Department of International Development, UK Aid from the British People.
There are 36 citations in total.

Details

Primary Language English
Subjects Economics
Journal Section Research Article
Authors

Dominique E. Uwizeyimana This is me

Publication Date February 19, 2020
Published in Issue Year 2020 Volume: 12 Issue: 1

Cite

APA Uwizeyimana, D. E. (2020). MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT. International Journal of Business and Management Studies, 12(1), 1-17.
AMA Uwizeyimana DE. MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT. IJBMS. February 2020;12(1):1-17.
Chicago Uwizeyimana, Dominique E. “MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT”. International Journal of Business and Management Studies 12, no. 1 (February 2020): 1-17.
EndNote Uwizeyimana DE (February 1, 2020) MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT. International Journal of Business and Management Studies 12 1 1–17.
IEEE D. E. Uwizeyimana, “MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT”, IJBMS, vol. 12, no. 1, pp. 1–17, 2020.
ISNAD Uwizeyimana, Dominique E. “MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT”. International Journal of Business and Management Studies 12/1 (February 2020), 1-17.
JAMA Uwizeyimana DE. MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT. IJBMS. 2020;12:1–17.
MLA Uwizeyimana, Dominique E. “MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT”. International Journal of Business and Management Studies, vol. 12, no. 1, 2020, pp. 1-17.
Vancouver Uwizeyimana DE. MONITORING AND EVALUATION IN A CHAOTIC AND COMPLEX GOVERNMENT INTERVENTIONS’ ENVIRONMENT. IJBMS. 2020;12(1):1-17.