Research Article
BibTex RIS Cite

Bloklanmış Üç Düzeyli Regresyon Süreksizlik Tasarımlarının Tasarım ve Analizinde Bir Düzeyinin Yok Sayılmasının Sonuçları: Güç ve Tip I Hata Oranları

Year 2022, Volume: 12 Issue: 1, 42 - 55, 30.06.2022
https://doi.org/10.17984/adyuebd.1068923

Abstract

Çok düzeyli regresyon süreksizliği tasarımları, politika ve programların etkililiğini değerlendirme amacıyla eğitim araştırmalarında giderek daha fazla kullanılmaktadır. Üç düzeyli bir veri yapısında (öğrenciler sınıflarda/öğretmenlerde, sınıflar/öğretmenler okullarda kümelenmiştir) bir düzeyin, ister veri analizi sırasında farkında olmadan ister planlama aşamasında kaynak kısıtlamaları nedeniyle göz ardı edilmesi yaygın bir sorundur. Bu çalışma, veri analizi ve planlama sırasında bloke edilmiş üç düzeyli regresyon süreksizlik tasarımlarında (BIRD3; müdahale 1. seviyededir) orta veya üst düzeyin göz ardı edilmesinin sonuçlarını araştırmaktadır. Monte Carlo simülasyon sonuçları, analiz sırasında bir düzeyin göz ardı edilmesinin müdahale etkisi tahminlerinin doğruluğunu etkilemediğini ancak standart hata, güç ve Tip I hata oranlarını etkilediğini göstermiştir. Orta düzeyin göz ardı edilmesi analiz aşamasında önemli bir sorun yaratmamıştır. Güç oranları nispeten daha küçük olsa bile Tip I hata oranları sabit kalmıştır. Buna karşılık, üst düzeyi göz ardı etmek daha büyük güç oranlarının elde edilmesine yardımcı olurken öte yandan Tip I hatada ciddi bir artışa yol açmaktadır. Tasarım aşamasına gelince, orta düzey göz ardı edildiğinde bir BIRD3 tasarımını planlamak için iki düzeyli bloke edilmiş regresyon süreksizlik modelinden (BIRD2) gelen parametreleri kullanmak uygundur. Ancak BIRD3 tasarımında, BIRD2 modelindeki 2. düzeye ait parametreleri 3. düzey parametrelerinin yerine kullanmak gerekir. Üst düzey göz ardı edildiğinde, bir BIRD3 tasarımını planlamak için BIRD2 modelindeki parametreleri kullanmaktan kaçınılması gerekmektedir.

References

  • Balu, R., Zhu, P., Doolittle, F., Schiller, E., Jenkins, J., & Gersten, R. (2015). Evaluation of response to intervention practices for elementary school reading (NCEE 2016-4000). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. https://files.eric.ed.gov/fulltext/ED560820.pdf
  • Bickel, R. (2007). Multilevel analysis for applied research: It's just regression! Guilford Press.
  • Bulus, M. (2022). Minimum detectable effect size computations for cluster-level regression discontinuity: Specifications beyond the linear functional form. Journal of Research on Education Effectiveness, 15(1), 151-177. https://doi.org/10.1080/19345747.2021.1947425
  • Bulus, M., & Dong, N. (2021a). Bound constrained optimization of sample sizes subject to monetary restrictions in planning of multilevel randomized trials and regression discontinuity studies. The Journal of Experimental Education, 89(2), 379-401. https://doi.org/10.1080/00220973.2019.1636197
  • Bulus, M., & Dong, N. (2021b). cosa: Bound constrained optimal sample size allocation. R package version 2.1.0. https://CRAN.R-project.org/package=cosa
  • Bulus, M., & Dong, N. (2022). Minimum detectable effect size computations for blocked individual-level regression discontinuity: Specifications beyond the linear functional form. Manuscript in preparation.
  • Cortes, K. E., Goodman, J. S., & Nomi, T. (2015). Intensive math instruction and educational attainment long-run impacts of double-dose algebra. Journal of Human Resources, 50(1), 108-158. https://doi.org/10.3368/jhr.50.1.108
  • Deke, J., Dragoset, L., Bogen, K., & Gill, B. (2012). Impacts of Title I Supplemental Educational Services on student achievement. NCEE 2012-4053. National Center for Education Evaluation and Regional Assistance. https://ies.ed.gov/ncee/pubs/20124053/pdf/20124053.pdf
  • Dong, N., & Maynard, R. (2013). PowerUp!: A tool for calculating minimum detectable effect sizes and minimum required sample sizes for experimental and quasi-experimental design studies. Journal of Research on Educational Effectiveness, 6(1), 24-67. https://doi.org/10.1080/19345747.2012.673143
  • Finch, W. H., & Bolin, J. E. (2017). Multilevel modeling using Mplus. CRC Press.
  • Fox, J. (1997). Applied regression analysis, linear models, and related methods. Sage Publications.
  • Goldstein, H. (2011). Multilevel statistical models (4th ed). John Wiley & Sons.
  • Harrington, J. R., Muñoz, J., Curs, B. R., & Ehlert, M. (2016). Examining the impact of a highly targeted state-administered merit aid program on brain drain: Evidence from a regression discontinuity analysis of Missouri's Bright Flight program. Research in Higher Education, 57(4), 423-447. https://doi.org/10.1007/s11162-015-9392-9
  • Hox, J. J. (2010). Multilevel analysis: Techniques and applications (2nd ed). Routledge.
  • Hustedt, J. T., Jung, K., Barnett, W. S., & Williams, T. (2015). Kindergarten readiness impacts of the Arkansas Better Chance State Prekindergarten Initiative. The Elementary School Journal, 116(2), 198-216. https://doi.org/10.1086/684105
  • Jenkins, J. M., Farkas, G., Duncan, G. J., Burchinal, M., & Vandell, D. L. (2016). Head Start at ages 3 and 4 versus Head Start followed by state Pre-K which is more effective? Educational Evaluation and Policy Analysis, 38(1), 88-112. https://doi.org/10.3102%2F0162373715587965
  • Klerman, J. A., Olsho, L. E., & Bartlett, S. (2015). Regression discontinuity in prospective evaluations: The case of the FFVP evaluation. American Journal of Evaluation, 36(3), 403-416. https://doi.org/10.1177%2F1098214014553786
  • Konstantopoulos, S., & Shen, T. (2016). Class size effects on mathematics achievement in Cyprus: Evidence from TIMSS. Educational Research and Evaluation, 22(1-2), 86-109. https://doi.org/10.1080/13803611.2016.1193030
  • Konu, A. I., Lintonen, T. P., & Autio, V. J. (2002). Evaluation of well-being in schools–a multilevel analysis of general subjective well-being. School Effectiveness and School Improvement, 13(2), 187-200. https://doi.org/10.1076/sesi.13.2.187.3432
  • Leeds, D. M., McFarlin, I., & Daugherty, L. (2017). Does student effort respond to incentives? Evidence from a guaranteed college admissions program. Research in Higher Education, 58(3), 231-243. http://dx.doi.org/10.1007/s11162-016-9427-x
  • Ludwig, J., & Miller, D. L. (2005). Does Head Start improve children's life chances? Evidence from a regression discontinuity design (No. w11702). National Bureau of Economic Research. https://www.nber.org/system/files/working_papers/w11702/w11702.pdf
  • Luyten, H. (2006). An empirical assessment of the absolute effect of schooling: Regression‐discontinuity applied to TIMSS‐95. Oxford Review of Education, 32(3), 397-429. https://doi.org/10.1080/03054980600776589
  • Luyten, H., Peschar, J., & Coe, R. (2008). Effects of schooling on reading performance, reading engagement, and reading activities of 15-year-olds in England. American Educational Research Journal, 45(2), 319-342. https://doi.org/10.3102%2F0002831207313345
  • Matsudaira, J. D. (2008). Mandatory summer school and student achievement. Journal of Econometrics, 142(2), 829-850. https://doi.org/10.1016/j.jeconom.2007.05.015
  • Manatunga, A. K., Hudgens, M. G., & Chen, S. (2001). Sample size estimation in cluster randomized studies with varying cluster size. Biometrical Journal, 43(1), 75-86. http://dx.doi.org/10.1002/1521-4036(200102)43:1%3C75::AID-BIMJ75%3E3.0.CO;2-N
  • May, H., Sirinides, P. M., Gray, A., & Goldsworthy, H. (2016). Reading recovery: An evaluation of the four-year i3 scale-up. Philadelphia: Consortium for Policy Research in Education. https://files.eric.ed.gov/fulltext/ED593261.pdf
  • Moerbeek, M. (2004). The consequence of ignoring a level of nesting in multilevel analysis. Multivariate Behavioral Research, 39(1), 129-149. https://doi.org/10.1207/s15327906mbr3901_5
  • Muthén, B. O. (1991). Multilevel factor analysis of class and student achievement components. Journal of Educational Measurement, 28(4), 338-354. http://www.jstor.org/stable/1434897
  • Nye, B., Konstantopoulos, S., & Hedges, L. V. (2004). How large are teacher effects? Educational Evaluation and Policy Analysis, 26(3), 237-257. https://doi.org/10.3102%2F01623737026003237
  • Opdenakker, M. C., & Van Damme, J. (2000). The importance of identifying levels in multilevel analysis: An illustration of the effects of ignoring the top or intermediate levels in school effectiveness research. School Effectiveness and School Improvement, 11(1), 103-130. https://doi.org/10.1076/0924-3453(200003)11:1;1-A;FT103
  • Raudenbush, S., & Bryk, A. S. (1986). A hierarchical model for studying school effects. Sociology of Education, 59(1), 1-17. https://doi.org/10.2307/2112482
  • Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed). Sage Publications.
  • Reardon, S. F., Arshan, N., Atteberry, A., & Kurlaender, M. (2010). Effects of failing a high school exit exam on course-taking, achievement, persistence, and graduation. Educational Evaluation and Policy Analysis, 32(4), 498-520. https://doi.org/10.3102%2F0162373710382655
  • Schochet, P. Z. (2008). Technical methods report: Statistical power for regression discontinuity designs in education evaluations (NCEE 2008-4026). National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. https://files.eric.ed.gov/fulltext/ED511782.pdf
  • Schochet, P. Z. (2009). Statistical power for regression discontinuity designs in education evaluations. Journal of Educational and Behavioral Statistics, 34(2), 238-266. https://doi.org/10.3102%2F1076998609332748
  • Snijders, T. A., & Bosker, R. J. (2011). Multilevel analysis: An introduction to basic and advanced multilevel modeling (2nd ed). Sage Publications.
  • Singer, J. (1987). An intraclass correlation for analyzing multilevel data. Journal of Experimental Education, 55(4), 219-228. https://doi.org/10.1080/00220973.1987.10806457
  • Van den Noortgate, W., Opdenakker, M. C., & Onghena, P. (2005). The effects of ignoring a level in multilevel analysis. School Effectiveness and School Improvement, 16(3), 281-303. https://doi.org/10.1080/09243450500114850
  • Wong, V. C., Cook, T. D., Barnett, W. S., & Jung, K. (2008). An effectiveness‐based evaluation of five state pre‐kindergarten programs. Journal of Policy Analysis and Management, 27(1), 122-154. https://doi.org/10.1002/pam.20310
  • Zhu, P., Jacob, R., Bloom, H., & Xu, Z. (2011). Designing and analyzing studies that randomize schools to estimate intervention effects on student academic outcomes without classroom-level information. Educational Evaluation and Policy Analysis, 34(1), 45-68. https://doi.org/10.3102%2F0162373711423786

Consequences of Ignoring a Level of Nesting on Design and Analysis of Blocked Three-level Regression Discontinuity Designs: Power and Type I Error Rates

Year 2022, Volume: 12 Issue: 1, 42 - 55, 30.06.2022
https://doi.org/10.17984/adyuebd.1068923

Abstract

Multilevel regression discontinuity designs have been increasingly used in education research to evaluate the effectiveness of policy and programs. It is common to ignore a level of nesting in a three-level data structure (students nested in classrooms/teachers nested in schools), whether unwittingly during data analysis or due to resource constraints during the planning phase. This study investigates the consequences of ignoring intermediate or top level in blocked three-level regression discontinuity designs (BIRD3; treatment is at level 1) during data analysis and planning. Monte Carlo simulation results indicated that ignoring a level during analysis did not affect the accuracy of treatment effect estimates; however, it affected the precision (standard errors, power, and Type I error rates). Ignoring the intermediate level did not cause a significant problem. Power rates were slightly underestimated, whereas Type I error rates were stable. In contrast, ignoring a top-level resulted in overestimated power rates; however, severe inflation in Type I error deemed this strategy ineffective. As for the design phase, when the intermediate level was ignored, it is viable to use parameters from a two-level blocked regression discontinuity model (BIRD2) to plan a BIRD3 design. However, level 2 parameters from the BIRD2 model should be substituted for level 3 parameters in the BIRD3 design. When the top level was ignored, using parameters from the BIRD2 model to plan a BIRD3 design should be avoided.

References

  • Balu, R., Zhu, P., Doolittle, F., Schiller, E., Jenkins, J., & Gersten, R. (2015). Evaluation of response to intervention practices for elementary school reading (NCEE 2016-4000). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. https://files.eric.ed.gov/fulltext/ED560820.pdf
  • Bickel, R. (2007). Multilevel analysis for applied research: It's just regression! Guilford Press.
  • Bulus, M. (2022). Minimum detectable effect size computations for cluster-level regression discontinuity: Specifications beyond the linear functional form. Journal of Research on Education Effectiveness, 15(1), 151-177. https://doi.org/10.1080/19345747.2021.1947425
  • Bulus, M., & Dong, N. (2021a). Bound constrained optimization of sample sizes subject to monetary restrictions in planning of multilevel randomized trials and regression discontinuity studies. The Journal of Experimental Education, 89(2), 379-401. https://doi.org/10.1080/00220973.2019.1636197
  • Bulus, M., & Dong, N. (2021b). cosa: Bound constrained optimal sample size allocation. R package version 2.1.0. https://CRAN.R-project.org/package=cosa
  • Bulus, M., & Dong, N. (2022). Minimum detectable effect size computations for blocked individual-level regression discontinuity: Specifications beyond the linear functional form. Manuscript in preparation.
  • Cortes, K. E., Goodman, J. S., & Nomi, T. (2015). Intensive math instruction and educational attainment long-run impacts of double-dose algebra. Journal of Human Resources, 50(1), 108-158. https://doi.org/10.3368/jhr.50.1.108
  • Deke, J., Dragoset, L., Bogen, K., & Gill, B. (2012). Impacts of Title I Supplemental Educational Services on student achievement. NCEE 2012-4053. National Center for Education Evaluation and Regional Assistance. https://ies.ed.gov/ncee/pubs/20124053/pdf/20124053.pdf
  • Dong, N., & Maynard, R. (2013). PowerUp!: A tool for calculating minimum detectable effect sizes and minimum required sample sizes for experimental and quasi-experimental design studies. Journal of Research on Educational Effectiveness, 6(1), 24-67. https://doi.org/10.1080/19345747.2012.673143
  • Finch, W. H., & Bolin, J. E. (2017). Multilevel modeling using Mplus. CRC Press.
  • Fox, J. (1997). Applied regression analysis, linear models, and related methods. Sage Publications.
  • Goldstein, H. (2011). Multilevel statistical models (4th ed). John Wiley & Sons.
  • Harrington, J. R., Muñoz, J., Curs, B. R., & Ehlert, M. (2016). Examining the impact of a highly targeted state-administered merit aid program on brain drain: Evidence from a regression discontinuity analysis of Missouri's Bright Flight program. Research in Higher Education, 57(4), 423-447. https://doi.org/10.1007/s11162-015-9392-9
  • Hox, J. J. (2010). Multilevel analysis: Techniques and applications (2nd ed). Routledge.
  • Hustedt, J. T., Jung, K., Barnett, W. S., & Williams, T. (2015). Kindergarten readiness impacts of the Arkansas Better Chance State Prekindergarten Initiative. The Elementary School Journal, 116(2), 198-216. https://doi.org/10.1086/684105
  • Jenkins, J. M., Farkas, G., Duncan, G. J., Burchinal, M., & Vandell, D. L. (2016). Head Start at ages 3 and 4 versus Head Start followed by state Pre-K which is more effective? Educational Evaluation and Policy Analysis, 38(1), 88-112. https://doi.org/10.3102%2F0162373715587965
  • Klerman, J. A., Olsho, L. E., & Bartlett, S. (2015). Regression discontinuity in prospective evaluations: The case of the FFVP evaluation. American Journal of Evaluation, 36(3), 403-416. https://doi.org/10.1177%2F1098214014553786
  • Konstantopoulos, S., & Shen, T. (2016). Class size effects on mathematics achievement in Cyprus: Evidence from TIMSS. Educational Research and Evaluation, 22(1-2), 86-109. https://doi.org/10.1080/13803611.2016.1193030
  • Konu, A. I., Lintonen, T. P., & Autio, V. J. (2002). Evaluation of well-being in schools–a multilevel analysis of general subjective well-being. School Effectiveness and School Improvement, 13(2), 187-200. https://doi.org/10.1076/sesi.13.2.187.3432
  • Leeds, D. M., McFarlin, I., & Daugherty, L. (2017). Does student effort respond to incentives? Evidence from a guaranteed college admissions program. Research in Higher Education, 58(3), 231-243. http://dx.doi.org/10.1007/s11162-016-9427-x
  • Ludwig, J., & Miller, D. L. (2005). Does Head Start improve children's life chances? Evidence from a regression discontinuity design (No. w11702). National Bureau of Economic Research. https://www.nber.org/system/files/working_papers/w11702/w11702.pdf
  • Luyten, H. (2006). An empirical assessment of the absolute effect of schooling: Regression‐discontinuity applied to TIMSS‐95. Oxford Review of Education, 32(3), 397-429. https://doi.org/10.1080/03054980600776589
  • Luyten, H., Peschar, J., & Coe, R. (2008). Effects of schooling on reading performance, reading engagement, and reading activities of 15-year-olds in England. American Educational Research Journal, 45(2), 319-342. https://doi.org/10.3102%2F0002831207313345
  • Matsudaira, J. D. (2008). Mandatory summer school and student achievement. Journal of Econometrics, 142(2), 829-850. https://doi.org/10.1016/j.jeconom.2007.05.015
  • Manatunga, A. K., Hudgens, M. G., & Chen, S. (2001). Sample size estimation in cluster randomized studies with varying cluster size. Biometrical Journal, 43(1), 75-86. http://dx.doi.org/10.1002/1521-4036(200102)43:1%3C75::AID-BIMJ75%3E3.0.CO;2-N
  • May, H., Sirinides, P. M., Gray, A., & Goldsworthy, H. (2016). Reading recovery: An evaluation of the four-year i3 scale-up. Philadelphia: Consortium for Policy Research in Education. https://files.eric.ed.gov/fulltext/ED593261.pdf
  • Moerbeek, M. (2004). The consequence of ignoring a level of nesting in multilevel analysis. Multivariate Behavioral Research, 39(1), 129-149. https://doi.org/10.1207/s15327906mbr3901_5
  • Muthén, B. O. (1991). Multilevel factor analysis of class and student achievement components. Journal of Educational Measurement, 28(4), 338-354. http://www.jstor.org/stable/1434897
  • Nye, B., Konstantopoulos, S., & Hedges, L. V. (2004). How large are teacher effects? Educational Evaluation and Policy Analysis, 26(3), 237-257. https://doi.org/10.3102%2F01623737026003237
  • Opdenakker, M. C., & Van Damme, J. (2000). The importance of identifying levels in multilevel analysis: An illustration of the effects of ignoring the top or intermediate levels in school effectiveness research. School Effectiveness and School Improvement, 11(1), 103-130. https://doi.org/10.1076/0924-3453(200003)11:1;1-A;FT103
  • Raudenbush, S., & Bryk, A. S. (1986). A hierarchical model for studying school effects. Sociology of Education, 59(1), 1-17. https://doi.org/10.2307/2112482
  • Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed). Sage Publications.
  • Reardon, S. F., Arshan, N., Atteberry, A., & Kurlaender, M. (2010). Effects of failing a high school exit exam on course-taking, achievement, persistence, and graduation. Educational Evaluation and Policy Analysis, 32(4), 498-520. https://doi.org/10.3102%2F0162373710382655
  • Schochet, P. Z. (2008). Technical methods report: Statistical power for regression discontinuity designs in education evaluations (NCEE 2008-4026). National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. https://files.eric.ed.gov/fulltext/ED511782.pdf
  • Schochet, P. Z. (2009). Statistical power for regression discontinuity designs in education evaluations. Journal of Educational and Behavioral Statistics, 34(2), 238-266. https://doi.org/10.3102%2F1076998609332748
  • Snijders, T. A., & Bosker, R. J. (2011). Multilevel analysis: An introduction to basic and advanced multilevel modeling (2nd ed). Sage Publications.
  • Singer, J. (1987). An intraclass correlation for analyzing multilevel data. Journal of Experimental Education, 55(4), 219-228. https://doi.org/10.1080/00220973.1987.10806457
  • Van den Noortgate, W., Opdenakker, M. C., & Onghena, P. (2005). The effects of ignoring a level in multilevel analysis. School Effectiveness and School Improvement, 16(3), 281-303. https://doi.org/10.1080/09243450500114850
  • Wong, V. C., Cook, T. D., Barnett, W. S., & Jung, K. (2008). An effectiveness‐based evaluation of five state pre‐kindergarten programs. Journal of Policy Analysis and Management, 27(1), 122-154. https://doi.org/10.1002/pam.20310
  • Zhu, P., Jacob, R., Bloom, H., & Xu, Z. (2011). Designing and analyzing studies that randomize schools to estimate intervention effects on student academic outcomes without classroom-level information. Educational Evaluation and Policy Analysis, 34(1), 45-68. https://doi.org/10.3102%2F0162373711423786
There are 40 citations in total.

Details

Primary Language English
Journal Section Research Articles
Authors

Metin Bulus 0000-0003-4348-6322

Nianbo Dong This is me 0000-0003-0128-7106

Publication Date June 30, 2022
Acceptance Date June 4, 2022
Published in Issue Year 2022 Volume: 12 Issue: 1

Cite

APA Bulus, M., & Dong, N. (2022). Consequences of Ignoring a Level of Nesting on Design and Analysis of Blocked Three-level Regression Discontinuity Designs: Power and Type I Error Rates. Adıyaman University Journal of Educational Sciences, 12(1), 42-55. https://doi.org/10.17984/adyuebd.1068923

                                                                                                                                                                                                                                                      
by-nc-nd.png?resize=300%2C105&ssl=1 This work is licensed under CC BY-NC-ND 4.0