Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2021, Cilt: 12 Sayı: 3, 239 - 253, 29.09.2021
https://doi.org/10.21031/epod.944067

Öz

Kaynakça

  • Barnette, J. (2000). Effects of stem and Likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those negatively- worded stems. Educational and Psychological Measurement, 6, 361-370. doi:10.1177/00131640021970592
  • Bolt, D., Wang, Y. C., Meyer, R. H., & Pier, L. (2020). An IRT mixture model for rating scale confusion associated with negatively worded items in measures of social-emotional learning. Applied Measurement in Education, 33(4), 331–348. doi:10.1080/08957347.2020.1789140
  • Bornkessel, I., & Schlesewsky, M. (2006). The extended argument dependency model: a neurocognitive approach to sentence comprehension across languages. Psychological review, 113(4), 787. doi: 10.1037/0033-295X.113.4.787
  • Boyce, L. K., Innocenti, M. S., Roggman, L. A., Norman, V. K., & Ortiz, E. (2010). Telling stories and making books: Evidence for an intervention to help parents in migrant Head Start families support their children’s language and literacy. Early Education and Development, 21(3), 343–371. doi:10.1080/10409281003631142
  • Candan, A., Küntay, A. C., Yeh, Y. C., Cheung, H., Wagner, L., & Naigles, L. R. (2012). Language and age effects in children's processing of word order. Cognitive Development, 27(3), 205-221. doi:10.1016/j.cogdev.2011.12.001
  • Caponera, E., Sestito, P., & Russo, P. M. (2016). The influence of reading literacy on mathematics and science achievement. The Journal of Educational Research, 109(2), 197-204. doi: 10.1080/00220671.2014.936998
  • Castro-Schilo, L., Grimm, K. J., & Widaman, K. F. (2016). Augmenting the Correlated Trait–Correlated Method Model for Multitrait–Multimethod Data. Structural Equation Modeling: A Multidisciplinary Journal, 23(6), 798-818. doi:10.1080/10705511.2016.1214919
  • Cole, K. L., Turner, R. C., & Gitchel, W. D. (2019). A study of polytomous IRT methods and item wording directionality effects on perceived stress items. Personality and Individual Differences, 147, 63–72. doi: 10.1016/j.paid.2019.03.046
  • Cronbach, L. J. (1950). Further evidence on response sets and test design. Educational and Psychological Measurement, 10, 3–31. doi:10.1177/001316445001000101
  • DiStefano, C., & Motl, R. W. (2006). Further investigating method effects associated with negatively worded items on self-report surveys. Structural Equation Modeling, 13(3), 440-464. doi: 10.1207/s15328007sem1303_6
  • Dodeen, H. (2015). The effects of positively and negatively worded items on the factor structure of the UCLA loneliness scale. Journal of Psychoeducational Assessment, 33(3), 259-267. doi: 10.1177/0734282914548325
  • Eid, M. (2000). A multitrait-multimethod model with minimal assumptions. Psychometrika, 65, 241-261. doi: 10.1007/BF02294377
  • Fan, Y., & Lance, C. E. (2017). A reformulated correlated trait–correlated method model for multitrait–multimethod data effectively increases convergence and admissibility rates. Educational and psychological measurement, 77(6), 1048-1063. do:10.1177/0013164416677144
  • Gnambs, T., & Schroeders, U. (2020). Cognitive abilities explain wording effects in the Rosenberg Self-Esteem Scale. Assessment, 27(2), 404–418. doi:10.1177/1073191117746503
  • Gustafsson, J.-E., Hansen, K. Y., & Rosén, M. (2013). Effects of home background on student achievement in reading, mathematics, and science at the fourth grade. In M. O. Martin and I. V. S. Mullis (Eds.), TIMSS and PIRLS 2011: Relationships among reading, mathematics, and science achievement at the fourth grade–Implications for early learning. (pp. 181- 287). Chestnut Hill: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
  • Hu, L., & Bentler, P. M. (1999). Cut-off criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55. doi: 10.1080/10705519909540118
  • Huang, C. (2017). Cross-informant agreement on the child behavior checklist for youths: A meta-analysis. Psychological reports, 120(6), 1096-1116. doi:10.1177/0033294117717733
  • Kam, C. C. S. (2018) Novel insights into item keying/valence effect using latent difference (LD) modeling analysis. Journal of Personality Assessment, 100(4), 389-397. doi:10.1080/00223891.2017.1369095
  • Kam, C. C. S., & Meyer, J. P. (2015). How careless responding and acquiescence response bias can influence construct dimensionality: The case of job satisfaction. Organizational Research Methods, 18(3), 512–541. doi:10.1177/1094428115571894
  • Lonigan, C. J., Burgess, S. R., & Anthony, J. L. (2000). Development of emergent literacy and early reading skills in preschool children: evidence from a latent-variable longitudinal study. Developmental psychology, 36(5), 596. doi:10.1037/0012-1649.36.5.596
  • Marsh, H. W. (1996). Positive and negative global self-esteem: A substantively meaningful distinction or artifactors? Journal of Personality and Social Psychology, 70, 810–819. doi:10.1037/0022-3514.70.4.810.
  • Marsh, H.W. (1989). Confirmatory factor analyses of multitrait-multimethod data: Many problems and a few solutions. Applied Psychological Measurement, 13, 335-361.
  • McTigue, E. M., Schwippert, K., Uppstad, P. H., Lundetræ, K., & Solheim, O. J. (2020). Gender differences in early literacy: boys’ response to formal ınstruction. Journal of Educational Psychology, Advance online publication. doi:10.1037/edu0000626
  • Melhuish, E. (2016). Longitudinal research and early years policy development in the UK. International Journal of Child Care and Education Policy, 10(1), 1-18. doi:10.1186/s40723-016-0019-1
  • Michaelides, M. P. (2019). Negative keying effects in the factor structure of TIMSS 2011 motivation scales and associations with reading achievement. Applied Measurement in Education, 32(4), 365–378. doi: 10.1080/08957347.2019.1660349
  • Mullis, I. V. S., & Martin, M. O. (Eds.). (2017). TIMSS 2019 Assessment Frameworks. Retrieved from Boston College, TIMSS & PIRLS International Study Center website: http://timssandpirls.bc.edu/timss2019/frameworks/
  • Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D. L., & Fishbein, B. (2020). TIMSS 2019 International Results in Mathematics and Science. Boston College, TIMSS & PIRLS International Study Center website: https://timssandpirls.bc.edu/timss2019/international-results/
  • Muthén, L. K., & Muthén, B. O. (1998-2020). Mplus user’s guide (8th ed.). Los Angeles, CA: Muthén & Muthén.
  • Nunnally, J.C. (1978). Psychometric Theory (2nd edn). McGraw-Hill.
  • Peng, P., Barnes, M., Wang, C., Wang, W., Li, S., Swanson, H. L., ... & Tao, S. (2018). A meta-analysis on the relation between reading and working memory. Psychological bulletin, 144(1), 48. doi:10.1037/bul0000124
  • Petscher, Y. (2010). A meta‐analysis of the relationship between student attitudes towards reading and achievement in reading. Journal of research in reading, 33(4), 335-355. doi:10.1111/j.1467-9817.2009.01418.x
  • Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology, 88(5), 879–903. doi:10.1037/0021-9010.88.5.879
  • Pullmann, H., & Allik, J. (2000). The Rosenberg Self-Esteem Scale: its dimensionality, stability and personality correlates in Estonian. Personality and Individual differences, 28(4), 701-715. doi: 10.1016/S0191-8869(99)00132-4
  • Quilty, L. C., Oakman, J. M., & Risko, E. (2006). Correlates of the Rosenberg self-esteem scale method effects. Structural Equation Modeling, 13(1, 99-117. doi:10.1207/s15328007sem1301_5
  • R Development Core Team. (2021). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing.
  • Roth, F. P., Speece, D. L., & Cooper, D. H. (2002). A longitudinal analysis of the connection between oral language and early reading. The Journal of Educational Research, 95(5), 259-272. doi: 10.1080/00220670209596600
  • Schmitt, D. P., & Allik, J. (2005). Simultaneous administration of the Rosenberg self-esteem scale in 53 nations: Exploring the universal and culture-specific features of global self-esteem. Journal of Personality and Social Psychology, 89, 623–642. doi:10.1037/ 0022-3514.89.4.623
  • Sénéchal, M., & LeFevre, J. A. (2002). Parental involvement in the development of children’s reading skill: A five‐year longitudinal study. Child development, 73(2), 445-460. doi:10.1111/1467-8624.00417
  • Spector, P. E., Van Katwyk, P. T., Brannick, M. T., & Chen, P. Y. (1997). When two factors don't reflect two constructs: How item characteristics can produce artifactual factors. Journal of Management, 23(5), 659-677.
  • Storch, S. A., & Whitehurst, G. J. (2002). Oral language and code-related precursors to reading: evidence from a longitudinal structural model. Developmental psychology, 38(6), 934. doi:10.1037/0012-1649.38.6.934
  • Tomás, J. M., Oliver, A., Galiana, L., Sancho, P., & Lila, M. (2013). Explaining method effects associated with negatively worded items in trait and state global and domain-specific self-esteem scales. Structural Equation Modeling: A Multidisciplinary Journal, 20(2), 299–313. doi:10.1080/10705511.2013.769394
  • Tunmer, W. E., & Hoover, W. A. (2019). The cognitive foundations of learning to read: A framework for preventing and remediating reading difficulties. Australian Journal of Learning Difficulties, 24(1), 75-93. doi:10.1080/19404158.2019.1614081
  • Wang, W., Chen, H., & Jin, K. (2015). Item response theory models for wording effects in mixed-format scales. Educational and Psychological Measurement, 75(1), 157–178. doi:10.1177/0013164414528209
  • Weems, G. H., Onwuegbuzie, A. J., & Collins, K. M. (2006). The role of reading comprehension in responses to positively and negatively worded items on rating scales. Evaluation & Research in Education, 19(1), 3–20. doi:10.1080/09500790608668322
  • Weems, G. H., Onwuegbuzie, A. J., & Lustig, D. (2003). Profiles of respondents who respond inconsistently to positively-and negatively-worded items on rating scales. Evaluation & Research in Education, 17(1), 45–60. doi:10.1080/14664200308668290
  • Yang, Y., Chen, Y. H., Lo, W. J., & Turner, J. E. (2012). Cross-cultural evaluation of item wording effects on an attitudinal scale. Journal of Psychoeducational Assessment, 30(5), 509–519. doi: 10.1177/0734282911435461

Item Wording Effects in Psychological Measures: Do Early Literacy Skills Matter?

Yıl 2021, Cilt: 12 Sayı: 3, 239 - 253, 29.09.2021
https://doi.org/10.21031/epod.944067

Öz

While the inclusion of both positively and negatively worded items is a common practice in scales, using positively and negatively worded items together may threaten the validity of a scale. Several studies have been devoted to investigating the effects of item wording methods. The current study investigated item wording effects on the responses of 4028 Turkish fifth-grade students, who responded to the Students Confidence in Mathematics (SCM) and Students Confidence in Science (SCS) scales. The role of early literacy-related variables (i.e., early literacy activities undertaken before primary school, student performance on reading literacy tasks upon entering primary school, and duration of the children’s pre-primary school attendance) on item wording effects was also examined. The investigations were conducted using confirmatory factor analysis and the correlated trait–correlated method minus one CFA- CTC(M-1) model, derived from the correlated traits-correlated methods framework. The results indicate that significant item wording effects existed in both scales. Moreover, a significant and positive effect was found in both scales relating to early literacy activities undertaken before school, but no effects were found relating to student performance on reading literacy tasks upon entering primary school or duration of the children’s pre-primary school attendance. Overall, the study suggests that researchers and practitioners should consider potential effects when including both positively and negatively worded items in scales, especially scales designed for younger students.

Kaynakça

  • Barnette, J. (2000). Effects of stem and Likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those negatively- worded stems. Educational and Psychological Measurement, 6, 361-370. doi:10.1177/00131640021970592
  • Bolt, D., Wang, Y. C., Meyer, R. H., & Pier, L. (2020). An IRT mixture model for rating scale confusion associated with negatively worded items in measures of social-emotional learning. Applied Measurement in Education, 33(4), 331–348. doi:10.1080/08957347.2020.1789140
  • Bornkessel, I., & Schlesewsky, M. (2006). The extended argument dependency model: a neurocognitive approach to sentence comprehension across languages. Psychological review, 113(4), 787. doi: 10.1037/0033-295X.113.4.787
  • Boyce, L. K., Innocenti, M. S., Roggman, L. A., Norman, V. K., & Ortiz, E. (2010). Telling stories and making books: Evidence for an intervention to help parents in migrant Head Start families support their children’s language and literacy. Early Education and Development, 21(3), 343–371. doi:10.1080/10409281003631142
  • Candan, A., Küntay, A. C., Yeh, Y. C., Cheung, H., Wagner, L., & Naigles, L. R. (2012). Language and age effects in children's processing of word order. Cognitive Development, 27(3), 205-221. doi:10.1016/j.cogdev.2011.12.001
  • Caponera, E., Sestito, P., & Russo, P. M. (2016). The influence of reading literacy on mathematics and science achievement. The Journal of Educational Research, 109(2), 197-204. doi: 10.1080/00220671.2014.936998
  • Castro-Schilo, L., Grimm, K. J., & Widaman, K. F. (2016). Augmenting the Correlated Trait–Correlated Method Model for Multitrait–Multimethod Data. Structural Equation Modeling: A Multidisciplinary Journal, 23(6), 798-818. doi:10.1080/10705511.2016.1214919
  • Cole, K. L., Turner, R. C., & Gitchel, W. D. (2019). A study of polytomous IRT methods and item wording directionality effects on perceived stress items. Personality and Individual Differences, 147, 63–72. doi: 10.1016/j.paid.2019.03.046
  • Cronbach, L. J. (1950). Further evidence on response sets and test design. Educational and Psychological Measurement, 10, 3–31. doi:10.1177/001316445001000101
  • DiStefano, C., & Motl, R. W. (2006). Further investigating method effects associated with negatively worded items on self-report surveys. Structural Equation Modeling, 13(3), 440-464. doi: 10.1207/s15328007sem1303_6
  • Dodeen, H. (2015). The effects of positively and negatively worded items on the factor structure of the UCLA loneliness scale. Journal of Psychoeducational Assessment, 33(3), 259-267. doi: 10.1177/0734282914548325
  • Eid, M. (2000). A multitrait-multimethod model with minimal assumptions. Psychometrika, 65, 241-261. doi: 10.1007/BF02294377
  • Fan, Y., & Lance, C. E. (2017). A reformulated correlated trait–correlated method model for multitrait–multimethod data effectively increases convergence and admissibility rates. Educational and psychological measurement, 77(6), 1048-1063. do:10.1177/0013164416677144
  • Gnambs, T., & Schroeders, U. (2020). Cognitive abilities explain wording effects in the Rosenberg Self-Esteem Scale. Assessment, 27(2), 404–418. doi:10.1177/1073191117746503
  • Gustafsson, J.-E., Hansen, K. Y., & Rosén, M. (2013). Effects of home background on student achievement in reading, mathematics, and science at the fourth grade. In M. O. Martin and I. V. S. Mullis (Eds.), TIMSS and PIRLS 2011: Relationships among reading, mathematics, and science achievement at the fourth grade–Implications for early learning. (pp. 181- 287). Chestnut Hill: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
  • Hu, L., & Bentler, P. M. (1999). Cut-off criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55. doi: 10.1080/10705519909540118
  • Huang, C. (2017). Cross-informant agreement on the child behavior checklist for youths: A meta-analysis. Psychological reports, 120(6), 1096-1116. doi:10.1177/0033294117717733
  • Kam, C. C. S. (2018) Novel insights into item keying/valence effect using latent difference (LD) modeling analysis. Journal of Personality Assessment, 100(4), 389-397. doi:10.1080/00223891.2017.1369095
  • Kam, C. C. S., & Meyer, J. P. (2015). How careless responding and acquiescence response bias can influence construct dimensionality: The case of job satisfaction. Organizational Research Methods, 18(3), 512–541. doi:10.1177/1094428115571894
  • Lonigan, C. J., Burgess, S. R., & Anthony, J. L. (2000). Development of emergent literacy and early reading skills in preschool children: evidence from a latent-variable longitudinal study. Developmental psychology, 36(5), 596. doi:10.1037/0012-1649.36.5.596
  • Marsh, H. W. (1996). Positive and negative global self-esteem: A substantively meaningful distinction or artifactors? Journal of Personality and Social Psychology, 70, 810–819. doi:10.1037/0022-3514.70.4.810.
  • Marsh, H.W. (1989). Confirmatory factor analyses of multitrait-multimethod data: Many problems and a few solutions. Applied Psychological Measurement, 13, 335-361.
  • McTigue, E. M., Schwippert, K., Uppstad, P. H., Lundetræ, K., & Solheim, O. J. (2020). Gender differences in early literacy: boys’ response to formal ınstruction. Journal of Educational Psychology, Advance online publication. doi:10.1037/edu0000626
  • Melhuish, E. (2016). Longitudinal research and early years policy development in the UK. International Journal of Child Care and Education Policy, 10(1), 1-18. doi:10.1186/s40723-016-0019-1
  • Michaelides, M. P. (2019). Negative keying effects in the factor structure of TIMSS 2011 motivation scales and associations with reading achievement. Applied Measurement in Education, 32(4), 365–378. doi: 10.1080/08957347.2019.1660349
  • Mullis, I. V. S., & Martin, M. O. (Eds.). (2017). TIMSS 2019 Assessment Frameworks. Retrieved from Boston College, TIMSS & PIRLS International Study Center website: http://timssandpirls.bc.edu/timss2019/frameworks/
  • Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D. L., & Fishbein, B. (2020). TIMSS 2019 International Results in Mathematics and Science. Boston College, TIMSS & PIRLS International Study Center website: https://timssandpirls.bc.edu/timss2019/international-results/
  • Muthén, L. K., & Muthén, B. O. (1998-2020). Mplus user’s guide (8th ed.). Los Angeles, CA: Muthén & Muthén.
  • Nunnally, J.C. (1978). Psychometric Theory (2nd edn). McGraw-Hill.
  • Peng, P., Barnes, M., Wang, C., Wang, W., Li, S., Swanson, H. L., ... & Tao, S. (2018). A meta-analysis on the relation between reading and working memory. Psychological bulletin, 144(1), 48. doi:10.1037/bul0000124
  • Petscher, Y. (2010). A meta‐analysis of the relationship between student attitudes towards reading and achievement in reading. Journal of research in reading, 33(4), 335-355. doi:10.1111/j.1467-9817.2009.01418.x
  • Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology, 88(5), 879–903. doi:10.1037/0021-9010.88.5.879
  • Pullmann, H., & Allik, J. (2000). The Rosenberg Self-Esteem Scale: its dimensionality, stability and personality correlates in Estonian. Personality and Individual differences, 28(4), 701-715. doi: 10.1016/S0191-8869(99)00132-4
  • Quilty, L. C., Oakman, J. M., & Risko, E. (2006). Correlates of the Rosenberg self-esteem scale method effects. Structural Equation Modeling, 13(1, 99-117. doi:10.1207/s15328007sem1301_5
  • R Development Core Team. (2021). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing.
  • Roth, F. P., Speece, D. L., & Cooper, D. H. (2002). A longitudinal analysis of the connection between oral language and early reading. The Journal of Educational Research, 95(5), 259-272. doi: 10.1080/00220670209596600
  • Schmitt, D. P., & Allik, J. (2005). Simultaneous administration of the Rosenberg self-esteem scale in 53 nations: Exploring the universal and culture-specific features of global self-esteem. Journal of Personality and Social Psychology, 89, 623–642. doi:10.1037/ 0022-3514.89.4.623
  • Sénéchal, M., & LeFevre, J. A. (2002). Parental involvement in the development of children’s reading skill: A five‐year longitudinal study. Child development, 73(2), 445-460. doi:10.1111/1467-8624.00417
  • Spector, P. E., Van Katwyk, P. T., Brannick, M. T., & Chen, P. Y. (1997). When two factors don't reflect two constructs: How item characteristics can produce artifactual factors. Journal of Management, 23(5), 659-677.
  • Storch, S. A., & Whitehurst, G. J. (2002). Oral language and code-related precursors to reading: evidence from a longitudinal structural model. Developmental psychology, 38(6), 934. doi:10.1037/0012-1649.38.6.934
  • Tomás, J. M., Oliver, A., Galiana, L., Sancho, P., & Lila, M. (2013). Explaining method effects associated with negatively worded items in trait and state global and domain-specific self-esteem scales. Structural Equation Modeling: A Multidisciplinary Journal, 20(2), 299–313. doi:10.1080/10705511.2013.769394
  • Tunmer, W. E., & Hoover, W. A. (2019). The cognitive foundations of learning to read: A framework for preventing and remediating reading difficulties. Australian Journal of Learning Difficulties, 24(1), 75-93. doi:10.1080/19404158.2019.1614081
  • Wang, W., Chen, H., & Jin, K. (2015). Item response theory models for wording effects in mixed-format scales. Educational and Psychological Measurement, 75(1), 157–178. doi:10.1177/0013164414528209
  • Weems, G. H., Onwuegbuzie, A. J., & Collins, K. M. (2006). The role of reading comprehension in responses to positively and negatively worded items on rating scales. Evaluation & Research in Education, 19(1), 3–20. doi:10.1080/09500790608668322
  • Weems, G. H., Onwuegbuzie, A. J., & Lustig, D. (2003). Profiles of respondents who respond inconsistently to positively-and negatively-worded items on rating scales. Evaluation & Research in Education, 17(1), 45–60. doi:10.1080/14664200308668290
  • Yang, Y., Chen, Y. H., Lo, W. J., & Turner, J. E. (2012). Cross-cultural evaluation of item wording effects on an attitudinal scale. Journal of Psychoeducational Assessment, 30(5), 509–519. doi: 10.1177/0734282911435461
Toplam 46 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Bölüm Makaleler
Yazarlar

Hatice Cigdem Bulut 0000-0003-2585-3686

Yayımlanma Tarihi 29 Eylül 2021
Kabul Tarihi 8 Eylül 2021
Yayımlandığı Sayı Yıl 2021 Cilt: 12 Sayı: 3

Kaynak Göster

APA Bulut, H. C. (2021). Item Wording Effects in Psychological Measures: Do Early Literacy Skills Matter?. Journal of Measurement and Evaluation in Education and Psychology, 12(3), 239-253. https://doi.org/10.21031/epod.944067