Research Article

Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd?

Volume: 9 Number: 4 October 16, 2018
EN

Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd?

Abstract

Managing crowded classes in terms of classroom assessment is a difficult task due to the amount of time which needs to be devoted to providing feedback to student products. In this respect, the present study aimed to develop an automated essay scoring environment as a potential means to overcome this problem. Secondarily, the study aimed to test if automatically-given scores would correlate with the scores given by a human rater. A quantitative research design employing a machine learning approach was preferred to meet the aims of the study. The data set to be used for machine learning consisted of 160 scored literary analysis essays written in an English Literature course, each essay analyzing a theme in a given literary work. To train the automated scoring model, LightSide software was used. First, textual features were extracted and filtered. Then, Logistic Regression, SMO, SVO, Logistic Tree and Naïve Bayes text classification algorithms were tested by using 10-Fold Cross-Validation to reach the most accurate model. To see if the scores given by the computer correlated with the scores given by the human rater, Spearman’s Rank Order Correlation Coefficient was calculated. The results showed that none of the algorithms were sufficiently accurate in terms of the scores of the essays within the data set. It was also seen that the scores given by the computer were not significantly correlated with the scores given by the human rater. The findings implied that the size of the data collected in an authentic classroom environment was too small for classification algorithms in terms of automated essay scoring for classroom assessment. 

Keywords

References

  1. Abu-Mustafa, Y. S., Magdon-Ismail, M., & Lin, H.-T. (2012). Learning from data (1st ed.). Seattle: AML Book.
  2. Anderson, S.E., & Ben Jafaar, S. (2006). Policy trends in Ontario education: 1990-2003. (ICEC Working Paper #1). University of Toronto, Ontario Institute. Retrieved on 22 January 2018 from http://fcis.oise.utoronto.ca/~icec/policytrends.pdf
  3. Attali, Y. & Burstein, J. (2006). Automated essay scoring with e-rater V.2. Journal of Technology, Learning, and Assessment (JTLA), 4(3). Retrieved on 11 March 2016 from http://ejournals.bc.edu/ojs/ index.php/jtla/article/view/1650
  4. Attali, Y. (2013). Validity and reliability of automated essay scoring. In M.D. Shermis & J.C. Burstein (Eds.), Handbook of automated essay evaluation: Current applications and new directions (pp. 181-198). New York, NY: Routledge.
  5. Baker, N. L. (2014). “Get it off my stack”: Teachers’ tools for grading papers. Assessing Writing, 19, 36-50. doi: 10.1016/j.asw.2013.11.005.
  6. Barker, T. (2011). An automated individual feedback and marking system: An empirical study. The Electronic Journal of E-Learning, 9(1), 1-14.
  7. Bauer, J. (2016). A new approach: Closing the writing gap by using reliable assessment to guide and evaluate cross-curricular argumentative writing (Unpublished master’s thesis). The University of Wisconsin, USA.
  8. Brookhart, S.M. & Bronowicz, D.L. (2003). “I don’t like writing. It makes my fingers hurt”: students talk about their classroom assessments. Assessment in Education, 10(2), 221-242.

Details

Primary Language

English

Subjects

-

Journal Section

Research Article

Authors

Publication Date

October 16, 2018

Submission Date

January 23, 2018

Acceptance Date

October 5, 2018

Published in Issue

Year 2018 Volume: 9 Number: 4

APA
Uzun, K. (2018). Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd? Contemporary Educational Technology, 9(4), 423-436. https://doi.org/10.30935/cet.471024
AMA
1.Uzun K. Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd? Contemporary Educational Technology. 2018;9(4):423-436. doi:10.30935/cet.471024
Chicago
Uzun, Kutay. 2018. “Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd?”. Contemporary Educational Technology 9 (4): 423-36. https://doi.org/10.30935/cet.471024.
EndNote
Uzun K (October 1, 2018) Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd? Contemporary Educational Technology 9 4 423–436.
IEEE
[1]K. Uzun, “Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd?”, Contemporary Educational Technology, vol. 9, no. 4, pp. 423–436, Oct. 2018, doi: 10.30935/cet.471024.
ISNAD
Uzun, Kutay. “Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd?”. Contemporary Educational Technology 9/4 (October 1, 2018): 423-436. https://doi.org/10.30935/cet.471024.
JAMA
1.Uzun K. Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd? Contemporary Educational Technology. 2018;9:423–436.
MLA
Uzun, Kutay. “Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd?”. Contemporary Educational Technology, vol. 9, no. 4, Oct. 2018, pp. 423-36, doi:10.30935/cet.471024.
Vancouver
1.Kutay Uzun. Home-Grown Automated Essay Scoring in the Literature Classroom: A Solution for Managing the Crowd? Contemporary Educational Technology. 2018 Oct. 1;9(4):423-36. doi:10.30935/cet.471024

Cited By