Research Article
BibTex RIS Cite

WEB-BASED LISTENING EXAM ACCEPTANCE: COMPARING FRESHMAN AND SOPHOMORE STUDENTS AT A VOCATIONAL COLLEGE

Year 2016, Volume: 5 , 121 - 124, 01.09.2016

Abstract

Assessing learner performance in a foreign
language teaching setting has always been a challenge for instructors due to
pragmatic reasons and Web-based Exams (WBEs) have been lending a helpful hand with
assessment procedures by virtue of their advantages of security, cost,
accuracy, and time saving. Being increasingly popular in recent years, WBEs are
attributed to be effective methods of testing and evaluation, because they not
only reduce time and effort required for exam generation and scheduling, but
also enable more efficient recording, grading, and further analysis on the
results. For this reason, a growing number of secondary and higher education
institutions are adopting WBEs to evaluate their students’ achievement.
However, there is another significant issue as much as employing WBEs in
educational settings: learners’ acceptance of WBEs. Building on this point,
this study aims to investigate the factors influencing language learners’
acceptance of WBEs in terms of the following variables: Goal Expectancy, Social
Influence, Facilitating Conditions, Computer Self Efficacy, Content, Perceived
Usefulness, Perceived Ease of Use, Perceived Playfulness, and Behavioral
Intention. The study was conducted in 2014–2015 academic year with the freshman
and sophomore students of ‘English as a Foreign Language’ course at a
vocational college. The data were collected via an online questionnaire from
602 participants having used the web-based listening comprehension exam system
and analyzed through descriptive and inferential statistics
(independent-samples t-test). The overall results of the study indicated that
the participants were not inclined to use WBEs although sophomore students
favored WBEs slightly more than the freshman students did. The inferential
analysis put forward a significant difference between freshman and sophomore
students in terms of all factors except goal expectancy. In this sense, it is
assumed that having more experience with WBEs could create a positive
orientation in the students’ acceptance of WBEs.

References

  • Bull, J. & McKenna, C. (2004). Blueprint for computer-assisted assessment. London: Routledge-Falmer. Cigdem, H. & Oncu, S. (2015). E-assessment adaptation at a military vocational college: Student perceptions. Eurasia Journal of Mathematics, Science & Technology Education, 11(5), 971-988. Cigdem, H. & Tan, S. (2014). Students’ opinions on administering optional online quizzes in a two-year college Mathematics course. Journal of Computer and Education Research, 2(4), 51-73. Dermo, J. (2009). E-assessment and the student learning experience: A survey of student perceptions of e-assessment. British Journal of Educational Technology, 40 (2), 203-214. DeSouza, E., & Fleming, M. (2003).A comparison of in-class quizzes vs. online quizzes on student exam performance. Journal of Computing in Higher Education, 14(2), 121-134. Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: a review of the literature. Computers & Education, 57(4), 2333–2351. Grimstad, K., & Grabe, M. (2004). Are online study questions beneficial? Teaching of Psychology, 31(2), 143–46. Johnson, B. C., & Kiviniemi, M. T. (2009). The effect of online chapter quizzes on exam performance in an undergraduate social psychology course. Teach Psychology, 36 (1), 33-37. Llamas-Nistal, M., Fernández-Iglesias, M. J., González-Tato, J., & Mikic-Fonte, F. A. (2013). Blended e-assessment: Migrating classical exams to the digital world. Computers & Education, 62, 72-87. McCausland, W.D. (2003). Extended case study: Computer aided assessment and independent learning in macroeconomics. Bristol: University of Bristol. Retrieved from http://www.economicsnetwork.ac.uk/showcase/ext_causland.htm Morris, D. (2008). Economics of scale and scope in e-learning. Teaching in Higher Education, 33(3), 331–343. Nicol, D. (2007). Laying a foundation for lifelong learning: case studies of e-assessment in large 1st-year classes. British Journal of Educational Technology 38(4): 668–678. Sorensen, E. (2013). Implementation and student perceptions of e-assessment in a Chemical Engineering module. European Journal of Engineering Education, 38 (2), 172-185. Terzis, V. & Economides, A. A. (2011). The acceptance and use of computer based assessment. Computers & Education, 56(4), 1032–1044. Zakrzewski, S. & Bull, J. (1998). The mass implementation and evaluation of computer-based assessments. Assessment & Evaluation in Higher Education 23(2), 141–152. Zakrzewski, S., & Steven, C. (2000). A model for computer-based assessment: The Catherine Wheel principle. Assessment & Evaluation in Higher Education, 25(2), 201–15. .
Year 2016, Volume: 5 , 121 - 124, 01.09.2016

Abstract

References

  • Bull, J. & McKenna, C. (2004). Blueprint for computer-assisted assessment. London: Routledge-Falmer. Cigdem, H. & Oncu, S. (2015). E-assessment adaptation at a military vocational college: Student perceptions. Eurasia Journal of Mathematics, Science & Technology Education, 11(5), 971-988. Cigdem, H. & Tan, S. (2014). Students’ opinions on administering optional online quizzes in a two-year college Mathematics course. Journal of Computer and Education Research, 2(4), 51-73. Dermo, J. (2009). E-assessment and the student learning experience: A survey of student perceptions of e-assessment. British Journal of Educational Technology, 40 (2), 203-214. DeSouza, E., & Fleming, M. (2003).A comparison of in-class quizzes vs. online quizzes on student exam performance. Journal of Computing in Higher Education, 14(2), 121-134. Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: a review of the literature. Computers & Education, 57(4), 2333–2351. Grimstad, K., & Grabe, M. (2004). Are online study questions beneficial? Teaching of Psychology, 31(2), 143–46. Johnson, B. C., & Kiviniemi, M. T. (2009). The effect of online chapter quizzes on exam performance in an undergraduate social psychology course. Teach Psychology, 36 (1), 33-37. Llamas-Nistal, M., Fernández-Iglesias, M. J., González-Tato, J., & Mikic-Fonte, F. A. (2013). Blended e-assessment: Migrating classical exams to the digital world. Computers & Education, 62, 72-87. McCausland, W.D. (2003). Extended case study: Computer aided assessment and independent learning in macroeconomics. Bristol: University of Bristol. Retrieved from http://www.economicsnetwork.ac.uk/showcase/ext_causland.htm Morris, D. (2008). Economics of scale and scope in e-learning. Teaching in Higher Education, 33(3), 331–343. Nicol, D. (2007). Laying a foundation for lifelong learning: case studies of e-assessment in large 1st-year classes. British Journal of Educational Technology 38(4): 668–678. Sorensen, E. (2013). Implementation and student perceptions of e-assessment in a Chemical Engineering module. European Journal of Engineering Education, 38 (2), 172-185. Terzis, V. & Economides, A. A. (2011). The acceptance and use of computer based assessment. Computers & Education, 56(4), 1032–1044. Zakrzewski, S. & Bull, J. (1998). The mass implementation and evaluation of computer-based assessments. Assessment & Evaluation in Higher Education 23(2), 141–152. Zakrzewski, S., & Steven, C. (2000). A model for computer-based assessment: The Catherine Wheel principle. Assessment & Evaluation in Higher Education, 25(2), 201–15. .
There are 1 citations in total.

Details

Journal Section Articles
Authors

Harun Cigdem

Mustafa Ozturk This is me

Abdullah Topcu This is me

Publication Date September 1, 2016
Published in Issue Year 2016 Volume: 5

Cite

APA Cigdem, H., Ozturk, M., & Topcu, A. (2016). WEB-BASED LISTENING EXAM ACCEPTANCE: COMPARING FRESHMAN AND SOPHOMORE STUDENTS AT A VOCATIONAL COLLEGE. The Eurasia Proceedings of Educational and Social Sciences, 5, 121-124.