The COVID-19 Pandemic and the Challenges of E-Assessment of Calculus Courses in Higher Education: A Case Study in Saudi Arabia

Fatima Azmi, Heba Bakr Khoshaim

Abstract


The COVID-19 pandemic has affected many aspects of our lives, including education. Due to this unexpected catastrophe, education has shifted to virtual-learning and auto-grading models in most parts of the world. This study explores the validity and appropriateness of auto-grading-assessment for online exams by comparing students’ online exam scores where they are first auto-graded and then manually graded. Furthermore, it investigates whether the mean differences in their scores are statistically significant. The study included two calculus courses taught by the authors, during the spring semester 2019-2020 at a private university in Saudi Arabia. The online exam was performed on the WebAssign platform, which has built-in calculus questions. The sample consisted of fifty-five students who were registered on those calculus courses. The quantitative data was analysed using the SPSS statistical tool. A paired t-test at an alpha level of 0.05 was performed on differences in mean exam scores between auto-graded and manually-graded scores. The statistical analysis results revealed a statistically significant difference in students' mean scores. Our findings illustrate the importance of human intelligence, its role in assessing students' achievements and understanding of mathematical concepts, and the extent to which instructors can currently rely on auto-grading. A careful manual investigation of auto-graded exams revealed different types of mistakes committed by students. Those mistakes were characterized into two categories: non-mathematical mistakes (related to Platform Design) and minor mathematical mistakes, which might deserve partial credit. The study indicated a need to reform the auto-grading system and provided some suggestions to overcome its setbacks.

https://doi.org/10.26803/ijlter.20.3.16


Keywords


COVID-19 pandemic; E-assessment; validity of auto-grading; higher education; mathematics

Full Text:

PDF

References


Abbasi, S., Ayoob, T., Malik, A., & Memon, S. I. (2020). Perceptions of students regarding E-learning during COVID-19 at a private medical college. Pakistan Journal of Medical Sciences, 36, 57–61. https://doi.org/10.12669/pjms.36.COVID19-S4.2766

Alanazi, A. A, & Alshaalan, Z. M. (2020). Views of faculty members on the use of e-learning in Saudi medical and health colleges during COVID-19 pandemic. Journal of Nature and Science of Medicine, 3(4), 308–317. https://doi.org/10.4103/JNSM.JNSM_82_20

Alqahtani, A. Y., & Rajkhan, A. A. (2020). E-Learning critical success factors during the COVID-19 pandemic: A comprehensive analysis of E-Learning managerial perspectives. Education Sciences, 10(9), 1–16. https://doi.org/10.3390/educsci10090216

Alruwais, N., Wills, G., & Wald, M. (2018). Advantages and challenges of using e-assessment. International Journal of Information and Education Technology, 8(1), 34–37. https://doi.org/10.18178/ijiet.2018.8.1.1008

Azevedo, J. M. (2015). E-Assessment in mathematics courses with multiple-choice questions tests. In Proceedings of the 7th International Conference on Computer Supported Education, CSEDU-2015 (pp. 260–266). https://doi.org/10.5220/0005452702600266

Baki, A., Kosa, T., & Guven, B. (2011). A comparative study of the effects of using dynamic geometry software and physical manipulative on the spatial visualization skills of pre-service mathematics teachers. British Journal of Educational Research, 42, 291–319. https://doi.org/10.1111/j.1467-8535.2009.01012.x

Bejar, I. I., Mislevy, R. I., & Zhang, M. (2017). Automated scoring with validity in mind. In A. A. Rubb, & J. P. Leighton, The handbook of cognition and assessment: Frameworks, methodologies and applications (pp. 226–247). Wiley-Blackwell.

Broughton, S. J., Robinson, C. L., & Hernandez-Martinez, P. (2013). Lecturers' perspectives on the use of a mathematics-based computer-aided assessment system. Teaching Mathematics and its Application, 32, 88–94. https://doi.org/10.1093/teamat/hrt008

Camacho Machín, M., Depool Rivero, R., & Santos-Trigo, M. (2010). Students' use of Derive software in comprehending and making sense of definite integral and area concepts. CBMS Issues in Mathematics Education, 16, 29–61.

Fulton, C. (2020). Collaborating in online teaching: inviting e-guests to facilitate learning in the digital environment. Information Learning Sciences, 121(7/8), 579–585.

Heid, M. K. (1988). Resequencing skills and concepts in applied calculus using the computer as a tool. Journal for Research in Mathematics Education, 19, 3–25.

Hodges, C., Moore, S., Lockee, B., Trust, T. & Bond, M. A. (2020, March 27). The difference between emergency remote teaching and online learning. Educause Review. https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning#fn12

Kan, A., Bulut, O., & Cormier, D. C. (2019). The impact of item stem format on the dimensional structure of mathematics assessments. Educational Assessment, 24(1), 13–32. https://doi.org/10.1080/10627197.2018.1545569

Kerzic, D., Umek, I., Tomazevic, N., & Aristovnik, A. (2018). Evaluating the impact of e-learning on students' perception of acquired competencies in a university blended learning environment. Journal of e-learning and Knowledge Society. 14(3), 65–76.

Khalil, R., Mansour, A. E., & Fadda, W. A. et al. (2020). The sudden transition to synchronized online learning during the COVID-19 pandemic in Saudi Arabia: a qualitative study exploring medical students’ perspectives. BMC Medical Education, 20(285), 1–10. https://doi.org/10.1186/s12909-020-02208-z

Lawrence, J. A., & Singhania, R. P. (2004). A study of teaching and testing strategies for a required statistics course for undergraduate business students. Journal of Education for Business, 79(6), 333–383. https://doi.org/10.3200/JOEB.79.6.333-338

McCarthy, K. (2020). "The Global Impact of Coronavirus on Education." [Video]. ABC News Network. https://abcnews.go.com/International/global-impact-coronavirus-education/story?id=69411738

Parshall, C. G., & Guille, R. A. (2015). Managing ongoing changes to the test: Agile strategies for continuous innovations. In F. Drasgow (Ed.), Technology and Testing: Improving Educational and Psychological Measurement. National Council of Teachers of Mathematics.

Pennisi, E. (2020). Courses bring field sites and labs to the small screen. Science, 369(6501), 239–240. https://doi.org/10.1126/science.369.6501.239

Raja, R, & Nagasubramani, P. C. (2018). Impact of modern technology in education. Journal of Applied and Advanced Research, 3(1), 3335. https://doi.org/10.21839/jaar.2018.v3S1.165

ReÅ¡ić, S. & HalilÄević, S. (2014). Tracking, evaluation, and grading students' accomplishments in mathematics classes. Human: Journal for Interdisciplinary Studies, 4(3), 41–50.

ReÅ¡ić, S., Bajramović, A., & HrnjiÄić, A. (2017). Grade as the motivational factor in learning mathematics. Human: Journal for Interdisciplinary Studies, 7(2), 10–21. https://doi.org/10.21554/hrr.091702

Rupp, A. A., & Leighton, J. P. (2017). Introduction to handbook. In A. A. Rubb, & J. P. Leighton (Eds.), The handbook of cognition and assessment: Frameworks, methodologies and applications (pp. 1–11). Wiley-Blackwell.

Schmider, E., Ziegler, M., Danay, E., Beyer, L., & B hner, M. (2010). Is it really robust? Reinvestigating the robustness of ANOVA against violations of the normal distribution assumption. Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 6, 147–151.

Shahbari, J. A., & Abu-Alhija, F. N. (2018). Does training in alternative assessment matter? The case of prospective and practicing mathematics teachers' attitudes toward alternative assessment and their beliefs about the nature of mathematics. International Journal of Science and Mathematics Education, 16, 1315–1335. https://doi.org/10.1007/s10763-017-9830-6

Smith, J. S. (2019, June), Determining optimal deployment strategies of MATLAB autograder to maximize student learning and engagement. Paper presented at 2019 ASEE Annual Conference & Exposition, Tampa, Florida. https://peer.asee.org/32627

Stankous, N. V. (2018). Constructive response vs. Multiple-Choice tests in math: American experience and discussion (Review). European Scientific Journal (special ed.), 308–316. https://core.ac.uk/download/pdf/328025438.pdf

Viner, R. M., Russell, S. J., Croker, H., Packer, J., Ward, J., Stansfield, C., Mytton, O., Bonell, C., & Booy, R. (2020). School closure and management practices during coronavirus outbreaks including COVID-19: A rapid systematic review. Lancet Child & Adolescent Health, 4, 397–404. https://doi.org/10.1016/S2352-4642(20)30095-X

Wang, J. (1998). A content examination of the TIMSS results. Phi Delta Kappan, 80(1), 36–38.

Wang, J. (2011). Re-examining test item issues in the TIMSS Mathematics and Science Assessments. School Science and Mathematics, 111(7), 33–34.


Refbacks

  • There are currently no refbacks.


e-ISSN: 1694-2116

p-ISSN: 1694-2493