Design, Implementation, and Evaluation of Online Open-Book Exams: Insights from Medical Students’ Perspectives

Document Type : Brief Report

Authors

1 Department of Medical Education, Virtual School of Medical Education and Management, Shahid Beheshti University of Medical Sciences, Tehran, Iran

2 Department of LIME, Research Affiliated Faculty, Karolinska Institute, Solna, Sweden

3 Department of E-Learning in Medical Sciences, Virtual School and Center of Excellence in e-Learning, Shiraz University of Medical Sciences, Shiraz, Iran

4 Department of Medical Education, Vice of Center for Recruitment and Affairs of Academic Members and Elites, Shahrekord University of Medical Sciences, Shahrekord, Iran

5 Department of School of Medical Education and Educational Technology, Shahid Beheshti University of Medical Sciences, Tehran, Iran

Abstract

Assessment of students is a crucial aspect of the educational process, closely tied to the quality of learning. Between April 2019 and August 2022, we designed, implemented, and evaluated online open-book examinations for 13 second-year doctoral students in medical education at Shahid Beheshti University of Medical Sciences (SBUMS), Iran, focusing on their perspectives. The exams were developed by a design team of six faculty members, who selected courses, created blueprints, question distributions, and realistic scenarios, and formulated questions based on open-book exam principles. Face and content validity were confirmed using feedback from 15 experts, with CVR ≥ 0.49 and CVI > 0.79, and reliability was verified via Cronbach’s alpha = 0.79. Exams were conducted online using Navaid Learning Management System (LMS), preceded by a pilot session to reduce student anxiety. Student satisfaction was assessed through a 10-item researcher-developed questionnaire on a 5-point Likert scale, validated by five medical education professors and tested for reliability (Cronbach’s alpha = 0.78). Additionally, a face-to-face critique session was conducted. Students generally provided positive feedback, reporting that the exams offered valuable and novel experiences, stimulated engagement, and allowed better handling of questions and scenarios. Some concerns were raised about technical issues, such as internet interruptions, power outages, or exam equipment malfunctions, which could potentially affect performance. Overall, the study shows that doctoral students were satisfied with the implementation of online open-book exams, highlighting the feasibility and acceptability of this approach in medical education.

Keywords


  1. Sousa M, Dal Mas F, Pesqueira A, Lemos C, Verde JM, Cobianchi L. The potential of AI in health higher education to increase the students’ learning outcomes. Health Technol Educ. 2021(2):488-97. doi: 10.18421/TEM102-02.
  2. Svetkova M, Saenko N, Levina V, Kondratenko L, Khimmataliev D. Organizing Students’ Independent Work at Universities for Professional Competencies Formation and Personality Development. Int J Instr. 2021;14(4):503–28. doi:10.29333/iji.2021.14430a.
  3. Itani M, Itani M, Kaddoura S, Al Husseiny F. The impact of the COVID-19 pandemic on on-line examination: Challenges and opportunities. Glob J Eng Educ. 2022;24:105-20.
  4. Cummings BA. Exploring the role of open book high-stakes examinations in 2021 and beyond. Can Med Educ J. 2022;13(4):49-52. doi: 10.36834/cmej.73897. PubMed PMID: 36091731; PubMed Central PMCID: PMC9441113.
  5. Kaidy BA. Do GPA, Age, and Credits Completed Result in Statistically Significant Differences in MSLQ Total Scores? [dissertation]. Minneapolis (MN): Capella University.
  6. Maki PL, Shea P, editors. Transforming digital learning and assessment: A guide to available and emerging practices and building institutional consensus. New York: Taylor & Francis; 2023.
  7. Wong JT, Bui NN, Fields DT, Hughes BS. A learning experience design approach to online professional development for teaching science through the arts: Evaluation of teacher content knowledge, self-efficacy and STEAM perceptions. J Sci Teach Educ. 2023;34(6):593-623.
  8. Ashri D, Sahoo BP. Open book examination and higher education during COVID-19: Case of University of Delhi. J Educ Technol Syst. 2021;50(1):73-86. doi: 10.1177/004723952101378.
  9. Bansal D. Open book examinations: modifying pedagogical practices for effective teaching and learning. Law Teach. 2022;56(3):354-67. doi: 10.1080/03069400.2021.1999151.
  10. Abu-Snoubar TK, Aldowkat I, Al-Shboul Y, Atiyat MA, Al-Hyari H. The attitudes of Jordanian English language and literature undergraduate students toward open-book exams. Front Educ. 2022;7:1050587. doi: 10.3389/feduc.2022.1050587.
  11. Tatari F, Raoufian H, Mashhadi M, Gazerani A. Effect of group open-book assessment on students’ learning and satisfaction: a quasi-experimental study. Neuropsychiatr Neuropsychol. 2021;16(1):87-91. doi:10.5114/nan.2021.105625.
  12. Carrasco GA, Behling KC, Lopez OJ. Evaluation of the role of incentive structure on student participation and performance in active learning strategies: a comparison of case-based and team-based learning. Med Teach. 2018;40(4):379-86. doi: 10.1080/0142159X.2017.1408899. PubMed PMID: 29205068.
  13. Wenzel K, Schweppe J, Rummer R. Are open‐book tests still as effective as closed‐book tests even after a delay of 2 weeks?. Appl Cogn Psychol.2022;36(3):699-707. doi:10.1002/acp.3943.
  14. Durning SJ, Dong T, Ratcliffe T, Schuwirth L, Artino Jr AR, Boulet JR, Eva K. Comparing open-book and closed-book examinations: a systematic review. Acad Med. 2016;91(4):583-99. doi: 10.1097/ACM.0000000000000977. PubMed PMID: 26535862.
Volume 14, Issue 4 - Serial Number 55
December 2023
Pages 330-336
  • Receive Date: 23 August 2023
  • Revise Date: 05 September 2023
  • Accept Date: 19 September 2023
  • Publish Date: 01 December 2023