Published on in Vol 7, No 3 (2021): Jul-Sep

Preprints (earlier versions) of this paper are available at, first published .
Web-Based Medical Examinations During the COVID-19 Era: Reconsidering Learning as the Main Goal of Examination

Web-Based Medical Examinations During the COVID-19 Era: Reconsidering Learning as the Main Goal of Examination

Web-Based Medical Examinations During the COVID-19 Era: Reconsidering Learning as the Main Goal of Examination

Authors of this article:

Amirreza Manteghinejad 1, 2 Author Orcid Image


1Cancer Prevention Research Center, Omid Hospital, Isfahan University of Medical Sciences, Isfahan, Iran

2Student Research Committee, School of Medicine, Isfahan University of Medical Sciences, Isfahan, Iran

Corresponding Author:

Amirreza Manteghinejad, MD, MHPE

Cancer Prevention Research Center

Omid Hospital

Isfahan University of Medical Sciences

Hezarjarib St

Isfahan, 81746-73461


Phone: 98 3132371103


Like other aspects of the health care system, medical education has been greatly affected by the COVID-19 pandemic. To follow the requirements of lockdown and virtual education, the performance of students has been evaluated via web-based examinations. Although this shift to web-based examinations was inevitable, other mental, educational, and technical aspects should be considered to ensure the efficiency and accuracy of this type of evaluation in this era. The easiest way to address the new challenges is to administer traditional questions via a web-based platform. However, more factors should be accounted for when designing web-based examinations during the COVID-19 era. This article presents an approach in which the opportunity created by the pandemic is used as a basis to reconsider learning as the main goal of web-based examinations. The approach suggests using open-book examinations, using questions that require high cognitive domains, using real clinical scenarios, developing more comprehensive examination blueprints, using advanced platforms for web-based questions, and providing feedback in web-based examinations to ensure that the examinees have acquired the minimum competency levels defined in the course objectives.

JMIR Med Educ 2021;7(3):e25355



Currently, we are living in the COVID-19 era [1], during which almost every aspect of life, from communications to people’s lifestyles, has changed [2,3]. Originating in China, the SARS-CoV-2 virus spread worldwide quickly, affected billions of people, and led to several infection control policies, such as wearing masks, social distancing, and lockdowns [4-6]. Iran is one of the countries that was greatly affected by this pandemic. The first cases of COVID-19 were reported on February 19, 2020, and to date, approximately 2.8 million confirmed cases and 78,000 deaths have been reported in Iran. Medical education, including undergraduate and postgraduate education, was not immune to the effects of this virus [7]. A sudden shift to e-learning and web-based courses was a direct result of COVID-19 [8]. For undergraduate students, almost all courses are presented virtually, and for postgraduate students, a blended learning approach is used to reduce the exposure of students, interns, and residents to the disease. Traditional written examinations also underwent a transformation in response to this pandemic, considering the high risk of infection in indoor testing sites [9,10]. The easiest way to face the new challenges was to administer traditional questions via a web-based platform. However, more factors should be accounted for in designing web-based examinations during the COVID-19 era. This article aims to address some of the major aspects that should be considered in this regard and to suggest a valid and reliable method for organizing medical students’ examinations in the COVID-19 era.

Resorting to web-based examinations during the COVID-19 pandemic was inevitable. It should be noted that COVID-19 itself affects medical students, medical educators, and medical universities more than lockdowns and social distancing. Other aspects of COVID-19 that must be taken into account during the pandemic can be categorized into mental aspects, educational aspects, and technical aspects.

Mental Aspects

Stress and burnout are among the factors that may affect medical students during this pandemic [11]. Studies conducted before the COVID-19 pandemic showed that stress levels were high among medical students [12]. The COVID-19 pandemic has now worsened the stress levels of medical students and workers throughout the entire health care community [13,14]. In our experience in Medical University of Isfahan (MUI) teaching hospitals, some of the main causes of stress during this pandemic were the shortage of personal protective equipment in the early weeks; the poor state of intern and resident on-call rooms, which increased the risk of infection; and the fear of infecting family members.

Regarding burnout, previous studies have shown a higher prevalence of burnout in medical students than in the general population [15]. Currently, during the COVID-19 pandemic, burnout syndrome has become more prevalent in health care providers and medical students [14,16].

Educational Aspects

The difference in the types of education that medical students, interns, and residents receive also increases the need to change the way examinations are conducted. The sudden shift to e-learning has created challenges in medical education [17]. Challenges include decreased numbers of educational opportunities, decreased numbers of patients (except for patients with COVID-19), increased numbers of shifts, and in some cases, the requirement to work in COVID-19 wards. As a result of the second and third waves of this pandemic in Isfahan, the primary teaching hospital of MUI became a COVID-19 center, and all operation rooms or outpatient clinics were closed, thus decreasing educational opportunities for interns and residents.

Technical Aspects

Web-based examinations also have unique problems. Studies show that web-based testing environments negatively affect student performance on examinations because of differences in student comfort and technical problems, such as internet connection disruptions and server failure [18]. This problem occurred in our first experiences of web-based examinations at MUI. Server error messages appeared during the examinations due to the lack of server resources and negatively affected the students’ performance by causing them stress or wasting examination time.

The ability to control the environment of unproctored web-based examinations is also questionable [19]; for instance, cheating is a commonly reported challenge in web-based assessments [20]. Kennedy et al [21] reported that 57% of students and 64% of faculty members believed that it is easier to cheat on web-based examinations than on traditional examinations. In another study, Jensen and Thomas [22] showed that approximately 22% of participants in web-based examinations used search engines to find the correct answer. This was also the case in MUI examinations. Our first web-based examination results showed a significant increase in examination grades. Further investigations revealed that using messaging platforms such as WhatsApp to share the answers is one of the most common ways of cheating among students. Some other common ways of web-based cheating are using electronic books and typed handouts and searching for keywords.

In response to all these emerging problems, most solutions are limited to technical aspects and preventing students from cheating. Some educators reduced the examination time or used more difficult questions to address this problem. Using webcams and screen sharing to control students is another solution. This method, however, has certain limitations, such as the limits concerning the number of examination participants and the need for high-speed internet connections for all students [23]. Meanwhile, technology can be employed to prevent cheating. For example, PageFocus is a JavaScript code that detects when participants abandon test pages by switching to another window or browser tab [24]. However, the inclusive use of smartphones and messaging applications undermines this method. Sharing answers on social media platforms can thwart the strategies employed by educators, such as decreasing the duration of examinations. Generally, using a second device can neutralize strategies such as screen sharing and PageFocus.

Although all these problems related to web-based examinations in the COVID-19 era seem concerning at first look, this leads us to take a step back and reconsider the aims of an examination. One of the main goals of an examination is to improve learning; however, learning through examination can only occur when the examination involves more than participating in the examination, answering the questions, and waiting for the results.

The COVID-19 era and web-based examinations are providing an excellent opportunity to consider learning as the main goal of examination. By accepting the existing limitations in educating and assessing students and preventing cheating in this era, some measures can be taken to transform the examination sessions into learning sessions to ensure that students have achieved the minimum competency levels required for passing their courses. For this transformation, the following approach is suggested.

Use Open-Book Examinations

The reason for suggesting open-book examinations is that it may encourage students to use their books or search for the answers on evidence-based medicine databases. Moreover, a systematic review showed decreased anxiety levels among students in open-book examinations, which makes them more favorable, especially in this era [25]. Evidence-based medicine is a result of the internet revolution, and it emerged because of the rapid expansion of knowledge in the age of information; its primary goal is to educate clinicians on how to use published articles for optimizing clinical care [26]. To use evidence-based medicine, physicians must possess the ability to search for and find correct information on the internet and in medical databases [26]; therefore, open-book examinations are an excellent way to familiarize medical students with evidence-based medicine [27].

Use Questions That Require High Cognitive Domains

To encourage students to use their books and other evidence-based medicine sources during examinations, it is essential to revise the examination questions.

According to Bloom’s theory, cognition has multiple domains [28]. Open-book examinations enable examiners to ask questions that require higher cognitive domains [29], because the students cannot find the test answers easily in references or on the internet with a simple keyword search. Using questions that require high cognitive domains forces the students to read the related contents thoroughly and ensures that students answering the questions have read the selected contents at least once and have understood them.

Using clinical scenarios is a good way to develop questions for open-book examinations because these scenarios require high cognitive domain levels. Open-book examinations are, in fact, similar to clinical practice in certain aspects. In real practice, general practitioners must have the ability to use evidence-based medicine in their decision-making; therefore, they should know how to search, where to search, and what to search for. Using real clinical scenarios in web-based examinations is an excellent method to train medical students in using evidence-based medicine by searching in evidence-based medicine resources [30].

Create More Comprehensive Examination Blueprints

We use examinations to ensure that medical students have a minimum competency in the subject of the examination. To achieve this goal, especially when the education system is less than perfect, it is necessary to develop more comprehensive examination blueprints. By using a comprehensive blueprint in coordination with open-book examinations, the instructor ensures that all the students recall, review, read, and understand the course topics and subtopics. A common guideline for cognitive domains of questions is 50-40-10, which means that 50% of questions require the knowledge domain, 40% require the application domain, and 10% require problem-solving skills [31,32]. However, as mentioned above, questions that require high cognitive skills such as problem-solving must be given more weight in these circumstances. The type of examination (formative or summative) and examinee (student, intern, or resident) can also change these percentages to ensure that an examinee who passes the examination has a minimum competency according to the course objectives [33].

Use Advanced Platforms for Web-Based Examinations

Using advanced platforms to administer web-based examinations makes it difficult for students to cheat and encourages them to refer to their books to find the answers [29]. Showing only one question at a time on screen, sorting the questions and choices randomly for multiple-choice questions, and prohibiting students from revisiting a question can help the instructors encourage students to use references instead of cheating [34-36]. Reducing stress is an additional goal of using advanced platforms. Internet connection disruptions and server failure are major causes of stress during web-based examinations. The web-based examination platform must be capable of handling these disruptions; it should save the students’ previous answers and their remaining examination time to curb the negative effects of disruptions on the performance of the examinees.

Provide Feedback

Learning in an examination session occurs when the examination requires students to do more than merely participate, answer questions, and wait for their grades. Providing feedback is an essential component to turn an examination session into a learning session. The question-response-feedback approach is one of the easiest ways to create a learning session in web-based examinations [37]. This type of feedback is divided into three categories: indication of a correct or incorrect response, statement of the correct response, and elaborative corrective feedback that includes an explanation of the question and responses. Moreover, the feedback could be provided after each question (immediate) or at the end of the test (delayed). Previous studies show that providing feedback with examination question rationales is a better approach than simply providing the correct answer or simply indicating whether the student was correct or not [37-39]. Moreover, a study shows that delayed feedback in examinations is more beneficial than immediate feedback [37]. In conclusion, using delayed elaborative feedback in web-based examinations is suggested for such examinations in this era.

There is no doubt that medical education, especially in the clinical setting, is being affected by the COVID-19 pandemic. Therefore, there is a need for new education and examination policies to adapt to this situation. Learning is one of the goals of examination that both instructors and students often neglect. The COVID-19 pandemic era is an excellent opportunity to consider learning as the main goal of exams and use methods to transform an exam session into a learning session and ensure that the students who pass the exam have a minimum competency.


The author thanks Shaghayegh Haghjooy Javanmard, MD, PhD, and Atefeh Vaezi, MD, for their support during the drafting and editing stages of this manuscript. No financial support or sponsorship was received for this work.

Conflicts of Interest

None declared.

  1. Shirani K, Sheikhbahaei E, Torkpour Z, Ghadiri Nejad M, Kamyab Moghadas B, Ghasemi M, et al. A narrative review of COVID-19: the new pandemic disease. Iran J Med Sci 2020 Jul;45(4):233-249 [FREE Full text] [CrossRef] [Medline]
  2. Yu M, Li Z, Yu Z, He J, Zhou J. Communication related health crisis on social media: a case of COVID-19 outbreak. Curr Issues Tour 2020 Apr 14:1-7. [CrossRef]
  3. Romero-Blanco C, Rodríguez-Almagro J, Onieva-Zafra MD, Parra-Fernández ML, Prado-Laguna MDC, Hernández-Martínez A. Physical activity and sedentary lifestyle in university students: changes during confinement due to the COVID-19 pandemic. Int J Environ Res Public Health 2020 Sep 09;17(18):6567 [FREE Full text] [CrossRef] [Medline]
  4. Pneumonia of unknown cause—China: disease outbreak news. World Health Organization. 2020 Jan 05.   URL: [accessed 2021-08-05]
  5. Steffens I. A hundred days into the coronavirus disease (COVID-19) pandemic. Eurosurveillance 2020;25(14):2000550. [CrossRef]
  6. Desai AN, Aronoff DM. Masks and coronavirus disease 2019 (COVID-19). JAMA 2020 May 26;323(20):2103. [CrossRef] [Medline]
  7. Rose S. Medical student education in the time of COVID-19. JAMA 2020 Jun 02;323(21):2131-2132. [CrossRef] [Medline]
  8. Dhawan S. Online learning: a panacea in the time of COVID-19 crisis. J Educ Technol Syst 2020 Jun 20;49(1):5-22. [CrossRef]
  9. Kohanski M, Lo L, Waring M. Review of indoor aerosol generation, transport, and control in the context of COVID-19. Int Forum Allergy Rhinol 2020 Oct;10(10):1173-1179 [FREE Full text] [CrossRef] [Medline]
  10. Walton M, Murray E, Christian MD. Mental health care for medical staff and affiliated healthcare workers during the COVID-19 pandemic. Eur Heart J Acute Cardiovasc Care 2020 Apr 28;9(3):241-247 [FREE Full text] [CrossRef] [Medline]
  11. Stacey A, D'Eon M, Madojemu G. Medical student stress and burnout: before and after COVID-19. Can Med Educ J 2020 Dec 05;11(6):e204-e205 [FREE Full text] [CrossRef] [Medline]
  12. Sharifirad G, Marjani A, Abdolrahman C, Mostafa Q, Hossein S. Stress among Isfahan medical sciences students. J Res Med Sci 2012 Apr;17(4):402-406 [FREE Full text] [Medline]
  13. Sabouhi S, Vaezi A, Sharbafchi M, Aerni A, Bentz D, Coynel D. The Iranian Corona Stress Study. OSF Preprints. Preprint posted on November 25, 2020. [CrossRef]
  14. Abdulghani HM, Sattar K, Ahmad T, Akram A. Association of COVID-19 pandemic with undergraduate medical students' perceived stress and coping. Psychol Res Behav Manag 2020 Oct;13:871-881 [FREE Full text] [CrossRef] [Medline]
  15. Dyrbye L, Shanafelt T. A narrative review on burnout experienced by medical students and residents. Med Educ 2016 Jan 23;50(1):132-149. [CrossRef] [Medline]
  16. Sharifi M, Asadi-Pooya A, Mousavi-Roknabadi R. Burnout among healthcare providers of COVID-19; a systematic review of epidemiology and recommendations. Arch Acad Emerg Med 2021;9(1):e7 [FREE Full text] [Medline]
  17. Rajab M, Gazal A, Alkattan K. Challenges to online medical education during the COVID-19 pandemic. Cureus 2020 Jul 02;12(7):e8966. [CrossRef]
  18. Fask A, Englander F, Wang Z. Do online exams facilitate cheating? An experiment designed to separate possible cheating from the effect of the online test taking environment. J Acad Ethics 2014 Mar 20;12(2):101-112. [CrossRef]
  19. Tippins NT, beaty J, Drasgow F, Gibson WM, Pearlman K, Segall DO, et al. Unproctored internet testing in employment settings. Personnel Psychology 2006 Mar;59(1):189-225. [CrossRef]
  20. Hollister K, Berenson M. Proctored versus unproctored online exams: studying the impact of exam environment on student performance. Decis Sci J Innov Educ 2009 Jan 16:271-294 [FREE Full text] [CrossRef]
  21. Kennedy K, Nowak S, Raghuraman R, Thomas J, Davis S. Academic dishonesty and distance learning: student and faculty views. Coll Stud J 2000;34(2):309.
  22. Jensen C, Thomsen JPF. Self-reported cheating in web surveys on political knowledge. Qual Quant 2013 Dec 3;48(6):3343-3354. [CrossRef]
  23. Hylton K, Levy Y, Dringus LP. Utilizing webcam-based proctoring to deter misconduct in online exams. Comput Educ 2016 Jan;92-93:53-63. [CrossRef]
  24. Diedenhofen B, Musch J. PageFocus: Using paradata to detect and prevent cheating on online achievement tests. Behav Res Methods 2017 Aug;49(4):1444-1459. [CrossRef] [Medline]
  25. Johanns B, Dinkens A, Moore J. A systematic review comparing open-book and closed-book examinations: evaluating effects on development of critical thinking skills. Nurse Educ Pract 2017 Nov;27:89-94 [FREE Full text] [CrossRef] [Medline]
  26. Djulbegovic B, Guyatt GH. Progress in evidence-based medicine: a quarter century on. Lancet 2017 Jul;390(10092):415-423. [CrossRef]
  27. Srinivasan M, Weiner M, Breitfeld PP, Brahmi F, Dickerson KL, Weiner G. Early introduction of an evidence-based medicine course to preclinical medical students. J Gen Intern Med 2002 Jan;17(1):58-65 [FREE Full text] [CrossRef] [Medline]
  28. Bloom B. Taxonomy of Educational Objectives, Handbook 1: Cognitive Domain. New York, NY: McKay; 1956:24.
  29. Harmon O, Lambrinos J, Buffolino J. Assessment design and cheating risk in online instruction. Online Journal of Distance Learning Administration 2010;13(3) [FREE Full text]
  30. Zagury-Orly I, Durning SJ. Assessing open-book examination in medical education: the time is now. Medical Teacher 2020 Aug 28:1-2. [CrossRef]
  31. Knecht K. Assessing cognitive skills of pharmacy students in a biomedical sciences module using a classification of multiple-choice item categories according to Bloom's taxonomy. Am J Pharm Educ 2001;65(4):324 [FREE Full text]
  32. Palmer EJ, Devitt PG. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper. BMC Med Educ 2007 Nov 28;7:49 [FREE Full text] [CrossRef] [Medline]
  33. Olopade FE, Adaramoye OA, Raji Y, Fasola AO, Olapade-Olaopa EO. Developing a competency-based medical education curriculum for the core basic medical sciences in an African Medical School. Adv Med Educ Pract 2016;7:389-398 [FREE Full text] [CrossRef] [Medline]
  34. Vachris MA. Teaching principles of economics without “chalk and talk”: the experience of CNU online. J Econ Educ 1999 Jan;30(3):292-303. [CrossRef]
  35. Shuey S. Assessing online learning in higher education. J Instr Deliv Syst 2002;16(2):13-18.
  36. Serwatka J. Assessment in on-line CIS courses. J Comput Inf Syst 2003;44(1):16-20.
  37. Levant B, Zückert W, Paolo A. Post-exam feedback with question rationales improves re-test performance of medical students on a multiple-choice exam. Adv in Health Sci Educ 2018 Jul 24;23(5):995-1003. [CrossRef]
  38. Merrill J. Levels of Questioning and Forms of Feedback: Instructional Factors in Courseware Design. Dissertation. Columbus, OH: The Ohio State University; 1985.
  39. Bangert-Drowns RL, Kulik CC, Kulik JA, Morgan M. The instructional effect of feedback in test-like events. Rev Educ Res 2016 Jun 30;61(2):213-238. [CrossRef]

MUI: Medical University of Isfahan

Edited by G Eysenbach; submitted 28.10.20; peer-reviewed by B Liu, S Watty, P Dattathreya; comments to author 27.11.20; revised version received 18.02.21; accepted 23.05.21; published 09.08.21


©Amirreza Manteghinejad. Originally published in JMIR Medical Education (, 09.08.2021.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on, as well as this copyright and license information must be included.