Published on in Vol 10 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/57772, first published .
Knowledge Mapping and Global Trends in the Field of the Objective Structured Clinical Examination: Bibliometric and Visual Analysis (2004-2023)

Knowledge Mapping and Global Trends in the Field of the Objective Structured Clinical Examination: Bibliometric and Visual Analysis (2004-2023)

Knowledge Mapping and Global Trends in the Field of the Objective Structured Clinical Examination: Bibliometric and Visual Analysis (2004-2023)

Authors of this article:

Hongjun Ba1 Author Orcid Image ;   Lili Zhang1 Author Orcid Image ;   Xiufang He1 Author Orcid Image ;   Shujuan Li1 Author Orcid Image

Original Paper

Department of Pediatric Cardiology, First Affiliated Hospital of Sun Yat-sen University, Guangzhou, China

Corresponding Author:

Shujuan Li, MD

Department of Pediatric Cardiology

First Affiliated Hospital of Sun Yat-sen University

58# Zhongshan Road 2

Guangzhou, 510080

China

Phone: 86 13430329103

Email: lishuj2@mail.sysu.edu.cn


Background: The Objective Structured Clinical Examination (OSCE) is a pivotal tool for assessing health care professionals and plays an integral role in medical education.

Objective: This study aims to map the bibliometric landscape of OSCE research, highlighting trends and key influencers.

Methods: A comprehensive literature search was conducted for materials related to OSCE from January 2004 to December 2023, using the Web of Science Core Collection database. Bibliometric analysis and visualization were performed with VOSviewer and CiteSpace software tools.

Results: Our analysis indicates a consistent increase in OSCE-related publications over the study period, with a notable surge after 2019, culminating in a peak of activity in 2021. The United States emerged as a significant contributor, responsible for 30.86% (1626/5268) of total publications and amassing 44,051 citations. Coauthorship network analysis highlighted robust collaborations, particularly between the United States and the United Kingdom. Leading journals in this domain—BMC Medical Education, Medical Education, Academic Medicine, and Medical Teacher—featured the highest volume of papers, while The Lancet garnered substantial citations, reflecting its high impact factor (to be verified for accuracy). Prominent authors in the field include Sondra Zabar, Debra Pugh, Timothy J Wood, and Susan Humphrey-Murto, with Ronaldo M Harden, Brian D Hodges, and George E Miller being the most cited. The analysis of key research terms revealed a focus on “education,” “performance,” “competence,” and “skills,” indicating these are central themes in OSCE research.

Conclusions: The study underscores a dynamic expansion in OSCE research and international collaboration, spotlighting influential countries, institutions, authors, and journals. These elements are instrumental in steering the evolution of medical education assessment practices and suggest a trajectory for future research endeavors. Future work should consider the implications of these findings for medical education and the potential areas for further investigation, particularly in underrepresented regions or emerging competencies in health care training.

JMIR Med Educ 2024;10:e57772

doi:10.2196/57772

Keywords



Objective Structured Clinical Examinations (OSCEs) have emerged as indispensable tools for assessing health care professionals, providing structured evaluations of clinical competencies, communication skills, and decision-making abilities [1,2]. Despite their widespread adoption since the 1970s, the landscape of OSCE research remains multifaceted and dynamic, reflecting ongoing innovations in medical, nursing, and allied health education [3].

While numerous studies have explored various aspects of OSCEs, gaps persist in our understanding of the overarching trends and global dynamics shaping this field. A comprehensive review of the existing literature highlights the need for a systematic approach to mapping the knowledge landscape and identifying emerging trends through bibliometric analysis [4-6]. By applying quantitative methods to scholarly publications, bibliometric analysis offers a unique opportunity to uncover hidden patterns, elucidate research trajectories, and forecast future directions in OSCE research.

Building on this rationale, our study aims to bridge these gaps by conducting a bibliometric analysis of OSCE literature from 2004 to 2023. We hypothesize that this analysis will reveal distinct patterns of publication output, collaboration networks, and thematic clusters within the OSCE research domain. Specifically, we seek to (1) identify key research themes, including but not limited to assessment methodologies, educational interventions, and technological innovations in OSCEs; (2) map the global distribution of OSCE research, highlighting geographic hotspots and areas of collaboration; and (3) explore the interconnections between different disciplines within medical education, shedding light on interdisciplinary collaborations and knowledge diffusion.

By elucidating these aspects, our study aims to provide stakeholders in medical education with valuable insights into the current state and future directions of OSCE research. Ultimately, this knowledge mapping exercise seeks to inform evidence-based decision-making, guide educational practices, and stimulate further research in the field of clinical skills assessment.


Data Acquisition and Search Strategy

The bibliographic accuracy of literature types in the Web of Science Core Collection (WoSCC) database is superior to any other database, making it the optimal choice for conducting literature analysis [7,8]. Therefore, we opted to perform our search within this database. We conducted a search in the Web of Science (WoS) for all relevant papers published between January 1, 2004, and December 31, 2023. The search formula “(TS=(The Objective Structured Clinical Examination)) or TS=(OSCE)” was used. The literature screening for this study was based on the inclusion criteria: (1) full-text publications related to the OSCEs; (2) papers and review manuscripts written in English; and (3) papers published between January 1, 2004, and December 31, 2023. The exclusion criteria included (1) topics not related to the OSCEs and (2) papers in the form of conference abstracts, news briefs, and so on. A plain text version of the papers was exported.

General Data

Figure 1 shows the process of literature searching and bibliometric analysis. The results indicate that from January 1, 2004, to December 31, 2023, there were a total of 5268 publications related to the OSCE in the WoSCC database, including 1800 papers (84.96%) and 384 reviews (15.04%). The literature involved 133 countries and regions, 5291 institutions, and 24,478 authors.

Figure 1. The workflow of data collection and bibliometric analysis.

Data Analysis

To depict annual publication trends and the distribution of national contributions, we used GraphPad Prism (version 8.0.2; Dotmatics). For the bibliometric analysis and the visualization of scientific knowledge maps, the study used both CiteSpace (6.2.4R, 64 bit advanced edition; Chaomei Chen, Drexel University) [9] and VOSviewer (version 1.6.18; Leiden University) [10]. These tools were selected for their robustness in handling extensive bibliometric data and their ability to graphically represent complex networks.

VOSviewer, a Java-based software pioneered by van Eck and Waltman [9] in 2009, facilitates the construction of various types of network maps, such as bibliographic coupling, cocitation, and coauthorship networks. CiteSpace, developed by Professor Chaomei Chen, provides a dynamic and computer-based platform for identifying and visualizing patterns and trends in scientific literature, thereby enabling the exploration of knowledge domains and predictive analysis of research trajectories [10].

Our methodological approach within these applications involved setting specific parameters for network density, threshold values for the inclusion of nodes, and time-slicing techniques to analyze temporal changes. The references corresponding to the software applications were verified against our citation list to ensure accuracy [9,10].

In our study using VOSviewer and CiteSpace software tools for bibliometric analysis, the criteria for defining country-based collaborations were established based on specific considerations. Collaborations were determined by considering the first authors and corresponding authors listed in the paper bylines. This approach was chosen to ensure inclusivity and to capture the entirety of collaborative efforts between researchers from different countries.

The burst detection in CiteSpace is based on the Kleinberg algorithm, which is based on modeling the stream using an infinite-state automaton to extract a meaningful structure from document streams that arrive continuously over time [11]. These analyses can show the fast-growing topics that last for multiple years as well as a single year.

Rationale for Analysis Selection

The aforementioned techniques were chosen a priori due to their widespread use and effectiveness in bibliometric studies. They provide robust and complementary insights into productivity, impact, and collaborative patterns within the research field.


Publication Trend

Since 2004, there has been a gradual increase in the number of papers published annually (Figure 2A). We have divided this into 3 periods: from 2004 to 2010, there was a slow growth, with fewer than 150 papers published per year, indicating that the field had not yet captured researchers’ attention. From 2011 to 2018, the volume of publications gradually increased, indicating growing interest in the field. After 2019, there was a rapid rise in the number of publications, peaking in 2021, which suggests that the field has received widespread attention since then.

Figure 2. Trend chart of publications in the past 20 years. (A) Annual publication count chart. (B) Line chart of national publication count. (C) Heatmap of national publication count.

Country or Region and Institution Contributions

Figure 2B and C show the annual number of publications from the top 10 countries over the past decade. The top 5 countries in the field are the United States, the United Kingdom, Canada, Germany, and China, respectively. The United States accounts for 30.86% (1626/5268) of the total volume of publications, significantly surpassing other countries.

Among the top 10 countries or regions in terms of the number of published papers, the United States had a citation count of 44,051, far exceeding all other countries or regions. Its citation-per-publication ratio (27.13) ranks third among all countries or regions, which suggests a generally high quality of the published papers. The United Kingdom had the second-highest number of published papers (576 papers) and ranked second in terms of citation count (15,929 citations). The cooperation network, as shown in Figure 3A, indicates close collaboration between the United States and the United Kingdom, which are the highest producers.

A total of 5291 institutions have systematically published papers related to the OSCE. Among the top 10 institutions in terms of publication volume, 6 are from the United States, 2 are from the United Kingdom, and 2 are from Canada (Figure 3B).

Figure 3. Network graph of national and institutional collaborations. (A) Network graph of national collaborations. (B) Network graph of institutional collaborations. The bubble size represents the number of publications. WoS: Web of Science.

Journals’ Contributions

Tables 1 and 2 list the top 10 journals with the highest outputs and the most citations, respectively. BMC Medical Education, with 227 out of 5268 papers, accounting for 4.31% of publications in the field, is the journal with the most published papers, followed by Medical Teacher (179/5268, 3.40%), Medical Education (132/5268, 2.51%), and Journal of Surgical Education (66/5268, 1.25%). Among the top 10 most productive journals, Annals of the Rheumatic Diseases has the highest impact factor at 27.6. All journals are categorized within either Q1 or Q2 quartiles.

The influence of a journal is determined by the frequency with which it is cocited, which indicates whether the journal has made a significant impact on the scientific community. According to Table 2, the most commonly cocited journal is Medical Education with 1868 citations, followed by Academic Medicine with 1775 citations, and Medical Teacher with 1597 citations. Among the top 10 journals by cocitation count, The Lancet was cited 697 times and has the highest impact factor of 168.9 within these top journals. All journals within the most cocited list are in the Q1 or Q2 zone.

Table 1. Top 10 most productive journals.
RankJournalsPapers (N=5268), n (%)IFaQuartile in category
1BMC Medical Education227 (4.31)3.6Q1
2Medical Teacher179 (3.40)4.7Q1
3Medical Education132 (2.51)7.1Q1
4Journal of Surgical Education66 (1.25)2.9Q2
5Academic Medicine64 (1.21)7.4Q1
6Patient Education and Counseling64 (1.21)3.5Q2
7Advances in Health Sciences Education60 (1.14)4.0Q1
8American Journal of Pharmaceutical Education59 (1.12)3.3Q2
9PLoS One59 (1.12)3.7Q2
10Nurse Education Today56 (1.06)3.9Q1

aIF: impact factor.

Table 2. Top 10 journals with the highest number of cocitations. Cocited journals refer to 2 or more journals that are simultaneously cited in the reference lists of other research papers.
RankCited journalsCocitations, nIFa (2020)Quartile in category
1Medical Education18684.7Q1
2Academic Medicine17757.4Q1
3Medical Teacher15974.7Q1
4BMC Medical Education9413.6Q1
5JAMA—Journal of American Medical Association931120.7Q1
6British Medical Journal827107.7Q1
7Advances in Health Sciences Education8024.0Q1
8The Lancet697168.9Q1
9New England Journal of Medicine694158.5Q1
10Teaching and Learning Medicine5992.5Q3

aIF: impact factor.

Authors and Cocited Authors' Contributions

Among all authors who have published literature related to OSCE, Tables 3 and 4 list the top 10 authors with the most published papers. Together, these top 10 authors have published 185 papers, accounting for 3.51% of all papers (N=5268) in the field. Sondra Zabar has 26 publications, which is the highest number of published research papers, followed by Debra Pugh with 22, Timothy J Wood with 20, and Susan Humphrey-Murto with 19. Further analysis indicates that among the top 10 ranked authors, 4 are from the United States, 3 are from Canada, 2 are from Australia, and 1 is from China. CiteSpace visualizes the network of relationships between authors (Figure 4).

Table 4 displays the top 10 authors who have been cocited and cited the most, respectively. A total of 148 authors have been cited more than 50 times, indicating that their research has a high reputation and influence. The largest nodes are associated with the authors who have been cocited the most, including Ronald M Harden with 751 citations, Brian D Hodges with 330 citations, and George E Miller with 222 citations.

Table 3. Top 10 most productive authors.
RankAuthorsPapers, nLocations
1Zabar, Sondra26United States
2Pugh, Debra22Canada
3Wood, Timothy J20Canada
4Humphrey-Murto, Susan19Canada
5Gillespie, Colleen17United States
6Shulruf, Boaz17Australia
7Yang, Ying-Ying17China
8Durning, Steven J16United States
9Fuller, Richard16Australia
10Park, Yoon Soo15United States
Table 4. Top 10 most cocited authors.
RankCocited authorsCitations, n
1Harden, Ronald M751
2Hodges, Brian D330
3Miller, George E222
4Epstein, Ronald M194
5van der Vleuten, Cees PM173
6Wass, Valerie172
7Khan, Kamran Z164
8Regehr, Glenn162
9Cook, David A160
10Downing, Steven M156
Figure 4. Network diagram of author collaborations. The bubble size represents the number of publications.

Analysis of Highly Cited References

Over the time span from 2004 to 2023, the cocitation network comprised 1053 nodes and 3508 links (Figure 5). According to the top 10 papers by cocitation frequency (Table 5), the most cocited reference is from the journal Advances in Medical Education and Practice (impact factor=2.0), titled “An evaluative study of Objective Structured Clinical Examination (OSCE): students and examiners perspectives” [12]. The first author of this paper is Md Anwarul Azim Majumder. The paper posits that OSCE is the gold standard and universal form for assessing medical students’ clinical competence in a comprehensive, reliable, and effective manner.

Figure 5. Network diagram of cocited references.
Table 5. Top 10 highest cited references.
RankTitlesJournalsIFa (2021)First authorsTotal citations, n
1An evaluative study of Objective Structured Clinical Examination (OSCE): students and examiners perspectives [12]Advances in Medical Education and Practice2.0Majumder, Md Anwarul Azim38
2Implementing an online OSCE during the COVID-19 pandemic [13]Journal of Dental Education2.3Kakadia, Rahen31
3Diagnostic and statistical manual of mental disorders [14]Psychiatry Research11.3Mittal, Vijay A31
4A systematic review of the reliability of Objective Structured Clinical Examination scores [15]Medical Education7.1Brannick, Michael T30
5Twelve tips for developing an OSCE that measures what you want [16]Medical Teacher4.7Daniels, Vijay John30
6Is the OSCE a feasible tool to assess competencies in undergraduate medical education? [17]Medical Teacher4.7Patricio, Madalena F29
7Techniques for measuring clinical competence: Objective Structured Clinical Examinations [18]Medical Education7.1Newble, David26
8Assessment in medical education [19]New England Journal of Medicine158.5Epstein, Ronald M26
9Assessing communication skills of medical students in Objective Structured Clinical Examinations (OSCE)-a systematic review of rating scales [20]PLoS One3.7Cömert, Musa26
10Twelve tips for conducting a virtual OSCE [21]Medical Teacher4.7Hopwood, Jenny26

aIF: impact factor.

Keyword Analysis

Through the analysis of keywords, we can quickly understand the situation and development direction of a field. Based on the co-occurrence of keywords in VOSviewer, the hottest keyword is “education” (n=677 occurrences), followed by “performance” (n=536), “competence” (n=458), and “skills” (n=449; Table 6).

Table 6. Top 20 keywords co-occurrence frequencies.
RankKeywordsCo-occurrences, n
1Education677
2Performance536
3Competence458
4Skills449
5Reliability371
6Assessment342
7Students337
8Validity329
9Simulation284
10Medical education264
11Diagnosis228
12Care217
13Prevalence207
14Medical students197
15Management196
16Medical education171
17Curriculum168
18Communication161
19Impact156
20Clinical skills147

The Burst of Cocited References and Keywords

With CiteSpace, we identified 50 of the most reliable citation bursts in the field related to OSCE [12,13,15-62]. The most frequently cited reference, with a burst strength of 15.91, is a paper published in Medical Education titled “A systematic review of the reliability of Objective Structured Clinical Examination scores” [15], whose first author is Michael T Brannick. The paper suggests that OSCEs consist of a series of simulated tasks to assess medical practitioners’ skills in diagnosing and treating patients. Of the 50 references, 47 (94%) were published between 2004 and 2023, indicating that these papers have been frequently cited over nearly 20 years. Notably, 24 of these papers are currently at a citation peak (Figure 6A [12,13,15-62]), meaning that research related to OSCE is expected to continue receiving significant attention in the future.

Among the 768 strongest emerging keywords in the field, we focused on the 50 with the most significant surges (Figure 6B), representing the current hotspots in the field and likely future research directions.

Figure 6. Citation burst graph (A), and keyword burst graph (B; sorted by the beginning year of the burst). The blue bars mean the reference has been published; the red bars mean citation burstness.

Principal Findings

This study is pioneering in its bibliometric approach to OSCE, encapsulating a comprehensive view of the dynamic research trends in this field. By analyzing the bibliometric data internationally, we have mapped out collaboration networks, identified prevailing research directions, and forecasted potential future developments in OSCE scholarship. The surge in OSCE-related publications since 2019 underscores the recognition of OSCEs as essential for evaluating health care practitioners, meeting the demands of modern medicine for more robust and comprehensive assessment methods to gauge clinical competency [22,63].

Despite this growth, the concentration of research output in countries like the United States, the United Kingdom, and Canada may reflect deeper issues of resource allocation and priority setting in medical education globally [64,65]. This suggests a need for a more nuanced discussion on the uneven geographical spread of OSCE research and its implications. The disparity in research contribution could hinder the global exchange of innovative practices and perspectives in medical education [66,67].

Furthermore, the bibliometric data point to the importance of technology in OSCEs, particularly the integration of virtual and augmented reality. However, to fully understand the implications of technological advances, a more detailed analysis is warranted. This should include how technology shapes the development of OSCEs, its impact on the validity and reliability of assessments, and the potential barriers to its widespread adoption [68-70].

The high concentration of publications in Q1 and Q2 quartile journals, especially those with a significant impact factor, attests to the intersection of OSCE research with impactful clinical education and outcomes. The association with prestigious journals underlines the extensive influence and critical importance of OSCEs across multiple medical specialties [71-73].

The prominence of a core group of scholars leading OSCE research suggests a centralization of expertise that could be diversified through broader international collaboration. Such collaboration could introduce various cultural and pedagogical perspectives into the OSCE discourse, thereby enriching both the practice and the research of OSCEs worldwide [74,75].

The keyword analysis reflects a continual focus on the foundational elements of clinical education, such as “education,” “performance,” “competence,” and “skills,” which are at the heart of the OSCE methodology. Emerging research trends suggest a shift toward the integration of innovative educational technologies and methodologies, enhancing both the OSCE process and its outcomes [76,77].

Comparison to the Literature

Our findings align with those of Lim et al [78], who identified issues with construct, content, and predictive validity in OSCEs in pharmacy education, as well as significant resource challenges. These concerns are echoed in our analysis, where similar validity issues and logistical constraints were observed. Other studies, such as those by Hodges et al [79], have highlighted persistent challenges in psychiatric OSCEs, emphasizing the need for continuous refinement and adaptation. Our study extends these discussions by mapping global trends and collaboration networks, underscoring the necessity for continuous re-evaluation and innovation in OSCE methodologies.

Implications of Findings

The challenges associated with OSCEs suggest a need for evolving assessment methods that incorporate simulations, peer assessments, and reflective practices. The resource-intensive nature of OSCEs underscores the necessity for scalable and sustainable alternatives, such as virtual simulations. Policymakers and educators should leverage global collaboration networks to share best practices and develop adaptable, technology-enhanced assessment frameworks. This approach will help address validity concerns and logistical constraints, ensuring that educational assessments remain robust and relevant in the ever-evolving landscape of health care education.

Limitations

Our bibliometric analysis has limitations that may affect our findings. We only used data from the WoSCC database, potentially excluding studies not indexed there and leading to bias toward English-language literature. This limits the scope of our analysis and overlooks valuable contributions from non-English sources.

Suggestions

To address this, future research should involve a wider range of databases and languages [80,81]. Moreover, the data quality in our study may vary, affecting the credibility of our knowledge mapping. Therefore, caution is needed when interpreting results, and complementary research methods should be considered for a more comprehensive understanding of the field. Longitudinal studies are crucial to assess the impact of OSCEs on medical performance, connecting educational assessments with clinical practice and patient care [82,83].

Moreover, understanding how OSCEs adapt to different health care systems, cultural contexts, and specializations will provide insights into their scalability and adaptability. This is particularly relevant as the health care sector grapples with rapid changes and as medical education seeks to prepare health care professionals for diverse practice environments [19,84].

Conclusions

In conclusion, this bibliometric study not only reaffirms the enduring importance and evolutionary path of OSCEs within medical education but also emphasizes the need for OSCEs to evolve in step with broader health care transformations. The data-driven insights from this analysis should inform future research directions, influence policymaking, and refine educational strategies. By doing so, OSCEs can continue to serve as a dynamic, relevant, and innovative tool in the arsenal of clinical education and evaluation methods.

Data Availability

All data generated or analyzed during this study are included in this published article.

Authors' Contributions

HB conceived and designed the ideas for the paper. HB, LZ, XH, and SL participated in all data collection and processing. HB was the major contributor in organizing records and drafting the manuscript. All authors proofread and approved the manuscript.

Conflicts of Interest

None declared.

  1. Criscione-Schreiber L. Turning Objective Structured Clinical Examinations into Reality. Rheum Dis Clin North Am. Feb 2020;46(1):21-35. [CrossRef] [Medline]
  2. Alkhateeb N, Salih AM, Shabila N, Al-Dabbagh A. Objective structured clinical examination: Challenges and opportunities from students' perspective. PLoS One. 2022;17(9):e0274055. [FREE Full text] [CrossRef] [Medline]
  3. Jünger J, Schäfer S, Roth C, Schellberg D, Friedman Ben-David M, Nikendei C. Effects of basic clinical skills training on objective structured clinical examination performance. Med Educ. Oct 2005;39(10):1015-1020. [CrossRef] [Medline]
  4. Gauthier É. Bibliometric analysis of scientific and technological research: a user's guide to the methodology. Science and Technology Redesign Project, CiteSeer. 1998. URL: https://www150.statcan.gc.ca/n1/en/catalogue/88F0006X1998008 [accessed 2024-08-26]
  5. Birch S, Lee MS, Alraek T, Kim T. Overview of Treatment Guidelines and Clinical Practical Guidelines That Recommend the Use of Acupuncture: A Bibliometric Analysis. J Altern Complement Med. Aug 2018;24(8):752-769. [CrossRef] [Medline]
  6. Wilson M, Sampson M, Barrowman N, Doja A. Bibliometric Analysis of Neurology Articles Published in General Medicine Journals. JAMA Netw Open. Apr 01, 2021;4(4):e215840. [FREE Full text] [CrossRef] [Medline]
  7. Wu H, Li Y, Tong L, Wang Y, Sun Z. Worldwide research tendency and hotspots on hip fracture: a 20-year bibliometric analysis. Arch Osteoporos. Apr 17, 2021;16(1):73. [CrossRef] [Medline]
  8. Vargas JS, Livinski AA, Karagu A, Cira MK, Maina M, Lu Y, et al. A bibliometric analysis of cancer research funders and collaborators in Kenya: 2007-2017. J Cancer Policy. Sep 2022;33:100331. [FREE Full text] [CrossRef] [Medline]
  9. van Eck NJ, Waltman L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics. Aug 2010;84(2):523-538. [FREE Full text] [CrossRef] [Medline]
  10. Chen C. CiteSpace: A Practical Guide for Mapping Scientific Literature. New York, NY. Nova Science Publishers; 2016.
  11. Kleinberg J. Bursty and hierarchical structure in streams. Data Min Knowl Discov. 2003;7:373-397. [CrossRef]
  12. Majumder MAA, Kumar A, Krishnamurthy K, Ojeh N, Adams OP, Sa B. An evaluative study of Objective Structured Clinical Examination (OSCE): students and examiners perspectives. Adv Med Educ Pract. Jun 5, 2019;10:387-397. [FREE Full text] [CrossRef] [Medline]
  13. Kakadia R, Chen E, Ohyama H. Implementing an online OSCE during the COVID-19 pandemic. J Dent Educ. Jul 15, 2020;85(Suppl 1):1006-1008. [FREE Full text] [CrossRef] [Medline]
  14. Mittal VA, Walker EF. Diagnostic and statistical manual of mental disorders. Psychiatry Res. Aug 30, 2011;189(1):158-159. [FREE Full text] [CrossRef] [Medline]
  15. Brannick MT, Erol-Korkmaz HT, Prewett M. A systematic review of the reliability of Objective Structured Clinical Examination scores. Med Educ. Dec 2011;45(12):1181-1189. [CrossRef] [Medline]
  16. Daniels VJ, Pugh D. Twelve tips for developing an OSCE that measures what you want. Med Teach. Dec 2018;40(12):1208-1213. [CrossRef] [Medline]
  17. Patrício MF, Julião M, Fareleira F, Carneiro AV. Is the OSCE a feasible tool to assess competencies in undergraduate medical education? Med Teach. Jun 2013;35(6):503-514. [CrossRef] [Medline]
  18. Newble D. Techniques for measuring clinical competence: Objective Structured Clinical Examinations. Med Educ. Feb 2004;38(2):199-203. [CrossRef] [Medline]
  19. Epstein RM. Assessment in medical education. N Engl J Med. 2007;356(4):387-396. [CrossRef] [Medline]
  20. Cömert M, Zill JM, Christalle E, Dirmaier J, Härter M, Scholl I. Assessing communication skills of medical students in Objective Structured Clinical Examinations (OSCE)-a systematic review of rating scales. PLoS One. Mar 31, 2016;11(3):e0152717. [FREE Full text] [CrossRef] [Medline]
  21. Hopwood J, Myers G, Sturrock A. Twelve tips for conducting a virtual OSCE. Med Teach. Jun 2021;43(6):633-636. [CrossRef] [Medline]
  22. Harden RM. Revisiting 'assessment of clinical competence using an objective structured clinical examination (OSCE)'. Med Educ. 2016;50(4):376-379. [CrossRef] [Medline]
  23. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945-949. [CrossRef] [Medline]
  24. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287(2):226-235. [CrossRef] [Medline]
  25. Hodges B. OSCE! Variations on a theme by Harden. Med Educ. 2003;37(12):1134-1140. [CrossRef] [Medline]
  26. Barman A. Critiques on the objective structured clinical examination. Ann Acad Med Singap. 2005;34(8):478-482. [FREE Full text] [Medline]
  27. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102. [CrossRef] [Medline]
  28. Rushforth HE. Objective Structured Clinical Examination (OSCE): review of literature and implications for nursing education. Nurse Educ Today. 2007;27(5):481-490. [CrossRef] [Medline]
  29. Turner JL, Dankoski ME. Objective structured clinical exams: a critical review. Fam Med. 2008;40(8):574-578. [Medline]
  30. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition, Text Revision (DSM-5-TR). Washington, DC. American Psychiatric Publishing; 2022.
  31. Pell G, Fuller R, Homer M, Roberts T, International Association for Medical Education. How to measure the quality of the OSCE: A review of metrics - AMEE guide no. 49. Med Teach. 2010;32(10):802-811. [FREE Full text] [CrossRef] [Medline]
  32. Brand HS, Schoonheim-Klein M. Is the OSCE more stressful? Examination anxiety and its consequences in different assessment methods in dental education. Eur J Dent Educ. 2009;13(3):147-153. [CrossRef] [Medline]
  33. Selim AA, Ramadan FH, El-Gueneidy MM, Gaafer MM. Using Clinical Examination (OSCE) in undergraduate psychiatric nursing education: is it reliable and valid? Nurse Educ Today. 2012;32(3):283-288. [CrossRef] [Medline]
  34. Mitchell ML, Henderson A, Groves M, Dalton M, Nulty D. The Objective Structured Clinical Examination (OSCE): optimising its value in the undergraduate nursing curriculum. Nurse Educ Today. 2009;29(4):398-404. [CrossRef] [Medline]
  35. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition. Washington, DC. American Psychiatric Publishing; 2013.
  36. Griesser MJ, Beran MC, Flanigan DC, Quackenbush M, Van Hoff C, Bishop JY. Implementation of an Objective Structured Clinical Exam (OSCE) into orthopedic surgery residency training. J Surg Educ. 2012;69(2):180-189. [CrossRef] [Medline]
  37. Khan KZ, Ramachandran S, Gaunt K, Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part I: an historical and theoretical perspective. Med Teach. 2013;35(9):e1437-e1446. [CrossRef] [Medline]
  38. Kogan JR, Conforti L, Bernabeo E, Iobst W, Holmboe E. Opening the black box of clinical skills assessment via observation: a conceptual model. Med Educ. 2011;45(10):1048-1060. [CrossRef] [Medline]
  39. American Educational Research Association. Standards for Educational & Psychological Testing (2014 Edition). 2024. URL: https://www.aera.net/publications/books/standards-for-educational-psychological-testing-2014-edition [accessed 2024-09-26]
  40. Ilgen JS, Ma IWY, Hatala R, Cook DA. A systematic review of validity evidence for checklists versus global rating scales in simulation-based assessment. Med Educ. 2015;49(2):161-173. [CrossRef] [Medline]
  41. Shirwaikar A. Objective Structured Clinical Examination (OSCE) in pharmacy education - a trend. Pharm Pract (Granada). 2015;13(4):627-630. [FREE Full text] [CrossRef] [Medline]
  42. Harden HR. OSC Guide. 2016. URL: https://www.osc.ca/en/news-events/subscribe/osc-guide [accessed 2024-09-26]
  43. Johnston ANB, Weeks B, Shuker M, Coyne E, Niall H, Mitchell M, et al. Nursing students' perceptions of the Objective Structured Clinical Examination: an integrative review. Clin Simul Nurs. 2017;13(3):127-142. [FREE Full text] [CrossRef]
  44. Bevan J, Russell B, Marshall B. A new approach to OSCE preparation - PrOSCEs. BMC Med Educ. 2019;19(1):126. [FREE Full text] [CrossRef] [Medline]
  45. Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C, et al. ICBME Collaborators. Core principles of assessment in competency-based medical education. Med Teach. 2017;39(6):609-616. [CrossRef] [Medline]
  46. Khan R, Payne MWC, Chahine S. Peer assessment in the Objective Structured Clinical Examination: a scoping review. Med Teach. 2017;39(7):745-756. [CrossRef] [Medline]
  47. Boursicot K, Kemp S, Ong TH, Wijaya L, Goh SH, Freeman K, et al. Conducting a high-stakes OSCE in a COVID-19 environment. MedEdPublish (2016). 2020;9:54. [FREE Full text] [CrossRef] [Medline]
  48. Lara S, Foster CW, Hawks M, Montgomery M. Remote assessment of clinical skills during COVID-19: a virtual, high-stakes, summative pediatric Objective Structured Clinical Examination. Acad Pediatr. 2020;20(6):760-761. [FREE Full text] [CrossRef] [Medline]
  49. Graf J, Smolka R, Simoes E, Zipfel S, Junne F, Holderried F, et al. Communication skills of medical students during the OSCE: gender-specific differences in a longitudinal trend study. BMC Med Educ. 2017;17(1):75. [FREE Full text] [CrossRef] [Medline]
  50. Yeates P, Cope N, Hawarden A, Bradshaw H, McCray G, Homer M. Developing a video-based method to compare and adjust examiner effects in fully nested OSCEs. Med Educ. 2019;53(3):250-263. [FREE Full text] [CrossRef] [Medline]
  51. Norcini J, Anderson MB, Bollela V, Burch V, Costa MJ, Duvivier R, et al. 2018 Consensus framework for good assessment. Med Teach. 2018;40(11):1102-1109. [CrossRef] [Medline]
  52. Lewis KL, Bohnert CA, Gammon WL, Hölzer H, Lyman L, Smith C, et al. The association of standardized patient educators (ASPE) standards of best practice (SOBP). Adv Simul (Lond). 2017;2:10. [FREE Full text] [CrossRef] [Medline]
  53. Chong L, Taylor S, Haywood M, Adelstein BA, Shulruf B. The sights and insights of examiners in Objective Structured Clinical Examinations. J Educ Eval Health Prof. 2017;14(3):34-242. [FREE Full text] [CrossRef] [Medline]
  54. Shehata MH, Kumar AP, Arekat MR, Alsenbesy M, Mohammed Al Ansari A, Atwa H, et al. A toolbox for conducting an online OSCE. Clin Teach. 2021;18(3):236-242. [CrossRef] [Medline]
  55. Craig C, Kasana N, Modi A. Virtual OSCE delivery: the way of the future? Med Educ. 2020;54(12):1185-1186. [FREE Full text] [CrossRef] [Medline]
  56. Boyle JG, Colquhoun I, Noonan Z, McDowall S, Walters MR, Leach JP. Viva la VOSCE? BMC Med Educ. 2020;20(1):514. [FREE Full text] [CrossRef] [Medline]
  57. Blythe J, Patel NSA, Spiring W, Easton G, Evans D, Meskevicius-Sadler E, et al. Undertaking a high stakes virtual OSCE ("VOSCE") during Covid-19. BMC Med Educ. 2021;21(1):221. [FREE Full text] [CrossRef] [Medline]
  58. Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020;10(11):e042378. [FREE Full text] [CrossRef] [Medline]
  59. Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual Objective Structured Clinical Examination in dental education. A response to COVID-19. Eur J Dent Educ. 2021;25(3):488-494. [FREE Full text] [CrossRef] [Medline]
  60. Boursicot K, Kemp S, Wilkinson T, Findyartini A, Canning C, Cilliers F, et al. Performance assessment: consensus statement and recommendations from the 2020 Ottawa conference. Med Teach. 2021;43(1):58-67. [CrossRef] [Medline]
  61. Hannan TA, Umar SY, Rob Z, Choudhury RR. Designing and running an online Objective Structured Clinical Examination (OSCE) on zoom: a peer-led example. Med Teach. 2021;43(6):651-655. [CrossRef] [Medline]
  62. Solà-Pola M, Morin-Fraile V, Fabrellas-Padrés N, Raurell-Torreda M, Guanter-Peris L, Guix-Comellas E, et al. The usefulness and acceptance of the OSCE in nursing schools. Nurse Educ Pract. 2020;43:102736. [CrossRef] [Medline]
  63. Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ. 1979;13(1):41-54. [CrossRef] [Medline]
  64. Lee GB, Chiu AM. Assessment and feedback methods in competency-based medical education. Ann Allergy Asthma Immunol. 2022;128(3):256-262. [CrossRef] [Medline]
  65. Mathew MM, Thomas KA. Medical aptitude and its assessment. Natl Med J India. 2018;31(6):356-363. [FREE Full text] [CrossRef] [Medline]
  66. Zayyan M. Objective structured clinical examination: the assessment of choice. Oman Med J. 2011;26(4):219-222. [FREE Full text] [CrossRef] [Medline]
  67. Jiang Z, Ouyang J, Li L, Han Y, Xu L, Liu R, et al. Cost-effectiveness analysis in performance assessments: a case study of the objective structured clinical examination. Med Educ Online. 2022;27(1):2136559. [FREE Full text] [CrossRef] [Medline]
  68. Cook DA, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang AT, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978-988. [CrossRef] [Medline]
  69. Bajpai S, Semwal M, Bajpai R, Car J, Ho AHY. Health professions' digital education: review of learning theories in randomized controlled trials by the digital health education collaboration. J Med Internet Res. 2019;21(3):e12912. [FREE Full text] [CrossRef] [Medline]
  70. Cheng A, Lang T, Starr S, Pusic M, Cook D. Technology-enhanced simulation and pediatric education: a meta-analysis. Pediatrics. 2014;133(5):e1313-e1323. [CrossRef] [Medline]
  71. Wilkinson TJ, Wade WB, Knock LD. A blueprint to assess professionalism: results of a systematic review. Acad Med. 2009;84(5):551-558. [CrossRef] [Medline]
  72. Preez RRD, Pickworth GE, van Rooyen M. Teaching professionalism: a South African perspective. Med Teach. 2007;29(9):e284-e291. [CrossRef] [Medline]
  73. Mueller PS. Teaching and assessing professionalism in medical learners and practicing physicians. Rambam Maimonides Med J. 2015;6(2):e0011. [FREE Full text] [CrossRef] [Medline]
  74. Alinier G. A typology of educationally focused medical simulation tools. Med Teach. 2007;29(8):e243-e250. [CrossRef] [Medline]
  75. Fox-Robichaud AE, Nimmo GR. Education and simulation techniques for improving reliability of care. Curr Opin Crit Care. 2007;13(6):737-741. [CrossRef] [Medline]
  76. Eva KW, Regehr G. "I'll never play professional football" and other fallacies of self-assessment. J Contin Educ Health Prof. 2008;28(1):14-19. [CrossRef] [Medline]
  77. Colthart I, Bagnall G, Evans A, Allbutt H, Haig A, Illing J, et al. The effectiveness of self-assessment on the identification of learner needs, learner activity, and impact on clinical practice: BEME guide no. 10. Med Teach. 2008;30(2):124-145. [CrossRef] [Medline]
  78. Lim AS, Ling YL, Wilby KJ, Mak V. What's been trending with OSCEs in pharmacy education over the last 20 years? A bibliometric review and content analysis. Curr Pharm Teach Learn. 2024;16(3):212-220. [FREE Full text] [CrossRef] [Medline]
  79. Hodges BD, Hollenberg E, McNaughton N, Hanson MD, Regehr G. The psychiatry OSCE: a 20-year retrospective. Acad Psychiatry. 2014;38(1):26-34. [CrossRef] [Medline]
  80. Boulet J, Durning S. What we measure … and what we should measure in medical education. Med Educ. 2019;53(1):86-94. [CrossRef] [Medline]
  81. Lucey CR, Hauer KE, Boatright D, Fernandez A. Medical education's wicked problem: achieving equity in assessment for medical learners. Acad Med. 2020;95(12S Addressing Harmful Bias and Eliminating Discrimination in Health Professions Learning Environments):S98-S108. [CrossRef] [Medline]
  82. Tormey W. Education, learning and assessment: current trends and best practice for medical educators. Ir J Med Sci. 2015;184(1):1-12. [CrossRef] [Medline]
  83. Gröne O, Mielke I, Knorr M, Ehrhardt M, Bergelt C. Associations between communication OSCE performance and admission interviews in medical education. Patient Educ Couns. 2022;105(7):2270-2275. [CrossRef] [Medline]
  84. Min Simpkins AA, Koch B, Spear-Ellinwood K, St John P. A developmental assessment of clinical reasoning in preclinical medical education. Med Educ Online. 2019;24(1):1591257. [FREE Full text] [CrossRef] [Medline]


OSCE: Objective Structured Clinical Examination
WoS: Web of Science
WoSCC: Web of Science Core Collection


Edited by B Lesselroth; submitted 26.02.24; peer-reviewed by S Alkan, W Chou; comments to author 15.05.24; revised version received 17.05.24; accepted 19.08.24; published 30.09.24.

Copyright

©Hongjun Ba, Lili Zhang, Xiufang He, Shujuan Li. Originally published in JMIR Medical Education (https://mededu.jmir.org), 30.09.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on https://mededu.jmir.org/, as well as this copyright and license information must be included.