Published on in Vol 12 (2026)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/87980, first published .
Evaluating Microlearning for Faculty Development in Medical Education: Mixed Methods Pilot Study

Evaluating Microlearning for Faculty Development in Medical Education: Mixed Methods Pilot Study

Evaluating Microlearning for Faculty Development in Medical Education: Mixed Methods Pilot Study

Research Letter

1Cardiovascular Education, Mayo Clinic, Rochester, MN, United States

2Cardiovascular Diseases, Mayo Clinic, Rochester, MN, United States

Corresponding Author:

Darci L Lammers, PhD

Cardiovascular Education

Mayo Clinic

200 1st St SW

Rochester, MN, 55905

United States

Phone: 1 507 284 2511

Email: lammers.darci@mayo.edu


This mixed methods pilot study evaluates the feasibility and effectiveness of microlearning for faculty development in cardiovascular education. Microlearning appears feasible and well-received for faculty development, offering a scalable, flexible approach.

JMIR Med Educ 2026;12:e87980

doi:10.2196/87980

Keywords



Traditional faculty development relies on time-intensive, in-person sessions, limiting participation for clinician-educators balancing clinical and administrative duties [1,2].

Microlearning delivers brief, focused segments (≤15 min) aligned with objectives [3,4]. Grounded in cognitive science, it reduces cognitive load and enhances retention by presenting information in small units [5]. For busy clinician-educators, microlearning offers flexible, asynchronous learning that addresses time constraints and supports targeted, on-demand modules for skill application [3,4,6], and it improves engagement and aligns with digital learning preferences [6,7].

However, research has focused only on the short-term outcomes of microlearning, leaving gaps in sustained knowledge transfer and behavioral change [3,4,6]. Without accessible and time-efficient faculty development approaches, clinician-educators may continue to rely on informal or inconsistent training, potentially compromising the quality of educational assessments, learner outcomes, and the validity of continuing medical education (CME) activities.

We selected CME multiple‑choice question (MCQ) development as our intervention topic based on an internal needs analysis indicating that faculty responsible for creating CME assessment items receive little or no structured guidance in item writing. This task was identified locally as a high‑frequency responsibility in which faculty desired more support. MCQ development serves as a practical context to evaluate microlearning for clinician‑educators.

This mixed methods pilot evaluates microlearning feasibility and effectiveness in cardiovascular faculty development by assessing learning transfer, satisfaction, and 4‑month knowledge application.


We conducted a sequential explanatory study guided by Kirkpatrick’s framework [8] to assess a microlearning module for faculty development in cardiovascular education. The Mayo Clinic Institutional Review Board reviewed the study and determined it to be exempt. Participants provided informed consent and could withdraw any time. Those completing all components received US $200 remuneration. All data were deidentified. Quantitative analysis was prioritized, with qualitative interviews used to explain and contextualize test results.

Cardiology clinician‑educators responsible for CME MCQs were recruited via email. Eligibility required current responsibility for board‑style MCQ development and no formal training or related coursework within 6 months. Of the 75 identified faculty, 34 met the criteria; 8 enrolled and completed all the components.

The pretest (18 items across 4 sections; total 400 points, scored in the learning management system) preceded each corresponding microlearning segment (Multimedia Appendix 1). The module comprised 4 short videos, a quick‑reference guide, and an MCQ‑writing template, delivered asynchronously via the cardiology CME learning management system for 3 months. After a 15‑item satisfaction survey (Multimedia Appendix 2), access was discontinued. An identical posttest occurred 4 months after completion to assess retention; 2 weeks later, semistructured Teams interviews explored application and perceptions (Multimedia Appendix 3).

We compared pre/post scores using the 2‑sided Wilcoxon signed-rank test (reporting Pratt method including ties), calculated matched-pairs rank-biserial effect sizes, and summarized distributions with medians (IQRs).

Survey responses were summarized descriptively and used to inform preliminary qualitative themes. We conducted a structured hybrid deductive–inductive analysis. The lead investigator completed line‑by‑line coding by using an a priori codebook derived from the interview guide, microlearning constructs, and Kirkpatrick’s model, with inductive codes added as needed. An education researcher not involved in the study independently reviewed all the coded transcripts. Discrepancies were resolved through consensus, and the codebook was refined iteratively.

Quantitative and qualitative data were integrated by the lead and co-lead investigators, who jointly analyzed test scores, survey responses, and interview transcripts. Quantitative results informed qualitative coding, enabling exploration of trends and outliers. Both strands were interpreted together to provide a comprehensive understanding of the module’s impact.


Among the 8 completers, the median pretest score was 366.07 (IQR 338.93-389.28), and the median posttest score was 400.00 (IQR 361.96-400.00). Of the 8 paired scores, 5 improved, 2 were unchanged (ties), and 1 decreased; thus, informative pairs were n=6.

A Wilcoxon signed-rank test (excluding ties) yielded W=5.0, P=.25; using the Pratt method (including ties), W=7.0, P=.18. The matched-pairs rank-biserial correlation was 0.52, indicating a moderate positive effect of the intervention.

Satisfaction survey responses showed strong agreement across all dimensions, with participants endorsing the module’s relevance, clarity, and flexibility (Table 1).

Table 1. Quantitative summary of the satisfaction survey responses (N=8).
Survey itemStrongly agree, n (%)Agree, n (%)
Videos were engaging6 (75)2 (25)
Video length was appropriate8 (100)0 (0)
Modular format was effective7 (88)1 (12)
Prepared to write board-style review questions6 (75)2 (25)
Quick reference guides were valuable8 (100)0 (0)
Pretest helped gauge prior knowledge5 (63)3 (37)
Pretest guided learning5 (63)3 (37)
Posttest reinforced learning5 (63)3 (37)

Qualitative analysis revealed three main themes: (1) appreciation for the concise, flexible format; (2) direct application of learned principles in educational practice; and (3) high perceived value and satisfaction (Table 2). Time constraints remained a barrier to engaging in faculty development, but microlearning was widely endorsed as effective and scalable.

Table 2. Key themes from qualitative interviews.
ThemeSubthemesRepresentative findings
Value of microlearning format
  • Flexibility and time efficiency
  • Reduced cognitive load
  • Point-of-need learning
Microlearning was praised for fitting into busy schedules and enabling learning in short, focused bursts
Knowledge application and transfer
  • Direct application to CMEa
  • Immediate use of quick-reference codes and template
  • Knowledge shared with colleagues
Learned principles were immediately used in CME MCQb development and shared with colleagues. Modules will serve as an ongoing reference
Perceived value and satisfaction
  • High ratings (excellent/very good)
  • Clear, concise, relevant content
  • Multimedia quality
Participants appreciated the clarity, relevance, and production quality of the modules and resources
Barriers to faculty development
  • Time constraints
  • Administrative duties
  • Lack of protected time from clinical responsibilities
Common barriers included competing clinical/administrative demands and lack of dedicated time for learning
Suggestions for improvement
  • Familiar platforms
  • Time estimates for modules
  • Follow-up opportunities
  • Resource access
Recommendations included hosting on familiar organizational platforms, adding time estimates, and offering structured follow-up with an expert

aCME: continuing medical education.

bMCQ: multiple choice question.


We demonstrated that a microlearning intervention for cardiovascular faculty development was feasible, well-received, and associated with improved knowledge scores and perceived application of skills at 4 months, aligning with the objectives to assess learning transfer, satisfaction, and sustained knowledge application.

Findings align with prior evidence that microlearning enhances engagement, skill acquisition, and retention in health education [1,3]. The module’s brevity, relevance, and asynchronous access likely contributed to its effectiveness. The 4-month follow-up addresses a gap in prior faculty-focused studies, which often lack longer-term data [2]. Microlearning can improve learning outcomes and self-efficacy, further supporting its value in CME [9].

By integrating quantitative data from pretests and posttests and satisfaction surveys with qualitative insights from interviews, investigators contextualized statistical findings with participant perspectives. Quantitative results informed preliminary qualitative coding, enabling exploration of trends and outliers. This approach provided a more comprehensive understanding of the module’s impact and feasibility, revealing factors and barriers not apparent from a single data type.

Mixed methods revealed both measurable gains and contextual insights, including barriers such as time constraints. Participant recommendations such as hosting modules on familiar platforms and providing time estimates for each section may further improve engagement.

Limitations include the single-institution setting, small sample size, exclusion of tied pairs in the primary analysis, lack of a control group, and potential response bias. The 4-month follow-up may not capture durable behavior change. Integrating quantitative and qualitative methods enhanced understanding but may introduce interpretive complexity.

Overall, microlearning appears to be a scalable, flexible approach to faculty development, well-suited to clinical educators. Institutions should consider implementing microlearning modules with structured follow-up to reinforce learning. Future research should use larger, more diverse samples, include control groups, and extend follow-up to evaluate long-term behavioral and organizational outcomes.

Acknowledgments

The authors would like to thank Metta A Kuehntopp, MEd, and Jeffrey C Williams for their part in the creation of the intervention used in this study. We thank Patricia K Guthrie for her valuable contributions to developing content within our learning management system. No generative artificial intelligence was used in the preparation of this manuscript.

Data Availability

The datasets generated and analyzed during this study are not publicly available because participant consent for data sharing was not obtained. Data may be available from the corresponding author upon reasonable request and with appropriate institutional approvals, provided such sharing complies with participant privacy and ethical guidelines.

Funding

This research was funded by the Mayo Clinic College of Medicine and Science Office of Applied Scholarship and Education Science Endowment for Education Research Award. The study was funded for US $12,000 for a one-year period starting January 2024.

Authors' Contributions

Conceptualization: DLL (lead), MWC (supporting)

Data curation: DLL

Formal analysis: DLL

Funding acquisition: DLL

Investigation: DLL

Methodology: DLL (lead), MWC (supporting)

Project administration: DLL (lead), JBG (supporting), MWC (supporting)

Resources: DLL (lead), JBG (supporting)

Supervision: MWC

Validation: DLL

Visualization: DLL (lead), MWC (supporting)

Writing – original draft: DLL (lead), MWC (supporting)

Writing – review & editing: DLL (lead), MWC (supporting), JBG (supporting), JAL (supporting)

Conflicts of Interest

None declared.

Multimedia Appendix 1

Microlearning pretest and posttest.

DOCX File , 499 KB

Multimedia Appendix 2

Post course evaluation.

DOCX File , 31 KB

Multimedia Appendix 3

Follow-up interview guide.

DOCX File , 29 KB

  1. Dyrbye L, Bergene A, Leep HA, Billings H. Reimagining faculty development deployment: a multipronged, pragmatic approach to improve engagement. Acad Med. Sep 2022;97(9):1322-1330. [CrossRef]
  2. Cook DA, Steinert Y. Online learning for faculty development: a review of the literature. Med Teach. Nov 2013;35(11):930-937. [CrossRef] [Medline]
  3. De Gagne JC, Park HK, Hall K, Woodward A, Yamane S, Kim SS. Microlearning in health professions education: scoping review. JMIR Med Educ. Jul 23, 2019;5(2):e13997. [FREE Full text] [CrossRef] [Medline]
  4. Bowler C, Foshee C, Haggar F, Simpson D, Schroedl C, Billings H. Got 15? try faculty development on the Fly: a snippets workshop for microlearning. MedEdPORTAL. Jun 14, 2021;17:11161. [FREE Full text] [CrossRef] [Medline]
  5. Taylor A, Hung W. The effects of microlearning: a scoping review. Education Tech Research Dev. Jan 26, 2022;70(2):363-395. [CrossRef]
  6. Monib WK, Qazi A, Apong RA. Microlearning beyond boundaries: a systematic review and a novel framework for improving learning outcomes. Heliyon. Jan 30, 2025;11(2):e41413. [FREE Full text] [CrossRef] [Medline]
  7. Moore R, Hwang W. A systematic review of mobile-based microlearning in adult learner contexts. Educ Technol Soc. 2024:27-146. [CrossRef]
  8. Kirkpatrick J, Kirkpatrick W. Kirkpatrick's Four Levels of Training Evaluation. Alexandria, VA. Association for Talent Development; 2016.
  9. Zarshenas L, Mehrabi M, Karamdar L, Keshavarzi MH, Keshtkaran Z. The effect of micro-learning on learning and self-efficacy of nursing students: an interventional study. BMC Med Educ. Sep 07, 2022;22(1):664. [FREE Full text] [CrossRef] [Medline]


CME: continuing medical education
MCQ: multiple choice question


Edited by A Stone; submitted 17.Nov.2025; peer-reviewed by BS Chisholm, S Otero; comments to author 15.Dec.2025; revised version received 18.Feb.2026; accepted 19.Feb.2026; published 11.Mar.2026.

Copyright

©Darci L Lammers, Jeffrey B Geske, Jane A Linderbaum, Michael W Cullen. Originally published in JMIR Medical Education (https://mededu.jmir.org), 11.Mar.2026.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on https://mededu.jmir.org/, as well as this copyright and license information must be included.