Published on in Vol 11 (2025)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/63882, first published .
Implementation Outcomes of Reusable Learning Objects in Health Care Education Across Three Malaysian Universities: Evaluation Using the RE-AIM Framework

Implementation Outcomes of Reusable Learning Objects in Health Care Education Across Three Malaysian Universities: Evaluation Using the RE-AIM Framework

Implementation Outcomes of Reusable Learning Objects in Health Care Education Across Three Malaysian Universities: Evaluation Using the RE-AIM Framework

1Department of Primary Care Medicine, Faculty of Medicine, Universiti Malaya, Lembah Pantai, Kuala Lumpur, Malaysia

2Department of Multimedia, Faculty of Computer Science and Information Technology, Universiti Putra Malaysia, Selangor, Malaysia

3Department of Pathology, Faculty of Medicine and Health Sciences, Universiti Putra Malaysia, Selangor, Malaysia

4Department of Learning, Informatics, Management and Ethics, Karolinska Institutet, Stockholm, Sweden

5NettOp, Department of E-Learning Development, University of Stavanger, Stavanger, Norway

6School of Health Sciences, University of Nottingham, Nottingham, United Kingdom

7Centre for Population Health Research and Implementation, SingHealth Regional Health System, Singapore, Singapore

8Duke-NUS Medical School, Singapore, Singapore

9UMeHealth Unit, Faculty of Medicine, Universiti Malaya, Kuala Lumpur, Malaysia

10Dean's Office, Faculty of Medicine, Universiti Malaya, Kuala Lumpur, Malaysia

11Department of Building Surveying, Faculty of Built Environment, Universiti Malaya, Kuala Lumpur, Malaysia

12School of Biosciences, Faculty of Health & Medical Sciences, Taylor's University, Selangor, Malaysia

13School of Pharmacy, Faculty of Health & Medical Sciences, Taylor's University, Selangor, Malaysia

14Taylor's Digital, Taylor's University, Selangor, Malaysia

15Learning Innovation and Development, Centre for Future Learning, Taylor’s University, Selangor, Malaysia

16Department of Family Medicine, Faculty of Medicine and Health Sciences, Universiti Putra Malaysia, Serdang, Malaysia

Corresponding Author:

Hooi Min Lim


Background: Current e-learning evaluation focuses on learners’ knowledge gain, satisfaction, perceptions, and attitudes; few assess the implementation outcomes of e-learning resources in teaching and learning.

Objective: In this study, we used the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework to systematically evaluate the implementation outcomes of reusable learning objects (RLOs) in the context of health care education.

Methods: This study is a part of the Advancing Co-creation of RLOs to Digitise Healthcare Curriculum (ACoRD) project, wherein we developed and implemented 23 RLOs across 3 Malaysian universities for medical, pharmacy, and biomedical curricula. Implementation and dissemination strategies were employed. Data were collected using a self-administered web-based questionnaire and Google Analytics.

Results: This study reports a cumulative RLO access of 7622 users from 48 countries (reach). Users rated RLOs as very helpful (1452/2071, 70.1%) or helpful (601/2071, 29.1%). Preassessments and postassessments showed a significant improvement in the knowledge score (21 RLOs, P<.05) and confidence level (17 RLOs, P<.05) (effectiveness). All 3 Malaysian universities adopted RLOs in the fields of professional development, primary care medicine, medicine, pediatrics, nursing, pharmacy, and biomedicine (adoption). The percentage of users who completed RLOs ranged from 5.6% (10/179) to 85% (78/92), with nonbounced users (users who viewed more than one page) ranging from 16.3% (165/1014) to 88.5% (370/418) (implementation). In the 4 months following the completion of the ACoRD project, a total of 2107 users accessed RLOs (maintenance).

Conclusions: We systematically evaluated the implementation of e-learning resources by using the RE-AIM framework, informing future strategies to integrate e-learning innovations in real-world teaching and learning practices.

JMIR Med Educ 2025;11:e63882

doi:10.2196/63882

Keywords



The use of technology in medical education and health sciences is on the rise, aiming to enhance the knowledge, skills, and practice of medical students. e-Learning, the process of acquisition and use of knowledge distributed and facilitated by electronic means [1], has been shown to be effective in not only improving knowledge but also increasing students’ satisfaction in medical education [2,3]. The application of e-learning objects in medical education promotes self-directed and personalized learning, serving as a complement to the conventional didactic teaching method in medical schools [4]. An e-learning object refers to a collection of digital materials structured in a meaningful way, aligned with a specific learning objective, and designed as independent, self-contained units of instructional materials [4].

Although institutions are devoting substantial resources to the development of e-learning, successful implementation of e-learning remains challenging and is often not systematically measured [5]. A systematic review showed that the evaluation of e-learning tends to focus on the assessment of learners’ knowledge, satisfaction, perceptions, and attitudes, and there is a lack of rigorous evaluation of its implementation outcomes in the real-world environment [6]. Implementation outcomes are the effects of deliberate and purposive action to implement a new intervention or practice [7]. In the context of e-learning, implementation outcomes include the acceptability, adoption, appropriateness, feasibility, fidelity, and sustainability of e-learning resources in teaching and learning [8]. When e-learning objects are released as open content, reach and discoverability become important.

To date, several models have been used to evaluate the effectiveness and implementation of e-learning resources. The Kirkpatrick model, an outcome-focused model, has been widely employed to evaluate the effectiveness of an educational program across 4 levels: reaction, learning, behavior, and results [9]; however, it does not evaluate the implementation outcomes. The Context-Input-Process-Product model [10] is designed to evaluate both the process and product to determine the success of an educational program. Nevertheless, this model primarily focuses on defining the contextual factors to improve performance [11]. The Analysis, Design, Development, Implementation, and Evaluation model [12], an instructional design model, offers a structured approach to guide the development and implementation of educational programs but falls short of providing metrics for measuring implementation. Therefore, there is a need for an e-learning implementation outcome framework that provides a comprehensive and objective evaluation of the implementation of e-learning resources.

RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) is an established implementation science framework used in the health care setting to measure how effectively an evidence-based intervention is implemented [13]. It has been used in several studies to evaluate the implementation outcomes of health care provider education programs [14,15]. In this study, we aimed to use the RE-AIM framework to evaluate the implementation outcomes of reusable learning objects (RLOs), which are bite-sized, stand-alone interactive web-based resources with multimedia that focus on a single learning objective [16]. The findings from this study would provide evidence on the feasibility of using the RE-AIM framework in the context of health care education, describe the implementation strategies used in this project, and propose recommendations to improve the evaluation of e-learning resource implementation in higher educational institutions.


Study Design

This study is a part of the Advancing Co-creation of RLOs to Digitise Healthcare Curriculum (ACoRD) project, which was funded by the European Union Erasmus+ project [17]. It is a capacity building project involving Universiti Malaya, Universiti Putra Malaysia, Taylor’s University, University of Nottingham, Karolinska Institutet, and University of Stavanger. The ACoRD project aimed to introduce innovative digital pedagogy methods by developing, evaluating, and disseminating high-quality, peer-reviewed RLOs that benefit health care and biomedical science learners in Malaysia. RLOs were developed, implemented, and evaluated over 3 phases (Figure 1). The implementation strategies of RLOs were developed and performed in both the development (preimplementation phase) and implementation phases.

Figure 1. Development, implementation, and evaluation of reusable learning objects in the ACoRD project. ACoRD: Advancing Co-creation of Reusable Learning Objects to Digitise Healthcare Curriculum; HELM: Health e-Learning and Media; KI: Karolinska Institutet; MERLOT: Multimedia Educational Resource for Learning and Online Teaching; RE-AIM: Reach, Effectiveness, Adoption, Implementation, and Maintenance; RLO: reusable learning object; TU: Taylor’s University; UM: Universiti Malaya; UoN: University of Nottingham; UoS: University of Stavanger; UPM: Universiti Putra Malaysia.

RLO Development Phase (Preimplementation)

RLOs were developed following the ASPIRE (Aim, Storyboarding, Populating, Implementing, Release, and Evaluation) framework [18]. The development phase began by establishing a coalition between international institutions within the project. We evaluated the process and challenges of transnational partnerships and knowledge transfer by using a qualitative study, providing insights on guiding effective partnership in e-learning development in the future [17]. This was followed by stakeholder engagement at 3 Malaysian institutions, involving higher education institution management, medical education units, respective departments, and curriculum coordinators. To identify the gaps and topics for RLO development, the ACoRD team reviewed the existing curricula and determined how RLOs can be integrated into the existing topics, learning activities, and teaching delivery methods in the curricula. We also conducted a needs assessment by using a Delphi survey to compare the differences between educators and learners in prioritizing topics for RLO development [19]. After determining the topics and scope of 23 RLOs, we engaged the learners in RLO cocreation, especially in the storyboard development step. We evaluated learners’ knowledge, confidence, and satisfaction of the storyboarding session by using a pre-post survey and explored their perception and experience of the cocreation process qualitatively [20]. We provided RLO specification writing training to the educators who subsequently wrote the content for RLOs. Once RLO specifications were finalized, a content review was conducted before moving on to technical development. A small pilot test was conducted with student champions on RLO usability before full implementation.

RLO Implementation Phase

The 23 RLOs (Table 1) were made available through the open-access ACoRD repository website [21], Health e-Learning and Media-Open under the University of Nottingham [22], Multimedia Educational Resource for Learning and Online Teaching [23], and the institutional online learning portal and curriculum guidebook.

The focus was on the actual integration of RLO resources into the health sciences curricula. At Universiti Malaya, RLOs were incorporated into various block postings for medicine and an undergraduate semester course for nursing. In Universiti Putra Malaysia, RLOs were used in a professionalism module block, and at Taylor’s University, they were embedded as part of semester courses in pharmacy and biomedical science. Various strategies were implemented to improve the usage of RLOs, including identifying early adopters (department and educators) to use RLOs in teaching, integrating RLOs into the existing curricula, sending reminders to educators, receiving feedback from students, establishing an effective dissemination team, monitoring RLO performance and feedback to educators, and periodic re-examining of the implementation strategies.

Table 1. The list of reusable learning objects developed for undergraduate students in health care professions.
University, courseRLOa titles (n=23)
Universiti Malaya, Kuala Lumpur, Malaysia
Medicine
  • RLO 1: Prescription Writing—Back to Basics
  • RLO 2: Treatment of Acute Illness
  • RLO 3: Principles of Family Medicine
  • RLO 4: Factors Affecting Nutrition in the Older Person
  • RLO 5: Growth Faltering in Children
  • RLO 6: Identify Challenging Behavior in Health Care Settings
  • RLO 7: How to Conduct a Literature Search
Universiti Putra Malaysia, Selangor, Malaysia
Medicine
  • RLO 8: Confidentiality
  • RLO 9: Breaking Bad News
  • RLO 10: Doctor-Patient Relationship
  • RLO 11: Consent
  • RLO 12: Verbal and Nonverbal Skills
  • RLO 13: Counseling Skills
  • RLO 14: Ethical Reasoning
  • RLO 15: Social Media Professionalism
Taylor’s University, Selangor, Malaysia
Pharmacy
  • RLO 16: Using Nicotine Gum for Smoking Cessation
  • RLO 17: Using Nicotine Patches for Smoking Cessation
  • RLO 18: Using Varenicline for Smoking Cessation
Biomedical science
  • RLO 19: Body Metabolism
  • RLO 20: DNA Repair
  • RLO 21: DNA Replication
  • RLO 22: Cardiac Output
  • RLO 23: Nervous Regulation of the Heart

aRLO: reusable learning object.

Evaluation Phase

This study was performed across 3 Malaysian institutions (Universiti Malaya, Universiti Putra Malaysia, and Taylor’s University). Participants in this evaluation were users who used RLOs during the implementation period from May 1, 2020, to February 24, 2022. We used a universal sampling method for data collection.

We evaluated the implementation of RLOs by using the RE-AIM framework (Figure 2). The reach domain is defined as the access of RLOs by the users. We assessed reach by measuring the absolute number of RLO users, location of access, types of devices used by the users, and how users found out about our RLOs.

Figure 2. Definitions and outcomes measures of the RE-AIM framework. RE-AIM: Reach, Effectiveness, Adoption, Implementation, and Maintenance; RLO: reusable learning object.

For the effectiveness domain, we applied the 4-level Kirkpatrick model: Reaction, Learning, Behavior, and Results [24]. It is a model that is commonly used in medical education to evaluate the effectiveness of a training program [25]. Reaction was assessed by gauging learners’ perceived helpfulness of RLOs, and learning was measured using pre-RLO and post-RLO knowledge and confidence scores. Behavior measures whether learners are applying the newly acquired knowledge from RLOs in clinical practice, while results measure the ultimate impact of training using RLOs, which includes impact at an organizational level. We did not measure the behavior and results domains in this study due to limitations in the study design.

Adoption refers to the number of institutions that agreed to use RLOs in teaching and learning activities. We captured information on the institutions, departments, and existing modules in which RLOs were implemented. The implementation domain refers to users’ fidelity in using RLOs for their learning. We measured the number of users who completed RLOs and the number of nonbounced users. Nonbounced users are those who are engaged with the site either by clicking on links, visiting multiple pages, or triggering events that indicate interaction, while bounced users are those who visited the RLO web page but only viewed a single page without interacting further before exiting.

The maintenance domain refers to the extent to which the sustained use of RLOs becomes institutionalized or part of the routine teaching and learning practice. We measured the number of users who visited the RLO web page 4 months after the project ended.

Research Instruments and Data Collection

We used 2 instruments to capture the RLO implementation outcomes: questionnaires and Google Analytics (Table 2). The questionnaires were designed to gather user feedback and usage patterns. We used 2 types of questionnaires: (1) a web-based questionnaire was administered at the completion of each RLO to assess users’ perception of its usefulness (using a 4-point Likert scale: 1-very unhelpful, 2-unhelpful, 3-helpful, and 4-very helpful) and how they discovered RLOs, and (2) a confidence Likert scale and RLO-specific knowledge questionnaire were administered before and immediately after RLO usage. For the knowledge score, RLOs 16, 17, and 18 (using nicotine gum, nicotine patches, and varenicline for smoking cessation) were assessed using 1 set of knowledge questions, while RLOs 20 and 21 (DNA repair and replication) were assessed using another set of knowledge questions related to specific RLO topics. We used GAMM1-Google Analytics in Universiti Malaya and Universiti Putra Malaysia and Moodle monitoring in Taylor’s University to capture the patterns of access (location, number of users, number of completing users).

Table 2. RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) outcome measures and assessment methods.
RE-AIM measuresAssessment methods
Reach
Number of RLOa usersGoogle Analytics
Location of accessGoogle Analytics
Types of devices usedGoogle Analytics
How do users find out RLOsQuestionnaire: multiple-choice
Effectiveness
Perceived helpfulness of RLOs in learning (Kirkpatrick level 1: reaction)bQuestionnaire: 4-point Likert scale
Knowledge and confidence (Kirkpatrick level 2: learning)Questionnaire 1: pre- and post-RLO knowledge score; Questionnaire 2: pre- and post-RLO confidence score
Adoption
Number of institutions who adopted RLOs in teaching and learningN/Ac
Implementation
Number of users who completed RLOsGoogle Analytics
Number of nonbounced usersdGoogle Analytics
Maintenance
Number of RLO accesses 4 months after the ACoRDe project implementation (from February 24 to June 22, 2022)Google Analytics

aRLO: reusable learning object.

bPerceived helpfulness is measured using a 4-point Likert scale (1-very unhelpful, 2-unhelpful, 3-helpful, and 4-very helpful).

cN/A: not applicable.

dNonbounced users are defined as users who viewed more than one page.

eACoRD: Advancing Co-creation of Reusable Learning Objects to Digitise Healthcare Curriculum.

Data Analysis

For the questionnaire items, categorical data were reported descriptively using proportion and percentage. The mean difference in the pre-RLO and post-RLO scores on knowledge and confidence were analyzed using Mann-Whitney U test, as the data were skewed. Statistical analysis was considered significant when P<.05. Google Analytics data were reported descriptively using proportion and percentage. All the data were analyzed using SPSS (version 27; IBM Corp) and Microsoft Excel.

Ethics Approval

Ethics approval was granted by the Universiti Malaya Research Ethics Committee (reference UM.TNC2/UMREC-997). All participants provided their written consent. All participant information has been anonymized and deidentified. There was no financial compensation provided to the participants.


Overview

From the launch of the first RLO on May 1, 2020, until February 24, 2022, Google Analytics recorded a cumulative 7622 users from 48 countries (Figure 3). We have reported the results according to the RE-AIM framework.

Figure 3. Geographical distribution of the reusable learning object users from 48 countries. There was no country with users between 500 and 6000.

Reach

Among the 7622 global users, the majority (n=6817, 89.4%) were from Malaysia, while 10.6% (n=805) were from other countries. Desktop computers were the predominant devices used to access RLOs (6045/7622, 79.3%); 20.7% (1577/7622) used portable devices such as mobile phones or tablets. The number of users for each RLO varied from 92 to 1014 (Table 3). The RLOs attracting the highest number of users were RLO 7 (How to Conduct Literature Research, n=1014), RLO 1 (Prescription Writing: Back to Basics, n=530), and RLO 17 (Using Nicotine Patches for Smoking Cessation, n=511). RLOs with fewer than 100 users were RLO 10 (Doctor-Patient Relationship) and RLO 23 (Nervous Regulation of the Heart).

Out of 2118 respondents to the questionnaire, the majority used RLOs because they were part of the course learning resource (n=1901, 89.8%), while others received a recommendation from peers, colleagues, or lecturers (n=87, 8.8%). A small proportion of the respondents (30/2118, 1.5%) found RLOs through the Health e-Learning and Media open database, open educational catalogs, or general internet searches.

Table 3. Number of users for each reusable learning object during project implementation.
RLOa typeUsers, n
RLO 1530
RLO 2366
RLO 3467
RLO 4310
RLO 5363
RLO 6184
RLO 71014
RLO 8234
RLO 9187
RLO 1097
RLO 11212
RLO 12336
RLO 13187
RLO 14335
RLO 15388
RLO 16432
RLO 17511
RLO 18418
RLO 19191
RLO 20365
RLO 21179
RLO 22224
RLO 2392

aRLO: reusable learning object.

Effectiveness

Most users rated RLOs as very helpful (1452/2071, 70.1%) and helpful (601/2071, 29%). We evaluated the preknowledge and postknowledge scores across all RLOs. The mean preknowledge score ranged from 0.33 to 9.6, while the postknowledge score ranged from 5.20 to 9.85. A significant improvement in the knowledge scores was reported for all RLOs after their use (P<.05) (Figure 4), except for RLO 4 and RLO 6, which had high mean baseline preknowledge scores of 6.61 and 8.80, respectively (Table S1 in Multimedia Appendix 1). Additionally, we assessed the confidence scores for 17 RLOs (excluding 5 RLOs, ie, RLOs 16-21 due to missing data) and found a significant improvement in the mean confidence scores after utilization for all 17 RLOs (P<.05) (Table S2 in Multimedia Appendix 1).

Figure 4. (A) Comparison of preknowledge and postknowledge scores for each reusable learning object. (B) Comparison of preconfidence and postconfidence scores for each reusable learning object. *P>.05. # indicates missing data. RLO: reusable learning object.

Adoption

All RLOs have been adopted by the 3 Malaysian universities (Universiti Malaya, Universiti Putra Malaysia, and Taylor’s University). At Universiti Malaya, the departments of primary care medicine, medicine (geriatrics), pediatrics, and nursing have incorporated RLOs into their teaching and learning curricula. Universiti Putra Malaysia has used RLOs within its professional development module, while Taylor’s University has implemented them in the pharmacy and biomedical undergraduate courses.

Implementation

The completion rates for RLOs varied widely, with percentages ranging from 5.6% (10/179) to 85% (78/92) as shown in Table 4. Only RLO 8 (Confidentiality), RLO 15 (Social Media Professionalism), and RLO 23 (Nervous Regulation of the Heart) had a completion rate exceeding 50%. The proportion of nonbounced users, defined as users who viewed more than one page, ranged from 16.3% (165/1014) to 88.5% (370/418). Notably, data on nonbounced users for RLOs 8-15 from Universiti Putra Malaysia were unavailable due to the absence of a tracking function on each RLO page in Google Analytics (because of technical errors).

Table 4. Number of users who completed reusable learning objects and number of nonbounced users during the implementation period.
RLOa typeAccesses, nUsers who completed RLOs, n (%)Nonbouncedb users, n (%)
RLO 1530205 (38.7)129 (24.3)
RLO 236690 (24.6)97 (26.5)
RLO 3467116 (24.8)143 (30.6)
RLO 431046 (14.8)64 (20.7)
RLO 536353 (14.6)71 (19.6)
RLO 618433 (17.9)49 (26.6)
RLO 7101475 (7.4)165 (16.3)
RLO 8234170 (72.7)c
RLO 918768 (36.4)
RLO 109735 (36)
RLO 11212100 (47.2)
RLO 12336164 (48.8)
RLO 1318760 (32.1)
RLO 14335116 (34.6)
RLO 15388204 (52.6)
RLO 16432121 (28)352 (81.5)
RLO 17511113 (22.1)351 (68.7)
RLO 18418109 (26.1)370 (88.5)
RLO 1919123 (12)68 (35.6)
RLO 2036561 (16.7)271 (74.3)
RLO 2117910 (5.6)111 (62)
RLO 2222476 (33.9)147 (65.6)
RLO 239278 (84.8)30 (32.6)

aRLO: reusable learning object.

bUsers who viewed more than one page.

cNot available.

Maintenance

After the ACoRD project ended on February 24, 2022, a total of 2107 users continued to access RLOs in the subsequent 4 months, ranging from 15 to 187 accesses per RLO (Figure 5).

Figure 5. Number of users who accessed reusable learning objects 4 months after the end of the project.

Principal Findings

This study reports the successful implementation of RLOs in health care education across 3 Malaysian higher educational institutions by using the RE-AIM framework. Our findings offer a broad perspective on how the impact and outcomes of e-learning objects can be measured and evaluated systematically to inform e-learning implementation strategies.

Our study shows that RLOs were accessed by a more diverse group of learners than initially anticipated, extending beyond the Malaysian institutions, where RLOs were intended to be used. Leveraging on the unique characteristics of reusability in RLOs [26], we adopted an open educational resource approach in their deployment. This strategy significantly broadened the reach of RLOs, making them available to learners across different disciplines. Consequently, RLOs addressing more general subjects such as literature review, prescription writing, and smoking cessation attracted a larger user base compared to those focusing on discipline-specific topics.

Throughout the ACoRD project, pragmatic implementation and dissemination strategies were employed to engage a broader audience. These strategies encompassed identifying early adopters and educator champions, sending of reminders to educators about incorporating RLOs in their teaching, monitoring of RLO performance at regular intervals, and providing feedback to educators. A study on mobile health e-learning courses highlighted the advantages of iterative feedback among developers, early adopters, and end users, which aid in the refinement of existing e-learning implementation strategies and the development of new ones [27]. The ACoRD project featured a dedicated dissemination team that organized and executed activities to promote RLOs through conferences and engagement workshops, extending beyond our institutions. Establishing collaborative partnerships with other higher educational institutions is an important strategy to boost visibility and ensure the long-term use of e-learning resources [28].

In our study, users perceived RLOs as beneficial for their learning, with most RLOs demonstrating an improvement in users’ knowledge and confidence levels. These findings are aligned with previous studies that have reported improvements in knowledge and understanding following RLO usage [16,29,30]. However, evaluating the effectiveness of e-learning resources remains a challenge. Most studies measured learners’ reactions and learnings such as knowledge, usefulness, confidence, satisfaction, and motivation [31]. Some studies examining the effectiveness of RLOs demonstrated behavior change in relation to prescribing behavior and hearing aid use [32,33]. Measuring higher levels of learning such as behavior change and impact of learning is complex due to the time and cost involved [34]. We propose that the evaluation of RLO effectiveness should go beyond individual learning outcomes to assess their impact within the wider teaching and learning ecosystem. This could include evaluating RLOs’ effectiveness from educators’ perspectives and examining how RLOs contribute to student empowerment in self-directed learning [6].

In the adoption domain of RE-AIM, RLOs developed by the ACoRD project were adopted by all 3 Malaysian institutions. Our study underscores the significance of planning and initiating implementation strategies during the development phase (preimplementation) with stakeholders and institutional engagement. Engaging institutional faculty, students, and stakeholders is critical for the successful adoption of e-learning [35]. The cocreation process and involvement of stakeholders throughout the ASPIRE process facilitates the feeling of ownership of the materials produced, thereby leading to use and reuse. To ensure the widespread adoption of e-learning innovation, it is crucial to engage with and consider the perspectives of diverse stakeholders, as the implementation of e-learning necessitates changes in teaching, learning, management, and infrastructure within the institution [36].

In our study, although RLOs reached a significant number of users, the proportion of users who completed RLOs (5.6%‐84.4%) and nonbounced users (16.3%‐81.5%) was relatively low for most RLOs. Chen et al [37] reported a similar finding, with a low nonbounced rate of 40% for an undergraduate online course. The wide discrepancy in the completion and nonbounced rates among RLOs may be attributed to the different levels of learning across various learner groups. RLOs integrated into the curriculum as teaching and learning materials achieved a completion rate greater than 70% (RLO 8 and RLO 23), suggesting that learners are more invested in completing these RLOs. Conversely, RLO 7 attracted the highest number of users (n=1014); yet, its completion rate was only 7.4% (75/1014). Our findings show that Google Analytics is beneficial for examining the fidelity of RLO usage, as it provides insights into learners’ behaviors. Assessing the fidelity of RLO usage is crucial for educators and institutions to gauge the effectiveness of RLOs in teaching and learning [38].

Users continued to access RLOs after the conclusion of the ACoRD project, which indicates the sustainability of RLOs. The main strategies that we identified were the integration of RLOs into the existing health care curriculum and developing RLOs by using a robust cocreation methodology. Identifying the learning needs of teachers and learners prior to the development of RLOs facilitated their integration into the curriculum [19]. We also implemented the educator-as-champion strategy, which involves educators in content development and subsequently utilizing RLOs in their teaching. e-Learning champions among academicians in higher educational institutions are the key players in fostering the integration of technology into teaching and learning [39].

Limitations

Our study has several limitations. Since each RLO was developed and launched at different times, discrepancies in the implementation periods made data interpretation challenging. We used Google Analytics to capture the reach of our RLOs. However, we could not confirm whether access came from unique users or unique IP addresses. There is a possibility that some users could have accessed RLOs multiple times by using different IP addresses. As this study employs a pragmatic approach to capture evaluation outcomes in real-time, missing data were inevitable; however, it did not substantially impact the overall findings. Strategies were implemented pragmatically throughout the project, with practical consideration and changes made to address implementation issues; as a result, we were unable to identify and measure the effectiveness of each implementation strategy.

Conclusion

We employed the RE-AIM framework to systematically evaluate the implementation success of e-learning resources, identifying gaps and strategies for improvement. This study highlights the development, implementation process, and implementation outcome indicators of open-access RLOs in health care education. To enhance future e-learning implementation efforts, we recommend incorporating the RE-AIM framework into outcome evaluations to provide a more comprehensive evaluation of e-learning implementation.

Acknowledgments

We thank Irmi Zarina Ismail from Universiti Putra Malaysia for her contribution to the Advancing Co-creation of Reusable Learning Objects to Digitise Healthcare Curriculum (ACoRD) project. This study was funded by the European Union Erasmus+ Program under the ACoRD project (reference 598935-EPP-1-2018-1-UK-EPPKA2-CBHE-JP). The funders have no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability

The datasets used or analyzed during this study are available from the corresponding author on reasonable request.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Preknowledge and postknowledge and confidence scores for each reusable learning object.

DOCX File, 31 KB

  1. Wentling TL, Waight C, Gallaher J, La Fleur J, Wang C, Kanfer A. E-learning: a review of literature. ResearchGate. Sep 2000. URL: https://www.researchgate.net/publication/331938876_E-learning_A_review_of_literature [Accessed 2025-05-21]
  2. Wang ZY, Zhang LJ, Liu YH, et al. The effectiveness of e-learning in continuing medical education for tuberculosis health workers: a quasi-experiment from China. Infect Dis Poverty. May 18, 2021;10(1):72. [CrossRef] [Medline]
  3. Sadeghi R, Sedaghat MM, Sha Ahmadi F. Comparison of the effect of lecture and blended teaching methods on students’ learning and satisfaction. J Adv Med Educ Prof. Oct 2014;2(4):146-150. [Medline]
  4. Ruiz JG, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. Mar 2006;81(3):207-212. [CrossRef] [Medline]
  5. Naveed QN, Qureshi MRN, Tairan N, et al. Evaluating critical success factors in implementing e-learning system using multi-criteria decision-making. PLoS One. 2020;15(5):e0231465. [CrossRef] [Medline]
  6. Barteit S, Guzek D, Jahn A, Bärnighausen T, Jorge MM, Neuhann F. Evaluation of e-learning for medical education in low- and middle-income countries: a systematic review. Comput Educ. Feb 2020;145:103726. [CrossRef] [Medline]
  7. Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. Mar 2011;38(2):65-76. [CrossRef] [Medline]
  8. Palmerola ED. Clarification evaluation of e-learning implementation: a developmental research design. Am J Educ Technol. 2024;3(2):119-127. [CrossRef]
  9. Kirkpatrick D, Kirpatrick JD. Evaluating training programs: the four levels. American J Eval. 2011;19(2):259-261. [CrossRef]
  10. Stufflebeam D. The CIPP model for evaluation. In: Evaluation Models. Springer Dordrecht; 2002:279-317. [CrossRef]
  11. Gandomkar R. Comparing Kirkpatrick’s original and new model with CIPP evaluation model. J Adv Med Educ Prof. Apr 2018;6(2):94-95. [Medline]
  12. Peterson C. Bringing ADDIE to life: instructional design at its best. ERIC. 2003. URL: https://eric.ed.gov/?id=EJ822355 [Accessed 2025-05-19]
  13. Holtrop JS, Estabrooks PA, Gaglio B, et al. Understanding and applying the RE-AIM framework: clarifications and resources. J Clin Transl Sci. 2021;5(1):e126. [CrossRef] [Medline]
  14. Gisondi MA, Keyes T, Zucker S, Bumgardner D. Teaching LGBTQ+ health, a web-based faculty development course: program evaluation study using the RE-AIM framework. JMIR Med Educ. Jul 21, 2023;9:e47777. [CrossRef] [Medline]
  15. Golden RE, Sanders AM, Frayne SM. RE-AIM applied to a primary care workforce training for rural providers and nurses: the Department of Veterans Affairs’ Rural Women’s Health Mini-Residency. Front Health Serv. 2023;3:1205521. [CrossRef] [Medline]
  16. Bath-Hextall F, Wharrad H, Leonardi-Bee J. Teaching tools in evidence based practice: evaluation of reusable learning objects (RLOs) for learning about meta-analysis. BMC Med Educ. May 4, 2011;11(1):18. [CrossRef] [Medline]
  17. Lim HM, Ng CJ, Wharrad H, et al. Knowledge transfer of e-learning objects: lessons learned from an intercontinental capacity building project. PLoS ONE. 2022;17(9):e0274771. [CrossRef] [Medline]
  18. Wharrad H, Windle R, Taylor M. Chapter 3: designing digital education and training for health. In: Konstantinidis ST, Bamidis PD, Zary N, editors. Digital Innovations in Healthcare Education and Training. Academic Press; 2021:31-45. [CrossRef]
  19. Lim HM, Ng CJ, Teo CH, et al. Prioritising topics for developing e-learning resources in healthcare curricula: a comparison between students and educators using a modified Delphi survey. PLoS ONE. 2021;16(6):e0253471. [CrossRef] [Medline]
  20. Lim HM, Teo CH, Hong WH, Lee YK, Lee PY, Ng CJ. Empowering students in co-creating e-learning resources through a virtual hackathon. TAPS. 2024;9(4):84-87. [CrossRef]
  21. The ACoRD Project. 2018. URL: https://acord.my [Accessed 2025-05-19]
  22. HELM Open. University of Nottingham. URL: https://www.nottingham.ac.uk/helmopen [Accessed 2025-05-19]
  23. MERLOT multimedia educational resource for learning and online teaching. MERLOT. URL: https://www.merlot.org/merlot/viewMaterial.htm?id=85072 [Accessed 2025-05-19]
  24. Kirkpatrick DL. The four levels of evaluation. In: Brown SM, Seidner CJ, editors. Evaluating Corporate Training: Models and Issues. Springer Netherlands; 1998:95-112. [CrossRef]
  25. Ragsdale JW, Berry A, Gibson JW, et al. Evaluating the effectiveness of undergraduate clinical education programs. Med Educ Online. Dec 2020;25(1):1757883. [CrossRef] [Medline]
  26. Windle RJ, Wharrad H, McCormick D, Laverty H, Taylor MG. Sharing and reuse in OER: experiences gained from open reusable learning objects in health. JIME. 2010;(1):4. [CrossRef]
  27. Philpot LM, Ahrens DJ, Eastman RJ, et al. Implementation of e-learning solutions for patients with chronic pain conditions. Digit Health. 2023;9:20552076231216404. [CrossRef] [Medline]
  28. Gallagher S, Murphy P. Implementing inter-institutional lifelong sustainability education: the UNI-ECO e-learning case study. Ir J Acad Pract. 2024;11(2). [CrossRef]
  29. Hardie P, Donnelly P, Greene E, et al. The application of reusable learning objects (RLOs) in preparation for a simulation laboratory in medication management: an evaluative study. Teaching Learning Nurs. Oct 2021;16(4):301-308. [CrossRef]
  30. Redmond C, Davies C, Cornally D, et al. Using reusable learning objects (RLOs) in wound care education: undergraduate student nurse’s evaluation of their learning gain. Nurse Educ Today. Jan 2018;60:3-10. [CrossRef] [Medline]
  31. de Leeuw R, de Soet A, van der Horst S, Walsh K, Westerman M, Scheele F. How we evaluate postgraduate medical e-learning: systematic review. JMIR Med Educ. Apr 5, 2019;5(1):e13128. [CrossRef] [Medline]
  32. Lymn JS, Bath-Hextall F, Wharrad HJ. Pharmacology education for nurse prescribing students - a lesson in reusable learning objects. BMC Nurs. Jan 23, 2008;7(1):2. [CrossRef] [Medline]
  33. Ferguson M, Brandreth M, Brassington W, Leighton P, Wharrad H. A randomized controlled trial to evaluate the benefits of a multimedia educational program for first-time hearing aid users. Ear Hear. 2016;37(2):123-136. [CrossRef] [Medline]
  34. El Nsouli D, Nelson D, Nsouli L, et al. The application of Kirkpatrick’s evaluation model in the assessment of interprofessional simulation activities involving pharmacy students: a systematic review. Am J Pharm Educ. Aug 2023;87(8):100003. [CrossRef] [Medline]
  35. Vovides Y, Chale SB, Gadhula R, et al. A systems approach to implementation of e-learning in medical education: five MEPI schools’ journeys. Acad Med. Aug 2014;89(8 Suppl):S102-S106. [CrossRef] [Medline]
  36. de Souza Rodrigues MA, Chimenti P, Nogueira ARR. An exploration of e-learning adoption in the educational ecosystem. Educ Inf Technol. Jan 2021;26(1):585-615. [CrossRef]
  37. Chen Y, Deng X, Huang Q, Luo H. Patterns and trends in online learning behaviors: evidence from Google analytics. Presented at: 2021 IEEE International Conference on Engineering, Technology & Education (TALE); Dec 5-8, 2021; Wuhan, Hubei Province, China. [CrossRef]
  38. Mudawi NA, Pervaiz M, Alabduallah BI, et al. Predictive analytics for sustainable e-learning: tracking student behaviors. Sustainability. 2023;15(20):14780. [CrossRef]
  39. Gachago D, Morkel J, Hitge L, van Zyl I, Ivala E. Developing e-learning champions: a design thinking approach. Int J Educ Technol High Educ. Dec 2017;14(1):30. [CrossRef]


ACoRD: Advancing Co-creation of Reusable Learning Objects to Digitise Healthcare Curriculum
ASPIRE: Aim, Storyboarding, Populating, Implementing, Release, and Evaluation
RE-AIM: Reach, Effectiveness, Adoption, Implementation, and Maintenance
RLO: reusable learning object


Edited by David Chartash; submitted 02.07.24; peer-reviewed by Lindsey Philpot, Nallely Mora; final revised version received 21.04.25; accepted 25.04.25; published 23.07.25.

Copyright

© Hooi Min Lim, Chin Hai Teo, Yew Kong Lee, Ping Yein Lee, Kuhan Krishnan, Zahiruddin Fitri Abu Hassan, Phelim Voon Chen Yong, Wei Hsum Yap, Renukha Sellappans, Enna Ayub, Nurhanim Hassan, Sazlina Shariff Ghazali, Nurul Amelina Nasharuddin, Puteri Shanaz Jahn Kassim, Faridah Idris, Klas Karlgren, Natalia Stathakarou, Petter Mordt, Stathis Konstantinidis, Michael Taylor, Cherry Poussa, Heather Wharrad, Chirk Jenn Ng. Originally published in JMIR Medical Education (https://mededu.jmir.org), 23.7.2025.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on https://mededu.jmir.org/, as well as this copyright and license information must be included.