Published on in Vol 7, No 1 (2021): Jan-Mar

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/13681, first published .
Exploring the Cost of eLearning in Health Professions Education: Scoping Review

Exploring the Cost of eLearning in Health Professions Education: Scoping Review

Exploring the Cost of eLearning in Health Professions Education: Scoping Review

Original Paper

1Department of Primary Care and Public Health, Imperial College London, London, United Kingdom

2Centre for Health Technology, University of Plymouth, Plymouth, United Kingdom

3Department of Physiotherapy, Monash University, Melbourne, Australia

4Faculty of Business and Economics, Monash University, Melbourne, Australia

5Medical Education Research and Quality, School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia

6BMJ Knowledge Centre, BMJ Learning, London, United Kingdom

Corresponding Author:

Edward Meinert, MA, MSc, MBA, MPA, PhD

Centre for Health Technology

University of Plymouth

6 Kirkby Place

Room 2

Plymouth, PL4 6DT

United Kingdom

Phone: 44 1752 585858

Email: edward.meinert@plymouth.ac.uk


Background: Existing research on the costs associated with the design and deployment of eLearning in health professions education is limited. The relative costs of these learning platforms to those of face-to-face learning are also not well understood. The lack of predefined costing models used for eLearning cost data capture has made it difficult to complete cost evaluation.

Objective: The key aim of this scoping review was to explore the state of evidence concerning cost capture within eLearning in health professions education. The review explores the available data to define cost calculations related to eLearning.

Methods: The scoping review was performed using a search strategy with Medical Subject Heading terms and related keywords centered on eLearning and cost calculation with a population scope of health professionals in all countries. The search was limited to articles published in English. No restriction was placed on literature publication date.

Results: In total, 7344 articles were returned from the original search of the literature. Of these, 232 were relevant to associated keywords or abstract references following screening. Full-text review resulted in 168 studies being excluded. Of these, 61 studies were excluded because they were unrelated to eLearning and focused on general education. In addition, 103 studies were excluded because of lack of detailed information regarding costs; these studies referred to cost in ways either indicating cost favorability or unfavorability, but without data to support findings. Finally, 4 studies were excluded because of limited cost data that were insufficient for analysis. In total, 42 studies provided data and analysis of the impact of cost and value in health professions education. The most common data source was total cost of training (n=29). Other sources included cost per learner, referring to the cost for individual students (n=13). The population most frequently cited was medical students (n=15), although 12 articles focused on multiple populations. A further 22 studies provide details of costing approaches for the production and delivery of eLearning. These studies offer insight into the ways eLearning has been budgeted and project-managed through implementation.

Conclusions: Although cost is a recognized factor in studies detailing eLearning design and implementation, the way cost is captured is inconsistent. Despite a perception that eLearning is more cost-effective than face-to-face instruction, there is not yet sufficient evidence to assert this conclusively. A rigorous, repeatable data capture method is needed, in addition to a means to leverage existing economic evaluation methods that can then test eLearning cost-effectiveness and how to implement eLearning with cost benefits and advantages over traditional instruction.

JMIR Med Educ 2021;7(1):e13681

doi:10.2196/13681

Keywords



Significant investment is necessary to adapt and expand global health care staff to transition to the medical challenges of the 21st century. The demands on the workforce range from an aging population and emphasis on chronic disease management [1] to access to primary care, where there is a direct link to the cost of training medical personnel. Primary care depends more heavily on public sector investment than other medical specialties, and scarce resources limit the number of personnel who can be trained [2]. As one example, with the increasing cost of delivery of care within the United Kingdom, the National Health Service has recognized that medical providers must take a greater role in education and training [3]. Creating production efficiencies in education and training may assist with the supply of medical personnel to support clinical skills and applied health-related skills. eLearning, defined as “an approach to teaching and learning, representing all or part of the educational model applied, that is based on the use of electronic media and devices as tools for improving access to training, communication and interaction and that facilitates the adoption of new ways of understanding and developing learning” [4], presents a possible opportunity to change and optimize training by providing a scalable means for instruction, thus reducing the costs necessary in delivery and implementation.

A potential critical opportunity of eLearning is the long-term efficiency gain in its delivery model in contrast to other forms of instruction; however, the costs to develop eLearning are significant when executed to a high standard [5]. To achieve better cost management of eLearning and ensure scale-up and adoption, data are required to identify the factors that influence eLearning design and production. Research on the use of eLearning in medicine suggests that measurement of costs in studies is often inconsistent [6]. Therefore, the aim of this scoping review was to provide a broad overview of the state of evidence concerning measurement of costs in eLearning. Understanding these costs will enable better planning in the design and production of eLearning.


Design

Scoping reviews are a form of rapid knowledge synthesis that identify the sources and evidence available to address research questions in a systematic manner. The established scoping review methodology by Levac et al [7] was chosen for this review, as the research question aims to provide a broad understanding of the literature available in this field to ultimately inform subsequent reviews or research agendas.

Identifying the Relevant Research Question

To establish a comprehensive understanding of the costs [8] associated with eLearning, we conducted a scoping review [7,9] to assess the available literature that quantifies the cost to deliver eLearning in health professions education. For the purpose of this review, cost is defined as the total costs (direct and indirect) from inception to deployment, including the design, development, and delivery (or implementation). Within the study analysis, we attempt to analyze how these costs have been reported by studies, with an understanding that separate factors and sources of these total costs may or may not be reported. Factors influencing these costs could, for example, include the level of experience of the teams producing content. This aggregate grouping of studies will impact the way studies are compared to each other and should be taken into account when reading this review, as other study themes or classifications could impact interpretation of results. The research question under investigation is: What is known in the literature about cost calculations related to eLearning in health professions education in regard to (a) practical cost analysis, with respect to cost per learner and comparison to face-to-face instruction; and (b) the choices in practice of costing methods and models? A secondary question is: How has the publication frequency of this field developed over time?

These questions were derived using the PICO (Population, Intervention, Comparison, Outcome) framework [10]. In this review, the population is defined as learners in health professions in all countries; this decision was made to ensure comprehensive coverage of all health professionals to best understand the state of evidence internationally. The intervention instrument being evaluated is eLearning in health professions education (inclusive of various forms of training, including basic and advanced continuing professional development, university-level training, patient education, and various other training forms provided by an equally broad group of education training providers). The comparison used in this study is the evaluation of costs between eLearning, other methods of instruction such as face to face, and alternate approaches to eLearning, or studies that do not make use of a comparator. The outcome was quantification and analysis of the difference in costs between and within the implementations. We defined costs from cost calculations used in economic evaluation, including cost-consequence analysis, cost-minimization analysis, cost-effective analysis, cost-utility analysis, and cost-benefit analysis [11].

Identifying Relevant Studies

Following consultation with an information scientist at the Imperial College London Medical School Library on literature search approaches, a search of the following databases was performed in December 2015 and repeated in December 2018: PubMed, Scopus, Education Resource Information Centre (ERIC), Web of Science, Embase, Global Health, Health Management Information Consortium (HMIC), Prospero, and OVID. In a second search, which was completed in December 2018, new papers were added to the original dataset but did not undergo exhaustive data charting; the data included provided a high-level summary of contents and relevance to previously categorized themes (these papers can be identified as studies from 2016 to 2018).

The search strategy included use of Medical Subject Heading terms and related keywords centered on eLearning and cost calculation with a population scope of health professionals in all countries. The search was limited to English-language studies. There was no restriction placed on literature publication date; although online technologies have changed rapidly over a short period of time, the authors felt that to provide a comprehensive overview of the literature, it would be useful to first explore research with no date restriction. The primary research questions were kept broad to ensure that there would be inclusion of all studies that recorded the costs to deliver eLearning globally. A high-level summary of the search strategy is detailed in Textbox 1; a full summary of the search strategy used per database is detailed in Multimedia Appendix 1.

Sample search terms.

Cost-related terms

  • Costs and Cost Analysis [Medical Subject Heading (MeSH) terms]
  • Cost-benefit analysis [MeSH Terms]
  • Costs and cost analysis [MeSH Terms]
  • Cost*
  • Economic*

Learning-related terms

  • Learning [MeSH Terms]
  • eLearning
  • Blended learning
  • Online learning
Textbox 1. Sample search terms.

Study Selection

Following the process used in this scoping review method, study selection was based on study identification with data centered on studies that identified cost factors and variables in health professions education eLearning. The literature was reviewed independently by two researchers (JE and EM) to identify articles. A third researcher (CB) adjudicated disagreements when necessary. Article abstracts were first scanned for relevance to the research question and then full articles were downloaded to verify appropriateness. The inclusion criteria included studies and reviews that examined eLearning in health professions education, and captured data concerning design, development, and production costs. Papers that provided synthesis or editorializing of issues without data (ie, opinion pieces and commentaries) were excluded (Multimedia Appendix 2).

Charting the Data

The definition of cost in this review is centered on the hypothesized cost savings derived from a possible reduction in labor costs through scaling teaching via digital technology; cost was defined as the production and delivery costs (direct and indirect) of online learning [12]. Studies included were classified to explore different ways of comparing and analyzing factors influencing these costs. Studies were chartered into two groups: (1) studies detailing costs for eLearning implementations and (2) studies with detailed costing methods (approaches to capture costs) for eLearning but without implementation of specific data. Group 1 was further charted into two separate groups: (1) studies with comparison to other learning types and (2) studies without a comparator. For these two subcategories, we excluded studies disclosing that the cost data provided were incomplete.

Collating, Summarizing, and Reporting the Results

Each study was reviewed individually to understand the implementation aspects of each reported eLearning instance. The studies were then summarized into four categories: (1) studies that detail eLearning costs without a comparator, (2) studies that detail eLearning costs with a comparator, (3) related data from two related systematic reviews, and (4) studies that detail costing approaches. The results are presented as a narrative summary of the principal aspects of each study organized via main classification themes to present evidence that can inform the development and deployment of eLearning by defining the factors that influence implementation costs and the criteria that should be used to explore cost optimization.


Overview of Included Studies

In total, 7344 articles were returned from the search of the literature (Figure 1). Of these, 232 were relevant to associated keywords or abstract references to cost following screening. Full-text review resulted in 168 studies being excluded. Of these, 61 studies were excluded because they were unrelated to eLearning and focused on general education. In addition, 103 studies were excluded because of lack of detailed information regarding costs; these studies referred to cost in ways either indicating cost favorability or unfavorability, but without data to support findings. Finally, 4 studies were excluded because of limited cost data insufficient for analysis. In total, 42 studies (Table 1) provided data and analysis of the impact of cost and value in health professions education. Completeness of data extracted varied, which resulted in some datasets in the final inclusion data charts to be designated as not available/applicable to reflect inability to abstract usable information; however, these studies remained within the inclusion set because of partial data that contributed to the narrative analysis. These studies contrasted to studies excluded at the earlier screening stage because of cost being a secondary outcome of the investigation and the cost data being of greater focus than those of the excluded studies. The most common data source was the total cost of training (n=29). Other sources included cost per learner, meaning the cost per student (n=13). The population most frequently cited was medical students (n=15), although a group of articles focused on multiple populations (n=12). A further 22 studies provide details of costing approaches for the production and delivery of eLearning. These studies offer insight into the ways that eLearning has been budgeted and project-managed through implementation.

Figure 1. PRISMA (Preferred Reporting Items in Systematic Reviews and Meta-Analyses) flow diagram of search and screening for costs of eLearning implementation.
View this figure
Table 1. Studies that provide costs for eLearning implementation.a
ReferenceYearComparisonStudy designSubjectCost sourceHCPb population
Allan et al [13]2008NoneCaseEvidence-based medicineTotal costClinicians
Bandla et al [14]2012NoneCase-controlSleep medicineTotal costMedical students
Berger et al [15]2009Face to faceCase- controlPatient educationPer learnerNurses
Butler et al [16]2013NoneRCTcBehavior change counselingPer learnerClinicians, nurses
Choi et al [17]2008Other learningCaseSurgical anatomyTotal costMedical students
Collins et al [18]2018NoneCourse reviewNutritionTotal costAHPsd, medical students
Downer et al [19]2018NoneCaseLeadership and management in healthTotal costAHPs, medical students, clinicians
Dumestre et al [20]2014Other learningSystematic reviewMicrosurgical skill acquisitionPer learnerClinicians, medical students
Glasbey et al [21]2017Face to faceCaseSurgical trainingTotal costMedical students
Grayson et al [22]2018NoneLongitudinalHand hygieneTotal costAHPs, medical students, clinicians
Hardwick et al [23]2011NoneCasePathologyTotal costClinicians
Jerin and Rea [24]2005NoneCaseEmergency medicinePer learnerAHPs
Joshi and Perin [25]2012Other learningCasePublic health informaticsTotal costAHPs
Kaufman [26]2010NoneCaseTreatment of diabetesPer learnerPatients (patient education used by HCP)
Knapp et al [27]2011Face to faceCaseHIV detectionTotal costAHPs, clinicians
Kumpu et al [28]2016Face to faceCaseGlobal healthTotal costAHPs, medical students, clinicians
Letterie et al [29]2003NoneLiterature reviewComputer-assisted medical educationTotal costAHPs, medical students, clinicians
Likic et al [30]2013NoneCohortRational therapeuticsTotal costMedical students
Manring et al [31]2011NoneCasePsychotherapyTotal costClinicians
McConnell et al [32]2009NoneCasePharmacy CPDePer learnerPharmacists
McDuffie et al [33]2011NoneCaseExperiential pharmacy trainingPer learnerPharmacists
Moreno-Ger et al [34]2010No InterventionCasePractical skills simulationPer learnerMedical students
Nickel et al [35]2015Other learningRCTLaparoscopic cholecystectomyTotal costMedical students
Nicklen et al [36]2016NoneCasePhysiotherapyTotal costUndergraduate AHPs
Padwal et al [37]2017Other learningRCTWeight managementTotal costPatients (patient education used by HCP)
Padwal et al [38]2013Other learningRCTWeight management (study protocol)Total costPatients (patient education used by HCP)
Palmer et al [39]2015NoneCaseClinical skillsTotal costMedical students
Pentiak et al [40]2013NoneClinical reviewSurgical skillsPer learnerClinicians
Perkins et al [41]2012Face to faceRCTAdvanced life support trainingPer learnerAHPs
Reeves et al [42]2013Other learningLiterature reviewInterprofessional educationTotal costAHPs
Schopf and Flytkjær [43]2011NoneCaseInterprofessional training -dermatologyTotal costClinicians, nurses
Shepler [44]2014NoneCohortAdvanced pharmacy practice experienceTotal costPharmacy students
Sivamalai et al [45]2011NoneCasePathologyTotal costMedical students
Spanou et al [46]2010Face to faceRCT (protocol)Behavior change counselingTotal costClinicians, nurses
Stansfeld et al [47]2015Other learningRCTEmployee well-beingTotal costAHPs
Stromberg et al [48]2012NoneCohortHeart failure nursingTotal costNurses
Thomas et al [49]2010NoneCaseFamily planningTotal costAHPs
de Ruijter et al [50]2015NoneCaseBusiness engineering; surgical technicianTotal costMedical students
Weiss et al [51]2011Other learningCohortAntibiotic prescribingTotal costClinicians, pharmacists
Williams et al [52]2009NoneCohortPractice-based research networksPer learnerClinicians
Young et al [53]2017NoneCaseResearch skillsPer learnerAHPs
Zhou et al [54]2018NoneCaseResource stewardshipPer learnerMedical students, clinicians

aThese studies were all assigned the prefix “INC,” indicating that this group was inclusive of both comparator and noncomparator studies (for eLearning costs); the combination of the prefix and study number can be used to provide a unique ID to refer to studies.

bHCP: health care provider.

cRCT: randomized controlled trial.

dAHPs: allied health professionals.

eCPD: continuing professional development.

Studies Describing eLearning Costs Without a Comparator

Twenty-two studies [13,16,19,22,23,26,30-34,39,40,43-45,48, 50,52-55] provided analysis of implementation costs in eLearning without comparison to other learning platforms. These studies primarily reported total costs and cost per learner (Table 2). The studies suggested that eLearning should be less costly than face-to-face learning; however, without a comparator, it is not possible to substantiate these claims. Despite these deficiencies, these studies provide varying means of cost calculation across different forms of instructional design.

Table 2. Studies that detail eLearning costs without a comparator.a
ReferenceYearInstructional designSample size (N)Total cost (US $)Cost per learner (US $)Notes
Allan et al [13]2008Asynchronous, blended304820924No blended learning cost
Butler et al [16]2013Blended80207526No explicit cost methodology/technique described
Downer et al [19]2018Asynchronous5323,000394No explicit cost methodology/technique described
Grayson et al [22]2018Asynchronous1,989,713N/Ab0.04Provided aggregate cost per leaner
Kaufman [26]2010Asynchronous787N/A1453Reported overall cost per learner
Hardwick et al [23]2011AsynchronousN/AN/AN/AProvided cost modeling approach
Likic et al [29]2013Asynchronous39310,00023Use of online course deemed lower cost than face-to-face problem-based learning
Manring et al [31]2011Blended355250137Only costs of physical implementation
McConnell et al [32]2009Asynchronous81206100.07No explicit cost methodology/technique described
McDuffie et al [33]2011Blended382N/A21No explicit cost methodology/technique described
Moreno-Ger et al [34]2010Asynchronous40026306No explicit cost methodology/technique described
Palmer et al [39]2015Synchronous95000506No explicit cost methodology/technique described
Pentiak et al [40]2013AsynchronousN/A32,685N/ATotal curriculum delivery
Schopf and Flytkjær [43]2011Asynchronous8884,229858No explicit cost methodology/technique described
Shepler [44]2014Asynchronous580N/AN/AUS $148 savings per intervention
Sivamalai et al [45]2011Asynchronous200392,4681782Cost of digital microscopy 1/3 cost of physical microscopy
Stromberg et al [48]2012Asynchronous183N/AN/ATotal cost reduction compared over previous methods
Thomas et al [49]2010Asynchronous27321,00070No explicit cost methodology/technique described
de Ruijter et al [50]2015Asynchronous80344,98649No explicit cost methodology/technique described
Williams et al [52]2009Asynchronous103373233No explicit cost methodology/technique described
Young et al [53]2017Asynchronous679N/A38Did not report total cost
Zhou et al [54]2018Asynchronous48N/A148Did not report total cost

aThese studies are given the prefix “SUM” to indicate that this group represents a summary of costs without a comparator; the prefix and number can be used to provide a unique ID to refer to studies.

bN/A: not available/applicable.

The studies in this set engaged the scope of the review question focused on the costs associated with eLearning in health professions education but lacked the comparison variable of the PICO framework. Although these studies suggest that implementation of eLearning could provide self-reported high value through low-cost delivery, and thus cost-effectiveness, they offer no comparative framework to justify these assertions. Among the studies that quantify eLearning costs, three groups emerged. The first included studies demonstrating that eLearning was of low cost but had no or limited evidence of self-reported educational impact [13,16]. The second group demonstrated that eLearning was of low cost and had a high self-reported education impact [23,30-34,43-45,48-50,52-54]. A third group [19,22,26,39,40] demonstrated that eLearning was of high cost and had a high self-reported educational impact.

Allan et al [13] and Butler et al [16] present examples of low-cost eLearning delivery but without demonstrated educational impact, with low cost in these studies presented from the perspective of the cost per learner. In Allan et al [13], the key research question was whether this research group could implement an evidence-based medicine curriculum for clinicians. Although quantifying costs was an aspect of the reported results, like many of the studies included in this review, it was not a primary focus and was done so in an informal fashion without explicit unit cost breakdown or listing of all of the components that would impact learning production. In contrast to the use of a comprehensive program including multiple forms of learning and the establishment of a learning community, Butler et al [16] made use exclusively of blended learning in a course. They revealed that the complete training costs are not captured when creating online or blended courses in primary care. Despite comprehensively capturing unit costs of delivery in the implementation of the study (by providing segmentation of costs across administrators, actors, trainers, clinicians, nurses, and costs per practice), their study treated eLearning as a single-group cost reflecting the time per participant to complete the eLearning; however, there was no accounting of the required system implementation time and production time for the creation of eLearning. Similar to Allan et al [13], Butler et al [16] highlight cost omissions that are endemic in studies included in this review.

A second group of studies demonstrate eLearning as having low cost and high educational impact [23,30-34,43-45,48-50,52-54]. Of this set, Likic et al [30], McConnell et al [32], McDuffie et al [33], de Ruijter et al [50], Moreno-Ger et al [34], Thomas et al [49], Williams et al [52], and Young et al [53] each represent online courses making use of asynchronous online learning at low cost per learner (below US $68/learner). The key issue among the studies in this literature cluster is that although they may provide evidence of low cost per learner, without a comparison point to comparable face-to-face delivery, there is no way to assert with any certainty that eLearning is a lower-cost option.

The final group of studies in this set [19,22,26,39,40] indicated that eLearning was of higher cost and had high educational impact. This group shared similar data-recording issues as those from the previous set but also provide evidence to indicate the high start-up costs associated with eLearning production.

It is challenging to draw strong inferences based on an aggregation of the studies that summarize eLearning costs because of the different methods that were used in cost calculation, the difference in subjects instructed, the rapid changes in web platforms for learning, and other factors impacting the way costs were calculated. However, it is possible to observe some trends from this grouping. For pure online courses, the studies suggest that total costs per learner are low; however, there is often acknowledgment in the studies that not all implementation costs have been captured in the cost calculations. This lack of included costs, including sunk costs, indicates that reported costs are not accurate. Although some studies identified the costs that were not captured, many did not, and these gaps are only evident to researchers who have a background and understanding of the issues involved in the delivery of eLearning. Additionally, most studies are cases of specific instances of eLearning implementation, making it difficult to gauge what the results mean in contrast to face-to-face learning, and case study methods make it hard to generalize the results. Some studies indicated high total costs, but in those instances [40], the eLearning costs were embedded in total curriculum delivery.

Studies Describing eLearning Costs With a Comparator

Seventeen studies [14,15,17,21,24,25,27,28,34-37,41,46,47,51] compared eLearning costs to those of face-to-face learning or other types of learning (Table 3). These comparative studies offered more evidence that the use of eLearning demonstrated cost efficiencies than did the studies in the previous group, which provided no comparative data.

Table 3. Studies that detail eLearning costs with a comparator.a
ReferenceYearInstructional designComparisonSample size (N)Cost of eLearning (US $)Cost of face-to-face learning (US $)Notes from study
Bandla et al [14]2012Asynchronous onlineFace to face17321,75221,752N/Ab
Berger et al [15]2009BlendedFace to face16614110Cost per learner
Choi et al [17]2008Asynchronous onlineOther learning34N/AN/AProvided costs of online platforms without complete cost comparison
Glasbey et al [21]2017N/AN/A570N/AN/AOnline curriculum embedded; core costs not separated in study
Jerin and Rea [24]2005Asynchronous onlineAsynchronous online9353352Cost per learner
Joshi and Perin [25]2012Asynchronous onlineOther learning1514,08520,714Online vs face-to-face total costs
Knapp et al [27]2011Asynchronous onlineFace to face911574386N/A
Kumpu et al [28]2016BlendedFace to face2824311054N/A
Moreno-Ger et al [34]2010Asynchronous onlineFace to face40072630N/A
Nickel et al [35]2015Virtual realityOther learning84390082,500Virtual reality vs blended learning
Nicklen et al [36]2016BlendedFace to face7859046856N/A
Padwal et al [37]2017Asynchronous onlineFace to face65111,727477,000N/A
Padwal et al [38]2013Asynchronous onlineFace to faceN/AN/AN/AProtocol
Perkins et al [41]2012BlendedFace to face3732438935N/A
Spanou et al [46]2010Asynchronous onlineFace to faceN/AN/AN/AProtocol
Stansfeld et al [47]2015Asynchronous onlineFace to face350N/AN/ACaptured approach to total costs but incomplete comparison data to nononline approach
Weiss et al [51]2011Asynchronous onlineOther learningN/AN/AN/ACost reduction per inhabitant following education program

aThese studies were given the prefix “COMP” to indicate that this group was a summary of costs with a comparator; the prefix and number can be used to provide a unique ID to refer to studies.

bN/A: not available/applicable.

The studies in this set can be divided into two groups: studies that demonstrated that eLearning was of lower cost but had no or limited evidence of self-reported educational impact, and studies that demonstrated that eLearning was of lower cost and had self-reported high educational impact [25,51].

Of the studies that demonstrated that eLearning was of lower cost and had a low education impact, the key data issue was that although these studies suggested that eLearning was lower cost, they consistently omitted key components in the design and production of eLearning, thereby creating an incomplete cost profile of the total costs of delivery. Two studies in this set demonstrated that eLearning was of lower cost and had a high education impact; although each study completed a full comparison demonstrating a reduction in costs (in some instances a dramatic reduction), the studies suffer from a lack of methodological consistency in the way they captured costs and evaluated effectiveness. As was the case in the previous set of study classifications, the continued differences in cost accounting, learning delivery platforms, and various forms of assessments make synthesis challenging.

Literature Reviews That Quantify eLearning Costs

Two review studies [20,42] analyzed the use of training where eLearning was used as a delivery platform. Both studies revealed that there was a lack of sufficient evidence to analyze whether training methods using aspects of online learning were more pedagogically effective. The studies were also unable to provide findings that created a holistic understanding of associated cost ingredients. Dumestre et al [20] suggested that within the field of microsurgical training, there are many available methods of implementing instruction and that cost is the determining factor in what method is used by institutions. Reeves [42] performed a Cochrane systematic review protocol that included 15 studies. The review showed that due to the small number of studies (N=15) and the heterogeneity of interventions and outcome measures, it is not possible to draw inferences about the key elements of interprofessional education and its effectiveness. To make such evaluation possible, there must be implementation of cost-benefit analysis, and separation of review within specific professions and studies using qualitative methods to evaluate effectiveness. Although both studies were concerned with evaluation of the effectiveness of specific education training, the way they engaged with the literature review question was limited, as both studies collected limited information on eLearning and only gave broad summary generalizations about cost reductions in their respective field of focus. Costs were identified by looking at the total costs of the delivery of programs; however, because the costs were not described as units, it is not possible to examine the extent and quality of the results. There was no accommodation for differential timing or impact of the consequences of cost decisions. These issues are similar to the weakness in cost analysis of the other studies included in this review.

Studies Describing Costing Approaches

Twenty-two studies [56-77] referenced economic evaluation (analyzing cost benefits or cost effectiveness) or used the ingredients method [78] to calculate costs in the production of eLearning (Table 4). Reflecting on the broader set of studies in this review, it is important to note that while many studies suggest the cost-effectiveness of eLearning, following completion of this review, we have only identified 5 cost-effectiveness analysis studies completed on eLearning. Regarding specific cost approaches, use of the ingredients method is referenced often in this set (12 times); however, the mechanisms for cost capture and subsequent project delivery management of production of learning within this group are inconsistent despite using the same methods.

Table 4. Studies detailing costing approaches or economic evaluation.
ReferenceYearCosting approach
Brown [56]2014Cost-benefit analysis
Buntrock et al [57]2014Cost-effectiveness analysis
Pettit et al [58]2017Ingredients cost method
Carlson et al [59]2008Ingredients cost method
Carpenter [60]2016Ingredients cost method
Chambers et al [61]2017Cost utility analysis
Chhabra et al [62]2013Cost-effectiveness analysis
Cousineau et al [63]2008Cost-effectiveness analysis
Curran et al [64]2006Ingredients cost method
Cook [65]2014Ingredients cost method
Delgaty [66]2013Ingredients cost method
Djukic et al [67]2015Ingredients cost method
Gallimore et al [68]2012Ingredients cost method
Isaacson et al [69]2014Ingredients cost method
Lonsdale et al [70]2016Cost-effectiveness analysis
Papadatou-Pastou et al [71]2017Multiple; survey of methods
Pardue [72]2001Ingredients cost method
Pickering and Joynes [73]2016Multiple; survey of methods
Rondags et al [74]2015Cost-effectiveness analysis
Sharma et al [75]2018Ingredients cost method
Tung and Chang [76]2008Perceived financial cost
Zary et al [77]2006Ingredients cost method

Principal Findings

Our review was focused on identifying literature that would define the associated costs in the delivery of eLearning in health professions education. Broadly speaking, we were able to answer this question as we collected data that documented a trend of reported eLearning costs per learner and their general low cost. However, we have questions about how conclusive these data are because of the issue of consistency regarding cost data capture, the lack of standard mechanisms for cost data collection for online learning, and the lack of primary studies that focused on cost analysis as a primary research objective. Our review findings are consistent with views put forth in previous research that understanding of the relationship of cost in eLearning is not well developed [6,79,80]. The studies included provide a cross-section of various instances of eLearning across many disciplines in health professions education. This collection of studies allowed gaining a deeper understanding of the various ways in which eLearning is being used and the cost considerations when applying different platforms of education delivery. The key limitation of the included studies was the lack of consistency of methodology for cost analysis. Cost evidence provided by the included studies was challenging for the purposes of comparison due to these deficiencies.

Strengths and Limitations

The strengths of this review are that it completed a comprehensive search of the major literature databases. The search question and the associated terms provided a sufficiently broad scope to ensure that there was coverage to any study that recorded cost and maintained relevance to the inclusion criteria. The search approach was designed in consultation with leading researchers who investigate cost in education, and the final results provide a rich background of materials to explore the issues associated with the research question.

There are four limitations to the process used in this literature review. First, as only English-language papers were searched, relevant foreign-language papers could have been excluded, in addition to the publication bias of health science papers for positive results. Additionally, industry literature was not explicitly searched in the search strategy, further adding to the limitation of study papers under review. Second, due to the inconsistency in capturing costs and lack of standardization in cost reporting, a meta-analysis for quantifying costs is not possible because of the lack of predefined costing models for eLearning used in standard ways across studies, the significant variance in the way costs are recorded, variant experimental methods with different outcome conclusions, and the variance in implementation between different eLearning types. Third, a significant limitation is that in comparing costs of eLearning within the included studies of the review, each study was treated equally, whereas the costs for a team new to eLearning production will likely be higher than those of an experienced team who have produced many courses. Additionally, reported costs could have been on segments of the production process, resulting in inconsistency in reporting. Further research could explore specific aspects of design, development, and delivery to allow for more refined comparison and analysis, including quantitative cost analysis such as that of fixed versus variable costs. In addition to this cost analysis, further work could explore the relationship between learning impact and associated effort as attributed to cost. Lastly, a significant limitation is that this review was rerun in December 2018 to update results from spring 2016 in an original scoping of the literature completed in December 2015, but detailed analysis of new studies identified from 2016 to 2018 are not included in the narrative of this review. Although the newly included studies are incorporated into the data tables, because of time constraints, further analysis of these new studies will be completed in a separate update of this review.

Therefore, the review could be strengthened by taking further measures to either refine the research question into a narrower scope or attempting cost modeling with accepted deficiencies. Nevertheless, the review as completed provides a comprehensive scope of the current evidence, and highlights a gap in the literature indicating a need for a protocol that can capture costs in eLearning interventions to allow a basis for comparison in similar educational subjects or across variant curriculum implementations. Such a protocol would provide a systematic mechanism for calculating online learning costs to allow for a basis of various forms of economic evaluation. This would assist course designers in understanding the total costs in delivery of eLearning and address the standardization issues incumbent with a lack of a standard as evidenced by this review.

Conclusions

Although cost is a recognized factor in studies exploring eLearning design and implementation, the way cost is captured is inconsistent and is assessed in relation to a wide variety of factors or with an alternate study–related focus. Despite a perception that eLearning is more cost-effective than face-to-face instruction, there is not yet sufficient evidence to assert this conclusively. Among the many factors for considering implementing eLearning is the potential long-term cost-effectiveness of its delivery model in comparison to other education delivery formats. A rigorous, repeatable data capture method is needed, in addition to a means to leverage existing economic evaluation methods that can then test whether eLearning is cost-effective, and how to implement eLearning with cost benefits and advantages over traditional instruction. On the one hand, if proven to be more cost-effective, this could assist in addressing the high cost of delivering health professions education. On the other the hand, should evidence point the other way, having discrete data points will allow those involved in health education to identify ways to optimize costs in eLearning delivery to create cost efficiency. To evaluate and optimize cost in education delivery, there must be a rigorous standard through which to score and assess cost-effectiveness, which would enable analysis of whether investments are justified.

To gain a comprehensive understanding of the way cost impacts the deployment of eLearning in comparison to face-to-face instruction, a body of evidence that makes use of economic evaluation must be developed to allow for systematic analysis of how these results demonstrate the strengths and weaknesses of comparative cost delivery. This review has identified the limited use of economic evaluations to achieve this aim thus far. Moreover, even among studies that make use of cost summaries in their results, there is a lack of sufficient rigor to provide insight into the way in which these costs impact education delivery or to allow for comparisons to other forms of learning.

Acknowledgments

We would like to thank Abrar Alturkistani for helping in the formatting of the manuscript. We would like to thank Rebecca Jones of the Imperial College Medical Library for assistance in the development of database queries for this review. This work was supported by the European Institute of Innovation and Technology–EIT Health Knowledge and Innovation Community (KIC). Research issues were identified and prioritized by members of the public at the launch event for the WHO/Imperial College London Systematic Review of eLearning in undergraduate health professionals held at 170 Queen’s Gate, Imperial College London in January 2015, cochaired by Dr Al-Shorbaji (WHO) and Dr Josip Car (Imperial College London). During this manuscript development, Scott Reeves provided key insight and perspectives which were incorporated in the main text; without his valuable feedback this manuscript would not have been possible.

Authors' Contributions

JC conceived the study topic, and EM (under supervision of JC) devised the primary research question, scope, structure, and methods of the investigation. EM drafted and completed the primary manuscript; the text has been adapted from EM’s doctoral thesis in Clinical Medicine Research (with a concentration in Public Health) at Imperial College London. JE and CB completed peer review of papers for selection and analysis. SM, GR, DI, KW, and AM provided feedback on the draft texts. EM responded to peer review feedback. The final manuscript was approved by all authors. EM is the guarantor.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Full search strategy.

DOCX File , 20 KB

Multimedia Appendix 2

Eligibility stage search exclusions.

DOCX File , 42 KB

  1. Pruitt SD, Epping-Jordan JE. Preparing the 21st century global healthcare workforce. BMJ 2005 Mar 19;330(7492):637-639 [FREE Full text] [CrossRef] [Medline]
  2. Sousa A, Scheffler RM, Nyoni J, Boerma T. A comprehensive health labour market framework for universal health coverage. Bull World Health Organ 2013 Nov 01;91(11):892-894 [FREE Full text] [CrossRef] [Medline]
  3. Plint S. Securing The Future Of The GP Workforce: Delivering The Mandate On GP Expansion. United Kingdom: Health Education England; 2014 Jul 14.   URL: https:/​/kingsfund.​blogs.com/​health_management/​2014/​07/​securing-the-future-of-the-gp-workforce-delivering-the-mandate-on-gp-expansion-.​html [accessed 2021-01-15]
  4. Sangrà A, Vlachopoulos D, Cabrera N. Building an inclusive definition of e-learning: An approach to the conceptual framework. In: International Review of Research in Open and Distributed Learning. Montreal, Canada: Athabasca University Press; Apr 13, 2012:145-159.
  5. Smith J, Holder H, Edwards N, Maybin J, Parker H, Rosen R, et al. Securing the future of general practice: New models of primary care. Nuffield Trust. United Kingdom: The King's Fund; 2013.   URL: https:/​/www.​nuffieldtrust.org.uk/​research/​securing-the-future-of-general-practice-new-models-of-primary-care [accessed 2021-01-15]
  6. Atun R, Car J, Majeed A, Wheeler E. eLearning for undergraduate health professional education: A systematic review informing a radical transformation of health workforce development. Geneva, Switzerland: World Health Organization; 2015.
  7. Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implement Sci 2010 Sep 20;5:69 [FREE Full text] [CrossRef] [Medline]
  8. Hollin IL, Robinson KA. A Scoping Review of Healthcare Costs for Patients with Cystic Fibrosis. Appl Health Econ Health Policy 2016 Apr;14(2):151-159. [CrossRef] [Medline]
  9. Arksey H, O'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Method 2005 Feb;8(1):19-32. [CrossRef]
  10. Higgins J, Thomas J, Chandler J, Cumpston M, Li T, Page M, et al. Cochrane Handbook for Systematic Reviews of Interventions.: John Wiley & Sons; 2019.   URL: http://training.cochrane.org/handbook [accessed 2019-06-15]
  11. Gray A, Clarke P, Wolstenholme J, Wordsworth S. Applied methods of cost-effectiveness analysis in health care. New York, NY: Oxford University Press; 2011.
  12. Deming DJ, Goldin C, Katz LF, Yuchtman N. Can Online Learning Bend the Higher Education Cost Curve? Am Econ Rev 2015 May 01;105(5):496-501. [CrossRef]
  13. Allan GM, Korownyk C, Tan A, Hindle H, Kung L, Manca D. Developing an integrated evidence-based medicine curriculum for family medicine residency at the University of Alberta. Acad Med 2008 Jun;83(6):581-587. [CrossRef] [Medline]
  14. Bandla H, Franco RA, Simpson D, Brennan K, McKanry J, Bragg D. Assessing learning outcomes and cost effectiveness of an online sleep curriculum for medical students. J Clin Sleep Med 2012 Aug 15;8(4):439-443. [CrossRef] [Medline]
  15. Berger J, Topp R, Davis L, Jones J, Stewart L. Comparison of Web-based and face-to-face training concerning patient education within a hospital system. J Nurses Staff Dev 2009;25(3):127-32; quiz 133. [CrossRef] [Medline]
  16. Butler CC, Simpson SA, Hood K, Cohen D, Pickles T, Spanou C, et al. Training practitioners to deliver opportunistic multiple behaviour change counselling in primary care: a cluster randomised trial. BMJ 2013 Mar 19;346:f1191 [FREE Full text] [CrossRef] [Medline]
  17. Choi AA, Tamblyn R, Stringer MD. Electronic resources for surgical anatomy. ANZ J Surg 2008 Dec;78(12):1082-1091. [CrossRef] [Medline]
  18. Collins J, Adamski MM, Twohig C, Murgia C. Opportunities for training for nutritional professionals in nutritional genomics: What is out there? Nutr Diet 2018 Apr;75(2):206-218. [CrossRef] [Medline]
  19. Foucart S, Nadeau R, de Champlain J. The release of catecholamines from the adrenal medulla and its modulation by alpha 2-adrenoceptors in the anaesthetized dog. Can J Physiol Pharmacol 1987 Apr;65(4):550-557. [CrossRef] [Medline]
  20. Dumestre D, Yeung JK, Temple-Oberle C. Evidence-based microsurgical skill-acquisition series part 1: validated microsurgical models--a systematic review. J Surg Educ 2014;71(3):329-338. [CrossRef] [Medline]
  21. Glasbey J, Sinclair P, Mohan H, Harries R, ASiT 40-4-40 Course Organisers. 40-4-40: educational and economic outcomes of a free, international surgical training event. Postgrad Med J 2017 Dec;93(1106):730-735. [CrossRef] [Medline]
  22. Grayson ML, Stewardson AJ, Russo PL, Ryan KE, Olsen KL, Havers SM, Hand Hygiene Australiathe National Hand Hygiene Initiative. Effects of the Australian National Hand Hygiene Initiative after 8 years on infection control practices, health-care worker education, and clinical outcomes: a longitudinal study. Lancet Infect Dis 2018 Nov;18(11):1269-1277. [CrossRef] [Medline]
  23. Hardwick DF, Sinard J, Silva F. Development and evolution of The Knowledge Hub for Pathology and related electronic resources. Hum Pathol 2011 Jun;42(6):795-801. [CrossRef] [Medline]
  24. Jerin JM, Rea TD. Web-based training for EMT continuing education. Prehosp Emerg Care 2005;9(3):333-337. [CrossRef] [Medline]
  25. Joshi A, Perin DMP. Gaps in the existing public health informatics training programs: a challenge to the development of a skilled global workforce. Perspect Health Inf Manag 2012;9:1-13 [FREE Full text] [Medline]
  26. Kaufman N. Internet and information technology use in treatment of diabetes. Int J Clin Pract Suppl 2010 Feb(166):41-46. [CrossRef] [Medline]
  27. Knapp H, Fletcher M, Taylor A, Chan K, Goetz MB. No clinic left behind: providing cost-effective in-services via distance learning. J Healthc Qual 2011 Sep;33(5):17-24. [CrossRef] [Medline]
  28. Kumpu M, Atkins S, Zwarenstein M, Nkonki L, ARCADE consortium. A partial economic evaluation of blended learning in teaching health research methods: a three-university collaboration in South Africa, Sweden, and Uganda. Glob Health Action 2016;9:28058 [FREE Full text] [CrossRef] [Medline]
  29. Letterie GS. Medical education as a science: the quality of evidence for computer-assisted instruction. Am J Obstet Gynecol 2003 Mar;188(3):849-853. [CrossRef] [Medline]
  30. Likic R, White C, Cinti S, Purkiss J, Fantone J, Chapman C, et al. Online learning applied to a course on rational therapeutics: an international comparison between final year students of two medical schools. Br J Clin Pharmacol 2013 Feb;75(2):373-380. [CrossRef] [Medline]
  31. Manring J, Greenberg RP, Gregory R, Gallinger L. Learning psychotherapy in the digital age. Psychotherapy (Chic) 2011 Jun;48(2):119-126. [CrossRef] [Medline]
  32. McConnell KJ, Newlon C, Dickerhofe J. A model for continuing pharmacy education. Am J Pharm Educ 2009 Aug 28;73(5):87 [FREE Full text] [CrossRef] [Medline]
  33. McDuffie CH, Duke LJ, Stevenson TL, Sheffield MC, Fetterman JW, Staton AG, et al. Consortium-based approach to an online preceptor development program. Am J Pharm Educ 2011 Sep 10;75(7):135 [FREE Full text] [CrossRef] [Medline]
  34. Moreno-Ger P, Torrente J, Bustamante J, Fernández-Galaz C, Fernández-Manjón B, Comas-Rengifo M. Application of a low-cost web-based simulation to improve students' practical skills in medical education. Int J Med Inform 2010 Jun;79(6):459-467. [CrossRef] [Medline]
  35. Nickel F, Brzoska JA, Gondan M, Rangnick HM, Chu J, Kenngott HG, et al. Virtual reality training versus blended learning of laparoscopic cholecystectomy: a randomized controlled trial with laparoscopic novices. Medicine (Baltimore) 2015 May;94(20):e764. [CrossRef] [Medline]
  36. Nicklen P, Rivers G, Ooi C, Ilic D, Reeves S, Walsh K, et al. An Approach for Calculating Student-Centered Value in Education - A Link between Quality, Efficiency, and the Learning Experience in the Health Professions. PLoS One 2016;11(9):e0162941 [FREE Full text] [CrossRef] [Medline]
  37. Padwal R, Klarenbach S, Sharma A, Fradette M, Jelinski S, Edwards A, et al. The evaluating self-management and educational support in severely obese patients awaiting multidisciplinary bariatric care (EVOLUTION) trial: principal results. BMC Med 2017 Mar 02;15(1):46 [FREE Full text] [CrossRef] [Medline]
  38. Padwal R, Sharma A, Fradette M, Jelinski S, Klarenbach S, Edwards A, et al. The evaluating self-management and educational support in severely obese patients awaiting multidisciplinary bariatric care (EVOLUTION) trial: rationale and design. BMC Health Serv Res 2013 Aug 17;13:321 [FREE Full text] [CrossRef] [Medline]
  39. Palmer RT, Biagioli FE, Mujcic J, Schneider BN, Spires L, Dodson LG. The feasibility and acceptability of administering a telemedicine objective structured clinical exam as a solution for providing equivalent education to remote and rural learners. Rural Remote Health 2015;15(4):3399 [FREE Full text] [Medline]
  40. Pentiak P, Schuch-Miller D, Streetman R, Marik K, Callahan R, Long G, et al. Barriers to adoption of the surgical resident skills curriculum of the American College of Surgeons/Association of Program Directors in Surgery. Surgery 2013 Jul;154(1):23-28. [CrossRef] [Medline]
  41. Perkins G, Kimani P, Bullock I, Clutton-Brock T, Davies R, Gale M, Electronic Advanced Life Support Collaborators. Improving the efficiency of advanced life support training: a randomized, controlled trial. Ann Intern Med 2012 Jul 03;157(1):19-28. [CrossRef] [Medline]
  42. Reeves S, Perrier L, Goldman J, Freeth D, Zwarenstein M. Interprofessional education: effects on professional practice and healthcare outcomes (update). Cochrane Database Syst Rev 2013 Mar 28(3):CD002213 [FREE Full text] [CrossRef] [Medline]
  43. Schopf T, Flytkjær V. Doctors and nurses benefit from interprofessional online education in dermatology. BMC Med Educ 2011 Oct 14;11:84 [FREE Full text] [CrossRef] [Medline]
  44. Shepler B. Cost savings associated with pharmacy student interventions during APPEs. Am J Pharm Educ 2014 May 15;78(4):71 [FREE Full text] [CrossRef] [Medline]
  45. Sivamalai S, Murthy S, Gupta T, Woolley T. Teaching pathology via online digital microscopy: positive learning outcomes for rurally based medical students. Aust J Rural Health 2011 Feb;19(1):45-51. [CrossRef] [Medline]
  46. Spanou C, Simpson S, Hood K, Edwards A, Cohen D, Rollnick S, et al. Preventing disease through opportunistic, rapid engagement by primary care teams using behaviour change counselling (PRE-EMPT): protocol for a general practice-based cluster randomised trial. BMC Fam Pract 2010 Sep 21;11:69 [FREE Full text] [CrossRef] [Medline]
  47. Stansfeld SA, Kerry S, Chandola T, Russell J, Berney L, Hounsome N, et al. Pilot study of a cluster randomised trial of a guided e-learning health promotion intervention for managers based on management standards for the improvement of employee well-being and reduction of sickness absence: GEM Study. BMJ Open 2015 Oct 26;5(10):e007981 [FREE Full text] [CrossRef] [Medline]
  48. Stromberg A. Implementing a European curriculum for clinical expertise in heart failure nursing, an educational initiative from the HFA and CCNAP. 2012 Presented at: ESC Congress; August 25-29, 2012; Munich, Germany p. 1065.
  49. Thomas A, Fried G, Johnson P, Stilwell B. Sharing best practices through online communities of practice: a case study. Hum Resour Health 2010 Nov 12;8:25 [FREE Full text] [CrossRef] [Medline]
  50. de Ruijter V, Halvax P, Dallemagne B, Swanström L, Marescaux J, Perretta S. The Business Engineering Surgical Technologies (BEST) teaching method: incubating talents for surgical innovation. Surg Endosc 2015 Jan;29(1):48-54. [CrossRef] [Medline]
  51. Weiss K, Blais R, Fortin A, Lantin S, Gaudet M. Impact of a multipronged education strategy on antibiotic prescribing in Quebec, Canada. Clin Infect Dis 2011 Sep;53(5):433-439. [CrossRef] [Medline]
  52. Williams R, McPherson L, Kong A, Skipper B, Weller N, PRIME Net clinicians. Internet-based training in a practice-based research network consortium: a report from the Primary Care Multiethnic Network (PRIME Net). J Am Board Fam Med 2009;22(4):446-452 [FREE Full text] [CrossRef] [Medline]
  53. Young G, McLaren L, Maden M. Delivering a MOOC for literature searching in health libraries: evaluation of a pilot project. Health Info Libr J 2017 Dec;34(4):312-318. [CrossRef] [Medline]
  54. Zhou L, Tait G, Sandhu S, Steiman A, Lake S. Online virtual cases to teach resource stewardship. Clin Teach 2019 Jun;16(3):220-225. [CrossRef] [Medline]
  55. Thomas G. How to do your Case Study: A Guide For Students And Researchers. United Kingdom: Sage Publications Ltd; 2011.
  56. Brown M, Bullock A. Evaluating PLATO: postgraduate teaching and learning online. Clin Teach 2014 Feb;11(1):10-14. [CrossRef] [Medline]
  57. Buntrock C, Ebert DD, Lehr D, Cuijpers P, Riper H, Smit F, et al. Evaluating the efficacy and cost-effectiveness of web-based indicated prevention of major depression: design of a randomised controlled trial. BMC Psychiatry 2014 Jan 31;14:25 [FREE Full text] [CrossRef] [Medline]
  58. Pettit RK, Kinney M, McCoy L. A descriptive, cross-sectional study of medical student preferences for vodcast design, format and pedagogical approach. BMC Med Educ 2017 May 19;17(1):89 [FREE Full text] [CrossRef] [Medline]
  59. Carlson JJ, Eisenmann JC, Pfeiffer KA, Jager KB, Sehnert ST, Yee KE, et al. (S)Partners for Heart Health: a school-based program for enhancing physical activity and nutrition to promote cardiovascular health in 5th grade students. BMC Public Health 2008 Dec 22;8:420 [FREE Full text] [CrossRef] [Medline]
  60. Carpenter SH. What deters nurses from participating in web-based graduate nursing programs?: A cross-sectional survey research study. Nurse Educ Today 2016 Jan;36:70-76. [CrossRef] [Medline]
  61. Chambers SK, Ritterband L, Thorndike F, Nielsen L, Aitken JF, Clutton S, et al. A study protocol for a randomised controlled trial of an interactive web-based intervention: CancerCope. BMJ Open 2017 Jun 23;7(6):e017279 [FREE Full text] [CrossRef] [Medline]
  62. Chhabra HS, Harvey LA, Muldoon S, Chaudhary S, Arora M, Brown DJ, et al. www.elearnSCI.org: a global educational initiative of ISCoS. Spinal Cord 2013 Mar;51(3):176-182. [CrossRef] [Medline]
  63. Cousineau TM, Green TC, Corsini E, Seibring A, Showstack MT, Applegarth L, et al. Online psychoeducational support for infertile women: a randomized controlled trial. Hum Reprod 2008 Mar;23(3):554-566 [FREE Full text] [CrossRef] [Medline]
  64. Curran VR, Fleet L, Kirby F. Factors influencing rural health care professionals' access to continuing professional education. Aust J Rural Health 2006 Apr;14(2):51-55. [CrossRef] [Medline]
  65. Cook DA. The value of online learning and MRI: finding a niche for expensive technologies. Med Teach 2014 Nov;36(11):965-972. [CrossRef] [Medline]
  66. Delgaty L. A critical examination of the time and workload involved in the design and delivery of an e-module in postgraduate clinical education. Med Teach 2013 May;35(5):e1173-e1180. [CrossRef] [Medline]
  67. Djukic M, Adams J, Fulmer T, Szyld D, Lee S, Oh S, et al. E-Learning with virtual teammates: A novel approach to interprofessional education. J Interprof Care 2015;29(5):476-482. [CrossRef] [Medline]
  68. Gallimore C, Barnett SG, Porter AL, Kopacek KJ. Evaluation of pharmacotherapy laboratory revisions implemented to reduce cost. Am J Pharm Educ 2012 May 10;76(4):67 [FREE Full text] [CrossRef] [Medline]
  69. Isaacson RS, Haynes N, Seifan A, Larsen D, Christiansen S, Berger JC, et al. Alzheimer's Prevention Education: If We Build It, Will They Come? www.AlzU.org. J Prev Alzheimers Dis 2014;1(2):91-98 [FREE Full text] [Medline]
  70. Lonsdale C, Sanders T, Cohen KE, Parker P, Noetel M, Hartwig T, et al. Scaling-up an efficacious school-based physical activity intervention: Study protocol for the 'Internet-based Professional Learning to help teachers support Activity in Youth' (iPLAY) cluster randomized controlled trial and scale-up implementation evaluation. BMC Public Health 2016 Aug 24;16(1):873 [FREE Full text] [CrossRef] [Medline]
  71. Papadatou-Pastou M, Goozee R, Payne E, Barrable A, Tzotzoli P. A review of web-based support systems for students in higher education. Int J Ment Health Syst 2017;11:59 [FREE Full text] [CrossRef] [Medline]
  72. Pardue SL. The virtual revolution: implications for academe. Poult Sci 2001 May;80(5):553-561 [FREE Full text] [CrossRef] [Medline]
  73. Pickering JD, Joynes VCT. A holistic model for evaluating the impact of individual technology-enhanced learning resources. Med Teach 2016 Dec;38(12):1242-1247. [CrossRef] [Medline]
  74. Rondags SMPA, de Wit M, van Tulder MW, Diamant M, Snoek FJ. HypoAware-a brief and partly web-based psycho-educational group intervention for adults with type 1 and insulin-treated type 2 diabetes and problematic hypoglycaemia: design of a cost-effectiveness randomised controlled trial. BMC Endocr Disord 2015 Aug 21;15:43 [FREE Full text] [CrossRef] [Medline]
  75. Sharma M, Chris A, Chan A, Knox DC, Wilton J, McEwen O, et al. Decentralizing the delivery of HIV pre-exposure prophylaxis (PrEP) through family physicians and sexual health clinic nurses: a dissemination and implementation study protocol. BMC Health Serv Res 2018 Jul 03;18(1):513 [FREE Full text] [CrossRef] [Medline]
  76. Tung F, Chang S. A new hybrid model for exploring the adoption of online nursing courses. Nurse Educ Today 2008 Apr;28(3):293-300. [CrossRef] [Medline]
  77. Zary N, Johnson G, Boberg J, Fors UGH. Development, implementation and pilot evaluation of a Web-based Virtual Patient Case Simulation environment--Web-SP. BMC Med Educ 2006 Feb 21;6:10 [FREE Full text] [CrossRef] [Medline]
  78. Levin H, McEwan P. Cost-effectiveness analysis: methods and applications. United Kingdom: Sage Publications; 2000.
  79. George PP, Papachristou N, Belisario JM, Wang W, Wark PA, Cotic Z, et al. Online eLearning for undergraduates in health professions: A systematic review of the impact on knowledge, skills, attitudes and satisfaction. J Glob Health 2014 Jun;4(1):010406. [CrossRef] [Medline]
  80. Car J, Carlstedt-Duke J, Tudor Car L, Posadzki P, Whiting P, Zary N, Digital Health Education Collaboration. Digital Education in Health Professions: The Need for Overarching Evidence Synthesis. J Med Internet Res 2019 Feb 14;21(2):e12913 [FREE Full text] [CrossRef] [Medline]


PICO: Population, Intervention, Comparison, Outcome


Edited by G Eysenbach; submitted 11.02.19; peer-reviewed by D Cook, G Myreteg, R de Leeuw, M Davis; comments to author 12.03.19; revised version received 25.04.19; accepted 18.12.20; published 11.03.21

Copyright

©Edward Meinert, Jessie Eerens, Christina Banks, Stephen Maloney, George Rivers, Dragan Ilic, Kieran Walsh, Azeem Majeed, Josip Car. Originally published in JMIR Medical Education (http://mededu.jmir.org), 11.03.2021.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on http://mededu.jmir.org/, as well as this copyright and license information must be included.