Published on in Vol 8, No 2 (2022): Apr-Jun

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/38259, first published .
Usability Methods and Attributes Reported in Usability Studies of Mobile Apps for Health Care Education: Scoping Review

Usability Methods and Attributes Reported in Usability Studies of Mobile Apps for Health Care Education: Scoping Review

Usability Methods and Attributes Reported in Usability Studies of Mobile Apps for Health Care Education: Scoping Review

Review

1Faculty of Health and Function, Western Norway University of Applied Sciences, Bergen, Norway

2Division of Health Services, Norwegian Institute of Public Health, Oslo, Norway

3Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada

*all authors contributed equally

Corresponding Author:

Susanne Grødem Johnson, MSc

Faculty of Health and Function

Western Norway University of Applied Sciences

Inndalsveien 28

Bergen, 5063

Norway

Phone: 47 92213202

Email: susanne.grodem.johnson@hvl.no


Background: Mobile devices can provide extendable learning environments in higher education and motivate students to engage in adaptive and collaborative learning. Developers must design mobile apps that are practical, effective, and easy to use, and usability testing is essential for understanding how mobile apps meet users’ needs. No previous reviews have investigated the usability of mobile apps developed for health care education.

Objective: The aim of this scoping review is to identify usability methods and attributes in usability studies of mobile apps for health care education.

Methods: A comprehensive search was carried out in 10 databases, reference lists, and gray literature. Studies were included if they dealt with health care students and usability of mobile apps for learning. Frequencies and percentages were used to present the nominal data, together with tables and graphical illustrations. Examples include a figure of the study selection process, an illustration of the frequency of inquiry usability evaluation and data collection methods, and an overview of the distribution of the identified usability attributes. We followed the Arksey and O’Malley framework for scoping reviews.

Results: Our scoping review collated 88 articles involving 98 studies, mainly related to medical and nursing students. The studies were conducted from 22 countries and were published between 2008 and 2021. Field testing was the main usability experiment used, and the usability evaluation methods were either inquiry-based or based on user testing. Inquiry methods were predominantly used: 1-group design (46/98, 47%), control group design (12/98, 12%), randomized controlled trials (12/98, 12%), mixed methods (12/98, 12%), and qualitative methods (11/98, 11%). User testing methods applied were all think aloud (5/98, 5%). A total of 17 usability attributes were identified; of these, satisfaction, usefulness, ease of use, learning performance, and learnability were reported most frequently. The most frequently used data collection method was questionnaires (83/98, 85%), but only 19% (19/98) of studies used a psychometrically tested usability questionnaire. Other data collection methods included focus group interviews, knowledge and task performance testing, and user data collected from apps, interviews, written qualitative reflections, and observations. Most of the included studies used more than one data collection method.

Conclusions: Experimental designs were the most commonly used methods for evaluating usability, and most studies used field testing. Questionnaires were frequently used for data collection, although few studies used psychometrically tested questionnaires. The usability attributes identified most often were satisfaction, usefulness, and ease of use. The results indicate that combining different usability evaluation methods, incorporating both subjective and objective usability measures, and specifying which usability attributes to test seem advantageous. The results can support the planning and conduct of future usability studies for the advancement of mobile learning apps in health care education.

International Registered Report Identifier (IRRID): RR2-10.2196/19072

JMIR Med Educ 2022;8(2):e38259

doi:10.2196/38259

Keywords



Background

Mobile devices can provide extendable learning environments and motivate students to engage in adaptive and collaborative learning [1,2]. Mobile devices offer various functions, enable convenient access, and support the ability to share information with other learners and teachers [3]. Most students own a mobile phone, which makes mobile learning easily accessible [4]. However, there are some challenges associated with mobile devices in learning situations, such as small screen sizes, connectivity problems, and multiple distractions in the environment [5].

Developers of mobile learning apps need to consider usability to ensure that apps are practical, effective, and easy to use [1] and to ascertain that mobile apps meet users’ needs [6]. According to the International Organization for Standardization, usability is defined as “the extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” [7]. Better mobile learning usability will be achieved by focusing on user-centered design and attention to context, ensuring that the technology corresponds to the user’s requirements and putting the user at the center of the process [8,9]. In addition, it is necessary to be conscious of the interrelatedness between usability and pedagogical design [9].

A variety of usability evaluation methods exists to test the usability of mobile apps, and Weichbroth [10] categorized them into the following 4 categories: inquiry, user testing, inspection, and analytical modeling. Inquiry methods are designed to gather data from users through questionnaires (quantitative data) and interviews and focus groups (qualitative data). User testing methods include think-aloud protocols, question-asking protocols, performance measurements, log analysis, eye tracking, and remote testing. Inspection methods, in contrast, involve experts testing apps, heuristic evaluation, cognitive walk-through, perspective-based inspections, and guideline reviews. Analytical modeling methods include cognitive task analysis and task environment analysis [10]. Across these 4 usability evaluation methods, the most commonly used data collection methods are controlled observations and surveys, whereas eye tracking, think-aloud methods, and interviews are applied less often [10].

Usability evaluations are normally performed in a laboratory or in field testing. Previous reviews have reported that usability evaluation methods are mainly conducted in a laboratory, which means in a controlled environment [1,11]. By contrast, field testing is conducted in real-life settings. There are pros and cons to the 2 different approaches. Field testing allows data collection within a dynamic environment, whereas in a laboratory data collection and conditions are easier to control [1]. A variety of data collection methods are appropriate for usability studies; for instance, in laboratories, participants performing predefined tasks, such as using questionnaires and observations, are often applied [1]. In field testing, logging mechanisms and diaries have been applied to capture user interaction with mobile apps [1].

In all, 2 systematic reviews examined various psychometrically tested usability questionnaires as a means of enhancing the usability of apps. Sousa and Lopez [12] identified 15 such questionnaires and Sure [13] identified 13. In all, 5 of the questionnaires have proven to be applicable in usability studies in general: the System Usability Scale (SUS), Questionnaire for User Interaction Satisfaction, After-Scenario Questionnaire, Post-Study System Usability Questionnaire, and Computer System Usability Questionnaire [12]. The SUS questionnaire and After-Scenario Questionnaire are most widely applied [13]. The most frequently reported usability attributes of these 5 questionnaires are learnability, efficiency, and satisfaction [12].

Usability attributes are features that measure the quality of mobile apps [1]. The most commonly reported usability attributes are effectiveness, efficiency, and satisfaction [5], which are part of the usability definition [7]. In the review by Weichbroth [10], 75 different usability attributes were identified. Given the wide selection of usability attributes, choosing appropriate attributes depends on the nature of the technology and the research question in the usability study [14]. Kumar and Mohite [1] recommended that researchers present and explain which usability attributes are being tested when mobile apps are being developed.

Previous reviews have examined the usability of mobile apps in general [5,10,11,14,15]; however, only one systematic review has specifically explored the usability of mobile learning apps [1]. However, studies from health care education were not included. Similarly, usability has not been widely explored in medical education apps [16]. Thus, there is a need to develop a better understanding of how the usability of mobile learning apps developed for health care education has been evaluated and conceptualized in previous studies.

Objectives

The aim of this scoping review has therefore been to identify usability methods and attributes in usability studies of mobile apps for health care education.


Framework

We have used the framework for scoping reviews developed by Arksey and O'Malley [17] and further developed by Levac et al [18] and Khalil et al [19]. We adopted the following five stages of this framework: (1) identifying the research question, (2) identifying relevant studies, (3) selecting studies, (4) charting the data, and (5) summarizing and reporting the results [17-19]. A detailed presentation of each step can be found in the published protocol for this scoping review [20]. We followed the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) checklist for reporting scoping reviews (Multimedia Appendix 1 [21]).

Stage 1: Identifying the Research Question

The following two research questions have been formulated:

  1. Which usability methods are used to evaluate the usability of mobile apps for health care education?
  2. Which usability attributes are reported in the usability studies of mobile apps for health care education?

Stage 2: Identifying Relevant Studies

A total of 10 electronic databases on technology, education, and health care from January 2008 to October 2021 and February 2022 were searched. These databases were as follows: Engineering Village, Scopus, ACM Digital Library, IEEE Xplore, Education Resource Information Center, PsycINFO, CINAHL, MEDLINE, EMBASE, and Web of Science. The search string was developed by the first author and a research librarian and then peer reviewed by another research librarian. The search terms used in the Web of Science, in addition to all relevant subject headings, included: ((student* or graduate* or undergraduate* or postgraduate*) NEAR/3 nurs*). This search string was repeated for other types of students and combined with the Boolean operator OR. The search string for all types of health care students was then combined with various search terms for mobile apps and mobile learning using the Boolean operator AND. Similar search strategies were used and adapted for all 10 databases as shown in Multimedia Appendix 2. In addition, a citation search in Google Scholar, screening reference lists of included studies, and searching for gray literature in OpenGrey were conducted.

Stage 3: Selecting Studies

Two of the authors independently screened titles and abstracts using Rayyan web-based management software [22]. Studies deemed eligible by one of the authors were included for full-text screening and imported into the EndNote X9 (Clarivate) reference management system [23]. Eligibility for full-text screening was determined independently by two of the authors and disagreements were resolved by consensus-based discussions. Research articles with different designs were included, and there were no language restrictions. As mobile apps started appearing in 2008, this year was set as the starting point for the search. Eligibility criteria are presented in Table 1.

Table 1. Study eligibility.

Inclusion criteriaExclusion criteria
PopulationHealth care and allied health care students at the undergraduate and postgraduate levelsHealth care professionals or students from education, engineering, or other nonhealth sciences
ConceptStudies of usability testing or methods of usability evaluation of mobile learning apps where the purpose relates to the development of the appsStudies relating to learner management systems, e-learning platforms, open online courses, or distance education
ContextTypical educational setting (eg, classroom teaching, clinical placement, or simulation training), including both synchronous and asynchronous teachingNoneducational settings not involving clinical placement or learning situations (eg, hospital or community settings)

Stage 4: Charting the Data (Data Abstraction)

The extracted data included information about the study (eg, authors, year of publication, title, and country), population (eg, number of participants), concepts (usability methods, usability attributes, and usability phase), and context (educational setting). The final data extraction sheet can be found in Multimedia Appendix 3 [24-111]. One review author extracted the data from the included studies using Microsoft Excel software [21], which was checked by another researcher.

Descriptions of usability attributes have not been standardized, making categorization challenging. Therefore, a review author used deductive analysis to interpret the usability attributes reported in the included studies. This interpretation was based on a review of usability attributes as defined in previous literature. These definitions were assessed on the basis of the results of the included studies. This analysis was reviewed and discussed by another author. Disagreements were resolved through a consensus-based discussion.

Stage 5: Summarizing and Reporting the Results

Frequencies and percentages were used to present nominal data, together with tables and graphical illustrations. For instance, a figure showing the study selection process, an illustration of the frequency of inquiry-based usability evaluation and data collection methods, and an overview of the distribution of identified usability attributes were provided.


Eligible Studies

Database searches yielded 34,369 records, and 2796 records were identified using other methods. After removing duplicates, 28,702 records remained. A total of 626 reports were examined in full text. In all, 88 articles were included in the scoping review [24-111] (Figure 1). A total of 8 articles comprised results from several studies in the same article, presented as study A, study B, or study C in Multimedia Appendix 3. Therefore, a total of 98 studies were reported in the 88 articles included.

Figure 1. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) flowchart of study selection process.
View this figure

The included studies comprised a total sample population of 7790, with participant numbers ranging from 5 to 736 participants per study. Most of the studies included medical students (34/88, 39%) or nursing students (25/88, 28%). Other participants included students from the following disciplines: pharmacy (9/88, 10%), dentistry (5/88, 6%), physiotherapy (5/88, 6%), health sciences (3/88, 3%), and psychology (2/88, 2%). Further information is provided in Multimedia Appendix 3. There were 22 publishing countries, with most studies being from the United States (22/88, 25%), Spain (9/88, 10%), the United Kingdom (8/88, 9%), Canada (7/88, 8%), and Brazil (7/88, 8%), with an increasing number of publications from 2014. Table 2 provides an overview and characteristics of the included articles.

Table 2. Characteristics of included articles.
Study numberStudyPopulation (N)Research design: data collection methodUsability attributes
1Aebersold et al [24], 2018, United StatesNursing (N=69)Mixed methods: questionnaire; task and knowledge performanceaEase of use; learning performance; satisfaction; usefulness
2Akl et al [25], 2008, United StatesResident (N=30)Qualitative methods: focus groups; written qualitative reflectionsSatisfaction
3Al-Rawi et al [26], 2015, United StatesDentist (N=61)Posttest 1-group design: questionnaireEase of use; frequency of use; satisfaction; usefulness
4Albrecht et al [27], 2013, GermanyMedicine (N=6)Posttest 1-group design: questionnairebSatisfaction
5Alencar Neto et al [28], 2020, BrazilMedicine (N=132)Posttest 1-group design: questionnairebEase of use; learnability; satisfaction; usefulness
6Alepis and Virvou [29], 2010, GreeceMedicine (N=110)Mixed methods: questionnaire; interviewsEase of use; usefulness; user-friendliness
7Ameri et al [30], 2020, IranPharmacy (N=241)Posttest 1-group design: questionnairebContext of use; efficiency; usefulness
8Balajelini and Ghezeljeh [31], 2018, IranNursing (N=41)Posttest 1-group design: questionnaireEase of use; frequency of use; navigation; satisfaction; simplicity; usefulness
9Barnes et al [32], 2015, United KingdomMedicine (N=42)Randomized controlled trial: questionnaire; task and knowledge performanceEase of use; effectiveness; learning performance; satisfaction
10Busanello et al [33], 2015, BrazilDentist (N=62)Pre-post test, nonrandomized control group design: questionnairebLearnability; learning performance; satisfaction
11Cabero-Almenara and Roig-Vila [34], 2019, SpainMedicine (N=50)Pre-post test, 1-group design: questionnairebLearning performance; satisfaction
12Choi et al [35], 2015, South KoreaNursing (N=5)Think-aloud methods: interviews; data from appContext of use; ease of use; learnability; satisfaction; usefulness
13Choi et al [36], 2018, South KoreaNursing (N=75)Pre-post test, nonrandomized control group design: questionnaireEase of use; learning performance; satisfaction; usefulness
14Choo et al [37], 2019, SingaporePsychology (N=8)Mixed methods: questionnaireb; written qualitative reflectionsEase of use; learning performance; satisfaction; usefulness; user-friendliness
15Chreiman et al [38], 2017, United StatesMedicine (N=30)Posttest 1-group design: questionnaire; data from appContext of use; ease of use; frequency of use; usefulness
16Colucci et al [39], 2015, United StatesMedicine (N=115)Posttest 1-group design: questionnaireEffectiveness; efficiency; satisfaction; usefulness
17Davids et al [40], 2014, South AfricaResidents (N=82)Randomized controlled trial: questionnaireb; data from appEffectiveness; efficiency; learnability; navigation; satisfaction; user-friendliness
18ADemmans et al [41], 2018, CanadaNursing (N=60)Pre-post test, nonrandomized control group design: questionnaire; observationsEase of use; effectiveness; learnability; learning performance; navigation; satisfaction
18BDemmans et al [41], 2018, CanadaNursing (N=85)Pre-post test, nonrandomized control group design: questionnaire; observationsEase of use; effectiveness; learnability; learning performance; navigation; satisfaction
19Devraj et al [42], 2021, United StatesPharmacy (N=89)Posttest 1-group design: questionnaire; data from appEase of use; errors; frequency of use; learning performance; navigation; operational usability; satisfaction; usefulness
20Díaz-Fernández et al [43], 2016, SpainPhysiotherapy (N=110)Posttest 1-group design: questionnaireComprehensibility; ease of use; usefulness
21Docking et al [44], 2018, United KingdomParamedic (N=24)Think-aloud methods: focus groupsContext of use; learnability; satisfaction; usefulness
22Dodson and Baker [45], 2020, United StatesNursing (N=23)Qualitative methods: focus groupsEase of use; operational usability; satisfaction; usefulness; user-friendliness
23Duarte Filho et al [46], 2014, BrazilMedicine (N=10)Posttest nonrandomized control group design: questionnaireEase of use; efficiency; satisfaction; usefulness
24Duggan et al [47], 2020, CanadaMedicine (N=80)Posttest 1-group design: questionnaire; data from appEase of use; frequency of use; satisfaction; usefulness
25Fernandez-Lao et al [48], 2016, SpainPhysiotherapy (N=49)Randomized controlled trial: questionnaireb; task and knowledge performanceLearning performance; satisfaction
26Fralick et al [49], 2017, CanadaMedicine (N=62)Pre-post test, nonrandomized control group design: questionnaireEase of use; frequency of use; learning performance; usefulness
27Ghafari et al [50], 2020, IranNursing (N=8)Posttest 1-group design: questionnaireEase of use; operational usability; satisfaction; usefulness
28Goldberg et al [51], 2014, United StatesMedicine (N=18)Posttest 1-group design: questionnaire; task and knowledge performanceEase of use; effectiveness
29Gutiérrez-Puertas et al [52], 2021, SpainNursing (N=184)Randomized controlled trial: questionnaire; task and knowledge performanceLearning performance; satisfaction
30Herbert et al [53], 2021, United StatesNursing (N=33)Randomized controlled trial: questionnaire; task and knowledge performanceEase of use; learning performance; navigation; operational usability; usefulness
31Hsu et al [54], 2019, TaiwanNursing (N=16)Qualitative methods: interviewsContext of use; operational usability; satisfaction; usefulness
32Huang et al [55], 2010, TaiwanNot clear (N=28)Posttest 1-group design: questionnaireEase of use; satisfaction, usefulness
33Hughes and Kearney [56], 2017, United StatesOccupational therapy (N=19)Qualitative methods: focus groupsEfficiency; satisfaction
34Ismail et al [57], 2018, MalaysiaHealth science (N=124)Pre-post test, 1-group design: questionnaireEase of use; learning performance; satisfaction; user-friendliness
35Johnson et al [58], 2021, NorwayOccupational therapy, physiotherapy, and social education (N=15)Qualitative methods: focus groupsContext of use; ease of use; operational usability
36AKang Suh [59], 2018, South KoreaNursing (N=92)Pre-post test, nonrandomized control group design: questionnaire; data from appEffectiveness; frequency of use; learning performance; satisfaction
36BKang Suh [59], 2018, South KoreaNursing (N=49)Qualitative methods: focus groupsEffectiveness; frequency of use; learning performance; satisfaction
37Keegan et al [60], 2016, United StatesNursing (N=116)Posttest nonrandomized control group design: questionnaire; task and knowledge performanceLearning performance; satisfaction; usefulness
38Kim-Berman et al [61], 2019, United StatesDentist (N=93)Posttest 1-group design: questionnaire; task and knowledge performanceContext of use; ease of use; effectiveness; usefulness
39Kojima et al [62], 2011, JapanPhysiotherapy and occupational therapy (N=41)Pre-post test, 1-group design: questionnaireEase of use; learning performance; satisfaction; usefulness
40Koulias et al [63], 2012, AustraliaMedicine (N=171)Posttest 1-group design: questionnaireEase of use; operational usability; satisfaction
41Kow et al [64], 2016, SingaporeMedicine (N=221)Pre-post test, 1-group design: questionnaireLearning performance; satisfaction
42Kurniawan and Witjaksono [65], 2018, IndonesiaMedicine (N=30)Posttest 1-group design: questionnaireSatisfaction; usefulness
43ALefroy et al [66], 2017, United KingdomMedicine (N=21)Qualitative methods: focus groups; data from appContext of use; frequency of use; satisfaction
43BLefroy et al [66], 2017, United KingdomMedicine (N=405)Quantitative methods: data from appContext of use; frequency of use; satisfaction
44Li et al [67], 2019, TaiwanHealth care (N=70)Pre-post test, nonrandomized control group design: questionnairebEase of use; usefulness
45Lin and Lin [68], 2016, TaiwanNursing (N=36)Pre-post test, nonrandomized control group design: questionnaireCognitive load; ease of use; learnability; learning performance; usefulness
46Lone et al [69], 2019, IrelandDentist (N=59)Randomized controlled trial: questionnaire; task and knowledge performanceEase of use; learnability; learning performance; operational usability; satisfaction
47ALong et al [70], 2016, United StatesNursing (N=158)Pre-post test, 1-group design: questionnaire; data from appEase of use; efficiency; learnability; learning performance; satisfaction
47BLong et al [70], 2016, United StatesHealth science (N=159)Randomized controlled trial: questionnaire; data from appEase of use; efficiency; learnability; learning performance; satisfaction
48Longmuir [71], 2014, United StatesMedicine (N=56)Posttest 1-group design: questionnaire; data from appEfficiency; learnability; operational usability; satisfaction
49López et al [72], 2016, SpainMedicine (N=67)Posttest 1-group design: questionnairebContext of use; ease of use; errors; satisfaction; usefulness
50Lozano-Lozano et al [73], 2020, SpainPhysiotherapy (N=110)Randomized controlled trial: questionnaire; task and knowledge performanceLearning performance; satisfaction; usefulness
51Lucas et al [74], 2019, AustraliaPharmacy (N=39)Pre-post test, 1-group design: questionnaire; task and knowledge performanceSatisfaction; usefulness
52Mathew et al [75], 2014, CanadaMedicine (N=5)Think-aloud methods: questionnaireb; interviews; task and knowledge performanceLearnability; satisfaction
53McClure [76], 2019, United StatesNursing (N=16)Posttest 1-group design: questionnairebLearnability; satisfaction; usefulness
54McDonald et al [77], 2018, CanadaMedicine (N=20)Pre-post test, 1-group design: questionnaire; data from appEffectiveness; satisfaction
55McLean et al [78], 2014, AustraliaMedicine (N=58)Mixed methods: questionnaire; focus groups; interviewsSatisfaction
56McMullan [79], 2018, United KingdomHealth science (N=60)Pre-post test, 1-group design: questionnaireLearning performance; navigation; satisfaction; usefulness; user-friendliness
57Mendez-Lopez et al [80], 2021, SpainPsychology (N=67)Pre-post test, 1-group design: questionnaire; task and knowledge performanceCognitive load; ease of use; learning performance; satisfaction; usefulness
58Meruvia-Pastor et al [81], 2016, CanadaNursing (N=10)Pre-post test, 1-group design: questionnaire; task and knowledge performanceEase of use; learning performance; satisfaction; usefulness
59Mettiäinen [82], 2015, FinlandNursing (N=121)Mixed methods: questionnaire; focus groupsEase of use; usefulness
60Milner et al [83], 2020, United StatesMedicine and nursing (N=66)Posttest 1-group design: questionnaireSatisfaction; usefulness
61Mladenovic et al [84], 2021, SerbiaDentist (N=56)Posttest 1-group design: questionnaireContext of use; ease of use; satisfaction; usefulness
62Morris and Maynard [85], 2010, United KingdomPhysiotherapy and nursing (N=19)Pre-post test, 1-group design: questionnaireContext of use; ease of use; navigation; operational usability; usefulness
63ANabhani et al [86], 2020, United KingdomPharmacy (N=56)Posttest 1-group design: questionnaireEase of use; learnability; learning performance; satisfaction; usefulness
63BNabhani et al [86], 2020, United KingdomPharmacy (N=152)Posttest 1-group design: questionnaireEase of use; learnability; learning performance; satisfaction; usefulness
63CNabhani et al [86], 2020, United KingdomPharmacy (N=33)Posttest 1-group design: task and knowledge performanceEase of use; learnability; learning performance; satisfaction; usefulness
64ANoguera et al [87], 2013, SpainPhysiotherapy (N=84)Posttest 1-group design: questionnaireLearning performance; satisfaction; usefulness
64BNoguera et al [87], 2013, SpainPhysiotherapy (N=76)Randomized controlled trial: questionnaireLearning performance; satisfaction; usefulness
65O’Connell et al [88], 2016, IrelandMedicine, nursing, and pharmacy (N=89)Randomized controlled trial: questionnairebEase of use; learning performance; operational usability; satisfaction; simplicity
66Oliveira et al [89], 2019, BrazilMedicine (N=110)Randomized controlled trial: questionnaire; task and knowledge performanceFrequency of use; learning performance; satisfaction
67Orjuela et al [90], 2015, ColombiaMedicine (N=22)Posttest 1-group design: questionnaireEase of use; satisfaction
68Page et al [91], 2016, United StatesMedicine (N=356)Mixed methods: questionnaire; interviewsContext of use; efficiency; satisfaction
69Paradis et al [92], 2018, CanadaMedicine and nursing (N=108)Posttest 1-group design: questionnairebEase of use; satisfaction; usefulness
70Pereira et al [93], 2017, BrazilMedicine (N=20)Posttest 1-group design: questionnairebEase of use; learnability; satisfaction; usefulness
71Pereira et al [94], 2019, BrazilNursing (N=60)Posttest 1-group design: questionnaireEase of use; operational usability; satisfaction
72APinto et al [95], 2008, BrazilBiomedical informatics (N=5)Qualitative methods: observations; task and knowledge performanceEfficiency; errors; learnability; learning performance; operational usability; satisfaction
72BPinto et al [95], 2008, BrazilMedicine (N=not clear)Posttest nonrandomized control group design: questionnaireEfficiency; errors; learnability; learning performance; operational usability; satisfaction
73Quattromani et al [96], 2018, United StatesNursing (N=181)Randomized controlled trial: questionnairebLearnability; learning performance; satisfaction; usefulness
74Robertson and Fowler [97], 2017, United StatesMedicine (N=18)Qualitative methods: focus groupsSatisfaction
75ARomero et al [98], 2021, GermanyMedicine (N=22)Think-aloud methods: questionnaire; interviews; task and knowledge performanceEffectiveness; efficiency; errors; navigation; satisfaction
75BRomero et al [98], 2021, GermanyMedicine (N=22)Posttest 1-group design: questionnairebLearnability; satisfaction
75CRomero et al [98], 2021, GermanyMedicine (N=736)Posttest 1-group design: questionnaireFrequency of use; satisfaction
76Salem et al [99], 2020, AustraliaPharmacy (N=33)Posttest 1-group design: questionnaireOperational usability; satisfaction; usefulness
77San Martín-Rodríguezet al [100], 2020, SpainNursing (N=77)Posttest 1-group design: questionnaire; task and knowledge performanceLearning performance; operational usability; satisfaction
78Schnepp and Rogers [101], 2017, United StatesNot clear (N=72)Think-aloud methods: questionnaireb; interviews; task and knowledge performanceLearnability; satisfaction
79Smith et al [102], 2016, United KingdomMedicine and nursing (N=74)Mixed methods: questionnaire; focus groupsNavigation; operational usability; satisfaction; user-friendliness
80Strandell-Laine et al [103], 2019, FinlandNursing (N=52)Mixed methods: questionnaireb; written qualitative responsesLearnability; operational usability; satisfaction
81Strayer et al [104], 2010, United StatesMedicine (N=122)Mixed methods: questionnaire; focus groupsContext of use; learnability; learning performance; satisfaction; usefulness
82Taylor et al [105], 2010, United KingdomA total of 8 different health care educations (N=79)Qualitative methods: focus groups; written qualitative reflectionsContext of use; learnability
83Toh et al [106], 2014, SingaporePharmacy (N=31)Posttest 1-group design: questionnaireEase of use; learnability; navigation; usefulness
84Tsopra et al [107], 2020, FranceMedicine (N=57)Mixed methods: questionnaire; focus groupsEase of use; operational usability; satisfaction; usefulness
85Wu [108], 2014, TaiwanNursing (N=36)Mixed methods: questionnaire; interviewsCognitive load; effectiveness; satisfaction; usefulness
86Wyatt et al [109], 2012, United StatesNursing (N=12)Qualitative methods: focus groupsEase of use; efficiency; errors; learnability; memorability; navigation; satisfaction
87Yap [110], 2017, SingaporePharmacy (N=123)Posttest 1-group design: questionnaireComprehensibility; learning performance; memorability; navigation; satisfaction; usefulness
88Zhang et al [111], 2015, SingaporeMedicine (N=185)Mixed methods: questionnaire; focus groupsUsefulness

aPerformances measured, comparing paper and app results, quiz results, and exam results.

bReported use of validated questionnaires.

Usability Evaluation Methods

The usability evaluation methods found were either inquiry-based or based on user testing. The following inquiry methods were used: 1-group design (46/98, 47%), control group design (12/98, 12%), randomized controlled trials (12/98, 12%), mixed methods (12/98, 12%), and qualitative methods (11/98, 11%). Several studies that applied inquiry-based methods used more than one data collection method, with questionnaires being used most often (80/98, 82%), followed by task and knowledge performance testing (17/98, 17%), focus groups (15/98, 15%), collection of user data from the app (10/98, 10%), interviews (5/98, 5%), written qualitative reflections (4/98, 4%), and observations (3/98, 3%). Additional information can be found in the data extraction sheet (Multimedia Appendix 3). Figure 2 illustrates the frequency of the inquiry-based usability evaluation methods and data collection methods.

The only user testing methods found were think-aloud methods (5/98, 5%), and 4 (80%) of these studies applied more than one data collection method. The data collection methods used included interviews (4/98, 4%), questionnaires (3/98, 3%), task and knowledge performance (3/98, 3%), focus groups (1/98, 1%), and collection of user data from the app (1/98, 1%).

A total of 19 studies used a psychometrically tested usability questionnaire, including the SUS, Technology Acceptance Model, Technology Satisfaction Questionnaire, and Technology Readiness Index. SUS [112] was used in most (9/98, 9%) of the studies.

Field testing was the most frequent type of usability experiment, accounting for 72% (71/98) of usability experiments. A total of 22 (22%) studies performed laboratory testing, and 5 (5%) studies did not indicate the type of experiment performed. Multimedia Appendix 3 provides an overview of the type of experiment conducted in each study. The usability testing of the mobile apps took place in a classroom setting (41/98, 42%), in clinical placement (29/98, 30%), during simulation training (14/98, 14%), other (7/98, 7%), or the setting was not specified (5/98, 5%).

Figure 2. Inquiry usability evaluation methods and data collection methods.
View this figure

Usability Attributes

A total of 17 usability attributes have been identified among the included studies. The most frequently identified attributes were satisfaction, usefulness, ease of use, learning performance, and learnability. The least frequent were errors, cognitive load, comprehensibility, memorability, and simplicity. Table 3 provides an overview of the usability attributes identified in the included studies.

Table 3. Distribution of usability attributes (n=17) and affiliated reports (N=88).
Usability attributeDistribution, n (%)Reports (references)
Satisfaction74 (84)[24-28,31-37,39-42,44-48,50,52,54-57,59,60,62-66,69-81,83,84,86-104,107-110]
Usefulness51 (58)[24,26,28-31,35-39,42-47,49,50,53-55,60-62,65,67,68,72-74,76,79-87,92,93,96,99,104,106-108,110,111]
Ease of use45 (51)[24,26,28,29,31,32,35-38,41-43,45-47,49-51,53,55,57,58,61-63,67-70,72,80-82,84-86,88,90,92-94,106,107,109]
Learning performance33 (38)[24,32-34,36,37,41,42,48,49,52,53,57,59,60,62,64,68-70,73,79-81,86-89,95,96,100,104,110]
Learnability23 (26)[28,33,35,40,41,44,68-71,75,76,86,93,95,96,98,101,103-106,109]
Operational usability19 (22)[42,45,50,53,54,58,63,69,71,85,88,90,94,95,99-101,103,107]
Context of use14 (16)[30,35,38,44,54,58,61,66,72,84,85,91,104,105]
Navigation12 (14)[31,40-42,53,79,85,98,102,106,109,110]
Efficiency11 (13)[30,39,40,46,56,70,71,91,95,98,109]
Effectiveness10 (11)[32,39-41,51,59,61,77,98,108]
Frequency of use10 (11)[26,31,38,42,47,49,59,66,89,98]
User-friendliness7 (8)[29,37,40,45,57,79,102]
Errors5 (6)[42,72,95,98,109]
Cognitive load3 (3)[68,80,108]
Comprehensibility2 (2)[43,110]
Memorability2 (2)[109,110]
Simplicity2 (2)[31,88]

Principal Findings

This scoping review sought to identify the usability methods and attributes reported in usability studies of mobile apps for health care education. A total of 88 articles, with a total of 98 studies reported in these 88 articles, were included in this review. Our findings indicate a steady increase in publications from 2014, with studies being published in 22 different countries. Field testing was used more frequently than laboratory testing. Furthermore, the usability evaluation methods applied were either inquiry-based or based on user testing. Most of the inquiry-based methods were experiments that used questionnaires as a data collection method, and all of the studies with user testing methods applied think-aloud methods. Satisfaction, usefulness, ease of use, learning performance, and learnability were the most frequently identified usability attributes.

Comparison With Prior Work

Usability Evaluation Methods

The studies included in this scoping review mainly applied inquiry-based methods, primarily the collection of self-reported data through questionnaires. This is congruent with the results of Weichbroth [10], in which controlled observations and surveys were the most frequently applied methods. Asking users to respond to a usability questionnaire may provide relevant and valuable information. Among the 83 studies that used questionnaires in our review, only 19 (23%) used a psychometrically tested usability questionnaire; of these, the SUS questionnaire [112] was used most frequently. In line with the review on usability questionnaires [12], we recommend using a psychometrically tested usability questionnaire to support the advancement of usability science. As questionnaires address only certain usability attributes, mainly learnability, efficiency, and satisfaction [12], it would be helpful to also include additional methods, such as interviews or mixed methods, and to incorporate additional open-ended questions when using questionnaires.

Furthermore, the application of usability evaluation methods other than inquiry methods, such as user testing methods and inspection methods [10], could be beneficial and lead to more objective measures of app usability. Among other things, subjective data are collected via self-reported questionnaires, and objective data are collected based on task completion rates [40]. For example, in one of the included studies, the participants reported that the usability of the app was satisfactory by subjective measures, but the participants did not use the app [75]. Another study reported a lack of coherence between subjective and objective data; thus, these results indicate the importance of not relying solely on subjective measures of usability [40]. Therefore, it is suggested that various usability evaluation methods, including subjective and objective usability measures, are used in future usability studies.

Our review found that most of the included studies in health care education (71/98, 72%) performed field testing, whereas previous literature suggests that usability experiments in other fields are more often conducted in a laboratory [1,113]. For instance, Kumar and Mohite [1] found that 73% of the studies included in their review of mobile learning apps used laboratory testing. Mobile apps in health care education have been developed to support students’ learning, on-campus and during clinical placement, in various settings and on the move. Accordingly, it is especially important to test how the apps are perceived in specific environments [5]; hence, field testing is required. However, many usability issues can be discovered in a laboratory. Particularly in the early phases of app development, testing an app with several participants in a laboratory may make it more feasible to test and improve the app [8]. Usability testing in a laboratory can provide rapid feedback on usability issues, which can then be addressed before testing the app in a real-world environment. Therefore, it may be beneficial to conduct small-scale laboratory testing before field testing.

Usability Attributes

Previous systematic reviews of mobile apps in general identified satisfaction, efficiency, and effectiveness as the most common usability attributes [5,10]. In this review, efficiency and effectiveness were explored to a limited extent, whereas satisfaction, usefulness, and ease of use were the most frequently identified usability attributes. Our results coincide with those from a previous review on the usability of mobile learning apps [1], possibly because satisfaction, usefulness, and ease of use are usability attributes of particular importance when examining mobile learning apps.

Learning performance was assessed frequently in the included studies. For ensuring that apps are valuable in a given learning context, it is relevant to test additional usability attributes such as cognitive load [9]. However, few studies included in our review examined cognitive load [68,80,108]. Mobile apps are often used in an environment with multiple distractions, which may contribute to an increased cognitive load [5], affecting the learning performance. Testing both learning performance and app users’ cognitive load may improve the understanding of the app’s usability.

We found that several of the included studies did not use terminology from usability literature to describe which usability attributes they were testing. For instance, studies that tested satisfaction often used words such as “likes and dislikes” and “recommend use to others” and did not specify that they tested the usability attribute satisfaction. Specifying which usability attributes are investigated will be important when performing a usability study of mobile apps, as this will influence transparency and enable comparison between different studies. In addition, evaluating a wider range of usability attributes may enable researchers to expand their perspective regarding the app’s usability problems and ensure quicker improvement of the app. Defining and presenting different usability attributes in a reporting guideline can assist in deciding on and reporting relevant usability attributes. As such, a reporting guideline would be beneficial for researchers planning and conducting usability studies, a point that is also supported by the systematic review conducted by Kumar and Mohite [1].

Future Directions

Combining different usability evaluation methods that incorporate both subjective and objective usability measures can add various and important perspectives when developing apps. In future studies, it would be advantageous to use psychometrically tested usability questionnaires to support the advancement of the usability science. In addition, developers of mobile apps should determine which usability attributes are relevant before conducting usability studies (eg, by registering a protocol). Incorporating these perspectives into the development of a reporting guideline would be beneficial to future usability studies.

Strengths and Limitations

First, the search strategy was designed in collaboration with a research librarian and peer reviewed by another research librarian and included 10 databases and other sources. This broad search strategy resulted in a high number of references, which may be associated with a lower level of precision. To ensure the retrieval of all potentially pertinent articles, two of the authors independently screened titles and abstracts; studies deemed eligible by one of the authors were included for full-text screening.

Second, the full-text evaluation was challenging because the term usability has multiple meanings that do not always relate to usability testing. For instance, the term was used when testing students’ experience of a commercially developed app but not in connection with the app’s further development. In addition, many studies did not explicitly state that a mobile app was being investigated, which also created a challenge when deciding whether they satisfied the eligibility criteria. Nevertheless, reading the full-text articles independently by 2 reviewers and solving disagreements through consensus-based discussions ensured the inclusion of relevant articles.

Conclusions

This scoping review was performed to provide an overview of the usability methods used and the attributes identified in usability studies of mobile apps in health care education. Experimental designs were commonly used to evaluate usability and most studies used field testing. Questionnaires were frequently used for data collection, although few studies used psychometrically tested questionnaires. Usability attributes identified most often were satisfaction, usefulness, and ease of use. The results indicate that combining different usability evaluation methods, incorporating both subjective and objective usability measures, and specifying which usability attributes to test seem advantageous. The results can support the planning and conduct of future usability studies of the advancement of learning apps in health care education.

Acknowledgments

The research library at Western Norway University of Applied Sciences provided valuable assistance in developing and performing the search strategy for this scoping review. Gunhild Austrheim, a research librarian, provided substantial guidance in the planning and performance of the database searches. Marianne Nesbjørg Tvedt peer reviewed the search string. Malik Beglerovic also assisted with database searches. The authors would also like to thank Ane Kjellaug Brekke Gjerland for assessing the data extraction sheet.

Authors' Contributions

SGJ, LL, DC, and NRO proposed the idea for this review. SGJ, DC, and NRO contributed to the screening of titles and abstracts, and SGJ and TP decided on eligibility based on full-text examinations. SGJ extracted data from the included studies. SGJ, TP, LL, DC, and NRO contributed to the drafts of the manuscript, and all authors approved the final version for publication.

Conflicts of Interest

None declared.

Multimedia Appendix 1

PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) checklist for reporting scoping reviews.

DOCX File , 107 KB

Multimedia Appendix 2

The search strategies for the 10 databases.

DOCX File , 84 KB

Multimedia Appendix 3

Data extraction sheet.

XLSX File (Microsoft Excel File), 156 KB

  1. Kumar BA, Mohite P. Usability of mobile learning applications: a systematic literature review. J Comput Educ 2018;5(1):1-17. [CrossRef]
  2. Asarbakhsh M, Sandars J. E-learning: the essential usability perspective. Clin Teach 2013 Feb;10(1):47-50. [CrossRef] [Medline]
  3. Lall P, Rees R, Law GC, Dunleavy G, Cotič Ž, Car J. Influences on the implementation of mobile learning for medical and nursing education: qualitative systematic review by the digital health education collaboration. J Med Internet Res 2019 Feb 28;21(2):e12895 [FREE Full text] [CrossRef] [Medline]
  4. Sophonhiranrak S, Promsaka Na Sakonnak N. Limitations of mobile learning: a systematic review. In: E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education. 2017 Presented at: E-Learn '17; October 17, 2017; Vancouver, British Columbia, Canada p. 965-971.
  5. Harrison R, Flood D, Duce D. Usability of mobile applications: literature review and rationale for a new usability model. J Interact Sci 2013 May 7;1(1):1-16. [CrossRef]
  6. Paz F, Pow-Sang JA. A systematic mapping review of usability evaluation methods for software development process. Int J Softw Eng Appl 2016 Jan 31;10(1):165-178. [CrossRef]
  7. ISO 9241-11:2018. Ergonomics of human-system interaction — part 11: usability: definitions and concepts. Geneva, Switzerland: International Organization for Standardization; 2018.
  8. Rubin J, Chisnell D, Spool J. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. 2nd edition. Hoboken, NJ, USA: Wiley; May 2008.
  9. Kukulska-Hulme A. Mobile usability in educational contexts: what have we learnt? Int Rev Res Open Dis Learn 2007 Jun 15;8(2):1-16. [CrossRef]
  10. Weichbroth P. Usability of mobile applications: a systematic literature study. IEEE Access 2020 Mar 19;8:55563-55577. [CrossRef]
  11. Nayebi F, Desharnais JM, Abran A. The state of the art of mobile application usability evaluation. In: Proceedings of the 25th IEEE Canadian Conference on Electrical and Computer Engineering. 2012 Presented at: CCECE '12; April 29-May 02, 2012; Montreal, Canada p. 1-4. [CrossRef]
  12. Sousa VE, Dunn Lopez K. Towards usable e-health. A systematic review of usability questionnaires. Appl Clin Inform 2017 May 10;8(2):470-490 [FREE Full text] [CrossRef] [Medline]
  13. Sure M. Questionnaires for Usability: A Systematic Literature Review. Linköping, Sweden: Linköping University; 2014.
  14. Zhang D, Adipat B. Challenges, methodologies, and issues in the usability testing of mobile applications. Int J Hum Comput Interact 2005 Jul;18(3):293-308. [CrossRef]
  15. Ismail NA, Ahmad F, Kamaruddin NA, Ibrahim R. A review on usability issues in mobile applications. J Mob Comput Appl 2016;3(3):47-52. [CrossRef]
  16. Sandars J. The importance of usability testing to allow e-learning to reach its potential for medical education. Educ Prim Care 2010 Jan;21(1):6-8. [CrossRef] [Medline]
  17. Arksey H, O'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol 2005 Feb;8(1):19-32. [CrossRef]
  18. Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implement Sci 2010 Sep 20;5:69 [FREE Full text] [CrossRef] [Medline]
  19. Khalil H, Peters M, Godfrey CM, McInerney P, Soares CB, Parker D. An evidence-based approach to scoping reviews. Worldviews Evid Based Nurs 2016 Apr;13(2):118-123. [CrossRef] [Medline]
  20. Johnson SG, Potrebny T, Larun L, Ciliska D, Olsen NR. Usability methods and attributes reported in usability studies of mobile apps for health care education: protocol for a scoping review. JMIR Res Protoc 2020 Aug 04;9(8):e19072 [FREE Full text] [CrossRef] [Medline]
  21. Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med 2018 Oct 02;169(7):467-473 [FREE Full text] [CrossRef] [Medline]
  22. Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev 2016 Dec 05;5(1):210 [FREE Full text] [CrossRef] [Medline]
  23. The EndNote Team. EndNote. X9.2. Clarivate. Philadelphia, PA; 2013.   URL: https://endnote.com/ [accessed 2022-02-15]
  24. Aebersold M, Voepel-Lewis T, Cherara L, Weber M, Khouri C, Levine R, et al. Interactive anatomy-augmented virtual simulation training. Clin Simul Nurs 2018 Feb;15:34-41 [FREE Full text] [CrossRef] [Medline]
  25. Akl EA, Mustafa R, Slomka T, Alawneh A, Vedavalli A, Schünemann HJ. An educational game for teaching clinical practice guidelines to Internal Medicine residents: development, feasibility and acceptability. BMC Med Educ 2008 Nov 18;8:50 [FREE Full text] [CrossRef] [Medline]
  26. Al-Rawi W, Easterling L, Edwards PC. Development of a mobile device optimized cross platform-compatible oral pathology and radiology spaced repetition system for dental education. J Dent Educ 2015 Apr;79(4):439-447. [Medline]
  27. Albrecht UV, Noll C, von Jan U. Explore and experience: mobile augmented reality for medical training. Stud Health Technol Inform 2013;192:382-386. [Medline]
  28. Alencar Neto JB, Araújo RL, Barroso Filho EM, Silva PG, Garrido RJ, Rocha PH, et al. Development and validation of a smartphone application for orthopedic residency education. Rev Bras Educ Med 2020 May;44(4):561-567. [CrossRef]
  29. Alepis E, Virvou M. Evaluation of mobile authoring and tutoring in medical issues. US China Educ Rev 2010 Jul;7(7):84-92 [FREE Full text]
  30. Ameri A, Khajouei R, Ameri A, Jahani Y. Acceptance of a mobile-based educational application (LabSafety) by pharmacy students: an application of the UTAUT2 model. Educ Inf Technol 2019 Aug 1;25(1):419-435. [CrossRef]
  31. Taziki Balajelini F, Najafi Ghezeljeh T. Prehospital trauma management: evaluation of a new designed smartphone application. J Client Centered Nurs Care 2018 Nov 30;4(4):193-202. [CrossRef]
  32. Barnes J, Duffy A, Hamnett N, McPhail J, Seaton C, Shokrollahi K, et al. The Mersey Burns App: evolving a model of validation. Emerg Med J 2015 Aug;32(8):637-641. [CrossRef] [Medline]
  33. Busanello FH, da Silveira PF, Liedke GS, Arús NA, Vizzotto MB, Silveira HE, et al. Evaluation of a digital learning object (DLO) to support the learning process in radiographic dental diagnosis. Eur J Dent Educ 2015 Nov;19(4):222-228. [CrossRef] [Medline]
  34. Cabero-Almenara J, Roig-Vila R. The motivation of technological scenarios in augmented reality (AR): results of different experiments. Appl Sci 2019 Jul 19;9(14):2907-2916. [CrossRef]
  35. Choi M, Lee HS, Park JH. Usability of academic electronic medical record application for nursing students' clinical practicum. Healthc Inform Res 2015 Jul;21(3):191-195 [FREE Full text] [CrossRef] [Medline]
  36. Choi M, Lee H, Park JH. Effects of using mobile device-based academic electronic medical records for clinical practicum by undergraduate nursing students: a quasi-experimental study. Nurse Educ Today 2018 Feb;61:112-119. [CrossRef] [Medline]
  37. Choo CC, Devakaran B, Chew PK, Zhang MW. Smartphone application in postgraduate clinical psychology training: trainees' perspectives. Int J Environ Res Public Health 2019 Oct 30;16(21):4206 [FREE Full text] [CrossRef] [Medline]
  38. Chreiman KM, Prakash PS, Martin ND, Kim PK, Mehta S, McGinnis K, et al. Staying connected: service-specific orientation can be successfully achieved using a mobile application for onboarding care providers. Trauma Surg Acute Care Open 2017;2(1):e000085 [FREE Full text] [CrossRef] [Medline]
  39. Colucci PG, Kostandy P, Shrauner WR, Arleo E, Fuortes M, Griffin AS, et al. Development and utilization of a Web-based application as a robust radiology teaching tool (radstax) for medical student anatomy teaching. Acad Radiol 2015 Feb;22(2):247-255 [FREE Full text] [CrossRef] [Medline]
  40. Davids MR, Chikte UM, Halperin ML. Effect of improving the usability of an e-learning resource: a randomized trial. Adv Physiol Educ 2014 Jun;38(2):155-160 [FREE Full text] [CrossRef] [Medline]
  41. Demmans Epp C, Horne J, Scolieri BB, Kane I, Bowser AS. PsychOut! a mobile app to support mental status assessment training. In: Proceedings of the 13th European Conference on Technology Enhanced Learning. 2018 Presented at: EC-TEL '18; September 3-5, 2018; Leeds, UK p. 216-230. [CrossRef]
  42. Devraj R, Colyott L, Cain J. Design and evaluation of a mobile serious game application to supplement instruction. Curr Pharm Teach Learn 2021 Sep;13(9):1228-1235. [CrossRef] [Medline]
  43. Díaz-Fernández Á, Jiménez-Delgado JJ, Osuna-Pérez MC, Rueda-Ruiz A, Paulano-Godino F. Development and implementation of a mobile application to improve university teaching of electrotherapy. In: Proceedings of the 2016 International Conference on Interactive Mobile Communication, Technologies and Learning. 2016 Presented at: IMCL '16; October 17-19, 2016; San Diego, CA, USA p. 33-37. [CrossRef]
  44. Docking RE, Lane M, Schofield PA. Usability testing of the iPhone app to improve pain assessment for older adults with cognitive impairment (prehospital setting): a qualitative study. Pain Med 2018 Jun 01;19(6):1121-1131. [CrossRef] [Medline]
  45. Dodson CH, Baker E. Focus group testing of a mobile app for pharmacogenetic-guided dosing. J Am Assoc Nurse Pract 2020 Feb 04;33(3):205-210. [CrossRef] [Medline]
  46. Duarte Filho NF, Gonçalves CF, Pizetta DC. Experimental analysis of the efficiency of application E-Mages in medical imaging visualization. In: Proceedings of the 9th Iberian Conference on Information Systems and Technologies. 2014 Presented at: CISTI '14; June 18-21, 2014; Barcelona, Spain p. 1-6. [CrossRef]
  47. Duggan N, Curran VR, Fairbridge NA, Deacon D, Coombs H, Stringer K, et al. Using mobile technology in assessment of entrustable professional activities in undergraduate medical education. Perspect Med Educ 2021 Dec;10(6):373-377 [FREE Full text] [CrossRef] [Medline]
  48. Fernández-Lao C, Cantarero-Villanueva I, Galiano-Castillo N, Caro-Morán E, Díaz-Rodríguez L, Arroyo-Morales M. The effectiveness of a mobile application for the development of palpation and ultrasound imaging skills to supplement the traditional learning of physiotherapy students. BMC Med Educ 2016 Oct 19;16(1):274 [FREE Full text] [CrossRef] [Medline]
  49. Fralick M, Haj R, Hirpara D, Wong K, Muller M, Matukas L, et al. Can a smartphone app improve medical trainees' knowledge of antibiotics? Int J Med Educ 2017 Nov 30;8:416-420 [FREE Full text] [CrossRef] [Medline]
  50. Ghafari S, Yazdannik A, Mohamadirizi S. Education promotion based on "mobile technology" in the Critical Care Nursing Department: four-phase intervention. J Educ Health Promot 2020;9:325 [FREE Full text] [CrossRef] [Medline]
  51. Goldberg H, Klaff J, Spjut A, Milner S. A mobile app for measuring the surface area of a burn in three dimensions: comparison to the Lund and Browder assessment. J Burn Care Res 2014;35(6):480-483. [CrossRef] [Medline]
  52. Gutiérrez-Puertas L, García-Viola A, Márquez-Hernández VV, Garrido-Molina JM, Granados-Gámez G, Aguilera-Manrique G. Guess it (SVUAL): an app designed to help nursing students acquire and retain knowledge about basic and advanced life support techniques. Nurse Educ Pract 2021 Jan;50:102961. [CrossRef] [Medline]
  53. Herbert VM, Perry RJ, LeBlanc CA, Haase KN, Corey RR, Giudice NA, et al. Developing a smartphone app with augmented reality to support virtual learning of nursing students on heart failure. Clin Simul Nurs 2021 May;54:77-85. [CrossRef]
  54. Hsu LL, Hsiang HC, Tseng YH, Huang SY, Hsieh SI. Nursing students' experiences of using a smart phone application for a physical assessment course: a qualitative study. Jpn J Nurs Sci 2019 Apr;16(2):115-124. [CrossRef] [Medline]
  55. Huang HM, Chen YL, Chen KY. Investigation of three-dimensional human anatomy applied in mobile learning. In: Proceedings of the 2010 International Computer Symposium. 2010 Presented at: ICS '10; December 16-18, 2010; Tainan, Taiwan p. 358-363. [CrossRef]
  56. Hughes JK, Kearney P. Impact of an iDevice application on student learning in an occupational therapy kinesiology course. Mhealth 2017;3:43 [FREE Full text] [CrossRef] [Medline]
  57. Ismail SN, Rangga JU, Rasdi I, Rahman UR, Samah MA. Mobile apps application to improve safety and health knowledge, attitude and practice among university students. Malays J Med Health Sci 2018;14:47-55.
  58. Johnson SG, Titlestad KB, Larun L, Ciliska D, Olsen NR. Experiences with using a mobile application for learning evidence-based practice in health and social care education: an interpretive descriptive study. PLoS One 2021;16(7):e0254272 [FREE Full text] [CrossRef] [Medline]
  59. Kang J, Suh EE. Development and evaluation of "chronic illness care smartphone apps" on nursing students' knowledge, self-efficacy, and learning experience. Comput Inform Nurs 2018 Nov;36(11):550-559. [CrossRef] [Medline]
  60. Keegan RD, Oliver MC, Stanfill TJ, Stevens KV, Brown GR, Ebinger M, et al. Use of a mobile device simulation as a preclass active learning exercise. J Nurs Educ 2016 Jan;55(1):56-59. [CrossRef] [Medline]
  61. Kim-Berman H, Karl E, Sherbel J, Sytek L, Ramaswamy V. Validity and user experience in an augmented reality virtual tooth identification test. J Dent Educ 2019 Nov;83(11):1345-1352. [CrossRef] [Medline]
  62. Kojima S, Mitani M, Ishikawa A. Development of an E-learning resource on mobile devices for kinesiology: a pilot study. J Phys Ther Sci 2011;23(4):667-672. [CrossRef]
  63. Scott KM, Kitching S, Burn D, Koulias M, Campbell D, Phelps M. "Wherever, whenever" learning in medicine: interactive mobile case-based project. In: Proceedings of the 27th Annual conference of the Australian Society for Computers in Tertiary Education. 2010 Presented at: Ascilite '10; December 5-8, 2010; New South Wales, Australia p. 888-890.
  64. Kow AW, Ang BL, Chong CS, Tan WB, Menon KR. Innovative patient safety curriculum using iPAD game (PASSED) improved patient safety concepts in undergraduate medical students. World J Surg 2016 Nov;40(11):2571-2580. [CrossRef] [Medline]
  65. Kurniawan MH, Suharjito, Diana, Witjaksono G. Human anatomy learning systems using augmented reality on mobile application. Procedia Comput Sci 2018;135:80-88. [CrossRef]
  66. Lefroy J, Roberts N, Molyneux A, Bartlett M, Gay S, McKinley R. Utility of an app-based system to improve feedback following workplace-based assessment. Int J Med Educ 2017 May 31;8:207-216 [FREE Full text] [CrossRef] [Medline]
  67. Li YJ, Lee LH, Cheng YT, Ou YY. Design and evaluation of a healthcare management terminology mobile learning application. In: Proceedings of the 2019 IEEE International Conference on Healthcare Informatics. 2019 Presented at: ICHI '19; June 10-13, 2019; Xi'an, China p. 1-9. [CrossRef]
  68. Lin YT, Lin YC. Effects of mental process integrated nursing training using mobile device on students’ cognitive load, learning attitudes, acceptance, and achievements. Comput Human Behav 2016 Feb;55(B):1213-1221 [FREE Full text] [CrossRef]
  69. Lone M, Vagg T, Theocharopoulos A, Cryan JF, Mckenna JP, Downer EJ, et al. Development and assessment of a three-dimensional tooth morphology quiz for dental students. Anat Sci Educ 2019 May;12(3):284-299. [CrossRef] [Medline]
  70. Long JD, Gannaway P, Ford C, Doumit R, Zeeni N, Sukkarieh-Haraty O, et al. Effectiveness of a technology-based intervention to teach evidence-based practice: the EBR tool. Worldviews Evid Based Nurs 2016 Feb;13(1):59-65. [CrossRef] [Medline]
  71. Longmuir KJ. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms. Adv Physiol Educ 2014 Mar;38(1):34-41 [FREE Full text] [CrossRef] [Medline]
  72. López MM, López MM, de la Torre Díez I, Jimeno JC, López-Coronado M. A mobile decision support system for red eye diseases diagnosis: experience with medical students. J Med Syst 2016 Jun;40(6):151. [CrossRef] [Medline]
  73. Lozano-Lozano M, Galiano-Castillo N, Fernández-Lao C, Postigo-Martin P, Álvarez-Salvago F, Arroyo-Morales M, et al. The Ecofisio mobile app for assessment and diagnosis using ultrasound imaging for undergraduate health science students: multicenter randomized controlled trial. J Med Internet Res 2020 Mar 10;22(3):e16258 [FREE Full text] [CrossRef] [Medline]
  74. Lucas C, Gibson A, Shum SB. Pharmacy students' utilization of an online tool for immediate formative feedback on reflective writing tasks. Am J Pharm Educ 2019 Aug;83(6):6800 [FREE Full text] [CrossRef] [Medline]
  75. Mathew D, Archer N, McKibbon KA, Dillenburg R. Integrating clinical practice guidelines within a mobile tablet app. In: Proceedings of the 2014 International Conference on E-Commerce. 2014 Presented at: EC '14; July 17-19, 2014; Lisbon, Portugal p. 357-361.
  76. McClure KC. Usability of a mobile website focused on preoperative and intraoperative anesthetic considerations for the cardiac patient with valvular dysfunction. Franciscan Missionaries of Our Lady University, Baton Rouge, LA, USA: ProQuest Dissertations Publishing; Dec 2019.
  77. McDonald H, Gawad N, Raîche I, Rubens FD. Transition to residency: the successful development and implementation of a nonclinical elective in perioperative management. J Surg Educ 2018;75(3):628-638. [CrossRef] [Medline]
  78. McLean M, Brazil V, Johnson P. How we "breathed life" into problem-based learning cases using a mobile application. Med Teach 2014 Oct;36(10):849-852. [CrossRef] [Medline]
  79. McMullan M. Evaluation of a medication calculation mobile app using a cognitive load instructional design. Int J Med Inform 2018 Oct;118:72-77. [CrossRef] [Medline]
  80. Mendez-Lopez M, Juan MC, Molla R, Fidalgo C. Evaluation of an augmented reality application for learning neuroanatomy in psychology. Anat Sci Educ 2022 May;15(3):535-551. [CrossRef] [Medline]
  81. Meruvia-Pastor O, Patra P, Andres K, Twomey C, Peña-Castillo L. OMARC: an online multimedia application for training health care providers in the assessment of respiratory conditions. Int J Med Inform 2016 May;89:15-24. [CrossRef] [Medline]
  82. Mettiäinen S. Electronic assessment and feedback tool in supervision of nursing students during clinical training. Electron J E Learn 2015;13(1):42-55.
  83. Milner KA, McCloud R, Cullen J. The evidence appraisal game: an innovative strategy for teaching step 3 in evidence-based practice. Worldviews Evid Based Nurs 2020 Apr;17(2):173-175. [CrossRef] [Medline]
  84. Mladenovic R, Davidovic B, Tusek I, Trickovic-Janjic O, Mladenovic K. The effect of a mobile application for learning about traumatic dental injuries during the COVID-19 pandemic. Srp Arh Celok Lek 2021;149(3-4):202-207. [CrossRef]
  85. Morris J, Maynard V. Pilot study to test the use of a mobile device in the clinical setting to access evidence-based practice resources. Worldviews Evid Based Nurs 2010 Dec;7(4):205-213. [CrossRef] [Medline]
  86. Nabhani S, Harrap N, Ishtiaq S, Ling V, Dudzinski M, Greenhill D, et al. Development and evaluation of an educational game to support pharmacy students. Curr Pharm Teach Learn 2020 Jul;12(7):786-803. [CrossRef] [Medline]
  87. Noguera JM, Jiménez JJ, Osuna-Pérez MC. Development and evaluation of a 3D mobile application for learning manual therapy in the physiotherapy laboratory. Comput Educ 2013 Nov;69(1):96-108. [CrossRef]
  88. O'Connell E, Pegler J, Lehane E, Livingstone V, McCarthy N, Sahm LJ, et al. Near field communications technology and the potential to reduce medication errors through multidisciplinary application. Mhealth 2016;2:29 [FREE Full text] [CrossRef] [Medline]
  89. Oliveira EY, Crosewski NI, Silva AL, Ribeiro CT, de Oliveira CM, Fogaça RT, et al. Profile of educational technology use by medical students and evaluation of a new mobile application designed for the study of human physiology. J Med Syst 2019 Aug 27;43(10):313. [CrossRef] [Medline]
  90. Orjuela MA, Uribe-Quevedo A, Jaimes N, Perez-Gutierrez B. External automatic defibrillator game-based learning app. In: Proceedings of the 2015 IEEE Games Entertainment Media Conference. 2015 Presented at: GEM '15; October 14-16, 2015; Toronto, Canada p. 1-4. [CrossRef]
  91. Page CP, Reid A, Coe CL, Carlough M, Rosenbaum D, Beste J, et al. Learnings from the pilot implementation of mobile medical milestones application. J Grad Med Educ 2016 Oct;8(4):569-575 [FREE Full text] [CrossRef] [Medline]
  92. Paradis M, Stiell I, Atkinson KM, Guerinet J, Sequeira Y, Salter L, et al. Acceptability of a mobile clinical decision tool among emergency department clinicians: development and evaluation of the Ottawa rules app. JMIR Mhealth Uhealth 2018 Jun 11;6(6):e10263 [FREE Full text] [CrossRef] [Medline]
  93. Victor Soares Pereira R, Kubrusly M, Marçal E. Desenvolvimento, utilização e avaliação de uma aplicação móvel para educação médica: um estudo de caso em Anestesiologia. RENOTE 2017 Jul 28;15(1):75104. [CrossRef]
  94. Pereira FG, Rocha D, Melo GA, Jaques RM, Formiga LM. Building and validating a digital application for the teaching of surgical instrumentation. Cogitare Enferm 2019 Mar 11;24:e58334. [CrossRef]
  95. Pinto VC, da Costa TM, Naveira MC, Sigulem D, Schor P, Pisa IT. MDFluxo: ophtalmology education with a PDA efficacy and usability evaluation. In: Proceedings of the 1st International Conference on Health Informatics. 2008 Presented at: HEALTHINF '08; January 28-31, 2008; Madeira, Portugal p. 227-230. [CrossRef]
  96. Quattromani E, Hassler M, Rogers N, Fitzgerald J, Buchanan P. Smart pump app for infusion pump training. Clin Simul Nurs 2018 Apr 1;17:28-37. [CrossRef]
  97. Robertson AC, Fowler LC. Medical student perceptions of learner-initiated feedback using a mobile Web application. J Med Educ Curric Dev 2017;4:2382120517746384 [FREE Full text] [CrossRef] [Medline]
  98. Roa Romero Y, Tame H, Holzhausen Y, Petzold M, Wyszynski JV, Peters H, et al. Design and usability testing of an in-house developed performance feedback tool for medical students. BMC Med Educ 2021 Jun 23;21(1):354 [FREE Full text] [CrossRef] [Medline]
  99. Salem S, Cooper J, Schneider J, Croft H, Munro I. Student acceptance of using augmented reality applications for learning in pharmacy: a pilot study. Pharmacy (Basel) 2020 Jul 21;8(3):122 [FREE Full text] [CrossRef] [Medline]
  100. San Martín-Rodríguez L, Escalada-Hernández P, Soto-Ruiz N. A themed game to learn about nursing theories and models: a descriptive study. Nurse Educ Pract 2020 Nov;49:102905. [CrossRef] [Medline]
  101. Schnepp JC, Rogers CB. Evaluating the acceptability and usability of EASEL: a mobile application that supports guided reflection for experiential learning activities. J Inf Technol Educ Innov Pract 2017;16:195-214. [CrossRef]
  102. Smith N, Rapley T, Jandial S, English C, Davies B, Wyllie R, et al. Paediatric musculoskeletal matters (pmm)--collaborative development of an online evidence based interactive learning tool and information resource for education in paediatric musculoskeletal medicine. Pediatr Rheumatol Online J 2016 Jan 05;14(1):1 [FREE Full text] [CrossRef] [Medline]
  103. Strandell-Laine C, Leino-Kilpi H, Löyttyniemi E, Salminen L, Stolt M, Suomi R, et al. A process evaluation of a mobile cooperation intervention: a mixed methods study. Nurse Educ Today 2019 Sep;80:1-8. [CrossRef] [Medline]
  104. Strayer SM, Pelletier SL, Martindale JR, Rais S, Powell J, Schorling JB. A PDA-based counseling tool for improving medical student smoking cessation counseling. Fam Med 2010 May;42(5):350-357 [FREE Full text] [Medline]
  105. Taylor JD, Dearnley CA, Laxton JC, Coates CA, Treasure‐Jones T, Campbell R, et al. Developing a mobile learning solution for health and social care practice. Dist Educ 2010 Jul 30;31(2):175-192. [CrossRef]
  106. Yap KY, Toh TW, Chui WK. Development of a virtual patient record mobile app for pharmacy practice education. Arch Pharma Pract 2014;5(2):66-71. [CrossRef]
  107. Tsopra R, Courtine M, Sedki K, Eap D, Cabal M, Cohen S, et al. AntibioGame®: a serious game for teaching medical students about antibiotic use. Int J Med Inform 2020 Apr;136:104074 [FREE Full text] [CrossRef] [Medline]
  108. Wu TT. The use of a mobile assistant learning system for health education based on project-based learning. Comput Inform Nurs 2014 Oct;32(10):497-503. [CrossRef] [Medline]
  109. Wyatt TH, Li X, Indranoi C, Bell M. Developing iCare v.1.0: an academic electronic health record. Comput Inform Nurs 2012 Jun;30(6):321-329. [CrossRef] [Medline]
  110. Yap KY. Usefulness of the mobile interactive pharmacy education enhancement resource (miPEER) mobile Web-app as a learning tool for electronic health records. Int J Clin Skills 2017;11(6):1-7. [CrossRef]
  111. Zhang MW, Cheok CC, Ho RC. Global outreach of a locally-developed mobile phone app for undergraduate psychiatry education. JMIR Med Educ 2015 Jun 08;1(1):e3 [FREE Full text] [CrossRef] [Medline]
  112. Brooke J. SUS: a retrospective. J Usabil Stud Arch 2013;8(2):29-40 [FREE Full text] [CrossRef]
  113. Bastien JM. Usability testing: a review of some methodological and technical aspects of the method. Int J Med Inform 2010 Apr;79(4):e18-e23. [CrossRef] [Medline]


PRISMA-ScR: Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews
SUS: System Usability Scale


Edited by T Leung; submitted 25.03.22; peer-reviewed by L Rutter, H Mehdizadeh, L Gutierrez-Puertas; comments to author 12.05.22; revised version received 02.06.22; accepted 05.06.22; published 29.06.22

Copyright

©Susanne Grødem Johnson, Thomas Potrebny, Lillebeth Larun, Donna Ciliska, Nina Rydland Olsen. Originally published in JMIR Medical Education (https://mededu.jmir.org), 29.06.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on https://mededu.jmir.org/, as well as this copyright and license information must be included.