Review
Abstract
Background: Mobile devices can provide extendable learning environments in higher education and motivate students to engage in adaptive and collaborative learning. Developers must design mobile apps that are practical, effective, and easy to use, and usability testing is essential for understanding how mobile apps meet users’ needs. No previous reviews have investigated the usability of mobile apps developed for health care education.
Objective: The aim of this scoping review is to identify usability methods and attributes in usability studies of mobile apps for health care education.
Methods: A comprehensive search was carried out in 10 databases, reference lists, and gray literature. Studies were included if they dealt with health care students and usability of mobile apps for learning. Frequencies and percentages were used to present the nominal data, together with tables and graphical illustrations. Examples include a figure of the study selection process, an illustration of the frequency of inquiry usability evaluation and data collection methods, and an overview of the distribution of the identified usability attributes. We followed the Arksey and O’Malley framework for scoping reviews.
Results: Our scoping review collated 88 articles involving 98 studies, mainly related to medical and nursing students. The studies were conducted from 22 countries and were published between 2008 and 2021. Field testing was the main usability experiment used, and the usability evaluation methods were either inquiry-based or based on user testing. Inquiry methods were predominantly used: 1-group design (46/98, 47%), control group design (12/98, 12%), randomized controlled trials (12/98, 12%), mixed methods (12/98, 12%), and qualitative methods (11/98, 11%). User testing methods applied were all think aloud (5/98, 5%). A total of 17 usability attributes were identified; of these, satisfaction, usefulness, ease of use, learning performance, and learnability were reported most frequently. The most frequently used data collection method was questionnaires (83/98, 85%), but only 19% (19/98) of studies used a psychometrically tested usability questionnaire. Other data collection methods included focus group interviews, knowledge and task performance testing, and user data collected from apps, interviews, written qualitative reflections, and observations. Most of the included studies used more than one data collection method.
Conclusions: Experimental designs were the most commonly used methods for evaluating usability, and most studies used field testing. Questionnaires were frequently used for data collection, although few studies used psychometrically tested questionnaires. The usability attributes identified most often were satisfaction, usefulness, and ease of use. The results indicate that combining different usability evaluation methods, incorporating both subjective and objective usability measures, and specifying which usability attributes to test seem advantageous. The results can support the planning and conduct of future usability studies for the advancement of mobile learning apps in health care education.
International Registered Report Identifier (IRRID): RR2-10.2196/19072
doi:10.2196/38259
Keywords
Introduction
Background
Mobile devices can provide extendable learning environments and motivate students to engage in adaptive and collaborative learning [
, ]. Mobile devices offer various functions, enable convenient access, and support the ability to share information with other learners and teachers [ ]. Most students own a mobile phone, which makes mobile learning easily accessible [ ]. However, there are some challenges associated with mobile devices in learning situations, such as small screen sizes, connectivity problems, and multiple distractions in the environment [ ].Developers of mobile learning apps need to consider usability to ensure that apps are practical, effective, and easy to use [
] and to ascertain that mobile apps meet users’ needs [ ]. According to the International Organization for Standardization, usability is defined as “the extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” [ ]. Better mobile learning usability will be achieved by focusing on user-centered design and attention to context, ensuring that the technology corresponds to the user’s requirements and putting the user at the center of the process [ , ]. In addition, it is necessary to be conscious of the interrelatedness between usability and pedagogical design [ ].A variety of usability evaluation methods exists to test the usability of mobile apps, and Weichbroth [
] categorized them into the following 4 categories: inquiry, user testing, inspection, and analytical modeling. Inquiry methods are designed to gather data from users through questionnaires (quantitative data) and interviews and focus groups (qualitative data). User testing methods include think-aloud protocols, question-asking protocols, performance measurements, log analysis, eye tracking, and remote testing. Inspection methods, in contrast, involve experts testing apps, heuristic evaluation, cognitive walk-through, perspective-based inspections, and guideline reviews. Analytical modeling methods include cognitive task analysis and task environment analysis [ ]. Across these 4 usability evaluation methods, the most commonly used data collection methods are controlled observations and surveys, whereas eye tracking, think-aloud methods, and interviews are applied less often [ ].Usability evaluations are normally performed in a laboratory or in field testing. Previous reviews have reported that usability evaluation methods are mainly conducted in a laboratory, which means in a controlled environment [
, ]. By contrast, field testing is conducted in real-life settings. There are pros and cons to the 2 different approaches. Field testing allows data collection within a dynamic environment, whereas in a laboratory data collection and conditions are easier to control [ ]. A variety of data collection methods are appropriate for usability studies; for instance, in laboratories, participants performing predefined tasks, such as using questionnaires and observations, are often applied [ ]. In field testing, logging mechanisms and diaries have been applied to capture user interaction with mobile apps [ ].In all, 2 systematic reviews examined various psychometrically tested usability questionnaires as a means of enhancing the usability of apps. Sousa and Lopez [
] identified 15 such questionnaires and Sure [ ] identified 13. In all, 5 of the questionnaires have proven to be applicable in usability studies in general: the System Usability Scale (SUS), Questionnaire for User Interaction Satisfaction, After-Scenario Questionnaire, Post-Study System Usability Questionnaire, and Computer System Usability Questionnaire [ ]. The SUS questionnaire and After-Scenario Questionnaire are most widely applied [ ]. The most frequently reported usability attributes of these 5 questionnaires are learnability, efficiency, and satisfaction [ ].Usability attributes are features that measure the quality of mobile apps [
]. The most commonly reported usability attributes are effectiveness, efficiency, and satisfaction [ ], which are part of the usability definition [ ]. In the review by Weichbroth [ ], 75 different usability attributes were identified. Given the wide selection of usability attributes, choosing appropriate attributes depends on the nature of the technology and the research question in the usability study [ ]. Kumar and Mohite [ ] recommended that researchers present and explain which usability attributes are being tested when mobile apps are being developed.Previous reviews have examined the usability of mobile apps in general [
, , , , ]; however, only one systematic review has specifically explored the usability of mobile learning apps [ ]. However, studies from health care education were not included. Similarly, usability has not been widely explored in medical education apps [ ]. Thus, there is a need to develop a better understanding of how the usability of mobile learning apps developed for health care education has been evaluated and conceptualized in previous studies.Objectives
The aim of this scoping review has therefore been to identify usability methods and attributes in usability studies of mobile apps for health care education.
Methods
Framework
We have used the framework for scoping reviews developed by Arksey and O'Malley [
] and further developed by Levac et al [ ] and Khalil et al [ ]. We adopted the following five stages of this framework: (1) identifying the research question, (2) identifying relevant studies, (3) selecting studies, (4) charting the data, and (5) summarizing and reporting the results [ - ]. A detailed presentation of each step can be found in the published protocol for this scoping review [ ]. We followed the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) checklist for reporting scoping reviews ( [ ]).Stage 1: Identifying the Research Question
The following two research questions have been formulated:
- Which usability methods are used to evaluate the usability of mobile apps for health care education?
- Which usability attributes are reported in the usability studies of mobile apps for health care education?
Stage 2: Identifying Relevant Studies
A total of 10 electronic databases on technology, education, and health care from January 2008 to October 2021 and February 2022 were searched. These databases were as follows: Engineering Village, Scopus, ACM Digital Library, IEEE Xplore, Education Resource Information Center, PsycINFO, CINAHL, MEDLINE, EMBASE, and Web of Science. The search string was developed by the first author and a research librarian and then peer reviewed by another research librarian. The search terms used in the Web of Science, in addition to all relevant subject headings, included: ((student* or graduate* or undergraduate* or postgraduate*) NEAR/3 nurs*). This search string was repeated for other types of students and combined with the Boolean operator OR. The search string for all types of health care students was then combined with various search terms for mobile apps and mobile learning using the Boolean operator AND. Similar search strategies were used and adapted for all 10 databases as shown in
. In addition, a citation search in Google Scholar, screening reference lists of included studies, and searching for gray literature in OpenGrey were conducted.Stage 3: Selecting Studies
Two of the authors independently screened titles and abstracts using Rayyan web-based management software [
]. Studies deemed eligible by one of the authors were included for full-text screening and imported into the EndNote X9 (Clarivate) reference management system [ ]. Eligibility for full-text screening was determined independently by two of the authors and disagreements were resolved by consensus-based discussions. Research articles with different designs were included, and there were no language restrictions. As mobile apps started appearing in 2008, this year was set as the starting point for the search. Eligibility criteria are presented in .Inclusion criteria | Exclusion criteria | |
Population | Health care and allied health care students at the undergraduate and postgraduate levels | Health care professionals or students from education, engineering, or other nonhealth sciences |
Concept | Studies of usability testing or methods of usability evaluation of mobile learning apps where the purpose relates to the development of the apps | Studies relating to learner management systems, e-learning platforms, open online courses, or distance education |
Context | Typical educational setting (eg, classroom teaching, clinical placement, or simulation training), including both synchronous and asynchronous teaching | Noneducational settings not involving clinical placement or learning situations (eg, hospital or community settings) |
Stage 4: Charting the Data (Data Abstraction)
The extracted data included information about the study (eg, authors, year of publication, title, and country), population (eg, number of participants), concepts (usability methods, usability attributes, and usability phase), and context (educational setting). The final data extraction sheet can be found in
[ - ]. One review author extracted the data from the included studies using Microsoft Excel software [ ], which was checked by another researcher.Descriptions of usability attributes have not been standardized, making categorization challenging. Therefore, a review author used deductive analysis to interpret the usability attributes reported in the included studies. This interpretation was based on a review of usability attributes as defined in previous literature. These definitions were assessed on the basis of the results of the included studies. This analysis was reviewed and discussed by another author. Disagreements were resolved through a consensus-based discussion.
Stage 5: Summarizing and Reporting the Results
Frequencies and percentages were used to present nominal data, together with tables and graphical illustrations. For instance, a figure showing the study selection process, an illustration of the frequency of inquiry-based usability evaluation and data collection methods, and an overview of the distribution of identified usability attributes were provided.
Results
Eligible Studies
Database searches yielded 34,369 records, and 2796 records were identified using other methods. After removing duplicates, 28,702 records remained. A total of 626 reports were examined in full text. In all, 88 articles were included in the scoping review [
- ] ( ). A total of 8 articles comprised results from several studies in the same article, presented as study A, study B, or study C in . Therefore, a total of 98 studies were reported in the 88 articles included.The included studies comprised a total sample population of 7790, with participant numbers ranging from 5 to 736 participants per study. Most of the studies included medical students (34/88, 39%) or nursing students (25/88, 28%). Other participants included students from the following disciplines: pharmacy (9/88, 10%), dentistry (5/88, 6%), physiotherapy (5/88, 6%), health sciences (3/88, 3%), and psychology (2/88, 2%). Further information is provided in
. There were 22 publishing countries, with most studies being from the United States (22/88, 25%), Spain (9/88, 10%), the United Kingdom (8/88, 9%), Canada (7/88, 8%), and Brazil (7/88, 8%), with an increasing number of publications from 2014. provides an overview and characteristics of the included articles.Study number | Study | Population (N) | Research design: data collection method | Usability attributes |
1 | Aebersold et al [ | ], 2018, United StatesNursing (N=69) | Mixed methods: questionnaire; task and knowledge performancea | Ease of use; learning performance; satisfaction; usefulness |
2 | Akl et al [ | ], 2008, United StatesResident (N=30) | Qualitative methods: focus groups; written qualitative reflections | Satisfaction |
3 | Al-Rawi et al [ | ], 2015, United StatesDentist (N=61) | Posttest 1-group design: questionnaire | Ease of use; frequency of use; satisfaction; usefulness |
4 | Albrecht et al [ | ], 2013, GermanyMedicine (N=6) | Posttest 1-group design: questionnaireb | Satisfaction |
5 | Alencar Neto et al [ | ], 2020, BrazilMedicine (N=132) | Posttest 1-group design: questionnaireb | Ease of use; learnability; satisfaction; usefulness |
6 | Alepis and Virvou [ | ], 2010, GreeceMedicine (N=110) | Mixed methods: questionnaire; interviews | Ease of use; usefulness; user-friendliness |
7 | Ameri et al [ | ], 2020, IranPharmacy (N=241) | Posttest 1-group design: questionnaireb | Context of use; efficiency; usefulness |
8 | Balajelini and Ghezeljeh [ | ], 2018, IranNursing (N=41) | Posttest 1-group design: questionnaire | Ease of use; frequency of use; navigation; satisfaction; simplicity; usefulness |
9 | Barnes et al [ | ], 2015, United KingdomMedicine (N=42) | Randomized controlled trial: questionnaire; task and knowledge performance | Ease of use; effectiveness; learning performance; satisfaction |
10 | Busanello et al [ | ], 2015, BrazilDentist (N=62) | Pre-post test, nonrandomized control group design: questionnaireb | Learnability; learning performance; satisfaction |
11 | Cabero-Almenara and Roig-Vila [ | ], 2019, SpainMedicine (N=50) | Pre-post test, 1-group design: questionnaireb | Learning performance; satisfaction |
12 | Choi et al [ | ], 2015, South KoreaNursing (N=5) | Think-aloud methods: interviews; data from app | Context of use; ease of use; learnability; satisfaction; usefulness |
13 | Choi et al [ | ], 2018, South KoreaNursing (N=75) | Pre-post test, nonrandomized control group design: questionnaire | Ease of use; learning performance; satisfaction; usefulness |
14 | Choo et al [ | ], 2019, SingaporePsychology (N=8) | Mixed methods: questionnaireb; written qualitative reflections | Ease of use; learning performance; satisfaction; usefulness; user-friendliness |
15 | Chreiman et al [ | ], 2017, United StatesMedicine (N=30) | Posttest 1-group design: questionnaire; data from app | Context of use; ease of use; frequency of use; usefulness |
16 | Colucci et al [ | ], 2015, United StatesMedicine (N=115) | Posttest 1-group design: questionnaire | Effectiveness; efficiency; satisfaction; usefulness |
17 | Davids et al [ | ], 2014, South AfricaResidents (N=82) | Randomized controlled trial: questionnaireb; data from app | Effectiveness; efficiency; learnability; navigation; satisfaction; user-friendliness |
18A | Demmans et al [ | ], 2018, CanadaNursing (N=60) | Pre-post test, nonrandomized control group design: questionnaire; observations | Ease of use; effectiveness; learnability; learning performance; navigation; satisfaction |
18B | Demmans et al [ | ], 2018, CanadaNursing (N=85) | Pre-post test, nonrandomized control group design: questionnaire; observations | Ease of use; effectiveness; learnability; learning performance; navigation; satisfaction |
19 | Devraj et al [ | ], 2021, United StatesPharmacy (N=89) | Posttest 1-group design: questionnaire; data from app | Ease of use; errors; frequency of use; learning performance; navigation; operational usability; satisfaction; usefulness |
20 | Díaz-Fernández et al [ | ], 2016, SpainPhysiotherapy (N=110) | Posttest 1-group design: questionnaire | Comprehensibility; ease of use; usefulness |
21 | Docking et al [ | ], 2018, United KingdomParamedic (N=24) | Think-aloud methods: focus groups | Context of use; learnability; satisfaction; usefulness |
22 | Dodson and Baker [ | ], 2020, United StatesNursing (N=23) | Qualitative methods: focus groups | Ease of use; operational usability; satisfaction; usefulness; user-friendliness |
23 | Duarte Filho et al [ | ], 2014, BrazilMedicine (N=10) | Posttest nonrandomized control group design: questionnaire | Ease of use; efficiency; satisfaction; usefulness |
24 | Duggan et al [ | ], 2020, CanadaMedicine (N=80) | Posttest 1-group design: questionnaire; data from app | Ease of use; frequency of use; satisfaction; usefulness |
25 | Fernandez-Lao et al [ | ], 2016, SpainPhysiotherapy (N=49) | Randomized controlled trial: questionnaireb; task and knowledge performance | Learning performance; satisfaction |
26 | Fralick et al [ | ], 2017, CanadaMedicine (N=62) | Pre-post test, nonrandomized control group design: questionnaire | Ease of use; frequency of use; learning performance; usefulness |
27 | Ghafari et al [ | ], 2020, IranNursing (N=8) | Posttest 1-group design: questionnaire | Ease of use; operational usability; satisfaction; usefulness |
28 | Goldberg et al [ | ], 2014, United StatesMedicine (N=18) | Posttest 1-group design: questionnaire; task and knowledge performance | Ease of use; effectiveness |
29 | Gutiérrez-Puertas et al [ | ], 2021, SpainNursing (N=184) | Randomized controlled trial: questionnaire; task and knowledge performance | Learning performance; satisfaction |
30 | Herbert et al [ | ], 2021, United StatesNursing (N=33) | Randomized controlled trial: questionnaire; task and knowledge performance | Ease of use; learning performance; navigation; operational usability; usefulness |
31 | Hsu et al [ | ], 2019, TaiwanNursing (N=16) | Qualitative methods: interviews | Context of use; operational usability; satisfaction; usefulness |
32 | Huang et al [ | ], 2010, TaiwanNot clear (N=28) | Posttest 1-group design: questionnaire | Ease of use; satisfaction, usefulness |
33 | Hughes and Kearney [ | ], 2017, United StatesOccupational therapy (N=19) | Qualitative methods: focus groups | Efficiency; satisfaction |
34 | Ismail et al [ | ], 2018, MalaysiaHealth science (N=124) | Pre-post test, 1-group design: questionnaire | Ease of use; learning performance; satisfaction; user-friendliness |
35 | Johnson et al [ | ], 2021, NorwayOccupational therapy, physiotherapy, and social education (N=15) | Qualitative methods: focus groups | Context of use; ease of use; operational usability |
36A | Kang Suh [ | ], 2018, South KoreaNursing (N=92) | Pre-post test, nonrandomized control group design: questionnaire; data from app | Effectiveness; frequency of use; learning performance; satisfaction |
36B | Kang Suh [ | ], 2018, South KoreaNursing (N=49) | Qualitative methods: focus groups | Effectiveness; frequency of use; learning performance; satisfaction |
37 | Keegan et al [ | ], 2016, United StatesNursing (N=116) | Posttest nonrandomized control group design: questionnaire; task and knowledge performance | Learning performance; satisfaction; usefulness |
38 | Kim-Berman et al [ | ], 2019, United StatesDentist (N=93) | Posttest 1-group design: questionnaire; task and knowledge performance | Context of use; ease of use; effectiveness; usefulness |
39 | Kojima et al [ | ], 2011, JapanPhysiotherapy and occupational therapy (N=41) | Pre-post test, 1-group design: questionnaire | Ease of use; learning performance; satisfaction; usefulness |
40 | Koulias et al [ | ], 2012, AustraliaMedicine (N=171) | Posttest 1-group design: questionnaire | Ease of use; operational usability; satisfaction |
41 | Kow et al [ | ], 2016, SingaporeMedicine (N=221) | Pre-post test, 1-group design: questionnaire | Learning performance; satisfaction |
42 | Kurniawan and Witjaksono [ | ], 2018, IndonesiaMedicine (N=30) | Posttest 1-group design: questionnaire | Satisfaction; usefulness |
43A | Lefroy et al [ | ], 2017, United KingdomMedicine (N=21) | Qualitative methods: focus groups; data from app | Context of use; frequency of use; satisfaction |
43B | Lefroy et al [ | ], 2017, United KingdomMedicine (N=405) | Quantitative methods: data from app | Context of use; frequency of use; satisfaction |
44 | Li et al [ | ], 2019, TaiwanHealth care (N=70) | Pre-post test, nonrandomized control group design: questionnaireb | Ease of use; usefulness |
45 | Lin and Lin [ | ], 2016, TaiwanNursing (N=36) | Pre-post test, nonrandomized control group design: questionnaire | Cognitive load; ease of use; learnability; learning performance; usefulness |
46 | Lone et al [ | ], 2019, IrelandDentist (N=59) | Randomized controlled trial: questionnaire; task and knowledge performance | Ease of use; learnability; learning performance; operational usability; satisfaction |
47A | Long et al [ | ], 2016, United StatesNursing (N=158) | Pre-post test, 1-group design: questionnaire; data from app | Ease of use; efficiency; learnability; learning performance; satisfaction |
47B | Long et al [ | ], 2016, United StatesHealth science (N=159) | Randomized controlled trial: questionnaire; data from app | Ease of use; efficiency; learnability; learning performance; satisfaction |
48 | Longmuir [ | ], 2014, United StatesMedicine (N=56) | Posttest 1-group design: questionnaire; data from app | Efficiency; learnability; operational usability; satisfaction |
49 | López et al [ | ], 2016, SpainMedicine (N=67) | Posttest 1-group design: questionnaireb | Context of use; ease of use; errors; satisfaction; usefulness |
50 | Lozano-Lozano et al [ | ], 2020, SpainPhysiotherapy (N=110) | Randomized controlled trial: questionnaire; task and knowledge performance | Learning performance; satisfaction; usefulness |
51 | Lucas et al [ | ], 2019, AustraliaPharmacy (N=39) | Pre-post test, 1-group design: questionnaire; task and knowledge performance | Satisfaction; usefulness |
52 | Mathew et al [ | ], 2014, CanadaMedicine (N=5) | Think-aloud methods: questionnaireb; interviews; task and knowledge performance | Learnability; satisfaction |
53 | McClure [ | ], 2019, United StatesNursing (N=16) | Posttest 1-group design: questionnaireb | Learnability; satisfaction; usefulness |
54 | McDonald et al [ | ], 2018, CanadaMedicine (N=20) | Pre-post test, 1-group design: questionnaire; data from app | Effectiveness; satisfaction |
55 | McLean et al [ | ], 2014, AustraliaMedicine (N=58) | Mixed methods: questionnaire; focus groups; interviews | Satisfaction |
56 | McMullan [ | ], 2018, United KingdomHealth science (N=60) | Pre-post test, 1-group design: questionnaire | Learning performance; navigation; satisfaction; usefulness; user-friendliness |
57 | Mendez-Lopez et al [ | ], 2021, SpainPsychology (N=67) | Pre-post test, 1-group design: questionnaire; task and knowledge performance | Cognitive load; ease of use; learning performance; satisfaction; usefulness |
58 | Meruvia-Pastor et al [ | ], 2016, CanadaNursing (N=10) | Pre-post test, 1-group design: questionnaire; task and knowledge performance | Ease of use; learning performance; satisfaction; usefulness |
59 | Mettiäinen [ | ], 2015, FinlandNursing (N=121) | Mixed methods: questionnaire; focus groups | Ease of use; usefulness |
60 | Milner et al [ | ], 2020, United StatesMedicine and nursing (N=66) | Posttest 1-group design: questionnaire | Satisfaction; usefulness |
61 | Mladenovic et al [ | ], 2021, SerbiaDentist (N=56) | Posttest 1-group design: questionnaire | Context of use; ease of use; satisfaction; usefulness |
62 | Morris and Maynard [ | ], 2010, United KingdomPhysiotherapy and nursing (N=19) | Pre-post test, 1-group design: questionnaire | Context of use; ease of use; navigation; operational usability; usefulness |
63A | Nabhani et al [ | ], 2020, United KingdomPharmacy (N=56) | Posttest 1-group design: questionnaire | Ease of use; learnability; learning performance; satisfaction; usefulness |
63B | Nabhani et al [86], 2020, United Kingdom | Pharmacy (N=152) | Posttest 1-group design: questionnaire | Ease of use; learnability; learning performance; satisfaction; usefulness |
63C | Nabhani et al [86], 2020, United Kingdom | Pharmacy (N=33) | Posttest 1-group design: task and knowledge performance | Ease of use; learnability; learning performance; satisfaction; usefulness |
64A | Noguera et al [ | ], 2013, SpainPhysiotherapy (N=84) | Posttest 1-group design: questionnaire | Learning performance; satisfaction; usefulness |
64B | Noguera et al [ | ], 2013, SpainPhysiotherapy (N=76) | Randomized controlled trial: questionnaire | Learning performance; satisfaction; usefulness |
65 | O’Connell et al [ | ], 2016, IrelandMedicine, nursing, and pharmacy (N=89) | Randomized controlled trial: questionnaireb | Ease of use; learning performance; operational usability; satisfaction; simplicity |
66 | Oliveira et al [ | ], 2019, BrazilMedicine (N=110) | Randomized controlled trial: questionnaire; task and knowledge performance | Frequency of use; learning performance; satisfaction |
67 | Orjuela et al [ | ], 2015, ColombiaMedicine (N=22) | Posttest 1-group design: questionnaire | Ease of use; satisfaction |
68 | Page et al [ | ], 2016, United StatesMedicine (N=356) | Mixed methods: questionnaire; interviews | Context of use; efficiency; satisfaction |
69 | Paradis et al [ | ], 2018, CanadaMedicine and nursing (N=108) | Posttest 1-group design: questionnaireb | Ease of use; satisfaction; usefulness |
70 | Pereira et al [ | ], 2017, BrazilMedicine (N=20) | Posttest 1-group design: questionnaireb | Ease of use; learnability; satisfaction; usefulness |
71 | Pereira et al [ | ], 2019, BrazilNursing (N=60) | Posttest 1-group design: questionnaire | Ease of use; operational usability; satisfaction |
72A | Pinto et al [ | ], 2008, BrazilBiomedical informatics (N=5) | Qualitative methods: observations; task and knowledge performance | Efficiency; errors; learnability; learning performance; operational usability; satisfaction |
72B | Pinto et al [ | ], 2008, BrazilMedicine (N=not clear) | Posttest nonrandomized control group design: questionnaire | Efficiency; errors; learnability; learning performance; operational usability; satisfaction |
73 | Quattromani et al [ | ], 2018, United StatesNursing (N=181) | Randomized controlled trial: questionnaireb | Learnability; learning performance; satisfaction; usefulness |
74 | Robertson and Fowler [ | ], 2017, United StatesMedicine (N=18) | Qualitative methods: focus groups | Satisfaction |
75A | Romero et al [ | ], 2021, GermanyMedicine (N=22) | Think-aloud methods: questionnaire; interviews; task and knowledge performance | Effectiveness; efficiency; errors; navigation; satisfaction |
75B | Romero et al [ | ], 2021, GermanyMedicine (N=22) | Posttest 1-group design: questionnaireb | Learnability; satisfaction |
75C | Romero et al [ | ], 2021, GermanyMedicine (N=736) | Posttest 1-group design: questionnaire | Frequency of use; satisfaction |
76 | Salem et al [ | ], 2020, AustraliaPharmacy (N=33) | Posttest 1-group design: questionnaire | Operational usability; satisfaction; usefulness |
77 | San Martín-Rodríguezet al [ | ], 2020, SpainNursing (N=77) | Posttest 1-group design: questionnaire; task and knowledge performance | Learning performance; operational usability; satisfaction |
78 | Schnepp and Rogers [ | ], 2017, United StatesNot clear (N=72) | Think-aloud methods: questionnaireb; interviews; task and knowledge performance | Learnability; satisfaction |
79 | Smith et al [ | ], 2016, United KingdomMedicine and nursing (N=74) | Mixed methods: questionnaire; focus groups | Navigation; operational usability; satisfaction; user-friendliness |
80 | Strandell-Laine et al [ | ], 2019, FinlandNursing (N=52) | Mixed methods: questionnaireb; written qualitative responses | Learnability; operational usability; satisfaction |
81 | Strayer et al [ | ], 2010, United StatesMedicine (N=122) | Mixed methods: questionnaire; focus groups | Context of use; learnability; learning performance; satisfaction; usefulness |
82 | Taylor et al [ | ], 2010, United KingdomA total of 8 different health care educations (N=79) | Qualitative methods: focus groups; written qualitative reflections | Context of use; learnability |
83 | Toh et al [ | ], 2014, SingaporePharmacy (N=31) | Posttest 1-group design: questionnaire | Ease of use; learnability; navigation; usefulness |
84 | Tsopra et al [ | ], 2020, FranceMedicine (N=57) | Mixed methods: questionnaire; focus groups | Ease of use; operational usability; satisfaction; usefulness |
85 | Wu [ | ], 2014, TaiwanNursing (N=36) | Mixed methods: questionnaire; interviews | Cognitive load; effectiveness; satisfaction; usefulness |
86 | Wyatt et al [ | ], 2012, United StatesNursing (N=12) | Qualitative methods: focus groups | Ease of use; efficiency; errors; learnability; memorability; navigation; satisfaction |
87 | Yap [ | ], 2017, SingaporePharmacy (N=123) | Posttest 1-group design: questionnaire | Comprehensibility; learning performance; memorability; navigation; satisfaction; usefulness |
88 | Zhang et al [ | ], 2015, SingaporeMedicine (N=185) | Mixed methods: questionnaire; focus groups | Usefulness |
aPerformances measured, comparing paper and app results, quiz results, and exam results.
bReported use of validated questionnaires.
Usability Evaluation Methods
The usability evaluation methods found were either inquiry-based or based on user testing. The following inquiry methods were used: 1-group design (46/98, 47%), control group design (12/98, 12%), randomized controlled trials (12/98, 12%), mixed methods (12/98, 12%), and qualitative methods (11/98, 11%). Several studies that applied inquiry-based methods used more than one data collection method, with questionnaires being used most often (80/98, 82%), followed by task and knowledge performance testing (17/98, 17%), focus groups (15/98, 15%), collection of user data from the app (10/98, 10%), interviews (5/98, 5%), written qualitative reflections (4/98, 4%), and observations (3/98, 3%). Additional information can be found in the data extraction sheet (
). illustrates the frequency of the inquiry-based usability evaluation methods and data collection methods.The only user testing methods found were think-aloud methods (5/98, 5%), and 4 (80%) of these studies applied more than one data collection method. The data collection methods used included interviews (4/98, 4%), questionnaires (3/98, 3%), task and knowledge performance (3/98, 3%), focus groups (1/98, 1%), and collection of user data from the app (1/98, 1%).
A total of 19 studies used a psychometrically tested usability questionnaire, including the SUS, Technology Acceptance Model, Technology Satisfaction Questionnaire, and Technology Readiness Index. SUS [
] was used in most (9/98, 9%) of the studies.Field testing was the most frequent type of usability experiment, accounting for 72% (71/98) of usability experiments. A total of 22 (22%) studies performed laboratory testing, and 5 (5%) studies did not indicate the type of experiment performed.
provides an overview of the type of experiment conducted in each study. The usability testing of the mobile apps took place in a classroom setting (41/98, 42%), in clinical placement (29/98, 30%), during simulation training (14/98, 14%), other (7/98, 7%), or the setting was not specified (5/98, 5%).Usability Attributes
A total of 17 usability attributes have been identified among the included studies. The most frequently identified attributes were satisfaction, usefulness, ease of use, learning performance, and learnability. The least frequent were errors, cognitive load, comprehensibility, memorability, and simplicity.
provides an overview of the usability attributes identified in the included studies.Usability attribute | Distribution, n (%) | Reports (references) |
Satisfaction | 74 (84) | [ | - , - , - , - , , , - , , , - , - , , , - , - ]
Usefulness | 51 (58) | [ | , , - , - , - , , , - , - , , , , - , , - , , , , , , - , , ]
Ease of use | 45 (51) | [ | , , , , , , - , - , - , - , , , , , - , - , , - , - , , , - , , , ]
Learning performance | 33 (38) | [ | , - , , , , , , , , , , , , , , - , , - , - , , , , , ]
Learnability | 23 (26) | [ | , , , , , , - , , , , , , , , , - , ]
Operational usability | 19 (22) | [ | , , , , , , , , , , , , , , - , , ]
Context of use | 14 (16) | [ | , , , , , , , , , , , , , ]
Navigation | 12 (14) | [ | , - , , , , , , , , ]
Efficiency | 11 (13) | [ | , , , , , , , , , , ]
Effectiveness | 10 (11) | [ | , - , , , , , , ]
Frequency of use | 10 (11) | [ | , , , , , , , , , ]
User-friendliness | 7 (8) | [ | , , , , , , ]
Errors | 5 (6) | [ | , , , , ]
Cognitive load | 3 (3) | [ | , , ]
Comprehensibility | 2 (2) | [ | , ]
Memorability | 2 (2) | [ | , ]
Simplicity | 2 (2) | [ | , ]
Discussion
Principal Findings
This scoping review sought to identify the usability methods and attributes reported in usability studies of mobile apps for health care education. A total of 88 articles, with a total of 98 studies reported in these 88 articles, were included in this review. Our findings indicate a steady increase in publications from 2014, with studies being published in 22 different countries. Field testing was used more frequently than laboratory testing. Furthermore, the usability evaluation methods applied were either inquiry-based or based on user testing. Most of the inquiry-based methods were experiments that used questionnaires as a data collection method, and all of the studies with user testing methods applied think-aloud methods. Satisfaction, usefulness, ease of use, learning performance, and learnability were the most frequently identified usability attributes.
Comparison With Prior Work
Usability Evaluation Methods
The studies included in this scoping review mainly applied inquiry-based methods, primarily the collection of self-reported data through questionnaires. This is congruent with the results of Weichbroth [
], in which controlled observations and surveys were the most frequently applied methods. Asking users to respond to a usability questionnaire may provide relevant and valuable information. Among the 83 studies that used questionnaires in our review, only 19 (23%) used a psychometrically tested usability questionnaire; of these, the SUS questionnaire [ ] was used most frequently. In line with the review on usability questionnaires [ ], we recommend using a psychometrically tested usability questionnaire to support the advancement of usability science. As questionnaires address only certain usability attributes, mainly learnability, efficiency, and satisfaction [ ], it would be helpful to also include additional methods, such as interviews or mixed methods, and to incorporate additional open-ended questions when using questionnaires.Furthermore, the application of usability evaluation methods other than inquiry methods, such as user testing methods and inspection methods [
], could be beneficial and lead to more objective measures of app usability. Among other things, subjective data are collected via self-reported questionnaires, and objective data are collected based on task completion rates [ ]. For example, in one of the included studies, the participants reported that the usability of the app was satisfactory by subjective measures, but the participants did not use the app [ ]. Another study reported a lack of coherence between subjective and objective data; thus, these results indicate the importance of not relying solely on subjective measures of usability [ ]. Therefore, it is suggested that various usability evaluation methods, including subjective and objective usability measures, are used in future usability studies.Our review found that most of the included studies in health care education (71/98, 72%) performed field testing, whereas previous literature suggests that usability experiments in other fields are more often conducted in a laboratory [
, ]. For instance, Kumar and Mohite [ ] found that 73% of the studies included in their review of mobile learning apps used laboratory testing. Mobile apps in health care education have been developed to support students’ learning, on-campus and during clinical placement, in various settings and on the move. Accordingly, it is especially important to test how the apps are perceived in specific environments [ ]; hence, field testing is required. However, many usability issues can be discovered in a laboratory. Particularly in the early phases of app development, testing an app with several participants in a laboratory may make it more feasible to test and improve the app [ ]. Usability testing in a laboratory can provide rapid feedback on usability issues, which can then be addressed before testing the app in a real-world environment. Therefore, it may be beneficial to conduct small-scale laboratory testing before field testing.Usability Attributes
Previous systematic reviews of mobile apps in general identified satisfaction, efficiency, and effectiveness as the most common usability attributes [
, ]. In this review, efficiency and effectiveness were explored to a limited extent, whereas satisfaction, usefulness, and ease of use were the most frequently identified usability attributes. Our results coincide with those from a previous review on the usability of mobile learning apps [ ], possibly because satisfaction, usefulness, and ease of use are usability attributes of particular importance when examining mobile learning apps.Learning performance was assessed frequently in the included studies. For ensuring that apps are valuable in a given learning context, it is relevant to test additional usability attributes such as cognitive load [
]. However, few studies included in our review examined cognitive load [ , , ]. Mobile apps are often used in an environment with multiple distractions, which may contribute to an increased cognitive load [ ], affecting the learning performance. Testing both learning performance and app users’ cognitive load may improve the understanding of the app’s usability.We found that several of the included studies did not use terminology from usability literature to describe which usability attributes they were testing. For instance, studies that tested satisfaction often used words such as “likes and dislikes” and “recommend use to others” and did not specify that they tested the usability attribute satisfaction. Specifying which usability attributes are investigated will be important when performing a usability study of mobile apps, as this will influence transparency and enable comparison between different studies. In addition, evaluating a wider range of usability attributes may enable researchers to expand their perspective regarding the app’s usability problems and ensure quicker improvement of the app. Defining and presenting different usability attributes in a reporting guideline can assist in deciding on and reporting relevant usability attributes. As such, a reporting guideline would be beneficial for researchers planning and conducting usability studies, a point that is also supported by the systematic review conducted by Kumar and Mohite [
].Future Directions
Combining different usability evaluation methods that incorporate both subjective and objective usability measures can add various and important perspectives when developing apps. In future studies, it would be advantageous to use psychometrically tested usability questionnaires to support the advancement of the usability science. In addition, developers of mobile apps should determine which usability attributes are relevant before conducting usability studies (eg, by registering a protocol). Incorporating these perspectives into the development of a reporting guideline would be beneficial to future usability studies.
Strengths and Limitations
First, the search strategy was designed in collaboration with a research librarian and peer reviewed by another research librarian and included 10 databases and other sources. This broad search strategy resulted in a high number of references, which may be associated with a lower level of precision. To ensure the retrieval of all potentially pertinent articles, two of the authors independently screened titles and abstracts; studies deemed eligible by one of the authors were included for full-text screening.
Second, the full-text evaluation was challenging because the term usability has multiple meanings that do not always relate to usability testing. For instance, the term was used when testing students’ experience of a commercially developed app but not in connection with the app’s further development. In addition, many studies did not explicitly state that a mobile app was being investigated, which also created a challenge when deciding whether they satisfied the eligibility criteria. Nevertheless, reading the full-text articles independently by 2 reviewers and solving disagreements through consensus-based discussions ensured the inclusion of relevant articles.
Conclusions
This scoping review was performed to provide an overview of the usability methods used and the attributes identified in usability studies of mobile apps in health care education. Experimental designs were commonly used to evaluate usability and most studies used field testing. Questionnaires were frequently used for data collection, although few studies used psychometrically tested questionnaires. Usability attributes identified most often were satisfaction, usefulness, and ease of use. The results indicate that combining different usability evaluation methods, incorporating both subjective and objective usability measures, and specifying which usability attributes to test seem advantageous. The results can support the planning and conduct of future usability studies of the advancement of learning apps in health care education.
Acknowledgments
The research library at Western Norway University of Applied Sciences provided valuable assistance in developing and performing the search strategy for this scoping review. Gunhild Austrheim, a research librarian, provided substantial guidance in the planning and performance of the database searches. Marianne Nesbjørg Tvedt peer reviewed the search string. Malik Beglerovic also assisted with database searches. The authors would also like to thank Ane Kjellaug Brekke Gjerland for assessing the data extraction sheet.
Authors' Contributions
SGJ, LL, DC, and NRO proposed the idea for this review. SGJ, DC, and NRO contributed to the screening of titles and abstracts, and SGJ and TP decided on eligibility based on full-text examinations. SGJ extracted data from the included studies. SGJ, TP, LL, DC, and NRO contributed to the drafts of the manuscript, and all authors approved the final version for publication.
Conflicts of Interest
None declared.
PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) checklist for reporting scoping reviews.
DOCX File , 107 KB
The search strategies for the 10 databases.
DOCX File , 84 KB
Data extraction sheet.
XLSX File (Microsoft Excel File), 156 KBReferences
- Kumar BA, Mohite P. Usability of mobile learning applications: a systematic literature review. J Comput Educ 2018;5(1):1-17. [CrossRef]
- Asarbakhsh M, Sandars J. E-learning: the essential usability perspective. Clin Teach 2013 Feb;10(1):47-50. [CrossRef] [Medline]
- Lall P, Rees R, Law GC, Dunleavy G, Cotič Ž, Car J. Influences on the implementation of mobile learning for medical and nursing education: qualitative systematic review by the digital health education collaboration. J Med Internet Res 2019 Feb 28;21(2):e12895 [FREE Full text] [CrossRef] [Medline]
- Sophonhiranrak S, Promsaka Na Sakonnak N. Limitations of mobile learning: a systematic review. In: E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education. 2017 Presented at: E-Learn '17; October 17, 2017; Vancouver, British Columbia, Canada p. 965-971.
- Harrison R, Flood D, Duce D. Usability of mobile applications: literature review and rationale for a new usability model. J Interact Sci 2013 May 7;1(1):1-16. [CrossRef]
- Paz F, Pow-Sang JA. A systematic mapping review of usability evaluation methods for software development process. Int J Softw Eng Appl 2016 Jan 31;10(1):165-178. [CrossRef]
- ISO 9241-11:2018. Ergonomics of human-system interaction — part 11: usability: definitions and concepts. Geneva, Switzerland: International Organization for Standardization; 2018.
- Rubin J, Chisnell D, Spool J. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. 2nd edition. Hoboken, NJ, USA: Wiley; May 2008.
- Kukulska-Hulme A. Mobile usability in educational contexts: what have we learnt? Int Rev Res Open Dis Learn 2007 Jun 15;8(2):1-16. [CrossRef]
- Weichbroth P. Usability of mobile applications: a systematic literature study. IEEE Access 2020 Mar 19;8:55563-55577. [CrossRef]
- Nayebi F, Desharnais JM, Abran A. The state of the art of mobile application usability evaluation. In: Proceedings of the 25th IEEE Canadian Conference on Electrical and Computer Engineering. 2012 Presented at: CCECE '12; April 29-May 02, 2012; Montreal, Canada p. 1-4. [CrossRef]
- Sousa VE, Dunn Lopez K. Towards usable e-health. A systematic review of usability questionnaires. Appl Clin Inform 2017 May 10;8(2):470-490 [FREE Full text] [CrossRef] [Medline]
- Sure M. Questionnaires for Usability: A Systematic Literature Review. Linköping, Sweden: Linköping University; 2014.
- Zhang D, Adipat B. Challenges, methodologies, and issues in the usability testing of mobile applications. Int J Hum Comput Interact 2005 Jul;18(3):293-308. [CrossRef]
- Ismail NA, Ahmad F, Kamaruddin NA, Ibrahim R. A review on usability issues in mobile applications. J Mob Comput Appl 2016;3(3):47-52. [CrossRef]
- Sandars J. The importance of usability testing to allow e-learning to reach its potential for medical education. Educ Prim Care 2010 Jan;21(1):6-8. [CrossRef] [Medline]
- Arksey H, O'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol 2005 Feb;8(1):19-32. [CrossRef]
- Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implement Sci 2010 Sep 20;5:69 [FREE Full text] [CrossRef] [Medline]
- Khalil H, Peters M, Godfrey CM, McInerney P, Soares CB, Parker D. An evidence-based approach to scoping reviews. Worldviews Evid Based Nurs 2016 Apr;13(2):118-123. [CrossRef] [Medline]
- Johnson SG, Potrebny T, Larun L, Ciliska D, Olsen NR. Usability methods and attributes reported in usability studies of mobile apps for health care education: protocol for a scoping review. JMIR Res Protoc 2020 Aug 04;9(8):e19072 [FREE Full text] [CrossRef] [Medline]
- Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med 2018 Oct 02;169(7):467-473 [FREE Full text] [CrossRef] [Medline]
- Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev 2016 Dec 05;5(1):210 [FREE Full text] [CrossRef] [Medline]
- The EndNote Team. EndNote. X9.2. Clarivate. Philadelphia, PA; 2013. URL: https://endnote.com/ [accessed 2022-02-15]
- Aebersold M, Voepel-Lewis T, Cherara L, Weber M, Khouri C, Levine R, et al. Interactive anatomy-augmented virtual simulation training. Clin Simul Nurs 2018 Feb;15:34-41 [FREE Full text] [CrossRef] [Medline]
- Akl EA, Mustafa R, Slomka T, Alawneh A, Vedavalli A, Schünemann HJ. An educational game for teaching clinical practice guidelines to Internal Medicine residents: development, feasibility and acceptability. BMC Med Educ 2008 Nov 18;8:50 [FREE Full text] [CrossRef] [Medline]
- Al-Rawi W, Easterling L, Edwards PC. Development of a mobile device optimized cross platform-compatible oral pathology and radiology spaced repetition system for dental education. J Dent Educ 2015 Apr;79(4):439-447. [Medline]
- Albrecht UV, Noll C, von Jan U. Explore and experience: mobile augmented reality for medical training. Stud Health Technol Inform 2013;192:382-386. [Medline]
- Alencar Neto JB, Araújo RL, Barroso Filho EM, Silva PG, Garrido RJ, Rocha PH, et al. Development and validation of a smartphone application for orthopedic residency education. Rev Bras Educ Med 2020 May;44(4):561-567. [CrossRef]
- Alepis E, Virvou M. Evaluation of mobile authoring and tutoring in medical issues. US China Educ Rev 2010 Jul;7(7):84-92 [FREE Full text]
- Ameri A, Khajouei R, Ameri A, Jahani Y. Acceptance of a mobile-based educational application (LabSafety) by pharmacy students: an application of the UTAUT2 model. Educ Inf Technol 2019 Aug 1;25(1):419-435. [CrossRef]
- Taziki Balajelini F, Najafi Ghezeljeh T. Prehospital trauma management: evaluation of a new designed smartphone application. J Client Centered Nurs Care 2018 Nov 30;4(4):193-202. [CrossRef]
- Barnes J, Duffy A, Hamnett N, McPhail J, Seaton C, Shokrollahi K, et al. The Mersey Burns App: evolving a model of validation. Emerg Med J 2015 Aug;32(8):637-641. [CrossRef] [Medline]
- Busanello FH, da Silveira PF, Liedke GS, Arús NA, Vizzotto MB, Silveira HE, et al. Evaluation of a digital learning object (DLO) to support the learning process in radiographic dental diagnosis. Eur J Dent Educ 2015 Nov;19(4):222-228. [CrossRef] [Medline]
- Cabero-Almenara J, Roig-Vila R. The motivation of technological scenarios in augmented reality (AR): results of different experiments. Appl Sci 2019 Jul 19;9(14):2907-2916. [CrossRef]
- Choi M, Lee HS, Park JH. Usability of academic electronic medical record application for nursing students' clinical practicum. Healthc Inform Res 2015 Jul;21(3):191-195 [FREE Full text] [CrossRef] [Medline]
- Choi M, Lee H, Park JH. Effects of using mobile device-based academic electronic medical records for clinical practicum by undergraduate nursing students: a quasi-experimental study. Nurse Educ Today 2018 Feb;61:112-119. [CrossRef] [Medline]
- Choo CC, Devakaran B, Chew PK, Zhang MW. Smartphone application in postgraduate clinical psychology training: trainees' perspectives. Int J Environ Res Public Health 2019 Oct 30;16(21):4206 [FREE Full text] [CrossRef] [Medline]
- Chreiman KM, Prakash PS, Martin ND, Kim PK, Mehta S, McGinnis K, et al. Staying connected: service-specific orientation can be successfully achieved using a mobile application for onboarding care providers. Trauma Surg Acute Care Open 2017;2(1):e000085 [FREE Full text] [CrossRef] [Medline]
- Colucci PG, Kostandy P, Shrauner WR, Arleo E, Fuortes M, Griffin AS, et al. Development and utilization of a Web-based application as a robust radiology teaching tool (radstax) for medical student anatomy teaching. Acad Radiol 2015 Feb;22(2):247-255 [FREE Full text] [CrossRef] [Medline]
- Davids MR, Chikte UM, Halperin ML. Effect of improving the usability of an e-learning resource: a randomized trial. Adv Physiol Educ 2014 Jun;38(2):155-160 [FREE Full text] [CrossRef] [Medline]
- Demmans Epp C, Horne J, Scolieri BB, Kane I, Bowser AS. PsychOut! a mobile app to support mental status assessment training. In: Proceedings of the 13th European Conference on Technology Enhanced Learning. 2018 Presented at: EC-TEL '18; September 3-5, 2018; Leeds, UK p. 216-230. [CrossRef]
- Devraj R, Colyott L, Cain J. Design and evaluation of a mobile serious game application to supplement instruction. Curr Pharm Teach Learn 2021 Sep;13(9):1228-1235. [CrossRef] [Medline]
- Díaz-Fernández Á, Jiménez-Delgado JJ, Osuna-Pérez MC, Rueda-Ruiz A, Paulano-Godino F. Development and implementation of a mobile application to improve university teaching of electrotherapy. In: Proceedings of the 2016 International Conference on Interactive Mobile Communication, Technologies and Learning. 2016 Presented at: IMCL '16; October 17-19, 2016; San Diego, CA, USA p. 33-37. [CrossRef]
- Docking RE, Lane M, Schofield PA. Usability testing of the iPhone app to improve pain assessment for older adults with cognitive impairment (prehospital setting): a qualitative study. Pain Med 2018 Jun 01;19(6):1121-1131. [CrossRef] [Medline]
- Dodson CH, Baker E. Focus group testing of a mobile app for pharmacogenetic-guided dosing. J Am Assoc Nurse Pract 2020 Feb 04;33(3):205-210. [CrossRef] [Medline]
- Duarte Filho NF, Gonçalves CF, Pizetta DC. Experimental analysis of the efficiency of application E-Mages in medical imaging visualization. In: Proceedings of the 9th Iberian Conference on Information Systems and Technologies. 2014 Presented at: CISTI '14; June 18-21, 2014; Barcelona, Spain p. 1-6. [CrossRef]
- Duggan N, Curran VR, Fairbridge NA, Deacon D, Coombs H, Stringer K, et al. Using mobile technology in assessment of entrustable professional activities in undergraduate medical education. Perspect Med Educ 2021 Dec;10(6):373-377 [FREE Full text] [CrossRef] [Medline]
- Fernández-Lao C, Cantarero-Villanueva I, Galiano-Castillo N, Caro-Morán E, Díaz-Rodríguez L, Arroyo-Morales M. The effectiveness of a mobile application for the development of palpation and ultrasound imaging skills to supplement the traditional learning of physiotherapy students. BMC Med Educ 2016 Oct 19;16(1):274 [FREE Full text] [CrossRef] [Medline]
- Fralick M, Haj R, Hirpara D, Wong K, Muller M, Matukas L, et al. Can a smartphone app improve medical trainees' knowledge of antibiotics? Int J Med Educ 2017 Nov 30;8:416-420 [FREE Full text] [CrossRef] [Medline]
- Ghafari S, Yazdannik A, Mohamadirizi S. Education promotion based on "mobile technology" in the Critical Care Nursing Department: four-phase intervention. J Educ Health Promot 2020;9:325 [FREE Full text] [CrossRef] [Medline]
- Goldberg H, Klaff J, Spjut A, Milner S. A mobile app for measuring the surface area of a burn in three dimensions: comparison to the Lund and Browder assessment. J Burn Care Res 2014;35(6):480-483. [CrossRef] [Medline]
- Gutiérrez-Puertas L, García-Viola A, Márquez-Hernández VV, Garrido-Molina JM, Granados-Gámez G, Aguilera-Manrique G. Guess it (SVUAL): an app designed to help nursing students acquire and retain knowledge about basic and advanced life support techniques. Nurse Educ Pract 2021 Jan;50:102961. [CrossRef] [Medline]
- Herbert VM, Perry RJ, LeBlanc CA, Haase KN, Corey RR, Giudice NA, et al. Developing a smartphone app with augmented reality to support virtual learning of nursing students on heart failure. Clin Simul Nurs 2021 May;54:77-85. [CrossRef]
- Hsu LL, Hsiang HC, Tseng YH, Huang SY, Hsieh SI. Nursing students' experiences of using a smart phone application for a physical assessment course: a qualitative study. Jpn J Nurs Sci 2019 Apr;16(2):115-124. [CrossRef] [Medline]
- Huang HM, Chen YL, Chen KY. Investigation of three-dimensional human anatomy applied in mobile learning. In: Proceedings of the 2010 International Computer Symposium. 2010 Presented at: ICS '10; December 16-18, 2010; Tainan, Taiwan p. 358-363. [CrossRef]
- Hughes JK, Kearney P. Impact of an iDevice application on student learning in an occupational therapy kinesiology course. Mhealth 2017;3:43 [FREE Full text] [CrossRef] [Medline]
- Ismail SN, Rangga JU, Rasdi I, Rahman UR, Samah MA. Mobile apps application to improve safety and health knowledge, attitude and practice among university students. Malays J Med Health Sci 2018;14:47-55.
- Johnson SG, Titlestad KB, Larun L, Ciliska D, Olsen NR. Experiences with using a mobile application for learning evidence-based practice in health and social care education: an interpretive descriptive study. PLoS One 2021;16(7):e0254272 [FREE Full text] [CrossRef] [Medline]
- Kang J, Suh EE. Development and evaluation of "chronic illness care smartphone apps" on nursing students' knowledge, self-efficacy, and learning experience. Comput Inform Nurs 2018 Nov;36(11):550-559. [CrossRef] [Medline]
- Keegan RD, Oliver MC, Stanfill TJ, Stevens KV, Brown GR, Ebinger M, et al. Use of a mobile device simulation as a preclass active learning exercise. J Nurs Educ 2016 Jan;55(1):56-59. [CrossRef] [Medline]
- Kim-Berman H, Karl E, Sherbel J, Sytek L, Ramaswamy V. Validity and user experience in an augmented reality virtual tooth identification test. J Dent Educ 2019 Nov;83(11):1345-1352. [CrossRef] [Medline]
- Kojima S, Mitani M, Ishikawa A. Development of an E-learning resource on mobile devices for kinesiology: a pilot study. J Phys Ther Sci 2011;23(4):667-672. [CrossRef]
- Scott KM, Kitching S, Burn D, Koulias M, Campbell D, Phelps M. "Wherever, whenever" learning in medicine: interactive mobile case-based project. In: Proceedings of the 27th Annual conference of the Australian Society for Computers in Tertiary Education. 2010 Presented at: Ascilite '10; December 5-8, 2010; New South Wales, Australia p. 888-890.
- Kow AW, Ang BL, Chong CS, Tan WB, Menon KR. Innovative patient safety curriculum using iPAD game (PASSED) improved patient safety concepts in undergraduate medical students. World J Surg 2016 Nov;40(11):2571-2580. [CrossRef] [Medline]
- Kurniawan MH, Suharjito, Diana, Witjaksono G. Human anatomy learning systems using augmented reality on mobile application. Procedia Comput Sci 2018;135:80-88. [CrossRef]
- Lefroy J, Roberts N, Molyneux A, Bartlett M, Gay S, McKinley R. Utility of an app-based system to improve feedback following workplace-based assessment. Int J Med Educ 2017 May 31;8:207-216 [FREE Full text] [CrossRef] [Medline]
- Li YJ, Lee LH, Cheng YT, Ou YY. Design and evaluation of a healthcare management terminology mobile learning application. In: Proceedings of the 2019 IEEE International Conference on Healthcare Informatics. 2019 Presented at: ICHI '19; June 10-13, 2019; Xi'an, China p. 1-9. [CrossRef]
- Lin YT, Lin YC. Effects of mental process integrated nursing training using mobile device on students’ cognitive load, learning attitudes, acceptance, and achievements. Comput Human Behav 2016 Feb;55(B):1213-1221 [FREE Full text] [CrossRef]
- Lone M, Vagg T, Theocharopoulos A, Cryan JF, Mckenna JP, Downer EJ, et al. Development and assessment of a three-dimensional tooth morphology quiz for dental students. Anat Sci Educ 2019 May;12(3):284-299. [CrossRef] [Medline]
- Long JD, Gannaway P, Ford C, Doumit R, Zeeni N, Sukkarieh-Haraty O, et al. Effectiveness of a technology-based intervention to teach evidence-based practice: the EBR tool. Worldviews Evid Based Nurs 2016 Feb;13(1):59-65. [CrossRef] [Medline]
- Longmuir KJ. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms. Adv Physiol Educ 2014 Mar;38(1):34-41 [FREE Full text] [CrossRef] [Medline]
- López MM, López MM, de la Torre Díez I, Jimeno JC, López-Coronado M. A mobile decision support system for red eye diseases diagnosis: experience with medical students. J Med Syst 2016 Jun;40(6):151. [CrossRef] [Medline]
- Lozano-Lozano M, Galiano-Castillo N, Fernández-Lao C, Postigo-Martin P, Álvarez-Salvago F, Arroyo-Morales M, et al. The Ecofisio mobile app for assessment and diagnosis using ultrasound imaging for undergraduate health science students: multicenter randomized controlled trial. J Med Internet Res 2020 Mar 10;22(3):e16258 [FREE Full text] [CrossRef] [Medline]
- Lucas C, Gibson A, Shum SB. Pharmacy students' utilization of an online tool for immediate formative feedback on reflective writing tasks. Am J Pharm Educ 2019 Aug;83(6):6800 [FREE Full text] [CrossRef] [Medline]
- Mathew D, Archer N, McKibbon KA, Dillenburg R. Integrating clinical practice guidelines within a mobile tablet app. In: Proceedings of the 2014 International Conference on E-Commerce. 2014 Presented at: EC '14; July 17-19, 2014; Lisbon, Portugal p. 357-361.
- McClure KC. Usability of a mobile website focused on preoperative and intraoperative anesthetic considerations for the cardiac patient with valvular dysfunction. Franciscan Missionaries of Our Lady University, Baton Rouge, LA, USA: ProQuest Dissertations Publishing; Dec 2019.
- McDonald H, Gawad N, Raîche I, Rubens FD. Transition to residency: the successful development and implementation of a nonclinical elective in perioperative management. J Surg Educ 2018;75(3):628-638. [CrossRef] [Medline]
- McLean M, Brazil V, Johnson P. How we "breathed life" into problem-based learning cases using a mobile application. Med Teach 2014 Oct;36(10):849-852. [CrossRef] [Medline]
- McMullan M. Evaluation of a medication calculation mobile app using a cognitive load instructional design. Int J Med Inform 2018 Oct;118:72-77. [CrossRef] [Medline]
- Mendez-Lopez M, Juan MC, Molla R, Fidalgo C. Evaluation of an augmented reality application for learning neuroanatomy in psychology. Anat Sci Educ 2022 May;15(3):535-551. [CrossRef] [Medline]
- Meruvia-Pastor O, Patra P, Andres K, Twomey C, Peña-Castillo L. OMARC: an online multimedia application for training health care providers in the assessment of respiratory conditions. Int J Med Inform 2016 May;89:15-24. [CrossRef] [Medline]
- Mettiäinen S. Electronic assessment and feedback tool in supervision of nursing students during clinical training. Electron J E Learn 2015;13(1):42-55.
- Milner KA, McCloud R, Cullen J. The evidence appraisal game: an innovative strategy for teaching step 3 in evidence-based practice. Worldviews Evid Based Nurs 2020 Apr;17(2):173-175. [CrossRef] [Medline]
- Mladenovic R, Davidovic B, Tusek I, Trickovic-Janjic O, Mladenovic K. The effect of a mobile application for learning about traumatic dental injuries during the COVID-19 pandemic. Srp Arh Celok Lek 2021;149(3-4):202-207. [CrossRef]
- Morris J, Maynard V. Pilot study to test the use of a mobile device in the clinical setting to access evidence-based practice resources. Worldviews Evid Based Nurs 2010 Dec;7(4):205-213. [CrossRef] [Medline]
- Nabhani S, Harrap N, Ishtiaq S, Ling V, Dudzinski M, Greenhill D, et al. Development and evaluation of an educational game to support pharmacy students. Curr Pharm Teach Learn 2020 Jul;12(7):786-803. [CrossRef] [Medline]
- Noguera JM, Jiménez JJ, Osuna-Pérez MC. Development and evaluation of a 3D mobile application for learning manual therapy in the physiotherapy laboratory. Comput Educ 2013 Nov;69(1):96-108. [CrossRef]
- O'Connell E, Pegler J, Lehane E, Livingstone V, McCarthy N, Sahm LJ, et al. Near field communications technology and the potential to reduce medication errors through multidisciplinary application. Mhealth 2016;2:29 [FREE Full text] [CrossRef] [Medline]
- Oliveira EY, Crosewski NI, Silva AL, Ribeiro CT, de Oliveira CM, Fogaça RT, et al. Profile of educational technology use by medical students and evaluation of a new mobile application designed for the study of human physiology. J Med Syst 2019 Aug 27;43(10):313. [CrossRef] [Medline]
- Orjuela MA, Uribe-Quevedo A, Jaimes N, Perez-Gutierrez B. External automatic defibrillator game-based learning app. In: Proceedings of the 2015 IEEE Games Entertainment Media Conference. 2015 Presented at: GEM '15; October 14-16, 2015; Toronto, Canada p. 1-4. [CrossRef]
- Page CP, Reid A, Coe CL, Carlough M, Rosenbaum D, Beste J, et al. Learnings from the pilot implementation of mobile medical milestones application. J Grad Med Educ 2016 Oct;8(4):569-575 [FREE Full text] [CrossRef] [Medline]
- Paradis M, Stiell I, Atkinson KM, Guerinet J, Sequeira Y, Salter L, et al. Acceptability of a mobile clinical decision tool among emergency department clinicians: development and evaluation of the Ottawa rules app. JMIR Mhealth Uhealth 2018 Jun 11;6(6):e10263 [FREE Full text] [CrossRef] [Medline]
- Victor Soares Pereira R, Kubrusly M, Marçal E. Desenvolvimento, utilização e avaliação de uma aplicação móvel para educação médica: um estudo de caso em Anestesiologia. RENOTE 2017 Jul 28;15(1):75104. [CrossRef]
- Pereira FG, Rocha D, Melo GA, Jaques RM, Formiga LM. Building and validating a digital application for the teaching of surgical instrumentation. Cogitare Enferm 2019 Mar 11;24:e58334. [CrossRef]
- Pinto VC, da Costa TM, Naveira MC, Sigulem D, Schor P, Pisa IT. MDFluxo: ophtalmology education with a PDA efficacy and usability evaluation. In: Proceedings of the 1st International Conference on Health Informatics. 2008 Presented at: HEALTHINF '08; January 28-31, 2008; Madeira, Portugal p. 227-230. [CrossRef]
- Quattromani E, Hassler M, Rogers N, Fitzgerald J, Buchanan P. Smart pump app for infusion pump training. Clin Simul Nurs 2018 Apr 1;17:28-37. [CrossRef]
- Robertson AC, Fowler LC. Medical student perceptions of learner-initiated feedback using a mobile Web application. J Med Educ Curric Dev 2017;4:2382120517746384 [FREE Full text] [CrossRef] [Medline]
- Roa Romero Y, Tame H, Holzhausen Y, Petzold M, Wyszynski JV, Peters H, et al. Design and usability testing of an in-house developed performance feedback tool for medical students. BMC Med Educ 2021 Jun 23;21(1):354 [FREE Full text] [CrossRef] [Medline]
- Salem S, Cooper J, Schneider J, Croft H, Munro I. Student acceptance of using augmented reality applications for learning in pharmacy: a pilot study. Pharmacy (Basel) 2020 Jul 21;8(3):122 [FREE Full text] [CrossRef] [Medline]
- San Martín-Rodríguez L, Escalada-Hernández P, Soto-Ruiz N. A themed game to learn about nursing theories and models: a descriptive study. Nurse Educ Pract 2020 Nov;49:102905. [CrossRef] [Medline]
- Schnepp JC, Rogers CB. Evaluating the acceptability and usability of EASEL: a mobile application that supports guided reflection for experiential learning activities. J Inf Technol Educ Innov Pract 2017;16:195-214. [CrossRef]
- Smith N, Rapley T, Jandial S, English C, Davies B, Wyllie R, et al. Paediatric musculoskeletal matters (pmm)--collaborative development of an online evidence based interactive learning tool and information resource for education in paediatric musculoskeletal medicine. Pediatr Rheumatol Online J 2016 Jan 05;14(1):1 [FREE Full text] [CrossRef] [Medline]
- Strandell-Laine C, Leino-Kilpi H, Löyttyniemi E, Salminen L, Stolt M, Suomi R, et al. A process evaluation of a mobile cooperation intervention: a mixed methods study. Nurse Educ Today 2019 Sep;80:1-8. [CrossRef] [Medline]
- Strayer SM, Pelletier SL, Martindale JR, Rais S, Powell J, Schorling JB. A PDA-based counseling tool for improving medical student smoking cessation counseling. Fam Med 2010 May;42(5):350-357 [FREE Full text] [Medline]
- Taylor JD, Dearnley CA, Laxton JC, Coates CA, Treasure‐Jones T, Campbell R, et al. Developing a mobile learning solution for health and social care practice. Dist Educ 2010 Jul 30;31(2):175-192. [CrossRef]
- Yap KY, Toh TW, Chui WK. Development of a virtual patient record mobile app for pharmacy practice education. Arch Pharma Pract 2014;5(2):66-71. [CrossRef]
- Tsopra R, Courtine M, Sedki K, Eap D, Cabal M, Cohen S, et al. AntibioGame®: a serious game for teaching medical students about antibiotic use. Int J Med Inform 2020 Apr;136:104074 [FREE Full text] [CrossRef] [Medline]
- Wu TT. The use of a mobile assistant learning system for health education based on project-based learning. Comput Inform Nurs 2014 Oct;32(10):497-503. [CrossRef] [Medline]
- Wyatt TH, Li X, Indranoi C, Bell M. Developing iCare v.1.0: an academic electronic health record. Comput Inform Nurs 2012 Jun;30(6):321-329. [CrossRef] [Medline]
- Yap KY. Usefulness of the mobile interactive pharmacy education enhancement resource (miPEER) mobile Web-app as a learning tool for electronic health records. Int J Clin Skills 2017;11(6):1-7. [CrossRef]
- Zhang MW, Cheok CC, Ho RC. Global outreach of a locally-developed mobile phone app for undergraduate psychiatry education. JMIR Med Educ 2015 Jun 08;1(1):e3 [FREE Full text] [CrossRef] [Medline]
- Brooke J. SUS: a retrospective. J Usabil Stud Arch 2013;8(2):29-40 [FREE Full text] [CrossRef]
- Bastien JM. Usability testing: a review of some methodological and technical aspects of the method. Int J Med Inform 2010 Apr;79(4):e18-e23. [CrossRef] [Medline]
Abbreviations
PRISMA-ScR: Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews |
SUS: System Usability Scale |
Edited by T Leung; submitted 25.03.22; peer-reviewed by L Rutter, H Mehdizadeh, L Gutierrez-Puertas; comments to author 12.05.22; revised version received 02.06.22; accepted 05.06.22; published 29.06.22
Copyright©Susanne Grødem Johnson, Thomas Potrebny, Lillebeth Larun, Donna Ciliska, Nina Rydland Olsen. Originally published in JMIR Medical Education (https://mededu.jmir.org), 29.06.2022.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on https://mededu.jmir.org/, as well as this copyright and license information must be included.