Original Paper
Abstract
Background: Although digital health is essential for improving health care, its adoption remains slow due to the lack of literacy in this area. Therefore, it is crucial for health professionals to acquire digital skills and for a digital competence assessment and accreditation model to be implemented to make advances in this field.
Objective: This study had two objectives: (1) to create a specific map of digital competences for health professionals and (2) to define and test a digital competence assessment and accreditation model for health professionals.
Methods: We took an iterative mixed methods approach, which included a review of the gray literature and consultation with local experts. We used the arithmetic mean and SD in descriptive statistics, P values in hypothesis testing and subgroup comparisons, the greatest lower bound in test diagnosis, and the discrimination index in study instrument analysis.
Results: The assessment model designed in accordance with the competence content defined in the map of digital competences and based on scenarios had excellent internal consistency overall (greatest lower bound=0.91). Although most study participants (110/122, 90.2%) reported an intermediate self-perceived digital competence level, we found that the vast majority would not attain a level-2 Accreditation of Competence in Information and Communication Technologies.
Conclusions: Knowing the digital competence level of health professionals based on a defined competence framework should enable such professionals to be trained and updated to meet real needs in their specific professional contexts and, consequently, take full advantage of the potential of digital technologies. These results have informed the Health Plan for Catalonia 2021-2025, thus laying the foundations for creating and offering specific training to assess and certify the digital competence of such professionals.
doi:10.2196/53462
Keywords
Introduction
Background
The recent COVID-19 pandemic has highlighted the importance and potential of digital health in optimizing the quality, efficiency, and safety of health care [
- ]. Despite this, the adoption of digital tools and technologies in this field has been slow [ , ], and their full implementation in clinical practice has yet to occur [ ]. Research has pointed to several factors as potential barriers, including technology, infrastructure, and financial resources [ , ]. However, it is the lack of digital health literacy that most commonly obstructs the implementation of digital health services [ ]. Health professionals have been identified as a key factor in the digital transformation of health care [ ]. Accordingly, they must be equipped with digital health competences, ranging from basic skills (eg, computer literacy) to more complex ones such as the ability to teach patients how to use technology and digital data sources safely and appropriately. Beyond informing patients about the availability and potential benefits of these technologies, physicians guide them on integrating these tools into their health care routines, playing a pivotal role in this process. For instance, they can guide patients on using portals for test results or mobile apps for medication adherence. However, for in-depth training on specific technologies, other professionals such as nurses or technical support staff may be better suited [ , , , ].In 2016, a survey of 200 health professionals conducted by the European Health Parliament’s Digital Skills for Health Professionals (COMPDIG-Salut) Committee found that, in most cases, health professionals felt that they lacked the appropriate skills to cope with the digital revolution in their professional practice [
, ]. Today, there is still a need for accessible, structured, and comprehensive education that will enable future health professionals to make the best use of technology and harness its full potential in terms of quality of care [ , , ].Health professionals need to develop digital health competences to keep up with new technologies and ensure that they can provide high-quality patient care [
, , , , ]. To this end, we must first map the specific digital competences needed in health care (to provide the right kind of digital education) [ ] and then create a model for assessing and accrediting such competences. While there is a growing number of individual digital health competence frameworks and reviews that focus on specific health care professions or settings [ ], there is a lack of standardization in the definition of digital health competences themselves, including discrepancies and overlap among available frameworks and their approach to categorization [ ]. This implies a need to continually update the competences required in this field as well as the methods used to assess them [ ].The Professional Dialogue Forum of Catalonia (northeastern Spain), one of the most advanced regions in Europe in the use of digital health technologies [
- ], highlighted “the need to improve information and communication technology (ICT) competences to advance in the use of ICT and the design of telehealth services” as one of the 17 primary current and future challenges facing health professionals [ ]. The COMPDIG-Salut project was launched in response to the identified needs, focusing on three aims: (1) defining a specific digital competence framework for health professionals, (2) creating a specific assessment and accreditation model for health professionals, and (3) designing actions to train and qualify health professionals in digital competences. Having determined the current digital competence level among Catalan health professionals [ ], it is time to work toward achieving the aims of the COMPDIG-Salut project [ , ].Objectives
This study had two objectives: (1) to create a specific map of digital competences for health professionals and (2) to define and test a digital competence assessment and accreditation model for health professionals.
Methods
Study Design
The research presented in this paper is the result of collaboration between the TIC Salut Social Foundation (Information and Communication Technologies in Health and Social Care Foundation) and the Universitat Oberta de Catalunya in Spain. As this was an observational exploratory study focusing on the analysis of digital competences in health care, a mixed qualitative and quantitative methodology was used following an iterative approach, and questionnaires were designed for data collection.
Specific Map of Digital Competences for Health Professionals
Narrative Review
A narrative review [
, ] was conducted to explore a broad range of existing digital competence frameworks in the field of health care and identify commonalities and strengths among the specific digital competences they included. We used the Google Scholar database to perform an iterative search for relevant frameworks based on a combination of search terms or keywords and appropriate Boolean operators.Specifically, we searched for “digital competence framework” OR “digital capabilities framework” and “health professionals” within a search period spanning January 1, 2017, to October 13, 2021. The inclusion of potential frameworks of interest was based on the research team’s knowledge and expertise on the topic. Only publications in English and Spanish were considered.
Inclusion and Exclusion Criteria
Deciding which frameworks to include in our review required careful and deliberate consideration to avoid bias and ensure valid results. To this end, we established explicit inclusion and exclusion criteria to select complete frameworks (eg, they needed to comprise digital competences specific to health care). Expert synthesis, discussion, and agreement among ≥2 reviewers were required to select frameworks for inclusion in the narrative review and ensure a consistent selection process.
To reduce selection bias and facilitate comparisons between frameworks, we ascertained the specific actions in thematic areas, the number of defined competences, the levels of achievement contained in them, and whether they distinguished between health professions. We also looked for similarities between them and the European Digital Competence Framework for Citizens (DigComp) [
]. From the 32 results of the initial search, 9 (28%) frameworks were selected based on our inclusion and exclusion criteria. Of these 9 frameworks, 6 (67%) were selected for full-text analysis [ - ]. In total, 2 additional reference frameworks were included because of their relevance to mapping the digital competences of health professionals in Catalonia. These were the framework of digital competences for health professionals developed by the working group of challenge 4 of the Professional Dialogue Forum [ ] and the Accreditation of Competence in Information and Communication Technologies (ACTIC) [ ], the government of Catalonia’s framework for digital competence accreditation for citizens, which is currently in line with DigComp [ ]. The latter was included because our proposed framework is closely related to it ( ). The overview flowchart is shown in .Development of the Digital Competence Framework for Health Professionals and Area Specification
To develop the Digital Competence Framework for Health Professionals, we needed to specify the digital competences required of health professionals to manage digital health effectively, critically, and responsibly. Among the various initiatives led by the European Commission to improve people’s digital literacy is DigComp, which is “an umbrella or meta-framework for current frameworks, initiatives, curricula, and certifications” [
]. Using DigComp as a reference, we mapped the 6 selected frameworks, projects, and studies to establish relationships and connections between the identified keywords and the most relevant competences. We ordered the competences by similarity to reveal thematic areas and common content ( ).Axial coding was then applied to this content distribution to define areas, competences, and indicators [
]. Coding was performed using the ATLAS.ti software (version 22; ATLAS.ti Scientific Software Development GmbH; ).These areas and competences were validated by a panel of 12 experts using a web-based questionnaire developed by the researchers. Considering the validation criteria (wording, consistency, applicability, and relevance), the experts assessed the clarity and precision of the labeling and the description of the competence areas, validated them or suggested changes, and answered open-ended questions on each item. The experts’ feedback was used to refine some of the definitions (
).Indicators were then defined for each of the competences to determine which aspects should be assessed for all health professionals. Indicators are characteristics that can be observed through specific tests by either predefined measures or other qualitative information. Finally, 21 professionals of various profiles from the working group of challenge 4 of the Professional Dialogue Forum validated the framework of competence areas and indicators through a web-based questionnaire (
).Digital Competence Assessment and Accreditation Model for Health Professionals
Test Creation and Administration
Having developed the Digital Competence Framework for Health Professionals, the next step was to create an assessment and accreditation model. Given the variation in health professionals’ roles, we could not reduce this process to a one-size-fits-all test. So, to ensure that it would be relevant to each health professional’s activities and duties, 4 professional profiles were defined to account for most cases (
).Profile code | Profile | Description |
P1 | Direct patient care | Professionals who spend >70% of their workday providing direct patient care or services (eg, physicians, nurses, occupational therapists, speech therapists, optometrists, opticians, dental hygienists, and pharmacists) |
P2 | Indirect patient care | Professionals who spend >70% of their workday providing health care support services (eg, physicians working in biological diagnostic and pathological anatomy services, specialist biologists, specialist physicists, specialist chemists, pharmacists, and dental prosthetists) |
P3 | Innovation, research, and teaching | Professionals who spend >70% of their workday providing innovation, research, or teaching services (eg, researchers and innovation specialists) |
P4 | Management | Professionals who spend >70% of their workday managing centers, organizations, departments, services, or teams (eg, executives and middle managers) |
The test questions in our assessment and accreditation model had to be linked to definitions of observable behaviors that could be put into practice in different professional settings within the Catalan health care environment. Observable behaviors are understood as practices or actions performed by health professionals as part of their work (eg, finding clinical information in databases, communicating and collaborating remotely with teams or patients, using information management tools, and creating content). As the point of the assessment was to determine respondents’ level of digital competence, we considered it appropriate to use assessment scenarios that would present them with challenges that they would need to overcome. Their attempts to deal with situations similar to those in the real world and provide the best possible digital response to the proposed challenges would provide greater insights into their competence level for each indicator. It would also allow respondents to put into practice other skills, such as problem-solving, critical thinking, and the analysis and responsible use of information and communications technology [
].The test for the 4 professional profiles included a set of 2 cases (scenarios) with a total of 28 questions to be answered within 60 minutes. The test was contextualized for each of the 4 profiles: P1, P2, P3, and P4. The maximum score was 30 points (26 questions counted for a maximum of 1 point each, and 2 questions counted for a maximum of 2 points each). Wrong answers on the multiple-choice questions were scored negatively (−0.2 points). A minimum score of 70% (21/30) was required to pass the test. Participants who scored <21 points were categorized as “suggested level not achieved,” and those who scored between 21 and 30 points were categorized as “suggested level achieved.”
Implementation and Analysis of the Assessment and Accreditation Model
To analyze the proposed assessment and accreditation model for our competence framework and identify areas for improvement (eg, time allotted, number of test questions, types of test questions, professional profiles, real-life situational approach by profile, assessment scenarios, and suitability of the cases and challenges presented) and validate the proposed level test (ie, to determine whether the proposed level was appropriate), we conducted a pilot study involving a web-based test with legally recognized health professionals [
] and health care social workers employed in Catalonia who reported an intermediate or advanced self-perceived digital competence level.The web-based test consisted of 3 activities: a level test based on the exam required to obtain the ACTIC 2—intermediate level certificate [
] (activity 1); the test developed for our proposed assessment and accreditation model, as described in the previous section (activity 2); and a feedback questionnaire to understand participants’ opinions on the proposed assessment and accreditation model and other aspects of the pilot test (activity 3). The estimated time to complete the 3 activities was 90 minutes. The activities were to be done consecutively and in the specified order.The ACTIC 2—intermediate level test (activity 1) was used to determine the participants’ baseline digital competence level and compare their scores with the results of the proposed assessment and accreditation test. Participants were categorized into 1 of the following 3 groups based on their scores: beginner (0-9.9), basic (10-24.9), and intermediate (25-35).
In relation to the test developed for our proposed assessment and accreditation model (activity 2),
shows all the variables involved in the pilot study.Descriptor | Collection method |
Name and surname | Forms (Microsoft Corp) questionnaire |
Email address | Microsoft Forms questionnaire |
Health profession | Microsoft Forms questionnaire |
Self-perceived digital competence level | Microsoft Forms questionnaire |
Experience related to digital competence training | Microsoft Forms questionnaire |
Official certification in digital competences | Microsoft Forms questionnaire |
Professional profile (P1, P2, P3, or P4) | Microsoft Forms questionnaire |
Score achieved in activity 1 | Moodle (Moodle HQ) questionnaire |
Score achieved in activity 2 | Moodle (Moodle HQ) questionnaire |
Feedback questionnaire | Moodle (Moodle HQ) questionnaire |
The feedback questionnaire (activity 3) consisted of 9 questions, 3 (33%) of which were open-ended and 1 (11%) of which asked for the participants’ self-perceived level in each of the digital competences defined for health professionals.
Data Collection
To focus the scope of the study, the Catalan Ministry of Health asked relevant professional associations to invite members whom they felt could meet the study’s inclusion criteria to volunteer as participants. Volunteer participants were recruited using a Microsoft Forms (Microsoft Corp) questionnaire (
). After informing them of the purpose of the study, their personal and professional details were collected. If they met the inclusion criteria, they were enrolled in the study and given credentials to access Moodle (Moodle HQ) for the web-based test.Although the study was scheduled to remain open for 30 days, from March 3 to 31, 2022, it was ultimately extended to April 14, 2022, to increase the response rate. During this period, 2 emails were sent to all candidates to remind them of the study’s end date (or to inform them of the extension) and the remaining activities to be completed.
After this period, the test results were reported in accordance with “Good practice in the conduct and reporting of survey research” [
] and the General Data Protection Regulation, where applicable.Statistical Analysis
When designing the study, we calculated the minimum sample size to ensure significant results with a 10% error rate, the maximum allowed in research studies [
]. On the basis of the results of an exploratory study [ ] and the latest available report on the population of health professionals in Catalonia [ ], the minimum sample size for the study was set at 168.Descriptive statistics were performed for professionals who had correctly completed activities 1 and 2, with results presented as absolute and relative frequencies. Arithmetic means and SDs were used for comparative analysis of subgroups according to sociodemographic and professional characteristics, and P values were used for hypothesis testing.
The reliability of activity 2 was analyzed by measuring the consistency of its items. In addition, the level of discrimination of the items in relation to the advanced level was evaluated. The arithmetic mean and SD were used in descriptive statistics; P values, Bonferroni-adjusted P values, and Holm-adjusted P values were used in hypothesis testing and subgroup comparisons; and the greatest lower bound (GLB) was chosen to diagnose the test given the lack of homogeneity of the scoring scale, as was done in the exploratory study [
].The study instrument was analyzed using the participants’ scores and the discrimination index (DI) [
]. The DI measures how well an item could discriminate between high-scoring participants (ie, those with strong digital competences) and low-scoring participants in activity 2. DI values between 0 and 0.2 indicate that the item is not discriminating, and negative values imply an inverse relationship between the score on that item and the total score.Finally, for activity 3, the numerical variables were presented as arithmetic means and SDs, whereas the categorical variables were presented as absolute and relative frequencies.
All responses were analyzed using the R statistical software (version 4.2.0; R Foundation for Statistical Computing). Responses to the open-ended questions (questions 5 and 7) in the feedback questionnaire (activity 3) were analyzed using quantitative content analysis to group them into limited categories [
].Ethical Considerations
No ethics approval was required due to the type and nature of the study as the Catalan Department of Health is responsible for formulating the general criteria for health planning, setting the objectives, and the levels to be achieved in the topics that are included in the Health Plan for Catalonia [
]. All participants were informed about the study’s purposes and that their participation was voluntary. Data protection treatment was informed to the participant and before accessing the survey, participants had to provide acceptance.Results
Specific Map of Digital Competences for Health Professionals
shows the specific map of digital competences for health professionals validated by the 21 individuals (of different profiles) in the working group of challenge 4 of the Professional Dialogue Forum.
Implementation and Analysis of the Digital Competence Assessment and Accreditation Model for Health Professionals
Recruitment
During the study period, we recorded a total of 398 visits to the Microsoft Forms recruitment questionnaire. Of the 398 visitors, 377 (94.7%) agreed to participate and gave their consent for the TIC Salut Social Foundation to process their personal data. After excluding participants who did not meet the inclusion criteria and removing repeated registrations, the total recruited sample comprised 372 participants, who were given access to Moodle to start the 3 programmed activities. A total of 49.5% (184/372) of the participants logged into Moodle and started the designed activities. In the end, 176 professionals completed activity 1 correctly, of whom 122 (69.3%) also completed activity 2 correctly.
For the purposes of this study, we considered a result as valid when the test was completed, allowing for a maximum of 3 consecutive questions with no blank answers.
Sample Description
The sample consisted mainly of health professionals with a direct patient care profile (P1; 95/122, 77.9%), followed by those with a management profile (P4; 13/122, 10.7%); an innovation, research, and teaching profile (P3; 8/122, 6.6%); and an indirect patient care profile (P2; 6/122, 4.9%;
).Descriptor | Participants, n (%) | ||
Profile | |||
P1—direct patient care | 95 (77.9) | ||
P2—indirect patient care | 6 (4.9) | ||
P3—innovation, research, and teaching | 8 (6.6) | ||
P4—management | 13 (10.7) | ||
Health profession | |||
Specialist biologist | 1 (0.8) | ||
Dietician or nutritionist | 1 (0.8) | ||
Pharmacist | 21 (17.2) | ||
Specialist physical chemist | 0 (0) | ||
Physical therapist | 17 (13.9) | ||
Dental hygienist | 7 (5.7) | ||
Nurse | 37 (30.3) | ||
Speech therapist | 7 (5.7) | ||
Physician | 4 (3.3) | ||
Dentist | 0 (0) | ||
Optometrist/optician | 6 (4.9) | ||
Podiatrist | 5 (4.1) | ||
Dental prosthetist | 1 (0.8) | ||
Clinical or general psychologist | 4 (3.3) | ||
Occupational therapist | 9 (7.4) | ||
Health care social worker | 2 (1.6) | ||
Self-perceived digital competence level | |||
Intermediate | 110 (90.2) | ||
Advanced | 12 (9.8) | ||
ACTICa certification | |||
No certification | 101 (82.8) | ||
ACTIC 1 | 1 (0.8) | ||
ACTIC 2 | 14 (11.5) | ||
ACTIC 3 | 6 (4.9) | ||
Experience at digital health events | |||
No experience | 107 (87.7) | ||
Speaker or trainer | 11 (9) | ||
Organizer | 2 (1.6) | ||
Both | 2 (1.6) |
aACTIC: Accreditation of Competence in Information and Communication Technologies.
Responses were received from professionals in all health professions, with the exception of specialist physical chemists and dentists. Nurses represented the largest proportion of the sample (37/122, 30.3%), followed by pharmacists (21/122, 17.2%) and physical therapists (17/122, 13.9%). According to the distribution of health care professions in Catalonia, a representative sample of nurses was obtained; however, this was not the case for physicians [
].The vast majority of the sample (110/122, 90.2%) reported an intermediate self-perceived digital competence level, with only 9.8% (12/122) of participants considering themselves advanced. Overall, 82.8% (101/122) had no ACTIC certification, 11.5% (14/122) had an ACTIC 2 (intermediate) certification, 4.9% (6/122) had an ACTIC 3 (advanced) certification, and 0.8% (1/122) had an ACTIC 1 (basic) certification.
Activity 1: ACTIC 2—Intermediate Level Test (Baseline Level)
The mean total score for activity 1 was 24.3 (SD 4.1) points, which was significantly lower (P=.03) than the cutoff score for “Intermediate” (25 points). Of the 122 participants, 59 (48.4%) scored between 10 and 24.9 points (basic), and 63 (51.6%) scored ≥25 points (intermediate). On average, the sample took 12.2 (SD 3.8) minutes to complete the baseline level test. There were no outliers in the times recorded (
).Descriptor | Values | |
Score, mean (SD) | 24.3a (4.1) | |
Score range (points), n (%) | ||
<10 (beginner) | 0 (0) | |
10-24.9 (basic) | 59 (48.4) | |
≥25 (intermediate) | 63 (51.6) | |
Minutes taken to complete the activity, mean (SD) | 12.2 (3.8) |
aScores significantly below 25 points (P=.03).
The scores followed a normal distribution, with a mode of 25 points. Only 0.8% (1/122) of the participants obtained the maximum possible score on the level test (
).Subgroups were compared to determine whether there were any significant differences in the overall scores. It was found that participants who had experience at digital health events scored significantly higher than those who did not (P=.03).
The scores for each subgroup were also compared to determine which subgroups scored significantly below 25 points in activity 1 (ie, did not reach intermediate level) and which scored significantly above 25 points (ie, did reach intermediate level). Several subgroups were found to have scored significantly below 25 points, including P1 (direct patient care; P=.04), Dental hygienist (P=.01), Intermediate level self-perception (P=.03), no ACTIC certification or ACTIC 1 (P=.02), no experience in digital health events (P=.02). Some subgroups scored significantly above 25 points, particularly the group of participants who reported ACTIC 3 certification (P=.40). P2 (indirect patient care) professionals and professionals with experience at digital health events scored significantly above 25 points, but this difference did not remain significant after correcting for multiple testing using the Bonferroni and Holm methods (
).Subgroup | Overall score, mean (SD) | Group comparisona | Intermediate level check | ||||||||||||||||||||
Unadjusted P value | Bonferroni-adjusted P value | Holm-adjusted P value | <25 pointsb | ≥25 pointsc | |||||||||||||||||||
Unadjusted P value | Bonferroni-adjusted P value | Holm-adjusted P value | Unadjusted P value | Bonferroni-adjusted P value | Holm-adjusted P value | ||||||||||||||||||
Profile | .23 | >.99 | .52 | ||||||||||||||||||||
P1—direct patient care (n=95) | 24.1 (4.0) | .01d | .04 | .04 | .99 | >.99 | >.99 | ||||||||||||||||
P2—indirect patient care (n=6) | 27.6 (2.3) | .98 | >.99 | >.99 | .02 | .08 | .08 | ||||||||||||||||
P3—innovation, research, and teaching (n=8) | 24.7 (4.3) | .42 | >.99 | >.99 | .58 | >.99 | >.99 | ||||||||||||||||
P4—management (n=13) | 24.5 (5.2) | .37 | >.99 | >.99 | .63 | >.99 | >.99 | ||||||||||||||||
Health professione | .60 | >.99 | .60 | ||||||||||||||||||||
Nurse (n=37) | 24.39 (4.17) | .19 | >.99 | >.99 | .81 | >.99 | >.99 | ||||||||||||||||
Pharmacist (n=21) | 25.31 (3.31) | .66 | >.99 | >.99 | .33 | >.99 | >.99 | ||||||||||||||||
Physical therapist (n=17) | 24.24 (5.28) | .28 | >.99 | >.99 | .72 | >.99 | >.99 | ||||||||||||||||
Other (n=44)f | 23.83 (3.93) | .02d | .25 | .23 | .98 | >.99 | >.99 | ||||||||||||||||
Specialist biologist (n=1) | —g | — | — | — | — | — | — | ||||||||||||||||
Dietician or nutritionist (n=1) | — | — | — | — | — | — | — | ||||||||||||||||
Dental hygienist (n=7) | 20.14 (2.56) | .001h | .01 | .01 | >99 | >.99 | >.99 | ||||||||||||||||
Speech therapist (n=7) | 23.43 (4.75) | .21 | >.99 | >.99 | .79 | >.99 | >.99 | ||||||||||||||||
Physician (n=4) | 22.75 (5.74) | .25 | >.99 | >.99 | .76 | >.99 | >.99 | ||||||||||||||||
Optometrist/optician (n=6) | 24.00 (4.12) | .29 | >.99 | >.99 | .71 | >.99 | >.99 | ||||||||||||||||
Podiatrist (n=5) | 24.10 (2.38) | .22 | >.99 | >.99 | .78 | >.99 | >.99 | ||||||||||||||||
Dental prosthetist (n=1) | — | — | — | — | — | — | — | ||||||||||||||||
General health or clinical psychologist (n=4) | 23.89 (4.23) | .32 | >.99 | >.99 | .68 | >.99 | >.99 | ||||||||||||||||
Occupational therapist (n=9) | 25.28 (3.16) | .60 | >.99 | >.99 | .40 | >.99 | >.99 | ||||||||||||||||
Health care social worker (n=2) | 29.00 (0.71) | — | — | — | — | — | — | ||||||||||||||||
Self-perceived digital competence level | .05 | .26 | .20 | ||||||||||||||||||||
Intermediate (n=110) | 24.12 (4.18) | .02d | .03d | .03d | .99 | >.99 | .99 | ||||||||||||||||
Advanced (n=12) | 26.04 (2.85) | .88 | >.99 | .88 | .12 | .23 | .23 | ||||||||||||||||
ACTICi certificatione | .17 | .87 | .52 | ||||||||||||||||||||
No certification/ACTIC 1 (n=102) | 24.00 (4.14) | .008h | .02d | .02d | .99 | >.99 | .99 | ||||||||||||||||
ACTIC 2 (n=14) | 25.04 (3.74) | .51 | >.99 | >.99 | .49 | >.99 | .97 | ||||||||||||||||
ACTIC 3 (n=6) | 27.92 (2.27) | .99 | >.99 | >.99 | .01d | .04d | .04d | ||||||||||||||||
Experience at digital health eventse | .006j | .03f | .03f | ||||||||||||||||||||
No experience (n=107) | 24.03 (4.21) | .01d | .02d | .02d | .99 | >.99 | .99 | ||||||||||||||||
Some experiencek (n=15) | 26.30 (2.50) | .97 | >.99 | 0.97 | .03d | .06 | .06 |
a2-tailed t test for comparisons between 2 samples and 2-tailed ANOVA for comparisons among >2 samples.
b1-tailed t test with a defined overall score threshold of <25 points.
c1-tailed t test with a defined overall score threshold of ≥25 points.
dSignificantly less or significantly more than 25 points at P<.05.
eSubgroups with too small a subsample size (n<6) were excluded from the analysis.
fSignificant differences at P<.05.
gSubgroups with a subsample size of n=1 were excluded from the analysis.
hSignificantly less or significantly more than 25 points at P<.01.
iACTIC: Accreditation of Competence in Information and Communication Technologies.
jSignificant differences at P<.01.
kIncludes experience as a speaker, trainer, or organizer.
Activity 2: Proposed Assessment and Accreditation Test for the Map of Digital Competences for Health Professionals
Overview
The mean score for activity 2 was 18.5 (SD 3.7) points, which was very significantly lower (P<.001) than the cutoff score for passing the test (21 points). Of the 122 participants, 89 (73%) scored <21 points (did not pass), and 33 (27%) scored ≥21 points (passed). On average, the sample took 34.4 (SD 11.4) minutes to complete the test. There were no outliers in the times recorded (
).Descriptor | Values | |
Score, mean (SD) | 18.5a (3.7) | |
Score range (points), n (%) | ||
<21 (did not pass) | 89 (73) | |
≥21 (passed) | 33 (27) | |
Minutes taken to complete the activity, mean (SD) | 34.4 (11.4) |
aScores significantly below 21 points (P<.001).
An internal consistency analysis of activity 2 for P1 showed excellent internal consistency overall (GLB=0.91). The internal consistency of activity 2 could not be determined for the remaining profiles because there were not enough participants in them (
).For activity 2, subgroup comparisons revealed significant differences between health professions and between professionals with intermediate and advanced self-perceived competence levels. However, the comparisons did not remain significant after applying the Bonferroni and Holm adjustments for multiple comparisons.
Some subgroups did not score significantly below 21 points, including P2 professionals (P=.83), nurses (P=.01), physical therapists (P=.22), speech therapists (P=.18), physicians (P>.99), podiatrists (P=.37), occupational therapists (P=.78), professionals with an advanced self-perceived competence level (P=.16), professionals with ACTIC 3 certification (P>.99), and professionals with experience at digital health events (P=.23). In some cases, the lack of significant level determination was likely due to the small size of the subgroup (
).Subgroup | Overall score, mean (SD) | Group comparisona | Intermediate level check | ||||||||||||||||||||
Unadjusted P value | Bonferroni-adjusted P value | Holm-adjusted P value | <21 pointsb | ≥21 pointsc | |||||||||||||||||||
Unadjusted P value | Bonferroni-adjusted P value | Holm-adjusted P value | Unadjusted P value | Bonferroni-adjusted P value | Holm-adjusted P value | ||||||||||||||||||
Profile | .24 | >.99 | .35 | ||||||||||||||||||||
P1—direct patient care (n=95) | 18.60 (3.71) | <.001d | <.001d | <.001d | >.99 | >.99 | >.99 | ||||||||||||||||
P2—indirect patient care (n=6) | 20.14 (2.37) | .21 | .83 | .21 | .79 | >.99 | >.99 | ||||||||||||||||
P3—innovation, research, and teaching (n=8) | 18.51 (1.77) | .003d | .01e | .006d | >.99 | >.99 | >.99 | ||||||||||||||||
P4—management (n=13) | 16.74 (4.43) | .002d | .008d | .006d | >.99 | >.99 | >.99 | ||||||||||||||||
Health professionf | .03g | .14 | .11 | ||||||||||||||||||||
Nurse (n=37) | 19.59 (3.59) | .01e | .11 | .07 | .99 | >.99 | >.99 | ||||||||||||||||
Pharmacist (n=21) | 18.59 (3.07) | <.001d | .01e | .007d | >.99 | >.99 | >.99 | ||||||||||||||||
Physical therapist (n=17) | 19.16 (3.47) | .02e | .22 | .10 | .98 | >.99 | >.99 | ||||||||||||||||
Other (n=44)g | 17.28 (3.87) | <.001d | <.001d | <.001d | >.99 | >.99 | >.99 | ||||||||||||||||
Specialist biologist (n=1) | —h | — | — | — | — | — | — | ||||||||||||||||
Dietician or nutritionist (n=1) | — | — | — | — | — | — | — | ||||||||||||||||
Dental hygienist (n=7) | 12.47 (1.70) | <.001d | <.001d | <.001d | >.99 | >.99 | >.99 | ||||||||||||||||
Speech therapist (n=7) | 16.87 (3.92) | .02e | .18 | .10 | .98 | >.99 | >.99 | ||||||||||||||||
Physician (n=4) | 19.52 (3.77) | .24 | >.99 | .32 | .76 | >.99 | >.99 | ||||||||||||||||
Optometrist/optician (n=6) | 15.76 (1.80) | <.001d | .005d | .004d | >.99 | >.99 | >.99 | ||||||||||||||||
Podiatrist (n=5) | 15.99 (4.49) | .03f | .37 | .14 | .97 | >.99 | >.99 | ||||||||||||||||
Dental prosthetist (n=1) | — | — | — | — | — | — | — | ||||||||||||||||
General health or clinical psychologist (n=4) | 19.13 (3.11) | .16 | >.99 | .32 | .84 | >.99 | >.99 | ||||||||||||||||
Occupational therapist (n=9) | 19.51 (2.73) | .07 | .78 | .21 | .93 | >.99 | >.99 | ||||||||||||||||
Health care social worker (n=2) | 21.67 (0.23) | — | — | — | — | — | — | ||||||||||||||||
Self-perceived digital competence level | .02g | .09 | .09 | ||||||||||||||||||||
Intermediate (n=110) | 18.29 (3.78) | <.001d | <.001d | <.001d | >.99 | >.99 | >.99 | ||||||||||||||||
Advanced (n=12) | 20.1 (2.06) | .08 | .16 | .08 | .92 | >.99 | >.99 | ||||||||||||||||
ACTICi certificationf | .12 | .59 | .35 | ||||||||||||||||||||
No certification/ACTIC 1 (n=102) | 18.17 (3.82) | <.001d | <.001d | <.001d | >.99 | >.99 | >.99 | ||||||||||||||||
ACTIC 2 (n=14) | 19.31 (2.35) | .008d | .02e | .02e | .99 | >.99 | >.99 | ||||||||||||||||
ACTIC 3 (n=6) | 21.55 (1.70) | .77 | >.99 | .77 | .23 | .70 | .70 | ||||||||||||||||
Experience at digital health eventsf | .13 | .66 | .35 | ||||||||||||||||||||
No experience (n=107) | 18.28 (3.67) | <.001d | <.001d | <.001d | >.99 | >.99 | >.99 | ||||||||||||||||
Some experiencej (n=15) | 19.83 (3.56) | .11 | .23 | .11 | .89 | >.99 | >.99 |
a2-tailed t test for comparisons between 2 samples and 2-tailed ANOVA for comparisons among >2 samples.
b1-tailed t test with a defined overall score threshold of <21 points.
c1-tailed t test with a defined overall score threshold of ≥21 points.
dVery significantly less than 21 points at P<.01.
eSignificantly less than 21 points at P<.05.
fSubgroups with too small a subsample size (n<6) were excluded from the analysis.
gSignificant differences at P<.05.
hSubgroups with a subsample size of n=1 were excluded from the analysis.
iACTIC: Accreditation of Competence in Information and Communication Technologies.
jIncludes experience as a speaker, trainer, or organizer.
Item Dimension (Instrument Diagnosis)
It should be noted that we performed this analysis only for activity 2 and P1 (direct patient care) as this subgroup was large enough for this type of analysis (n>40;
).Correlation Between Participants’ Performance in Activity 1 and Activity 2
Regarding the correlation between participants’ performance in activity 1 and activity 2, there were 2 main groups: those who did not pass either activity 1 or activity 2 (49/122, 40.2%) and those who passed activity 1 but did not pass activity 2 (40/122, 32.8%) (
).Descriptor | Participants, n (%) |
Passed activity 1 and passed activity 2 | 23 (18.9) |
Passed activity 1 and did not pass activity 2 | 40 (32.8) |
Did not pass activity 1 and passed activity 2 | 10 (8.2) |
Did not pass activity 1 and did not pass activity 2 | 49 (40.2) |
Of the 33 participants who passed activity 2, a total of 10 (30%) did not pass activity 1. Although these are isolated cases, they call into question the validity of the advanced level of activity 2 and need to be investigated further. No significant differences were found between this subgroup and the remaining professionals in the time taken to complete or in the number of blank answers in activity 1.
Activity 3: Feedback Questionnaire
A total of 94.3% (115/122) of the participants completed activity 3, although not all of them answered every question.
There were no significant differences in the ratings by profile (P1, P2, P3, and P4) or by health profession that would allow for robust conclusions to be drawn (
).Question number and topic | Result | |
Question 1—level of difficulty of activity 2, mean (SD) | 3.6 (0.7) | |
Question 2—self-perceived level in activity 2, n (%) | ||
Basic | 7 (6.1) | |
Intermediate | 49 (42.6) | |
Advanced | 59 (51.3) | |
Question 3—wording of the questions, mean (SD) | 3.8 (0.9) | |
Question 4—appropriateness of the challenges, mean (SD) | 3.2 (1.2) | |
Question 5—feedback on the challenges and scenarios | Open-ended question | |
Question 6—appropriateness of the profiles, mean (SD) | 4.2 (0.8) | |
Question 7—feedback on the profiles | Open-ended question | |
Question 8—self-perceived level in each of the digital competences defined for health professionals | Multiple-choice question | |
Question 9—general feedback on the pilot study and the initiative | Open-ended question |
When asked whether they would change any of the challenges or scenarios presented in activity 2 as part of our assessment and accreditation model (question 5), a considerable number of professionals (45/94, 48%) indicated that they did not fully identify with them (
).When the responses were analyzed by health profession, we found that nurses and physicians generally identified with the challenges more than their counterparts in other health professions did (
). While 48% (11/23) of nurses and physicians gave positive feedback in this regard, only 31% (15/48) of respondents from other professions said that they adequately identified with the challenges and scenarios.When asked whether they would change any of the profiles in our model (question 7), most participants (49/79, 62%) agreed that the profiles were accurately defined. Of the participants who did not fully agree, some questioned the workday percentage defined for categorization (eg, 70% direct patient care for P1). Others felt that there were hybrid profiles halfway between P1 and P4 or other profiles altogether or indeed that some profiles should be included in others (
).It should be noted that the participants who indicated that they did not fully agree with the profiles were either from the private sector or working in pharmaceutical companies.
Regarding the participants’ self-perceived level in each of the 10 digital competences defined for health professionals in our framework (question 8), most reported having a basic competence level of data analysis and digital transformation (71/111, 64% and 64/111, 58%, respectively). The competence for which most professionals reported an advanced level was ethics and civic-mindedness (44/111, 40%;
).Discussion
Frameworks and Competence Assessment Challenges
Digital health offers a valuable opportunity to make strides toward attaining the United Nations’ Sustainable Development Goal of universal health coverage by 2030 [
]. To that end, and given the gradual digitization of the health care sector, health professionals must have the appropriate competences regardless of their specific disciplines [ , , , , ].Furthermore, the constant evolution of new technologies requires the continuous updating of digital health competence. This entails reviewing both the relevant competences and the methods for properly assessing them [
]. Some discrepancies and overlap still exist among available frameworks, the methods used to conceptualize such frameworks, and the competences they include [ ] and also among the health professionals at whom they are aimed [ , ]. The lack of a comprehensive framework applicable to all health professionals warrants the development of frameworks that include different health professionals.Among the most recent frameworks is the Health Information Technology Competencies. Developed using an iterative method, it is the most complete framework aimed at a broad range of health professionals and medical specialties [
]. In Spain, and more specifically in the Basque Country (northern Spain), the Ikanos project was developed (2020) based on the European DigComp framework as a benchmark for the description of digital competences. Drawing on interviews with experts, the Ikanos framework focuses on profiles based initially on primary care professionals (medical and nursing staff) [ ].The COMPDIG-Salut project was launched for the purposes of designing a specific map of digital competences for health professionals and creating a specific digital competence assessment and accreditation model to enable health professionals to obtain a specific accreditation certificate. Taking an iterative approach, which included a review of the gray literature and consultation with local experts [
], we designed a map of digital competences, consisting of 10 competences under 4 competence areas: data access, management, and analysis; communication and collaboration; digital awareness; and professional development. The second and fourth of these areas have the highest number of competences (3 each). In short, the map includes competences related to essential skills for the digital management and analysis of health and social data and information and includes emerging technological innovation (ie, health apps, artificial intelligence, and autonomous decision support systems) [ , ]. Moreover, and given that digital health entails new methods of communication (teleconsultations and email), it is imperative for health professionals to know how to convey information in a precise, effective, and timely manner to patients, health professionals, and any collaborating parties [ - ]. Effective communication, collaboration, and teamwork are crucial in a health setting, so health professionals must also possess the ability to work effectively as members of interdisciplinary teams [ , ]. The creation and publication of health-related digital content is another basic competence that health professionals should have to ensure high-quality health care provision and help improve the communities they serve [ , ]. Finally, emphasis is placed on the importance of health professionals complying with the ethical principles and security criteria associated with the appropriate use of digital technologies [ ] and also keeping abreast of the regulations on health and social data and information privacy, confidentiality, and protection [ , , ]. However, we should bear in mind that the evolving nature of digital health and professional practice in the health field—which often outpaces research—requires constant, flexible development of professional competences [ ]. Hence, some of the competences in the competence framework defined in the COMPDIG-Salut project may need to be adjusted to incorporate future digital trends, such as addressing challenges in the areas of cybersecurity and the use of artificial intelligence and robotics in professional practice. This implies that the methods for accurately assessing such competences may need to be updated.Digital Competence Assessment Model
Meanwhile, given the need to assess health professionals’ digital competence levels for the purposes of narrowing the digital gap and, by so doing, maintaining health service quality [
, ], a digital competence assessment model for health professionals was developed as part of the COMPDIG-Salut project. Drawing on the competence content defined in the project, the model was designed in accordance with ACTIC’s new scenario-based assessment model [ ]. In this sense, the assessment model was adapted to the 4 distinct health sector profiles: professionals providing direct patient care; professionals providing indirect patient care; professionals providing innovation, research, or teaching services; and professionals who manage. This not only reflects the impact of the professional role on digital competence [ ] but also, with the differentiation of the 4 profiles, facilitates the design of a specific training pathway for each role ( ).While the focus was on the type of health professional profile (P1, P2, P3, or P4), the most frequent professional categories into which participants fell were nurses, pharmacists, and physiotherapists. The internal consistency of the digital competence assessment model for health professionals in P1 was excellent overall, bearing in mind that it was a proposed assessment model (GLB=0.91). We found that 54% (15/28) of the questions discriminated between health professionals with an advanced level of digital competence and those with a lower level (DI≥0.20) and that 11% (3/28) of the questions were highly discriminating (DI>0.4). However, 7% (2/28) of the items had a negative DI, so these would need to be reviewed because a negative DI implies an inverse relationship between the level and the correct answer, meaning that the wording of the question may not have been sufficiently precise. However, as these values were close to 0, they could be regrouped with the nondifferentiating questions (DI=0-0.2). As the assessment model for the remaining profiles assessed the same competences as those in the P1 test, we expected the internal consistency and the degree of discrimination of the questions to be similar among the profiles. However, it was not possible to directly extrapolate the P1 results.
Most of the questions in the feedback questionnaire (activity 3) could not be robustly interpreted because they did not constitute a standard instrument that had been validated by another study. The values obtained in the Likert-type questions could not be robustly interpreted either because the values were not near the ends of the scale. Many health professionals felt that their roles and challenges were not adequately represented in the assessment model despite the professional profiles being well defined. This was especially the case among health professionals who made up the smallest groups (mostly nonmedical and nonnursing staff and those working in the private or pharmaceutical sectors). This result underscores the necessity of refining the assessment model to meet the diverse digital competence needs of health care professionals, emphasizing the potential requirement for more targeted measures tailored to specific groups, and acknowledging their potential cultural dependencies. Therefore, it would be necessary to either review the profile proposal for the competence assessment model or add scenarios and challenges to it for health professionals who were not reflected in the proposed situations. The sample size of some subgroups (physicians, health professionals with advanced digital competence, P2, P3, and P4) may have been too small to represent the population.
Self-Perceived Digital Competence and Certification
Although most study participants (110/122, 90.2%) reported an intermediate self-perceived digital competence level and only 9.8% (12/122) of participants considered themselves advanced, we found that the vast majority would not attain ACTIC 2 (intermediate) certification [
]. This situation was similar to the one found in an exploratory study conducted in 2021 [ ]. Consistent with the low score for the ACTIC 2 certification (activity 1), the total mean score for activity 2, which assessed the defined competence framework, was significantly lower than the defined threshold. In fact, 73% (89/122) of the sample did not pass the tests. Participants with ACTIC 3 certification significantly passed activity 1 but not activity 2. In fact, none of the subgroups studied showed a significant increase in activity 2. This shows that the assessment model is a useful tool as an approach to assessing the competence framework for health professionals, and the results strengthen those obtained in the exploratory study [ ]. This also shows that health professionals still need training, that such training goes beyond that required for the main tools currently used, and that merely being exposed to digital media as consumers is insufficient in terms of acquiring the necessary digital skills. More and more faculties of medicine are incorporating digital health training into their programs of study [ , ], and there is an ever-increasing number of initiatives to meet that need [ ], such as the Digital Capability Framework developed by Health Education England [ ] or the free introduction to eHealth web-based course offered by the EU*US eHealth Work project [ ]. The designed map of digital competences could serve as the basis of the knowledge and skills for digital literacy specific to health professionals, to which other competences or areas could be added in accordance with the particularities of the various health professions and medical specialties. This map of digital competences could also inform and contribute to the development of future digital health training programs for health workers at different stages of their professional careers.Strengths and Limitations
Our study provides a competence framework and a tool for assessing the digital competences of health professionals that is consistent with the certification available to citizens (ACTIC). It also yields results that strengthen those obtained in the previous exploratory study [
]. However, this study has several limitations. First, several weaknesses in the interpretation of findings from the review should be considered. While narrative reviews are often used in social science research for educational purposes, they may be biased and lack objectivity. That said, they can provide a unique perspective and identify knowledge gaps in the literature [ ]. Although a considerable number of reference frameworks from gray literature were included, some may have been overlooked. Moreover, while performing thematic analysis, we found that some frameworks had either vague competence categories or overlaps among categories, which engendered differences of opinion during the classification process. However, the 4 reviewers drew on their experience to develop and clearly define the competence areas and assign the corresponding competences to them, first independently and then jointly through discussion-based agreement to reduce bias and classify the information in the most appropriate way possible. Although the reviewers tried to make the classification process as transparent and replicable as possible, it should be borne in mind that there could be other ways of interpreting and classifying the information and that the categorization might differ in the future with new advances in digital health. Second, while valid results were obtained for 122 participants, the minimum sample size set for the study was not reached (N=168). The platform used for the test (Moodle), the number of activities (3), and the time allotted (estimated at 1.5 hours) were almost certainly the reasons for poor participation in the study, especially given the high workload of the professional group in question. A study assessing both perceived and demonstrated eHealth literacy through a computer simulation of health-related internet tasks also revealed that evaluating demonstrated eHealth literacy via simulations is a challenging endeavor in terms of time [ ]. When comparing the results of this study (using Moodle for the test) to those of the previous exploratory study (using Microsoft Forms for the test), it could be said that Moodle might have had an influence on a health professional’s decision not to take part in the study [ ]. Third, the most highly represented health professionals in this study were those providing direct patient care (P1). Thus, many of the conclusions drawn from the study are limited to that profile. The poor participation of physicians (4/122, 3.3%) meant that they were underrepresented if compared to the 2017 professional population data for Catalonia [ ]. As it fell to professional associations to inform their members about this study, the dissemination mechanisms they used are not precisely known. However, their impact was seemingly greater in the associations of nurses, pharmacists, and physiotherapists than in the associations of other health professions. Fourth, the absence of a detailed demographic analysis of the participant population limits our ability to provide a comprehensive understanding of sample characteristics. While demographic characterization could aid in contextualizing the results and their relationship to the simulation performance, such data were not collected in this study in accordance with the General Data Protection Regulation principle of “data minimization” [ ]. Consequently, the lack of this analysis may restrict our ability to generalize findings to broader populations and fully comprehend the influence of demographic variables on study outcomes. Fifth, our findings enable us to estimate the prevalence of P1 as the most common profile; however, they do not provide a basis for drawing more specific conclusions. The 2022 report on health care professionals in Catalonia lacks profession categorization [ ], and acquiring accurate and current data poses challenges attributable to factors such as professional mobility (eg, change of workplace, the coexistence of public and private sectors, and mobility between various institutions) and liberal professions. Subsequent research endeavors should prioritize the inclusion of all 4 profiles for a more comprehensive understanding. Sixth, the overrepresentation of health professionals with an intermediate self-perceived competence level compared to those considering themselves advanced also limited some of the study conclusions. We will probably need to review our definitions of the different self-perception levels because, perhaps due to the wording used, there was a greater perception of exigency than there really was (eg, advanced user: “I have attained the most advanced digital competences for transforming and innovating in today’s digital society.”). Moreover, the evaluation of digital health competences should take into account that, despite being notable, there is a moderate correlation between perceived and actual digital competences, as other studies looking at simulation scenarios in health care have shown [ , ]. Despite the considerable similarity between the latter simulation scenarios and those used in this study, previous research has focused on distinct demographic groups within the general population, such as individuals aged ≥50 years [ ], or specific cohorts, such as patients with chronic illnesses [ ], rather than health care professionals.Conclusions
Assessing the digital competence level of health professionals based on a defined competence framework should enable such professionals to be trained and updated to meet real needs in their specific professional contexts and, consequently, take full advantage of the potential of digital technologies [
]. The information and data gathered, together with the results of an exploratory study on competence levels conducted in 2021 [ ], should be taken as the starting point for promoting relevant strategic policies and actions to ensure that the right resources and conditions are in place for good professional performance [ , - ]. Faced with the need to improve the digital competence of health professionals working in Catalonia, these results have informed the Health Plan for Catalonia 2021-2025 [ ] and lay the foundations for designing and delivering specific training to first assess and then certify the digital competence of such professionals. Thus, the assessment model presented in this paper—designed in accordance with the competence content defined in the map of digital competences and based on scenarios—has the potential to be applied in diverse countries and languages with appropriate modifications to meet the specific needs and contexts of health care professionals. This might provide a preliminary perspective for assessing the competence levels and needs of health professionals in different health systems, although further evidence is needed to fully support this claim.Conflicts of Interest
None declared.
Reference frameworks selected for the analysis and comparative study.
DOCX File , 18 KBOrdering of competences by thematic area for each framework.
DOCX File , 26 KBAxial coding of content to define areas, competences, and indicators.
PNG File , 252 KBWeb-based questionnaire 1.
PDF File (Adobe PDF File), 263 KBWeb-based questionnaire 2.
PDF File (Adobe PDF File), 306 KBInformation sheet and recruitment questionnaire.
PDF File (Adobe PDF File), 157 KBSpecific map of digital competences for health professionals.
DOCX File , 21 KBOverall distribution of scores on Activity 1.
DOCX File , 22 KBOverall distribution of scores in the proposed assessment and accreditation test for health professionals.
DOCX File , 21 KBDiscrimination indexes of the instrument items for profile P1.
DOCX File , 17 KBQuestion 5: “Feedback on the profile–specific challenges and scenarios.”.
DOCX File , 13 KBQuestion 5: “Feedback on the profile–specific challenges and scenarios by profession.”.
DOCX File , 14 KBQuestion 7: “Feedback on the profiles.”.
DOCX File , 13 KBExample scenario 1.
PDF File (Adobe PDF File), 397 KBReferences
- Armaignac DL, Saxena A, Rubens M, Valle CA, Williams LM, Veledar E, et al. Impact of telemedicine on mortality, length of stay, and cost among patients in progressive care units: experience from a large healthcare system. Crit Care Med. May 2018;46(5):728-735. [FREE Full text] [CrossRef] [Medline]
- Nazeha N, Pavagadhi D, Kyaw BM, Car J, Jimenez G, Tudor Car L. A digitally competent health workforce: scoping review of educational frameworks. J Med Internet Res. Nov 05, 2020;22(11):e22706. [FREE Full text] [CrossRef] [Medline]
- Koehler F, Koehler K, Deckwart O, Prescher S, Wegscheider K, Kirwan B, et al. Efficacy of telemedical interventional management in patients with heart failure (TIM-HF2): a randomised, controlled, parallel-group, unmasked trial. Lancet. Sep 22, 2018;392(10152):1047-1057. [CrossRef] [Medline]
- Mian A, Khan S. Medical education during pandemics: a UK perspective. BMC Med. Apr 09, 2020;18(1):100. [FREE Full text] [CrossRef] [Medline]
- Jimenez G, Spinazze P, Matchar D, Koh Choon Huat G, van der Kleij RM, Chavannes NH, et al. Digital health competencies for primary healthcare professionals: a scoping review. Int J Med Inform. Nov 2020;143:104260. [CrossRef] [Medline]
- Schreiweis B, Pobiruchin M, Strotbaum V, Suleder J, Wiesner M, Bergh B. Barriers and facilitators to the implementation of eHealth services: systematic literature analysis. J Med Internet Res. Nov 22, 2019;21(11):e14197. [FREE Full text] [CrossRef] [Medline]
- Longhini J, Rossettini G, Palese A. Correction: digital health competencies among health care professionals: systematic review. J Med Internet Res. Nov 29, 2022;24(11):e43721. [FREE Full text] [CrossRef] [Medline]
- Saigí-Rubió F, Borges do Nascimento IJ, Robles N, Ivanovska K, Katz C, Azzopardi-Muscat N, et al. The current status of telemedicine technology use across the World Health Organization European region: an overview of systematic reviews. J Med Internet Res. Oct 27, 2022;24(10):e40877. [FREE Full text] [CrossRef] [Medline]
- Brown J, Pope N, Bosco AM, Mason J, Morgan A. Issues affecting nurses' capability to use digital technology at work: an integrative review. J Clin Nurs. Aug 2020;29(15-16):2801-2819. [CrossRef] [Medline]
- Burmann A, Tischler M, Faßbach M, Schneitler S, Meister S. The role of physicians in digitalizing health care provision: web-based survey study. JMIR Med Inform. Nov 11, 2021;9(11):e31527. [FREE Full text] [CrossRef] [Medline]
- Shiferaw KB, Tilahun BC, Endehabtu BF. Healthcare providers' digital competency: a cross-sectional survey in a low-income country setting. BMC Health Serv Res. Nov 09, 2020;20(1):1021. [FREE Full text] [CrossRef] [Medline]
- Digital skills for health professionals. European Health Parliament. URL: https://www.healthparliament.eu/digital-skills-health-professionals/ [accessed 2022-07-22]
- European year of skills 2023. European Commission. URL: https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/european-year-skills-2023_en [accessed 2023-07-15]
- Global strategy on digital health 2020-2025. World Health Organization. 2021. URL: https://apps.who.int/iris/handle/10665/344249 [accessed 2024-04-29]
- Brunner M, McGregor D, Keep M, Janssen A, Spallek H, Quinn D, et al. An eHealth capabilities framework for graduates and health professionals: mixed-methods study. J Med Internet Res. May 15, 2018;20(5):e10229. [FREE Full text] [CrossRef] [Medline]
- Improving digital literacy. The Royal College of Nursing & Health Education England. 2017. URL: https://www.rcn.org.uk/Professional-Development/publications/pub-006129 [accessed 2023-05-12]
- Heath S. Digital health literacy: why it’s important and how to improve it. TechTarget. URL: https://www.techtarget.com/patientengagement/feature/Digital-Health-Literacy-Why-Its-Important-and-How-to-Improve-It [accessed 2023-05-25]
- Milner RJ, Gusic ME, Thorndyke LE. Perspective: toward a competency framework for faculty. Acad Med. 2011;86(10):1204-1210. [CrossRef]
- Saigí-Rubió F, Vidal-Alaball J, Torrent-Sellens J, Jiménez-Zarco A, López Segui F, Carrasco Hernandez M, et al. Determinants of Catalan public primary care professionals' intention to use digital clinical consultations (eConsulta) in the post-COVID-19 context: optical illusion or permanent transformation? J Med Internet Res. May 31, 2021;23(6):e28944. [FREE Full text] [CrossRef] [Medline]
- Pérez Sust P, Solans O, Fajardo JC, Medina Peralta M, Rodenas P, Gabaldà J, et al. Turning the crisis into an opportunity: digital health strategies deployed during the COVID-19 outbreak. JMIR Public Health Surveill. May 04, 2020;6(2):e19106. [FREE Full text] [CrossRef] [Medline]
- Temprana-Salvador J, López-García P, Castellví Vives J, de Haro L, Ballesta E, Rojas Abusleme M, et al. DigiPatICS: digital pathology transformation of the Catalan Health Institute Network of 8 hospitals-planification, implementation, and preliminary results. Diagnostics (Basel). Mar 30, 2022;12(4):852. [FREE Full text] [CrossRef] [Medline]
- Reunió plenària I Fòrum de diàleg professional. Departament de Salut. URL: http://salutweb.gencat.cat/ca/el_departament/eixos-xiv-legislatura/forum-dialeg-professional/etapes/treballs-fase-anterior/i-forum-diagnostic/ [accessed 2022-07-27]
- Reixach E, Andrés E, Sallent Ribes J, Gea-Sánchez M, Àvila López A, Cruañas B, et al. Measuring the digital skills of Catalan Health Care professionals as a key step toward a strategic training plan: digital competence test validation study. J Med Internet Res. Nov 30, 2022;24(11):e38347. [FREE Full text] [CrossRef] [Medline]
- Karnoe A, Furstrand D, Christensen KB, Norgaard O, Kayser L. Assessing competencies needed to engage with digital health services: development of the eHealth literacy assessment toolkit. J Med Internet Res. May 10, 2018;20(5):e178. [FREE Full text] [CrossRef] [Medline]
- Ferrari R. Writing narrative style literature reviews. Med Write. Dec 23, 2015;24(4):230-235. [CrossRef]
- Greenhalgh T, Thorne S, Malterud K. Time to challenge the spurious hierarchy of systematic over narrative reviews? Eur J Clin Invest. Jun 16, 2018;48(6):e12931. [FREE Full text] [CrossRef] [Medline]
- The digital competence framework. European Commission. URL: https://joint-research-centre.ec.europa.eu/digcomp/digital-competence-framework_en [accessed 2022-07-26]
- Montero Delgado JA, Merino Alonso FJ, Monte Boquet E, Ávila de Tomás JF, Cepeda Díez JM. Competencias digitales clave de los profesionales sanitarios. Educación Médica. Sep 2020;21(5):338-344. [CrossRef]
- Test Ikanos de competencias digitales. Ikanos. URL: https://test.ikanos.eus/ [accessed 2022-07-26]
- Good practices on DC frameworks. TicSalut Social Foundation. URL: https://ticsalutsocial.cat/wp-content/uploads/2024/07/D1.-Good-practices-on-DC-frameworks.pdf [accessed 2023-05-26]
- Jahnel T, Pan CC, Pedros Barnils N, Muellmann S, Freye M, Dassow HH, et al. Developing and evaluating digital public health interventions using the digital public health framework DigiPHrame: a framework development study. J Med Internet Res. Sep 12, 2024;26:e54269. [FREE Full text] [CrossRef] [Medline]
- A health and care digital capabilities framework. National Health Service. 2018. URL: https://www.hee.nhs.uk/sites/default/files/documents/Digital%20Literacy%20Capability%20Framework%202018.pdf [accessed 2024-04-29]
- The Topol review. NHS Health Education England. URL: https://topol.hee.nhs.uk/ [accessed 2023-09-18]
- What is ACTIC. Accreditation of Competence in Information and Communication Technologies. URL: http://actic.gencat.cat/en/actic_informacio/actic_que_es_l_actic_/index.html [accessed 2022-07-26]
- Charmaz K. Constructing Grounded Theory: A Practical Guide through Qualitative Analysis (Introducing Qualitative Methods series). Thousand Oaks, CA. Sage Publications; 2006.
- BOE-A-2003-21340 Ley 44/2003, de 21 de noviembre, de ordenación de las profesiones sanitarias. BOE.es. URL: https://www.boe.es/buscar/act.php?id=BOE-A-2003-21340 [accessed 2022-07-26]
- Rodríguez Del Águila MM, González-Ramírez AR. Sample size calculation. Allergol Immunopathol (Madr). Sep 2014;42(5):485-492. [CrossRef] [Medline]
- Taib F, Yusoff MS. Difficulty index, discrimination index, sensitivity and specificity of long case and multiple choice questions to predict medical students’ examination performance. J Taibah Univ Med Sci. Jun 2014;9(2):110-114. [CrossRef]
- Bengtsson M. How to plan and perform a qualitative study using content analysis. NursingPlus Open. 2016;2:8-14. [CrossRef]
- Pla de salut 2021-2025. Departament de Salut, Generalitat de Catalunya. 2021. URL: https://salutweb.gencat.cat/web/.content/_departament/pla-de-salut/pla-de-salut-2021-2025/pla-salut-catalunya-2021-2025.pdf [accessed 2021-04-13]
- Situació actual i diagnòstic de necessitats de professionals de la salut. Departament de Salut. 2018. URL: https://salutweb.gencat.cat/web/.content/_departament/eixos-xiv-legislatura/1r_forum_dialeg_professional/documents/diagnostic.pdf [accessed 2024-04-29]
- Sustainable development goals. United Nations Sustainable Development. URL: https://www.un.org/sustainabledevelopment/ [accessed 2023-09-28]
- Classification of digital health interventions v 1. World Health Organization. 2018. URL: https://apps.who.int/iris/bitstream/handle/10665/260480/WHO-RHR-18.06-eng.pdf [accessed 2024-04-29]
- Hübner U, Thye J, Shaw T, Elias B, Egbert N, Saranto K, et al. Towards the TIGER international framework for recommendations of core competencies in health informatics 2.0: extending the scope and the roles. Stud Health Technol Inform. Aug 21, 2019;264:1218-1222. [CrossRef] [Medline]
- Longhini J, Rossettini G, Palese A. Digital health competencies among health care professionals: systematic review. J Med Internet Res. Aug 18, 2022;24(8):e36414. [FREE Full text] [CrossRef] [Medline]
- The Topol review: preparing the healthcare workforce to deliver the digital future. The NHS Constitution. URL: https://topol.hee.nhs.uk/ [accessed 2023-07-24]
- Griffiths FE, Armoiry X, Atherton H, Bryce C, Buckle A, Cave JA, et al. The Role of Digital Communication in Patient–Clinician Communication for NHS Providers of Specialist Clinical Services for Young People [The Long-Term Conditions Young People Networked Communication (LYNC) Study]: A Mixed-Methods Study. Southampton, UK. NIHR Journals Library; 2018.
- Norgaard O, Furstrand D, Klokker L, Karnoe A, Batterham R, Kayser L. The e-health literacy framework: a conceptual framework for characterizing e-health users and their interaction with e-health systems. Knowl Manag E-Learn. 2015;7:540. [CrossRef]
- Duffy FF, Fochtmann LJ, Clarke DE, Barber K, Hong S, Yager J, et al. Psychiatrists' comfort using computers and other electronic devices in clinical practice. Psychiatr Q. Sep 2016;87(3):571-584. [FREE Full text] [CrossRef] [Medline]
- Global diffusion of eHealth: making universal health coverage achievable: report of the third global survey on eHealth. World Health Organization. 2016. URL: https://apps.who.int/iris/handle/10665/252529 [accessed 2024-04-29]
- Tsirintani M. Fake news and disinformation in health care- challenges and technology tools. Stud Health Technol Inform. May 27, 2021;281:318-321. [CrossRef] [Medline]
- Vayena E, Haeusermann T, Adjekum A, Blasimme A. Digital health: meeting the ethical and policy challenges. Swiss Med Wkly. Dec 29, 2018;148:w14571. [FREE Full text] [CrossRef] [Medline]
- van Houwelingen CT, Ettema RG, Kort HS, Ten Cate O. Hospital nurses' self-reported confidence in their telehealth competencies. J Contin Educ Nurs. Jan 01, 2019;50(1):26-34. [FREE Full text] [CrossRef] [Medline]
- Kirchberg J, Fritzmann J, Weitz J, Bork U. eHealth literacy of German physicians in the pre-COVID-19 era: questionnaire study. JMIR Mhealth Uhealth. Oct 16, 2020;8(10):e20099. [FREE Full text] [CrossRef] [Medline]
- Potts HW. Is e-health progressing faster than e-health researchers? J Med Internet Res. Sep 29, 2006;8(3):e24. [FREE Full text] [CrossRef] [Medline]
- Konttila J, Siira H, Kyngäs H, Lahtinen M, Elo S, Kääriäinen M, et al. Healthcare professionals' competence in digitalisation: a systematic review. J Clin Nurs. Mar 2019;28(5-6):745-761. [CrossRef] [Medline]
- Aungst TD, Patel R. Integrating digital health into the curriculum-considerations on the current landscape and future developments. J Med Educ Curric Dev. Jan 20, 2020;7:2382120519901275. [FREE Full text] [CrossRef] [Medline]
- Pathipati AS, Azad TD, Jethwani K. Telemedical education: training digital natives in telemedicine. J Med Internet Res. Dec 12, 2016;18(7):e193. [FREE Full text] [CrossRef] [Medline]
- Workforce, training and education: improving the digital literacy of the workforce. National Health Service, England. 2018. URL: https://www.hee.nhs.uk/our-work/digital-literacy [accessed 2022-07-26]
- EU*US eHealth work. EHTEL. URL: https://ehtel.eu/activities/eu-funded-projects/euus-ehealth-work.html#:~:text=The%20EU*US%20eHealth%20Work,healthcare%20workforce%20across%20the%20globe [accessed 2022-07-26]
- Narrative review - an overview. ScienceDirect. URL: https://www.sciencedirect.com/topics/psychology/narrative-review [accessed 2023-07-25]
- Neter E, Brainin E. Perceived and performed eHealth literacy: survey and simulated performance test. JMIR Hum Factors. Jan 17, 2017;4(1):e2. [FREE Full text] [CrossRef] [Medline]
- Article 5: regulation (EU) 2016/679 of the European Parliament and of the Council. European Union. URL: https://eur-lex.europa.eu/eli/reg/2016/679/oj [accessed 2024-05-13]
- van der Vaart R, Drossaert CH, de Heus M, Taal E, van de Laar MA. Measuring actual eHealth literacy among patients with rheumatic diseases: a qualitative analysis of problems encountered using Health 1.0 and Health 2.0 applications. J Med Internet Res. Feb 2013;15(2):e27. [FREE Full text] [CrossRef] [Medline]
- Jarva E, Oikarinen A, Andersson J, Tomietto M, Kääriäinen M, Mikkonen K. Healthcare professionals' digital health competence and its core factors; development and psychometric testing of two instruments. Int J Med Inform. Mar 2023;171:104995. [FREE Full text] [CrossRef] [Medline]
- Machleid F, Kaczmarczyk R, Johann D, Balčiūnas J, Atienza-Carbonell B, von Maltzahn F, et al. Perceptions of digital health education among European medical students: mixed methods survey. J Med Internet Res. Aug 14, 2020;22(8):e19827. [FREE Full text] [CrossRef] [Medline]
- Socha-Dietrich K. Empowering the health workforce to make the most of the digital revolution. Organization for Economic Cooperation and Development. 2021. URL: https://www.oecd-ilibrary.org/content/paper/37ff0eaa-en [accessed 2024-04-29]
- Keep M, Janssen A, McGregor D, Brunner M, Baysari MT, Quinn D, et al. Mapping eHealth education: review of eHealth content in health and medical degrees at a metropolitan tertiary institute in Australia. JMIR Med Educ. Aug 19, 2021;7(3):e16440. [FREE Full text] [CrossRef] [Medline]
- Echelard JF, Méthot F, Nguyen HA, Pomey MP. Medical student training in eHealth: scoping review. JMIR Med Educ. Sep 11, 2020;6(2):e20027. [FREE Full text] [CrossRef] [Medline]
Abbreviations
ACTIC: Accreditation of Competence in Information and Communication Technologies |
COMPDIG-Salut: Digital Skills for Health Professionals |
DI: discrimination index |
DigComp: Digital Competence Framework for Citizens |
GLB: greatest lower bound |
Edited by F Pietrantonio, M Montagna, J López Castro, I Said-Criado; submitted 07.10.23; peer-reviewed by V Grieco, E Brainin; comments to author 21.12.23; revised version received 12.02.24; accepted 17.06.24; published 17.10.24.
Copyright©Francesc Saigí-Rubió, Teresa Romeu, Eulàlia Hernández Encuentra, Montse Guitert, Erik Andrés, Elisenda Reixach. Originally published in JMIR Medical Education (https://mededu.jmir.org), 17.10.2024.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on https://mededu.jmir.org/, as well as this copyright and license information must be included.