Published on in Vol 9, No 8 (2020): August

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/19072, first published .
Usability Methods and Attributes Reported in Usability Studies of Mobile Apps for Health Care Education: Protocol for a Scoping Review

Usability Methods and Attributes Reported in Usability Studies of Mobile Apps for Health Care Education: Protocol for a Scoping Review

Usability Methods and Attributes Reported in Usability Studies of Mobile Apps for Health Care Education: Protocol for a Scoping Review

Protocol

1Department of Occupational Therapy, Faculty of Health and Function, Western Norway University of Applied Sciences, Bergen, Norway

2Centre for Evidence-Based Practice, Western Norway University of Applied Sciences, Bergen, Norway

3Division of Health Services, Norwegian Institute of Public Health, Oslo, Norway

4Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada

5Department of Physiotherapy, Faculty of Health and Function, Western Norway University of Applied Sciences, Bergen, Norway

*all authors contributed equally

Corresponding Author:

Susanne Grødem Johnson, MSc

Department of Occupational Therapy

Faculty of Health and Function

Western Norway University of Applied Sciences

Inndalsveien 28

Bergen, 5063

Norway

Phone: 47 92213202

Email: susanne.grodem.johnson@hvl.no


Background: E-learning technologies, including mobile apps, are used to a large extent in health care education. Mobile apps can provide extendable learning environments and motivate students for adaptive and collaborative learning outside the classroom context. Developers should design practical, effective, and easy-to-use mobile apps. Usability testing is an important part of app development in order to understand if apps meet the needs of users.

Objective: The aim of this study is to perform a scoping review of usability methods and attributes reported in usability studies of mobile apps for health care education.

Methods: The scoping review is guided by the methodological framework developed by Arksey & O’Malley and further developed by Levac et al and Kahlil et al. The stages we will follow are as follows: (1) identifying the research question; (2) identifying relevant studies; (3) selecting studies; (4) charting the data; and (5) summarizing and reporting the results. We have developed two research questions to meet the aim of the study, which are as follows: (1) What usability methods are used to evaluate the usability of mobile apps for health care education? and (2) What usability attributes are reported in the usability studies of mobile apps for health care education? We will apply a comprehensive search of the literature, including 10 databases, a reference search, and a search for grey literature. Two review authors will independently screen articles for eligibility.

Results: The initial electronic database searches were completed in March 2019. The literature search identified 14,297 unique references. Following title and abstract screening, the full texts of 369 records were obtained. The scoping review is expected to be completed in spring 2021.

Conclusions: We expect the overview of usability methods and attributes reported in usability studies of mobile apps for health care education to contribute to the knowledge base for researchers and developers. It will give an overview of the research field and provide researchers and developers with relevant and important information on the usability research area, including highlighting possible research gaps.

International Registered Report Identifier (IRRID): DERR1-10.2196/19072

JMIR Res Protoc 2020;9(8):e19072

doi:10.2196/19072

Keywords



Background

There has been increasing attention for e-learning technologies, including mobile apps, in health care education. Mobile apps can provide extendable learning environments and motivate students for adaptive and collaborative learning outside the classroom context [1,2]. However, mobile apps have small screen sizes and connectivity problems, and the context provides distractions for the user [3]. Developers of mobile apps need to ensure that apps are practical, effective, and easy to use [1]. Usability testing is important in app development in order to understand how mobile apps meet the needs of users [4]. According to the International Organization for Standardization (ISO), usability is defined as “The extent to which a system, product, or service can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” [5].

Usability Methods

Usability methods, which are currently referred to in usability studies, involve laboratory experiments and field studies [1,6]. There are advantages and disadvantages for both methods. Laboratory experiments take place in a usability laboratory, where the test procedure is conducted in a controlled environment. In a laboratory, researchers can record user activity while they fulfil predefined tasks for later analysis [6], and they can control other irrelevant variables [3]. It is however not possible to test real-world problems (eg, only brief episodes of available time during clinical placement) or problems with internet connection. The expense of instruments and dedicated space make laboratory experiments more costly than other methods [6]. Field studies involve the collection of real-time data from users performing tasks in the real-world environment. In field studies, data about task flows, inefficiencies, and the organizational and physical environments are collected [6]. Field studies allow for data collection within the dynamic nature of the context, which is almost impossible to simulate in a laboratory experiment [1]. However, as users move around in field studies, data collection and conditions are difficult to control [1]. It can also be challenging to collect data in a precise and timely manner [7].

Usability Attributes

Usability attributes are features used to measure the quality of mobile apps [1]. The three most common usability attributes are effectiveness, efficiency, and satisfaction [3], and all three are part of the ISO standard for usability [5]. Other attributes are learnability, memorability, errors, simplicity, comprehensibility, and learning performance [7]. Selecting appropriate usability attributes depends on the nature of the e-learning technology and the research question of the usability study [7]. It is unclear which usability attributes are most relevant to mobile apps for health care students, although Sandars [8] highlighted the following four main domains for usability testing of e-learning: the learner, technological aspects (navigation, learnability, accessibility, consistency, and visual design), instructional design aspects (interactivity, content and resources, media use, and learning strategy design), and the context.

Previous reviews on usability methods examined usability testing in general [9] or usability specifically related to mobile apps [3,6,7,10]. Only one systematic review specifically explored the usability of mobile learning apps [1], although it did not include studies from health care education. Thus, there is a need for an overview of studies reporting on usability evaluations of mobile apps related to health care education. The aim of this study is to perform a scoping review of usability methods and attributes reported in usability studies of mobile apps for health care education.


Overview

A scoping review summarizes and disseminates research findings to describe the breadth and range of research in a particular topic or field [11-13]. To address the objectives of this scoping review, we will follow the framework for scoping reviews developed by Arksey & O’Malley [11], which was further developed by Levac et al [12] and Kahlil et al [13]. We will adopt the following five stages of this framework: (1) identifying the research question; (2) identifying relevant studies; (3) selecting studies; (4) charting the data; and (5) summarizing and reporting the results [11-13]. A detailed presentation of each step is provided below. This scoping review will also follow the PRISMA-ScR checklist for reporting scoping reviews [14].

Stage 1: Identifying the Research Question

Research questions in a scoping review are broad and have a goal to summarize the breadth of the evidence, although the research questions should include a clear scope of inquiry [12]. We have developed two research questions to meet the aim of the study, which are as follows: (1) What usability methods are used to evaluate the usability of mobile apps for health care education? and (2) What usability attributes are reported in usability studies of mobile apps for health care education?

Stage 2: Literature Search (Identifying Relevant Studies)

The term usability is defined and used in multiple ways, making it hard to develop a comprehensive search strategy for the term. Using a broader search may be preferable [15]. Therefore, the sensitivity (finding as many relevant articles as possible) of the search is prioritized over the specificity (making sure retrieved articles are relevant), as recommended in order not to miss any relevant articles [16].

We will search the following 10 electronic databases covering technology, education, and health care: Engineering Village (Elsevier), Scopus (Elsevier), ACM Digital Library, IEEE Xplore, Education Resource Information Center (ERIC) (EBSCOhost), PsycINFO (Ovid), CINAHL (EBSCOhost), Medline (Ovid), Embase (Ovid), and Web of Science (Clarivate Analytics). The database searches will be updated before final analysis. The search strategy has been developed in cooperation with a research librarian at Western Norway University of Applied Science. The search string has been peer reviewed by another research librarian, according to the Peer Review of Electronic Search Strategies (PRESS) [17]. A comprehensive search strategy combining text and mesh words relating to health care students and mobile apps was developed. The Boolean operator OR will combine words of similar meaning and the Boolean operator AND will combine searches with words of different meanings. The search strategy for PsycINFO is presented in Multimedia Appendix 1. We will tailor the search strategy to the other databases and present it in our scoping review.

We will browse OpenGrey for grey literature. We will perform a citation search in Google Scholar for included studies and screen reference lists for possible relevant studies. There will be no language restrictions. Studies from January 2008 to the date the searches are run will be sought. The year restriction has been chosen as mobile apps did not appear until 2008 [18].

Stage 3: Data Selection (Selecting Studies)

The Rayyan online management software [19] will be used for the selection of eligible studies. Based on the inclusion criteria outlined in Textbox 1, two authors will independently screen the titles and abstracts of studies retrieved from the searches to identify eligible studies We will include research articles of both quantitative and qualitative designs within the area of health care professional education. Commentaries, discussion papers, book editorials, and conference abstracts will be excluded. Moreover, studies relating to learner management systems, e-learning platforms, open online courses, or distance education will be excluded. Studies will be screened in full text, if one reviewer decides to include it. The full text of these potentially eligible studies will be retrieved, imported to the EndNote X9 reference management system [20], and independently assessed for eligibility by two review authors. Any eligibility disagreements will be resolved through discussion or with a third reviewer. A flow chart of the study selection process will be presented.

Study eligibility.

Inclusion criteria

Population: Studies reporting on health care and allied health care students at the undergraduate and postgraduate levels.

Concepts: Studies of usability testing or usability evaluation methods of mobile apps, where the purpose is related to development of the apps. The usability attributes include effectiveness, efficiency, satisfaction, learnability, memorability, errors, simplicity, comprehensibility, and learning performance of the learning app.

Context: Typical educational settings (eg, classroom teaching, clinical placement, and simulation training).

Textbox 1. Study eligibility.
Stage 4: Charting the Data

A standardized prepiloted data extraction form will be used to extract characteristics and data from the included studies. One review author will extract the data from the included studies, which will be checked by another review author. A combination of Microsoft Excel software [21] and NVivo 12 [22] will be used to facilitate this process. Discrepancies will be identified and resolved through discussion or with a third author when necessary.

The process of extracting information from the included studies in a scoping review is an iterative approach [12,13]. This means that we will extract predefined themes, although other relevant information may be included later in the process. Extracted information related to the purpose of the scoping review will include the following:

(1) Study: author(s) name(s), year of publication, title, country, publication journal, study setting, study design, research question, and research methods

(2) Population: number of participants, description of participants, and education level

(3) Concepts: usability methods, usability attributes, modes of delivery, usability phase, materials, procedures, type(s) of location(s), number of usability testing procedures, and modifications

(4) Context: educational setting

Stage 5: Summarizing and Reporting the Results

The fifth stage of the scoping review involves summarizing and reporting the results of the included studies [11-13]. The characteristics of each study will be mapped, and a descriptive narrative account will be presented. We will perform a content analysis [23] to map the different usability methods and usability attributes used in the included studies. Tables and graphical illustrations will be used to bring together and present the usability methods and attributes.

Ethics

This protocol for a scoping review does not require ethical approval or consent to participate. The data consist of data from published articles and do not include individual data.


The electronic searches for eight of the databases were completed on March 5, 2019. The literature search identified 14,297 unique references (Figure 1). Owing to the sensitivity of the search, many of these references were irrelevant and excluded. Following title and abstract screening, full texts of 369 records were obtained. Our next step is to assess these references for eligibility.

Figure 1. Flow chart of the search results and screening process.
View this figure

Usability Studies of Mobile Apps for Health Care Education

The increasing acceptability and use of mobile apps in the health care education context can lead to improved learning outcomes. However, in order to make learning tools relevant to students, mobile apps must meet the expectations of users [4]. To our knowledge, no overview exists on usability studies of mobile apps for health care education. The results of this scoping review will provide valuable information to developers of mobile apps for health care education, as it will point to relevant usability methods and attributes. Furthermore, the review will identify areas where further research is needed.

A strength of this study is the broad search strategy. We searched ten different databases, and the search strategy was designed in collaboration with a research librarian and was peer reviewed by another research librarian. The search has a time restriction from 2008, but no language restriction. The time restriction was set from 2008, as mobile apps appeared in 2008. A broad search strategy may be associated with lower precision, making it challenging to retrieve relevant articles. We did however experience some challenges with the initial database searches. The authors and research librarians had little experience with databases in academic areas outside health care (eg, Engineering Village and Scopus). “Usability” was not used as a term in the search strategy, as studies on usability do not necessarily refer to or use the term usability. Designing an effective search strategy that balances sensitivity and precision was demanding. Consequently, the search was challenging to narrow, and the search yielded 14,297 unique hits. To ensure that members of the review team had a similar understanding of the inclusion and exclusion criteria, efforts were made to calibrate our screening. Reporting methodological rigor and transparency in a scoping review is of importance to the trustworthiness of the research [24]. Publishing a protocol of the scoping review will support the transparency of the methodology and will assist in the conduction of the scoping review. Following the reporting guidelines for scoping reviews (PRISMA-ScR) [14] will help ensure the methodological quality of the scoping review.

Conclusion

This scoping review will advance the field of mobile app development for health care education by presenting advice on the relevant usability methods and attributes to study. It will give an overview of the field and provide researchers and developers with relevant and important information on the usability research area, including highlighting possible research gaps.

Acknowledgments

Research librarians at Western Norway University of Applied Sciences provided valuable assistance in the development of this scoping review protocol. Gunhild Austrheim, a research librarian, provided substantial guidance in the planning and performance of the database searches. Marianne Nesbjørg Tvedt peer reviewed the search string.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Search string for PsycINFO.

DOCX File , 17 KB

  1. Kumar BA, Mohite P. Usability of mobile learning applications: a systematic literature review. J. Comput. Educ 2017 Oct 12;5(1):1-17. [CrossRef]
  2. Asarbakhsh M, Sandars J. E-learning: the essential usability perspective. Clin Teach 2013 Feb;10(1):47-50. [CrossRef] [Medline]
  3. Harrison R, Flood D, Duce D. Usability of mobile applications: literature review and rationale for a new usability model. J Interact Sci 2013;1(1):1. [CrossRef]
  4. Paz F, Pow-Sang JA. A Systematic Mapping Review of Usability Evaluation Methods for Software Development Process. IJSEIA 2016 Jan 31;10(1):165-178 [FREE Full text]
  5. ISO 9241-11:2018 Ergonomics of human-system interaction - Part 11: Usability: Definitions and concepts. ISO. 2018.   URL: https://www.iso.org/standard/63500.html [accessed 2020-06-09]
  6. Nayebi F, Desharnais J, Abran A. The state of the art of mobile application usability evaluation. 2012 Presented at: IEEE Canadian Conference on Electrical and Computer Engineering (CCECE); April 29-May 2, 2012; Canada p. 1-4. [CrossRef]
  7. Zhang D, Adipat B. Challenges, Methodologies, and Issues in the Usability Testing of Mobile Applications. International Journal of Human-Computer Interaction 2005 Jul;18(3):293-308. [CrossRef]
  8. Sandars J. The importance of usability testing to allow e-learning to reach its potential for medical education. Educ Prim Care 2010 Jan;21(1):6-8. [CrossRef] [Medline]
  9. Bastien JM. Usability testing: a review of some methodological and technical aspects of the method. Int J Med Inform 2010 Apr;79(4):e18-e23. [CrossRef] [Medline]
  10. Ismail N, Ahmad F, Kamaruddin N, Ibrahim R. A review on usability issues in mobile applications. Journal of Mobile Computing & Application (IOSR-JMCA) 2016;3(3):47-52 [FREE Full text]
  11. Arksey H, O'Malley L. Scoping studies: towards a methodological framework. International Journal of Social Research Methodology 2005 Feb;8(1):19-32. [CrossRef]
  12. Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implement Sci 2010 Sep 20;5:69 [FREE Full text] [CrossRef] [Medline]
  13. Khalil H, Peters M, Godfrey CM, McInerney P, Soares CB, Parker D. An Evidence-Based Approach to Scoping Reviews. Worldviews Evid Based Nurs 2016 Apr;13(2):118-123. [CrossRef] [Medline]
  14. Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann Intern Med 2018 Oct 02;169(7):467-473. [CrossRef] [Medline]
  15. Akers J, Aguiar-Ibáñez R, Sari A, Beynon S, Booth A, Burch J. Systematic Reviews: CRD's Guidance for Undertaking Reviews in Health Care. York, England: University of York NHS Centre for Reviews & Dissemination; 2009.
  16. Siddaway AP, Wood AM, Hedges LV. How to Do a Systematic Review: A Best Practice Guide for Conducting and Reporting Narrative Reviews, Meta-Analyses, and Meta-Syntheses. Annu Rev Psychol 2019 Jan 04;70:747-770. [CrossRef] [Medline]
  17. McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Statement. J Clin Epidemiol 2016 Jul;75:40-46 [FREE Full text] [CrossRef] [Medline]
  18. Mobile app. Wikipedia. 2020 Apr 14.   URL: https://en.wikipedia.org/wiki/Mobile_app [accessed 2020-06-09]
  19. Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev 2016 Dec 05;5(1):210 [FREE Full text] [CrossRef] [Medline]
  20. Web of Science Group. EndNote X9. Clarivate Analytics. 2020.   URL: https://endnote.com/ [accessed 2020-06-09]
  21. Microsoft Excel. Microsoft Corporation. 2020.   URL: https://microsoft.com [accessed 2020-06-09]
  22. NVivo 12. QSR International. 2020.   URL: https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/home [accessed 2020-06-09]
  23. Peters M, Godfrey C, Khalil H, McInerney P, Parker D, Soares CB. Guidance for conducting systematic scoping reviews. Int J Evid Based Healthc 2015 Sep;13(3):141-146. [CrossRef] [Medline]
  24. Pham MT, Rajić A, Greig JD, Sargeant JM, Papadopoulos A, McEwen SA. A scoping review of scoping reviews: advancing the approach and enhancing the consistency. Res Synth Methods 2014 Dec;5(4):371-385 [FREE Full text] [CrossRef] [Medline]


ISO: International Organization of Standardization


Edited by G Eysenbach; submitted 03.04.20; peer-reviewed by B Adusumilli, MA Bahrami; comments to author 19.04.20; revised version received 14.06.20; accepted 14.06.20; published 04.08.20

Copyright

©Susanne Grødem Johnson, Thomas Potrebny, Lillebeth Larun, Donna Ciliska, Nina Rydland Olsen. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 04.08.2020.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Research Protocols, is properly cited. The complete bibliographic information, a link to the original publication on http://www.researchprotocols.org, as well as this copyright and license information must be included.