Published on in Vol 11, No 3 (2022): March

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/34894, first published .
Dashboards in Health Care Settings: Protocol for a Scoping Review

Dashboards in Health Care Settings: Protocol for a Scoping Review

Dashboards in Health Care Settings: Protocol for a Scoping Review

Protocol

1Department of Internal Medicine, University of Michigan, Ann Arbor, MI, United States

2Institute for Healthcare Policy and Innovation, University of Michigan, Ann Arbor, MI, United States

3Veterans Affairs Ann Arbor Center for Clinical Management Research, Ann Arbor, MI, United States

4Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, United States

5Department of Psychiatry, University of Michigan, Ann Arbor, MI, United States

6Taubman Health Sciences Library, University of Michigan, Ann Arbor, MI, United States

7Department of Medicine, UCLA Health, Los Angeles, CA, United States

8University of Michigan Medical School, Ann Arbor, MI, United States

*these authors contributed equally

Corresponding Author:

Danielle Helminski, MPH

Department of Internal Medicine

University of Michigan

NCRC Building 14

2800 Plymouth Road

Ann Arbor, MI, 48109

United States

Phone: 1 7346153952

Email: dhelmins@umich.edu


Background: Health care organizations increasingly depend on business intelligence tools, including “dashboards,” to capture, analyze, and present data on performance metrics. Ideally, dashboards allow users to quickly visualize actionable data to inform and optimize clinical and organizational performance. In reality, dashboards are typically embedded in complex health care organizations with massive data streams and end users with distinct needs. Thus, designing effective dashboards is a challenging task and theoretical underpinnings of health care dashboards are poorly characterized; even the concept of the dashboard remains ill-defined. Researchers, informaticists, clinical managers, and health care administrators will benefit from a clearer understanding of how dashboards have been developed, implemented, and evaluated, and how the design, end user, and context influence their uptake and effectiveness.

Objective: This scoping review first aims to survey the vast published literature of “dashboards” to describe where, why, and for whom they are used in health care settings, as well as how they are developed, implemented, and evaluated. Further, we will examine how dashboard design and content is informed by intended purpose and end users.

Methods: In July 2020, we searched MEDLINE, Embase, Web of Science, and the Cochrane Library for peer-reviewed literature using a targeted strategy developed with a research librarian and retrieved 5188 results. Following deduplication, 3306 studies were screened in duplicate for title and abstract. Any abstracts mentioning a health care dashboard were retrieved in full text and are undergoing duplicate review for eligibility. Articles will be included for data extraction and analysis if they describe the development, implementation, or evaluation of a dashboard that was successfully used in routine workflow. Articles will be excluded if they were published before 2015, the full text is unavailable, they are in a non-English language, or they describe dashboards used for public health tracking, in settings where direct patient care is not provided, or in undergraduate medical education. Any discrepancies in eligibility determination will be adjudicated by a third reviewer. We chose to focus on articles published after 2015 and those that describe dashboards that were successfully used in routine practice to identify the most recent and relevant literature to support future dashboard development in the rapidly evolving field of health care informatics.

Results: All articles have undergone dual review for title and abstract, with a total of 2019 articles mentioning use of a health care dashboard retrieved in full text for further review. We are currently reviewing all full-text articles in duplicate. We aim to publish findings by mid-2022. Findings will be reported following guidance from the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) checklist.

Conclusions: This scoping review will provide stakeholders with an overview of existing dashboard tools, highlighting the ways in which dashboards have been developed, implemented, and evaluated in different settings and for different end user groups, and identify potential research gaps. Findings will guide efforts to design and use dashboards in the health care sector more effectively.

International Registered Report Identifier (IRRID): DERR1-10.2196/34894

JMIR Res Protoc 2022;11(3):e34894

doi:10.2196/34894

Keywords



Background

Effectively measuring, monitoring, and responding to metrics about health-related decisions, practices, and outcomes has become an essential business function for modern health care organizations. Nimble health care organizations employ data for all manners of daily operational decision-making, ranging from supply chain management and staff scheduling to individual treatment planning and population health management [1]. For certain key performance metrics, payers have linked reimbursement to value-based payment programs [2] and accrediting bodies have required monitoring and disclosure of performance for accreditation or certification [3], incentivizing organizations to effectively monitor and track their performance against established benchmarks [4]. With the rapid proliferation of electronic health records, there is an abundance of patient- and provider-level data to use for assessing performance [5-7]. At the same time, vast data alone are of little use without systems to derive timely and actionable insights.

Health systems have increasingly adopted business intelligence software to track performance metrics in an automated way [8]. These applications have been defined by Loewen and Roudsari [9] as “specialized tools to collect, analyze, and present organizational data to operational leaders in user-friendly format(s) to support organizational objectives.” One such tool that has seen considerable expansion in health care settings is the “dashboard,” a business intelligence tool that uses data visualization to provide actionable feedback to improve performance, adherence to evidence-based practices, workflow management, and resource utilization [10,11]. Dashboards often display performance trends, peer comparisons, benchmarks, or goals, and use visual elements such as graphs and color-coding to improve interpretability [12].

To create an effective dashboard, developers must make multiple complex decisions. End users’ information needs are highly contextual and depend on the clinical setting, professional roles, and the patient population, which impact selection of appropriate data elements, visualizations, and interactivity [13-15]. Although health care executives may prefer to see graphic performance trends over weeks or months, clinicians working with vulnerable patient groups may require real-time, patient-level health data so they can intervene quickly if needed. Indeed, numerous techniques for developing dashboards and selecting key metrics have been described, including focus groups, iterative usability testing, and a Delphi method [16,17]. More sophisticated dashboards also incorporate forecasting and decision support, which carry their own challenges [18,19]. The range of and most common strategies used to address these essential steps in dashboard development are unknown.

Developing effective dashboards tailored to the needs of the intended end user is only the first step in the health care performance improvement cycle. Developers and organizational leadership must also employ implementation strategies to promote uptake and use of the dashboard, such as the identification and involvement of “champions,” ongoing training of end users, and changes in policy that mandate or incentivize dashboard use [20]. As development and maintenance of data-rich business intelligence tools, like dashboards, can be time- and resource-intensive, it is essential that these tools both function effectively and result in measurable improvements. Iterative evaluation of dashboard performance throughout development and implementation and beyond are critical to identify user- and system-level barriers to use as well as potential errors that may only be identified after extended use.

In this scoping review, we will survey peer-reviewed literature to describe the contexts in which dashboards have been used in health care settings, as well as how they were developed, implemented, and evaluated.

Aims and Comparison With Prior Work

This scoping review will provide a narrative overview of design elements and characteristics of health care dashboards, including where they exist geographically, the intended end users, information presented, whether/how the end user and setting impact dashboard design, and the processes used for development, implementation, and evaluation. Although previous reviews on health care dashboards have focused on identifying important design features and effectiveness of dashboards in improving patient outcomes and clinician satisfaction [11,14,15,21,22], an updated review of how dashboard tools are used, and by whom will provide meaningful insight into how intended end user and setting impact the design, development and implementation of the dashboard (ie, the relationship of form and function). This information is essential to provide insights into (1) how and why dashboards work in different settings for different users, to allow relevant stakeholders to make more informed decisions about where to implement, and (2) how to effectively design dashboards based on their intended purpose and target audience. Given the rapidly evolving field of health informatics, the scoping review will also provide insight into the latest trends in dashboards, from initial conception and development through implementation and evaluation. Previous reviews of dashboards have included articles published only as recently as 2017 [11,14,21-23].


Study Design

The aims of this study can be best accomplished through a scoping review, which differs from a systematic review in that scoping reviews generally have a broader scope and are exploratory, not requiring critical quantitative appraisal of synthesized findings [24,25]. For this study, we will follow the framework for conducting scoping reviews developed by Arksey & O’Malley [26] and further refined by Levac et al [27]. A description of each step is provided below.

Step 1. Identifying the Research Questions

The key research questions, which were established through a process of team discussions and preliminary searches of the literature on health care dashboards, are as follows:

  1. What design features are most frequently incorporated in health care dashboards?
  2. For what purposes are dashboards developed in health care settings?
  3. Where, and by whom, are dashboards used?
  4. What processes and/or frameworks are used for development, implementation, and evaluation of dashboards?

Step 2. Identifying Relevant Studies

We searched MEDLINE, Embase, Web of Science, and the Cochrane Library databases in July 2020 for relevant articles using comprehensive search strategies for each database that were developed in collaboration with a research librarian (MLC) and are available in Multimedia Appendix 1. These databases were selected since they represent a broad sample of literature relevant to the health sciences. Search terms included a variety of keywords and medical subject headings (MeSH) related to clinical health care and information technology. Search strategies were developed around the following key terms: “dashboard,” “information technology,” “healthcare,” “electronic health record,” “electronic medical record,” “quality,” “safety,” “key performance indicators,” “decision making,” “decision support,” “benchmark,” and “informatics.” Boolean operators “AND,” “OR,” and “NOT” were used to construct each search, with “NOT” operators used to reduce the number of results related to automotive and learning analytics dashboards. No date, language, or other restrictions were imposed in the database searches. Grey literature sources were not searched.

Step 3. Study Selection

All articles retrieved by the search were imported into and initially reviewed using Covidence [28], a screening and data extraction tool adopted by Cochrane in 2015 as the standard platform for producing Cochrane Reviews. In addition, 2 of 4 authors (DH, ADR, MLC, OJG) independently screened all titles and abstracts to identify potentially eligible studies. All articles that mentioned use of a “dashboard” in a health care setting were reviewed in full text and are currently undergoing duplicate review by 2 of 7 authors (DH, ADR, MLC, OJG, ANK, RG, AR) to determine eligibility, applying the inclusion and exclusion criteria listed in Textbox 1. We excluded articles published prior to 2015 and those describing dashboards that were not successfully used in routine workflow or were only used in a pretesting environment. We believe these exclusions are justified as rapid advancements in technology warrant a focus on newer research that is more likely to be reproducible. Additionally, limiting our analysis to dashboards that were successfully implemented or used outside of a pretesting environment provides a clearer view of existing barriers and facilitators to designing and implementing dashboards in real-world practice. Any disagreements that arise during full-text screening will be resolved through adjudication by a third author. For any studies that are reviewed in full text but not deemed eligible for inclusion in the scoping review, a reason for exclusion will be documented and provided with the results of the scoping review in a PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) flow diagram.

Eligibility criteria for full-text review.

Inclusion criteria

  • Peer-reviewed articles that describe the development, implementation, and/or evaluation of a dashboard used in a health care setting outside of a pretesting environment. Health care settings include clinics, hospitals, health systems, or any other settings where medical care is provided. Both quantitative and qualitative evaluations of dashboards will be included.

Exclusion criteria

  • Non–English language publication
  • Articles published prior to 2015
  • Articles that describe pretesting of pilot or prototype dashboards that were not successfully implemented or used outside of a testing environment
  • Articles that describe public health dashboards used for geographic tracking of disease or comparing city- or country-level data not used for clinical or management-level decision-making in a health care setting where patient care is provided
  • Articles that describe dashboards used in undergraduate medical education, or in educational contexts where there is no direct association with patients, patient care, or facility management
  • Articles for which the full-text manuscript is unavailable
Textbox 1. Eligibility criteria for full-text review.

Step 4. Charting the Data

A preliminary list of data elements for charting is presented in Textbox 2. However, in accordance with recommendations from Levac et al [27], an iterative process will be used to identify additional elements for data extraction and analysis as the study progresses. Using an iterative process improves the quality of the review by allowing reviewers to gain familiarity with included studies and add or revise data extraction elements accordingly. A standardized data extraction form will be developed and reviewed by all authors. The form will be pilot tested by two authors who will independently complete data extraction for a subset of articles to ensure consistency among extractors. Once a high level of agreement is achieved between extractors, the pilot extraction form will be approved, and two authors will independently extract data from each included study. Any disagreements in data extraction will be resolved by discussion between the two authors; if the reviewers are unable to reach consensus, a third author will serve as arbiter.

Preliminary list of data extraction elements.

Article information

  • Title
  • Author
  • Publication year
  • Journal
  • Study type

Contextual factors

  • Geographic location of the described dashboard
  • Health care setting
  • Intended end user(s)

Primary purpose or goal of the dashboard

  • Reason stated for development or use of dashboard

Development

  • Software used
  • Framework(s) used to guide development or pretesting
  • Usability testing conducted
  • Involvement of users in development process

Implementation

  • Adjunct strategies used in conjunction with dashboard (such as academic detailing, audit and feedback, or financial incentivization)
  • Identification of potential barriers and facilitators to use of dashboard prior to implementation
  • Identification and involvement of champions
  • Training of stakeholders or distribution of educational materials on how to use the dashboard
  • Protocol or policy changes that mandate use of the dashboard

Evaluation

  • Type of evaluation (qualitative or quantitative)

Design features

  • Format (including delivery channel and timing)
    • Frequency of data updates
    • Use of visual elements
    • Delivery channel (eg, website, email, wall display)
  • Information content
    • Descriptions of performance summary data (including indicators, time intervals, comparators, and their performance levels)
    • Patient lists (typically patients who have actionable data, such as guideline-discordant care; “yes” or “no”)
    • Patient-level data (“yes” or “no”)
    • Recommended actions (“yes” or “no”)
    • Metrics or evaluation based on benchmarks established by accrediting bodies, health care payer organizations, or national guidelines (“yes” or “no”)
  • Functionality (“yes” or “no”)
    • Multilevel presentation of data
    • Interface customizability
    • Goal setting/action planning
    • Task performance (ie, ordering, flagging, prescribing)
Textbox 2. Preliminary list of data extraction elements.

Step 5. Collating, Summarizing, and Reporting the Results

Data extraction will be performed using Microsoft Excel (Microsoft Corp). The data elements for each dashboard identified will be displayed and coded in a spreadsheet, which will be used for analysis, mainly counts. This scoping review will follow the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) checklist [29] for reporting of methods and outcomes.


In July 2020, electronic database searches were completed using the search strategies outlined in Multimedia Appendix 1, and 5188 results were retrieved and imported into Zotero reference management software (version 5.0.96.2; Corporation for Digital Scholarship) for management of records and retrieval of full-text articles prior to upload into Covidence online screening software [28]. After removal of duplicate results in Covidence [28], there were 3306 articles identified for title and abstract screening. A total of 2019 articles were retrieved for full-text review and will be reviewed in duplicate for eligibility. We aim to finish the review and draft the final report by mid-2022. Findings will be summarized in a narrative fashion while employing use of tables and graphs to illustrate key characteristics of dashboards in health care and will be submitted for publication along with the completed PRISMA-ScR reporting checklist.


Future Planned Work

Currently, available literature lacks standard, consensus hierarchical descriptions of the different types of health care dashboards in use and their distinct design and implementation processes [15,30]. As the use of dashboards continues to increase, it is important for stakeholders to be able to communicate effectively with the designers and users of these tools. The authors intend to use the findings of this scoping review to inform the development of a taxonomy of the various types of dashboards a health care organization may choose to employ. This taxonomy will identify the relevant design elements that each type of dashboard includes to inform evidence about health care dashboard usability and purpose of use, and stakeholders, including end users. Finally, the review will provide evidence of the extent to which rigorous practices are used in the development, evaluation, and implementation of health care dashboards, each of which ultimately contributes to a dashboard’s success.

The findings of this scoping review will additionally inform the design of a future meta-analysis and meta-synthesis of dashboard evaluations, if possible, in consideration of the heterogeneity of the studies identified in this scoping review.

Limitations

This scoping review methodology has several limitations. First, the search strategy does not include grey literature or conference abstracts since these are expected to provide insufficient detail for the data elements we plan to extract. This may cause some dashboards described in government and committee reports, dissertations, and conference proceedings to be overlooked. However, since the data extracted will mainly be summarized, and since we are not evaluating any causal effects or performing quantitative analyses, which would be more susceptible to publication bias, this will not be a major limitation. Second, because of the inclusion criteria, our findings will be most applicable to dashboards used in settings that provide direct health care; they will be less informative about public health tracking dashboards, including those used to monitor the COVID-19 pandemic and to perform contact tracing [31,32].

Conclusion

Health information technology continues to rapidly change the way health care organizations operate, and dashboards are an increasingly common tool. It is essential that key stakeholders have a clear understanding of what dashboards are, and which features are essential to specific end users for dashboard development. This scoping review will advance the field of health informatics by providing organizational leaders, clinical staff, dashboard developers, and quality improvement researchers with a clear and concise overview of the literature in this field, and by highlighting research gaps.

Acknowledgments

This scoping review was funded by the National Institute of Diabetes and Digestive and Kidney Diseases through a K23 award (K23DK118179) to JEK, and the US Department of Veterans Affairs (1 I50 HX003251-01) Maintaining Implementation Through Dynamic Adaptations (MIDAS; QUE 20-025). The funders played no role in the study design, decision to publish, or drafting of the manuscript.

Authors' Contributions

DH contributed to the study concept and design, pretesting and refinement of search strategy, citation and database management, screening of identified articles, and drafting of manuscript. JEK contributed to study concept and design, pretesting and refinement of search strategy, and drafting of manuscript. ADR contributed to study concept and design, screening of identified articles, and critical revision of the manuscript. JS, LJD, ZLL, and PNP contributed to study concept and design and critical revision of the manuscript. MLC contributed to pretesting and refinement of search strategy, screening of identified articles, and critical revision of the manuscript. OJG contributed to study concept and design and screening of identified articles. ANK, RG, and AR screened identified articles. All authors read and approved the paper for submission.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Database search strategies.

DOCX File , 14 KB

  1. Unlocking Business Intelligence in Healthcare. Tableau.   URL: https://www.tableau.com/learn/articles/business-intelligence/healthcare [accessed 2020-11-30]
  2. What Is Pay for Performance in Healthcare? NEJM Catalyst, Innovations in Care Delivery. 2018 Mar 01.   URL: https://catalyst.nejm.org/doi/abs/10.1056/CAT.18.0245 [accessed 2020-09-18]
  3. 2021 ORYX Performance Measure Reporting Requirements. The Joint Commission. 2020 Oct.   URL: https:/​/www.​jointcommission.org/​-/​media/​tjc/​documents/​measurement/​oryx/​cy2021-oryx-reporting-requirements-oct2020.​pdf [accessed 2021-03-25]
  4. Kyeremanteng K, Robidoux R, D'Egidio G, Fernando SM, Neilipovitz D. An Analysis of Pay-for-Performance Schemes and Their Potential Impacts on Health Systems and Outcomes for Patients. Crit Care Res Pract 2019;2019:8943972 [FREE Full text] [CrossRef] [Medline]
  5. Office-based Physician Electronic Health Record Adoption. Office of the National Coordinator for Health Information Technology. 2019 Jan.   URL: https://www.healthit.gov/data/quickstats/office-based-physician-electronic-health-record-adoption [accessed 2020-12-01]
  6. Non-federal Acute Care Hospital Electronic Health Record Adoption. Office of the National Coordinator for Health Information Technology. 2017 Sep.   URL: https:/​/www.​healthit.gov/​data/​quickstats/​non-federal-acute-care-hospital-electronic-health-record-adoption [accessed 2020-12-01]
  7. Atasoy H, Greenwood BN, McCullough JS. The Digitization of Patient Care: A Review of the Effects of Electronic Health Records on Health Care Quality and Utilization. Annu Rev Public Health 2019 Apr 01;40:487-500. [CrossRef] [Medline]
  8. Bonney W. Applicability of Business Intelligence in Electronic Health Record. Procedia - Social and Behavioral Sciences 2013 Feb;73:257-262. [CrossRef]
  9. Loewen L, Roudsari A. Evidence for Busines Intelligence in Health Care: A Literature Review. Stud Health Technol Inform 2017;235:579-583. [CrossRef] [Medline]
  10. Rivas C, Tkacz D, Antao L, Mentzakis E, Gordon M, Anstee S, et al. Automated analysis of free-text comments and dashboard representations in patient experience surveys: a multimethod co-design study. Health Serv Deliv Res 2019 Jul;7(23):1-160. [CrossRef] [Medline]
  11. Buttigieg SC, Pace A, Rathert C. Hospital performance dashboards: a literature review. J of Health Org and Mgt 2017 May 15;31(3):385-406. [CrossRef] [Medline]
  12. Making Healthcare Safer III. A Critical Analysis of Existing and Emerging Patient Safety Practices. Agency for Healthcare Research and Quality. 2020 Mar.   URL: https:/​/www.​ahrq.gov/​sites/​default/​files/​wysiwyg/​research/​findings/​making-healthcare-safer/​mhs3/​making-healthcare-safer-III.​pdf [accessed 2020-09-03]
  13. Panicker V, Lee D, Wetmore M, Rampton J, Smith R, Moniz M, et al. Designing Tailored Displays for Clinical Practice Feedback: Developing Requirements with User Stories. Stud Health Technol Inform 2019 Aug 21;264:1308-1312 [FREE Full text] [CrossRef] [Medline]
  14. Khairat SS, Dukkipati A, Lauria HA, Bice T, Travers D, Carson SS. The Impact of Visualization Dashboards on Quality of Care and Clinician Satisfaction: Integrative Literature Review. JMIR Hum Factors 2018 May 31;5(2):e22. [CrossRef] [Medline]
  15. Vazquez-Ingelmo A, Garcia-Penalvo FJ, Theron R. Information Dashboards and Tailoring Capabilities - A Systematic Literature Review. IEEE Access 2019;7:109673-109688. [CrossRef]
  16. Laurent G, Moussa MD, Cirenei C, Tavernier B, Marcilly R, Lamer A. Development, implementation and preliminary evaluation of clinical dashboards in a department of anesthesia. J Clin Monit Comput 2020 May 16;35(3):617-626. [CrossRef] [Medline]
  17. Bunch K, Allin B, Jolly M, Hardie T, Knight M. Developing a set of consensus indicators to support maternity service quality improvement: using Core Outcome Set methodology including a Delphi process. BJOG: Int J Obstet Gy 2018 Jun 14;125(12):1612-1618. [CrossRef] [Medline]
  18. Mould DR, Upton RN, Wojciechowski J, Phan BL, Tse S, Dubinsky MC. Dashboards for Therapeutic Monoclonal Antibodies: Learning and Confirming. AAPS J 2018 Jun 14;20(4):76. [CrossRef] [Medline]
  19. Eser A, Primas C, Reinisch S, Vogelsang H, Novacek G, Mould DR, et al. Prediction of Individual Serum Infliximab Concentrations in Inflammatory Bowel Disease by a Bayesian Dashboard System. The Journal of Clinical Pharmacology 2018 Jan 30;58(6):790-802. [CrossRef] [Medline]
  20. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Sci 2015 Aug 7;10(1):1-8. [CrossRef] [Medline]
  21. Dowding D, Randell R, Gardner P, Fitzpatrick G, Dykes P, Favela J, et al. Dashboards for improving patient care: Review of the literature. International Journal of Medical Informatics 2015 Feb;84(2):87-100. [CrossRef] [Medline]
  22. Rivas C, Tkacz D, Laurence A, Mentzakis E, Gordon M, Anstee S, et al. Scoping review of clinical digital toolkit design. In: Automated analysis of free-text comments and dashboard representations in patient experience surveys: a multimethod co-design study. Southampton (UK): NIHR Journals Library; Jul 2019.
  23. Wilbanks BA, Langford PA. A review of dashboards for data analytics in nursing. Comput Inform Nurs 2014 Nov;32(11):545-549. [CrossRef] [Medline]
  24. Aromatis E, Munn Z. JBI Manual for Evidence Synthesis. Joanna Briggs Institute.   URL: https://jbi-global-wiki.refined.site/space/MANUAL [accessed 2020-08-18]
  25. Munn Z, Peters MDJ, Stern C, Tufanaru C, McArthur A, Aromataris E. Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Med Res Methodol 2018 Dec 19;18(1):143 [FREE Full text] [CrossRef] [Medline]
  26. Arksey H, O'Malley L. Scoping studies: towards a methodological framework. International Journal of Social Research Methodology 2005 Feb;8(1):19-32. [CrossRef]
  27. Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implement Sci 2010;5:69 [FREE Full text] [CrossRef] [Medline]
  28. Covidence. Melbourne, Australia: Veritas Health Information   URL: https://www.covidence.org [accessed 2020-04-01]
  29. Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann Intern Med 2018 Oct 02;169(7):467-473 [FREE Full text] [CrossRef] [Medline]
  30. Barnum T, Vaez K, Cesarone D, Yingling CT. Your Data Looks Good on a Dashboard. Health Information and Management Systems Society (HIMSS).   URL: https://www.himss.org/resources/your-data-looks-good-dashboard [accessed 2021-03-25]
  31. Berry I, Soucy JR, Tuite A, Fisman D, COVID-19 Canada Open Data Working Group. Open access epidemiologic data and an interactive dashboard to monitor the COVID-19 outbreak in Canada. CMAJ 2020 Apr 14;192(15):E420 [FREE Full text] [CrossRef] [Medline]
  32. Verhagen MD, Brazel DM, Dowd JB, Kashnitsky I, Mills MC. Forecasting spatial, socioeconomic and demographic variation in COVID-19 health care demand in England and Wales. BMC Med 2020 Jun 29;18(1):203 [FREE Full text] [CrossRef] [Medline]


MeSH: medical subject headings
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses
PRISMA-ScR: Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews


Edited by G Eysenbach; submitted 11.11.21; peer-reviewed by T Ndabu, F Velayati; comments to author 28.12.21; revised version received 11.01.22; accepted 11.01.22; published 02.03.22

Copyright

©Danielle Helminski, Jacob E Kurlander, Anjana Deep Renji, Jeremy B Sussman, Paul N Pfeiffer, Marisa L Conte, Oliver J Gadabu, Alex N Kokaly, Rebecca Goldberg, Allison Ranusch, Laura J Damschroder, Zach Landis-Lewis. Originally published in JMIR Research Protocols (https://www.researchprotocols.org), 02.03.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Research Protocols, is properly cited. The complete bibliographic information, a link to the original publication on https://www.researchprotocols.org, as well as this copyright and license information must be included.