Published on in Vol 10, No 12 (2021): December

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/31995, first published .
A Web-Based Service Delivery Model for Communication Training After Brain Injury: Protocol for a Mixed Methods, Prospective, Hybrid Type 2 Implementation-Effectiveness Study

A Web-Based Service Delivery Model for Communication Training After Brain Injury: Protocol for a Mixed Methods, Prospective, Hybrid Type 2 Implementation-Effectiveness Study

A Web-Based Service Delivery Model for Communication Training After Brain Injury: Protocol for a Mixed Methods, Prospective, Hybrid Type 2 Implementation-Effectiveness Study

Protocol

Corresponding Author:

Melissa Miao, BAppSc (Hons)

University of Technology Sydney

PO Box 123 Broadway, Ultimo

Sydney, 2007

Australia

Phone: 61 2 95147348

Email: melissa.miao@uts.edu.au


Background: Acquired brain injuries (ABIs) commonly cause cognitive-communication disorders, which can have a pervasive psychosocial impact on a person’s life. More than 135 million people worldwide currently live with ABI, and this large and growing burden is increasingly surpassing global rehabilitation service capacity. A web-based service delivery model may offer a scalable solution. The Social Brain Toolkit is an evidence-based suite of 3 web-based communication training interventions for people with ABI and their communication partners. Successful real-world delivery of web-based interventions such as the Social Brain Toolkit requires investigation of intervention implementation in addition to efficacy and effectiveness.

Objective: The aim of this study is to investigate the implementation and effectiveness of the Social Brain Toolkit as a web-based service delivery model.

Methods: This is a mixed methods, prospective, hybrid type 2 implementation-effectiveness study, theoretically underpinned by the Nonadoption, Abandonment, Scale-up, Spread, and Sustainability (NASSS) framework of digital health implementation. We will document implementation strategies preemptively deployed to support the launch of the Social Brain Toolkit interventions, as well as implementation strategies identified by end users through formative evaluation of the Social Brain Toolkit. We will prospectively observe implementation outcomes, selected on the basis of the NASSS framework, through quantitative web analytics of intervention use, qualitative and quantitative pre- and postintervention survey data from all users within a specified sample frame, and qualitative interviews with a subset of users of each intervention. Qualitative implementation data will be deductively analyzed against the NASSS framework. Quantitative implementation data will be analyzed descriptively. We will obtain effectiveness outcomes through web-based knowledge tests, custom user questionnaires, and formal clinical tools. Quantitative effectiveness outcomes will be analyzed through descriptive statistics and the Reliable Change Index, with repeated analysis of variance (pretraining, posttraining, and follow-up), to determine whether there is any significant improvement within this participant sample.

Results: Data collection commenced on July 2, 2021, and is expected to conclude on June 1, 2022, after a 6-month sample frame of analytics for each Social Brain Toolkit intervention. Data analysis will occur concurrently with data collection until mid-2022, with results expected for publication late 2022 and early 2023.

Conclusions: End-user evaluation of the Social Brain Toolkit’s implementation can guide intervention development and implementation to reach and meet community needs in a feasible, scalable, sustainable, and acceptable manner. End user feedback will be directly incorporated and addressed wherever possible in the next version of the Social Brain Toolkit. Learnings from these findings will benefit the implementation of this and future web-based psychosocial interventions for people with ABI and other populations.

Trial Registration: Australia and New Zealand Clinical Trials Registry ACTRN12621001170819; https://anzctr.org.au/Trial/Registration/TrialReview.aspx?ACTRN=12621001170819, Australia and New Zealand Clinical Trials Registry ACTRN12621001177842; https://anzctr.org.au/Trial/Registration/TrialReview.aspx?ACTRN=12621001177842, Australia and New Zealand Clinical Trials Registry ACTRN12621001180808; https://anzctr.org.au/Trial/Registration/TrialReview.aspx?ACTRN=12621001180808

International Registered Report Identifier (IRRID): DERR1-10.2196/31995

JMIR Res Protoc 2021;10(12):e31995

doi:10.2196/31995

Keywords



Background

More than 135 million people worldwide currently live with acquired brain injuries (ABIs), including traumatic brain injury and stroke [1]. ABIs commonly cause cognitive-communication disorders in which underlying problems with working memory, organization, executive function, self-regulation, or a combination of these, affect the communication skills needed for everyday exchanges such as conversations, explanations, and stories [2]. Cognitive-communication disorders can thus have a pervasive impact on a person’s social participation and relationships [3], employment [4,5], and mental health [6], while presenting concurrent health, psychosocial, and economic challenges for their families and communities [7-9].

The growing psychosocial burden of ABI increasingly surpasses the global rehabilitation service capacity to address it [1], including national-scale shortages in public speech-language pathology services to manage communication difficulty [10]. Inversely, families of people with ABI, particularly from rural and remote areas, experience logistical and access challenges when seeking face-to-face health care, leading carers to express their need for locally accessible support [11]. The equitable and scalable delivery of communication rehabilitation for people with ABI is therefore a global health service challenge [1], demanding consideration of alternative and complementary service delivery models to meet the psychosocial needs of this population and their communities now and into the future.

In response to these challenges, an evidence-based suite of web-based interventions known as the Social Brain Toolkit [12] is currently in development. The project was cocreated with stakeholders, including people with ABI and their communication partners, clinicians, partnering organizations, and policy makers. These stakeholders have been attending, and are included in, regular steering and advisory committee meetings. They have provided feedback on early prototypes and have been involved in the planning of implementation strategies and now the formative evaluation of the Social Brain Toolkit products. The aim of the Social Brain Toolkit is to provide scalable communication training to people with ABI and their communication partners, including family members, friends, partners, paid support workers, and clinicians. The Social Brain Toolkit comprises 3 web-based interventions: (1) convers-ABI-lity, a conversation skills training program for adults with ABI and familiar communication partners such as family members, partners, and friends; (2) interact-ABI-lity, web-based communication training for unfamiliar communication partners of people with ABI, such as paid support workers and the general public; and (3) social-ABI-lity, social media training for people with ABI seeking to communicate and connect on the web. interact-ABI-lity and social-ABI-lity are self-directed web-based courses, whereas convers-ABI-lity includes self-directed web-based content between telehealth sessions with a speech-language pathologist. The need for and format of these communication training courses were identified together with stakeholders, including people with ABI and their communication partners, with the aim of improving the quality of life and psychosocial outcomes of people with ABI and their communities.

The communication skills of communication partners can have a positive or detrimental effect on the communication skills of people with ABI [13-15]. Therefore, interact-ABI-lity and convers-ABI-lity deliver an evidence-based [16] intervention known as communication partner training (CPT), which involves training communication partners to facilitate the communication [17] of the person with ABI. CPT is recommended in international guidelines as best practice management of cognitive-communication disorders after ABI [18,19], and the convers-ABI-lity intervention delivers the core therapeutic content of the existing efficacious CPT programs TBI Express [20] and TBIconneCT [21,22]. convers-ABI-lity is a conversion and streamlining of the content of these programs into both asynchronous self-directed activities and synchronous telehealth speech-language pathology sessions. interact-ABI-lity delivers CPT as an asynchronous, self-directed web-based educational intervention.

In addition, the Social Brain Toolkit contains the social-ABI-lity intervention, which provides people with ABI with training in communication through social media. This is because people with ABI who use social media for connection are likely to experience difficulties in web-based interactions that are similar to those experienced in real-world interactions [23]. The social-ABI-lity intervention in the Social Brain Toolkit is an educational intervention based on new recommendations to support and train people with ABI in the safe and effective use of social media, with a view to increasing their social participation, enabling recovery of social communication skills, and promoting a sense of self or identity after ABI [24].

Web-based access to psychosocial interventions such as the Social Brain Toolkit are promising not just for the possibility of improving the scalability and accessibility of interventions, but also for their potential to reduce inequities in access to psychosocial support. Even before global shifts to web-based health care during the COVID-19 pandemic [25], people with ABI frequently accessed language and cognitive training on the web, with older patients and rural residents even more digitally engaged than younger and urban users [26]. When delivered on a national scale, equivalent web-based service delivery models in mental health have demonstrated the ability to overcome entrenched health care access barriers such as socioeconomic and indigenous status, and to enable access to users who otherwise do not access traditional face-to-face care [27,28]. However, web-based service delivery models face numerous implementation challenges. Even clinically effective digital health interventions struggle to be sustained as a long-term service delivery option [29] for reasons beyond their clinical effectiveness, including costs and workflow changes associated with their delivery [30]. Although there is an established and varied evidence base exploring web-based psychosocial interventions [31,32], there is limited implementation science research over and above these clinical trials to determine how these interventions might be successfully implemented and sustained as part of routine clinical care [31]. Therefore, a specific focus on implementation, especially in early research collaboration with end users, has been recommended for future research into web-based psychosocial care [27-31].

Therefore, real-world evaluation of the implementation of the Social Brain Toolkit interventions demands (1) collaborative involvement of end users, (2) a hybrid implementation-effectiveness research design [33] to expedite the incorporation of implementation learnings into intervention design, and (3) a theoretical underpinning in a digital health implementation framework that reflects the complexity of real-world implementation. Thus, this hybrid implementation-effectiveness study will be underpinned by an implementation theory that is both specific to digital health and based on a complexity approach: the Nonadoption, Abandonment, Scale-up, Spread, and Sustainability (NASSS) framework of eHealth implementation [34-36]. This framework will be used to support the more comprehensive identification of real-world complexities in the scale-up, spread, and sustainability of the Social Brain Toolkit.

Aims

In this study, formative evaluation of the implementation of the Social Brain Toolkit by end users in the community will be used to guide intervention development and implementation [37] to support these interventions to reach and meet community needs in a feasible, scalable, sustainable, and acceptable manner. Therefore, guided by domains of the NASSS framework [34,35], in this hybrid implementation-effectiveness study, we aim to answer the following implementation questions:

  1. Who uses these interventions and what are their characteristics (domains 1 and 4)? What implementation strategies can be used to improve intervention reach?
  2. In what geographical locations and health care and social contexts are the interventions used (domains 3 and 5-6)? What implementation strategies can be used to improve intervention reach?
  3. Do users complete the interventions as intended (domain 4)? Why or why not? What implementation strategies can be used to improve intervention adherence and fidelity?
  4. How usable is the technology for those completing the interventions (domain 2)? What changes can be made to improve usability?
  5. What barriers, facilitators, and workarounds do users experience when completing these interventions (domains 1-7)? What strategies, facilitators, and workarounds can be used to improve future implementation?
  6. How satisfied are users with the interventions (domain 3)? What changes can be made to increase user satisfaction?
  7. What is the cost of delivering each web-based intervention, and how does this compare with face-to-face delivery (domain 3)?

We seek to determine intervention effectiveness as follows:

  1. Do people who complete the interact-ABI-lity course have improvements in their self-efficacy and knowledge about communicating with people with ABI?
  2. Do people who complete the social-ABI-lity course have improvements in their self-efficacy and knowledge about communicating safely and successfully on social media?
  3. Do people who complete the convers-ABI-lity program have improved communication and quality-of-life outcomes?

Design

This study uses a prospective hybrid type 2 implementation-effectiveness design [33] and is registered on the Australia and New Zealand Clinical Trials Registry: interact-ABI-lity ACTRN12621001170819 (Universal Trial Number [UTN] U1111-1266-6628); social-ABI-lity ACTRN12621001177842 (UTN U1111-1268-4909); and convers-ABI-lity ACTRN12621001180808 (UTN U1111-1268-4849). The Social Brain Toolkit is well suited to a hybrid implementation-effectiveness design [37] because its interventions present minimal risk [16], with indirect evidence supporting effectiveness [22-24] and strong face validity supporting applicability to a web-based delivery method [21,22]. A prospective hybrid type 2 design especially reflects a collaborative ethos because it allows formative evaluation by end users to inform the refinement and improvement of both the clinical interventions and their implementation processes [37]. Preemptive implementation strategies will be devised during intervention development, including through consultation with people with ABI, communication partners, and clinicians supporting people with ABI, and people with experience implementing digital health [38], as well as reference to current implementation science literature [39]. Additional user-identified implementation strategies will be determined as part of the formative evaluation processes in this study, rather than solely a priori, in keeping with our collaborative ethos and approach. The deployment of additional user-identified strategies will enable us to identify and potentially address persisting and unanticipated implementation barriers or shortcomings of our preemptive implementation strategies.

Data Collection

To obtain these data, we will use a mixed methods design for both implementation (Multimedia Appendix 1) and effectiveness (Multimedia Appendix 2) [40-42] data collection.

Interviews

Intervention usability and user experience and satisfaction will be formatively evaluated through interviews. The first 30 minutes of the 90-minute interview will involve a think-aloud [43] review of the user interface through screen sharing of the web-based platform. The think-aloud method is a robust and flexible research technique to test usability by providing valuable and reliable information of users’ cognitive processes while completing a task [43]. This think-aloud task will be followed by a 60-minute qualitative interview, with interview questions developed using the NASSS framework of digital health implementation [34] to prompt discussion of multiple issues within this time frame (see Multimedia Appendices 3-7 for interview protocols). We will conduct the interviews as soon as possible after a user’s completion of the course to facilitate recollection of the intervention experience. We will conduct the interviews individually and with communication partners, depending on participant preference and availability. Data collection will occur entirely on the web, with interviews conducted through secure videoconferencing on Microsoft Teams (Microsoft Corporation) software [44]. The interview will be audio and video recorded using the built-in recording functions of the videoconferencing platform. Interview recordings will be transcribed verbatim for analysis.

Surveys

For all 3 interventions in the Social Brain Toolkit, we will use pre- and postintervention surveys within the intervention platforms to obtain implementation outcomes, including user demographic information, and qualitative and quantitative patient-reported experience measures (Multimedia Appendix 1). We will conduct the surveys completely on the web. The surveys are based on the NASSS framework domains (see Multimedia Appendices 8-10 for survey protocols). We will measure intervention effectiveness using questionnaires that probe patient-reported outcome measures such as self-ratings of confidence in communicating with someone with ABI or using social media (Multimedia Appendix 2).

Analytics

We will collect web analytics (Multimedia Appendix 1) for each intervention over a 6-month sampling frame. This will enable user fidelity and adherence to the interventions to be examined.

Clinical Outcomes

Clinical outcomes (Multimedia Appendix 2) examining the effectiveness of convers-ABI-lity will be collected in a parallel study. As interact-ABI-lity and social-ABI-lity are educational interventions, their effectiveness will be examined by measuring knowledge through preintervention, postintervention, and follow-up multiple-choice questions.

Analysis

Qualitative

To examine effectiveness, 2 experienced speech-language pathologists will review the lists of strategies generated by users of the social-ABI-lity and interact-ABI-lity courses, and code them as appropriate or inappropriate using a consensus rating procedure (Multimedia Appendix 2). To examine implementation, the first author (MM) will conduct deductive content analysis [45] based on the NASSS framework [34] of both free-text survey responses and interview data (Multimedia Appendix 1). Coding will be managed in NVivo 12 Pro (QSR International Pty Ltd) [46] or Microsoft Excel 2016 [47] (Microsoft Corporation) software.

Quantitative

We will prospectively measure implementation outcomes in relation to the following:

  1. Preemptive strategies deployed to support the implementation of the Social Brain Toolkit at launch, devised through current implementation evidence [39] and stakeholder input from a separate study [38].
  2. Additional user-identified strategies subsequently obtained through formative evaluation of the interventions by end users.

To observe any potential influence of these implementation strategies and factors on implementation success and the time lag of impact, we will record the following:

  1. A detailed description of each implementation strategy and its rationale.
  2. A detailed timeline of each strategy’s deployment.
  3. Effectiveness and implementation outcomes over time.

We will calculate descriptive statistics of implementation measures, including user demographic characteristics, satisfaction ratings, percentage completion, and total number of users (Multimedia Appendix 1). We will tabulate descriptive statistics stratified by time and by whether users complete the interventions. Descriptive statistical analysis will be conducted using RStudio software (RStudio Inc) [48].

Clinical assessment data for conversation skills and quality of life (Multimedia Appendix 2) [40-42] will be analyzed using the Reliable Change Index [49] to determine whether individual participants had any clinically significant changes. Patient-reported outcome measures for interact-ABI-lity and social-ABI-lity, such as self-ratings of confidence, will be analyzed using repeated measures analysis of variance, with 3 levels (pretraining, posttraining, and follow-up) to determine whether there is any significant improvement within this participant sample. These data will also be managed using appropriate statistical software such as Microsoft Excel 2016 (Microsoft Corporation) [47].

Finally, theoretically underpinned by the third domain of the NASSS framework [34] examining the value proposition of an intervention, we will use the web analytics data for each intervention to calculate web-based health care costs and equivalent face-to-face costs using a bottom-up costing approach [50]. With an initial focus on the Australian context from which the Social Brain Toolkit is developed, we will refer to nationally recognized cost guides such as the Australian Medicare Benefits Scheme [51] to obtain relevant unit costs. Costs will be calculated in Australian dollars, with equivalent conversions reported in euros and US dollars, using RStudio software (RStudio Inc) [48].

Rigor

For qualitative implementation data, a second author (EP, RR, LT, MB, or DD) will verify a random 25% of the total codes from the (1) first interview for interact-ABI-lity, (2) first interview for social-ABI-lity, (3) first clinician interview for convers-ABI-lity, and (4) first interview with a person with ABI and their communication partner for convers-ABI-lity. Any discrepancies will be resolved through research team discussion and consensus before the first author (MM) proceeds to code the remaining interviews. Qualitative effectiveness data will be managed by 2 experienced speech-language pathologists through a consensus rating procedure. For quantitative implementation and effectiveness data, a second author (EP, RR, LT, MB, or DD) will review outputs, and the first author (MM) will consult a biostatistician for support as necessary. For quantitative costing data, analysis will be conducted in consultation with a health economist and the clinical research team, with calculation methods, rationales, and references transparently reported. Overall results will be reported according to the Standards for Reporting Implementation Studies [52].

Participants

Inclusion and Exclusion Criteria for Users of interact-ABI-lity and social-ABI-lity

As social-ABI-lity and interact-ABI-lity are publicly available web-based courses, all users of the courses within the sample frame will be included in data collection and analysis of survey and analytic data, with no restrictions of inclusion and exclusion criteria. A minimum of 5 users of the social-ABI-lity course (ie, people with ABI) and interact-ABI-lity courses (ie, communication partners such as paid support workers, friends, and family members) will be invited to participate in further implementation interviews. This number of users interviewed is consistent with the think-aloud methods described in the study design [43], which are used to refine the usability of the courses. The internationally recognized industry standard [53] is for a minimum of 5 users to undergo a formative usability interview evaluation [53], because only so many users are required to identify up to 90% of the usability issues [54] before there are diminishing returns for the product cycle [53]. However, a maximum variation sample of these interviewees will be sought if and where possible.

If people with ABI completed the social-ABI-lity course with the assistance of a friend, partner, family member, or other individual, this person will also be invited through the course user to be interviewed together or individually, depending on individual preference or availability. Therefore, interview participants must meet the following criteria:

  1. They must have registered for, and used, at least some modules of interact-ABI-lity or social-ABI-lity, as verified by course records.
  2. They must have indicated consent at course enrollment to be contacted for further research participation opportunities related to the course.
  3. They must have provided informed written consent to participate in the interview. For users who have an ABI, capacity to consent will be determined during a video call with a qualified speech-language pathologist according to our adapted consenting process protocol that includes relevant questions adapted from the University of California, San Diego, Brief Assessment of Capacity to Consent instrument [55]. People with ABI without the ability to respond correctly to all 5 questions presented using supported communication strategies, as outlined in Multimedia Appendix 11 [55], will be excluded from the study.
  4. They must be aged ≥18 years.
  5. They must have adequate English proficiency to participate in the study without the aid of an interpreter, with functional reading skills in English.

There are no restrictions on any other factors (eg, gender, clinical experience, or geographical location) for interview participants who have used interact-ABI-lity and social-ABI-lity. Where possible, variation in these factors is preferred to obtain a purposive, maximum variation sample of user experiences of the interventions.

Inclusion and Exclusion Criteria for Users of convers-ABI-lity

Interviews will be conducted with all 10 people with ABI and their 10 communication partners who have completed the pilot version of convers-ABI-lity, as well as the 5 clinicians delivering the intervention. Participants with ABI must meet the following criteria:

  1. They must have had a definite moderate-severe ABI at least 6 months previously based on the Mayo Classification Scheme [56] (at least one of the following: loss of consciousness of 30 minutes or more, posttraumatic amnesia lasting ≥24 hours, worst Glasgow Coma Scale total score in the first 24 hours of <13, or evidence of a significant brain imaging abnormality). People with a non–traumatic brain injury (restricted specifically to the etiologies of stroke, hypoxic injury, brain tumor, poisoning, and infection) will also be eligible to participate.
  2. They must have been discharged or partially discharged from hospital and be able to spend time at home on a regular basis.
  3. They must have significant social communication skills deficits (either self-identified or identified by a usual communication partner).
  4. They must have insight into their social communication skills deficits.
  5. They must be aged 18-70 years.
  6. They must have adequate English proficiency for completing assessment tasks without the aid of an interpreter.
  7. They must have functional reading skills in English.
  8. They must have a communication partner with whom they interact regularly who is willing to participate in the research interviews and training program.

The exclusion criteria for participants with ABI are as follows:

  1. Aphasia of a severity such that it prevents any participation in conversation.
  2. Severe amnesia, which would prevent participants from providing informed consent, as evaluated using the University of California, San Diego, Brief Assessment of Capacity to Consent instrument [55].
  3. Dysarthria of a severity such that it significantly reduces intelligibility during conversation, as evaluated by the researcher.
  4. Drug or alcohol addiction, which would prevent participants from reliably participating in sessions.
  5. Active psychosis.
  6. Co-occurring degenerative neurological disorder, more than one episode of moderate-severe ABI, or premorbid intellectual disability.

Family members, friends, or paid support workers participating in the study must meet the following criteria:

  1. They must regularly interact with a person with ABI (ie, at least once a week). This person with ABI must have had the ABI at least 6 months previously.
  2. They must have known the person with ABI for at least 3 months.
  3. They must not have sustained a severe ABI themselves.
  4. They must be aged ≥18 years.

Speech-language pathologists delivering convers-ABI-lity must meet the following criteria:

  1. They must be currently employed in a clinical speech-language pathology role.
  2. At least 20% of their caseload must comprise people with ABI.

Ethics

This research has received ethical approval from the University of Technology Sydney (UTS) Health and Medical Research Ethics Committee (ETH21-6111) to conduct interviews with users of social-ABI-lity and interact-ABI-lity. The UTS Health and Medical Research Ethics Committee has also ratified (ETH21-5899) an approval by the Western Sydney Local Health District Human Research Ethics Committee (2019/ETH13510) to conduct interviews with users of convers-ABI-lity and collect demographic and web analytic data for all 3 interventions.

Users of the interact-ABI-lity and social-ABI-lity courses provide their email addresses at registration and indicate whether they consent to participate in follow-up research related to these courses. Consenting users of the courses will be invited to provide informed written consent to participate in a follow-up interview using an accessible, lay-language participant information and consent form. To ensure informed consent, the form will be adapted with visual supports and explained through video call for people with ABI, outlining the full burden and risks of research participation. Screening for capacity to consent is described in the aforementioned inclusion criteria (Multimedia Appendix 11).

Participants with ABI and their communication partners will be paid for their interviews at the 2021 hourly rate recommended by Health Consumers New South Wales [57]. Reimbursement for people with ABI and their communication partners is viewed as critical to recognizing the value of their lived experience of ABI, caring, and health care. It also aims to minimize any undue burden of research participation. Potential participants will be advised of this arrangement in the participant information form to facilitate decision-making around any potential economic burden of participation.


Research Funding

Australian National Health and Medical Research Council Postgraduate Scholarship funding was granted in November 2019, UTS Centre for Social Justice & Inclusion Social Impact funding was granted in April 2021, and icare New South Wales Quality of Life funding was received in November 2019.

Ethical Approval

Ethical approval was received from the UTS Health and Medical Research Ethics Committee (ETH21-6111) on June 29, 2021, and the Western Sydney Local Health District Human Research Ethics Committee (2019/ETH13510) on June 11, 2021, with ratification by the UTS Health and Medical Research Ethics Committee (ETH21-5899) on June 29, 2021.

Participant Enrollment

At manuscript submission on July 13, 2021, 85 participants had enrolled in interact-ABI-lity, with 85 completed entry surveys, 6 course completions, and 8 completed exit surveys. No interview data had yet been collected. Data collection for interact-ABI-lity commenced on July 2, 2021, and is expected to conclude on January 2, 2022, after a 6-month sample frame of analytics. By July 13, 2021, 1 participant had been recruited to participate in the convers-ABI-lity study, with no data yet collected. Data collection is expected to commence on July 26, 2021, and conclude on March 26, 2022. By July 13, 2021, no participant had been recruited to participate in the social-ABI-lity study. Data collection is expected to commence on December 1, 2021, and conclude on June 1, 2022, after a 6-month sample frame of analytics.

Analysis and Findings

Data analysis will occur concurrently with data collection until mid-2022. Results are expected for publication during late 2022 and early 2023.


A Dual Focus on Implementation and Effectiveness

As the global burden of ABI grows, our communities and health care system must find a scalable, feasible, acceptable, and accessible health care service delivery solution to address the psychosocial burden of this condition [1]. A concerted focus on implementation is essential to ensure the successful and sustained uptake of health interventions, without which even efficacious treatments have failed to be adopted, implemented, or sustained [29]. Implementation knowledge should include the perspectives of key stakeholders, while also leveraging existing implementation science theory and evidence, to ensure implementation success.

To this end, we have selected several theoretically informed implementation measures, in addition to measuring the effectiveness of the Social Brain Toolkit. As social-ABI-lity and interact-ABI-lity are educational interventions, their effectiveness will be determined through measures of increased knowledge as well as ecologically valid postintervention measures such as frequency of social media use and confidence communicating with people with ABI. For the CPT intervention of convers-ABI-lity, effectiveness is determined by similarly meaningful measures of quality of life and conversation. The complexity of real-world implementation will be captured using mixed methods, including surveys and interviews exploring user satisfaction and experiences of implementation. For example, our specific survey of whether users are from rural, regional, remote, or metropolitan areas will enable the implementation experiences of these populations to be compared. Given the web-based nature of the Social Brain Toolkit, implementation measures will also include web analytics to identify any need for targeted implementation adjustments or strategies to improve intervention use. As the Social Brain Toolkit comprises technological interventions, think-aloud methods will be used to explore technological usability and provide users with a direct feedback channel to improve user interfaces.

As a prospective study of the real-world implementation of the Social Brain Toolkit, this study will not occur in a closed environment that allows controlled, randomized testing of isolated implementation strategies. Instead, with a theoretical foundation in the real-world complexity of digital health implementation [34], we will measure a comprehensive range of qualitative and quantitative effectiveness and implementation outcomes, and provide a recorded timeline of implementation strategy development and deployment for the Social Brain Toolkit.

Strengths and Limitations

The strengths of this study include a hybrid implementation-effectiveness approach, a mixed methods design with a wide range of implementation measures collected from the outset of implementation, end-user–identified implementation strategies, and a strong theoretical underpinning in a digital health implementation framework. This study is constrained by some technical limitations regarding the collection of analytics, the sample size of the initial limited release of the Social Brain Toolkit interventions, and reliance on self-report for most outcome measures. However, these limitations will be acknowledged in reporting to assist appropriate interpretation.

Conclusions

As people with ABI and their communication partners are the main intended beneficiaries of the Social Brain Toolkit, they have been and are being included from project inception to formative evaluation. Feedback provided by participants will directly inform future iterations of the interventions. Problems identified and recommendations made by users will be incorporated and addressed wherever possible in the next version of the Social Brain Toolkit. Therefore, beneficiaries will see concrete changes to interventions that directly reflect user input. These changes and their rationales will be documented and reported back to users in engaged scholarship that values and empowers stakeholder input through a direct feedback loop. The direct evaluation of the implementation of the interventions by end users in the community aims to ensure that the interventions are sufficiently feasible, acceptable, accessible, scalable, and sustainable to reach those in need of these supports in the community. The results can be used to improve the development and implementation of the Social Brain Toolkit, as well as future web-based psychosocial interventions for people with ABI and other populations.

Acknowledgments

MM would like to thank Professor Aron Shlonsky and Professor Geoffrey Curran for their advice on study design, and Dr Paula Cronin for her advice on costing. The interviewing of stakeholders as part of this study, and the publication of their input, has been funded by a 2021 University of Technology Sydney Social Impact Grant from the Centre for Social Justice & Inclusion, with further funding top-up from the University of Technology Sydney Faculty of Health. MM is supported by a National Health and Medical Research Council (Australia) Postgraduate (PhD) Scholarship (GNT1191284) and an Australian Research Training Program Scholarship. LT is supported by a National Health and Medical Research Council (Australia) Senior Research Fellowship. The development of the Social Brain Toolkit is supported by icare New South Wales.

Authors' Contributions

This study was designed by MM, EP, RR, MB, and LT. MM prepared the manuscript, and EP, RR, LT, MB, and DD critically revised the manuscript. All authors approved the final version of the manuscript for submission.

Conflicts of Interest

MM, EP, RR, MB, and LT are developers of the Social Brain Toolkit in collaboration with end users.

Multimedia Appendix 1

Overview of prospective implementation data collection.

DOCX File , 16 KB

Multimedia Appendix 2

Overview of prospective effectiveness data collection.

DOCX File , 18 KB

Multimedia Appendix 3

convers-ABI-lity interview protocol for people with acquired brain injury and their communication partner.

DOCX File , 16 KB

Multimedia Appendix 4

convers-ABI-lity interview protocol for clinicians.

DOCX File , 16 KB

Multimedia Appendix 5

interact-ABI-lity interview protocol for informal communication partners.

DOCX File , 15 KB

Multimedia Appendix 6

interact-ABI-lity interview protocol for paid communication partners and clinicians.

DOCX File , 15 KB

Multimedia Appendix 7

social-ABI-lity interview protocol for people with acquired brain injury.

DOCX File , 15 KB

Multimedia Appendix 8

Entry surveys for people with acquired brain injury using convers-ABI-lity or social-ABI-lity.

DOCX File , 15 KB

Multimedia Appendix 9

Entry surveys for communication partners of people with acquired brain injury using convers-ABI-lity or interact-ABI-lity.

DOCX File , 15 KB

Multimedia Appendix 10

Exit surveys for all users of social-ABI-lity, convers-ABI-lity, and interact-ABI-lity.

DOCX File , 16 KB

Multimedia Appendix 11

Consenting process protocol, including relevant questions, adapted from the University of California, San Diego, Brief Assessment of Capacity to Consent instrument.

PDF File (Adobe PDF File), 215 KB

  1. Cieza A, Causey K, Kamenov K, Hanson SW, Chatterji S, Vos T. Global estimates of the need for rehabilitation based on the Global Burden of Disease study 2019: a systematic analysis for the Global Burden of Disease Study 2019. Lancet 2021 Dec 19;396(10267):2006-2017 [FREE Full text] [CrossRef] [Medline]
  2. MacDonald S. Introducing the model of cognitive-communication competence: a model to guide evidence-based communication interventions after brain injury. Brain Inj 2017 Dec;31(13-14):1760-1780. [CrossRef] [Medline]
  3. Hewetson R, Cornwell P, Shum D. Social participation following right hemisphere stroke: influence of a cognitive-communication disorder. Aphasiology 2017 Apr 26;32(2):164-182. [CrossRef]
  4. Douglas JM, Bracy CA, Snow PC. Return to work and social communication ability following severe traumatic brain injury. J Speech Lang Hear Res 2016 Jun 01;59(3):511-520. [CrossRef] [Medline]
  5. Langhammer B, Sunnerhagen KS, Sällström S, Becker F, Stanghelle JK. Return to work after specialized rehabilitation-An explorative longitudinal study in a cohort of severely disabled persons with stroke in seven countries: The Sunnaas International Network stroke study. Brain Behav 2018 Aug;8(8):e01055 [FREE Full text] [CrossRef] [Medline]
  6. Mitchell AJ, Sheth B, Gill J, Yadegarfar M, Stubbs B, Yadegarfar M, et al. Prevalence and predictors of post-stroke mood disorders: a meta-analysis and meta-regression of depression, anxiety and adjustment disorder. Gen Hosp Psychiatry 2017 Jul;47:48-60. [CrossRef] [Medline]
  7. Hua A, Wells J, Haase C, Chen K, Rosen H, Miller B, et al. Evaluating patient brain and behavior pathways to caregiver health in neurodegenerative diseases. Dement Geriatr Cogn Disord 2019 Jun;47(1-2):42-54 [FREE Full text] [CrossRef] [Medline]
  8. Schofield D, Shrestha RN, Zeppel MJ, Cunich MM, Tanton R, Veerman JL, et al. Economic costs of informal care for people with chronic diseases in the community: lost income, extra welfare payments, and reduced taxes in Australia in 2015-2030. Health Soc Care Community 2019 Mar;27(2):493-501. [CrossRef] [Medline]
  9. Caro CC, Costa JD, Da Cruz DM. Burden and quality of life of family caregivers of stroke patients. Occup Ther Health Care 2018 Apr;32(2):154-171. [CrossRef] [Medline]
  10. Prevalence of different types of speech, language and communication disorders and speech pathology services in Australia. Commonwealth of Australia. 2014 Sep 2.   URL: https:/​/www.​aph.gov.au/​Parliamentary_Business/​Committees/​Senate/​Community_Affairs/​Speech_Pathology/​Report [accessed 2021-12-01]
  11. Gan C, Gargaro J, Brandys C, Gerber G, Boschen K. Family caregivers' support needs after brain injury: a synthesis of perspectives from caregivers, programs, and researchers. NeuroRehabilitation 2010;27(1):5-18. [CrossRef] [Medline]
  12. Acquired Brain Injury Communication Lab. Social Brain Toolkit. The University of Sydney. 2021.   URL: https://abi-communication-lab.sydney.edu.au/social-brain-toolkit/ [accessed 2021-12-01]
  13. Togher L, Hand L, Code C. Measuring service encounters with the traumatic brain injury population. Aphasiology 1997 Apr;11(4-5):491-504. [CrossRef]
  14. Togher L, Hand L, Code C. Analysing discourse in the traumatic brain injury population: telephone interactions with different communication partners. Brain Inj 1997 Mar;11(3):169-189. [CrossRef] [Medline]
  15. Shelton C, Shryock M. Effectiveness of communication/interaction strategies with patients who have neurological injuries in a rehabilitation setting. Brain Inj 2007 Nov;21(12):1259-1266. [CrossRef] [Medline]
  16. Wiseman-Hakes C, Ryu H, Lightfoot D, Kukreja G, Colantonio A, Matheson FI. Examining the efficacy of communication partner training for improving communication interactions and outcomes for individuals with traumatic brain injury: a systematic review. Arch Rehabil Res Clin Transl 2020 Mar;2(1):100036 [FREE Full text] [CrossRef] [Medline]
  17. Simmons-Mackie N, Raymer A, Cherney LR. Communication partner training in aphasia: an updated systematic review. Arch Phys Med Rehabil 2016 Dec;97(12):2202-21.e8. [CrossRef] [Medline]
  18. Hebert D, Lindsay MP, McIntyre A, Kirton A, Rumney PG, Bagg S, et al. Canadian stroke best practice recommendations: stroke rehabilitation practice guidelines, update 2015. Int J Stroke 2016 Jun;11(4):459-484. [CrossRef] [Medline]
  19. Togher L, Wiseman-Hakes C, Douglas J, Stergiou-Kita M, Ponsford J, Teasell R, INCOG Expert Panel. INCOG recommendations for management of cognition following traumatic brain injury, part IV: cognitive communication. J Head Trauma Rehabil 2014;29(4):353-368. [CrossRef] [Medline]
  20. Togher L, McDonald S, Tate R, Rietdijk R, Power E. The effectiveness of social communication partner training for adults with severe chronic TBI and their families using a measure of perceived communication ability. NeuroRehabilitation 2016 Mar 23;38(3):243-255. [CrossRef] [Medline]
  21. Rietdijk R, Power E, Brunner M, Togher L. A single case experimental design study on improving social communication skills after traumatic brain injury using communication partner telehealth training. Brain Inj 2019 Jan;33(1):94-104. [CrossRef] [Medline]
  22. Rietdijk R, Power E, Attard M, Heard R, Togher L. A clinical trial investigating telehealth and in-person social communication skills training for people with traumatic brain injury: participant-reported communication outcomes. J Head Trauma Rehabil 2020 Jan;35(4):241-253. [CrossRef] [Medline]
  23. Ketchum JM, Sevigny M, Hart T, O'Neil-Pirozzi TM, Sander AM, Juengst SB, et al. The association between community participation and social internet use among adults with traumatic brain injury. J Head Trauma Rehabil 2020 Aug;35(4):254-261 [FREE Full text] [CrossRef] [Medline]
  24. Brunner M, Hemsley B, Togher L, Dann S, Palmer S. Social media and people with traumatic brain injury: a metasynthesis of research informing a framework for rehabilitation clinical practice, policy, and training. Am J Speech Lang Pathol 2021 Jan 27;30(1):19-33. [CrossRef] [Medline]
  25. Bokolo AJ. Application of telemedicine and eHealth technology for clinical services in response to COVID‑19 pandemic. Health Technol 2021 Jan 14;11(2):1-8 [FREE Full text] [CrossRef] [Medline]
  26. Munsell M, De Oliveira E, Saxena S, Godlove J, Kiran S. Closing the digital divide in speech, language, and cognitive therapy: cohort study of the factors associated with technology usage for rehabilitation. J Med Internet Res 2020 Feb 07;22(2):e16286 [FREE Full text] [CrossRef] [Medline]
  27. Titov N, Hadjistavropoulos HD, Nielssen O, Mohr DC, Andersson G, Dear BF. From research to practice: ten lessons in delivering digital mental health services. J Clin Med 2019 Aug 17;8(8):1239 [FREE Full text] [CrossRef] [Medline]
  28. Titov N, Dear BF, Nielssen O, Wootton B, Kayrouz R, Karin E, et al. User characteristics and outcomes from a national digital mental health service: an observational study of registrants of the Australian MindSpot Clinic. Lancet Digit Health 2020 Nov;2(11):e582-e593 [FREE Full text] [CrossRef] [Medline]
  29. Christie HL, Martin JL, Connor J, Tange HJ, Verhey FR, de Vugt ME, et al. eHealth interventions to support caregivers of people with dementia may be proven effective, but are they implementation-ready? Internet Interv 2019 Dec;18:100260 [FREE Full text] [CrossRef] [Medline]
  30. Granja C, Janssen W, Johansen MA. Factors determining the success and failure of eHealth interventions: systematic review of the literature. J Med Internet Res 2018 May 01;20(5):e10235 [FREE Full text] [CrossRef] [Medline]
  31. Andersson G. Internet interventions: past, present and future. Internet Interv 2018 Jun;12:181-188 [FREE Full text] [CrossRef] [Medline]
  32. Shaffer KM, Tigershtrom A, Badr H, Benvengo S, Hernandez M, Ritterband LM. Dyadic psychosocial eHealth interventions: systematic scoping review. J Med Internet Res 2020 Mar 04;22(3):e15509 [FREE Full text] [CrossRef] [Medline]
  33. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care 2012 Mar;50(3):217-226 [FREE Full text] [CrossRef] [Medline]
  34. Greenhalgh T, Wherton J, Papoutsi C, Lynch J, Hughes G, A'Court C, et al. Beyond adoption: a new framework for theorizing and evaluating nonadoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. J Med Internet Res 2017 Nov 01;19(11):e367 [FREE Full text] [CrossRef] [Medline]
  35. Greenhalgh T, Wherton J, Papoutsi C, Lynch J, Hughes G, A'Court C, et al. Analysing the role of complexity in explaining the fortunes of technology programmes: empirical application of the NASSS framework. BMC Med 2018 May 14;16(1):66 [FREE Full text] [CrossRef] [Medline]
  36. Greenhalgh T, Abimbola S. The NASSS Framework - a synthesis of multiple theories of technology implementation. Stud Health Technol Inform 2019 Jul 30;263:193-204. [CrossRef] [Medline]
  37. Bernet AC, Willens DE, Bauer MS. Effectiveness-implementation hybrid designs: implications for quality improvement science. Implementation Sci 2013 Apr;8(Suppl 1):S2. [CrossRef]
  38. Miao M, Power E, Rietdijk R, Debono D, Brunner M, Salomon A, et al. Co-producing knowledge of the implementation of complex digital health interventions for adults with acquired brain injury and their communication partners: a mixed-methods study protocol. JMIR Res Protocols 2021 (forthcoming) [FREE Full text] [CrossRef]
  39. Miao M, Power E, Rietdijk R, Brunner M, Togher L. Implementation of online psychosocial interventions for people with neurological conditions and their caregivers: a systematic review protocol. Digit Health 2021 Sep 6;7:20552076211035988 [FREE Full text] [CrossRef] [Medline]
  40. Douglas JM, O'Flaherty CA, Snow PC. Measuring perception of communicative ability: the development and evaluation of the La Trobe communication questionnaire. Aphasiology 2000 Mar;14(3):251-268. [CrossRef]
  41. Togher L, Power E, Tate R, McDonald S, Rietdijk R. Measuring the social interactions of people with traumatic brain injury and their communication partners: the adapted Kagan scales. Aphasiology 2010 Feb 03;24(6-8):914-927. [CrossRef]
  42. von Steinbüchel N, Wilson L, Gibbons H, Hawthorne G, Höfer S, Schmidt S, QOLIBRI Task Force. Quality of Life after Brain Injury (QOLIBRI): scale development and metric properties. J Neurotrauma 2010 Jul;27(7):1167-1185. [CrossRef] [Medline]
  43. Charters E. The use of think-aloud methods in qualitative research an introduction to think-aloud methods. Brock Educ J 2003 Jul 01;12(2):68-82. [CrossRef]
  44. Microsoft Corporation. Microsoft Teams (Version 1.4.00.16575, 64-bit). 2021.   URL: https://www.microsoft.com/en-au/microsoft-teams/download-app [accessed 2021-12-01]
  45. Kyngäs H, Kaakinen P. Deductive content analysis. In: The Application of Content Analysis in Nursing Science Research. Cham: Springer; 2020:23-30.
  46. QSR International Pty Ltd. NVivo 12 Pro Windows (Version 12.6).   URL: https:/​/www.​qsrinternational.com/​nvivo-qualitative-data-analysis-software/​support-services/​nvivo-downloads [accessed 2021-11-23]
  47. Microsoft Corporation. Microsoft ® Excel ® 2016 for Windows (Version 16.0.5149.1000, 32-bit). 2016.   URL: https://microsoft-excel-2016.en.softonic.com/ [accessed 2021-11-23]
  48. RStudio Inc. RStudio (Version 1.2.5042). 2020.   URL: https://rstudio.com/ [accessed 2021-11-24]
  49. Jacobson NS, Truax P. Clinical significance: a statistical approach to defining meaningful change in psychotherapy research. J Consult Clin Psychol 1991 Feb;59(1):12-19. [CrossRef] [Medline]
  50. Drummond M, Sculpher MJ, Claxton K, Stoddart GL, Torrance GW. Methods for the Economic Evaluation of Health Care Programmes. Oxford: Oxford University Press; 2015.
  51. MBS Online. Australian Government Department of Health. 2020.   URL: http://www.mbsonline.gov.au/internet/mbsonline/publishing.nsf/Content/Home [accessed 2021-11-24]
  52. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, StaRI Group. Standards for Reporting Implementation Studies (StaRI): explanation and elaboration document. BMJ Open 2017 Apr 03;7(4):e013318 [FREE Full text] [CrossRef] [Medline]
  53. Association for the Advancement of Medical Instrumentation [AAMI]. ANSI/AAMI HE75:2009 (R2018). 2018.   URL: https://webstore.ansi.org/Standards/AAMI/ANSIAAMIHE752009R2018 [accessed 2021-11-24]
  54. Virzi RA. Refining the test phase of usability evaluation: how many subjects is enough? Hum Factors 2016 Nov 23;34(4):457-468. [CrossRef]
  55. Jeste DV, Palmer BW, Appelbaum PS, Golshan S, Glorioso D, Dunn LB, et al. A new brief instrument for assessing decisional capacity for clinical research. Arch Gen Psychiatry 2007 Aug;64(8):966-974. [CrossRef] [Medline]
  56. Malec JF, Brown AW, Leibson CL, Flaada JT, Mandrekar JN, Diehl NN, et al. The Mayo Classification System for Traumatic Brain Injury Severity. J Neurotrauma 2007 Sep;24(9):1417-1424. [CrossRef] [Medline]
  57. Remuneration and reimbursement of health consumers. Health Consumers NSW.   URL: https:/​/www.​hcnsw.org.au/​for-health-consumer-organisations/​remuneration-and-reimbursement-of-health-consumers/​ [accessed 2021-11-24]


ABI: acquired brain injury
CPT: communication partner training
NASSS: Nonadoption, Abandonment, Scale-up, Spread, and Sustainability
UTS: University of Technology Sydney


Edited by G Eysenbach; submitted 13.07.21; peer-reviewed by S Chokshi, CY Lin; comments to author 12.10.21; revised version received 18.10.21; accepted 20.10.21; published 09.12.21

Copyright

©Melissa Miao, Emma Power, Rachael Rietdijk, Melissa Brunner, Deborah Debono, Leanne Togher. Originally published in JMIR Research Protocols (https://www.researchprotocols.org), 09.12.2021.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Research Protocols, is properly cited. The complete bibliographic information, a link to the original publication on https://www.researchprotocols.org, as well as this copyright and license information must be included.