Implementation researchers’ perspectives on bridging the research-practice gap


by Zuneera Khurshid

In our blog earlier this year, we talked about simplifying implementation science, and making it more accessible to frontline staff. In this blog we will share critical insights that the implementation team, based at the Improvement Academy, has gathered while supporting, and facilitating putting evidence into practice. 

While there are several factors that contribute to the widening of the research to practice gap, the way the research pipeline is structured is a key one. The traditional research pipeline focuses on demonstrating efficacy and effectiveness of interventions before moving onto implementation and sustainability. This is a linear way of looking at how research moves into practice and limits the usefulness of implementation science. While this is changing, it will still take some time before implementation is fully integrated into research designs. Due to this, several well-researched and evidence-based programmes and interventions do not make it to practice. De Geest and colleagues describe this as the ‘valley of death’ where evidence-based innovations fail to cross the wasteland between the research world to the real-world settings. Implementation science will continue to play catch-up unless we change the way we think about it. 

Our implementation team aims to challenge and change the assumption of how evidence moves into practice in a linear way by arguing that practice also generates robust evidence. We encourage a ‘learning by doing’ approach that acknowledges the expertise of frontline staff, practitioners, and communities. This means that as implementation researchers, we bring the research evidence and across project learning to a real-world problem, and work with the implementers who bring the operational and functional knowledge to undergo multiple cycles on implementation knowledge generation together. This helps us in developing a deeper understanding of implementation. This is demonstrated using a short case study below.

This approach has enabled us to build and strengthen partnerships and relationships with our implementation stakeholders which has helped us in developing a better understanding of implementation factors in our research, and more meaningful outcomes. Additionally, we have been developing our methods to support this approach by facilitating rapid feedback to enable improvement and adaptation and map implementation factors in real-time. This way of working is accompanied by a reflection on our role as implementation researchers as we move from being passive observers towards a role with more overlap across the research-practice boundary.  This has added depth to our understanding of causal pathways about what works and why and which implementation strategies can be brought to life within a setting. 

Case study: Implementing Community health checks in Bradford

Bradford District is among the most deprived local authorities in England and has a high rate of the cardiovascular disease (CVD) and there was a demand in the community for pre-emptive actions to reduce this risk. In response to this need, the Improvement Academy established a network of clinicians, voluntary sector organisations, community members, local authority representatives, and researchers to implement of a series of collaborative, outreach health check events with an aim of CVD prevention, reducing inequalities, and improving community access to social prescribing and primary care. This was based on a ‘learning by doing’ and ‘practice to evidence’ approach where each health check event was used as a learning opportunity to understand how implementation works in practice and using this learning to inform future events and generate implementation learning. 

Integrating implementation science in research studies from the start could amplify the impact of research and research teams. This involves looking at research and implementation as complementary, and cyclical, where both inform each other. Following are some of the key learnings from our approach:

  • Implementation research should be driven by the community/frontline that is addressing a problem that has real world impact.
  • Integrating rapid evaluation methods with traditional research and evaluation methods.
  • Providing rapid feedback to implementers to promote intervention and implementation strategy adaptation and tailoring.
  • Identifying and developing relationships with implementation champions in the sites.
  • Facilitating establishment of an implementation team, identifying key stakeholders and knowledge assets, in the implementation process.
  • Using existing connections and relationships. 
  • Advocating for leadership support for the implementation effort.
  • Understanding the unique local context and readiness for implementation
  • Providing implementation training and support. 
  • Using a case study approach. 

Related Blogs

Taking Steps Towards Equality, Diversity, and Inclusion in Yorkshire & Humber ARC Research

We want all people and communities in YH to benefit from our research. To do this we aim to follow best practice and have the highest standards of equality, diversity and inclusion in all our research. The NIHR Applied Research Collaborations (ARCs) are leading efforts to put the latest NIHR Equality Diversity and Inclusion (EDI) Strategy into practice within their research programs. Some ARCs have developed toolkits while others have developed strategies focused on "mainstreaming" EDI.

Patient and Public Involvement (PPI) on a pain and frailty study – reflections on involvement, engagement, and impact 

The The YH ARC supported Pain in Older People with Frailty (POPPY) Study is a 3-year NIHR-funded study hosted by Bradford Teaching Hospitals NHS Foundation Trust (commenced April 2022). The study aims to provide service guidance to improve access and support for older adults living with frailty to better manage their pain.

We need a more realistic approach to implementation in healthcare (Part 1)

The basic origins of implementation science have always been a push approach. We as implementation researchers find and know the evidence-based practices that ‘need’ to be implemented. We tell health systems, hospitals, schools, communities, and clinics about these interventions. We ask them if they want to engage with us in a clinical trial or implementation trial. If they agree, funding is provided externally or internally, local managers and their teams are informed that the project is happening, researchers are funded to learn about the implementation, and the executives and policy makers who gave the green light expect findings that tell them how to scale up these interventions. Implementation begins and most pilots are usually pretty successful.