I’ve been asking experts around the world about implementation 


…..and it is changing how I see implementation

 by Kristian Hudson

Anyone who has worked in public health will be aware of how hard it can be to ensure interventions produce the same outcomes in practice as they do in the research trials from which they stemmed from. It seems the road between something that works and it actually benefitting society is filled with a million and one implementation potholes. It is clear to most people that it doesn’t matter how effective an intervention has been shown to be, if you do not get the implementation right it is a lot less likely to have any positive effect. 

Implementation science has grown considerably over the past 20 years to address this issue and has provided some answers to our implementation problems. There are now dozens of implementation frameworks which can help us ensure the positive outcomes of evidenced based practices are not hindered by implementation difficulties.

These frameworks provide a shared language which stakeholders can use to familiarise themselves with implementation and which can function as practical tools for planning, executing, and evaluating real-world implementation efforts. They can (1) describe and/or guide the process of translating effective interventions into practice (process frameworks), (2) analyse what influences implementation outcomes (determinant frameworks), and (3) evaluate implementation efforts (outcome frameworks).

But are these frameworks enough? In October last year I started asking experts around the world how they envisaged applying implementation frameworks to practical settings like those in the NHS. I did this via our Improvement Academy podcast – Essential Implementation (https://www.youtube.com/channel/UCZu-S0m8tgInP9p6yOcuB-A). 

I already knew that implementation frameworks provided academic knowledge that was generalizable and teachable. But I also felt intuitively that ‘doing’ implementation might require a different type of knowledge. This is what Professor Ioan Fazey at the University of York has been writing about in his work around climate change. As Ioan explained on the podcast, we’ve got very good at problem analysis (about 95% of climate change papers analyse the problem) but we are still learning how to generate practical knowledge to actually address the problem (about 2% of climate papers do this). In order to generate practical knowledge we need to go and ‘do’ implementation and through this process we will learn ‘how to’ implement. So we need both academic implementation knowledge and practical implementation knowledge. I interviewed Laura Damschroder, the first author of the Consolidated Framework for Implementation Research (CFIR), a very popular implementation framework used across the globe, and asked her what her thoughts were on practical knowledge. She explained to me that she can diagnose a context for implementation barriers using the CFIR but the actual process of overcoming these barriers involves a process of engaging and working with receptive teams in the target setting. She combines implementation science with improvement science techniques and ensures teams are able to continually optimise the intervention over time. 

The Improvement Academy, which forms part of the YHARC, are experts in improvement science and over the last 7 years they have generated vast amounts of practical implementation knowledge. And what do they say about the practical side of implementation? They say that relationships are vital to any implementation strategy. In fact Allison Metz, one of the most cited implementation practitioners in the world told me on the podcast that “relationships might actually be more important then the implementation strategy you use”. Relating is a difficult concept to quantify and measure but anyone who has been involved in the practical side of implementation knows how important this concept can be. Dr Ali Cracknell, another Interviewee, working with the Improvement Academy, managed to get Safety Huddles, a patient safety intervention, into 136 wards across the Yorkshire area. For Ali, safety huddles would not have worked had they not been developed over time via a process of stakeholder engagement and psychological safety where NHS staff were able to adapt the huddle to their specific needs on the ward. Dr Elizabeth Taylor-Buck and Amanda Lane told a similar story and were able to implement a patient reported outcome measure into 80% of Trusts. High service user and clinician input to ensure high face validity from the start, stakeholder workshops, an advisory group to help with the knowledge transfer and dealing with the ‘churn’ after the initial testing phase were all key steps to ensuring successful implementation.

In short, framework use in isolation without forming relationships with practitioners in the target setting can result in a mismatch between framework selection and its applicability in practice as well as a less positive implementation outcomes. Calls are growing for an embedded research approach where researchers embed themselves into the target setting and work to support practitioners in developing interventions they feel they need. Practice-based research networks, learning health systems, and implementation laboratories, are also ways to foster collaborations between researchers, implementers, and policy-makers integrated within a healthcare system to improve patient health outcomes.

Be sure to check out The Essential Implementation Podcast for further discussions into the practical side of implementation as well as the latest guidance about how to implement interventions into healthcare settings.

Related Blogs

Implementation researchers’ perspectives on bridging the research-practice gap

In our blog earlier this year, we talked about simplifying implementation science, and making it more accessible to frontline staff. In this blog we will share critical insights that the implementation team, based at the Improvement Academy, has gathered while supporting, and facilitating putting evidence into practice.

We need a more realistic approach to implementation in healthcare (Part 1)

The basic origins of implementation science have always been a push approach. We as implementation researchers find and know the evidence-based practices that ‘need’ to be implemented. We tell health systems, hospitals, schools, communities, and clinics about these interventions. We ask them if they want to engage with us in a clinical trial or implementation trial. If they agree, funding is provided externally or internally, local managers and their teams are informed that the project is happening, researchers are funded to learn about the implementation, and the executives and policy makers who gave the green light expect findings that tell them how to scale up these interventions. Implementation begins and most pilots are usually pretty successful.

We need a more realistic approach to implementation in healthcare (Part 2)

There are a lot of people post Covid, and a lot of practitioners who are traumatised. In my field of work this was most evident on ICU wards. There is also the idea of traumatised systems. Data has shown how hard these systems were before Covid and the moral injury that practitioners experience in the service settings. Unfortunately, there’s just not enough in the literature and the big implementation science journals and conferences which talk about these burnouts and these traumas.