Applied Implementation Research: Addressing the "How" of Real World Practice The following is a transcript of the presentation video, edited for clarity. ... Presentation Video
Clinical Practice Research > Implementation Science    >   Presentation Video
Free
Presentation Video  |   March 01, 2014
Applied Implementation Research: Addressing the "How" of Real World Practice
Author Affiliations & Notes
  • Renée Boothroyd
    National Implementation Research Network (NIRN), University of North Carolina at Chapel Hill
  • Presented at the Implementation Science Summit: Integrating Research Into Practice in Communication Sciences and Disorders (March 2014). Hosted by the American Speech-Language-Hearing Foundation.
    Presented at the Implementation Science Summit: Integrating Research Into Practice in Communication Sciences and Disorders (March 2014). Hosted by the American Speech-Language-Hearing Foundation.×
  • This content is licensed under Creative Commons license CC BY-NC-ND, Attribution-NonCommercial-NoDerivs. You are free to share, copy, distribute and transmit the work under the following conditions: Attribution —You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work); Noncommercial —You may not use this work for commercial purposes; No Derivative Works —You may not alter or transform this work. Any of the above conditions can be waived if you get permission from the copyright holder.
    This content is licensed under Creative Commons license CC BY-NC-ND, Attribution-NonCommercial-NoDerivs. You are free to share, copy, distribute and transmit the work under the following conditions: Attribution —You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work); Noncommercial —You may not use this work for commercial purposes; No Derivative Works —You may not alter or transform this work. Any of the above conditions can be waived if you get permission from the copyright holder.×
Article Information
Development / Research Issues, Methods & Evidence-Based Practice / Attention, Memory & Executive Functions / Clinical Practice Research / Implementation Science
Presentation Video   |   March 01, 2014
Applied Implementation Research: Addressing the "How" of Real World Practice
CREd Library, March 2014, doi:10.1044/cred-pvid-implscid1p3
CREd Library, March 2014, doi:10.1044/cred-pvid-implscid1p3

The following is a transcript of the presentation video, edited for clarity.

For our time today, the aims are bold. I want you to think of this time as an orientation to lay out some big ideas. In an effort to manage expectations, I' not sure we will get around to sorting out that whole word salad. But you will get a sense of what some of this work looks like in different ways.
I want to spend just a couple slides going into some of the definitions to give us some anchoring and some bearing. And then characterize what we do know about this kind of work and what it takes to apply evidence in real world practice. We are always reminded it can be quite noisy, quite complex, and dynamic -- meaning it changes over time. And then we will consider an application or two, and in particular, encourage you to do the same.
Defining Implementation
Some basic, key aspects of what we might call implementation are underlined here, in terms of putting something into practice as well as fulfilling or carrying something out. Really what's implicit there is a process, that we're going from something that we know. We have some sense of it, whether it's a kernel, or in some cases maybe a more defined packaged, if you will, kind of evidence-based program.
But there's no guarantee that that's going to lead to success because there's a process of going through putting the program into place and getting to those outcomes.
What we do know about research to practice in the context of this work is that the research has really expanded over several years. We do know more about some things that are working, not only in controlled, but also in some real world settings through other kinds of effectiveness trials.
As I mentioned before when it comes to putting those same kind of programs or evidence-based interventions into practice, it's not really a carte blanche kind of issue that they'll automatically lead to the same kind of outcomes.
In fact we probably have many examples of -- I wouldn't necessarily call them failure articles, but certainly those learning opportunities, where things didn't quite work out as we expected. Where there are some qualitative evaluation or other kinds of aspects that are then helping us understand where things did not go exactly according to plan, or where we maybe had failed to plan, that we really need to keep in mind as we go forward. And some of those factors that really seem to influence the extent to which these things rolled forward or didn't.
In this case, this is the same program that had varied levels of implementation, discrepancies in attaining some level of "is this being implemented as intended." Not so much in a rigid way, but certainly in a robust way, with a sense of core ingredients.
If we refocus then back on the "how" of getting to outcomes, and the science of what works, there is a science that could help us differentiate between: is it the program that's really not working in this situation, or is it an issue about how it's being implemented that we need to pay attention to as well.
It can really help us guide application of what it is that we're learning to improve implementation practice. And I'll touch on a little bit of that in terms of the work that I'm involved in in a couple of different projects.
So in terms of implementation, here are just a couple of definitions, really again just reinforcing some of the aspects of looking at uptake and application, with a real emphasis on routine practice in these kind of real world, noisy situations.
In particular, through the colleagues at NIRN, in particular Karen Blase and Dean Fixsen, who were the cofounders of NIRN five or six years ago, had really kind of taken this a bit further in terms of a more applied sense of what we mean by implementation. That's really a focus on those conditions and variables that impact, and that are critical, for leading to change at the practice and at the organization and at the system level. So you'll hear me talk in particular more so about that kind of applied implementation.
One of the things that I wanted to just spend a little bit of a time on I might call kind of just a differentiation between what I'm referring to as implementation, and what sometimes is an implementation strategy that in and of itself is an intervention.
In a lot of situations, we talk about a particular delivery strategy like a reminder recall system, in this case maybe to work towards strengthening immunization rates in a particular project. What I'm really talking about is not limited to a kind of delivery strategy as implementation per se.
What we're doing is taking a step back and asking, "What does it take to do X?" This could be a reminder recall program, it could be a patient navigator program, it might be a particular evidence-based intervention. But we're really asking what are those implementation practices and processes that are targeted at the level of the practitioner, of the clinician, of the organization and the system. In other words, how can we create those conditions and their behavior to support full and effective use of whatever the strategy might be.
I'll elaborate more on what those processes, what we know from implementation research so far, about those implementation related processes and practices.
But there's a slight little nuance here. Here, the implementation processes that you're putting into place are trying to reach a set of outcomes, and those implementation outcomes are specific to the practitioner, to the clinician, to the organization, and in a way then those implementation processes are kind of an independent variable and, in the case of the clinic staff and the system, they are the dependent measures that are affected by these implementation approaches.
And then if you will there's a bit of a shift right, because then the degree to which they can then put an effective intervention in place as intended, becomes then the intervention that's then related to outcomes.
This is just kind of a visual to be able to describe that there's an implementation approach that has particular implementation objectives and outcomes, and we really need to be paying attention to those, whether it's before, simultaneous to in a kind of hybrid approach, but doing so in a way that is also really staying focused on implementation.
These are just a few reminders that pull some of those big ideas from that how aspect of getting to outcomes, that implementation works within real world conditions rather than trying to control for them or remove them.
It's very concerned and focused on the users of research, and not purely on the production of knowledge.
And it definitely occurs as part of a system, and sometimes an implementation approach might be very focused on a strategy, knowing that there's noise behind it but not necessarily paying attention to the noise. In the work that we're doing we're paying attention to those complex and dynamic systems, which really suggests a multipronged, multilevel approach to implementation, and paying attention to different things.
What this slide tries to describe is that even with an intervention that's effective, we need to focus on implementation. In particular, there's a lot of research that does demonstrate that pipeline in getting to effects. There's a low percentage, even over a long period of time, that actually get to what we might call full implementation. That's typically in the absence of what might be a more active, a more comprehensive, focused approach to implementation.
In the presence of that kind of implementation team, other kinds of factors that are paying attention to different levels of implementation, we're getting a lot more effective implementation of those interventions. Not overnight, but maybe in a three to four year time period.
Some of the literature characterizes that as, whether it's a letting it happen/helping it happen approach, in contrast to a making it happen or effective use of implementation science and practice.
It's not to minimize or to discredit any of these other approaches because these are not mutually exclusive. There's a cascade of behaviors when you think about implementation, in terms of finding a fit, deciding to adopt, going through a whole process of making it happen.
So diffusion and dissemination and other kinds of approaches are essential, they're just not quite sufficient necessarily for the kind of uptake that we might be looking for.
A lot of what I'm going to talk about is that purposeful and proactive approach. This is just reminding us that implementation includes, but is not limited to, some of the other approaches that are involved in research to practice. And if you think about it, oftentimes some of the diffusion or dissemination is very focused on the innovation itself, while implementation is focused on that process, and if focuses on the use of that innovation in a real world kind of situation.
What Do We Know: Active Implementation Frameworks
What do know about implementation science so far? We do know that, kind of like rain not quite reaching the ground, there are some well-designed, well-intended approaches that aren't quite getting us the uptake or impact in terms of implementation that we would like.
The best data is showing that these kinds of methods, when they're used alone, don't necessarily result in use of the innovations as intended for getting to the outcomes that matter and the outcomes that we're trying to get at. Training alone or dissemination alone or other approaches, they don't quite get us the return on investment that we're looking for.
We're kind of at a stage moving from making lists to making sense. We know enough about factors that are influencing implementation practice, especially in the last five or six years in light of a few very comprehensive reviews that have looked at implementation frameworks or models, that are pointing to a core set of themes that are important, or factors that are important to address.
I've listed just a couple of examples of what some of those things are here, in terms of paying attention to different levels of context, you know the whole idea of the practitioner as well as the organization and the system. And that this processing work happens in a phased-based approach. This isn't turn it on, turn it off. There are a set of activities and work that can be done in implementation to set you up for success.
You've heard from many of us, and I'm sure you will continue to hear the importance of regular monitoring and feedback -- feed forward and feedback -- that's critical in multiple layers of this work for ongoing attention to and improvement to implementation.
What I'm going to talk about then that is kind of a culmination of some of this, and a representation, are what are called the active implementation frameworks that are part of the work that we do at NIRN.
With the active implementation frameworks, what it suggests is that successful implementation really starts with the what. We need some specificity about the essential ingredients, about what is the what. So that when it's trying to be replicated, we know what those essential ingredients are and we have an idea of how to measure the extent that they're present and the strength that they might have.
Successful implementation really does require active use of implementation best practices. These are evidence-informed best practices in terms of some implementation work that's been done. And we might call those the drivers of implementation.
I already kind of alluded to the idea of purposeful activities that occur within stages of this work.
And a deliberate, purposeful focus on people, or the who of this work, especially in kind of systems and transformational change projects.
Implicit in there is that focus on continuing improvement or improvement cycles.
These visuals describe the five frameworks that I just alluded to. I'm just going to briefly touch on maybe a core aspect of each of them.
What: Usable Intervention Criteria
I'm going to anchor some of this into what's called this kind of formula for success that reminds us that it's not just attention on the what to get to the outcomes that we care about, there's a real deliberate focus on the how.
If you pay close attention you realize that there's a multiplying factor here. So imagine if either one of these are zero, or either one of these is put into place on a level less than optimal, then you're only getting a percentage of the kind of outcomes that you're looking for.
Let me start with the what and emphasize that you'd be really surprised at the extent to which these proven interventions are not really defined in enough detail so that they are functional, operational, very specific. So that they're teachable, that they're learnable, that they're measurable, and repeatable. That's really one of the critical aspects of where implementation has to start.
In some cases, purveyors of certain evidence-based programs, or even in the case of some of the work that I'm doing with the California Department of Child Welfare, where they're developing an evidence-informed practice model, some of the real work starts with operationally defining the what.
So you can say, what are you seeing, what are people saying and doing, to be able to say, behaviorally, this is what we mean by engagement. Or this is what we might mean by outreach, or different aspects that might be essential functions.
How: Drivers, Improvement Cycles, Enabling Contexts
Now let me move on just to the how and touch briefly on some of those other frameworks that I mentioned.
This slide summarizes that those drivers are factors that are involved in implementation, they pay attention to the person on the left in terms of the competency drivers, that user or that deliverer, if you will, of the intervention or the practice or the kernel.
It pays attention to the organization in terms of the degree to which the organization can remove barriers and pay attention to feedback from users, and create data systems to inform decisions that can help have practice inform policy, and have policy then enable practice.
And it pays attention to leadership, and by that I don't mean a philosophical commitment, because we can all say, "Oh yeah, I support that." I mean taking an active role in paying attention and removing barriers, and being a voice and a message for the kind of change that is being sought. There's many more elaborations that we can talk about there.
This "how" also continues in a couple of ways with these improvement cycles in terms of some rapid can-do study act approaches. And this really also reinforces this idea, especially with implementation, and there's probably gradients to this as you think about a specific, very particular strategy, compared to maybe a more multipronged kind of practice approach that you might be starting to maybe implement in a slice of an organization.
In other words, we're not trying to apply this practice model with the California Department of Child Welfare in every county in California all at once. We're starting with a slice to work out some of the kinks, to try and begin to think about how something like coaching can really fit into an organization that's very compliance-orientated. Or to begin to incorporate direct observation as a way to measure delivery of the intervention in a system that hasn't really used direct observation either at all, or it has used it in a punishment kind of way.
This just reinforces starting small and using some of those feedback cycles to work out kinks as you then maybe consider scaling up down the road.
This other aspect of using information in an ongoing fashion is just reinforcing, then, that that using data and connecting this kind of information about implementation, about coaching, about the barriers that are getting in the way of rollout, can be informed in an ongoing way.
To influence not only policy...
... But also practice.
All this happens within what we need to have as an enabling context, and I think that's why we always need to remember that systems trump programs.
We can take any effective innovation, and if we put it within a system as-is, it might work for a little while, it might not work at all, but likely it's not going to work long term. And that's because these organizations and systems are really designed to, whether it's intentionally or unwittingly, achieve precisely the results that they get.
If we're not paying attention to how this particular practice model, in this case, influences how a supervisor does their work, and how the organization might need to pay attention to that.
This isn't saying that the entire system has to start from scratch, but if we start to pay attention to things that are or are not working in service to what we're trying to put into place, we might be able to change a couple of practices. The return on the investment is huge in terms of really creating a facilitating and enabling context for this kind of work.
An effective innovation needs to be supported by that organization, and this active implementation approach creates ways for the system to do that, to pay attention to it in different ways.
Who: Implementation Team(s)
One of the earlier slides really picked up on this idea of who, and we've learned that without a deliberate, purposeful focus on implementation, and this often takes the form of starting with a few people, a portion of whose time is repurposed to pay attention to some of this, whether it's coaching, whether it's data systems, whether it's linked communications with leadership groups. So that they can intentionally be building readiness and installing these implementation practices, and really assessing and reporting on outcomes.
How/When: Implementation Stages
And finally, as mentioned in some of the other aspects of the talk so far, this work definitely unfolds in stages. Some of the frameworks name them different things, but essentially we're talking about some pre-implementation, some initial start-up and during, and then certainly looking towards more long-term, sustained. This often can take the full rollout, it's not overnight and it can take multiple years. But at the same time, we're paying attention to specific aspects of implementation as we go.
It's not as if we are expecting some kind of latent feedback three or four years later to let us know if we're on the right path.
Example and Final Thoughts
What this one slide just tries to outline is an application of some data that, in this case, was a child wellbeing project.
It was putting into place a lot of those drivers or factors related to implementation around coaching, around ongoing assessments.
And also then measuring fidelity, again in a robust way not in a rigid way, to really look at the percent of child wellbeing cases in a service organization that were meeting some of the definitions of what those essential ingredients were.
Early on, as some of the work was beginning to roll out and get traction and maybe in some cases not get traction at all, and revisiting some of this, with a score with how we were able to measure the rollout of those factors.
You can see that it was lower in terms of getting started to be putting into place, with a number more towards one, and then a lower percentage of those cases.
Over time, as those implementation factors were really being addressed, and in this case training and a real emphasis on training with the 1.5, actually went down a little bit and that's because coaching really started playing a role.
Implementation is really teaching us that if we don't have ongoing ways to apply and learn from this in our everyday work, then the return on investment is very low.
So we might have behavioral rehearsals in training, but we need ongoing feedback in an instructional and supportive way to be getting that kind of traction. In this case, they were able to begin to explore some associations between what implementation looked like and what fidelity looked like.
So let me just leave you with a few kind of considerations to take with you. One is that implementation is very much a deliberate and purposeful process, and by default, if it's working in those real world conditions, then it is multilevel. It involves people, it involves organizations, it involves systems.
The second is that with these implementation strategies and processes, what we're really looking for are implementation outcomes. It's that kind of step back, we're looking for changes in professional's behaviors, in organizational behavior and systems behaviors. And that those then set the stage, by an intervention being fully and effectively in place, for getting to outcomes. So it goes back to that hybrid or kind of parallel process approach in the design of some of this work.
And I can't stress enough that we know enough at this point, not to say we're not going to learn more, I mean this work is going to keep teaching us, but we know enough about what influences implementation to create better conditions, more learning labs if you will, to put some of these practices into place, and to be studying those rather than just going back and saying you know what training, still inefficient, it's not going to quite get us there. So let's take what we are learning, and what we have learned from implementation so far, and implement that into our practice to use those as our learning labs for advancing the research.
So in that light, these implementation frameworks, they're not really an end point, they're really a new beginning to begin to also take steps forward to understand implementation, best practices, science and policy, and it really suggests for us this idea of starting small and working through some of the roll out of these strategies to effect the kind of change that we really want.
Acknowledgements
So I leave you with a few acknowledgements of some of the work that informs this, and call your attention to that Active Implementation Hub.
Further Reading: Websites of Interest
National Implementation Research Network (NIRN). (2015). National Implementation Research Network. University of North Carolina at Chapel Hill FPG Child Development Institute.
National Implementation Research Network (NIRN). (2015). National Implementation Research Network. University of North Carolina at Chapel Hill FPG Child Development Institute. ×
State Implementation and Scaling-up of Evidence-based Practices Center (SISEP) & National Implementation Research Network (NIRN). (2015). Active Implementation Hub. University of North Carolina at Chapel Hill FPG Child Development Institute.
State Implementation and Scaling-up of Evidence-based Practices Center (SISEP) & National Implementation Research Network (NIRN). (2015). Active Implementation Hub. University of North Carolina at Chapel Hill FPG Child Development Institute. ×