Implementation Science: Realities and Lessons Learned Research Myths and Reality
Clinical Practice Research > Implementation Science    >   Research Myths and Reality
Free
Research Myths and Reality  |   November 01, 2014
Implementation Science: Realities and Lessons Learned
Author Affiliations & Notes
  • Wenonah Campbell
    McMaster University
  • The content of this page is based on selected clips from a video interview conducted at the ASHA Convention.
    The content of this page is based on selected clips from a video interview conducted at the ASHA Convention.×
Article Information
School-Based Settings / Research Issues, Methods & Evidence-Based Practice / Clinical Practice Research / Implementation Science
Research Myths and Reality   |   November 01, 2014
Implementation Science: Realities and Lessons Learned
CREd Library, November 2014, doi:10.1044/cred-impl-rmr-001
CREd Library, November 2014, doi:10.1044/cred-impl-rmr-001
Why Implementation Science?

Implementation science could really inform how we, as a field, ensure that the work that we're doing -- the evidence-based practices we come up with -- get put into use.

I think we have great examples from other fields to draw from about how it can be used. There's a statistic that's reported in the research about, if you just have an evidence-based practice, and you put it out there and hope it will happen, then 17 years later, about 14% of research is integrated into practice (Balas & Boren, 2000). You'll see that statistic cited in the Implementation Science Symposium in some of the talks given there.

That percent improves dramatically when you use implementation science. When you engage with people, so that the entire way you're working at changing and providing supports is integrated in the whole system. So that what you've developed, this wonderful thing you have, this innovation, is actually put into use.

Implementation Research Can Start Small and Grow Over Time

It often starts small. The project I'm involved in right now, Partnering for Change, is at a large stage in its implementation. But it started with a pilot study. So what I've learned is that yes, you can get to this stage in implementation science. If you're a new researcher or an early investigator, or a senior investigator doing something new, it can seem overwhelming at first. But realize that you build those foundations by finding someone in the community, someone in a school system, someone in a health system who is eager to make change. Who is thinking innovatively, who is okay with trying new things.

You start there. And often, once you have those links to the community, those people then have the connections of who you need to be talking to. They have partners, they have collaborators. Eventually, you start to bring a team together. And that team grows over time.

At this stage in our work, we now have representatives at -- what would be the equivalent of the state level in the US, it's the provincial level in Canada, but those relationships weren't all there at the beginning point.

Interdisciplinary Collaborators Are Critical

Doing implementation science requires collaboration, and it often requires being interdisciplinary.

Look around you. It's wonderful to have the perspectives of other people. So, if you're doing work that's going to go into an education system, then you want to have people from education on your team. You want to have other health professionals on your team. You want to have people who have expertise in different kinds of research methods. There's a whole host of things.

I think implementation science is something that is best done collaboratively. Both by having the different perspectives, by having the networking that those people bring to you, and also by sharing the workload. When you do implementation science, and you're having to build relationships with partners in the community, when you get to a stage when you're ready to scale up your evidence-based practice, that's going to require a fair amount of different roles and the PI can't do them all.

That's another thing I've learned. I'm not the PI on the project, I'm a co-investigator. But our PI is a wonderful mentor, and she's very good at recognizing there are many roles to have, and that part of having a wonderful team around her is that we have different ways we can contribute and move the work forward.

Your Communication Plan Needs to Extend Beyond the "Front Lines"

The other lesson we learned was the importance of communication.

Which is kind of interesting because I am a speech-language pathologist by background, and communication is our mainstay. But as a research team, when we started out, we had a whole plan designed around how to communicate with all the front line people who would be impacted by our introducing this new service delivery.

We did tons of presentations, and we had all kinds of materials. But we focused on the front line folks. We hadn't really thought about the fact that there's a whole bunch of other people who are in schools, in health organizations, who work there, who don't have to do this model directly, but they're going to see the occupational therapist in the school. Or they're going to have some contact with our program. In order for us to have the best and most seamless entry, we have to help them understand it, too.

Some of the early barriers we encountered were around people not understanding what the project was, why the OT was in the school, what was her role, what was this project, what was Partnering for Change. Going into Year 2, we actually made sure that we did very widespread presentations, across whole organizations. That we met with groups -- even if they weren't the people implementing it or touched by it directly, that they knew why we were there, what we were doing. That we had opportunities for them to ask questions.

We now tailor our presentations. We don't give one standard presentation. If we're talking about this intervention to speech-language pathologists, I go give the presentation. I'm an SLP, and I tailor it for them. I really put in information that's specific to them and their needs about it. We tailor to school psychologists. We tailor to special education resource teachers. So that's one thing we've learned, is this idea of communication, messaging, tailoring.

Reports on Barriers and "What Didn't Go Well" Have Impacts Outside of Your Research

When we thought about introducing our model, we were very aware of the fact that a school culture is going to matter. What that school culture is like is going to matter for how easy it's going to be to embed occupational therapists in there. We thought about the culture of health agencies who fund the service and how that might interface with the school. What we didn't really think about was our culture as academics, and how that influences how we interact with organizations that are out there in the community.

One of the things is that when we were studying the implementation process through qualitative interviews and focus groups, part of what we wanted to identify was what worked, where were the successes, what's going well. But we also wanted to identify what's not going well, what did we not do right, where were the barriers, what went wrong. As researchers we thought, "Well that's great. That's exactly what we want to know." You learn as much from what didn't go well, or where the barriers are as you do from the successes.

When we started to share that with our partners, we didn't realize right away that because they're an organization where doing this is their job, sharing things that didn't go well can actually impact how their job performance is evaluated.

So, if problem X was identified that could be a barrier, our perspective was, "Oh, well, that's great. We learned about this. Isn't that wonderful." But from their perspective, it's like, "That could really impact that person because that's a part of that person's job."

One of the things we learned to do is to think about how we could share everything we learned, and be completely honest about what the data show -- but do it in a way that isn't hurtful to our partners.

The strategy we came up with was being solution-focused, and to have action-oriented objectives. Now when we communicate, before we put anything in a formal report or document that's going to go out and be used by people in management, by people even at higher levels of government. Because you send these things out, and they work their way up through the system.

So we talk to our partners. We meet with them, and we say, "Okay, here was the barrier. What are we going to do about it? Let's implement a solution. Let's come up with our action plan. And let's actually work through it first." Then, when we share that, we've now presented it as, "Yes, there was a barrier. We can document it. But we also have found a solution." Then it works for both of us -- and it moves our project forward. Because now we've fixed the problem.

So that was something that was completely unexpected the first time we had that. Where we went, "Oh. That's really important to understand, and we need to find a way to communicate that works for everyone."

Further Reading: Resources of Interest
Balas E. A. & Boren S. A. (2000). Managing clinical knowledge for health care improvement. In Bemmel, J. & McCray, A. T. (Eds.), Yearbook of Medical Informatics 2000: Patient-Centered Systems (pp. 214–216). Stuttgart, Germany: Schattauer Verlagsgesellschaft mbH.
Balas E. A. & Boren S. A. (2000). Managing clinical knowledge for health care improvement. In Bemmel, J. & McCray, A. T. (Eds.), Yearbook of Medical Informatics 2000: Patient-Centered Systems (pp. 214–216). Stuttgart, Germany: Schattauer Verlagsgesellschaft mbH. ×
CanChild Centre for Childhood Disability Research. Partnering for Change. CanChild: Current Studies (Available from the McMaster University Website at http://canchild.ca).
CanChild Centre for Childhood Disability Research. Partnering for Change. CanChild: Current Studies (Available from the McMaster University Website at http://canchild.ca). ×