Advancing Implementation Science: A View from the National Institute of Mental Health An overview of the need, themes, progress, and funding resources for dissemination and implementation research. Presentation Video
Clinical Practice Research > Implementation Science    >   Presentation Video
Free
Presentation Video  |   March 01, 2014
Advancing Implementation Science: A View from the National Institute of Mental Health
Author Affiliations & Notes
  • David A. Chambers
    National Institute of Mental Health, NIH
  • Presented at the Implementation Science Summit: Integrating Research into Practice in Communication Sciences and Disorders (March 2014). Hosted by the American Speech-Language-Hearing Foundation.
    Presented at the Implementation Science Summit: Integrating Research into Practice in Communication Sciences and Disorders (March 2014). Hosted by the American Speech-Language-Hearing Foundation.×
Article Information
Research Issues, Methods & Evidence-Based Practice / Regulatory, Legislative & Advocacy / Clinical Practice Research / Implementation Science
Presentation Video   |   March 01, 2014
Advancing Implementation Science: A View from the National Institute of Mental Health
CREd Library, March 2014, doi:10.1044/cred-pvid-implscid3p2
CREd Library, March 2014, doi:10.1044/cred-pvid-implscid3p2

The following is a transcript of the presentation video, edited for clarity.

This presentation is to give you more of a broad view of dissemination and implementation research opportunities at the NIH, but I did want to start out by saying where we started thinking about this problem in NIMH.
I'll tell you little bit about where we are in terms of some of our NIH efforts. And ideally, we are working together to try and improve this connection between research and practice.
Conceptualizing the Challenge
What this shows is that there are sixty million people in the US with any mental disorder. Eleven to seventeen million, approximately, have serious or severe mental illness.
Of those, less than half get any services whatsoever.
Of that, maybe a third get what would be considered services of sufficient quality.
Most of the action is at this bottom part, saying: Of those who receive care, how many of them within trials are fully benefiting. And in most cases, only say a third are fully benefiting from our existing treatments, even in trials. Another third are partially benefiting. And even in the best available care, there are third who really don't benefit at all.
The challenge is that so much of our effort historically has been on that that smallest slice, that even if you got everyone who is receiving care to get good care and to benefit from it, you're already missing eighty five percent of the people's needs, at least in the country.
So what we recognize that the implementation problem, at least at the population health level, overwhelms any other impact that we might be able to achieve through some of our other kinds of research. And so we see this as a tremendous opportunity to really drive huge improvements in population health. And that is why we were very excited about the abilities of you and others to try to advance knowledge in this field. Because it shouldn't be the case that you only have a fifteen percent chance of getting quality care. It shouldn't be the case that you have less than a fifty percent chance of receiving any services. So this is really our mandate.
Current Progress
How far have we come?
Well in 2001, when I got to NIH there was tremendous variability in terms: A lot of people speaking past each other in terms of what this challenge of dissemination was all about. There was very little awareness that these were actually research questions. The prevailing thought was don't they just read my papers don't they just do what I said, why isn't the world is changing the way I would like them to the change.
There was minimal capacity within the field to really think about this systematically and scientifically. There was not a shared vision certainly at NIH about how we could try and bridge these gaps. At the time there were few opportunities to present and publish. We heard from many of our investigators, who felt like they had learned a tremendous amount about trying -- and in some cases failing -- to implement effective practices in care, that they couldn't even publish the most interesting findings because sometimes they were negative, and who wants to publish negative findings.
The commentaries were abundant. There were a large number of people are saying this is a problem. We've got to do something about it. There were commentaries that it actually outpaced the number of studies that were in the field trying to systematically look at these efforts to implement. It really wasn't a clear part of our research agenda. It was, as I said, assumed to be this handoff point.
The first thing that we felt very clearly needed to be done, at least for our purposes was to come up with what we saw as working definitions. So at least where NIH, or at least the institutes involved in these efforts could say: Here is how were identifying these challenges. It didn't mean, and it still doesn't mean, that anyone in the field has to be a slave to these particular definitions, but rather what it meant was, at least you'll know where we are coming from. And so this is the kind of thing that you're interested in. In addition, what we encouraged is that people who found other definitions or were using other terms, it would be really really helpful across the field, if people were defining what they meant. So rather than people using things that they said were synonyms when they were actually sometimes defining things in different ways, at least we could come up with some shared language to define dissemination.
We defined it to be distribution, about the spread of information and intervention materials to specific audiences. This idea that you are trying to transmit and ideally have that message packaged in a nice way, received, and actually used in some capacity.
Where we contrasted that with implementation, we saw a much more intensive strategy of how to integrate interventions within specific health settings, and how do you change practice patterns. We did not come at this alone. Jonathan Lomas from Canada had done this wonderful paper defining diffusion, dissemination and implementation in the context of knowledge transfer and we really did adapt those definitions because we thought they did a nice job of talking about different levels of intensity with which you're trying to inform and ultimately change practice.
We usually see where we are testing these interventions on one end, and driving directly to health outcomes.
And what we were missing was all of these potential cascading outcomes in between.
And so we saw as the core implementation science being the how. How do you identify these implementation strategies that really have an ability to improve the feasibility of uptake of a particular intervention: the fidelity to that, the penetration through a system, sustainability over time, the costs associated with implementation. But actually, what we more broadly see is that this entire picture is where implementation science needs to go. It needs to ultimately result in improved health outcomes. Otherwise, why are we bothering? If we get something in place that has no impact whatsoever on people's lives then we are really not doing what we set out to do.
And so while we saw that core of implementation around to some degree, this prospective test of the strategy in this resultant implementation outcomes. What we really want to see is that effect. How is it improving the efficiency and the quality and accessibility of services? And how is it ultimately driving improvements both symptomatic, functional, and even just other associated satisfaction and other things to make people's lives better. And that is what we remain seeing as our core mission, in terms of implementation.
So we set out first a mental health center program announcement, to then work with other institutes within the agency around a broader program announcement. So the first time around, we had eight other NIH institutes, which was more than we expected, who all said: this makes sense to us. Our investigators are struggling with cardiovascular, cancer or substance abuse, etc. were all struggling with the same problem of the limited uptake of things that we've seen again and again through RCTs, through meta- analyses, as being effective -- but are not getting out there. This was in late 2005. Over the three years, we saw a nice growth in the portfolio. We saw forty projects that ranged from being smaller R03s to the sort of middle size developmental R21, to the large R01s. We were encouraged that there was a continuum of intervention types. There were treatment studies, implementation studies, there were prevention studies. There were screening studies. They covered both clinical and community settings. Most of the studies were prospective. The modal applications we were seeing were: how do we get this particular intervention into this particular setting. Which was a good start, and there were a variance of approaches, which we also liked, to try and accomplish this. We didn't want to see that everyone is using the exact same approach, and so we learn nothing more than the effectiveness of that approach.
The second time around, we added another four institutes -- at this point in twelve of the NIH institutes and centers involved. We had ultimately forty six projects that were funded. We saw enhanced focus on sustainability, which we see as very important. We saw some nice focus on improving measurement within the field. And we saw continuum of intervention types. We saw more clinical topics covered, so we started to see applications and awards being made in dental and craniofacial research, which had not been a part of our portfolio before. Complementary and alternative medicine, and helpfully thinking that patients were dealing with multiple problems at once -- not just a single disorder that that that may be limiting.
We also saw a nice range of designs we saw experimental designs. We saw about experimental. We saw observational. Not every trial that went through that was successful looked like a simple double-blind RCT. We're seeing a nice range, and the key thing was that the reviewers were saying: We want the design to be as rigorous as possible, but as relevant to the circumstances as possible. So it wasn't this one-size-fits-all -- but it was the nuanced: how do we match the question they are asking with the right design given the circumstances.
What we saw out of these sort of 86 grants were a lot of studies looking at the effectiveness of different approaches to implementation. Quality improvement approaches, trying to change the organization structure, climate, culture. We saw a number that were trying to train providers and provide supervision for ongoing quality assessment and improvement.
And we saw a number of them that were focusing on how do we change policy or practice at higher and higher levels, and how do we change financing models to try to make reimbursement more of a reality in some cases where traditional funding methods have not worked very well.
We also have seen some emerging approaches which have been well-used in practice, but not necessarily as well researched. So, we're starting to see more work looking systematically at learning collaboratives which have been popularized by the Institute for Healthcare Improvement and others, but we don't have as much data on what impact they provide.
And more using technology as supports to try and have implementation exist and be successful for a given practice.
Building a Better System
We have more to come, and this is where you come in. We really do have a lot more work to do.
We have current program announcements. Now we have 15 Institutes and Centers that are involved. You'll see the acronyms, the sort of word salad. But it covers across NIH. Most every institute where they are focusing directly as the patient as the end. Some of our institutes, they are much more basic science driven. A lot of our institutes cut across basic, clinical, and into services. And these are the ones that are represented. You'll see NIDCD, mental health, substance abuse, cancer, aging. The Human Genome Research Institute -- they have joined forces because they are recognizing that as there's advances in genomic medicine, there's not as much understanding about how to best integrate that with clinical care. This is an area that research has been exploding in, but has it reached the individual who could benefit from it? In many cases, no. It's still on the horizon.
The other thing we negotiated successfully for, and we felt like we were a few years ahead of schedule, is that the Center for Scientific Review said, we know there needs to be special expertise to review these applications.
When we had the first couple rounds of the program announcement, we had special emphasis panels, which were like a new study section every round. And they decided that this was one area NIH needed a more standing, consistent review process for. So in 2010, they created the Dissemination and Implementation Research in Health (DIRH) study section which remains the place where, if you apply through these program announcements, it will always go to that study section. We used to have every other round submission. But again there was a recognition at NIH that there was enough demand, there were enough applications coming in, that justified that each and every round we would have the opportunity to make sure that things came in.
Just a few of the themes, you can obviously look at the program announcements which again have R03s, the R21s, and then the larger R01s. But some of the themes we wanted to spotlight, because we definitely need more, are things related to improving sustainability and ongoing improvement of interventions.
We want more of an emphasis on not just thinking about the individual intervention, but thinking about how you can take multiple interventions and make an evidence based care system. Because we know that going one intervention at a time within a study -- or even in practice -- is not very efficient and probably not meeting people's needs.
We really do need -- for any of you who are on the methodological side -- we really need more development and use of new designs and new measures. There have been a couple of efforts to pull together existing measures. We've recognized there are still gaps in that. Where you're thinking about a construct that's not well measured, that's a very effective way to use the program announcement. It will help us all.
We've gotten more interested in modeling approaches and system science approaches to dissemination and implementation. Trying to look at how you estimate impact. And look at available data to look at which strategy might be better than others. That's very well encouraged.
More and more we're recognizing that if we just focus domestically, we may be missing out on tremendous opportunities to learn. Seeing more implementation science in that global health context is absolutely encouraged. In fact, some of the program announcements will say, this is only open to domestic institutions, or domestic investigations -- this is one that is open much more broadly. What you need to do is say how is the knowledge going to be relevant in the US, but in our experience, there is such diversity in this country that it hasn't been all that difficult to say how this kind of context could give us some information on what happens in the US.
As I said, there's the study section. Martha Hare, who is the scientific review officer, will routinely connect with programs to say, "What do you think? How is it going? Do you feel like there are other areas of expertise we should get on the committee?" And we've had a nice opportunity to inform that committee.
Just to leave you with this: There are a number of resources. If you're newer to this area, it's hard to say, what should I attend to?
We've tried to make as many of these resources available as possible. There's certainly access to the funded grants. We have a couple of websites, OBSSR and the National Cancer Institute actually publish lists and links to abstracts of everything that's been funded in this space. We now have Research Centers, we have CTSA cores, we have Research Networks that are devoted to dissemination and implementation.
The OBSSR-led Summer Training Institute -- all of the materials from the first three years are available online. The whole philosophy of the TIDIRH training institute is of a train-the-trainer model. We're looking for people who may not have a lot of collaborators locally to help to build that, to build capacity in your local institutions.
Over recent years there has been a premier journal called Implementation Science. People had said, "Where do I publish something about implementation?" It's pretty clear from the title that this is a pretty good option.
Then Brownson, Colditz, and Proctor had pulled together this volume a couple years ago, which does represent a lot of the key advances of the last ten years in the field. If you haven't seen this volume, at least take a look at some of the chapters from it.
The one last thing that we've done over recent years is tried to come up with a venue for people who are new to the field and experienced to get together with NIH and the VA. We had five large meetings that really grew, as you can see. The sixth meeting, we had these smaller working groups. The goal is to build capacity around training, around measurement and research design. In the coming months, ideally we'll have more of the dissemination and the products for those meetings out to the field to be able to learn from. It's really leveraging a lot of wonderful people's efforts around improving measurement, around being thoughtful and innovative with research design, and on what are the kinds of training materials that can be available, for some of you that are new to the field, or others who can be teachers of this work to others.
Questions and Discussion

Question: Are other NIH Institutes joining the program in dissemination?

There are a couple of different ways we have seen institutes engage in this. Like I said, 15 of them have signed on directly to the program announcements. There are others who have some level of enthusiasm, but each institute has a different way of thinking about how they prioritize their research priorities.
What is nice about DIRH, which wasn't the case when it was a special emphasis panel, is because it's a standing committee, even if you submit investigator-initiated, so not through a program announcement, there is the ability to have that same review at DIRH. The key thing with any institute is to get in touch with the program contact. If people don't see the Institute on the list and they have ideas that they think are relevant, I'm very happy to work with you to find the right contact.

Question: Thinking about conceptualizing projects that would fit into this, there must be lots of projects that come through that fit into our typical study section, as well as into D&I. Do you have any thoughts or suggestions about framing projects and making sure they go to the right place?

The more general answer is that this is why it's so good to connect with program staff prior to submission. Program staff, who often have applications scattered across ten or fifteen study sections get a really good understanding of the nuances and the kinds of applications that may be most appropriate for any.
We do see within DIRH more and more of these hybrid studies, where they are trying to answer questions about the effectiveness of interventions while informing implementation. That is the committee that's been receptive to that. But I would always start with, here are the key questions that I'm most interested in answering. Let program staff help to identify the right place. I think it can be a challenge to try and fit a particular study section. Especially if it's not quite what you want to do. I think there's enough diversity among the study sections and history of seeing different permutations with different levels of influence of say, intervention, effectiveness or efficacy with implementation, that I think the better thing is to have that individual discussion. Some expertise may be more abundant on certain study sections.
There's not a perfect answer, but the best thing to do is have that dialogue and say, here's what I'm thinking. Then in your cover letter, you absolutely have the right to specify, this is the committee that I think makes the most sense for the following reasons. That makes it easier for the referral offices to say, "Okay, that makes sense." If it doesn't of course they have the discretion to go elsewhere.
References
Brownson, R., Colditz, G., & Proctor, E. (Eds.). (2012). Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press.
Brownson, R., Colditz, G., & Proctor, E. (Eds.). (2012). Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press.×
Department of Health and Human Services. (2013). PAR-13-055 Dissemination and Implementation Research in Health (R01). Available at http://grants.nih.gov/grants/guide/pa-files/PAR-13-055.html
Department of Health and Human Services. (2013). PAR-13-055 Dissemination and Implementation Research in Health (R01). Available at http://grants.nih.gov/grants/guide/pa-files/PAR-13-055.html×
Kessler, R. C., Chiu, W. T., Demler, O. & Walters, E. E. (2005). Prevalence, severity, and comorbidity of 12-month DSM-IV disorders in the national comorbidity survey replication. Archives of General Psychiatry, 62(6), 617–627 [Article] [PubMed]
Kessler, R. C., Chiu, W. T., Demler, O. & Walters, E. E. (2005). Prevalence, severity, and comorbidity of 12-month DSM-IV disorders in the national comorbidity survey replication. Archives of General Psychiatry, 62(6), 617–627 [Article] [PubMed]×
Lomas, J. (1993). Diffusion, dissemination, and implementation: Who should do what?. Annals of the New York Academy of Sciences, 703(1), 226–237 [Article] [PubMed]
Lomas, J. (1993). Diffusion, dissemination, and implementation: Who should do what?. Annals of the New York Academy of Sciences, 703(1), 226–237 [Article] [PubMed]×
Merikangas, K. R., He, J., Burstein, M., Swendsen, J., Avenevoli, S., Case, B., Georgiades, K., Heaton, L., Swanson, S. & Olfson, M. (2011). Service utilization for lifetime mental disorders in us adolescents: Results of the national comorbidity survey–adolescent supplement (NCS-A). Journal of the American Academy of Child & Adolescent Psychiatry, 50(1), 32–45 [Article]
Merikangas, K. R., He, J., Burstein, M., Swendsen, J., Avenevoli, S., Case, B., Georgiades, K., Heaton, L., Swanson, S. & Olfson, M. (2011). Service utilization for lifetime mental disorders in us adolescents: Results of the national comorbidity survey–adolescent supplement (NCS-A). Journal of the American Academy of Child & Adolescent Psychiatry, 50(1), 32–45 [Article] ×
National Institutes of Health Office of Behavioral and Social Sciences Research. (2015). Dissemination and Implementation. Available at http://obssr.od.nih.gov/scientific_areas/translation/dissemination_and_implementation/index.aspx
National Institutes of Health Office of Behavioral and Social Sciences Research. (2015). Dissemination and Implementation. Available at http://obssr.od.nih.gov/scientific_areas/translation/dissemination_and_implementation/index.aspx×
Proctor, E. K., Landsverk, J., Aarons, G., Chambers, D., Glisson, C. & Mittman, B. (2009). Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research, 36(1), 24–34 [Article] [PubMed]
Proctor, E. K., Landsverk, J., Aarons, G., Chambers, D., Glisson, C. & Mittman, B. (2009). Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research, 36(1), 24–34 [Article] [PubMed]×
Substance Abuse and Mental Health Services Administration. (2010). Results from the 2009 National Survey on Drug Use and Health: Volume I. Summary of National Findings. Available at http://archive.samhsa.gov/data/NSDUH/2k9NSDUH/2k9Results.htm
Substance Abuse and Mental Health Services Administration. (2010). Results from the 2009 National Survey on Drug Use and Health: Volume I. Summary of National Findings. Available at http://archive.samhsa.gov/data/NSDUH/2k9NSDUH/2k9Results.htm×
Tinkle, M., Kimball, R., Haozous, E. A., Shuster, G. & Meize-Grochowski, R. (2013). Dissemination and implementation research funded by the US National Institutes of Health, 2005–2012. Nursing Research and Practice, 2013, 0 [Article]
Tinkle, M., Kimball, R., Haozous, E. A., Shuster, G. & Meize-Grochowski, R. (2013). Dissemination and implementation research funded by the US National Institutes of Health, 2005–2012. Nursing Research and Practice, 2013, 0 [Article] ×
Wang, P. S., Lane, M., Olfson, M., Pincus, H. A., Wells, K. B. & Kessler, R. C. (2005). Twelve-month use of mental health services in the United States: Results from the national comorbidity survey replication. Archives of General Psychiatry, 62(6), 629–640 [Article] [PubMed]
Wang, P. S., Lane, M., Olfson, M., Pincus, H. A., Wells, K. B. & Kessler, R. C. (2005). Twelve-month use of mental health services in the United States: Results from the national comorbidity survey replication. Archives of General Psychiatry, 62(6), 629–640 [Article] [PubMed]×