Practical Tips for Managing a Multi-Site Collaborative Study Advice on information and data management, leadership, and promoting fidelity. Tools, Tips and How To
Free
Tools, Tips and How To  |   November 01, 2013
Practical Tips for Managing a Multi-Site Collaborative Study
Author Affiliations & Notes
  • Mary Pat Moeller
    Boys Town National Research Hospital
  • The content of this page is based on selected clips from a video interview conducted at the ASHA National Office.
    The content of this page is based on selected clips from a video interview conducted at the ASHA National Office. ×
Article Information
Research Issues, Methods & Evidence-Based Practice / Planning, Managing and Publishing Research / Lab and Project Management
Tools, Tips and How To   |   November 01, 2013
Practical Tips for Managing a Multi-Site Collaborative Study
CREd Library, November 2013, doi:10.1044/cred-lpm-tth-001
CREd Library, November 2013, doi:10.1044/cred-lpm-tth-001

I was fortunate, again, because Dr. Tomblin had done ten years of research that was an epidemiological study of children with specific language impairment. We were able to set our project into somewhat of a pre-existing framework. Bruce had already figured out some of that.

Information and Data Management

Some of that involves having a large shared database. We have a Sharepoint resource where all the investigators communicate with one another, leave files, organize all of our infrastructure. But it took stepping back and thinking about what does everyone across all the sites need to know, and how are we going to share our information and resources across sites.

We do something called a "data freeze" so that we don't have an investigator in North Carolina looking at a dataset that differs from one that the investigator is looking at in Iowa. Everyone gets their data up to date and entered in the database by X date when we're going to "freeze" the data and everyone's working on that dataset for a period of time, until there's the next update and data freeze.

Leadership

There's just some of those practical management issues. But beyond that, there's critical aspects of leadership.

Your leadership needs to be strong, so that everyone is being communicated with. You're not leaving part of our players out of the conversation. The leadership benefits when it's complementary, and I would say that is a major strength on our team. Dr. Tomblin and I as the leaders of this project have very complementary strengths. It took us a little while to discover that, but then we realized what strength there was in that, and so it's very easy for us to know, Bruce should take the lead on this, I should take the lead on that, because our strengths lie in different areas. Rather than seeing that as a limitation, you leverage that as a strength.

I'd have to say leadership is key when you want to get three research centers working together collaboratively. We've made important decisions. We have a set of guidelines on authorship, for example. You can really get into difficulty when you have many players in the mix on authorship, so we always have open conversations up front when we're starting a manuscript, who's going to lead this manuscript, who else are we going to involve in this manuscript, is there anyone we've left out? And we've made very clear guidelines about how you earn authorship on a manuscript. It's not just a role of participating in data collection, for example, it needs to be part of the intellectual contribution. That's something that we did up front that has allows us all to work in a collegial manner and make smart decisions about authorship.

Procedural Fidelity and Quality Control

You cannot assume -- although I think I went into this naively assuming everyone would do it the same way. You really can have a lot of messiness in your data if procedures are not being followed. There are a couple of strategies that we used. One was developing very good manuals to guide each of our procedures. So even a standardized test, we have specific guidelines about how the examiners are to utilize that material.

I got a late PhD. I worked in the clinic for 30 years before I went back to get a doctorate. Those clinical insights are invaluable as a researcher. That history of being in the clinic. But there's certain ways in which your clinical insights can blind your objectivity in a research enterprise. We are bringing clinicians into a research project, and they might have that same little barrier to get over that I experienced: Okay this has got to follow the rules and the guidelines of this protocol to the tee. I can't have some wiggle room because I know if I do it this way, I can get the child to give me that response.

We have a speech-language pathologist at the University of Iowa who is our quality control expert. She is very detail oriented. We actually send DVDs of administration from every site to her, and in the beginning she'd get lots of quality control -- coaching each of those providers on "oops, the protocol needs to be this, not that." It only took some of that feedback to help the examiners realize "oh, I need to follow the protocol in this way" and they became very committed to that objective, in realizing how that was such an important quality control.

We also have the examiners double-score each other's protocols, all data gets double-entered. Some of it is automated so the computer checks our scoring. So we have some of these checks and balances within the infrastructure that really make us feel like it's a quality project. It's not that we don't make errors -- of course we do. Each of us can find ourselves violating protocol without even realizing it. It happens. We just try to minimize that.

I can't say enough about the investment in training. We have long training sessions, a very detailed training manual, detailed coding manuals, people trained to follow the protocol, and then that after the fact submitting of DVDs so that we're monitoring was the protocol followed and what is the fidelity of the administration.