The Common Metrics Reports: your go-to conversation starter for process improvement

It’s relatively simple: you can’t fix something that you don’t know is broken. In many ways, that’s the ideology behind the Common Metrics Initiative – help CTSA Program hubs gather and evaluate data that not only illuminates strengths, but also areas for improvement pertaining to a set of key translational science-related metrics.

One of the metrics each hub reports on is IRB Duration, which analyzes the time it takes to receive final Institutional Review Board (IRB) approval. This metric is critical because the longer it takes a study or a trial to get approved, the bigger the barrier to advancing research and improving public health becomes.

After receiving their 2016 Common Metrics (CM) Report, which analyzed three specific metrics including IRB Duration, the University of Illinois at Chicago’s Center for Clinical & Translational Sciences (CCTS) realized there was work to be done regarding their IRB turnaround time.

“It became clear that we weren’t where we wanted to be, and the report was a springboard for having much-needed conversations with leadership and other institutional research groups about potential process changes,” said Elizabeth Hawes, director of administration at CCTS. “We’re now having regular discussions about how we can collaborate to address this frustrating barrier for our researchers.”

And for CCTS, the CM Report not only sparked conversations, but validated them. While they may have had worries about their IRB turnaround time in the past, they now had hard data to back up their claims and suggestions.

“It’s one thing to say, ‘We need to improve upon this,’ and it’s another thing to say, ‘Here’s where we stand in comparison to 50+ other hubs,’” said Hawes. 

Hawes also noted that the CM Report helped eliminate some of the age-old excuses that may have been used to justify performance results – things like, “that hub is a private institution” or “that hub receives more funding,” weren’t necessarily valid excuses anymore.

“If a small, state-funded institution is able to achieve quick IRB turnaround time, why can’t we?” said Hawes. “We’re excited about connecting with colleagues at other hubs to learn about best practices and effective process changes, and we’ve already begun to schedule meetings.”

These types of cross-hub comparisons are exactly what the CLIC CM team hopes will continue to happen as a result of the overall initiative and the reports.

“It’s not meant to be a competition, and we know there are so many varying factors that contribute to the data, but we want it to serve as an opportunity for hubs to share insights and lessons learned,” said Ann Schwartz, CM Quality Improvement Specialist at CLIC. “We encourage hubs to reach out to us, and their CTSA Program colleagues, to help make sense of the data and develop appropriate strategies.”

And in terms of what Hawes is hoping other hubs will learn from CCTS’ experience? It’s all about what you make of it.

“The reports are not a ‘silver bullet’ nor are they meant to be. But they’re absolutely the conversation starter we needed to make strides towards action. You don’t have to change your hub’s culture or processes overnight, but make an effort to use these reports to keep communication open.”

The next iteration of the CM Reports will be available at the end of February via the CLIC website. The CLIC CM team will be hosting two webinars post-report release on March 18 and March 26 to provide an overview of the analysis process and allow for questions. Simply click the linked dates above to register, and reach out to the CLIC CM Team with any questions.  

Interested in sharing your hub’s story with CLIC? Fill out this quick form.