What began as an effort to better define Virginia Commonwealth University’s key performance indicators in research turned into an all-encompassing database that enables performance data to be shared openly and that is designed to improve transparency as well as ease the administrative burden on principal investigators (PI) and study teams.
The VCU C. Kenneth and Dianne Wright Center for Clinical and Translational Research (VCU), a Clinical & Translational Science Award (CTSA) Program hub, utilized the Common Metrics Initiative (CMI) as a springboard which led to collaboration with its Institutional Review Board (IRB) to determine primary institutional key performance indicators (KPI) definitions as well as management metrics. The goals were to better monitor IRB performance and to avoid the debate that can occur about IRB cycle times, budget, contracts, and other measurements. Not long into these discussions, it became apparent that not all the university departments and offices were using the same operational definitions and that data collection was siloed.
“Early in the project, it was clear that each stakeholder from the IRB to the cancer center, measured in different ways and sometimes for very good reasons. We were speaking different languages, thus coming up with different approaches and different results to the question: How long does it take?” said Lisa R. Ballance, Executive Director of Clinical Research and Compliance, VCU. “We needed a shared and trusted approach to measuring not only IRB cycle times but also coverage analysis, budget, contracting, and big picture metrics such as time to activation.” This need coincided with an initiative from the Wright Center Informatics Core to consolidate IRB, Sponsored Programs and Clinical Trials data into a common data model and provide a set of dashboards about clinical trial performance at VCU.
The VCU C. Kenneth and Dianne Wright Center for Clinical and Translational Research brought the stakeholders together and provided experienced technical and project management support of the Informatics Core. Conference room debates about numbers and definitions soon became a cohesive group of stakeholders with a goal of using metrics to answer questions from different perspectives, guide resource decision making and drive efficiency with a shared metrics portal.
The portal allows any clinical research stakeholder to see performance metrics and toggle various views that might be critical to their reporting or other evaluative criteria, such as the median cycle time between a full-board IRB submission and IRB approval, by calendar year or fiscal year. In addition, the user can click on a record and log into the IRB’s system or the Sponsored Programs’ system to check on details that may have impacted that timeline. Collecting categories of delay types will help the stakeholders to identify improvement approaches that will drive efficiency.
Having a single metric portal consolidated performance data — previously captured in different silos and with different terminology — in one location. In implementing the CMI to develop the portal, VCU also achieved greater consistency in operational definitions and the ability to monitor IRB performance transparently, and it spurred a collaborative group to openly focus on IRB improvement.
The portal also greatly reduced the burden on PIs, which was especially beneficial during the COVID-19 crisis. Working on a short turnaround time, VCU prioritized its human research studies in various tiers, to identify studies that would continue and which ones needed to be put on hold or conducted via telehealth. All COVID-19 clinical studies and studies in which there is no alternative treatment or the participant would be adversely impacted by stopping the studies were classified as tier one, meaning they must go on no matter what.
The team utilizes the REDCap software for justification from investigators about studies to be kept open, and then routing the requests to chair and dean approval, and then automatically sending that entire record into the IRB system. Transactions are displayed using Tableau’s interactive dashboard software. Using these software applications cut down on administrative time and helped build IRB’s trust in the process of facilitating rapid communication and enabling the Wright Center to be a problem-solver. The IRB trusted the Wright Center to push information into their system, which was not possible before. In addition, using Tableau enables easy data sharing with the health system.
“The CMI really allowed us to get into the discussions that we needed to have, but your data is only as good as your agreed-upon operational definitions,” Ballance added.
- The CMI process can be a catalyst for greater collaboration with the IRB and for empowering the institution to collaborate to improve efficiency
- Consistent, agreed-upon operational definitions are vital to ensuring good data
- The process allows the IRB to recognize their own improvement needs and to make data-driven decisions to improve performance metrics
- The harmonized operational definitions and ease to accessible data allowed for improved teamwork to respond during COVID-19 submitted studies
Share Your I2I Success Story
Your story can inspire other hubs in the consortium. The Center for Leading Innovation and Collaboration (CLIC) would like to share your Insights to Inspire (I2I) success story. If you have a story to highlight, please contact us at firstname.lastname@example.org.