The most award winning
healthcare information source.
TRUSTED FOR FOUR DECADES.
Use of metrics help IRBs improve operations with laser-like efficiency
Staffing vacancy, turnover reduced
IRB professionals might not have gotten into the business of protecting human subjects because of their love for mathematics and statistics. But many now are finding that tracking data and analyzing numbers helps them do their job better.
Also, IRB directors and research administrators are implementing quality improvement projects with a more precise focus and efficiency because of the metrics they've collected.
For instance, one research institution used metrics to improve its staff retention.
"In 2006, we had a 36% vacancy rate and 50% staff turnover rate," says Joseph O. Schmelz, PhD, CIP, FAAN, director of research regulatory programs at the University of Texas Health Science Center in San Antonio, TX.
After reviewing staffing metrics, the institution implemented changes that have resulted in a complete turnaround: "We really haven't had any vacancies or any turnover in the last two years," Schmelz says.
The office also used metrics to increase efficiency in the IRB approval, says Dawn Lantero, PhD, research subject advocate at the University of Texas Health Science Center.
The IRB used metrics to define protocol complexity and send those eligible for an expedited review to a newly-hired expedited reviewer. This resulted in a big decrease in the submission to approval time for all studies. It dropped from 120 days in 2006 to 88 days in 2009 for studies requiring a full review; the time spent on expedited studies decreased from 70 days in 2006 to 15 days in 2009.1
The data also helped the institution create a new, grant-funded position of research subject advocate.
"We had to decide what would be the best activities for the research subject advocate to do, so Dr. Jenice Longfield and I came up with simple metrics and data to collect," Lantero says.
"For instance, in our grant it's important for us to be working with our affiliates, and my role is to be accessible to anyone who contacts me," she says.
Lantero uses metrics to help shape her role and schedule her time efficiently so she is available when people most needed her and so she would address the most appropriate issues, rather than issues that were better handled elsewhere.
Metrics also have proven useful for scheduling continuing reviews and maintaining a quick IRB review turnaround time, according to another IRB office director.
"We use a monthly project status report to track when expirations are happening," says Mindy Reeter, BS, CIP, director of the office of human research oversight, University of Illinois, College of Medicine at Peoria, IL.
The IRB's electronic system sends out 30-day and 60-day automatic email reminders. But the IRB office tracks these data to see whether a renewal period falls just after a board meeting.
"So we're able to help those people realize their expirations are happening off synch with the IRB meeting calendar," Reeter says.
Before the office collected data on project expirations, they would discover these timing conflicts right before a meeting, giving them little time to adjust the schedule. Now the IRB office and researchers can avoid deadline conflicts, she adds.
The IRB also has used metrics to better schedule staff time and deadlines. Workload is identified precisely to more accurately gauge the time required for a meaningful review.2
Metrics also can be used to show an institution how its IRB office is performing.
"It's hard to measure success, but the metrics do capture the types of activities that are ongoing in a human research subject program," Schmelz says. "Having those data presented as Dawn does for us is important for getting a sense of what happens outside of IRB committee meetings."
The University of Illinois IRB office collects data with assistance from IRBNet, an electronics solution company, and can compare its efficiency and activities to national standards.
For instance, the office found that it handles more protocols than the average member of the Association for the Accreditation of Human Research Protection Programs (AAHRPP), but has a more efficient convened and expedited review process.
"It's good to know we're working on a level comparable to AAHRPP-accredited institutions," Reeter says. "Our turnaround times are comparable."
The IRB office's average time to reach an expedited review decision is 14 days, while the average time to a decision for a convened review is 25 days.
Since the office began collecting metrics on how long the board took to reach a decision, there have been fewer complaints about delays from principal investigators, Reeter notes.
"The fact that it's being measured has meant that we work to get everything done quickly," she adds. "We don't let items sit in the queue, and our office work is improved because we want to improve our numbers."
When IRB offices collect metrics and see reports about their performances they can maintain a continuous quality improvement environment, Reeter notes.
Without data, an IRB office's chief performance feedback might come from an audit by the Food and Drug Administration.
"You never get a good idea of how you're doing until the FDA shows up on your doorstep," Reeter says. "With these monthly reports we keep a finger on the pulse of what's happening."
Electronic data collection systems are easier for this purpose, but they're not necessary for collecting metrics.
"My method of collecting metrics was to use a spreadsheet, and anytime someone contacts me or I self-reference an issue, I write specific descriptors," Lantero says. "Our goal is to get an electronic database so that anyone in our affiliate sites can log into the database and send information to others who have access to the database."
For example, Lantero might be asked a question about whether a particular research staff member is approved to do specific study activities. Lantero can access data on education and training for particular employees and find a quick answer. An electronic database would make it even easier to find this kind of information, she adds.
"In our system, the IRB office staff has created unique metrics for the parts of the program they're involved in," Schmelz says. "We all work together to make improvements based on the information the metrics give us."
IRB employees define the metrics, make changes, and present information to Schmelz.
"We organized the office last year and are moving toward a completely electronic system," he adds. "These metrics allow us to evaluate whether the electronic system is improving our program or giving us new problems."