The most award winning
healthcare information source.
TRUSTED FOR FOUR DECADES.
Data collection guidelines use specific criteria
It’s just gotten much easier for Joint Commission on Accreditation of Healthcare Organizations surveyors to size up how your organization compares to others. "The ORYX and core measure data are part of the Joint Commission’s initiative to look at hospitals in a more uniform manner, so they can compare one hospital to another," explains Judy Homa-Lowry, RN, MS, CPHQ, president of Homa-Lowry Healthcare Consulting, based in Metamora, MI.
With ORYX, organizations were able to select their own indicators, so there may have been a tendency to choose things that were easy to measure but didn’t adequately address critical elements of the health care being provided, she says.
According to Homa-Lowry, another problem with ORYX is that a number of vendors were allowed to participate in the program, which made using the data for comparative purposes difficult because the various vendors used different severity or risk-adjustment methodologies. "If you’re not comparing apples to apples, it makes the comparison difficult."
In contrast, the data collection guidelines for core measures use very specific criteria, says Homa-Lowry, adding that core measures also address patient teaching initiatives such as smoking cessation in addition to the disease process.
Each of the core measures is linked with a clinical service group, notes Christine McGreevey, RN, MS, manager of the Joint Commission’s accreditation systems integration and accreditation operations, giving the example of heart failure and acute myocardial infarction being linked to the cardiology group.
"If there are undesirable outlier performances on any of those measures, then cardiology might be flagged as an area that we will direct the patient tracer," she says, noting that patient tracers are determined by an aggregation of many types of data.
Smaller hospitals typically have a harder time benchmarking with comparative data because their numbers are so small, Homa-Lowry explains. "Unless you apply statistical methods to try to project some of the outcomes, it is hard to compare."
You may wonder, "How significant is one out of five patients?" if you see very few patients with that diagnosis, she says. "Now, with the adjustments Joint Commission makes, it will be easier for smaller hospitals to take a look at how they compare to other organizations," she says. "It will also be helpful to have a document to assist with data analysis, as opposed to a larger hospital that may have an operations analysis department doing a lot of that for them."
However, a key concern with the mandated core measures is, "Who is going to do all the data collection?" says Homa-Lowry, adding that some insurance companies now are mandating quality measures. "Some of the organizations have budgeted for additional staff, and others have not. It will really be a challenge."
To streamline your organization’s data collection, do the following:
• Eliminate redundant or fruitless collection.
Take a close look at all the mandated data your organization is collecting, and cut whatever is not necessary, Homa-Lowry recommends.
"Instead of continually adding on, you need to eliminate data collection that is not helpful," she says. "Take a hard look to determine whether your data collection activities are worthwhile and provide good outcomes, and try to eliminate unnecessary or repetitive data collection."
• Assign a case manager to individual physicians instead of by floor.
This results in an ongoing relationship and better results with core measures. It also allows the quality department to share data analysis with individual physicians and work directly with the chairman of the department, Homa-Lowry says.
• Cross-train staff.
Does your organization have case managers, utilization reviewers, risk managers, and quality professionals all collecting similar data? Look for ways to cross-train staff to increase efficiency and avoid duplicating efforts, Homa-Lowry advises. "You may have people getting into the charts numerous times for something that is only a little bit different."
For example, if a case manager is looking into the records of a pneumonia patient, it would make sense for him or her to do a concurrent audit on that patient to meet the core measure requirements, or at least a portion of them, she suggests. "If they are not in compliance, they can try to get in compliance while the patient is still an inpatient. They have a better opportunity to bring that patient into compliance before they report that data externally."
Sometimes, the relationship between case management and nursing isn’t delineated clearly, adds Homa-Lowry. "Those two departments could sit down and talk about the responsibilities for data collection. The analysis could go to the quality department so when the data come in, they could look for trends and send the information to the appropriate department for correction."
That way, there is an opportunity to deal with the issue both concurrently and retrospectively, she points out.
In many organizations, core measures have been tacked onto someone’s existing responsibilities, Homa-Lowry notes. "Depending on what the volumes are, it can be a huge responsibility for one person to do all the abstracting, reporting the data, analyzing it, attending the meetings, and giving feedback to key people about what outcomes were." For that reason, core measures now are being designated as a separate position in some organizations. "It’s almost sprouted up as another specialized area, as opposed to trying to integrate it into existing processes," she says.
You are better off integrating the core measures into existing data collection processes, adds Homa-Lowry. "When you cross-train staff, you are not as dependent if an individual goes on vacation — or you may have a special quality investigation and that staff are pulled off to do a root-cause analysis or some other quality responsibility."