The most award winning
healthcare information source.
TRUSTED FOR FOUR DECADES.
Many states try to tie incentives — both negative and positive — to health plans meeting certain clinical and administrative goals for their Medicaid health programs. But only one, Rhode Island, has decided to forgo the stick and use only a carrot of cash bonuses for plans that meet specific requirements.
For the last five years, three health plans that serve the Medicaid population in that state have worked with each other and the state to collect 22 pieces of data and meet state goals related to it. The results are an increasing number of those goals being met and improved health care.
And all of this has happened without threat of financial penalty or embarrassing reports of failures by the press. Indeed, the financial rewards that the plans can reap — this year the total bonus pot is something more than $1 million — comes on top of what the state thinks are fair Medicaid reimbursement rates.
"We don’t hold back money from the capitation rate and then provide the bonus," says Tricia Leddy, administrator of the Center for Child and Family Health at the Rhode Island Department of Human Services in Cranston. "We negotiate fair rates, and this is added value for quality work."
Leddy oversees RIte Care, the Medicaid managed care program in the state. She says that payments to the plans generally reach about $700,000 per year. In 1999, one plan alone earned more than $463,000. How much a plan can earn is based on a combination of performance and the number of Medicaid enrollees in the plan. The more members the plan has, the bigger the potential payoff.
The plans — Neighborhood Health Plan of Rhode Island (NHPRI), United Health Care, and Blue Cross/Blue Shield — are required to collect 22 data sets and ship them off to the state. (For a list of the data collected and standards to be met for the incentives, click here.) The state then collates the data and presents them in aggregate to the plans both together and individually. When plans do well in aggregate, the state also holds press conferences to tout that fact, but Leddy says they don’t single out any one plan for doing better than the others.
The measures are in three broad categories: administration and management, access, and clinical care. Some are items that the plans already collect for other organizations, and some are different. "We wanted to collect a mix of measures that goes across populations," Leddy explains. "We do use HEDIS [Health Plan Employer Data and Information Set] definitions when possible, but we are interested in other items, like lead screening, that HEDIS doesn’t measure."
Access is something the state emphasizes in its data, in part, says Leddy, because "if you can’t get into the system, then you can’t even get to HEDIS measures." So the plans are required to provide data on whether new members are sent plan identification cards and benefits booklets in a timely manner.
The data sets collected are based on site reviews that prior to the program’s inception indicated areas of importance, she says. "We wanted to find items that are actual indicators of something valuable. Administrative measures can indicate administrative capacity. But you also have to be careful to pick things that are available across all plans."
The plans were involved in the process, and Leddy says that was important to making the program something that they were whole-heartedly in favor of. "We also made use of our consumer advisor council, which is very vocal and involved." The consumers were the ones who wanted to put the emphasis on access to care, grievance and appeals procedures, and whether or not members have a primary care physician.
Since the program started, the same items have been measured, but Leddy says there are some pilot measures being considered. And in the future, once the goals have been attained and maintained for a period of time — perhaps five years — then some measures may be dropped and others added.
While the plans appreciate the endeavor to bring quality measures to a population that largely has been ignored when it comes to benchmarking, they do find some of the measures more valuable internally than others.
For instance, the clinical measures often differ from the clinical data being collected for the National Committee for Quality Assurance (NCQA) or other organizations, says Beth Ann Marootian, MPH, director of quality management at NHPRI. It has about 70,000 Medicaid members.
Some of the clinical measures are based on encounter data, and Marootian thinks that using encounter data and chart reviews in combination would provide more meaningful results. For example, looking at immunization rates based on claims data doesn’t tell whether a child is getting his first or second MMR vaccine, just that the child had a vaccine. Only chart reviews can drill down to that more meaningful level, Marootian says. Most of the NHPRI quality improvement programs relating to clinical care require chart review, but the RIte Care incentive program does not.
Licensure and accreditation data are the most important statistics NHPRI collects, Marootian says. "I consider RIte Care statistics the third level," she says. "We measure a lot already for NCQA. Some of what the state wants us to measure makes sense, but there are other HEDIS data sets that are audited by certified vendors that they could look at. We are trying to promote that, but I think the state has invested a whole lot in encounter data."
Marootian is happy to hand the information over to the state, but the usefulness of it is limited, she continues. "I don’t use it for the clinical stuff that much. Where I do see value is in the operational data. And in the end, it’s just very cool that they care about this. The orientation is just right. To provide incentives to the health plans is what they should do as stewards of the Medicaid program."
Marootian says she feels badly for the states where the positive incentives are either replaced with penalties or tempered by what are called nonfinancial incentives. Usually, that means public disclosure of the data. That’s great if a plan does well, but it’s hardly an incentive if a plan needs to work on a particular issue.
Leddy agrees. "We don’t want to embarrass plans or single one out," she says. "That doesn’t go with our determination to make this a true partnership. If we had sanctions, it would change the flavor of our program. This way, it’s always seen as a positive."
And even in tough economic times, it seems to be a cheap way to improve quality. The annual Medicaid budget in Rhode Island is a billion dollars. "This is less than 0.1% of that, and even in very tough times, I wouldn’t be the one to take it out of the budget."
When a plan is found to be lacking in an area, the state uses a very collaborative approach, providing guidance to the plan on how it might do better next time. Often, the plans will work together to find solutions. Recently, an area in which all plans were found to be lacking was in providing interpreter services during office appointments. The plans found that one big reason was that physicians had different forms to fill out for each plan. They collaborated together to create a single form that physicians can use for the service no matter which plan a patient is on. That simplified matters for the doctors and made them more willing to comply with interpreter requests.
Marootian thinks the state could do more to foster that kind of collaboration. "It has been the next step of taking the data and bringing plans together to talk about best practices and key areas to work on that needs to be expanded."
It happens occasionally. Marootian says that postpartum care now is a pilot measure, in part as a result of the three plans coming together to talk about it. "We knew it was a problem in the Medicaid population," she says. "Our HEDIS data tells us that. But we, not the state, made this a priority. I think they could become more of a driver in fostering change." Part of the problem could be a continuing vacancy for a medical director at the agency. "They need the clinical leadership to make this happen," she says.
Despite that criticism, Leddy says she has been impressed with the willingness of the plans to work together on the program. And it has paid off. Plans meet more of their goals every year. "The only exception is when we have rapid growth like we did in 1999," says Leddy, noting that the number of enrollees increased from 75,000 to 105,000 during that year. "We can see how the large influx of new enrollees impacted the improvement." But considering the very high level of the goals, Leddy is happy with how it has gone.
In general, Marootian gives the state high marks for the program. "We have a very collaborative Medicaid agency, and one that has the member in mind. They are working to improve health care, not make the life of the HMO a living hell. Because of that, the Medicaid health plans are willing to participate in this program, and we have become more collaborative ourselves in our approach to this work."
[Editor’s note: A report on Medicaid incentive programs was released in March by the Center for Health Care Strategies (CHCS) in Lawrenceville, NJ. To view the report, which covers the Rhode Island program along with others in Iowa, Massachusetts, Wisconsin, and Utah, visit the CHCS web site at www.chcs.org/publications/pdf/ips/bailitperformance.pdf.
For more information, contact: