The most award winning
healthcare information source.
TRUSTED FOR FOUR DECADES.
Quality professionals have been told for years, regularly and with great enthusiasm, that they should use “big data” to radically improve quality and outcomes, but many found that doing so was a challenge and didn’t live up to expectations. Now, however, it might finally be possible to truly use the wealth of data available to healthcare leaders for meaningful change.
The difference now is that the amount of data has greatly increased and it is much more accessible, say several experts. The widescale adoption of the electronic health record (EHR) resulted in a rapid accumulation of more data and created an infrastructure that made it much easier to access in useful ways, says Mark Wolff, PhD, chief health analytics strategist with SAS, a data analytics firm based in Cary, NC.
“Over the years there have been many pronouncements that not only technology, but also the amount of information available to that computational power, will finally create some dramatic paradigm shift,” Wolff says. “And that once that occurs, it will be truly transformative.”
Big data is different from the type of data routinely used in healthcare because it involves extremely large data sets analyzed to reveal patterns, trends, and associations. The idea of using big data for significant change goes back as far as 1959, when the first paper on the use of big data was published, Wolff notes. Another prominent paper with the same declaration appears every 10 or 15 years, he says.
“So here we are in 2017, effectively saying the same thing. Computational power is not a limitation now so perhaps now the data are going to drive this revolution,” Wolff says. “There are reasons now that this time we probably will see a dramatic shift in outcomes analysis, quality, and the standardization of care as a prerequisite to delivering higher-quality, lower-cost care.”
The first reason is the availability of the data in a digitized form, he explains. Healthcare providers have always had a great deal of information they could use, but that data usually was in the basement in manila folders with colored tags. Now that data is digitized, there also is the ability to deconstruct data in medical imaging to data that can be analyzed. Mathematical matrices for imaging data is leading to potential advancements such as the ability to automate the diagnosis of lung cancer.
Combining that volume and type of data with the technological improvements for analyzing massive amounts of information creates a significant opportunity for healthcare quality improvement, Wolff says.
“We’ve never seen anything like this,” he says. “It really is quite dramatic.”
Vast amounts of healthcare data can transform population health analytics, Wolff says. Statistical sampling was developed because the technology did not exist to look at large amounts of data, he notes. The data had to be sampled to bring the volume down to a manageable size. Today, sampling is much less necessary, almost not at all. It is possible to look at the data from hundreds of millions of patients all at once in an analytic platform. (For examples of how big data helped improve population health and reduce drug errors, see the stories later in this issue.)
“Without sampling, we have the power to identify small groups, outliers, unique events,” Wolff says. “Medicine is about looking for something that is different, a similarity among individuals, a genetic combination that is meaningful, behavior information. With massive data, I can go to the hospital and they can do patient matching, comparing my condition to hundreds of millions of individuals over time. You can identify the people who look most like me and then identify what was done and what the outcome was.”
That allows personalization of medicine that does not rely solely on genome sequencing or other complicated applications, he says. Some cancer researchers have even suggested that oncology is so complex that it is unethical to treat patients without computer algorithms.
“We’ve approached and surpassed the limit of human cognitive abilities to understand not only the volume of information available, but what is relevant and needs to be addressed,” Wolff says. “The complexity of the disease and the treatments, with different combinations producing different results for different cancers in different people, means that we have to use technology to deal with the information overload.”
Healthcare providers are looking more at how to use the data available to them, says Anne McGeorge, U.S. and global national managing partner of healthcare with the Chicago-based consulting firm Grant Thornton. The movement to a value-based payment system puts more pressure on healthcare organizations to use available data to make good clinical decisions for population health, she says. In effect, she says, hospital leaders are learning how to make money by keeping their patients out of the hospital rather than by bringing them into the hospital.
“A big piece of that is the strategic use of data,” she says. “By being able to capture the data at least in their electronic record systems, they feel they can more effectively manage the health of their patients. Healthcare leaders are realizing this is an opportunity and they are looking for ways to make it happen and move forward.”
McGeorge worked with one hospital that had 250,000 patients in its EHR database, and they know within that group there are 50,000 smokers. They used the data to reach out to each individual and offer smoking cessation assistance and other information encouraging a healthy lifestyle. Many diseases and other issues can be tracked in the same way and individuals identified for intervention, she notes.
In comparison to other industries, the healthcare industry is lagging in making the most of big data, she says. The retail industry, for example, uses customer data to target individual customers with good results. She recalls one department store that identified customers buying pregnancy tests and offered those customers information on pregnancy care, diapers, and other baby items. The same rationale could be used for identifying patients who are likely to follow a pattern and respond to them at appropriate stages, she says.
“The healthcare industry has focused on providing good clinical care to their patients, and they typically achieve that with high marks,” McGeorge says. “But sometimes the organization doesn’t see the analysis of big data as a clinical function or even a core function. By moving more toward analyzing big data, hospitals can evaluate how they’re doing in treating sick patients and in keeping well patients out of the hospital.”
Hospital leaders seeking to make more use of big data should start with a deep dive into their EHR system, McGeorge says. Hospitals with an enterprise resource planning (ERP) system also will be a good source for data.
“The ones with a robust ERP system can actually do a deep dive into the analytics covering, for example, the cost of certain procedures, even to the cost of caring for that patient from the moment he or she walks in the door to the moment the patient is discharged. Knowing the fully loaded cost of taking care of a patient is a huge step toward being able to analyze profitable departments, procedures, and pharmaceutical protocols, which can change behaviors and create more efficiencies in how some of the clinical care is delivered.”
Hospitals also should look at the capabilities of their existing systems, she says. Many hospital leaders do not fully understand that their existing technology can do, sometimes if a big data effort would require a significant investment in more technology, she says. System interoperability also is important, she says.
New technology can make use of data that researchers never could analyze in large volumes, notes Josh Bach, managing director of the enterprise improvement group at Van Conway & Partners, a management consulting firm based in Birmingham, AL. He recently was involved with a project that used the IBM Watson machine to study information available from films of interviews with Parkinson’s patients.
“Utilizing and leveraging big data allowed us to feed into Watson all of the interviews and writing samples to find facial expressions, verbal expressions, hints, or degradation in the handwriting, to earlier predict the disease,” Bach says. “The analysis also enabled us to monitor those with the potential for the disease and follow their progress for any similarity to the data from people who had Parkinson’s.”
Bach worked on a similar project with a team of researchers and clinicians who were using big data to review all oncology pre- and post-marketing trials to see if Watson could glean better treatment algorithms that would be free of the researcher bias that has plagued many studies in the past. The clear majority of cancer trials are on single drugs, but most treatment is approached with combination therapy.
Oncology, with its regimented dosing schedules and comparatively immense data tracking, is a more attractive target for cognitive computing, Bach notes. Unlike other areas of medicine, oncology is the most likely to have accurate inputs on dosing and compliance due to the life-threatening nature of the disease and high cost of the medication, he says.
The big data analysis helped overcome biases that may have affected other assessments, he says.
“You’re not doing double-blind, placebo-controlled trials on cancer patients because it’s unethical. And you have cases in which the researcher has some sort of bias or a certain hypothesis about treatment or dosing, and thus, many times a well-intentioned and well-qualified group of people will come up with a conclusion that supports their preconceived bias,” Bach says. “With the big data and the Watson computer, there is no preconceived bias. We’re able to look at all research that has been published and digest in such a way that you’re going to get outputs suggesting certain treatment algorithms are going to be favorable, without the bias of a few key opinion leaders or research institutions that may have had a vested interest in one conclusion over another.”
Big data also can be influential with monitoring compliance and key performance indicators, says David Kaufman, JD, a partner in the Healthcare Practice Group with the law firm of Freeborn & Peters in Chicago.
“Quality through the lens of efficiency, standardization, and compliance can be enabled pro-actively to ensure efficiencies and make immediate improvements,” he says. “For example, quality KPIs [key performance indicators] can be launched via business rules or by a triggered alert to acquire data based on outliers or thresholds. If those specified KPIs are not met, it’s immediately reported, signaling where, when, and, in some cases, why the threshold was violated.”
Author Greg Freeman, Editor Jill Drachenberg, Ebook Design Specialist Dana Spector, Nurse Reviewer Fameka Leonard, and Consulting Editor Patrice Spath, report no consultant, stockholder, speaker’s bureau, research, or other financial relationships with companies having ties to this field of study.