Health insurers’ capability to gather and create data far outstrips their ability to analyze the information and put it to use. Progress is being made.
Big data is a big deal, or so the health care IT vendors say. As revenue from the sale of electronic health records falls off and complaints soar about EHRs and CMS’s meaningful use program, health IT vendors have stepped out to tout the power of big data analytics. Slicing and dicing gobs of data stored in EHRs, insurance claims, and external databases such as the Geographic Information System (GIS) database is supposed to transform the health care delivery system.
Here’s the problem: Big data doesn’t exist outside the capabilities built by a few sophisticated organizations. You may have heard of one: the United States, which has 84 big data programs spread over six federal agencies.
In health care, most data are locked up in siloed databases, trapped by electronic medical records that don’t talk to each other. Moreover, the quality of patient data is often questionable. Simple fields such as a patient’s gender may not be usable because some systems may have more choices than male and female. Critical fields like a diagnosis code can be tainted by billing and reimbursement considerations, such as upcoding.
The barriers to creating a big data environment remain despite huge efforts by CMS’s Office of the National Coordinator (ONC), the American Health Information Management Association (AHIMA), myriad IT professional organizations, and many voluntary work groups. The $30 billion in federal EHR incentive payments, which was a bonanza for IT vendors, hasn’t solved the problem.
Big data analytics will require interoperability and aggregation of data across providers, health plans, PBMs, and pharmacies, but that is not widely happening. In March, a Senate committee hearing examined the problems of interoperability and found that there is no business case for EMR vendors to make their systems interoperable—nor do providers have a reason to bear that expense.
The EMR vendors at the hearing said it was too costly for them to standardize their systems and create interfaces. Miffed by what they saw as arrogance on the part of vendors, the senators were not shy in their criticism of the meaningful use program’s failure to achieve interoperability.
“The technology for exchanging information exists, but there are all sorts of disincentives, particularly among competitors, and there are few incentives for them to do so,” says David Kibbe, MD, CEO of DirectTrust, a health care consortium that provides a standardized Internet based data exchange network. “The bottom line is that it will take payment reform to reward providers for exchanging info and penalizing them for hoarding it.”
Pharmacy data gets some respect
Health care analytics is often limited to the data in medical and pharmacy claims. “Claims data rule the world,” says Jonathan Weiner, PhD, a codeveloper of Johns Hopkins University’s Adjusted Clinical Groups (ACG) System, a widely used population-based, case mix/risk adjustment methodology. Weiner is also a member of Managed Care’s editorial board.
Unfortunately, claims have very few truly useful data fields, so insight must be gleaned and inferred. Nevertheless, progress is being made. Pharmacy data plays a central role in analyzing health care services, and health plans, PBMs, and others are working to build and use analytic tools that process it. As pharmacy becomes a larger piece of total health care expenditures, and as new high-cost specialty drugs—like the hepatitis C agents—rattle the system, predicting pharmacy costs and managing pharmacy utilization has emerged as a top priority.
Johns Hopkins gets into the game
Health plans and PBMs recognize that pharmacy data can aid in identifying high risk patients and gaps in therapy. It can also be used in some nifty, reform-minded ways for predicting pharmacy costs in risk-based payment arrangements, supporting population health management in ACOs and modeling medical care for specific diseases.
The ACG system at Johns Hopkins has these capabilities. Its primary inputs are medical and pharmacy data and patient demographics from claims, but it is beginning to incorporate data from other sources. The system organizes patient data into morbidity groups, one based on diagnosis and the other on pharmacy usage.
The ACG system maps more than 90,000 NDCs into a set of 60 pharmacy-based morbidity groups. These morbidity groups are then fed into a predictive model that uses a reference database of millions of health plan enrollees. The predictive model has flexible output parameters, such as predicting pharmacy adherence or hospitalizations. Hopkins is working to focus the ACG system on population health analytics by incorporating data, such as the GIS database.
Data analysis with a purpose
Analysis should never be an end in itself. The acid test for big data analytics in pharmacy services or any other area of health care is how that data is used to improve quality and efficiency.
Earlier this year, Prime Therapeutics won the Pharmaceutical Benefit Management Institute’s 2015 Rx Benefit Innovation Award for an analytic tool it developed that incorporates medical and pharmacy claims. The company’s tool provides early warning and trend forecasting for emerging high-cost or high-utilization drugs.
“It serves as the front end for providing actionable clinical utilization management recommendations to our Blue Cross plans,” says David Lassen, PharmD, chief clinical officer.
Prime Therapeutics provided the Pharmaceutical Benefit Management Institute the results of a case study of the cost and utilization of sofosbuvir (Sovaldi). The tool’s predictive modeling analyzed medical claims for a commercially insured population of 12 million members, looking at the incidence of hepatitis C virus screening as an early indicator of diagnosis and possible treatment. The incidence of screening was quantified for three separate 10-month intervals coinciding with the releases of updated screening and treatment recommendations from the CDC and the U.S. Preventive Services Task Force.
Prime Therapeutics then used screening, diagnosis, and trend data to calculate the number of new members diagnosed with HCV. The results showed an increase in the estimated number of new diagnoses resulting from screening. They estimated that 42,284 of its 14.8 million (0.2%) member base had hepatitis C at the end of 2013.
The Minnesota PBM then updated the estimate of hepatitis C cases to 2015 and modeled its cost trends. The cost of hepatitis C treatments drugs shot up from $0.13 PMPM in 2013 to $4.28 PMPM, a 32-fold increase. By incorporating the approval of additional new HCV agents and other factors, Prime also modeled continuing cost increases that ranged from 20% to 50% for 2015.
Prime Therapeutics’s watch list tool is integrated with its therapy management program, says Lassen. Claims data are fed into software that finds gaps in therapy, drug interactions, overutilization, poor adherence, and other situations. Alerts target providers and patients.
Active Health Management, Aetna’s disease management subsidiary, also has software with capabilities for incorporating pharmacy data and guiding pharmacy management.
The future is now-ish
Narrow, targeted data-networking efforts are producing success, and Aetna’s IT subsidiary, Medicity, is driving some of that success. Medicity provides data networking and exchange services to state health information exchanges, EHR vendors, and other clients.
Health care organizations are coming together to develop custom interfaces. Nancy Ham, Medicity’s CEO, says Medicity is working with more than 100 enterprise clients to develop interfaces that connect data systems and meld data into meaningful databases. The types of data exchange networks that Medicity is developing may provide the vehicle for health plans and PBMs to obtain the expanded data they have been seeking from providers, allowing them to manage care more effectively.
Eagle Public Relations