Humana uses Azure and Kafka to make healthcare less frustrating for doctors and patients


Event streaming platform creates one real-time data feed from multiple sources to modernize information architecture.

Patient data lives in multiple silos—the pharmacy record, the home healthcare visit, the specialist’s recommendations, the electronic medical record at the local hospital. One provider often doesn’t have the latest update from another, and it’s frequently up to the patient to connect the dots. 

These disconnected data points make it hard for doctors to see the full pictures and for patients to get coordinated care. At best, this problem is annoying and at worst it can be life-threatening.

Humana is taking on this problem by modernizing its data management architecture. The goal is to make sure everyone in the healthcare world is working from the same and most up-to-date information. The health insurance company is using event streaming technology to turn data silos into one merged data stream.

SEE: Big data and DevOps: No longer separate silos, and that’s a good thing (TechRepublic)

To do this, Humana turned to Confluent, a company using Apache Kafka to create a new way of managing a company’s data. Jay Kreps, Neha Narkhede, and Jun Rao created Apache Kafka while they were at LinkedIn. They left the networking platform to launch Confluent to use Kafka to create a “central nervous system” for managing corporate data. The idea was to bring the real-time data management used by tech companies like Netflix and Uber to other industries.

This means seeing actions–such as a sale or a customer service interaction–as an event and using the data from that event to power real-time recommendations, decisions, or related actions.  

Dan Rosonova, head of product management for Confluent Cloud, said every company is expected to have this level of data management sophistication, which requires a new kind of data infrastructure. Instead of manually combining information from databases from sales, customer service, and fulfillment, companies can use Kafka to create an ongoing feed of everything that is happening in a company. 

“A salesperson could set an alert to get an alert any time a customer has more than three support tickets within a week and is above some threshold of spend,” he said. 

Managing events in a stream creates three new capabilities:

  1. The ability to build onramps and offramps from the main stream
  2. The ability to create custom views of the same stream of data
  3. The ability to capture events and rerun them at a later date

Rosonova said that one common application for this replay functionality is in fraud detection. When a security team develops a new model, they could rerun every transaction for the last 30 days to see if the new model catches fraudulent transactions.

SEE: Big data: How wide should your lens be? It depends on your use (TechRepublic)

Rosonova said that cloud computing is the other key element of the Confluent platform.

“Without the cloud, the complexity is really high, and normal companies don’t have all of the computer science talent to support that,” he said. “The cloud democratizes this.”

Real-time updates of health information

Although healthcare companies were not the initial customers for this kind of database management, there is a lot of potential to manage complex event processing, such as drug interactions during clinical trials or patients receiving care from multiple specialists.

The Kafka interface also provides onramps and offramps for data from the stream to share information with customers, partners, or third parties. This capability could make interoperability a reality for the healthcare industry, which has struggled to share data across physician offices, hospitals, and insurance companies. 

“There are a million point-to-point connections in healthcare, which are like surface streets in a downtown,” Rosonova said. “The data stream is the highway.”

A cloud infrastructure also limits the amount of tech work that hospitals and health insurance companies have to do.  

“If I’m getting a feed from providers, and I have to produce a feed for pharmacies, I can focus on connecting them,” he said. “Healthcare organizations can’t do greenfield development because they have to deal with the structures they already have.”

Using event streaming at Humana

Levi Bailey, the associate vice president of cloud architecture at Humana, said that adopting an event streaming model was part of the company’s larger shift to using Azure. Bailey said that Humana is changing the way it manages information to improve interoperability in the industry.

 “We sit in the middle of all this information, and when providers share data with each other, it has to come back to us in some shape or form,” he said. “When we looked at this, we knew we needed a platform approach.”

SEE: Are you a big data laggard? Here’s how to catch up (TechRepublic)

Bailey said Kafka’s platform gives Humana the ability to scale out for the demand for a hyperconnected data ecosystem.

“Streaming the data out through a highly scalable platform gives us the ability to bring this data together, optimize it for each audience, and deliver it,” he said.

Bailey said that another benefit of the event streaming approach is that the Kafka platform maintains enough history about events, which makes it easier to understand the implications of changes.

He used the metaphor of a sentence structure to explain how this element of the technology works in practice. The data asset is the noun and the action that changes the state or status of that noun is the verb. Kafka’s streaming event management approach preserves the action (verb) that changes the status of the noun. Previously, that action was lost. 

“Thinking about events changes your point of view on what’s important,” Bailey said. “Tracking an event allows another system to drive an experience with the downstream consumer because now they know about the upstream change.”

Bailey used pre-authorizations as an example of how improved data sharing and interoperability can save time. “It used to take 20-30 minutes to do a pre-auth, now it is down to a minute,” he said.

Bailey also said that Humana is hoping interoperability also makes it easier for patients to have a more complete view of their data that includes lab data, diagnosis codes, and data from smart watches.

Bailey described another way the Confluent platform is distributing real-time healthcare information among providers. When a person on Medicare is discharged from the hospital or a rehab facility, a home healthcare provider does an assessment of the home environment. The provider looks at medication habits, fall risks, social support, and other elements. Humana works with a vendor to complete this evaluation. 

Humana uses Confluent’s event management platform to make sure the person doing the assessment has the latest information from an individual’s doctor about medications and other treatment plans. 

The home healthcare assistant can see the information on a mobile device in real time. 

“Also, if they get updates while they are there, they can send the info back to us through the same interface,” he said. 

Rosonova said that data streams in healthcare organizations might start off as a creek as companies get started with the data streaming approach.

“There is a lot of potential in data syndication scenarios, and big innovations will come from that space,” he said.

Also see



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *