The next big thing in AI, emotional intelligence, could give hospitals a competitive edge
As Amazon's Alexa makes "herself" comfortable in more and more homes, she and similar artificial intelligence technologies could soon be having an impact on hospitals.
AI-based virtual assistants are evolving quickly, and more and more effort is being put into making them emotionally intelligent – able to pick up on subtle cues in speech, inflection or gesture to assess a person's mood and feelings.
The ways that could impact wellness and healthcare are intriguing. By reading into vocal tone, AI platforms could perhaps detect depression, or potentially even underlying chronic conditions such as heart disease.
For example, a Tel Aviv-based startup called Beyond Verbal is working on analytics tools that could work with Alexa et al. to gain insight into behavioral and vocal patterns.
"In the not so far future, our aim is to add vocal biomarker analysis to our feature set enabling virtual private assistants to analyze your voice for specific health conditions," said the company's CEO Yuval Mor in June.
In the nearer term, hospitals looking to realize the benefits of AI and EI need to think hard about where and how they'll deploy the technology as it continues to mature, said Anthony Chambers, director in the life sciences practice at Chicago-based consultancy West Monroe Partners.
The use cases for AI in healthcare are many and varied. Voice-enabled virtual assistants can help clinicians access notes or let surgeons see safety checklists. They can help with staff handle coding and transcription chores. Smart deployments of the technology hold the potential for big gains in hospital efficiency.
"Hospitals have realized they're sitting on mounds of data," said Chambers. "The past few years, they've been starting to take the next steps with narrative science, natural language generation and other machine learning technologies to give them a competitive advantage. We're seeing our clients make a lot of progress on identifying and predicting where efficiencies could be found in the patient care journey."
Lots of hospitals are now using AI and machine learning to "predict where issues are: where they can the get higher throughput, where they can see more efficiency in their care management," he said. "They can measure in real-time how they're doing, how can they gauge capacity, where is the slack in the system."
But in the years ahead it may be patients themselves who could be spending the most time with the AI platforms – and that's where emotional intelligence begins to take on more importance.
"What gets really fascinating – we have yet to see it, but we're seeing discussions of it – is potential uses around the quality of care," said Chambers. "That remains an untapped potential where the promise of emotional intelligence, in combination with AI, could play out."
Hospitals and pharmaceutical companies are starting to explore how the platforms could help clinical trial management, for example: "We already know of one client that is doing a proof of concept to support clinical trials, at the intersection point between provider and pharma," he said.
Natural language processing tools could help with gathering data and predicting outcomes, "giving almost real-time feedback to the physician and the drug company at the same time," said Chambers.
That's especially useful given how stress clinical trials can be on the patient. Offering a less intrusive way to communicate results to both provider and drug company could be a boon.
"If we could use an interactive bot, where the patient then has a point of conversation via smartphone or something, that could be a game changer because of the challenge of clinical trials being so stressful on the population, and the expense of running the trials," he said.
Chambers said he's also seeing more and more providers starting to "dip their toes into automating the bookends of the patient journey" – intake and discharge.
"Being able to potentially monitor the intake with a human in the room, but also an Alexa-type unit listening to the conversation and also hearing the stress or anger or fear in a patient's voice, that may throw up real-time prompts that the human can then put forward," he said. "That use case has been kicked around, a way to support the intake process.
"Think about facial expressions, gestures, pace and tenor of speech," he added. "If you factor those pieces into what a chatbot or robot or other interaction point with a human, that becomes an indicator or piece of data that artificial intelligence and big data algorithms could use to assess outcomes."
As hospitals increasingly look to forward-leaning implementations such as these, there are some important questions CIOs and other IT professionals should be keeping in mind, said Chambers.
For those organizations looking to use AI and EI to help with customer experience or quality of care efforts, "I think the first question hospitals or clinicians are going to have to decide is how will it support that care journey," he said.
Will it displace human interaction, or just augment it? And if it does displace it, where is it going to support the patient, and how are you going to use that communication?
"Because you really are changing the paradigm, potentially, of how you're going to interact with the patient," said Chambers. "Where on the patient journey do you see a need?"
Assuming those questions are ironed out and the implementation is complete, there another important thing to consider, he said: "What are you going to do with that data? It's patient-level data, it's real-time. Does that get then folded into an electronic health record? Do you tag it with social media? Does claims data use it? How does it get integrated?
The issue of privacy alone "is a little daunting," said Chambers. "How do you manage a patient's emotional quotient? I don't know. These are problems we're grappling with. But hospitals and healthcare companies have an opportunity to lead in this space."