Vanderbilt creates AI and natural language processing voice assistant for its Epic EHR
Vanderbilt University Medical Center has developed a voice assistant for caregivers to use navigating the hospital's Epic electronic health record.
The new tool processes requests using natural language processing and understanding technology, and not just macros, officials say – noting that it could represent an important paradigm shift in how providers interact with their EHRs in more natural and intuitive ways.
The name of the voice assistant is V-EVA, which stands for Vanderbilt EHR Voice Assistant. The Vanderbilt University Medical Center Department of Biomedical informatics and Health Information Technology Innovations developed it.
"The idea to develop an in-house voice assistant came from the general frustration we heard from users about the difficulty navigating the EHR to find relevant information," said Yaa Kumah-Crystal, MD, eStar Core design advisor, assistant professor of biomedical informatics and assistant professor of pediatric endocrinology at Vanderbilt University Medical Center and Monroe Carell Jr. Children's Hospital at Vanderbilt.
"There is a lot of information foraging that occurs in the EHR, although users often know the precise pieces of data they need to understand a clinical picture," she said.
Depending on how an EHR is organized, it can be a taxing process to click and scroll through multiple windows and panes to find an answer. Vanderbilt staffers wondered, wouldn't it be nice to just ask for what you want and have the answer given back to you?
"The idea for voice user interfaces and natural communication with technology has existed for some time, initially as science fiction now in the consumer realm," said Kumah-Crystal. "Now we have passed an appreciable threshold in machine learning and natural language understanding technology, and this is an idea whose time has come."
Vanderbilt is working in collaboration with Nuance Communications to leverage their artificial intelligence/natural language understanding platform to help process voice requests. The AI is extremely important to a successful functioning assistant because it enables Vanderbilt to build and configure the system to handle various input request types.
"The Nuance platform also enables us to build out our tools in a HIPAA-compliant manner, which is critical when dealing with this level of patient data and protected health information," said Kumah-Crystal. "We are also working with our EHR vendor Epic in identifying and mapping the information sources to satisfy the queries."
A positive aspect of the meaningful use program is that much of the relevant data Vanderbilt caregivers seek now exists as structured and coded information, she said. Additional innovations, such as the FHIR standard have enabled Vanderbilt to develop the platform in a way that data retrieval can be generalizable across other platforms.
"Aside from the technical work of understanding, processing and retrieving the correct information, we have found that a lot of the work of creating a useful assistant is delivering the responses back in a way that not only answers the users' questions but also satisfies their informational needs," Kumah-Crystal explained.
"This has been one of the primary informatics challenges working on a voice user interface. When searching for information on a screen, the user has the opportunity to skim for information and have intermittent transactions with details by glancing back and forth at the data."
Voice is linear, however. The information is given as a direct reply that will be most impactful when the assistant is "smart enough" to understand the intent behind a user's request. This is where Vanderbilt has had to do a lot of exploration of information theory and learn how best to deliver voice replies.
The voice assistant is designed as a mobile responsive web application. The user launches it through the EHR in the context of a patient. A provider can ask a general question to the voice assistant such as, "Tell me about this patient," and receive a summary of the patient's general demographics and recent encounter information.
If the provider asks, "What was her last weight," he will get a response about the patient's weight and relevant information about a change in the weight trend such as, "Sally is 146 pounds today, she has gained 4 pounds since her last visit 6 months ago."
"We believe information like this is where a voice assistant can shine," said Kumah-Crystal. "It is one thing to relay back data that can be found in the EHR, but the added value in providing a layer of context to the information shared can enhance the transaction. This can also save the provider time since they no longer have to look up the previous values and perform the arithmetic themselves to understand the patient's weight trend."
This is also where the expertise of medical informaticists on the Vanderbilt team who are also practicing providers comes into play. These subject matter experts help think through the users' informational needs and the most concise ways to deliver that information. Instead of simply answering the user's questions, the team seeks to understand the problem the user is trying to solve and aims to satisfy that.
The prototype voice assistant is being tested by a small cohort of users to assess the usability, efficiency and safety of this new workflow. Vanderbilt is performing accuracy and time-to-task analysis to see if this workflow saves time over the standard way of seeking information in the EHR. Vanderbilt also is trying to understand when and where the use of this technology would have the highest yield.
"We think the incorporation of voice assistants in the provider workflow can enhance the delivery of care," said Kumah-Crystal. "One of our testers described the platform like a helpful intern always ready with an answer. User satisfaction and the perception of ease of use will also be essential metrics. People are not very tolerant of failure when they are busy trying to get work done."