Q&A: Northeastern's Timothy Bickmore on the clinical future of relational agents

Tim Bickmore

Timothy Bickmore, associate professor at Northeastern University's College of Computer and Information Science, has been working for the past decade in the area of "relational agents." He says these artificially intelligent avatars are poised for a promising future in healthcare.

[See also: At Aetna.com ask 'Ann' anything]

Bickmore describes relational agents as "computational artifacts designed to build and maintain long-term, social-emotional relationships with their users."

Sometimes called intelligent virtual assistants (IVAs), they're not too dissimilar from the "chatbots" appearing more and more in a consumer-focused commercial capacity, deployed by companies such as E*Trade, AT&T and IKEA, and by health insurers such as Aetna.

[See also: Virtual reality tech projected to grow in healthcare sector]

But relational agents, equipped with natural language processing capabilities, are more explicitly meant to maintain persistent contact with their human interlocutors, and are designed and developed to "remember" past interactions with people and build on them in an ongoing relationship.

Bickmore specializes in making these avatars as expressive as possible, in order to improve the emotional verisimilitude of the human/humanoid interaction, fine-tuning "speech, gaze, gesture, intonation and other nonverbal modalities to emulate the experience of human face-to-face conversation."

In the coming years, Bickmore says relational agents and IVAs have an important role to play when it comes to improving health literacy, driving patient engagement, and maintaining compliance with wellness programs. He spoke to Healthcare IT News about his work with the technology, and the new ways he sees it being deployed.

What is your research background? How did you become interested in relational agents?

I did my Ph.D at the MIT Media Lab some years ago. I was working in a research group that was simulating face-to-face conversation between people as a kind of user interface. We were studying hand gestures, facial displays of emotion, body posture shifts and head nods, and how these are used to convey information in face-to-face conversation.

My dissertation was applying this type of interface to health counseling – studying how doctors talk to patients, how nurses talk to patients and how we can take best practices from face-to-face counseling and build that into automated systems for educating patients and doing longitudinal health behavior change interventions.

Previous
1