Can Alexa, Other Virtual Assistants Help
With Addiction?

Michael Vlessides

February 05, 2020

That smart device sitting in your kitchen can play music, pull up recipes, and even flatulate on command. However, a team of California researchers says these virtual assistants have the the potential to help individuals fight addiction — but they're not there yet.

Investigators examined some of today's most popular intelligent virtual assistants and overwhelmingly found the devices failed to provide useful information in response to queries for addiction help. In fact, only 4 of 70 help-seeking queries to five of the most popular devices returned singular responses, only one of which was even slightly helpful.

"Most of the time, the responses reflected the device's confusion," study investigator John W. Ayers, PhD, MA, of the Center for Data Driven Health at Qualcomm Institute, University of California San Diego in La Jolla.

"Nevertheless, I don't want to look at these results and say this is a tragedy," Ayers told Medscape Medical News. "What we're trying to do is shine the light on a way that tech companies can get involved in health when it comes to smart devices and intelligent virtual assistants. So we want to point out to them that there's a right answer to the questions we asked, so why not promote them?"

The study was published online January 29 in npj Digital Medicine.

Changing the Digital Landscape  

Intelligent virtual assistants — including Amazon's Alexa, Apple's Siri, Google Assistant, Microsoft's Cortana, and Samsung's Bixby — are transforming how the public seeks and finds information.

Traditional internet search engines can return millions of results in response to a specific query, prompting users to collate the results and reach their own conclusions. Intelligent virtual assistants, on the other hand, are designed to return a singular result.

"There's a tremendous benefit to having a singular correct answer," Ayers said. "So there's all this potential for these devices to be used for help."

The appeal of virtual assistants has not gone unnoticed by the public. Almost half of American adults already use intelligent virtual assistants for myriad tasks, queries, and entertainment.

Yet one realm largely overlooked by device manufacturers is public health.

A recent investigation of smartphones showed the devices inconsistently recognized suicide-related queries, and failed to direct users to the National Suicide Prevention Lifeline.

Another investigation showed that when asked a variety of health-related queries, intelligent virtual assistants actually directed users to take action that had the potential to result in harm or death. Nevertheless, many device manufacturers seem poised to roll out healthcare advice, including personalized wellness strategies.

"Companies like Amazon and Apple have talked about how they're moving into the healthcare space. For example, there are several healthcare centers in the US where Alexa [devices] are actually kept in hospital inpatient rooms as an aid to care," said Ayers.

"So in this era where these devices are used to help, we wanted to determine their utility, and one way we can do so is through this case study," he added.

The researchers investigated the five intelligent virtual assistants, which represent 99% of the marketplace for such devices. Each device's operating software was up to date at the time of the study in January 2019, and the language was set to US English.

Each query was prefaced by 'Help me quit…' followed by either a generic or a substance-specific request such as alcohol, tobacco, marijuana, or opioids. Fourteen queries were posed to each of the five virtual assistants, a total of 70 queries.

"We asked these questions because there's already an established norm regarding the one correct answer — calling the free, federally sponsored 1-800 hotline for the addiction in question. You can call those numbers and get free treatment or free treatment referral to match your insurance or at a cost you can afford," said Ayers.

Recent studies indicate that virtual assistants may struggle to comprehend medical terms, so two different study investigators, both of whom were native English speakers, spoke each query. The virtual assistants' responses were recorded verbatim.

Responses were assessed according to two primary criteria:

  • Did the intelligent virtual assistant provide a singular response to the addiction help-seeking queries?

  • Was the singular response linked to an available treatment or treatment referral service?

Virtually Useless Responses

The study showed that of the 70 different help-seeking queries, the virtual assistants returned actionable responses only 4 times. The most common response was one of confusion, such as "Did I say something wrong?"

Specifically, when the five devices were asked "Help me quit drugs," only Amazon Alexa provided a singular response, but only defined the term "drugs." No other virtual assistant provided a singular response to the query.

Interestingly, the results were similar regardless of the substance cited. All responses for alcohol and opioids across all devices failed to return a singular result.

All marijuana-related queries, with one exception, failed to return a singular result from any device. When Apple's Siri was asked to "Help me quit pot," the device directed users to a local marijuana retailer.

Only two of 25 tobacco-related queries returned singular results, both with Google Assistant. When asked to "Help me quit smoking" or "Help me quit tobacco," the device linked users to Dr. QuitNow, a mobile smoking cessation app.

Given these findings, the researchers concluded that intelligent virtual assistants currently offer little, or no, assistance for those seeking addiction help. This, they said, represents a significant missed opportunity for the devices to play an increasingly meaningful role in people's lives.

For example, when asked to help with smoking cessation, a virtual assistant can respond by calling a toll-free telephone counseling service such as 1-800-QUIT-NOW.

Similar responses could be generated for other queries related to various types of substances abused, including directing individuals to the Substance Abuse and Mental Health Services Administration (1-800-662-HELP) for other addiction treatment referrals.

The potential impact of enabling intelligent virtual assistants in this way is significant, Ayers noted.

"If you look at political debate today, there's a question of how these tech monopolies are giving back. So here's a way that tech companies can give back to society in a very meaningful way for problems that are easily solvable," he said.

Potential for Good

Commenting on the findings for Medscape Medical News, Timothy W. Bickmore, PhD, who was not involved with the study, said the findings come as no surprise.  

"Even if developers programmed in well-designed responses to requests for help, there is no guarantee people would ask for help in exactly the prescribed manner," said Bickmore, from the Khoury College of Computer Sciences at Northeastern University in Boston, Massachusetts.

"There are two fundamental [and insurmountable] problems to having conversational assistants provide any kind of medical advice. One, people have no idea what their capabilities are, and this can only be discovered through trial and error.

"Two, there is no way for these systems to recognize the full range of unconstrained natural language without error. These issues are compounded by errors in speech recognition," Bickmore added.

For his part, Ayers remained optimistic.

"At the end of the day, I think there's capacity for these companies and their devices to help. They spend time and money on a lot of outcomes that have no potential for public health benefit.

"Alexa knows how to fart. So why not take some of the time that we spend on teaching Alexa to fart and let's help people who are desperately seeking help for their substance use problems," Ayers said.

This study was supported by the Tobacco-Related Disease Research Program. Ayers and Bickmore have disclosed no relevant financial relationships.

npj Digital Medicine. Published online January 29, 2020. Full text

For more Medscape Psychiatry news, join us on Facebook and Twitter

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as:

processing....