Early in her residency, neurologist Heidi Moawad, MD, ordered a CT scan for a patient who came into the emergency room with a seizure. It showed a massive brain tumor.
"She was convinced that the CT scan caused the brain tumor," recalls Moawad, now a clinical assistant professor at Case Western Reserve University School of Medicine in Cleveland, Ohio. "I tried to remind her that she came in because she'd had a seizure. I told her I trusted the people senior to me."
However, the woman refused to accept the terrible prognosis until her family members convinced her.
If Moawad were to encounter such a patient today, she would be better equipped to respond to these irrational fears. Social science researchers have begun to devise approaches for dissuading patients from harmful beliefs.
With the return of infectious diseases such as measles on the rise, in part due to the antivaccination movement, the need for approaches to engage with patients about the risk of unsupported medical information has probably never been greater. Although patients' misconceptions, lack of logic, and superstitions have complicated the work of doctors since the first doctors existed, the advent of social media has taken the problem to a new dimension.
With social media, patients can more easily find misinformation, says Dominique Brossard, PhD, chair of the University of Wisconsin-Madison Department of Life Sciences Communication. They can also share that misinformation more easily.
Although search engines such as Google can proficiently identify the most popular and relevant websites on a given topic, they can't distinguish between accurate and inaccurate websites. "People will most likely not look past the first page of search results," notes Brossard. The first page of websites may be optimized to rank high on searches, or they may be sites with outdated information.
Shoddy Science's Viral Effect
Patients could read misleading articles in newspapers and magazines before the Internet even existed, but sharing them with friends and family required photocopying and stuffing envelopes. Now, with a few clicks of a mouse, anyone can pass on an article to thousands of people in seconds.
In a 2017 survey of participants in Facebook Health Communities by WEGO Health, 87% of respondents said they shared health information through public Facebook posts, and 81% shared it through private messages.[1]
Lies may spread faster than the truth. Researchers at the Massachusetts Institute of Technology analyzed a set of about 126,000 news stories disseminated on Twitter from 2006 to 2017. They found that more people retweeted false information than true information. The researchers speculated that people may have passed along the fake news more readily because it was more novel and evoked more emotion.[2]
"We see this all the time," says Sander van der Linden, PhD, director of the Social Decision-Making Lab at the University of Cambridge in England. "Hoaxes go viral."
The wide dissemination means that some patients may receive the same false messages repetitively. In another study, Yale researchers found that the more often people receive the same message, the more likely they are to believe it, even when the message is labeled as disputed by social media fact checkers.[3]
Furthermore, when people are evaluating the reliability of health information shared online, they care more about who shared the information than they do about the original source, according to an American Press Institute study.[4]
False health information can have real health consequences. A different team of Yale researchers found that in a cohort study of 1,901,815 patients, the use of complementary medicine was associated with refusal of conventional cancer treatment, and with a two-fold greater risk of death compared with patients who had never used complementary medicine.[5]
Doctors as Gatekeepers
In the face of this disinformation deluge, what can healthcare providers do to make sure their patients are acting on accurate information?
The good news is that most people still trust the medical scientists conducting health research. In 2016, 84% of Pew Research Center survey respondents said they have at least a fair amount of confidence that medical scientists will act in the best interests of the public. By comparison, only 38% said that about the news media.[6]
"Doctors are the number one most-trusted experts as communicators," adds van der Linden. "They're more persuasive on climate change than climate scientists."
However, doctors must make judicious use of that trust, he cautions. "If you make people feel stupid or suggest they've done something wrong, it can elicit biases. Especially with controversial topics like vaccines, we find that people are very defensive."
"Often patients are looking for hope, so doctors must start by acknowledging those emotions," says Brossard. "It's crucial to put concern, empathy, and listening as the first step in the conversation."
Media's Need for Expertise
The next step depends on what information the patient wants to discuss. Journalists may exaggerate the importance of a study to attract readers. They may not know enough about the scientific process to report on the data accurately. Or, they may simply leave out the caveats that accompanied the original study.
"Untrained and inexperienced health reporters are becoming more common because so many news organizations are in financial trouble and have laid off their most experienced staff," Brossard says. At the same time, the ease of publishing online allows almost anyone to lay claim to the title of journalist.
For example, a legitimate study might show a correlation between brain health and the microbiome. An article written about the study could then imply that eating yogurt cures dementia. If a patient quotes such an article, Brossard recommends responding with something such as, "A lot of studies have been conducted on this topic, but we need to be careful about articles in the media that sensationalize these studies and make them seem more important than they are. Maybe we should look at the original study."
Fighting Unproven Therapies
A different approach may be needed when a patient expresses confidence in a clearly unscientific approach; for example, trying to cure brain cancer by wearing crystals. In this case, there is no underlying study to examine. Even so, Brossard recommends tact. "As you know," a doctor could say, "that's not what I'm doing here in this office, so why don't we focus on the approach we are using?"
If an unproven therapy is preventing the patient from taking advantage of a proven therapy but time is not of the essence, Moawad notes, a doctor can suggest a trial period. For example, if the crystals don't seem to work in 6 weeks, the patient might agree to move on to chemotherapy.
If a patient insists on doing something that seems harmful, she says, the best approach may be a personal appeal. "Genuinely explain that you want the person to get better, and you are personally concerned that if they will only do the thing they're talking about, they won't get better."
Doctors face a more complicated challenge when patients have the correct information about a study, but the study itself is poorly designed, preliminary, or outweighed by other research. A few patients may be interested in wading into the nuances of study design and levels of evidence, according to van der Linden, but most will be better off with concepts that are easy to remember.
Preventing Misinformation Outbreaks Dismissive
Van der Linden recommends communicating to these patients using terms such as "the weight of evidence across many studies," "preliminary," "pilot study," and "exploratory research." "We tell people that 90% of doctors agree vaccines are safe, and you should be inoculating your children," he says.
Patients are often convinced that a treatment works or doesn't work because of a powerful anecdote or testimony. "We attach a lot of value to social information," says van der Linden. "It's difficult for people to understand that something they can observe is less valid than a statistic. Often it comes down to telling people what happens to a single person is not descriptive of the average."
The Cambridge lab has also found evidence that doctors can preempt some misconceptions using an approach they call "inoculation." Doctors who are aware of some of the most common misinformation can prepare their patients for it in advance. For example, doctors can counter myths about the danger of vaccination even before the patient, or patient's guardian, hears them.
The technique is most effective when patients are active participants. Instead of warning them about disinformation, the doctor could ask, "what might be some myths about vaccinations, and what would you say to debunk them?"
Van der Linden's group has even created an online game in which the user plays the part of an Internet troll, learning the tricks used to spread disinformation and thereby becoming a more savvy information consumer.
The National Institutes of Health also provides a set of criteria that people can use to evaluate health information on websites.[7]
However, persuading a patient to make use of such resources depends above all on the strength of the doctor-patient relationship, and that begins with the basics. "I think the biggest thing is to treat people with respect," Moawad says.
Follow Medscape on Facebook, Twitter, Instagram, and YouTube
Medscape Psychiatry © 2019 WebMD, LLC
Any views expressed above are the author's own and do not necessarily reflect the views of WebMD or Medscape.
Cite this: A Prescription for Treating Fake Health News - Medscape - May 28, 2019.
Comments