Health misinformation confuses communities, persists in memory

By Lexie Little

[This is part of a series of briefs covering the 2019 State of the Public’s Health conference produced by graduate students at the UGA Grady College of Journalism & Mass Communication in conjunction with the Health and Medical Journalism Program.]

The adage that “communication is key” rings true for doctors, nurses, epidemiologists and others tasked with educating the public about health.

But new research suggests simply communicating factual information may fail to counter the surplus of false health messages that permeate the national conversation as they spread on social media.

Bad misinformation remains hard to forget, researchers say.

“Once people encounter false information, even if you correct that information, it will persist in memory for an extended period of time,” said Michael Cacciatore, an associate professor of public relations in the Grady College of Journalism and Mass Communication at the University of Georgia. He described a theory called the “continued influence effect,” which researchers now apply to communications studies in the fake news era.

Cacciatore and other panelists spoke about misinformation at the recent State of the Public’s Health Conference. The day-long event focused on various healthcare challenges in Georgia, like high maternal mortality rates and poor access to care in rural communities. The misinformation panel explored how and why bad information about health exists and spreads, and what can be done about it.

Misinformation campaigns on social media have received national scrutiny from health experts. For example, earlier this year, James Madara, executive vice president of the American Medical Association, wrote a letter to executives of platforms like Facebook, Twitter and YouTube urging them to take action.

But it’s also important to look beyond social media, said panelist Leslie Rodriguez Zeigler, a health communication specialist for the Centers for Disease Control and Prevention.

“Folks who are spreading misinformation are getting better and better at knowing how to spread it,” Zeigler said.

She led the CDC’s communication efforts in the wake of the recent measles outbreak in an Orthodox Jewish community in New York. Zeigler pointed to the effectiveness of a print booklet titled Parents Educating and Advocating for Children’s Health, or PEACH, that targeted local parents and discouraged them from vaccinating against childhood diseases like measles, mumps and rubella.

PEACH proved successful because the booklet looked legitimate, and the authors understood the community’s culture.

It was “very deliberate,” Zeigler said on the panel moderated by Glen Nowak, director of the Center for Health and Risk Communication at Grady College. Panelists agreed that misinformation is as hard to recognize as it is to discredit, partly because it often looks like news. Appearance makes it harder for the public to tell the difference.

One way for health experts to convey the correct message is to include members of the community when crafting the communications, said Soroya McFarlane, an assistant professor of communication studies in the Franklin College of Arts and Sciences. Cultural values shape what information is accepted or refuted.

“The more visibility and activity that the community has within your project…that tends to lead to more success,” McFarlane said.

Panelists noted while it’s easier to identify deliberate campaigns, which attempt to gain profit or take advantage of a situation, the unintentional sharing of false information remains hard to research. Computer algorithms and social media monitoring may help track the problem, but there’s a long road ahead. Terms like ‘misinformation’ are just beginning to be understood, Cacciatore said.

“The term ‘misinformation’ was meaningless five years ago. Not that it wasn’t a word, but the public didn’t know what it was,” he said. “The public doesn’t know what it is now.”

Posted on November 15, 2019.

Additional Conference Briefs: