As a species, we don’t seem to get any smarter about rumors and misinformation – even if our technology evolves. Take Ebola and compare its scares to those of another deadly epidemic: AIDS, 30 years ago.
Then I was a broadcast health and science reporter, covering HIV/AIDS before it was mainstream (that came in 1985 after Rock Hudson died). Cutting-edge communications tech in pre-web browser days was represented by personal computer bulletin board systems, online dial-up services like CompuServe, and commercial, proprietary email systems such as MCI Mail. Nothing digital broadly connected to anything else, so bad info was spread and combated by mass media like TV, radio, and newspapers, and by individual communications such as telephone calls.
Now we’ve theoretically progressed by adding many more public and private communications channels, Internet-linked and fast-acting: social media, web news sites, and apps in our pockets.
I quietly reflected on these technological improvements recently as a tweet from a Seattle television news anchor flashed onto my screens: “#Ebola can be spread thru the air, according to U. of Minnesota researchers. CDC, WHO advised of findings.”
A click on an included link brought me to a web news story with an equally frightening headline (“Ebola Is Airborne, University Of Minnesota CIDRAP Researchers Claim”), which had largely rewritten and linked to a post on a site founded by political operative Andrew Breitbart (“MEDICAL RESEARCH ORG CIDRAP: EBOLA TRANSMITTABLE BY AIR”), which in turn referenced and linked to the actual “report” – a much-earlier guest opinion piece, not a medical finding, on the Center for Infectious Disease Research and Policy site (“COMMENTARY: Health workers need optimal respiratory protection for Ebola”).
Just three clicks to find out that it was opinion, not fact; that it was a month old, not breaking; and that the authors were not with the medical research organization. CIDRAP had simply posted their perspective.
But they were three easy clicks the original tweeter – and apparently, many who retweeted it uncritically – didn’t make.
When my blood pressure returned to normal and the steam from my ears dissipated, I ended my quiet reflection on our communications progress and turned to my personal Wayback Machine.
Fear-mongering was also common with AIDS. A major difference was that, unlike Ebola, AIDS was new and HIV’s transmission was not well understood. It was first stigmatized as the “gay cancer,” and myths about getting it from toilet seats or casual contact had to be repeatedly addressed.
Ebola is obviously frightening and a horrific way to die. But you’d think that now with more, not fewer, methods of social communication technology, and a better, not less, researched disease, knocking down Ebola rumors would be easy.
Instead, it’s Whac-A-Mole time. Because of social media, misinformation moves fast from many more points.
Dr. Robert Wood, who was director of HIV/AIDS control for Public Health-Seattle & King County from 1986 to his retirement four years ago, speculates partly because of communications tech advances, with Ebola, “my guess is that lots more people are aware than in the early days of AIDS.”
Karen Hartfield, communicable disease epidemiology manager for Public Health-Seattle & King County, thinks social media and websites can be a double-edged sword in providing accurate information: “We are often playing catch up and responding to rumors, when we wish we could spend the time disseminating accurate messages and developing and implementing programs.”
One challenge is the forced brevity. “Social media can ‘dumb down’ information so that it is no longer accurate,” Hartfield notes. “Social media often doesn’t lend itself to complexity.”
Even with no 140-character or physical space restrictions, websites create problems – including those of major media outlets. “The comments aren’t screened for accuracy. I know people who breeze right by the actual story and read the comments instead (more entertaining, more engaging),” Hartfield observes.
In the case of Twitter, researchers at the University of Washington already are working to develop an algorithm to identify rumors, based in part on an analysis of more than ten million tweets from the Boston Marathon bombings. However, that still leaves other uncontrolled point sources of digital pollution, from Facebook to deliberately misleading stories used as click bait.
The consequences of rumors or misinformation can be greater than just stoking fear (or, presumably, ratings). Some schools in Ohio and Texas shut down after reports that students or staff were on a plane with a nurse who had been diagnosed with the Ebola virus.
However, Hartfield points out social media can still be used as a force for good as well as the opposite. “We recently used our website, Facebook page and Twitter feed to dispel myths about Ebola and get people focused on other critical public health issues like Enterovirus D68 and influenza,” Hartfield says, highlighting social media’s ability to target specific audiences such as racial, ethnic and sexual minority communities.
Speed, facts, and repetition, repetition, repetition. What held true with AIDS three decades ago, using fewer and less prevalent tech-propelled communications tools, is more so now. Tech alone can’t change human nature or biases, but it can amplify the best (or worst) of what we are.
And if you’re still really freaked out by speculative tweets?
Hartfield advises that to reduce your risk from widespread illness and thousands of deaths every year, “Get a flu shot.”