Researchers at the University of Washington’s Center for an Informed Public are studying the science behind social media platforms like Facebook, Twitter and others. (Jeremy Zero Photo via Unsplash)

As misinformation about COVID-19 and vaccines is spiking, the University of Washington’s Center for an Informed Public (CIP) is gearing up its efforts to document, understand and combat the rampant spread of unfounded claims across social media platforms.

The center announced today that it has received $2.25 million from a grant totaling $3 million from the National Science Foundation. The money will be used to “develop and evaluate ‘rapid response’ methods for studying and communicating about disinformation,” said Kate Starbird, a UW Human Centered Design and Engineering associate professor who will lead the project.

The new effort will start in October and includes support from Stanford University.

The CIP has worked on similar issues. The UW was part of a multi-university team created in the summer of 2020 called the Election Integrity Partnership that monitored and shared in real time mis- and disinformation about the November election that was being spread on social media.

The CIP launched in 2019 and facilitates collaboration between professors in fields including engineering, law, biology and other areas to examine the powerful role that Facebook, Twitter and other platforms play in communication worldwide.

Joseph Bak-Coleman, a post-doctoral researcher at the University of Washington’s Center for an Informed Public. (UW Photo)

In June, the CIP’s Joseph Bak-Coleman was the lead author of a paper that called for elevating the study of “collective behavior” — namely how we gather and share information and make decisions — to the urgent status of a “crisis discipline.” The research was a sort of call to arms to bring attention to the massive challenges represented by misinformation and communication networks. It was published in the Proceedings of the National Academy of Sciences.

“We have global problems, which will require global communication. We’re not going to fix global warming if we can’t talk to each other. So it’s great that we have these tools for allowing information to spread globally,” he said. “Unfortunately, as they’re currently constructed and used, they don’t seem to necessarily be optimized for that. They’re optimized for revenue.”

In the face of the pandemic, national elections, the climate crisis and other events, the social media platforms have been hotbeds for confusion and worse. As vaccination rates lag and the Delta variant of COVID surges, President Biden last month accused Facebook of “killing people” for failing to effectively curb the spread of untruths about vaccines.

Biden later softened his message, but there are widespread calls to more aggressively regulate the platforms as the multi-billion-dollar companies have had limited success in self-policing inaccurate content.

“It seems almost unreasonable to suggest that we can just let things go and some invisible hand will guide society towards the happiest, healthiest possible future,” said Bak-Coleman, who is a postdoctoral researcher.

We caught up with Bak-Coleman for a interview about his recent publication. Answers have been edited for clarity and length.

GeekWire: Your background is in biology and you see parallels between individuals engaging in social media networks and biological systems. Can you explain?

Bak-Coleman: I work in collective behavior, originally on fish schools but across species. And one of the things we keep seeing is that a lot of the way animals keep doing these magical things, or things that seem like magical things — like flocks of birds deciding where to go or fish avoiding predators or locusts swarming in unison — it’s all simple local rules, and then the network structure allows [collective behavior] to emerge.

One of the more haunting examples is in ants who follow pheromone trails and the rule they follow roughly is, if I smell a pheromone trail laid down by an ant then I stay on it, which causes me to lay it down as well. If they wind up going in a circle, they can get stuck in a state where they’re going in a circle until they all starve to death and die. It’s called ant mills or ant death spirals.

I was learning about this at the same time as the 2016 elections, which for a lot of Americans was a wake-up call that social media is doing something. And I happened to be teaching conservation biology class at the same time, so all of that came together.

GW: What is the path to more responsible social media?

Bak-Coleman: In the most optimistic part of my brain, I would hope at some point in time the companies would realize, and regulators and the general public would have the awareness that, ‘Wow this is a thing we can’t just monetize,’ and then we have to find sustainable business models.

That’s my most optimistic self. I don’t know that that’s going happen with every company. Certainly Facebook is showing that’s not the direction they want to head.

So it might come down to either regulators realizing that this sort of chaotic, social system is just a nightmare for governance, that might be part of it. It could be the general public finding that they don’t love the idea of large companies regulating society and how we interact. Or maybe scientists finding clever ways to reveal how harm is occurring as a result of these technologies. I think all those things coming together will, hopefully, push to get transparency. And then ideally transparency will provide more attention to what’s happening and kind of be a feedback process.

GW: How have social media companies ducked regulation?

Bak-Coleman: One of the things that fossil fuel companies did — and same with tobacco companies and with the Sackler family with opioids — is agnotology, is trying to create uncertainty. That’s their goal — you create doubt, enough uncertainty, to avoid regulations.

And then you put little patches on it. You put a filter on a cigarettes and say now, it’s safe. I think the companies are actually following the same playbook.

If you looked at that press release by Facebook [on July 17], it was filled with half-baked stats. Saying like 85% of Facebook users express interest in vaccines and all of this, it’s almost like textbook disinformation, trying to create the impression that this company has only done good.

GW: You’re calling for “evidence-based stewardship” of communication networks. What does that mean?

Bak-Coleman: All we’re advocating for is scientists should start thinking about how the system works and how it fails. And then we can allow the public and regulators to make informed decisions about our social systems.

There are some fundamental things that we just don’t understand yet as scientists about how you construct a healthy communication network at scale that is ideally profitable for the companies, that we need to figure out.

We’re not advocating for some sort of technocracy or elite-driven social media system, far from it. We are advocating for an understanding that allows society at large to make an informed decision about how they want to structure social media systems ideally in a way that provides everyone with a voice and access to information.

GW: What are the broader implications if social media is no longer a prime source for misinformation?

Bak-Coleman: If we sort this problem out, then we’ll sort a lot of other things out as well. If we have a good, healthy information ecosystem, it shouldn’t be hard to see leaders elected that advocate for basic, public health policy. It shouldn’t be hard to get people to take safe and healthy, effective vaccines.

And so on one hand, it is a harder problem in some ways, but a lot of the reason why climate change is so hard [to respond to] is because we don’t understand collective behavior and that’s the thing we’re trying to get at.

It is a big problem and the problem’s urgent, but we can make progress. And it might not be at the full scale of creating utopia, but it could be adjusting recommendation algorithms so we have more people get vaccinated, or it could be avoiding radicalization and stopping genocide.

We can make really tangible progress, probably fairly easily on big parts of this even if the big healthy ecosystem might be a little ways off.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.