Melissa Fleming and Jeremy Heimans on countering vaccine falsehoods

Source

FIRST CAME WARNINGS of martial law in cities. Next, there were quack cures. Then came claims that the virus wasn’t real and swamped hospitals weren’t full. Today, nearly two years since covid-19 appeared and as a new variant circulates, the flood of falsehoods continues. It prevents public authorities from staunching the pandemic, notably among people reluctant to get vaccinated. But what has also been revealed is that there are ways to fight back.

The Economist today

Handpicked stories, in your inbox

A daily newsletter with the best of our journalism

At the start of the pandemic it was clear that intergovernmental bodies like the United Nations and World Health Organisation would need to provide authoritative, scientific information. But it was also obvious that the traditional tools the UN uses to communicate would be no match for the chaotic mosh-pit of digital information. No megaphone, especially one optimised to broadcast in the language of technocrats, would work.

Where academics and think-tanks study the problem and fact-checking groups correct rumours and lies, at the UN we knew we needed to do something different, something more muscular. We teamed up with Purpose, a firm specialising in digital public-engagement, to go on the offensive, flooding the internet with factual—but unconventionally packaged—information. In the process, we learned a bit about what works against misinformation, though we are far from winning the war. The lessons are urgent not only for tackling covid-19, but for other areas where misinformation holds back progress, from responding to climate change to lessening political, ethnic and religious tensions.

The core of a counter-misinformation strategy is to respond using the same methods as the problem itself: viral content. Generic public-service announcements and infographics are no match for the around-the-clock disinformation engines of attention-seeking digital influencers or sophisticated state-sponsored groups. Hardcore anti-vaxxers themselves are relatively small in number, but their falsehoods are enough to keep the much larger cohort of vaccine-hesitant people on the fence. And the vaccine hesitant, often with legitimate concerns, inadvertently become carriers of false information.

Most public institutions are ill-equipped to respond. To level the playing field just a bit, a system is required to create, curate or commission content of all kinds, at significant speed and scale. Then the material needs to be rapidly tested and optimised so it is likely to spread person to person. Finally it needs to be distributed on social media, in private messaging apps and through traditional media.

Over the past 15 months the UN and Purpose built a new kind of global communications infrastructure to do just this.

Countering the cacophony

The programme, called Verified, promoted science-backed information, especially to those vulnerable to misinformation. With distribution partners in 60 languages, Verified produced more than 2,000 pieces of content, and collaborated with organisations and influencers on thousands more, from low-fi memes to music videos. So far, these messages have reached well over a billion people.

For example, Verified partnered with a popular Facebook page for millennials, VT, which has 25m followers, to distribute short videos of doctors dispelling covid-19 myths in a chatty, non-intimidating way. We hosted a series of live AMAs (Ask Me Anything) on Reddit—at times a cauldron of misinformation—putting experts directly in front of a cynical and doubting crowd. And we worked with MultiChoice, one of the biggest pay-TV services in Africa, to get young people to produce ads for Verified in an edgy voice the UN itself could never use.

A lesson from this is that institutions sometimes need to step back and let others do the talking. There are times when an organisation’s legitimacy needs to be centre-stage: early in the pandemic the UN’s credibility and brand recognition was critical to reach audiences and to broker vaccine-distribution deals. But in an age of declining trust in institutions, there is a skill in knowing when your brand is going to invite suspicion or mistrust, especially among those steeped in conspiracy theories.

As a result, many of the GIFs, digital stickers and memes that we produced and distributed on Giphy and Tenor (the big content libraries used on platforms like WhatsApp, Instagram and TikTok) were unbranded and more likely to be shared. We identified misinformation terms that were trending, such as “vaccines kill”, and injected pro-vaccine content into the mix so people searching with those terms saw accurate information too. In some cases we paid for the material to be more prominent on the platform—just like a company might for a marketing message, but in the service of facts. The most popular of these highly visual, intentionally unpolished messages has been viewed and shared almost 150m times.

Finding experts to speak to the public is not easy but it is essential. Polls show that doctors and scientists are the most trusted voices on covid-19, even among sceptical people. Yet few health experts are great communicators or immediately persuasive, at least for certain people. So we began an experiment called Team Halo. In summer 2020 we reached out to scientists developing vaccines and recruited the most engaging ones to receive training on how to explain covid-19 vaccines through short TikTok videos. The trainer was a 26-year-old TikTok celebrity, Max Klymenko, so the content would fit the vernacular of the platform.

Team Halo is now a network of more than 100 “guides”—scientists, doctors and others—making self-shot videos in 13 countries. It looks and feels radically different to official public-health communications. And that is the point. They’re conversational, vulnerable, funny and highly shareable (such as Dr Karan on myths about masks). The guides have produced more than 3,000 videos which have received more than 515m views on all platforms, about half on TikTok alone. Data from Facebook and Opinium, a research firm, found that exposure to Team Halo content increased vaccination intent by 12 percentage points.

We are not alone in doing this. In America, a group called This is Our Shot has equipped more than 30,000 doctors and healthcare workers with social-media training. The ONE campaign cleverly mobilised celebrities to hand over their social channels to healthcare workers. Together, the initiatives mark a new model of public communications to humanise experts and deploy them to spread vital messages.

In some cases, however, non-experts can be better messengers, carrying over their credibility in a specific area, like video-gaming or cooking, or in a certain demographic community. Institutions will increasingly need to learn how to cultivate these types of “micro-influencers” to reach their followers. They will also need to accept that the influencers will not always stick to the script. By tightly controlling their messages, institutions all but guarantee that only those who trust mainstream media will get the word. Building and funding bottom-up networks of messengers will lead to awkward moments—a mangled fact or misleading advice—but also to vastly higher engagement with the material.

These methods, when they work, scale easily on the internet. After all, this is exactly how misinformation actors do so much damage from a small number of accounts. To be effective, organisations need to understand the psychology of online sharing. Misinformation spreads most effectively not when it is broadcast on social media, but when it is sent person-to-person in private messaging apps, bypassing the platforms’ patchy monitoring tools. Sharing is often impulsive and emotionally driven, and links are often not clicked before an item is forwarded.

We tried a number of experiments to interrupt the impulsive act of sharing, by spreading the concept of “pause”. The idea comes from behavioural research that showed that if the thrill of reposting a shock-story or inflammatory meme is interrupted even briefly, better decisions prevail and less bogus news gets shared. The Pause campaign creates content, runs advertisements and encourages others to create their own content that asks users to stop and think before blindly sharing material. Twitter and Facebook came to the same conclusion and introduced a prompt when someone tries to share an article they haven’t read.

Cynics might regard calls to pause as akin to the dubious “Just Say No” anti-drug campaign from America in the 1980s. But there is evidence that they work. A forthcoming MIT study in Britain and America found that the simple act of pausing to question the origin, credibility, relevance or accuracy of information before sharing it reduced people’s propensity to distribute misinformation by up to nine percentage points.

An infrastructure of accuracy

It might be tempting to dismiss these methods as just a race to the bottom in pursuit of the silliest meme. But this posture is exactly what has held back big institutions as they attempt to reach people about covid-19 and other issues. Our work suggests that if organisations combine rigorous science with humour, humanity and a willingness to engage in the culture of the internet, they at least stand a chance in a frenetic information environment. Experts and technocrats need to work hand-in-hand with digital communicators—an unnatural but vital pairing.

Of course the big tech platforms should be doing more too. Their systems prioritise content that triggers users, not informs or unites them. Though they remove many of the most egregious accounts, their actions do not match the scale of the problem. And they are utterly ineffective at limiting person-to-person sharing once falsehoods have entered the bloodstream of their networks. We can’t look to them for the complete solution (nor conversely, to alone adjudicate questions of legitimate scientific debate).

We are in an information war. For many institutions, communicating to the public is seen as burdensome or is an afterthought. But it is just as important as developing and delivering the vaccine itself. But in public health as with other issues, losing the information war has devastating consequences. To respond, organisations need to invest substantially in building modern communications infrastructures of the kind we have developed at the UN around covid-19. They also need to bring tactical innovation, a willingness to deploy others’ voices, a higher tolerance for risk and yes, humility.

The crux is that institutions need not remain passive and rue the infodemic: they can fight back. They might not eliminate misinformation—the problem is vast—but at least they will give the truth a fighting chance.

______________

Melissa Fleming is the United Nations’ undersecretary-general for global communications. Jeremy Heimans is the chief executive of Purpose, an organisation that develops global campaigns. He is the co-author with Henry Timms of “New Power” (Doubleday, 2018).