Marc-Andre Argentino is a PhD candidate in the Individualized Program (INDI). His research examines how extremist groups leverage technology to create propaganda, recruit members to ideological causes, inspire acts of violence and impact democratic institutions. He has an MA from Université Laval and a BA from Concordia. Marc-André is an associate fellow at the Global Network on Extremism & Technology and an associate researcher at the Centre d'Expertise et de Formation sur les Intégrismes Religieux, les idéologies politiques et la radicalisation et la Radicalisation.
Conspiracy Pilled: the growing public health threat posed by conspiracy theories and disinformation
Over the past few months in the US and in Canada there was a large focus on the US election and the political disinformation. COVID-19 took a little bit of a backseat in the news cycle; however, with a second wave upon most of the world and the delivery of the first COVID-19 vaccines we cannot minimize the public health threat disinformation & conspiracy theories pose. Democratic societies must strive to ensure healthy lives & promote well-being for everyone at all ages; however, disinformation, political partisanship and conspiracy theories around COVID challenge our ability to do so.
The infodemic is a real public health threat as it has been a major challenge for medical professionals to provide meaningful advice, especially as there have been cases of patients denying the existence of COVID-19 as they die of the virus, or and ER nurses who refuse to take the vaccine for “political reasons”. Governments, journalists, academics and experts in various fields are facing, at times, insurmountable challenges as disinformation can spread across the web, into the real world and move on to the next piece of disinformation before fact checkers and experts can even respond to the first.
Although all the major social media platforms have launched efforts to curb COVID disinformation and conspiracy theories, and attempt to direct users to trusted sources of COVID information, this has not been enough. Like the fact checkers, by the time platforms decide to act against a specific piece of disinformation, the world has moved on to the next one. The issue is not the platforms’ moderation efforts, rather it is the platforms, like the users, who are victims of the algorithms that curate spaces for the rapid spread of disinformation.
The lack of transparency around the platform’s algorithms and the fact that they are core to the business model and source of income, means that there will never be any real transparency around them, nor will their core functionality change to prevent this public health threat. However, the platforms are only a vehicle (albeit a very effective one nonetheless). A key feature in the disinformation is the role played by alternative information sites that are production lines of disinformation and conspiracies.
Snake Oil, Online
In addition is the role played by sites or individuals that present themselves as medical professionals or organizations who are selling “snake oil” and promoting dangerously false information about COVID. Our digital ecosystem has always been a social determinant of public health and played a key role in protecting the health; now this role has been thrust into the spotlight as well as the flaws and vulnerabilities around this as the infodemic has grown. This is nothing new, but it is a phenomenon that is acerbated by the global pandemic. These sites are designed to suggest that they are legitimate and professional organizations, that are also recognized, and are billing themselves as an alternative option for information and facts to the mainstream sources their target audience already doesn't trust.
Though social media platforms are behemoths that will require much more effort from government and civil society to bring into line with the public good, these alternative websites are something that need to be tackled as well and are not part of the regulation plan. These sites prey upon the fear and the emotions of their target audience. A good rule of thumb is if you read or view something that makes you feel very happy or very mad: fact check it. Facts and information should not give you mood swings but inform you. If a source is playing with your emotions, you are likely being manipulated. They are trying to sell you something that will deal with that emotion.
Why are they able to promote their content on larger platforms and build their audiences from there? Ah yes -- because they pay. How much damage could be reduced if this were not possible? You should not be able to target a vulnerable audience with disinformation and conspiracies. You should not be able to make a single dollar off of a global public health crisis. Yet many grifters in the alternative health, information and conspiracy theory space are making a killing. Cutting off their capacity to purchase views or make a dollar off views should not be impossible in this crisis.
Assessing the Hydra
Now many will likely respond with “ban all the content, drop the hammer on their heads”. How has that worked so far? If we look at a community like QAnon, yes, Facebook has done a decent job removing them from their platform, as has YouTube, although Twitter…not so much. Though the communities are gone and the YouTube channels of influencers are gone, QAnon content still abounds on the platform, much of which is still COVID-19 related.
What has happened is not surprising and inevitable: they simply migrated to alternative platforms. Now yes, there is a positive where you break the cycle of influence and you remove the audiences and virality the algorithms provide. However, because these are just echo chambers, the narrative becomes more toxic and more extreme. Parler is a great example of this, where some individuals in positions of power (some current and former elected officials) will use their official Twitter account to say one thing (generally normal) and on Parler promote toxicity and vitriol within a few minutes.
What is happening, though, is the audiences of disinformation are still receiving the disinformation across multiple alternative social media platforms, video platforms and information sites. The problem still exists and is still thriving, it's just been swept under the rug.
This also poses a challenge for disinformation experts who still need to track these narratives and inform decision makers of what is going on. We need to be informed as a community to better combat medical disinformation. At this point, isolating a segment of the population that have been targeted by disinformation is ineffective as they will still seek to consume this information from the sources they trust and continue to pose a public health threat to our societies.
So, what do we do? I hate to complain and not make some suggested solutions. We need a multisectoral approach where governments, the community of experts and civil society can work together to better incentivize those who participate in the disinformation and conspiracy space to not do so. During a crisis or significant event uncertain information-seekers are more likely to consume problematic information at a faster rate and higher volume, rather than wait in uncertainty. There is a search for quick emotional gratification. Better funding, promotion and support of credible and independent online platforms and news sites is a start, as well as better funding and support for disinformation experts. So would be access to data from the platforms to accomplish this work.
We must also acknowledge that disinformation and conspiracy theories exist in data voids created by de-platforming, and the erosion of trust in journalists, news organizations, academics and governments. A fragmented and polarized population leads to a lack of collective trust, which destabilizes the digital ecosystem around experts and scientific institutions and leaves our societies more vulnerable to the next wave of disinformation.
We must remove the capacity for disinformers and conspiracy theorists to monetize information that poses a threat to public health. Platforms must isolate and create more friction with disinformation websites that pose as news outlets and limit the reach of these websites. There needs to be an effort to limit the capacity for domestic disinformation actors to coordinate their efforts to spread information that is a public health threat, the same way platforms limit coordinated inauthentic behaviour from foreign state actors.
There needs to be greater effort put in place in promoting science and fact-based information about COVID, especially when there is viral disinformation or conspiracy theories that go counter to verified information. This won't deal with those who are hardcore into conspiracy theories, but it will go a long way to making it more difficult to "infect" more people with disinformation and conspiracy theories that are a public health threat.
Above and beyond the public health, these conspiracy theories and disinformation have led to both violence and damage to material goods. We have seen government health experts need bodyguards, doctors and nurses threatened and harassed, cell towers burned. COVID disinformation has led to the growth and proliferation of QAnon, and has mobilized some individuals to acts of violence. Disinformation and conspiracy theories will continue to polarize and radicalize our societies, especially as a vaccine rolls out. As we receive more and more news about vaccine rollouts, we need to get prepared for a new wave of conspiracy theories and disinformation, but also prepare to educate those in this disinformation space about the importance and need of a vaccine in a way that they will be receptive.
The groundwork for positive narratives around the vaccine rollout needs to start now, they need to be tailored to this audience and delivered by sources they will trust. Which will not be an easy challenge. I may paint a possible positive road map, but in reality, I believe we are going to face a long and difficult road ahead where things will likely get worse before they get better.
Though I am always happy to be wrong.