This week at the 2022 Health Resource Discourse Initiative Symposium (HDRI), I was delighted to spend time alongside a diverse collection of academics (both on the computer science as well as the social science research sides), researchers, technologists, and other practitioners working to unpack some of the tough questions around health-related misinformation and disinformation.
Through our partnerships in the space, Constella has participated in events alongside the Media Ecosystems Analysis Group (MEAG) in past years. The MEAG is a non-profit organization conducting media analysis research and organizing educational activities to aid organizations in learning to use communication and media more effectively in support of social change, public health, and human rights agendas. MEAG is also associated with the Media Cloud project, initially developed at the Berkman Klein Center for Internet and Society at Harvard University and incubated at the MIT Media Lab.
Here, I outline a few hard truths about the disinformation ecosystem that we examined, unpacking what we’ve learned about the area of study and our approaches to understanding the dynamically evolving phenomenon over several years of research.
"Data voids" precede many manipulation techniques.
Director, Risk Intelligence
The spaces where manipulative tactics can flourish are as fascinating as the tactics themselves. In a study on Coordinated Link Sharing Behavior (CLSB) as a strategy to hack the attention economy, Fabio Giglietto’s (University of Urbino) team analyzed the coordinated shares of the same news article in a limited period by networks of entities composed of Facebook pages, groups and verified public profiles, demonstrating how this strategy represents an apparent attempt to boost content reach and game the algorithm that presides over the distribution of the most popular posts. Using the methodology developed by the team to identify CLSB, they detected 26 networks composed of 124 entities impacting the Italian digital conversation in the Covid-19 “informational void”. When discussing the risks of “data voids”, and the manipulative tactics that are often used to fill them in the wake of breaking news stories or crises, advancing techniques like these will be critical to understanding what these spaces mean for “information disorder” and how we can address them through proactive communications and monitoring.
The rate of evolution of malign tactics depends on their effectiveness.
Advertising is a field in constant change because it depends on the susceptibility (and ignorance) of its targets. It continuously needs reinventing because its subjects quickly learn how it works and the modalities used to further its goals. This is a critical force in the evolution of malign influence and disinformation, as tactics must continuously depend on more subtle and nuanced techniques that are less explicit and more deeply entrenched within the fabric of the ecosystem. Although explicitly false narratives may be easier to identify and call out when emerging from officials or media tied to nation-states, the enlistment (wittingly or unwittingly) of cultural proxies, influencers, and media in disseminating competing narratives, or the inflammation and amplification of existing social divisions (to name a few) are highly effective at destabilizing public discourse and undermining trust in processes and institutions. Coercion in this sense, is multidimensional and consists of a broader strategy and ecosystem of entrenched activities, always tending towards a combination of the explicit and the subtle—and ambiguity that presents a wide range of risks for individuals, companies, and public institutions.
Attention = Influence = Monetization
We now live in the attention economy, an economy in which all actors vie for attention in digital terms; that is, time spent in one place of digital “real estate” or another. In the same way that physical stores were once intent on winning the physical presence of their customers, websites aim to achieve the same. However, the end goal is now different. The objective is no longer solely maximizing the possibility of a potential purchase. Instead, the more time users spend on a website (and the more traffic a domain can attract), the more advertisers will spend placing their ads there. Influence means potential for monetization. As discussed in an article in Fast Company mentioning the fantastic Check My Ads project, to understand why disinformation is so prevalent (and was so prevalent in the public health space during the Covid-19 pandemic), it is critical to understand the economics incentivizing malign actors and sustaining their business models. Attention—although in part driven and sustained by economic incentives—translates into influence over public opinion, sentiment, and collective decision-making.
Of course, this is only one facet of the network of incentives that now govern our digital ecosystem, as journalists and media must engage in the competition for attention in order to remain relevant and sustain their businesses as well. When sensationalist and even false narratives are more effective at garnering attention and traveling through the digital sphere, malign actors leverage this fundamental reality to produce (or work with other economically incentivized actors to produce) and execute diverse campaigns of malign influence targeting individuals, companies, institutions, and the stability of discourse itself.
Commoditization of the information economy also leads to alternative markets, where various items, including botnets, capabilities to produce deepfakes, merchandising, illicit items, and other products tied to digital communities can be identified. In this way, we can understand how the challenge of malign influence and disinformation is only an ideological one in a superficial sense. At the core of the problem, the exploitation and inflammation of preexisting political, religious, ethnic, class and other social divides rely on an architecture that produces distinct business models with incentives. Susceptibility exists more pointedly in polarized societies. Still, to understand how this polarization is exploited in the digital sphere, we must understand the risks and structural flaws of the digital infrastructure that relies upon and monetizes this engagement.