Why we fall for disinformation

Opinion & Analysis
Today, messages of persuasion are not just on billboards and commercials, but in a host of non-traditional places like in the memes, images, and content shared online by friends and family. When viewing an Oreo commercial, we can feel relatively confident that it wants to persuade us of the cookie’s excellence.

AS humans evolved, we developed certain psychological mechanisms to deal with the information surrounding us. But in the 21st-century media environment, where we are exposed to a growing quantity of messages and information, some of these time-tested tools make us dangerously vulnerable to disinformation.

Today, messages of persuasion are not just on billboards and commercials, but in a host of non-traditional places like in the memes, images, and content shared online by friends and family. When viewing an Oreo commercial, we can feel relatively confident that it wants to persuade us of the cookie’s excellence.

The goals of today’s disinformation campaigns are more difficult to discern, and the content creators are harder to identify. Few viewers will have any idea of the goal or identity of the creator of a shared meme about COVID-19 vaccines. And since this content appears in less traditional locations, we are less alert to its persuasive elements.

In a recent study, we examined how, in this disorienting information environment, normal information-processing and social psychological mechanisms can be exploited by disinformation campaigns. Our report, The psychology of (Dis)Information: A primer on key psychological mechanisms, identifies four key psychological mechanisms that make people vulnerable to persuasion.

Initial information processing Our mental processing capacity is limited; we simply cannot deeply attend to all new information we encounter. To manage this problem, our brains take mental shortcuts to incorporate new information.

For example, an Iranian-orchestrated disinformation campaign known as Endless Mayfly took advantage of this mental shortcut by creating a series of websites designed to impersonate legitimate and familiar news organisations like The Guardian and Bloomberg News. These look-alike sites were subject to less scrutiny by individual users who saw the familiar logo and assumed that the content was reliable and accurate.

Cognitive dissonance We feel uncomfortable when confronted with two competing ideas, experiencing what psychologists call cognitive dissonance. We are motivated to reduce the dissonance by changing our attitude, ignoring or discounting the contradictory information or increasing the importance of compatible information.

Disinformation spread by the Chinese government following the 2019 protests in Hong Kong took advantage of the human desire to avoid cognitive dissonance by offering citizens a clear and consistent narrative casting the Chinese government in a positive light and depicting Hong Kong’s protestors as terrorists.

This narrative, shared via official and unofficial media, protected viewers from feeling the dissonance that might result from trying to reconcile the tensions between the Chinese government’s position and that of the Hong Kong protestors.

Influence of group membership, beliefs, and novelty Not all information is equally valuable to individuals. We are more likely to share information from and with people we consider members of our group, when we believe that it is true and when the information is novel or urgent. For example, the #CoronaJihad campaign leveraged the emergence of a brand new disease. One that resulted in global fear and apprehension, to circulate disinformation blaming Indian Muslims for its origins and spread.

Emotion and arousal Not all information affects us the same way. Research demonstrates that we pay more attention to information that creates intense emotions or arouses us to act. That means we are more likely to share information if we feel awe, amusement or anxiety than if we feel less-arousing emotions like sadness or contentment.

Operation Secondary Infektion, co-ordinated by the Russians, tried to create discord in Russian adversaries like the United Kingdom by planting fake news, forged documents and divisive content on topics likely to create intense emotional responses, such as terrorist threats and inflammatory political issues.

Despite their impact on the spread of disinformation, these mechanisms can be generally healthy and useful to us in our daily lives.

They allow us to filter through the onslaught of information and images we encounter regularly. They are also the same mechanisms that advertisers have been using for years to get us to buy their products.

The US government is already working on technological means of thwarting State and non-State actors spreading disinformation in and about the United States. Our work outlines two promising categories of techniques in this vein.

One is to provide preventive inoculation, such as warning people about the effects of disinformation and how to spot it. The other is to encourage deeper, analytical thinking.

These two techniques can be woven into training and awareness campaigns that would not necessarily require the co-operation of social media platforms. They could be simple, low-cost, and scalable.— Psychologytoday.com

Related Topics