Skip navigation
Perspective

Reality check: coming to grips with the “meaning” crisis

The world is drowning in lies: misinformation, disinformation, conspiracy theories, fake news and pseudo-science. Manipulation of the truth, much of it “computational propaganda”, is an industrial-scale problem and a money-making enterprise. The casualty is trust – in governments, institutions, media, leaders, and even our fellow citizens. What are the implications? How can the world combat the new technologies and the tried-and-true rhetoric used to manipulate the facts – and all of us? 

Karen Brandon / Published on 20 December 2022
Perspective contact

Karen Brandon / karen.brandon@sei.org

This perspective is part of SEI’s “Currents 2023” project examining key global issues on the horizon. Join us for the online event on 11 January.

Iconic 21st century propaganda posters from around the world, Maastricht, Netherlands

Various propaganda posters on display in Maastricht, Netherlands. Photo: Sinitta Leunen / Unsplash.

As these examples suggest, manipulation of facts constitutes a thriving global industry. “Computational propaganda“, as those who study the phenomenon call it, is now said to be an “industrial-scale problem” prevalent in more than 80 countries. Each success or failure provides training for the underpinning technologies, which learn more about what does and does not work to persuade and deceive.

The casualty? Trust. Trust has eroded in democratic processes, public institutions, media and science – and in leaders of government and industry, who have given rise to cynicism by saying one thing and doing another. People are questioning the legitimacy of pillars of society. Indeed, “alternative facts” is a catchphrase of the Information Age.

Lies in disguise

Lies camouflaged as news flourish in old and new media, in some newspapers and magazines, on some television stations, and through unregulated digital platforms. No wonder. It can be very profitable to lie. Outlets gain the readers, audiences and clicks that earn money by exploiting falsehoods about practically anything and everything, profiteering, even, from conspiracy theories about the murder of schoolchildren.

Bots worldwide push disinformation at opportune moments, when key political events such as elections are on the calendar. Their work is proliferating to such a degree that bot activity dwarfs human activity in promoting conspiracy theories, data scientist Emilio Ferrara said in an interview with Nature. Technological tools can make it difficult to distinguish fact from fiction. People could be forgiven for being fooled by the manipulated videos that went viral showing Nancy Pelosi, Speaker of the US House of Representatives, seemingly impaired. People could be misled because they simply do not know that search engines themselves can be manipulated to give results that give a sheen of legitimacy to fossil fuel producers; in milliseconds, greenwashed search-engine results help improve the image of fossil fuel producers and enhance the bottom line of the search engines.

Contaminating all pursuits

Against this backdrop, there is evidence of more-fragmented societies, less-civil debates, intolerance for different views outside of our own echo chambers, and, perhaps, a decay in the critical-thinking skills needed to analyse evidence and arguments in an increasingly complex world. The atmosphere infects all walks of contemporary life – from conducting the smallest day-in, day-out interactions to working to achieve global-level aims. As Gallup put it, “When people lose trust in leaders, their decisions are informed by suspicion and self-interest”.

Will 2023 be the time when we turn the corner? In the 2022 midterm elections in the US, candidates who were election “deniers” lost most (but not all) key races. Fact-checking works. Recent research has enhanced our understanding of how to best address misinformation. Facts still have persuasive power, across party lines and across different countries. Outlets that say they want to eliminate misinformation have many more tools at their disposal – if only they would deploy them, either through their own initiative or via mandates imposed by regulations to require platforms to take responsibility for the content they peddle.

What is ahead in a world increasingly flooded with misinformation, disinformation, conspiracies, fake news, and pseudo-science? How can we work together if we each have our own “facts”? How can we restore collective trust at a time when innovators keep finding new ways to fool us? Have we reached “peak propaganda”? Or will this threat keep escalating?

Written by

Karen Brandon
Karen Brandon

Senior Communications Officer and Editor

Communications

SEI Oxford

Design and development by Soapbox.