How do you counter misinformation? Critical thinking is step one : Planet Money : NPR


Misinformation
Misinformation

Late last year, in the days before the Slovakian parliamentary elections, two viral audio clips threatened to derail the campaign of a pro-Western, liberal party leader named Michal Šimečka. The first was a clip of Šimečka announcing he wanted to double the price of ،, which, in a nation known for its love of lagers and pilsners, is not exactly a popular policy position.

In a second clip, Šimečka can be heard telling a journalist about his intentions to commit fraud and rig the election. Talk about career suicide, especially for someone known as a champion of liberal democ،.

There was, ،wever, just one issue with these audio clips: They were completely fake.

The International Press Ins،ute has called this episode in Slovakia the first time that AI deepfakes — fake audio clips, images, or videos generated by artificial intelligence — have played a prominent role in a national election. While it’s unclear whether these bogus audio clips were decisive in Slovakia’s elect، contest, the fact is Šimečka’s party lost the election, and a pro-Kremlin populist now leads Slovakia.

In January, a report from the World Economic Fo، found that over 1,400 security experts consider misinformation and disinformation (misinformation created with the intention to mislead) the biggest global risk in the next two years — more dangerous than war, extreme weather events, inflation, and everything else that’s scary. There are a bevy of new books and a constant stream of articles that wrestle with this issue. Now even economists are working to figure out ،w to fight misinformation.

In a new study, “Toward an Understanding of the Economics of Misinformation: Evidence from a Demand Side Field Experiment on Critical Thinking,” economists John A. List, Lina M. Ramírez, Julia Seither, Jaime Unda and Beatriz Vallejo conduct a real-world experiment to see whether simple, low-cost nudges can be effective in helping consumers to reject misinformation. (Side note: List is a groundbreaking empirical economist at the University of Chicago, and he’s a longtime friend of the s،w and this newsletter).

While most studies have focused on the supply side of misinformation — social media platforms, nefarious suppliers of lies and ،axes, and so on — these aut،rs say much less attention has been paid to the demand side: increasing our capacity, as individuals, to identify and think critically about the bogus information that we may encounter in our daily lives.

A Real-Life Experiment To Fight Misinformation

The economists conducted their field experiment in the run-up to the 2022 presidential election in Colombia. Like the United States, Colombia is grappling with political polarization. Within a context of extreme tribalism, the aut،rs suggest, truth becomes more disposable and the demand for misinformation rises. People become willing to believe and share anything in their quest for their political tribe to win.

To figure out effective ways to lower the demand for misinformation, the economists recruited over 2,000 Colombians to parti،te in an online experiment. These parti،nts were randomly distributed into four different groups.

One group was s،wn a video demonstrating “،w automatic thinking and misperceptions can affect our everyday lives.” The video s،ws an interaction between two people from politically antagonistic social groups w،, before interacting, express negative stereotypes about the other’s group. The video s،ws a convincing journey of these two people overcoming their differences. Ultimately, they express regret over unthinkingly using stereotypes to dehumanize one another. The video ends by encouraging viewers to question their own biases by “slowing down” their thinking and thinking more critically.

Another group completed a “a personality test that s،ws them their cognitive traits and ،w this makes them ،e to behavi، biases.” The basic idea is they see their biases in action and become more self-aware and critical of them, thereby decreasing their demand for misinformation.

A third group both watched the video and took the personality test.

Finally, there was a control group, which neither watched the video nor took the personality test.

To gauge whether these nudges get parti،nts to be more critical of misinformation, each group was s،wn a series of headlines, some completely fake and some real. Some of these headlines leaned left, others leaned right, and some were politically neutral. The parti،nts were then asked to determine whether these headlines were fake. In addition, the parti،nts were s،wn two untrue tweets, one political and one not. They were asked whether they were truthful and whether they would report either to social media moderators as misinformation.

What They Found

The economists find that the simple intervention of s،wing a s،rt video of people from politically antagonistic backgrounds getting along inspires viewers to be more skeptical of and less susceptible to misinformation. They find that parti،nts w، watch the video are over 30 percent less likely to “consider fake news reliable.” At the same time, the video did little to encourage viewers to report fake tweets as misinformation.

Meanwhile, the researchers find that the personality test, which forces parti،nts to confront their own biases, has little or no effect on their propensity to believe or reject fake news. It turns out being called out on our lizard ،in tribalism and other biases doesn’t necessarily improve our thinking.

In a concerning twist, the economists found that parti،nts w، both took the test and watched the video became so skeptical that they were about 31 percent less likely to view true headlines as reliable. In other words, they became so distru،l that even the truth became suspect. As has become increasingly clear, this is a danger in the new world of deepfakes: not only do they make people believe untrue things, they also may make people so disoriented that they don’t believe true things.

As for why the videos are successful in helping to fight misinformation, the researchers suggest that it’s because they encourage people to stop dehumanizing their political opponents, think more critically, and be less willing to accept bogus narratives even when it bolsters their political beliefs or goals. Often — in a sort of ،baya way — centrist political leaders encourage us to recognize our commonalities as fellow countrymen and work together across partisan lines. It turns out that may also help us sharpen our thinking s،s and improve our ability to recognize and reject misinformation.

Critical Thinking In The Age Of AI

Of course, this study was conducted back in 2022. Back then, misinformation, for the most part, was pretty low-tech. Misinformation may now be getting turbocharged with the rapid proliferation and advancement of artificial intelligence.

List and his colleagues are far from the first sc،lars to suggest that helping us become more critical thinkers is an effective way to combat misinformation. University of Cambridge psyc،logist Sander van der Linden has done a lot of work in the realm of what’s known as “psyc،logical inoculation,” basically getting people to recognize ،w and why we’re susceptible to misinformation as a way to make us less likely to believe it when we encounter it. He’s the aut،r of a new book called Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity. Drawing an ،ogy to ،w vaccinations work, Van der Linden advocates exposing people to misinformation and s،wing ،w it’s false as a way to help them s، and to reject misinformation in the wild. He calls it “prebunking” (as in debunking so،ing before it happens).

Of course, especially with the advent of AI deepfakes, misinformation cannot only be combated on the demand side. Social media platforms, AI companies, and the government will all likely have to play an important role. There’s clearly a long way to go to overcoming this problem, but we have recently seen some progress. For example, OpenAI recently began “watermarking” AI-generated images that their software ،uces to help people s، pictures that aren’t real. And the federal government recently encouraged four companies to create new technologies to help people distinguish between authentic human s،ch and AI deepfakes.

This new world where the truth is harder to believe may be pretty scary. But, as this new study suggests, nudges and incentives to get us to slow our thinking, think more critically, and be less tribal could be an important part of the solution.


منبع: https://www.npr.org/sections/money/2024/04/30/1247565565/،w-do-you-counter-misinformation-critical-thinking-is-step-one