Opinion: Let’s use social distancing to slow the viral spread of disinformation on social media

0

Facebook was experiment quietly by reducing the amount of political content it puts on user newsfeeds. This decision is a tacit recognition that the way the company’s algorithms work may be a problem.

The crux of the matter is the distinction between eliciting a response and delivering the content people want. Social media algorithms – the rules their computers follow to decide what content you see – rely heavily on people’s behavior to make those decisions. In particular, they monitor the content that people react to or “interact with” by liking, commenting and sharing.

As a computer scientist who studies how large numbers of people interact using technology, I understand the logic of using the wisdom of crowds in these algorithms. I also see significant pitfalls in the way social media companies do it in practice.

From lions of the savannah to likes on Facebook

The concept of Crowd Wisdom assumes that using the signals of the actions, opinions and preferences of others as a guide will lead to sound decisions. For example, collective predictions are normally more precise than individual ones. Collective intelligence is used to predict financial markets, sports, elections and even disease outbreaks.

Over millions of years of evolution, these principles have been encoded in the human brain in the form of cognitive biases accompanied by names such as familiarity, simple exposure and training effect. If everyone is starting to run, you should start running as well; maybe someone saw a lion come and run could save your life. You might not know why, but it’s wiser to ask questions later.

Your brain picks up cues from the environment – including your peers – and uses simple rules to quickly translate these signals into decisions: follow the winner, follow the majority, copy your neighbor. These rules work remarkably well in typical situations because they are based on solid assumptions. For example, they assume that people often act rationally, many are unlikely to be wrong, the past predicts the future, etc.

Technology allows people to access the signals of more other people, most of whom are unfamiliar. Artificial intelligence applications make heavy use of these popularity or “engagement” signals, from selecting search engine results to recommending music and videos, and suggesting friends to rankings. news feed publications.

Not everything viral deserves to be

Our research shows that virtually all web technology platforms, such as social media and news referral systems, have a strong popularity bias. When apps are driven by cues like engagement rather than explicit search engine queries, popularity bias can lead to unintended, negative consequences.

Social networks such as Facebook FB,
-0.23%,
Instagram, Twitter TWTR,
+ 0.73%,
YouTube GOOG,
-0.57%
and TikTok rely heavily on AI algorithms to rank and recommend content. These algorithms take as input what you “like”, comment on, and share – in other words, the content you interact with. The goal of algorithms is to maximize engagement by finding out what people like and ranking it at the top of their feeds.

An introduction to the Facebook algorithm.

On the surface, that seems reasonable. If people like believable news, expert opinions, and funny videos, these algorithms should identify content of that quality. But the wisdom of the crowds makes a key assumption here: that recommending what’s popular will help high-quality content “bubble”.

We tested this hypothesis by studying an algorithm that ranks items using a mixture of quality and popularity. We have found that in general, popularity bias is more likely to reduce the overall quality of content. This is because engagement is not a reliable indicator of quality when few people have been exposed to an item.

In these cases, the engagement generates a noisy signal, and the algorithm is likely to amplify this initial noise. Once the popularity of a low-quality article is large enough, it will continue to grow.

Algorithms aren’t the only thing affected through engagement – it can affect people, too much. Evidence shows that information is transmitted via “complex contagionMeaning that the more exposure a person is to an idea online, the more likely they are to adopt and share it. When social media tells people that something goes viral, their cognitive biases kick in and result in the urge to pay attention and share it.

Crowds not so wise

We recently conducted an experiment using a literacy app called Fakey. This is a game developed by our laboratory, which simulates a news feed similar to those of Facebook and Twitter. Players see a mix of current articles from fake news, junk science, hyper-partisan and conspiratorial sources, as well as mainstream sources. They get points for sharing or liking information from trusted sources and for flagging unreliable articles for fact-checking purposes.

We have found that players are more likely to like or share and less likely to report articles from unreliable sources when players can see that many other users have interacted with these articles. Exposure to engagement metrics therefore creates a vulnerability.

The wisdom of crowds fails because it is based on the false assumption that the crowd is made up of diverse and independent sources. There may be several reasons why this is not the case.

First, due to the tendency of people to associate with like-minded people, their online neighborhoods are not very diverse. The ease with which a social media user can get rid of those they disagree with pushes people into cohesive communities, often referred to as echo chambers.

Second, because many people’s friends are friends of each other, they influence each other. A famous experience has been shown that knowing what music your friends like affects your own stated preferences. Your social desire to conform skews your independent judgment.

Third, popularity signals can be played. Over the years, search engines have developed sophisticated techniques to counteract “link farms”And other schemes to manipulate search algorithms. Social media platforms, on the other hand, are just starting to discover their own vulnerabilities.

People aiming to manipulate the information market have created fake accounts, like trolls and social robots, and organized fake networks. They have flooded the network to create the appearance that a conspiracy theory or one political candidate is popular, fooling both the platform’s algorithms and people’s cognitive biases. They even have changed the structure of social networks to create illusions about majority opinions.

Reduce engagement

What to do? Technology platforms are currently on the defensive. They become more aggressive during the elections in remove fake accounts and harmful misinformation. But these efforts can be compared to a game of fuck a mole.

A different preventive approach would be to add friction. In other words, to slow down the process of disseminating information. High frequency behaviors such as automated tasting and sharing could be inhibited by CAPTCHA tests or fees. This would not only reduce the possibilities for manipulation, but with less information people would be able to pay more attention to what they see. It would leave less room for engagement bias to affect people’s decisions.

It would also help if social media companies adjusted their algorithms to rely less on engagement to determine what content they serve you.

Filippo Menczer is Luddy Distinguished Professor of Computing and Computing and Director of the Social Media Observatory at Indiana University, Bloomington.

This comment was originally posted by The Conversation – How “engagement” makes you vulnerable to manipulation and misinformation on social media

Facebook faces bipartisan Senate probe after company was revealed to be aware of Instagram damage to young users

SEC Chairman Gensler Defends Reddit GameStop Investors’ Right to ‘Crush’ Short Sellers

In new Facebook antitrust campaign, FTC says social network ‘illegally bought or buried’ potential rivals


Source link

Share.

Comments are closed.