Meta founder and CEO Mark Zuckerberg has announced major changes to the way the company addresses misinformation on Facebook, Instagram and Threads. Instead of relying on independent third-party factcheckers, Meta will now emulate Elon Musk’s X (formerly Twitter) in using “community notes.” These crowdsourced contributions allow users to flag content they feel is questionable.

Zuckerberg claimed that these changes promote “free expression”. But some experts worry that he is bowing to right-wing political pressure, and effectively allowing a flood of hate speech and lies to spread on Meta platforms.

Research on social media’s group dynamics shows that those experts have a point.

At first glance, Community Notes may seem democratic, reflecting the values ​​of free speech and collective decisions. Crowdsourced systems such as Wikipedia, Metaculus and PredictIt, although imperfect, are often successful in harnessing the wisdom of crowds – where the collective judgment of many people can sometimes outperform even experts.

Research shows that diverse groups making independent judgments and guesses can be surprisingly effective at discerning the truth. However, intelligent crowds rarely have to contend with social media algorithms.

Many people rely on platforms like Facebook for their news, putting them at risk of being exposed to misinformation and biased sources. Relying on social media users to police information accuracy can polarize platforms and amplify extreme voices.

Two group-based tendencies – our psychological need to divide ourselves and others into groups – are of particular concern: in-group/out-group bias and acrophily (love of extremes).

ingroup/outgroup bias

Humans are biased in the way they evaluate information. People are more likely to trust and remember information from within their own group – those who share their identity – while distrusting information from a perceived out-group. This bias leads to echo chambers, where like-minded people reinforce shared beliefs regardless of accuracy.

It may seem logical to trust family, friends, or coworkers rather than strangers. But group sources often have similar viewpoints and experiences, providing little new information. On the other hand, out-group members are more likely to provide diverse perspectives. This diversity is important for the intelligence of crowds.

But too much disagreement between groups can also prevent community fact-checking from happening. Many community notes on X (formerly Twitter), such as those related to COVID vaccines, were likely never shown publicly because users disagreed with each other. The advantage of third-party factchecking was to provide an objective external source rather than requiring widespread consensus from users on a network.

Worse, such systems are vulnerable to manipulation by well-organized groups with political agendas. For example, Chinese nationalists reportedly launched a campaign to edit Wikipedia entries related to China–Taiwan relations to make them more favorable to China.

Political polarization and acrophily

In fact, politics intensifies these dynamics. In America, political identity increasingly dominates how people define their social groups.

Political groups are motivated to define “truth” in ways that benefit them and harm their political opponents. It’s easy to see how organized efforts to spread politically motivated lies and discredit inconvenient truths could corrupt the wisdom of the crowd in Meta’s community notes.

Social media exacerbates this problem through a phenomenon called acrophily, or preference for extremes. Research shows that people engage with posts that are slightly more extreme than their views.

These increasingly extreme posts are more likely to be negative than positive. Psychologists have known for decades that the bad is more attractive than the good. We are committed to paying more attention to negative experiences and information than positive experiences.

On social media, this means that negative posts – about violence, disasters and crises – attract more attention, often at the expense of more neutral or positive content.

People who express these extreme, negative views gain status within their groups, attracting more followers and increasing their influence. Over time, people come to accept these slightly more extreme negative thoughts as normal, and gradually shift their own views toward the poles.

A recent study of 2.7 million posts on Facebook and Twitter found that messages containing words like “hate”, “attack” and “destroy” were shared and liked at a higher rate than almost any other content. This shows that social media is not only promoting extreme views – it’s fostering a culture of out-group hate that undermines the collaboration and trust needed for a system like Community Notes to work.

way forward

The combination of negativity bias, in-group/out-group bias, and acrophily supercharges one of the greatest challenges of our time: polarization. Through polarization, extreme views become normalized, reducing the possibility of shared understanding across group divisions.

The best solutions, which I examine in my upcoming book, The Collective Age, start with diversifying our information sources. First, people need to connect and cooperate with different groups to break down the barriers of distrust. Second, they should seek information from multiple credible news and information outlets, not just social media.

However, social media algorithms often work against these solutions, creating echo chambers and diverting people’s attention. For Community Notes to work, these algorithms will need to prioritize diverse, trusted sources of information.

While community notes can theoretically harness the wisdom of crowds, their success depends on overcoming these psychological weaknesses. Perhaps increasing awareness of these biases could help us design better systems – or empower users to use community notes to foster dialogue at different levels. Only then can platforms get closer to solving the problem of misinformation.The dynamics that polarize us on social media are about to get much worse

,Author: Colin M. Fisher, Associate Professor of Organizations and Innovation and author of “The Collective Age: Unlocking the Secret Power of Groups”, UCL)

,disclosure statement: Colin M. Fisher does not work for, consult, own shares in, or receive funding from any company or organization that would benefit from this article, and has no relevant affiliations beyond his academic appointment. not disclosed)

This article is republished from The Conversation under a Creative Commons license. Read the original article.

(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)

LEAVE A REPLY

Please enter your comment!
Please enter your name here