Social media toxicity can’t be fixed by changing the algorithms

- Advertisement -


Social media toxicity can’t be fixed by changing the algorithms

Can social media’s problems be solved?

MoiraM / Alamy

The polarising impact of social media isn’t just the result of bad algorithms – it is inevitable because of the core components of how the platforms work, a study with AI-generated users has found. It suggests the problem won’t be fixed unless we fundamentally reimagine the world of online communication.

Petter Törnberg at the University of Amsterdam in the Netherlands and his colleagues set up 500 AI chatbots designed to mimic a range of political beliefs in the US, based on the American National Election Studies Survey. Those bots, powered by the GPT-4o mini large language model, were then instructed to interact with one another on a simple social network the researchers had designed with no ads or algorithms.

Read more

The truth about social media and screen time’s impact on young people

During five runs of the experiment, each involving 10,000 actions, the AI agents tended to follow people with whom they shared political affiliations, while those with more partisan views gained more followers and reposts. This echoed overall attention towards those users, which gravitated towards more partisan posters.

In a previous study, Törnberg and his colleagues explored whether simulated social networks with different algorithms could identify routes to tamp down political polarisation – but the new research seems to contradict their earlier findings.

“We were expecting this (polarisation) to be something that’s driven by algorithms,” Törnberg says. “(We thought) that the platforms are designed for this – to produce these outcomes – because they are designed to maximise engagement and to piss you off and so on.”

Free newsletter

Sign up to The Daily

The latest on what’s new in science and why it matters each day.

Sign up to newsletter
New Scientist. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

Instead, they found it wasn’t the algorithms themselves that seemed to be causing the issue, which could make any attempts to weed out antagonistic user behaviour by design very difficult. “We set up the simplest platform we could imagine, and then, boom, we already have these outcomes,” he says. “That already suggests that this is stemming from something very fundamental to the fact that we have posting behaviour, reposting and following.”

To see whether those behaviours could be either muted or countered, the researchers also tested six potential solutions, including a solely chronological feed, giving less prominence to viral content, amplifying opposing views and empathetic and reasoned content, hiding follower and repost counts, and hiding profile bios.

Most of the interventions made little difference: cross-party mixing changed by no more than about 6 per cent, and the share of attention hogged by top accounts shifted between 2 and 6 per cent – while others, such as hiding biographies of the users involved, actually made the problem worse. When there were gains in one area, they were countered by negative impacts elsewhere. Fixes that reduced user inequality made extreme posts more popular, while alterations to soften partisanship funnelled even more attention to a small elite.

Read more

How to avoid being fooled by AI-generated misinformation

“Most social media activities are always fruit of the poisonous tree – the beginning problems of social media always lie with their foundational design, and as such can encourage the worst of human behaviour,” says Jess Maddox at the University of Georgia.

While Törnberg acknowledges the experiment is a simulation that could simplify some mechanisms, he thinks it can tell us what social platforms need to do to reduce polarisation. “We might need more fundamental interventions and need more fundamental rethinking,” he says. “It might not be enough to wiggle with algorithms and change the parameters of the platform, but (we might) need to rethink more fundamentally the structure of interaction and how these spaces structure our politics.”

Reference:

arXiv DOI: 10.48550/arXiv.2508.03385

Topics:

  • social media
- Advertisement -

Latest articles

Related articles

error: Content is protected !!