Instagram algorithm promotes content that is dangerous for teens

Meta is actively promoting self-harm content on Instagram by failing to remove dangerous images and encouraging users who have seen them to befriend each other.

This is according to a new study, according to which the moderation system is “grossly inadequate”, reported the Guardian newspaper.

Danish researchers have created a network on the platform, including fake profiles of people as young as 13, in which they shared 85 examples of self-harm content of gradually increasing severity, including blood, razor blades and more.

The study aims to test Meta’s claim that it has significantly improved its systems for removing dangerous content, which it says now uses artificial intelligence (AI). The technology company claims to remove around 99% of it before it is even reported.

But Digitalt Ansvar (Digital Accountability), an organization that promotes responsible digital development, found that not a single image was removed during the month-long experiment.

Instagram has access to technology capable of tackling the problem, but “has chosen not to implement it effectively,” the study said.

The platform’s inadequate moderation, Digitalt Ansvar warns, suggests it is not complying with EU law.

A Meta spokesperson said: “Content that promotes self-harm is against our policies and we remove it when we find it. In the first half of 2024, we removed more than 12 million suicide and self-harm-related materials on Instagram, 99% of which we removed proactively.”

"Earlier this year, we launched Instagram Teen Accounts, which will put teens on the strictest setting of our controls for sensitive content, so they're even less likely to be recommended to them, and in many cases we're hiding it altogether," he said.

However, the Danish study found that rather than trying to shut down the network that was created, Instagram's algorithm actively helped to expand it. The study found that 13-year-olds became friends with all members of a "self-harm group" after they had connected with one of its members. | BGNES