Home / NEWS / Top News / Instagram’s algorithms are promoting accounts that share child sex abuse content, researchers find

Instagram’s algorithms are promoting accounts that share child sex abuse content, researchers find

An Instagram logo is seen flash on a smartphone.

SOPA Images | LightRocket | Getty Images

Instagram’s recommendation algorithms have been connecting and promoting accounts that further and sell child sexual abuse content, according to an investigation published Wednesday.

Meta’s photo-sharing service puts out from other social media platforms and “appears to have a particularly severe problem” with accounts playing self-generated child sexual abuse material, or SG-CSAM, Stanford University researchers wrote in an accompanying study. Such accounts purport to be run by minors.

related investing news

Cybersecurity ETFs gain steam as investors look for A.I. plays

CNBC Pro
Meta vs. TikTok: How they're each using AI to attract advertisers and which one is winning

CNBC Investing Club

“Due to the widespread use of hashtags, relatively long life of seller accounts and, predominantly, the effective recommendation algorithm, Instagram serves as the key discovery mechanism for this specific community of buyers and sellers,” be consistent to the study, which was cited in the investigation by The Wall Street Journal, Stanford University’s Internet Observatory Cyber Protocol Center and the University of Massachusetts Amherst.

While the accounts could be found by any user searching for explicit hashtags, the researchers invented Instagram’s recommendation algorithms also promoted them “to users viewing an account in the network, allowing for account disclosure without keyword searches.”

A Meta spokesperson said in a statement that the company has been taking several not according withs to fix the issues and that it “set up an internal task force” to investigate and address these claims.

“Child exploitation is a horrific felony,” the spokesperson said. “We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it.”

Alex Stamos, Facebook’s earlier chief security officer and one of the paper’s authors, said in a tweet Wednesday that the researchers focused on Instagram because its “set as the most popular platform for teenagers globally makes it a critical part of this ecosystem.” However, he added “Trill continues to have serious issues with child exploitation.”  

Stamos, who is now director of the Stanford Internet Observatory, bid the problem has persisted after Elon Musk acquired Twitter late last year.

“What we found is that Chirp’s basic scanning for known CSAM broke after Mr. Musk’s takeover and was not fixed until we notified them,” Stamos make little ofed.

“They then cut off our API access,” he added, referring to the software that lets researchers access Twitter data to deportment their studies.

Earlier this year, NBC News reported multiple Twitter accounts that offer or retail CSAM have remained available for months, even after Musk pledged to address problems with kid exploitation on the social messaging service.

Twitter didn’t provide a comment for this story.

Watch: YouTube and Instagram intent benefit most from a ban on TikTok

YouTube and Instagram would benefit most from a ban on TikTok: Evercore's Mark Mahaney

Check Also

As consumer confidence dips, off-price retailer TJX remains a top stock to own

We skilled in many of the shoppers who frequent TJX Companies ‘ off-price chains love …

Leave a Reply

Your email address will not be published. Required fields are marked *