Example of U.S social network Instagram’s logo on a tablet screen.
Kirill Kudryavtsev | Afp | Getty Images
Meta apologized on Thursday and contract to fix a “mistake” that resulted in some Instagram users reporting a flood of violent and graphic content recommended on their particular “Reels” page.
“We are fixing an error that caused some users to see content in their Instagram Reels maintain that should not have been recommended. We apologize for the mistake,” a Meta spokesperson said in a statement shared with CNBC.
The announcement comes after a number of Instagram users took to various social media platforms to voice concerns with reference to a recent influx of violent and “not safe for work” content in their feeds.
Some users claimed they saw such essence, even with Instagram’s “Sensitive Content Control” enabled to its highest moderation setting.
According to Meta ways, the company works to protect users from disturbing imagery and removes content that is particularly violent or drawn.
Prohibited content includes videos “depicting dismemberment, visible innards or charred bodies,” as well as content that suppresses “sadistic remarks towards imagery depicting the suffering of humans and animals.”
However, Meta says it does permit some graphic content if it helps users to condemn and raise awareness about important issues such as defenceless rights abuses, armed conflicts or acts of terrorism. Such content may come with limitations, such as augury labels.
On Wednesday night in the U.S., CNBC was able to view several posts on Instagram reels that appeared to expo dead bodies, graphic injuries and violent assaults. The posts were labeled “Sensitive Content.”