It’s no secret that the criticism surrounding Instagram’s algorithm recently prompted the company to shed some light on how it works. However, it looks like Instagram’s algorithm is once again under scrutiny as according to researchers from Stanford University and the University of Massachusetts Amherst, the algorithm is not only associated with a “vast pedophile network,” but it also promotes it by allowing users to search for explicit Instagram hashtags related to CSAM.
The researchers also discovered that these hashtags led users to accounts selling pedophilic materials, which included videos depicting children harming themselves or engaging in acts of bestiality. Shockingly, some accounts even provided options for buyers to “commission specific acts” or arrange in-person meetings.
“Instagram is an onramp to places on the internet where there’s more explicit child sexual abuse. The most important platform for these networks of buyers and sellers seems to be Instagram,” said Brian Levine, director of UMass Rescue Lab.
Instagram’s response
In response to this disturbing report, Meta, the parent company of Instagram, has set up an internal task force and launched an active investigation. Moreover, the company is also making efforts to block networks involved in child sexual abuse material (CSAM) and implement system changes. However, the fact that Instagram often ignored attempts made by users to report these CSAM accounts and even viewing an account associated with an underage seller triggered the algorithm to recommend new accounts raises some serious concerns.
“Child exploitation is a horrific crime. We are continuously exploring ways to actively combat this behavior,” said Meta.
Although Meta claimed that they actively seek and remove users involved in child exploitation, citing the takedown of 490,000 accounts violating child safety policies in January alone, this report comes at a time when social media platforms, including Meta’s Instagram, face increased scrutiny regarding their efforts to regulate and prevent the dissemination of abusive content. Moreover, Meta’s recent plans to expand end-to-end encryption have had law enforcement agencies like the FBI and Interpol worried, as it could hinder the detection of harmful content related to child sex abuse.