Instagram To Blur Nude Images In Messages To Minors
Nudity protection measure. Meta confirms Instagram will begin testing the blurring of any naked images sent to minors
Meta Platforms has confirmed an upcoming protection safeguard for children, as it seeks to clamp down on “sextortion and intimate image abuse.”
Meta in a blog post on Thursday announced that Instagram will begin testing nudity protection features in DMs (direct messages), which will blur images detected as containing nudity. It will also encourage people to think twice before sending nude images.
“Nudity protection will be turned on by default for teens under 18 globally, and we’ll show a notification to adults encouraging them to turn it on,” blogged Meta.
Ongoing problem
The issue of children using smartphones to share naked images has been ongoing for many years now, triggering concern among parents, guardians, and authorities.
Silicon UK for example published this advice almost a decade ago on how to protect people’s nude images online.
Meta in the blog post on Thursday wrote it is “testing new features to help protect young people from sextortion and intimate image abuse, and to make it more difficult for potential scammers and criminals to find and interact with teens.”
“We’re also testing new ways to help people spot potential sextortion scams, encourage them to report and empower them to say no to anything that makes them feel uncomfortable,” it stated.
It also pointed out that “financial sextortion is a horrific crime” that is typically carried out by predators and scammers.
Meta noted that it has already defaulted teenagers into stricter message settings so they can’t be messaged by anyone they’re not already connected to, as well as show Safety Notices to teens who are already in contact with potential scam accounts.
Meta also already offers a dedicated option for people to report DMs that are threatening to share private images, and it also supported the National Centre for Missing and Exploited Children (NCMEC) in developing Take It Down, a platform that lets young people take back control of their intimate images and helps prevent them being shared online.
The Wall Street Journal meanwhile has reported that the new feature will be tested in the coming weeks, with a global rollout expected after that.
When the Instagram nudity protection is enabled, Instagram users who receive nude photographs will be presented with a message informing them not to feel pressured to respond, alongside options to block and report the sender.
Meanwhile users who try to DM a nude will also see a message warning them about the dangers of sharing sensitive photos. Another warning message will seek to discourage users who attempt to forward a nude image they have received.
Protecting children
The issue of protecting children and teenagers from social media dangers has dogged Meta for years now.
In October last year, 33 US states filed legal action against Meta, alleging its Instagram and Facebook platforms were harming children’s mental health.
The US states alleged Meta contributes to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.
That federal suit was reportedly the result of an investigation led by a bipartisan coalition of attorneys general from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee, and Vermont.
Prior to that in October 2021, the head of Instagram confirmed the platform was ‘pausing’ the development of the “Instagram Kids” app, after the Wall Street Journal (WSJ) had reported on leaked internal research which suggested that Instagram had a harmful effect on teenagers, particularly teen girls.
Facebook had previously said it would require Instagram users to share their date of birth, in an effort to improve child safety.