LONDON — Instagram says it’s deploying new tools to protect young people and combat sexual extortion, including a feature that will automatically blur nudity in direct messages.
The social media platform said in a blog post Thursday that it’s testing out the features as part of its campaign to fight sexual scams and other forms of “image abuse,” and to make it tougher for criminals to contact teens.
Sexual extortion, or sextortion, involves persuading a person to send explicit photos online and then threatening to make the images public unless the victim pays money or engages in sexual favors. Recent high-profile cases include two Nigerian brothers who pleaded guilty to sexually extorting teen boys and young men in Michigan, including one who took his own life, and a Virginia sheriff’s deputy who sexually extorted and kidnapped a 15-year-old girl.
Instagram and other social media companies have faced growing criticism for not doing enough to protect young people. Mark Zuckerberg, the CEO of Instagram’s owner Meta Platforms, apologized to the parents of victims of such abuse during a Senate hearing earlier this year.
Meta, which is based in Menlo Park, California, also owns Facebook and WhatsApp but the nudity blur feature won’t be added to messages sent on those platforms.
Instagram said scammers often use direct messages to ask for “intimate images.” To counter this, it will soon start testing out a nudity-protection feature for direct messages that blurs any images with nudity “and encourages people to think twice before sending nude images.”
“The feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Instagram said.
The feature will be turned on by default globally for teens under 18. Adult users will get a notification encouraging them to activate it.
Images with nudity will be blurred with a warning, giving users the option to view it. They’ll also get an option to block the sender and report the chat.
For people sending direct messages with nudity, they will get a message reminding them to be cautious when sending “sensitive photos.” They’ll also be informed that they can unsend the photos if they change their mind, but that there’s a chance others may have already seen them.
As with many of Meta’s tools and policies around child safety, critics saw the move as a positive step, but one that does not go far enough.
“I think the tools announced can protect senders, and that is welcome. But what about recipients?” said Arturo Béjar, former engineering director at the social media giant who is known for his expertise in curbing online harassment. He said 1 in 8 teens receives an unwanted advance on Instagram every seven days, citing internal research he compiled while at Meta that he presented in November testimony before Congress. “What tools do they get? What can they do if they get an unwanted nude?”
Béjar said “things won’t meaningfully change” until there is a way for a teen to say they’ve received an unwanted advance, and there is transparency about it.
White House assistant press secretary Robyn Patterson also noted Thursday that President Joe Biden “has been outspoken about his belief that social media companies can do more to combat sexual exploitation online.”
Instagram said it’s working on technology to help identify accounts that could be potentially be engaging in sexual extortion scams, “based on a range of signals that could indicate sextortion behavior.”
To stop criminals from connecting with young people, it’s also taking measures including not showing the “message” button on a teen’s profile to potential sextortion accounts, even if they already follow each other, and testing new ways to hide teens from these accounts.
In January, the FBI warned of a “huge increase” in sextortion cases targeting children — including financial sextortion, where someone threatens to release compromising images unless the victim pays. The targeted victims are primarily boys between the ages of 14 to 17, but the FBI said any child can become a victim. In the six-month period from October 2022 to March 2023, the FBI saw a more than 20% increase in reporting of financially motivated sextortion cases involving minor victims compared to the same period in the previous year.
__
AP Technology Writer Barbara Ortutay in Oakland, California, contributed to this report.