Instagram says it is deploying new tools to protect young people and combat sexual extortion, including a feature that will automatically blur nudity in direct messages.
The social media company said in a blogpost on Thursday that it was testing the features as part of its campaign to fight sexual scams and other forms of “image abuse”, and to make it tougher for criminals to contact teens.
Sexual extortion, or sextortion, involves persuading a person to send explicit photos online and then threatening to make the images public unless the victim pays money or engages in sexual favors. Recent high-profile cases include two Nigerian brothers who pleaded guilty to sexually extorting teen boys and young men in Michigan, including one who took his own life, and a Virginia sheriff’s deputy who sexually extorted and kidnapped a 15-year-old girl.
Instagram said scammers often use direct messages to ask for “intimate images”. To counter this, it will soon start testing a nudity protection feature for direct messages that blurs any images with nudity “and encourages people to think twice before sending nude images”.
“The feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Instagram said.
The feature will be turned on by default globally for teens under 18. Adult users will get a notification encouraging them to activate it. Images with nudity will be blurred with a warning, giving users the option to view it. They will also get an option to block the sender and report the chat.
For people sending direct messages with nudity, they will get a message reminding them to be cautious when sending “sensitive photos”. They will also be informed that they can unsend the photos if they change their mind, but that there’s a chance others may have already seen them. Meta also owns Facebook and WhatsApp, but the nudity blur feature will not be added to messages sent on those platforms.
Instagram and other social media companies have faced growing criticism for not doing enough to protect young people. Mark Zuckerberg, the CEO of Instagram’s owner Meta Platforms, apologized to the parents of victims of such abuse during a Senate hearing earlier this year. New Mexico’s attorney general has sued Meta, alleging its social networks are the world’s “single largest marketplace for paedophiles”. The suit follows a two-year Guardian investigation into Meta’s struggle to contain child sex trafficking.
Instagram said it was working on technology to help identify accounts that could be potentially be engaging in sexual extortion scams, “based on a range of signals that could indicate sextortion behavior”
To stop criminals from connecting with young people, it is also taking measures including not showing the “message” button on a teen’s profile to potential sextortion accounts, even if they already follow each other, and testing new ways to hide teens from these accounts.