Sadly, nudes are a part of smartphone life, especially on social media sites. Meta is going to do something about it.
Meta, the parent company of Instagram, revealed that a new AI-powered tool will now detect nude photos and blur them. The company revealed this information through a blog post.
The new tool will be enabled by default for anyone under 18 by default. However, anyone can turn it on by visiting Instagram’s Settings page.
The platform will also send a warning or caution to the senders to be careful of whom they’re sending nude photos to. They will also be presented with an option to not send them if they change their mind.
The best part is that the images are processed on-device and not on Meta’s servers. So, there is no issue with privacy.
Here is what Meta said in a statement. “Nudity protection uses on-device machine learning to analyze whether an image sent in a DM on Instagram contains nudity. Because the images are analyzed on the device itself, nudity protection will work in end-to-end encrypted chats, where Meta won’t have access to these images – unless someone chooses to report them to us.”
Apart from this, they are also adding sextortion protection. The company has some rules about it and punishes people by removing their account who engage in extortion activities. They also block them from creating new accounts as well. Now, they are developing new tech to identify accounts even sooner by using AI to catch sextortion behavior.
Messages from such accounts will be relegated to the hidden folders. The company will also notify the users who are talking to these people by sending safety notices. They will also remind them to report any threat.
“We’re also testing hiding teens from these accounts in people’s follower, following, and like lists, and making it harder for them to find teen accounts in Search results.”
Thanks for choosing to leave a comment. Please keep in mind that all comments are moderated according to our comment Policy.