Top News in India: Read Latest News on Sports, Business, Entertainment, Blogs and Opinions from leading columnists. Newz24india.in brings the Breaking News …

Instagram starts blocking nudity in messages in an effort to prevent sexual extortion and preserve youth.

Instagram starts blocking nudity in messages in an effort to prevent sexual extortion and preserve youth.

Instagram starts blocking nudity in messages in an effort to prevent sexual extortion and preserve youth.

Instagram

Instagram announced that it is implementing new measures, such as a function that automatically blurs nudity in direct messages, to safeguard children and prevent sexual extortion.

In a blog post published on Thursday, the social media platform stated that it is testing these capabilities as part of its campaign to combat sexual scams and other types of “image abuse” and to make it more difficult for criminals to reach teenagers.

Persuading someone to upload obscene photos online and then threatening to make the images public unless the victim pays money or performs sexual favors is known as sexual extortion, or sextortion. Two Nigerian brothers who pled guilty to sexually abducting teenage boys and young men in Michigan—one of them committed suicide—as well as a Virginia sheriff’s deputy who abducted a fifteen-year-old girl are two recent high-profile examples.

Growing criticism has been leveled at Instagram and other social media corporations for not doing enough to safeguard youth. Earlier this year at a Senate session, Mark Zuckerberg, the CEO of Meta Platforms, the company that owns Instagram, expressed regret to the parents of those who had suffered from similar abuse.

Although Facebook and WhatsApp are owned by Meta as well, messages sent on those platforms will not have the nudity blur feature applied to them.

Instagram said that scammers frequently seek for “intimate images” through direct messages. In response, it will shortly begin testing a feature for direct messages called “nudity protection,” which will blur any photographs that contain nudity and “encourage people to think twice before sending nude images.”

According to Instagram, “the feature is designed to protect people from scammers who may send nude images to trick people into sending their own images in return, in addition to protecting people from seeing unwanted nudity in their DMs.”

For teenagers under the age of 18, the feature will be activated by default worldwide. A notification will be sent to adult users, encouraging them to activate it.

LAVA PROWATCH PREPARE TO ENTER IN THE MARKET, CHECK LAUNCH DATE, PRICE & SPECIFICATIONS

Users will have the choice to see nude images with a warning before they become blurry. Additionally, they will have the opportunity to report the communication and blacklist the sender.

A warning alerting recipients to exercise caution while forwarding “sensitive photos” will be sent to those who send direct messages including nudity. They will also be told that although they can choose not to send the pictures back, there’s a possibility that someone else may have already seen them.

Instagram announced that, “based on a range of signals that could indicate sextortion behavior,” it is developing technologies to help identify accounts that might be involved in sexual extortion scams.

It’s also trying new techniques to hide teenagers from these accounts and not displaying the “message” option on a teen’s profile to potential sextortion accounts, even if they already follow each other, in an effort to prevent criminals from interacting with youth.

https://www.facebook.com/newz24india.in/

Exit mobile version