Meta introduces nudity protection feature for Instagram

On Thursday, Instagram's parent firm, Meta, said it will test features that obscure nudity messages to protect teens and stop potential crooks from reaching them. This action is part of Meta's action to ease concerns over harmful content on its apps.
On Thursday, Instagram's parent firm, Meta, said it will test features that obscure nudity messages to protect teens and stop potential crooks from reaching them. This action is part of Meta's action to ease concerns over harmful content on its apps.

On Thursday, Instagram’s parent firm, Meta, said it will test features that obscure nudity messages to protect teens and stop potential crooks from reaching them. This action is part of Meta’s action to ease concerns over harmful content on its apps.

Amid mounting pressure regarding the addictive nature of its apps and concerns about their impact on mental health, Meta announced a new protection feature for Instagram‘s direct messages. The feature will use on-device machine learning to analyze whether images sent through the service contain nudity.

Default Protection for Users Under 18

The nudity protection feature will be enabled by default for users under 18, with Meta notifying adults to encourage them to activate it. By analyzing images on the device itself, the feature will also function in end-to-end encrypted chats, ensuring privacy unless images are reported to Meta.

Addressing Sextortion Scams

Meta is also developing technology to identify accounts potentially involved in sextortion scams. Additionally, the company is testing new pop-up messages for users who may have interacted with such accounts, aiming to enhance user safety and security on the platform.

Efforts to Safeguard Teenagers

In January, Meta announced plans to hide more content from teenagers on Facebook and Instagram, particularly sensitive content related to topics like suicide, self-harm, and eating disorders. 

Also read: FERRARI VENTURES INTO BATTERY RESEARCH WITH E-CELLS LAB

These measures are intended to minimize exposure to harmful content and promote a safer online environment for young users.

Facing Legal Scrutiny

Meta faces legal challenges in both the United States and Europe regarding its platform’s impact on users, particularly young people. 

In the U.S., attorneys general from 33 states have sued the company for allegedly misleading the public about the dangers of its platforms. Meanwhile, the European Commission has requested information on Meta’s measures to protect children from illegal and harmful content, reflecting growing regulatory scrutiny in Europe.

Share This

Tony Boyce is a seasoned journalist and editor at Sharks Magazine, where his expertise in business and startups journalism shines through his compelling storytelling and in-depth analysis. With 12 years of experience navigating the intricate world of entrepreneurship and business news, Tony has become a trusted voice for readers seeking insights into the latest trends, strategies, and success stories.

Leave a Reply

Your email address will not be published.

Related

BUSINESS

WORLD

LIFESTYLE