Meta To Boost ‘Sextortion’ Protections On Instagram

Meta announced Thursday that it was developing new tools to safeguard teenage users from “sextortion” schemes on its Instagram platform, which has been accused of harming young people’s mental health.

The US company claimed in a statement that it was testing an AI-powered “nudity protection” function that would automatically detect and blur nudity-containing photographs sent to children via the app’s messaging system.

“This way, the recipient is not exposed to unwanted intimate content and can choose whether or not to see the image,” said Capucine Tuffier, Meta France’s director of child protection.

The company also stated that it would send messages containing advise and safety tips to everyone who sent or received such texts.

Meta stated in January that it will implement safeguards to protect under-18s on its platforms, following legal action by dozens of US states accusing the company of benefiting “from children’s pain”.

Leaked internal research from Meta, as reported by The Wall Street Journal, and whistleblower Frances Haugen, a former Facebook engineer, revealed that the business was well aware of the risks its platforms posed to young people’s mental health.

Meta stated on Thursday that its latest tools built on “our longstanding work to help protect young people from unwanted or potentially harmful contact”.

“We’re testing new features to help protect young people from sextortion and intimate image abuse, and to make it more difficult for potential scammers and criminals to find and interact with teens,” the business stated in a statement.

 

 

Leave a Reply