Soon, Instagram to 'automatically blur' inappropriate images sent via DMs

Meta said that its latest tools were building on 'longstanding work to help protect young people from unwanted or potentially harmful contact'

By AFP

  • Follow us on
  • google-news
  • whatsapp
  • telegram

 

Photo: Reuters
Photo: Reuters

Published: Thu 11 Apr 2024, 2:15 PM

Last updated: Thu 11 Apr 2024, 11:46 PM

Meta said Thursday that it was developing new tools to protect teenage users from "sextortion" scams on its Instagram platform, which has been accused by US politicians of damaging the mental health of youngsters.

The US firm said in a statement it was testing an AI-driven "nudity protection" tool. The tool would automatically detect and blur images with nudity that were sent to minors on the app's messaging system.


"This way, the recipient is not exposed to unwanted intimate content and has the choice to see the image or not," Capucine Tuffier, in charge of child protection at Meta France, told AFP.

The firm said it would also send messages with advice and safety tips to anyone sending or receiving such messages.


Meta announced in January that it would roll out measures to protect under-18s on its platforms after dozens of US states launched legal action accusing the firm of profiting "from children's pain".

Meta said on Thursday that its latest tools were building on "our longstanding work to help protect young people from unwanted or potentially harmful contact".

ALSO READ:


More news from World