Instagram is preparing to roll out a new safety feature that blurs nude images in messages, as part of efforts to protect minors on the platform from abuse and sexually exploitative scams.
Announced by Meta on Thursday, the new feature — which both blurs images detected to contain nudity and discourages users from sending them — will be enabled by default for teenage Instagram users, as identified by the birthday information on their account. A notification will also encourage adult users to turn it on.
These efforts follow longstanding criticism that platforms like Facebook and Instagram have caused harm to their youngest users, from harming children’s mental health and body image, to knowingly platforming abusive parents and creating…