We missed this earlier: A new feature will now issue a warning to US teenagers under 18 before they send or view messages with nude photos on Apple’s iMessage, according to a report in the Wall Street Journal. The feature will be carried in the new iPhone update (iOS 15.2) through the Family Sharing setting, to be switched on by parents, the report added.
The company, however, dropped its plan to notify parents when their children (only under 13) view or send nudes. The notification was supposed to be part of the tool earlier but was dropped due to concerns around the privacy of kids’ communications, WSJ wrote in its report.
An Apple engineer, speaking to WSJ, revealed that the message informing kids of the notification to parents was clear but it was possible that kids would not understand the implications of the message and ignore it. It is not clear if this will be rolled out across the world or only in the US.
The feature is a significant development as tech companies try to address child sexual predation without straddling the rights of parents and children.
What will happen once parents enable the feature?
The iPhone’s AI will detect nude images that its Messages app receives, or the ones added by the child. The nude image will be blurred, and the child will have to choose to open the image, according to our report in August this year. The company had said that it will not have access to the messages or the images on its servers because of its encryption.
- Users will first receive a message asking whether they want to proceed to view or add the sensitive image
- A second message will urge them to talk to someone they trust if they feel they are being pressured into the exercise.
- Kids will be able to text an adult for help within the Messages app itself
The image blurring and warnings will be available only on Apple’s Messages app and will not cover third-party messaging apps such as WhatsApp, Signal, or social media apps like Instagram and Snapchat, WSJ revealed. The tool will be rolled out on iPhones before they are rolled out on iPads and Mac, the business daily added.
What were the measures announced by Apple to combat Child Sexual Abuse?
Apple had announced three measures to be introduced in its operating systems which look to limit the spread of Child Sexual Abuse Material (CSAM) and protect children from predators:
- CSAM detection in iCloud Photos: Using advanced cryptographic methods, Apple will detect if a user’s iCloud Photos library contains high-levels of CSAM content and pass on this information to law enforcement agencies.
- Safety measures in Siri and Search: Siri and Search will intervene when users try to search for CSAM-related topics and will also provide parents and children expanded information if they encounter unsafe situations.
The move faced resistance immediately as many free speech advocates believed that the technology could be exploited by authoritarian regimes to snuff out political dissidents and be wielded as surveillance tools. The company has pressed pause on these CSAM measures as it reviews the policy and the concerns around it.
“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.” – Electronic Frontier Foundation
Also read:
- How WhatsApp Deals With Child Sexual Abuse Material Without Breaking End To End Encryption
- Instagram Announces Three New Safety Measures For Young Users, Including Limiting Advertisers’ Reach
- India Leads In Generation Of Online Child Sexual Abuse Material
Have something to add? Post your comment and gift someone a MediaNama subscription.
I cover several beats such as crypto, telecom, and OTT at MediaNama. I will be loitering at my local theatre and consuming movies by the dozen when I am off work.
