wordpress blog stats
Connect with us

Hi, what are you looking for?

Apple will warn kids before they send nude images but won’t inform their parents: Report

Apple had faced flak from privacy and child-safety advocates over an earlier version of the feature.

We missed this earlier: A new feature will now issue a warning to US teenagers under 18 before they send or view messages with nude photos on Apple’s iMessage, according to a report in the Wall Street Journal. The feature will be carried in the new iPhone update (iOS 15.2) through the Family Sharing setting, to be switched on by parents, the report added.

The company, however, dropped its plan to notify parents when their children (only under 13) view or send nudes. The notification was supposed to be part of the tool earlier but was dropped due to concerns around the privacy of kids’ communications, WSJ wrote in its report.

An Apple engineer, speaking to WSJ, revealed that the message informing kids of the notification to parents was clear but it was possible that kids would not understand the implications of the message and ignore it. It is not clear if this will be rolled out across the world or only in the US.

The feature is a significant development as tech companies try to address child sexual predation without straddling the rights of parents and children.

What will happen once parents enable the feature?

The iPhone’s AI will detect nude images that its Messages app receives, or the ones added by the child. The nude image will be blurred, and the child will have to choose to open the image, according to our report in August this year. The company had said that it will not have access to the messages or the images on its servers because of its encryption.

Advertisement. Scroll to continue reading.
  • Users will first receive a message asking whether they want to proceed to view or add the sensitive image
  • A second message will urge them to talk to someone they trust if they feel they are being pressured into the exercise.
  • Kids will be able to text an adult for help within the Messages app itself

The image blurring and warnings will be available only on Apple’s Messages app and will not cover third-party messaging apps such as WhatsApp, Signal, or social media apps like Instagram and Snapchat, WSJ revealed. The tool will be rolled out on iPhones before they are rolled out on iPads and Mac, the business daily added.

What were the measures announced by Apple to combat Child Sexual Abuse?

Apple had announced three measures to be introduced in its operating systems which look to limit the spread of Child Sexual Abuse Material (CSAM) and protect children from predators:

  • CSAM detection in iCloud Photos: Using advanced cryptographic methods, Apple will detect if a user’s iCloud Photos library contains high-levels of CSAM content and pass on this information to law enforcement agencies.
  • Safety measures in Siri and Search: Siri and Search will intervene when users try to search for CSAM-related topics and will also provide parents and children expanded information if they encounter unsafe situations.

The move faced resistance immediately as many free speech advocates believed that the technology could be exploited by authoritarian regimes to snuff out political dissidents and be wielded as surveillance tools.  The company has pressed pause on these CSAM measures as it reviews the policy and the concerns around it.

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.” – Electronic Frontier Foundation

Also read:

Have something to add? Post your comment and gift someone a MediaNama subscription.

Written By

I cover several beats such as crypto, telecom, and OTT at MediaNama. I will be loitering at my local theatre and consuming movies by the dozen when I am off work.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.


The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.


In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?


The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.


The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ