wordpress blog stats
Connect with us

Hi, what are you looking for?

Big Tech launch Project Protect to fight online child sexual exploitation and abuse


Technology Coalition — a global group of 18 technology companies including Amazon, Apple, Microsoft, Facebook, Google, PayPal, Snapchat, Adobe, GoDaddy and others — announced a new plan to combat online child sexual abuse on June 10. The plan, called Project Protect, will include multi-million dollar investment in Innovation Fund, a cross-industry technology that can fight child sexual exploitation and abuse (CSEA) online; multi-stakeholder approach to dealing with CSEA where stakeholders will include governments, law enforcement, civil society, research centres, hotlines, first responders, social workers, educators, etc.

For this plan, the Coalition has partnered with the Global Partnership to End Violence Against Children (EVAC) and WePROTECT Global Alliance (WPGA). The Coalition will act as a resource for the whole technology industry in the fight against child sexual exploitation and abuse (CSEA). EVAC will head the research arm of the project and will be conducted and analysed independent of the Coalition and its members.

As of now, the project is focussing on putting in place a structure for the Project, membership models and hiring.

The announcement of Project Protect comes at a time when there has been an increase in the volume of child sexual abuse material (CSAM) created and shared online. India accounts for the highest number of suspected online child exploitation reports received by USA’s National Center for Missing and Exploited Children’s (NCMEC) CyberTipline. Of all the reports made to NCMEC by electronic service providers, Facebook accounted for 94% of them. Even in the UK, April 2020 saw a significant spike in the number of attempts made to access known child sexual abuse imagery.

In light of Facebook’s, and especially Messenger’s, massive share in reporting CSAM to NCMEC, in February 2020, 129 signatories had urged the company to resist introducing end-to-end encryption on Facebook’s messaging platforms, and their subsequent integration. The governments of USA, UK and Australia had written an open letter to Facebook to not introduce end-to-end encryption, at least not without backdoors for law enforcement. Terrorism and online child sexual exploitation were the governments’ reasons for asking Facebook not to implement it. In response, Facebook and WhatsApp had absolutely refused to build backdoors and was supported by 58 civil society organisations around the world.

Advertisement. Scroll to continue reading.

End-to-end encryption, which is already the default on Facebook-owned WhatsApp, disallows anyone, except the sender and the receiver, to monitor the communication in any way, thereby turning all communication opaque to any automated or manual monitoring for CSAM. Thus far, Facebook has primarily relied on automated monitoring of content on its social media platform and on Messenger to report and take down CSAM. Microsoft uses PhotoDNA, a technology it developed with Dartmouth, to target CSAM. Through this technology, Microsoft basically creates a “hash”, a unique digital signature, of the image and then compares it against hashes of other photos to find copies and take them down and/or report them.

The issue of tracing originators and distributors of CSAM on end-to-end encrypted messaging platforms has been brought up in the WhatsApp traceability case in India as well. There, Senior Advocate Kapil Sibal, on behalf of WhatsApp, had argued that even on a matter as serious as child pornography, WhatsApp’s hands were tied because of end-to-end encryption. However, WhatsApp also uses PhotoDNA to scan unencrypted profile photos of WhatsApp users to find and take down accounts of people and groups engaged in sharing CSAM. The ad hoc Rajya Sabha committee, led by Jairam Ramesh, had recommended that law enforcement agencies be allowed to break end-to-end encryption to trace distributors of child pornography. The committee had been constituted to find out ways to prevent sexual abuse of children and prohibit access and circulation of child pornography on social media.

Written By

Send me tips at aditi@medianama.com. Email for Signal/WhatsApp.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.


The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.


In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?


The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.


The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ