wordpress blog stats
Connect with us

Hi, what are you looking for?

How will Facebook’s new privacy-enhancing technologies impact user data in digital advertising? 

Drawn from cryptography and statistics, these technologies like multi-party computation and differential privacy come after Google announced its own alternative to third-party cookies. 

Facebook is redrawing the borders of its digital advertising landscape to ensure personalization of ads do not infringe upon the privacy of its users, according to a blog post by Facebook’s VP of product marketing for ads Graham Mudd on August 11. The post said that the company is developing privacy-enhancing technologies (PETs) which reduce Facebook’s reliance on individual third-party data to deliver personalised ads.

The notion of privacy has taken centre stage in conversations around Big Tech and its influence on the lives of people at large. Big tech companies like Facebook have been facing pushback for years on some of their practices to gather user data and using it to track them across the digital spectrum. Apple rolled out a prompt late last year to iPhones where users can decide which apps can track them across other apps to deliver targetted ads. Google is also rolling out something similar for Android phones by phasing out third-party cookies and replacing them with a mechanism called Federated Learning of Cohorts (FloC) that groups clusters of people with similar interests instead of tracking them individually. 

On the legislative end, the European Union is exploring a ban on targetted ads as part of a proposal called the Digital Services Act, and the Biden administration is pursuing its interest in policing the “surveillance of users” by “dominant Internet platforms.”

Privacy-enhancing technologies (PETs) touted by Facebook 

According to Facebook, PETs are “advanced techniques drawn from the fields of cryptography and statistics” that “minimize the data that is processed while preserving critical functionality like ad measurement and personalization.” There are three kinds of PETs:

Advertisement. Scroll to continue reading.

Multi-party computation (MPC) 

The company said that MPC facilitates the working of two or more organisations while limiting the information that either party can learn. 

  • Data remains encrypted end-to-end in transit, storage, and use, making it impossible for any party to see the other’s data. 
  • Useful in reporting results of an ad campaign or training a machine-learning model where data is held by two or more parties as MPC calculates outcomes from more than one party while maintaining privacy. 
  • The framework for MPC is open source for any developer to create privacy-centric measurement products with the technology. 
  • Facebook rolled out MPC last year in a tool called Private Lift Measurement.  It will be available to all advertisers by next year.

On-device learning

According to Facebook, on-device learning trains an algorithm to process data collected on the user’s device without sending individual data to a remote server or cloud. 

  • It will find patterns using historical data of actions performed on the device to make predictions.
  • It will devise new ways to show people relevant ads, and it will not need to learn about specific actions users take on other apps and websites.
  • The company claims it is a feature like autocorrect or text prediction and improves over time. 

Differential privacy

It is a technique that is used on its own or applied to other privacy-enhancing technologies to protect data from being re-identified, as per Facebook.  

  • It includes calculated “noise” to prevent reverse engineering of individual data within aggregated datasets
  • In an example provided by the company, if 118 people buy a product after clicking on an ad, the system would add or subtract a random amount from 118.
  • The company said that adding a small bit of incorrect information makes it harder to know who actually bought the product after clicking the ad. 

Key takeaways from Graham Mudd’s interview on PETs

In an interview with The Verge, Mudd revealed the following:

  • Limited access to third-party data: “Access to third-party data will become more limited over the course of the next couple of years. It is a reflection of peoples’ changing expectations around privacy.”
  • Merits of MPC: “MPC is just another approach to anonymous data sharing. Instead of it necessarily being aggregate, you’re using encryption and cryptography technology to ensure that the same end is met, which is that you don’t learn anything about an individual.”
  • Developing consensus with operating systems: “One of the challenges with on-device learning is that the compute resources required to do it are obviously under the control of the operating systems (Apple and Google) themselves.” 
  • Existing privacy controls for users: “We have what’s called an online behavioral advertising control. So you can turn off the use of third-party data for advertising already. You can also — through another control called Off-Facebook Activity— at the advertiser level, decide whether you want data associated with your account and used in advertising.”
  • Building tools for privacy: “There are dedicated teams and hundreds of engineers who are working on these (privacy and ads) types of technologies. Because data and personalization is at the heart of almost every one of our systems, from targeting to ad optimization to measurement, almost all our systems will be rebuilt over the next couple of years.”
  • Sentiment around privacy: “We do a lot of research into peoples’ sentiments and beliefs around privacy, and those help to shape the investments that we’ve made. I would say that in most of these cases, regulators, policymakers, and the media tend to be both ahead of and helping to shape consumer opinion here, as is their role.”
  • Future of digital advertising: “I think our view is that the path forward here is likely an ensemble of privacy enhancing technologies. It’s not a situation of it’s FLoC or MPC. FLoC addresses a specific use case — behavioral targeting — without revealing anything about a given individual. And the beta that we have running right now is really focused on measurement.”

Google’s bet on FLoC 

Facebook is not the only company adding privacy consciousness into its ads ecosystem. Google began working on FLoC as an alternative to third-party cookies which the Chrome browser will be phasing out by 2023.

In a blog post earlier this year, Google said that the Privacy Sandbox it released was to keep up with changing consumer expectations around privacy and that it was experimenting with FLoC to build on its vision. 

FLoC clusters large groups of people with similar interests. It hides individuals “in the crowd” and uses on-device processing to keep a person’s online history private on the browser.

However, FLoC invited tremendous scrutiny from various quarters, and many browsers rejected participation. The Electronic Freedom Foundation went so far as to label it a “terrible idea” in its report.  

“FLoC is part of a suite intended to bring targeted ads into a privacy-preserving future. But the core design involves sharing new information with advertisers. Unsurprisingly, this also creates new privacy risks. Users and advocates must reject FLoC and other misguided attempts to reinvent behavioral targeting. We implore Google to abandon FLoC and redirect its effort towards building a truly user-friendly Web,” the report read. 

Also read:

Have something to add? Post your comment and gift someone a MediaNama subscription.

Advertisement. Scroll to continue reading.
Written By

I cover several beats such as crypto, telecom, and OTT at MediaNama. I will be loitering at my local theatre and consuming movies by the dozen when I am off work.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.


The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.


In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?


The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.


The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ