Differential Privacy: Where Maths and Privacy Meet

Differential privacy is a formal mathematical definition of privacy. At its core, implementing differential privacy means adding ‘noise’ to the data to protect privacy. And it has emerged as an effective method of protecting individual privacy in the age of big data. 

What is Differential Privacy? 

As we alluded to above, differential privacy involves applying an algorithm to data to improve individual privacy outcomes while allowing an organisation to achieve its purposes. An algorithm is only considered to be differentially private if it’s not possible to tell whether an individual’s data was included in the dataset based on the output.  

Read more about it on Harvard’s Privacy Tools Project page. 

Differential Privacy: Real-World Examples 

Apple, for instance, outlines that it privatises data on the user’s device – before it is sent to Apple. It does this so Apple’s servers don’t receive clear (identifiable) data from a user. Once the data has been collected by Apple, it drops IP addresses and other metadata to further strengthen privacy. Finally, Apple aggregates and processes the raw information before sharing the compiled statistics and data with the relevant teams.  

“[Local differential privacy] is a technique that enables Apple to learn about the user community without learning about individuals in the community. Differential privacy transforms the information shared with Apple before it ever leaves the user’s device such that Apple can never reproduce the true data.” 

Advantages and Disadvantages of Differential Privacy  

Differential privacy ‘guarantees’ that individual privacy will be protected once information is collected and processed. However, it is not a one-size-fits-all solution to organisational privacy challenges.  

Advantages of Differential Privacy 

Three key advantages of differential privacy (when compared to ‘traditional’ methods, such as anonymisation) are that it: 

  1. Promotes individual privacy while still allowing organisations to collect and analyse the data they need.  
  2. Increases the accuracy of aggregated data. One factor that leads to improved accuracy is that people are more likely to provide accurate and detailed information when they aren’t afraid of being exposed.  
  3. Builds trust between individuals and organisations. Now that more individuals know the value of their data – and the risk leaked data poses to them – they appreciate and trust organisations that prioritise data security.  

Disadvantages of Differential Privacy 

As we foreshadowed, differential privacy is not a flawless method of data aggregation. Some disadvantages include:  

  1. Reduced data quality – especially where the dataset is small. This is why we see higher adoption of differential privacy amongst ‘bigdata’ companies, like Apple, Facebook, and Amazon, and government departments.  
  2. Increased technical complexity, which drives up costs and decreases access.  
  3. Decreased utility of the data. Since you are adding ‘noise’ to the data, the quality and utility of the data may decrease. If we consider the example in the NIST video above where the person’s address was changed, it may result in insights related to the specific geographic location being overlooked, for example.  

Further Resources: 

For more information, read:  

Big Data and De-identification: Taking a risk management approach 

De-identification as a privacy enhancing tool: How, when and why to use it 

Re-introducing the Re-identification offence bill: The dumbest privacy idea this year? 

Security Management with Privacy 108  

Our team builds information security and privacy management frameworks, policies, and processes aligned to major standards and customised to meet your individual requirements. 

We can support you at any stage in your management system lifecycle: whether design, implementation, maintenance, or review. 

To improve your security management, contact us: 

  • We collect and handle all personal information in accordance with our Privacy Policy.

  • This field is for validation purposes and should be left unchanged.

Privacy, security and training. Jodie is one of Australia’s leading privacy and security experts and the Founder of Privacy 108 Consulting.