3 Tips for Conducting Better Privacy Impact Assessments
In a recent guidance, the Office of the Australian Information Commissioner suggested that organisations undertake a Privacy Impact Assessment (PIA) covering generative AI tools. The guidance shares a fictional example of an employee uploading a customer’s personal data to tools like ChatGPT – against company policy – and how a PIA could have helped.
In this post, we look at the OAIC’s recent guidance and outline practical tips for conducting better privacy impact assessments.
What is a Privacy Impact Assessment?
Before we get started, we wanted to briefly share some key information about PIAs.
A Privacy Impact Assessment (or PIA) is a process that organisations can use to understand the impact of a project on personal privacy and evaluate the strategies to manage, mitigate, or reduce privacy risk. They’re an underutilised tool and, in our opinion, organisations miss out on significant benefits if PIAs are narrowly viewed as a legal compliance measure.
You can learn more about PIAs in the following resources:
- Triggers for Conducting a Privacy Impact Assessment
- PIAs as a Business Tool
- The AFP in Trouble Again for AI Use with no PIA.
Privacy Impact Assessments & Generative AI Data Breaches: A Scenario
In the OAIC’s fictional case study, an insurance company, CarCover, had an internal policy banning the use of public AI tools for personal data. Yet, a staff member was able to upload a customer’s sensitive financial hardship application because GenAI use was permitted within the workplace for ‘routine tasks’.
The employee’s action resulted in a double failure:
- Sensitive health and financial data was shared with a public third-party platform, and
- The AI’s summary was inaccurate, causing the insurer to unfairly reject the customer’s application.
How Could A Privacy Impact Assessment Have Helped
The scenario shows that written policies are not sufficient alone in preventing data breaches. Technical measures and ongoing training and communication may also be needed for organisations to effectively reduce risk.
A Privacy Impact Assessment can help organisations understand the potential impact and consequences of the use of GenAI tools, as well as more broadly. With this knowledge, organisations are in a better position to manage, minimise, or eliminate the relevant risks.
In this scenario, the organisation may have opted for greater protections than just an internal policy, had it completed a PIA. Some examples may include:
- Providing a paid GenerativeAI plan to employees to gain greater control over the data. Paid models often have more privacy-friendly setting options.
- Disallowing any use of GenAI tools, including adding technical measures to prevent access.
- Increasing training for the team around the risks of sharing personal information with GenAI tools.
3 Tips for Better Privacy Impact Assessments
So how can organisations create better privacy impact assessments? We’ve outlined three tips we believe can help:
Tip 1: Visualise exactly where your data goes and can go.
Organisations are getting more comfortable with data flow maps – that is, visual representations of how information moves through the organisation. These maps trace the lifecycle of data: how it is collected, where it is stored, who accesses it, and how it is eventually destroyed.
However, the OAIC’s scenario highlights the risks that come when invisible or possible-but-prohibited data flows occur. Privacy Impact Assessments can, and should, consider these flows. To do this, your PIA can look at:
- Possible secondary usages, and the risks they pose.
- The technology ecosystem, and whether new tools can be plugged in by teams without authorisation and/or approval.
Tip 2: Design a PIA framework to catch risks the APPs might miss
While Privacy Impact Assessments can be compliance tools, using them only as a compliance tool misses some benefits. Using them to develop a risk framework can help you better protect your organisation. To achieve this, consider the risks as they extend to the real-world, not just the Australian Privacy Principles. For example:
- Context Matters: Collecting a home address might be low risk for an e-commerce delivery, but it is extremely high risk for a confidential counselling service for LGBTIQA+ teens. Your framework should weigh the sensitivity of the data against the context of its potential use.
- Likelihood vs. Consequence: Use a standard risk matrix that carefully reflects likelihood compared to consequences. What is the likelihood of a breach and what is the real-world consequence (e.g., identity theft, physical danger, humiliation)?
- Beyond a Breach: Consider risks other than hackers. What if the data is accurate but used unfairly? What if the data is inaccurate and leads to unfair decision making?
Tip 3: Train key team members on the benefits of PIAs
It’s likely that your privacy professionals and champions know about the benefits and importance of Privacy Impact Assessments, it’s possible other key team members don’t. This can lead to opportunities for PIAs being completed to be missed.
Offering training to key team members can help them recognise triggers for PIAs, reducing the knowledge gap that leads to ‘too-late’ scenarios.
Here’s who to train and why:
- Software Developers & Solution Architects: They need to understand that a small technical decision (like an API call to a server overseas) can have a massive privacy impact.
- Procurement & Vendor Management: If they understand the triggers for a PIA, they can ensure privacy vetting happens before a vendor is onboarded or a contract is signed.
- Project Managers & Product Owners: Training them ensures they build “Privacy by Design” time into their projects, rather than viewing the PIA as a blocker at the finish line.
- Marketing & Digital Teams: These teams often move fast and use tools (like tracking pixels or GenAI copy generators) that may rely heavily on personal data. Plus, it’s important they understand “publicly available data” isn’t free to use without assessment.
- Frontline Managers: Staff often find “workarounds” to be more efficient (we discuss this in our piece covering Shadow AI). Training them helps them understand why certain controls exist, making them less likely to bypass them.
Privacy Impact Assessments with Privacy 108
There are plenty of Privacy Impact Assessment templates available online. So, why would you choose Privacy108 for your PIA?
Our team offers a suite of services relating to Privacy Impact Assessments, from completing them for you to developing a PIA process to embedding PIAs into your business. Whatever service you choose, we streamline the process of assessing your risk.
Privacy108 approaches your PIA from a legal and IT perspective, assessing risk holistically and developing practical solutions to mitigate privacy risk. We deliver PIAs that meet your needs and build privacy compliance and risk management into your products, services and business processes.
Our approach recognises that PIAs should not be undertaken on a one-size-fits-all basis. Your Privacy108 PIA will contemplate your individual risk profile, timeline, budget, and IT infrastructure. We’re uniquely placed to oversee your privacy compliance from project initiation to end, but we’re equally happy to provide point-in-time assessments and provide an implementable action plan.
For more information, reach out to the team for an obligation-free chat at hello@privacy108.com.au.
If you’d like insights like this in your inbox each month, subscribe to our newsletter: