About The Bunnings Privacy Decision: CCTV and Facial Recognition Technology Are A Potent Mix
Combining CCTV with facial recognition technology is a new-ish technology that has slipped under the regulatory radar in Australia until raised by Choice magazine over 2 years agon. The Choice articles prompted an OAIC investigation, with the regulator’s decision on Bunnings’ use issued earlier this week (November 2024) to much coverage, including a post from the OAIC itself. In discussing its decision, the Privacy Commissioner refers to FRT and the surveillance it enables as “as one of the most ethically challenging new technologies in recent years.”
The OAIC’s Bunnings’ decision is a strong indication of the regulator’s view on a range of important privacy issues, including consent, providing APP 5 notice of collection and when a collection of information is ‘necessary’ as required by APP 3.
It also perhaps flags a more proactive approach by the regulator and a possible change in the Australian privacy regulatory environment.
This blog post will look at one aspect of decision in more detail – whether consent was required to the collection.
What is Facial Recognition Technology (FRT)?
FRT is the process by which an individual can be identified or verified from a digital image.
A digital image of an individual’s face is made (e.g. via stills taken from CCTV footage) and distinct features are mapped into a biometric template. The biometric template is then compared against one or more pre-extracted biometric templates for the purpose of facial verification or identification. This whole process can take less than 1 second.
An individual does not need to be identified (by name) from the specific information being handled to be ‘identifiable’ in a facial identification system. An individual can be identified if their facial image is distinguishable from others in a database. Because everyone’s face is unique, a facial image is usually considered as personal information. A picture of someone’s face, provided it is reasonable quality, can be linked to an individual.
There are two main uses of FRT:
- facial verification refers to ‘one-to-one’ matching. It involves determining whether a face matches a single biometric template. This is what is used to open your device or as part of an access system.
- facial identification refers to ‘one-to-many’ matching. It involves determining whether a face matches any biometric template in a database. This use is relevant to scanning crowds or large groups to identify a particular person. The use of FRT in the Bunnings case as this facial identification use.
Biometric templates and biometric information including used for biometric verification or identification is sensitive information under the Privacy Act. Sensitive information is generally afforded a higher level of privacy protection In particular, consent is needed to collect biometric information unless one of the exceptions in the Privacy Act applies.
Background – Uses of FRT and CCTV
The Bunnings case has implications for every user of CCTV and FRT.
CCTV and FRT is seen as a valuable tool for law enforcement. Law enforcement agencies are one of the target sectors for Clearview AI’s FR system, which was constructed by scraping mages from ‘public’ sites (according to Clearview). Clearview AI’s collection of facial images has been considered by regulators in Canada, the UK, France, and Australia, who have all issued penalties or notices to the US company to stem its use of its facial recognition database. See our blog post on Clearview here.
In Australia, the AFP has been in trouble with the regulator for its use of Clearview.AI – here – Though it’s not clear that that has deterred the agency, with reports of continued use earlier this year (see here).
CCTV and FRT has been used in other retail contexts. In the USA, the FTC banned the RiteAid chain of pharmacies from using CCTV and FRT after falsely identifying shoplifters.
Retailers in Australia have been reported to use the crime intelligence platform Auror which supports the use of CCTV and FRT by providing a database of ‘suspects.’ Auror reportedly shares information about suspects with supermarkets and police, alerting users when people enter their stores and using analytics to “prevent crime before it happens”. The company is reported to work closely with Australian police forces, which have access to the data collected on millions of Australians.
According to a case study on the Auror site, Auror has partnered with Woolworths NZ to roll out its Retail Crime Intelligence Platform: ‘making it fast and easy for stores to report detailed incidents, observe patterns of repeat offenders, and spot Organised Retail Crime (ORC) groups. Woolworths found they were able to capture three times as much data as their previous incident reporting system.’
Crime intelligence software like Auror (which is developed and offered by a for profit organisation) also supports supermarkets in the aggregation of their data. It can gather more insights and create profiles on people, which then gets shared with law enforcement. For instance, when you enter a supermarket carpark, the automatic licence-plate recognition technology that opens the boom gate also serves as a car-tracking network, information that can be shared with police. (More information here.)
Privacy advocates refer to the way that these programs ‘normalise’ surveillance and create additional privacy and security risks for the everyday consumer:
“They’re both punitive surveillance and also for profit. They wouldn’t be implementing them if it didn’t make good business sense for them to do it. It’s not just about policing people’s behaviour. It’s also about being able to gather more and more information.”
What are some of the concerns?
The particular privacy issues considered by the OAIC in the Bunnings’ decisions (and the APP’s found to have been breached) included:
- Collection of sensitive information without consent (and where no exception to the consent requirement applied), contrary to APP 3; and
- Failure to provide notice of collection contrary to APP 5.1;
- Failing to implement privacy management system: failing to take such steps as were reasonable in the circumstances to implement practices, procedures and systems relating to its functions or activities to ensure that it complied with the APPs, as required by APP 1.2(a); and
- Incomplete privacy notice: failing to include in its privacy policies information about the kinds of personal information that it collected and held, and how it collected and held that personal information, as required by APP 1.4(a) and APP 1.4(b).
This post is going to focus on the question of collection of sensitive information without consent. In this context, Bunnings argued:
- There was no collection because the process involved a transient use of data;
- If there was a collection, then consent was not required because an exception to the required applied.
Was there a collection of sensitive information?
The process used to do the facial matching is described in Appendix 1 below.
Bunnings argued that this process did not involve the ‘collection’ of personal information of non-matched individuals using the FRT system because the activity lacked the necessary purposive character of ‘collection’ under the Privacy Act. Non-matched individuals were the majority of individuals whose data was collected. Bunnings did not dispute the collection of information regarding matched individuals but relied on exceptions to the requirement for consent, which is discussed further below.
Bunnings argued that it was a deliberate aspect of the design of the FRT system that the information of non-matched individuals (namely the facial image and associated vector set) was automatically deleted. It claimed that the transient processing of an image which occurred as part of the matching process took on average 4.17 milliseconds, and there was no retention in any record which could be re-accessed or retrieved. Accordingly, there was no collection because the processing lacked the purposive requirement of ‘collection.’
In considering the use of the FRT in Bunnings, the Commissioner notes:
- Hundreds of thousands of images were probably processed during the 22 months the FRT was in use;
- Images of non-matched people were held for approximately 4.17 milliseconds;
- People entering Bunnings were enrolled in the system without their knowledge or consent;
- For matched individuals, the consequences of the collection included the prospect of being [Redacted] and potentially being subjected to different and adverse treatment, regardless of their behaviour;
- Matched individuals who were the subject of a ‘false positive’ match were likely treated in the same manner as enrolled individuals, with those consequences aggravated by the fact that they had done nothing to warrant suspicion;
- At the time of implementation, FRT was a relatively novel technology in a retail setting, and involved ways of collecting personal information that differed from what individuals entering the stores were otherwise accustomed to or might have expected.
What is collection?
The OAIC’s guidance provides that ‘collection’ applies broadly, and includes gathering, acquiring or obtaining personal information from any source and by any means, including from biometric technology such as voice or facial recognition.
The Privacy Act provides that an entity ‘collects’ personal information only if the entity collects the personal information for inclusion in a record or generally available publication. A ‘record’ includes a document or an electronic or other device.
Considering the 4-step matching process performed by the FRT system, the Commissioner was satisfied that Bunnings ‘collected’ personal information in order to execute that process.
The Commissioner was also satisfied that the personal information was collected for inclusion in a record. While the match was being processed, the data (the CCTV footage data, still images and vector sets) was stored in the supplier’s server. Data was also saved to Bunnings servers.
In relation to the transient nature of dealing with non-matched individuals’ data, the Commissioner found:
Notwithstanding the respondent’s contentions, it does not matter for the purposes of establishing ‘collection’ that the information of non-matched individuals was held momentarily before being deleted, or that the matching process was conducted automatically and without human intervention.
Collection of sensitive information without consent
Bunnings also contended that, if there was a collection of sensitive information, no consent was required because Bunnings could rely on the ‘permitted general situations’ exception (allowed under Section 16A). The two relevant ‘permitted general situations’ relied on were:
- Unlawful activity or misconduct situation:
- the entity has reason to suspect that unlawful activity, or misconduct of a serious nature, that relates to the entity’s functions or activities has been, is being or may be engaged in, and
- the entity reasonably believes that the collection, use or disclosure is necessary in order for the entity to take appropriate action in relation to the matter [emphasis added.
Examples of where this permitted general situation might apply are the collection of sensitive information by:
- an APP entity that is investigating fraudulent conduct by a professional adviser or a client in relation to the entity’s functions or activities
- an agency that is investigating a suspected serious breach by a staff member of the Australian Public Service Code of Conduct.
- Serious threat situation:
-
- it is unreasonable or impracticable to obtain the individual’s consent to the collection, use or disclosure, and
- the entity reasonably believes that the collection, use or disclosure is necessary to lessen or prevent a serious threat to the life, health or safety of any individual, or to public health or safety [emphasis added].
Examples of where this permitted general situation might apply are:
- collecting health information about an individual who is seriously injured, requires treatment and, due to their injuries, cannot give informed consent, on the basis that it is impracticable to obtain the individual’s consent
- collecting sensitive information about a parent that is required to provide assistance to a child who may be at risk of physical or sexual abuse by the parent, on the basis that it would be unreasonable to obtain the parent’s consent.
Both of these permitted general situations require that there be a reasonable belief that the collection of personal information was necessary for the purposes (i.e. to prevent a serious threat to life or where there is suspected unlawful activity or misconduct).
Necessity of collecting via FRT
A key issue in the Bunnings decision was whether the collection of biometric information was necessary for the particular purpose. Bunnings contended that ‘necessary’ in this context means ‘reasonably appropriate and adapted.’
In the absence of any judicial authority or guidance about the interpretation of ‘necessary’ in the context of the permitted general situations, the Commissioner considered the following factors to be relevant to in determining whether the use of FRT is necessary:
- Suitability: The suitability of the FRT system in addressing the relevant activity or conduct
- Alternatives: The alternatives available to address the relevant activity or conduct
- Whether the use of the FRT system is proportionate to the outcome achieved. An organisation will need to balance the privacy impacts of the collection of sensitive information, and holding this information, against the benefits of the use of the FRT system.
Unlawful activity – Suitability
Although not necessary, the Commissioner went on to consider the suitability of the FRT system and in particular, its effectiveness in addressing unlawful activity, which it regarded as relevant to considering the reasonableness of Bunnings’ belief that the collection of personal information via the FRT system was necessary.
Bunnings produced evidence of the range of alternative tools and techniques it had reviewed and implemented to take appropriate action in respect of unlawful activity in its stores. The Commissioner referred to these measures but was not convinced that there was support for the reasonableness of implementing the FRT system:
While the FRT system was perhaps the most efficient and cost effective of those tools and techniques in certain situations and its operation provided a sense of comfort to the respondent’s staff, it was also the most privacy intrusive option available to the respondent because, notwithstanding the functionality which enabled the prompt deletion of data relating to non-matched individuals, it impacted a much broader cohort than any of the available alternatives. In my view, the fact that the FRT system was an additional and complementary tool available to the respondent was not of itself sufficient in the circumstances to induce a reasonable belief that the collection of personal information via the FRT system was necessary.
Unlawful activity – Alternatives
The Commissioner looked at whether the desired outcomes could have been achieved by alternative means. Possible alternatives to the use of FRT include:
- the presence of in-store security,
- staff training,
- quality CCTV coverage,
- the issuing of ‘Prohibition notices,
- close engagement with law enforcement agencies,
- and the use of [Redacted] technology.
Unlawful activity – Proportionality
The Commissioner also looked at the proportionality of the use of FRT, as part of assessing the reasonableness of Bunnings’ belief as to the necessity of its use.
The FRT system involved capturing and processing the facial images of every individual who entered a relevant store during the relevant period, regardless of their age, appearance, demeanour or intentions.
Of the thousands of individuals who entered a relevant store on each day of the relevant period and whose facial images were collected by the FRT system, the Database contained 448 enrolled individuals at its peak and returned 909 matches. Although, those figures would have increased over time and if the FRT system had been implemented more broadly.
The Commissioner found that of the significant volume of personal information collected via the FRT system, it was useful in a relatively small number of occasions and in respect of a relatively small number of individuals. She noted that many of the events that occurred in Bunnings’ stores did not involve enrolled individuals or conduct that met the threshold of an unlawful activity. She said:
I am not satisfied that the respondent could have reasonably believed that the collection of personal information via the FRT system was necessary in circumstances where it involved the wholesale and indiscriminate collection of personal information that also constituted sensitive information in order to take appropriate action in respect of actual or suspected unlawful activity by a relatively small number of individuals and in a limited set of circumstances. This is particularly so because individuals were not adequately notified that the FRT system was in operation and therefore had no control over how their personal information was handled. To find otherwise would arguably undermine the balance sought to be achieved by the objects of the Privacy Act in respect of the rights of individuals and the interests of entities.
Serious threat to life – Suitability
The Commissioner referred to the limitation of the FRT to identify only known offenders and not being suitable to address situations where people entered the store brandishing a weapon as reducing its suitability to prevent serious threat to life.
Serious threat to life – Alternatives
The Commissioner thought the same alternatives as considered in regard to preventing unlawful activity were relevant to preventing threat to life.
Serious threat to life – Proportionality
It was the Commissioner’s view that the impact on the privacy of individuals outweighed the benefits that were or could be realised by the use of the FRT system in respect of lessening or preventing serious threat situations.
Outcome for Bunnings
The declarations made by the Commissioner were limited to:
- Not repeating or continuing the collection;
- Issuing a public statement covering the decision within 30 days
- Retain all information collected for a period of 12 months and then delete it
- Provide written confirmation to the OAIC when the information is deleted.
There was no award of damages but, by requiring that Bunnings retain the data for 12 months, the OAIC is giving affected individuals the opportunity to seek compensation for any damage suffered.
What do you need to think about?
The Bunnings’ decision and recent guidance issued by the OAIC provide valuable insights on what should be considered when deploying FRT.
The key principles to consider before using FRT for facial identification include:
- Necessity and proportionality (APP 3) – personal information for use in FRT must only be collected when it is necessary and proportionate in the circumstances and where the purpose cannot be reasonably achieved by less privacy intrusive means.
- Consent and transparency (APP 3 and 5) – individuals need to be proactively provided with sufficient notice and information to allow them to provide meaningful consent to the collection of their information.
- Accuracy, bias and discrimination (APP 10) – organisations need to ensure that the biometric information used in FRT is accurate and steps need to be taken to address any risk of bias (or discrimination).
- Governance and ongoing assurance (APP 1) – organisations who decide to use FRT need to have clear governance arrangements in place, including privacy risk management practices and policies which are effectively implemented, and ensure that they are regularly reviewed.
More specifically, factors to consider include:
- What is the primary purpose of collecting the information?
- How will the biometric information be used, stored and secured in undertaking a function or activity?
- Can you undertake the function or activity without collecting the biometric information?
- Can the purpose be achieved by less intrusive means? Have you considered other alternative means?
- Have you identified and assessed the benefits and privacy risks? Do the benefits to be achieved clearly outweigh the privacy risks, and why?
- Is there a clear public interest in using FRT? Examples may include to lessen or prevent a serious threat to public health or safety.
- Would an individual reasonably expect FRT to be used in the circumstances? Will the use of FRT lead to unjustified adverse effects, such as unjust discrimination?
To achieve all of the above, organisation’s should be considering a privacy by design approach, which should ensure that the key principles above were all implemented to support the appropriate use of sensitive information when using FRT.
If you are thinking about using FRTs, alternatives to monitor for safety and security concerns may include:
- Quality CCTV coverage
- The deployment of security guards, including covert security guards
- Training employees in dealing with safety and security issues, and
- Close engagement with law enforcement
If FRT is deployed, you should regularly consider whether the benefits of using FRT have been realised, and if the use of the technology is still needed, including whether any anticipated privacy risks have arisen.
And remember, the use of FRT is very much on the Commissioner’s radar her position has been made clear. Take care when implementing …
Resources
Commissioner initiated investigation into Bunnings Group Limited (Privacy) [2024] AICmr
Bunnings determination factsheet (PDF, 471 KB)
Facial recognition technology: A guide to assessing the privacy risks
Facial Recognition, Technology and Privacy Fact Sheet
Appendix 1 – FRT matching process used
The FRT system, via Closed Circuit Television (CCTV) that operated at the entry points of relevant stores, captured the facial image of every person who entered a relevant store during the relevant period, regardless of their age or other characteristics. This included customers, staff, visitors and contractors. The individuals’ facial images on the live CCTV footage were analysed by the FRT system to create a ‘real-time facial image’. The respondent used the facial image of each individual, through the application of an algorithm, to create ‘searchable data’ in relation to that individual’s facial image.
The FRT system involved an initial sequence of four operations performed on a computer server:
- Step 1: Video decoding – each frame of the CCTV video was separated into still images.
- Step 2: Facial recognition processing – a Gabor filter was applied to each still image to determine whether it contained any images of human faces.
- Step 3: Facial feature calculation processing – where a human face was identified from a still image, vector points of the facial features were extracted to create a vector set.
- Step 4: Comparison processing – each vector set was compared against vector sets previously extracted from the faces of individuals that the respondent had enrolled in a database (the Database) by calculating the relative differences between the location of the vector points in each vector set.
Where Step 4 resulted in a match, an alert was generated. In cases where a match was not generated in respect of an individual’s facial image (non-matched individuals) and there was no alert, the respondent submitted that the facial image was automatically deleted, without human intervention, within an average of 4.17 milliseconds. After the matching process had occurred and the images of non-matched individuals were deleted, the respondent claimed that it was unable to retrieve or re-access the information.
The FRT system stored in the Database the facial images and associated searchable data of a cohort of individuals who the respondent deemed to pose a risk to its operations. The individuals whose facial images were uploaded into the Database (enrolled individuals) were those who the respondent considered to fall into one or more of the following categories:
- individuals who had engaged in, or were reasonably suspected of engaging in, ‘actual or threatened violence’ to the respondent’s staff or other members of the public;
- individuals who had committed, or were reasonably suspected of committing, ‘Organised Retail Crime’. That is, theft or fraudulent activity that had occurred multiple times, in multiple stores, and by two or more individuals, with the intent of converting proceeds to financial gain;
- individuals who had demonstrated ‘violent, threatening or other inappropriate behaviour’, resulting in the respondent’s staff issuing them with a prohibition notice that ‘formally notified [them] that they were prohibited from entering a Bunnings store’ (prohibited person);
- individuals who had engaged in ‘serious cases of theft’; or
- individuals who had committed criminal conduct….