Privacy Commissioner’s determination on Bunnings’ use of facial recognition technology.

Author: Ray Hong

17/1/25 | Read time: 2 min

The Office of the Australian Information Commissioner (OAIC) recently decided that Bunnings breached Australian privacy laws by using facial recognition technology (FRT) to identify people causing problems in their stores.[1]

While the OAIC provided important guidance on several privacy principles and concepts,[2] the focus of its determination was about whether privacy laws permitted the use of FRT in such circumstances.

We consider this aspect of the case in further detail, including its implications for businesses who use (or plan to use) FRT in the same way.

1.     What happened

The case addressed situations where an individual previously created trouble at a Bunnings store, sometimes amounting to a crime eg theft or assault.

Bunnings used FRT to create an internal record of such individuals and identify them when they revisited Bunnings stores. Bunnings staff were then able to take action to pre-empt the individual in case they created trouble or misbehaved again.

2.     Tricky moral question

The central assumption in this: an individual who previously created trouble or misbehaved was likely to repeat their behavior. In other words, sensitive personal information (facial images) were collected and used to prevent something that may or may not take place, based on past conduct.

Do our privacy laws permit this (or, depending on your perspective of what laws are, should they be interpreted to permit the use of personal information for such situations)?

3.     What do current privacy laws say?

Privacy laws essentially provide 2 routes toward collecting and using personal information: with individual consent, or without, via an applicable legal exception.

In this context, the only viable route was perhaps through an applicable legal exception (because it is unrealistic to presume that a misbehaving individual would consent to have their personal information collected and used to be reidentified for future misbehaviour).

Bunnings tried to justify its collection and use of personal information within 2 legal exceptions – that it reasonably believed the personal information was necessary to either prevent a serious threat to its staff and customers or act against unlawful activity or misconduct.

4.     No, you may not

The OAIC disagreed.

It was apparent from the following points that it felt the privacy of an individual should not be easily displaced for a business to take pre-emptive security actions based on its assumptions about past conduct:

  • Legal exceptions displace privacy in a very ‘specific and confined way’[3] ie OAIC will interpret them narrowly and strictly.
  • OAIC appeared to frown on FRT use on individuals who were merely suspected of having engaged in potentially unlawful conduct but had not been proven to have done so.
  • It also indirectly alluded to the lack of a rational and consistent approach toward ‘enrolling’ and delisting individuals to be tracked by FRT.[4]
  • FRT was not viewed as a suitable system for addressing unlawful activity or serious threats – it had limitations in that it did not work for individuals who had disguised their faces and did not have a deterrent effect.
  • FRT was only 1 of several tools enabling Bunnings to take appropriate action or prevent a serious threat (not an essential tool).
  • FRT is a blunt and privacy intrusive option as it indiscriminately collected facial images to ‘match’ the individuals it was looking for.
  • The benefits derived from FRT use were relatively small (applying only to a small number of individuals and limited set of circumstances) – the intrusiveness of the technology into the privacy of individuals was disproportionate.
  • Overall, Bunnings did not meet the necessary (narrow and strict) standard that the collection and use of sensitive personal information through FRT was reasonably necessary or essential to act against unlawful activity or prevent a serious threat.

5.     Where does the case leave us?

It throws into serious doubt whether businesses can continue to use FRT for similar pre-emptive action-based situations, especially in respect of individuals who have not been proven to have engaged in unlawful conduct.

Based on the above points, to stand a (probably remote) chance for consideration under either of the 2 exceptions argued, an FRT would need to:

  • only be used on individuals who have been proven to have engaged in unlawful conduct or serious threats;
  • have a rational basis (policy) for enrollment or delisting of individuals;
  • have a deterrent effect;
  • be the essential tool that enabled the business to act;
  • only target the individuals in question and not indiscriminately collect facial images;
  • operate in settings which did not limit the effectiveness of the FRT (eg where individuals somehow do not mask their faces).

This is in addition to all the other requirements such as the need for effective notification under Australian Privacy Principles.

The question for businesses is whether it is operationally feasible to have an FRT which meets all these requirements and that it still makes commercial sense (at what cost?) to do so.

Perhaps not, but was that the outcome the OAIC was driving for?

6.     Post-script – inconsistent outcomes in different settings?

The Bunnings case addressed the use of FRT in a retail setting. All eyes will now be on the latest use of FRT, albeit in a different setting – at the Australian Open (AO).

Notably, AO’s ticket conditions of sale and entry state that FRT may be used for pre-emptive security actions to ‘identify and deny entry to, or eject, persons who have been removed or denied entry to the AO’ if the organiser ‘reasonably believe it is in the best interests of the safety, security or integrity of the AO to do so’. At face value, this does not appear to meet the Privacy Commissioner’s requirement for a ‘rational basis’.[5]

It is difficult to rationally reconcile how the use of FRT at the AO in that manner is consistent with the Privacy Commissioner’s determination in the Bunnings case, and questions will undoubtedly arise in that regard.

From an enforcement perspective, Australians will be watching closely whether the Privacy Commissioner takes a consistent approach in the exercise of its investigative powers or provides guidance on why FRT use in a sporting event for pre-emptive action is permitted, but not in a retail setting.[6]

 

[1] Commissioner initiated investigation into Bunnings Group Limited (Privacy) [2024] AICmr 230 (29 October 2024) (Bunnings case)
[2] On ‘collection’, ‘consent’, ‘notification’, that facial images were sensitive biometric personal information, maintenance of personal information and privacy policy requirements under the Australian Privacy Principles
[3] [89] of the Bunnings case
[4] This was addressed in the Commissioner’s consideration of whether Bunnings had taken reasonable steps to maintain the personal information under APP1.2
[5] 17d) of the AO Ticket Conditions of Sale and Entry, Australian Open 2025
[6] The Privacy Commissioner appears to have made a brief comment on this in the Bunnings case at [134]: ‘I do not think that the widespread use of FRT to collect the sensitive information of people at the point of entry to retail stores can be compared to the use of FRT in an airport or sport stadium because those facilities have a different purpose and risk profile.’