top of page

Bunnings facial recognition technology legal development

  • By THE BRIEF EDITORIAL
  • Feb 8
  • 4 min read

Between January 2019 and November 2021, Bunnings Group Ltd operated a facial recognition technology system across multiple hardware store locations in New South Wales and Victoria. The system captured and analysed the faces of customers entering stores, compared them against an internal database of individuals believed to pose a risk, and deleted data when no match was identified.


Background


In late 2024, the Office of the Australian Information Commissioner (OAIC), through the Privacy Commissioner, determined that Bunnings had breached the Privacy Act 1988 (Cth) by collecting sensitive personal information without adequate notice or consent and failing to update its privacy policies to reflect the use of the technology.


Under the Privacy Act 1988, collection of biometric information such as facial images falls within the definition of sensitive information, which triggers enhanced privacy obligations, including transparency, notification, and accountability under the Australian Privacy Principles (APPs).


Bunnings subsequently challenged the Privacy Commissioner’s determination before the Administrative Review Tribunal (ART), seeking review and substitution of the original findings.


Legal issues before the tribunal


The principal legal issues in Bunnings v OAIC involved the application of the Privacy Act and related principles to the company’s use of facial recognition technology:

  1. Whether Bunnings contravened the Privacy Act by collecting sensitive biometric information without consent and failing to take reasonable steps to notify individuals about the collection and use of their personal information.

  2. Whether an exemption or permissible basis existed under the Privacy Act permitting Bunnings to collect and use such information for a specific public safety purpose - namely, the prevention of retail crime and protection of staff and customers.

  3. Whether the privacy impact was proportionate to the stated purpose of the technology, considering the nature of the threat and the mechanisms employed to limit data retention and misuse.


The original 2024 determination by the Privacy Commissioner found breaches of APP obligations, including failure to provide adequate notification and lack of transparency in data handling practices.


Administrative Review Tribunal findings


In February 2026, the Administrative Review Tribunal delivered a ruling in favour of Bunnings on key aspects of the case, substantially altering the legal landscape for facial recognition in the retail context.


Tribunal’s Core Holding

The ART found that Bunnings was reasonably entitled to use facial recognition technology to combat serious and repeat retail crime and protect staff and customers from violence, abuse, and intimidation within its stores.

This replaced the Privacy Commissioner’s earlier conclusion that Bunnings had no lawful basis to collect the biometric data under the Privacy Act.


Proportionate Privacy Impact

In its assessment, the tribunal acknowledged that Bunnings had not fully complied with procedural privacy requirements — notably its failure to properly notify customers that the technology was in operation and to update privacy policies in a timely manner.


Despite these shortcomings, the ART concluded that the privacy intrusion was minimal and proportionate to the legitimate safety and crime-prevention objectives in evidence, which included frequent incidents of theft and threats to staff reported during the trial period.

Unmatched biometric data was deleted almost immediately after capture, and only images matching those of known or banned individuals were retained for the narrow crime-combatting purpose.


Conditions and Safeguards

The tribunal emphasised that lawful use of facial recognition requires ongoing compliance with privacy safeguards, including:

  • Robust notification procedures and signage alerting store visitors to the use of biometric technology.

  • Clear privacy policies reflecting the nature and purposes of data collection.

  • Risk assessments and privacy impact analyses tailored to the specific use of the technology.

  • Effective data handling protocols to minimise misuse and protect individual privacy.


These conditions mirror the Australian Privacy Principles’ focus on transparency and accountability, underscoring that compliance remains central even where an exemption or permissible basis is found.


Post-ruling developments


Following the ART decision, Bunnings announced plans to roll out facial recognition across its store network over the next 18 months, indicating intent to deploy the technology operationally in accordance with the tribunal’s findings and procedural requirements.

The OAIC, while acknowledging the tribunal’s ruling, has publicly stated that it is carefully considering the decision and its implications and may appeal the matter further.


Context within Australian privacy and emerging technology law


The Bunnings case sits within a broader regulatory and legal context in which Australian privacy law is increasingly tested by emerging technologies, including AI and biometric systems. The ART decision provides a reference point for balancing privacy protections under the Privacy Act with public safety and crime-prevention objectives - issues also evident in related privacy determinations against other retailers for similar technology use.


Unlike purely surveillance use, the tribunal’s articulated framework focuses on specific and limited purposes, proportionality of data capture, and the existence of measurable threats justifying the intrusion -all core principles under modern privacy jurisprudence.


Professional Significance


The ART’s decision in Bunnings v OAIC represents a significant interpretative development in Australian privacy law as it applies to commercial deployment of biometric technologies:

  • It clarifies that facial recognition and associated AI systems are not per se prohibited under the Privacy Act where a legitimate, narrow purpose exists and privacy protections are robust.

  • It confirms that failure to comply with notification and transparency obligations remains a discrete breach but may not be dispositive where proportionality and public safety are demonstrated.

  • It reinforces the importance of structured privacy impact assessments and governance frameworks in commercial technology adoption - especially where sensitive personal information is involved.


The case will be monitored for potential appeal and may influence future enforcement and legislative reform as regulators, courts, and policymakers engage with the intersection of AI, privacy rights, and commercial security technologies.

 
 
bottom of page