- cross-posted to:
- privacy
- [email protected]
- cross-posted to:
- privacy
- [email protected]
Australia’s Privacy Commissioner Carly Kind determined in 2024 that Bunnings breached privacy laws by scanning hundreds of thousands of customers’ faces without their proper consent.
A review of that decision by the Administrative Review Tribunal of Australia has now found the opposite
The retailer did not break the law by scanning customers’ identities, but should improve its privacy policy and notify customers of the use of AI-based facial recognition technology, the ruling said
Petty typical stuff by this point. The privacy-invading company wins, pissweak government makes a few privacy “recommendations” but stops short of enforcing anything
Fuck. Cunts. Shit’s going to be everywhere tomorrow. (I rarely profane as much, but this)
Exactly. Kmart did a similar trial a while back. This ruling will just open the floodgates for every company to roll out AI-powered facial recognition cameras everywhere
Well, looks like its medical face mask, big sunnies and blank clothing covering my arms and legs, just to go get some nails.
Make sure you don’t drive either. They scan number plates too
Ok, so put fake plates on before I go as well.
Wearing fake plates is absolutely illegal for very good reasons.
only if you get pulled over.
Wear clothing with realistic faces printed on it to confuse the FR. A baseball cap with infrared LEDs in the brim too
The way Bunnings used facial recognition was as private as it could be … mostly. It is possible to run all of the facial recognition locally, but instead they run it on a central Bunnings-controlled server in Sydney. They only scan for faces of known offenders based on previous entanglements.
My concern isn’t surveillance, but mass surveillance. Because this is exclusive to Bunnings it doesn’t quite reach mass surveillance, but because the processing is centralized and Bunnings is so big it edges dangerously close.
Still this isn’t Flock, nor Palantir, nor Google.
They only scan for faces of known offenders based on previous entanglements.
How does the system know who’s an offender and who isn’t without scanning their face first?
They look at CCTV, take an violentcustomer.jpeg, and add it to the database.
EDIT: Oh I see what you’re asking. You misread my quote, what I said is “they only scan for” and not “they only scan”.





