Australia’s Privacy Commissioner Carly Kind determined in 2024 that Bunnings breached privacy laws by scanning hundreds of thousands of customers’ faces without their proper consent.

A review of that decision by the Administrative Review Tribunal of Australia has now found the opposite

The retailer did not break the law by scanning customers’ identities, but should improve its privacy policy and notify customers of the use of AI-based facial recognition technology, the ruling said

Petty typical stuff by this point. The privacy-invading company wins, pissweak government makes a few privacy “recommendations” but stops short of enforcing anything

    • freedickpics@lemmy.mlOP
      link
      fedilink
      arrow-up
      17
      ·
      14 hours ago

      Exactly. Kmart did a similar trial a while back. This ruling will just open the floodgates for every company to roll out AI-powered facial recognition cameras everywhere

  • dumbass@piefed.social
    link
    fedilink
    English
    arrow-up
    14
    ·
    14 hours ago

    Well, looks like its medical face mask, big sunnies and blank clothing covering my arms and legs, just to go get some nails.

  • FoundFootFootage78@lemmy.ml
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    13 hours ago

    The way Bunnings used facial recognition was as private as it could be … mostly. It is possible to run all of the facial recognition locally, but instead they run it on a central Bunnings-controlled server in Sydney. They only scan for faces of known offenders based on previous entanglements.

    My concern isn’t surveillance, but mass surveillance. Because this is exclusive to Bunnings it doesn’t quite reach mass surveillance, but because the processing is centralized and Bunnings is so big it edges dangerously close.

    Still this isn’t Flock, nor Palantir, nor Google.

    • ageedizzle@piefed.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      13 hours ago

      They only scan for faces of known offenders based on previous entanglements.

      How does the system know who’s an offender and who isn’t without scanning their face first?

      • FoundFootFootage78@lemmy.ml
        link
        fedilink
        English
        arrow-up
        7
        ·
        13 hours ago

        They look at CCTV, take an violentcustomer.jpeg, and add it to the database.

        EDIT: Oh I see what you’re asking. You misread my quote, what I said is “they only scan for” and not “they only scan”.