• UnspecificGravity@piefed.social
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 days ago

        Right. if you built a robot and it killed someone you would go to jail. Do you think anyone at Waymo is going to jail if their car kills someone?

        • SaveTheTuaHawk@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          2 days ago

          That’s like saying CEO of GM should go to jail for every asshole who runs over someone with his Silverado because it has a 16ft front blind spot.

          Waymo is liable in civil court. The NTHSA should investigate but they got DOGEd.

          • UnspecificGravity@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Thank you. That is the exact point I am making.

            At some point criminal liability just vanishes when enough money happens. If I sell a robot that does a crime I go to jail. If me and another guy build a robot that does crimes then we are both going to jail. At some point no one goes to jail because laws just sorta don’t apply to certain entities.

            If a person does a crime on BEHALF of those entities, they are going to jail. But if the entity itself does a crime, well suddenly there aren’t criminal penalties anymore.

            Which is why we shouldn’t let those entities do things like operate antonymous vehicles in the street.

    • greygore@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      Assuming that the facts in the article are true (I’m more inclined to believe Waymo than Tesla)… it hit the child at 6 mph after “hard braking” from 17 mph when the child popped out unseen from behind a tall SUV. That doesn’t seem terribly reckless to me?

      I’m all for a conversation about responsibility when dealing with autonomous systems, as well as one on regulating self-driving cars before they’re exposed to the general public (much less small children), but this doesn’t seem like the egregious case to suddenly come to some kind of global or national reckoning.

      • UnspecificGravity@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        The event described is one with cards tightly packed on both sides and double parked with kids actively loading and unloading, 17 mph would feel awful fast to a human driver is that situation.

  • rafoix@lemmy.zip
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 days ago

    Why wasn’t the child wearing Waymo approved NFC clothing for the vehicle to properly detect? I hope Waymo sues that irresponsible child’s family.

    /s

  • Noxy@pawb.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    The young pedestrian “suddenly entered the roadway from behind a tall SUV"

    Yet another reason why the proliferation of SUVs is fucking bad. Total tangent, I know, but still.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Oh and they always have tinted windows as well so you can’t look through the windscreen to see if there’s anyone there. It’s bad enough when you’re driving behind them and they suddenly slam on the brakes and you can’t see in front of them so you have no idea what’s going on. There could be a kaiju attack in progress for all you know.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    1 day ago

    World 1: 40,000 people die a year from traffic accidents, but the driver is always a person. The courts may or may not find them liable, but the victims’ families at least have a name and face, and an explanation of events and motivations that they can rely on to find peace.

    World 2: 4,000 people die a year from traffic accidents, but the driver is always an opaque ML model. Nobody is named, let alone found liable. And even if you had a name, there is nobody who can explain the reasoning of the model to you.

    I am not sure World 2 is unquestionably preferable.

    I think we’re missing something about justice when we only look at how many people die and not also how the survivors are able to make sense (or not) of their loss.

  • Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path

    This is why autonomous vehicles aren’t ready for use yet. Human driver would know that being near a school at that particular time means that you should be extra cautious and looking out for things like that, if there is a large vehicle blocking your view of the road you don’t automatically assume that it’s safe you assume the opposite and go slowly.

  • the_riviera_kid@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    2 days ago

    They made a statement not too long ago that we need to get used to the idea of driverless cars hitting pedestrians and I commented that they know its inevitable and are trying to cover thier asses in advanced and here we are, they ran over a kid.

    • Ech@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      2 days ago

      How many children get hit and killed by human drivers every year? Why are you only worried now?

      • eleijeep@piefed.social
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        2 days ago

        Humans who drive dangerously or while intoxicated are held responsible. Even humans who are driving normally and sober can be held responsible if the law decides that they were at fault. Fatal accidents are investigated by people whose job is specifically to investigate vehicle accidents.

        Who is being held responsible when an automated car kills someone?

        • Ech@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          1 day ago

          Even humans who are driving normally and sober can be held responsible if the law decides that they were at fault. Fatal accidents are investigated by people whose job is specifically to investigate vehicle accidents.

          That’s a good question, but it’s important to keep in mind that a lot of public judgement here is based off a single headline, not a thorough investigation. The world is messy, and even the best technology will get into accidents due to any number of causes, including children dashing into the road.

          In the case where the technology in the car was found to be at fault, I agree that there needs to be a responsible party, but it’s not exactly straightforward. Punishing the driver seems counterintuitive, but may be useful in getting owners to be diligent in selecting safer vehicles. And punishing the company is complicated - if the punishment is just a fine, it will simply be rolled up into their expected expenses. If it’s too damaging, it will chill development of automated driving entirely. I’m not really sure what the best solution is.

  • altphoto@lemmy.today
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    In racist America the question must be asked… The car was white, what color was the kid? S\ (which means that this is a sarcastic cynical comment)

  • idriss@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    Go check HN comments, the delusion is out of the roof like always. They flipped the story to a human could have done worse and this is a good outcome. I punched you for no reason but somebody else you have done worse so I am good.