In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • FuglyDuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    23 hours ago

    Thanks for that.

    The thing is, though the NHTSA generally doesn’t make a determination on criminal or civil liability. They’ll make the report about what happened and keep it to the facts, and let the courts sort it out whose at fault. they might not even actually investigate a crash unless it comes to it. It’s just saying “when your car crashes, you need to tell us about it.” and they kinda assume they comply.

    Which, Tesla doesn’t want to comply, and is one of the reasons Musk/DOGE is going after them.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      22 hours ago

      I knew they wouldn’t necessarily investigate it, that’s always their discretion, but I had no idea there was no actual bite to the rule if they didn’t comply. That’s stupid.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        Generally things like that are meant more to identify a pattern. It may not be useful to an individual, but very useful to determine a recall or support a class action