In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    23 hours ago

    I knew they wouldn’t necessarily investigate it, that’s always their discretion, but I had no idea there was no actual bite to the rule if they didn’t comply. That’s stupid.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      Generally things like that are meant more to identify a pattern. It may not be useful to an individual, but very useful to determine a recall or support a class action