• GreenKnight23@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    13 hours ago

    it’s funny because I can run a coral TPU on 4gb that can identify obstacles in live streams.

    I’m a fucking genius for figuring it out. make me the CTO of Micron and I will share my knowledge.

  • humanspiral@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    18 hours ago

    This is not credible. A self promoting stock pump and dump PR. Vision AI models are smaller than text models. They do need fast/faster GPUs, but less memory. Very narrow purposed AI/Neural Network models need less memory because the memory is more about storing facts than logic/reasoning capability. LLM breakthroughs in benchmark score/GB are currently having more gains by smaller models than frontier largest models. 32gb is a reasonable ceiling for memory requirement. Robots can swap in task specific AI models as well.

  • sleet01@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    ·
    23 hours ago

    Ah, yes, the ol’ “X needs 300 of whatever it is I sell” gambit.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    5
    ·
    23 hours ago

    Nah, it just needs a team of Indian guys to step in whenever the collision alarms go off.

  • IphtashuFitz@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    22 hours ago

    Didn’t Musk promise like a decade ago that Tesla self driving would run fine on their “hardware v2” computer, then a few years later that it would require v3, and then v4 before he finally stopped making such promises?

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      20 hours ago

      Micron make RAM. I don’t think we should give any more credence to their claims than we do to Elon’s. Their goal here is to pump their share price, nothing more.

    • Omgpwnies@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      11
      ·
      2 days ago

      With the current level of tech in a car, you’re already likely pushing 300GB in total. There’s dozens of high-compute ECUs doing all sorts of things, running some *nix OS and using anywhere from a couple GB to well… way more.

      to reach full driverless capability, those will need to become more powerful, the software will require more memory, and the number of compute modules will likely increase as well for sensors and other stuff.

      300GB IMO is probably a conservative estimate.

      • GamingChairModel@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        22 hours ago

        With the current level of tech in a car, you’re already likely pushing 300GB in total.

        The actual article (and the call it is reporting on, with statements from the CEO) says that 16GB is the average in new cars today. No need to make stuff up.

      • YiddishMcSquidish@lemmy.today
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 day ago

        I’m not trying to sound angry at you, but I’m told I come off that way. So please let me start this with an advanced apology.

        We have the esp32 in very common circulation. We have seen what is required to keep a thing fucking airborne, and it is so beyond what I thought was possible twenty years ago. And they did it with <1 gig.

        • Omgpwnies@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 day ago

          No worries, nothing grouchy sounding there :)

          My statement is sourced by me working in R&D in the automotive industry on these modules… an ESP32 does not come close to the amount of computing resources needed to move and process the absolute boat load of information required to make decisions for autonomous driving.

          Flying around doesn’t need the same level of object detection, path-finding, decision making and so on that a vehicle that is capable of killing anyone in or around it needs. And on top, it has to be able to do that at highway speeds, without ever making a mistake - because of the killing everyone in or around it part.

          Further, it needs to deal with all the random stuff all those people are doing around it all the time… again, without ever making a mistake.

          So it needs to be able to see something, identify if it’s something it needs to be concerned about, figure out if it might be doing something that needs to be addressed, make a plan, then execute it… in like a few milliseconds. with a virtually unlimited number of potential obstacles, while obeying traffic laws, and still get the occupant to their destination.

          Without killing anyone.

          And that’s just the ADAS subsystem.

          • YiddishMcSquidish@lemmy.today
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            Yeah, I guess I didn’t take into account that airborne is easier considering encountering randomly moving objects.

            But 300gigs is a bad number on at least 2 different levels.

  • BaraCoded@literature.cafe
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    2 days ago

    Each will also need a portable nuclear reactor and a swimming pool filled with the blood of innocents and ice cubes made out of children’s tears, for cooling purposes.