• ☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOP
    link
    fedilink
    arrow-up
    4
    ·
    3 days ago

    That’s what I’m thinking too. There’s no reason why you couldn’t make a chip like this for a full blown Deepseek model, and then when new models come out you just print new chips for them. The really nice part is that their approach doesn’t need DRAM either because the state of each transistor acts as memory, it just needs a bit of SRAM which we don’t have a shortage of.

    I’m fully convinced that the whole AI as a service business model is going to be very short lived. Ultimately, nobody really likes their data going out to some company, and to have to pay subscription fees to use the models. If we start getting these kinds of specialized chips, they’re going to be a game changer.

    • CriticalResist8@lemmygrad.ml
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      I could however totally see an economy where the chips themselves while cheap to produce cost a premium based on model and number of parameters.

      Because the tech is certainly impressive and they have proof of concept. I don’t know how scalable this is for them (or others), but it clearly works and shows immediate advantages. If it could integrate with existing consumer hardware, like say a PCI card you plug the chip into and switch them out when you want to change the model, anybody could easily have this at home.

      But with capitalism we’d probably have to settle for DRM’d chips that self-destruct after X many tokens generated lol.