• ☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOP
    link
    fedilink
    arrow-up
    4
    ·
    2 days ago

    Sure, but it’s still expensive for most people to get frontier model performance locally. I expect that in a few years the models will get optimized enough that even ones that run on modest hardware will be able to do everything a current frontier model does. And that’s going to be the big game changer because there isn’t going to be much demand for models as a service at that point.