• SillySausage@lemmynsfw.com
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      3 hours ago

      I successfully ran local Llama with llama.cpp and an old AMD GPU. I’m not sure why you think there’s no other option.

      • GraveyardOrbit@lemmy.zip
        link
        fedilink
        arrow-up
        4
        ·
        2 hours ago

        Amd had approximately 1 consumer gpu with rocm support so unless your framework supports opencl or you want to fuck around with unsupported rocm drivers then you’re out of luck. They’ve completely failed to meet the market

        • piccolo@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          ·
          1 hour ago

          I mean… my 6700xt dont have offical rocm support, but the rocm driver works perfectly fine for it. The difference is amd hasnt but the effort in testing rocm on their consumer cards, thus cant make claims support for it.