I’m connecting to llama.cpp on my laptop through my phone via Tailscale but when my laptop sleeps I can’t access it anymore on my phone.

What are yall using for this? Thanks!

  • BurnedDonutHole@ani.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 days ago

    Use https://anythingllm.com/ on your mobile with its own local models? You’ll be limited with your phone’s hardware but it will do on a pinch where you can’t reach or use your laptop. It’s open source and everything works locally.

    • venusaur@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 days ago

      Thanks! I’ll try it out. I’m on an old phone and resistant to switch to a bigger one.