I’m connecting to llama.cpp on my laptop through my phone via Tailscale but when my laptop sleeps I can’t access it anymore on my phone.
What are yall using for this? Thanks!
I’m connecting to llama.cpp on my laptop through my phone via Tailscale but when my laptop sleeps I can’t access it anymore on my phone.
What are yall using for this? Thanks!
Use https://anythingllm.com/ on your mobile with its own local models? You’ll be limited with your phone’s hardware but it will do on a pinch where you can’t reach or use your laptop. It’s open source and everything works locally.
Thanks! I’ll try it out. I’m on an old phone and resistant to switch to a bigger one.