Hi, currently have a spare GeForce 1060 lying around collecting dust. Planning
to use it with Ollama [https://ollama.com/] for self-hosting my own AI model or
maybe even for AI training. Problem is, none of my home lab devices have a
compatible connection to the GPU’s GPIO. My current setup includes: - Beelink
MINI S12 Intel Alder Lake N100 - Raspberry Pi 5 - Le Potato AML-S905X-CC - Pi
Picos Would like to hear about recommendations or experiences with external GPU
docks that I can use to connect my GPU to my home lab setup, thanks.
What @[email protected] said, but the adapters arent cheap. You’re going to end up spending more than the 1060 is worth.
A used desktop to slap it in, that you turn on as needed, might make sense? Doubly so if you can find one with an RTX 3060, which would open up 32B models with TabbyAPI instead of ollama. Some configure them to wake on LAN and boot an LLM server.