Hi, currently have a spare GeForce 1060 lying around collecting dust.
Planning to use it with Ollama for self-hosting my own AI model or maybe even for AI training.
Problem is, none of my home lab devices have a compatible connection to the GPU’s GPIO. My current setup includes:
- Beelink MINI S12 Intel Alder Lake N100
- Raspberry Pi 5
- Le Potato AML-S905X-CC
- Pi Picos
Would like to hear about recommendations or experiences with external GPU docks that I can use to connect my GPU to my home lab setup, thanks.
You must log in or register to comment.
You won’t get much use out of a 1060 for AI. Maybe some toy models to learn but even then you might run into limitations with the older CUDA version
It would be cheaper and better to just build a small PC with parts from a few generations ago and that card is only worth $50 for a reason.