I was looking around yesterday and came across this and got some ideas.
About three years ago, I even removed that cheap GPU, because as a network server, I could connect to it via SSH — no need for a monitor at all.
I run headless whenever possible.
It barely works, but I honestly couldn’t imagine what to use it for — after all, ChatGPT is already great and convenient enough for me.
If you are into self hosting, take a look at KaraKeep. It basically Pocket but open source and bring your own AI. I use this 100 times a day.
Check out LM Studio if you haven't, so much better than Ollama.
I don't use local AI much, I mostly use providers, so it is mostly for fun as I need the speed more, but I am working on a project that I do not want my data in the cloud and once I can prove it is profitable, I will be getting some RTX 6000 Pros and Epyc CPUs to run Kimi K2 locally.
RE: Adding external GPU to my AI box