Running local LLMs on a homelab with Ollama20 March 2026·3 minsAI & Automation Ollama Llm Gpu-Passthrough Ai Homelab Automation