AI Infrastructure
Local AI models, inference engines, and AI services running on home hardware. Covers tools like Ollama and LM Studio — what models they support, hardware requirements, GPU vs CPU inference, and integration with Home Assistant automations.



