Infrastructure Category: AI Infrastructure

Local AI models, inference engines, and AI services running on home hardware. Covers tools like Ollama and LM Studio — what models they support, hardware requirements, GPU vs CPU inference, and integration with Home Assistant automations.

1 infrastructure item matching filters
  • AI Servers

    AI Servers

    AI servers provide the local compute layer for voice, vision, summarization, and orchestration workloads that should not live only in the cloud.

Tags

No tags for current selection.