Infrastructure Category: AI Infrastructure

Local AI models, inference engines, and AI services running on home hardware. Covers tools like Ollama and LM Studio — what models they support, hardware requirements, GPU vs CPU inference, and integration with Home Assistant automations.

2 infrastructure items matching filters
  • AI Servers

    AI Servers

    AI servers provide the local compute layer for voice, vision, summarization, and orchestration workloads that should not live only in the cloud.

  • Homelab Server

    Homelab Server

    A homelab server is the always-on foundation for Home Assistant, containers, storage workflows, and the surrounding services a serious automation stack depends on.

Tags

No tags for current selection.