Change the repository type filter
All
Repositories list
2 repositories
local-llm-server
PublicA containerized, offline-capable LLM API powered by Ollama. Automatically pulls models and serves them via a REST API. Perfect for homelab, personal AI assistants, and portable deployments.infrastructure
Public