m
Margin
← Help center
Privacy

Self-hosting the ML service (Enterprise)

Why

Transcripts never leave your network. The full product works against your own LLM endpoint.

How

Set OLLAMA_BASE_URL on the ML service to your inference cluster. We support Ollama, vLLM, and any OpenAI-compatible endpoint.