SemiLayerDocs

Enterprise

SemiLayer Enterprise is a self-hosted deployment — full control over your infrastructure, data, and configuration. No billing, no usage quotas, no external dependencies required.

What's included

  • Unlimited everything — no rate limits, no quota checks, no tier enforcement
  • BYO embedding provider — OpenAI, Ollama (local/air-gapped), Cohere, or any compatible endpoint
  • BYO auth — any OIDC provider (Okta, Azure AD, your own IdP)
  • BYO KMS — bring your own key management service, or use the built-in local provider
  • Private bridge support — register internal data source adapters without touching the public registry
  • Air-gapped deployments — supported with Ollama for embeddings
  • Docker images — deploy anywhere: your own servers, Kubernetes, ECS, Fly, Railway, or bare metal

Getting Access

Enterprise documentation, Docker images, and deployment guides are available to licensed customers.

ℹ️

To request access, reach out to the SemiLayer team. We'll provision your license, share Docker image access, and walk through your deployment architecture.

Contact us:

We'll respond within one business day.

What to Expect

Once licensed, you'll receive:

  1. Docker image pull credentials
  2. Access to the full enterprise deployment guide
  3. Example docker-compose.yml for single-node dev/staging
  4. Reference deployment configs for Kubernetes, ECS, and other platforms
  5. A private Slack channel for deployment support

Self-Hosting Overview

Enterprise deployments run two services:

service — The API. Handles all client requests (search, query, streaming, auth).

worker — The ingest worker. Reads from your sources, embeds, and indexes.

Both services are configured entirely via environment variables. No config files to manage.

Key environment variables:

# Data store connection
DATABASE_URL=postgresql://...

# Auth — generic OIDC
AUTH_PROVIDER=oidc
AUTH_ISSUER=https://your-idp.example.com
AUTH_CLIENT_ID=...
AUTH_CLIENT_SECRET=...

# Embeddings — provider-agnostic
EMBEDDING_PROVIDER=openai    # openai | ollama | cohere
EMBEDDING_API_KEY=...        # not needed for ollama

# Deployment mode
DEPLOYMENT_MODE=enterprise   # disables billing, quotas, Stripe

Air-Gapped Deployments

For environments without internet access, use Ollama as the embedding provider:

EMBEDDING_PROVIDER=ollama
EMBEDDING_BASE_URL=http://ollama:11434
EMBEDDING_MODEL=nomic-embed-text

Ollama runs locally and requires no external API calls. See the enterprise deployment guide for sizing recommendations.


Already a customer and looking for the deployment docs? Log in to your Console and navigate to Organization → Enterprise for direct access to the private documentation.