Skip to main content
Context7 On-Premise lets you run the full Context7 stack inside your own infrastructure. Your code, documentation, and embeddings never leave your environment.

What’s Included

  • Full Context7 parsing and indexing pipeline
  • Local vector storage (no external vector DB required)
  • Built-in MCP server — works with any MCP-compatible AI client
  • Web UI for managing indexed libraries and configuration
  • REST API compatible with the public Context7 API
  • Private GitHub and GitLab repository ingestion
On-Premise Architecture

Setup

1

Request a trial

Go to context7.com/plans and click On-Premise Trial. Fill out the request form — no credit card required. You’ll receive a 30-day full-featured license key via email once approved.
2

Authenticate with the registry

Use your license key to get a registry token and log in:
LICENSE_KEY="<your-license-key>"

TOKEN=$(curl -s -H "Authorization: Bearer $LICENSE_KEY" \
  https://context7.com/api/v1/license/registry-token | jq -r '.token')

docker login ghcr.io -u x-access-token -p $TOKEN
Docker stores these credentials locally — docker compose will use them automatically when pulling the image. You can also pull manually:
docker pull ghcr.io/context7/enterprise:latest
3

Configure and start

Create a docker-compose.yml:
services:
  context7:
    image: ghcr.io/context7/enterprise:latest
    container_name: context7
    restart: unless-stopped
    ports:
      - "3000:3000"
    volumes:
      - context7-data:/data
    environment:
      - LICENSE_KEY=${LICENSE_KEY}

volumes:
  context7-data:
    driver: local
The context7-data volume is critical. It stores your SQLite database (configuration, credentials, indexed libraries) and all vector embeddings. Without a persistent volume, all data is lost when the container restarts or is recreated. Never run without a volume mount in production.
Create a .env file in the same directory:
LICENSE_KEY=ctx7sk-...
Start the service:
docker compose up -d
4

Complete the setup wizard

Open http://localhost:3000 in your browser. On first launch, the setup wizard guides you through configuring:
  1. AI Provider — Choose OpenAI, Anthropic, Gemini, or a custom OpenAI-compatible endpoint. Enter your API key and model name.
  2. Embedding Provider — Use the same provider as your LLM, or configure a separate one for embeddings.
  3. Git Tokens — Add a GitHub and/or GitLab token for the platforms you use.
All configuration is stored locally in the embedded database and can be updated later from the Settings page.
5

Ingest your first repository

From the dashboard, click Add Repository and enter a GitHub or GitLab URL. Once ingestion completes, your private docs are ready to query.You can also add libraries via the REST API:
curl -X POST http://localhost:3000/api/parse \
  -H "Content-Type: application/json" \
  -d '{"url": "https://github.com/your-org/your-repo"}'

Connecting Your AI Client

Point your MCP client at your deployment URL. Replace https://context7.internal.yourcompany.com with your actual host.

Claude Code

claude mcp add --scope user --transport http context7 https://context7.internal.yourcompany.com/mcp

Cursor

Add to ~/.cursor/mcp.json:
{
  "mcpServers": {
    "context7": {
      "url": "https://context7.internal.yourcompany.com/mcp"
    }
  }
}

Opencode

{
  "mcp": {
    "context7": {
      "type": "remote",
      "url": "https://context7.internal.yourcompany.com/mcp",
      "enabled": true
    }
  }
}
For other clients, see All Clients.

Configuration

Environment Variables

These are set in your docker-compose.yml or .env file before starting the container.
VariableRequiredDescription
LICENSE_KEYYesLicense key issued by Upstash
PORTNoHTTP port (default: 3000)
DATA_DIRNoData directory inside the container (default: ./data)
AI provider keys, model settings, and git tokens are not set via environment variables. They are configured through the setup wizard and can be updated anytime from the Settings page in the web UI.

AI Provider Settings

Configured via the Settings page in the web UI.
SettingDescription
LLM Provideropenai, anthropic, gemini, or custom
LLM API KeyAPI key for your chosen provider
LLM ModelModel name (e.g. gpt-4o, claude-sonnet-4-5, gemini-2.5-flash)
LLM Base URLCustom OpenAI-compatible endpoint (for local models or proxies)

Examples

Provider: custom
Base URL: https://openrouter.ai/api/v1
Model: openai/gpt-4o
API Key: sk-or-v1-...

Embedding Settings

By default, Context7 uses the same provider as your LLM for generating embeddings. You can configure a separate embedding provider if needed.
SettingDescription
Embedding Provideropenai or gemini
Embedding API KeySeparate API key for embeddings (falls back to LLM API key)
Embedding ModelEmbedding model name (e.g. text-embedding-3-small)
Embedding Base URLCustom embedding endpoint

Git Access Tokens

Configured via the Settings page in the web UI.
SettingDescription
GitHub TokenGitHub Personal Access Token — required for GitHub repositories
GitLab TokenGitLab token — required for GitLab repositories
You only need tokens for the platforms you use. If you only parse GitLab repos, you don’t need a GitHub token, and vice versa. Create tokens with repo scope (GitHub) or read_repository scope (GitLab) for private repository access.

Access Control

Admin credentials are set during first login (default: admin / admin). Change these immediately after setup via Settings > Change Credentials. The Settings page lets you control which operations are available without authentication.
PermissionDefaultDescription
Allow anonymous parseOffAllow unauthenticated users to trigger parsing
Allow anonymous refreshOffAllow unauthenticated users to refresh libraries
Allow anonymous deleteOffAllow unauthenticated users to delete libraries
Allow anonymous support bundleOffAllow unauthenticated support bundle downloads
When a permission is off, the operation requires admin login. The MCP endpoint and search API are always publicly accessible.

Web UI

Open your deployment URL in a browser to access the dashboard. From here you can:
  • Add and remove libraries
  • Trigger re-indexing
  • Monitor parsing status and logs
  • Update AI provider settings, git tokens, and permissions
  • Test MCP connectivity
  • Change admin credentials

Operations

Updating the Image

If your registry login has expired, re-authenticate first:
LICENSE_KEY="<your-license-key>"

TOKEN=$(curl -s -H "Authorization: Bearer $LICENSE_KEY" \
  https://context7.com/api/v1/license/registry-token | jq -r '.token')

docker login ghcr.io -u x-access-token -p $TOKEN
Then pull the latest image and restart the container:
docker compose pull
docker compose up -d
Data persists in the named Docker volume across updates.

Health Check

curl http://localhost:3000/api/health
Example response:
{
  "status": "healthy",
  "version": "1.0.0",
  "setup": "complete",
  "license": "configured",
  "licenseInfo": {
    "valid": true,
    "teamSize": 10,
    "expiresAt": "2026-06-01T00:00:00.000Z"
  },
  "repos_parsed": 5,
  "uptime": 3600,
  "connectivity": {
    "llm": "configured",
    "llm_provider": "openai",
    "embedding": "configured",
    "embedding_provider": "openai",
    "github": "configured",
    "gitlab": "not configured"
  }
}

Support

For license issues, upgrade requests, or deployment questions, contact context7@upstash.com.