Context-Simplo runs in Docker. Choose an embedding provider below.
- Docker Desktop (Windows/macOS) or Docker Engine (Linux)
- 2GB RAM minimum, 4GB recommended
- 500MB disk space for image + data
Step 1: Install Ollama and pull the model
ollama pull nomic-embed-textStep 2: Pull the Context-Simplo image
docker pull ohopson/context-simplo:latestStep 3: Start the container
macOS / Windows
docker run -d \
--name context-simplo \
-p 3001:3001 \
-v "$HOME":/host:ro \
-v context-simplo-data:/data \
-e MOUNT_ROOT=/host \
-e INITIAL_WORKSPACE=/host \
-e LLM_PROVIDER=ollama \
-e LLM_BASE_URL=http://host.docker.internal:11434 \
-e LLM_EMBEDDING_MODEL=nomic-embed-text \
ohopson/context-simplo:latestLinux
docker run -d \
--name context-simplo \
--add-host=host.docker.internal:host-gateway \
-p 3001:3001 \
-v "$HOME":/host:ro \
-v context-simplo-data:/data \
-e MOUNT_ROOT=/host \
-e INITIAL_WORKSPACE=/host \
-e LLM_PROVIDER=ollama \
-e LLM_BASE_URL=http://host.docker.internal:11434 \
-e LLM_EMBEDDING_MODEL=nomic-embed-text \
ohopson/context-simplo:latestWindows (Command Prompt)
docker run -d ^
--name context-simplo ^
-p 3001:3001 ^
-v %USERPROFILE%:/host:ro ^
-v context-simplo-data:/data ^
-e MOUNT_ROOT=/host ^
-e INITIAL_WORKSPACE=/host ^
-e LLM_PROVIDER=ollama ^
-e LLM_BASE_URL=http://host.docker.internal:11434 ^
-e LLM_EMBEDDING_MODEL=nomic-embed-text ^
ohopson/context-simplo:latestWindows (PowerShell)
docker run -d `
--name context-simplo `
-p 3001:3001 `
-v "$env:USERPROFILE":/host:ro `
-v context-simplo-data:/data `
-e MOUNT_ROOT=/host `
-e INITIAL_WORKSPACE=/host `
-e LLM_PROVIDER=ollama `
-e LLM_BASE_URL=http://host.docker.internal:11434 `
-e LLM_EMBEDDING_MODEL=nomic-embed-text `
ohopson/context-simplo:latestStep 1: Pull the image
docker pull ohopson/context-simplo:latestStep 2: Start the container with your API key
macOS / Linux
docker run -d \
--name context-simplo \
-p 3001:3001 \
-v "$HOME":/host:ro \
-v context-simplo-data:/data \
-e MOUNT_ROOT=/host \
-e INITIAL_WORKSPACE=/host \
-e LLM_PROVIDER=openai \
-e LLM_API_KEY=sk-your-api-key-here \
-e LLM_BASE_URL=https://api.openai.com/v1 \
-e LLM_EMBEDDING_MODEL=text-embedding-3-small \
ohopson/context-simplo:latestWindows (Command Prompt)
docker run -d ^
--name context-simplo ^
-p 3001:3001 ^
-v %USERPROFILE%:/host:ro ^
-v context-simplo-data:/data ^
-e MOUNT_ROOT=/host ^
-e INITIAL_WORKSPACE=/host ^
-e LLM_PROVIDER=openai ^
-e LLM_API_KEY=sk-your-api-key-here ^
-e LLM_BASE_URL=https://api.openai.com/v1 ^
-e LLM_EMBEDDING_MODEL=text-embedding-3-small ^
ohopson/context-simplo:latestWindows (PowerShell)
docker run -d `
--name context-simplo `
-p 3001:3001 `
-v "$env:USERPROFILE":/host:ro `
-v context-simplo-data:/data `
-e MOUNT_ROOT=/host `
-e INITIAL_WORKSPACE=/host `
-e LLM_PROVIDER=openai `
-e LLM_API_KEY=sk-your-api-key-here `
-e LLM_BASE_URL=https://api.openai.com/v1 `
-e LLM_EMBEDDING_MODEL=text-embedding-3-small `
ohopson/context-simplo:latestSkip embeddings entirely. Semantic search won't work, but all structural tools (call hierarchies, impact analysis, dead code detection, etc.) still work.
docker run -d \
--name context-simplo \
-p 3001:3001 \
-v "$HOME":/host:ro \
-v context-simplo-data:/data \
-e MOUNT_ROOT=/host \
-e INITIAL_WORKSPACE=/host \
-e LLM_PROVIDER=none \
ohopson/context-simplo:latestStep 1: Open the dashboard
open http://localhost:3001Step 2: Configure your IDE
Follow the MCP Setup Guide to connect your IDE (Cursor, VS Code, Claude Desktop, or Claude Code) to Context-Simplo.
Step 3: Index your first repository
From the dashboard:
- Click "Repositories" in the sidebar
- Click "Browse" to navigate to your project directory
- Select the project folder
- Click "Index Repository"
Or use the MCP tool from your IDE:
Tool: index_repository
Args: { "path": "/host/projects/my-app" }
Note: Use container paths (/host/...) when indexing via MCP tools.
Check container status:
docker ps | grep context-simploCheck logs:
docker logs context-simploTest MCP connection from your IDE:
Tool: list_repositories
Expected response shows your indexed repositories with file counts and node counts.
Check logs:
docker logs context-simploCommon issues:
- Port 3001 already in use: Stop other services or change port with
-p 3002:3001 - Volume mount failed: Ensure Docker has permission to access your home directory
Verify host.docker.internal resolves:
docker exec context-simplo ping -c 1 host.docker.internalLinux users: Ensure you used --add-host=host.docker.internal:host-gateway in the docker run command.
Verify Ollama is running:
curl http://localhost:11434/api/tagsCheck if the container is running:
docker ps | grep context-simploTry restarting:
docker restart context-simploMake sure you're using container paths (/host/...) not host paths when indexing.
Check the Browse tab in the dashboard to see the container filesystem.
Increase Docker memory allocation:
- Docker Desktop: Settings → Resources → Memory (increase to 4GB+)
For large repositories, indexing may take time on first run. Subsequent updates are incremental and fast (<500ms per file).
Stop the container:
docker stop context-simploStart the container:
docker start context-simploRemove the container:
docker stop context-simplo
docker rm context-simploUpdate to latest version:
docker stop context-simplo
docker rm context-simplo
docker pull ohopson/context-simplo:latest
# Then run the docker run command againView logs:
docker logs -f context-simplo- MCP Setup Guide - Connect your IDE
- Configuration Reference - Environment variables and settings
- MCP Tools Reference - Available tools and usage
- Architecture Overview - How it works