ZeroClaw supports custom API endpoints for both OpenAI-compatible and Anthropic-compatible providers.
For services that implement the OpenAI API format:
default_provider = "custom:https://your-api.com"
api_key = "your-api-key"
default_model = "your-model-name"For services that implement the Anthropic API format:
default_provider = "anthropic-custom:https://your-api.com"
api_key = "your-api-key"
default_model = "your-model-name"Edit ~/.zeroclaw/config.toml:
api_key = "your-api-key"
default_provider = "anthropic-custom:https://api.example.com"
default_model = "claude-sonnet-4-6"For custom: and anthropic-custom: providers, use the generic key env vars:
export API_KEY="your-api-key"
# or: export ZEROCLAW_API_KEY="your-api-key"
zeroclaw agentVerify your custom endpoint:
# Interactive mode
zeroclaw agent
# Single message test
zeroclaw agent -m "test message"- Verify API key is correct
- Check endpoint URL format (must include
http://orhttps://) - Ensure endpoint is accessible from your network
- Confirm model name matches provider's available models
- Check provider documentation for exact model identifiers
- Ensure endpoint and model family match. Some custom gateways only expose a subset of models.
- Verify available models from the same endpoint and key you configured:
curl -sS https://your-api.com/models \
-H "Authorization: Bearer $API_KEY"- If the gateway does not implement
/models, send a minimal chat request and inspect the provider's returned model error text.
- Test endpoint accessibility:
curl -I https://your-api.com - Verify firewall/proxy settings
- Check provider status page
default_provider = "custom:http://localhost:8080"
default_model = "local-model"default_provider = "anthropic-custom:https://llm-proxy.corp.example.com"
api_key = "internal-token"default_provider = "custom:https://gateway.cloud-provider.com/v1"
api_key = "gateway-api-key"
default_model = "gpt-4"