We need to add a comprehensive setup guide that covers how to configure Strix with the most popular LLM providers. The doc should include: * OpenAI (GPT-5.1, codex, etc.) * Anthropic (Claude Sonnet / Opus) * Google (Gemini) * Azure OpenAI * DeepSeek / Groq * OpenRouter * Ollama * LMStudio * etc.. Each provider section should include: * Required API keys / environment variables * Example `litellm` config * Recommended model(s) for best Strix performance * Common pitfalls + troubleshooting