Skip to content

feat: add LiteLLM as LLM provider#2565

Open
RheagalFire wants to merge 2 commits into
OpenMind:mainfrom
RheagalFire:feat/add-litellm-provider
Open

feat: add LiteLLM as LLM provider#2565
RheagalFire wants to merge 2 commits into
OpenMind:mainfrom
RheagalFire:feat/add-litellm-provider

Conversation

@RheagalFire
Copy link
Copy Markdown

@RheagalFire RheagalFire commented May 8, 2026

Overview

Adds LiteLLM as a new LLM plugin, enabling access to 100+ providers (OpenAI, Anthropic, Google, Azure, Bedrock, Ollama, etc.) via a single unified SDK. Follows the exact same plugin pattern as OpenAILLM,
DeepSeekLLM, GeminiLLM, etc.

Type of change

  • Bug fix
  • New feature
  • Documentation improvement
  • Bounty issue submission
  • Other:

Changes

  • src/llm/plugins/litellm.py - new LiteLLM plugin class with:
    • LiteLLMConfig extending LLMConfig
    • litellm.acompletion() for async completion
    • drop_params=True for cross-provider kwargs compatibility
    • Full function/tool calling support (same as OpenAILLM)
    • @AvatarLLMState.trigger_thinking() and @LLMHistoryManager.update_history() decorators
  • pyproject.toml - added litellm>=1.60.0,<2.0.0 as optional dependency under [project.optional-dependencies]
  • tests/llm/test_litellm.py - 12 unit tests (all passing)

Unit tests (12/12 passing):

$ PYTHONPATH=src python -m pytest tests/llm/test_litellm.py -v
test_plugin_file_exists PASSED                                                                                                                                                                                     
test_has_litellm_config_class PASSED
test_has_litellm_class PASSED                                                                                                                                                                                      
test_litellm_class_inherits_llm PASSED
test_has_ask_method PASSED
test_ask_is_async PASSED                                                                                                                                                                                           
test_uses_drop_params_true PASSED
test_uses_litellm_acompletion PASSED                                                                                                                                                                               
test_lazy_imports_litellm PASSED
test_acompletion_called_with_drop_params PASSED
test_acompletion_forwards_api_key PASSED                                                                                                                                                                           
test_acompletion_handles_tool_calls PASSED
12 passed in 0.02s                                                                                                                                                                                                 

Live E2E test (Azure Foundry, Claude):

import asyncio, os, litellm                                                                                                                                                                                        
                
async def main():
    response = await litellm.acompletion(
        model="anthropic/claude-sonnet-4-6",
        messages=[{"role": "user", "content": "What is 2+2? Reply with just the number."}],
        drop_params=True,                                                                                                                                                                                          
        api_key=os.environ["ANTHROPIC_FOUNDRY_API_KEY"],
        api_base=os.environ["ANTHROPIC_FOUNDRY_BASE_URL"],                                                                                                                                                         
    )                                                                                                                                                                                                              
    print(response.choices[0].message.content)
                                                                                                                                                                                                                   
asyncio.run(main())
# Output: "4"

Example usage:

# In your OM1 agent config YAML:
llm:                                                                                                                                                                                                               
  class_name: LiteLLM
  config:                                                                                                                                                                                                          
    model: "anthropic/claude-sonnet-4-20250514"
    api_key: "${ANTHROPIC_API_KEY}"
from llm.plugins.litellm import LiteLLM, LiteLLMConfig                                                                                                                                                             
                
config = LiteLLMConfig(
    model="anthropic/claude-sonnet-4-20250514",
    api_key=os.environ["ANTHROPIC_API_KEY"],
)                                                                                                                                                                                                                  
llm = LiteLLM(config=config)
response = await llm.ask("What should the robot do next?")                                                                                                                                                         

See https://docs.litellm.ai/docs/providers for 100+ supported model strings.

Checklist

  • Code complies with style guidelines
  • Self-review completed
  • Documentation updated (if applicable)
  • Local testing completed
  • Tests added/updated (if applicable)

Impact

  • Additive only, existing plugins (OpenAI, DeepSeek, Gemini, Ollama, etc.) untouched
  • litellm added as optional dependency (pip install .[litellm])
  • drop_params=True silently drops provider-unsupported kwargs
  • Auto-discovered by OM1's find_module_with_class() since the class inherits from LLM

Additional Information

  • No proxy server needed, uses litellm Python SDK directly
  • LiteLLM model strings use the provider/model format (e.g. anthropic/claude-sonnet-4-20250514, azure/gpt-4o, bedrock/anthropic.claude-3-haiku)
  • Provider API keys are read from standard env vars automatically (ANTHROPIC_API_KEY, OPENAI_API_KEY, etc.)

@RheagalFire RheagalFire requested review from a team as code owners May 8, 2026 22:08
@github-actions github-actions Bot added robotics Robotics code changes python Python code tests Test files labels May 8, 2026
@github-actions github-actions Bot added dependencies Pull requests that update a dependency file config Configuration files labels May 8, 2026
@RheagalFire
Copy link
Copy Markdown
Author

cc @openminddev

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

config Configuration files dependencies Pull requests that update a dependency file python Python code robotics Robotics code changes tests Test files

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant