Important
This library is in development mode and the API is subject to change.
This library provides a unified way to translate requests and responses between different Large Language Model (LLM) providers. It allows developers to integrate with one interface (e.g., OpenAI’s API) while actually calling another provider (e.g., Anthropic, Mistral, etc.), without rewriting application logic.
Each LLM provider exposes its own request and response format:
- OpenAI expects
messagesarrays withrole/contentparts. - Anthropic expects a different structure, with
systemanduser/assistantsegments. - Mistral and others have yet more variations.
If you want to connect a tool that only understands one interface (e.g., OpenAI) to a different backend (e.g., Anthropic), you would normally need to write a direct conversion for every pair.
This leads to a Cartesian product problem:
- For N providers, you’d need N × (N-1) conversions.
Instead of converting directly between every provider pair, this library introduces a standard intermittent format.
The process works in two steps:
- Normalize → Convert provider-specific request into the intermittent format.
- Transform → Convert intermittent format into the target provider’s request.
The same applies to responses:
- Normalize response → Intermittent format.
- Transform intermittent format → Desired interface (e.g., OpenAI-style).
With this approach, the number of conversions drops from N × (N-1) to just 2 × N (one in and one out per provider).
📖 Complete Documentation - Comprehensive guides and API reference
- Usage Guide - Step-by-step instructions and examples
- Unified interface across providers.
- Reduced complexity via intermittent format.
- Supports request & response conversion (including errors, metadata).
- Streaming-compatible for real-time use cases.
- Extensible: add new providers by defining just two mappings (to/from intermittent format).