Synapse.AI is a bridge between altar_ai and Synapse, providing SDK-backed LLM providers and workflow actions for AI-powered multi-agent systems.
Unlike HTTP-based providers, Synapse.AI leverages the full power of native SDKs through altar_ai:
- Full SDK Features: Caching, streaming, batch processing, auth management
- Automatic Fallback: Composite adapter chains across Gemini, Claude, and Codex
- Unified Interface: Single API across all LLM providers
- Type Safety: Structured responses and error handling
- Telemetry Integration: Unified metrics with FlowStone and other altar_ai consumers
Add synapse_ai to your dependencies in mix.exs:
def deps do
[
{:synapse_ai, path: "../synapse_ai"},
# Or from Hex (when published)
# {:synapse_ai, "~> 0.1.0"}
]
endConfigure SDK-backed providers in your Synapse ReqLLM profiles:
# config/config.exs
config :synapse, Synapse.ReqLLM,
profiles: %{
# Gemini with full SDK support
gemini_sdk: [
provider_module: Synapse.AI.Providers.GeminiSDK,
model: "gemini-pro"
],
# Claude with extended thinking and caching
claude_sdk: [
provider_module: Synapse.AI.Providers.ClaudeSDK,
model: "claude-opus-4-5-20251101"
],
# OpenAI Codex for code generation
codex_sdk: [
provider_module: Synapse.AI.Providers.CodexSDK,
model: "gpt-4o"
],
# Composite with automatic fallback
composite: [
provider_module: Synapse.AI.Providers.CompositeSDK,
fallback_order: [:gemini, :claude, :codex]
]
}# lib/my_app/application.ex
defmodule MyApp.Application do
use Application
def start(_type, _args) do
# Forward altar_ai telemetry to synapse.ai namespace
Synapse.AI.setup_telemetry()
# ... rest of your application setup
end
endalias Synapse.Workflow.{Spec, Step}
Spec.new(
name: :content_moderation,
steps: [
Step.new(
id: :classify_content,
action: Synapse.AI.Actions.Classify,
params: %{
text: "This is a test message",
labels: ["safe", "spam", "toxic"],
adapter: :gemini
}
),
Step.new(
id: :generate_response,
action: Synapse.AI.Actions.Generate,
params: fn env ->
case env.classify_content.label do
"safe" -> %{prompt: "Generate a friendly greeting"}
"spam" -> %{prompt: "Generate a spam warning"}
"toxic" -> %{prompt: "Generate a moderation notice"}
end
end
)
]
)Implement Synapse.LLMProvider but delegate to altar_ai adapters:
Synapse.AI.Providers.GeminiSDK: Full gemini_ex SDK with caching and streamingSynapse.AI.Providers.ClaudeSDK: Full claude_agent_sdk with extended thinkingSynapse.AI.Providers.CodexSDK: Full OpenAI SDK for code generationSynapse.AI.Providers.CompositeSDK: Automatic fallback across all providers
Jido Actions for AI operations in Synapse workflows:
Text generation with any adapter:
Step.new(
id: :summarize,
action: Synapse.AI.Actions.Generate,
params: %{
prompt: "Summarize the following article: #{article_text}",
adapter: :claude,
opts: [max_tokens: 500]
}
)Multi-label classification:
Step.new(
id: :classify_sentiment,
action: Synapse.AI.Actions.Classify,
params: %{
text: user_feedback,
labels: ["positive", "negative", "neutral"],
adapter: :gemini
}
)Vector embedding generation:
Step.new(
id: :embed_documents,
action: Synapse.AI.Actions.Embed,
params: %{
texts: documents,
adapter: :composite
}
)Pre-built handlers for signal-driven AI operations:
Automatically route signals based on classification:
Synapse.SignalRouter.register_handler(
:incoming_messages,
&Synapse.AI.SignalHandlers.classify_and_route/2,
labels: ["urgent", "normal", "spam"],
text_path: [:data, :message, :body],
route_prefix: "classified_"
)Add vector embeddings to signals:
Synapse.SignalRouter.register_handler(
:documents,
&Synapse.AI.SignalHandlers.enrich_with_embeddings/2,
text_path: [:data, :content],
adapter: :gemini
)Automatically summarize text content:
Synapse.SignalRouter.register_handler(
:articles,
&Synapse.AI.SignalHandlers.generate_summary/2,
text_path: [:data, :article],
max_length: 100
)Synapse.AI bridges altar_ai telemetry events to the :synapse, :ai namespace:
:telemetry.attach(
"my-handler",
[:synapse, :ai, :generate, :stop],
fn _event, measurements, metadata, _config ->
IO.puts("Generated #{metadata.model} in #{measurements.duration}ms")
end,
nil
)[:synapse, :ai, :generate, :start | :stop | :exception][:synapse, :ai, :classify, :start | :stop | :exception][:synapse, :ai, :embed, :start | :stop | :exception][:synapse, :ai, :batch_embed, :start | :stop | :exception]
┌─────────────────────────────────────────────────────────┐
│ Synapse Workflows │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ Generate │ │ Classify │ │ Embed │ │
│ │ Action │ │ Action │ │ Action │ │
│ └──────┬───────┘ └──────┬───────┘ └──────┬───────┘ │
└─────────┼──────────────────┼──────────────────┼─────────┘
│ │ │
└──────────────────┼──────────────────┘
│
┌────────▼─────────┐
│ Synapse.AI │
│ Providers │
└────────┬─────────┘
│
┌────────▼─────────┐
│ Altar.AI │
│ Adapters │
└────────┬─────────┘
│
┌──────────────────┼──────────────────┐
│ │ │
┌─────▼─────┐ ┌─────▼─────┐ ┌─────▼─────┐
│ Gemini │ │ Claude │ │ Codex │
│ SDK │ │ SDK │ │ SDK │
└───────────┘ └───────────┘ └───────────┘
| Feature | HTTP Providers | Synapse.AI (SDK) |
|---|---|---|
| Streaming | Limited | Full support |
| Caching | None | Provider-native |
| Auth Management | Manual | SDK-handled |
| Rate Limiting | Manual | SDK-handled |
| Retry Logic | Manual | SDK-native |
| Fallback Chains | Manual | Automatic |
| Type Safety | Partial | Full |
| Error Handling | Custom | Unified |
Spec.new(
name: :multi_model_pipeline,
steps: [
# Fast classification with Gemini
Step.new(
id: :classify,
action: Synapse.AI.Actions.Classify,
params: %{text: input, labels: ["code", "text"], adapter: :gemini}
),
# Code generation with Codex if needed
Step.new(
id: :generate_code,
action: Synapse.AI.Actions.Generate,
when: fn env -> env.classify.label == "code" end,
params: %{prompt: "Generate code for: #{input}", adapter: :codex}
),
# Text refinement with Claude otherwise
Step.new(
id: :refine_text,
action: Synapse.AI.Actions.Generate,
when: fn env -> env.classify.label == "text" end,
params: %{prompt: "Refine: #{input}", adapter: :claude}
)
]
)# Use composite adapter for automatic failover
Step.new(
id: :resilient_generation,
action: Synapse.AI.Actions.Generate,
params: %{
prompt: "Critical task: #{task}",
adapter: :composite # Will try Gemini -> Claude -> Codex
}
)Run the test suite:
mix testAll tests use async execution and mock adapters for fast, reliable testing.
Contributions welcome! Please:
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Ensure
mix testandmix formatpass - Submit a pull request
MIT License - see LICENSE for details.
- GitHub: https://github.com/nshkrdotcom/synapse_ai
- Related Projects: