Skip to content

Synapse integration for altar_ai - SDK-backed LLM providers for multi-agent workflows with automatic fallback, signal handlers, and workflow actions

License

Notifications You must be signed in to change notification settings

nshkrdotcom/synapse_ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Synapse.AI Logo

Synapse.AI

SDK-backed LLM providers for Synapse multi-agent workflows

License: MIT Elixir GitHub


Overview

Synapse.AI is a bridge between altar_ai and Synapse, providing SDK-backed LLM providers and workflow actions for AI-powered multi-agent systems.

Why Synapse.AI?

Unlike HTTP-based providers, Synapse.AI leverages the full power of native SDKs through altar_ai:

  • Full SDK Features: Caching, streaming, batch processing, auth management
  • Automatic Fallback: Composite adapter chains across Gemini, Claude, and Codex
  • Unified Interface: Single API across all LLM providers
  • Type Safety: Structured responses and error handling
  • Telemetry Integration: Unified metrics with FlowStone and other altar_ai consumers

Installation

Add synapse_ai to your dependencies in mix.exs:

def deps do
  [
    {:synapse_ai, path: "../synapse_ai"},
    # Or from Hex (when published)
    # {:synapse_ai, "~> 0.1.0"}
  ]
end

Quick Start

1. Configure Providers

Configure SDK-backed providers in your Synapse ReqLLM profiles:

# config/config.exs
config :synapse, Synapse.ReqLLM,
  profiles: %{
    # Gemini with full SDK support
    gemini_sdk: [
      provider_module: Synapse.AI.Providers.GeminiSDK,
      model: "gemini-pro"
    ],

    # Claude with extended thinking and caching
    claude_sdk: [
      provider_module: Synapse.AI.Providers.ClaudeSDK,
      model: "claude-opus-4-5-20251101"
    ],

    # OpenAI Codex for code generation
    codex_sdk: [
      provider_module: Synapse.AI.Providers.CodexSDK,
      model: "gpt-4o"
    ],

    # Composite with automatic fallback
    composite: [
      provider_module: Synapse.AI.Providers.CompositeSDK,
      fallback_order: [:gemini, :claude, :codex]
    ]
  }

2. Enable Telemetry

# lib/my_app/application.ex
defmodule MyApp.Application do
  use Application

  def start(_type, _args) do
    # Forward altar_ai telemetry to synapse.ai namespace
    Synapse.AI.setup_telemetry()

    # ... rest of your application setup
  end
end

3. Use in Workflows

alias Synapse.Workflow.{Spec, Step}

Spec.new(
  name: :content_moderation,
  steps: [
    Step.new(
      id: :classify_content,
      action: Synapse.AI.Actions.Classify,
      params: %{
        text: "This is a test message",
        labels: ["safe", "spam", "toxic"],
        adapter: :gemini
      }
    ),
    Step.new(
      id: :generate_response,
      action: Synapse.AI.Actions.Generate,
      params: fn env ->
        case env.classify_content.label do
          "safe" -> %{prompt: "Generate a friendly greeting"}
          "spam" -> %{prompt: "Generate a spam warning"}
          "toxic" -> %{prompt: "Generate a moderation notice"}
        end
      end
    )
  ]
)

Core Components

SDK-Backed Providers

Implement Synapse.LLMProvider but delegate to altar_ai adapters:

  • Synapse.AI.Providers.GeminiSDK: Full gemini_ex SDK with caching and streaming
  • Synapse.AI.Providers.ClaudeSDK: Full claude_agent_sdk with extended thinking
  • Synapse.AI.Providers.CodexSDK: Full OpenAI SDK for code generation
  • Synapse.AI.Providers.CompositeSDK: Automatic fallback across all providers

Workflow Actions

Jido Actions for AI operations in Synapse workflows:

Generate

Text generation with any adapter:

Step.new(
  id: :summarize,
  action: Synapse.AI.Actions.Generate,
  params: %{
    prompt: "Summarize the following article: #{article_text}",
    adapter: :claude,
    opts: [max_tokens: 500]
  }
)

Classify

Multi-label classification:

Step.new(
  id: :classify_sentiment,
  action: Synapse.AI.Actions.Classify,
  params: %{
    text: user_feedback,
    labels: ["positive", "negative", "neutral"],
    adapter: :gemini
  }
)

Embed

Vector embedding generation:

Step.new(
  id: :embed_documents,
  action: Synapse.AI.Actions.Embed,
  params: %{
    texts: documents,
    adapter: :composite
  }
)

Signal Handlers

Pre-built handlers for signal-driven AI operations:

Classify and Route

Automatically route signals based on classification:

Synapse.SignalRouter.register_handler(
  :incoming_messages,
  &Synapse.AI.SignalHandlers.classify_and_route/2,
  labels: ["urgent", "normal", "spam"],
  text_path: [:data, :message, :body],
  route_prefix: "classified_"
)

Enrich with Embeddings

Add vector embeddings to signals:

Synapse.SignalRouter.register_handler(
  :documents,
  &Synapse.AI.SignalHandlers.enrich_with_embeddings/2,
  text_path: [:data, :content],
  adapter: :gemini
)

Generate Summary

Automatically summarize text content:

Synapse.SignalRouter.register_handler(
  :articles,
  &Synapse.AI.SignalHandlers.generate_summary/2,
  text_path: [:data, :article],
  max_length: 100
)

Telemetry

Synapse.AI bridges altar_ai telemetry events to the :synapse, :ai namespace:

:telemetry.attach(
  "my-handler",
  [:synapse, :ai, :generate, :stop],
  fn _event, measurements, metadata, _config ->
    IO.puts("Generated #{metadata.model} in #{measurements.duration}ms")
  end,
  nil
)

Available Events

  • [:synapse, :ai, :generate, :start | :stop | :exception]
  • [:synapse, :ai, :classify, :start | :stop | :exception]
  • [:synapse, :ai, :embed, :start | :stop | :exception]
  • [:synapse, :ai, :batch_embed, :start | :stop | :exception]

Architecture

┌─────────────────────────────────────────────────────────┐
│                    Synapse Workflows                    │
│  ┌──────────────┐  ┌──────────────┐  ┌──────────────┐ │
│  │   Generate   │  │   Classify   │  │    Embed     │ │
│  │    Action    │  │    Action    │  │   Action     │ │
│  └──────┬───────┘  └──────┬───────┘  └──────┬───────┘ │
└─────────┼──────────────────┼──────────────────┼─────────┘
          │                  │                  │
          └──────────────────┼──────────────────┘
                             │
                    ┌────────▼─────────┐
                    │   Synapse.AI     │
                    │   Providers      │
                    └────────┬─────────┘
                             │
                    ┌────────▼─────────┐
                    │    Altar.AI      │
                    │    Adapters      │
                    └────────┬─────────┘
                             │
          ┌──────────────────┼──────────────────┐
          │                  │                  │
    ┌─────▼─────┐     ┌─────▼─────┐     ┌─────▼─────┐
    │  Gemini   │     │  Claude   │     │   Codex   │
    │    SDK    │     │    SDK    │     │    SDK    │
    └───────────┘     └───────────┘     └───────────┘

Benefits Over HTTP-Based Providers

Feature HTTP Providers Synapse.AI (SDK)
Streaming Limited Full support
Caching None Provider-native
Auth Management Manual SDK-handled
Rate Limiting Manual SDK-handled
Retry Logic Manual SDK-native
Fallback Chains Manual Automatic
Type Safety Partial Full
Error Handling Custom Unified

Examples

Multi-Model Pipeline

Spec.new(
  name: :multi_model_pipeline,
  steps: [
    # Fast classification with Gemini
    Step.new(
      id: :classify,
      action: Synapse.AI.Actions.Classify,
      params: %{text: input, labels: ["code", "text"], adapter: :gemini}
    ),

    # Code generation with Codex if needed
    Step.new(
      id: :generate_code,
      action: Synapse.AI.Actions.Generate,
      when: fn env -> env.classify.label == "code" end,
      params: %{prompt: "Generate code for: #{input}", adapter: :codex}
    ),

    # Text refinement with Claude otherwise
    Step.new(
      id: :refine_text,
      action: Synapse.AI.Actions.Generate,
      when: fn env -> env.classify.label == "text" end,
      params: %{prompt: "Refine: #{input}", adapter: :claude}
    )
  ]
)

Automatic Fallback

# Use composite adapter for automatic failover
Step.new(
  id: :resilient_generation,
  action: Synapse.AI.Actions.Generate,
  params: %{
    prompt: "Critical task: #{task}",
    adapter: :composite  # Will try Gemini -> Claude -> Codex
  }
)

Testing

Run the test suite:

mix test

All tests use async execution and mock adapters for fast, reliable testing.

Contributing

Contributions welcome! Please:

  1. Fork the repository
  2. Create a feature branch
  3. Add tests for new functionality
  4. Ensure mix test and mix format pass
  5. Submit a pull request

License

MIT License - see LICENSE for details.

Links


Built with Elixir | Powered by altar_ai | Integrated with Synapse

About

Synapse integration for altar_ai - SDK-backed LLM providers for multi-agent workflows with automatic fallback, signal handlers, and workflow actions

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •  

Languages