Skip to content

fix(providers): propagate abort signal to all LLM SDK calls#3325

Merged
waleedlatif1 merged 3 commits intostagingfrom
fix/cancel-model
Feb 24, 2026
Merged

fix(providers): propagate abort signal to all LLM SDK calls#3325
waleedlatif1 merged 3 commits intostagingfrom
fix/cancel-model

Conversation

@waleedlatif1
Copy link
Collaborator

Summary

  • Add abortSignal to ProviderRequest interface
  • Wire ctx.abortSignal from agent handler into provider requests
  • Pass signal to all SDK call sites across 13 providers (Anthropic, Gemini, OpenAI, Bedrock, DeepSeek, Groq, xAI, Mistral, Cerebras, vLLM, Ollama, OpenRouter, Azure OpenAI)
  • In-flight LLM requests now cancel immediately when execution times out or user disconnects

Type of Change

  • Bug fix

Testing

Tested manually — verified TypeScript compiles clean, existing provider tests pass

Checklist

  • Code follows project style guidelines
  • Self-reviewed my changes
  • Tests added/updated and passing
  • No new warnings introduced
  • I confirm that I have read and agree to the terms outlined in the Contributor License Agreement (CLA)

@vercel
Copy link

vercel bot commented Feb 24, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Actions Updated (UTC)
docs Skipped Skipped Feb 24, 2026 10:45pm

Request Review

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Feb 24, 2026

Greptile Summary

This PR implements proper abort signal propagation across all LLM provider integrations to enable immediate cancellation of in-flight API requests when executions timeout or users disconnect.

Key changes:

  • Added optional abortSignal field to ProviderRequest interface
  • Wired ctx.abortSignal from agent handler into all provider requests
  • Propagated signal to SDK calls across 13 providers (Anthropic, Gemini, OpenAI, Bedrock, DeepSeek, Groq, xAI, Mistral, Cerebras, vLLM, Ollama, OpenRouter, Azure OpenAI)
  • Covered both streaming and non-streaming code paths, as well as tool loop iterations

Implementation is consistent and thorough:

  • Each provider follows the appropriate pattern for its SDK (options object with signal field for OpenAI-compatible SDKs, abortSignal field for AWS SDK, direct config field for Gemini)
  • All SDK call sites have been updated, including initial calls, tool loop iterations, and final streaming responses
  • Changes are minimal and focused - no refactoring or over-engineering

Confidence Score: 5/5

  • This PR is safe to merge with minimal risk
  • The implementation is straightforward, consistent across all providers, and follows each SDK's documented pattern for abort signal handling. The changes are purely additive (optional parameter), maintain backward compatibility, and the author confirmed TypeScript compiles cleanly with existing tests passing.
  • No files require special attention

Important Files Changed

Filename Overview
apps/sim/providers/types.ts Added optional abortSignal field to ProviderRequest interface
apps/sim/executor/handlers/agent/agent-handler.ts Wired ctx.abortSignal to provider request to enable cancellation
apps/sim/providers/anthropic/core.ts Propagated abort signal to all Anthropic SDK calls (streaming and non-streaming)
apps/sim/providers/openai/core.ts Added abort signal to all fetch calls for OpenAI Responses API
apps/sim/providers/gemini/core.ts Added abort signal to Gemini configuration object
apps/sim/providers/bedrock/index.ts Passed abort signal to all Bedrock client.send() calls
apps/sim/providers/groq/index.ts Added abort signal to all Groq SDK calls (streaming and non-streaming)
apps/sim/providers/deepseek/index.ts Propagated abort signal to all DeepSeek SDK calls
apps/sim/providers/xai/index.ts Added abort signal to all xAI SDK calls (streaming and non-streaming)
apps/sim/providers/mistral/index.ts Propagated abort signal to all Mistral SDK calls
apps/sim/providers/cerebras/index.ts Added abort signal to all Cerebras SDK calls
apps/sim/providers/ollama/index.ts Propagated abort signal to all Ollama SDK calls
apps/sim/providers/vllm/index.ts Added abort signal to all vLLM SDK calls
apps/sim/providers/openrouter/index.ts Propagated abort signal to all OpenRouter SDK calls
apps/sim/providers/azure-openai/index.ts Added abort signal to all Azure OpenAI SDK calls

Last reviewed commit: a98e20f

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

15 files reviewed, no comments

Edit Code Review Agent Settings | Greptile

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 2 potential issues.

Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.

@waleedlatif1
Copy link
Collaborator Author

@cursor review

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.

@waleedlatif1
Copy link
Collaborator Author

@cursor review

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Bugbot reviewed your changes and found no new issues!

Comment @cursor review or bugbot run to trigger another review on this PR

@waleedlatif1 waleedlatif1 merged commit 0574427 into staging Feb 24, 2026
12 checks passed
@waleedlatif1 waleedlatif1 deleted the fix/cancel-model branch February 24, 2026 22:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant