Skip to content

fix(opencode): default temperature for openai-compatible config models#25919

Open
adavila0703 wants to merge 2 commits intoanomalyco:devfrom
adavila0703:fix/opencode-openai-compatible-temperature
Open

fix(opencode): default temperature for openai-compatible config models#25919
adavila0703 wants to merge 2 commits intoanomalyco:devfrom
adavila0703:fix/opencode-openai-compatible-temperature

Conversation

@adavila0703
Copy link
Copy Markdown

Issue for this PR

Closes #25755

Type of change

  • Bug fix
  • New feature
  • Refactor / code improvement
  • Documentation

What does this PR do?

Custom models under opencode.json that use @ai-sdk/openai-compatible did not set temperature in the model config. We defaulted
capabilities.temperature to false, and session/llm.ts only sends temperature when that flag is true—so agent/frontmatter temperature
never reached /v1/chat/completions.
If the resolved SDK is @ai-sdk/openai-compatible and neither the config model nor an inherited models.dev model sets temperature, we now
default the capability to true. Models can still opt out with temperature: false on the model in config.

How did you verify your code works?

  • bun test test/provider/provider.test.ts (extended “custom provider with npm package” to assert capabilities.temperature)
  • bun test test/session/llm.test.ts (sanity for openai-compatible request path)
  • bun typecheck from packages/opencode

Screenshots / recordings

N/A (non-UI)

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

adavila0703 and others added 2 commits May 5, 2026 16:10
Config-defined @ai-sdk/openai-compatible models left capabilities.temperature
false when unset, so LLM never sent temperature despite agent config.

Fixes anomalyco#25755

Co-authored-by: Cursor <cursoragent@cursor.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

temperature not sent in request body for custom OpenAI-compatible provider

1 participant