Skip to content

Microsoft.Extensions.AI.OpenAI: TextReasoningContent is read from reasoning_content but not round-tripped back in Chat Completions history #7525

@soul-soft

Description

@soul-soft

Describe the bug

Microsoft.Extensions.AI.OpenAI correctly reads OpenAI-compatible reasoning_content from Chat Completions responses and surfaces it as TextReasoningContent, but it does not write that
content back into subsequent assistant history messages when sending the next request.

This breaks compatibility with providers such as DeepSeek reasoning models that require the previous assistant message's reasoning_content to be passed back on the next turn.

In practice:

  • response JSON contains choices[0].message.reasoning_content
  • OpenAIChatClient converts it to TextReasoningContent
  • the message is stored in history as a generic Microsoft.Extensions.AI.ChatMessage
  • on the next request, OpenAIChatClient reconstructs a new AssistantChatMessage
  • that reconstruction path does not serialize TextReasoningContent back to reasoning_content

As a result, the next request sends a normal assistant message without reasoning_content, and DeepSeek rejects it.

Expected behavior

If a previous assistant message contains TextReasoningContent, the Chat Completions adapter should round-trip it back into the next outgoing assistant message, ideally as
reasoning_content for OpenAI-compatible providers.

Actual behavior

TextReasoningContent is visible in ChatMessage.Contents, but it is dropped when history is converted back into outgoing Chat Completions messages.

Why this appears to be the cause

The response path already supports reading reasoning_content:

  • OpenAIChatClient.FromOpenAIChatCompletion(...) extracts $.choices[0].message.reasoning_content
  • streaming also extracts $.choices[0].delta.reasoning_content

However, the request path appears not to support writing it back:

  • ToOpenAIChatMessages(...) reconstructs assistant messages when RawRepresentation is not already an OpenAI.Chat.ChatMessage
  • in that reconstruction logic, TextReasoningContent is not handled specially
  • ToChatMessageContentPart(...) also does not map TextReasoningContent

So the implementation is currently ingress-only for reasoning_content, not round-trip.

Repro

Minimal repro pattern:

  1. Use OpenAI.Chat.ChatClient(...).AsIChatClient()
  2. Call a reasoning-capable OpenAI-compatible Chat Completions endpoint that returns reasoning_content
  3. Store the returned ChatMessage in history
  4. Send a second turn using that history
  5. The outgoing assistant history message no longer contains reasoning_content

With DeepSeek reasoning models, the second request fails with an error equivalent to:

The reasoning_content in the thinking mode must be passed back to the API.

Notes

This is different from the OpenAI Responses path, which already seems to have explicit round-trip handling for reasoning items.

This issue is specifically about the Chat Completions adapter.

Possible fix

When reconstructing assistant history messages in the Chat Completions path, map TextReasoningContent back into the outgoing AssistantChatMessage, likely via Patch / raw JSON so
reasoning_content is preserved for OpenAI-compatible providers.

Related context

The behavior seems related to the work that added support for surfacing OpenAI-compatible reasoning_content as TextReasoningContent:

But that change appears to cover reading only, not writing it back in subsequent history turns.

Metadata

Metadata

Assignees

No one assigned

    Labels

    area-aiMicrosoft.Extensions.AI librariesuntriaged

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions