Skip to content

fix: address issues #313, #314, #330, #341, #363#380

Open
jherr wants to merge 2 commits intomainfrom
claude/fix-multiple-issues-LKtog
Open

fix: address issues #313, #314, #330, #341, #363#380
jherr wants to merge 2 commits intomainfrom
claude/fix-multiple-issues-LKtog

Conversation

@jherr
Copy link
Contributor

@jherr jherr commented Mar 14, 2026

https://claude.ai/code/session_01GsXhpJTjCnyZifLHTJj9gT

🎯 Changes

✅ Checklist

  • I have followed the steps in the Contributing guide.
  • I have tested this code locally with pnpm run test:pr.

🚀 Release Impact

  • This change affects published code, and I have generated a changeset.
  • This change is docs/CI/dev-only (no release).

Summary by CodeRabbit

  • New Features

    • Added an "Installation" page to Getting Started.
  • Improvements

    • Support for multimodal tool results and outputs across the platform.
    • Better token-usage reporting for image generation results.
    • Improved adapter handling and normalization for model options.
    • More consistent event/devtools handling for reliable result emission.

- #313: Add installation.md docs page with framework-specific sections
  and anchors to fix 404 links from the frameworks page
- #314: Add snake_case to camelCase conversion for OpenRouter modelOptions
  so options like `tool_choice` are not silently discarded by the SDK
- #330: Parse usageMetadata from Gemini image adapter responses instead
  of hardcoding usage as undefined
- #341: Cache a shared EventTarget on the server via
  globalThis.__TANSTACK_EVENT_TARGET__ so emit() and on() operate on
  the same target in Node/Bun/Workers environments
- #363: Preserve array tool results (multimodal content parts) instead
  of always stringifying, enabling image/multimodal tool responses

https://claude.ai/code/session_01GsXhpJTjCnyZifLHTJj9gT
@changeset-bot
Copy link

changeset-bot bot commented Mar 14, 2026

⚠️ No Changeset found

Latest commit: 4c4c030

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 14, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 569dcf50-d84b-4697-87ff-c3840cfcc1f0

📥 Commits

Reviewing files that changed from the base of the PR and between 7d7878f and 4c4c030.

📒 Files selected for processing (1)
  • packages/typescript/ai-devtools/src/store/ai-context.tsx

📝 Walkthrough

Walkthrough

This PR expands tool-result support to multimodal content (string or array) across types, processors, and adapters, adds an installation guide in docs, and ensures a shared EventTarget is initialized in non-browser runtimes for event clients.

Changes

Cohort / File(s) Summary
Documentation
docs/config.json, docs/getting-started/installation.md, docs/getting-started/quick-start.md
Adds an Installation doc and updates nav order; inserts the new entry into the Getting Started children.
Core Type Definitions
packages/typescript/ai-client/src/types.ts, packages/typescript/ai/src/types.ts
Widened tool result types: ToolResultPart.content and ToolCallEndEvent.result changed from string to `string
Tool Result Processing
packages/typescript/ai/src/activities/chat/...
packages/typescript/ai/src/activities/chat/index.ts, .../messages.ts, .../stream/message-updaters.ts, .../stream/processor.ts, .../tools/tool-calls.ts
Preserve array (multimodal) outputs directly; keep string outputs as-is; JSON.stringify other types as fallback. Signature updated where applicable.
Event Infrastructure
packages/typescript/ai-event-client/src/index.ts, packages/typescript/ai-event-client/src/devtools-middleware.ts
Adds a server-environment guard to create/ensure a shared EventTarget on globalThis for non-browser runtimes; devtools middleware conditionally stringifies non-string tool results before emitting.
Adapters
packages/typescript/ai-openai/src/adapters/text.ts, packages/typescript/ai-openrouter/src/adapters/text.ts, packages/typescript/ai-gemini/src/adapters/image.ts
OpenAI adapter accepts array/string tool outputs; OpenRouter normalizes snake_case modelOptions to camelCase; Gemini image adapter populates usage from response metadata.
Devtools / UI Store
packages/typescript/ai-devtools/src/store/ai-context.tsx
Tool-result parts with array content are normalized to JSON strings when stored for the devtools UI.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

🐰 I hopped in code where arrays play,
Strings and images join the fray,
Docs to show the rabbit trail,
EventTargets steady every scale,
Multimodal cheer — hop, hooray!

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Title check ✅ Passed The title directly references the GitHub issues being addressed and summarizes the main objective of this multi-issue fix PR.
Description check ✅ Passed The description covers all required template sections with detailed issue summaries, completed checklist items, and changeset generation confirmation.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch claude/fix-multiple-issues-LKtog
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@nx-cloud
Copy link

nx-cloud bot commented Mar 14, 2026

View your CI Pipeline Execution ↗ for commit 4c4c030

Command Status Duration Result
nx affected --targets=test:sherif,test:knip,tes... ✅ Succeeded 1m 8s View ↗
nx run-many --targets=build --exclude=examples/** ✅ Succeeded 13s View ↗

☁️ Nx Cloud last updated this comment at 2026-03-15 04:13:05 UTC

@pkg-pr-new
Copy link

pkg-pr-new bot commented Mar 14, 2026

Open in StackBlitz

@tanstack/ai

npm i https://pkg.pr.new/@tanstack/ai@380

@tanstack/ai-anthropic

npm i https://pkg.pr.new/@tanstack/ai-anthropic@380

@tanstack/ai-client

npm i https://pkg.pr.new/@tanstack/ai-client@380

@tanstack/ai-devtools-core

npm i https://pkg.pr.new/@tanstack/ai-devtools-core@380

@tanstack/ai-elevenlabs

npm i https://pkg.pr.new/@tanstack/ai-elevenlabs@380

@tanstack/ai-event-client

npm i https://pkg.pr.new/@tanstack/ai-event-client@380

@tanstack/ai-fal

npm i https://pkg.pr.new/@tanstack/ai-fal@380

@tanstack/ai-gemini

npm i https://pkg.pr.new/@tanstack/ai-gemini@380

@tanstack/ai-grok

npm i https://pkg.pr.new/@tanstack/ai-grok@380

@tanstack/ai-groq

npm i https://pkg.pr.new/@tanstack/ai-groq@380

@tanstack/ai-ollama

npm i https://pkg.pr.new/@tanstack/ai-ollama@380

@tanstack/ai-openai

npm i https://pkg.pr.new/@tanstack/ai-openai@380

@tanstack/ai-openrouter

npm i https://pkg.pr.new/@tanstack/ai-openrouter@380

@tanstack/ai-preact

npm i https://pkg.pr.new/@tanstack/ai-preact@380

@tanstack/ai-react

npm i https://pkg.pr.new/@tanstack/ai-react@380

@tanstack/ai-react-ui

npm i https://pkg.pr.new/@tanstack/ai-react-ui@380

@tanstack/ai-solid

npm i https://pkg.pr.new/@tanstack/ai-solid@380

@tanstack/ai-solid-ui

npm i https://pkg.pr.new/@tanstack/ai-solid-ui@380

@tanstack/ai-svelte

npm i https://pkg.pr.new/@tanstack/ai-svelte@380

@tanstack/ai-vue

npm i https://pkg.pr.new/@tanstack/ai-vue@380

@tanstack/ai-vue-ui

npm i https://pkg.pr.new/@tanstack/ai-vue-ui@380

@tanstack/preact-ai-devtools

npm i https://pkg.pr.new/@tanstack/preact-ai-devtools@380

@tanstack/react-ai-devtools

npm i https://pkg.pr.new/@tanstack/react-ai-devtools@380

@tanstack/solid-ai-devtools

npm i https://pkg.pr.new/@tanstack/solid-ai-devtools@380

commit: 4c4c030

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🧹 Nitpick comments (4)
docs/getting-started/installation.md (1)

78-88: Vue and Svelte sections lack usage examples and API doc links.

The React, Solid, and Preact sections include code examples and links to API documentation, but Vue and Svelte only have install commands. Consider adding equivalent content for consistency, or clarifying if these integrations are still in development.

📝 Suggested additions
 ## Vue
 
 ```bash
 npm install `@tanstack/ai-vue`

+The Vue integration provides composables for managing chat state. See the @tanstack/ai-vue API docs for full details.

Svelte

npm install `@tanstack/ai-svelte`

+The Svelte integration provides stores for managing chat state. See the @tanstack/ai-svelte API docs for full details.

</details>

<details>
<summary>🤖 Prompt for AI Agents</summary>

Verify each finding against the current code and only fix it if needed.

In @docs/getting-started/installation.md around lines 78 - 88, Add usage
examples and links to the Vue and Svelte API documentation in the
installation.md file to match the existing React, Solid, and Preact sections.
For Vue, mention that it provides composables for managing chat state and link
to the @tanstack/ai-vue API docs; for Svelte, note it offers stores for chat
state management and link to the @tanstack/ai-svelte API docs. Place these
additions immediately below the respective installation commands.


</details>

</blockquote></details>
<details>
<summary>packages/typescript/ai/src/types.ts (1)</summary><blockquote>

`873-874`: **Consider updating `RealtimeToolResultPart` to support multimodal content for consistency with other tool result types.**

The `ToolResultPart` and `ToolCallEndEvent` types both support `string | Array<ContentPart>` for tool results, enabling multimodal responses. However, `RealtimeToolResultPart` only accepts `string`. While this likely reflects the simpler requirements of the realtime audio-first protocol, aligning the type signature would improve consistency across the codebase. If realtime tool execution scenarios do require multimodal content (e.g., tool returning images), update `RealtimeToolResultPart.content` to `string | Array<ContentPart>` to match.

<details>
<summary>🤖 Prompt for AI Agents</summary>

Verify each finding against the current code and only fix it if needed.

In @packages/typescript/ai/src/types.ts around lines 873 - 874, Update the
RealtimeToolResultPart type so its content property accepts multimodal data like
the other tool result types: change RealtimeToolResultPart.content from string
to string | Array (matching ToolResultPart and ToolCallEndEvent)
and ensure any related usages of RealtimeToolResultPart (e.g.,
serialization/consumers) handle an Array alongside string.


</details>

</blockquote></details>
<details>
<summary>packages/typescript/ai/src/activities/chat/stream/message-updaters.ts (1)</summary><blockquote>

`100-100`: **Tighten tool-result content typing to avoid `any`.**

Use the canonical `ToolResultPart['content']` type here so future content-shape changes stay synchronized and type-safe.

<details>
<summary>♻️ Suggested refactor</summary>

```diff
 export function updateToolResultPart(
   messages: Array<UIMessage>,
   messageId: string,
   toolCallId: string,
-  content: string | Array<any>,
+  content: ToolResultPart['content'],
   state: ToolResultState,
   error?: string,
 ): Array<UIMessage> {

Based on learnings: Maintain type safety through multimodal content support (image, audio, video, document) with model capability awareness.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai/src/activities/chat/stream/message-updaters.ts` at
line 100, The property currently typed as "content: string | Array<any>" is too
loose; replace it with the canonical ToolResultPart['content'] type to keep
multimodal tool-result shapes synchronized and type-safe. Update the declaration
that currently uses "content: string | Array<any>" in message-updaters.ts to
reference ToolResultPart['content'], and ensure any imports or type aliases
include ToolResultPart so the compiler enforces the correct shape for
images/audio/video/documents as models evolve.
packages/typescript/ai-gemini/src/adapters/image.ts (1)

205-224: Remove as any and properly type forward-compatibility for future SDK versions.

The usageMetadata property does not exist on GenerateImagesResponse in @google/genai v1.43.0 (the version used). The defensive code at line 206 uses as any to handle a future SDK update, but this approach weakens type safety. Instead of casting to any, use a proper type guard with a narrowing check—either remove this forward-compatibility code if it's not immediately needed, or implement it with a type-safe approach:

const usageMeta =
  'usageMetadata' in response
    ? (response as GenerateImagesResponse & {
        usageMetadata?: {
          promptTokenCount?: number
          candidatesTokenCount?: number
          totalTokenCount?: number
        }
      }).usageMetadata
    : undefined

This preserves type safety without as any and prepares for the property if it's added to the SDK in the future.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai-gemini/src/adapters/image.ts` around lines 205 - 224,
The current code casts response to any to access a non-existent usageMetadata,
weakening type safety; replace that cast by using a type guard: check
"'usageMetadata' in response" and if present narrow response to
GenerateImagesResponse & { usageMetadata?: { promptTokenCount?: number;
candidatesTokenCount?: number; totalTokenCount?: number } } and read its
usageMetadata into the usageMeta variable, otherwise set usageMeta to undefined;
update the usage construction that references usageMeta
(inputTokens/outputTokens/totalTokens) to remain unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@docs/getting-started/installation.md`:
- Around line 100-127: The Adapters installation list is missing the Fal adapter
referenced in the docs navigation; update the "Adapters" section in
installation.md to include the Fal adapter by adding an entry for the Fal
package (npm install `@tanstack/ai-fal`) alongside the other providers and ensure
the text still points readers to the Adapters section for provider-specific
setup; keep naming consistent with existing entries (e.g., "Fal") so it matches
the navigation config.

In `@packages/typescript/ai-client/src/types.ts`:
- Around line 146-147: The ToolResultPart.content property currently uses a
loose Array<any>; update it to use the specific ContentPart type for consistency
and stronger typing by changing the union from string | Array<any> to string |
Array<ContentPart> (modify the ToolResultPart.content declaration to reference
ContentPart which is already imported).

In `@packages/typescript/ai-event-client/src/devtools-middleware.ts`:
- Around line 198-201: The current serialization for the result field uses
JSON.stringify(chunk.result ?? ''), which turns null/undefined into the
two-character string '" "' instead of an empty string; update the logic around
the result key (the expression using typeof chunk.result === 'string' ? ... :
...) to short-circuit null/undefined explicitly so that when chunk.result is
null or undefined you return '' (empty string), otherwise call JSON.stringify on
the non-string value; reference the existing chunk.result check and the result
property so the branch becomes: if it's a string return it, else if chunk.result
== null return '' else return JSON.stringify(chunk.result).

In `@packages/typescript/ai-openai/src/adapters/text.ts`:
- Around line 709-716: The function_call_output.output assignment incorrectly
allows arrays for message.content which the OpenAI Responses API does not
accept; update the logic in the adapter where function_call_output.output is
built (referencing message.content and function_call_output.output) to always
pass a string: if message.content is already a string use it, otherwise
JSON.stringify(message.content) so arrays and objects become valid string
payloads; ensure any downstream consumers parse the JSON string if they expect
structured multimodal data.

In `@packages/typescript/ai/src/activities/chat/messages.ts`:
- Around line 251-253: The code currently JSON.stringify's non-array outputs
which turns string tool outputs into quoted strings; change the content
assignment for part.output so strings are preserved: replace the ternary that
sets content using Array.isArray(part.output) ? part.output :
JSON.stringify(part.output) with logic that returns part.output when it's an
array, returns part.output as-is when typeof part.output === 'string', and only
calls JSON.stringify for other types (e.g., objects), referencing the same
part.output expression in messages.ts.

---

Nitpick comments:
In `@docs/getting-started/installation.md`:
- Around line 78-88: Add usage examples and links to the Vue and Svelte API
documentation in the installation.md file to match the existing React, Solid,
and Preact sections. For Vue, mention that it provides composables for managing
chat state and link to the `@tanstack/ai-vue` API docs; for Svelte, note it offers
stores for chat state management and link to the `@tanstack/ai-svelte` API docs.
Place these additions immediately below the respective installation commands.

In `@packages/typescript/ai-gemini/src/adapters/image.ts`:
- Around line 205-224: The current code casts response to any to access a
non-existent usageMetadata, weakening type safety; replace that cast by using a
type guard: check "'usageMetadata' in response" and if present narrow response
to GenerateImagesResponse & { usageMetadata?: { promptTokenCount?: number;
candidatesTokenCount?: number; totalTokenCount?: number } } and read its
usageMetadata into the usageMeta variable, otherwise set usageMeta to undefined;
update the usage construction that references usageMeta
(inputTokens/outputTokens/totalTokens) to remain unchanged.

In `@packages/typescript/ai/src/activities/chat/stream/message-updaters.ts`:
- Line 100: The property currently typed as "content: string | Array<any>" is
too loose; replace it with the canonical ToolResultPart['content'] type to keep
multimodal tool-result shapes synchronized and type-safe. Update the declaration
that currently uses "content: string | Array<any>" in message-updaters.ts to
reference ToolResultPart['content'], and ensure any imports or type aliases
include ToolResultPart so the compiler enforces the correct shape for
images/audio/video/documents as models evolve.

In `@packages/typescript/ai/src/types.ts`:
- Around line 873-874: Update the RealtimeToolResultPart type so its content
property accepts multimodal data like the other tool result types: change
RealtimeToolResultPart.content from string to string | Array<ContentPart>
(matching ToolResultPart and ToolCallEndEvent) and ensure any related usages of
RealtimeToolResultPart (e.g., serialization/consumers) handle an
Array<ContentPart> alongside string.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: e1795af3-dc4d-489d-b6ec-8d72d5fb5fcf

📥 Commits

Reviewing files that changed from the base of the PR and between 0e21282 and 7d7878f.

📒 Files selected for processing (15)
  • docs/config.json
  • docs/getting-started/installation.md
  • docs/getting-started/quick-start.md
  • packages/typescript/ai-client/src/types.ts
  • packages/typescript/ai-event-client/src/devtools-middleware.ts
  • packages/typescript/ai-event-client/src/index.ts
  • packages/typescript/ai-gemini/src/adapters/image.ts
  • packages/typescript/ai-openai/src/adapters/text.ts
  • packages/typescript/ai-openrouter/src/adapters/text.ts
  • packages/typescript/ai/src/activities/chat/index.ts
  • packages/typescript/ai/src/activities/chat/messages.ts
  • packages/typescript/ai/src/activities/chat/stream/message-updaters.ts
  • packages/typescript/ai/src/activities/chat/stream/processor.ts
  • packages/typescript/ai/src/activities/chat/tools/tool-calls.ts
  • packages/typescript/ai/src/types.ts

Comment on lines +100 to +127
## Adapters

You also need an adapter for your LLM provider. Install one (or more) of the following:

```bash
# OpenRouter (recommended — 300+ models with one API key)
npm install @tanstack/ai-openrouter

# OpenAI
npm install @tanstack/ai-openai

# Anthropic
npm install @tanstack/ai-anthropic

# Google Gemini
npm install @tanstack/ai-gemini

# Ollama (local models)
npm install @tanstack/ai-ollama

# Groq
npm install @tanstack/ai-groq

# Grok (xAI)
npm install @tanstack/ai-grok
```

See the [Adapters section](../adapters/openai) for provider-specific setup guides.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Missing Fal adapter from the adapters list.

The Fal adapter is listed in the navigation config (docs/config.json line 176-178) under Adapters, but is not included in the installation instructions here.

📝 Suggested addition
 # Grok (xAI)
 npm install `@tanstack/ai-grok`
+
+# Fal (image/video generation)
+npm install `@tanstack/ai-fal`
</details>

<details>
<summary>🤖 Prompt for AI Agents</summary>

Verify each finding against the current code and only fix it if needed.

In @docs/getting-started/installation.md around lines 100 - 127, The Adapters
installation list is missing the Fal adapter referenced in the docs navigation;
update the "Adapters" section in installation.md to include the Fal adapter by
adding an entry for the Fal package (npm install @tanstack/ai-fal) alongside the
other providers and ensure the text still points readers to the Adapters section
for provider-specific setup; keep naming consistent with existing entries (e.g.,
"Fal") so it matches the navigation config.


</details>

<!-- fingerprinting:phantom:poseidon:ocelot -->

<!-- This is an auto-generated comment by CodeRabbit -->

Comment on lines +146 to +147
/** Tool result content. String for text results, or an array for multimodal results. */
content: string | Array<any>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Search for direct string method calls on ToolResultPart.content without type guards
rg -n --type=ts 'content\.(trim|split|substring|slice|replace|match|toLowerCase|toUpperCase)' packages/typescript/ai-client/

Repository: TanStack/ai

Length of output: 179


🏁 Script executed:

#!/bin/bash
# Read the types.ts file around lines 146-147 to see the actual definition
echo "=== ai-client/src/types.ts (lines 140-155) ==="
sed -n '140,155p' packages/typescript/ai-client/src/types.ts

echo -e "\n=== Check ai/src/types.ts for ToolResultPart definition ==="
grep -n -A 3 "ToolResultPart" packages/typescript/ai/src/types.ts | head -20

Repository: TanStack/ai

Length of output: 835


🏁 Script executed:

#!/bin/bash
# Search for other potential unsafe content usage patterns more broadly
echo "=== Search for .content usage in ai-client ==="
rg -n '\.content' packages/typescript/ai-client/src/ -A 2 -B 2 --type=ts | head -100

Repository: TanStack/ai

Length of output: 1953


🏁 Script executed:

#!/bin/bash
# Get the complete ToolResultPart definition from ai/src/types.ts
sed -n '298,310p' packages/typescript/ai/src/types.ts

Repository: TanStack/ai

Length of output: 428


🏁 Script executed:

#!/bin/bash
# Also search for ContentPart type definition to understand the type hierarchy
rg -n "interface ContentPart" packages/typescript/ -A 5

Repository: TanStack/ai

Length of output: 1586


🏁 Script executed:

#!/bin/bash
# Check the context around chat-client.ts:352 to see the full usage
sed -n '345,360p' packages/typescript/ai-client/src/chat-client.ts

Repository: TanStack/ai

Length of output: 548


🏁 Script executed:

#!/bin/bash
# Check if ContentPart is properly exported/imported in ai-client
rg -n "ContentPart|import.*types" packages/typescript/ai-client/src/types.ts | head -20

Repository: TanStack/ai

Length of output: 113


🏁 Script executed:

#!/bin/bash
# Also check what ContentPart actually is
rg -n "type ContentPart|interface ContentPart" packages/typescript/ai/src/types.ts -A 5

Repository: TanStack/ai

Length of output: 1049


Use Array<ContentPart> for type consistency with the main ai package.

The ToolResultPart.content type in ai-client uses Array<any>, while packages/typescript/ai/src/types.ts defines it as string | Array<ContentPart>. Since ContentPart is already imported in this file, use the properly-typed alternative to maintain consistency and provide better type safety across the codebase.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai-client/src/types.ts` around lines 146 - 147, The
ToolResultPart.content property currently uses a loose Array<any>; update it to
use the specific ContentPart type for consistency and stronger typing by
changing the union from string | Array<any> to string | Array<ContentPart>
(modify the ToolResultPart.content declaration to reference ContentPart which is
already imported).

Comment on lines +198 to +201
result:
typeof chunk.result === 'string'
? chunk.result
: JSON.stringify(chunk.result ?? ''),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Bug: Undefined results produce '""' instead of empty string.

When chunk.result is undefined or null, this evaluates to:

JSON.stringify('' /* from ?? '' */)  // produces '""' (a 2-character string)

This changes the behavior from producing '' (falsy, empty) to '""' (truthy, parseable as empty string). Downstream code checking truthiness or parsing will behave differently:

  • if (chunk.result) will now be true for '""'
  • JSON.parse('""') yields "" instead of failing

Consider preserving the original empty string behavior for undefined/null:

🐛 Proposed fix
           result:
             typeof chunk.result === 'string'
               ? chunk.result
-              : JSON.stringify(chunk.result ?? ''),
+              : chunk.result != null
+                ? JSON.stringify(chunk.result)
+                : '',
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai-event-client/src/devtools-middleware.ts` around lines
198 - 201, The current serialization for the result field uses
JSON.stringify(chunk.result ?? ''), which turns null/undefined into the
two-character string '" "' instead of an empty string; update the logic around
the result key (the expression using typeof chunk.result === 'string' ? ... :
...) to short-circuit null/undefined explicitly so that when chunk.result is
null or undefined you return '' (empty string), otherwise call JSON.stringify on
the non-string value; reference the existing chunk.result check and the result
property so the branch becomes: if it's a string return it, else if chunk.result
== null return '' else return JSON.stringify(chunk.result).

Comment on lines +709 to +716
// Support multimodal tool outputs (OpenAI Responses API accepts
// string or array of content parts for function_call_output).
output:
typeof message.content === 'string'
typeof message.content === 'string' ||
Array.isArray(message.content)
? message.content
: JSON.stringify(message.content),
})
} as any)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🌐 Web query:

Does OpenAI Responses API function_call_output support array content multimodal?

💡 Result:

No. In the Responses API, a function_call_output item’s payload is the output string (free text—often a JSON-encoded string). It does not support the “array of content parts” multimodal content: [...] structure used by message items. [1], [2]

If you need multimodal data downstream, return structured JSON in output (e.g., with file/image URLs or IDs) and then include those as proper multimodal inputs via a subsequent message item with content parts. [1], [2]

Sources: [1], [2]


The multimodal tool output logic is incorrect and will fail at runtime.

The OpenAI Responses API function_call_output.output field only accepts strings, not arrays. The current code at lines 712–716 passes message.content directly if it's an array, which violates the API contract:

output:
  typeof message.content === 'string' ||
  Array.isArray(message.content)
    ? message.content  // ❌ Arrays not supported by output field
    : JSON.stringify(message.content),

Fix: Always stringify the output, including arrays:

output: typeof message.content === 'string'
  ? message.content
  : JSON.stringify(message.content),

If multimodal data is needed, encode it as structured JSON (e.g., with URLs or references) in the string and let downstream code parse it.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai-openai/src/adapters/text.ts` around lines 709 - 716,
The function_call_output.output assignment incorrectly allows arrays for
message.content which the OpenAI Responses API does not accept; update the logic
in the adapter where function_call_output.output is built (referencing
message.content and function_call_output.output) to always pass a string: if
message.content is already a string use it, otherwise
JSON.stringify(message.content) so arrays and objects become valid string
payloads; ensure any downstream consumers parse the JSON string if they expect
structured multimodal data.

Comment on lines +251 to +253
content: Array.isArray(part.output)
? part.output
: JSON.stringify(part.output),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Preserve string tool outputs here as well (not only arrays).

This path still JSON.stringifys string outputs, so client tool results become quoted strings while server tool results stay raw strings. That mismatch can change model context interpretation.

🐛 Proposed fix
         role: 'tool',
-        content: Array.isArray(part.output)
-          ? part.output
-          : JSON.stringify(part.output),
+        content:
+          typeof part.output === 'string' || Array.isArray(part.output)
+            ? part.output
+            : JSON.stringify(part.output),
         toolCallId: part.id,
       })
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
content: Array.isArray(part.output)
? part.output
: JSON.stringify(part.output),
role: 'tool',
content:
typeof part.output === 'string' || Array.isArray(part.output)
? part.output
: JSON.stringify(part.output),
toolCallId: part.id,
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai/src/activities/chat/messages.ts` around lines 251 -
253, The code currently JSON.stringify's non-array outputs which turns string
tool outputs into quoted strings; change the content assignment for part.output
so strings are preserved: replace the ternary that sets content using
Array.isArray(part.output) ? part.output : JSON.stringify(part.output) with
logic that returns part.output when it's an array, returns part.output as-is
when typeof part.output === 'string', and only calls JSON.stringify for other
types (e.g., objects), referencing the same part.output expression in
messages.ts.

ToolResultPart.content was changed to `string | Array<ContentPart>` in
7d7878f but the devtools consumer still expected `string`. Stringify
array content so the MessagePart type is satisfied.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants