-
-
Notifications
You must be signed in to change notification settings - Fork 147
Description
TanStack AI version
@tanstack/ai@0.8.0, @tanstack/ai-anthropic@0.6.2
Framework/Library version
@anthropic-ai/sdk ^0.71.2, Node.js
Describe the bug and the steps to reproduce it
The anthropicText adapter joins system prompts into a plain string before passing them to the Anthropic API:
// src/adapters/text.ts, line 289
system: options.systemPrompts?.join('\n'),This prevents using Anthropic's prompt caching on system prompts. The Anthropic SDK accepts system as either
string or TextBlockParam[], and only the array form supports cache_control:
// String form (current) — no caching possible
system: "You are a helpful assistant..."
// Array form (needed) — supports cache_control
system: [{ type: "text", text: "You are a helpful assistant...", cache_control: { type: "ephemeral" } }]The adapter already supports cache_control on message content parts via the metadata system (AnthropicTextMetadata), so this is just a gap in the system prompt path.
Impact: In agentic loops using chat(), the system prompt is resent on every iteration. Without caching, a large system prompt is billed at full input price on every call. With
caching, repeat reads cost 90% less ($0.30/MTok vs $3/MTok). For a 22-iteration agent run with a 24K-token system prompt, this wastes ~$1.50-2.00 per run.
Proposed fix: Convert system prompts to TextBlockParam[] and expose a way to pass cache_control metadata on them — either via a new option on systemPrompts entries or by
accepting structured objects alongside plain strings.
I'm happy to open a PR for this if the approach sounds right.
Your Minimal, Reproducible Example - (Sandbox Highly Recommended)
n/a
Screenshots or Videos (Optional)
No response
Do you intend to try to help solve this bug with your own PR?
Yes, I think I know how to fix it and will discuss it in the comments of this issue
Terms & Code of Conduct
- I agree to follow this project's Code of Conduct
- I understand that if my bug cannot be reliable reproduced in a debuggable environment, it will probably not be fixed and this issue may even be closed.