Replies: 2 comments
-
|
هاي |
Beta Was this translation helpful? Give feedback.
-
|
Hi! At the moment, streaming tool calls with Chat Completions isn’t fully supported for “custom” tool types in the official API or SDK. The spec and current SDK implementations only explicitly list support for In short:
If you need streaming of tool calls or fine-grained streaming of custom tool invocations, one option is to use the Responses API instead — it has richer streaming events and better tooling support for advanced tool interactions, including custom tools. The Responses API is the newer recommended interface for complex agent-style streaming. Here’s a minimal streaming snippet showing what is supported with function tools: from openai import OpenAI
client = OpenAI()
stream = client.chat.completions.create(
model="gpt-5",
messages=[{"role": "user", "content": "Hello"}],
tools=[
{"type": "function", "function": {"name": "my_tool"}}
],
stream=True,
)
for chunk in stream:
# text arrives chunk by chunk
print(chunk.choices[0].delta) |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Context:
In the documentation here, an example is given for passing
"custom"tools to models using the Chat Completions API.However, in the chat completion chunk object described here, the
choices.delta.tool_callsonly seems to supporttype="function"with no mention of"custom". The same is reflected in theopenaipython SDK, currently leading to among other things typing issues in downstream custom code.Question:
So does this mean that streaming from Chat Completions while passing
"custom"tools isn't supported? Or perhaps is the API definition incomplete?Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions