You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The chat() function and underlying ChatRequest model in ollama-python do not expose a tool_choice parameter, even though several upstream endpoints already handle it:
OpenAI-compat endpoint in ollama/ollama server (openai/responses.go): ToolChoice is parsed off the request, with TODO comments noting partial support.
Anthropic-compat endpoint (anthropic/trace.go + anthropic/anthropic.go): ToolChoice is wired through.
importollama# Test 1: tool_choice as kwarg → hard rejectionollama.chat(
model="llama3.1:8b",
messages=[{"role": "user", "content": "Hello"}],
tool_choice="auto",
)
# TypeError: Client.chat() got an unexpected keyword argument 'tool_choice'# Test 2: tool_choice via options dict → silently accepted but not wired# (options is passed through as model parameters; tool_choice has no effect there)ollama.chat(
model="llama3.1:8b",
messages=[{"role": "user", "content": "Hello"}],
options={"tool_choice": "auto"},
)
# OK, no error, but tool_choice is not interpreted by the server
ChatRequest.model_json_schema() confirms there is no tool_choice field. Current fields: model, stream, options, format, keep_alive, messages, tools, think, logprobs, top_logprobs.
Why it matters
Practitioners porting code from OpenAI's or Anthropic's SDK to ollama-python expect a tool_choice parameter (to force tool use, restrict to a specific tool, or disable tools). Today the only available control is omitting tools entirely, which is coarse.
Concrete suggestion
When the native /api/chat server endpoint adds tool_choice (per ollama/ollama#11171), expose the same parameter on ollama.chat() and ChatRequest:
# Standard cross-vendor valuestool_choice: Literal["auto", "required", "none"] |dict|None=None
Until then, this issue serves as a tracking item for the Python client mirror so it lands in the same release window as the server-side feature.
Workarounds
For "must use a tool" semantics, append a directive to the system prompt and use temperature=0.0 (verified to produce reliable tool calls on Llama 3.1 8B).
For "must not use a tool", omit the tools argument entirely.
Happy to send a PR once the server-side direction is settled.
Observation
The
chat()function and underlyingChatRequestmodel inollama-pythondo not expose atool_choiceparameter, even though several upstream endpoints already handle it:ollama/ollamaserver (openai/responses.go):ToolChoiceis parsed off the request, with TODO comments noting partial support.anthropic/trace.go+anthropic/anthropic.go):ToolChoiceis wired through./api/chat:tool_choiceis not a request field today; tracked upstream at Add support fortool_choiceset toanyto require a model to use a tool ollama#11171 ("Add support fortool_choiceset toanyto require a model to use a tool").Empirical behavior today
ChatRequest.model_json_schema()confirms there is notool_choicefield. Current fields:model,stream,options,format,keep_alive,messages,tools,think,logprobs,top_logprobs.Why it matters
Practitioners porting code from OpenAI's or Anthropic's SDK to ollama-python expect a
tool_choiceparameter (to force tool use, restrict to a specific tool, or disable tools). Today the only available control is omittingtoolsentirely, which is coarse.Concrete suggestion
When the native
/api/chatserver endpoint addstool_choice(per ollama/ollama#11171), expose the same parameter onollama.chat()andChatRequest:Until then, this issue serves as a tracking item for the Python client mirror so it lands in the same release window as the server-side feature.
Workarounds
temperature=0.0(verified to produce reliable tool calls on Llama 3.1 8B).toolsargument entirely.Happy to send a PR once the server-side direction is settled.