Skip to content

Add tool_choice parameter to chat() (mirror server-side ollama#11171) #663

@albertodiazdurana

Description

@albertodiazdurana

Observation

The chat() function and underlying ChatRequest model in ollama-python do not expose a tool_choice parameter, even though several upstream endpoints already handle it:

  • OpenAI-compat endpoint in ollama/ollama server (openai/responses.go): ToolChoice is parsed off the request, with TODO comments noting partial support.
  • Anthropic-compat endpoint (anthropic/trace.go + anthropic/anthropic.go): ToolChoice is wired through.
  • Native /api/chat: tool_choice is not a request field today; tracked upstream at Add support for tool_choice set to any to require a model to use a tool ollama#11171 ("Add support for tool_choice set to any to require a model to use a tool").

Empirical behavior today

import ollama

# Test 1: tool_choice as kwarg → hard rejection
ollama.chat(
    model="llama3.1:8b",
    messages=[{"role": "user", "content": "Hello"}],
    tool_choice="auto",
)
# TypeError: Client.chat() got an unexpected keyword argument 'tool_choice'

# Test 2: tool_choice via options dict → silently accepted but not wired
# (options is passed through as model parameters; tool_choice has no effect there)
ollama.chat(
    model="llama3.1:8b",
    messages=[{"role": "user", "content": "Hello"}],
    options={"tool_choice": "auto"},
)
# OK, no error, but tool_choice is not interpreted by the server

ChatRequest.model_json_schema() confirms there is no tool_choice field. Current fields: model, stream, options, format, keep_alive, messages, tools, think, logprobs, top_logprobs.

Why it matters

Practitioners porting code from OpenAI's or Anthropic's SDK to ollama-python expect a tool_choice parameter (to force tool use, restrict to a specific tool, or disable tools). Today the only available control is omitting tools entirely, which is coarse.

Concrete suggestion

When the native /api/chat server endpoint adds tool_choice (per ollama/ollama#11171), expose the same parameter on ollama.chat() and ChatRequest:

# Standard cross-vendor values
tool_choice: Literal["auto", "required", "none"] | dict | None = None

Until then, this issue serves as a tracking item for the Python client mirror so it lands in the same release window as the server-side feature.

Workarounds

  • For "must use a tool" semantics, append a directive to the system prompt and use temperature=0.0 (verified to produce reliable tool calls on Llama 3.1 8B).
  • For "must not use a tool", omit the tools argument entirely.

Happy to send a PR once the server-side direction is settled.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions