-
Notifications
You must be signed in to change notification settings - Fork 336
SSE comment lines from OpenRouter cause "unexpected end of JSON input" stream error #2349
Description
Describe the bug
When using an OpenAI-compatible custom provider (e.g. OpenRouter), the agent fails with:
all models failed: error receiving from stream: unexpected end of JSON input
Root cause: OpenRouter sends SSE comment lines during initial processing:
: OPENROUTER PROCESSING
: OPENROUTER PROCESSING
data: {"id":"...","object":"chat.completion.chunk",...}
SSE comment lines (lines starting with :) are valid per the SSE spec and must be ignored by clients. Docker agent's stream parser does not skip them and attempts to parse them as JSON, causing the error.
Version affected
docker agent version v1.42.0
Commit: 8a128f53f9b3a418ad9937d72e37a132854e8e18
How To Reproduce
providers:
openrouter:
api_type: openai_responses
base_url: https://openrouter.ai/api/v1
token_key: OPENROUTER_API_KEY- Configure a model with provider: openrouter pointing to OpenRouter (base_url: https://openrouter.ai/api/v1)
- Run any agent — fails immediately on first response
Expectation
Comment lines are silently skipped per the SSE spec.
OS and Terminal type
Linux/ghostty
Additional context
Agent-suggested fix:
Root cause confirmed: The bug is in the OpenAI Go SDK's ssestream package, which docker-agent uses for stream parsing. It doesn't skip SSE comment lines (per the spec, lines starting with : must be ignored). When OpenRouter sends : OPENROUTER PROCESSING, the SDK tries to JSON-parse it and fails.
Where to fix in docker-agent: Add a new SSECommentFilterMiddleware in pkg/model/provider/oaistream/middleware.go, then apply it in client.go. The existing ErrorBodyMiddleware in the same file shows the exact pattern to follow. The non-gateway path (where custom providers like OpenRouter land) currently doesn't apply any middleware at all — it just does:
httpClient := httpclient.NewHTTPClient(ctx)
clientOptions = append(clientOptions, option.WithHTTPClient(httpClient))
client := openai.NewClient(clientOptions...)
The fix — add to pkg/model/provider/oaistream/middleware.go:
// SSECommentFilterMiddleware returns a middleware that strips SSE comment lines
// (lines starting with ':') from streaming responses. Some providers (e.g. OpenRouter)
// send comments such as ": OPENROUTER PROCESSING" which are valid per the SSE spec
// but cause "unexpected end of JSON input" in the OpenAI SDK's SSE parser.
func SSECommentFilterMiddleware() option.Middleware {
return func(req *http.Request, next option.MiddlewareNext) (*http.Response, error) {
resp, err := next(req)
if err != nil || resp == nil || resp.StatusCode >= 400 {
return resp, err
}
resp.Body = newSSECommentFilter(resp.Body)
resp.ContentLength = -1
return resp, nil
}
}
type sseCommentFilter struct {
r *bufio.Reader
orig io.ReadCloser
buf []byte
}
func newSSECommentFilter(rc io.ReadCloser) *sseCommentFilter {
return &sseCommentFilter{r: bufio.NewReader(rc), orig: rc}
}
func (f *sseCommentFilter) Read(p []byte) (int, error) {
for len(f.buf) == 0 {
line, err := f.r.ReadBytes('\n')
if len(line) > 0 && !bytes.HasPrefix(line, []byte(":")) {
f.buf = line
}
if err != nil {
if len(f.buf) == 0 {
return 0, err
}
break
}
}
n := copy(p, f.buf)
f.buf = f.buf[n:]
return n, nil
}
func (f *sseCommentFilter) Close() error { return f.orig.Close() }Then in client.go, apply it in the non-gateway path alongside the http client setup:
clientOptions = append(clientOptions, option.WithHTTPClient(httpClient))
clientOptions = append(clientOptions, option.WithMiddleware(oaistream.SSECommentFilterMiddleware())) // add this
client := openai.NewClient(clientOptions...)