Skip to content

fix(openrouter): streaming reasoning_details fragmentation causes multi-turn BadRequestResponseError #36400

@X-iZhang

Description

@X-iZhang

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
  • This is not related to the langchain-community package.
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Package (Required)

  • langchain
  • langchain-openai
  • langchain-anthropic
  • langchain-classic
  • langchain-core
  • langchain-model-profiles
  • langchain-tests
  • langchain-text-splitters
  • langchain-chroma
  • langchain-deepseek
  • langchain-exa
  • langchain-fireworks
  • langchain-groq
  • langchain-huggingface
  • langchain-mistralai
  • langchain-nomic
  • langchain-ollama
  • langchain-openrouter
  • langchain-perplexity
  • langchain-qdrant
  • langchain-xai
  • Other / not sure / general

Related Issues / PRs

No response

Reproduction Steps / Example Code (Python)

from langchain_openrouter import ChatOpenRouter
from langchain_core.messages import AIMessage, HumanMessage

model = ChatOpenRouter(
    model="anthropic/claude-sonnet-4-6",
    api_key="your-key-here",
    reasoning={"effort": "high", "summary": "auto"},
)

# Turn 1 — streaming
msgs = [HumanMessage(content="What is 2+2? Think step by step.")]
chunks = list(model.stream(msgs))
merged = chunks[0]
for c in chunks[1:]:
    merged = merged + c

# Inspect: reasoning_details is fragmented (e.g. 6 entries instead of 1)
details = merged.additional_kwargs.get("reasoning_details", [])
print(f"reasoning_details count: {len(details)}")  # Expected: 1, Actual: 6

# Turn 2 — fails
ai_msg = AIMessage(
    content=merged.content,
    additional_kwargs=merged.additional_kwargs,
    response_metadata=merged.response_metadata,
)
msgs.append(ai_msg)
msgs.append(HumanMessage(content="Now what is 3+3?"))

# This raises BadRequestResponseError
chunks2 = list(model.stream(msgs))

Error Message and Stack Trace (if applicable)

BadRequestResponseError: Provider returned error

Description

During streaming, AIMessageChunk.__add__ list-concatenates reasoning_details in additional_kwargs, fragmenting a single entry into many. When _convert_message_to_dict() serializes conversation history back to the OpenRouter API for the next turn, these fragmented entries are passed through as-is. The API rejects the malformed payload with BadRequestResponseError.

Non-streaming (invoke()) is unaffected — it returns complete entries in a single response.

The fix should merge fragmented reasoning_details entries (same type + same index) back into single entries before serialization. A PR with the fix and tests is ready.

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 25.3.0
Python Version: 3.11.14

Package Information

langchain_core: 1.2.23
langchain: 1.2.13
langchain_openrouter: 0.2.1
langchain_openai: 1.1.11

Other Dependencies

openrouter: 0.8.0
pydantic: 2.12.5

Metadata

Metadata

Labels

bugRelated to a bug, vulnerability, unexpected error with an existing featureexternalopenrouter`langchain-openrouter` package issues & PRs

Type

No fields configured for Bug.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions