Skip to content

bug: add_messages(format="langchain-openai") strips message IDs and additional_kwargs #7272

@Witaly3

Description

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangGraph rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangGraph (or the specific integration package).
  • This is not related to the langchain-community package.
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Related Issues / PRs

#7273

Reproduction Steps / Example Code (Python)

from langchain_core.messages import HumanMessage, AIMessage
from langgraph.graph.message import add_messages

# 1. Message IDs are lost
left = [HumanMessage(content="hello", id="msg-1")]
right = [AIMessage(content="world", id="msg-2")]

result = add_messages(left, right, format="langchain-openai")

print(result[0].id)  # None — expected "msg-1"
print(result[1].id)  # None — expected "msg-2"

# 2. Custom additional_kwargs are lost
msg = AIMessage(
    id="msg-1",
    content="hello",
    additional_kwargs={"widgets": [{"type": "carousel"}]},
)
result = add_messages([], [msg], format="langchain-openai")
print(result[0].additional_kwargs)  # {} — expected {"widgets": [{"type": "carousel"}]}

Error Message and Stack Trace (if applicable)

No error raised. Message IDs are silently set to None and custom additional_kwargs are silently dropped.

Description

add_messages(format="langchain-openai") loses message IDs and custom additional_kwargs during the OpenAI format round-trip.

_format_messages() in langgraph/graph/message.py calls convert_to_openai_messages() without include_id=True. The round-trip BaseMessage → OpenAI dict → BaseMessage strips all IDs to
None. Additionally, any custom additional_kwargs (e.g. UI widgets, metadata) are dropped because convert_to_openai_messages only outputs standard OpenAI fields.

Impact:

  1. Every subsequent reducer call assigns new random UUIDs to all existing messages, breaking ID-based deduplication and update logic.
  2. Custom data stored in additional_kwargs is silently lost after each reducer invocation.

Proposed fix: Pass include_id=True to preserve IDs. Save additional_kwargs by message ID before conversion and restore lost fields after.

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 25.3.0: Wed Jan 28 20:56:42 PST 2026; root:xnu-12377.91.3~2/RELEASE_ARM64_T8142
Python Version: 3.14.2 (main, Dec 9 2025, 19:29:30) [Clang 21.1.4 ]

Package Information

langchain_core: 1.2.20
langsmith: 0.6.4
langgraph_sdk: 0.3.12

Optional packages not installed

deepagents
deepagents-cli

Other Dependencies

httpx: 0.28.1
jsonpatch: 1.33
orjson: 3.11.6
packaging: 25.0
pydantic: 2.12.5
pytest: 9.0.2
pyyaml: 6.0.3
requests: 2.32.5
requests-toolbelt: 1.0.0
tenacity: 9.1.2
typing-extensions: 4.15.0
uuid-utils: 0.13.0
zstandard: 0.25.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingexternal

    Type

    No fields configured for Bug.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions