Skip to content

Fix stream_options UserWarning in LLM client initialization #4621

@beastoin

Description

@beastoin

Current Behavior

Every time the API container starts, LangChain's ChatOpenAI emits repeated UserWarning logs:

WARNING! stream_options is not default parameter.
    stream_options was transferred to model_kwargs.
    Please confirm that stream_options is what you intended.

This fires for each streaming client initialized in utils/llm/clients.py (7 clients), and repeats on every worker process startup via routers/conversations.py import chain.

Expected Behavior

Clean startup logs with no warnings.

Affected Areas

File Line Description
backend/utils/llm/clients.py 19, 26, 34, 42, 52, 62, 72 stream_options={"include_usage": True} passed to ChatOpenAI

Solution

Move stream_options into model_kwargs explicitly to suppress the warning:

# Before
llm_mini_stream = ChatOpenAI(
    model='gpt-4.1-mini',
    streaming=True,
    stream_options={"include_usage": True},
    callbacks=[_usage_callback],
)

# After
llm_mini_stream = ChatOpenAI(
    model='gpt-4.1-mini',
    streaming=True,
    model_kwargs={"stream_options": {"include_usage": True}},
    callbacks=[_usage_callback],
)

Files to Modify

  • backend/utils/llm/clients.py

Impact

None — stream_options is already being transferred to model_kwargs internally. This just makes it explicit and silences the warning.


by AI for @beastoin

Metadata

Metadata

Assignees

No one assigned

    Labels

    docs-toolingLayer: Documentation, examples, dev toolshelp-wantedLane: Open to contributorsp3Priority: Backlog (score <14)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions