-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Open
Labels
docs-toolingLayer: Documentation, examples, dev toolsLayer: Documentation, examples, dev toolshelp-wantedLane: Open to contributorsLane: Open to contributorsp3Priority: Backlog (score <14)Priority: Backlog (score <14)
Description
Current Behavior
Every time the API container starts, LangChain's ChatOpenAI emits repeated UserWarning logs:
WARNING! stream_options is not default parameter.
stream_options was transferred to model_kwargs.
Please confirm that stream_options is what you intended.
This fires for each streaming client initialized in utils/llm/clients.py (7 clients), and repeats on every worker process startup via routers/conversations.py import chain.
Expected Behavior
Clean startup logs with no warnings.
Affected Areas
| File | Line | Description |
|---|---|---|
backend/utils/llm/clients.py |
19, 26, 34, 42, 52, 62, 72 | stream_options={"include_usage": True} passed to ChatOpenAI |
Solution
Move stream_options into model_kwargs explicitly to suppress the warning:
# Before
llm_mini_stream = ChatOpenAI(
model='gpt-4.1-mini',
streaming=True,
stream_options={"include_usage": True},
callbacks=[_usage_callback],
)
# After
llm_mini_stream = ChatOpenAI(
model='gpt-4.1-mini',
streaming=True,
model_kwargs={"stream_options": {"include_usage": True}},
callbacks=[_usage_callback],
)Files to Modify
backend/utils/llm/clients.py
Impact
None — stream_options is already being transferred to model_kwargs internally. This just makes it explicit and silences the warning.
by AI for @beastoin
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
docs-toolingLayer: Documentation, examples, dev toolsLayer: Documentation, examples, dev toolshelp-wantedLane: Open to contributorsLane: Open to contributorsp3Priority: Backlog (score <14)Priority: Backlog (score <14)