Skip to content

BUG: cannot resume conversation with different settings for enable_encrypted_reasoning #238

@jpshackelford

Description

@jpshackelford

Stack trace:

     ___                    _   _                 _
    /  _ \ _ __   ___ _ __ | | | | __ _ _ __   __| |___
    | | | | '_ \ / _ \ '_ \| |_| |/ _` | '_ \ / _` / __|
    | |_| | |_) |  __/ | | |  _  | (_| | | | | (_| \__ \
    \___ /| .__/ \___|_| |_|_| |_|\__,_|_| |_|\__,_|___/
          |_|
    

Resumed conversation 6081e6fd-136e-46da-8929-c2a3e36e9046

Version: 1.6.0
Confirmation mode: AlwaysConfirm

Let's start building!
What do you want to build? Type /help for help

> where were we?
Initializing agent...
Error: The LLM provided is different from the one in persisted state.
Diff: enable_encrypted_reasoning: True -> False
model: 'litellm_proxy/prod/claude-sonnet-4-5-20250929' -> 'litellm_proxy/claude-sonnet-4-5-20250929'
Traceback (most recent call last):
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands_cli/simple_main.py", line 203, in main
    run_cli_entry(
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands_cli/agent_chat.py", line 235, in run_cli_entry
    conversation = setup_conversation(
                   ^^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands_cli/setup.py", line 122, in setup_conversation
    conversation: BaseConversation = Conversation(
                                     ^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/conversation/conversation.py", line 122, in __new__
    return LocalConversation(
           ^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/conversation/impl/local_conversation.py", line 115, in __init__
    self._state = ConversationState.create(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/conversation/state.py", line 189, in create
    resolved = agent.resolve_diff_from_deserialized(state.agent)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/agent/base.py", line 289, in resolve_diff_from_deserialized
    new_llm = self.llm.resolve_diff_from_deserialized(persisted.llm)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/llm/llm.py", line 1135, in resolve_diff_from_deserialized
    raise ValueError(
ValueError: The LLM provided is different from the one in persisted state.
Diff: enable_encrypted_reasoning: True -> False
model: 'litellm_proxy/prod/claude-sonnet-4-5-20250929' -> 'litellm_proxy/claude-sonnet-4-5-20250929'
Traceback (most recent call last):
  File "/Users/jpshack/.local/share/uv/tools/openhands/bin/openhands", line 10, in <module>
    sys.exit(main())
             ^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands_cli/simple_main.py", line 203, in main
    run_cli_entry(
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands_cli/agent_chat.py", line 235, in run_cli_entry
    conversation = setup_conversation(
                   ^^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands_cli/setup.py", line 122, in setup_conversation
    conversation: BaseConversation = Conversation(
                                     ^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/conversation/conversation.py", line 122, in __new__
    return LocalConversation(
           ^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/conversation/impl/local_conversation.py", line 115, in __init__
    self._state = ConversationState.create(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/conversation/state.py", line 189, in create
    resolved = agent.resolve_diff_from_deserialized(state.agent)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/agent/base.py", line 289, in resolve_diff_from_deserialized
    new_llm = self.llm.resolve_diff_from_deserialized(persisted.llm)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jpshack/.local/share/uv/tools/openhands/lib/python3.12/site-packages/openhands/sdk/llm/llm.py", line 1135, in resolve_diff_from_deserialized
    raise ValueError(
ValueError: The LLM provided is different from the one in persisted state.
Diff: enable_encrypted_reasoning: True -> False
model: 'litellm_proxy/prod/claude-sonnet-4-5-20250929' -> 'litellm_proxy/claude-sonnet-4-5-20250929'

Metadata

Metadata

Assignees

No one assigned

    Labels

    agent-sdkRequires interacting with agent sdk APIs or fixing issues upstream in the agent sdkbugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions