Skip to content
Closed
Show file tree
Hide file tree
Changes from 6 commits
Commits
Show all changes
83 commits
Select commit Hold shift + click to select a range
3590c43
docs: LLM profiles design + example profile
openhands-agent Oct 18, 2025
9b1e3db
llm: add profile_id field to LLM (profile filename identifier)\n\nCo-…
openhands-agent Oct 18, 2025
21efefe
feat(llm): add ProfileManager and eagerly register profiles at conver…
openhands-agent Oct 18, 2025
46ca1b7
chore: stop tracking local runtime and worktree files; add to .gitignore
openhands-agent Oct 18, 2025
5efdaee
chore: only ignore bead databases
enyst Oct 18, 2025
9cbf67f
test: cover llm profile manager
enyst Oct 18, 2025
dfab517
Update .gitignore
enyst Oct 18, 2025
441eb25
Improve LLM profile manager persistence
enyst Oct 18, 2025
e7cd039
Add example for managing LLM profiles
enyst Oct 18, 2025
269610a
Document plan for profile references
enyst Oct 18, 2025
d0ab952
Integrate profile-aware persistence
enyst Oct 19, 2025
f74d050
Simplify profile registration logging
enyst Oct 19, 2025
df308fb
Normalize inline_mode naming
enyst Oct 19, 2025
4d293db
Simplify profile_id sync in ProfileManager
enyst Oct 19, 2025
7d1a525
Rename profile sync helper
enyst Oct 19, 2025
ec45ed5
LLMRegistry handles profile management
enyst Oct 19, 2025
1566df4
docs: clarify LLMRegistry profile guidance
enyst Oct 19, 2025
8f8b5b9
refactor: rename profile persistence helpers
enyst Oct 19, 2025
a3efa6e
refactor: split profile transform helpers
enyst Oct 19, 2025
17617aa
style: use f-strings in LLMRegistry logging
enyst Oct 19, 2025
9134aa1
Update openhands/sdk/llm/llm_registry.py
enyst Oct 19, 2025
36ab580
chore: stop tracking scripts/worktree.sh
enyst Oct 19, 2025
cea6a0d
Merge upstream main into agent-sdk-18-profile-manager
enyst Oct 21, 2025
12eec55
fix: remove runtime llm switching
enyst Oct 21, 2025
03b4600
style: use f-string for registry logging
enyst Oct 21, 2025
acf67e3
docs: expand LLM profile example
enyst Oct 21, 2025
218728e
Refine LLM profile persistence
enyst Oct 21, 2025
75e8ecd
Update LLM profile docs for usage_id semantics
enyst Oct 22, 2025
8511524
Merge remote-tracking branch 'upstream/main' into agent-sdk-18-profil…
enyst Oct 23, 2025
1f3adab
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Oct 24, 2025
96ba8e9
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Oct 25, 2025
142faee
fix LLM mutation for profiles to respect immutability; add docstring;…
enyst Oct 25, 2025
82138dd
refactor: keep LLM profile expansion at persistence layer
enyst Oct 25, 2025
b6511a9
Merge branch 'main' of github.com:All-Hands-AI/agent-sdk into agent-s…
enyst Oct 25, 2025
f5404b6
fix: restore LLM profile validation behavior
enyst Oct 26, 2025
85bc698
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Oct 26, 2025
ba4bd50
harden profile handling
enyst Oct 26, 2025
99a422c
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Nov 6, 2025
5c52fa5
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Nov 20, 2025
b69db09
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Nov 28, 2025
5dc94c1
update to current state
enyst Nov 28, 2025
69d3a7d
remove deprecated from llm
enyst Nov 28, 2025
61f5b77
ruff
enyst Nov 28, 2025
2381da7
restore gitignore
enyst Nov 28, 2025
b2f80d3
Delete .openhands/microagents/vscode.md
enyst Nov 28, 2025
8a95dac
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 1, 2025
0aa1164
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 12, 2025
744f171
fix(llm): tolerate legacy profile fields
enyst Dec 12, 2025
24d59bd
fix(llm): keep profile loading strict
enyst Dec 12, 2025
a4d6cd4
fix(llm): reduce profile side effects
enyst Dec 12, 2025
075c9b2
test(utils): stabilize discriminated union suite
enyst Dec 12, 2025
ab3a265
single source of truth for persistence behavior
enyst Dec 13, 2025
82549cc
Merge branch 'main' of github.com:OpenHands/software-agent-sdk into a…
enyst Dec 13, 2025
f400d7d
Update openhands-sdk/openhands/sdk/persistence/__init__.py
enyst Dec 14, 2025
a112ddc
feat(llm): save API keys in LLM profiles by default and set 0600 perm…
enyst Dec 15, 2025
60bfbb2
Merge branch 'main' of github.com:OpenHands/software-agent-sdk into a…
enyst Dec 16, 2025
0d01065
Merge branch 'agent-sdk-18-profile-manager' of github.com:OpenHands/s…
enyst Dec 16, 2025
bc94774
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 16, 2025
ad07b05
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 17, 2025
2464633
Delete docs/llm_profiles.md
enyst Dec 18, 2025
db10002
Update openhands-sdk/openhands/sdk/llm/llm.py
enyst Dec 18, 2025
ce31e79
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 18, 2025
1fe3929
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 19, 2025
8625ff2
Merge remote-tracking branch 'upstream/main' into agent-sdk-18-profil…
enyst Dec 29, 2025
95d94c3
Merge remote-tracking branch 'upstream/main' into agent-sdk-18-profil…
enyst Dec 29, 2025
67ab2c0
ci: detect nested examples in docs check
enyst Dec 29, 2025
fab1d57
ci: fix nested examples regex
enyst Dec 29, 2025
926fb90
ci(docs): clarify example skip rationale
enyst Dec 30, 2025
5676592
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Jan 1, 2026
b2ea371
Merge main into agent-sdk-18-profile-manager
enyst Jan 6, 2026
2aa320d
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Jan 7, 2026
b5a01ad
fix(llm): reject unknown fields when loading profiles
enyst Jan 7, 2026
90257c5
Revert "fix(llm): reject unknown fields when loading profiles"
enyst Jan 7, 2026
7a83b34
refactor(persistence): default to LLM profiles, drop inline env toggle
enyst Jan 7, 2026
9530155
Update .gitignore
enyst Jan 7, 2026
69e259b
chore(examples): make llm profiles example last
enyst Jan 8, 2026
c6f5db7
Update examples/01_standalone_sdk/34_llm_profiles.py
enyst Jan 8, 2026
9ecab27
Merge branch 'main' into agent-sdk-18-profile-manager
xingyaoww Jan 8, 2026
cd3ab89
Merge branch 'main' into agent-sdk-18-profile-manager
xingyaoww Jan 8, 2026
9859f21
chore(examples): inline llm profiles script body
enyst Jan 8, 2026
23cb159
feat(llm): default profile persistence and drop inline key
enyst Jan 8, 2026
dcc83f5
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Jan 8, 2026
6807c99
test: fix workflow model resolver + session api key env
enyst Jan 27, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -203,3 +203,15 @@ cache
/workspace/
openapi.json
.client/
# Ignore local runtime and developer files
.beads/*.db
*.db
.worktrees/
*.code-workspace
log.txt
previous.md
AGENTS.md
CLAUDE.md
docs/agent-sdk.workspace.code-workspace
docs/llm-model-info-and-caps.md
docs/llm-refactor.md
42 changes: 42 additions & 0 deletions docs/llm_profiles.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
LLM Profiles (design)

Overview

This document records the design decision for "LLM profiles" (named LLM configuration files) and how they map to the existing LLM model and persistence in the SDK.

Key decisions

- Reuse the existing LLM Pydantic model schema. A profile file is simply the JSON dump of an LLM instance (the same shape produced by LLM.model_dump(exclude_none=True) or LLM.load_from_json).
- Storage location: ~/.openhands/llm-profiles/<profile_name>.json. The profile_name is the filename (no extension) used to refer to the profile.
- Do not change ConversationState or Agent serialization format for now. Profiles are a convenience for creating LLM instances and registering them in the runtime LLMRegistry.
- Secrets: do NOT store plaintext API keys in profile files by default. Prefer storing the env var name in the LLM.api_key (via LLM.load_from_env) or keep the API key in runtime SecretsManager. The ProfileManager.save_profile API will expose an include_secrets flag; default False.
- LLM.service_id semantics: keep current behavior (a small set of runtime "usage" identifiers such as 'agent', 'condenser', 'title-gen', etc.). Do not use service_id as the profile name. We will evaluate a rename (service_id -> usage_id) in a separate task (see agent-sdk-23).

ProfileManager API (summary)

- list_profiles() -> list[str]
- load_profile(name: str) -> LLM
- save_profile(name: str, llm: LLM, include_secrets: bool = False) -> str (path)
- register_all(registry: LLMRegistry) -> None

Implementation notes

- Use LLM.load_from_json(path) for loading and llm.model_dump(exclude_none=True) for saving.
- Default directory: os.path.expanduser('~/.openhands/llm-profiles/')
- When loading, do not inject secrets. The runtime should reconcile secrets via ConversationState/Agent resolve_diff_from_deserialized or via SecretsManager.
- When saving, respect include_secrets flag; if False, ensure secret fields (api_key, aws_* keys) are omitted or masked.

CLI

- Use a single flag: --llm <profile_name> to select a profile for the agent LLM.
- Also support an environment fallback: OPENHANDS_LLM_PROFILE.
- Provide commands: `openhands llm list`, `openhands llm show <profile_name>` (redacts secrets).

Migration

- Migration from inline configs to profiles: provide a migration helper script to extract inline LLMs from ~/.openhands/agent_settings.json and conversation base_state.json into ~/.openhands/llm-profiles/<name>.json and update references (manual opt-in by user).

Notes on service_id rename

- There is an ongoing discussion about renaming `LLM.service_id` to a clearer name (e.g., `usage_id` or `token_tracking_id`) because `service_id` is overloaded. We will not rename immediately; agent-sdk-23 will investigate the migration and impact.

11 changes: 11 additions & 0 deletions examples/llm-profiles/example.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
{
"model": "gpt-4o-mini",
"base_url": "https://api.openai.com/v1",
"api_key": null,
"temperature": 0.0,
"max_output_tokens": 1024,
"service_id": "agent",
"metadata": {
"profile_description": "Example profile for local testing (no api_key stored)."
}
}
14 changes: 14 additions & 0 deletions openhands/sdk/conversation/impl/local_conversation.py
Original file line number Diff line number Diff line change
Expand Up @@ -110,6 +110,20 @@ def _default_callback(e):
for llm in list(self.agent.get_all_llms()):
self.llm_registry.add(llm)

# Eagerly discover and register LLM profiles from disk so they are
# available through the registry (profiles are stored under
# ~/.openhands/llm-profiles/*.json). This keeps behavior backward
# compatible while making named profiles discoverable to the runtime.
try:
from openhands.sdk.llm.profile_manager import ProfileManager

ProfileManager().register_all(self.llm_registry)
except Exception:
# Do not fail conversation initialization if profile loading has problems
logger.debug(
"No LLM profiles registered or failed to load profiles", exc_info=True
)

# Initialize secrets if provided
if secrets:
# Convert dict[str, str] to dict[str, SecretValue]
Expand Down
4 changes: 4 additions & 0 deletions openhands/sdk/llm/llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -208,6 +208,10 @@ class LLM(BaseModel, RetryMixin, NonNativeToolCallingMixin):
"Safety settings for models that support them (like Mistral AI and Gemini)"
),
)
profile_id: str | None = Field(
default=None,
description="Optional profile id (filename under ~/.openhands/llm-profiles).",
)
service_id: str = Field(
default="default",
description="Unique identifier for LLM. Typically used by LLM registry.",
Expand Down
79 changes: 79 additions & 0 deletions openhands/sdk/llm/profile_manager.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
from __future__ import annotations

import json
import logging
from pathlib import Path

from openhands.sdk.llm.llm import LLM
from openhands.sdk.llm.llm_registry import LLMRegistry


logger = logging.getLogger(__name__)


class ProfileManager:
"""Manage LLM profile files on disk.

Profiles are stored as JSON files using the existing LLM schema, typically
at ~/.openhands/llm-profiles/<profile_name>.json.
"""

def __init__(self, base_dir: str | Path | None = None):
if base_dir is None:
self.base_dir = Path.home() / ".openhands" / "llm-profiles"
else:
self.base_dir = Path(base_dir).expanduser()
self.base_dir.mkdir(parents=True, exist_ok=True)

def list_profiles(self) -> list[str]:
return sorted([p.stem for p in self.base_dir.glob("*.json")])

def get_profile_path(self, name: str) -> Path:
return self.base_dir / f"{name}.json"

def load_profile(self, name: str) -> LLM:
p = self.get_profile_path(name)
if not p.exists():
raise FileNotFoundError(f"Profile not found: {name} -> {p}")
# Use LLM.load_from_json to leverage pydantic validation
llm = LLM.load_from_json(str(p))
# Ensure profile_id is present on loaded LLM
if getattr(llm, "profile_id", None) is None:
try:
llm = llm.model_copy(update={"profile_id": name})
except Exception:
# Old pydantic versions might not have model_copy; fallback
llm.profile_id = name # type: ignore[attr-defined]
return llm

def save_profile(self, name: str, llm: LLM, include_secrets: bool = False) -> Path:
p = self.get_profile_path(name)
# Dump model to dict and ensure profile_id is set
data = llm.model_dump(exclude_none=True)
data["profile_id"] = name
# Remove secret fields unless explicitly requested
if not include_secrets:
for secret_field in (
"api_key",
"aws_access_key_id",
"aws_secret_access_key",
):
if secret_field in data:
data.pop(secret_field, None)
# Write to file
with open(p, "w", encoding="utf-8") as f:
json.dump(data, f, indent=2, ensure_ascii=False)
logger.info(f"Saved profile {name} -> {p}")
return p

def register_all(self, registry: LLMRegistry) -> None:
# Load and attempt to register all profiles. Skip duplicates.
for name in self.list_profiles():
try:
llm = self.load_profile(name)
try:
registry.add(llm)
except Exception as e:
logger.info(f"Skipping profile {name}: registry.add failed: {e}")
except Exception as e:
logger.warning(f"Failed to load profile {name}: {e}")
70 changes: 70 additions & 0 deletions tests/sdk/llm/test_profile_manager.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
import json

from pydantic import SecretStr

from openhands.sdk.llm.llm import LLM
from openhands.sdk.llm.llm_registry import LLMRegistry
from openhands.sdk.llm.profile_manager import ProfileManager


def test_list_profiles_returns_sorted_names(tmp_path):
manager = ProfileManager(base_dir=tmp_path)
(tmp_path / "b.json").write_text("{}", encoding="utf-8")
(tmp_path / "a.json").write_text("{}", encoding="utf-8")

assert manager.list_profiles() == ["a", "b"]


def test_save_profile_excludes_secret_fields(tmp_path):
manager = ProfileManager(base_dir=tmp_path)
llm = LLM(
model="gpt-4o-mini",
service_id="service",
api_key=SecretStr("secret"),
aws_access_key_id=SecretStr("id"),
aws_secret_access_key=SecretStr("value"),
)

path = manager.save_profile("sample", llm)
data = json.loads(path.read_text(encoding="utf-8"))

assert data["profile_id"] == "sample"
assert data["service_id"] == "service"
assert "api_key" not in data
assert "aws_access_key_id" not in data
assert "aws_secret_access_key" not in data


def test_load_profile_assigns_profile_id_when_missing(tmp_path):
manager = ProfileManager(base_dir=tmp_path)
profile_path = tmp_path / "foo.json"
profile_path.write_text(
json.dumps({"model": "gpt-4o-mini", "service_id": "svc"}),
encoding="utf-8",
)

llm = manager.load_profile("foo")

assert llm.profile_id == "foo"
assert llm.service_id == "svc"


def test_register_all_skips_invalid_and_duplicate_profiles(tmp_path):
manager = ProfileManager(base_dir=tmp_path)
registry = LLMRegistry()

llm = LLM(model="gpt-4o-mini", service_id="shared")
manager.save_profile("alpha", llm)

duplicate_data = llm.model_dump(exclude_none=True)
duplicate_data["profile_id"] = "beta"
(tmp_path / "beta.json").write_text(
json.dumps(duplicate_data),
encoding="utf-8",
)

(tmp_path / "gamma.json").write_text("{", encoding="utf-8")

manager.register_all(registry)

assert registry.list_services() == ["shared"]