Skip to content

如何支持本地模型 #1

@yangweijie

Description

@yangweijie

RT,是否支持ollama 或者lm studio 启动的本地模型,claude-code-proxy 可以实现 本地模型 claude-code 运行

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions