Skip to content

Commit 08b85cc

Browse files
authored
Merge pull request #24001 from doringeman/dmr-anthropic-api
model-runner: add Anthropic API documentation
2 parents d7b942c + 5d3a379 commit 08b85cc

File tree

1 file changed

+61
-3
lines changed

1 file changed

+61
-3
lines changed

content/manuals/ai/model-runner/api-reference.md

Lines changed: 61 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
---
22
title: DMR REST API
3-
description: Reference documentation for the Docker Model Runner REST API endpoints, including OpenAI and Ollama compatibility.
3+
description: Reference documentation for the Docker Model Runner REST API endpoints, including OpenAI, Anthropic, and Ollama compatibility.
44
weight: 30
5-
keywords: Docker, ai, model runner, rest api, openai, ollama, endpoints, documentation, cline, continue, cursor
5+
keywords: Docker, ai, model runner, rest api, openai, anthropic, ollama, endpoints, documentation, cline, continue, cursor
66
---
77

88
Once Model Runner is enabled, new API endpoints are available. You can use
99
these endpoints to interact with a model programmatically. Docker Model Runner
10-
provides compatibility with both OpenAI and Ollama API formats.
10+
provides compatibility with OpenAI, Anthropic, and Ollama API formats.
1111

1212
## Determine the base URL
1313

@@ -54,6 +54,7 @@ When configuring third-party tools that expect OpenAI-compatible APIs, use these
5454
| Tool type | Base URL format |
5555
|-----------|-----------------|
5656
| OpenAI SDK / clients | `http://localhost:12434/engines/v1` |
57+
| Anthropic SDK / clients | `http://localhost:12434` |
5758
| Ollama-compatible clients | `http://localhost:12434` |
5859
5960
See [IDE and tool integrations](ide-integrations.md) for specific configuration examples.
@@ -65,6 +66,7 @@ Docker Model Runner supports multiple API formats:
6566
| API | Description | Use case |
6667
|-----|-------------|----------|
6768
| [OpenAI API](#openai-compatible-api) | OpenAI-compatible chat completions, embeddings | Most AI frameworks and tools |
69+
| [Anthropic API](#anthropic-compatible-api) | Anthropic-compatible messages endpoint | Tools built for Claude |
6870
| [Ollama API](#ollama-compatible-api) | Ollama-compatible endpoints | Tools built for Ollama |
6971
| [DMR API](#dmr-native-endpoints) | Native Docker Model Runner endpoints | Model management |
7072
@@ -132,6 +134,62 @@ Be aware of these differences when using DMR's OpenAI-compatible API:
132134
| Logprobs | Supported. |
133135
| Token counting | Uses the model's native token encoder, which may differ from OpenAI's. |
134136
137+
## Anthropic-compatible API
138+
139+
DMR provides [Anthropic Messages API](https://platform.claude.com/docs/en/api/messages) compatibility for tools and frameworks built for Claude.
140+
141+
### Endpoints
142+
143+
| Endpoint | Method | Description |
144+
|----------|--------|-------------|
145+
| `/anthropic/v1/messages` | POST | [Create a message](https://platform.claude.com/docs/en/api/messages/create) |
146+
| `/anthropic/v1/messages/count_tokens` | POST | [Count tokens](https://docs.anthropic.com/en/api/messages-count-tokens) |
147+
148+
### Supported parameters
149+
150+
The following Anthropic API parameters are supported:
151+
152+
| Parameter | Type | Description |
153+
|-----------|------|-------------|
154+
| `model` | string | Required. The model identifier. |
155+
| `messages` | array | Required. The conversation messages. |
156+
| `max_tokens` | integer | Maximum tokens to generate. |
157+
| `temperature` | float | Sampling temperature (0.0-1.0). |
158+
| `top_p` | float | Nucleus sampling parameter. |
159+
| `top_k` | integer | Top-k sampling parameter. |
160+
| `stream` | Boolean | Enable streaming responses. |
161+
| `stop_sequences` | array | Custom stop sequences. |
162+
| `system` | string | System prompt. |
163+
164+
### Example: Chat with Anthropic API
165+
166+
```bash
167+
curl http://localhost:12434/v1/messages \
168+
-H "Content-Type: application/json" \
169+
-d '{
170+
"model": "ai/smollm2",
171+
"max_tokens": 1024,
172+
"messages": [
173+
{"role": "user", "content": "Hello!"}
174+
]
175+
}'
176+
```
177+
178+
### Example: Streaming response
179+
180+
```bash
181+
curl http://localhost:12434/v1/messages \
182+
-H "Content-Type: application/json" \
183+
-d '{
184+
"model": "ai/smollm2",
185+
"max_tokens": 1024,
186+
"stream": true,
187+
"messages": [
188+
{"role": "user", "content": "Count from 1 to 10"}
189+
]
190+
}'
191+
```
192+
135193
## Ollama-compatible API
136194
137195
DMR also provides Ollama-compatible endpoints for tools and frameworks built for Ollama.

0 commit comments

Comments
 (0)