11---
22title : Build and run agentic AI applications with Docker
33linktitle : Agentic AI applications
4- keywords : AI, Docker, Model Runner, MCP Toolkit, Docker Offload, AI agents, application development
4+ keywords : AI, Docker, Model Runner, MCP Toolkit, AI agents, application development
55summary : |
6- Learn how to create AI agent applications using Docker Model Runner, MCP Toolkit, and Docker Offload .
6+ Learn how to create AI agent applications using Docker Model Runner, and MCP Toolkit .
77params :
88 tags : [AI]
99 time : 30 minutes
1010---
1111
12+ > [ !TIP]
13+ >
14+ > This guide uses the familiar Docker Compose workflow to orchestrate agentic AI
15+ > applications. For a smoother development experience, check out [ Docker
16+ > cagent] ( ../manuals/ai/cagent/_index.md ) , a purpose-built agent runtime that
17+ > simplifies running and managing AI agents.
18+
1219## Introduction
1320
1421Agentic applications are transforming how software gets built. These apps don't
@@ -31,8 +38,8 @@ architecture. It's a new kind of stack, built from three core components:
3138 capabilities via the Model Context Protocol (MCP).
3239
3340Docker makes this AI-powered stack simpler, faster, and more secure by unifying
34- models, tool gateways, and cloud infrastructure into a developer-friendly
35- workflow that uses Docker Compose.
41+ models, and tool gateways into a developer-friendly workflow that uses Docker
42+ Compose.
3643
3744![ A diagram of the agentic stack] ( ./images/agentic-ai-diagram.webp )
3845
@@ -46,26 +53,23 @@ shows how Docker ties them all together with the following tools:
4653 and securely run external tools, like APIs and databases, using the Model
4754 Context Protocol (MCP).
4855- [ Docker MCP Gateway] ( ../manuals/ai/mcp-catalog-and-toolkit/mcp-gateway.md ) lets you orchestrate and manage MCP servers.
49- - [ Docker Offload] ( /offload/ ) provides a powerful, GPU-accelerated
50- environment to run your AI applications with the same Compose-based
51- workflow you use locally.
5256- [ Docker Compose] ( /manuals/ai/compose/models-and-compose.md ) is the tool that ties it all
5357 together, letting you define and run multi-container applications with a
5458 single file.
5559
56- For this guide, you'll start by running the app in Docker Offload, using the
57- same Compose workflow you're already familiar with. Then, if your machine
58- hardware supports it, you'll run the same app locally using the same workflow.
59- Finally, you'll dig into the Compose file, Dockerfile, and app to see how it all
60- works together.
60+ For this guide, you'll use the same Compose workflow you're already familiar
61+ with. Then, you'll dig into the Compose file, Dockerfile, and app to see how it
62+ all works together.
6163
6264## Prerequisites
6365
6466To follow this guide, you need to:
6567
6668 - [ Install Docker Desktop 4.43 or later] ( ../get-started/get-docker.md )
6769 - [ Enable Docker Model Runner] ( /manuals/ai/model-runner.md#enable-dmr-in-docker-desktop )
68- - [ Join Docker Offload Beta] ( /offload/quickstart/ )
70+ - At least the following hardware specifications:
71+ - VRAM: 3.5 GB
72+ - Storage: 2.31 GB
6973
7074## Step 1: Clone the sample application
7175
@@ -77,59 +81,9 @@ $ git clone https://github.com/docker/compose-for-agents.git
7781$ cd compose-for-agents/adk/
7882```
7983
80- ## Step 2: Run the application with Docker Offload
81-
82- You'll start by running the application in Docker Offload, which provides a
83- managed environment for running AI workloads. This is ideal if you want to
84- leverage cloud resources or if your local machine doesn't meet the hardware
85- requirements to run the model locally. Docker Offload includes support for
86- GPU-accelerated instances, making it ideal for compute-intensive workloads like
87- AI model inference.
88-
89- To run the application with Docker Offload, follow these steps:
90-
91- 1 . Sign in to the Docker Desktop Dashboard.
92- 2 . In a terminal, start Docker Offload by running the following command:
93-
94- ``` console
95- $ docker offload start
96- ```
97-
98- When prompted, choose the account you want to use for Docker Offload and select
99- ** Yes** when prompted ** Do you need GPU support?** .
100-
101- 3 . In the ` adk/ ` directory of the cloned repository, run the following command
102- in a terminal to build and run the application:
103-
104- ``` console
105- $ docker compose up
106- ```
107-
108- The first time you run this command, Docker pulls the model from Docker Hub,
109- which may take some time.
110-
111- The application is now running with Docker Offload. Note that the Compose workflow
112- is the same when using Docker Offload as it is locally. You define your
113- application in a ` compose.yaml ` file, and then use ` docker compose up ` to build
114- and run it.
115-
116- 4 . Visit [ http://localhost:8080 ] ( http://localhost:8080 ) . Enter a correct or
117- incorrect fact in the prompt and hit enter. An agent searches DuckDuckGo to
118- verify it and another agent revises the output.
119-
120- ![ Screenshot of the application] ( ./images/agentic-ai-app.png )
84+ ## Step 2: Run the application locally
12185
122- 5 . Press ctrl-c in the terminal to stop the application when you're done.
123-
124- 6 . Run the following command to stop Docker Offload:
125-
126- ``` console
127- $ docker offload stop
128- ```
129-
130- ## Step 3: Optional. Run the application locally
131-
132- If your machine meets the necessary hardware requirements, you can run the
86+ Your machine must meet the necessary hardware requirements to run the
13387entire application stack locally using Docker Compose. This lets you test the
13488application end-to-end, including the model and MCP gateway, without needing to
13589run in the cloud. This particular example uses the [ Gemma 3 4B
@@ -159,9 +113,11 @@ To run the application locally, follow these steps:
159113 incorrect fact in the prompt and hit enter. An agent searches DuckDuckGo to
160114 verify it and another agent revises the output.
161115
116+ ![ Screenshot of the application] ( ./images/agentic-ai-app.png )
117+
1621183 . Press ctrl-c in the terminal to stop the application when you're done.
163119
164- ## Step 4 : Review the application environment
120+ ## Step 3 : Review the application environment
165121
166122You can find the ` compose.yaml ` file in the ` adk/ ` directory. Open it in a text
167123editor to see how the services are defined.
@@ -316,7 +272,7 @@ Together, these variables let the same ADK web server code seamlessly target eit
316272- Hosted OpenAI: if you supply ` OPENAI_API_KEY ` (and optionally ` OPENAI_MODEL_NAME ` )
317273- Model Runner: by remapping ` MODEL_RUNNER_URL ` and ` MODEL_RUNNER_MODEL ` into the OpenAI client’s expected variables
318274
319- ## Step 5 : Review the application
275+ ## Step 4 : Review the application
320276
321277The ` adk ` web application is an agent implementation that connects to the MCP
322278gateway and a model through environment variables and API calls. It uses the
@@ -375,7 +331,7 @@ combine local model inference with external tool integrations in a structured,
375331modular way.
376332
377333You also saw how Docker simplifies this process by providing a suite of tools
378- that support local and cloud-based agentic AI development:
334+ that support agentic AI development:
379335
380336- [ Docker Model Runner] ( ../manuals/ai/model-runner/_index.md ) : Run and serve
381337 open-source models locally via OpenAI-compatible APIs.
@@ -386,9 +342,7 @@ that support local and cloud-based agentic AI development:
386342 MCP servers to connect agents to external tools and services.
387343- [ Docker Compose] ( /manuals/ai/compose/models-and-compose.md ) : Define and run
388344 multi-container agentic AI applications with a single file, using the same
389- workflow locally and in the cloud.
390- - [ Docker Offload] ( /offload/ ) : Run GPU-intensive AI workloads in a secure, managed
391- cloud environment using the same Docker Compose workflow you use locally.
345+ workflow.
392346
393347With these tools, you can develop and test agentic AI applications efficiently,
394- locally or in the cloud, using the same consistent workflow throughout.
348+ using the same consistent workflow throughout.
0 commit comments