Building AI workflows: from local experiments to serving users Oleg Šelajev, Docker

Agentic applications need three things Models Tools Code

Large Language Model github Github Tool User input or event notion Agent Application Notion tool Response SQL tool Internal DB

The Docker Model Runner Run models next to your other containerized services using the tools you’re already using compose.yaml models: gemma3: model: ai/gemma3:4B-F16 services: app: models: gemma3: endpoint_var: OPENAI_BASE_URL model_var: OPENAI_MODEL

The MCP Catalog Run MCP servers using containers without worrying about runtimes or installs anymore

The MCP Gateway Run containerized MCP servers safely and securely directly in your application stack compose.yaml services: mcp-gateway: image: docker/mcp-gateway:latest use_api_socket: true command: - —transport=sse - —servers=duckduckgo - —tools=search,fetch_content app: … environment: MCP_ENDPOINT: http://mcp-gateway:8811/sse

github Large Language Model Github Tool notion User input or event Notion tool Agent Application MCP Gateway Response SQL tool Internal DB

Compose for agents Build and run AI agents using Docker Compose

Cloud Run and Docker Compose Deploy your compose.yaml directly to Cloud Run cloud.google.com/blog/products/serverless/cloud-run-and-docker-collaboration