How Docker built an AI assistant that serves 13 million developers
Docker has joined the growing list of developer tools adding AI assistants to their products, but with a twist that sets them apart. Gordon, Docker's new AI agent, demonstrates what happens when you combine deep product integration with specialized AI knowledge - creating an assistant that's more than just another chatbot.

Docker CEO announcing the kapa.ai integration
What makes Docker's approach particularly compelling for enterprise leaders is their speed to market - going from concept to production-ready AI features in weeks rather than months, while maintaining the accuracy and reliability their 13+ million developers expect.
THE SITUATION
Helping millions of developers navigate complex tooling
Docker powers over 13 million monthly active developers who rely on its platform for containerization and deployment. As Docker's ecosystem expanded with new features, CLI options, and best practices, developers faced increasing complexity in:
Writing optimized Dockerfiles that follow best practices
Debugging container build and runtime errors
Finding the right documentation among 1,000+ pages
Avoiding outdated or incorrect information from web searches
The Docker team recognized that traditional documentation and generic AI tools weren't enough. They needed an AI solution that understood Docker's specific context and could provide accurate, up-to-date guidance.
————
THE SOLUTION
Docker implemented two AI solutions powered by Kapa.ai
A. Documentation AI assistant
First, Docker launched an AI-powered documentation assistant powered by kapa.ai to provide instant, accurate answers to Docker-related questions directly within their documentation pages. This serves their 13 million monthly documentation visitors who often struggle to find specific information across the vast documentation site.
The docs AI search can answer user questions in your preferred language, making Docker's technical knowledge accessible to a global developer base.

B. Gordon - The in-product AI agent
Building on this foundation, Docker created Gordon, an AI agent that lives directly inside Docker Desktop and the Docker CLI. This strategic placement means:
No context switching: Developers get help right where they're working
Access to local context: Gordon can analyze Dockerfiles and error messages in your working directory
UI-integrated suggestions: When containers fail, Gordon appears with debugging help

How Gordon Works with Kapa.ai
When a developer asks Gordon a question in Docker Desktop or CLI, Gordon uses an agentic pipeline that can:
Understand the user's intent and determine which tools to use
Gather context from local files (Dockerfiles, error logs, configurations)
Make tool calls to various services:
Kapa.ai API - for Docker product questions, best practices, and documentation queries
Other tools - for local file analysis, error parsing, and action execution
Generate and validate responses before presenting them to the user
Suggest or perform actions based on the analysis

Simplified diagram from Docker's official post
When users ask Docker-specific product questions (like "how do I optimize my Dockerfile?" or "what's the best way to run MongoDB?"), Gordon calls kapa.ai's API which has indexed all of Docker's documentation, API references, tutorials, and best practices. This partnership allows Docker to provide accurate, up-to-date answers to product questions while the Docker team focuses on building Gordon's other capabilities like local context analysis, error debugging, and action recommendations.
————
THE BUSINESS IMPACT
The implementation has shown impressive results across multiple metrics:
Documentation AI Impact
1,000+ daily AI-resolved queries = 1000+ support hours saved monthly
Multi-language support enabling global market expansion
24/7 availability without additional headcount
Gordon's Early Success
Gordon is already really good at answering general Docker-related questions, debugging container build or runtime errors, remediating policy deviations from Docker Scout, optimizing Docker-related files and configurations, and telling users how to run specific containers.
Key capabilities developers are using:
Dockerfile optimization: Gordon identifies best practice violations and suggests improvements
Error diagnosis: Paste an error message, get specific solutions
Container recommendations: Ask "how do I run MongoDB?" and get working configurations
Time-To-Market Advantage
By partnering with kapa.ai for the AI knowledge layer, Docker achieved in weeks what would have taken 6-12 months to build internally. This meant:
First-mover advantage in AI-powered container tooling
Engineering resources remained focused on core platform innovation
Avoided hiring additional ML engineers for in-house development
————
THE FUTURE
Looking Forward: The Future of In-Product AI
Docker's implementation showcases several best practices for adding AI to developer tools:
Meet developers where they work - Embedding AI in the product eliminates friction
Leverage existing knowledge - Using documentation as the AI's knowledge base ensures accuracy
Focus on specific use cases - Gordon optimizes for Docker tasks, not general questions
Build vs. buy the right components - Docker built the integration but partnered for the AI infrastructure
As Docker's CEO Scott Johnston highlighted when announcing the documentation AI, the goal is simple: help developers quickly find answers to their Docker questions. By combining documentation AI with an in-product agent, Docker has created a comprehensive AI strategy that serves developers at every stage of their journey.
The success of Docker's AI implementation - from processing thousands of daily documentation queries to helping debug complex container issues - demonstrates that thoughtful AI integration can measurably improve the developer experience. As more developer tools follow this path, we're likely to see AI assistants become as essential as syntax highlighting and autocomplete.

Name
docker.com
Website
Location
San Francisco, California
Industry
Developer Platform
Deployment Method

Docs Widget

API