Docker announced two new Docker Generative AI tools called DockerAI and GenAI in its yearly global developer conference, DockerCon. These tools are expected to streamline the AI integration process for developers. Out of the two, DockerAI is getting the most attention. It is the first AI-powered tool by Docker that draws millions of engineers wisdom from its open-source community.
The aim of DockerAI is to “meet developers where they are” by assisting them in harnessing the full potential of AI and ML in their applications. It does the work by recommending context-specific suggestions that boost productivity while building applications. For example, it suggests best practices for using Docker for development.
Another highlight is the launch of GenAI in collaboration with Neo4j, Ollama, and LangChain. GenAI helps developers to get started with AI generative applications in a matter of minutes. It offers preconfigured management tools and Large Language Models (LLMs) to boost the process of developing applications. Additionally, Neo4j is the default database for this tool, which enhances model accuracy and uncovers patterns.
The GenAI is packaged with preconfigured, ready-to-code, secure LLMs from Ollama and the LangChain framework, eliminating the need to search and configure technologies from different sources. Currently, the GenAI stack is available in early access and is accessible from the Docker Desktop Learning Center or on GitHub.
The recent announcements of Docker signal its intentions to democratize the generative AI tools development industry further. With tools like DockerAI and GenAI, developers can now build modern, scalable applications by using AI models backed by Docker’s cutting-edge technology.
Making AI a crucial part of DevOps, these Docker AI tools are positioned to play a pivotal role in the future of software development and deployment.