Docker AI

Are Docker AI and the GenAI stack the future of AI development?

Charlotte Goetz

The relevance of Artificial Intelligence (AI) in current software development cannot be emphasized strongly enough. AI technologies have fundamentally changed the way applications are built and run. Docker Inc. along with its partners announced what they see as a landmark AI initiative a few days ago at the annual global developer conference, DockerCon in Los Angeles. This includes the GenAI stack and Docker's first ever AI-powered product: Docker AI.

Docker AI uses the knowledge of Docker developers to automatically generate best practices and select up-to-date, secure images for applications. Caas is a service offered by Docker, on which the highly acclaimed Docker AI is based. The applications in the GenAI stack give developers the opportunity to integrate preconfigured AI components into their projects in a secure and uncomplicated way. An example of pre-trained AI models is the Konfuzio Marketplace, a central hub for innovative AI solutions and tools.

The Docker AI package

In addition to Docker AI and the GenAI stack, Docker's AI package includes, among other things.

  • preconfigured Large Language Models (LLMs),
  • Vector and graph databases,
  • the LangChain framework and
  • supporting tools and code templates

to proven generative AI methods. Docker AI offers users contextual ease in configuring Docker systems. Docker promises to help developers get started with generative AI applications in just minutes with the help of both innovations. But before we dive into the details, let's give you a brief overview of Docker, the Docker container, relevant partners, and DockerCon.

Terms around Docker AI explained in a few sentences

Docker AI What is Docker

Docker Inc. provide with their product Docker a platform that allows to run applications in so-called containers. A container embodies a common software unit that combines the entire code of an application and all associated prerequisites. This allows the application to run quickly and reliably in different computing environments. There have been some recent changes to Docker, including the integration of Kubernetes clusters and changes to the licensing of the desktop version. Due to these developments, more and more companies are considering switching to Alternatives to Docker Desktop.

Docker container

Docker container technology was first introduced in 2013 as the "open-source Docker Engine". This technology was based on existing concepts in containerization and, in particular, the basic components known as "cgroups" and "namespaces" in the Linux environment. Docker technology focuses on meeting the needs of developers and system administrators by separating application dependencies from the underlying Infrastructure isolated.

Neo4j

Neo4j is established in the field of graph databases and analytics, enabling the efficient identification of hidden relationships and patterns in big data. With its graph stack, Neo4j provides features such as native vector search, data science, advanced analytics, and security controls for enterprises to solve business problems in various areas such as fraud detection, customer 360, knowledge graphs, and more.

LangChain

LangChain is an open source platform and toolkit for developers that assists in the development of reasoning applications. The understanding of the context is taken into account and is based on Large Language Models (LLMs).

DockerCon

The event brings together the entire Docker developer community, including stakeholders and partners, to share knowledge and collaborate on the advancement of cloud-native development to work on. DockerCon is essentially a highly focused learning opportunity where developers are exposed to new opportunities and potential within the Docker ecosystem.

The Docker AI

In its own words, Docker AI "picks up developers where they are" and increases the productivity of their existing skills and workflows.

"Code generation AIs increase developer productivity when writing source code, and that's fantastic"

Scott Johnston, CEO of Docker

Components of Docker AI

In addition to the source code, Docker AI applications consist of.

  • Web servers,
  • Language runtimes,
  • Databases,
  • Message queues and
  • many other technologies.

Docker AI, to the current state of knowledge, helps developers quickly and securely define and troubleshoot all aspects of the app while iterating in their "inner loop," Docker said.

Code generation tools such as GitHub Copilot and Tabnine have helped speed up development work reportedly tenfold. However, these Tools account for only a small portion, about 10 to 15 percent, of the total Docker AI development work. The majority, 85 to 90 percent, are tasks such as runtimes, front-end development, and more, which are defined by Dockerfiles, Docker Compose files, and Docker images.

Docker AI provides AI support for Docker system configuration

So the new Docker product is a service that helps users with AI to configure their Docker system, such as creating Dockerfiles or fixing configuration issues. Similar to GitHub Copilot in its programming space, Docker AI offers contextual best practice suggestions.

In addition, Docker AI includes automated, contextual guidance for developers when performing tasks such as.

  • editing Dockerfiles or Docker Compose files,
  • debugging the "Docker Build" or
  • performing local tests

implement. It automatically generates best practices and selects secure application images. The AI draws on the knowledge of countless Docker development projects and is currently under development as part of a Docker AI Early Access Program available. Participation in the program is granted based on project thresholds. The Costs of Docker AI have not yet added Docker Inc. to their official pricing at this time (October 12, 2023).

The GenAI stack

The GenAI stack is a collaborative effort between Docker, Neo4j, LangChain, and Ollama, and is part of a comprehensive suite of new AI and Machine learning capabilities. The mission behind it: to provide developers with a quick and easy way to create AI applications.

The GenAI stack is available in the Docker Desktop Learning Center and in the Repository available. It aims to support common use cases in generative AI using trusted open source resources on Docker Hub. The components of the GenAI stack are carefully selected.

Preconfigured open source LLMs as powerful AI models.

One of the key components of the GenAI stack are the preconfigured Large Language Models (LLMs). These contain models such as

  • Llama 2,
  • Code Llama,
  • Mistral as well as
  • private models like GPT-3.5 and GPT-4

of OpenAI. The provision of these powerful models gives developers the ability to access high-level AI capabilities from the start without having to perform tedious configurations.

Help from Ollama simplifies local implementation of LLMs

Ollama supports developers in implementing open source LLMs on local systems. This ensures full control over the AI models, as well as efficient integration into the same AI projects.

Neo4j as a database with the power of graphs and vectors

Neo4j serves as the default database in the GenAI stack and enables graph and native vector searching. This database is capable of identifying both explicit and implicit patterns and relationships in the data, which is critical for increasing the speed and accuracy of AI and machine learning models. In addition, Neo4j acts as a long-term memory for these models, increasing their performance over time.

Neo4j knowledge graphs form the foundation for accurate GenAI predictions

Using Neo4j knowledge graphs as the basis for Large Language Models allows for more accurate predictions and results in the field of generative AI. These knowledge graphs serve as a rich knowledge base that the models can access to produce better and more contextual results.

LangChain orchestration for linking LLMs, applications and databases

LangChain plays a key role in the orchestration of Large Language Models, applications and databases. This component creates integration and communication between the different parts of the GenAI stack. In particular, it supports the development of context-aware reasoning applications based on Large Language Models.

Supporting tools as tools and best practices for GenAI

In addition to the main components, the GenAI stack offers a wide range of supporting Tools, Code Templates, Instructions and best practices for developers. These resources help developers realize the full potential of the GenAI stack and achieve optimal results when developing AI applications to achieve.

The GenAI stack is presented as a strong answer to the challenges AI developers have faced so far. Now, according to Docker, they benefit from a user-friendly configuration that offers them a wide range of functions. These include easy data loading and vector index creation. These functionalities allow them to work seamlessly with data and embed questions and answers in the indexes.

Additionally, the platform enables advanced querying and enrichment of application results through data summarization and flexible knowledge graphs. Developers are able to create diverse response formats, which range from lists to GitHub issues and PDFs to poems.

Particularly interesting is the possibility to use the achieved Compare results, whether between standalone Large Language Models (LLMs), LLMs with vector integration, or LLMs that use vector and knowledge graph integration. This gives developers a wide choice and flexibility in developing their AI applications.

Expressions about Docker AI and the GenAI stack

Docker AI Opinion

Opinions from Docker stakeholders and partners were collected as part of a press release for DockerCon on October 5, 2023. In sum, these experts assess the new developments extremely positively and see the future of AI development as bright.

We would like to emphasize that these are expressly opinions of sympathizing parties and not an objective assessment by neutral industry professionals.

James Governor, principal analyst and co-founder of RedMonk, highlights the need for a consistent experience across the tool landscape to attract mainstream developers to AI development. Emil Eifrem, co-founder and CEO of Neo4j, is excited about the opportunities opening up for millions of developers. Harrison Chase, co-founder and CEO of LangChain, talks about the bridge between the magical user experience of GenAI and the work that needs to be done. Jeffrey Morgan, founder of Ollama, expresses excitement about working with the Docker community to develop the next generation of AI applications.

"Research from IDC shows that generative AI tools contribute to developer satisfaction by improving productivity, increasing speed, and spending more time completing higher-value tasks."

Katie Norton, senior research analyst for DevOps and DevSecOps at IDC

In the specific context of Docker AI, Katie Norton follows up with another statement, "Docker AI guidance will not only help achieve these benefits, but also prepare developers for success across the application stack. By leveraging the collective knowledge of the Docker developer community, developers can be confident that Docker AI's insights are based on best practices and recommend the most secure and up-to-date images."

Conclusion and outlook

The announcement of Docker AI and the GenAI stack at DockerCon 2023 is currently generating a lot of interest in the developer community as they are expected to simplify and accelerate AI development with Docker containers. In summary, Docker Inc. and partners emphasize that Docker AI will help developers configure Docker systems and apply best practices, while the GenAI stack provides a diverse set of tools and resources for AI development.

However, so far there are little to no customer feedback to these new products and services, as they are still in an early access program.

It is important to note that the positive opinions of Docker stakeholders and partners as presented in the press release are subjective and further objective evaluations and experiences must await to assess whether Docker AI and the GenAI stack are indeed the future of AI development.

It remains to be seen how these new tools will impact the development landscape, especially given the fact that previous code generation tools represent only a small portion of development work.

What is your opinion on Docker AI?

Write us a message. We are looking forward to the professional exchange.

    About me

    More Articles

    PDF to JSON conversion for intelligent text processing

    Many technologies are nowadays a natural part of everyday life. Videos are automatically embellished. Online stores know what we are buying before we do...

    Read article
    semistructured data

    Semistructured Data: Challenges and Solutions

    In these times, the rapidly growing data stream around modern companies demands precisely tailored processing strategies. In addition to unstructured formats, this can...

    Read article
    become a guest author

    Become a guest author on the AI blog of Konfuzio

    The blog on Konfuzio.com is always looking for qualified guest authors who can share expert knowledge on the topics of AI and IT with...

    Read article
    Arrow-up