GitLab Duo Self-Hosted

  • Tier: Premium, Ultimate
  • Add-on: GitLab Duo Enterprise
  • Offering: GitLab Self-Managed
History

To maintain full control over your data privacy, security, and the deployment of large language models (LLMs) in your own infrastructure, use GitLab Duo Self-Hosted.

By deploying GitLab Duo Self-Hosted, you can manage the entire lifecycle of requests made to LLM backends for GitLab Duo features, ensuring that all requests stay in your enterprise network, and avoiding external dependencies.

For a click-through demo, see GitLab Duo Self-Hosted product tour.

For an overview, see GitLab Duo Self-Hosted: AI in your private environment.

Why use GitLab Duo Self-Hosted

With GitLab Duo Self-Hosted, you can:

  • Choose any GitLab-supported LLM.
  • Retain full control over data by keeping all request/response logs in your domain, ensuring complete privacy and security with no external API calls.
  • Isolate the GitLab instance, AI gateway, and models in your own environment.
  • Select specific GitLab Duo features tailored to your users.
  • Eliminate reliance on the shared GitLab AI gateway.

This setup ensures enterprise-level privacy and flexibility, allowing seamless integration of your LLMs with GitLab Duo features.

Supported GitLab Duo features

The following tables list the GitLab Duo features, and whether they are available on GitLab Duo Self-Hosted or not.

To use these features with GitLab Duo Self-Hosted, you must have the Duo Enterprise add-on. This applies regardless of whether you can use these features with Duo Core or Duo Pro when GitLab hosts the models and connects to those models through the cloud-based AI gateway.

Code Suggestions

FeatureAvailable on GitLab Duo Self-HostedGitLab version
Code Suggestionscheck-circle-filled YesGitLab 17.9 and later

Chat

FeatureAvailable on GitLab Duo Self-HostedGitLab version
Generalcheck-circle-filled YesGitLab 17.9 and later
Explain Codecheck-circle-filled YesGitLab 17.9 and later
Write Testcheck-circle-filled YesGitLab 17.9 and later
Refactor Codecheck-circle-filled YesGitLab 17.9 and later
Fix Codecheck-circle-filled YesGitLab 17.9 and later
Troubleshoot Jobcheck-circle-filled YesGitLab 17.10 and later
Explain Vulnerabilitycheck-circle-filled YesGitLab 18.1 and later

For more examples of a question you can ask, see Ask about GitLab.

GitLab Duo in merge requests

FeatureAvailable on GitLab Duo Self-HostedGitLab version
Generate Commit Messagecheck-circle-filled YesGitLab 18.1 and later
Summarize New Merge Requestcheck-circle-filled YesGitLab 18.1 and later
Code Reviewdash-circle NoNot applicable
Code Review Summarycheck-circle-filled YesGitLab 18.1 and later

GitLab Duo in issues

FeatureAvailable on GitLab Duo Self-HostedGitLab version
Issue Description Generationdash-circle NoNot applicable
Issue Discussion Summarycheck-circle-filled YesGitLab 18.1 and later

Other features

FeatureAvailable on GitLab Duo Self-HostedGitLab version
GitLab Duo for the CLIcheck-circle-filled YesGitLab 18.1 and later
GitLab Duo Workflowcheck-circle-filled NoGitLab 17.4 and later
Vulnerability Resolutioncheck-circle-filled YesGitLab 18.1 and later
AI Impact Dashboardcheck-circle-filled YesGitLab 17.9 and later

Prerequisites

Before setting up the GitLab Duo Self-Hosted infrastructure, you must have:

Decide on your configuration type

GitLab Self-Managed customers can implement AI-native features using either of the following options:

The differences between these options are:

FeatureSelf-hosted AI gatewayGitLab.com AI gateway
Infrastructure requirementsRequires hosting your own AI gateway and modelsNo additional infrastructure needed
Model optionsChoose from supported modelsUses the default GitLab external vendor LLMs
Network requirementsCan operate in fully isolated networksRequires internet connectivity
ResponsibilitiesYou set up your infrastructure, and do your own maintenanceGitLab does the set up and maintenance

Self-hosted AI gateway and LLMs

In a fully self-hosted configuration, you deploy your own AI gateway and use any supported LLMs in your infrastructure, without relying on external public services. This gives you full control over your data and security.

While this configuration is fully self-hosted and you can use models like Mistral that are hosted on your own infrastructure, you can still use cloud-based LLM services like AWS Bedrock or Azure OpenAI as your model backend.

If you have an offline environment with physical barriers or security policies that prevent or limit internet access, and comprehensive LLM controls, you can use GitLab Duo Self-Hosted.

For licensing, you must have a GitLab Premium or Ultimate subscription, and GitLab Duo Enterprise. Offline Enterprise licenses are available for those customers with fully isolated offline environments. To get access to your purchased subscription, request a license through the Customers Portal.

For more information, see:

GitLab.com AI gateway with default GitLab external vendor LLMs

If you do not meet the use case criteria for GitLab Duo Self-Hosted, you can use the GitLab.com AI gateway with default GitLab external vendor LLMs.

The GitLab.com AI gateway is the default Enterprise offering and is not self-hosted. In this configuration, you connect your instance to the GitLab-hosted AI gateway, which integrates with external vendor LLM providers, including:

These LLMs communicate through the GitLab Cloud Connector, offering a ready-to-use AI solution without the need for on-premise infrastructure.

For licensing, you must have a GitLab Premium or Ultimate subscription, and GitLab Duo Enterprise. To get access to your purchased subscription, request a license through the Customers Portal

For more information, see the GitLab.com AI gateway configuration diagram.

To set up this infrastructure, see how to configure GitLab Duo on a GitLab Self-Managed instance.

Set up a GitLab Duo Self-Hosted infrastructure

To set up a fully isolated GitLab Duo Self-Hosted infrastructure:

  1. Install a Large Language Model (LLM) Serving Infrastructure

    • We support various platforms for serving and hosting your LLMs, such as vLLM, AWS Bedrock, and Azure OpenAI. To help you choose the most suitable option for effectively deploying your models, see the supported LLM platforms documentation for more information on each platform’s features.

    • We provide a comprehensive matrix of supported models along with their specific features and hardware requirements. To help select models that best align with your infrastructure needs for optimal performance, see the supported models and hardware requirements documentation.

  2. Install the GitLab AI gateway Install the AI gateway to efficiently configure your AI infrastructure.

  3. Configure GitLab Duo features See the Configure GitLab Duo features documentation for instructions on how to customize your environment to effectively meet your operational needs.

  4. Enable logging You can find configuration details for enabling logging in your environment. For help in using logs to track and manage your system’s performance effectively, see the logging documentation.