Dagger.io: Programmable CI/CD Pipelines That Replace YAML
Your CI/CD pipeline is probably a pile of YAML files that nobody on the team fully understands. One developer wrote them eight months ago, they mostly work, and everyone is afraid to touch them. When something breaks, you spend 45 minutes pushing commits to see if your syntax fix actually lands, because there's no way to run the pipeline locally. Sound familiar?
Dagger fixes this. Created by Solomon Hykes (the co-founder of Docker), Dagger replaces YAML-based CI/CD configurations with real, type-safe code in Python, Go, or TypeScript. Your pipelines become actual software -- testable locally, debuggable in your IDE, and portable across any CI provider. In 2026, with nearly 1,500 community modules on the Daggerverse and integrations with every major CI platform, Dagger has moved from "interesting experiment" to "serious contender."
This guide walks you through setting up Dagger, writing your first pipeline functions, integrating with your existing CI, and understanding the caching system that makes Dagger pipelines run 2-10x faster than their YAML equivalents.
π What You'll Need
- Docker or a compatible container runtime -- Dagger runs everything in containers under the hood
- The Dagger CLI -- installed via Homebrew, curl, or your package manager of choice
- One of the supported languages -- Python 3.10+, Go 1.21+, or Node.js 18+ (for TypeScript)
- An existing project to pipeline -- Dagger shines on real codebases, not hello-world demos
- 15-30 minutes -- enough to go from installation to your first working pipeline
π§± What Dagger Actually Is (And Isn't)
Dagger is not a CI platform. It doesn't replace GitHub Actions, GitLab CI, or Jenkins -- it runs inside them. Think of it as an abstraction layer that sits between your code and whichever CI system triggers your builds.
Here's the mental model:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Your CI Provider β
β (GitHub Actions, GitLab, Jenkins) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β Dagger Engine β
β (Container-based execution) β
ββββββββββββ¬βββββββββββββββ¬ββββββββββββββββββββββββββββ€
β Python β Go β TypeScript β
β SDK β SDK β SDK β
ββββββββββββ΄βββββββββββββββ΄ββββββββββββββββββββββββββββ€
β Your Pipeline Logic β
β (Build, test, lint, deploy -- as real code) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
The Dagger Engine is a specialized container runtime. When you call a Dagger Function, it spins up containers, mounts your source code, runs your commands, and caches the results. Every operation is containerized, which means your pipeline behaves identically whether you're running it on your laptop or in a CI runner on the other side of the planet.
The Key Concepts
- Dagger Functions -- the fundamental unit of work. Each function takes typed inputs and produces typed outputs. They're just methods in your chosen language.
- Dagger Modules -- collections of related functions, packaged for reuse. Published modules live on the Daggerverse.
- The
dagclient -- a pre-initialized API client available inside every function, giving you access to core types likeContainer,Directory,File, andService.
β‘ Installation and First Pipeline
Installing the CLI
macOS (Homebrew):
brew install dagger/tap/dagger
Linux / macOS (curl):
curl -fsSL https://dl.dagger.io/dagger/install.sh | BIN_DIR=/usr/local/bin sudo sh
Windows (winget):
winget install dagger
Verify the installation:
dagger version
Initializing Your First Module
Navigate to your project root and initialize a Dagger module. Pick whichever language you're most comfortable with:
Python:
dagger init --sdk=python --name=my-pipeline
Go:
dagger init --sdk=go --name=my-pipeline
TypeScript:
dagger init --sdk=typescript --name=my-pipeline
This creates a dagger.json configuration file and scaffolds a dagger/ directory with sample code. The structure varies by language:
| Language | Entry Point | Config Files |
|---|---|---|
| Python | dagger/src/main.py |
pyproject.toml, uv.lock |
| Go | dagger/main.go |
go.mod, go.sum |
| TypeScript | dagger/src/index.ts |
package.json, tsconfig.json |
Running Your First Function
The scaffolded module includes a sample function. Run it immediately:
dagger call container-echo --string-arg="hello from dagger"
That single command spins up a container, executes the function, and returns the result. No YAML. No commit-and-push feedback loop. Just a function call.
π Writing Real Pipelines: Python, Go, and TypeScript
Let's build a practical pipeline that lints, tests, and builds a containerized application. I'll show all three languages so you can pick the one that fits your stack.
Python SDK
import dagger
from dagger import dag, function, object_type
@object_type
class MyPipeline:
@function
async def test(self, source: dagger.Directory) -> str:
"""Run tests against the project source."""
return await (
dag.container()
.from_("python:3.12-slim")
.with_directory("/app", source)
.with_workdir("/app")
.with_exec(["pip", "install", "-r", "requirements.txt"])
.with_exec(["pytest", "--tb=short", "-q"])
.stdout()
)
@function
async def lint(self, source: dagger.Directory) -> str:
"""Run linting on the project source."""
return await (
dag.container()
.from_("python:3.12-slim")
.with_directory("/app", source)
.with_workdir("/app")
.with_exec(["pip", "install", "ruff"])
.with_exec(["ruff", "check", "."])
.stdout()
)
@function
async def build(self, source: dagger.Directory) -> dagger.Container:
"""Build a production container image."""
return (
dag.container()
.from_("python:3.12-slim")
.with_directory("/app", source)
.with_workdir("/app")
.with_exec(["pip", "install", "-r", "requirements.txt"])
.with_entrypoint(["python", "app.py"])
)
Go SDK
package main
import (
"context"
"dagger/my-pipeline/internal/dagger"
)
type MyPipeline struct{}
// Test runs the project test suite.
func (m *MyPipeline) Test(ctx context.Context, source *dagger.Directory) (string, error) {
return dag.Container().
From("golang:1.23-alpine").
WithDirectory("/app", source).
WithWorkdir("/app").
WithExec([]string{"go", "mod", "download"}).
WithExec([]string{"go", "test", "./..."}).
Stdout(ctx)
}
// Lint runs the project linter.
func (m *MyPipeline) Lint(ctx context.Context, source *dagger.Directory) (string, error) {
return dag.Container().
From("golangci/golangci-lint:latest").
WithDirectory("/app", source).
WithWorkdir("/app").
WithExec([]string{"golangci-lint", "run"}).
Stdout(ctx)
}
// Build produces a production container image.
func (m *MyPipeline) Build(source *dagger.Directory) *dagger.Container {
return dag.Container().
From("golang:1.23-alpine").
WithDirectory("/app", source).
WithWorkdir("/app").
WithExec([]string{"go", "build", "-o", "/app/server", "."}).
WithEntrypoint([]string{"/app/server"})
}
TypeScript SDK
import { dag, Container, Directory, object, func } from "@dagger.io/dagger";
@object()
class MyPipeline {
@func()
async test(source: Directory): Promise<string> {
return dag
.container()
.from("node:20-slim")
.withDirectory("/app", source)
.withWorkdir("/app")
.withExec(["npm", "ci"])
.withExec(["npm", "test"])
.stdout();
}
@func()
async lint(source: Directory): Promise<string> {
return dag
.container()
.from("node:20-slim")
.withDirectory("/app", source)
.withWorkdir("/app")
.withExec(["npm", "ci"])
.withExec(["npx", "eslint", "."])
.stdout();
}
@func()
async build(source: Directory): Promise<Container> {
return dag
.container()
.from("node:20-slim")
.withDirectory("/app", source)
.withWorkdir("/app")
.withExec(["npm", "ci", "--production"])
.withEntrypoint(["node", "dist/index.js"]);
}
}
Running These Pipelines
The magic is that you call these from the CLI the same way regardless of language:
# Run tests with your local source code
dagger call test --source=.
# Run the linter
dagger call lint --source=.
# Build the container and export it
dagger call build --source=. export --path=./build.tar
# Chain them: test, then build
dagger call test --source=. && dagger call build --source=.
Every call runs in containers. Every call works identically on your laptop and in CI. That's the entire point.
π Integrating Dagger With Your Existing CI
Dagger doesn't ask you to abandon your CI platform. You keep GitHub Actions (or GitLab CI, or Jenkins, or whatever you use) as the trigger, but delegate the actual pipeline logic to Dagger. This means your CI YAML shrinks to a handful of lines.
GitHub Actions Integration
Here's what a typical GitHub Actions workflow looks like before Dagger:
# .github/workflows/ci.yml -- BEFORE (the YAML you're used to)
name: CI
on: [push]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.12"
- run: pip install -r requirements.txt
- run: pytest
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.12"
- run: pip install ruff
- run: ruff check .
build:
runs-on: ubuntu-latest
needs: [test, lint]
steps:
- uses: actions/checkout@v4
- uses: docker/build-push-action@v5
with:
push: true
tags: myapp:latest
And here's the after -- the same workflow with Dagger:
# .github/workflows/ci.yml -- AFTER (Dagger handles the logic)
name: CI
on: [push]
jobs:
pipeline:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Test
uses: dagger/dagger-for-github@v8
with:
version: "latest"
verb: call
args: test --source=.
- name: Lint
uses: dagger/dagger-for-github@v8
with:
version: "latest"
verb: call
args: lint --source=.
- name: Build
uses: dagger/dagger-for-github@v8
with:
version: "latest"
verb: call
args: build --source=.
The CI YAML becomes a thin wrapper. All the actual logic lives in your Dagger module, written in a real programming language, testable on your local machine.
GitLab CI Integration
# .gitlab-ci.yml
stages:
- ci
dagger:
stage: ci
image: docker:latest
services:
- docker:dind
before_script:
- apk add --no-cache curl
- curl -fsSL https://dl.dagger.io/dagger/install.sh | BIN_DIR=/usr/local/bin sh
script:
- dagger call test --source=.
- dagger call lint --source=.
- dagger call build --source=.
Jenkins Integration
// Jenkinsfile
pipeline {
agent any
stages {
stage('Dagger Pipeline') {
steps {
sh 'curl -fsSL https://dl.dagger.io/dagger/install.sh | BIN_DIR=/usr/local/bin sh'
sh 'dagger call test --source=.'
sh 'dagger call lint --source=.'
sh 'dagger call build --source=.'
}
}
}
}
Notice how the Jenkins, GitLab, and GitHub Actions configs all call the same dagger call commands. Your pipeline logic is defined once, in code, and works everywhere. That's vendor lock-in, eliminated.
π Caching, Performance, and the Daggerverse
How Dagger Caching Works
Dagger's performance story is its best feature after portability. The engine uses two caching mechanisms:
Layer Cache (automatic): Every container operation produces layers, just like Docker. Dagger caches these layers based on a content-addressed key that includes the operation, its arguments, and the input state. If you run pip install -r requirements.txt and the requirements file hasn't changed, Dagger skips the install entirely on subsequent runs.
Cache Volumes (explicit): For things like dependency caches (node_modules, .cache/pip, Go module cache), you can mount persistent cache volumes:
@function
async def test(self, source: dagger.Directory) -> str:
pip_cache = dag.cache_volume("pip-cache")
return await (
dag.container()
.from_("python:3.12-slim")
.with_mounted_cache("/root/.cache/pip", pip_cache)
.with_directory("/app", source)
.with_workdir("/app")
.with_exec(["pip", "install", "-r", "requirements.txt"])
.with_exec(["pytest", "-q"])
.stdout()
)
The result? Teams report 2x to 10x faster pipelines after switching from YAML-based CI to Dagger. OpenMeter published a case study showing a 5x speedup on their CI pipeline after Daggerizing it. Most of those gains come from intelligent caching and automatic parallelization of independent operations.
The Daggerverse: Community Modules
The Daggerverse (daggerverse.dev) hosts nearly 1,500 community-contributed modules. Instead of writing everything from scratch, you can pull in pre-built modules:
# Use a community Golang module
dagger call -m github.com/kpenfound/dagger-modules/golang test --source=.
# Use a community Docker module for multi-platform builds
dagger call -m github.com/purpleclay/daggerverse/docker build --source=.
You can also declare dependencies in your dagger.json:
{
"name": "my-pipeline",
"sdk": "python",
"dependencies": [
{
"name": "golang",
"source": "github.com/kpenfound/dagger-modules/golang"
}
]
}
These modules are language-agnostic. A Go module can be consumed by a Python pipeline. Dagger handles the cross-language function calls through its API layer.
π Dagger vs. the Alternatives
Here's how Dagger stacks up against the CI/CD tools you're probably already using:
| Feature | Dagger | GitHub Actions | Jenkins | GitLab CI |
|---|---|---|---|---|
| Pipeline language | Python, Go, TS (+ 5 more) | YAML | Groovy/YAML | YAML |
| Run locally | β Identical to CI | β Requires act (limited) |
β οΈ Possible but painful | β No native support |
| Vendor lock-in | β None -- runs anywhere | β GitHub-only | β οΈ Self-hosted but complex | β GitLab-only |
| Built-in caching | β Automatic + explicit | β οΈ Manual cache actions | β Plugin-dependent | β οΈ Manual configuration |
| Debugging experience | β IDE + local breakpoints | β Commit-and-push | β οΈ Replay with limitations | β Commit-and-push |
| Learning curve | β οΈ New concepts to learn | β Low (YAML is simple) | β Steep (Groovy, plugins) | β Low (YAML is simple) |
| Community ecosystem | β οΈ ~1,500 modules (growing) | β 20,000+ marketplace actions | β 1,800+ plugins | β οΈ Smaller catalog |
| Cost | β Free (engine is OSS) | β οΈ Free tier + paid minutes | β Free (self-hosted cost) | β οΈ Free tier + paid minutes |
When to Choose Dagger
Use Dagger when:
- You're tired of debugging YAML by pushing commits to CI
- Your team uses multiple CI providers and wants portable pipelines
- Pipeline logic is complex enough to benefit from real programming constructs (loops, conditionals, error handling, types)
- You want to test pipeline changes locally before they hit CI
- You need consistent behavior between local development and CI
Stick with pure YAML when:
- Your pipelines are simple (checkout, install, test, deploy) and rarely change
- Your entire team is comfortable with the existing YAML configuration
- You're on a single CI platform with no plans to migrate
Dagger Cloud Pricing
The Dagger engine is fully open source. Dagger Cloud adds observability, trace visualization, and team features:
| Plan | Price | What You Get |
|---|---|---|
| Individual | Free | Pipeline observability, trace explorer |
| Team | $50/month (up to 10 users) | Shared observability, module sharing |
| Enterprise | Custom | Dedicated support, SLAs, SSO |
You don't need Dagger Cloud to use Dagger. The open-source engine handles everything discussed in this article. Cloud is a nice-to-have for teams that want centralized pipeline monitoring.
π§ Troubleshooting Common Issues
"Cannot connect to the Docker daemon"
Dagger needs a container runtime. Make sure Docker Desktop, OrbStack, or Podman is running. On Linux, check that your user is in the docker group: sudo usermod -aG docker $USER, then log out and back in.
"Module not found" when calling functions
Make sure you're in the directory containing dagger.json, or pass --mod to specify the module path. Also verify your module was initialized correctly with dagger init.
"Slow first run, fast subsequent runs"
This is expected. The first run pulls base images and installs dependencies. Subsequent runs hit the layer cache and complete much faster. If you want to warm the cache in CI, run dagger call once in a setup step.
"Type errors in my Dagger Functions"
Dagger's type system is strict. Make sure your function parameters use the correct Dagger types (dagger.Directory, dagger.Container, etc.) instead of raw strings. The SDK generates these types -- check the generated code in your module's SDK directory.
"Cache not persisting between CI runs"
By default, Dagger's cache lives on the runner's local disk. For ephemeral CI runners (like GitHub Actions), the cache is lost after each job. Use Dagger Cloud or mount an external cache volume to persist across runs. Alternatively, export and import cache artifacts as CI cache entries.
πΊοΈ What's Next
- π Start with
dagger initin an existing project. Pick whichever SDK matches your stack and convert one pipeline job to a Dagger Function. You'll feel the difference immediately. - π¦ Browse the Daggerverse at daggerverse.dev for pre-built modules. Don't reinvent the wheel for common tasks like Docker builds, Helm deployments, or language-specific test runners.
- π Integrate incrementally. Keep your existing CI platform. Replace one YAML job at a time with a
dagger call. There's no need for a big-bang migration. - π€ Combine with AI-powered CI. Dagger's programmable pipelines pair well with AI-driven automation. Check out our guide on GitHub Actions + AI for ideas on intelligent pipeline triggers and auto-remediation.
- π Read the official docs at docs.dagger.io for advanced topics: custom types, services, secrets management, and multi-platform builds.
For a broader look at how AI is changing the developer toolkit, read AI Coding Agents Compared and The Rise of the AI Engineer.