Turborepo remote cache server that just works

A tiny web server written in Rust, providing distributed remote caching for Turborepo. Deploy as a GitHub Action or Docker container with S3-compatible storage.

Tested with popular storage providers:

Amazon S3Cloudflare R2MinIO

Folks from the following companies using it

N26
Cursor
BBC
LEGO
Amplitude

Why choose Turbo Cache Server?

Built with performance and simplicity in mind. Never rebuild the same artifacts twice.

Built with Rust

Tiny, fast web server written in Rust for maximum performance and reliability

S3-Compatible Storage

Works with Amazon S3, Cloudflare R2, MinIO, and any S3-compatible storage provider

API-Compliant

Fully compatible with Turborepo's remote caching API - drop-in replacement

Two ways to deploy

Choose the deployment method that fits your workflow. Both options provide the same performance and features.

GitHub Action

Perfect for GitHub workflows. Starts automatically in the background during your CI runs.

Key benefits:

  • • Zero configuration required
  • • Automatic startup and teardown
  • • Integrates seamlessly with existing workflows

Docker Container

Universal deployment option. Works with GitLab CI, Jenkins, or any CI system that supports Docker.

Key benefits:

  • • Works with any CI provider
  • • Self-hosted deployment options
  • • Full control over infrastructure

Quick setup examples

Get started in minutes with these copy-paste examples for your preferred deployment method.

GitHub Action Setup

env:
  TURBO_API: "http://127.0.0.1:8585"
  TURBO_TEAM: "NAME_OF_YOUR_REPO_HERE"
  TURBO_TOKEN: "turbo-token"

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout repository
        uses: actions/checkout@v4

      - name: Turborepo Cache Server
        uses: brunojppb/turbo-cache-server@1.0.24
        env:
          PORT: "8585"
          S3_BUCKET_NAME: your-bucket-name-here
          S3_REGION: "eu-central-1"
          S3_ACCESS_KEY: ${{ secrets.S3_ACCESS_KEY }}
          S3_SECRET_KEY: ${{ secrets.S3_SECRET_KEY }}

          # Optional: If not using AWS, provide endpoint like "https://minio" for your instance.
          S3_ENDPOINT: ${{ secrets.S3_ENDPOINT }}
          # Optional: If your S3-compatible store does not support requests
          # like https://bucket.hostname.domain/. Setting "S3_USE_PATH_STYLE"
          # to true configures the S3 client to make requests like
          # https://hostname.domain/bucket instead.
          # Defaults to "false"
          S3_USE_PATH_STYLE: false
          # Max payload size for each cache object sent by Turborepo
          # Defaults to 100 MB
          # Requests larger than that, will get "HTTP 413: Entity Too Large" errors
          MAX_PAYLOAD_SIZE_IN_MB: "100"

      - name: Run tasks
        run: turbo run test build typecheck

Docker Setup

Works with GitLab CI, Jenkins, etc.
docker run \
  -e S3_ACCESS_KEY=KEY \
  -e S3_SECRET_KEY=SECRET \
  -e S3_BUCKET_NAME=my_cache_bucket \
  -e S3_ENDPOINT=https://s3_endpoint_here \
  -e S3_REGION=eu \
  -p "8000:8000" \
  ghcr.io/brunojppb/turbo-cache-server