Below is the complete list of sections. Detailed expansion follows after.
- What Is a Monorepo?
- What Is Turborepo?
- History & Evolution (Jared Palmer → Vercel Acquisition → Rust Rewrite)
- Problems Turborepo Solves
- Turborepo vs. Other Monorepo Tools (Nx, Lerna, Rush, Bazel, Moon)
- Workspaces (The Building Block)
- The Dependency Graph (Topological Ordering)
- Tasks & the Task Pipeline (
turbo.json) - Hashing & Content-Aware Caching
- Local Caching (
.turbodirectory) - Remote Caching (Vercel Remote Cache & Custom)
- Incremental Builds
- Recommended Directory Structure
- Package Manager Support (npm, yarn, pnpm, bun)
- Initializing a Turborepo (
create-turbo& manual) - Root
package.jsonConfiguration - Root
turbo.jsonConfiguration - Workspace
package.jsonConventions - Workspace-Level
turbo.jsonOverrides - Internal Packages vs. Applications
turbo.jsonSchema — Full Referencepipeline/tasksConfiguration (v1 vs. v2 syntax)dependsOn— Task Dependencies & Topological Specifiers (^)outputs— Declaring Cacheable Artifactsinputs— Narrowing Hash Inputscache— Enabling/Disabling Per Taskpersistent— Long-Running Tasks (Dev Servers)env&globalEnv— Environment Variable HandlingpassThroughEnv&globalPassThroughEnvoutputLogs— Log Verbosity ControlinteractiveTaskConfiguration
turbo run <task>— Running Tasks- Filtering (
--filter) — Scope, Directory, Git Diff --dry-run— Inspecting What Would Execute--graph— Visualizing the Task Graph--concurrency— Controlling Parallelism--continue— Running Past Failures--force— Bypassing Cache--output-logs— Controlling Outputturbo prune— Sparse Monorepo Deploysturbo gen— Code Generation (Generators)turbo login/turbo link— Remote Cache Authturbo ls&turbo inspect
- How Hashing Works (Files, Env Vars, Dependencies, Lockfile)
- Cache Hit vs. Cache Miss — Lifecycle
- Artifact Caching (File outputs, Logs)
- Local Cache Configuration & Garbage Collection
- Remote Caching Architecture
- Self-Hosted Remote Cache (Ducktail, turborepo-remote-cache, etc.)
- Cache Signing & Security
- Debugging Cache Misses (
--verbosity,--dry-run=json)
- Internal Dependencies (
workspace:*Protocol) - External Dependency Hoisting & Isolation
- Shared Configuration Packages (ESLint, TypeScript, Prettier)
- Versioning Strategies Inside the Monorepo
- Managing Conflicting Dependency Versions
- Shared
tsconfigPackages - Project References vs. Workspace-Level Compilation
- Composite Builds &
declarationOutputs - Path Aliases & Module Resolution
- Type-Checking as a Turbo Task
- Build Task Orchestration
- Linting with Shared ESLint Configs
- Unit Testing (Jest / Vitest) in a Monorepo
- End-to-End Testing Strategies
- Code Formatting (Prettier) as a Task
- Running Dev Servers in Parallel (
persistenttasks) - Watch Mode & Rebuilds
- Hot Module Replacement Across Packages
- Using
turbowith Docker turbo prunefor Optimized Docker Layers- CI/CD Integration (GitHub Actions, GitLab CI, CircleCI, etc.)
- Caching in CI — Remote Cache Best Practices
- Branch-Based & PR-Based Cache Scoping
- Polyglot Monorepos (Non-JS Packages)
- Publishing Packages from a Turborepo (Changesets)
- Feature Flags & Environment-Specific Builds
- Micro-Frontends with Turborepo
- Monorepo with Multiple Frameworks (Next.js, Remix, Vite, etc.)
- Dynamic Workspace Inclusion/Exclusion
- Custom Turbo Daemon & Background Processes
- Codemods & Migration Automation (
@turbo/codemod)
- Benchmarks & Real-World Performance Gains
- Multitasking — How Turbo Parallelizes
- The Turbo Daemon (Persistent File-Watching Process)
- Scaling to Hundreds of Packages
- Profiling & Debugging Slow Pipelines
- Vercel Deployment Integration
- Next.js Monorepo Patterns
- Storybook in a Turborepo
- Prisma / Database Packages
- Shared UI Component Libraries (Design Systems)
- API & Full-Stack App Patterns
- "Cache Miss" When You Expect a Hit
- Phantom Dependencies
- Circular Dependencies
- Platform/OS-Specific Cache Invalidation
node_modulesResolution Issues- Out-of-Memory & Large Monorepo Issues
- Migrating from Lerna
- Migrating from Nx
- Migrating from Polyrepo to Monorepo
- Incremental Adoption (Adding Turbo to Existing Monorepo)
- Turbo v1 → v2 Migration
- Naming Conventions
- Ownership & CODEOWNERS
- Monorepo Boundary Enforcement
- Documentation Strategies
- When NOT to Use a Monorepo / Turborepo
A monorepo (monolithic repository) is a single version-controlled repository containing multiple distinct projects, applications, and/or libraries. Unlike a "monolith" (one tightly coupled application), a monorepo maintains separate, independently deployable/publishable units that happen to live together. Key properties include:
- Unified version control — one repo, one commit history.
- Shared tooling — one linting config, one CI pipeline, one set of standards.
- Atomic commits — changes spanning multiple packages land in a single commit.
- Code sharing — trivial to share utilities, types, and components.
- Dependency deduplication — shared
node_modules, reduced install size.
Notable companies using monorepos: Google (billions of lines), Meta, Microsoft, Vercel, Shopify.
Turborepo is a high-performance build system and task orchestrator for JavaScript/TypeScript monorepos. It is NOT a package manager, bundler, or framework — it is a task runner that sits on top of your existing toolchain and makes it dramatically faster.
Core value proposition:
- Never do the same work twice — content-aware hashing + local/remote caching.
- Maximum parallelism — understands the dependency graph and runs independent tasks concurrently.
- Zero-config convention — works with your existing
package.jsonscripts. - Incremental by design — only rebuilds what changed.
Turborepo is written in Rust (originally Go, rewritten starting in 2023) for maximum performance and ships as a single turbo binary.
| Year | Event |
|---|---|
| 2021 | Jared Palmer creates Turborepo as an open-source project. |
| Dec 2021 | Vercel acquires Turborepo. Jared joins Vercel. |
| 2022 | Rapid adoption. Remote caching integrated into Vercel platform. |
| 2023 | Rust rewrite begins (replacing Go internals). Turbo Daemon introduced. |
| 2024 | Turborepo v2 released. Pipeline config renamed to tasks. Major DX improvements. |
| 2025 | Continued Rust migration, performance improvements, enhanced filtering. |
| Problem | How Turbo Solves It |
|---|---|
| Slow builds across many packages | Caching + parallelism |
| Redundant CI work | Remote cache shares artifacts across machines/branches |
| Complex task orchestration | Declarative dependsOn with topological awareness |
| "Works on my machine" inconsistencies | Deterministic hashing includes env vars, lockfile, etc. |
| Deploying only what changed | --filter + turbo prune |
| Dev server coordination | persistent tasks + parallel execution |
| Feature | Turborepo | Nx | Lerna (v6+) | Rush | Bazel |
|---|---|---|---|---|---|
| Language | Rust | TypeScript | TypeScript | TypeScript | Java/Starlark |
| Learning curve | Low | Medium–High | Low | High | Very High |
| Caching | ✅ Local + Remote | ✅ Local + Nx Cloud | ✅ (via Nx) | ✅ | ✅ |
| Task orchestration | ✅ | ✅ | ✅ | ✅ | ✅ |
| Code generation | Basic (turbo gen) |
Extensive | ❌ | ❌ | ❌ |
| Plugins | ❌ Minimal | ✅ Rich ecosystem | ❌ | ❌ | ✅ |
| Framework-agnostic | ✅ | Mostly (Angular roots) | ✅ | ✅ | ✅ |
| Config complexity | Very Low (turbo.json) |
Medium (project.json) |
Low | High | Very High |
| Zero-config | ✅ | Partial | ✅ | ❌ | ❌ |
| Polyglot | Limited | Limited | ❌ | ❌ | ✅ Full |
| Computation caching | ✅ | ✅ | ✅ | ✅ | ✅ |
| Distributed execution | ❌ | ✅ (Nx Agents) | ❌ | ❌ | ✅ |
Summary: Turborepo wins on simplicity and speed of adoption. Nx wins on feature richness and enterprise tooling. Bazel wins on polyglot and scale. Turborepo is the best choice when you want powerful caching/orchestration with minimal configuration overhead in a JS/TS ecosystem.
Workspaces are the atomic units of a monorepo. Each workspace is a directory with its own package.json. They can be:
- Applications (
apps/web,apps/api) — deployable artifacts. - Packages (
packages/ui,packages/utils) — shared libraries consumed by apps or other packages.
Workspaces are declared via the package manager:
// Root package.json (npm/yarn)
{
"workspaces": ["apps/*", "packages/*"]
}# pnpm-workspace.yaml
packages:
- "apps/*"
- "packages/*"Turborepo auto-discovers all workspaces. It does not maintain its own workspace registry.
Turborepo constructs a DAG (Directed Acyclic Graph) from workspace-level dependencies and devDependencies:
apps/web → packages/ui → packages/utils
apps/api → packages/utils
This graph determines:
- Build order —
packages/utilsmust build beforepackages/ui, which must build beforeapps/web. - Parallelism opportunities —
packages/utilsand unrelated packages can build simultaneously. - Cache invalidation scope — changes to
packages/utilsinvalidate the cache for all dependents.
Turbo performs topological sorting to guarantee correct ordering.
A "task" in Turborepo is a script defined in a workspace's package.json:
{
"scripts": {
"build": "tsc",
"test": "vitest",
"lint": "eslint ."
}
}The task pipeline (defined in turbo.json) tells Turbo how tasks relate:
{
"$schema": "https://turbo.build/schema.json",
"tasks": {
"build": {
"dependsOn": ["^build"],
"outputs": ["dist/**"]
},
"test": {
"dependsOn": ["build"]
},
"lint": {},
"dev": {
"persistent": true,
"cache": false
}
}
}Key insight: Tasks are NOT workspaces. When you run turbo run build, Turbo runs the build script in every workspace that has one, respecting the dependency graph and pipeline configuration.
For every task execution, Turbo computes a deterministic hash based on:
- Source files in the workspace (contents, not timestamps)
- Internal dependency hashes (transitive)
- Lockfile fragments (resolved external dependency versions)
- Environment variables declared in
env/globalEnv - Task configuration in
turbo.json - Arguments passed to the task
- Global files (if configured via
globalDependencies)
If the hash matches a previous run → CACHE HIT → outputs are restored instantly. If not → CACHE MISS → task executes normally.
This is content-aware, not timestamp-based. Reverting a file to a previous state will produce a cache hit from that previous run.
By default, cached artifacts are stored in ./node_modules/.cache/turbo (or .turbo in newer versions). For every cached task, Turbo stores:
- Output files (as declared in
outputs) - Terminal logs (stdout/stderr)
- Hash metadata
On a cache hit, outputs are replayed from the local cache and logs are replayed to the terminal (with a FULL TURBO indicator).
Remote caching allows sharing the cache across machines: CI runners, teammates' laptops, and different branches.
Vercel Remote Cache (built-in):
turbo login # Authenticate with Vercel
turbo link # Link repo to a Vercel project
turbo run build # Now caches are synced to the cloudHow it works:
- Before executing a task, Turbo checks the remote cache for the hash.
- If found → downloads artifacts → CACHE HIT (no execution).
- If not found → executes → uploads artifacts to remote cache.
This means a CI build on main can populate the cache, and a developer pulling the branch gets instant builds.
Turborepo provides incrementality at two levels:
- Task-level — only re-runs tasks whose inputs changed.
- Workspace-level — only affected workspaces are processed when using
--filter.
Combined with caching, this means:
- First run: all tasks execute (cold cache).
- Subsequent runs: only changed tasks execute; everything else is restored from cache.
- In CI: if remote cache is warm, even first runs on a new machine are instant.
my-monorepo/
├── apps/
│ ├── web/ # Next.js frontend
│ │ ├── src/
│ │ ├── package.json
│ │ └── tsconfig.json
│ ├── api/ # Express/Fastify backend
│ │ ├── src/
│ │ ├── package.json
│ │ └── tsconfig.json
│ └── mobile/ # React Native app
│ ├── src/
│ └── package.json
├── packages/
│ ├── ui/ # Shared React component library
│ │ ├── src/
│ │ ├── package.json
│ │ └── tsconfig.json
│ ├── utils/ # Shared utility functions
│ │ ├── src/
│ │ ├── package.json
│ │ └── tsconfig.json
│ ├── database/ # Prisma schema + client
│ │ ├── prisma/
│ │ ├── package.json
│ │ └── tsconfig.json
│ ├── eslint-config/ # Shared ESLint configuration
│ │ ├── index.js
│ │ └── package.json
│ └── typescript-config/ # Shared tsconfig files
│ ├── base.json
│ ├── nextjs.json
│ ├── react-library.json
│ └── package.json
├── turbo.json
├── package.json
├── pnpm-workspace.yaml # (if using pnpm)
├── .gitignore
├── .npmrc
└── README.md
Turborepo supports all major Node package managers:
| Manager | Workspace Declaration | Lockfile | Notes |
|---|---|---|---|
| npm (v7+) | package.json → workspaces |
package-lock.json |
Native workspaces |
| Yarn (v1 Classic) | package.json → workspaces |
yarn.lock |
Widely used |
| Yarn (v2+ Berry) | package.json → workspaces |
yarn.lock |
PnP support varies |
| pnpm | pnpm-workspace.yaml |
pnpm-lock.yaml |
Recommended — strictest, fastest, most disk-efficient |
| Bun | package.json → workspaces |
bun.lockb |
Emerging support |
Turborepo auto-detects the package manager from the lockfile. You can also specify it via "packageManager" in root package.json:
{
"packageManager": "pnpm@9.1.0"
}Option A: Scaffold with create-turbo
npx create-turbo@latest my-monorepo
# or
pnpm dlx create-turbo@latest my-monorepoThis scaffolds a fully working monorepo with example apps and packages.
Option B: Add to existing monorepo
# Install turbo
pnpm add -D turbo -w # (-w = root workspace)
# Create turbo.json
cat > turbo.json << 'EOF'
{
"$schema": "https://turbo.build/schema.json",
"tasks": {
"build": {
"dependsOn": ["^build"],
"outputs": ["dist/**", ".next/**", "!.next/cache/**"]
},
"test": {
"dependsOn": ["build"]
},
"lint": {},
"dev": {
"cache": false,
"persistent": true
}
}
}
EOFOption C: Global install
npm install -g turbo
# Now `turbo` is available everywhere{
"name": "my-monorepo",
"private": true,
"packageManager": "pnpm@9.1.0",
"workspaces": ["apps/*", "packages/*"],
"scripts": {
"build": "turbo run build",
"dev": "turbo run dev",
"lint": "turbo run lint",
"test": "turbo run test",
"format": "prettier --write \"**/*.{ts,tsx,md}\""
},
"devDependencies": {
"turbo": "^2.0.0",
"prettier": "^3.0.0"
}
}Key points:
- Root should be
"private": true(never published). - Root scripts delegate to
turbo run. - Shared dev tools (prettier, turbo) live in root
devDependencies.
This is the heart of Turborepo. Detailed breakdown in Part IV.
{
"name": "@myorg/ui",
"version": "0.0.0",
"private": true,
"main": "./dist/index.js",
"types": "./dist/index.d.ts",
"exports": {
".": {
"types": "./dist/index.d.ts",
"import": "./dist/index.mjs",
"require": "./dist/index.js"
}
},
"scripts": {
"build": "tsup src/index.ts --format cjs,esm --dts",
"dev": "tsup src/index.ts --format cjs,esm --dts --watch",
"lint": "eslint src/",
"test": "vitest run"
},
"dependencies": {
"react": "^18.0.0"
},
"devDependencies": {
"@myorg/eslint-config": "workspace:*",
"@myorg/typescript-config": "workspace:*",
"tsup": "^8.0.0",
"typescript": "^5.0.0"
}
}The workspace:* protocol tells the package manager to resolve this dependency from the local monorepo, not npm.
Individual workspaces can override the root turbo.json for their specific needs:
// apps/web/turbo.json
{
"$schema": "https://turbo.build/schema.json",
"extends": ["//"],
"tasks": {
"build": {
"outputs": [".next/**", "!.next/cache/**"],
"env": ["NEXT_PUBLIC_API_URL", "DATABASE_URL"]
}
}
}"extends": ["//"]means "inherit from the rootturbo.json".- Only override what's different for this workspace.
| Aspect | Internal Package | Application |
|---|---|---|
| Purpose | Shared library | Deployable artifact |
| Published to npm? | Usually no ("private": true) |
No |
| Consumed by | Other packages + apps | End users |
main/exports |
Yes (entry points) | Usually no |
| Build output | dist/ |
.next/, build/, etc. |
| Examples | @myorg/ui, @myorg/utils |
apps/web, apps/api |
"Just-in-time" Packages (an alternative pattern): Skip building internal packages entirely — let the consuming app's bundler transpile them:
// packages/ui/package.json
{
"name": "@myorg/ui",
"main": "./src/index.ts", // Point directly to source!
"types": "./src/index.ts"
}Then configure Next.js (or your bundler) to transpile the package:
// apps/web/next.config.js
module.exports = {
transpilePackages: ["@myorg/ui"]
};This eliminates the need to build internal packages at all, dramatically simplifying the pipeline.
{
"$schema": "https://turbo.build/schema.json",
// Global configuration
"globalDependencies": ["tsconfig.json", ".env"],
"globalEnv": ["CI", "NODE_ENV"],
"globalPassThroughEnv": ["GITHUB_TOKEN"],
// UI mode
"ui": "tui", // "tui" | "stream"
// Daemon
"daemon": true,
// Remote cache configuration
"remoteCache": {
"enabled": true,
"signature": false,
"preflight": false,
"timeout": 60
},
// Task definitions
"tasks": {
"build": { /* ... */ },
"test": { /* ... */ },
"lint": { /* ... */ },
"dev": { /* ... */ }
}
}v1 syntax (deprecated):
{ "pipeline": { "build": { ... } } }v2 syntax (current):
{ "tasks": { "build": { ... } } }Migrate with: npx @turbo/codemod migrate
This is the most important configuration field. It defines what must complete before a task can start.
{
"tasks": {
"build": {
"dependsOn": ["^build"]
},
"test": {
"dependsOn": ["build"]
},
"deploy": {
"dependsOn": ["build", "test", "lint"]
}
}
}Three types of dependencies:
| Syntax | Meaning | Example |
|---|---|---|
"^build" |
Topological — run build in all workspace dependencies first |
Package web depends on ui; ui#build runs before web#build |
"build" |
Same-workspace — run build in the same workspace first |
test waits for build in the same package |
"ui#build" |
Specific workspace — run build in workspace ui first |
Explicit cross-workspace dependency |
Visual example of "dependsOn": ["^build"]:
packages/utils#build ──→ packages/ui#build ──→ apps/web#build
└──→ apps/api#build
{
"build": {
"outputs": [
"dist/**",
".next/**",
"!.next/cache/**",
"build/**",
"coverage/**"
]
}
}- Globs relative to the workspace root.
- These files are saved to cache and restored on cache hit.
!prefix for negation (exclude from caching).- If
outputsis empty[], only logs are cached (useful forlint,test).
By default, Turbo hashes all files in a workspace (excluding gitignored files). You can narrow this:
{
"lint": {
"inputs": [
"src/**/*.ts",
"src/**/*.tsx",
".eslintrc.*",
"eslint.config.*"
]
},
"test": {
"inputs": [
"src/**",
"test/**",
"vitest.config.*"
]
}
}Benefit: Changing a README.md won't invalidate the lint cache if README.md isn't in inputs.
$TURBO_DEFAULT$ — a special token meaning "all default inputs":
{
"test": {
"inputs": ["$TURBO_DEFAULT$", "test-fixtures/**"]
}
}{
"dev": {
"cache": false // Never cache dev server runs
},
"deploy": {
"cache": false // Side-effect tasks shouldn't be cached
},
"build": {
"cache": true // Default — cache this task
}
}Disable caching for:
- Dev servers
- Deployment scripts
- Database migrations
- Any task with side effects
{
"dev": {
"persistent": true,
"cache": false
}
}Persistent tasks:
- Are expected to never exit (dev servers, watch modes).
- Cannot be depended on by other tasks (since they never complete).
- Keep the terminal alive.
- Turbo runs them last and in parallel.
Turborepo is strict about environment variables to ensure cache correctness.
{
"globalEnv": ["CI", "NODE_ENV"],
"tasks": {
"build": {
"env": ["DATABASE_URL", "NEXT_PUBLIC_*"],
"dependsOn": ["^build"],
"outputs": ["dist/**"]
}
}
}env— workspace-task-level: included in the task's hash. Wildcards supported.globalEnv— affects hash for ALL tasks across ALL workspaces.- If an env var is NOT listed → it is excluded from the hash and not available (in strict mode) to the task.
Framework auto-detection: Turbo automatically includes framework-prefixed env vars:
NEXT_PUBLIC_*for Next.jsVITE_*for ViteREACT_APP_*for CRANUXT_*for NuxtPUBLIC_*for SvelteKit
{
"globalPassThroughEnv": ["AWS_SECRET_KEY", "GITHUB_TOKEN"],
"tasks": {
"deploy": {
"passThroughEnv": ["DEPLOY_TOKEN"],
"cache": false
}
}
}Pass-through env vars:
- Are available to the task at runtime.
- Are NOT included in the hash (won't affect caching).
- Use for secrets that change per environment but shouldn't invalidate cache.
{
"build": {
"outputLogs": "new-only"
}
}| Value | Behavior |
|---|---|
"full" |
Show all logs (default) |
"hash-only" |
Only show hash |
"new-only" |
Show logs only for cache misses; suppress replayed logs |
"errors-only" |
Only show stderr/error output |
"none" |
Suppress all output |
For tasks that need terminal interaction (prompts, stdin):
{
"tasks": {
"login": {
"interactive": true,
"cache": false,
"persistent": true
}
}
}Interactive tasks disable output buffering and connect stdin.
# Run build in all workspaces
turbo run build
# Run multiple tasks
turbo run build test lint
# Equivalent shorthand (turbo v2)
turbo build test lintTurbo orchestrates tasks based on the dependency graph defined in turbo.json.
The --filter flag is one of Turborepo's most powerful features:
# By workspace name
turbo run build --filter=@myorg/web
turbo run build --filter=web
# By directory
turbo run build --filter=./apps/web
# By workspace + its dependencies
turbo run build --filter=web...
# By workspace + its dependents
turbo run build --filter=...ui
# By workspace + dependencies + dependents
turbo run build --filter=...ui...
# By git diff (changed since main)
turbo run build --filter=[main...HEAD]
# Combine: changed packages and their dependents
turbo run build --filter=...[main...HEAD]
# Exclude a workspace
turbo run build --filter=!@myorg/docs
# Multiple filters (union)
turbo run build --filter=web --filter=api
# By directory glob
turbo run build --filter="./packages/*"Filter syntax cheat sheet:
| Syntax | Meaning |
|---|---|
name |
Exact workspace match |
name... |
Workspace + all its dependencies (downstream) |
...name |
Workspace + all its dependents (upstream) |
...name... |
Dependencies + workspace + dependents |
./path |
By filesystem path |
[git-range] |
Changed workspaces in git commit range |
{dir} |
Directories within a workspace |
!name |
Exclude |
# Text output
turbo run build --dry-run
# JSON output (for scripting)
turbo run build --dry-run=jsonShows what tasks would run, their hashes, dependencies, and cache status — without executing anything. Invaluable for debugging.
# Open interactive graph in browser
turbo run build --graph
# Output to file
turbo run build --graph=graph.svg
turbo run build --graph=graph.png
turbo run build --graph=graph.pdf
turbo run build --graph=graph.json
turbo run build --graph=graph.dotGenerates a visual representation of the task dependency graph. Extremely useful for understanding execution order and parallelism.
# Number of concurrent tasks
turbo run build --concurrency=4
# Percentage of CPU cores
turbo run build --concurrency=50%
# Default: 10 (or number of CPUs, whichever is higher)turbo run build test --continueBy default, Turbo stops on first failure. --continue runs as many tasks as possible, reporting all failures at the end. Useful in CI for getting complete error reports.
turbo run build --forceIgnores existing cache and re-runs all tasks. Useful for debugging cache issues or ensuring a clean build.
turbo run build --output-logs=new-only
turbo run build --output-logs=errors-onlyCLI override for the outputLogs config. Same options as Section 30.
turbo prune @myorg/web --out-dir=./outCreates a sparse/pruned copy of the monorepo containing only:
- The target workspace
- All its internal dependencies (transitive)
- A pruned lockfile
- Relevant workspace configurations
Output structure:
out/
├── apps/
│ └── web/ # The target app
├── packages/
│ ├── ui/ # Direct dependency
│ └── utils/ # Transitive dependency
├── package.json # Pruned root
├── pnpm-lock.yaml # Pruned lockfile (only relevant deps)
└── pnpm-workspace.yaml
Primary use case: Docker builds — dramatically reduces Docker context size and layer invalidation:
FROM node:20-alpine AS builder
RUN npm install -g turbo pnpm
WORKDIR /app
COPY . .
RUN turbo prune @myorg/web --docker
# Thin install layer
FROM node:20-alpine AS installer
WORKDIR /app
COPY --from=builder /app/out/json/ .
RUN pnpm install --frozen-lockfile
# Build layer
COPY --from=builder /app/out/full/ .
RUN pnpm turbo run build --filter=@myorg/web
# Production
FROM node:20-alpine AS runner
WORKDIR /app
COPY --from=installer /app/apps/web/.next ./.next
COPY --from=installer /app/apps/web/package.json ./
CMD ["pnpm", "start"]The --docker flag splits output into json/ (package.json files for install layer) and full/ (source code for build layer) to optimize Docker layer caching.
# Run a generator
turbo gen workspace # Generate a new workspace from templates
turbo gen run # Run custom generatorsDefine custom generators in turbo/generators/config.ts:
import type { PlopTypes } from "@turbo/gen";
export default function generator(plop: PlopTypes.NodePlopAPI): void {
plop.setGenerator("react-component", {
description: "Create a new React component",
prompts: [
{
type: "input",
name: "name",
message: "Component name?",
},
],
actions: [
{
type: "add",
path: "packages/ui/src/{{pascalCase name}}.tsx",
templateFile: "templates/component.hbs",
},
],
});
}# Authenticate with Vercel
turbo login
# Link to a Vercel project (for remote caching)
turbo link
# Unlink
turbo unlink
# Logout
turbo logout# List all workspaces
turbo ls
# Show detailed info about a workspace
turbo inspect @myorg/webTurbo generates a hash for each workspace#task combination:
Hash = f(
source files, # Content of all files in the workspace
internal dep hashes, # Hashes of dependency workspaces (transitive)
lockfile entries, # Resolved versions of external dependencies
task config, # turbo.json config for this task
env variables, # Values of declared env vars
global dependencies, # Files listed in globalDependencies
global env vars, # Values of globalEnv vars
turbo.json hash, # The config file itself
arguments # CLI args passed to the task
)
Important: Hashing is content-based, not timestamp-based. git checkout to a previous commit will produce cache hits from that commit's builds.
Cache Miss:
1. Compute hash
2. Check local cache → not found
3. Check remote cache → not found
4. Execute task
5. Capture outputs + logs
6. Store in local cache (keyed by hash)
7. Upload to remote cache (if enabled)
Cache Hit (local):
1. Compute hash
2. Check local cache → FOUND
3. Restore outputs to workspace
4. Replay logs to terminal
5. Print "FULL TURBO" indicator
Cache Hit (remote):
1. Compute hash
2. Check local cache → not found
3. Check remote cache → FOUND
4. Download artifacts
5. Store in local cache
6. Restore outputs to workspace
7. Replay logs
Cached artifacts per task include:
- File outputs — everything matching the
outputsglobs, stored as a compressed tarball. - Logs — stdout and stderr captured during execution.
- Metadata — hash, timing, status information.
// turbo.json
{
"cacheDir": ".turbo/cache", // Custom cache directory
"daemon": true // File-watching daemon for faster hashing
}Local cache can grow large. Clean it with:
# Remove cache
rm -rf node_modules/.cache/turbo
# or
rm -rf .turbo┌─────────────┐ ┌──────────────┐ ┌─────────────────┐
│ Developer A │────▶│ Remote Cache │◀────│ CI Runner │
│ (laptop) │ │ (Vercel) │ │ (GitHub Action) │
└─────────────┘ └──────────────┘ └─────────────────┘
▲ ▲ │
│ │ │
│ ┌─────┴──────┐ │
└──────────────│ Developer B│───────────────┘
│ (laptop) │
└────────────┘
Artifacts are stored as content-addressable blobs keyed by task hash. The cache is team-scoped (via Vercel project/team).
Turbo's remote cache uses a simple HTTP API. Open-source servers:
- turborepo-remote-cache — Node.js server supporting S3, GCS, Azure Blob, local filesystem.
- turbo-remote-cache-rs — Rust implementation.
- Custom implementation following the Turbo Remote Cache API spec.
Configuration:
# Set custom API endpoint
turbo run build --api="https://my-cache-server.com" --token="my-token" --team="my-team"Or in .turbo/config.json:
{
"apiurl": "https://my-cache-server.com",
"teamid": "team_xxx",
"token": "xxx"
}// turbo.json
{
"remoteCache": {
"signature": true
}
}When enabled, Turbo signs cached artifacts with an HMAC key set via TURBO_REMOTE_CACHE_SIGNATURE_KEY env var. This prevents cache poisoning attacks.
# Step 1: Dry run to see hashes
turbo run build --dry-run=json
# Step 2: Compare hashes between runs
# Look at the "hash" and "hashOfExternalDependencies" fields
# Step 3: Verbose mode
turbo run build --verbosity=2
# Step 4: Check what files are included
turbo run build --summarize
# Generates .turbo/runs/<run-id>.json with full detailsCommon causes:
- Unlisted env var changing between runs
- Timestamp in generated files
- Non-deterministic build output
- Different node_modules resolution
- OS-specific lockfile differences
// apps/web/package.json
{
"dependencies": {
"@myorg/ui": "workspace:*",
"@myorg/utils": "workspace:^1.0.0"
}
}| Protocol | Meaning | Published form |
|---|---|---|
workspace:* |
Any version from workspace | Replaced with actual version on publish |
workspace:^ |
Compatible version | ^1.2.3 |
workspace:~ |
Patch-compatible | ~1.2.3 |
These are resolved by the package manager at install time to symlinks/hard links to the local workspace directory.
pnpm (recommended):
- Strict isolation by default — packages can only import what they declare.
- Hoisting controlled via
.npmrc:# .npmrc shamefully-hoist=false # default, strict public-hoist-pattern[]=*eslint* public-hoist-pattern[]=*prettier*
npm/yarn:
- Hoist by default — all deps accessible from any workspace (phantom dependency risk).
nohoistavailable in Yarn Classic.
ESLint config package:
// packages/eslint-config/package.json
{
"name": "@myorg/eslint-config",
"version": "0.0.0",
"private": true,
"files": ["index.js", "next.js", "react.js"],
"dependencies": {
"@typescript-eslint/eslint-plugin": "^7.0.0",
"@typescript-eslint/parser": "^7.0.0",
"eslint-config-prettier": "^9.0.0"
}
}// packages/eslint-config/index.js
module.exports = {
parser: "@typescript-eslint/parser",
extends: ["eslint:recommended", "prettier"],
// ...
};Consumed by workspaces:
// apps/web/.eslintrc.json
{
"extends": ["@myorg/eslint-config/next"]
}TypeScript config package:
// packages/typescript-config/package.json
{
"name": "@myorg/typescript-config",
"version": "0.0.0",
"private": true,
"files": ["base.json", "nextjs.json", "react-library.json"]
}// packages/typescript-config/base.json
{
"compilerOptions": {
"strict": true,
"target": "ES2020",
"module": "ESNext",
"moduleResolution": "bundler",
"declaration": true,
"declarationMap": true,
"sourceMap": true,
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"skipLibCheck": true
}
}Consumed:
// apps/web/tsconfig.json
{
"extends": "@myorg/typescript-config/nextjs.json",
"compilerOptions": {
"outDir": "dist"
},
"include": ["src/**/*"],
"exclude": ["node_modules"]
}| Strategy | Description | When to use |
|---|---|---|
| Fixed/Locked | All packages share one version | Small teams, tightly coupled |
| Independent | Each package versioned separately | Library publishing, large teams |
0.0.0 everywhere |
No versioning (internal only) | Private monorepos that don't publish |
For publishing, use Changesets:
pnpm add -D @changesets/cli -w
pnpm changeset init
pnpm changeset # Create a changeset
pnpm changeset version # Apply version bumps
pnpm changeset publish # Publish to npmProblem: apps/web needs React 18, apps/legacy needs React 17.
Solutions:
- pnpm overrides: Force a single version repo-wide.
- pnpm catalog (v9+): Centralized dependency version management.
- Accept multiple versions: pnpm's strict isolation handles this naturally.
- Syncpack: Tool to enforce version consistency.
// Root package.json
{
"pnpm": {
"overrides": {
"react": "^18.0.0"
}
}
}(Covered in Section 54. Pattern: packages/typescript-config)
Option A: Independent compilation (recommended for Turborepo)
Each package compiles independently with its own tsconfig.json. Turbo orchestrates the build order via dependsOn: ["^build"].
Option B: TypeScript Project References
// tsconfig.json (root)
{
"references": [
{ "path": "packages/utils" },
{ "path": "packages/ui" },
{ "path": "apps/web" }
]
}With tsc --build. This leverages TypeScript's own incremental/composite build system. Can coexist with Turbo but adds complexity.
Recommendation: Use Turbo's task orchestration instead of TS project references. Simpler and more cacheable.
If using project references:
// packages/utils/tsconfig.json
{
"compilerOptions": {
"composite": true,
"declaration": true,
"declarationMap": true,
"outDir": "dist"
}
}// apps/web/tsconfig.json
{
"compilerOptions": {
"paths": {
"@/*": ["./src/*"],
"@myorg/ui": ["../../packages/ui/src"],
"@myorg/utils": ["../../packages/utils/src"]
}
}
}For the JIT (Just-in-Time) pattern, point paths to source files so your IDE resolves types without building.
// turbo.json
{
"tasks": {
"typecheck": {
"dependsOn": ["^build"],
"outputs": []
}
}
}// packages/ui/package.json
{
"scripts": {
"typecheck": "tsc --noEmit"
}
}Separate type-checking from building allows caching each independently. A lint change shouldn't invalidate typecheck.
Typical build pipeline:
{
"tasks": {
"build": {
"dependsOn": ["^build"],
"outputs": ["dist/**", ".next/**", "!.next/cache/**"],
"env": ["NODE_ENV"]
}
}
}Execution flow for turbo run build:
packages/typescript-config (no build script → skipped)
packages/eslint-config (no build script → skipped)
packages/utils#build ─┐
├── parallel
packages/database#build ─┘
│
packages/ui#build (depends on utils)
│
┌────┴────┐
│ │
apps/web apps/api ── parallel
#build #build
// turbo.json
{
"tasks": {
"lint": {
"dependsOn": ["^build"],
"inputs": [
"src/**/*.ts",
"src/**/*.tsx",
".eslintrc.*",
"eslint.config.*",
"../../packages/eslint-config/**"
],
"outputs": []
}
}
}Note: dependsOn: ["^build"] may be needed if ESLint plugins resolve types from built packages.
{
"tasks": {
"test": {
"dependsOn": ["build"],
"inputs": [
"src/**",
"test/**",
"__tests__/**",
"vitest.config.*",
"jest.config.*"
],
"outputs": ["coverage/**"],
"env": ["CI"]
}
}
}{
"tasks": {
"test:e2e": {
"dependsOn": ["build"],
"outputs": [
"playwright-report/**",
"test-results/**"
],
"cache": false,
"env": ["E2E_BASE_URL", "CI"]
}
}
}E2E tests are often not cached because they depend on external services, browser state, etc.
# Run prettier from root (not through turbo — it's faster)
prettier --write "**/*.{ts,tsx,json,md}"Formatting is typically a root-level task, not per-workspace, because Prettier is fast enough to run on the entire repo and doesn't have workspace-specific config.
If you do want it per-workspace:
{
"tasks": {
"format": {
"outputs": [],
"cache": false
}
}
}{
"tasks": {
"dev": {
"dependsOn": ["^build"],
"persistent": true,
"cache": false
}
}
}# Start all dev servers
turbo run dev
# Start specific ones
turbo run dev --filter=web --filter=apiTurbo runs all dev servers simultaneously, each in its own process, with combined output in the terminal.
The new TUI (Terminal UI) mode ("ui": "tui") provides a split-pane interactive terminal for each persistent task.
{
"tasks": {
"dev": {
"persistent": true,
"cache": false
}
}
}For packages, use the tool's built-in watch mode:
// packages/ui/package.json
{
"scripts": {
"dev": "tsup src/index.ts --format cjs,esm --dts --watch"
}
}turbo watch (v2): Automatically re-runs tasks when source files change:
turbo watch build lint testWhen using the JIT pattern:
- App dev server (Next.js, Vite) watches source files.
- Package changes are picked up immediately (since they point to source).
- HMR fires in the app.
When building packages:
- Package
devscript runstsup --watch. - Outputs
dist/. - App dev server detects
dist/change → HMR.
Naive approach (slow, large images):
COPY . .
RUN npm install
RUN npx turbo run build --filter=webOptimized approach with turbo prune:
# Stage 1: Prune
FROM node:20-alpine AS pruner
RUN npm install -g turbo
WORKDIR /app
COPY . .
RUN turbo prune @myorg/web --docker
# Stage 2: Install
FROM node:20-alpine AS installer
WORKDIR /app
COPY --from=pruner /app/out/json/ .
COPY --from=pruner /app/out/pnpm-lock.yaml ./pnpm-lock.yaml
RUN corepack enable && pnpm install --frozen-lockfile
# Stage 3: Build
COPY --from=pruner /app/out/full/ .
RUN corepack enable && pnpm turbo run build --filter=@myorg/web
# Stage 4: Run
FROM node:20-alpine AS runner
WORKDIR /app
COPY --from=installer /app/apps/web/.next/standalone ./
COPY --from=installer /app/apps/web/.next/static ./.next/static
COPY --from=installer /app/apps/web/public ./public
CMD ["node", "server.js"]The --docker flag creates two directories:
out/json/— Onlypackage.jsonfiles and the lockfile. Used for the install layer. Changes to source code don't invalidate this Docker layer.out/full/— Full source code. Used for the build layer.
This separation means pnpm install (expensive) is only re-run when dependencies change, not on every code change.
GitHub Actions example:
name: CI
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: pnpm/action-setup@v4
with:
version: 9
- uses: actions/setup-node@v4
with:
node-version: 20
cache: "pnpm"
- run: pnpm install --frozen-lockfile
- run: pnpm turbo run build test lint
env:
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
TURBO_TEAM: ${{ vars.TURBO_TEAM }}Key optimizations:
- Remote caching — share cache across CI runs.
- Filter by change — only build/test affected packages:
- run: pnpm turbo run build test --filter="...[origin/main...HEAD]"
- Cache
node_modules— use actions/cache or pnpm's built-in caching.
- Enable remote caching for all CI jobs.
- Use read-only cache in PRs to prevent cache pollution:
TURBO_REMOTE_CACHE_READ_ONLY=true turbo run build
- Populate cache from
mainbranch — CI onmainpushes to cache; PRs read from it. - Scope by team — all team members share one cache.
- Sign artifacts in sensitive environments.
Turbo's cache is global (not branch-scoped) — any hash match from any branch is a valid hit. This is correct because hashing is content-based.
If you need isolation:
- Different
TURBO_TEAMper environment - Self-hosted cache with custom scoping logic
Turborepo is JavaScript-centric but can orchestrate non-JS tasks:
{
"tasks": {
"build": {
"dependsOn": ["^build"],
"outputs": ["target/**"]
}
}
}// packages/rust-lib/package.json
{
"name": "@myorg/rust-lib",
"scripts": {
"build": "cargo build --release"
}
}As long as there's a package.json with scripts, Turbo can orchestrate it. The workspace doesn't have to be JavaScript.
pnpm add -D @changesets/cli -w
pnpm changeset initWorkflow:
# 1. Developer creates a changeset
pnpm changeset
# Prompts: which packages changed? major/minor/patch? description?
# 2. CI or release process versions packages
pnpm changeset version
# 3. Publish
pnpm turbo run build --filter="...[HEAD^]"
pnpm changeset publishCI automation with GitHub Actions:
- uses: changesets/action@v1
with:
publish: pnpm changeset publish
version: pnpm changeset version
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}// turbo.json
{
"tasks": {
"build": {
"env": [
"FEATURE_*",
"NEXT_PUBLIC_FEATURE_*"
]
}
}
}Different env var values produce different hashes → different cache entries. So building with FEATURE_NEW_UI=true caches separately from FEATURE_NEW_UI=false.
apps/
├── shell/ # Host/shell application (Module Federation)
├── mfe-auth/ # Auth micro-frontend
├── mfe-dashboard/ # Dashboard micro-frontend
└── mfe-settings/ # Settings micro-frontend
packages/
├── shared-types/ # Shared TypeScript interfaces
└── shared-utils/ # Shared utilities
Each MFE builds independently. Turbo caches each one. Module Federation or import maps handle runtime composition.
apps/
├── marketing/ # Next.js (SSR/SSG)
├── dashboard/ # Vite + React (SPA)
├── docs/ # Astro (static)
├── api/ # Fastify
└── admin/ # Remix
packages/
├── ui/ # Framework-agnostic components
└── api-client/ # Generated API client (OpenAPI)
Each app has its own framework setup. Shared packages are framework-agnostic. Turbo orchestrates everything uniformly.
# pnpm-workspace.yaml
packages:
- "apps/*"
- "packages/*"
- "!packages/deprecated-*" # Exclude deprecated packagesThe daemon is a background process that watches the filesystem for changes. Benefits:
- Faster hashing — incremental file watching instead of walking the entire tree.
- Persistent state — keeps the workspace graph in memory.
// turbo.json
{
"daemon": true // Default in v2
}Manage the daemon:
turbo daemon status
turbo daemon stop
turbo daemon restart# Migrate turbo.json from v1 to v2
npx @turbo/codemod migrate
# Specific transforms
npx @turbo/codemod rename-pipeline # pipeline → tasks
npx @turbo/codemod add-package-manager
npx @turbo/codemod set-default-outputsTurborepo claims and demonstrates:
- ~65-85% faster CI builds with remote caching
- Millisecond replays on full cache hits
- Linear scaling with workspace count when cache is warm
Real-world example:
| Scenario | Without Turbo | With Turbo (cold) | With Turbo (warm) |
|---|---|---|---|
| 20-package build | 4 min | 2 min | 8 sec |
| 50-package CI | 12 min | 5 min | 30 sec |
Turbo's scheduler:
- Builds the task graph.
- Identifies tasks with no pending dependencies.
- Executes them in parallel (up to
--concurrencylimit). - As tasks complete, new tasks become unblocked.
- Repeats until all tasks are done.
This is optimal — it achieves maximum parallelism given the dependency constraints.
Performance impact:
| Operation | Without Daemon | With Daemon |
|---|---|---|
| Hash computation (large repo) | ~2-5s | ~50-200ms |
| Workspace discovery | ~500ms | ~10ms |
| File change detection | Walk entire tree | Instant (inotify/FSEvents) |
Recommendations for large monorepos:
- Always use remote caching.
- Use pnpm — fastest installs, strictest isolation.
- Narrow
inputs— reduce hash computation. - Use the daemon — faster file watching.
- Filter in CI — only build affected packages.
- Split CI jobs — parallelize across runners.
- Use
turbo prune— for deployment builds.
# Generate a profile
turbo run build --profile=profile.json
# View in Chrome DevTools (chrome://tracing)
# or https://ui.perfetto.devThe profile shows:
- Task execution timeline
- Cache hit/miss per task
- Wait times (blocked on dependencies)
- Parallelism utilization
Vercel has native Turborepo support:
- Automatic remote caching (no config needed)
- Automatic workspace detection
- Build command:
cd ../.. && npx turbo run build --filter=web - Root directory setting:
apps/web
In Vercel project settings:
Root Directory: apps/web
Build Command: cd ../.. && pnpm turbo run build --filter=@myorg/web
Output Directory: .next
Install Command: pnpm install
// apps/web/next.config.js
/** @type {import('next').NextConfig} */
module.exports = {
transpilePackages: ["@myorg/ui", "@myorg/utils"],
output: "standalone", // For Docker
experimental: {
outputFileTracingRoot: path.join(__dirname, "../../"),
},
};// packages/ui/package.json
{
"scripts": {
"storybook": "storybook dev -p 6006",
"build-storybook": "storybook build"
}
}// turbo.json
{
"tasks": {
"storybook": {
"persistent": true,
"cache": false
},
"build-storybook": {
"dependsOn": ["^build"],
"outputs": ["storybook-static/**"]
}
}
}packages/database/
├── prisma/
│ └── schema.prisma
├── src/
│ └── index.ts # Re-export PrismaClient
├── package.json
└── tsconfig.json
// packages/database/package.json
{
"name": "@myorg/database",
"scripts": {
"build": "tsup src/index.ts --format cjs,esm --dts",
"db:generate": "prisma generate",
"db:push": "prisma db push",
"db:migrate": "prisma migrate dev",
"db:studio": "prisma studio"
}
}// turbo.json
{
"tasks": {
"build": {
"dependsOn": ["^build", "db:generate"]
},
"db:generate": {
"cache": false
},
"db:push": {
"cache": false
}
}
}packages/ui/
├── src/
│ ├── components/
│ │ ├── Button.tsx
│ │ ├── Card.tsx
│ │ ├── Input.tsx
│ │ └── index.ts
│ ├── hooks/
│ ├── styles/
│ └── index.ts
├── package.json
└── tsconfig.json
Export pattern:
{
"name": "@myorg/ui",
"exports": {
".": "./src/index.ts",
"./button": "./src/components/Button.tsx",
"./card": "./src/components/Card.tsx",
"./styles.css": "./src/styles/globals.css"
}
}apps/
├── web/ # Next.js frontend
├── api/ # Express/Fastify API
packages/
├── api-client/ # Type-safe API client (generated from OpenAPI or tRPC)
├── validators/ # Zod schemas shared between frontend & backend
├── types/ # Shared TypeScript interfaces
└── database/ # Prisma
With tRPC, you get end-to-end type safety:
packages/trpc/ # tRPC router definitions
apps/web/ # tRPC client (type-safe)
apps/api/ # tRPC server
Diagnosis:
turbo run build --dry-run=json --filter=webCompare the hash and inputs between runs.
Common causes:
- Unlisted env var changing between runs → add to
envorglobalEnv. - Timestamp in output → make builds deterministic.
- Different
node_modules→ ensure lockfile is committed and identical. - Generated file differing → add to
.gitignoreorinputsexclusion. - OS differences → lockfile may differ between macOS/Linux.
turbo.jsonchanged → expected, the config is part of the hash.
A phantom dependency is a package used in code but not declared in package.json. It works due to hoisting but breaks in strict environments.
// packages/ui/src/index.ts
import lodash from 'lodash'; // Not in ui's package.json!
// Works because npm/yarn hoists it from another workspaceFix: Use pnpm (strict by default) or a linting rule to catch undeclared imports.
packages/a → packages/b → packages/a ← CYCLE!
Turbo will error on circular workspace dependencies (the DAG cannot be topologically sorted).
Fix:
- Extract shared code into a third package.
- Use dependency injection.
- Restructure to remove the cycle.
Problem: macOS developer pushes cache, Linux CI runner can't use it because native modules (e.g., esbuild, swc) differ.
Solution: Turbo includes platform info in the hash for tasks that depend on native binaries. The lockfile's platform-specific resolution also affects the hash.
Symptoms: Module not found errors despite packages being installed.
Fixes:
- Ensure
workspace:*protocol is used for internal deps. - Run
pnpm installfrom root. - Check
.npmrchoisting settings. - Verify
exports/mainfields in package.json. - Clear
node_modulesand reinstall:pnpm install --force.
# Increase Node memory
NODE_OPTIONS="--max-old-space-size=8192" turbo run build
# Reduce concurrency to limit memory
turbo run build --concurrency=2
# Use turbo prune for deployment builds
turbo prune @myorg/web --docker# 1. Install turbo
pnpm add -D turbo -w
# 2. Create turbo.json matching your lerna.json tasks
# 3. Replace lerna run with turbo run in scripts/CI
# 4. Remove lerna (optional — keep for versioning/publishing)
# Keep lerna for publishing, use turbo for building:
{
"scripts": {
"build": "turbo run build", // Turbo
"publish": "lerna publish" // Lerna
}
}Key differences to address:
project.json→package.jsonscripts +turbo.jsonnx.jsontargets →turbo.jsontasks- Nx generators →
turbo gen(simpler) - Nx plugins → manual setup (Turbo has fewer plugins)
nx affected→turbo run --filter=...[main...HEAD]
- Create monorepo structure with
apps/andpackages/. - Move repos into the structure (preserving git history with
git subtreeor fresh start). - Deduplicate dependencies — merge lockfiles, unify versions.
- Extract shared code into packages.
- Add
workspace:*references. - Configure
turbo.json. - Update CI/CD pipelines.
Turbo can be added to an existing monorepo without changing anything:
# 1. Install
pnpm add -D turbo -w
# 2. Add minimal turbo.json
echo '{ "tasks": { "build": { "outputs": ["dist/**"] } } }' > turbo.json
# 3. Replace `pnpm -r run build` with `turbo run build`
# That's it! Immediate caching + parallelism benefits.npx @turbo/codemod migrateKey changes:
pipeline→tasks- Automatic environment variable detection improved
- New TUI (
"ui": "tui") - Daemon on by default
turbo watchcommand- Improved filtering
- Workspace
turbo.jsonrequires"extends": ["//"]
| Item | Convention | Example |
|---|---|---|
| Workspace names | Scoped, lowercase, kebab-case | @myorg/ui-components |
| App directories | Short, descriptive | apps/web, apps/api |
| Package directories | Feature-based | packages/auth, packages/ui |
| Config packages | Suffixed with -config |
@myorg/eslint-config |
| Scripts | Consistent across workspaces | build, test, lint, dev |
# .github/CODEOWNERS
/apps/web/ @frontend-team
/apps/api/ @backend-team
/packages/ui/ @design-system-team
/packages/database/ @backend-team
/turbo.json @platform-team
/.github/ @platform-team
Tools to enforce architectural boundaries:
- eslint-plugin-import — restrict imports between packages.
- Sherif — monorepo-specific linting (dependency consistency, etc.).
- depcheck — find unused dependencies.
- syncpack — enforce consistent dependency versions.
- publint — validate package.json for publishing.
Custom ESLint rule example:
// Don't allow apps to import from other apps
"import/no-restricted-paths": [{
zones: [{
target: "./apps/web",
from: "./apps/api"
}]
}]docs/
├── architecture.md # Monorepo architecture overview
├── adding-a-package.md # How to create new packages
├── dependency-policy.md # Rules about dependencies
└── ci-cd.md # CI/CD pipeline documentation
packages/ui/
└── README.md # Package-specific docs
Additionally, use turbo gen to codify patterns — new package creation becomes a guided, automated process.
❌ Don't use a monorepo when:
- Teams have completely separate tech stacks with no shared code.
- Strict repository-level access control is required (per-project permissions).
- The projects are maintained by entirely different organizations.
- Git performance is critical and the repo would exceed millions of files.
❌ Don't use Turborepo specifically when:
- You need distributed task execution across multiple machines (use Nx or Bazel).
- You need deep polyglot support (Go, Rust, Java alongside JS — use Bazel).
- You need fine-grained project graph plugins (use Nx).
- Your monorepo is < 2 packages with no shared code (overkill).
# Setup
npx create-turbo@latest # Create new monorepo
pnpm add -D turbo -w # Add to existing
# Running
turbo run build # Build everything
turbo run build test lint # Multiple tasks
turbo run build --filter=web # Single workspace
turbo run build --filter=web... # Workspace + deps
turbo run build --filter=...[main...HEAD] # Changed since main
# Cache
turbo run build --force # Skip cache
turbo run build --dry-run # Preview without running
turbo run build --summarize # Generate run summary
# Remote cache
turbo login && turbo link # Enable remote caching
# Docker
turbo prune @myorg/web --docker # Pruned output for Docker
# Debugging
turbo run build --graph # Visualize task graph
turbo run build --dry-run=json # Inspect hashes
turbo run build --verbosity=2 # Verbose output
# Maintenance
turbo daemon status # Check daemon
npx @turbo/codemod migrate # Migrate to latestThis guide covers Turborepo comprehensively from first principles through advanced production patterns. Each section builds on the previous, creating a complete mental model for building, scaling, and maintaining a Turborepo-based monorepo.