From 4838df315931bb883f704ec3e1abe2685f296cdf Mon Sep 17 00:00:00 2001 From: HsiangNianian Date: Sat, 22 Apr 2023 19:52:26 +0800 Subject: πŸ˜€ MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/pages/_app.tsx | 47 ++ docs/pages/_document.tsx | 54 ++ docs/pages/_error.js | 17 + docs/pages/_meta.json | 39 +- docs/pages/about.mdx | 3 - docs/pages/advanced.mdx | 3 - docs/pages/advanced/satori.mdx | 3 - docs/pages/another.mdx | 31 - docs/pages/api/binaries/version.ts | 113 ++++ docs/pages/api/og.tsx | 167 +++++ docs/pages/api/signup.tsx | 33 + docs/pages/api/user/[id].tsx | 36 ++ docs/pages/blog.mdx | 44 ++ docs/pages/blog/_meta.json | 26 + docs/pages/blog/joining-vercel.mdx | 23 + docs/pages/blog/saml-sso-now-available.mdx | 52 ++ docs/pages/blog/turbo-0-4-0.mdx | 211 ++++++ docs/pages/blog/turbo-1-1-0.mdx | 141 ++++ docs/pages/blog/turbo-1-2-0.mdx | 132 ++++ docs/pages/blog/turbo-1-3-0.mdx | 202 ++++++ docs/pages/blog/turbo-1-4-0.mdx | 139 ++++ docs/pages/blog/turbo-1-5-0.mdx | 142 ++++ docs/pages/blog/turbo-1-6-0.mdx | 223 +++++++ docs/pages/blog/turbo-1-7-0.mdx | 150 +++++ docs/pages/blog/turbo-1-8-0.mdx | 119 ++++ docs/pages/blog/turbo-1-9-0.mdx | 121 ++++ docs/pages/blog/turbopack-benchmarks.mdx | 275 ++++++++ ...ight-not-need-typescript-project-references.mdx | 95 +++ docs/pages/confirm.mdx | 8 + docs/pages/index.mdx | 14 +- docs/pages/pack/_meta.json | 15 + docs/pages/pack/docs/_meta.json | 10 + docs/pages/pack/docs/advanced/profiling.mdx | 74 +++ docs/pages/pack/docs/benchmarks.mdx | 174 +++++ .../pack/docs/comparisons/BenchmarksCallout.tsx | 14 + docs/pages/pack/docs/comparisons/_meta.json | 4 + docs/pages/pack/docs/comparisons/vite.mdx | 67 ++ docs/pages/pack/docs/comparisons/webpack.mdx | 66 ++ docs/pages/pack/docs/core-concepts.mdx | 64 ++ docs/pages/pack/docs/features.mdx | 22 + docs/pages/pack/docs/features/_meta.json | 11 + docs/pages/pack/docs/features/css.mdx | 80 +++ .../pack/docs/features/customizing-turbopack.mdx | 112 ++++ docs/pages/pack/docs/features/dev-server.mdx | 17 + .../pack/docs/features/environment-variables.mdx | 29 + docs/pages/pack/docs/features/frameworks.mdx | 45 ++ docs/pages/pack/docs/features/imports.mdx | 47 ++ docs/pages/pack/docs/features/javascript.mdx | 39 ++ docs/pages/pack/docs/features/static-assets.mdx | 47 ++ docs/pages/pack/docs/features/typescript.mdx | 40 ++ docs/pages/pack/docs/index.mdx | 61 ++ docs/pages/pack/docs/migrating-from-webpack.mdx | 34 + docs/pages/pack/docs/roadmap.mdx | 33 + docs/pages/pack/docs/why-turbopack.mdx | 62 ++ docs/pages/pack/index.mdx | 8 + docs/pages/privacy.mdx | 249 +++++++ docs/pages/repo/_meta.json | 15 + docs/pages/repo/docs/_meta.json | 20 + docs/pages/repo/docs/acknowledgements.mdx | 45 ++ docs/pages/repo/docs/ci.mdx | 13 + docs/pages/repo/docs/ci/_meta.json | 6 + docs/pages/repo/docs/ci/circleci.mdx | 145 +++++ docs/pages/repo/docs/ci/github-actions.mdx | 231 +++++++ docs/pages/repo/docs/ci/gitlabci.mdx | 136 ++++ docs/pages/repo/docs/ci/travisci.mdx | 122 ++++ docs/pages/repo/docs/core-concepts/_meta.json | 8 + docs/pages/repo/docs/core-concepts/caching.mdx | 460 +++++++++++++ docs/pages/repo/docs/core-concepts/monorepos.mdx | 23 + .../repo/docs/core-concepts/monorepos/_meta.json | 5 + .../monorepos/configuring-workspaces.mdx | 267 ++++++++ .../docs/core-concepts/monorepos/filtering.mdx | 208 ++++++ .../docs/core-concepts/monorepos/running-tasks.mdx | 315 +++++++++ .../core-concepts/monorepos/skipping-tasks.mdx | 66 ++ .../repo/docs/core-concepts/remote-caching.mdx | 112 ++++ docs/pages/repo/docs/core-concepts/scopes.mdx | 50 ++ docs/pages/repo/docs/faq.mdx | 51 ++ docs/pages/repo/docs/getting-started/_meta.json | 6 + .../repo/docs/getting-started/add-to-project.mdx | 147 +++++ .../pages/repo/docs/getting-started/create-new.mdx | 527 +++++++++++++++ .../docs/getting-started/existing-monorepo.mdx | 231 +++++++ .../repo/docs/getting-started/from-example.mdx | 49 ++ docs/pages/repo/docs/handbook.mdx | 25 + docs/pages/repo/docs/handbook/_meta.json | 15 + .../pages/repo/docs/handbook/building-your-app.mdx | 76 +++ .../repo/docs/handbook/deploying-with-docker.mdx | 184 ++++++ docs/pages/repo/docs/handbook/dev.mdx | 205 ++++++ .../repo/docs/handbook/environment-variables.mdx | 54 ++ docs/pages/repo/docs/handbook/linting.mdx | 50 ++ docs/pages/repo/docs/handbook/linting/_meta.json | 4 + docs/pages/repo/docs/handbook/linting/eslint.mdx | 124 ++++ .../repo/docs/handbook/linting/typescript.mdx | 129 ++++ .../repo/docs/handbook/migrating-to-a-monorepo.mdx | 77 +++ .../repo/docs/handbook/package-installation.mdx | 157 +++++ .../repo/docs/handbook/publishing-packages.mdx | 15 + .../docs/handbook/publishing-packages/_meta.json | 4 + .../docs/handbook/publishing-packages/bundling.mdx | 130 ++++ .../versioning-and-publishing.mdx | 57 ++ docs/pages/repo/docs/handbook/sharing-code.mdx | 71 ++ .../repo/docs/handbook/sharing-code/_meta.json | 3 + .../handbook/sharing-code/internal-packages.mdx | 237 +++++++ docs/pages/repo/docs/handbook/testing.mdx | 109 ++++ docs/pages/repo/docs/handbook/tools/_meta.json | 4 + docs/pages/repo/docs/handbook/tools/prisma.mdx | 207 ++++++ docs/pages/repo/docs/handbook/tools/storybook.mdx | 223 +++++++ docs/pages/repo/docs/handbook/troubleshooting.mdx | 31 + .../repo/docs/handbook/what-is-a-monorepo.mdx | 85 +++ docs/pages/repo/docs/handbook/workspaces.mdx | 168 +++++ docs/pages/repo/docs/index.mdx | 73 +++ docs/pages/repo/docs/installing.mdx | 85 +++ docs/pages/repo/docs/reference/_meta.json | 5 + docs/pages/repo/docs/reference/codemods.mdx | 278 ++++++++ .../repo/docs/reference/command-line-reference.mdx | 713 +++++++++++++++++++++ docs/pages/repo/docs/reference/configuration.mdx | 405 ++++++++++++ docs/pages/repo/docs/troubleshooting.mdx | 172 +++++ docs/pages/repo/docs/upgrading-to-v1.mdx | 78 +++ docs/pages/repo/index.mdx | 7 + docs/pages/showcase.mdx | 10 + docs/pages/terms.mdx | 90 +++ 118 files changed, 11646 insertions(+), 59 deletions(-) create mode 100644 docs/pages/_app.tsx create mode 100644 docs/pages/_document.tsx create mode 100644 docs/pages/_error.js delete mode 100644 docs/pages/about.mdx delete mode 100644 docs/pages/advanced.mdx delete mode 100644 docs/pages/advanced/satori.mdx delete mode 100644 docs/pages/another.mdx create mode 100644 docs/pages/api/binaries/version.ts create mode 100644 docs/pages/api/og.tsx create mode 100644 docs/pages/api/signup.tsx create mode 100644 docs/pages/api/user/[id].tsx create mode 100644 docs/pages/blog.mdx create mode 100644 docs/pages/blog/_meta.json create mode 100644 docs/pages/blog/joining-vercel.mdx create mode 100644 docs/pages/blog/saml-sso-now-available.mdx create mode 100644 docs/pages/blog/turbo-0-4-0.mdx create mode 100644 docs/pages/blog/turbo-1-1-0.mdx create mode 100644 docs/pages/blog/turbo-1-2-0.mdx create mode 100644 docs/pages/blog/turbo-1-3-0.mdx create mode 100644 docs/pages/blog/turbo-1-4-0.mdx create mode 100644 docs/pages/blog/turbo-1-5-0.mdx create mode 100644 docs/pages/blog/turbo-1-6-0.mdx create mode 100644 docs/pages/blog/turbo-1-7-0.mdx create mode 100644 docs/pages/blog/turbo-1-8-0.mdx create mode 100644 docs/pages/blog/turbo-1-9-0.mdx create mode 100644 docs/pages/blog/turbopack-benchmarks.mdx create mode 100644 docs/pages/blog/you-might-not-need-typescript-project-references.mdx create mode 100644 docs/pages/confirm.mdx create mode 100644 docs/pages/pack/_meta.json create mode 100644 docs/pages/pack/docs/_meta.json create mode 100644 docs/pages/pack/docs/advanced/profiling.mdx create mode 100644 docs/pages/pack/docs/benchmarks.mdx create mode 100644 docs/pages/pack/docs/comparisons/BenchmarksCallout.tsx create mode 100644 docs/pages/pack/docs/comparisons/_meta.json create mode 100644 docs/pages/pack/docs/comparisons/vite.mdx create mode 100644 docs/pages/pack/docs/comparisons/webpack.mdx create mode 100644 docs/pages/pack/docs/core-concepts.mdx create mode 100644 docs/pages/pack/docs/features.mdx create mode 100644 docs/pages/pack/docs/features/_meta.json create mode 100644 docs/pages/pack/docs/features/css.mdx create mode 100644 docs/pages/pack/docs/features/customizing-turbopack.mdx create mode 100644 docs/pages/pack/docs/features/dev-server.mdx create mode 100644 docs/pages/pack/docs/features/environment-variables.mdx create mode 100644 docs/pages/pack/docs/features/frameworks.mdx create mode 100644 docs/pages/pack/docs/features/imports.mdx create mode 100644 docs/pages/pack/docs/features/javascript.mdx create mode 100644 docs/pages/pack/docs/features/static-assets.mdx create mode 100644 docs/pages/pack/docs/features/typescript.mdx create mode 100644 docs/pages/pack/docs/index.mdx create mode 100644 docs/pages/pack/docs/migrating-from-webpack.mdx create mode 100644 docs/pages/pack/docs/roadmap.mdx create mode 100644 docs/pages/pack/docs/why-turbopack.mdx create mode 100644 docs/pages/pack/index.mdx create mode 100644 docs/pages/privacy.mdx create mode 100644 docs/pages/repo/_meta.json create mode 100644 docs/pages/repo/docs/_meta.json create mode 100644 docs/pages/repo/docs/acknowledgements.mdx create mode 100644 docs/pages/repo/docs/ci.mdx create mode 100644 docs/pages/repo/docs/ci/_meta.json create mode 100644 docs/pages/repo/docs/ci/circleci.mdx create mode 100644 docs/pages/repo/docs/ci/github-actions.mdx create mode 100644 docs/pages/repo/docs/ci/gitlabci.mdx create mode 100644 docs/pages/repo/docs/ci/travisci.mdx create mode 100644 docs/pages/repo/docs/core-concepts/_meta.json create mode 100644 docs/pages/repo/docs/core-concepts/caching.mdx create mode 100644 docs/pages/repo/docs/core-concepts/monorepos.mdx create mode 100644 docs/pages/repo/docs/core-concepts/monorepos/_meta.json create mode 100644 docs/pages/repo/docs/core-concepts/monorepos/configuring-workspaces.mdx create mode 100644 docs/pages/repo/docs/core-concepts/monorepos/filtering.mdx create mode 100644 docs/pages/repo/docs/core-concepts/monorepos/running-tasks.mdx create mode 100644 docs/pages/repo/docs/core-concepts/monorepos/skipping-tasks.mdx create mode 100644 docs/pages/repo/docs/core-concepts/remote-caching.mdx create mode 100644 docs/pages/repo/docs/core-concepts/scopes.mdx create mode 100644 docs/pages/repo/docs/faq.mdx create mode 100644 docs/pages/repo/docs/getting-started/_meta.json create mode 100644 docs/pages/repo/docs/getting-started/add-to-project.mdx create mode 100644 docs/pages/repo/docs/getting-started/create-new.mdx create mode 100644 docs/pages/repo/docs/getting-started/existing-monorepo.mdx create mode 100644 docs/pages/repo/docs/getting-started/from-example.mdx create mode 100644 docs/pages/repo/docs/handbook.mdx create mode 100644 docs/pages/repo/docs/handbook/_meta.json create mode 100644 docs/pages/repo/docs/handbook/building-your-app.mdx create mode 100644 docs/pages/repo/docs/handbook/deploying-with-docker.mdx create mode 100644 docs/pages/repo/docs/handbook/dev.mdx create mode 100644 docs/pages/repo/docs/handbook/environment-variables.mdx create mode 100644 docs/pages/repo/docs/handbook/linting.mdx create mode 100644 docs/pages/repo/docs/handbook/linting/_meta.json create mode 100644 docs/pages/repo/docs/handbook/linting/eslint.mdx create mode 100644 docs/pages/repo/docs/handbook/linting/typescript.mdx create mode 100644 docs/pages/repo/docs/handbook/migrating-to-a-monorepo.mdx create mode 100644 docs/pages/repo/docs/handbook/package-installation.mdx create mode 100644 docs/pages/repo/docs/handbook/publishing-packages.mdx create mode 100644 docs/pages/repo/docs/handbook/publishing-packages/_meta.json create mode 100644 docs/pages/repo/docs/handbook/publishing-packages/bundling.mdx create mode 100644 docs/pages/repo/docs/handbook/publishing-packages/versioning-and-publishing.mdx create mode 100644 docs/pages/repo/docs/handbook/sharing-code.mdx create mode 100644 docs/pages/repo/docs/handbook/sharing-code/_meta.json create mode 100644 docs/pages/repo/docs/handbook/sharing-code/internal-packages.mdx create mode 100644 docs/pages/repo/docs/handbook/testing.mdx create mode 100644 docs/pages/repo/docs/handbook/tools/_meta.json create mode 100644 docs/pages/repo/docs/handbook/tools/prisma.mdx create mode 100644 docs/pages/repo/docs/handbook/tools/storybook.mdx create mode 100644 docs/pages/repo/docs/handbook/troubleshooting.mdx create mode 100644 docs/pages/repo/docs/handbook/what-is-a-monorepo.mdx create mode 100644 docs/pages/repo/docs/handbook/workspaces.mdx create mode 100644 docs/pages/repo/docs/index.mdx create mode 100644 docs/pages/repo/docs/installing.mdx create mode 100644 docs/pages/repo/docs/reference/_meta.json create mode 100644 docs/pages/repo/docs/reference/codemods.mdx create mode 100644 docs/pages/repo/docs/reference/command-line-reference.mdx create mode 100644 docs/pages/repo/docs/reference/configuration.mdx create mode 100644 docs/pages/repo/docs/troubleshooting.mdx create mode 100644 docs/pages/repo/docs/upgrading-to-v1.mdx create mode 100644 docs/pages/repo/index.mdx create mode 100644 docs/pages/showcase.mdx create mode 100644 docs/pages/terms.mdx (limited to 'docs/pages') diff --git a/docs/pages/_app.tsx b/docs/pages/_app.tsx new file mode 100644 index 0000000..fc61810 --- /dev/null +++ b/docs/pages/_app.tsx @@ -0,0 +1,47 @@ +import "../styles.css"; +import "../custom.css"; + +import { SSRProvider } from "@react-aria/ssr"; +import type { AppProps } from "next/app"; +import type { ReactNode } from "react"; +import { Analytics } from "@vercel/analytics/react"; + +type NextraAppProps = AppProps & { + Component: AppProps["Component"] & { + getLayout: (page: ReactNode) => ReactNode; + }; +}; + +// Shim requestIdleCallback in Safari +if (typeof window !== "undefined" && !("requestIdleCallback" in window)) { + window.requestIdleCallback = (fn) => setTimeout(fn, 1); + window.cancelIdleCallback = (e) => clearTimeout(e); +} + +export default function Nextra({ Component, pageProps }: NextraAppProps) { + return ( + + <> + {/** + * Globally defined svg linear gradient, for use in icons + */} + + + + + + + + + + + + + ); +} diff --git a/docs/pages/_document.tsx b/docs/pages/_document.tsx new file mode 100644 index 0000000..54b5e72 --- /dev/null +++ b/docs/pages/_document.tsx @@ -0,0 +1,54 @@ +import Document, { Html, Head, Main, NextScript } from "next/document"; +import type { DocumentInitialProps, DocumentContext } from "next/document"; + +class MyDocument extends Document { + static async getInitialProps( + ctx: DocumentContext + ): Promise { + const initialProps = await Document.getInitialProps(ctx); + + return initialProps; + } + + render() { + return ( + + + + + + + {/* eslint-disable-next-line @next/next/google-font-preconnect */} + + {/* eslint-disable-next-line @next/next/google-font-preconnect */} + + + +
+ + + + ); + } +} + +export default MyDocument; diff --git a/docs/pages/_error.js b/docs/pages/_error.js new file mode 100644 index 0000000..129bf1a --- /dev/null +++ b/docs/pages/_error.js @@ -0,0 +1,17 @@ +import * as Sentry from "@sentry/nextjs"; +import NextErrorComponent from "next/error"; + +const CustomErrorComponent = (props) => { + return ; +}; + +CustomErrorComponent.getInitialProps = async (contextData) => { + // In case this is running in a serverless function, await this in order to give Sentry + // time to send the error before the lambda exits + await Sentry.captureUnderscoreErrorException(contextData); + + // This will contain the status code of the response + return NextErrorComponent.getInitialProps(contextData); +}; + +export default CustomErrorComponent; diff --git a/docs/pages/_meta.json b/docs/pages/_meta.json index 164927d..09675ab 100644 --- a/docs/pages/_meta.json +++ b/docs/pages/_meta.json @@ -1,15 +1,34 @@ { - "index": "Introduction", - "another": "Another Page", - "advanced": "Advanced (A Folder)", - "about": { - "title": "About", + "*": { "type": "page" }, - "contact": { - "title": "Contact β†—", - "type": "page", - "href": "https://twitter.com/shuding_", - "newWindow": true + "index": { + "display": "hidden", + "theme": { + "layout": "raw", + "sidebar": false, + "toc": true + } + }, + "repo": "Repo", + "pack": "Pack", + "blog": { + "title": "Blog", + "theme": { + "typesetting": "article" + } + }, + "showcase": "Showcase", + "confirm": { + "title": "Confirm", + "display": "hidden" + }, + "terms": { + "title": "Terms of Service", + "display": "hidden" + }, + "privacy": { + "title": "Privacy Policy", + "display": "hidden" } } diff --git a/docs/pages/about.mdx b/docs/pages/about.mdx deleted file mode 100644 index ec6faed..0000000 --- a/docs/pages/about.mdx +++ /dev/null @@ -1,3 +0,0 @@ -# About - -This is the about page! This page is shown on the navbar. diff --git a/docs/pages/advanced.mdx b/docs/pages/advanced.mdx deleted file mode 100644 index a1a5148..0000000 --- a/docs/pages/advanced.mdx +++ /dev/null @@ -1,3 +0,0 @@ -# Advanced - -This is the index page for the Advanced folder! diff --git a/docs/pages/advanced/satori.mdx b/docs/pages/advanced/satori.mdx deleted file mode 100644 index 46eb19f..0000000 --- a/docs/pages/advanced/satori.mdx +++ /dev/null @@ -1,3 +0,0 @@ -# Satori - -Satori (ζ‚Ÿγ‚Š) is a Japanese Buddhist term for awakening, "comprehension; understanding". diff --git a/docs/pages/another.mdx b/docs/pages/another.mdx deleted file mode 100644 index 7fb6a02..0000000 --- a/docs/pages/another.mdx +++ /dev/null @@ -1,31 +0,0 @@ -# Another Page - -```js filename="demo.js" {3} copy -let a = 1; - -console.log(a); -``` - -## Component - -import { useState } from 'react' - -{/* Import CSS modules */} -import styles from '../components/counters.module.css' - -export const Counter = () => { - const [count, setCount] = useState(0); - return ( -
- -
- ); -}; - - - -## External Component - -import Counters from '../components/counters' - - diff --git a/docs/pages/api/binaries/version.ts b/docs/pages/api/binaries/version.ts new file mode 100644 index 0000000..339a34d --- /dev/null +++ b/docs/pages/api/binaries/version.ts @@ -0,0 +1,113 @@ +import type { NextRequest } from "next/server"; + +const REGISTRY = "https://registry.npmjs.org"; +const DEFAULT_TAG = "latest"; +const SUPPORTED_PACKAGES = ["turbo"]; +const SUPPORTED_METHODS = ["GET"]; +const [DEFAULT_NAME] = SUPPORTED_PACKAGES; + +async function fetchDistTags({ name }: { name: string }) { + const result = await fetch(`${REGISTRY}/${name}`); + const json = await result.json(); + return json["dist-tags"]; +} + +function errorResponse({ + status, + message, +}: { + status: 400 | 404 | 500; + message: string; +}) { + return new Response( + JSON.stringify({ + error: message, + }), + { + status, + } + ); +} + +/* +This API is called via the turbo rust binary to check for version updates. + +Response Schema (200): +{ + "type": "object", + "properties": { + "name": { + "type": "string", + }, + "version": { + "type": "string", + }, + "tag": { + "type": "string", + } + } +} + +Errors (400 | 404 | 500): +{ + "type": "object", + "properties": { + "error": { + "type": "string", + } + } +} + +*/ +export default async function handler(req: NextRequest) { + if (!SUPPORTED_METHODS.includes(req.method)) { + return errorResponse({ + status: 404, + message: `unsupported method - ${req.method}`, + }); + } + + try { + const { searchParams } = new URL(req.url); + const name = searchParams.get("name") || DEFAULT_NAME; + const tag = searchParams.get("tag") || DEFAULT_TAG; + + if (!SUPPORTED_PACKAGES.includes(name)) { + return errorResponse({ + status: 400, + message: `unsupported package - ${name}`, + }); + } + + const versions = await fetchDistTags({ name }); + if (!versions || !versions[tag]) { + return errorResponse({ + status: 404, + message: `unsupported tag - ${tag}`, + }); + } + + return new Response( + JSON.stringify({ + name, + version: versions[tag], + tag, + }), + { + status: 200, + headers: { + "content-type": "application/json", + // cache for 15 minutes, and allow stale responses for 5 minutes + "cache-control": "public, s-maxage=900, stale-while-revalidate=300", + }, + } + ); + } catch (e) { + console.error(e); + return errorResponse({ status: 500, message: e.message }); + } +} + +export const config = { + runtime: "experimental-edge", +}; diff --git a/docs/pages/api/og.tsx b/docs/pages/api/og.tsx new file mode 100644 index 0000000..196ca72 --- /dev/null +++ b/docs/pages/api/og.tsx @@ -0,0 +1,167 @@ +import React, { createElement } from "react"; +import { ImageResponse } from "@vercel/og"; + +import PackLogo from "../../components/logos/og/PackLogo"; +import RepoLogo from "../../components/logos/og/RepoLogo"; +import TurboLogo from "../../components/logos/og/TurboLogo"; +import VercelLogo from "../../components/logos/og/VercelLogo"; + +import type { NextApiRequest } from "next/index"; + +function _arrayBufferToBase64(buffer) { + var binary = ""; + var bytes = new Uint8Array(buffer); + var len = bytes.byteLength; + for (var i = 0; i < len; i++) { + binary += String.fromCharCode(bytes[i]); + } + return btoa(binary); +} + +async function loadAssets(): Promise< + [ + { name: string; data: ArrayBuffer; weight: 400 | 700; style: "normal" }[], + string + ] +> { + const [inter, spaceMono, bg] = await Promise.all([ + fetch( + String(new URL("../../assets/inter-v12-latin-700.ttf", import.meta.url)) + ).then((res) => res.arrayBuffer()), + fetch( + String( + new URL( + "../../assets/space-mono-v12-latin-regular.ttf", + import.meta.url + ) + ) + ).then((res) => res.arrayBuffer()), + fetch(String(new URL("../../assets/bg.jpeg", import.meta.url))).then( + (res) => res.arrayBuffer() + ), + ]); + return [ + [ + { + name: "Inter", + data: inter, + weight: 700 as const, + style: "normal" as const, + }, + { + name: "Space Mono", + data: spaceMono, + weight: 400 as const, + style: "normal" as const, + }, + ], + _arrayBufferToBase64(bg), + ]; +} + +export const config = { + runtime: "experimental-edge", +}; + +export default async function openGraphImage( + req: NextApiRequest +): Promise { + try { + const [fonts, bg] = await loadAssets(); + const { searchParams } = new URL(req.url); + + const type = searchParams.get("type"); + + // ?title= + const hasTitle = searchParams.has("title"); + const title = hasTitle + ? searchParams.get("title")?.slice(0, 100) + : type === "pack" + ? "The successor to Webpack" + : type === "repo" + ? "The build system that makes ship happen" + : ""; + + return new ImageResponse(createElement(OGImage, { title, type, bg }), { + width: 1200, + height: 630, + fonts, + }); + } catch (e: unknown) { + return new Response(undefined, { + status: 302, + headers: { + Location: "https://turbo.build/og-image.png", + }, + }); + } +} + +export function OGImage({ + title, + type, + bg, +}: { + title: string; + type: string; + bg: string; +}): JSX.Element { + return ( + <div + style={{ + display: "flex", + flexDirection: "column", + alignItems: "center", + justifyContent: "center", + width: "100%", + height: "100%", + fontFamily: "Inter", + fontWeight: 700, + fontSize: 60, + backgroundImage: `url(data:image/jpeg;base64,${bg})`, + backgroundSize: "1200px 630px", + color: "#fff", + }} + > + {/* eslint-disable-next-line @next/next/no-img-element, jsx-a11y/alt-text */} + <div style={{ display: "flex", height: 97 * 1.1, alignItems: "center" }}> + {type === "pack" ? ( + <PackLogo height={103 * 1.1} width={697 * 1.1} /> + ) : type === "repo" ? ( + <RepoLogo height={83 * 1.1} width={616 * 1.1} /> + ) : ( + <TurboLogo height={97 * 1.1} width={459 * 1.1} /> + )} + </div> + {title ? ( + <div + style={{ + fontFamily: "Space Mono", + fontSize: 36, + letterSpacing: -1.5, + padding: "15px 20px 30px", + textAlign: "center", + backgroundImage: "linear-gradient(to bottom, #fff, #aaa)", + backgroundClip: "text", + color: "transparent", + }} + > + {title} + </div> + ) : null} + <div + style={{ + fontFamily: "Space Mono", + fontSize: 18, + marginTop: 80, + display: "flex", + color: "#fff", + alignItems: "center", + }} + > + <div style={{ marginRight: 12 }}>by</div> + <VercelLogo fill="white" height={30} /> + </div> + </div> + ); +} diff --git a/docs/pages/api/signup.tsx b/docs/pages/api/signup.tsx new file mode 100644 index 0000000..0a082ee --- /dev/null +++ b/docs/pages/api/signup.tsx @@ -0,0 +1,33 @@ +import { NextApiRequest, NextApiResponse } from "next"; +import { withSentry } from "@sentry/nextjs"; + +const CAMPAIGN_ID = process.env.TURBOREPO_SFDC_CAMPAIGN_ID; +const TRAY_URL = process.env.TRAY_URL; + +async function handler(req: NextApiRequest, res: NextApiResponse) { + if (req.method === "POST") { + const user = { + email: req.body.email, + campaign_id: CAMPAIGN_ID, + }; + + try { + const trayRes = await fetch(TRAY_URL, { + method: "POST", + headers: { + "Content-Type": "application/json", + Accept: "application/json", + }, + body: JSON.stringify({ user: user }), + }); + + return res.status(201).json(user); + } catch (error) { + return res.status(500).json(error); + } + } else { + return res.status(404).send(null); + } +} + +export default withSentry(handler); diff --git a/docs/pages/api/user/[id].tsx b/docs/pages/api/user/[id].tsx new file mode 100644 index 0000000..091d716 --- /dev/null +++ b/docs/pages/api/user/[id].tsx @@ -0,0 +1,36 @@ +import { NextApiRequest, NextApiResponse } from "next"; +import { withSentry } from "@sentry/nextjs"; +import { + getSubscriber, + Subscriber, + updateSubscriber, +} from "../../../lib/ConvertKitApi"; + +async function handler(req: NextApiRequest, res: NextApiResponse) { + try { + if (req.method === "PUT") { + const subscriber = await updateSubscriber( + req.query.id as string, + { + first_name: req.body.first_name, + email_address: req.body.email_address, + fields: req.body.fields, + } as Subscriber + ); + res.setHeader("Content-Type", "application/json"); + res.statusCode = 204; + res.json(subscriber); + } else { + const subscriber = await getSubscriber(req.query.id as string); + res.setHeader("Content-Type", "application/json"); + res.statusCode = 200; + res.json(subscriber); + } + } catch (error) { + console.log(error); + res.statusCode = 500; + res.json({ okay: false }); + } +} + +export default withSentry(handler); diff --git a/docs/pages/blog.mdx b/docs/pages/blog.mdx new file mode 100644 index 0000000..dddad2e --- /dev/null +++ b/docs/pages/blog.mdx @@ -0,0 +1,44 @@ +--- +description: The latest updates and releases from the Turbo team at Vercel. +type: posts +--- + +import { getPagesUnderRoute } from "nextra/context"; +import Link from "next/link"; + +export function BlogHeader() { + return ( + <div className="max-w-screen-lg mx-auto pt-4 pb-8 mb-16 border-b border-gray-400 border-opacity-20"> + <h1> + <span className="font-bold leading-tight lg:text-5xl">Blog</span> + </h1> + <p className="text-center text-gray-500 dark:text-gray-400 font-space-grotesk"> + The latest updates and releases from the Turbo team at Vercel. + </p> + </div> + ); +} + +export function BlogIndex() { + return getPagesUnderRoute("/blog").map((page) => { + return ( + <div key={page.route} className="mb-10"> + <Link href={page.route} style={{ color: "inherit", textDecoration: "none" }} className="block font-semibold mt-8 text-2xl"> + {page.meta?.title || page.frontMatter?.title || page.name} + </Link> + <p className="opacity-80" style={{ marginTop: ".5rem" }}> + {page.frontMatter?.description}{" "} + <span className="inline-block"> + <Link href={page.route}>{"Read more β†’"}</Link> + </span> + </p> + {page.frontMatter?.date ? ( + <p className="opacity-50 text-sm">{page.frontMatter.date}</p> + ) : null} + </div> + ); + }); +} + +<BlogHeader /> +<BlogIndex /> diff --git a/docs/pages/blog/_meta.json b/docs/pages/blog/_meta.json new file mode 100644 index 0000000..3de89ce --- /dev/null +++ b/docs/pages/blog/_meta.json @@ -0,0 +1,26 @@ +{ + "*": { + "//": "Hide all the links from the sidebar or navbar, and disable ToC and Sidebar for these pages.", + "display": "hidden", + "theme": { + "toc": false, + "sidebar": false, + "pagination": false, + "typesetting": "article" + } + }, + "turbo-1-9-0": "Turborepo 1.9", + "turbo-1-8-0": "Turborepo 1.8", + "turbo-1-7-0": "Turborepo 1.7", + "turbopack-benchmarks": "Turbopack Performance Benchmarks", + "turbo-1-6-0": "Turborepo 1.6", + "turbo-1-5-0": "Turborepo 1.5", + "turbo-1-4-0": "Turborepo 1.4", + "turbo-1-3-0": "Turborepo 1.3", + "turbo-1-2-0": "Turborepo 1.2", + "turbo-1-1-0": "Turborepo 1.1", + "joining-vercel": "Joining Vercel", + "saml-sso-now-available": "SAML SSO now available", + "you-might-not-need-typescript-project-references": "You might not need TypeScript project references", + "turbo-0-4-0": "Turborepo 0.4.0" +} diff --git a/docs/pages/blog/joining-vercel.mdx b/docs/pages/blog/joining-vercel.mdx new file mode 100644 index 0000000..f84ec49 --- /dev/null +++ b/docs/pages/blog/joining-vercel.mdx @@ -0,0 +1,23 @@ +--- +title: Turborepo is joining Vercel +date: 2021/12/09 +description: Turborepo is joining Vercel to make the web even faster. +tag: web development +ogImage: /images/blog/joining-vercel/twitter-card.png +--- + +# Turborepo is joining Vercel + +import { Authors } from "../../components/Authors"; + +<Authors authors={["jaredpalmer"]} /> + +Turborepo has been acquired by Vercel and the Turborepo CLI is now open-source! Also, Turborepo now provides zero-config remote caching throughΒ [Vercel](https://vercel.com/?utm_source=turbo.build&utm_medium=referral&utm_campaign=docs-link)! + +beta.turborepo.com and its remote caching service will be shut down on January 15th, 2022 and older versions of the `turbo` CLI will not be installable. Learn more about how to upgrade your CLI and migrate to Vercel [here](/repo/docs/upgrading-to-v1). + +This is a milestone moment for the project and for all of you who have supported and adopted Turborepo. With Vercel's infrastructure and team backing, I'll expand on the capabilities of Turborepo and build out a team focused on improving their world-class build system. I can't wait to bring you along for this [next chapter](https://vercel.com/blog/vercel-acquires-turborepo). + +Join me this [Friday, December 10 at 4:00 p.m. ET | 9:00 p.m GMT for a livestream Q&A](https://www.youtube.com/watch?v=YX5yoApjI3M) with Vercel's Head of Developer Relations, Lee Robinson. We'll go over what's in store for Turborepo and Vercel as we work toward improving the developer experience together. + +This is just the beginning. We're about to embark on a world of even faster builds, even more flexibility, and even better workflows. Thanks for joining us on this amazing journey. diff --git a/docs/pages/blog/saml-sso-now-available.mdx b/docs/pages/blog/saml-sso-now-available.mdx new file mode 100644 index 0000000..0014cfc --- /dev/null +++ b/docs/pages/blog/saml-sso-now-available.mdx @@ -0,0 +1,52 @@ +--- +title: SAML SSO is now available +date: 2021/08/03 +description: SAML Single Sign-on (SSO) is now available to Enterprise customers thanks to our friends over at WorkOS. +tag: web development +ogImage: /images/blog/saml-sso-now-available/twitter-card.png +--- + +# SAML SSO is now available + +import { Authors } from "../../components/Authors"; + +<Authors authors={["jaredpalmer"]} /> + +Thanks to our friends over at [WorkOS](https://workos.com/), in addition to GitHub, Gitlab, and Passwordless auth, Turborepo now supports Single Sign-on (SSO) from the following SAML providers for enterprise customers: + +- AD FS SAML +- Auth0 SAML +- Azure AD SAML +- CyberArk SAML +- Generic SAML +- G Suite OAuth +- G Suite SAML +- JumpCloud SAML +- Microsoft OAuth +- Okta SAML +- OneLogin SAML +- OpenID Connect +- PingFederate SAML +- PingOne SAML +- Shibboleth +- VMWare SAML + +We also support SCIM (a.k.a. "Directory Sync") via: + +- Azure AD SCIM +- BambooHR +- G Suite +- Gusto +- Hibob +- Okta SCIM v1.1 +- Okta SCIM v2.0 +- Rippling +- SCIM v1.1 +- SCIM v2.0 +- Workday + +If you're team in interested in activating SAML SSO, please contact us [turborepo@vercel.com](mailto:turborepo@vercel.com). + +## What's next? + +We take security extremely seriously. And while SSO is certainly a must in 2021, we are also currently undergoing a third-party SOC 2 Type 2 examination as well. Last but not least, we are adding biometric/U2F access controls and audit logs this fall. Stay tuned for updates. diff --git a/docs/pages/blog/turbo-0-4-0.mdx b/docs/pages/blog/turbo-0-4-0.mdx new file mode 100644 index 0000000..c4b92ba --- /dev/null +++ b/docs/pages/blog/turbo-0-4-0.mdx @@ -0,0 +1,211 @@ +--- +title: Turborepo 0.4.0 +date: 2021/04/02 +description: Turborepo 0.4.0 introduces 10x faster hashing, pruned workspaces with sparse installs, a new pipeline configuration API, and improved cache control. +tag: web development +ogImage: /images/blog/joining-vercel/twitter-card.png +--- + +# Turborepo 0.4.0 + +import { Authors } from "../../components/Authors"; + +<Authors authors={["jaredpalmer"]} /> + +I'm excited to announce the release of Turborepo v0.4.0! + +- **10x faster**: `turbo` has been rewritten from the ground up in Go to make it even more blazing fast +- **Smarter hashing**: Improved hashing algorithm now considers resolved dependencies instead of just the contents of the entire root lockfile +- **Partial lockfiles / sparse installs**: Generate a pruned subset of your root lockfile and monorepo that includes only the necessary packages needed for a given target +- **Fine-grained scheduling**: Improved task orchestration and options via `pipeline` configuration +- **Better cache control**: You can now specify cache outputs on a per-task basis + +## Rewritten in Go + +Although I initially prototyped `turbo` in TypeScript, it became clear that certain items on the roadmap would require better performance. After around a month or so of work, I'm excited to finally release Go version of the `turbo` CLI. Not only does it boot in a milliseconds, but the new Go implementation is somewhere between 10x and 100x faster at hashing than the Node.js implementation. With this new foundation (and some features you're about to read about), Turborepo can now scale to intergalactic sized projects while remaining blazing fast all thanks to Go's awesome concurrency controls. + +## Better Hashing + +Not only is hashing faster in v0.4.0, but also _a lot_ smarter. + +The major change is that `turbo` no longer includes the hash of the contents of the root lockfile in its hasher (the algorithm responsible for determining if a given task exists in the cache or needs to be executed). Instead, `turbo` now hashes the set of the resolved versions of a package's `dependencies` and `devDependencies` based on the root lockfile. + +The old behavior would explode the cache whenever the root lockfile changed in any way. With this new behavior, changing the lockfile will only bust the cache for those package's impacted by the added/changed/removed dependencies. While this sounds complicated, again all it means is that when you install/remove/update dependencies from npm, only those packages that are actually impacted by the changes will need to be rebuilt. + +## Experimental: Pruned Workspaces + +One of our biggest customer pain points/requests has been improving Docker build times when working with large Yarn Workspaces (or really any workspace implementation). The core issue is that workspaces' best feature--reducing your monorepo to a single lockfile--is also its worst when it comes to Docker layer caching. + +To help articulate the problem and how `turbo` now solves it, let's look at an example. + +Say we have a monorepo with Yarn workspaces that includes a set of packages called `frontend`, `admin`, `ui`, and `backend`. Let's also assume that `frontend` and `admin` are Next.js applications that both depend on the same internal React component library package `ui`. Now let's also say that `backend` contains an Express TypeScript REST API that doesn't really share much code with any other part of our monorepo. + +Here's what the Dockerfile for the `frontend` Next.js app might look like: + +```docker {7} +FROM node:alpine AS base +RUN apk update +WORKDIR /app + +# Add lockfile and package.jsons +FROM base AS builder +COPY *.json yarn.lock ./ +COPY packages/ui/*.json ./packages/ui/ +COPY packages/frontend/*.json ./packages/frontend/ +RUN yarn install + +# Copy source files +COPY packages/ui/ ./packages/ui/ +COPY packages/frontend/ ./packages/frontend/ + +# Build +RUN yarn --cwd=packages/ui/ build +RUN yarn --cwd=packages/frontend/ build + +# Start the Frontend Next.js application +EXPOSE 3000 +RUN ['yarn', '--cwd', 'packages/frontend', 'start'] +``` + +While this works, there are some things that could be a lot better: + +- You manually `COPY` in the internal packages and files needed to build the target app and need to remember which need to be built first. +- You `COPY` the root `yarn.lock` lockfile into the correct position very early in the Dockerfile, but this lockfile is the lockfile for the _entire_ monorepo. + +This last issue is especially painful as your monorepo gets larger and larger because any change to this lockfile triggers a nearly full rebuild regardless of whether or not the app is actually impacted by the new/changed dependencies. + +....until now. + +With the all new `turbo prune` command, you can now fix this nightmare by deterministically generating a sparse/partial monorepo with a pruned lockfile for a target package--without installing your `node_modules`. + +Let's look at how to use `turbo prune` inside of Docker. + +```docker {11,16} +FROM node:alpine AS base +RUN apk update && apk add git + +## Globally install `turbo` +RUN npm i -g turbo + +# Prune the workspace for the `frontend` app +FROM base as pruner +WORKDIR /app +COPY . . +RUN turbo prune --scope=frontend --docker + +# Add pruned lockfile and package.json's of the pruned subworkspace +FROM base AS installer +WORKDIR /app +COPY --from=pruner /app/out/json/ . +COPY --from=pruner /app/out/yarn.lock ./yarn.lock +# Install only the deps needed to build the target +RUN yarn install + +# Copy source code of pruned subworkspace and build +FROM base as builder +WORKDIR /app +COPY --from=pruner /app/.git ./.git +COPY --from=pruner /app/out/full/ . +COPY --from=installer /app/ . +RUN turbo run build --scope=frontend + +# Start the app +FROM builder as runner +EXPOSE 3000 +RUN ['yarn', '--cwd', 'packages/frontend', 'start'] +``` + +So what exactly is the output of the `turbo prune`? A folder called `out` with the following inside of it: + +- A folder `json` with the pruned workspace's package.jsons +- A folder `full` with the pruned workspace's full source code, but only including the internal packages that are needed to build the target +- A _new_ pruned lockfile that only contains the pruned subset of the original root lockfile with the dependencies that are actually used by the packages in the pruned workspace. + +Thanks to the above, Docker can now be set up to only rebuild each application when there is a real reason to do so. So `frontend` will only rebuild when its source or dependencies (either internal or from npm) have actually changed. Same same for `admin` and `backend`. Changes to `ui`, either to its source code or dependencies, will trigger rebuilds of both `frontend` and `admin`, but _not_ `backend`. + +While this example seems trivial, just imagine if each app takes up to 20 minutes to build and deploy. These savings really start to add up quickly, especially on large teams. + +## Pipelines + +To give you even more control over your Turborepo, we've added `pipeline` to `turbo`'s configuration. This new field in lets you specify how the npm scripts in your monorepo relate to each other as well as some additional per-task options. `turbo` then uses this information to optimally schedule your tasks in your monorepo, collapsing waterfalls that would otherwise exist. + +Here's how it works: + +```json +// <root>/package.json +{ + "turbo": { + "pipeline": { + "build": { + // This `^` tells `turbo` that this pipeline target relies on a topological target being completed. + // In english, this reads as: "this package's `build` command depends on its dependencies' or + // devDependencies' `build` command being completed" + "dependsOn": ["^build"] + }, + "test": { + // `dependsOn` without `^` can be used to express the relationships between tasks at the package level. + // In English, this reads as: "this package's `test` command depends on its `lint` and `build` command first being completed" + "dependsOn": ["lint", "build"] + }, + "lint": {}, + "dev": {} + } + } +} +``` + +The above config would then be interpreted by `turbo` to optimally schedule execution. + +What's that actually mean? In the past (like Lerna and Nx), `turbo` could only run tasks in topological order. With the addition of pipelines, `turbo` now constructs a topological "action" graph in addition to the actual dependency graph which it uses to determine the order in which tasks should be executed with maximum concurrency. The end result is that you no longer waste idle CPU time waiting around for stuff to finish (i.e. no more waterfalls). + +![Turborepo scheduler](/images/blog/turbo-0-4-0/turbo-vs-lerna-execution.png) + +## Improved Cache Control + +Thanks to `pipeline`, we now have a great place to open up `turbo`'s cache behavior on a per-task basis. + +Building on the example from above, you can now set cache output conventions across your entire monorepo like so: + +```json +// <root>/package.json +{ + "turbo": { + "pipeline": { + "build": { + // Cache anything in dist or .next directories emitted by a `build` command + "outputs": ["dist/**", ".next/**", "!.next/cache/**"] + "dependsOn": ["^build"] + }, + "test": { + // Cache the test coverage report + "outputs": ["coverage/**"], + "dependsOn": ["lint", "build"] + }, + "dev": { + // Never cache the `dev` command + "cache": false + }, + "lint": {}, + } + } +} +``` + +_Note: Right now, `pipeline` exists at the project level, +but in later releases these will be overridable on per-package basis._ + +## What's Next? + +I know this was a lot, but there's even more to come. Here's what's up next on the Turborepo roadmap. + +- A landing page! +- [Remote caching w/ `@turborepo/server`](https://twitter.com/jaredpalmer/status/1359627800840462341) +- Build scans, telemetry, and metrics and dependency and task graph visualization +- [Desktop Console UI](https://twitter.com/jaredpalmer/status/1360315387372572672) +- Intelligent `watch` mode +- Official build rules for TypeScript, React, Jest, Node.js, Docker, Kubernetes, and more + +## Credits + +- [Iheanyi Ekechukwu](https://twitter.com/kwuchu) for guiding me through the Go ecosystem +- [Miguel Oller](https://twitter.com/ollermi) and the team from [Makeswift](https://www.makeswift.com/) for iterating on the new `prune` command diff --git a/docs/pages/blog/turbo-1-1-0.mdx b/docs/pages/blog/turbo-1-1-0.mdx new file mode 100644 index 0000000..d111a09 --- /dev/null +++ b/docs/pages/blog/turbo-1-1-0.mdx @@ -0,0 +1,141 @@ +--- +title: Turborepo 1.1 +date: 2022/01/31 +description: Turborepo 1.1 introduces automatic migrations, turbo.json configuration, environment variable dependencies, partial Yarn v2 support, and more! +tag: web development +ogImage: /images/blog/turbo-1-1-0/twitter-card.png +--- + +# Turborepo 1.1 + +import { Authors } from "../../components/Authors"; + +<Authors authors={["jaredpalmer", "becca__z", "gaspargarcia", "gregsoltis"]} /> + +Since releasing Turborepo v1.0 in mid-December, we've seen incredible adoption: + +- 5.5k+ GitHub Stars +- 70k+ weekly npm downloads +- 65+ OSS contributors +- In production at [Vercel](https://github.com/vercel/next.js), [AWS](https://github.com/aws-amplify/amplify-ui), [PayPal](https://twitter.com/jaredpalmer/status/1485617973477978121), [Twilio](https://github.com/twilio-labs/function-templates), [Contentful](https://github.com/contentful/forma-36), [Miro](https://github.com/miroapp/app-examples), [Framer](https://github.com/framer/motion), [Discord.js](https://github.com/discordjs/discord.js), [Rocket.chat](https://github.com/RocketChat/fuselage), [Astro.build](https://github.com/withastro/astro) +- 585+ members of the [Turborepo Community Discord](https://turbo.build/discord) + +![Weekly npm downloads of `turbo`](/images/blog/turbo-1-1-0/turborepo-weekly-npm-downloads.png) + +We're further improving build performance and caching with Turborepo v1.1, featuring: + +- [**Automatic Migrations:**](#automatic-migrations) Official idempotent migration scripts to assist with upgrading. +- [**`turbo.json` Support:**](#turbojson-support) Turborepo configuration now lives in its own file. +- [**Faster Package Manager Detection:**](#faster-package-manager-detection) Turborepo now respects the `packageManager` key in the root `package.json`. +- [**Environment Variable Dependencies:**](#environment-variable-dependencies) Define how environment variables impact global and task-specific caching. +- [**Partial Support for Yarn v2+:**](#partial-yarn-v2v3-support) Support for yarn v2+ with `nodeLinker: "node-modules"`. + +Update today by running `npm install turbo@latest`. After running `turbo`, you'll see instructions about how to use `@turbo/codemod` to run automatic migrations for `v1.1`. + +## Automatic Migrations + +Turborepo now provides idempotent code transformations and automatic migration scripts (a.k.a "codemods") to help upgrade your Turborepo codebase when a feature is deprecated or will be deprecated in the future. + +Codemods are transformations that run on your codebase programmatically. This saves you time by applying a large number of changes to your code automatically, without having to manually go through and edit every file. + +### Usage + +```bash +npx @turbo/codemod <transform> <path> +``` + +- `transform` - the name of transform, [see available transforms in the docs](/repo/docs/reference/codemods#turborepo-1x). +- `path` - files or directory to transform. +- `--dry` - Do a dry run, no code will be edited. +- `--print` - Prints the changed output for comparison. + +For more information about specific transforms, check out the [new Codemods documentation](/repo/docs/reference/codemods#turborepo-1x). + +## `turbo.json` Support + +Turborepo configuration is now defined in a `turbo.json` file in the root of your monorepo. This is an improvement over having a `turbo` key in `package.json` for those who want to quickly jump straight to their Turborepo configuration in their code editors. + +To automatically migrate from your current configuration in `package.json`, check out a new branch, navigate to the root of your monorepo and run the following codemod: + +```bash +npx @turbo/codemod create-turbo-config . +``` + +For more information on this transformation, [check out the documentation](/repo/docs/reference/codemods#create-turbo-config). + +## Faster Package Manager Detection + +Turborepo now supports the recently established `packageManager` field in `package.json` for faster package manager detection. Previously, `turbo` would check for specific files to infer this information. To automatically set this field, check out a new branch, navigate to the root of your monorepo and run: + +```bash +npx @turbo/codemod add-package-manager . +``` + +For more information on this transformation, [check out the documentation](/repo/docs/reference/codemods#add-package-manager). + +## Environment Variable Dependencies + +When you use `turbo` with tools that inline environment variables at build time (e.g. Next.js or Create React App), it is important you tell `turbo` about it to avoid shipping a cached artifact with the wrong environment variables. + +You can now control `turbo`'s [cache fingerprinting (a.k.a. hashing)](/repo/docs/core-concepts/caching#hashing) behavior based on the values of both environment variables and the contents of files: + +- Including environment variables in a `dependsOn` in your `pipeline` definition prefixed by a `$` will impact the cache fingerprint on a per-task or per-package-task basis. +- Including environment variables in `globalDependencies` list prefixed by a `$` will impact the cache fingerprint of _all_ tasks. +- Including files or globs of files in `globalDependencies` will impact the cache fingerprint of _all_ tasks. +- The value of any environment variable that includes `THASH` in its name will impact the cache fingerprint of _all_ tasks. + +```jsonc +{ + "pipeline": { + "build": { + "dependsOn": { + "^build" + // env vars will impact hashes of all "build" tasks + "$SOME_ENV_VAR" + }, + "outputs": ["dist/**"] + }, + "web#build": { // override settings for the "build" task for the "web" app + "dependsOn": [ + "^build", + // env vars that will impact the hash of "build" task for only "web" app + "$STRIPE_SECRET_KEY", + "$NEXT_PUBLIC_STRIPE_PUBLIC_KEY", + "$NEXT_PUBLIC_ANALYTICS_ID", + ], + "outputs": [".next/**", "!.next/cache/**"], + }, + "docs#build": { // override settings for the "build" task for the "docs" app + "dependsOn": [ + "^build", + // env vars that will impact the hash of "build" task for only "web" app + "$STRIPE_SECRET_KEY", + "$NEXT_PUBLIC_STRIPE_PUBLIC_KEY", + "$NEXT_PUBLIC_ANALYTICS_ID", + ], + "outputs": [".next/**", "!.next/cache/**"], + } + }, + "globalDependencies": [ + "$GITHUB_TOKEN"// env var that will impact the hashes of all tasks, + "tsconfig.json" // file contents will impact the hashes of all tasks, + ".env.*" // glob file contents will impact the hashes of all tasks, + ] +} +``` + +Note: In most monorepos, you don't often use environment variables in shared packages, but mostly in applications. Thus, to get higher cache hit rates, you should only include environment variables in the app-specific tasks where they are used/inlined. + +For more information, read the [caching and hashing documentation](/repo/docs/core-concepts/caching). + +## Partial Yarn v2/v3 support + +In addition to Yarn v1, npm, and pnpm package managers, Turborepo now supports Yarn v2+ with [`nodeLinker: "node-modules"` set in `.yarnrc.yml`](https://yarnpkg.com/configuration/yarnrc#nodeLinker). This key tells Yarn v2+ to mimic Yarn v1's `node_modules` installation behavior. Yarn v2+ Plug'n'Play (a.k.a. "PnP") is not currently supported. + +## What's next? + +[Since our launch](/blog/joining-vercel), Turborepo has focused on seamless incremental adoption/migration and speeding up CI/CD. We are committed to both of those values, but now we'll also be focusing on improving Turborepo's day-to-day ergonomics for local development and observability. We're really excited about this next chapter and will be sharing more details soon. + +## We're hiring! + +The Turborepo team at [Vercel](https://vercel.com/) is hiring! We're specifically looking for full time [Senior Full Stack Software Engineers](https://vercel.com/careers) and [Senior DevOps/Infrastructure Engineers](https://vercel.com/careers) to help us make Turborepo even better. diff --git a/docs/pages/blog/turbo-1-2-0.mdx b/docs/pages/blog/turbo-1-2-0.mdx new file mode 100644 index 0000000..1b49f6f --- /dev/null +++ b/docs/pages/blog/turbo-1-2-0.mdx @@ -0,0 +1,132 @@ +--- +title: Turborepo 1.2 +date: 2022/04/08 +description: Turborepo 1.2 introduces improved task filtering, artifact signing and integrity, human-readable and JSON dry runs, and more! +tag: web development +ogImage: /images/blog/turbo-1-2-0/twitter-card.png +--- + +# Turborepo 1.2 + +import { Authors } from "../../components/Authors"; +import Date from "../../components/blog/Date"; + +<Date> + Friday, April 8th, 2022 +</Date> + +<Authors authors={["jaredpalmer", "becca__z", "gaspargarcia", "gregsoltis"]} /> + +Since releasing Turborepo v1.1 in late January, we've seen incredible adoption and community growth: + +- **6.5k+** [GitHub Stars](https://github.com/vercel/turbo) +- **140k+** weekly npm downloads (doubling since our [last blog post for v1.1](/blog/turbo-1-1-0)) +- **95+** OSS contributors +- **900+** members of the [Turborepo Community Discord](https://turbo.build/discord) +- **1.6 years** of Time Saved through Remote Caching on Vercel, saving more than 2.5 months every week + +We've further improved ergonomics, observability, and security with Turborepo v1.2 featuring: + +- [**New Task Filtering API**](#new-task-filtering-api): `--filter` adds more powerful task filtering capabilities to `turbo run` +- [**Human-readable and JSON dry runs**](#debug-and-automate-with---dry-run): `--dry-run` flag can print out information about a `turbo run` without executing any tasks, in both human and JSON-parse friendly formats +- [**Improved Internal Scheduler and Graph**](#improved-internal-scheduler-and-graph): We refactored `turbo` 's internal scheduler and graph to be more ergonomic and predictable +- [**Enhanced Remote Cache Security**](#cache-outputs-integrity-and-signature-verification): Cryptographically sign remote cache artifacts with your own secret key + +Update today by running `npm install turbo@latest`. After running `turbo run` for the first time, you'll see instructions about how to use `@turbo/codemod` to run automatic migrations for `v1.2`. + +## New Task Filtering API + +We are excited to release one of our most requested features: the ability to expressively filter tasks through a `--filter` flag. The `--filter` flag is the much more powerful successor to the current combination of `--scope`, `--include-dependencies`, `--since`, and `--no-deps` flags. + +With `--filter` you can tell `turbo` to restrict executing commands to a subset of matched packages in your monorepo based on name, folder, or even if it has changed since a git commit ref. + +Take a look at some examples of what you can accomplish with the new `--filter` command: + +- `--filter=<package_name>` - match by exact package name or glob pattern +- `--filter=...<package_name>`- match by package name/glob and include all dependent packages of matches +- `--filter=...^<package_name>`- match by package name/glob and include all dependent packages of matches, but exclude the matches themselves +- `--filter=<package_name>...` - match by package name/glob and include all the matched packages' dependencies +- `--filter=<package_name>^...` - match by package name/glob and include all matched package dependencies, but exclude the matches themselves +- `--filter={./path/to/package}` - match by path or filesystem glob pattern +- `--filter=[origin/main]` - match by changed packages since a git commit ref + +You can use multiple filters together to get even more granular filtering as well as combine each part of the above patterns `{}`, `[]` , `^` , and `...` to express more complex behavior. + +For example, if you had an app located in `./apps/web` directory with local packages used as dependencies, and a Turborepo pipeline where `test` depends on `^build` topologically, running: + +```sh +turbo run test --filter={./apps/web}[HEAD^1]^... +``` + +would tell `turbo` to ensure dependencies are built and to run the `test` script in all of the local dependencies of the app located in `./apps/web`, not including that app itself, if the app has changed since HEAD^1. + +For more details and examples, refer to the new [filtering documentation](/repo/docs/core-concepts/monorepos/filtering). + +## Debug and Automate with `--dry-run` + +You can now see the impact of `turbo run` without actually executing any commands by appending either `--dry-run` or `--dry-run=json` to any `turbo run` command. This will result in either human or JSON output. + +Dry runs are incredibly useful for two situations: + +- Debugging and testing run options +- Using `turbo` filtering and task graphs for building automations + +import { Bleed } from "nextra-theme-docs"; + +<Bleed> + <div className="lg:rounded-xl overflow-hidden"> + ![Turborepo Dry Run](/images/blog/turbo-1-2-0/turbo-dry-run.png) + </div> +</Bleed> + +We hope that this will improve visibility into what `turbo` is doing, speeding up debugging, and make it easier to leverage `turbo` in dynamic CI/CD systems. + +## Improved Internal Scheduler and Graph + +When using `turbo run`, every `package.json` task is added to an internal graph to map dependencies based on the inferred relationships defined in your Turborepo `pipeline`. This task graph allows Turborepo to efficiently schedule incremental concurrent task running and cache task outputs for later use. + +We have made major improvements to the internal task scheduler and resulting graph structure, resulting in better performance and a better developer experience. For example, in many cases, you will no longer need to use `--include-dependencies`. Instead, after specifying your task entry points, the new and improved graph will automatically handle this graph resolution on your behalf. + +## Cache Outputs Integrity and Signature Verification + +You can now configure Turborepo to sign remote cache outputs using HMAC-SHA256 with a secret key before uploading them to the Remote Cache. When Turborepo downloads signed cache artifacts, it will now verify the artifact's integrity and authenticity. Any artifact that fails to verify will be ignored, discarded, and treated as a cache miss by Turborepo. + +To enable this feature, set the `remoteCache` options in your `turbo.json` config file to include `signature: true`. Then specify your secret key by declaring the `TURBO_REMOTE_CACHE_SIGNATURE_KEY` environment variable. + +```jsonc +{ + "$schema": "[https://turbo.build/schema.json](https://turbo.build/schema.json)", + "remoteCache": { + // Indicates if signature verification is enabled. + "signature": true + } +} +``` + +## Other bug fixes and improvements + +- `--sso-team` flag now enables teams with SAML tokens to log in through `turbo login` with correct team permissions +- `--log-output` flag allows you to control what logs are printed to the terminal, and when, allowing you to focus only on what's new +- `FORCE_COLOR` environment variable is now supported +- `TURBO_FORCE=true` environment variable will now force execution +- `--remote-only` and `TURBO_REMOTE_ONLY=true` will tell `turbo` to only use Remote Caching +- We now show `>>> FULL TURBO` when there's at least one task attempted +- Yarn v2+ with Plug'n'Play (PnP linker) is supported for the `turbo run` command, but `turbo prune` is still not fully supported +- Fixed regression with chrome tracing if `--profile` is specified +- You can now set concurrency by percentage of CPUs with `--concurrency=50%` + +## We're hiring! + +The Turborepo team at [Vercel](https://vercel.com/) is hiring! We're up to five core team members already this year and are looking to hire even more. We're specifically looking for full-time [Senior Build Systems Engineers](https://vercel.com/careers). + +## What's next? + +Along with seamless incremental adoption/migration and speeding up CI/CD, we've been focusing on improving Turborepo's day-to-day ergonomics, security, and observability. The new `--filter` flag, signed artifacts, and dry runs are important steps toward those goals. + +Next up, we'll be focusing an enhanced local development experience, codebase automations, and overall CLI performance. + +## Thank you, contributors + +Turborepo is the result of the combined work of over 95 individual developers and our core team. + +This release was brought to you by the contributions of: @gsoltis09, @jaredpalmer, @gaspar09, @shuding, @rajatkulkarni95, @VanTanev, @Kikobeats, @tknickman, @thebanjomatic, @chelkyl, @elado, @finn-orsini, @becca, @weyert, @ekosz diff --git a/docs/pages/blog/turbo-1-3-0.mdx b/docs/pages/blog/turbo-1-3-0.mdx new file mode 100644 index 0000000..f84b57e --- /dev/null +++ b/docs/pages/blog/turbo-1-3-0.mdx @@ -0,0 +1,202 @@ +--- +title: Turborepo 1.3 +date: 2022/06/23 +description: Turborepo 1.3 introduces restricted hash inputs, root script running and caching, new CI/CD Recipes, and more! +tag: web development +ogImage: /images/blog/turbo-1-3-0/twitter-card.png +--- + +# Turborepo 1.3 + +import { Authors } from "../../components/Authors"; +import Callout from "../../components/Callout"; +import Date from "../../components/blog/Date"; + +<Date> + Thursday, June 23rd, 2022 +</Date> + +<Authors + authors={[ + "gregsoltis", + "nathanhammond", + "tomknickman", + "jaredpalmer", + "gaspargarcia", + "becca__z", + ]} +/> + +With Turborepo 1.3 we are bringing improved caching and flexibility which includes: + +- [**Restricted hash inputs:**](#pipeline-inputs) Specify the files in a package folder that impact caching with `inputs`. +- [**Root script running and caching:**](#run-and-cache-scripts-from-the-root-of-your-monorepo) Run and cache `package.json` scripts from the root of the monorepo. +- [**New CI/CD Recipes:**](#new-cicd-recipes) We added recipes for using Turborepo with popular CI providers. + +Update today by running `npm install turbo@latest`. + +## Pipeline `inputs` + +In addition to [environment variables, dependencies, and pipeline configurations,](/repo/docs/core-concepts/caching#hashing) `turbo` will consider all non-gitignored files in package folder when calculating each `package.json` script's hash fingerprint (the key that `turbo` uses to index its cache and to determine if a script needs to be re-executed). **With Turborepo 1.3+, you can now specify globs of `inputs` in your `turbo.json` `pipeline` to control which files are relevant for a particular script for caching.** This means that you can now express the following in `turbo.json` + +- Ignore changes to all markdown files in a package or app's folder. +- Don't bother rebuilding an app if only its test files have changed. +- Only re-run tests if either source files or test files have been changed in a package or folder. +- and more. + +Let's walk through a concrete example: imagine we have a monorepo with a Next.js application for a documentation website in `./apps/docs-site`, some packages, and some markdown files in the root of the monorepo in a `./docs` folder. + +```sh filename="Example monorepo" +. +β”œβ”€β”€ docs/ +β”‚ β”œβ”€β”€ api-reference.md +β”‚ β”œβ”€β”€ getting-started.md +β”‚ └── intro.md +β”œβ”€β”€ apps/ +β”‚ β”œβ”€β”€ docs-site/ +β”‚ β”‚ β”œβ”€β”€ components/ +β”‚ β”‚ β”œβ”€β”€ pages/ +β”‚ β”‚ β”‚ └── [slug].js +β”‚ β”‚ β”œβ”€β”€ README.md +β”‚ β”‚ └── package.json +β”‚ └── web-site/ +β”‚ β”œβ”€β”€ pages/ +β”‚ β”œβ”€β”€ README.md +β”‚ └── package.json +β”œβ”€β”€ packages/ +β”‚ β”œβ”€β”€ configs/ +β”‚ └── ui/ +β”œβ”€β”€ package.json +└── turbo.json +``` + +Let's assume that the Next.js `docs-site` renders the markdown files from the `./docs` folder. We can now set up the `build` script in the app's `package.json` to use `inputs` in `turbo.json` to better specify exactly which files are relevant (and which should impact caching) as follows: + +```jsonc filename="./turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + // ... omitted for brevity + "build": { + "dependsOn": ["^build"], + "outputs": [".next/**", "!.next/cache/**", "dist/**"] + }, + "docs#build": { + "dependsOn": ["^build"], + "outputs": [".next/**", "!.next/cache/**"], + // Define set of relevant globs which impact caching of docs site + // builds + "inputs": [ + "../../docs/**/*.md", + "pages/**", + "components/**", + "package.json" + ] + } + } +} +``` + +Note: Like `outputs`, `inputs` are defined relative to the related `package.json` , but they can be outside of a given folder (`e.g. ../../docs/**`). + +## Run and cache scripts from the root of your monorepo + +As of 1.3, **`turbo` can now run and cache scripts from the `package.json` file at the root of the monorepo**, which will help significantly when migrating to Turborepo. + +To set this up, specify a root script in your `pipeline` configuration in your `turbo.json` using the form `"//#<script>": {...}`. The `//` tells `turbo` that the script is relative to the root of the monorepo and not each workspace package. + +There are 2 important things to note about root scripts and execution scope: + +- If you already have `"build": {...}` in your `pipeline`, but want to include the `build` script defined in the monorepo's root `package.json` file when running `turbo run build`, you may opt the root into the execution's scope by also including `"//#build": {...}` in your configuration as well. +- Conversely, you _do not_ need to define a generic `"my-script": {...}` entry if all you need is `"//#my-script": {...}`. + +A sample pipeline that defines the root script `check-examples` and opts the root into `test` might look like: + +```json filename="./package.json" +{ + "name": "my-turborepo", + "private": true, + "scripts": { + "test": "echo 'test!'", + "check-examples": "./check-examples.sh" + }, + "devDependencies": { + "turbo": "latest" + } +} +``` + +```jsonc filename="./turbo.json" highlight="20" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"] + }, + "test": { + "dependsOn": ["^build"], + "outputs": [] + }, + // This will cause the "test" script from all workspace package.json's + // AND the root package.json to be included when "turbo run test" is run + "//#test": { + "dependsOn": [], + "outputs": [] + }, + // This will cause the "check-examples" script in the root package.json + // to be run when "turbo run check-examples" is run. Since a general + // "check-examples" script is not defined in the pipeline, only the root + // package.json's "check-examples" script will be included + // when "turbo run check-examples" is run + "//#check-examples": { + "dependsOn": [], + "outputs": [], + "inputs": [ + "examples/**/*.ts", + "examples/**/*.tsx", + "examples/**/*.json", + "examples/**/*.js", + "examples/**/*.yaml", + "cli/**/*.ts", + "./scripts/run-example.sh" + ] + } + } +} +``` + +Note: We suggest specifying `inputs` whenever declaring a root task in your `pipeline` to improve caching. + +## New CI/CD Recipes + +We added recipes for using Turborepo and Remote Caching with: + +- [CircleCI](/repo/docs/ci/circleci) +- [GitHub Actions](/repo/docs/ci/github-actions) +- [Gitlab CI](/repo/docs/ci/gitlabci) +- [Travis CI](/repo/docs/ci/travisci) + +If there are other recipes you would like to see here please let us know by opening up a [GitHub Discussion](https://github.com/vercel/turbo/discussions/categories/ideas). + +## Other Bug Fixes and Improvements + +- Improved git operations and hashing +- Better cycle detection in dependency graph analysis +- Added support for Windows ARM 64-bit architectures +- Improved Remote Cache error logging +- Added Storybook to the Design System example + +## Community + +Since releasing [Turborepo v1.2 in early April](/blog/turbo-1-2-0), we've seen incredible adoption and community growth: + +- [8.1k+ GitHub Stars](https://github.com/vercel/turbo) +- 275k+ weekly NPM downloads (up ~2x) +- 1,200+ members of the [Turborepo Community Discord](https://turbo.build/discord) +- 5.8 years of compute time saved through Remote Caching on Vercel (up ~5x), saving +7 months per week now + +Turborepo is the result of the combined work of over 136 contributors including our core team. + +This release was brought to you by the contributions of: @gsoltis, @nathanhammond, @tknickman, @jaredpalmer, @zvictor, @ObliviousHarmony, @O4epegb, @rafaeltab, @mcmontseny, @bertspaan, @Jastor11, and @enBonnet + +Thank you for your continued support, feedback, and collaboration with us to make Turborepo your build tool of choice. diff --git a/docs/pages/blog/turbo-1-4-0.mdx b/docs/pages/blog/turbo-1-4-0.mdx new file mode 100644 index 0000000..b599def --- /dev/null +++ b/docs/pages/blog/turbo-1-4-0.mdx @@ -0,0 +1,139 @@ +--- +title: Turborepo 1.4 +date: 2022/08/09 +description: Turborepo 1.4 has new examples, automatically includes environment variables, and more! +tag: web development +ogImage: /images/blog/turbo-1-4-0/twitter-card.png +--- + +# Turborepo 1.4 + +import { Authors } from "../../components/Authors"; +import Callout from "../../components/Callout"; +import Date from "../../components/blog/Date"; + +<Date> + Tuesday, August 9th, 2022 +</Date> + +<Authors + authors={[ + "gregsoltis", + "nathanhammond", + "tomknickman", + "anthonyshew", + "jaredpalmer", + "mehulkar", + ]} +/> + +Turborepo 1.4 brings: + +- [**Automatic environment variable inclusion:**](#automatic-environment-variable-inclusion) We'll automatically infer popular framework environment variables for you. No need to declare them yourself in `turbo.json`. +- [**`eslint-config-turbo`:**](#eslint-config-turbo) Enhanced feedback with a new ESLint plugin. +- [**New framework and library examples:**](#new-framework-and-library-examples) New starters and examples requested by the community. + +Update today by running `npm install turbo@latest`. + +## Automatic environment variable inclusion + +To help ensure correct caching across environments Turborepo will now automatically infer and include public environment variables when calculating cache keys for apps built with Astro, Create React App, Gatsby, Next.js, Nuxt, SvelteKit, Vite, Vue, and more. You can safely remove framework-specific public environment variables from `turbo.json` if you manually declared them. + +```diff filename="turbo.json" +{ + "pipeline": { + "build": { + "dependsOn": [ + "^build" +- // Include build time public inlined environment variables that +- // are different in development and production, so that +- // Turborepo does not use the same cached build +- // across environments +- "$NEXT_PUBLIC_EXAMPLE_ENV_VAR" + ] + } + } +} +``` + +Note that this automatic detection and inclusion only works if Turborepo successfully infers the framework your apps are built with. Additionally, the environment variables will only be included in the cache key for tasks in workspaces where that framework is used. In other words, environment variables inferred for Next.js apps, will only be included in the cache key for workspaces detected as Next.js apps. Tasks in other workspaces in the monorepo will not be impacted. + +For example, consider a monorepo with three workspaces: a Next.js project, a Create React App project, and a TypeScript package. Each has a `build` script, and both apps depend on the TypeScript project. Let's say that this Turborepo has a standard `turbo.json` pipeline that builds them all in order: + +```jsonc filename="turbo.json" +{ + "pipeline": { + "build": { + "dependsOn": ["^build"] + } + } +} +``` + +As of 1.4, when you run `turbo run build`, Turborepo will not consider any build time environment variables relevant when building the TypeScript package. However, when building the Next.js app, Turborepo will infer that environment variables starting with `NEXT_PUBLIC_` could alter the output of the `.next` folder and should thus be included when calculating the hash. Similarly, when calculating the hash of the Create React App's `build` script, all build time environment variables starting with `REACT_APP_PUBLIC_` will be included. + +This improvement in hash specificity by framework is a significant step toward optimal, safe, and correct caching. + +## `eslint-config-turbo` + +We've also created a new ESLint config for further in-editor assistance to help ensure your Turborepo cache can be correctly shared across every environment. While our new hashing algorithm should cover most situations with most frameworks, this ESLint config will provide in-editor feedback for teams using other build time inlined environment variables that are not framework-prefixed but impact build outputs (i.e. caching), and teams using their own in-house frameworks that we cannot detect automatically. + +To get started, extend from `eslint-config-turbo` in your root [`eslintrc`](https://eslint.org/docs/latest/user-guide/configuring/configuration-files#configuration-file-formats) file: + +```jsonc +{ + // Automatically flag env vars missing from turbo.json + "extends": ["next/core-web-vitals", "turbo"] +} +``` + +If you prefer more control over the rules - use can install and configure the `eslint-plugin-turbo` _plugin_ directly by first adding it to plugins and then configuring the desired rules: + +```jsonc +{ + "extends": ["next/core-web-vitals"], + "plugins": ["turbo"], + "rules": { + // Automatically flag env vars missing from turbo.json + "turbo/no-undeclared-env-vars": "error" + } +} +``` + +The plugin will warn you if you are using non-framework-related environment variables in your code that have not been declared in your `turbo.json`. + +As of 1.4.x, we now include `eslint-config-turbo` in all of our examples and in new projects generated via `npx create-turbo`. + +Learn more about ESLint configs and plugins in the [ESLint docs](https://eslint.org/docs/latest/). + +## New framework and library examples + +Based on your feedback and suggestions, we've created new examples to integrate Turborepo into your workflow: + +- [Svelte](https://github.com/vercel/turbo/tree/main/examples/with-svelte) +- [Docker](https://github.com/vercel/turbo/tree/main/examples/with-docker) +- [Create React App](https://github.com/vercel/turbo/tree/main/examples/with-create-react-app) +- [React Native](https://github.com/vercel/turbo/tree/main/examples/with-react-native-web) +- [Prisma](https://github.com/vercel/turbo/tree/main/examples/with-prisma) +- [Tailwind](https://github.com/vercel/turbo/tree/main/examples/with-tailwind) +- … [and more!](https://github.com/vercel/turbo/tree/main/examples) + +## Other bug fixes and improvements + +- Allow both sides of git comparison ([#1442](https://github.com/vercel/turbo/pull/1442)) +- Properly rebuild packages that share a name prefix ([#1538](https://github.com/vercel/turbo/pull/1538)) +- Cache files with the correct file permissions ([#1429](https://github.com/vercel/turbo/pull/1429)) + +## Community + +Since releasing [Turborepo v1.3 in June](https://turbo.build/blog/turbo-1-3-0), we've seen incredible adoption and community growth: + +- [8.65k+ GitHub Stars](https://github.com/vercel/turbo) +- 365k weekly NPM downloads, up 2x since late April +- 10 years of compute time saved through Remote Caching on Vercel, saving 10 months per week + +Turborepo is the result of the combined work of all of our contributors including our core team. + +This release was brought to you by the contributions of: @B2o5T, @chitchu, @elis, @gsoltis, @harshcut, @jaredpalmer, @kocisov, @nathanhammond, @neolivz, @NuroDev, @oneezy, @samouri, @shayc, @StevenMatchett, @tknickman, @trevorr, @zsoldosp, and more! + +Thank you for your continued support, feedback, and collaboration to make Turborepo your build tool of choice. diff --git a/docs/pages/blog/turbo-1-5-0.mdx b/docs/pages/blog/turbo-1-5-0.mdx new file mode 100644 index 0000000..ed33779 --- /dev/null +++ b/docs/pages/blog/turbo-1-5-0.mdx @@ -0,0 +1,142 @@ +--- +title: Turborepo 1.5 +date: 2022/09/19 +description: Turborepo 1.5 brings the Monorepo Handbook, drops the 'run' command, improves pruning, and much more! +tag: web development +ogImage: /images/blog/turbo-1-5-0/twitter-card.png +--- + +# Turborepo 1.5 + +import { Authors } from "../../components/Authors"; +import Callout from "../../components/Callout"; +import Date from "../../components/blog/Date"; + +<Date> + Monday, September 19th, 2022 +</Date> + +<Authors + authors={[ + "mattpocock", + "gregsoltis", + "nathanhammond", + "tomknickman", + "anthonyshew", + "jaredpalmer", + "mehulkar", + "chrisolszewski", + ]} +/> + +Turborepo 1.5 is a **huge leap forward for our documentation and DX**, as well as bringing big improvements to `turbo prune`: + +- [**The Monorepo Handbook**](#the-monorepo-handbook): We've built the missing manual for your monorepo - a guide on workspaces, code sharing, integrating common tools and much more. +- [**Drop the `run`**](#drop-the-run): `turbo run <task>` can now be shortened to `turbo <task>` +- [**`turbo prune` now supports pnpm and yarn 2+**](#prune-now-supported-on-pnpm-and-yarn-2): Pruning your monorepo is now supported in `pnpm` and `yarn@berry`. +- [**Improved environment variables in `turbo.json`**](#environment-variables-in-turbojson): Environment variables are now first-class citizens in your Turborepo pipeline configuration. +- [**Changes to `package.json` hashing**](#changes-to-packagejson-hashing): We've improved how we hash `package.json` when running tasks. + +Update today by running `npm install turbo@latest`. + +## The Monorepo Handbook + +Setting up a monorepo for the first time often means navigating a lot of new concepts. You'll need to understand workspaces, package installation, sharing code and dependency management - and a lot more. + +This often meant that folks who wanted to set up a monorepo from scratch had to piece information together from different documentation sites. First `pnpm`, then `tsup`, then back to `changesets`, then back to Turborepo for dessert. + +We want to fill this gap with [the Monorepo Handbook](/repo/docs/handbook). We've built guides on how to integrate all the tools you'll need to make ship happen with your monorepo, including guides on: + +- [Installing Packages](/repo/docs/handbook/package-installation) +- [Linting](/repo/docs/handbook/linting) +- [Development Tasks](/repo/docs/handbook/dev) +- [Building Apps](/repo/docs/handbook/building-your-app) +- [Publishing Packages](/repo/docs/handbook/publishing-packages) + +## Drop the `run` + +<iframe + className="max-w-xl h-72 sm:h-96 w-full mt-6" + src="https://www.youtube.com/embed/PEgk2v6KntY" + title="YouTube video player" + frameborder="0" + allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" +></iframe> + +You can now run tasks with the Turborepo CLI using `turbo <task>`. + +```diff +- turbo run build ++ turbo build + + +- turbo run lint build test ++ turbo lint build test +``` + +If your task name conflicts with a built-in `turbo` subcommand, we'll run our subcommand instead. That means you shouldn't name your tasks things like `prune`, `run`, or `login` - since those are built-in subcommands. + +`turbo run <task>` will continue to work, and there are no plans to deprecate it. + +## Prune now supported on pnpm and yarn 2+ + +We're delighted to announce that [`turbo prune`](/repo/docs/reference/command-line-reference#turbo-prune---scopetarget) now supports in pnpm, yarn, and yarn 2+. + +You can use `turbo prune` to create a pruned subset of your monorepo with a dedicated lockfile--with the correct dependencies needed for a given target application and its dependencies. This is especially useful for using efficiently Turborepo within a Docker image. + +As part of the new handbook, we've also added a section on using `turbo prune` to [build docker images](/repo/docs/handbook/deploying-with-docker). + +Check out our previous [blog on prune](/blog/turbo-0-4-0#experimental-pruned-workspaces) to learn more. + +## Environment variables in `turbo.json` + +We've introduced two new keys to `turbo.json` - `env` and `globalEnv`. These allow environment variables to be configured _separately_ from tasks: + +```diff filename="turbo.json" +{ + "globalDependencies": [ +- "$DATABASE_URL" + ], ++ "globalEnv": [ ++ "DATABASE_URL" ++ ], + "pipeline": { + "build": { + "dependsOn": [ +- "$BUILD_ENV" + ], ++ "env": [ ++ "BUILD_ENV" ++ ] + } + } +} +``` + +`globalEnv` and `env` allow you to specify a list of environment variables _without_ `$` prefixes. This makes the configuration file significantly easier to read. Read more [in our updated docs](/repo/docs/core-concepts/caching#altering-caching-based-on-environment-variables). + +To help migrate from the previous syntax, we've prepared a codemod. You can run `npx @turbo/codemod migrate-env-var-dependencies`. + +This work builds on the [automatic env variable detection](/blog/turbo-1-4-0) we added in 1.4.0. + +## Changes to `package.json` hashing + +The `package.json` file in each workspace is now _always_ considered an input for tasks in that workspace. This means that if you change the _definition_ for a task in `package.json`, we want to invalidate any caches from the previous definition. + +This also counts for the `package.json` in the root. Changes to the root `package.json` will invalidate tasks in the root workspace. + +This helps make Turborepo's cache a bit smarter, and less likely to trip up when task definitions change. + +## Community + +Since releasing [Turborepo v1.4 in August](/blog/turbo-1-4-0), we've seen incredible adoption and community growth: + +- [9.5k+ GitHub Stars](https://github.com/vercel/turbo) +- [440k weekly NPM downloads](https://www.npmjs.com/package/turbo) +- 15 years of compute time saved through [Remote Caching on Vercel](https://vercel.com/docs/concepts/monorepos/remote-caching), saving over a 1 year per week, up 2x since July + +Turborepo is the result of the combined work of all of our contributors including our core team. + +This release was brought to you by the contributions of: @7flash, @afady, @alexander-young, @atilafassina, @bguedes-moz, @bobaaaaa, @brunojppb, @chris-olszewski, @DoctorJohn, @erj826, @futantan, @gsoltis, @HosseinAgha, @ivov, @jaredpalmer, @joelhooks, @knownasnaffy, @laurentlucian, @leerob, @MarceloAlves, @mattpocock, @mauricekleine, @mehulkar, @Misikir, @nareshbhatia, @nathanhammond, @pakaponk, @PhentomPT, @renovate, @ruisaraiva19, @samuelhorn, @shemayas, @shuding, @t-i-0414, @theurgi, @tknickman, @yanmao-cc, and more! + +Thank you for your continued support, feedback, and collaboration to make Turborepo your build tool of choice. diff --git a/docs/pages/blog/turbo-1-6-0.mdx b/docs/pages/blog/turbo-1-6-0.mdx new file mode 100644 index 0000000..295f312 --- /dev/null +++ b/docs/pages/blog/turbo-1-6-0.mdx @@ -0,0 +1,223 @@ +--- +title: Turborepo 1.6 +date: 2022/10/21 +description: Turborepo 1.6 lets you use Turborepo in non-monorepos, prune for npm, and improves cache performance. +tag: "web development" +ogImage: /images/blog/turbo-1-6-0/twitter-card.png +--- + +import { Tabs, Tab } from '../../components/Tabs' + +# Turborepo 1.6 + +import { Authors } from '../../components/Authors' +import Callout from '../../components/Callout' +import Date from "../../components/blog/Date"; + +<Date> + Friday, October 21st, 2022 +</Date> + +<Authors authors={[ + 'mattpocock', + 'gregsoltis', + 'nathanhammond', + 'tomknickman', + 'anthonyshew', + 'jaredpalmer', + 'mehulkar', + 'chrisolszewski' +]} /> + +Turborepo 1.6 changes the game for Turborepo - you can now use it in **any project**. + +- [**Turborepo in non-monorepos**](#any-codebase-can-use-turborepo): Seeing slow builds on your project? You can now use Turborepo to speed up builds in any codebase with a `package.json`. +- [**`turbo prune` now supports npm**](#prune-now-supported-on-npm): Pruning your monorepo is now supported in monorepos using `npm`, completing support for all major workspace managers. +- [**Faster caching**](#performance-improvements-in-the-cache): We've improved the way we handle local file writes, meaning a big speed-up of Turborepo's cache. + +Update today by running `npm install turbo@latest`. + +## Any codebase can use Turborepo + +Turborepo helps speed up tasks in your codebase. Until now, we'd built Turborepo specifically for monorepos - codebases which contain multiple applications and packages. + +Turborepo is fantastic in monorepos because they have so many tasks to handle. Each package and app needs to be built, linted, and tested. + +But we got to thinking: lots of codebases that _aren't_ monorepos run plenty of tasks. Most CI/CD processes do a lot of duplicated work that would benefit from a [cache](/repo/docs/core-concepts/caching). + +So we're excited to announce that **any codebase can now use Turborepo**. + +Try it out now by [starting from the example](https://github.com/vercel/turbo/tree/main/examples/non-monorepo), or by adding Turborepo to an existing project: + +### Add Turborepo to your project + +1. **Install `turbo`:** + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```bash + npm install turbo --save-dev + ``` + </Tab> + <Tab> + ```bash + yarn add turbo --dev + ``` + </Tab> + <Tab> + ```bash + pnpm install turbo --save-dev + ``` + </Tab> +</Tabs> + +2. **Add a `turbo.json` file at the base of your new repository:** + +<Tabs items={['Next.js', 'Vite']} storageKey="selected-framework"> + <Tab> +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": [".next/**", "!.next/cache/**"] + }, + "lint": { + "outputs": [] + } + } +} +``` + </Tab> + <Tab> +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": ["dist/**"] + }, + "lint": { + "outputs": [] + } + } +} +``` + +Some Vite starters ship with a `package.json` that looks like this: + +```json filename="package.json" +{ + "scripts": { + "build": "tsc && vite build" + } +} +``` + +We recommend splitting these into a `lint` and `build` script. + +```json filename="package.json" +{ + "scripts": { + "build": "vite build", + "lint": "tsc" + } +} +``` + +This enables `turbo` to schedule them separately. + + </Tab> +</Tabs> + +3. **Try running `build` and `lint` with `turbo`:** + +```bash +turbo build lint +``` + +Congratulations - **you just ran your first build with `turbo`**. You can try: + +- Running through the full [Quickstart](/repo/docs). +- Check out our updated [Core Concepts docs](/repo/docs/core-concepts/caching) to understand what makes Turborepo special. + +## When should I use Turborepo? + +Turborepo being available for non-monorepos opens up a lot of new use cases. But when is it at its best? + +### When scripts depend on each other + +You should use `turbo` to run your `package.json` scripts. If you've got multiple scripts which all rely on each other, you can express them as Turborepo tasks: + +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": ["dist/**"] + }, + "lint": { + // 'build' should be run before 'lint' + "dependsOn": ["build"] + }, + "test": { + // 'build' should be run before 'test' + "dependsOn": ["build"] + } + } +} +``` + +Then, you can run: + +```sh +turbo run lint test +``` + +Because you've said that `build` should be run before `lint` and `test`, it'll _automatically run `build` for you_ when you run `lint` or `test`. + +Not only that, but it'll figure out the optimal schedule for you. Head to our core concepts doc on [optimizing for speed](/repo/docs/core-concepts/monorepos/running-tasks#most-tools-dont-optimize-for-speed). + +### When you want to run tasks in parallel + +Imagine you're running a [Next.js](https://nextjs.org/) app, and also running the [Tailwind CLI](https://tailwindcss.com/docs/installation). You might have two scripts - `dev` and `dev:css`: + +```json filename="package.json" +{ + "scripts": { + "dev": "next", + "dev:css": "tailwindcss -i ./src/input.css -o ./dist/output.css --watch" + } +} +``` + +Without anything being added to your `turbo.json`, you can run: + +```sh +turbo run dev dev:css +``` + +Just like tools like [`concurrently`](https://www.npmjs.com/package/concurrently), Turborepo will automatically run the two scripts in parallel. + +This is extremely useful for dev mode, but can also be used to speed up tasks on CI - imagine you have multiple scripts to run: + +```sh +turbo run lint unit:test e2e:test integration:test +``` + +Turborepo will figure out the fastest possible way to run all your tasks in parallel. + +## Prune now supported on npm + +Over the last several releases, we've been adding support for [`turbo prune`](/repo/docs/reference/command-line-reference#turbo-prune---scopetarget) on different workspace managers. This has been a challenge - `turbo prune` creates a subset of your monorepo, including pruning the dependencies in your lockfile. This means we've had to implement logic for each workspace manager separately. + +We're delighted to announce that `turbo prune` now works for `npm`, completing support for all major package managers. This means that if your monorepo uses `npm`, `yarn`, `yarn 2+` or `pnpm`, you'll be able to deploy to Docker with ease. + +Check out our previous [blog on `turbo prune`](/blog/turbo-0-4-0#experimental-pruned-workspaces) to learn more. + +## Performance improvements in the cache + +Before 1.6, Turborepo's local cache was a recursive copy of files on the system to another place on disk. This was _slow_. It meant that for every file that we needed to cache, we'd need to perform six system calls: open, read, and close on the source file; open, write, and close on the destination file. + +In 1.6, we've cut that nearly in half. Now, when creating a cache, we create a single `.tar` file (_one_ open), we write to it in 1mb chunks (_batched_ writes), and then close it (_one_ close). The halving of system calls _also_ happens on the way back out of cache. + +And we didn't stop there. Over the past month we've invested significantly in our build toolchain to enable CGO which unlocks usage of best-in-class libraries written in C. This enabled us to adopt [Zstandard](https://facebook.github.io/zstd/)'s `libzstd` for compression which gets us an algorithmic 3x performance improvement for compression. + +After all of these changes we're regularly seeing performance improvements of more than 2x on local cache creation and more than 3x on remote cache creation. This gets even better the bigger your repository is, or the slower your device is (looking at you, CI). This means we've been able to deliver performance wins precisely to those who needed it the most. diff --git a/docs/pages/blog/turbo-1-7-0.mdx b/docs/pages/blog/turbo-1-7-0.mdx new file mode 100644 index 0000000..e0ffecf --- /dev/null +++ b/docs/pages/blog/turbo-1-7-0.mdx @@ -0,0 +1,150 @@ +--- +title: Turborepo 1.7 +date: 2023/01/11 +description: Turborepo 1.7 focuses on improving developer experience by bringing more clarity to your tasks. +tag: "web development" +ogImage: /images/blog/turbo-1-7-0/twitter-card.png +--- + +import { Tabs, Tab } from '../../components/Tabs' + +# Turborepo 1.7 + +import { Authors } from '../../components/Authors' +import Badge from '../../components/Badge' +import Date from "../../components/blog/Date"; + +<Date> + Wednesday, January 11th, 2023 +</Date> + +<Authors authors={[ + 'gregsoltis', + 'nathanhammond', + 'tomknickman', + 'anthonyshew', + 'jaredpalmer', + 'mehulkar', + 'chrisolszewski', + 'nicholasyang' +]} /> + +Turborepo 1.7 focuses on improving developer experience by bringing more clarity to your tasks: + +- [**Improved support for long running tasks**](#schedule-your-long-running-tasks-with-confidence): Use `persistent: true` to mark non-terminating tasks so that `turbo` can alert you if you have dependencies on them. +- [**Better clarity for outputs**](#declare-your-outputs-for-improved-clarity): You'll now always need to declare your task outputs, improving transparency to what your tasks will cache. +- [**Globally installable**](#global-turbo): Install once, use everywhere. Turborepo can now be installed globally, and ran from any directory, not just from your repo root. +- [**β€œError only” output mode**](#errors-only-output-mode-for-quieter-logs): Quiet your output logs to only show when a task fails. + +Update today by running `npm install turbo@latest`, or by [installing globally](#global-turbo) <Badge>NEW</Badge> and running the [`set-default-outputs`](/repo/docs/reference/codemods#set-default-outputs) codemod. + +## Schedule your long-running tasks with confidence + +To avoid misconfigurations that could result in tasks that never run, you can now tell Turborepo about tasks that won't exit on their own (like `dev` scripts) with a `persistent: true` [configuration option](/repo/docs/reference/configuration#persistent). When this config is set on a task, Turborepo will ensure no other task can depend on this task. This is useful for `dev` tasks or test runners with `--watch` flags. + +```diff +{ + "pipeline": { + "dev": { ++ "persistent": true + } + } +} +``` + +Previously, if `Task B` depended on a persistent `Task A`, `Task B` would never execute, because `Task A` never exited. By declaring `Task A` as `persistent`, Turborepo will prevent this error scenario from happening. + +Before this release, we had been recommending the use of `turbo run <task> --parallel` for persistent tasks. With `--parallel`, `turbo` would ignore your dependency graph and execute all your tasks at once. + +While `--parallel` did provide a helpful escape hatch, it meant that users had to tell Turborepo **_how_** to run their tasks rather than declaring **_what_** a task is. + +Rather than throwing away your entire topological dependency graph, it's much more precise for Turborepo to keep your dependency graph while guaranteeing that you don't depend on a process that won't exit with `persistent: true`. + +## Global `turbo` + +You can now run your Turborepo tasks from anywhere in your project once you've installed `turbo` globally. To do so, use: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```bash + npm install turbo --global + ``` + </Tab> + <Tab> + ```bash + yarn global add turbo + ``` + </Tab> + <Tab> + ```bash + pnpm install turbo --global + ``` + </Tab> +</Tabs> + +`turbo` will now work in any project. To find your local `turbo` version, `turbo` will walk through a few steps, always looking upward from your current directory: + +1. Find the nearest turbo.json. +2. If one is not found, find the first `package.json` with a `workspaces` property. +3. If one is not found, find the first `package.json`. + +Your globally installed version of `turbo` will only be used when a locally installed version of `turbo` does not exist or cannot be found. + +![turbo-inference](/images/blog/turbo-1-7-0/turbo-inference.png) + +`turbo --version` and `turbo bin` will show you the version and binary location, respectively, of the copy of `turbo` that will execute your tasks. +Additionally, running with `-vv` or `--verbosity=2` will always show if your local, or global `turbo` is being used. + +```bash +turbo --version --verbosity=2 +2023-01-11T10:49:04.042-0500 [DEBUG] turborepo_lib::shim: No local turbo binary found at: /Users/knickman/Developer/vercel/my-awesome-monorepo/node_modules/.bin/turbo +2023-01-11T10:49:04.042-0500 [DEBUG] turborepo_lib::shim: Running command as global turbo +1.7.0 +``` + +## Declare your `outputs` for improved clarity + +Previously, if you did not specify an `outputs` key for a task, Turborepo would automatically attempt to cache all files in the `dist/` and `build/` directories. + +This worked well for `build` tasks of specific frameworks, but this implicit behavior did not scale well as it applied to _all_ tasks. We've found that, across the many developers, teams, projects, and codebases using Turborepo, the assumption to automatically cache `dist/` and `build/` directories was causing problems for users. + +In version 1.7, this behavior is removed and you will now need to explicitly tell turborepo what to cache. + +```diff +{ + "pipeline": { + "build": { ++ "outputs": ["dist/**", "build/**"] + } + } +} +``` + +If you were relying on the default cache output in Turborepo versions below 1.7, you can get the same behavior by running the [`@turbo/codemod set-default-outputs`](/repo/docs/reference/codemods#set-default-outputs) codemod: + +```bash +npx @turbo/codemod set-default-outputs +``` + +Also note that you will no longer need to specify `outputs: []` because not caching anything is now the default behavior. The codemod will also remove this configuration from your tasks. + +## β€œErrors only” output mode for quieter logs + +To bring visibility to errors, community member [@dobesv](https://github.com/dobesv) contributed [a solution to only show errors instead of all logs from a task run](https://github.com/vercel/turbo/pull/2588). While debugging a pipeline, `--output-logs=errors-only` can be used to keep your signal-to-noise ratio high so you can focus on ensuring successful runs for your pipelines. +This can be used as a [configuration option](/repo/docs/reference/configuration#outputmode) or as a [CLI flag](/repo/docs/reference/command-line-reference#--output-logs) + +```bash +turbo build --output-logs=errors-only +``` + +## Community + +Since releasing [Turborepo v1.6](/blog/turbo-1-6-0) and merging with [Turbopack](https://turbo.build/pack), we've seen incredible adoption and community growth: + +- [18.7k+ GitHub Stars](https://github.com/vercel/turbo) +- [750k weekly NPM downloads](https://www.npmjs.com/package/turbo) +- 30 years of compute time saved through [Remote Caching on Vercel](https://vercel.com/docs/concepts/monorepos/remote-caching) + +Turborepo is the result of the combined work of all of our contributors including our core team. + +Thank you for your continued support, feedback, and collaboration to make Turborepo your build tool of choice. diff --git a/docs/pages/blog/turbo-1-8-0.mdx b/docs/pages/blog/turbo-1-8-0.mdx new file mode 100644 index 0000000..d533fc1 --- /dev/null +++ b/docs/pages/blog/turbo-1-8-0.mdx @@ -0,0 +1,119 @@ +--- +title: Turborepo 1.8 +date: 2023/02/22 +description: Turborepo 1.8 brings better flexibility and more control to your codebase by improving turbo's understanding of your workspaces. +tag: "web development" +ogImage: /images/blog/turbo-1-8-0/twitter-card.png +--- + +import { Tabs, Tab } from '../../components/Tabs' + +# Turborepo 1.8 + +import { Authors } from '../../components/Authors' +import Badge from '../../components/Badge' +import Date from "../../components/blog/Date"; + +<Date> + Wednesday, February 22th, 2023 +</Date> + +<Authors authors={[ + 'gregsoltis', + 'nathanhammond', + 'tomknickman', + 'anthonyshew', + 'jaredpalmer', + 'mehulkar', + 'chrisolszewski', + 'nicholasyang', + 'alexanderlyon' +]} /> + +Turborepo 1.8 brings better flexibility and more control to your codebase by improving `turbo`'s understanding of your workspaces. + +- [**Workspace Configurations**](#workspace-configurations): You can now add a `turbo.json` configuration file in a workspace to override the root configuration in your repository. +- [**Automatic Workspace Scoping**](#automatic-workspace-scoping): Global `turbo` now automatically infers your current workspace so that it only runs that workspace’s tasks. +- [**Easier Migrations**](#easier-migrations): Automatically migrate to new versions of `turbo` with `npx @turbo/codemod migrate`. + +Update today by running `npx @turbo/codemod migrate`. + +## Workspace Configurations + +In workspace directories, you can now add a `turbo.json` to: + +- add tasks specific to that workspace +- override configuration for tasks + +This will enable teams to scale ownership of the projects in their monorepos by moving away from global configuration to fine-grain contol over tasks in workspaces. + +For example, imagine your monorepo has a Next.js app and a SvelteKit app, and you want to use Turborepo to cache outputs of the `build` task. The Next.js `build` script creates a `.next` directory, whereas SvelteKit creates a `.svelte-kit` directory. Instead of adding both build directories in your root `outputs`, you can define the `outputs` key in the workspace instead: + +```jsonc filename="turbo.json" +{ + "pipeline": { + "build": { + "dependsOn": ["^codegen"], + // no need to define outputs here! + } + } +} +``` + +```jsonc filename="apps/my-nextjs-app/turbo.json" +{ + "extends": ["//"], + "pipeline": { + "build": { + // dependsOn is inherited from root + "outputs": [".next/**", "!.next/cache/**"] + } + } +} +``` + +```jsonc filename="apps/my-sveltekit-app/turbo.json" +{ + "extends": ["//"], + "pipeline": { + "build": { + // dependsOn is inherited from root + "outputs": [".svelte-kit/**"] + } + } +} +``` + +The `extends` key in the Workspace Configurations enables workspace owners to use the best of the root `turbo.json` and customize the parts that makes their app different (The `"//"` sigil will look familiar if you’re used to [defining tasks to run from your root](/repo/docs/core-concepts/monorepos/running-tasks#running-tasks-from-the-root)). + +Keys that **are** declared will replace the key from the root if those keys exist, overriding what is defined in your root configuration. Keys that **are not** declared are inherited from the root config. + +In the example above, `outputs` is customized for both apps, while `dependsOn` is configured by the root `turbo.json` and remains `"^codegen"`. + +[Learn more in the docs](/repo/docs/core-concepts/monorepos/configuring-workspaces). + +## Automatic Workspace Scoping + +In [Turborepo v1.7](/blog/turbo-1-7-0), `turbo` became globally installable, giving you the power to run your tasks from anywhere in your codebase. However, `turbo` would still run tasks from the root, running tasks in other workspaces that you may not have intended to run. + +With 1.8, `turbo` will automatically detect the workspace you are in and generate [the `--filter` syntax](/repo/docs/reference/command-line-reference#--filter) to scope your task to that workspace. + +As an example, if your current directory is `apps/admin` and you use the `turbo build` command, `turbo` will run `turbo build --filter=admin` under the hood, focusing on the workspace that you are working on. + +## Easier Migrations + +Manually running individual codemods in the correct order is no longer required when upgrading Turborepo versions. `@turbo/codemod` now provides a simple `migrate` command which both upgrades your repo to the specified version (`latest` by default) of `turbo`, _and_ runs any codemods required. + +Try it out now with `npx @turbo/codemod migrate`. + +## Community + +Since releasing [Turborepo v1.7](/blog/turbo-1-7-0) we've seen incredible adoption and community growth: + +- [19.6k+ GitHub Stars](https://github.com/vercel/turbo) +- [987k weekly NPM downloads](https://www.npmjs.com/package/turbo) +- 42 years of compute time saved through [Remote Caching on Vercel](https://vercel.com/docs/concepts/monorepos/remote-caching) + +Turborepo is the result of the combined work of all of its contributors, including our core team. + +Thank you for your continued support, feedback, and collaboration to make Turborepo your build tool of choice. diff --git a/docs/pages/blog/turbo-1-9-0.mdx b/docs/pages/blog/turbo-1-9-0.mdx new file mode 100644 index 0000000..b370cd8 --- /dev/null +++ b/docs/pages/blog/turbo-1-9-0.mdx @@ -0,0 +1,121 @@ +--- +title: Turborepo 1.9 +date: 2023/04/11 +description: Turborepo 1.9 focuses on improving observability for your task runs to better understand your caching behavior. +tag: "web development" +ogImage: /images/blog/turbo-1-9-0/twitter-card.png +--- + +# Turborepo 1.9 + +import { Authors } from '../../components/Authors' +import Badge from '../../components/Badge' +import Date from "../../components/blog/Date"; + +<Date> + Monday, April 11th, 2023 +</Date> + +<Authors authors={[ + 'gregsoltis', + 'nathanhammond', + 'tomknickman', + 'anthonyshew', + 'jaredpalmer', + 'mehulkar', + 'chrisolszewski', + 'nicholasyang', + 'alexanderlyon' +]} /> + +Turborepo 1.9 focuses on improving observability for your task runs to better understand your caching behavior: + +- [**Run Summaries**](#view-and-compare-task-runs): Use the `--summarize` flag to generate a summary of your task to compare against previous runs. +- [**Easier Starters**](#bring-your-own-starter): Use the `--example` flag with `npx create-turbo` to start from official Turborepo examples or custom repositories. +- [**Strict Environments** <Badge>Experimental</Badge>](#strict-environments-experimental): Try enabling strict mode to restrict the environment variables your tasks have access to. + +Update today by running `npx @turbo/codemod migrate`. + +## View and compare task runs + +You can now produce a JSON summary of your task run using the `--summarize` flag: + +```bash +turbo build --summarize +``` + +When this flag is enabled, Turborepo will generate a summary in `.turbo/runs/` that contains all the information necessary to understand how `turbo` interpreted your your task's configuration and code. + +```bash +Tasks: 3 successful, 3 total +Cached: 0 cached, 3 total +Time: 1.707s +Summary: /Users/acme/projects/acme/.turbo/runs/2Nn3X6nWDhP9ag8BnmivWRxHpHC.json +``` + +You can then compare summaries using your favorite JSON diffing tool to understand why you got a cache hit or a cache miss. + +Learn more in the [docs](/repo/docs/reference/command-line-reference#--summarize) to learn more. + +## Bring your own starter + +`create-turbo` now supports starting a new project from any of the official [Turborepo examples](https://github.com/vercel/turbo/tree/main/examples). Get started with an example using a single command: + +```bash +npx create-turbo@latest -e kitchen-sink +``` + +In your terminal UI, choose your preferred package manager and `create-turbo` will automatically convert the chosen example to your package manager of choice. + +Additionally, you can use `create-turbo` with custom repository sources, allowing you to re-use your own custom starter or another starter from around the community: + +```bash +npx create-turbo -e https://github.com/your-org/turbo-starter +``` + +## Strict Environments <Badge>Experimental</Badge> + +You can now use the `--experimental-env-mode=strict` flag to restrict the environment variables your tasks have access to. Your tasks will only be aware of the variables you explicitly state, creating a safer caching environment. + +In `strict` mode, Turborepo will pass environment variables declared in: + +- `globalEnv` and `experimentalGlobalPassThroughEnv` to all tasks +- `env` and `experimentalPassThroughEnv` for each task + +```json +{ + // Available to all tasks + "experimentalGlobalPassThroughEnv": ["GLOBAL_VAR_1"], + + // Available to all tasks and invalidates caches + "globalEnv": ["GLOBAL_VAR_2"], + + "pipeline": { + "build": { + // Only available to `build` tasks + "experimentalPassThroughEnv": ["VAR_1"], + + // Available to `build` task and invalidates caches + "env": ["VAR_2"] + } + } +} +``` + +In `strict` mode, this configuration will only expose four environment variables to your `build` tasks, helping you catch missing variables earlier in the development process. + +`--experimental-env-mode` also supports `loose` and `infer`. + +Learn more in the [docs](/repo/docs/reference/command-line-reference#--experimental-env-mode). + +## Community + +Since releasing [Turborepo v1.8](/blog/turbo-1-8-0) we've seen incredible adoption and community growth: + +- [20.5k+ GitHub Stars](https://github.com/vercel/turbo) +- [1.1M weekly NPM downloads](https://www.npmjs.com/package/turbo) +- 64 years of compute time saved through [Remote Caching on Vercel](https://vercel.com/docs/concepts/monorepos/remote-caching) + +Turborepo is the result of the combined work of all of its contributors, including our core team. + +Thank you for your continued support, feedback, and collaboration to make Turborepo your build tool of choice. diff --git a/docs/pages/blog/turbopack-benchmarks.mdx b/docs/pages/blog/turbopack-benchmarks.mdx new file mode 100644 index 0000000..e6a0818 --- /dev/null +++ b/docs/pages/blog/turbopack-benchmarks.mdx @@ -0,0 +1,275 @@ +--- +title: "Turbopack Performance Benchmarks" +date: 2022/10/31 +description: "Benchmarking Turbopack performance against Vite and webpack." +tag: "web development" +ogImage: "/images/blog/turbopack-benchmarks/twitter-card.png" +--- + +import { DEFAULT_BARS, HMR_BARS } from '../../components/pages/pack-home/PackBenchmarks' +import { DocsBenchmarksGraph } from '../../components/pages/pack-home/DocsBenchmarksGraph'; +import { Tabs, Tab } from '../../components/Tabs' +import { ThemedImageFigure } from '../../components/image/ThemedImageFigure' +import { Authors } from '../../components/Authors' +import Callout from '../../components/Callout' +import Date from "../../components/blog/Date"; + +# Turbopack Performance Benchmarks + +<Date update={<>Thursday, December 22nd, 2022</>}> + Monday, October 31st, 2022 +</Date> + +<Authors authors={[ 'tobiaskoppers', 'alexkirsz' ]} /> + +<p className="text-gray-500 mt-6 uppercase text-sm tracking-wider">Summary</p> + +- We are thankful for the work of the entire OSS ecosystem and the incredible interest and reception from the [Turbopack release](https://vercel.com/blog/turbopack). We look forward to continuing our collaboration with and integration into the broader Web ecosystem of tooling and frameworks. +- In this article, you will find our methodology and documentation supporting the benchmarks that show **Turbopack is [much faster](#bench) than existing non-incremental approaches.** +- **Turbopack** andΒ [**Next.js 13.0.1**](https://github.com/vercel/next.js/releases/tag/v13.0.1)Β are out addressing a regression that snuck in prior to public release and after the initial benchmarks were taken. We also fixed an incorrect rounding bug on our website (`0.01s`Β β†’Β `15ms`). We appreciateΒ [Evan You](https://twitter.com/youyuxi)'s work that helped us identify and [correct this](https://github.com/vercel/turbo/pull/2516). +- We are excited to continue to evolve the incremental build architecture of Turbopack. We believe that there are still significant performance wins on the table. + +<hr className="mt-8 w-full border-gray-400 authors border-opacity-20"/> + +At [Next.js Conf](https://nextjs.org), [we announced](https://www.youtube.com/watch?v=NiknNI_0J48) our latest open-source project: Turbopack, an incremental bundler and build system optimized for JavaScript and TypeScript, written in Rust. + +The project began as an exploration to improve webpack’s performance and create ways for it to more easily integrate with tooling moving forward. In doing so, the team realized that a greater effort was necessary. While we saw opportunities for better performance, the premise of a new architecture that could scale to the largest projects in the world was inspiring. + +In this post, we will explore why Turbopack is so fast, how its incremental engine works, and benchmark it against existing approaches. + +## Why is Turbopack _blazing_ fast? + +Turbopack’s speed comes from its incremental computation engine. Similar to trends we have seen in frontend state libraries, computational work is split into reactive functions that enable Turbopack to apply updates to an existing compilation without going through a full graph recomputation and bundling lifecycle. + +This does not work like traditional caching where you look up a result from a cache before an operation and then decide whether or not to use it. That would be too slow. + +Instead, Turbopack skips work altogether for cached results and only recomputes affected parts of its internal dependency graph of functions. This makes updates independent of the size of the whole compilation, and eliminates the usual overhead of traditional caching. + +## Benchmarking Turbopack, webpack, and Vite + +We created a test generator that makes an application with a variable amount of modules to benchmark cold startup and file updating tasks. This generated app includes entries for these tools: + +- Next.js 11 +- Next.js 12 +- Next.js 13 with Turbopack +- Vite + +As the current state of the art, we are including [Vite](https://vitejs.dev/) along with webpack-based [Next.js](https://nextjs.org) solutions. All of these toolchains point to the same generated component tree, assembling a [SierpiΕ„ski triangle](https://en.wikipedia.org/wiki/Sierpi%C5%84ski_triangle) in the browser, where every triangle is a separate module. + +<ThemedImageFigure + borderRadius={false} + dark={{ + source: '/images/blog/turbopack-benchmarks/triangle-dark.png', + height: 600, + width: 1200 + }} + light={{ + source: '/images/blog/turbopack-benchmarks/triangle-light.png', + height: 600, + width: 1200 + }} + captionSpacing={-12} + caption="This image is a screenshot of the test application we run our benchmarks on. It depicts a SierpiΕ„ski triangle where each single triangle is its own component, separated in its own file. In this example, there are 3,000 triangles being rendered to the screen." +/> + +### Cold startup time + +This test measures how fast a local development server starts up on an application of various sizes. We measure this as the time from startup (without cache) until the app is rendered in the browser. We do not wait for the app to be interactive or hydrated in the browser for this dataset. + +Based on feedback and collaboration with the Vite team, we used the [SWC plugin](https://github.com/vitejs/vite-plugin-react-swc) with Vite in replacement for the [default Babel plugin](https://github.com/vitejs/vite-plugin-react) for improved performance in this benchmark. + +<DocsBenchmarksGraph category="cold" bars={DEFAULT_BARS} /> + +<ThemedImageFigure + borderRadius={true} + dark={{ + source: '/images/blog/turbopack-benchmarks/bench_startup_dark.svg', + height: 720, + width: 1960 + }} + light={{ + source: '/images/blog/turbopack-benchmarks/bench_startup_light.svg', + height: 720, + width: 1960 + }} + captionSpacing={24} + caption="Startup time by module count. Benchmark data generated from 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400)." +/> + +#### Data + +To run this benchmark yourself, clone [`vercel/turbo`](https://github.com/vercel/turbo) and then use this command from the root: + +```sh +TURBOPACK_BENCH_COUNTS=1000,5000,10000,30000 TURBOPACK_BENCH_BUNDLERS=all cargo bench -p next-dev "startup/(Turbopack SSR|Next.js 12 SSR|Next.js 11 SSR|Vite SWC CSR)." +``` + +Here are the numbers we were able to produce on a 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400): + +```sh +bench_startup/Next.js 11 SSR/1000 modules 9.2Β±0.04s +bench_startup/Next.js 11 SSR/5000 modules 32.9Β±0.67s +bench_startup/Next.js 11 SSR/10000 modules 71.8Β±2.57s +bench_startup/Next.js 11 SSR/30000 modules 237.6Β±6.43s +bench_startup/Next.js 12 SSR/1000 modules 3.6Β±0.02s +bench_startup/Next.js 12 SSR/5000 modules 12.1Β±0.32s +bench_startup/Next.js 12 SSR/10000 modules 23.3Β±0.32s +bench_startup/Next.js 12 SSR/30000 modules 89.1Β±0.21s +bench_startup/Turbopack SSR/1000 modules 1381.9Β±5.62ms +bench_startup/Turbopack SSR/5000 modules 4.0Β±0.04s +bench_startup/Turbopack SSR/10000 modules 7.3Β±0.07s +bench_startup/Turbopack SSR/30000 modules 22.0Β±0.32s +bench_startup/Vite SWC CSR/1000 modules 4.2Β±0.02s +bench_startup/Vite SWC CSR/5000 modules 16.6Β±0.08s +bench_startup/Vite SWC CSR/10000 modules 32.3Β±0.12s +bench_startup/Vite SWC CSR/30000 modules 97.7Β±1.53s +``` + +### File updates (HMR) + +We also measure how quickly the development server works from when an update is applied to a source file to when the corresponding change is re-rendered in the browser. + +For Hot Module Reloading (HMR) benchmarks, we first start the dev server on a fresh installation with the test application. We wait for the HMR server to boot up by running updates until one succeeds. We then run ten changes to warm up the tooling. This step is important as it prevents discrepancies that can arise with cold processes. + +Once our tooling is warmed up, we run a series of updates to a list of modules within the test application. Modules are sampled randomly with a distribution that ensures we test a uniform number of modules per module depth. The depth of a module is its distance from the entry module in the dependency graph. For instance, if the entry module A imports module B, which imports modules C and D, the depth of the entry module A will be 0, that of module B will be 1, and that of modules C and D will be 2. Modules A and B will have an equal probability of being sampled, but modules C and D will only have half the probability of being sampled. + +We report the linear regression slope of the data points as the target metric. This is an estimate of the average time it takes for the tooling to apply an update to the application. + +<DocsBenchmarksGraph category="file_change" bars={HMR_BARS} /> + +<ThemedImageFigure + borderRadius={true} + dark={{ + source: '/images/blog/turbopack-benchmarks/bench_hmr_to_commit_dark.svg', + height: 720, + width: 1960 + }} + light={{ + source: '/images/blog/turbopack-benchmarks/bench_hmr_to_commit_light.svg', + height: 720, + width: 1960 + }} + captionSpacing={24} + caption="Turbopack, Next.js (webpack), and Vite HMR by module count. Benchmark data generated from 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400)." +/> + +<a id="bench"/> + +<ThemedImageFigure + borderRadius={true} + dark={{ + source: '/images/blog/turbopack-benchmarks/bench_hmr_to_commit_turbopack_vite_dark.svg', + height: 720, + width: 1960 + }} + light={{ + source: '/images/blog/turbopack-benchmarks/bench_hmr_to_commit_turbopack_vite_light.svg', + height: 720, + width: 1960 + }} + captionSpacing={24} + caption="Turbopack and Vite HMR by module count. Benchmark data generated from 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400)." +/> + +The takeaway: Turbopack performance is a function of **the size of an update**, not the size of an application. + +For more info, visit the comparison docs for [Vite](/pack/docs/comparisons/vite) and [webpack](/pack/docs/comparisons/webpack). + +#### Data + +To run this benchmark yourself, clone [`vercel/turbo`](https://github.com/vercel/turbo) and then use this command from the root: + +``` +TURBOPACK_BENCH_COUNTS=1000,5000,10000,30000 TURBOPACK_BENCH_BUNDLERS=all cargo bench -p next-dev "hmr_to_commit/(Turbopack SSR|Next.js 12 SSR|Next.js 11 SSR|Vite SWC CSR)" +``` + +Here are the numbers we were able to produce on a 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400): + +```sh +bench_hmr_to_commit/Next.js 11 SSR/1000 modules 211.6Β±1.14ms +bench_hmr_to_commit/Next.js 11 SSR/5000 modules 866.0Β±34.44ms +bench_hmr_to_commit/Next.js 11 SSR/10000 modules 2.4Β±0.13s +bench_hmr_to_commit/Next.js 11 SSR/30000 modules 9.5Β±3.12s +bench_hmr_to_commit/Next.js 12 SSR/1000 modules 146.2Β±2.17ms +bench_hmr_to_commit/Next.js 12 SSR/5000 modules 494.7Β±25.13ms +bench_hmr_to_commit/Next.js 12 SSR/10000 modules 1151.9Β±280.68ms +bench_hmr_to_commit/Next.js 12 SSR/30000 modules 6.4Β±2.29s +bench_hmr_to_commit/Turbopack SSR/1000 modules 18.9Β±2.92ms +bench_hmr_to_commit/Turbopack SSR/5000 modules 23.8Β±0.31ms +bench_hmr_to_commit/Turbopack SSR/10000 modules 23.0Β±0.35ms +bench_hmr_to_commit/Turbopack SSR/30000 modules 22.5Β±0.88ms +bench_hmr_to_commit/Vite SWC CSR/1000 modules 104.8Β±1.52ms +bench_hmr_to_commit/Vite SWC CSR/5000 modules 109.6Β±3.94ms +bench_hmr_to_commit/Vite SWC CSR/10000 modules 113.0Β±1.20ms +bench_hmr_to_commit/Vite SWC CSR/30000 modules 133.3Β±23.65ms +``` + +As a reminder, Vite is using the official [SWC plugin](https://github.com/vitejs/vite-plugin-react-swc) for these benchmarks, which is not the default configuration. + +Visit the [Turbopack benchmark documentation](/pack/docs/benchmarks) to run the benchmarks yourself. If you have questions about the benchmark, please open an [issue on GitHub](https://github.com/vercel/turbo/issues). + +## The future of the open-source Web + +Our team has taken the lessons from 10 years of webpack, combined with the innovations in incremental computation from [Turborepo](/repo) and Google's Bazel, and created an architecture ready to support the coming decades of computing. + +Our goal is to create a system of open source tooling that helps to build the future of the Webβ€”powered by Turbopack. We are creating a reusable piece of architecture that will make both development and warm production builds faster for everyone. + +For Turbopack’s alpha, we are including it in Next.js 13. But, in time, [we hope that Turbopack will power other frameworks and builders](https://twitter.com/youyuxi/status/1585040276447690752?s=20&t=YV0ASkHl5twCWQvJF5jpwg) as a seamless, low-level, incremental engine to build great developer experiences with. + +We look forward to being a part of the community bringing developers better tooling so that they can continue to deliver better experiences to end users. If you would like to learn more about Turbopack benchmarks, visit [turbo.build](https://turbo.build/). To try out Turbopack in Next.js 13, visit [nextjs.org](https://nextjs.org/docs/advanced-features/turbopack). + +--- + +## Update (2022/12/22) + +When we first released Turbopack, we made some claims about its performance relative to previous Next.js versions (11 and 12), and relative to Vite. These numbers were computed with our benchmark suite, which was publicly available on the [turbo repository](https://github.com/vercel/turbo), but we hadn’t written up much about them, nor had we provided clear instructions on how to run them. + +After collaborating with Vite’s core contributor [Evan You](https://github.com/yyx990803), we released this blog post explaining our methodology and we updated our website to provide instructions on how to run the benchmarks. + +Based on the outcome of our collaboration with Vite, here are some clarifications we have made to the benchmarks above on our testing methodology: + +### Raw HMR vs. React Refresh + +In the numbers we initially published, we were measuring the time between a file change and the update being run in the browser, but not the time it takes for React Refresh to re-render the update (`hmr_to_eval`). + +We had another benchmark which included React Refresh (`hmr_to_commit`) which we elected not to use because we thought it mostly accounted for React overheadβ€”an additional 30ms. However, this assumption turned out to be wrong, and the issue was [within Next.js’ update handling code](https://github.com/vercel/next.js/pull/42350). + +On the other hand, Vite’s `hmr_to_eval` and `hmr_to_commit` numbers were much closer together (no 30ms difference), and [this brought up suspicion that our benchmark methodology was flawed and that we weren’t measuring the right thing](https://github.com/yyx990803/vite-vs-next-turbo-hmr/discussions/8). + +This blog post has been updated to include React Refresh numbers. + +### Root vs. Leaf + +[Evan You’s benchmark application](https://github.com/yyx990803/vite-vs-next-turbo-hmr) is composed of a single, very large file with a thousand imports, and a thousand very small files. The shape of our benchmark is different – it represents a tree of files, with each file importing 0 or 3 other files, trying to mimic an average application. Evan helped us find a regression when editing large, root files in his benchmark. The cause for this was [quickly identified and fixed by Tobias](https://github.com/vercel/turbo/commit/a11422fdf6b1b3cde9072d90aab6d9eebfacb591) and was released in Next 13.0.1. + +We have adapted our HMR benchmarks to samples modules uniformly at all depths, and we have updated this blog post and our documentation to include more details about this process. + +### SWC vs. Babel + +When we initially released the benchmarks, we were using the official Vite React plugin which uses Babel under the hood. Turbopack itself uses SWC, [which is much faster than Babel](https://swc.rs/blog/perf-swc-vs-babel). Evan You suggested that for a more accurate comparison, we should change Vite’s benchmark to use the SWC plugin instead of the default Babel experience. + +While SWC does improve Vite’s startup performance significantly, it only shows a small difference in HMR updates (\<10%). It should be noted that the React Refresh implementations between the plugins are different, hence this might not be measuring SWC’s effect but some other implementation detail. + +We have updated our benchmarks to run Vite with the official SWC plugin. + +### File Watcher Differences + +Every OS provide its own APIs for watching files. On macOS, Turbopack uses [FSEvents](https://developer.apple.com/documentation/coreservices/file_system_events), which have shown to have [~12ms latency](https://twitter.com/devongovett/status/1586599130494746625) for reporting updates. We have considered using [kqueue](https://www.freebsd.org/cgi/man.cgi?query=kqueue&sektion=2) instead, which has much lower latency. However, since it is not a drop-in replacement, and brings its lot of drawbacks, this is still in the exploratory stage and not a priority. + +Expect to see different numbers on Linux, where [inotify](https://man7.org/linux/man-pages/man7/inotify.7.html) is the standard monitoring mechanism. + +### Improving our Methodology + +Our benchmarks originally only sampled 1 HMR update from 10 different instances of each bundler running in isolation, for a total of 10 sampled updates, and reported the mean time. Before sampling each update, bundler instances were warmed up by running 5 updates. + +Sampling so few updates meant high variance from one measurement to the next, and this was particularly significant in Vite’s HMR benchmark, where single updates could take anywhere between 80 and 300ms. + +We have since refactored our benchmarking architecture to measure a variable number of updates per bundler instance, which results in anywhere between 400 and 4000 sampled updates per bundler. We have also increased the number of warmup updates to 10. + +Finally, instead of measuring the mean of all samples, we are measuring the slope of the linear regression line. + +This change has improved the soundness of our results. It also brings Vite’s numbers down to an **almost constant 100ms at any application size**, while we were previously measuring 200+ms updates for larger applications of over 10,000 modules. + +### Conclusion + +We are now measuring Turbopack to be consistently 5x faster than Vite for HMR, over all application sizes, after collaborating with the Vite team on the benchmarks. We have updated all of our benchmarks to reflect our new methodology. diff --git a/docs/pages/blog/you-might-not-need-typescript-project-references.mdx b/docs/pages/blog/you-might-not-need-typescript-project-references.mdx new file mode 100644 index 0000000..5874d98 --- /dev/null +++ b/docs/pages/blog/you-might-not-need-typescript-project-references.mdx @@ -0,0 +1,95 @@ +--- +title: You might not need TypeScript project references +date: 2021/04/23 +description: As it turns out, you might not even references or even an interim TypeScript build step with a pattern I am about to show you, which I dub "internal packages." +tag: web development +ogImage: /images/blog/you-might-not-need-typescript-project-references/twitter-card.png +--- + +# You might not need TypeScript project references + +import { Authors } from "../../components/Authors"; +import Callout from "../../components/Callout"; + +<Authors authors={['jaredpalmer']} /> + +If you've worked in a larger TypeScript codebase or monorepo, you are likely familiar with [project references](https://www.typescriptlang.org/docs/handbook/project-references.html). They are indeed fairly powerful. + +When you reference a project in your `tsconfig.json`, new things happen: + +- Importing modules from a referenced project will instead load its output declaration file (`.d.ts`) +- If the referenced project produces an `outFile`, the output file `.d.ts` file’s declarations will be visible in this project +- Running build mode (`tsc -b`) will automatically build the referenced project if it hasn't been built but is needed +- By separating into multiple projects, you can greatly improve the speed of typechecking and compiling, reduce memory usage when using an editor, and improve enforcement of the logical groupings of your program. + +Sounds awesome! Right?! Well...maybe. Once you add references to your project you now need to continuously update them whenever you add or remove packages. That kinda blows. + +Well...what if you didn't need to? + +## "Internal" TypeScript Packages + +As it turns out, you might not even need references or even an interim TypeScript build step with a pattern I am about to show you, which I dub "internal packages." + +**An "internal package" is a TypeScript package without a `tsconfig.json` with _both_ its `types` and `main` fields in its `package.json` pointing to the package's untranspiled entry point (e.g. `./src/index.tsx`).** + +```json +{ + "name": "@sample/my-internal-package" + "main": "./src/index.ts" + "types": "./src/index.ts", // yes, this works! + "dependencies": { + ... + }, + "devDependencies": { + ... + } +} +``` + +As it turns out, the TypeScript Language Server (in VSCode) and Type Checker can treat both a raw `.ts` or `.tsx` file as its own valid type declaration. This last sentence is obvious once you read it twice. What isn't so obvious, though, is that you can point the `types` field directly to raw source code. + +Once you do this, this package can then be used _without_ project references or a TypeScript build step (either via `tsc` or `esbuild` etc) as long as you adhere to 2 rules: + +- **The consuming application of an internal package must transpile and typecheck it.** +- **You should never publish an internal package to npm.** + +As far as I can tell, this internal package pattern works with all yarn/npm/pnpm workspace implementations regardless of whether you are using Turborepo or some other tool. I have personally tested this pattern with several different meta frameworks (see below), but I'm sure that it works with others as well. + +### Next.js + +Next.js 13 can automatically transpile and bundle dependencies from local packages (like monorepos) or from external dependencies (`node_modules`). + +```js +/** @type {import('next').NextConfig} */ +const nextConfig = { + transpilePackages: ['@acme/ui', 'lodash-es'], +}; + +module.exports = nextConfig; +``` + +<Callout type="info">As of Next.js 13.1, you no longer need the `next-transpile-modules` package. For more information, visit the Next.js [Built-in module transpilation](https://nextjs.org/blog/next-13-1#built-in-module-transpilation-stable) blog post.</Callout> + +### Vite + +Internal packages just work. No extra config is needed. + +### React Native + +If you use [Expo](https://expo.dev/) and use the [`expo-yarn-workspaces`](https://github.com/expo/expo/tree/master/packages/expo-yarn-workspaces) or [`@turborepo/adapter-expo`](https://www.npmjs.com/package/@turborepo/adapter-expo) package, you can use internal packages as long as you are targeting iOS or Android. When you run Expo for these platforms, all of `node_modules` are automatically transpiled with [Metro](https://facebook.github.io/metro/). However, if you are targeting Expo for web, internal packages will not work because `node_modules` are oddly not transpiled for web. + +_I reached out to the Expo team about this inconsistency. They are aware of it. It's a legacy wart I'm told._ + +## The beauty of this pattern + +This pattern rocks because it saves you from extra needless or duplicative build steps. It also gives you all the editor benefits of project references, but without any configuration. + +## Caveats + +When you use an internal package, it's kind of like telling the consuming application that you have another source directoryβ€”which has pros and cons. As your consuming application(s) grow, adding more internal packages is identical to adding more source code to that consuming application. Thus, when you add more source code, there is more code to transpile/bundle/typecheck...so this can result in slower builds of the consuming application (as there is just more work to do) but potentially faster (and less complicated) overall build time. When/if overall build time begins to suffer, you might decide to convert your larger internal packages back into "regular" packages with `.d.ts` files and with normal TypeScript build steps. + +As previously mentioned, this pattern actually has very little to do with Turborepo. It's just super duper awesome and I think you should be aware of it. As we are actively working on preset package build rules (i.e. "builders") for Turborepo, we'll using the internal package pattern to skip build steps. + +## Speaking of long build times... + +Shameless plug here. If you are reading this post, and you're struggling with slow build and test times, I'd love to show you how Turborepo can help. I guarantee that Turborepo will cut your monorepo's build time by 50% or more. [You can request a live demo right here.](https://vercel.com/contact/sales?utm_source=turbo.build&utm_medium=referral&utm_campaign=blog-project-references) diff --git a/docs/pages/confirm.mdx b/docs/pages/confirm.mdx new file mode 100644 index 0000000..6dd2033 --- /dev/null +++ b/docs/pages/confirm.mdx @@ -0,0 +1,8 @@ +--- +headeronly: true +searchable: false +--- + +import Confirm from "../components/pages/confirm"; + +<Confirm /> diff --git a/docs/pages/index.mdx b/docs/pages/index.mdx index 058465c..b5b08bd 100644 --- a/docs/pages/index.mdx +++ b/docs/pages/index.mdx @@ -1,11 +1,7 @@ -# Introduction - -Welcome to Nextra! This is a basic docs template. You can use it as a starting point for your own project :) +--- +description: "Turbo is an incremental bundler and build system optimized for JavaScript and TypeScript, written in Rust." +--- -## What is Nextra? +import LandingPage from "../components/pages/landing"; -A **simple**, **powerful** and **flexible** site generation framework with everything you love from Next.js. - -## Documentation - -The documentation is available at [https://nextra.site](https://nextra.site). +<LandingPage /> diff --git a/docs/pages/pack/_meta.json b/docs/pages/pack/_meta.json new file mode 100644 index 0000000..4941080 --- /dev/null +++ b/docs/pages/pack/_meta.json @@ -0,0 +1,15 @@ +{ + "index": { + "type": "page", + "display": "hidden", + "theme": { + "layout": "raw", + "sidebar": false, + "toc": true + } + }, + "docs": { + "title": "Docs", + "display": "children" + } +} diff --git a/docs/pages/pack/docs/_meta.json b/docs/pages/pack/docs/_meta.json new file mode 100644 index 0000000..ddd6017 --- /dev/null +++ b/docs/pages/pack/docs/_meta.json @@ -0,0 +1,10 @@ +{ + "index": "Quickstart", + "why-turbopack": "Why Turbopack?", + "core-concepts": "Core Concepts", + "roadmap": "Roadmap", + "features": "Features", + "comparisons": "Comparisons", + "benchmarks": "Benchmarks", + "migrating-from-webpack": "Migrating from Webpack" +} diff --git a/docs/pages/pack/docs/advanced/profiling.mdx b/docs/pages/pack/docs/advanced/profiling.mdx new file mode 100644 index 0000000..7c50d18 --- /dev/null +++ b/docs/pages/pack/docs/advanced/profiling.mdx @@ -0,0 +1,74 @@ +--- +title: Profiling +description: Learn how to profile Turbopack +--- + +import { ThemedImageFigure } from '../../../../components/image/ThemedImageFigure'; + +# Profiling Turbopack + +## On macOS + +### Install [`cargo-instruments`] + +```sh +cargo install cargo-instruments +``` + +Make sure you have all the [prerequisites](https://github.com/cmyr/cargo-instruments#pre-requisites) for running cargo-instruments. + +### Run the profiler + +By default, the next-dev CLI will keep watching for changes to your application and never exit until you manually interrupt it. However, [`cargo-instruments`] waits for your program to exit before building and opening the trace file. For this purpose, we've added a `profile` feature to `next-dev` which exits the program if no updates are detected within a given time frame and there are no pending tasks. + +To profile `next-dev`, run the following command: + +```sh +cargo instruments -t time --bin next-dev --release --features profile [-- [...args]] +``` + +You can also run [other templates](https://github.com/cmyr/cargo-instruments#templates) than the time profiler. + +Once the program exits, the profiler will open the trace file in Instruments. Refer to the [learning resources](https://github.com/cmyr/cargo-instruments#resources) to learn how to use Instruments. + +<ThemedImageFigure + borderRadius={true} + dark={{ + source: '/images/docs/pack/instruments-dark.png', + height: 662, + width: 968 + }} + light={{ + source: '/images/docs/pack/instruments-light.png', + height: 706, + width: 1012 + }} + captionSpacing={24} + caption="An example trace from the time profiler." +/> + +## Linux + +### Memory usage + +```sh +# Install `heaptrack` and `heaptrack_gui` +sudo apt install heaptrack heaptrack_gui + +# Compile with debug info but without the alternative allocator: +CARGO_PROFILE_RELEASE_DEBUG=1 cargo build --bin next-dev --release --no-default-features --features cli + +# Run the binary with heaptrack (it will be much slower than usual) +heaptrack target/release/next-dev [...] + +# Stop it anytime + +# Open the GUI and open the heaptrack.next-dev.XXX.gz file +heaptrack_gui +``` + +## On other platforms + +We currently don't have a guide for profiling Turbopack on other platforms. + +[`cargo-instruments`]: https://github.com/cmyr/cargo-instruments diff --git a/docs/pages/pack/docs/benchmarks.mdx b/docs/pages/pack/docs/benchmarks.mdx new file mode 100644 index 0000000..41788a9 --- /dev/null +++ b/docs/pages/pack/docs/benchmarks.mdx @@ -0,0 +1,174 @@ +import { DEFAULT_BARS } from '../../../components/pages/pack-home/PackBenchmarks' +import { DocsBenchmarksGraph } from '../../../components/pages/pack-home/DocsBenchmarksGraph'; +import Callout from '../../../components/Callout'; +import { ThemedImageFigure } from '../../../components/image/ThemedImageFigure'; +import { HMR_BARS } from '../../../components/pages/pack-home/PackBenchmarks' + +# Benchmarks + +We created a test generator that makes an application with a variable amount of modules to benchmark cold startup and file updating tasks. This generated app includes entries for these tools: + +- Next.js 11 +- Next.js 12 +- Next.js 13 with Turbopack +- Vite + +As the current state of the art, we're including [Vite](https://vitejs.dev/) along with webpack-based [Next.js](https://nextjs.org) solutions. All of these toolchains point to the same generated component tree, assembling a [SierpiΕ„ski triangle](https://en.wikipedia.org/wiki/Sierpi%C5%84ski_triangle) in the browser, where every triangle is a separate module. + +<ThemedImageFigure + borderRadius={false} + dark={{ + source: '/images/blog/turbopack-benchmarks/triangle-dark.png', + height: 600, + width: 1200 + }} + light={{ + source: '/images/blog/turbopack-benchmarks/triangle-light.png', + height: 600, + width: 1200 + }} + captionSpacing={-12} + caption="This image is a screenshot of the test application we run our benchmarks on. It depicts a SierpiΕ„ski triangle where each single triangle is its own component, separated in its own file. In this example, there are 3,000 triangles being rendered to the screen." +/> + +## Metrics + +Let's break down exactly what each of these metrics mean, and how they'll impact your day-to-day developer experience. + +<Callout type="info"> + Curious about how these benchmarks are implemented, or want to run them yourself? Check out [our benchmark suite documentation in the Turbo monorepo](https://github.com/vercel/turbo/blob/main/crates/next-dev/benches/README.md). +</Callout> + +### Cold startup time + +This test measures how fast a local development server starts up on an application of various sizes. We measure this as the time from startup (without cache) until the app is rendered in the browser. We do not wait for the app to be interactive or hydrated in the browser for this dataset. + +<DocsBenchmarksGraph category="cold" bars={DEFAULT_BARS} /> + +<ThemedImageFigure + borderRadius={true} + dark={{ + source: '/images/blog/turbopack-benchmarks/bench_startup_dark.svg', + height: 720, + width: 1960 + }} + light={{ + source: '/images/blog/turbopack-benchmarks/bench_startup_light.svg', + height: 720, + width: 1960 + }} + captionSpacing={24} + caption="Startup time by module count. Benchmark data generated from 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400)." +/> + +#### Data + +To run this benchmark yourself, clone [`vercel/turbo`](https://github.com/vercel/turbo) and then use this command from the root: + +```sh +TURBOPACK_BENCH_COUNTS=1000,5000,10000,30000 TURBOPACK_BENCH_BUNDLERS=all cargo bench -p next-dev "startup/(Turbopack SSR|Next.js 12 SSR|Next.js 11 SSR|Vite SWC CSR)." +``` + +Here are the numbers we were able to produce on a 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400): + +```sh +bench_startup/Next.js 11 SSR/1000 modules 9.2Β±0.04s +bench_startup/Next.js 11 SSR/5000 modules 32.9Β±0.67s +bench_startup/Next.js 11 SSR/10000 modules 71.8Β±2.57s +bench_startup/Next.js 11 SSR/30000 modules 237.6Β±6.43s +bench_startup/Next.js 12 SSR/1000 modules 3.6Β±0.02s +bench_startup/Next.js 12 SSR/5000 modules 12.1Β±0.32s +bench_startup/Next.js 12 SSR/10000 modules 23.3Β±0.32s +bench_startup/Next.js 12 SSR/30000 modules 89.1Β±0.21s +bench_startup/Turbopack SSR/1000 modules 1381.9Β±5.62ms +bench_startup/Turbopack SSR/5000 modules 4.0Β±0.04s +bench_startup/Turbopack SSR/10000 modules 7.3Β±0.07s +bench_startup/Turbopack SSR/30000 modules 22.0Β±0.32s +bench_startup/Vite SWC CSR/1000 modules 4.2Β±0.02s +bench_startup/Vite SWC CSR/5000 modules 16.6Β±0.08s +bench_startup/Vite SWC CSR/10000 modules 32.3Β±0.12s +bench_startup/Vite SWC CSR/30000 modules 97.7Β±1.53s +``` + +### File updates (HMR) + +We also measure how quickly the development server works from when an update is applied to a source file to when the corresponding change is re-rendered in the browser. + +For Hot Module Reloading (HMR) benchmarks, we first start the dev server on a fresh installation with the test application. We wait for the HMR server to boot up by running updates until one succeeds. We then run ten changes to warm up the tooling. This step is important as it prevents discrepancies that can arise with cold processes. + +Once our tooling is warmed up, we run a series of updates to a list of modules within the test application. Modules are sampled randomly with a distribution that ensures we test a uniform number of modules per module depth. The depth of a module is its distance from the entry module in the dependency graph. For instance, if the entry module A imports module B, which imports modules C and D, the depth of the entry module A will be 0, that of module B will be 1, and that of modules C and D will be 2. Modules A and B will have an equal probability of being sampled, but modules C and D will only have half the probability of being sampled. + +We report the linear regression slope of the data points as the target metric. This is an estimate of the average time it takes for the tooling to apply an update to the application. + +<DocsBenchmarksGraph category="file_change" bars={HMR_BARS} /> + +<ThemedImageFigure + borderRadius={true} + dark={{ + source: '/images/blog/turbopack-benchmarks/bench_hmr_to_commit_dark.svg', + height: 720, + width: 1960 + }} + light={{ + source: '/images/blog/turbopack-benchmarks/bench_hmr_to_commit_light.svg', + height: 720, + width: 1960 + }} + captionSpacing={24} + caption="Turbopack, Next.js (webpack), and Vite HMR by module count. Benchmark data generated from 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400)." +/> + +<a id="bench"/> + +<ThemedImageFigure + borderRadius={true} + dark={{ + source: '/images/blog/turbopack-benchmarks/bench_hmr_to_commit_turbopack_vite_dark.svg', + height: 720, + width: 1960 + }} + light={{ + source: '/images/blog/turbopack-benchmarks/bench_hmr_to_commit_turbopack_vite_light.svg', + height: 720, + width: 1960 + }} + captionSpacing={24} + caption="Turbopack and Vite HMR by module count. Benchmark data generated from 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400)." +/> + +The takeaway from these benchmarks: Turbopack performance is a function of **the size of an update**, not the size of an application. + +We're excited by Turbopack's performance, but we also expect to make further speed improvements for both small and large applications in the future. For more info, visit the comparison docs for [Vite](/pack/docs/comparisons/vite) and [webpack](/pack/docs/comparisons/webpack). + +#### Data + +To run this benchmark yourself, clone [`vercel/turbo`](https://github.com/vercel/turbo) and then use this command from the root: + +``` +TURBOPACK_BENCH_COUNTS=1000,5000,10000,30000 TURBOPACK_BENCH_BUNDLERS=all cargo bench -p next-dev "hmr_to_commit/(Turbopack SSR|Next.js 12 SSR|Next.js 11 SSR|Vite SWC CSR)" +``` + +Here are the numbers we were able to produce on a 16” MacBook Pro 2021, M1 Max, 32GB RAM, macOS 13.0.1 (22A400): + +```sh +bench_hmr_to_commit/Next.js 11 SSR/1000 modules 211.6Β±1.14ms +bench_hmr_to_commit/Next.js 11 SSR/5000 modules 866.0Β±34.44ms +bench_hmr_to_commit/Next.js 11 SSR/10000 modules 2.4Β±0.13s +bench_hmr_to_commit/Next.js 11 SSR/30000 modules 9.5Β±3.12s +bench_hmr_to_commit/Next.js 12 SSR/1000 modules 146.2Β±2.17ms +bench_hmr_to_commit/Next.js 12 SSR/5000 modules 494.7Β±25.13ms +bench_hmr_to_commit/Next.js 12 SSR/10000 modules 1151.9Β±280.68ms +bench_hmr_to_commit/Next.js 12 SSR/30000 modules 6.4Β±2.29s +bench_hmr_to_commit/Turbopack SSR/1000 modules 18.9Β±2.92ms +bench_hmr_to_commit/Turbopack SSR/5000 modules 23.8Β±0.31ms +bench_hmr_to_commit/Turbopack SSR/10000 modules 23.0Β±0.35ms +bench_hmr_to_commit/Turbopack SSR/30000 modules 22.5Β±0.88ms +bench_hmr_to_commit/Vite SWC CSR/1000 modules 104.8Β±1.52ms +bench_hmr_to_commit/Vite SWC CSR/5000 modules 109.6Β±3.94ms +bench_hmr_to_commit/Vite SWC CSR/10000 modules 113.0Β±1.20ms +bench_hmr_to_commit/Vite SWC CSR/30000 modules 133.3Β±23.65ms +``` + +Note that Vite is using the official [SWC plugin](https://github.com/vitejs/vite-plugin-react-swc) for these benchmarks, which is not the default configuration. + +If you have questions about the benchmark, please open an [issue on GitHub](https://github.com/vercel/turbo/issues). diff --git a/docs/pages/pack/docs/comparisons/BenchmarksCallout.tsx b/docs/pages/pack/docs/comparisons/BenchmarksCallout.tsx new file mode 100644 index 0000000..98e33e4 --- /dev/null +++ b/docs/pages/pack/docs/comparisons/BenchmarksCallout.tsx @@ -0,0 +1,14 @@ +import Link from "next/link"; +import Callout from "../../../../components/Callout"; + +export default function BenchmarksCallout() { + return ( + <Callout type="info"> + Want to know more about Turbopack's benchmarking process and + philosophy?{" "} + <Link href="../benchmarks" className="nx-underline"> + Learn more about Turbopack's benchmarking suite. + </Link> + </Callout> + ); +} diff --git a/docs/pages/pack/docs/comparisons/_meta.json b/docs/pages/pack/docs/comparisons/_meta.json new file mode 100644 index 0000000..58ddd9f --- /dev/null +++ b/docs/pages/pack/docs/comparisons/_meta.json @@ -0,0 +1,4 @@ +{ + "vite": "Vite", + "webpack": "Webpack" +} diff --git a/docs/pages/pack/docs/comparisons/vite.mdx b/docs/pages/pack/docs/comparisons/vite.mdx new file mode 100644 index 0000000..5425796 --- /dev/null +++ b/docs/pages/pack/docs/comparisons/vite.mdx @@ -0,0 +1,67 @@ +--- +title: Turbopack vs. Vite +description: Compare Turbopack vs. Vite +--- + +import { DocsBenchmarksGraph } from '../../../../components/pages/pack-home/DocsBenchmarksGraph'; +import { DocsBenchmarkStat } from '../../../../components/pages/pack-home/DocsBenchmarkStat'; +import BenchmarksCallout from './BenchmarksCallout'; +import Callout from '../../../../components/Callout' + +# Comparing Turbopack and Vite + +[Vite](https://vitejs.dev/) is an incredibly fast (non-)bundler that the web development community is extremely excited about - and so are we. Vite has raised the bar for web development and shown us what is possible for the future of the Web. If we were going to build a bundler, it had to perform at least as good as the (already impressive) Vite to validate our efforts. We're proud to say that we achieved that. + +<BenchmarksCallout /> + +## Speed + +Turbopack can outperform Vite on several key metrics. + +### Dev server startup time + +Vite is a non-bundler, which means it doesn't bundle your code at all. It sends each module directly to the browser. This means the browser does the hard work of handling dependencies between modules. + +On the surface, this seems like an unfair fight. Turbopack _bundles_ your application, meaning that a lot more work needs doing _before_ sending the code to the browser. + +But it turns out that Turbopack can handle this _faster_ than the browser can. By pre-bundling, we can save a lot of time over Vite's Native ESM system. You can learn more about this in our [Why Turbopack](/pack/docs/why-turbopack#bundling-vs-native-esm) section. + +This means that Turbopack's dev server starts up much faster than Vite's. On a 1,000 module application, Vite takes <DocsBenchmarkStat stat="vite-cold-1000" /> to start up. Turbopack starts up in <DocsBenchmarkStat stat="turbopack-cold-1000" /> - **<DocsBenchmarkStat stat="turbopack-cold-vs-vite" /> faster**. + +In large applications, this differential stays consistent. In a 30,000 module application, Turbopack starts up <DocsBenchmarkStat stat="turbopack-cold-vs-vite-30000" /> faster than Vite. + +Note that Vite is using the official [SWC plugin](https://github.com/vitejs/vite-plugin-react-swc) for these benchmarks, which is not the default configuration. + +<DocsBenchmarksGraph category="cold" bars={[ + { + label: 'Turbopack', + turbo: true, + key: 'next13' + }, + { + label: 'Vite', + key: 'vite', + swc: true + } +]} /> + +### Code updates + +Vite is extremely fast in development because of its speedy Fast Refresh capabilities. When you update a file, Vite uses its Native ESM system to to send the updated module to the browser - and performs a little bit of magic to integrate that into the existing module graph. + +In Turbopack, we discovered that for Fast Refresh, we don't really need to do bundling work at all. We can send updates in a similar style to Vite. In fact - a little bit more efficiently: Turbopack sends changed modules directly through the WebSocket without doing any bundling at all. + +In a 1,000 module application, Turbopack can react to file changes **<DocsBenchmarkStat stat="turbopack-update-vs-vite" /> faster** than Vite. + +<DocsBenchmarksGraph category="file_change" bars={[ + { + label: 'Turbopack', + turbo: true, + key: 'next13' + }, + { + label: 'Vite', + key: 'vite', + swc: true + } +]} /> diff --git a/docs/pages/pack/docs/comparisons/webpack.mdx b/docs/pages/pack/docs/comparisons/webpack.mdx new file mode 100644 index 0000000..a8b821a --- /dev/null +++ b/docs/pages/pack/docs/comparisons/webpack.mdx @@ -0,0 +1,66 @@ +--- +title: Turbopack vs. webpack +description: Compare Turbopack vs. webpack +--- + +import { DocsBenchmarksGraph } from '../../../../components/pages/pack-home/DocsBenchmarksGraph'; +import { DocsBenchmarkStat } from '../../../../components/pages/pack-home/DocsBenchmarkStat'; +import BenchmarksCallout from './BenchmarksCallout'; + +# Comparing Turbopack and webpack + +webpack has been downloaded over 3 billion times, making it today's most common JavaScript bundler. However, we found that we'd hit the limits of what it could do with its JavaScript-based architecture. + +We've built Turbopack as the successor of webpack: much **faster**, but just as **flexible and extensible**. + +<BenchmarksCallout /> + +## Speed + +Turbopack's incremental architecture outstrips webpack's speed on several key metrics. + +### Dev server startup time + +The main problem we found with webpack was development server startup time. If you end up importing a lot of modules in a certain page and open that page in your browser, the initial compile will take a few seconds. If you change routes in your development environment, you have to wait for a similar compile again for your new page. + +We designed Turbopack to be as lazy as possible, only doing work when it's requested. In a dev server, this means on incoming requests we do **exactly the work the user asked for**. No more unnecessary bundling of on demand loaded code before the user needs it. You can learn more in our [core concepts docs](/pack/docs/core-concepts#compiling-by-request). + +This means that Turbopack's dev server starts up much faster than webpack. Next.js 12, which uses webpack under the hood, can start up a build server on a 1,000 module application in <DocsBenchmarkStat stat="next12-cold-1000" />. Turbopack starts up in <DocsBenchmarkStat stat="turbopack-cold-1000" /> - **<DocsBenchmarkStat stat="turbopack-cold-vs-next12" /> faster**. + +<DocsBenchmarksGraph category="cold" bars={[ + { + label: 'Turbopack', + turbo: true, + key: 'next13' + }, + { + label: 'webpack', + key: 'next12', + } +]} /> + +### Code updates + +As we continued to optimize webpack, we found a performance ceiling on how much faster we could make Fast Refresh. With around 2,000 components, the best number we could produce was 500ms. This mark was a tremendous feat in Next.js 12. Previously, that process would have taken around 10 seconds. + +With Turbopack, we achieved the goal we were aiming for: Fast Refresh performance that stays near-constant, no matter your application size. Instead of scaling with your application size, it scales based on the size of the _changes made_. + +In a 1,000 module application, Turbopack can react to file changes **<DocsBenchmarkStat stat="turbopack-update-vs-next12" /> faster** than webpack. In a 30,000 module application, this is **<DocsBenchmarkStat stat="turbopack-update-vs-next12-30000" /> faster**. + +<DocsBenchmarksGraph category="file_change" bars={[ + { + label: 'Turbopack', + turbo: true, + key: 'next13' + }, + { + label: 'webpack', + key: 'next12', + } +]} /> + +## Extensibility + +webpack has an extraordinary collection of [plugins](https://webpack.js.org/plugins/) to customize its behavior. Composing plugins lets you create custom toolchains which can support a huge variety of bundler features. + +In its alpha state, Turbopack cannot currently be configured with plugins. In the future, we plan to make Turbopack just as extensible as webpack - though likely with an altered API. diff --git a/docs/pages/pack/docs/core-concepts.mdx b/docs/pages/pack/docs/core-concepts.mdx new file mode 100644 index 0000000..40e2dd7 --- /dev/null +++ b/docs/pages/pack/docs/core-concepts.mdx @@ -0,0 +1,64 @@ +--- +title: Core Concepts +description: Learn about the innovative architecture that powers Turbopack's speed improvements. +--- + +# Core Concepts + +Let’s dive deep into the internals of Turbopack to figure out why it’s so fast. + +## The Turbo engine + +Turbopack is so fast because it’s built on a reusable library for Rust which enables incremental computation known as the Turbo engine. Here’s how it works: + +### Function-level caching + +In a Turbo engine-powered program, you can mark certain functions as β€˜to be remembered’. When these functions are called, the Turbo engine will remember **what they were called with**, and **what they returned**. It’ll then save it in an in-memory cache. + +Here’s a simplified example of what this might look like in a bundler: + +![](/images/docs/pack/turbo-engine-first-run.png) + +We start with calling `readFile` on two files, `api.ts` and `sdk.ts`. We then `bundle` those files, `concat` them together, and end up with the `fullBundle` at the end. The results of all of those function calls get saved in the cache for later. + +Let’s imagine that we’re running on a dev server. You save the `sdk.ts` file on your machine. Turbopack receives the file system event, and knows it needs to recompute `readFile("sdk.ts")`: + +![](/images/docs/pack/turbo-engine-second-run.png) + +Since the result of `sdk.ts` has changed, we need to `bundle` it again, which then needs to be concatenated again. + +Crucially, `api.ts` hasn’t changed. We read its result from the cache and pass that to `concat` instead. So we save time by not reading it and re-bundling it again. + +Now imagine this in a real bundler, with thousands of files to read and transformations to execute. The mental model is the same. You can save enormous amounts of work by remembering the result of function calls and not re-doing work that’s been done before. + +### The cache + +The Turbo engine currently stores its cache in memory. This means the cache will last as long as the process running it - which works well for a dev server. When you run `next dev --turbo` in Next v13, you’ll start a cache with the Turbo engine. When you cancel your dev server, the cache gets cleared. + +In the future, we’re planning to persist this cache - either to the filesystem, or to a remote cache like Turborepo’s. This will mean that Turbopack could remember work done _across runs and machines._ + +### How does it help? + +This approach makes Turbopack extremely fast at computing incremental updates to your apps. This optimizes Turbopack for handling updates in development, meaning your dev server will always respond snappily to changes. + +In the future, a persistent cache will open the door to much faster production builds. By remembering work done _across runs_, new production builds could only rebuild changed files - potentially leading to enormous time savings. + +## Compiling by Request + +The Turbo engine helps provide extremely fast _updates_ on your dev server, but there’s another important metric to consider - startup time. The faster your dev server can start running, the faster you can get to work. + +There are two ways to make a process faster - work faster, or do less work. For starting up a dev server, the way to do less work is to compile _only the code that’s needed_ to get started. + +### Page-level compilation + +Versions of Next.js from 2-3 years ago used to compile the _entire application_ before showing your dev server. In Next.js [11], we began compiling _only the code on the page you requested._ + +That’s better, but it’s not perfect. When you navigate to `/users`, we’ll bundle all the client and server modules, dynamic-imported modules, and referenced CSS and images. That means if a large part of your page is hidden from view, or hidden behind tabs, we’ll still compile it anyway. + +### Request-level compilation + +Turbopack is smart enough to compile _only the code you request_. That means if a browser requests HTML, we compile only the HTML - not anything that is referenced by the HTML. + +If a browser wants some CSS, we’ll compile only that - without compiling referenced images. Got a big charting library behind `next/dynamic`? Doesn’t compile it until the tab showing the chart is shown. Turbopack even knows _to not compile source maps unless your Chrome DevTools are open_. + +If we were to use native ESM, we’d get similar behavior. Except that Native ESM produces a _lot_ of requests to the server, as discussed in our [Why Turbopack](/pack/docs/why-turbopack) section. With request-level compilation, we get to both reduce the number of requests _and_ use native speed to compile them. As you can see in our [benchmarks](/pack/docs/benchmarks), this provides significant performance improvements. diff --git a/docs/pages/pack/docs/features.mdx b/docs/pages/pack/docs/features.mdx new file mode 100644 index 0000000..984bd34 --- /dev/null +++ b/docs/pages/pack/docs/features.mdx @@ -0,0 +1,22 @@ +--- +title: Features +description: Learn about Turbopack's supported features +--- + +import { TurbopackFeatures } from '../../../components/TurbopackFeatures' + +# Features + +The practice of building web applications is enormously diverse. In CSS alone, you have SCSS, Less, CSS Modules, PostCSS, and hundreds of other libraries. Frameworks like React, Vue and Svelte require custom setups. + +When building a bundler, we needed to consider which features would be: + +- **Built-in**: they work out of the box, no config required +- **Available via plugins**: usually installed from a registry and configured +- **Unavailable**: not available at all + +**Turbopack is in alpha**, so very few of these decisions are set in stone. In its current state, **Turbopack cannot yet be configured** - so plugins are not available yet. + +Let's discuss which features are available out-of-the-box, in Turbopack's default configuration. We'll also touch on features which will be configurable via plugins. + +<TurbopackFeatures /> diff --git a/docs/pages/pack/docs/features/_meta.json b/docs/pages/pack/docs/features/_meta.json new file mode 100644 index 0000000..c63b5ee --- /dev/null +++ b/docs/pages/pack/docs/features/_meta.json @@ -0,0 +1,11 @@ +{ + "javascript": "JavaScript", + "typescript": "TypeScript", + "frameworks": "Frameworks", + "css": "CSS", + "dev-server": "Dev Server", + "static-assets": "Static Assets", + "imports": "Imports", + "environment-variables": "Environment Variables", + "customizing-turbopack": "Customizing Turbopack" +} diff --git a/docs/pages/pack/docs/features/css.mdx b/docs/pages/pack/docs/features/css.mdx new file mode 100644 index 0000000..6e2c2fd --- /dev/null +++ b/docs/pages/pack/docs/features/css.mdx @@ -0,0 +1,80 @@ +import Callout from '../../../../components/Callout'; + +# CSS + +CSS bundling is handled by SWC, using a Rust crate called `swc_css`. We haven't yet documented `swc_css` separately, but it's integrated into Turbopack and supports several CSS features: + +## Global CSS + +Importing CSS into global scope is supported **out-of-the-box** in Turbopack. + +```ts +import './globals.css'; +``` + +## CSS Modules + +Turbopack handles CSS Modules out-of-the-box. Any file with a `.module.css` extension will be considered a CSS module, and you can import it into a JavaScript or TypeScript file: + +```tsx Component.tsx +import cssExports from './phone.module.css' +``` + +This follows the same rules set out by [Next.js](https://nextjs.org/docs/basic-features/built-in-css-support#adding-component-level-css) - letting you easily distinguish between global and scoped CSS. + +## `postcss-nested` + +Turbopack handles [`postcss-nested`](https://www.npmjs.com/package/postcss-nested) syntax out-of-the-box. This useful library lets you nest CSS declarations inside each other: + +```css phone.css +.phone { + &_title { + width: 500px; + @media (max-width: 500px) { + width: auto; + } + body.is_dark & { + color: white; + } + } + img { + display: block; + } +} +``` + +## `@import` syntax + +Using the CSS `@import` syntax to import other CSS files works **out-of-the-box**. This gives you the ability to combine several CSS files together into a single module: + +```css filename="globals.css" +@import './modal.css'; +@import './dark.css'; +``` + +## PostCSS + +PostCSS gives you the ability to use plugins to enhance your CSS toolchain. It's been an invaluable tool for integrating libraries like Tailwind and `autoprefixer` into applications. + +The most common pattern is adding a `postcss.config.js` file to the root of your application, where you can import and configure your plugins. + +When Turbopack finds a `postcss.config.js` file, it will automatically process your CSS files with PostCSS in a Node.js worker pool. + +```js filename="postcss.config.js" +module.exports = { + plugins: { + tailwindcss: {}, + autoprefixer: {} + } +}; +``` + +## SCSS and LESS + +`.scss` and `.less` files let you utilize SCSS and LESS - languages which enhance CSS in various ways. These languages **don't currently work** out-of-the-box with Turbopack. + +These are likely to be available via plugins in the future. + +## Tailwind CSS + +Tailwind CSS can be used via PostCSS plugins. You can use the [official Tailwind Next.js guide](https://tailwindcss.com/docs/guides/nextjs) to get started. diff --git a/docs/pages/pack/docs/features/customizing-turbopack.mdx b/docs/pages/pack/docs/features/customizing-turbopack.mdx new file mode 100644 index 0000000..2ddd6a1 --- /dev/null +++ b/docs/pages/pack/docs/features/customizing-turbopack.mdx @@ -0,0 +1,112 @@ +--- +title: Customizing Turbopack +description: Learn about how to customize Turbopack to your needs +--- + +import Callout from "../../../../components/Callout"; + +# Customizing Turbopack + +Turbopack can be customized to transform different files and change how modules are resolved. It supports a subset of webpack's loader API and offers similar configuration aliasing module resolution. + +## webpack loaders for Next.js + +<Callout type="info"> + Turbopack for Next.js does not require loaders nor loader configuration for built-in functionality, just as they aren't required for Next.js. Turbopack has built-in support for css and compiling modern JavaScript, so there's no need for `css-loader`, `postcss-loader`, or `babel-loader` if you're just using `@babel/preset-env`. +</Callout> + +If you need loader support beyond what's built in, many webpack loaders already work with Turbopack. There are currently some limitations: + +- Only a core subset of the webpack loader API is implemented. This is enough for some popular loaders, and we'll expand our support for this API in the future. +- Only loaders that return JavaScript code are supported. Loaders that transform files like stylesheets or images are not currently supported. +- Options passed to webpack loaders must be plain JavaScript primitives, objects, and arrays. For example, it's not possible to pass `require()`d plugin modules as option values. + +As of Next 13.2, configuring webpack loaders is possible for Next.js apps through an experimental option in `next.config.js`. `turbo.loaders` can be set as a mapping of file extensions to a list of package names or `{loader, options}` pairs: + +```js filename="next.config.js" +module.exports = { + experimental: { + turbo: { + loaders: { + // Option format + '.md': [ + { + loader: '@mdx-js/loader', + options: { + format: 'md', + }, + }, + ], + // Option-less format + '.mdx': ['@mdx-js/loader'], + }, + }, + }, +} +``` + +If you need to pass something like the result of importing an external package as a loader option, it's possible to wrap the webpack loader with your own, specifying options there. **This is an interim solution and should not be necessary in the future.** This loader wraps `@mdx-js/loader` and configures the `rehypePrism` rehype plugin: + +```js filename="my-mdx-loader.js" +const mdxLoader = require('@mdx-js/loader'); +const rehypePrism = require('@mapbox/rehype-prism'); + +module.exports = function (code) { + const prevGetOptions = this.getOptions.bind(this); + this.getOptions = function getOptions(...args) { + return { + ...prevGetOptions(...args), + rehypePlugins: [rehypePrism] + } + } + + mdxLoader.call(this, code); +} +``` + +Then, configure Next.js to load the wrapper loader: + +```js filename="next.config.js" +module.exports = { + experimental: { + turbo: { + loaders: { + '.mdx': ['./my-mdx-loader'], + }, + }, + }, +} +``` + +### Supported loaders + +The following loaders have been tested to work with Turbopack's webpack loader implementation: + +- [`babel-loader`](https://www.npmjs.com/package/babel-loader) +- [`@mdx-js/loader`](https://www.npmjs.com/package/@mdx-js/loader) +- [`@svgr/webpack`](https://www.npmjs.com/package/@svgr/webpack) +- [`svg-inline-loader`](https://www.npmjs.com/package/svg-inline-loader) +- [`yaml-loader`](https://www.npmjs.com/package/yaml-loader) +- [`string-replace-loader`](https://www.npmjs.com/package/string-replace-loader) +- [`raw-loader`](https://www.npmjs.com/package/raw-loader) + +## Resolve aliases + +Turbopack can be configured to modify module resolution through aliases, similar to webpack's [`resolve.alias`](https://webpack.js.org/configuration/resolve/#resolvealias) configuration: + +```js filename="next.config.js" +module.exports = { + experimental: { + turbo: { + resolveAlias: { + underscore: 'lodash', + mocha: { browser: 'mocha/browser-entry.js' }, + }, + }, + }, +} +``` + +This aliases imports of the `underscore` package to the `lodash` package. In other words, `import underscore from 'underscore'` will load the `lodash` module instead of `underscore`. + +Turbopack also supports conditional aliasing through this field, similar to Node.js's [conditional exports](https://nodejs.org/docs/latest-v18.x/api/packages.html#conditional-exports). At the moment only the `browser` condition is supported. In the case above, imports of the `mocha` module will be aliased to `mocha/browser-entry.js` when Turbopack targets browser environments. diff --git a/docs/pages/pack/docs/features/dev-server.mdx b/docs/pages/pack/docs/features/dev-server.mdx new file mode 100644 index 0000000..ef8e01a --- /dev/null +++ b/docs/pages/pack/docs/features/dev-server.mdx @@ -0,0 +1,17 @@ +# Dev Server + +Turbopack is optimized to give you an extremely fast development server. We considered these features indispensable. + +## HMR + +Hot Module Replacement (HMR) gives your dev server the ability to push file updates to the browser without triggering a full refresh. This works for most static assets (including JavaScript files) enabling a smooth and fast developer experience. + +Turbopack supports Hot Module Replacement out of the box. Because of our [incremental architecture](/pack/docs/core-concepts), we are hyper-optimized for delivering fast updates. + +{/* Maybe link to benchmarks here? */} + +## Fast Refresh + +Fast Refresh builds on top of HMR by providing a framework-level integration to preserve _state_ across updates. Changes to a `<Count />` component, for instance, would preserve the component's internal `count` across changes. + +Turbopack supports Fast Refresh out of the box for React. Support for other frameworks will be [added over time](/pack/docs/features/frameworks). diff --git a/docs/pages/pack/docs/features/environment-variables.mdx b/docs/pages/pack/docs/features/environment-variables.mdx new file mode 100644 index 0000000..4e6daec --- /dev/null +++ b/docs/pages/pack/docs/features/environment-variables.mdx @@ -0,0 +1,29 @@ +# Environment variables + +## `.env` files + +Turbopack will parse and inject `.env` files out of the box. + +``` +NEXT_PUBLIC_DEPLOYMENT_NAME="my-site" +DATABASE_URL="postgres://" +``` + +This includes all the variations these files come in: + +``` +.env +.env.local +.env.development +.env.production.local +``` + +### Live reloading + +Not only that, but Turbopack will live reload when these values change. Restarting your dev server just to inject a new environment variable can be extremely annoying - Turbopack does it for you. + +## `process.env` + +Environment variables will be injected into `process.env`. For instance, `DATABASE_URL` will be on `process.env.DATABASE_URL`. + +This follows the tradition of Node.js, webpack 5 and Next.js 12, which each use `process.env` for variable injection. diff --git a/docs/pages/pack/docs/features/frameworks.mdx b/docs/pages/pack/docs/features/frameworks.mdx new file mode 100644 index 0000000..59f6998 --- /dev/null +++ b/docs/pages/pack/docs/features/frameworks.mdx @@ -0,0 +1,45 @@ +# Frameworks + +Turbopack plans to offer first-class support for multiple frameworks. No matter whether you're using Svelte, React, Vue.js, or another framework, we want to provide a great experience on Turbopack. + +## React + +### JSX/TSX + +`.jsx` and `.tsx` files are supported out of the box with Turbopack. We use [SWC](https://swc.rs/) to compile your JavaScript and TypeScript code, which results in extremely fast compilation. + +Similar to Next.js, Turbopack doesn't require you to import React in order to use JSX: + +```diff filename="src/index.tsx" +- import React from 'react'; + +const Component = () => { + return <div /> +} +``` + +### React Server Components + +React Server Components let you declare certain components as 'server' components, allowing you to run backend code inside an `async` function. Next.js 13 brings [first-class support for them](https://beta.nextjs.org/docs/rendering/server-and-client-components). + +React Server Components impose unusual constraints on your bundler. The mix of client and server code means you need to ensure that server code does not get compiled to the client, and vice versa. + +Turbopack has been built from the ground up to solve these problems - it works with React Server Components out of the box. + +## Next + +Turbopack's Alpha version has been focused on providing a great experience for Next.js's dev server. We're using this as our initial goal to show what Turbopack can do. In the future, we want Turbopack to act as a low-level engine for other frameworks. + +This means that Turbopack plans to support everything in Next.js. + +### `next/dynamic` + +[`next/dynamic`](https://nextjs.org/docs/advanced-features/dynamic-import) is not yet supported - however, we plan to support it out of the box soon. + +## Vue and Svelte + +[VueJS](https://vuejs.org/) and [Svelte](https://svelte.dev/) are tremendously popular frameworks which deliver a world-class developer experience. + +Since Turbopack is in alpha, we're focusing our support on Next.js's dev server. That means that right now, Vue and Svelte don't work out of the box. + +In future versions, we'll be supporting Vue and Svelte via plugins. diff --git a/docs/pages/pack/docs/features/imports.mdx b/docs/pages/pack/docs/features/imports.mdx new file mode 100644 index 0000000..5dee4ce --- /dev/null +++ b/docs/pages/pack/docs/features/imports.mdx @@ -0,0 +1,47 @@ +# Imports + +Turbopack supports CJS and ESM imports out of the box, and offers partial support for AMD. + +Turbopack bundles your application, so imports won't be resolved to native browser ESM. You can learn why in our [bundling vs Native ESM](/pack/docs/why-turbopack#bundling-vs-native-esm) section. + +## CommonJS + +Turbopack supports the `require` syntax out-of-the-box: + +```ts +const { add } = require('./math'); + +add(1, 2); +``` + +We also support dynamic `require()` syntax, for if you want to import a dynamically named asset: + +```ts +const imgName = getRandomImgName(); + +const img = require(`./images/${imgName}.png`); +``` + +## ESM + +Importing via the `import` syntax is also supported out-of-the-box. This includes static assets, and `import type`: + +```ts +import img from './img.png'; + +import type { User } from '../server/types'; + +import { z } from 'zod'; +``` + +## Dynamic Imports + +Turbopack supports dynamic imports via `import()`: + +```ts +const getFeatureFlags = () => { + return import('/featureFlags').then(mod => { + return mod.featureFlags; + }) +} +``` diff --git a/docs/pages/pack/docs/features/javascript.mdx b/docs/pages/pack/docs/features/javascript.mdx new file mode 100644 index 0000000..8b7a294 --- /dev/null +++ b/docs/pages/pack/docs/features/javascript.mdx @@ -0,0 +1,39 @@ +import Callout from "../../../../components/Callout"; + +# JavaScript + +## ECMAScript Support + +Turbopack uses [SWC](https://swc.rs/) to bundle JavaScript and TypeScript files. So, we match SWC's support for ECMAScript versions - anything that SWC supports, Turbopack will support. + +This means that by default **we support all syntax in ESNext**. + +## Browserslist + +[Browserslist](https://github.com/browserslist/browserslist) has become an industry standard for defining which browsers you plan to target. To make use of it, you can add a `browserslist` field to your `package.json`: + +```json filename="package.json" +{ + "browserslist": [ + "last 1 version", + "> 1%", + "not dead" + ] +} +``` + +Turbopack supports Browserslist **out-of-the-box**. We pass the information we find in your `package.json` to SWC, which handles [`browserslist` support](https://swc.rs/docs/configuration/supported-browsers) for us. + +This means you can feel comfortable using Turbopack to target legacy browsers, or deciding to only ship code to modern browsers. + +<Callout type="info"> + Turbopack is available in alpha preview with a dev server, which uses a pre-set minimal browserslist to minimize transformation during development. In a future release, Turbopack will build apps for production targeting your defined browserslist. +</Callout> + +## Babel + +[Babel](https://babel.dev/) allows you to add custom transformations to your code to provide custom syntax, including support for early language proposals. + +Babel plugins are currently **not supported** on Turbopack. In our default configuration, we don't use Babel to compile JavaScript or TypeScript code. + +In the future, Babel support will be provided via plugins. diff --git a/docs/pages/pack/docs/features/static-assets.mdx b/docs/pages/pack/docs/features/static-assets.mdx new file mode 100644 index 0000000..26ebf7a --- /dev/null +++ b/docs/pages/pack/docs/features/static-assets.mdx @@ -0,0 +1,47 @@ +# Static Assets + +Part of bundling for the web is handling all the asset types the web supports - images, videos, JSON, fonts, and much more. Turbopack offers familiar tools for these so you can immediately get productive. + +## Import static assets + +Importing static assets works out of the box with Turbopack: + +```ts +import img from './img.png' +import video from './video.mp4' +import audio from './audio.wav' +``` + +### Next.js + +In webpack and some other frameworks, importing an image returns a string containing that image's URL. + +```ts +import img from './img.png'; + +console.log(img); // /assets/static/1uahwd98h123.png +``` + +In Next.js, importing an image actually returns an object, containing various metadata about the image. This is so it can be fed into [Next.js's Image component](https://nextjs.org/docs/basic-features/image-optimization#local-images). + +The behavior of extracting an object of metadata from the image is **not yet supported**. For now, imported images will resolve to strings. + +## Public directory + +The `/public` directory lets you place assets which you want to be available on the root URL of the website. For instance, `public/favicon.png` will be available at `https://example/favicon.png`. + +In Turbopack, the `/public` directory is supported out of the box. + +## JSON + +Most frameworks allow you to import JSON directly into your application: + +```ts +import fixtures from './fixtures.json'; +``` + +This is supported out of the box with Turbopack, as is performing a named import on that JSON: + +```ts +import { users, posts } from './fixtures.json'; +``` diff --git a/docs/pages/pack/docs/features/typescript.mdx b/docs/pages/pack/docs/features/typescript.mdx new file mode 100644 index 0000000..a643486 --- /dev/null +++ b/docs/pages/pack/docs/features/typescript.mdx @@ -0,0 +1,40 @@ +# TypeScript + +Turbopack supports [TypeScript](https://www.typescriptlang.org/) out of the box. This means you can import `.ts` files with Turbopack. We support all of TypeScript's feature set. + +Thanks to our JSX support, you can also import `.tsx` files too. + +## Resolving `paths` and `baseUrl` + +In TypeScript, you can use the [`paths`](https://www.typescriptlang.org/tsconfig#paths) property of `tsconfig.json` to let you import files from custom paths. + +```json +{ + "compilerOptions": { + "baseUrl": "src", + "paths": { + "app/*": ["app/*"], + "config/*": ["app/_config/*"], + "shared/*": ["app/_shared/*"], + }, +} +``` + +This would let you import directly from `app/*` without needing to do a relative import: + +```diff filename="src/app/some/deep/file/in/your/app.ts" +- import { add } from '../../../../../math'; ++ import { add } from 'app/math'; + +add(); +``` + +Turbopack reads the `paths` and `baseUrl` in `tsconfig.json` in order to resolve these paths, just like `Next.js` does. + +This means you only need to configure your absolute paths in one place. + +## Type Checking + +Turbopack does not perform type checks on your application. We use [SWC](https://swc.rs/) to compile TypeScript code, which also does not perform type checks. + +This means that in order to run your type checks, you'll need a sidecar process running `tsc --watch`. Or, you can rely on your IDE's TypeScript integration. diff --git a/docs/pages/pack/docs/index.mdx b/docs/pages/pack/docs/index.mdx new file mode 100644 index 0000000..3cbc782 --- /dev/null +++ b/docs/pages/pack/docs/index.mdx @@ -0,0 +1,61 @@ +--- +title: Turbopack Quickstart +description: Start Building Web Applications with Turbopack +--- + +import { TurbopackQuickstartArea } from "../../../components/TurbopackQuickstart"; +import Callout from "../../../components/Callout"; +import { Tabs, Tab } from "../../../components/Tabs"; +import FullTurboCTA from "../../../components/FullTurboCTA"; + +# Getting Started with Turbopack + +Turbopack is an incremental bundler optimized for JavaScript and TypeScript, written in Rust by the creators of webpack and [Next.js](https://nextjs.org/) at [Vercel](https://vercel.com/). + +The secret to Turbopack's performance is twofold: highly optimized machine code and a low-level incremental computation engine that enables caching down to the level of individual functions. Once Turbopack performs a task it never does it again. + +Our team has taken the lessons from 10 years of webpack, combined with the innovations in incremental computation from [Turborepo](/repo) and Google's Bazel, and created an architecture ready to support the coming decades of computing. + +<Callout type="error"> + Turbopack is currently in <strong>alpha</strong>. It is not yet ready for production use. We appreciate your support and feedback as we work to make it ready for everyone. +</Callout> + +## Quickstart + +As of today, Turbopack can be used in Next.js v13. In the future we will be releasing a standalone CLI, plugin API, and support for other frameworks such as Svelte and Vue. For now, please follow these instructions to get started: + +1. Create a Next.js v13 project with Turbopack: + +```bash +npx create-next-app --example with-turbopack +``` + +2. Start the Next.js development server (with Turbopack): + + <Tabs items={["npm", "yarn", "pnpm"]} storageKey="selected-pkg-manager"> + <Tab> + ```bash + npm run dev + ``` + </Tab> + <Tab> + ```bash + yarn dev + ``` + </Tab> + <Tab> + ```bash + pnpm dev + ``` + </Tab> + </Tabs> + +The Next.js v13 development server is now powered by Turbopack! Startup and updates should both be near-instant. The larger the application, the larger the improvement will be. + +## Next Steps + +Want to learn more about Turbopack? Here's a deep dive on what we think makes it special. + +<TurbopackQuickstartArea /> + +<FullTurboCTA /> diff --git a/docs/pages/pack/docs/migrating-from-webpack.mdx b/docs/pages/pack/docs/migrating-from-webpack.mdx new file mode 100644 index 0000000..1788b18 --- /dev/null +++ b/docs/pages/pack/docs/migrating-from-webpack.mdx @@ -0,0 +1,34 @@ +--- +title: Migrate from webpack to Turbopack +description: Learn about how to migrate from webpack to its Rust-powered successor, Turbopack. +--- + +import Callout from "../../../components/Callout"; + +# Migrating from webpack to Turbopack + +<Callout type="info"> + Turbopack now implements basic webpack loader support and configuration familiar to webpack users. Visit [Customizing Turbopack](features/customizing-turbopack) for how to configure Turbopack to use webpack loaders. +</Callout> + +We're planning Turbopack as the successor to webpack. In the future, we plan to give Turbopack all the tools needed to support your webpack app. + +## webpack loaders and resolve aliases + +For apps running Next.js 13.2 or later, Turbopack supports configuration familiar to webpack users, including support for webpack loaders and customizing resolution rules. Visit [Customizing Turbopack](features/customizing-turbopack) for how to configure Turbopack with these options. Note that using webpack-based Next.js plugins as-is from `next.config.js` is **not yet possible**. + +## FAQ + +### Will it be compatible with webpack's API? + +webpack has a huge API. It's extremely flexible and extensible, which is a big reason why it's so popular. + +We're planning on making Turbopack very flexible and extensible, but we're **not planning 1:1 compatibility with webpack**. This lets us make choices which improve on webpack's API, and lets us optimize for speed and efficiency. + +### Will we be able to use webpack plugins? + +webpack plugins are a crucial part of webpack's ecosystem. They let you customize your toolchain, giving you low-level tools to maximize your productivity. + +Unlike loaders, webpack plugins can be tightly integrated with webpack's internals. + +Since we're not offering 1:1 API compatibility for plugins, most won't work out of the box with Turbopack. However, we're working on porting several of the most popular webpack plugins to Turbopack. diff --git a/docs/pages/pack/docs/roadmap.mdx b/docs/pages/pack/docs/roadmap.mdx new file mode 100644 index 0000000..9bd7c97 --- /dev/null +++ b/docs/pages/pack/docs/roadmap.mdx @@ -0,0 +1,33 @@ +# Roadmap + +We've got big plans for Turbopack. Here's what we're aiming for in the future: + +## Next.js + +Right now, Turbopack is being used as an opt-in feature in Next.js's dev server. This is helping to create an extremely fast experience in local development that scales to big projects. + +Next, we want to use Turbopack to power production builds with Next.js. We think that this will result in a big boost in performance, especially when integrated with remote caching. + +## Svelte + +We're planning to build a first-class integration with Svelte to let Turbopack power the next generation of SvelteKit applications. + +## Other Frameworks + +We are in active discussions with other frameworks to bring Turbopack to their users. We're excited to see what we can build together! + +## Remote Caching and Replication + +Turbopack is built from the ground up to take advantage of [caching](/pack/docs/core-concepts#the-turbo-engine). Currently, this cache is stored in-memory only. This lets us optimize for our current use case - making the Next.js dev server fast. + +In the future, we plan to persist this cache to the file system, to speed up Turbopack between runs. This will work similarly to [Turborepo's cache](/repo/docs/core-concepts/caching) - but at a much more granular level. Turborepo can currently only cache the results of entire builds. Turbopack, however, can cache the results of individual functions within those builds - saving much more time over subsequent runs. + +Once persisting to the file system is working, we can build the next logical step: persisting to a remote cache. With Turborepo, we've already built [remote caching](/repo/docs/core-concepts/remote-caching) on Vercel. In the future, you'll be able to _share_ Turbopack's hyper-granular cache across your whole team, using the Vercel Remote Cache. + +## Migration for webpack users + +To learn more about our future plans for webpack integration, check out our [Migrating from webpack](/pack/docs/migrating-from-webpack) page. + +## Fusion with Turborepo + +We are currently migrating/rewriting Turborepo in Rust. In the future, Turborepo and Turbopack will merge into a single toolchain--Turbo--that can be used as either a bundler or a build system or both. diff --git a/docs/pages/pack/docs/why-turbopack.mdx b/docs/pages/pack/docs/why-turbopack.mdx new file mode 100644 index 0000000..18cec98 --- /dev/null +++ b/docs/pages/pack/docs/why-turbopack.mdx @@ -0,0 +1,62 @@ +--- +title: Why Turbopack? +description: Learn why we think Turbopack is the future of bundling for the web. +--- + +# Why Turbopack? + +When we set out to create Turbopack, we wanted to solve a problem. We had been working on speed improvements for Next.js. We migrated away from several JS-based tools. Babel, gone. Terser, gone. Our next target was another JS-based tool, webpack. + +Replacing it became our goal. But with what? + +A new generation of native-speed bundlers were emerging, like esbuild and swc. But after assessing the bundlers on the market, we decided to build our own. Why? + +## Bundling vs Native ESM + +Frameworks like Vite use a technique where they don’t bundle application source code in development mode. Instead, they rely on the browser’s native ES Modules system. This approach results in incredibly responsive updates since they only have to transform a single file. + +However, Vite can hit scaling issues with large applications made up of many modules. A flood of cascading network requests in the browser can lead to a relatively slow startup time. For the browser, it’s faster if it can receive the code it needs in as few network requests as possible - even on a local server. + +That’s why we decided that, like webpack, we wanted Turbopack to bundle the code in the development server. Turbopack can do it much faster, especially for larger applications, because it is written in Rust and skips optimization work that is only necessary for production. + +## Incremental Computation + +There are two ways to make a process faster: do less work or do work in parallel. We knew if we wanted to make the fastest bundler possible, we’d need to pull hard on both levers. + +We decided to create a reusable Turbo build engine for distributed and incremental behavior. The Turbo engine works like a scheduler for function calls, allowing calls to functions to be parallelized across all available cores. + +The Turbo engine also caches the result of all the functions it schedules, meaning it never needs to do the same work twice. Put simply, **it does the minimum work at maximum speed**. + +### Vite and esbuild + +Other tools take a different attitude to β€˜doing less work’. Vite minimizes work done by using Native ESM in development mode. We decided not to take this approach for the reasons listed above. + +Under the hood, Vite uses esbuild for many tasks. esbuild is a bundler - a superbly fast one. It doesn’t force you to use native ESM. But we decided not to adopt esbuild for a few reasons. + +esbuild’s code is hyper-optimized for one task - bundling fast. It doesn’t have HMR, which we don’t want to lose from our dev server. + +esbuild is an extremely fast bundler, but it doesn’t do much caching. This means you _will_ end up doing the same work again and again, even if that work is at the speed of native. + +Evan Wallace refers to esbuild as a [proof-of-concept for the next generation of bundlers](https://news.ycombinator.com/item?id=22336334). We think he’s right. We feel that a Rust-powered bundler _with_ incremental computation could perform better at a larger scale than esbuild. + +## Lazy bundling + +Early versions of Next.js tried to bundle the _entire_ web app in development mode. We quickly realized that this β€˜eager’ approach was less than optimal. Modern versions of Next.js bundle only the pages requested by the dev server. For instance, if you go to `localhost:3000`, it’ll bundle only `pages/index.jsx`, and the modules it imports. + +This more β€˜lazy’ approach (only bundling assets when absolutely necessary) is key for a fast dev server. Native ESM handles this without much magic - you request a module, which requests other modules. However, we wanted to build a bundler, for the reasons explained above. + +esbuild doesn’t have a concept of β€˜lazy’ bundling - it’s all-or-nothing, unless you specifically target only certain entry points. + +Turbopack’s development mode builds a minimal graph of your app’s imports and exports based on received requests and only bundles the minimal code necessary. Learn more in the [core concepts docs](/pack/docs/core-concepts#compiling-by-request). + +This strategy makes Turbopack extremely fast when first starting up the dev server. We compute only the code necessary to render the page, then ship it to the browser in a single chunk. At large scale, this ends up being significantly faster than Native ESM. + +## Summary + +We wanted to: + +- Build a bundler. Bundlers outperform Native ESM when working on large applications. +- Use incremental computation. The Turbo engine brings this into the core of Turbopack’s architecture - maximizing speed and minimizing work done. +- Optimize our dev server’s startup time. For that, we build a lazy asset graph to compute only the assets requested. + +That’s why we chose to build Turbopack. diff --git a/docs/pages/pack/index.mdx b/docs/pages/pack/index.mdx new file mode 100644 index 0000000..f1f71b1 --- /dev/null +++ b/docs/pages/pack/index.mdx @@ -0,0 +1,8 @@ +--- +overrideTitle: "Turbopack - The successor to webpack" +description: "Turbopack is an incremental bundler optimized for JavaScript and TypeScript, written in Rust." +--- + +import TurbopackHome from "../../components/pages/pack-home"; + +<TurbopackHome /> diff --git a/docs/pages/privacy.mdx b/docs/pages/privacy.mdx new file mode 100644 index 0000000..778aa2d --- /dev/null +++ b/docs/pages/privacy.mdx @@ -0,0 +1,249 @@ +--- +description: See our privacy policy and how it relates to you. +hidden: true +headeronly: true +container: true +searchable: false +--- + +## Privacy Policy + +Welcome toΒ https://turborepo.org (the "Site"), hosted byΒ Vercel Inc.Β ("Vercel", "we", "us" and/or "our"). Vercel provides information on this Site regarding its Turborepo software that is a high-performance build system for JavaScript/TypeScript monorepos. Turborepo's tooling provides developers with incremental builds, intelligent remote caching, and optimized task scheduling. The Turborepo software is available pursuant to an open source license (such information that we provide and the Turborepo software are referred to as the "Services"). In order to provide our Site and Services (collectively, "Platform"), we collect personal data from our customers ("Customers"). We also collect the personal data of our Customer's end users ("End Users") when our Customers use our Services. Throughout this Privacy Policy, "personal information" or "personal data" refers to any information that is unique to an individual, such as name, address, email address, phone number, IP address and other information that can reasonably identify an individual.This Privacy Policy, along with ourΒ Cookie Policy, covers how we collect, handle and disclose personal data on our Platform, including any affiliated websites, software or platforms owned or operated by Vercel on which this Privacy Policy appears. Even though we store End Users' personal data, it does not cover how or why our Customers may collect, handle and disclose their End Users' personal data when they visit or use their websites or platforms. + +If you are located in the European Economic Area ("EEA") or the United Kingdom ("UK"), this entire Privacy Policy applies to you. However, please see the Section titledΒ "Additional Information for Users in the EEA and the UK.", which will provide more detail information about which rights you have regarding the processing of your personal data.We recommend that you read this Privacy Policy and our Cookie Policy carefully as both provide important information about your personal information and your rights under the law.If you have any questions, comments, or concerns regarding this Privacy Policy, our Cookie Policy and/or our data practices, or would like to exercise your rights, do not hesitate to contact us atΒ privacy@vercel.comΒ or see our information below. + +## Who We Are and How To Contact Us + +Vercel is a Delaware corporation with the following contact information: +Vercel Inc.340 S Lemon Ave #4133Walnut, CA 91789privacy@vercel.com +For users in the EEA and the UK, note that we may collect your personal data as a 'data controller' when we determine the means and purpose of processing, such as when we process the personal data of our Site Visitors, Event Attendees and/or Customers, or as a 'data processor' when we collect and process personal data on behalf of our Customers who use our Services or Customers that use our analytics tools. Similarly, for users in California, we currently act as a β€˜service provider’ to those β€˜businesses’ that are subject to the California Consumer Privacy Act when we collect and process personal data or information on behalf of our Customers who use our Services or Customers that use our analytics tools. This is further explained below. + +## To Whom Does This Policy Apply + +Note at the outset that this Privacy Policy does not cover our Customers' websites. Each Customer is responsible for posting its own terms, conditions, and privacy policies, and ensuring compliance with all applicable laws and regulations. This Privacy Policy applies to: + +**Customers:**Β as noted above, this includes any individual who registers individually or on behalf of an entity or organization in order to use the Services. + +**Site Visitors:Β **visitors to our Site, including those who may also opt in to receive commercial communications from Vercel. This does not include any visitors to our Customers' individual websites. + +**Event Attendees:**Β Vercel hosts and/or sponsors annual conferences and is an active participant at many other conferences, summits, and additional events (collectively, "Events"), and collects personal information from individuals when they attend (or register to attend) Events. + +**End Users:**Β Vercel may process End User data on behalf of its Customers. While our Customers are responsible, as data controllers or businesses, for how and why they collect and process their End User personal data, this Privacy Policy also applies to any End User personal information that we process, as a data processor or service provider, in order to provide Services to our Customers. + +## Children's Privacy + +Vercel does not knowingly collect information from children under the age of 16. If you are under the age of 16, please do not submit any personal data through the Platform. We encourage parents and legal guardians to monitor their children's Internet usage and to help enforce our Privacy Policy by instructing their children never to provide personal data on our Platform without their permission. If you have reason to believe that a child under the age of 16 has provided personal data to Vercel through the Platform, please contactΒ privacy@vercel.comΒ and we will endeavor to delete that information from our databases. + +## Changes To This Privacy Policy + +This Privacy Policy was last updated on the date indicated below, but we suggest that you review it from time to time, as our Platform and our business may change. As a result, at times it may be necessary for Vercel to make changes to this Privacy Policy (and/or our Cookie Policy). Vercel reserves the right to update or modify this Privacy Policy and/or our Cookie Policy at any time and from time to time without prior notice. However, if we make changes that we believe will materially impact this Privacy Policy, the Cookie Policy or your rights, we will promptly notify you of those changes. Your continued use of the Platform after any changes or revisions to this Privacy Policy shall indicate your agreement with the terms of such revised Privacy Policy. + +## What Information Do We Collect + +What personal information we collect and process depends on how and why you use our Platform. Generally, we process personal information that we receive: +[Directly](https://vercel.com/legal/privacy-policy#direct-collection)Β from you when you provide it to us, such as in connection with our Services or Event registrations. +[Indirectly](https://vercel.com/legal/privacy-policy#indirect-collection), through automated technologies such as cookies, or from third parties. +[On behalf of our Customers](https://vercel.com/legal/privacy-policy#customers), when we process the personal data of their End Users as part of our Services. This is all explained in more detail below. + +## Information We Collect Directly From You + +You can generally visit our Site without having to submit any personal information. If you request more information, or sign up for our Services, we will collect personal information as follows. + +**Event Attendee Information.**Β We may collect personal information such as name, address, phone number and email address when participants register for or attend a sponsored Event or other events at which Vercel (and/or its affiliates or representatives) may be present. What information is collected depends on the Event, but any personal information that Vercel receives in connection with an Event is processed in accordance with this Privacy Policy. + +**Optional Information.**Β We may also ask you to submit personal information if you choose to use interactive features of the Platform, including participation in surveys, promotions, requesting customer support, or otherwise communicating with us. + +## Information We Collect Indirectly + +### Device and Usage Information + +When you download, use or interact with the Site, even if you do not have an account, we, or authorized third parties engaged by us, may automatically collect information about your use of the Site via your device, some of which is considered personal information. "Device and Usage Information" that we collect consists of: + +**Information About your Device:**Β information about the devices and software you use to access the Site β€” primarily the internet browser or mobile device that you use, the website or source that linked or referred you to the Site, your IP address or device ID (or other persistent identifier that uniquely identifies your computer or mobile device on the Internet), the operating system of your computer or mobile device, device screen size, and other similar technical information. + +**Usage Information:**Β information about your interactions with the Platform, including access dates and times, hardware and software information, device event information, log data, crash data, cookie data. This information allows us to understand the screens that you view, how you've used the Platform (which may include administrative and support communications with us), and other actions on the Platforms. We, or authorized third parties, automatically collect log data when you access and use the Platform, even if you have not created an account or logged in. We use this information to administer and improve the Services, analyze trends, track users' use of the Platform, and gather broad demographic information for aggregate use.Note that Device and Usage Information is collected from our community boards,.Based on Device and Usage Information, we are also able to determine general location information ("Location Information"). + +**Location Information:**Β we use a third-party database to map the general location to which each IP address corresponds ("Location Information"). The Location Information that is mapped from an IP address is limited to country and city – we do not identify precise location. We use Location Information in order to more effectively provide our Services to our Customers. In addition, as part of our Services, we also share Location Information (based on End Users’ IP addresses) to our Customers, so that they may also more effectively provide their products and services to their End Users. + +### Cookies and Similar Technologies + +Vercel uses cookies and similar technologies. Through our use of cookies (and similar technologies), we or authorized third parties collect some Device and Usage Information. Some cookies are necessary to make the Site and our content available to you, while others are meant to improve the Site and enable us to analyze and measure audience and traffic. For more information on our use of cookies and the data that they collect, see ourΒ Cookie Policy. + +### Information from Third Parties + +In some instances, we process personal information from third parties, which consists of data from our partners, such as information from third parties with whom we host or sponsor Events. In all such cases, individuals are informed of our processing activities and/or able to choose certain preferences at the time of collection. From time to time, we may combine information we collect as described above with personal information we obtain from third parties. For example, we may combine information entered through a Vercel sales submission with information we receive from a third-party sales intelligence platform to enhance our ability to market our Services to Customers or potential Customers. We also may collect information from third party websites where the specific data is made publicly available. + +## Information We Process on Behalf of Our Customers + +As noted above, we provide other analytics tools for our Customers. These Customers may collect personal information from their End Users in connection with the products or services that they offer to End Users. Because we provide other analytics tools, we process End Users' information when they use our Customers' websites, web applications, and APIs. This information may also include, but is not limited to, IP addresses, system configuration information, and other information about traffic to and from Customers' websites (collectively, the "Log Data"), as well as Location Information derived from IP addresses. We do not, however, collect or process End Users’ IP addresses through the use of our analytics Services. All of this information is stored on our Platform as part of our Services, but Customers are responsible for the content transmitted across our network (e.g., images, written content, graphics, etc.), as well as any personal information they collect. Customers are also solely responsible for notifying their End Users of their personal information collection, use, and disclosure. With respect to Log Data, we collect and use Log Data to operate, maintain, and improve our Services in performance of our obligations under our Customer agreements. For example, Log Data can help us to detect new threats, identify malicious third parties, and provide more robust security protection for our Customers. It also helps us with improving our Platform, as explained below. + +## Analytics + +With the Device and Usage Information collected by our third-party analytics services, such as Google Analytics, as well as Log Data, we generate and process aggregated information, such as statistical or demographic data. Aggregated Information may be derived from personal data, but may not considered personal data if it does not directly or indirectly reveal your identity. For example, we may track the total number of visitors to our Platform, or review Log Data, or track the number of visitors to each page of our Site. We aggregate this data to calculate the percentage of users accessing a specific feature of the Platform and analyze this data for trends and statistics. We do not combine or connect Aggregated Information with your personal data so it will not identify you. Please see ourΒ Cookie PolicyΒ for more information. + +## Interactive and Public Spaces + +The Platform may provide links Third Party Sites (as defined in the Links to Third Party Sites section below) that offer publicly accessible blogs, community forums, comments sections, discussion forums, or other interactive features ("Interactive and Public Spaces"). If you choose to participate in any of these Interactive and Public Spaces, please be aware that any information that you post may be read, collected, and used by others who access it according to the terms of such Third Party Sites. + +## How Do We Use Personal Information That We Collect + +We use your personal information for a number of different reasons, as further explained below.In addition,Β **for users located in the EEA and the UK**, we must have a valid legal basis in order to process your personal data. The main legal bases under the European Union's General Data Protection Regulation (GDPR) that justify our collection and use of your personal information are: +**Performance of a contract**Β β€” When your personal information is necessary to enter into or perform our contract with you. +**Consent**Β β€” When you have consented to our use of your personal information via a consent form (online or offline). +**Legitimate interestsΒ **β€” When we use your personal information to achieve a legitimate interest and our reasons for using it outweigh any prejudice to your data protection rights. +**Legal obligation**Β β€” When we must use your personal information to comply with our legal obligations. +**Legal claims**Β β€” When your personal information is necessary for us to defend, prosecute or make a claim.Below are the general purposes and corresponding legal bases (in brackets) for which we may use your personal information: + +- Providing you access to and use of the PlatformΒ [depending on the context, performance of a contract, legitimate interests, and in some cases, legal claims]. +- Processing and completing transactions, and sending you related information, including important noticesΒ [depending on the context, performance of a contract or legitimate interests]. +- Responding to your queries and requests, or otherwise communicating directly with youΒ [depending on the context, performance of a contract, legitimate interests, and in some cases, legal claims]. +- Improving the content and general administration of the Platform, including system maintenance and upgrades, enabling new features and enhancing both Site Visitor and Customer experienceΒ [legitimate interests]. +- Detecting fraud, illegal activities or security breachesΒ [legitimate interests]. +- Providing our Services to our CustomersΒ [depending on the context, performance of a contract or legitimate interests]. +- Hosting or sponsoring EventsΒ [depending on the context, performance of a contract or legitimate interests]. +- Ensuring compliance with applicable lawsΒ [compliance with a legal obligation]. +- Conducting statistical analyses and analytics by monitoring and analyzing trends, usage, and activities in connection with the PlatformΒ [consent where required (e.g. 3rd-party cookies), or legitimate interests]. +- Increasing the number of customers who use our Platform through marketing and advertisingΒ [consent where required, or legitimate interests]. +- Sending commercial communications, in line with your communication preferences, about products and services, features, newsletters, offers, promotions, and eventsΒ [consent and in some cases, depending on location, with existing customers, legitimate interests]. +- Providing information to regulatory bodies when legally required, and only as outlined below in this Privacy PolicyΒ [legal obligation, legal claims, legitimate interests]. + +## When We Share Information + +We only disclose your personal information as described below. + +### Third-Party Service Providers + +Vercel discloses users' information to our third party agents, contractors, or service providers who are hired to perform services on our behalf. These companies do things to help us provide the Platform, and in some cases collect information directly. Below is an illustrative list of functions for which we may use third-party service providers: + +- Hosting and content delivery network services +- Analytics services +- CRM providers +- Lead generation partners +- Marketing and social media partners +- Customer support services +- Hosting and content delivery network services +- Functionality and debugging services +- Professional service providers, such as auditors, lawyers, consultants, accountants and insurers + +### Business Transfers + +As we continue to grow, we may purchase websites, applications, subsidiaries, other businesses or business units. Alternatively, we may sell businesses or business units, merge with other entities and/or sell assets or stock, in some cases as part of a reorganization or liquidation in bankruptcy. As part of these transactions, we may transfer your personal information to a successor entity upon a merger, consolidation or other corporate reorganization in which Vercel participates, or to a purchaser or acquirer of all or a portion of Vercel's assets, bankruptcy included. + +### Vercel Customers + +When we act on behalf of our Customers (as a data processor or service provider), we may provide End Users' personal information to our Customers in order to comply with their requests, End Users' requests and/or regulator requests, among others. Occasionally, we will provide our Customers with aggregated information that does not identify End Users directly, in order to provide information about usage, demographics (such as location) or other general information. + +### Event Sponsors + +If you are an Event Attendee that has registered for an Event with Vercel, such as the Next.js conference, we will provide your personal information to the Event sponsors. Such Event sponsors may use your personal information for their own direct marketing purposes and otherwise in accordance with the terms of the Event sponsors’ own privacy policies. + +### Anonymized Information + +We share aggregated, automatically-collected or otherwise non-personal information with third parties for various purposes, including (i) compliance with reporting obligations; (ii) for business or marketing purposes; (iii) to assist us and other parties in understanding our users' interests, habits and usage patterns for certain programs, content, services, marketing and/or functionality available through the Platform. We do not share personal information about you in this case. + +### Legal Obligations and Security + +In addition, Vercel will preserve or disclose your personal information in limited circumstances (other than as set forth in this Privacy Policy), including: (i) with your consent; (ii) when we have a good faith belief it is required by law, such as pursuant to a subpoena, warrant or other judicial or administrative order (as further explained below); (iii) to protect the safety of any person and to protect the safety or security of our Platform or to prevent spam, abuse, or other malicious activity of actors with respect to the Platform; or (iv) to protect our rights or property or the rights or property of those who use the Platform.If we are required to disclose personal information by law, such as pursuant to a subpoena, warrant or other judicial or administrative order, our policy is to respond to requests that are properly issued by law enforcement within the United States or via mutual legal assistance mechanism (such as a treaty). Note that if we receive information that provides us with a good faith belief that there is an exigent emergency involving the danger of death or serious physical injury to a person, we may provide information to law enforcement trying to prevent or mitigate the danger (if we have it), to be determined on a case-by-case basis. + +### Links to Third Party Sites + +This Privacy Policy applies only to the Platform. The Platform may contain links to other websites not operated or controlled by Vercel ("Third Party Sites"). The policies and procedures we describe here do not apply to Third Party Sites, and links to such Third Party Sites on the Platform do not imply that Vercel endorses or has reviewed the Third Party Sites. We suggest contacting those sites directly for information on their privacy policies. + +## "Do Not Track" + +Vercel does not respond to Do Not Track ("DNT") browser signals. For more information on DNT settings generally, please visitΒ https://allaboutdnt.com. + +## How to Change Your Communication Preferences + +To keep your information accurate, current, and complete, please contact us as specified below. We will take reasonable steps to update any information in our possession that you have previously submitted via the Platform. Note that you may also manage your communications preferences and the receipt of any commercial communication by clicking the "unsubscribe" link included at the bottom of all emails from Vercel. You may also adjust your preferences through your account settings if you have a Vercel account, or send an email toΒ privacy@vercel.com. + +## Access and Accuracy + +Vercel uses reasonable efforts to keep your personal information accurate. We will provide you with online access to your personal information so that you may review, update or correct personal information that we hold.Note that in order to protect your privacy and security, we will also take reasonable steps to verify your identity before granting you access or enabling you to make corrections. To access your personal information, please visit the relevant account management sections of our Platform. If you cannot access your personal information on the Platform, please send an email toΒ privacy@vercel.com. + +## How Long Do We Keep Your Personal information? + +### General Retention Periods + +We use the following criteria to determine our retention periods: + +- The amount, nature and sensitivity of your information; +- The reasons for which we collect and process the personal data; +- The length of time we have an ongoing relationship with you and provide you with access to our Platform; and +- applicable legal requirements. + +We retain personal information for as long as needed to provide our Services. We regularly delete other information that is less essential to the provision of our Services in order to minimize our storage of data. We also will retain personal information that we've collected from you where we have an ongoing legitimate business need to do so (for example, to comply with applicable legal, tax or accounting requirements). Additionally, we cannot delete information when it is needed for the establishment, exercise or defense of legal claims (also known as a "litigation hold"). In this case, the information must be retained as long as needed for exercising respective potential legal claims.When we no longer have an ongoing legitimate business need to process your personal information, we will either delete or anonymize it or, if this is not possible (for example, because your personal information has been stored in backup archives), we will securely store your personal information and isolate it from any further processing until deletion is possible.For any questions about data retention, please contactΒ privacy@vercel.com. + +### Anonymization + +In some instances, we may choose to anonymize your personal data instead of deleting it, for statistical use, for instance. When we choose to anonymize, we make sure that there is no way that the personal data can be linked back to you or any specific user. + +## Data Security And Integrity + +We take steps that are reasonably necessary to securely provide our Platform. We have put in place reasonably appropriate security measures designed to prevent your personal data from being accidentally lost, used or accessed in an unauthorized way, altered or disclosed. We limit access to personal data only to those employees, agents, contractors and the third parties who have a business need-to-know.We also have procedures in place to deal with any suspected data security breach. If required, we will notify you and any applicable regulator of a suspected data security breach. We also require those parties to whom we transfer your personal information to provide acceptable standards of security.Notwithstanding, no Internet or email transmission is ever fully secure or error free. In particular, email sent to or from the Services may not be secure. Therefore, take special care in deciding what information you send to us via email. For any questions about the security of your information, please contactΒ privacy@vercel.com. + +## Your California Privacy Rights + +If you are a California resident, California Civil Code Section 1798.83 permits you to request in writing a list of the categories of personal information relating third parties to which we have disclosed certain categories of personal information during the preceding year for the third parties’ direct marketing purposes. To make such a request, please contact us atΒ privacy@vercel.com. + +## Notice to Nevada Consumers + +We do not sell your personal information within the scope of, and according to the defined meaning of, a "sale" under NRS 603A. + +## International Transfers + +Vercel is a United States Delaware corporation with primarily storage of your information in the United States and the EEA. To facilitate our global operations, we may process personal information from around the world, including from other countries in which Vercel has operations, in order to provide the Platform. **If you are accessing or using our Platform or otherwise providing personal information to us, you are agreeing and consenting to the processing of your personal information in the United States and other jurisdictions in which we operate**. If you are a Customer, you are responsible for informing your End Users of how and where their personal data will be processed at the time of collection. With respect to personal data transferred from the EEA or the UK, Vercel relies on other mechanisms, including the Standard Contractual Clauses and consent, to transfer personal data that we receive from the EEA or the UK to Vercel in the United States. + +**Security.**Β We maintain security measures to protect personal data as described in the section above entitledΒ "Data Security And Integrity" + +**Access.** Individuals in the EEA, the UK and Switzerland have certain rights to access, correct, amend, or delete personal data. Please see the section below entitledΒ "Additional Information for Users in the EEA and the UK."Β for more information on the rights of users in the EEA and the UK (and, to the extent applicable, users in Switzerland). Note that when we process End Users' personal data as a data processor on behalf of our Customers, we will process such requests pursuant to our contract with our customer and our customers' instructions. + +**Recourse, Enforcement, Liability.**Β With respect to personal data collected from individuals in the EEA and the UK (while Privacy Shield was in place) or Switzerland, , Vercel commits to resolve complaints about our processing of your personal data as set forth in the principles under Privacy Shield. Individuals in the EEA and the UK and Switzerland with inquiries or complaints should first contact Vercel by email atΒ privacy@vercel.comΒ or via postal mail at 340 S Lemon Ave #4133, Walnut, CA 91789, with the subject "Attention: Privacy Shield". We have further committed to refer unresolved Privacy Shield complaints to an alternative dispute resolution provider. If you have an unresolved privacy or data use concern that we have not addressed satisfactorily, please contact our U.S.-based third-party dispute resolution provider (free of charge) atΒ https://feedback-form.truste.com/watchdog/request. For additional information, please visitΒ https://www.privacyshield.gov/article?id=ANNEX-I-introduction. We are subject to the investigatory and enforcement powers of the Federal Trade Commission with respect to personal data received or transferred pursuant to the Frameworks. + +### Additional Information for Users in the EEA and the UK. + +If the GDPR applies to you because you are in the EEA or the UK, you have certain rights in relation to your personal data: + +The right to be informed:Β our obligation to inform you that we process your personal data (and that's what we're doing in this Privacy Policy); + +The right of access:Β your right to request a copy of the personal data we hold about you (also known as a 'data subject access request'); + +The right of rectification:Β your right to request that we correct personal data about you if it is incomplete or inaccurate (though we generally recommend first making any changes in your Account Settings); + +The right to erasure (also known as the 'right to be forgotten'):Β under certain circumstances, you may ask us to delete the personal data we have about you (unless it remains necessary for us to continue processing your personal data for a legitimate business need or to comply with a legal obligation as permitted under the GDPR, in which case we will inform you); + +The right to restrict processing:Β your right,Β under certain circumstances, to ask us to suspend our processing of your personal data; + +The right to data portability:Β your right to ask us for a copy of your personal data in a common format (for example, a .csv file); + +The right to object:Β your right to object to us processing your personal data (for example, if you object to us processing your data for direct marketing); and + +Rights in relation to automated decision-making and profiling:Β our obligation to be transparent about any profiling we do, or any automated decision-making.These rights are subject to certain rules around when you can exercise them. + +### Customers, Site Visitors and Event Attendees in the EEA or the UK. + +If you are located in the EEA or the UK and you are a Customer, Site Visitor or Event Attendee, and wish to exercise any of the rights set out above, you may contact us atΒ privacy@vercel.comΒ using the term "DSR" as your email subject line.You will not have to pay a fee to access your personal data (or to exercise any of the other rights) unless your request is clearly unfounded, repetitive or excessive. Alternatively, we may refuse to comply with your request under those circumstances. If we cannot reasonably verify your identity, we will not be able to comply with your request(s).Β We may need to request specific information from you to help us confirm your identity. This is a security measure to ensure that personal data is not disclosed to any person who has no right to receive it. Note that this is especially true when you engage a third party to assist you in exercising your rights. We will respond to all legitimate requests within one month. Occasionally it may take us longer than a month if your request is particularly complex or you have made a number of requests. In this case, we will notify you and keep you updated as required by law. In addition, we will always balance your rights against those of other data subjects in connection with any requests, and in some cases this may require us to redact our responses or deny a request. + +If you no longer wish to receive our marketing/promotional information, we remind you that you may withdraw your consent to direct marketing at any time directly from the unsubscribe link included in each electronic marketing message we send to you. If you do so, we will promptly update our databases, and will take all reasonable steps to meet your request at the earliest possible opportunity, but we may continue to contact you to the extent necessary for the purposes of providing our Services.Finally, you have the right to make a complaint at any time to the supervisory authority for data protection issues in your country of residence. We would, however, appreciate the chance to address your concerns before you approach the supervisory authority, so please contact us directly first. + +### End Users in the EEA or the UK. + +Vercel has no direct relationship with End Users.Β Our Customers are solely responsible for ensuring compliance with all applicable laws and regulations with respect to their End Users, and this includes handling all data subject requests.Β We rely on our Customers to comply with the underlying legal requirements and respond directly to End Users when End Users wish to exercise the rights set forth above. However, if an End User sends a request to Vercel to access, correct, update, or delete his/her information, or no longer wishes to be contacted by a Customer that uses our Services, we will direct that End User to contact the Customer's website(s) with which he/she interacted directly, and cooperate with our Customers as required by applicable law in order to ensure that our Customers satisfy their End Users' requests. + +## European Representative + +Pursuant to Article 27 of the General Data Protection Regulation (GDPR), Vercel has appointed European Data Protection Office (EDPO) as its GDPR representative in the EU. You can contact EDPO regarding matters pertaining to the GDPR by: +Using EDPO's online request form atΒ https://edpo.com/gdpr-data-request/; or +Writing to EDPO at Avenue Huart Hamoir 71, 1030 Brussels, Belgium. + +## UK Representative + +Pursuant to the UK GDPR, Vercel has appointed EDPO UK Ltd as its UK GDPR representative in the UK. You can contact EDPO UK regarding matters pertaining to the UK GDPR: +Using EDPO's UK online request form atΒ https://edpo.com/gdpr-data-request/; or +Writing to EDPO UK at 8 Northumberland Avenue, London WC2N 5BY, United Kingdom. + +## Contact Us + +If you have any questions or suggestions regarding this Privacy Policy, please contact us atΒ privacy@vercel.com. + +Last update: December 8th, 2021 diff --git a/docs/pages/repo/_meta.json b/docs/pages/repo/_meta.json new file mode 100644 index 0000000..4941080 --- /dev/null +++ b/docs/pages/repo/_meta.json @@ -0,0 +1,15 @@ +{ + "index": { + "type": "page", + "display": "hidden", + "theme": { + "layout": "raw", + "sidebar": false, + "toc": true + } + }, + "docs": { + "title": "Docs", + "display": "children" + } +} diff --git a/docs/pages/repo/docs/_meta.json b/docs/pages/repo/docs/_meta.json new file mode 100644 index 0000000..750a466 --- /dev/null +++ b/docs/pages/repo/docs/_meta.json @@ -0,0 +1,20 @@ +{ + "index": { + "title": "Quickstart" + }, + "installing": "Installing Turborepo", + "getting-started": "Getting Started", + "core-concepts": "Core Concepts", + "reference": "API Reference", + "ci": "CI Recipes", + "troubleshooting": "Troubleshooting", + "handbook": "Monorepo Handbook", + "changelog": { + "title": "Changelog", + "href": "https://github.com/vercel/turbo/releases", + "newWindow": true + }, + "upgrading-to-v1": "Upgrading to v1", + "acknowledgements": "Acknowledgements", + "faq": "FAQ" +} diff --git a/docs/pages/repo/docs/acknowledgements.mdx b/docs/pages/repo/docs/acknowledgements.mdx new file mode 100644 index 0000000..50a5c46 --- /dev/null +++ b/docs/pages/repo/docs/acknowledgements.mdx @@ -0,0 +1,45 @@ +--- +title: Acknowledgements and Prior Art +description: Thank you to all these developers, build systems, and monorepo tools for their support and assistance. +--- + +# Acknowledgements + +Turborepo was originally created by [Jared Palmer](https://twitter.com/jaredpalmer) as a closed-source enterprise software offering. In late 2021, [Vercel acquired Turborepo](https://vercel.com/blog/vercel-acquires-turborepo) and open sourced the codebase. + +Today, Turborepo has dedicated full-time team working on it as well as a growing list of [open source contributors](https://github.com/vercel/turbo/graphs/contributors). + +## Inspiration / Prior Art + +At [Vercel](https://vercel.com/), we believe deeply in the open source movement and in the power of open collaboration. To that end, it's important to provide meaningful attribution to the projects and people that inspire(d) us and our work. + +We'd like to make a special shoutout to other build systems, monorepo tools, and prior art: + +- Bazel - https://bazel.build +- Buck - https://buck.build +- Please - https://please.build +- Pants - https://www.pantsbuild.org +- Scoot - https://github.com/twitter/scoot +- TSDX - https://tsdx.io +- Lerna - https://lerna.js.org +- Lage - https://microsoft.github.io/lage +- Backfill - https://github.com/microsoft/backfill +- Bolt - https://github.com/boltpkg/bolt +- Rush - https://rushjs.io +- Preconstruct - https://preconstruct.tools +- Nx - https://nx.dev +- Yarn - https://yarnpkg.com +- npm - https://www.npmjs.com +- pnpm - https://pnpm.js.org + +Throughout the documentation, wherever applicable, we also provide inline callouts and links to the projects and people that have inspired us. + +## Additional Thanks + +Additionally, we're grateful to: + +- [Rick Button](https://twitter.com/rickbutton) for donating the `turbo` package name on npm +- [Iheanyi Ekechukwu](https://twitter.com/kwuchu) for helping Jared pick up Golang during the Pandemic! +- [Kenneth Chau](https://twitter.com/kenneth_chau) for Lage's Scope and Pipeline API and docs +- [Miguel Oller](https://mobile.twitter.com/ollermi) and [MakeSwift.com](https://www.makeswift.com/) for piloting Turborepo +- [Eric Koslow](https://twitter.com/ekosz1), [Jack Hanford](https://twitter.com/jackhanford), and [Lattice.com](https://lattice.com/) for piloting Turborepo diff --git a/docs/pages/repo/docs/ci.mdx b/docs/pages/repo/docs/ci.mdx new file mode 100644 index 0000000..779a891 --- /dev/null +++ b/docs/pages/repo/docs/ci.mdx @@ -0,0 +1,13 @@ +--- +title: Continuous Integration +description: Recipes for using Turborepo with GitHub Actions, CircleCI, and other continuous integration providers. +--- + +# Continuous Integration + +Turborepo not only speeds up builds, but also your CI pipeline. Below are a few ways to use Turborepo with various Continuous Integration providers. + +- [CircleCI](/repo/docs/ci/circleci) +- [GitHub Actions](/repo/docs/ci/github-actions) +- [GitLab CI](/repo/docs/ci/gitlabci) +- [Travis CI](/repo/docs/ci/travisci) diff --git a/docs/pages/repo/docs/ci/_meta.json b/docs/pages/repo/docs/ci/_meta.json new file mode 100644 index 0000000..b9ed68d --- /dev/null +++ b/docs/pages/repo/docs/ci/_meta.json @@ -0,0 +1,6 @@ +{ + "circleci": "CircleCI", + "github-actions": "GitHub Actions", + "gitlabci": "GitLab CI", + "travisci": "Travis CI" +} diff --git a/docs/pages/repo/docs/ci/circleci.mdx b/docs/pages/repo/docs/ci/circleci.mdx new file mode 100644 index 0000000..c102dab --- /dev/null +++ b/docs/pages/repo/docs/ci/circleci.mdx @@ -0,0 +1,145 @@ +--- +title: Using Turborepo with CircleCI +description: How to use CircleCI with Turborepo to optimize your CI workflow +--- + +import { Tabs, Tab } from "../../../../components/Tabs"; + +# Using Turborepo with CircleCI + +The following example shows how to use Turborepo with [CircleCI](https://circleci.com/). + +For a given root `package.json`: + +```json +{ + "name": "my-turborepo", + "scripts": { + "build": "turbo run build", + "test": "turbo run test" + }, + "devDependencies": { + "turbo": "1.2.5" + } +} +``` + +And a `turbo.json`: + +```json +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "outputs": [".next/**", "!.next/cache/**"], + "dependsOn": ["^build"] + }, + "test": { + "dependsOn": ["^build"] + }, + } +} +``` + +Create a file called `.circleci/config.yml` in your repository with the following contents: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```yaml + version: 2.1 + orbs: + node: circleci/node@5.0.2 + workflows: + test: + jobs: + - test + jobs: + test: + docker: + - image: cimg/node:lts + steps: + - checkout + - node/install-packages + - run: + command: npm run build + - run: + command: npm run test + ``` + </Tab> + <Tab> + ```yaml + version: 2.1 + orbs: + node: circleci/node@5.0.2 + workflows: + test: + jobs: + - test + jobs: + test: + docker: + - image: cimg/node:lts + steps: + - checkout + - node/install-packages: + pkg-manager: yarn + - run: + command: yarn build + - run: + command: yarn test + ``` + </Tab> + <Tab> + ```yaml + version: 2.1 + orbs: + node: circleci/node@5.0.2 + workflows: + test: + jobs: + - test + jobs: + test: + docker: + - image: cimg/node:lts + steps: + - checkout + - node/install-packages: + - run: + command: npm i -g pnpm + - run: + command: pnpm build + - run: + command: pnpm test + ``` + </Tab> +</Tabs> + +## Remote Caching + +To use Remote Caching with CircleCI, add the following environment variables to your CircleCI project +to make them available to your `turbo` commands. + +- `TURBO_TOKEN` - The Bearer token to access the Remote Cache +- `TURBO_TEAM` - The account to which the monorepo belongs + +To use Vercel Remote Caching, you can get the value of these variables in a few steps: + +1. Create a Scoped Access Token to your account in the [Vercel Dashboard](https://vercel.com/account/tokens) + +![Vercel Access Tokens](/images/docs/vercel-create-token.png) + +Copy the value to a safe place. You'll need it in a moment. + +2. Go to your CircleCI project settings and click on the **Environment Variables** tab. Create a new secret called `TURBO_TOKEN` and enter the value of your Scoped Access Token. + +![CircleCI Environment Variables](/images/docs/circleci-environment-variables.png) +![CircleCI Create Environment Variables](/images/docs/circleci-create-environment-variables.png) + +3. Make a second secret called `TURBO_TEAM` and enter the value of your team's Vercel URL _without_ the `vercel.com/`. Your Team URL can be found inside your team's general project settings from the dashboard. + + If you're using a Hobby Plan, you can use your username. Your username can be found in your [Vercel Personal Account Settings](https://vercel.com/account) + +![Vercel Account Slug](/images/docs/vercel-slug.png) + +4. CircleCI automatically loads environment variables stored in project settings into the CI environment. No modifications are necessary for the CI file. diff --git a/docs/pages/repo/docs/ci/github-actions.mdx b/docs/pages/repo/docs/ci/github-actions.mdx new file mode 100644 index 0000000..3b69a94 --- /dev/null +++ b/docs/pages/repo/docs/ci/github-actions.mdx @@ -0,0 +1,231 @@ +--- +title: Using Turborepo with GitHub Actions +description: How to use GitHub Actions with Turborepo to optimize your CI workflow +--- + +import { Tabs, Tab } from "../../../../components/Tabs"; + +# Using Turborepo with GitHub Actions + +The following example shows how to use Turborepo with [GitHub Actions](https://github.com/features/actions). + +For a given root `package.json`: + +```json +{ + "name": "my-turborepo", + "scripts": { + "build": "turbo run build", + "test": "turbo run test" + }, + "devDependencies": { + "turbo": "1.2.5" + } +} +``` + +And a `turbo.json`: + +```json +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "outputs": [".next/**", "!.next/cache/**"], + "dependsOn": ["^build"] + }, + "test": { + "dependsOn": ["^build"] + } + }, +} +``` + +Create file called `.github/workflows/ci.yml` in your repository with the following contents: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```yaml + name: CI + + on: + push: + branches: ["main"] + pull_request: + types: [opened, synchronize] + + jobs: + build: + name: Build and Test + timeout-minutes: 15 + runs-on: ubuntu-latest + # To use Remote Caching, uncomment the next lines and follow the steps below. + # env: + # TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }} + # TURBO_TEAM: ${{ secrets.TURBO_TEAM }} + # TURBO_REMOTE_ONLY: true + + steps: + - name: Check out code + uses: actions/checkout@v3 + with: + fetch-depth: 2 + + - name: Setup Node.js environment + uses: actions/setup-node@v3 + with: + node-version: 16 + cache: 'npm' + + - name: Install dependencies + run: npm install + + - name: Build + run: npm run build + + - name: Test + run: npm run test + ``` + + </Tab> + <Tab> + ```yaml + name: CI + + on: + push: + branches: ["main"] + pull_request: + types: [opened, synchronize] + + jobs: + build: + name: Build and Test + timeout-minutes: 15 + runs-on: ubuntu-latest + # To use Remote Caching, uncomment the next lines and follow the steps below. + # env: + # TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }} + # TURBO_TEAM: ${{ secrets.TURBO_TEAM }} + + steps: + - name: Check out code + uses: actions/checkout@v3 + with: + fetch-depth: 2 + + - name: Setup Node.js environment + uses: actions/setup-node@v3 + with: + node-version: 16 + cache: 'yarn' + + - name: Install dependencies + run: yarn + + - name: Build + run: yarn build + + - name: Test + run: yarn test + ``` + + </Tab> + <Tab> + ```yaml + name: CI + + on: + push: + branches: ["main"] + pull_request: + types: [opened, synchronize] + + jobs: + build: + name: Build and Test + timeout-minutes: 15 + runs-on: ubuntu-latest + # To use Remote Caching, uncomment the next lines and follow the steps below. + # env: + # TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }} + # TURBO_TEAM: ${{ secrets.TURBO_TEAM }} + + steps: + - name: Check out code + uses: actions/checkout@v3 + with: + fetch-depth: 2 + + - uses: pnpm/action-setup@v2.0.1 + with: + version: 6.32.2 + + - name: Setup Node.js environment + uses: actions/setup-node@v3 + with: + node-version: 16 + cache: 'pnpm' + + - name: Install dependencies + run: pnpm install + + - name: Build + run: pnpm build + + - name: Test + run: pnpm test + ``` + + </Tab> +</Tabs> + +## Remote Caching + +To use Remote Caching with GitHub Actions, add the following environment variables to your GitHub Actions workflow +to make them available to your `turbo` commands. + +- `TURBO_TOKEN` - The Bearer token to access the Remote Cache +- `TURBO_TEAM` - The account to which the monorepo belongs + +To use Vercel Remote Caching, you can get the value of these variables in a few steps: + +1. Create a Scoped Access Token to your account in the [Vercel Dashboard](https://vercel.com/account/tokens) + +![Vercel Access Tokens](/images/docs/vercel-create-token.png) + +Copy the value to a safe place. You'll need it in a moment. + +2. Go to your GitHub repository settings and click on the **Secrets** and then **Actions** tab. Create a new secret called `TURBO_TOKEN` and enter the value of your Scoped Access Token. + +![GitHub Secrets](/images/docs/github-actions-secrets.png) +![GitHub Secrets Create](/images/docs/github-actions-create-secret.png) + +3. Make a second secret called `TURBO_TEAM` and enter the value of your team's Vercel URL _without_ the `vercel.com/`. Your Team URL can be found inside your team's general project settings from the dashboard. + + If you're using a Hobby Plan, you can use your username. Your username can be found in your [Vercel Personal Account Settings](https://vercel.com/account) + +![Vercel Account Slug](/images/docs/vercel-slug.png) + +4. At the top of your GitHub Actions workflow, provide the following environment variables to jobs that use `turbo`: + +```yaml highlight="6-8" +# ... + +jobs: + build: + name: Build and Test + timeout-minutes: 15 + runs-on: ubuntu-latest + # To use Turborepo Remote Caching, set the following environment variables for the job. + env: + TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }} + TURBO_TEAM: ${{ secrets.TURBO_TEAM }} + + steps: + - name: Check out code + uses: actions/checkout@v3 + with: + fetch-depth: 2 + # ... +``` diff --git a/docs/pages/repo/docs/ci/gitlabci.mdx b/docs/pages/repo/docs/ci/gitlabci.mdx new file mode 100644 index 0000000..f8321d0 --- /dev/null +++ b/docs/pages/repo/docs/ci/gitlabci.mdx @@ -0,0 +1,136 @@ +--- +title: Using Turborepo with GitLab CI +description: How to use GitLab CI with Turborepo to optimize your CI workflow +--- + +import { Tabs, Tab } from "../../../../components/Tabs"; + +# Using Turborepo with GitLab CI + +The following example shows how to use Turborepo with [GitLab CI](https://docs.gitlab.com/ee/ci/). + +For a given root `package.json`: + +```json +{ + "name": "my-turborepo", + "scripts": { + "build": "turbo run build", + "test": "turbo run test" + }, + "devDependencies": { + "turbo": "1.2.5" + } +} +``` + +And a `turbo.json`: + +```json +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "outputs": [".svelte-kit/**"], + "dependsOn": ["^build"] + }, + "test": { + "dependsOn": ["^build"] + }, + } +} +``` + +Create a file called `.gitlab-ci.yml` in your repository with the following contents: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```yaml + image: node:latest + # To use Remote Caching, uncomment the next lines and follow the steps below. + # variables: + # TURBO_TOKEN: $TURBO_TOKEN + # TURBO_TEAM: $TURBO_TEAM + stages: + - build + build: + stage: build + script: + - npm install + - npm run build + - npm run test + ``` + </Tab> + <Tab> + ```yaml + image: node:latest + # To use Remote Caching, uncomment the next lines and follow the steps below. + # variables: + # TURBO_TOKEN: $TURBO_TOKEN + # TURBO_TEAM: $TURBO_TEAM + stages: + - build + build: + stage: build + script: + - yarn install + - yarn build + - yarn test + cache: + paths: + - node_modules/ + - .yarn + ``` + </Tab> + <Tab> + ```yaml + image: node:latest + # To use Remote Caching, uncomment the next lines and follow the steps below. + # variables: + # TURBO_TOKEN: $TURBO_TOKEN + # TURBO_TEAM: $TURBO_TEAM + stages: + - build + build: + stage: build + before_script: + - curl -f https://get.pnpm.io/v6.16.js | node - add --global pnpm@6.32.2 + - pnpm config set store-dir .pnpm-store + script: + - pnpm install + - pnpm build + - pnpm test + cache: + key: "$CI_COMMIT_REF_SLUG" + paths: + - .pnpm-store + ``` + > For more information visit the pnpm documentation section on GitLab CI integration, view it [here](https://pnpm.io/continuous-integration#gitlab) + </Tab> +</Tabs> + +## Remote Caching + +To use Remote Caching with GitLab CI, add the following environment variables to your GitLab CI project. + +- `TURBO_TOKEN` - The Bearer token to access the Remote Cache +- `TURBO_TEAM` - The account to which the monorepo belongs + +To use Vercel Remote Caching, you can get the value of these variables in a few steps: + +1. Create a Scoped Access Token to your account in the [Vercel Dashboard](https://vercel.com/account/tokens) + +![Vercel Access Tokens](/images/docs/vercel-create-token.png) + +Copy the value to a safe place. You'll need it in a moment. + +2. Go to your GitLab repository settings and click on the **Settings** and then **CI/CD** tab. Create a new variable called `TURBO_TOKEN` and enter the value of your Scoped Access Token. + +![GitLab CI Variables](/images/docs/gitlab-ci-variables.png) +![GitLab CI Create Variable](/images/docs/gitlab-ci-create-variable.png) + +3. Make a second secret called `TURBO_TEAM` and enter the value of your team's Vercel URL _without_ the `vercel.com/`. Your Team URL can be found inside your team's general project settings from the dashboard. + + If you're using a Hobby Plan, you can use your username. Your username can be found in your [Vercel Personal Account Settings](https://vercel.com/account) + +![Vercel Account Slug](/images/docs/vercel-slug.png) diff --git a/docs/pages/repo/docs/ci/travisci.mdx b/docs/pages/repo/docs/ci/travisci.mdx new file mode 100644 index 0000000..29a09e5 --- /dev/null +++ b/docs/pages/repo/docs/ci/travisci.mdx @@ -0,0 +1,122 @@ +--- +title: Using Turborepo with Travis CI +description: How to use Travis CI with Turborepo to optimize your CI workflow +--- + +import { Tabs, Tab } from "../../../../components/Tabs"; + +# Using Turborepo with Travis CI + +The following example shows how to use Turborepo with [Travis CI](https://www.travis-ci.com/). + +For a given root `package.json`: + +```json +{ + "name": "my-turborepo", + "scripts": { + "build": "turbo run build", + "test": "turbo run test" + }, + "devDependencies": { + "turbo": "1.2.5" + } +} +``` + +And a `turbo.json`: + +```json +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "outputs": [".svelte-kit/**"], + "dependsOn": ["^build"] + }, + "test": { + "dependsOn": ["^build"] + }, + } +} +``` + +Create a file called `.travis.yml` in your repository with the following contents: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```yaml + language: node_js + node_js: + - lts/* + install: + - npm install + script: + - npm run build + script: + - npm run test + ``` + </Tab> + <Tab> + Travis CI detects the use of Yarn by the presence of `yarn.lock`. It will automatically ensure it is installed. + ```yaml + language: node_js + node_js: + - lts/* + install: + - yarn + script: + - yarn build + script: + - yarn test + ``` + </Tab> + <Tab> + ```yaml + language: node_js + node_js: + - lts/* + cache: + npm: false + directories: + - "~/.pnpm-store" + before_install: + - curl -f https://get.pnpm.io/v6.16.js | node - add --global pnpm@6.32.2 + - pnpm config set store-dir ~/.pnpm-store + install: + - pnpm install + script: + - pnpm build + script: + - pnpm test + ``` + > For more information visit the pnpm documentation section on Travis CI integration, view it [here](https://pnpm.io/continuous-integration#travis) + </Tab> +</Tabs> + +## Remote Caching + +To use Remote Caching with Travis CI, add the following environment variables to your Travis CI project. + +- `TURBO_TOKEN` - The Bearer token to access the Remote Cache +- `TURBO_TEAM` - The account to which the monorepo belongs + +To use Vercel Remote Caching, you can get the value of these variables in a few steps: + +1. Create a Scoped Access Token to your account in the [Vercel Dashboard](https://vercel.com/account/tokens) + +![Vercel Access Tokens](/images/docs/vercel-create-token.png) + +Copy the value to a safe place. You'll need it in a moment. + +2. Go to your Travis repository settings and scroll down to the _Environment Variables_ section. Create a new variable called `TURBO_TOKEN` and enter the value of your Scoped Access Token. + +![Travis CI Variables](/images/docs/travis-ci-environment-variables.png) + +3. Make a second secret called `TURBO_TEAM` and enter the value of your team's Vercel URL _without_ the `vercel.com/`. Your Team URL can be found inside your team's general project settings from the dashboard. + + If you're using a Hobby Plan, you can use your username. Your username can be found in your [Vercel Personal Account Settings](https://vercel.com/account) + +![Vercel Account Slug](/images/docs/vercel-slug.png) + +4. Travis CI automatically loads environment variables stored in project settings into the CI environment. No modifications are necessary for the CI file. diff --git a/docs/pages/repo/docs/core-concepts/_meta.json b/docs/pages/repo/docs/core-concepts/_meta.json new file mode 100644 index 0000000..92e93f2 --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/_meta.json @@ -0,0 +1,8 @@ +{ + "caching": "Caching Tasks", + "remote-caching": "Remote Caching", + "monorepos": "Monorepos", + "scopes": { + "display": "hidden" + } +} diff --git a/docs/pages/repo/docs/core-concepts/caching.mdx b/docs/pages/repo/docs/core-concepts/caching.mdx new file mode 100644 index 0000000..b815977 --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/caching.mdx @@ -0,0 +1,460 @@ +--- +title: Caching +description: Caching allows Turborepo to skip work that's already been done, for the fastest builds. +--- + +import Callout from "../../../../components/Callout"; + +# Caching Tasks + +Every JavaScript or TypeScript codebase will need to run `package.json` scripts, like `build`, `test` and `lint`. In Turborepo, we call these **tasks**. + +Turborepo can cache the results and logs of your tasks - leading to enormous speedups for slow tasks. + +## Missing the cache + +Each task in your codebase has **inputs** and **outputs**. + +- A `build` task might have source files as inputs and outputs logs to `stderr` and `stdout` as well as bundled files. +- A `lint` or `test` task might have source files as inputs and outputs logs to `stdout` and `stderr`. + +Let's say you run a `build` task with Turborepo using `turbo run build`: + +![](/images/docs/cache-miss.png) + +1. Turborepo will **evaluate the inputs to your task** (by default all non-git-ignored files in the workspace folder) and **turn them into a hash** (e.g. `78awdk123`). + +2. **Check the local filesystem cache** for a folder named with the hash (e.g.`./node_modules/.cache/turbo/78awdk123`). + +3. If Turborepo doesn't find any matching artifacts for the calculated hash, Turborepo will then **execute the task**. + +4. Once the task is over, Turborepo **saves all the outputs** (including files and logs) into its cache under the hash. + +<Callout type="info"> + Turborepo takes a lot of information into account when creating the hash - + source files, environment variables, and even the source files of dependent + workspaces. Learn more [below](/repo/docs/core-concepts/caching#hashing). +</Callout> + +## Hitting the cache + +Let's say that you run the task again without changing any of its inputs: + +![](/images/docs/cache-hit.png) + +1. The **hash will be the same** because **the inputs haven't changed** (e.g. `78awdk123`) + +2. Turborepo will find the folder in its cache with the calculated hash (e.g. `./node_modules/.cache/turbo/78awdk123`) + +3. **Instead of running the task**, Turborepo will **replay the output** - printing the saved logs to `stdout` and restoring the saved output files to their respective position in the filesystem. + +Restoring files and logs from the cache happens near-instantaneously. This can take your build times from minutes or hours down to seconds or milliseconds. Although specific results will vary depending on the shape and granularity of your codebase's dependency graph, most teams find that they can cut their overall monthly build time by around 40-85% with Turborepo's caching. + +## Configuring Cache Outputs + +Using [`pipeline`](/repo/docs/reference/configuration#pipeline), you can configure cache conventions across your Turborepo. + +To override the default cache output behavior, pass an array of globs to a [`pipeline.<task>.outputs`](/repo/docs/reference/configuration#outputs) array. Any file that satisfies the glob patterns for a task will be treated as artifact. + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "outputs": [".next/**", "!.next/cache/**"], + "dependsOn": ["^build"] + }, + "test": { + "dependsOn": ["build"] + } + } +} +``` + +If your task does not emit any files (e.g. unit tests with Jest) you can omit `outputs`. Even without any file outputs, Turborepo automatically records and caches the logs of every task. If no inputs change (i.e. if there is a cache hit), subsequent runs will replay these logs. + +When you run `turbo run build test`, Turborepo will execute your build and test scripts, +and cache their `output`s in `./node_modules/.cache/turbo`. + +<Callout type="info"> + Pro Tip for caching ESLint: You can get a cacheable pretty terminal output + (even for non-errors) by setting `TIMING=1` variable before `eslint`. Learn + more over in the [ESLint + docs](https://eslint.org/docs/latest/developer-guide/working-with-rules#per-rule-performance). +</Callout> + +## Configuring Cache Inputs + +A workspace is considered to have been updated when any of the files in that workspace have changed. +However, for some tasks, we only want to rerun that task when relevant files have changed. +Specifying `inputs` lets us define which files are relevant for a particular task. For example, the +`test` configuration below declares that the `test` task only needs to execute if a `.tsx` or `.ts` file +in the `src/` and `test/` subdirectories has changed since the last execution. + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + // ... omitted for brevity + + "test": { + // A workspace's `test` task depends on that workspace's + // own `build` task being completed first. + "dependsOn": ["build"], + // A workspace's `test` task should only be rerun when + // either a `.tsx` or `.ts` file has changed. + "inputs": ["src/**/*.tsx", "src/**/*.ts", "test/**/*.ts"] + } + } +} +``` + +## Turn off caching + +Sometimes you really don't want to write the cache output (e.g. when you're using [`next dev`](https://nextjs.org/docs/api-reference/cli#development) or `react-scripts start` for live reloading). To disable cache writes, append `--no-cache` to any command: + +```shell +# Run `dev` npm script in all workspaces in parallel, +# but don't cache the output +turbo run dev --no-cache +``` + +Note that `--no-cache` disables cache writes but does not disable cache reads. If you want to disable cache reads, use the `--force` flag. + +You can also disable caching on specific tasks by setting the [`pipeline.<task>.cache`](/repo/docs/reference/configuration#cache) configuration to `false`: + +```json +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "dev": { + "cache": false, + "persistent": true + } + } +} +``` + +## Alter Caching Based on File Changes + +For some tasks, you may not want a cache miss if an irrelevant file has changed. For instance, updating `README.md` +might not need to trigger a cache miss for the `test` task. You can use `inputs` to restrict the set +of files `turbo` considers for a particular task. In this case, only consider `.ts` and `.tsx` files relevant for +determining a cache hit on the `test` task: + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + // ...other tasks + "test": { + "dependsOn": ["build"], + "inputs": ["src/**/*.tsx", "src/**/*.ts", "test/**/*.ts"] + } + } +} +``` + +<Callout type="info"> + `package.json` is *always* considered an input for tasks in the workspace it + lives in. This is because the *definition* of the task itself lives in + `package.json` in the `scripts` key. If you change that, any cached output is + considered invalid. +</Callout> + +If you want _all_ tasks to depend on certain files, you can declare this dependency in the +`globalDependencies` array. + +```diff +{ + "$schema": "https://turbo.build/schema.json", ++ "globalDependencies": [".env"], + "pipeline": { + // ...other tasks + "test": { + "dependsOn": ["build"], + "inputs": ["src/**/*.tsx", "src/**/*.ts", "test/**/*.ts"] + } + } +} +``` + +<Callout type="info"> + `turbo.json` is *always* considered a global dependency. If you modify + `turbo.json`, all caches are invalidated. +</Callout> + +## Altering Caching Based on Environment Variables + +When you use `turbo` with tools that inline environment variables at build time +(e.g. [Next.js][1] or [Create React App][2]), it is important to tell `turbo` about it. +Otherwise, you could ship a cached build with the wrong environment variables! + +You can control `turbo`'s caching behavior based on +the values of environment variables: + +- Including environment variables in the `env` key in your `pipeline` definition will impact the cache fingerprint on a per-task or per-workspace-task basis. +- The value of any environment variable that includes `THASH` in its name will impact the cache fingerprint of _all_ tasks. + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"], + // env vars will impact hashes of all "build" tasks + "env": ["SOME_ENV_VAR"], + "outputs": ["dist/**"] + }, + + // override settings for the "build" task for the "web" app + "web#build": { + "dependsOn": ["^build"], + "env": [ + // env vars that will impact the hash of "build" task for only "web" app + "STRIPE_SECRET_KEY", + "NEXT_PUBLIC_STRIPE_PUBLIC_KEY", + "NEXT_PUBLIC_ANALYTICS_ID" + ], + "outputs": [".next/**", "!.next/cache/**"] + } + } +} +``` + +<Callout type="info"> + Declaring environment variables in the `dependsOn` config with a `$` prefix is + deprecated. +</Callout> + +To alter the cache for _all_ tasks, you can declare environment variables in the +`globalEnv` array: + +```diff +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"], + // env vars will impact hashes of all "build" tasks + "env": ["SOME_ENV_VAR"], + "outputs": ["dist/**"] + }, + + // override settings for the "build" task for the "web" app + "web#build": { + "dependsOn": ["^build"], + "env": [ + // env vars that will impact the hash of "build" task for only "web" app + "STRIPE_SECRET_KEY", + "NEXT_PUBLIC_STRIPE_PUBLIC_KEY", + "NEXT_PUBLIC_ANALYTICS_ID" + ], + "outputs": [".next/**", "!.next/cache/**"], + }, + }, ++ "globalEnv": [ ++ "GITHUB_TOKEN" // env var that will impact the hashes of all tasks, ++ ] +} +``` + +### Automatic environment variable inclusion + +To help ensure correct caching across environments, Turborepo automatically infers and includes public environment variables when calculating cache keys for apps built with detected frameworks. You can safely omit framework-specific public environment variables from `turbo.json`: + +```diff filename="turbo.json" +{ + "pipeline": { + "build": { + "env": [ +- "NEXT_PUBLIC_EXAMPLE_ENV_VAR" + ] + } + } +} +``` + +Note that this automatic detection and inclusion only works if Turborepo successfully infers the framework your apps are built with. The supported frameworks and the environment variables that Turborepo will detect and include in the cache keys: + +- Astro: `PUBLIC_*` +- Blitz: `NEXT_PUBLIC_*` +- Create React App: `REACT_APP_*` +- Gatsby: `GATSBY_*` +- Next.js: `NEXT_PUBLIC_*` +- Nuxt.js: `NUXT_ENV_*` +- RedwoodJS: `REDWOOD_ENV_*` +- Sanity Studio: `SANITY_STUDIO_*` +- Solid: `VITE_*` +- SvelteKit: `VITE_*` +- Vite: `VITE_*` +- Vue: `VUE_APP_*` + +<Callout type="info"> + There are some exceptions to the list above. For various reasons, CI systems (including Vercel) + set environment variables that start with these prefixes even though they aren't part of your build + output. These can change unpredictably — even on every build! — invalidating Turborepo's + cache. To workaround this, Turborepo uses a `TURBO_CI_VENDOR_ENV_KEY` variable to + _exclude_ environment variables from Turborepo's inference. + +For example, Vercel sets the `NEXT_PUBLIC_VERCEL_GIT_COMMIT_SHA`. This value changes on every build, +so Vercel _also_ sets `TURBO_CI_VENDOR_ENV_KEY="NEXT_PUBLIC_VERCEL_"` to exclude these variables. + +Luckily, you only need to be aware of this on other build systems; you don't need to worry +about these edge cases when using Turborepo on Vercel. + +</Callout> + +#### A note on monorepos + +The environment variables will only be included in the cache key for tasks in workspaces where that framework is used. In other words, environment variables inferred for Next.js apps will only be included in the cache key for workspaces detected as Next.js apps. Tasks in other workspaces in the monorepo will not be impacted. + +For example, consider a monorepo with three workspaces: a Next.js project, a Create React App project, and a TypeScript package. Each has a `build` script, and both apps depend on the TypeScript project. Let's say that this Turborepo has a standard `turbo.json` pipeline that builds them all in order: + +```jsonc filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": [".next/**", "!.next/cache/**", "dist/**"], + "dependsOn": ["^build"] + } + } +} +``` + +As of 1.4, when you run `turbo run build`, Turborepo will not consider any build time environment variables relevant when building the TypeScript package. However, when building the Next.js app, Turborepo will infer that environment variables starting with `NEXT_PUBLIC_` could alter the output of the `.next` folder and should thus be included when calculating the hash. Similarly, when calculating the hash of the Create React App's `build` script, all build time environment variables starting with `REACT_APP_PUBLIC_` will be included. + +### `eslint-config-turbo` + +To further assist in detecting unseen dependencies creeping into your builds, and to help ensure that your Turborepo cache is correctly shared across every environment, use the [`eslint-config-turbo`](https://www.npmjs.com/package/eslint-config-turbo) package. While automatic environment variable inclusion should cover most situations with most frameworks, this ESLint config will provide just-in-time feedback for teams using other build time inlined environment variables. This will also help support teams using in-house frameworks that we cannot detect automatically. + +To get started, extend from `eslint-config-turbo` in your root [`eslintrc`](https://eslint.org/docs/latest/user-guide/configuring/configuration-files#configuration-file-formats) file: + +```jsonc +{ + // Automatically flag env vars missing from turbo.json + "extends": ["turbo"] +} +``` + +For more control over the rules, you can install and configure the [`eslint-plugin-turbo`](https://www.npmjs.com/package/eslint-plugin-turbo) _plugin_ directly by first adding it to plugins and then configuring the desired rules: + +```jsonc +{ + "plugins": ["turbo"], + "rules": { + // Automatically flag env vars missing from turbo.json + "turbo/no-undeclared-env-vars": "error" + } +} +``` + +The plugin will warn you if you are using non-framework-related environment variables in your code that have not been declared in your `turbo.json`. + +### Invisible Environment Variables + +Since Turborepo runs _before_ your tasks, it is possible for your tasks to create or mutate environment +variables after `turbo` has already calculated the hash for a particular task. For example, consider this `package.json`: + +```json +{ + "scripts": { + "build": "NEXT_PUBLIC_GA_ID=UA-00000000-0 next build" + } +} +``` + +`turbo`, having calculated a task hash prior to executing the `build` script, will be unable to discover +the `NEXT_PUBLIC_GA_ID` environment variable, and thus unable to partition the cache based +on its value. Be careful to ensure that all of your environment variables are configured prior to +invoking `turbo`! + +```jsonc +{ + "$schema": "https://turborepo.org/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"], + "env": ["SOME_ENV_VAR"], + "outputs": [".next/**", "!.next/cache/**"], + }, + } +} +``` + +### Using dotenv + +Frameworks commonly use [dotenv][3] to automatically load environment variables when the dev server +starts or when creating a build. This makes it hard for Turborepo to understand the environment of your task by default: + +- dotenv stores environment variables in a _file_ rather than in the environment +- this file is loaded _after_ Turborepo has already started execution of the task +- sometimes the file is git-ignored, so even if `inputs` are not specified, Turborepo won't be able to detect changes + +To ensure you end up with the correct caching behavior for your task, add these `.env` files to the `globalDependencies` key: + +```diff +{ + "$schema": "https://turborepo.org/schema.json", ++ "globalDependencies": ["**/.env.*local"], + "pipeline": { + "build": { + "dependsOn": ["^build"], + "env": ["SOME_ENV_VAR"], + "outputs": [".next/**", "!.next/cache/**"], + }, + } +} +``` + +Alternatively, you can add specific environment variables to the `inputs` key for specific tasks. + +## Force overwrite cache + +Conversely, if you want to disable reading the cache and force `turbo` to re-execute a previously cached task, add the `--force` flag: + +```shell +# Run `build` npm script in all workspaces, +# ignoring cache hits. +turbo run build --force +``` + +Note that `--force` disables cache reads but does not disable cache writes. If you want to disable cache writes, use the `--no-cache` flag. + +## Logs + +Not only does `turbo` cache the output of your tasks, it also records the terminal output (i.e. combined `stdout` and `stderr`) to (`<package>/.turbo/run-<command>.log`). When `turbo` encounters a cached task, it will replay the output as if it happened again, but instantly, with the package name slightly dimmed. + +## Hashing + +By now, you're probably wondering how `turbo` decides what constitutes a cache hit vs. miss for a given task. Good question! + +First, `turbo` constructs a hash of the current global state of the codebase: + +- The contents of any files that satisfy the glob patterns and any the values of environment variables listed in [`globalDependencies`](/repo/docs/reference/configuration#globalDependencies) +- The sorted list environment variable key-value pairs that include `THASH` _anywhere_ in their names (e.g. `STRIPE_PUBLIC_THASH_SECRET_KEY`, but not `STRIPE_PUBLIC_KEY`) + +Then it adds more factors relevant to a given workspace's task: + +- Hash the contents of all version-controlled files in the workspace folder or the files matching the `inputs` globs, if present +- The hashes of all internal dependencies +- The `outputs` option specified in the [`pipeline`](/repo/docs/reference/configuration#pipeline) +- The set of resolved versions of all installed `dependencies`, `devDependencies`, and `optionalDependencies` specified in a workspace's `package.json` from the root lockfile +- The workspace task's name +- The sorted list of environment variable key-value pairs that correspond to the environment variable names listed in applicable [`pipeline.<task-or-package-task>.dependsOn`](/repo/docs/reference/configuration#dependson) list. + +Once `turbo` encounters a given workspace's task in its execution, it checks the cache (both locally and remotely) for a matching hash. If it's a match, it skips executing that task, moves or downloads the cached output into place and replays the previously recorded logs instantly. If there isn't anything in the cache (either locally or remotely) that matches the calculated hash, `turbo` will execute the task locally and then cache the specified `outputs` using the hash as an index. + +The hash of a given task is injected at execution time as an environment variable `TURBO_HASH`. This value can be useful in stamping outputs or tagging Dockerfile etc. + +<Callout> + As of `turbo` v0.6.10, `turbo`'s hashing algorithm when using `npm` or `pnpm` + differs slightly from the above. When using either of these package managers, + `turbo` will include the hashed contents of the lockfile in its hash algorithm + for each workspace's task. It will _not_ parse/figure out the resolved set of + all dependencies like the current `yarn` implementation. +</Callout> + +[1]: https://nextjs.org/docs/basic-features/environment-variables#exposing-environment-variables-to-the-browser +[2]: https://create-react-app.dev/docs/adding-custom-environment-variables/ +[3]: https://github.com/motdotla/dotenv diff --git a/docs/pages/repo/docs/core-concepts/monorepos.mdx b/docs/pages/repo/docs/core-concepts/monorepos.mdx new file mode 100644 index 0000000..d245469 --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/monorepos.mdx @@ -0,0 +1,23 @@ +# Turborepo in Monorepos + +## The problem + +![](/images/docs/why-turborepo-problem.png) + +Monorepos have many advantages - but **they struggle to scale**. Each workspace has its own test suite, its own linting and its own build process. A single monorepo might have **hundreds of tasks to execute**. + +## The solution + +![](/images/docs/why-turborepo-solution.png) + +**Turborepo solves your monorepo's scaling problem**. Our remote cache stores the result of all your tasks, meaning that **your CI never needs to do the same work twice**. + +Task scheduling can be difficult in a monorepo. Imagine `yarn build` needs to run before `yarn test`, across all your workspaces. Turborepo **can schedule your tasks for maximum speed**, across all available cores. + +Turborepo can be **adopted incrementally**. It uses the `package.json` scripts you've already written, the dependencies you've already declared, and a single `turbo.json` file. You can **use it with any package manager**, like `npm`, `yarn` or `pnpm`. You can add it to any monorepo in just a few minutes. + +## What turborepo is not + +Turborepo **doesn't handle [package installation](/repo/docs/handbook/package-installation)**. Tools like `npm`, `pnpm` or `yarn` already do that brilliantly. But they run tasks inefficiently, meaning slow CI builds. + +We recommend that **Turborepo runs your tasks**, and your favorite package manager installs your packages. diff --git a/docs/pages/repo/docs/core-concepts/monorepos/_meta.json b/docs/pages/repo/docs/core-concepts/monorepos/_meta.json new file mode 100644 index 0000000..d66bf5d --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/monorepos/_meta.json @@ -0,0 +1,5 @@ +{ + "running-tasks": "Running Tasks", + "filtering": "Filtering Workspaces", + "skipping-tasks": "Skipping Tasks in CI" +} diff --git a/docs/pages/repo/docs/core-concepts/monorepos/configuring-workspaces.mdx b/docs/pages/repo/docs/core-concepts/monorepos/configuring-workspaces.mdx new file mode 100644 index 0000000..f205e64 --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/monorepos/configuring-workspaces.mdx @@ -0,0 +1,267 @@ +--- +title: Configuring Workspaces +description: Turborepo lets you configure each workspace +--- + +import Callout from "../../../../../components/Callout"; + +# Configuring Workspaces + +Most monorepos can declare a `turbo.json` in the root directory with a uniform +[pipeline][2] that apply to all workspaces. Sometimes, a monorepo can contain +workspaces that need to configure their tasks differently. To accommodate this, +starting in version 1.8, Turborepo enables you to extend the root configuration +with a `turbo.json` in any workspace. This flexibility enables a more diverse +set of apps and packages to co-exist, and allows workspace owners to maintain +specialized tasks and configuration without affecting other apps and packages of +the monorepo. + +## How it Works + +To override the configuration for any task defined in the root `turbo.json`, add +a `turbo.json` file in any workspace of your monorepo with a top-level `extends` +key: + +```jsonc filename="apps/my-app/turbo.json" +{ + "extends": ["//"], + "pipeline": { + "build": { + // custom configuration for the build task in this workspace + }, + // new tasks only available in this workspace + "special-task": {}, + } +} +``` + +<Callout> + For now, the only valid value for the `extends` key is `["//"]`. +`//` is a special name used to identify the root directory of the monorepo. +</Callout> + +Configuration in a workspace can override any of [the configurations for a +pipeline task][2]. If you don't include a key, the configuration is inherited +from the extended `turbo.json`. + +## Examples + +To illustrate, let's look at some use cases. + +### Different Frameworks + +Let's say your monorepo has multiple [Next.js][5] apps, and one [SvelteKit][6] +app. Both frameworks create their build output with a `build` script in their +respective `package.json`s. You _could_ configure Turborepo to run these tasks +with a single `turbo.json` at the root like this: + +```jsonc filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": [".next/**", "!.next/cache/**", ".svelte-kit/**"], + } + } +} +``` + +Notice that both `.next/**` and `.svelte-kit/**` need to be specified as +[`outputs`][7], even though Next.js apps do not generate a `.svelte-kit` directory, and +vice versa. With Workspace Configurations, you can instead add custom +configuration in the SvelteKit workspace in `apps/my-svelte-kit-app/turbo.json`: + +```jsonc filename="apps/my-svelte-kit-app/turbo.json" +{ + "extends": ["//"], + "pipeline": { + "build": { + "outputs": [".svelte-kit/**"] + } + } +} +``` + +and remove the config from the root configuration: + +```diff filename="turbo.json" +{ + "pipeline": { + "build": { +- "outputs": [".next/**", "!.next/cache/**", ".svelte-kit/**"] ++ "outputs": [".next/**", "!.next/cache/**"] + } + } +} +``` + +This not only makes each configuration easier to read, it puts the configuration +closer to where it is used. + +### Specialized Tasks + +In another example, say that the `build` task in one workspace `dependsOn` a +`compile` task. You could universally declare it as `dependsOn: ["compile"]`. +This means that your root `turbo.json` has to have an empty `compile` task +entry: + +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "dependsOn": ["compile"] + }, + "compile": {} + } +} +``` + +With Workspace Configurations, you can move that `compile` task into the +`apps/my-custom-app/turbo.json`, + +```json filename="apps/my-app/turbo.json" +{ + "extends": ["//"], + "pipeline": { + "build": { + "dependsOn": ["compile"] + }, + "compile": {} + } +} +``` + +and remove it from the root: + +```diff filename="turbo.json" +{ + "pipeline": { ++ "build": {} +- "build": { +- "dependsOn": ["compile"] +- }, +- "compile": {} + } +} +``` + +Now, the owners of `my-app`, can have full ownership over their `build` task, +but continue to inherit any other tasks defined at the root. + +## Comparison to Workspace-specific tasks + +At first glance, Workspace Configurations may sound a lot like the +[`workspace#task` syntax][3] in the root `turbo.json`. The features are +similar, but have one significant difference: when you declare a Workspace-specific +in the root `turbo.json`, it _completely_ overwrites the baseline task +configuration. With a Workspace Configuration, the task configuration is merged +instead. + +Consider the example of the monorepo with multiple Next.js apps and a Sveltekit +app again. Without a Workspace-specific task, you might configure your root +`turbo.json` like this: + +```jsonc filename="turbo.json" +{ + "pipeline": { + "build": { + "outputMode": "hash-only", + "inputs": ["src/**"], + "outputs": [".next/**", "!.next/cache/**"], + }, + "my-sveltekit-app#build": { + "outputMode": "hash-only", // must duplicate this + "inputs": ["src/**"], // must duplicate this + "outputs": [".svelte-kit/**"] + } + } +} +``` + +In this example, `my-sveltekit-app#build` completely overwrites `build` for the +Sveltekit app, so `outputMode` and `inputs` also need to be duplicated. + +With Workspace Configurations, `outputMode` and `inputs` are inherited, so +you don't need to duplicate them. You only need to override `outputs` +`my-sveltekit-app` config. + +<Callout type="info"> + Although there are no plans to remove Workspace-specific task configurations, + we expect that Workspace Configurations can be used for most use cases instead. +</Callout> + +## Limitations + +Although the general idea is the same as the root `turbo.json`, Workspace +Configurations come with a set of guardrails that can prevent workspaces from creating +confusing situations. These guardrails are listed here to make it clear that +they are intentional, rather than accidental: + +- Workspace Configurations cannot use [the `workspace#task` syntax][3] as pipeline entries + + The `workspace` is inferred based on the _location_ of the config, and it is + not possible to change configuration for another workspace. For example, in a + Workspace Configuration for 'my-nextjs-app': + + ```jsonc filename="apps/my-nextjs-app/turbo.json" + { + "pipeline": { + "my-nextjs-app#build": { + // ❌ This is not allowed. Even though it's + // referencing the correct workspace, "my-nextjs-app" + // is inferred, and we don't need to specify it again. + // This syntax also has different behavior, so we do not want to allow it. + // (see "Comparison to Workspace-specific tasks" section) + }, + "my-sveltekit-app#build": { + // ❌ Changing configuration for the "my-sveltekit-app" workspace + // from Workspace Configuraton in "my-nextjs-app" is not allowed. + }, + "build": { + // βœ… just use the task name! + }, + } + } + ``` + + Note that the `build` task can still depend on a Workspace-specific task: + + ```jsonc filename="apps/my-nextjs-app/turbo.json" + { + "pipeline": { + "build": { + // βœ… It's still ok to have workspace#task in dependsOn! + "dependsOn": ["some-pkg#compile"] + }, + } + } + ``` + +- Workspace Configurations cannot override anything outside the `pipeline` key. + + For example, it is not possible to override `globalEnv` or + `globalDependencies`. We expect that monorepo owners should control this + absolutely, and if this config is not _truly_ global, it should not be + configured that way. + +- Root turbo.json cannot use the `extends` key. + + To avoid creating circular dependencies on workspaces, the root `turbo.json` + cannot extend from anything. + +If you have a use case for any of these, please [file an issue][4]! + +## Troubleshooting + +In large monorepos, it can sometimes be difficult to understand how Turborepo is +interpreting your configuration. To help, we've added a `resolvedTaskDefinition` +to the Dry Run output. If you run `turbo run build --dry-run`, for example, the +output will include the combination of all `turbo.json` configurations that were +considered before running the `build` task. + +[1]: /repo/docs/core-concepts/monorepos/running-tasks#running-tasks-from-the-root +[2]: /repo/docs/reference/configuration#pipeline +[3]: /repo/docs/core-concepts/monorepos/running-tasks#specific-workspace-tasks +[4]: https://github.com/vercel/turbo/issues/new/choose +[5]: https://nextjs.org +[6]: https://kit.svelte.dev/ +[7]: /repo/docs/reference/configuration#outputs diff --git a/docs/pages/repo/docs/core-concepts/monorepos/filtering.mdx b/docs/pages/repo/docs/core-concepts/monorepos/filtering.mdx new file mode 100644 index 0000000..f88d9c8 --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/monorepos/filtering.mdx @@ -0,0 +1,208 @@ +--- +title: Filtering Workspaces +description: Only build the workspaces you care about. +--- + +import Callout from "../../../../../components/Callout"; +import HeartIcon from "@heroicons/react/solid/HeartIcon"; + +# Filtering Workspaces + +A monorepo can contain hundreds, or thousands, of workspaces. By default, running `turbo run test` from the root of the repository will execute the `test` task in **all available workspaces**. + +![Without using a filter, test will be run across all packages](/images/docs/no-filter.png) + +Turborepo supports a `--filter` flag that lets you **select the workspaces you'd like to execute your task in**. + +![With a filter on shared, only the shared package runs test](/images/docs/with-filter.png) + +You can use it to: + +- Filter by [workspace name](#filter-by-workspace-name) +- Filter by [workspace directory](#filter-by-directory) +- Include [dependents](#include-dependents-of-matched-workspaces) and [dependencies](#include-dependencies-of-matched-workspaces) of matched workspaces +- Execute tasks from the [workspace root](#the-workspace-root) +- Filter by [changes in git history](#filter-by-changed-workspaces) +- [Exclude workspaces](#excluding-workspaces) from selection + +Turborepo will run each task against each matched workspace, ensuring that any tasks which depend on it are run first, according to the `pipeline` specification in [`turbo.json`](/repo/docs/reference/configuration#pipeline). + +## Filter Syntax + +### Multiple filters + +You can specify more than one filter by passing multiple `--filter` flags to the command: + +```sh +turbo build --filter=my-pkg --filter=my-app +``` + +### Filter by workspace name + +When you want to run a script in only one workspace, you can use a single filter: `--filter=my-pkg`. + +```sh +# Build 'my-pkg', letting `turbo` infer task dependencies +# from the pipeline defined in turbo.json +turbo run build --filter=my-pkg + +# Build '@acme/bar', letting `turbo` infer task dependencies +# from the pipeline defined in turbo.json +turbo run build --filter=@acme/bar +``` + +If you want to run tasks inside several workspaces with similar names, you can use glob syntax: `--filter=*my-pkg*`. + +```sh +# Build all workspaces that start with 'admin-', letting turbo infer task +# dependencies from the pipeline defined in turbo.json +turbo run build --filter=admin-* +``` + +#### Scopes + +Some monorepos prepend their workspace names with a scope, such as `@acme/ui` and `@acme/app`. As long as the scope (`@acme`) is unique across the codebase, you may omit it from filters. + +```diff +- turbo run build --filter=@acme/ui ++ turbo run build --filter=ui +``` + +### Include dependents of matched workspaces + +Sometimes, you'll want to ensure that your shared package isn't affecting any downstream dependencies. For that, you can use `--filter=...my-lib`. + +If `my-app` depends on `my-lib`, `...my-lib` will select `my-app` and `my-lib`. + +Including a `^` (`...^my-lib`) will select all of `my-lib`'s dependents, but not `my-lib` itself. + +```sh +# Test 'my-lib' and everything that depends on 'my-lib' +turbo run test --filter=...my-lib + +# Test everything that depends on 'my-lib', but not 'my-lib' itself +turbo run test --filter=...^my-lib +``` + +### Include dependencies of matched workspaces + +Sometimes, you'll want to make sure that `build` is run in all of the dependencies of the lib you're targeting. For that, you can use `--filter=my-app...`. + +If `my-app` depends on `my-lib`, `my-app...` will select `my-app` and `my-lib`. + +Including a `^` (`my-app^...`) will select all of `my-app`'s dependencies, but not `my-app` itself. + +```sh +# Build 'my-app' and its dependencies +turbo run build --filter=my-app... + +# Build 'my-app's dependencies, but not 'my-app' itself +turbo run build --filter=my-app^... +``` + +### Filter by directory + +Useful for when you want to target a specific directory, not a workspace name. It supports: + +- Exact matches: `--filter=./apps/docs` +- Globs: `--filter='./apps/*'` + +```sh +# Build all of the workspaces in the 'apps' directory +turbo run build --filter='./apps/*' +``` + +#### Combining with other syntaxes + +When combining directory filters with other syntaxes, enclose in `{}`. For example: + +```sh +# Build all of the workspaces in the 'libs' directory, +# and all the workspaces that depends on them +turbo run build --filter=...{./libs/*} +``` + +### Filter by changed workspaces + +You can run tasks on any workspaces which have changed since a certain commit. These need to be wrapped in `[]`. + +For example, `--filter=[HEAD^1]` will select all workspaces that have changed in the most recent commit: + +```sh +# Test everything that changed in the last commit +turbo run test --filter=[HEAD^1] +``` + +#### Check a range of commits + +If you need to check a specific range of commits, rather than comparing to `HEAD`, you can set both ends of the comparison via `[<from commit>...<to commit>]`. + +```sh +# Test each workspace that changed between 'main' and 'my-feature' +turbo run test --filter=[main...my-feature] +``` + +#### Ignoring changed files + +You can use [`--ignore`](/repo/docs/reference/command-line-reference#--ignore) to specify changed files to be ignored in the calculation of which workspaces have changed. + +#### Combining with other syntaxes + +You can additionally prepend the commit reference with `...` to match the dependencies of other components +against the changed workspaces. For instance, to select `foo` if any of `foo`'s dependencies have changed in the last commit, +you can pass `--filter=foo...[HEAD^1]`. + +```sh +# Build everything that depends on changes in branch 'my-feature' +turbo run build --filter=...[origin/my-feature] + +# Build '@foo/bar' if it or any of its dependencies +# changed in the last commit +turbo run build --filter=@foo/bar...[HEAD^1] +``` + +You can even combine `[]` and `{}` syntax together: + +```sh +# Test each workspace in the '@scope' scope that +# is in the 'packages' directory, if it has +# changed in the last commit +turbo run test --filter=@scope/*{./packages/*}[HEAD^1] +``` + +### The workspace root + +The monorepo's root can be selected using the token `//`. + +```sh +# Run the format script from the root "package.json" file: +turbo run format --filter=// +``` + +### Excluding workspaces + +Prepend `!` to the filter. Matched workspaces from the entire filter will be excluded from the set of targets. +For example, match everything except `@foo/bar`: `--filter=!@foo/bar`. Note that you may need to escape `!` as appropriate for your shell (e.g. `\!`). + +```sh +# Build everything except '@foo/bar' +turbo run build --filter=!@foo/bar +# Build all of the workspaces in the 'apps' directory, except the 'admin' workspace +turbo run build --filter=./apps/* --filter=!admin +``` + +### Via global `turbo` + +If you are using a globally installed version of `turbo`, running from within a workspace automatically +filters to that workspace's directory. That means running `turbo run test --filter={./packages/shared}` from the root of the repository is equivalent to +running `cd packages/shared && turbo run test`. + +Running with an explicitly named workspace will always work from anywhere in the repository: `turbo run test --filter=shared`. + +<Callout + type="idea" + icon={<HeartIcon className="mt-1 h-5 w-5 text-gray-400" aria-hidden="true" />} +> + Turborepo's Filter API design and docs were/are inspired by + [pnpm](https://pnpm.io/filtering) +</Callout> diff --git a/docs/pages/repo/docs/core-concepts/monorepos/running-tasks.mdx b/docs/pages/repo/docs/core-concepts/monorepos/running-tasks.mdx new file mode 100644 index 0000000..6225b41 --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/monorepos/running-tasks.mdx @@ -0,0 +1,315 @@ +--- +title: Running Tasks +description: Turborepo helps you specify task dependencies declaratively. +--- + +import Callout from "../../../../../components/Callout"; +import HeartIcon from "@heroicons/react/solid/HeartIcon"; + +# Running Tasks in a Monorepo + +Every monorepo has two main building blocks: **workspaces** and **tasks**. Let's imagine you have a monorepo containing **three workspaces**, each with **three tasks**: + +![](/images/docs/your-monorepo-excalidraw.png) + +Here, both `apps/web` and `apps/doc` use code from `packages/shared`. In fact, when they're built (via `build`) **they need `packages/shared` to be built _first_**. + +## Most tools don't optimize for speed + +Let's imagine we want to run all our tasks across all our workspaces. In a tool like `yarn`, you might run a script like this: + +``` +yarn workspaces run lint +yarn workspaces run test +yarn workspaces run build +``` + +This would mean the tasks run like this: + +![](/images/docs/yarn-workspaces-excalidraw.png) + +As you can see, `lint` gets run in all the workspaces. Then, `build` gets run - with `shared` going first. Finally, `test` gets run. + +**This is the slowest possible way to run these tasks**. Each task needs to wait for the previous one to finish before it can start. To improve on this, we'll need a tool that can multitask. + +## Turborepo can multitask + +Turborepo can schedule our tasks for maximum speed by understanding the dependencies between our tasks. + +First, we declare our tasks inside `turbo.json`: + +```json +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + // ^build means build must be run in dependencies + // before it can be run in this workspace + "outputs": [".next/**", "!.next/cache/**",".svelte-kit/**"], + "dependsOn": ["^build"] + }, + "test": {}, + "lint": {} + } +} +``` + +Next, we can replace our `yarn workspaces` script with this: + +```diff +- yarn workspaces run lint +- yarn workspaces run test +- yarn workspaces run build ++ turbo run lint test build +``` + +When we run it, Turborepo will **multitask as many tasks as possible over all available CPU's**, meaning our tasks run like this: + +![](/images/docs/turborepo-excalidraw.png) + +Both `lint` and `test` run immediately, because they have no `dependsOn` specified in `turbo.json`. + +The `build` task inside `shared` completes first, then `web` and `docs` build afterwards. + +## Defining a pipeline + +The `pipeline` configuration declares which tasks depend on each other in your monorepo. Here's a kitchen sink example: + +```jsonc filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + // A workspace's `build` task depends on that workspace's + // topological dependencies' and devDependencies' + // `build` tasks being completed first. The `^` symbol + // indicates an upstream dependency. + "dependsOn": ["^build"], + "outputs": [".next/**", "!.next/cache/**", ".svelte-kit/**"] + }, + "test": { + // A workspace's `test` task depends on that workspace's + // own `build` task being completed first. + "dependsOn": ["build"], + // A workspace's `test` task should only be rerun when + // either a `.tsx` or `.ts` file has changed. + "inputs": ["src/**/*.tsx", "src/**/*.ts", "test/**/*.ts", "test/**/*.tsx"] + }, + // A workspace's `lint` task has no dependencies and + // can be run whenever. + "lint": {}, + "deploy": { + // A workspace's `deploy` task depends on the `build`, + // `test`, and `lint` tasks of the same workspace + // being completed. + "dependsOn": ["build", "test", "lint"] + } + } +} +``` + +Let's walk through some common patterns you'll want to get to know before diving in to `turbo.json`. + +### Dependencies between tasks + +#### In the same workspace + +There might be tasks which need to run _before_ other tasks. For instance, `build` might need to be run before `deploy`. + +If both tasks are in the same workspace, you can specify the relationship like this: + +```jsonc filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"], + "outputs": [".next/**", "!.next/cache/**", ".svelte-kit/**"] + }, + "deploy": { + // A workspace's `deploy` task depends on the `build`, + // task of the same workspace being completed. + "dependsOn": ["build"] + } + } +} +``` + +This means that whenever `turbo run deploy` is run, `build` will also be run inside the same workspace. + +#### In a different workspace + +A common pattern in monorepos is to declare that a workspace's `build` task should only run once the `build` tasks of all _the workspaces it depends on_ are complete. + +The `^` symbol explicitly declares that the task has a dependency on a task in a workspace it depends on. + +```jsonc filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + // "A workspace's `build` command depends on its dependencies' + // and devDependencies' `build` commands being completed first" + "dependsOn": ["^build"], + "outputs": [".next/**", "!.next/cache/**", ".svelte-kit/**"] + } + } +} +``` + +#### No dependencies + +An empty dependency list (`dependsOn` is either undefined or `[]`) means that nothing needs to run before this task! After all, it has NO dependencies. + +```jsonc filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + // A workspace's `lint` command has no dependencies and can be run + // whenever. + "lint": {} + } +} +``` + +#### Specific workspace-tasks + +Sometimes, you may want to create a workspace-task dependency on another workspace-task. This can be especially helpful for repos migrating from `lerna` or `rush`, where tasks are run in separate phases by default. Sometimes these configurations make assumptions that cannot be expressed in a simple `pipeline` configuration, as seen above. Or you may just want to express sequences of tasks between applications or microservices when using `turbo` in CI/CD. + +For these cases, you can express these relationships in your `pipeline` configuration using the `<workspace>#<task>` syntax. +The example below describes the `deploy` script of a `frontend` application that depends on the `deploy` and `health-check` scripts of `backend`, as well as the `test` script of a `ui` workspace: + +```jsonc filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + // Standard configuration + "build": { + "dependsOn": ["^build"], + "outputs": [".next/**", "!.next/cache/**", ".svelte-kit/**"] + }, + "test": { + "dependsOn": ["^build"] + }, + "deploy": { + "dependsOn": ["test", "build"] + }, + + // Explicit workspace-task to workspace-task dependency + "frontend#deploy": { + "dependsOn": ["ui#test", "backend#deploy", "backend#health-check"] + } + } +} +``` + +This explicit configuration for `frontend#deploy` may seem to conflict with the `test` and `deploy` task configurations, but it does not. Since `test` and `deploy` do not have dependencies on other workspaces (e.g. `^<task>`), they can execute any time after their workspace's `build` and `test` scripts have finished. + +<Callout> + Notes: + +1. Although this `<workspace>#<task>` syntax is a useful escape hatch, we generally recommend using it for deployment orchestration tasks such as health checks, rather than build-time dependencies, so that Turborepo can optimize these tasks more efficiently +1. Package-tasks do not inherit cache configuration. You must redeclare + [`outputs`](/repo/docs/reference/configuration#outputs) at the moment. +1. `<workspace>` must match the `name` key in the workspace's `package.json` or the task will be ignored. + +</Callout> + +### Running tasks from the root + +`turbo` can run tasks that exist in the `package.json` file at the root of the monorepo. +These must be explicitly added to the pipeline configuration using the key syntax `"//#<task>"`. This is +true even for tasks that already have their own entry. For example, if your pipeline declares a `"build"` task, +and you want to include the `build` script defined in the monorepo's root `package.json` file with +`turbo run build`, you must opt the root into it by declaring `"//#build": {...}` in your configuration. +Conversely, you _do not_ need to define a generic `"my-task": {...}` entry if all you need is `"//#my-task": {...}`. + +A sample pipeline that defines the root task `format` and opts the root into `test` might look like: + +```jsonc filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"], + "outputs": [".next/**", "!.next/cache/**", ".svelte-kit/**"] + }, + "test": { + "dependsOn": ["^build"], + }, + // This will cause the "test" script to be included when + // "turbo run test" is run + "//#test": { + "dependsOn": [], + }, + // This will cause the "format" script in the root package.json + // to be run when "turbo run format" is run. Since the general + // "format" task is not defined, only the root's "format" script + // will be run. + "//#format": { + "dependsOn": [], + "outputs": ["dist/**/*"], + "inputs": ["version.txt"] + } + } +} +``` + +**A note on recursion**: Scripts defined in the monorepo's root `package.json` often call `turbo` themselves. +For example, the `build` script might be `turbo run build`. In this situation, including `//#build` in +`turbo run build` will cause infinite recursion. It is for this reason that tasks run from the monorepo's root must +be explicitly opted into via including `//#<task>` in the pipeline configuration. `turbo` includes +some best-effort checking to produce an error in the recursion situations, but it is up to you to only +opt in those tasks which don't themselves trigger a `turbo` run that would recurse. + +### Dependencies outside of a task + +When your task has topological dependencies that are outside of that given task, you'll still want to enjoy the parallelism of Turborepo and ensure that your caching behavior is correct according to your code changes. + +To demonstrate how to do this, let's say you have a set of workspaces to do a little bit of math: `add`, `subtract`, and `multiply`. `subtract` is implemented by calling `add` with a negative number and your `multiply` works by calling `add` in a loop. So, `add` is a dependency of both `subtract` and `multiply`. + +You've written tests in all three of these workspaces and it's time to run them. There are two requirements here: + +1. All tests run in parallel to keep things speedy +2. A change in a dependency should result in a cache miss + +To accomplish this, we can set up a pipeline like so: + +```jsonc filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "topo": { + "dependsOn": ["^topo"] + }, + "test": { + "dependsOn": ["^topo"] + } + } + } +``` + +![](/images/docs/task-graph-with-placeholder-task.png) + +In this pipeline, we create an intermediary placeholder `topo` task. Since we don't have a `topo` command in our workspaces, the pipeline will go straight to running `test` scripts in parallel, meeting our first requirement. The second requirement will also be taken care of, falling back on Turborepo's default behavior of creating hashes for a workspace task and it's dependencies as a tree. + +<Callout + type="idea" + icon={<HeartIcon className="h-5 w-5 mt-1 text-gray-400" aria-hidden="true" />} +> + Turborepo's Pipeline API design and this page of documentation was inspired by + [Microsoft's Lage + project](https://microsoft.github.io/lage/docs/Tutorial/pipeline#defining-a-pipeline). + Shoutout to [Kenneth Chau](https://twitter.com/kenneth_chau) for the idea of + fanning out tasks in such a concise and elegant way. +</Callout> + +### Tips + +#### Tasks that are in the `pipeline` but not in SOME `package.json` + +Sometimes tasks declared in the `pipeline` are not present in all workspaces' `package.json` files. `turbo` will gracefully ignore those. No problem! + +#### `pipeline` tasks are the only ones that `turbo` knows about + +`turbo` will only account for tasks declared in the `pipeline` configuration. If it's not listed there, `turbo` will not know how to run them. diff --git a/docs/pages/repo/docs/core-concepts/monorepos/skipping-tasks.mdx b/docs/pages/repo/docs/core-concepts/monorepos/skipping-tasks.mdx new file mode 100644 index 0000000..5096d21 --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/monorepos/skipping-tasks.mdx @@ -0,0 +1,66 @@ +--- +title: Skipping tasks +description: Never do the same work twice. +--- + +import Callout from '../../../../../components/Callout' + +# Skipping Tasks + +[Build caching](/repo/docs/core-concepts/caching) can dramatically speed up your tasks - but you can do even better by using `npx turbo-ignore`. If your workspace is unaffected by your code changes, you can completely skip executing a task altogether. + +Let's say you want to skip the unit tests for your `web` workspace when there aren't any changes to your `web` application (or its package dependencies). If you are already using [Remote Caching](https://turbo.build/repo/docs/core-concepts/remote-caching), you will probably get a cache hit, but you would still spend time provisioning the CI container, installing `npm` dependencies, and other things that can take a while. + +Ideally, we would do a quick check to see if any of that work needs to happen in the first place. + +After we've checked out the repo, but **before** any other work, we can take a few seconds to check that our `web` tests have changed since the parent commit. + +```bash +npx turbo-ignore web --task=test +``` + +This command will: + +1. Filter for the `web` workspace. +2. Create the `dry` output for your `test` task compared to your parent commit. +3. Parse the output to determine which packages have changed. +4. Exit with a `1` code if changes are detected. Otherwise, exits with a `0`. + +While you may have been able to hit a `>>> FULL TURBO` cache for this task, we just saved time with all of the other setup tasks required to run your CI. + +## Using `turbo-ignore` + +To skip unaffected work, first ensure that your git history is available on the machine. Then, run `npx turbo-ignore`. + +`turbo-ignore` uses a combination of the `--filter` and `--dry=json` flags to find changes from the parent commit to the current commit to identify affected packages. By default, `turbo-ignore` finds the difference for the **build task in the current working directory**, but you can [customize this behavior with flags](#customizing-behavior). + +Here's an example of the command that will be built and run: + +```bash +npx turbo run build --filter=@example/web...3c8387ffd98b751305fe3f0284befdd00cbd4610 --dry=json +``` + +Note that a dry run does not _execute_ the build task. Instead, it checks your packages to see if your code changes will affect your build (or other task) in only a few seconds. + +If `turbo-ignore` finds that the task can be skipped, it will exit the process with a `0` code. If changes have been found, the process will exit with `1`. + +<Callout type="info">On Vercel, the previously deployed SHA will be used instead of the parent commit.</Callout> + +## Customizing behavior + +To specify a workspace, you can add it to your command like: + +``` +npx turbo-ignore web +``` + +where `web` is your workspace's name. + +If you'd like to change more of the default behavior, there are a few flags available: + +- `--task`: Specifies the task for the command that `turbo-ignore` will invoke. Defaults to `build`. +- `--fallback`: Specify a ref/head to compare against. Defaults to `HEAD^`. + +## Using `turbo-ignore` on Vercel + +To use `npx turbo-ignore` on Vercel, you can use the [Ignored Build Step](https://vercel.com/docs/concepts/projects/overview#ignored-build-step) feature. Vercel will automatically infer the correct arguments to successfully run `turbo-ignore`. diff --git a/docs/pages/repo/docs/core-concepts/remote-caching.mdx b/docs/pages/repo/docs/core-concepts/remote-caching.mdx new file mode 100644 index 0000000..6f7c542 --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/remote-caching.mdx @@ -0,0 +1,112 @@ +--- +title: Remote Caching +description: Share cache artifacts across machines for even faster builds. +--- + +import Callout from "../../../../components/Callout"; +import { Tabs, Tab } from "../../../../components/Tabs"; + +# Remote Caching + +Turborepo's [task cache](/repo/docs/core-concepts/caching) can save a lot of time by never doing the same work twice. + +But there's an issue - the **cache is local to your machine**. When you're working with a CI, this can result in a lot of duplicated work: + +![](/images/docs/local-caching.png) + +Since Turborepo only caches to the local filesystem by default, the same task (`turbo run build`) must be **re-executed on each machine** (by you, by your teammates, by your CI, by your PaaS, etc.) even when all of the task inputs are identical β€” which **wastes time and resources**. + +## A single, shared cache + +What if you could share a single Turborepo cache across your entire team (and even your CI)? + +![](/images/docs/remote-caching.png) + +By working with providers like [Vercel](#vercel), Turborepo can securely communicate with a remote cache - a cloud server that stores the results of your tasks. + +This can save enormous amounts of time by **preventing duplicated work across your entire organization**. + +<Callout> + Remote Caching is a powerful feature of Turborepo, but with great power comes + great responsibility. Make sure you are caching correctly first and double + check handling of environment variables. Please also remember Turborepo treats + logs as artifacts, so be aware of what you are printing to the console. +</Callout> + +## Vercel + +### For Local Development + +If you want to link your local turborepo to your Remote Cache, first authenticate the Turborepo CLI with your Vercel account: + +```sh +turbo login +``` + +<Callout> + If your Remote Cache is configured to use single-sign-on you will need to run + `npx turbo login --sso-team=TEAMNAME` in order to get a cache token with the + correct privileges. +</Callout> + +Next, link your Turborepo to your remote cache: + +```sh +turbo link +``` + +Once enabled, make some changes to a workspace you are currently caching and run tasks against it with `turbo run`. +Your cache artifacts will now be stored locally _and_ in your Remote Cache. + +To verify, delete your local Turborepo cache with: + +<Tabs items={["unix", "win"]} storageKey="selected-os"> + <Tab> + ```sh + rm -rf ./node_modules/.cache/turbo + ``` + </Tab> + <Tab> + ```sh + rd /s /q "./node_modules/.cache/turbo" + ``` + </Tab> +</Tabs> + +Then run the same build again. If things are working properly, `turbo` should not execute tasks locally, but rather download both the logs and artifacts from your Remote Cache and replay them back to you. + +### Remote Caching on Vercel Builds + +If you are building and hosting your apps on Vercel, Remote Caching will be automatically set up on your behalf once you use `turbo`. You need to update your [build settings](https://vercel.com/docs/concepts/deployments/configure-a-build) to build with `turbo`. + +Please refer to the [Vercel documentation](https://vercel.com/docs/concepts/git/monorepos#turborepo?utm_source=turbo.build&utm_medium=referral&utm_campaign=docs-link) for instructions. + +### Artifact Integrity and Authenticity Verification + +You can enable Turborepo to sign artifacts with a secret key before uploading them to the Remote Cache. Turborepo uses `HMAC-SHA256` signatures on artifacts using a secret key you provide. +Turborepo will verify the remote cache artifacts' integrity and authenticity when they're downloaded. +Any artifacts that fail to verify will be ignored and treated as a cache miss by Turborepo. + +To enable this feature, set the `remoteCache` options on your `turbo.json` config to include `signature: true`. Then specify your secret key by declaring the `TURBO_REMOTE_CACHE_SIGNATURE_KEY` environment variable. + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "remoteCache": { + // Indicates if signature verification is enabled. + "signature": true + } +} +``` + +## Custom Remote Caches + +You can self-host your own Remote Cache or use other remote caching service providers as long as they comply with Turborepo's Remote Caching Server API. + +You can set the remote caching domain by specifying the `--api` and `--token` flags, where `--api` is the hostname and `--token` is a bearer token. + +```sh +turbo run build --api="https://my-server.example.com" --token="xxxxxxxxxxxxxxxxx" +``` + +You can see the endpoints / requests [needed here](https://github.com/vercel/turbo/blob/main/cli/internal/client/client.go). diff --git a/docs/pages/repo/docs/core-concepts/scopes.mdx b/docs/pages/repo/docs/core-concepts/scopes.mdx new file mode 100644 index 0000000..4757c13 --- /dev/null +++ b/docs/pages/repo/docs/core-concepts/scopes.mdx @@ -0,0 +1,50 @@ +--- +title: Scoped tasks +description: Even faster Turborepo builds with scoped tasks. +searchable: false +--- + +import Callout from "../../../../components/Callout"; +import HeartIcon from "@heroicons/react/solid/HeartIcon"; + +<Callout type="error"> + `--scope` is deprecated in `1.2.x`. Please use + [`--filter`](/repo/docs/core-concepts/monorepos/filtering) instead. +</Callout> + +# Scoped Tasks + +Scoping task execution can speed up the process especially if there are distinct clusters of workspaces that are not related to each other within your repository. Turborepo has a `scope` option that allows the task running to proceed up to the workspaces found that matches the `scope` argument. It's useful to think of `scope` as an "entry point" into your monorepo's workspace/task graph. This is a string matcher based on the name of the workspaces (not the workspace path). + +<Callout type="info"> + It is important to note that dependents and dependencies refer to the + workspace and task. +</Callout> + +## Scoped tasks with all its dependents + +By default, it is helpful to be able to run tasks on all affected workspaces within a scope. Workspaces that changed will affect downstream consumers. In this case, pass along the `scope` to build all the dependencies as well. + +<Callout type="info"> + You can use wildcard character: `*`. This is particularly helpful when + workspaces are named by group or by scope. +</Callout> + +```sh +turbo run build --scope=*build-tools* +``` + +## Scoped tasks with no dependents & their dependencies + +Sometimes we want to run the tasks needed to satisfy the `build` script of all the workspaces that has the `build-tools` string in their names. Think of this as running tasks up and including the workspace matched in the scope. Add a `--no-deps` flag to run up to a workspace task. + +```sh +turbo run build --scope=*build-tools* --no-deps --include-dependencies +``` + +<Callout type="idea" icon={<HeartIcon className="h-5 w-5 text-gray-400" aria-hidden="true" />}> +Turborepo's Scoped Tasks API design and docs were/are inspired [Microsoft's Lage project](https://microsoft.github.io/lage/docs/Guide/scopes#scoped-builds-with-all-its-dependents) `--scope` flag which was inspired by [Lerna's](https://github.com/lerna/lerna/tree/main/commands/run#readme). + +We are working toward a new, more expressive task filtering/scoping syntax. [Read the RFC here.](https://github.com/vercel/turbo/discussions/105) + +</Callout> diff --git a/docs/pages/repo/docs/faq.mdx b/docs/pages/repo/docs/faq.mdx new file mode 100644 index 0000000..5d4c1c5 --- /dev/null +++ b/docs/pages/repo/docs/faq.mdx @@ -0,0 +1,51 @@ +--- +title: Frequently Asked Questions +description: Frequently asked questions about Turborepo. +--- + +import Callout from '../../../components/Callout' + +# Frequently Asked Questions + +## Should I install Turborepo globally? + +You have two options when working with Turborepo: + +1. Install it globally, via `npm install --global turbo` +2. Install a local version in your project + +We recommend installing the `turbo` CLI globally. This gives you a smooth, ergonomic experience for running tasks. + +## Do I have to use Remote Caching to use Turborepo? + +No. [Remote Caching](/repo/docs/core-concepts/remote-caching) is optional. However, you'll find it very useful to speed up development on a team, speed up builds inside of Docker, and also save space on your own machine. + +## Does Turborepo / Remote Caching store my source code? + +No. Turborepo does not store source code. Without [Remote Caching](/repo/docs/core-concepts/remote-caching), no code ever leaves your machineβ€”it will only cache artifacts to local disk. + +With Turborepo's Remote Caching, you are responsible for configuring cache behavior and should only set up Turborepo to cache compiled artifacts. Please be aware that Turborepo treats all logs as artifacts and so these _will_ be stored along with other cache artifacts. + +## Do I have to use Vercel to use Turborepo? + +No. Turborepo is an open-source project and is not tied to any specific hosting provider or Remote Cache provider. The default Remote Cache provider is Vercel, should you opt-in to enable it. However, you can use any other provider you like if they support the same API. Several open-source community Remote Caches are compatible with Turborepo. + +## Can I use Turborepo with a different Remote Cache provider other than Vercel? + +Yes. As long as the [Remote Cache](/repo/docs/core-concepts/remote-caching) provider you choose supports the same API, you can use Turborepo with it. + +## Does Turborepo collect any personally identifiable information? + +Due to the nature of Turborepo's functionality, no personal information is gathered when the open source binary is run locally. All cached artifacts are stored on your machine by default. Further, no log in information or contact details are collected by the `turbo` CLI, so Turborepo will never have access to any personally identifiable information. Thus, for any data privacy questions and concerns please refer to [Turborepo's Privacy Policy](/privacy). + +## Does Turborepo collect any personally identifiable information when using Remote Caching? + +When [Remote Caching](/repo/docs/core-concepts/remote-caching) is enabled, by default Turborepo will utilize your Vercel account to cache artifacts in the cloud. Thus, for any data privacy questions and concerns, please refer to [Turborepo's Privacy Policy](/privacy) and [Vercel's Privacy Policy](https://vercel.com/legal/privacy-policy). If you use a different Remote Cache provider, please refer to the provider's privacy policy. + +## How can I retain Fast Refresh in my Turborepo when using multiple Next.js applications? + +[Fast Refresh](https://nextjs.org/docs/basic-features/fast-refresh) gives you instantaneous feedback on edits made to your React components in Next.js applications. + +If your Turborepo has multiple Next.js applications, you can use `transpilePackages` inside `next.config.js` to ensure that imports across workspaces will work with Fast Refresh when changes are made. Turborepo will effectively watch for any edits and the rebuild when saving. You can get started from [this example](https://github.com/vercel/turbo/tree/main/examples/basic) which is set up to handle Fast Refresh. + +<Callout type="info">If you are using a Next.js version below 13, you will want to use [`next-transpile-modules`](https://www.npmjs.com/package/next-transpile-modules) for the same Fast Refresh behavior.</Callout> diff --git a/docs/pages/repo/docs/getting-started/_meta.json b/docs/pages/repo/docs/getting-started/_meta.json new file mode 100644 index 0000000..2ed5b29 --- /dev/null +++ b/docs/pages/repo/docs/getting-started/_meta.json @@ -0,0 +1,6 @@ +{ + "add-to-project": "Add to Existing Project", + "from-example": "Start from an Example", + "create-new": "Create a New Monorepo", + "existing-monorepo": "Add to Existing Monorepo" +} diff --git a/docs/pages/repo/docs/getting-started/add-to-project.mdx b/docs/pages/repo/docs/getting-started/add-to-project.mdx new file mode 100644 index 0000000..5923a47 --- /dev/null +++ b/docs/pages/repo/docs/getting-started/add-to-project.mdx @@ -0,0 +1,147 @@ +import { Tabs, Tab } from '../../../../components/Tabs' + +# Add Turborepo to your existing project + +Turborepo can be used in **any project** to speed up the execution of scripts in your `package.json`. + +After you install `turbo`, you'll be able to run all your `package.json` tasks from `turbo` instead of your package manager. + +By configuring your `turbo.json` correctly, you'll notice how [caching](/repo/docs/core-concepts/caching) helps your tasks run a lot faster. + +## Quickstart + +0. **If you don't have one already, create a new application:** + +<Tabs items={['Next.js', 'Vite']} storageKey="selected-framework"> + <Tab> +```bash +npx create-next-app@latest +``` + </Tab> + <Tab> +```bash +npm create vite@latest +``` + </Tab> +</Tabs> + +1. **Install `turbo` globally:** + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```bash + npm install turbo --global + ``` + </Tab> + <Tab> + ```bash + yarn global add turbo + ``` + </Tab> + <Tab> + ```bash + pnpm add turbo --global + ``` + </Tab> +</Tabs> + +For more details about installation, see [Installing Turborepo](../installing) + +2. **Add a `turbo.json` file at the base of your new repository:** + +For more information on configuring your `turbo.json`, see the [Configuration Options](/repo/docs/reference/configuration) documentation. + +<Tabs items={['Next.js', 'Vite']} storageKey="selected-framework"> + <Tab> +```json filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "outputs": [".next/**", "!.next/cache/**"] + }, + "lint": {} + } +} +``` + </Tab> + <Tab> +```json filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "outputs": ["dist/**"] + }, + "lint": {} + } +} +``` + +Some Vite starters ship with a `package.json` that looks like this: + +```json filename="package.json" +{ + "scripts": { + "build": "tsc && vite build" + } +} +``` + +We recommend splitting these into a `lint` and `build` script. + +```json filename="package.json" +{ + "scripts": { + "build": "vite build", + "lint": "tsc" + } +} +``` + +This means that Turbo can schedule them separately. + + </Tab> +</Tabs> + +3. **Edit `.gitignore`** + +Add `.turbo` to your `.gitignore` file. The CLI uses these folders for logs and certain task outputs. + +```diff ++ .turbo +``` + +4. **Try running `build` and `lint` with `turbo`:** + +```bash +turbo build lint +``` + +This runs `build` and `lint` at the same time. + +5. **Without making any changes to the code, try running `build` and `lint` again:** + +```bash +turbo build lint +``` + +You should see terminal output like this: + +``` + Tasks: 2 successful, 2 total +Cached: 2 cached, 2 total + Time: 185ms >>> FULL TURBO +``` + +Congratulations - **you just completed a build and lint in under 200ms**. + +To learn how this is possible, check out our [core concepts docs](/repo/docs/core-concepts/caching). + +6. **Try running `dev` with `turbo`:** + +```bash +turbo dev +``` + +You'll notice that your `dev` script starts up. You can use `turbo` to run any script in your `package.json`. diff --git a/docs/pages/repo/docs/getting-started/create-new.mdx b/docs/pages/repo/docs/getting-started/create-new.mdx new file mode 100644 index 0000000..fa6f061 --- /dev/null +++ b/docs/pages/repo/docs/getting-started/create-new.mdx @@ -0,0 +1,527 @@ +--- +title: Getting Started with Turborepo +description: Create your first monorepo or add Turborepo to an existing project. +--- + +import Callout from "../../../../components/Callout"; +import { Tabs, Tab } from "../../../../components/Tabs"; + +# Creating a new monorepo + +<Callout type="info"> + This guide uses a global installation of `turbo`. Follow the [installation guide](../installing) + to get this setup. Alternatively, you can use your package manager to run a locally installed `turbo` + in the commands below. +</Callout> + +## Quickstart + +To create a new monorepo, use our [`create-turbo`](https://www.npmjs.com/package/create-turbo) npm package: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```sh + npx create-turbo@latest + ``` + </Tab> + <Tab> + ```sh + yarn dlx create-turbo@latest + ``` + </Tab> + <Tab> + ```sh + pnpm dlx create-turbo@latest + ``` + </Tab> +</Tabs> + +You can also clone a Turborepo starter repository to get a head start on your monorepo. To see Turborepo examples and starters, see the [Turborepo examples directory on GitHub](https://github.com/vercel/turbo/tree/main/examples). + +## Full tutorial + +This tutorial will walk you through setting up a basic example. By the end, you'll feel confident with using `turbo`, and know all the basic functionality. + +<Callout type="info"> + +During this tutorial, some lines of code are omitted from the code samples. For instance, when showing a `package.json` we won't show _all_ of the keys - only the ones that matter. + +</Callout> + +### 1. Running `create-turbo` + +First, run: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```sh + npx create-turbo@latest + ``` + </Tab> + <Tab> + ```sh + yarn dlx create-turbo@latest + ``` + </Tab> + <Tab> + ```sh + pnpm dlx create-turbo@latest + ``` + </Tab> +</Tabs> + +This installs the [`create-turbo`](https://www.npmjs.com/package/create-turbo) CLI, and runs it. You'll be asked several questions: + +#### Where would you like to create your turborepo? + +Choose anywhere you like. The default is `./my-turborepo`. + +#### Which package manager do you want to use? + +Turborepo doesn't handle installing packages, so you'll need to choose either: + +- [npm](https://www.npmjs.com//) +- [pnpm](https://pnpm.io/) +- [yarn](https://yarnpkg.com/) + +If you're not sure, we recommend choosing `pnpm`. If you don't have it installed, cancel `create-turbo` (via `ctrl-C`) and take a look at the [installation instructions](https://pnpm.io/installation). + +#### Installation + +Once you've picked a package manager, `create-turbo` will create a bunch of new files inside the folder name you picked. It'll also install all the dependencies that come with the `basic` example by default. + +### 2. Exploring your new repo + +You might have noticed something in the terminal. `create-turbo` gave you a description of all of the things it was adding. + +``` +>>> Creating a new turborepo with the following: + + - apps/web: Next.js with TypeScript + - apps/docs: Next.js with TypeScript + - packages/ui: Shared React component library + - packages/eslint-config-custom: Shared configuration (ESLint) + - packages/tsconfig: Shared TypeScript `tsconfig.json` +``` + +Each of these is a _workspace_ - a folder containing a `package.json`. Each workspace can declare its own dependencies, run its own scripts, and export code for other workspaces to use. + +Open the root folder - `./my-turborepo` - in your favourite code editor. + +#### Understanding `packages/ui` + +First, open `./packages/ui/package.json`. You'll notice that the package's name is `"name": "ui"` - right at the top of the file. + +Next, open `./apps/web/package.json`. You'll notice that this package's name is `"name": "web"`. But also - take a look in its dependencies. + +You'll see that `"web"` depends on a package called `"ui"`. If you're using `pnpm`, you'll see it's declared like this: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```json filename="apps/web/package.json" + { + "dependencies": { + "ui": "*" + } + } + ``` + </Tab> + <Tab> + ```json filename="apps/web/package.json" + { + "dependencies": { + "ui": "*" + } + } + ``` + </Tab> + <Tab> + ```json filename="apps/web/package.json" + { + "dependencies": { + "ui": "workspace:*" + } + } + ``` + </Tab> +</Tabs> + +This means that our **web app depends on our local `ui` package**. + +If you look inside `apps/docs/package.json`, you'll see the same thing. Both `web` and `docs` depend on `ui` - a shared component library. + +This pattern of sharing code across applications is extremely common in monorepos - and means that multiple apps can share a single design system. + +#### Understanding imports and exports + +Take a look inside `./apps/docs/pages/index.tsx`. Both `docs` and `web` are [Next.js](https://nextjs.org/) applications, and they both use the `ui` library in a similar way: + +```tsx filename="apps/docs/pages/index.tsx" +import { Button } from "ui"; +// ^^^^^^ ^^ + +export default function Docs() { + return ( + <div> + <h1>Docs</h1> + <Button /> + </div> + ); +} +``` + +They're importing `Button` directly from a dependency called `ui`! How does that work? Where is `Button` coming from? + +Open `packages/ui/package.json`. You'll notice these two attributes: + +```json filename="packages/ui/package.json" +{ + "main": "./index.tsx", + "types": "./index.tsx" +} +``` + +When workspaces import from `ui`, `main` tells them where to access the code they're importing. `types` tells them where the TypeScript types are located. + +So, let's look inside `packages/ui/index.tsx`: + +```tsx filename="packages/ui/index.tsx" +import * as React from "react"; +export * from "./Button"; +``` + +Everything inside this file will be able to be used by workspaces that depend on `ui`. + +`index.tsx` is exporting everything from a file called `./Button`, so let's go there: + +```tsx filename="packages/ui/Button.tsx" +import * as React from "react"; + +export const Button = () => { + return <button>Boop</button>; +}; +``` + +We've found our button! Any changes we make in this file will be shared across `web` and `docs`. Pretty cool! + +<Callout type="idea"> + +Try experimenting with exporting a different function from this file. Perhaps `add(a, b)` for adding two numbers together. + +This can then be imported by `web` and `docs`. + +</Callout> + +#### Understanding `tsconfig` + +We have two more workspaces to look at, `tsconfig` and `eslint-config-custom`. Each of these allow for shared configuration across the monorepo. Let's look in `tsconfig`: + +```json filename="packages/tsconfig/package.json" +{ + "name": "tsconfig", + "files": ["base.json", "nextjs.json", "react-library.json"] +} +``` + +Here, we specify three files to be exported, inside `files`. Packages which depend on `tsconfig` can then import them directly. + +For instance, `packages/ui` depends on `tsconfig`: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```json filename="packages/ui/package.json" + { + "devDependencies": { + "tsconfig": "*" + } + } + ``` + </Tab> + <Tab> + ```json filename="packages/ui/package.json" + { + "devDependencies": { + "tsconfig": "*" + } + } + ``` + </Tab> + <Tab> + ```json filename="packages/ui/package.json" + { + "devDependencies": { + "tsconfig": "workspace:*" + } + } + ``` + </Tab> +</Tabs> + +And inside its `tsconfig.json` file, it imports it using `extends`: + +```json filename="packages/ui/tsconfig.json" +{ + "extends": "tsconfig/react-library.json" +} +``` + +This pattern allows for a monorepo to share a single `tsconfig.json` across all its workspaces, reducing code duplication. + +#### Understanding `eslint-config-custom` + +Our final workspace is `eslint-config-custom`. + +You'll notice that this is named slightly differently to the other workspaces. It's not as concise as `ui` or `tsconfig`. Let's take a look inside `.eslintrc.js` in the root of the monorepo to figure out why. + +```ts filename=".eslintrc.js" +module.exports = { + // This tells ESLint to load the config from the workspace `eslint-config-custom` + extends: ["custom"], +}; +``` + +[ESLint](https://eslint.org/) resolves configuration files by looking for workspaces with the name `eslint-config-*`. This lets us write `extends: ['custom']` and have ESLint find our local workspace. + +But why is this in the root of the monorepo? + +The way ESLint finds its configuration file is by looking at the closest `.eslintrc.js`. If it can't find one in the current directory, it'll look in the directory above until it finds one. + +So that means that if we're working on code inside `packages/ui` (which doesn't have a `.eslintrc.js`) it'll refer to the _root_ instead. + +Apps that _do_ have an `.eslintrc.js` can refer to `custom` in the same way. For instance, in `docs`: + +```ts filename="apps/docs/.eslintrc.js" +module.exports = { + root: true, + extends: ["custom"], +}; +``` + +Just like `tsconfig`, `eslint-config-custom` lets us share ESLint configs across our entire monorepo, keeping things consistent no matter what project you're working on. + +#### Summary + +It's important to understand the dependencies between these workspaces. Let's map them out: + +- `web` - depends on `ui`, `tsconfig` and `eslint-config-custom` +- `docs` - depends on `ui`, `tsconfig` and `eslint-config-custom` +- `ui` - depends on `tsconfig` and `eslint-config-custom` +- `tsconfig` - no dependencies +- `eslint-config-custom` - no dependencies + +Note that **the Turborepo CLI is not responsible for managing these dependencies**. All of the things above are handled by the package manager you chose (`npm`, `pnpm` or `yarn`). + +### 3. Understanding `turbo.json` + +We now understand our repository and its dependencies. How does Turborepo help? + +Turborepo helps by making running tasks simpler and _much_ more efficient. + +Let's take a look inside `turbo.json`, at the root: + +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "dependsOn": ["^build"], + "outputs": ["dist/**", ".next/**", "!.next/cache/**"] + }, + "lint": {}, + "dev": { + "cache": false + } + } +} +``` + +What we're seeing here is that we've _registered_ three tasks with `turbo`: `lint`, `dev` and `build`. +Every task registered inside `turbo.json` can be run with `turbo run <task>` (or `turbo <task>` for short). + +<Callout type="error"> + Before we move on, let's try running a task called `hello` that _isn't_ registered in `turbo.json`: + +```bash +turbo hello +``` + +You'll see an error in the terminal. Something resembling: + +``` +Could not find the following tasks in project: hello +``` + +That's worth remembering - **in order for `turbo` to run a task, it must be in `turbo.json`**. + +</Callout> + +Let's investigate the scripts we already have in place. + +### 4. Linting with Turborepo + +Try running our `lint` script: + +```sh +turbo lint +``` + +You'll notice several things happen in the terminal. + +1. Several scripts will be run at the same time, each prefixed with either `docs:lint`, `ui:lint` or `web:lint`. +2. They'll each succeed, and you'll see `3 successful` in the terminal. +3. You'll also see `0 cached, 3 total`. We'll cover what this means later. + +The scripts that each run come from each workspace's `package.json`. Each workspace can optionally specify its own `lint` script: + +```json filename="apps/web/package.json" +{ + "scripts": { + "lint": "next lint" + } +} +``` + +```json filename="apps/docs/package.json" +{ + "scripts": { + "lint": "next lint" + } +} +``` + +```json filename="packages/ui/package.json" +{ + "scripts": { + "lint": "eslint *.ts*" + } +} +``` + +When we run `turbo lint`, Turborepo looks at each `lint` script in each workspace and runs it. For more details, see our [pipelines](/repo/docs/core-concepts/monorepos/running-tasks#defining-a-pipeline) docs. + +#### Using the cache + +Let's run our `lint` script one more time. You'll notice a few new things appear in the terminal: + +1. `cache hit, replaying output` appears for `docs:lint`, `web:lint` and `ui:lint`. +2. You'll see `3 cached, 3 total`. +3. The total runtime should be under `100ms`, and `>>> FULL TURBO` appears. + +Something interesting just happened. Turborepo realised that **our code hadn't changed since the last time we ran the lint script**. + +It had saved the logs from the previous run, so it just replayed them. + +Let's try changing some code to see what happens. Make a change to a file inside `apps/docs`: + +```diff filename="apps/docs/pages/index.tsx" +import { Button } from "ui"; + +export default function Docs() { + return ( + <div> +- <h1>Docs</h1> ++ <h1>My great docs</h1> + <Button /> + </div> + ); +} +``` + +Now, run the `lint` script again. You'll notice that: + +1. `docs:lint` has a comment saying `cache miss, executing`. This means that `docs` is running its linting. +2. `2 cached, 3 total` appears at the bottom. + +This means that **the results of our previous tasks were still cached**. Only the `lint` script inside `docs` actually ran - again, speeding things up. To learn more, check out our [caching docs](/repo/docs/core-concepts/caching). + +### 5. Building with Turborepo + +Let's try running our `build` script: + +```bash +turbo build +``` + +You'll see similar outputs to when we ran our lint script. Only `apps/docs` and `apps/web` specify a `build` script in their `package.json`, so only those are run. + +Take a look inside `build` in `turbo.json`. There's some interesting config there. + +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": [".next/**", "!.next/cache/**"] + } + } +} +``` + +You'll notice that some `outputs` have been specified. Declaring outputs will mean that when `turbo` finishes running your task, it'll save the output you specify in its cache. + +Both `apps/docs` and `apps/web` are Next.js apps, and they output builds to the `./.next` folder. + +Let's try something. Delete the `apps/docs/.next` build folder. + +Run the `build` script again. You'll notice: + +1. We hit `FULL TURBO` - the builds complete in under `100ms`. +2. The `.next` folder re-appears! + +Turborepo cached the result of our previous build. When we ran the `build` command again, it restored the entire `.next/**` folder from the cache. To learn more, check out our docs on [cache outputs](/repo/docs/core-concepts/caching#configuring-cache-outputs). + +### 6. Running dev scripts + +Let's now try running `dev`. + +```bash +turbo dev +``` + +You'll notice some information in the terminal: + +1. Only two scripts will execute - `docs:dev` and `web:dev`. These are the only two workspaces which specify `dev`. +2. Both `dev` scripts are run simultaneously, starting your Next.js apps on ports `3000` and `3001`. +3. In the terminal, you'll see `cache bypass, force executing`. + +Try quitting out of the script, and re-running it. You'll notice we don't go `FULL TURBO`. Why is that? + +Take a look at `turbo.json`: + +```json filename="turbo.json" +{ + "pipeline": { + "dev": { + "cache": false, + "persistent": true + } + } +} +``` + +Inside `dev`, we've specified `"cache": false`. This means we're telling Turborepo _not_ to cache the +results of the `dev` script. `dev` runs a persistent dev server and produces no outputs, so there +is nothing to cache. Learn more about it in our docs on [turning off caching](/repo/docs/core-concepts/caching#turn-off-caching). +Additionally, we set `"persistent": true`, to let turbo know that this is a long-running dev server, +so that turbo can ensure that no other tasks depend on it. You can read more in the [docs for the +`persistent` option](/repo/docs/reference/configuration#persistent). + +#### Running `dev` on only one workspace at a time + +By default, `turbo dev` will run `dev` on all workspaces at once. But sometimes, we might only want to choose one workspace. + +To handle this, we can add a `--filter` flag to our command. + +```bash +turbo dev --filter docs +``` + +You'll notice that it now only runs `docs:dev`. Learn more about [filtering workspaces](/repo/docs/core-concepts/monorepos/filtering) from our docs. + +### Summary + +Well done! You've learned all about your new monorepo, and how Turborepo makes handling your tasks easier. + +#### Next steps + +- Need to add more tasks? Learn more about using [pipelines](/repo/docs/core-concepts/monorepos/running-tasks#defining-a-pipeline) +- Want to speed up your CI? Set up [remote caching](/repo/docs/core-concepts/remote-caching). +- Want some inspiration? Take a look at our directory of [examples](https://github.com/vercel/turbo/tree/main/examples) diff --git a/docs/pages/repo/docs/getting-started/existing-monorepo.mdx b/docs/pages/repo/docs/getting-started/existing-monorepo.mdx new file mode 100644 index 0000000..f36fd0c --- /dev/null +++ b/docs/pages/repo/docs/getting-started/existing-monorepo.mdx @@ -0,0 +1,231 @@ +--- +title: Getting Started with Turborepo +description: Create your first monorepo or add Turborepo to an existing project. +--- + +import Callout from "../../../../components/Callout"; +import { Tabs, Tab } from "../../../../components/Tabs"; + +# Add Turborepo to your existing monorepo + +## Configure workspaces + +`turbo` is built on top of Workspaces, a way of managing multiple packages from within a single monorepo package. Turborepo is compatible with the workspace implementations from all package managers. For more information on managing your Turborepo workspaces, see the [Workspaces](/repo/docs/handbook/workspaces) documentation. + +You can configure workspaces any way you want, but a common folder structure example is keeping applications in the `/apps` folder and packages in the `/packages` folder. The configuration of these folders is different for each package manager. + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> +Specify your `workspaces` in your monorepo's root `package.json` file: + +```json filename="package.json" +{ + "workspaces": ["packages/*", "apps/*"] +} +``` + + </Tab> + <Tab> +Specify your `workspaces` in your monorepo's root `package.json` file: + +```json filename="package.json" +{ + "workspaces": ["packages/*", "apps/*"] +} +``` + + </Tab> + <Tab> +Specify your `packages` in `pnpm-workspace.yaml`. +```yaml filename="pnpm-workspace.yaml" +packages: + - "packages/*" + - "apps/*" +``` + </Tab> +</Tabs> + +After configuring your workspaces, re-run your package manager's `install` command. + +<Callout> + Note: Nested workspaces are not supported. As package names are required to be + unique, moving each package to be a child of the monorepo's root package + should meet your needs. +</Callout> + +## Install `turbo` + +Install `turbo` globally. + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```bash + npm install turbo --global + ``` + </Tab> + <Tab> + ```bash + yarn global add turbo + ``` + </Tab> + <Tab> + ```bash + pnpm install turbo --global + ``` + </Tab> +</Tabs> + +For more details about installation, see [Installing Turborepo](../installing) + +## Create `turbo.json` + +In the root of your monorepo, create an empty file named `turbo.json`. This will hold the configuration for Turborepo. + +```json filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json" +} +``` + +## Create a `pipeline` + +To define your monorepo's task dependency graph, use the [`pipeline`](/repo/docs/core-concepts/monorepos/running-tasks) key in the `turbo.json` configuration file at the root of monorepo. `turbo` interprets this configuration to optimally schedule, execute, and cache the outputs of each of the `package.json` scripts defined in your workspaces. + +Each key in the [`pipeline`](/repo/docs/core-concepts/monorepos/running-tasks) object is the name of a `package.json` script that can be executed by [`turbo run`](/repo/docs/reference/command-line-reference#turbo-run-task). You can specify its dependencies with the [`dependsOn`](/repo/docs/reference/configuration#dependson) key inside it as well as some other options related to [caching](/repo/docs/core-concepts/caching). For more information on configuring your pipeline, see the [`Pipelines`](/repo/docs/core-concepts/monorepos/running-tasks) documentation. + +Workspaces that do not have the specified script defined in their `package.json`'s list of `scripts` will be ignored by `turbo`. + +```jsonc filename="turbo.json" +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + // A package's `build` script depends on that package's + // dependencies and devDependencies + // `build` tasks being completed first + // (the `^` symbol signifies `upstream`). + "dependsOn": ["^build"], + // note: output globs are relative to each package's `package.json` + // (and not the monorepo root) + "outputs": [".next/**", "!.next/cache/**"] + }, + "test": { + // A package's `test` script depends on that package's + // own `build` script being completed first. + "dependsOn": ["build"], + // A package's `test` script should only be rerun when + // either a `.tsx` or `.ts` file has changed in `src` or `test` folders. + "inputs": ["src/**/*.tsx", "src/**/*.ts", "test/**/*.ts", "test/**/*.tsx"] + }, + // A package's `lint` script has no dependencies and + // can be run whenever. It also has no filesystem outputs. + "lint": {}, + "deploy": { + // A package's `deploy` script depends on the `build`, + // `test`, and `lint` scripts of the same package + // being completed. It also has no filesystem outputs. + "dependsOn": ["build", "test", "lint"] + } + } +} +``` + +The rough execution order for a given package is based on the `dependsOn` keys: + +1. `build` once its upstream dependencies have run their `build` commands +2. `test` once its _own_ `build` command is finished and has no filesystem outputs (just logs) within a package +3. `lint` runs in an arbitrary order as it has no upstream dependencies +4. `deploy` once its _own_ `build`, `test`, and `lint` commands have finished. + +After execution, the full pipeline can run: + +```sh +npx turbo run deploy +``` + +`turbo` will then schedule the execution of each task(s) to optimize usage of the machine's resources. + +## Edit `.gitignore` + +Add `.turbo` to your `.gitignore` file. The CLI uses these folders for logs and certain task outputs. + +```diff ++ .turbo +``` + +Make sure that your task artifacts, the files and folders you want cached, are also included in your `.gitignore`. + +```diff ++ build/** ++ dist/** ++ .next/** +``` + +Re-run your npm client's `install` command to check your configuration. + +## Build your monorepo + +```bash +turbo build +``` + +Depending on your monorepo setup, some artifacts might already be caching properly. In the next sections, we'll show how `turbo` works, how `scope` works, and then how to get caching working after that. + +## Configure Remote Caching + +A major key πŸ”‘ to Turborepo's speed is that it is both lazy and efficientβ€”it does the least amount of work possible and it tries to never redo work that's already been done before. + +At the moment, Turborepo caches your tasks on your local filesystem (i.e. "single-player mode," if you will). However, what if there was a way to take advantage of the computational work done by your teammates or your CI (i.e. "co-op multiplayer mode")? What if there was a way to teleport and share a single cache across machines? Almost like a "Dropbox" for your Turborepo cache. + +> Remote Caching has entered the chat. + +Turborepo can use a technique known as Remote Caching to share cache artifacts across machines for an additional speed boost. + +<Callout> + Remote Caching is a powerful feature of Turborepo, but with great power comes + great responsibility. Make sure you are caching correctly first and double + check handling of environment variables. Please also remember Turborepo treats + logs as artifacts, so be aware of what you are printing to the console. +</Callout> + +### Using Remote Caching for Local development + +Turborepo uses [Vercel](https://vercel.com/?utm_source=turbo.build&utm_medium=referral&utm_campaign=docs-link) as its default remote caching provider. If you want to link your local turborepo to your Remote Cache you can authenticate the Turborepo CLI with your Vercel account: + +```sh +turbo login +``` + +Then, link your turborepo to your remote cache: + +``` +turbo link +``` + +Once enabled, make some changes to a package or application you are currently caching and run tasks against it with `turbo run`. +Your cache artifacts will now be stored locally and in your Remote Cache. To verify that this worked, delete your local Turborepo cache: + +```sh +rm -rf ./node_modules/.cache/turbo +``` + +Run the same build again. If things are working properly, `turbo` should not execute tasks locally, but rather download both the logs and artifacts from your Remote Cache and replay them back to you. + +<Callout> + **Note: When connecting to an sso-enabled Vercel team, you must provide your + Team slug as an argument to `npx turbo login`.** +</Callout> + +``` +turbo login --sso-team=<team-slug> +``` + +## Next Steps + +You're now up and running with Turborepo, but there are still a few things to do: + +- [Understand how Turborepo caching works](/repo/docs/core-concepts/caching) +- [Correctly handle environment variables](/repo/docs/core-concepts/caching#altering-caching-based-on-environment-variables) +- [Learn to orchestrate task running with pipelines](/repo/docs/core-concepts/monorepos/running-tasks) +- [Efficiently filter package tasks](/repo/docs/core-concepts/monorepos/filtering) +- [Configure Turborepo with your CI provider](/repo/docs/ci) diff --git a/docs/pages/repo/docs/getting-started/from-example.mdx b/docs/pages/repo/docs/getting-started/from-example.mdx new file mode 100644 index 0000000..ba1e5ce --- /dev/null +++ b/docs/pages/repo/docs/getting-started/from-example.mdx @@ -0,0 +1,49 @@ +--- +title: Examples +description: Start from an example. +--- + +import { readdirSync, lstatSync, readFileSync } from 'fs' +import path from 'path' +import { ExamplesArea } from "../../../../components/ExamplesArea"; + +export const getStaticProps = ({ params }) => { + // path to examples directory at the monorepo root. + const examplesDirectory = path.join(__dirname, '../../../../../../../examples') + const examples = []; + readdirSync(examplesDirectory).forEach(file => { + if (lstatSync(path.join(examplesDirectory, file)).isDirectory()) { + try { + examples.push({ + slug: file, + ...JSON.parse(readFileSync(path.join(examplesDirectory, file, "meta.json")).toString()) + } + ); + } catch (err) { + console.log(`No meta.json found for ${file}, excluding from docs`); + } + } + }); + // throw an error if no examples are found + if (examples.length === 0) { + throw new Error( + `No examples found in ${examplesDirectory}! Make sure you have updated the path if moving this file.` + ) + } + return { + props: { + ssg: { + examples + } + }, + revalidate: 60 * 60 * 24 + } +} + +# Turborepo Examples + +Clone a Turborepo starter repository to get a head start on your monorepo. + +<ExamplesArea filter='all' /> + +For even more examples and starters, see the [Turborepo examples directory on GitHub](https://github.com/vercel/turbo/tree/main/examples). diff --git a/docs/pages/repo/docs/handbook.mdx b/docs/pages/repo/docs/handbook.mdx new file mode 100644 index 0000000..2111b68 --- /dev/null +++ b/docs/pages/repo/docs/handbook.mdx @@ -0,0 +1,25 @@ +--- +title: Monorepo Handbook +description: The missing manual on how to set up and use your monorepo. +--- + +import { + FundamentalsArea, + TasksArea, +} from "../../../components/MonorepoHandbook"; + +# Monorepo Handbook + +Now we've covered the core concepts, it's time to get practical. This handbook covers **everything you need to know to set up and use your monorepo**. + +## Fundamentals + +Learn about the **fundamental building blocks of monorepos** - workspaces, packages and dependencies. + +<FundamentalsArea /> + +## Tasks + +Configure common tasks in your monorepo, like **linting, testing, and building** your apps and packages. + +<TasksArea /> diff --git a/docs/pages/repo/docs/handbook/_meta.json b/docs/pages/repo/docs/handbook/_meta.json new file mode 100644 index 0000000..38d0327 --- /dev/null +++ b/docs/pages/repo/docs/handbook/_meta.json @@ -0,0 +1,15 @@ +{ + "what-is-a-monorepo": "What is a Monorepo?", + "package-installation": "Package Installation", + "workspaces": "Workspaces", + "migrating-to-a-monorepo": "Migrating to a Monorepo", + "dev": "Development Tasks", + "building-your-app": "Building Your App", + "deploying-with-docker": "Deploying with Docker", + "environment-variables": "Environment Variables", + "sharing-code": "Sharing Code", + "linting": "Linting", + "testing": "Testing", + "publishing-packages": "Publishing Packages", + "troubleshooting": "Troubleshooting" +} diff --git a/docs/pages/repo/docs/handbook/building-your-app.mdx b/docs/pages/repo/docs/handbook/building-your-app.mdx new file mode 100644 index 0000000..e2e09de --- /dev/null +++ b/docs/pages/repo/docs/handbook/building-your-app.mdx @@ -0,0 +1,76 @@ +import { Tabs, Tab } from "../../../../components/Tabs"; +import Callout from "../../../../components/Callout"; + +# Building your App + +Unless your monorepo is only used for [publishing packages to npm](/repo/docs/handbook/publishing-packages), it will likely contain at least one application. Coordinating your app's builds with Turborepo can lead to some extraordinary gains in speed. + +## Setting up the build + +Turborepo works by keeping your workspace tasks where they belong - in each workspace's `package.json`. Let's imagine you have a monorepo that looks like this: + +``` +β”œβ”€β”€ apps +β”‚ └── web +β”‚ └── package.json +β”œβ”€β”€ package.json +└── turbo.json +``` + +Your `apps/web/package.json` should have a `build` script inside: + +<Tabs items={['Next.js', 'Vite']} storageKey="selected-framework"> + <Tab> +```json filename="apps/web/package.json" +{ + "scripts": { + "build": "next build" + } +} +``` + </Tab> + <Tab> +```json filename="apps/web/package.json" +{ + "scripts": { + "build": "vite build" + } +} +``` + </Tab> +</Tabs> + +Inside `turbo.json`, you can add `build` to the pipeline. + +<Tabs items={['Next.js', 'Vite']} storageKey="selected-framework"> + <Tab> +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": [".next/**", "!.next/cache/**"] + } + } +} +``` + </Tab> + <Tab> +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": ["dist/**"] + } + } +} +``` + </Tab> +</Tabs> + +<Callout type="info"> + +We configure the `outputs` so that we can enable [caching](/repo/docs/core-concepts/caching) - an extremely powerful feature of Turborepo that can skip tasks that have been done before. + +</Callout> + +This means that running `turbo build` from root will build all of the apps in the repository. Thanks to Turborepo's task cache, you can end up with extremely fast build times. diff --git a/docs/pages/repo/docs/handbook/deploying-with-docker.mdx b/docs/pages/repo/docs/handbook/deploying-with-docker.mdx new file mode 100644 index 0000000..fafb041 --- /dev/null +++ b/docs/pages/repo/docs/handbook/deploying-with-docker.mdx @@ -0,0 +1,184 @@ +import Callout from "../../../../components/Callout"; + +# Deploying with Docker + +Building a [Docker](https://www.docker.com/) image is a common way to deploy all sorts of applications. However, doing so from a monorepo has several challenges. + +## The problem + +**TL;DR:** In a monorepo, unrelated changes can make Docker do unnecessary work when deploying your app. + +Let's imagine you have a monorepo that looks like this: + +```txt +β”œβ”€β”€ apps +β”‚ β”œβ”€β”€ docs +β”‚ β”‚ β”œβ”€β”€ server.js +β”‚ β”‚ └── package.json +β”‚ └── web +β”‚ └── package.json +β”œβ”€β”€ package.json +└── package-lock.json +``` + +You want to deploy `apps/docs` using Docker, so you create a Dockerfile: + +```docker filename="Dockerfile" +FROM node:16 + +WORKDIR /usr/src/app + +# Copy root package.json and lockfile +COPY package.json ./ +COPY package-lock.json ./ + +# Copy the docs package.json +COPY apps/docs/package.json ./apps/docs/package.json + +RUN npm install + +# Copy app source +COPY . . + +EXPOSE 8080 + +CMD [ "node", "apps/docs/server.js" ] +``` + +This will copy the root `package.json` and the root lockfile to the docker image. Then, it'll install dependencies, copy the app source and start the app. + +You should also create a `.dockerignore` file to prevent node_modules from being copied in with the app's source. + +```txt filename=".dockerignore" +node_modules +npm-debug.log +``` + +### The lockfile changes too often + +Docker is pretty smart about how it deploys your apps. Just like Turbo, it tries to do as [little work as possible](https://bitjudo.com/blog/2014/03/13/building-efficient-dockerfiles-node-dot-js/). + +In our Dockerfile's case, it will only run `npm install` if the files it has in its image are _different_ from the previous run. If not, it'll restore the `node_modules` directory it had before. + +This means that whenever `package.json`, `apps/docs/package.json` or `package-lock.json` change, the docker image will run `npm install`. + +This sounds great - until we realise something. The `package-lock.json` is _global_ for the monorepo. That means that **if we install a new package inside `apps/web`, we'll cause `apps/docs` to redeploy**. + +In a large monorepo, this can result in a huge amount of lost time, as any change to a monorepo's lockfile cascades into tens or hundreds of deploys. + +## The solution + +The solution is to prune the inputs to the Dockerfile to only what is strictly necessary. Turborepo provides a simple solution - `turbo prune`. + +```bash +turbo prune --scope="docs" --docker +``` + +Running this command creates a **pruned version of your monorepo** inside an `./out` directory. It only includes workspaces which `docs` depends on. + +Crucially, it also **prunes the lockfile** so that only the relevant `node_modules` will be downloaded. + +### The `--docker` flag + +By default, `turbo prune` puts all relevant files inside `./out`. But to optimize caching with Docker, we ideally want to copy the files over in two stages. + +First, we want to copy over only what we need to install the packages. When running `--docker`, you'll find this inside `./out/json`. + +```txt +out +β”œβ”€β”€ json +β”‚ β”œβ”€β”€ apps +β”‚ β”‚ └── docs +β”‚ β”‚ └── package.json +β”‚ └── package.json +β”œβ”€β”€ full +β”‚ β”œβ”€β”€ apps +β”‚ β”‚ └── docs +β”‚ β”‚ β”œβ”€β”€ server.js +β”‚ β”‚ └── package.json +β”‚ β”œβ”€β”€ package.json +β”‚ └── turbo.json +└── package-lock.json +``` + +Afterwards, you can copy the files in `./out/full` to add the source files. + +Splitting up **dependencies** and **source files** in this way lets us **only run `npm install` when dependencies change** - giving us a much larger speedup. + +<Callout type="info"> + Without `--docker`, all pruned files are placed inside `./out`. +</Callout> + +### Example + +Our detailed [`with-docker` example](https://github.com/vercel/turbo/tree/main/examples/with-docker) goes into depth on how to utilise `prune` to its full potential. Here's the Dockerfile, copied over for convenience. + +```docker +FROM node:alpine AS builder +RUN apk add --no-cache libc6-compat +RUN apk update +# Set working directory +WORKDIR /app +RUN yarn global add turbo +COPY . . +RUN turbo prune --scope=web --docker + +# Add lockfile and package.json's of isolated subworkspace +FROM node:alpine AS installer +RUN apk add --no-cache libc6-compat +RUN apk update +WORKDIR /app + +# First install the dependencies (as they change less often) +COPY .gitignore .gitignore +COPY --from=builder /app/out/json/ . +COPY --from=builder /app/out/yarn.lock ./yarn.lock +RUN yarn install + +# Build the project +COPY --from=builder /app/out/full/ . +RUN yarn turbo run build --filter=web... + +FROM node:alpine AS runner +WORKDIR /app + +# Don't run production as root +RUN addgroup --system --gid 1001 nodejs +RUN adduser --system --uid 1001 nextjs +USER nextjs + +COPY --from=installer /app/apps/web/next.config.js . +COPY --from=installer /app/apps/web/package.json . + +# Automatically leverage output traces to reduce image size +# https://nextjs.org/docs/advanced-features/output-file-tracing +COPY --from=installer --chown=nextjs:nodejs /app/apps/web/.next/standalone ./ +COPY --from=installer --chown=nextjs:nodejs /app/apps/web/.next/static ./apps/web/.next/static +COPY --from=installer --chown=nextjs:nodejs /app/apps/web/public ./apps/web/public + +CMD node apps/web/server.js +``` + +## Remote caching + +To take advantage of remote caches during Docker builds, you will need to make sure your build container has credentials to access your Remote Cache. + +There are many ways to take care of secrets in a Docker image. We will use a simple strategy here with multi-stage builds using secrets as build arguments that will get hidden for the final image. + +Assuming you are using a Dockerfile similar to the one above, we will bring in some environment variables from build arguments right before `turbo build`: + +```docker +ARG TURBO_TEAM +ENV TURBO_TEAM=$TURBO_TEAM + +ARG TURBO_TOKEN +ENV TURBO_TOKEN=$TURBO_TOKEN + +RUN yarn turbo run build --filter=web... +``` + +`turbo` will now be able to hit your remote cache. To see a Turborepo cache hit for a non-cached Docker build image, run a command like this one from your project root: + +```sh +docker build -f apps/web/Dockerfile . --build-arg TURBO_TEAM=β€œyour-team-name” --build-arg TURBO_TOKEN=β€œyour-tokenβ€œ --no-cache +``` diff --git a/docs/pages/repo/docs/handbook/dev.mdx b/docs/pages/repo/docs/handbook/dev.mdx new file mode 100644 index 0000000..ea0d184 --- /dev/null +++ b/docs/pages/repo/docs/handbook/dev.mdx @@ -0,0 +1,205 @@ +import { Tabs, Tab } from '../../../../components/Tabs' +import Callout from '../../../../components/Callout' + +# Development tasks in a Monorepo + +The vast majority of development workflows look like this: + +1. Open a repository +2. Run a `dev` task while they develop +3. At the end of the day, shut down the `dev` task and close the repository. + +`dev` will likely be the most frequently run task in your repository, so getting it right is important. + +## Types of `dev` tasks + +`dev` tasks come in many shapes and sizes: + +1. Running a local development server for a web app +1. Running [`nodemon`](https://www.npmjs.com/package/nodemon) to re-run a backend process every time code changes +1. Running tests in `--watch` mode + +## Setup with Turborepo + +You should specify your `dev` task like this in your `turbo.json`. + +```json filename="turbo.json" +{ + "pipeline": { + "dev": { + "cache": false, + "persistent": true + } + } +} +``` + +Since `dev` tasks don't produce outputs, `outputs` is empty. `dev` tasks are also unique in that you +rarely want to [cache](/repo/docs/core-concepts/caching) them, so we set `cache` as `false`. +We also set `persistent` to `true`, because `dev` tasks are long-running tasks, and we want to ensure +that it doesn't block any other task from executing. + +### Setting up `package.json` + +You should also provide a `dev` task in your root `package.json`: + +```json filename="package.json" +{ + "scripts": { + "dev": "turbo run dev" + } +} +``` + +This enables developers to run the task directly from their normal task runner. + +## Running tasks _before_ `dev` + +In some workflows, you'll want to run tasks _before_ you run your `dev` task. For instance, generating code or running a `db:migrate` task. + +In these cases, use [`dependsOn`](/repo/docs/core-concepts/monorepos/running-tasks#in-the-same-workspace) to say that any `codegen` or `db:migrate` tasks should be run _before_ `dev` is run. + +```json filename="turbo.json" +{ + "pipeline": { + "dev": { + "dependsOn": ["codegen", "db:migrate"], + "cache": false + }, + "codegen": { + "outputs": ["./codegen-outputs/**"] + }, + "db:migrate": { + "cache": false + } + } +} +``` + +Then, in your app's `package.json`: + +```json filename="apps/web/package.json" +{ + "scripts": { + // For example, starting the Next.js dev server + "dev": "next", + // For example, running a custom code generation task + "codegen": "node ./my-codegen-script.js", + // For example, using Prisma + "db:migrate": "prisma db push" + } +} +``` + +This means that users of your `dev` task _don't need to worry about codegen or migrating their database_ - it gets handled for them before their development server even starts. + +## Running `dev` only in certain workspaces + +Let's assume you want to run the `dev` task in the `docs` workspace, located at `<root>/apps/docs`. +`turbo` can infer the workspace from your directory, so if you run: + + ```bash + cd <root>/apps/docs + turbo run dev + ``` + +`turbo` will automatically pick up that you're in the `docs` workspace and run the `dev` task. + +To run the same task from any other location in the repository, use [`--filter` syntax](/repo/docs/core-concepts/monorepos/filtering). +For example: + +```bash +turbo run dev --filter docs +``` + +## Using environment variables + +While developing, you'll often need to use environment variables. These let you customize the behavior of your program - for instance, pointing to a different `DATABASE_URL` in development and production. + +We recommend using a library called [`dotenv-cli`](https://www.npmjs.com/package/dotenv-cli) to solve this problem. + +<Callout type="info"> + We want every dev to have a great experience using Turbo. The approach documented below does **not** live up to those standards. + +We're working on a first-class solution to this problem - but while you wait, here's the next-best solution. + +</Callout> + +### Tutorial + +1. Install `dotenv-cli` in your [root workspace](/repo/docs/handbook/what-is-a-monorepo#the-root-workspace): + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```bash + # Installs dotenv-cli in the root workspace + npm add dotenv-cli + ``` + </Tab> + <Tab> + ```bash + # Installs dotenv-cli in the root workspace + yarn add dotenv-cli --ignore-workspace-root-check + ``` + </Tab> + <Tab> + ```bash + # Installs dotenv-cli in the root workspace + pnpm add dotenv-cli --ignore-workspace-root-check + ``` + </Tab> +</Tabs> + +2. Add a `.env` file to your root workspace: + +```diff + β”œβ”€β”€ apps/ + β”œβ”€β”€ packages/ ++ β”œβ”€β”€ .env + β”œβ”€β”€ package.json + └── turbo.json +``` + +Add any environment variables you need to inject: + +```txt filename=".env" +DATABASE_URL=my-database-url +``` + +3. Inside your root `package.json`, add a `dev` script. Prefix it with `dotenv` and the `--` argument separator: + +```json +{ + "scripts": { + "dev": "dotenv -- turbo run dev" + } +} +``` + +This will extract the environment variables from `.env` before running `turbo run dev`. + +4. Now, you can run your dev script: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```bash + npm run dev + ``` + </Tab> + <Tab> + ```bash + yarn dev + ``` + </Tab> + <Tab> + ```bash + pnpm dev + ``` + </Tab> +</Tabs> + +And your environment variables will be populated! In Node.js, these are available on `process.env.DATABASE_URL`. + +<Callout> + You should also [add your environment variables](/repo/docs/core-concepts/caching#altering-caching-based-on-environment-variables) to your `turbo.json` if you're using them to build your app. +</Callout> diff --git a/docs/pages/repo/docs/handbook/environment-variables.mdx b/docs/pages/repo/docs/handbook/environment-variables.mdx new file mode 100644 index 0000000..6bb3d53 --- /dev/null +++ b/docs/pages/repo/docs/handbook/environment-variables.mdx @@ -0,0 +1,54 @@ +# Using environment variables + +We currently recommend using `dotenv-cli` as the simplest way to bring your environment variables into your development tasks. + +We're actively looking forward to improving the developer experience and ergonomics for your environment variables in a future release of Turborepo. + +## With local `turbo` + +1. Place all of your variables into the root of your monorepo as a `.env` file. + +2. Install `dotenv-cli` into the root of your repository. + +```json +{ + "devDependencies": { ++ "dotenv-cli": "latest" + } +} +``` + +3. Adjust your scripts to inject the environment variables into the `turbo` command. + +```json +{ + "scripts": { + "dev": "dotenv -- turbo dev" + } +} +``` + +## With global `turbo` + +If you're using `turbo` globally, you'll also need to install `dotenv-cli` globally so you can put `dotenv --` in front of the `turbo` command in your terminal. + +## With workspace scripts + +You may prefer to make your workspaces responsible for loading environment variables. This approach is more flexible, if you don't mind the extra configuration overhead in your `package.json` scripts. + +To use this strategy: + +1. Place all of your variables into the root of your monorepo as a `.env` file. + +2. Install `dotenv-cli` in the workspace. + +```json +{ + "scripts": { ++. "dev": "dotenv -e ../../.env start-server" + } + "devDependencies": { ++ "dotenv-cli": "latest" + }, +} +``` diff --git a/docs/pages/repo/docs/handbook/linting.mdx b/docs/pages/repo/docs/handbook/linting.mdx new file mode 100644 index 0000000..13300f9 --- /dev/null +++ b/docs/pages/repo/docs/handbook/linting.mdx @@ -0,0 +1,50 @@ +# Linting in a monorepo + +Linting in a monorepo can be tricky. Most of your workspaces will likely contain code that needs to be linted - so working out the most efficient way to lint them is tough. + +In this guide, we'll propose a method that plays to Turborepo's strengths: + +- Running lint tasks _inside_ the workspaces, not from root +- Sharing as much config as possible between workspaces + +## Running tasks + +We recommend specifying a single `lint` task inside your `turbo.json`. + +```json filename="turbo.json" +{ + "pipeline": { + "lint": {} + } +} +``` + +Then, inside **each workspace that needs to be linted**, add a `lint` script. We'll use TypeScript as an example: + +```json filename="packages/*/package.json" +{ + "scripts": { + "lint": "tsc" + } +} +``` + +This pattern has two benefits: + +- [**Parallelization**](/repo/docs/core-concepts/monorepos/running-tasks): the lint tasks will be run concurrently, speeding them up +- [**Caching**](/repo/docs/core-concepts/caching): `lint` tasks will only be re-run on workspaces that have changed + +This means you can lint your entire repo using one command: + +```bash +turbo run lint +``` + +## Sharing config files + +Sharing configuration across a monorepo helps keep the development experience consistent. Most linters will have a system for sharing config, or extending config across different files. + +So far, we've built guides for sharing config in: + +- [TypeScript](/repo/docs/handbook/linting/typescript) +- [ESLint](/repo/docs/handbook/linting/eslint) diff --git a/docs/pages/repo/docs/handbook/linting/_meta.json b/docs/pages/repo/docs/handbook/linting/_meta.json new file mode 100644 index 0000000..cd627de --- /dev/null +++ b/docs/pages/repo/docs/handbook/linting/_meta.json @@ -0,0 +1,4 @@ +{ + "typescript": "TypeScript", + "eslint": "ESLint" +} diff --git a/docs/pages/repo/docs/handbook/linting/eslint.mdx b/docs/pages/repo/docs/handbook/linting/eslint.mdx new file mode 100644 index 0000000..38b36b6 --- /dev/null +++ b/docs/pages/repo/docs/handbook/linting/eslint.mdx @@ -0,0 +1,124 @@ +import { Tabs, Tab } from "../../../../../components/Tabs"; + +# ESLint in a monorepo + +## Sharing config + +Sharing an ESLint config across workspaces can be a boon to productivity by making all your workspaces more consistent. + +Let's imagine a monorepo like this: + +``` +apps +β”œβ”€ docs +β”‚ β”œβ”€ package.json +β”‚ └─ .eslintrc.js +└─ web + β”œβ”€ package.json + └─ .eslintrc.js +packages +└─ eslint-config-custom + β”œβ”€ index.js + └─ package.json +``` + +We've got a package called `eslint-config-custom`, and two applications, each with their own `.eslintrc.js`. + +### Our `eslint-config-custom` package + +Our `eslint-config-custom` file contains only a single file, `index.js`. It looks like this. + +```js filename="packages/eslint-config-custom/index.js" +module.exports = { + extends: ["next", "turbo", "prettier"], + rules: { + "@next/next/no-html-link-for-pages": "off", + }, +}; +``` + +It's a typical ESLint config, nothing fancy. + +The `package.json` looks like this: + +```json filename="packages/eslint-config-custom/package.json" +{ + "name": "eslint-config-custom", + "main": "index.js", + "version": "1.0.0", + "dependencies": { + "eslint": "latest", + "eslint-config-next": "latest", + "eslint-config-prettier": "latest", + "eslint-plugin-react": "latest", + "eslint-config-turbo": "latest" + } +} +``` + +Two things are notable here. First, the `main` field points to `index.js`. This allows files to easily [import this config](/repo/docs/handbook/sharing-code#anatomy-of-a-package). + +Secondly, the ESLint dependencies are all listed here. This is useful - it means we don't need to re-specify the dependencies inside the apps which import `eslint-config-custom`. + +### How to use the `eslint-config-custom` package + +In our `web` app, we first need to add `eslint-config-custom` as a dependency. + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> +```jsonc filename="apps/web/package.json" +{ + "dependencies": { + "eslint-config-custom": "*" + } +} +``` + </Tab> + <Tab> +```jsonc filename="apps/web/package.json" +{ + "dependencies": { + "eslint-config-custom": "*" + } +} +``` + </Tab> + <Tab> +```jsonc filename="apps/web/package.json" +{ + "dependencies": { + "eslint-config-custom": "workspace:*" + } +} +``` + </Tab> +</Tabs> + +We can then import the config like this: + +```js filename="apps/web/.eslintrc.js" +module.exports = { + root: true, + extends: ["custom"], +}; +``` + +By adding `custom` to our `extends` array, we're telling ESLint to look for a package called `eslint-config-custom` - and it finds our workspace. + +### Summary + +This setup ships by default when you [create a new monorepo](/repo/docs/getting-started/create-new) with `npx create-turbo@latest`. You can also look at [our basic example](https://github.com/vercel/turbo/tree/main/examples/basic) to see a working version. + +## Setting up a `lint` task + +We recommend following the setup in the [`basics`](/repo/docs/handbook/linting#running-tasks) section, with one alteration. + +Each `package.json` script should look like this: + +```json filename="packages/*/package.json" +{ + "scripts": { + "lint": "eslint" + } +} +``` diff --git a/docs/pages/repo/docs/handbook/linting/typescript.mdx b/docs/pages/repo/docs/handbook/linting/typescript.mdx new file mode 100644 index 0000000..f340f4e --- /dev/null +++ b/docs/pages/repo/docs/handbook/linting/typescript.mdx @@ -0,0 +1,129 @@ +import { Tabs, Tab } from "../../../../../components/Tabs"; + +# TypeScript in a monorepo + +You can use TypeScript in a monorepo in one of two ways - as a linter, or as a build tool. + +In this section, we'll discuss TypeScript's role as a linter. This is when you prevent TypeScript emitting files (with [`noEmit`](https://www.typescriptlang.org/tsconfig/noEmit.html)) and instead use it _only_ to check your source code's types. + +## Sharing `tsconfig.json` + +We can share TypeScript config files across our repository with a clever solution. We can put our _base_ `tsconfig.json` files in a single workspace, and `extend` them from the `tsconfig.json` files in our apps. + +Let's imagine a workspace like this: + +``` +apps +β”œβ”€ docs +β”‚ β”œβ”€ package.json +β”‚ β”œβ”€ tsconfig.json +β”œβ”€ web +β”‚ β”œβ”€ package.json +β”‚ β”œβ”€ tsconfig.json +packages +β”œβ”€ tsconfig +β”‚ β”œβ”€ base.json +β”‚ β”œβ”€ nextjs.json +β”‚ β”œβ”€ package.json +β”‚ β”œβ”€ react-library.json +``` + +### Our `tsconfig` package + +Inside `packages/tsconfig`, we have a few `json` files which represent different ways you might want to configure TypeScript. They each look like this: + +```json filename="packages/tsconfig/base.json" +{ + "$schema": "https://json.schemastore.org/tsconfig", + "display": "Default", + "compilerOptions": { + "composite": false, + "declaration": true, + "declarationMap": true, + "esModuleInterop": true, + "forceConsistentCasingInFileNames": true, + "inlineSources": false, + "isolatedModules": true, + "moduleResolution": "node", + "noUnusedLocals": false, + "noUnusedParameters": false, + "preserveWatchOutput": true, + "skipLibCheck": true, + "strict": true + }, + "exclude": ["node_modules"] +} +``` + +Inside `package.json`, we simply name our package: + +```json filename="packages/tsconfig/package.json" +{ + "name": "tsconfig" +} +``` + +The other `json` files in the repository can be accessed via a simple import: + +```ts +import baseJson from "tsconfig/base.json"; +import nextjsJson from "tsconfig/nextjs.json"; +import reactLibraryJson from "tsconfig/react-library.json"; +``` + +This lets us export different config settings for different types of projects. + +### How to use the `tsconfig` package + +Each app/package which uses our shared `tsconfig` must first specify it as a dependency: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> +```jsonc filename="apps/web/package.json" +{ + "dependencies": { + "tsconfig": "*" + } +} +``` + </Tab> + <Tab> +```jsonc filename="apps/web/package.json" +{ + "dependencies": { + "tsconfig": "*" + } +} +``` + </Tab> + <Tab> +```jsonc filename="apps/web/package.json" +{ + "dependencies": { + "tsconfig": "workspace:*" + } +} +``` + </Tab> +</Tabs> + +Then, they can **extend it inside their own `tsconfig.json`**: + +```json filename="apps/web/tsconfig.json" +{ + // We extend it from here! + "extends": "tsconfig/nextjs.json", + + // You can specify your own include/exclude + "include": ["next-env.d.ts", "**/*.ts", "**/*.tsx"], + "exclude": ["node_modules"] +} +``` + +### Summary + +This setup ships by default when you [create a new monorepo](/repo/docs/getting-started/create-new) with `npx create-turbo@latest`. You can also look at [our basic example](https://github.com/vercel/turbo/tree/main/examples/basic) to see a working version. + +## Running tasks + +We recommend following the setup in the [`basics`](/repo/docs/handbook/linting#running-tasks) section. diff --git a/docs/pages/repo/docs/handbook/migrating-to-a-monorepo.mdx b/docs/pages/repo/docs/handbook/migrating-to-a-monorepo.mdx new file mode 100644 index 0000000..912a1be --- /dev/null +++ b/docs/pages/repo/docs/handbook/migrating-to-a-monorepo.mdx @@ -0,0 +1,77 @@ +import Callout from "../../../../components/Callout"; + +# Migrating to a Monorepo + +Migrating from a multi-repo setup to a monorepo setup can bring enormous benefits for productivity, especially if: + +- You're finding it hard to share code between applications +- You want a unified approach on how your team builds code + +It can be daunting to move to a monorepo. But with careful planning, it can go pretty smoothly. + +## Folder structure + +Let's imagine your multi-repo setup looks like this: + +``` +web (repo 1) +β”œβ”€ package.json + +docs (repo 2) +β”œβ”€ package.json + +app (repo 3) +β”œβ”€ package.json +``` + +You've got three repositories, `web`, `docs` and `app`. They don't have any shared dependencies, but you've noticed _lots_ of duplicated code between them. + +The best way to organise them in a monorepo would be like so: + +``` +my-monorepo +β”œβ”€ apps +β”‚ β”œβ”€ app +β”‚ β”‚ └─ package.json +β”‚ β”œβ”€ docs +β”‚ β”‚ └─ package.json +β”‚ └─ web +β”‚ └─ package.json +└─ package.json +``` + +To start sharing code, you could use the [internal package](/repo/docs/handbook/sharing-code/internal-packages) pattern, resulting in a new `packages` folder: + +``` +my-monorepo +β”œβ”€ apps +β”‚ β”œβ”€ app +β”‚ β”‚ └─ package.json +β”‚ β”œβ”€ docs +β”‚ β”‚ └─ package.json +β”‚ └─ web +β”‚ └─ package.json +β”œβ”€ packages +β”‚ └─ shared +β”‚ └─ package.json +└─ package.json +``` + +<Callout type="idea"> + If you're planning to move to a monorepo, try sketching out the exact folder + structure you're aiming for. +</Callout> + +## Setting up workspaces + +Once your apps are in the right folder structure, you'll need to set up workspaces and install your dependencies. Our sections on [setting up workspaces](/repo/docs/handbook/workspaces) should help. + +## Handling tasks + +Now your workspaces are set up, you'll need to figure out how you're going to run your tasks in your new monorepo. We've got sections on: + +1. How to [configure tasks](/repo/docs/core-concepts/monorepos/running-tasks) with Turborepo. +1. How to set up your [development tasks](/repo/docs/handbook/dev) +1. How to set up [linting](/repo/docs/handbook/linting) +1. How to [build your application](/repo/docs/handbook/building-your-app) +1. How to set up [testing](/repo/docs/handbook/testing) diff --git a/docs/pages/repo/docs/handbook/package-installation.mdx b/docs/pages/repo/docs/handbook/package-installation.mdx new file mode 100644 index 0000000..cb67afc --- /dev/null +++ b/docs/pages/repo/docs/handbook/package-installation.mdx @@ -0,0 +1,157 @@ +import Callout from "../../../../components/Callout"; +import HeartIcon from "@heroicons/react/solid/HeartIcon"; +import { Tabs, Tab } from "../../../../components/Tabs"; + +# Package Installation + +A package manager (like `npm`) handles two things for you: [managing workspaces](/repo/docs/handbook/workspaces) and installing packages. + +Turborepo is compatible with four package managers: + +- [npm](https://docs.npmjs.com/cli/v8/using-npm/workspaces/#description) +- [pnpm](https://pnpm.io/workspaces) +- [Yarn 1](https://classic.yarnpkg.com/lang/en/docs/workspaces/) +- Yarn >=2 (docs coming soon) + +You should use whichever you feel most comfortable with - but **if you're a monorepo beginner, we recommend npm**. + +If you're **comfortable with monorepos, we recommend pnpm**. It's faster and offers some useful CLI options like `--filter`. + +## Installing packages + +When you first clone or create your monorepo, you'll need to: + +1. Make sure you're in the root directory of your monorepo +2. Run the install command: + +<Tabs items={["npm", "yarn", "pnpm"]} storageKey="selected-pkg-manager"> + <Tab> + + ```bash + npm install + ``` + + </Tab> + <Tab> + + ```bash + yarn install + ``` + + </Tab> + <Tab> + + ```bash + pnpm install + ``` + + </Tab> +</Tabs> + +You'll now see `node_modules` folders appear in the root of your repository, and in each workspace. + +## Adding/removing/upgrading packages + +You can add, remove and upgrade packages from within your monorepo using your package manager's built-in commands: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + + **Install a package in a workspace** + ```bash + npm install <package> --workspace=<workspace> + ``` + + Example: + ```bash + npm install react --workspace=web + ``` + + **Remove a package from a workspace** + ```bash + npm uninstall <package> --workspace=<workspace> + ``` + + Example: + ```bash + npm uninstall react --workspace=web + ``` + + **Upgrade a package in a workspace** + ```bash + npm update <package> --workspace=<workspace> + ``` + + Example: + ```bash + npm update react --workspace=web + ``` + + </Tab> + <Tab> + + **Install a package in a workspace** + ```bash + yarn workspace <workspace> add <package> + ``` + + Example: + ```bash + yarn workspace web add react + ``` + + **Remove a package from a workspace** + ```bash + yarn workspace <workspace> remove <package> + ``` + + Example: + ```bash + yarn workspace web remove react + ``` + + **Upgrade a package in a workspace** + ```bash + yarn workspace <workspace> upgrade <package> + ``` + + Example: + ```bash + yarn workspace web upgrade react + ``` + + </Tab> + <Tab> + + **Install a package in a workspace** + ```bash + pnpm add <package> --filter <workspace> + ``` + + Example: + ```bash + pnpm add react --filter web + ``` + + **Remove a package from a workspace** + ```bash + pnpm uninstall <package> --filter <workspace> + ``` + + Example: + ```bash + pnpm uninstall react --filter web + ``` + + **Upgrade a package in a workspace** + ```bash + pnpm update <package> --filter <workspace> + ``` + + Example: + ```bash + pnpm update react --filter web + ``` + + </Tab> +</Tabs> diff --git a/docs/pages/repo/docs/handbook/publishing-packages.mdx b/docs/pages/repo/docs/handbook/publishing-packages.mdx new file mode 100644 index 0000000..bb6e183 --- /dev/null +++ b/docs/pages/repo/docs/handbook/publishing-packages.mdx @@ -0,0 +1,15 @@ +import Callout from "../../../../components/Callout"; + +# Publishing Packages + +Publishing a package to `npm` from a monorepo can be an extremely satisfying and smooth experience, with the right tools. + +You should follow this setup if you want to publish some of your monorepo's workspaces to `npm` as packages. If you don't need to publish to `npm`, you should use an [internal package](/repo/docs/handbook/sharing-code/internal-packages) instead. They're much easier to set up and use. + +## Tools + +You'll need to set up a few tools to get it working: + +First, you'll need a [bundler](/repo/docs/handbook/publishing-packages/bundling) to turn your code into [`CommonJS`](https://en.wikipedia.org/wiki/CommonJS), the most commonly used format on `npm`. You'll also need to set up a dev script so that you can work on the workspace in local development. + +Finally, you'll need a tool for [publishing & versioning](/repo/docs/handbook/publishing-packages/versioning-and-publishing). This will handle both bumping your monorepo's package versions _and_ publishing to npm. diff --git a/docs/pages/repo/docs/handbook/publishing-packages/_meta.json b/docs/pages/repo/docs/handbook/publishing-packages/_meta.json new file mode 100644 index 0000000..0c946ab --- /dev/null +++ b/docs/pages/repo/docs/handbook/publishing-packages/_meta.json @@ -0,0 +1,4 @@ +{ + "bundling": "Bundling", + "versioning-and-publishing": "Versioning and Publishing" +} diff --git a/docs/pages/repo/docs/handbook/publishing-packages/bundling.mdx b/docs/pages/repo/docs/handbook/publishing-packages/bundling.mdx new file mode 100644 index 0000000..df3d5c7 --- /dev/null +++ b/docs/pages/repo/docs/handbook/publishing-packages/bundling.mdx @@ -0,0 +1,130 @@ +import Callout from "../../../../../components/Callout"; +import { Tabs, Tab } from "../../../../../components/Tabs"; + +# Bundling packages in a Monorepo + +Unlike [internal](/repo/docs/handbook/sharing-code/internal-packages) packages, external packages can be deployed to npm _and_ used locally. In this guide, we'll be using a bundler to bundle a package to [`CommonJS`](https://en.wikipedia.org/wiki/CommonJS), the most commonly used format used on npm. + +## Setting up a build script + +Let's start with a package created using our [internal packages](/repo/docs/handbook/sharing-code/internal-packages) tutorial. + +There, we created a `math-helpers` package which contained a few helper functions for adding and subtracting. We've decided that this package is good enough for npm, so we're going to bundle it. + +At the end of that tutorial, we had a package set up under `/packages`, which looked like this: + +``` +β”œβ”€β”€ apps +β”‚ └── web +β”‚ └── package.json +β”œβ”€β”€ packages +β”‚ └── math-helpers +β”‚ β”œβ”€β”€ src +β”‚ β”‚ └── index.ts +β”‚ β”œβ”€β”€ tsconfig.json +β”‚ └── package.json +β”œβ”€β”€ package.json +└── turbo.json +``` + +We're going to add a `build` script to `math-helpers`, using a bundler. If you're unsure which one to choose, we recommend [`tsup`](https://tsup.egoist.dev/). + +<Tabs items={['tsup']} storageKey="selected-bundler"> + <Tab> + +First install, `tsup` inside `packages/math-helpers` using your [package manager](/repo/docs/handbook/package-installation). + +```json filename="packages/math-helpers/package.json" +{ + "scripts": { + "build": "tsup src/index.ts --format cjs --dts" + } +} +``` + +`tsup` outputs files to the `dist` directory by default, so you should: + +1. Add `dist` to your `.gitignore` files to make sure they aren't committed by `git`. +2. Add `dist` to the outputs of `build` in your `turbo.json`. + +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": ["dist/**"] + } + } +} +``` + +That way, when `tsup` is run the outputs can be [cached](/repo/docs/core-concepts/caching) by Turborepo. + +Finally, we should change `main` to point at `./dist/index.js` inside `package.json`. `types` can point at `./dist/index.d.ts`: + +```json filename="packages/math-helpers/package.json" +{ + "main": "./dist/index.js", + "types": "./dist/index.d.ts" +} +``` + +<Callout> + +If you run into errors by using `main` and `types`, take a look at the [tsup docs](https://tsup.egoist.dev/#bundle-formats). + +Bundling is a complicated topic, and we don't have space here to cover everything! + +</Callout> + + </Tab> +</Tabs> + +### Building our package before our app + +Before we can run `turbo run build`, there's one thing we need to consider. We've just added a [task dependency](/repo/docs/core-concepts/monorepos/running-tasks) into our monorepo. The `build` of `packages/math-helpers` needs to run **before** the `build` of `apps/web`. + +Fortunately, we can use `dependsOn` to easily configure this. + +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "dependsOn": [ + // Run builds in workspaces I depend on first + "^build" + ] + } + } +} +``` + +Now, we can run `turbo run build`, and it'll automatically build our packages _before_ it builds our app. + +## Setting up a dev script + +There's a small issue with our setup. We are building our package just fine, but it's not working great in dev. Changes that we make to our `math-helpers` package aren't being reflected in our app. + +That's because we don't have a `dev` script to rebuild our packages while we're working. We can add one easily: + +<Tabs items={['tsup']} storageKey="selected-bundler"> + <Tab> + +```json filename="packages/math-helpers/package.json" +{ + "scripts": { + "build": "tsup src/index.ts --format cjs --dts", + "dev": "npm run build -- --watch" + } +} +``` + +This passes the `--watch` flag to `tsup`, meaning it will watch for file changes. + + </Tab> +</Tabs> + +If we've already set up [dev scripts](/repo/docs/handbook/dev) in our `turbo.json`, running `turbo run dev` will run our `packages/math` dev task in parallel with our `apps/web` dev task. + +## Summary + +Our package is now in a spot where we can consider deploying to npm. In our [versioning and publishing](/repo/docs/handbook/publishing-packages/versioning-and-publishing) section, we'll do just that. diff --git a/docs/pages/repo/docs/handbook/publishing-packages/versioning-and-publishing.mdx b/docs/pages/repo/docs/handbook/publishing-packages/versioning-and-publishing.mdx new file mode 100644 index 0000000..ae913d2 --- /dev/null +++ b/docs/pages/repo/docs/handbook/publishing-packages/versioning-and-publishing.mdx @@ -0,0 +1,57 @@ +import Callout from "../../../../../components/Callout"; + +# Versioning and Publishing Packages in a Monorepo + +Manually versioning and publishing packages in a monorepo can be extremely tiresome. Luckily, there's a tool that makes things easy - the [Changesets](https://github.com/changesets/changesets) CLI. + +We recommend Changesets because it's intuitive to use, and - just like Turborepo - fits with the monorepo tools you're already used to. + +Some alternatives are: + +- [intuit/auto](https://github.com/intuit/auto) - Generate releases based on semantic version labels on pull requests +- [microsoft/beachball](https://github.com/microsoft/beachball) - The Sunniest Semantic Version Bumper + +## Understanding Changesets + +We recommend taking a look at the Changesets docs. Here's our recommended reading order: + +1. [Why use changesets?](https://github.com/changesets/changesets/blob/main/docs/intro-to-using-changesets.md) - an intro that takes you through the fundamentals. +1. [Installation instructions](https://github.com/changesets/changesets/blob/main/packages/cli/README.md) +1. If you're using GitHub, consider using the [Changeset GitHub bot](https://github.com/apps/changeset-bot) - a bot to nudge you to add changesets to PR's. +1. You should also consider adding the [Changesets GitHub action](https://github.com/changesets/action) - a tool which makes publishing extremely easy. + +## Using Changesets with Turborepo + +Once you've started using Changesets, you'll gain access to three useful commands: + +```bash +# Add a new changeset +changeset + +# Create new versions of packages +changeset version + +# Publish all changed packages to npm +changeset publish +``` + +Linking your publishing flow into Turborepo can make organising your deploy a lot simpler and faster. + +Our recommendation is to add a `publish-packages` script into your root `package.json`: + +```json filename="package.json" +{ + "scripts": { + // Include build, lint, test - all the things you need to run + // before publishing + "publish-packages": "turbo run build lint test && changeset version && changeset publish" + } +} +``` + +<Callout> + We recommend `publish-packages` so that it doesn't conflict with npm's + built-in `publish` script. +</Callout> + +This means that when you run `publish-packages`, your monorepo gets built, linted, tested and published - and you benefit from all of Turborepo's speedups. diff --git a/docs/pages/repo/docs/handbook/sharing-code.mdx b/docs/pages/repo/docs/handbook/sharing-code.mdx new file mode 100644 index 0000000..9cdf783 --- /dev/null +++ b/docs/pages/repo/docs/handbook/sharing-code.mdx @@ -0,0 +1,71 @@ +import Callout from "../../../../components/Callout"; + +# Sharing Code in a monorepo + +Monorepos let you share code across applications without friction. To do that, you'll be building **packages** to share code between your apps. + +## What is a package? + +The word 'package' has a double meaning when it comes to monorepos. It can refer to either of these: + +1. A set of files you download from a registry into your `node_modules`, via a package manager like `npm`. +2. A workspace containing code that can be shared between applications - by convention usually inside `/packages`. + +This dual meaning can be very confusing to folks new to the monorepo scene. You're likely very familiar with [package installation](/repo/docs/handbook/package-installation), but not so familiar with [workspaces](/repo/docs/handbook/workspaces). + +The fact is that they're very similar. A package is just a piece of shared code. Except that _installed packages_ live in your `node_modules`, and _local packages_ lives in a workspace - likely in your `/packages` folder. + +## Anatomy of a package + +Each package contains a `package.json`. You're likely familiar with using these to manage dependencies and scripts in your applications. + +However, you may not have noticed the `main` and `name` fields before: + +```jsonc filename="packages/my-lib/package.json" +{ + // The name of your package + "name": "my-lib", + + // When this package is used, this file is actually + // the thing that gets imported + "main": "./index.js" +} +``` + +Both of these fields are important for deciding **how this package behaves when it's imported**. For instance, if `index.js` has some exports: + +```js filename="packages/my-lib/index.js" +export const myFunc = () => { + console.log("Hello!"); +}; +``` + +And we import this file into one of our apps: + +```ts filename="apps/web/pages/index.jsx" +import { myFunc } from "my-lib"; + +myFunc(); // Hello! +``` + +Then we'll be able to use the code inside the `my-lib` folder inside our applications. + +To summarize, **each package must have a `name` and a `main`** declared inside its `package.json`. + +<Callout type="info"> + +Package resolution in `package.json` is a very complicated topic, and we can't do justice to it here. Other fields in your `package.json` may take precedence over `main` depending on how the package is being imported. + +Check the [npm docs](https://docs.npmjs.com/cli/v8/configuring-npm/package-json/#main) for a guide. + +For our purposes, using `main` will be good enough. + +</Callout> + +## Next steps + +We're going to introduce two styles of packages - **internal** packages and **external** packages: + +[**Internal** packages](/repo/docs/handbook/sharing-code/internal-packages) are intended to only be used inside the monorepo where they're housed. They are relatively simple to set up, and if your project is closed source they will be the most useful to you. + +[**External** packages](/repo/docs/handbook/publishing-packages) are bundled and sent to a package registry. This is useful for design systems, shared utility libraries or any open source work. However, they introduce more complexity around bundling, versioning and publishing. diff --git a/docs/pages/repo/docs/handbook/sharing-code/_meta.json b/docs/pages/repo/docs/handbook/sharing-code/_meta.json new file mode 100644 index 0000000..b1305a2 --- /dev/null +++ b/docs/pages/repo/docs/handbook/sharing-code/_meta.json @@ -0,0 +1,3 @@ +{ + "internal-packages": "Internal Packages" +} diff --git a/docs/pages/repo/docs/handbook/sharing-code/internal-packages.mdx b/docs/pages/repo/docs/handbook/sharing-code/internal-packages.mdx new file mode 100644 index 0000000..29efd56 --- /dev/null +++ b/docs/pages/repo/docs/handbook/sharing-code/internal-packages.mdx @@ -0,0 +1,237 @@ +import Callout from "../../../../../components/Callout"; + +import { Tabs, Tab } from "../../../../../components/Tabs"; + +# Internal Packages + +Internal packages are [packages](/repo/docs/handbook/sharing-code) which are only intended to be used _inside_ your monorepo. They're extremely useful for sharing code between apps in closed-source monorepos. + +Internal packages are quick to create, and can be turned into [external packages](/repo/docs/handbook/publishing-packages) if you end up wanting to publish them to `npm`. + +## What makes a package _internal?_ + +External packages **run their files through a bundler** before putting them on a package registry. This means they need a _lot_ of tooling to handle. + +- **Bundlers**: to build the package +- **Versioning**: to help with versioning and releases +- **Publishing**: to publish the package + +If you want to use those files locally, you'll also need: + +- **Dev scripts**: for bundling the package locally when files change + +Because internal packages are not published, we can skip _all_ of these steps. Instead of bundling our package ourselves, we're going to make the **app which imports the package bundle it for us**. + +This sounds complex, but it's extremely easy to set up. + +## Our first internal package + +We're going to create a shared `math-helpers` package inside our monorepo. + +### 1. Create your monorepo + +If you don't have an existing monorepo, create one [using our guide](/repo/docs/getting-started/create-new). + +### 2. Create a new package + +Inside `/packages`, create a new folder called `math-helpers`. + +```bash +mkdir packages/math-helpers +``` + +Create a `package.json`: + +```json filename="packages/math-helpers/package.json" +{ + "name": "math-helpers", + "dependencies": { + // Use whatever version of TypeScript you're using + "typescript": "latest" + } +} +``` + +Create a `src` folder, and add a TypeScript file at `packages/math-helpers/src/index.ts`. + +```ts filename="packages/math-helpers/src/index.ts" +export const add = (a: number, b: number) => { + return a + b; +}; + +export const subtract = (a: number, b: number) => { + return a - b; +}; +``` + +You'll also need to add a `tsconfig.json` at `packages/math-helpers/tsconfig.json`: + +```json filename="packages/math-helpers/tsconfig.json" +{ + "compilerOptions": { + "esModuleInterop": true, + "forceConsistentCasingInFileNames": true, + "isolatedModules": true, + "moduleResolution": "node", + "preserveWatchOutput": true, + "skipLibCheck": true, + "noEmit": true, + "strict": true + }, + "exclude": ["node_modules"] +} +``` + +Great! We've now got all the files we need for our internal package. + +### 3. Import the package + +We're now going to import the package and see what happens. Go into one of your apps, and add `math-helpers` into the dependencies of its package.json: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> +```jsonc filename="apps/web/package.json" +{ + "dependencies": { + "math-helpers": "*" + } +} +``` + </Tab> + <Tab> +```jsonc filename="apps/web/package.json" +{ + "dependencies": { + "math-helpers": "*" + } +} +``` + </Tab> + <Tab> +```jsonc filename="apps/web/package.json" +{ + "dependencies": { + "math-helpers": "workspace:*" + } +} +``` + </Tab> +</Tabs> + +[Install all packages from root](/repo/docs/handbook/package-installation) to ensure that dependency works. + +Now add an import from `math-helpers` into one of your app's source files: + +```diff ++ import { add } from "math-helpers"; + ++ add(1, 2); +``` + +You'll likely see an error! + +``` +Cannot find module 'math-helpers' or its corresponding type declarations. +``` + +That's because we've missed a step. We haven't told our `math-helpers/package.json` what the entry point to our package is. + +### 4. Fix `main` and `types` + +Go back to `packages/math-helpers/package.json` and add two fields, `main` and `types`: + +```json filename="packages/math-helpers/package.json" +{ + "name": "math-helpers", + "main": "src/index.ts", + "types": "src/index.ts", + "dependencies": { + "typescript": "latest" + } +} +``` + +Now, anything that imports our `math-helpers` module will be pointed directly towards the `src/index.ts` file - _that's_ the file that they will import. + +Go back to `apps/web/pages/index.tsx`. The error should be gone! + +### 5. Try running the app + +Now, try running that app's dev script. In the default turborepo, this will be as easy as: + +```bash +turbo dev +``` + +<Tabs items={['Next.js', 'Vite']} storageKey="selected-framework"> + <Tab> + +When it starts running, you'll likely see an error in your web browser: + +``` +../../packages/math-helpers/src/index.ts +Module parse failed: Unexpected token (1:21) +You may need an appropriate loader to handle this file type, +currently no loaders are configured to process this file. +See https://webpack.js.org/concepts#loaders + +> export const add = (a: number, b: number) => { +| return a + b; +| }; +``` + +This is what happens when you try and import an un-bundled file into a Next.js app. + +The fix is simple - we need to tell Next.js to bundle the files from certain packages it imports. + + </Tab> + <Tab> + Because `vite` transpiles modules by default, there's no more setup needed! Skip to step 7. + </Tab> +</Tabs> + +### 6. Configuring your app + +<Tabs items={['Next.js', 'Vite']} storageKey="selected-framework"> + <Tab> + + We can do that using `transpilePackages` in `next.config.js` (requires v13.1+): + + ```ts filename="apps/web/next.config.js" + /** @type {import('next').NextConfig} */ + const nextConfig = { + transpilePackages: ['math-helpers'], + }; + + module.exports = nextConfig; + ``` + + Restart your dev script, and go to the browser. + + **The error has disappeared!** + + </Tab> + <Tab> + No configuration is needed! + </Tab> +</Tabs> + +### 7. Summary + +We are now able to add any amount of code into our `math-helpers` package, and use it in any app in our monorepo. We don't even need to build our package - it just works. + +This pattern is extremely powerful for creating pieces of code that can be easily shared between teams. + +### Quick Reference + +#### Quick reference - creating a new internal package + +1. Create a new folder in `packages/<folder>` +1. Add a `package.json`, with `name` and `types` pointing at `src/index.ts` (or `src/index.tsx`) +1. Add `src/index.tsx`, with at least one named export +1. [Install your packages](/repo/docs/handbook/package-installation) from root + +#### Quick reference - importing an internal package + +1. Ensure that you're [importing it correctly](/repo/docs/handbook/sharing-code/internal-packages#3-import-the-package) +2. Ensure that you've [configured your app to transpile it](/repo/docs/handbook/sharing-code/internal-packages#6-configuring-your-app) diff --git a/docs/pages/repo/docs/handbook/testing.mdx b/docs/pages/repo/docs/handbook/testing.mdx new file mode 100644 index 0000000..a63ca72 --- /dev/null +++ b/docs/pages/repo/docs/handbook/testing.mdx @@ -0,0 +1,109 @@ +import { Tabs, Tab } from "../../../../components/Tabs"; +import Callout from "../../../../components/Callout"; + +# Testing in a Monorepo + +Along with linting and building, testing is a crucial part of a production-ready monorepo. Whether you're using end-to-end tests or a unit test suite, integrating them with Turborepo will lead to enormous speed-ups. + +## Working with test runners + +Let's say we have a monorepo that looks like this: + +``` +β”œβ”€β”€ apps +β”‚ └── web +β”‚ └── package.json +└── packages + └── shared + └── package.json +``` + +Both `apps/web` and `packages/shared` have their own test suite. Their `package.json` files look like this: + +<Tabs items={["Jest", "Vitest"]} storageKey="selected-test-runner"> + <Tab> +```json filename="apps/web/package.json" +{ + "scripts": { + "test": "jest" + } +} +``` + </Tab> + <Tab> +```json filename="apps/web/package.json" +{ + "scripts": { + "test": "vitest run" + } +} +``` + </Tab> +</Tabs> + +Inside the root `turbo.json`, we recommend setting up a `test` task in your [pipeline](/repo/docs/core-concepts/monorepos/running-tasks): + +```json filename="turbo.json" +{ + "pipeline": { + "test": {} + } +} +``` + +Now, you can run `turbo test` and have Turborepo test the entire repository. + +Because of Turborepo's [caching](/repo/docs/core-concepts/caching), this also means that only repositories that have changed files will be tested - resulting in a lot of time saved. + +## Running tests in watch mode + +When you run your test suite normally, it completes and outputs to `stdout`. This means you can [cache it](/repo/docs/core-concepts/caching) with Turborepo. + +But when you run your tests in watched mode, the process never exits. This makes a watch task more like a [development task](/repo/docs/handbook/dev). + +Because of this difference, we recommend specifying **two separate Turborepo tasks**: one for running your tests, and one for running them in watch mode. + +Here's an example: + +<Tabs items={["Jest", "Vitest"]} storageKey="selected-test-runner"> + <Tab> +```json filename="apps/web/package.json" +{ + "scripts": { + "test": "jest", + "test:watch": "jest --watch" + } +} +``` + </Tab> + <Tab> +```json filename="apps/web/package.json" +{ + "scripts": { + "test": "vitest run", + "test:watch": "vitest" + } +} +``` + </Tab> +</Tabs> + +```json filename="turbo.json" +{ + "pipeline": { + "test": {}, + "test:watch": { + "cache": false + } + } +} +``` + +```json filename="package.json" +{ + "scripts": { + "test": "turbo run test", + "test:watch": "turbo run test:watch" + } +} +``` diff --git a/docs/pages/repo/docs/handbook/tools/_meta.json b/docs/pages/repo/docs/handbook/tools/_meta.json new file mode 100644 index 0000000..2a0e55f --- /dev/null +++ b/docs/pages/repo/docs/handbook/tools/_meta.json @@ -0,0 +1,4 @@ +{ + "prisma": "Prisma", + "storybook": "Storybook" +} diff --git a/docs/pages/repo/docs/handbook/tools/prisma.mdx b/docs/pages/repo/docs/handbook/tools/prisma.mdx new file mode 100644 index 0000000..11e589d --- /dev/null +++ b/docs/pages/repo/docs/handbook/tools/prisma.mdx @@ -0,0 +1,207 @@ +import { Tabs, Tab } from '../../../../../components/Tabs' +import Callout from '../../../../../components/Callout' + +# Using Prisma with Turborepo + +[Prisma](https://www.prisma.io/) is an extremely popular ORM with automated migrations, type safety and integrated tooling. Using it with Turborepo can cut time you spend generating code, and easily make sure your generated Prisma code is always up-to-date. + +## Guide + +This guide shows you how to: + +1. Set up Prisma in a monorepo +2. Handle migration and code generation scripts +3. Cache those scripts with Turborepo +4. Ensure that they're always run whenever `dev` or `build` is run + +If you've already got Prisma set up in your database, you can skip to [step 4](#4-setting-up-the-scripts). + +### 1. Create your monorepo + +If you don't have an existing project, use our [quickstart](/repo/docs/getting-started/create-new) to create a new monorepo. + +### 2. Add a new `database` package + +Create a new folder called `database` inside packages with a `package.json` inside: + +```json filename="packages/database/package.json" +{ + "name": "database", + "dependencies": { + "@prisma/client": "latest" + }, + "devDependencies": { + // Replace "latest" with the latest version + "prisma": "latest" + } +} +``` + +If you're using `pnpm`, you should add a file at the root called `.npmrc`: + +```txt filename=".npmrc" +public-hoist-pattern[]=*prisma* +``` + +Run your package manager's install step to install the new dependencies. + +### 3. Run `prisma init` + +`cd` into `packages/database`: + +```bash +cd packages/database +``` + +Run `npx prisma init`. + +This should create several files inside `packages/database`: + +``` +prisma/schema.prisma +.gitignore +.env +``` + +- `schema.prisma` is where your [Prisma schema](https://www.prisma.io/docs/concepts/components/prisma-schema) lives. Here, you'll be able to modify the shape of your database. +- `.gitignore` adds some ignored files to git +- `.env` lets you manually specify your `DATABASE_URL` for prisma. + +At this point, you should refer to the Prisma docs for [connecting your database to Prisma](https://www.prisma.io/docs/getting-started/setup-prisma/start-from-scratch/relational-databases/connect-your-database-typescript-postgres). + +Once you've got a database connected, you can move on. + +### 4. Setting up the scripts + +Let's add some scripts to the `package.json` inside `packages/database`: + +```json filename="packages/database/package.json" +{ + "scripts": { + "db:generate": "prisma generate", + "db:push": "prisma db push --skip-generate" + } +} +``` + +Let's also add these scripts to `turbo.json` in the root: + +```json filename="turbo.json" +{ + "pipeline": { + "db:generate": { + "cache": false + }, + "db:push": { + "cache": false + } + } +} +``` + +We can now run `turbo db:push db:generate` from the root of our repository to automatically migrate our database and generate our type safe Prisma client. + +<Callout type="info"> + We use the `--skip-generate` flag on `db:push` to ensure it doesn't automatically run `prisma generate` after migrating the database. This ends up being faster when using Turborepo because we automatically parallelize the tasks. +</Callout> + +### 5. Exporting your client + +Next, we need to export the `@prisma/client` so we can use it in our applications. Let's add a new file to `packages/database`: + +```ts filename="packages/database/index.ts" +export * from '@prisma/client'; +``` + +Following the [internal packages pattern](/repo/docs/handbook/sharing-code/internal-packages), we'll also need to add `index.ts` to `main` and `types` inside `packages/database/package.json`. + +```json filename="packages/database/package.json" +{ + "main": "./index.ts", + "types": "./index.ts" +} +``` + +#### Importing `database` + +Let's now import our database package into one of our apps to test it out. Let's say you have an app at `apps/web`. Add the dependency to `apps/web/package.json`: + +<Tabs items={["npm", "yarn", "pnpm"]} storageKey="selected-pkg-manager"> + <Tab> + ```json filename="apps/web/package.json" + { + "dependencies": { + "database": "*" + } + } + ``` + </Tab> + <Tab> + ```json filename="apps/web/package.json" + { + "dependencies": { + "database": "*" + } + } + ``` + </Tab> + <Tab> + ```json filename="apps/web/package.json" + { + "dependencies": { + "database": "workspace:*" + } + } + ``` + </Tab> +</Tabs> + +Run your package manager's install command. + +You can now import `PrismaClient` from `database` anywhere in your app: + +```ts +import { PrismaClient } from 'database' + +const client = new PrismaClient(); +``` + +<Callout> + You may also need to do some configuration inside your application to allow it to run an internal package. Check out our [internal packages docs](/repo/docs/handbook/sharing-code/internal-packages#6-configuring-your-app) for more info. +</Callout> + +### 6. Figuring out the scripts + +We're now in a pretty good position. We have a reusable `database` module that we can import into any of our applications. We've got a `turbo db:push` script we can use to push our changes to the database. + +However, our `db:generate` scripts aren't optimized yet. They provide crucial code to our `dev` and `build` tasks. If a new developer runs `dev` on an application without running `db:generate` first, they'll get errors. + +So, let's make sure that `db:generate` is always run _before_ the user runs `dev`: + +```json filename="turbo.json" +{ + "pipeline": { + "dev": { + "dependsOn": ["^db:generate"], + "cache": false + }, + "build": { + "dependsOn": ["^db:generate"], + "outputs": ["your-outputs-here"] + }, + "db:generate": { + "cache": false + } + } +} +``` + +Check out the section on [running tasks](/repo/docs/core-concepts/monorepos/running-tasks) to learn more about the `^db:generate` syntax. + +### 7. Caching the results of `prisma generate` + +`prisma generate` outputs files to the filesystem, usually inside `node_modules`. In theory, it should be possible to cache the output of `prisma generate` with Turborepo to save a few seconds. + +However, Prisma behaves differently with different package managers. This can lead to unpredictable results, which might lead to broken deployments in some situations. Instead of documenting the intricacies of each approach, we recommend _not_ caching the results of `prisma generate`. Since `prisma generate` usually only takes 5-6 seconds, and tends not to take longer with larger `schema` files, this seems like a fine trade-off. + +You may wish to experiment with this yourself. If you find a solution that you feel works, feel free to [add an issue](https://github.com/vercel/turbo/issues/new/choose) and we can update this section. diff --git a/docs/pages/repo/docs/handbook/tools/storybook.mdx b/docs/pages/repo/docs/handbook/tools/storybook.mdx new file mode 100644 index 0000000..008b8ba --- /dev/null +++ b/docs/pages/repo/docs/handbook/tools/storybook.mdx @@ -0,0 +1,223 @@ +import { Tabs, Tab } from '../../../../../components/Tabs' +import Callout from '../../../../../components/Callout' + +# Using Storybook with Turborepo + +[Storybook](https://storybook.js.org/) is a popular way to build UI components in an isolated environment. By putting Storybook into your Turborepo, you can easily develop your design system right alongside your applications. If you'd rather use a template, this guide is walking through how to build [this Storybook/Turborepo template](https://vercel.com/templates/react/turborepo-design-system) on Vercel. + +## Guide + +This guide shows you how to: + +1. Set up Storybook in a monorepo +2. Create your first story +3. Ensure Storybook works with the rest of your tasks + +### 1. Create your monorepo + +If you don't have an existing project, use our [quickstart](/repo/docs/getting-started/create-new) to create a new monorepo. + +```shell +npx create-turbo@latest +``` + +### 2. Add a new `workshop` app + +Storybook needs a builder to use so we will create a Vite app. + +<Tabs items={["npm", "yarn", "pnpm"]} storageKey="selected-pkg-manager"> + <Tab> + ```shell + cd apps + npm create vite + ``` + </Tab> + <Tab> + ```shell + cd apps + yarn create vite + ``` + </Tab> + <Tab> + ```shell + cd apps + pnpm create vite + ``` + </Tab> +</Tabs> + +Follow the prompts to create an app named "workshop" as a React, TypeScript app. + +Next, we need to scaffold Storybook: + +<Tabs items={["npm", "yarn", "pnpm"]} storageKey="selected-pkg-manager"> + <Tab> + ```shell + cd workshop + npx sb init --skip-install + npm install --save-dev @storybook/cli # Manually install deps and CLI + ``` + + <Callout type="info">You may be prompted to enable the `--legacy-peer-deps` flag. This flag is required for Storybook to work in a monorepo.</Callout> + + </Tab> + <Tab> + ```shell + cd workshop + npx storybook init + ``` + </Tab> + <Tab> + <p className="nx-mt-6 nx-leading-7">If you're using `pnpm`, you'll need to add an `.npmrc` at the root of your monorepo:</p> + ```txt filename=".npmrc" + auto-install-peers=true + legacy-peer-deps=true + node-linker=hoisted + ``` + Then, we scaffold Storybook and install its dependencies manually: + + ```shell + cd workshop + pnpx sb init --skip-install + pnpm install --save-dev @storybook/cli # Manually install deps and CLI + ``` + + </Tab> +</Tabs> + +### 3. Set up a story for your Button component + +The Storybook scaffold creates some stories and React components in the `/src/stories` directory. To create a story for the button from your `ui` package, we will replace the import in `Button.stories.tsx` with our own. + +1. Update the Button in your `ui` package to match the story's specifications. + +```jsx filename="packages/ui/Button.tsx" +interface Props { + primary?: boolean; + size?: "small" | "large"; + label?: string; +} + +export const Button = ({ + primary = false, + label = "Boop", + size = "small", +}: Props) => { + return ( + <button + style={{ + backgroundColor: primary ? "red" : "blue", + fontSize: size === "large" ? "24px" : "14px", + }} + > + {label} + </button> + ); +}; +``` + +2. Add your `ui` package to the `workshop` app: + +<Tabs items={["npm", "yarn", "pnpm"]} storageKey="selected-pkg-manager"> + <Tab> +```json filename="apps/workshop/package.json" +{ + // ... + { + "dependencies": { + "ui": "*", + // ... + } + } +} +``` + +And `npm install` one more time to make sure that your `ui` package is installed in the `workshop` app. + + </Tab> + <Tab> +```json filename="apps/workshop/package.json" +{ + // ... + { + "dependencies": { + "ui": "*", + // ... + } + } +} +``` + +And `yarn install` to make sure that your `ui` package is installed in the `workshop` app. + + </Tab> + <Tab> +```json filename="apps/workshop/package.json" +{ + // ... + { + "dependencies": { + "ui": "workspace:*", + // ... + } + } +} +``` +And `pnpm install` one more time to make sure that your `ui` package is installed in the `workshop` app. + </Tab> +</Tabs> + +3. Replace the `Button` import in the `Button.stories.tsx` so that it comes from your `ui` package: + +```jsx filename="apps/workshop/src/stories/Button.stories.tsx" +import { Button } from 'ui' +``` + +### 4. Align tasks + +The last thing that we need to do is make sure that Storybook is lined up with the rest of your tasks: + +```json filename="apps/workshop/package.json" +{ + // ... + "scripts": { + "dev": "start-storybook -p 6006", + "build": "build-storybook" + } +} +``` + +To ensure build caching, you'll first need to add `storybook-static` to your `.gitignore`. Then, add `storybook-static` to the outputs of your `turbo.json` build task: + +```json filename="turbo.json" +{ + "pipeline": { + "build": { + "outputs": [ + "dist/**", ++ "storybook-static/**" + ] + } + } +} +``` + +Your `dev` and `build` tasks will now include Storybook, allowing you to develop your Storybook alongside your applications and enjoy cached builds with the rest of your applications. + +## Deploy on Vercel + +Let's deploy your Storybook project. + +In the "Build and Development Settings" on the General tab of your project settings, change your "Output Directory" to `storybook-static`. + +Additionally, at the time of this writing, Storybook cannot be ran on Node 18, the Vercel default. In the `package.json` of your `workshop` app, add an `engines` field to make sure that this project runs on Node 16: + +```json filename="apps/workshop/package.json" +{ + // ... + "engines": { + "node": "16" + } +} + +``` diff --git a/docs/pages/repo/docs/handbook/troubleshooting.mdx b/docs/pages/repo/docs/handbook/troubleshooting.mdx new file mode 100644 index 0000000..fd88afd --- /dev/null +++ b/docs/pages/repo/docs/handbook/troubleshooting.mdx @@ -0,0 +1,31 @@ +# Troubleshooting + +## Handling mismatched package versions + +As your monorepo grows, you may end up with different versions of packages in different workspaces. + +For instance, `app` may use `react@18.0.0`, but `web` may use `react@17.0.0`. This is especially true when you've just [migrated from a multi-repo setup](/repo/docs/handbook/migrating-to-a-monorepo). + +Mis-matched dependencies in different repositories can mean that code runs unexpectedly. React, for instance, will error if there is more than one version installed. + +#### `@manypkg/cli` + +Our recommended method for handling this problem is with [`@manypkg/cli`](https://www.npmjs.com/package/@manypkg/cli) - a CLI which can ensure that your dependencies match across your repositories. + +Here's a quick example. At the root of your `package.json`, add a `postinstall` script. + +```json filename="package.json" +{ + "scripts": { + // This will check your dependencies match + // after each installation + "postinstall": "manypkg check" + }, + "dependencies": { + // Make sure you install @manypkg/cli + "@manypkg/cli": "latest" + } +} +``` + +You can also run `manypkg fix` to automatically update them throughout your repository. diff --git a/docs/pages/repo/docs/handbook/what-is-a-monorepo.mdx b/docs/pages/repo/docs/handbook/what-is-a-monorepo.mdx new file mode 100644 index 0000000..0a3b916 --- /dev/null +++ b/docs/pages/repo/docs/handbook/what-is-a-monorepo.mdx @@ -0,0 +1,85 @@ +import { Tabs, Tab } from "../../../../components/Tabs"; + +# What is a Monorepo? + +A monorepo is a collection of many different apps and packages in a single codebase. + +The alternative setup is called **a polyrepo** - multiple codebases which are published and versioned separately. + +## Sharing code + +### In a polyrepo + +In a polyrepo setup, the process for sharing code between applications is relatively lengthy. + +Imagine that you have three separate repositories - `app`, `docs`, and `shared-utils`. Both `app` and `docs` depend on `shared-utils`, which is published as a package on npm. + +Let's say a bug in `shared-utils` is causing a critical issue in both `app` and `docs`. You'll need to: + +1. Make a commit in `shared-utils` fixing the error +2. Run a `publish` task inside `shared-utils` to publish it to npm +3. Make a commit in `app` bumping the version of the `shared-utils` dependency +4. Make a commit in `docs` bumping the version of the `shared-utils` dependency +5. `app` and `docs` are now ready to be deployed. + +The more apps you have that depend on `shared-utils`, the longer this process takes. It can be extremely arduous. + +### In a monorepo + +In a monorepo setup, `shared-utils` would be _in the same codebase_ as `app` and `docs`. This makes the process very simple: + +1. Make a commit in `shared-utils` fixing the error +2. `app` and `docs` are now ready to be deployed. + +No versioning is required, because `app` and `docs` don't depend on the version of `shared-utils` in npm - they depend on the **version that's in the codebase**. + +This makes it possible to create single commits which fix bugs in multiple apps and packages at once. This can be an enormous gain in speed for teams. + +## How do monorepos work? + +The main building block of the monorepo is the [workspace](/repo/docs/handbook/workspaces). Each application and package you build will be in its own workspace, with its own `package.json`. As you'll learn from our guide, workspaces can **depend on each other**, meaning your `docs` workspace can depend on `shared-utils`: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + +```json filename="apps/docs/package.json" +{ + "dependencies": { + "shared-utils": "*" + } +} +``` + + </Tab> + <Tab> +```json filename="apps/docs/package.json" +{ + "dependencies": { + "shared-utils": "*" + } +} +``` + + </Tab> + + <Tab> +```json filename="apps/docs/package.json" +{ + "dependencies": { + "shared-utils": "workspace:*" + } +} +``` + + </Tab> +</Tabs> + +Workspaces are managed by the same CLI which [installs your dependencies](/repo/docs/handbook/package-installation). + +### The root workspace + +You'll also have a root workspace - a `package.json` in the root folder of your codebase. This is a useful place for: + +1. Specifying dependencies which are present across your entire monorepo +1. Adding tasks that operate on the _whole_ monorepo, not just individual workspaces +1. Adding documentation on how to use the monorepo diff --git a/docs/pages/repo/docs/handbook/workspaces.mdx b/docs/pages/repo/docs/handbook/workspaces.mdx new file mode 100644 index 0000000..903021f --- /dev/null +++ b/docs/pages/repo/docs/handbook/workspaces.mdx @@ -0,0 +1,168 @@ +import Callout from "../../../../components/Callout"; +import HeartIcon from "@heroicons/react/solid/HeartIcon"; +import { Tabs, Tab } from "../../../../components/Tabs"; + +# Workspaces + +Workspaces are the building blocks of your monorepo. Each app and package you add to your monorepo will be **inside its own workspace**. + +Workspaces are managed by your [package manager](/repo/docs/handbook/package-installation), so make sure you've set that up first. + +## Configuring workspaces + +To use workspaces, you must first declare their file system locations to your package manager. + +A common convention we recommend is having top-level `apps/` and `packages/` directories. This isn't a requirement - just a suggested directory structure. + +The `apps` folder should contain workspaces for launchable apps, such as a [Next.js](https://nextjs.org/) or [Svelte](https://svelte.dev/) app. + +The `packages` folder should contain workspaces for packages used by either an app or another package. + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + + Add the folders you want to configure as workspaces to the `workspaces` field in your root `package.json` file. This field contains a list of workspace folders in the form of globs: + + ```json + { + "name": "my-monorepo", + "version": "1.0.0", + "workspaces": [ + "docs", + "apps/*", + "packages/*" + ] + } + ``` + + </Tab> + <Tab> + + Add the folders you want to configure as workspaces to the `workspaces` field in your root `package.json` file. This field contains a list of workspace folders in the form of globs: + + ```json + { + "name": "my-monorepo", + "version": "1.0.0", + "workspaces": [ + "docs", + "apps/*", + "packages/*" + ] + } + ``` + + </Tab> + + <Tab> + + Add the folders you want to configure as workspaces to the `pnpm-workspace.yaml` file that exists in your root directory. This file contains a list of workspace folders in the form of globs: + + ```yaml + packages: + - "docs" + - "apps/*" + - "packages/*" + ``` + + </Tab> +</Tabs> + +``` +my-monorepo +β”œβ”€ docs +β”œβ”€ apps +β”‚ β”œβ”€ api +β”‚ └─ mobile +β”œβ”€ packages +β”‚ β”œβ”€ tsconfig +β”‚ └─ shared-utils +└─ sdk +``` + +In the example above, all directories inside `my-monorepo/apps/` and `my-monorepo/packages/` are workspaces, and the `my-monorepo/docs` directory itself is also a workspace. `my-monorepo/sdk/` is _not_ a workspace, as it is not included in the workspace configuration. + +## Naming workspaces + +Each workspace has a unique name, which is specified in its `package.json`: + +```json filename="packages/shared-utils/package.json" +{ + "name": "shared-utils" +} +``` + +This name is used to: + +1. Specify which workspace a [package should be installed to](/repo/docs/handbook/package-installation) +1. Use this workspace in other workspaces +1. [Publish packages](/repo/docs/handbook/publishing-packages): it'll be published on npm under the `name` you specify + +You can use an npm organization or user scope to avoid collisions with existing packages on npm. For instance, you could use `@mycompany/shared-utils`. + +## Workspaces which depend on each other + +To use a workspace inside another workspace, you'll need to specify it as a dependency, using its name. + +For instance, if we want `apps/docs` to import `packages/shared-utils`, we'd need to add `shared-utils` as a dependency inside `apps/docs/package.json`: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + +```json filename="apps/docs/package.json" +{ + "dependencies": { + "shared-utils": "*" + } +} +``` + + </Tab> + <Tab> +```json filename="apps/docs/package.json" +{ + "dependencies": { + "shared-utils": "*" + } +} +``` + + </Tab> + + <Tab> +```json filename="apps/docs/package.json" +{ + "dependencies": { + "shared-utils": "workspace:*" + } +} +``` + + </Tab> +</Tabs> + +<Callout type="info"> + The `*` allows us to reference the _latest_ version of the dependency. It + saves us from needing to bump the versions of our dependency if the versions + of our packages change. +</Callout> + +Just like a normal package, we'd need to run `install` from root afterwards. Once installed, we can use the workspace as if it were any other package from `node_modules`. See our [section on sharing code](/repo/docs/handbook/sharing-code) for more information. + +## Managing workspaces + +In a monorepo, when you run an `install` command from root, a few things happen: + +1. The workspace dependencies you have installed are checked +1. Any workspaces are [symlinked](https://en.wikipedia.org/wiki/Symbolic_link) into `node_modules`, meaning that you can import them like normal packages +1. Other packages are downloaded and installed into `node_modules` + +This means that whenever you add/remove workspaces, or change their locations on the filesystem, you'll need to re-run your `install` command from root to set up your workspaces again. + +<Callout> + You **don't need to re-install every time your source code changes** inside a + package - only when you change the locations (or configuration) of your + workspaces in some way. +</Callout> + +If you run into issues, you may have to delete each `node_modules` folder in your repository and re-run `install` to correct it. diff --git a/docs/pages/repo/docs/index.mdx b/docs/pages/repo/docs/index.mdx new file mode 100644 index 0000000..04f6fcf --- /dev/null +++ b/docs/pages/repo/docs/index.mdx @@ -0,0 +1,73 @@ +--- +title: Turborepo Quickstart +description: Create your first monorepo or add Turborepo to an existing project. +--- + +import { readdirSync, lstatSync, readFileSync } from 'fs'; +import path from 'path'; +import { QuickStartArea, LearnMoreArea, MonoreposArea } from "../../../components/QuickStart"; +import { ExamplesArea } from "../../../components/ExamplesArea"; +import FullTurboCTA from "../../../components/FullTurboCTA"; + +export const getStaticProps = ({ params }) => { + // path to examples directory at the monorepo root. + const examplesDirectory = path.join(__dirname, '../../../../../examples') + const examples = []; + readdirSync(examplesDirectory).forEach(file => { + if (lstatSync(path.join(examplesDirectory, file)).isDirectory()) { + try { + examples.push({ + slug: file, + ...JSON.parse(readFileSync(path.join(examplesDirectory, file, "meta.json")).toString()) + } + ); + } catch (err) { + console.log(`No meta.json found for ${file}, excluding from docs`); + } + } + }); + // throw an error if no examples are found + if (examples.length === 0) { + throw new Error( + `No examples found in ${examplesDirectory}! Make sure you have updated the path if moving this file.` + ) + } + return { + props: { + ssg: { + examples + } + }, + revalidate: 60 * 60 * 24 + } +} + +# Turborepo Quickstart + +Turborepo is an intelligent **build system optimized for JavaScript and TypeScript codebases**. + +Your codebase's tasks - like `lint`, `build` and `test` - **don't run as fast as they could**. Turborepo uses [caching](/repo/docs/core-concepts/caching) to turbocharge your local setup and speed up your CI. + +Turborepo is designed to be **incrementally adopted**, so you can add it to most codebases in a few minutes. + +<QuickStartArea /> + +## Features + +Turborepo leverages advanced build system techniques to speed up development, **both on your local machine and your CI/CD**. + +<LearnMoreArea /> + +## Monorepos + +Turborepo works out-of-the-box with monorepo tools like `npm`, `pnpm` and `yarn`. If you've ever felt that your monorepo slowed you down, it might be time for Turborepo. + +<MonoreposArea /> + +## Examples + +You can also clone a Turborepo starter repository to get a head start on your monorepo. For even more examples and starters, see the [Turborepo examples directory on GitHub](https://github.com/vercel/turbo/tree/main/examples). + +<ExamplesArea filter="featured" /> + +<FullTurboCTA /> diff --git a/docs/pages/repo/docs/installing.mdx b/docs/pages/repo/docs/installing.mdx new file mode 100644 index 0000000..cc1e9c2 --- /dev/null +++ b/docs/pages/repo/docs/installing.mdx @@ -0,0 +1,85 @@ +--- +title: Installing Turborepo +description: Learn how to install Turborepo for use with your repository +--- + +import Callout from "../../../components/Callout"; +import { Tabs, Tab } from '../../../components/Tabs' + +# Install Turborepo + +`turbo` works with [yarn](https://classic.yarnpkg.com/lang/en/), [npm](https://www.npmjs.com/), and [pnpm](https://pnpm.io/) on the following operating systems: + +- macOS darwin 64-bit (Intel), ARM 64-bit (Apple Silicon) +- Linux 64-bit, ARM 64-bit +- Windows 64-bit, ARM 64-bit + +<Callout> + Note: Linux builds of `turbo` link against `glibc`. For Alpine Docker environments, you will need to ensure libc6-compat is installed as well, via `RUN apk add --no-cache libc6-compat` +</Callout> + +## Install Globally + +A global install of `turbo` can be used in any project, and enables automatic workspace +selection based on the directory where you run `turbo`. + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```bash + npm install turbo --global + ``` + </Tab> + <Tab> + ```bash + yarn global add turbo + ``` + </Tab> + <Tab> + ```bash + pnpm install turbo --global + ``` + </Tab> +</Tabs> + +Once you have a globally installed copy of `turbo`, you will be able to run directly from workspace +directories. + +```bash +cd <repository root>/apps/docs +turbo build +``` + +is equivalent to the [filtering syntax](../docs/core-concepts/monorepos/filtering): + +```bash +cd <repository root> +turbo build --filter=docs +``` + +## Install Per Repository + +You may wish to pin the version of Turborepo used within a repository, especially [if you are +collaborating with other developers](../docs/core-concepts/remote-caching). In that case, add `turbo` as a dev dependency at the root +of the repository: + +<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> + <Tab> + ```bash + npm install turbo --dev + ``` + </Tab> + <Tab> + ```bash + yarn add turbo --dev --ignore-workspace-root-check + ``` + </Tab> + <Tab> + ```bash + pnpm add turbo --save-dev --ignore-workspace-root-check + ``` + </Tab> +</Tabs> + +You can continue to use your global install of `turbo`, which will defer to your local version +if it exists. This allows you to get the best of both worlds: easy scoping to the directory you're working +in while maintaining a pinned version among your entire team. diff --git a/docs/pages/repo/docs/reference/_meta.json b/docs/pages/repo/docs/reference/_meta.json new file mode 100644 index 0000000..9ed1477 --- /dev/null +++ b/docs/pages/repo/docs/reference/_meta.json @@ -0,0 +1,5 @@ +{ + "configuration": "Configuration Options", + "command-line-reference": "CLI Usage", + "codemods": "Codemods" +} diff --git a/docs/pages/repo/docs/reference/codemods.mdx b/docs/pages/repo/docs/reference/codemods.mdx new file mode 100644 index 0000000..ac4cbee --- /dev/null +++ b/docs/pages/repo/docs/reference/codemods.mdx @@ -0,0 +1,278 @@ +--- +title: Turborepo Codemods +description: To make upgrading easier, Turborepo includes codemods and migration scripts. +--- + +import Callout from '../../../../components/Callout' + +# Turborepo Codemods + +Turborepo provides Codemod transformations and automatic migration scripts to help upgrade your Turborepo codebase when a feature is deprecated. + +Codemods are transformations that run on your codebase programmatically. This allows for a large amount of changes to be applied without having to manually go through every file. + +## Usage + +```sh +npx @turbo/codemod <transform> <path> +``` + +- `transform` - name of transform, see available transforms below. +- `path` - files or directory to transform +- `--dry` - Do a dry-run, no code will be edited +- `--print` - Prints the changed output for comparison + +## Turborepo 1.x + +1. [add-package-manager](#add-package-manager) +2. [create-turbo-config](#create-turbo-config) +3. [migrate-env-var-dependencies](#migrate-env-var-dependencies) +4. [set-default-outputs](#set-default-outputs) + +### `add-package-manager` + +<Callout type="info"> + Introduced in v1.1.0 +</Callout> + +Transforms the root `package.json` so that `packageManager` key as the detected package manager (`yarn`, `npm`, `pnpm`) and version (e.g. `yarn@1.22.17`). This key is now [supported by Node.js](https://nodejs.org/dist/latest-v17.x/docs/api/packages.html#packagemanager) and is used by Turborepo for faster package manager detection (vs. inferring from just the filesystem alone). + +For example, for Yarn v1: + +```json +// Before +{ + "name": "turborepo-basic", + "version": "0.0.0", + "private": true, + "workspaces": ["apps/*", "packages/*"] + // ... +} +``` + +```diff +{ + "name": "turborepo-basic", + "version": "0.0.0", + "private": true, ++ "packageManager": "yarn@1.22.17", + "workspaces": [ + "apps/*", + "packages/*" + ] +} +``` + +#### Usage + +Go to your project: + +```sh +cd path-to-your-turborepo/ +``` + +Run the codemod: + +```sh +npx @turbo/codemod add-package-manager +``` + +### `create-turbo-config` + +<Callout type="info"> + Introduced in v1.1.0 +</Callout> + +Creates the `turbo.json` file at the root of your project based on the `"turbo"` key in `package.json`. +The `"turbo"` key is subsequently deleted from `package.json`. + +For example: + +```json +// Before, package.json +{ + "name": "Monorepo root", + "private": true, + "turbo": { + "pipeline": { + ... + } + }, + ... +} +``` + +```diff +// After, package.json +{ + "name": "Monorepo root", + "private": true, +- "turbo": { +- "pipeline": { +- ... +- } +- }, + ... +} + +// After, turbo.json ++{ ++ "$schema": "https://turbo.build/schema.json", ++ "pipeline": { ++ ... ++ } ++} +``` + +#### Usage + +Go to your project: + +```sh +cd path-to-your-turborepo/ +``` + +Run the codemod: + +```sh +npx @turbo/codemod create-turbo-config +``` + +### `migrate-env-var-dependencies` + +<Callout type="info"> + Introduced in v1.5.0 +</Callout> + +Migrates all environment variable dependencies in `turbo.json` from `dependsOn` and `globalDependencies` to `env` and `globalEnv` respectively. + +For example: + +```json +// Before, turbo.json +{ + "$schema": "https://turbo.build/schema.json", + "globalDependencies": [".env", "$CI_ENV"], + "pipeline": { + "build": { + "dependsOn": ["^build", "$API_BASE"], + "outputs": [".next/**", "!.next/cache/**"] + }, + "lint": {}, + "dev": { + "cache": false, + "persistent": true + } + } +} +``` + +```diff +// After, turbo.json +{ + "$schema": "https://turbo.build/schema.json", +- "globalDependencies": [".env", "$CI_ENV"], ++ "globalDependencies": [".env"], ++ "globalEnv": ["CI_ENV"], + "pipeline": { + "build": { +- "dependsOn": ["^build", "$API_BASE"], ++ "dependsOn": ["^build"], ++ "env": ["API_BASE"], + "outputs": [".next/**", "!.next/cache/**"], + }, + "lint": {}, + "dev": { + "cache": false, + "persistent": true + } + } +} +``` + +#### Usage + +Go to your project: + +```sh +cd path-to-your-turborepo/ +``` + +Run the codemod: + +```sh +npx @turbo/codemod migrate-env-var-dependencies +``` + +### `set-default-outputs` + +<Callout type="info"> + Introduced in v1.7.0 +</Callout> + +Migrates `turbo.json` outputs to include the previously inferred `dist/` and `build/`. + +For example: + +```json +// Before, turbo.json +{ + "$schema": "https://turbo.build/schema.json", + "globalDependencies": [".env"], + "globalEnv": ["CI_ENV"], + "pipeline": { + "build": { + "dependsOn": ["^build"], + "env": ["API_BASE"], + "outputs": [".next/**", "!.next/cache/**"] + }, + "lint": { + "outputs": [] + }, + "dev": { + "cache": false, + "persistent": true + } + } +} +``` + +```diff +// After, turbo.json +{ + "$schema": "https://turbo.build/schema.json", + "globalDependencies": [".env"], + "globalEnv": ["CI_ENV"], + "pipeline": { + "build": { + "dependsOn": ["^build"], + "env": ["API_BASE"], + "outputs": [".next/**", "!.next/cache/**"] + }, +- "lint": { +- "outputs": [] +- }, ++ "lint": {}, + "dev": { + "cache": false, + "persistent": true, ++ "outputs": ["dist/**", "build/**"] + } + } +} + +``` + +#### Usage + +Go to your project: + +```sh +cd path-to-your-turborepo/ +``` + +Run the codemod: + +```sh +npx @turbo/codemod set-default-outputs +``` diff --git a/docs/pages/repo/docs/reference/command-line-reference.mdx b/docs/pages/repo/docs/reference/command-line-reference.mdx new file mode 100644 index 0000000..b1fdddf --- /dev/null +++ b/docs/pages/repo/docs/reference/command-line-reference.mdx @@ -0,0 +1,713 @@ +--- +title: CLI Reference +description: Turborepo is a high-performance build system for JavaScript and TypeScript codebases. +--- + +import Callout from "../../../../components/Callout"; +import OuputModeTable from "../../../../components/output-mode-table.mdx"; + +# Command-Line Reference + +After [installing the `turbo`](/repo/docs/getting-started/add-to-project) package (or cloning a starter), you can start using Turborepo's command line interface (CLI) `turbo` to do all kinds of awesomeness in your monorepo. + +## Option Syntax + +Options can be passed to `turbo` in different ways. Options that require a value can be passed with an equals sign: + +```sh +--opt=<value> +--opt="<value with a space>" +``` + +They can also be passed with a space between: + +```sh +--opt value +--opt "value with a space" +``` + +Boolean options can be enabled as follows: + +```sh +# To pass true +--opt + +# To pass false +--opt=false +``` + +## Global Arguments + +The following flags apply to all commands. + +#### `--color` + +Forces the use of color even when the output stream is not considered to be a TTY terminal. +This can be used to enable `turbo`'s color output for CI runners such as Github Actions which +have support for rendering color in their log output. + +```sh +turbo run build --color +``` + +Alternatively, you can also enable color using the `FORCE_COLOR` environment variable (borrowed +from the [supports-color nodejs package](https://www.npmjs.com/package/supports-color)). Note that +this may also enable additional colored output from the actual tasks themselves if +they use `supports-color` to determine whether or not to output with colored output. + +```sh +declare -x FORCE_COLOR=1 +turbo run build +``` + +#### `--no-color` + +Suppresses the use of color in the output when running `turbo` in an interactive / TTY session. + +```sh +turbo run build --no-color +``` + +Alternatively, you can also suppress color using the `FORCE_COLOR` environment variable (borrowed +from the [supports-color nodejs package](https://www.npmjs.com/package/supports-color)). + +```sh +declare -x FORCE_COLOR=0 +turbo run build +``` + +#### `--no-update-notifier` + +Disables the update notification. This notification will be automatically disabled when running in CI environments, but can also be disabled manually via this flag. + +```sh +turbo run build --no-update-notifier +``` + +Alternatively, you can also disable the update notification by using either the `TURBO_NO_UPDATE_NOTIFIER` environment variable, or the `NO_UPDATE_NOTIFIER` environment variable (borrowed from the [update-notifier nodejs package](https://github.com/yeoman/update-notifier)). + +```sh +declare -x TURBO_NO_UPDATE_NOTIFIER=1 +turbo run build +``` + +## `turbo run <task>` + +Run npm scripts across all workspaces in specified scope. Tasks must be specified in your `pipeline` configuration. + +`turbo run <task1> <task2> [options] [-- <args passed to task1 and task2>]` + +`turbo` can run multiple tasks, and any arguments following `--` will be passed through +to the tasks to be executed. Note that these additional arguments will _not_ be passed to +any additional tasks that are run due to dependencies from the [pipeline](/repo/docs/reference/configuration#pipeline) configuration. + +### Options + +#### `--cache-dir` + +`type: string` + +Defaults to `./node_modules/.cache/turbo`. Specify local filesystem cache directory. Be sure to add this folder to your `.gitignore` if you change it from the default. + +```sh +turbo run build --cache-dir="./my-cache" +``` + +#### `--concurrency` + +`type: number | string` + +Defaults to `10`. Set/limit the max concurrency of task execution. This must be an integer greater than or equal to `1` or a percentage value like `50%`. Use `1` to force serial (i.e. one task at a time) execution. Use `100%` to use all available logical processors. This option is ignored if the [`--parallel`](#--parallel) flag is also passed. + +```sh +turbo run build --concurrency=50% +turbo run test --concurrency=1 +``` + +#### `--continue` + +Defaults to `false`. This flag tells `turbo` whether or not to continue with execution in the presence of an error (i.e. non-zero exit code from a task). +By default, specifying the `--parallel` flag will automatically set `--continue` to `true` unless explicitly set to `false`. +When `--continue` is `true`, `turbo` will exit with the highest exit code value encountered during execution. + +```sh +turbo run build --continue +``` + +#### `--cwd` + +Set the working directory of the command. + +```sh +turbo run build --cwd=./somewhere/else +``` + +#### `--deps` + +<Callout type="error"> + `--deps` is deprecated in `1.2.x`. Please use + [`--filter`](/repo/docs/core-concepts/monorepos/filtering#include-dependents-of-matched-workspaces) + instead. +</Callout> + +Defaults to `true`. Include dependent workspace consumers in the execution. + +```sh +turbo run build --deps +turbo run build --no-deps +``` + +**Example** + +Let's say you have workspaces A, B, C, and D where A depends on B and C depends on D. You run `turbo run build` for the first time and everything is built and cached. Then, you change a line of code in B. With the `--deps` flag on, running `turbo run build` will execute `build` in B and then A, but not in C and D because they are not impacted by the change. If you were to run `turbo run build --no-deps` instead, turbo will only run `build` in B. + +#### `--dry / --dry-run` + +Instead of executing tasks, display details about the affected workspaces and tasks that would be run. +Specify `--dry=json` to get the output in JSON format. + +Task details include: + +- `task`: The name of the task to be executed +- `package`: The workspace in which to run the task +- `hash`: The hash of the task, used for caching +- `directory`: The directory where the task will be run +- `command`: The actual command used to run the task +- `outputs`: Location of outputs from the task that will cached +- `logFile`: Location of the log file for the task run +- `dependencies`: Tasks that must run before this task +- `dependents`: Tasks that must be run after this task + +#### `--experimental-env-mode` + +`type: string` + +<Callout type="info"> + **Warning**: This is an experimental flag, so its name and behavior can change. +</Callout> + +Controls the available environment variables to your tasks. + +| option | description | +| ------ | ---------------------------------------------------------- | +| infer | infers strict mode or loose mode based on allowlist config | +| loose | allows all environment variables | +| strict | only allow declared variables | + +`PATH`, `SHELL`, and `SYSTEMROOT` are always available to all tasks. + +##### `infer` + +In infer mode, Turborepo will look for [`experimentalPassThroughEnv`][1] for +each task config, and [`experimentalGlobalPassThroughEnv`][2] in the root of the +`turbo.json` config. If either is defined, "strict" mode is inferred. If +neither is defined, then `loose` mode is inferred. + +In this mode, the value of `experimentalGlobalPassThroughEnv` and the flag's +value ("infer") will only be incorporated into the global hash if +`experimentalGlobalPassThroughEnv` is set. + +##### `loose` + +In loose mode, all environment variables are made available to the task. +The value of `experimentalGlobalPassThroughEnv` and the flag's value ("loose") +itself will be incorporated into the global hash. + +##### `strict` + +In strict mode, only environment variables specified in the following keys are +available to the task: + +- `env` and `experimentalPassThroughEnv` in each task configuration +- `globalEnv` and `experimentalGlobalPassThroughEnv` in the root of your config + +The value of `experimentalGlobalPassThroughEnv` and the flag's value ("strict") +itself will be incorporated into the global hash. + +If strict mode is specified or inferred, _all_ tasks are run in strict mode, +regardless of their configuration. + +#### `--filter` + +`type: string[]` + +Specify combinations of workspaces, directories, and git commits to act as entrypoints +for execution. + +Multiple filters can be combined to select distinct sets of targets. Additionally, filters +can also exclude targets. A target that matches any filter and is not explicitly excluded will +be in the final entrypoint selection. + +For more detailed information about the `--filter` flag and filtering, refer to the [dedicated page in our documentation](/repo/docs/core-concepts/monorepos/filtering). + +```sh +turbo run build --filter=my-pkg +turbo run test --filter=...^@scope/my-lib +turbo run build --filter=./apps/* --filter=!./apps/admin +``` + +#### `--graph` + +This command will generate an svg, png, jpg, pdf, json, html, or [other supported output formats](https://graphviz.org/doc/info/output.html) of the current task graph. +The output file format defaults to jpg, but can be controlled by specifying the filename's extension. + +If Graphviz is not installed, or no filename is provided, this command prints the dot graph to `stdout`. + +```sh +turbo run build --graph +turbo run build test lint --graph=my-graph.svg +turbo run build test lint --graph=my-json-graph.json +turbo run build test lint --graph=my-graph.pdf +turbo run build test lint --graph=my-graph.png +turbo run build test lint --graph=my-graph.html +turbo run build test lint --graph=my-graph.mermaid +``` + +<Callout type="info"> + **Known Bug**: All possible pipeline task nodes will be added to the graph at + the moment, even if that pipeline task does not actually exist in a given + workspace. This has no impact on execution, it means that 1) the terminal + output may overstate the number of workspaces in which a task is running and + 2) your dot viz graph may contain additional nodes that represents tasks that + do not exist. +</Callout> + +#### `--force` + +Ignore existing cached artifacts and forcibly re-execute all tasks (overwriting artifacts that overlap) + +```sh +turbo run build --force +``` + +The same behavior also be set via the `TURBO_FORCE=true` environment variable. + +#### `--global-deps` + +Specify glob of global filesystem dependencies to be hashed. Useful for .env and files in the root directory that impact multiple packages/apps. +Can be specified multiple times. + +```sh +turbo run build --global-deps=".env.*" --global-deps=".eslintrc" --global-deps="jest.config.js" +``` + +You can also specify these in your `turbo` configuration as `globalDependencies` key. + +#### `--ignore` + +`type: string[]` + +Ignore **files or directories** from impacting scope. Uses glob patterns under the hood. + +``` +turbo run build --ignore="apps/**/*" +turbo run build --ignore="packages/**/*" +turbo run build --ignore="packages/**/*" --ignore="\!/packages/not-this-one/**/*" +``` + +##### How multiple patterns work + +Positive patterns (e.g. `foo` or `*`) add to the results, while negative patterns (e.g. `!foo`) subtract from the results. + +Therefore a lone negation (e.g. `['!foo']`) will never match anything – use `['*', '!foo']` instead. + +##### Globbing patterns + +Just a quick overview. + +- `*` matches any number of characters, but not `/` +- `?` matches a single character, but not `/` +- `**` matches any number of characters, including `/`, as long as it's the only thing in a path part +- `{}` allows for a comma-separated list of "or" expressions +- `!` at the beginning of a pattern will negate the match + +#### `--include-dependencies` + +<Callout type="error"> + `--include-dependencies` is deprecated in `1.2.x`. Please use + [`--filter`](/repo/docs/core-concepts/monorepos/filtering#include-dependencies-of-matched-workspaces) + instead. +</Callout> + +Default `false`. When `true`, `turbo` will add any workspaces that the workspaces in the current execution _depend_ on (i.e. those declared in `dependencies` or `devDependencies`). + +This is useful when using `--filter` in CI as it guarantees that every dependency needed for the execution is actually executed. + +#### `--no-cache` + +Default `false`. Do not cache results of the task. This is useful for watch commands like `next dev` or `react-scripts start`. + +```shell +turbo run build --no-cache +turbo run dev --no-cache +``` + +#### `--no-daemon` + +Default `false`. `turbo` can run a standalone process in some cases to precalculate values used for determining what work needs to be done. +This standalone process (daemon) is an optimization, and not required for proper functioning of `turbo`. +Passing `--no-daemon` instructs `turbo` to avoid using or creating the standalone process. + +#### `--output-logs` + +`type: string` + +Set type of output logging. Defaults to "outputMode" for the task in `turbo.json`. + +<OuputModeTable /> + +**Example** + +```shell +turbo run build --output-logs=full +turbo run build --output-logs=new-only +turbo run build --output-logs=errors-only +turbo run build --output-logs=none +``` + +#### `--only` + +Default `false`. Restricts execution to include specified tasks only. This is very similar to how `lerna` and `pnpm` run tasks by default. + +Given this pipeline in `turbo.json`: + +```json +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"] + }, + "test": { + "dependsOn": ["^build"] + } + } +} +``` + +```shell +turbo run test --only +``` + +Will execute _only_ the `test` tasks in each workspace. It will not `build`. + +#### `--parallel` + +Default `false`. Run commands in parallel across workspaces and ignore the task dependency graph. + +<Callout type="info"> + The `--parallel` flag is typically used for "dev" or `--watch` mode tasks that don't exit. + Starting in `turbo@1.7`, we recommend configuring these tasks using the + [`persistent`](/repo/docs/reference/configuration#persistent) config instead. +</Callout> + +```sh +turbo run lint --parallel --no-cache +turbo run dev --parallel --no-cache +``` + +#### `--remote-only` + +Default `false`. Ignore the local filesystem cache for all tasks. Only allow reading and caching artifacts using the remote cache. + +```shell +turbo run build --remote-only +``` + +The same behavior can also be set via the `TURBO_REMOTE_ONLY=true` environment variable. + +#### `--scope` + +<Callout type="error"> + `--scope` is deprecated in `1.2.x`. Please use + [`--filter`](/repo/docs/core-concepts/monorepos/filtering#filter-by-package) instead. +</Callout> + +`type: string[]` + +Specify/filter workspaces to act as entry points for execution. Globs against `package.json` `name` field (and not the file system.) + +```sh +turbo run lint --scope="@example/**" +turbo run dev --scope="@example/a" --scope="@example/b" --no-cache --no-deps +``` + +#### `--serial` + +<Callout type="error"> + `serial` is deprecated in `0.5.3`.Please use + [`--concurrency=1`](#--concurrency) instead. +</Callout> + +Executes all tasks serially (i.e. one-at-a-time). + +```sh +turbo run build --serial +``` + +#### `--since` + +<Callout type="error"> + `--since` is deprecated in `1.2.x`. Please use + [`--filter`](/repo/docs/core-concepts/monorepos/filtering#filter-by-changed-workspaces) + instead. +</Callout> + +Filter execution based on which workspaces have changed since a merge-base. + +``` +turbo run build --since=origin/main +``` + +<Callout type="info"> + **Important**: This uses the `git diff ${target_branch}...` mechanism to + identify which workspaces have changed. There is an assumption that all the + input files for a workspace exist inside their respective workspace folders. +</Callout> + +#### `--summarize` + +Generates a JSON file in `.turbo/runs` containing metadata about the run, including affected workspaces, +executed tasks (including their timings and hashes), expanded to the cache key based on your config +and all the files included in the cached artifact. This flag can be helpful to determine, among other +things: + +- How turbo interpreted your glob syntax for `inputs` and `outputs` +- What inputs changed between two task runs to produce a cache hit or miss +- How task timings changed over time + +#### `--token` + +A bearer token for remote caching. Useful for running in non-interactive shells (e.g. CI/CD) in combination with `--team` flags. + +```sh +turbo run build --team=my-team --token=xxxxxxxxxxxxxxxxx +``` + +You can also set the value of the current token by setting an environment variable named `TURBO_TOKEN`. The flag will take precedence over the environment variable if both are present. + +If you are using Remote Caching on Vercel and building your project on Vercel, this environment variable and flag are unnecessary because they are automatically set for you. Suppose you are using Remote Caching on Vercel but building in another CI provider like CircleCI or GitHub Actions. You can use a Vercel Personal Access Token as your `--token` or `TURBO_TOKEN`. If you are using a custom Remote Cache, this value will be used to send an HTTP Bearer token with requests to your custom Remote Cache. + +#### `--team` + +The slug of the remote cache team. Useful for running in non-interactive shells in combination with `--token` and `--team` flags. + +```sh +turbo run build --team=my-team +turbo run build --team=my-team --token=xxxxxxxxxxxxxxxxx +``` + +You can also set the value of the current team by setting an environment variable named `TURBO_TEAM`. The flag will take precedence over the environment variable if both are present. + +#### `--preflight` + +Only applicable when remote artifact caching is configured. Enables sending a preflight request before every cache artifact and analytics request. The follow-up upload and download will follow redirects. + +```sh +turbo run build --preflight +``` + +The same behavior can also be set via the `TURBO_PREFLIGHT=true` environment variable. + +#### `--trace` + +`type: string` + +To view CPU trace, outputs the trace to the given file, use `go tool trace [file]`. + +<Callout> + **Important**: The trace viewer doesn't work under Windows Subsystem for + Linux. +</Callout> + +```sh +turbo run build --trace="<trace-file-name>" +``` + +#### `--heap` + +`type: string` + +To view heap trace, outputs the trace to the given file, use `go tool pprof [file]` and type `top`. You can also drop it into [speedscope](https://www.speedscope.app/) and use the `left heavy` or `sandwich` view modes. + +```sh +turbo run build --heap="<heap-file-name>" +``` + +#### `--cpuprofile` + +`type: string` + +To view CPU profile, outputs the profile to the given file, drop the file into [speedscope](https://www.speedscope.app/). + +<Callout> + **Important**: The CPU profiler doesn't work under Windows Subsystem for + Linux. The profiler has to be built for native Windows and run using the + command prompt instead. +</Callout> + +```sh +turbo run build --cpuprofile="<cpu-profile-file-name>" +``` + +#### `--verbosity` + +To specify log level, use `--verbosity=<num>` or `-v, -vv, -vvv`. + +- `Info `: `--verbosity=1`, or `-v` +- `Debug`: `--verbosity=2`, or `-vv` +- `Trace`: `--verbosity=3`, or `-vvv` + +```sh +turbo run build -v +turbo run build --verbosity=2 +turbo run build -vvv +``` + +## `turbo prune --scope=<target>` + +Generate a sparse/partial monorepo with a pruned lockfile for a target workspace. + +This command will generate folder called `out` with the following inside of it: + +- The full source code of all internal workspaces that are needed to build the target +- A new pruned lockfile that only contains the pruned subset of the original root lockfile with the dependencies that are actually used by the workspaces in the pruned workspace. +- A copy of the root `package.json` + +``` +. # Folder full source code for all workspaces needed to build the target +β”œβ”€β”€ package.json # The root `package.json` +β”œβ”€β”€ packages +β”‚ β”œβ”€β”€ ui +β”‚ β”‚ β”œβ”€β”€ package.json +β”‚ β”‚ β”œβ”€β”€ src +β”‚ β”‚ β”‚ └── index.tsx +β”‚ β”‚ └── tsconfig.json +β”‚ β”œβ”€β”€ shared +β”‚ β”‚ β”œβ”€β”€ package.json +β”‚ β”‚ β”œβ”€β”€ src +β”‚ β”‚ β”‚ β”œβ”€β”€ __tests__ +β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ sum.test.ts +β”‚ β”‚ β”‚ β”‚ └── tsconfig.json +β”‚ β”‚ β”‚ β”œβ”€β”€ index.ts +β”‚ β”‚ β”‚ └── sum.ts +β”‚ β”‚ └── tsconfig.json +β”‚ └── frontend +β”‚ β”œβ”€β”€ next-env.d.ts +β”‚ β”œβ”€β”€ next.config.js +β”‚ β”œβ”€β”€ package.json +β”‚ β”œβ”€β”€ src +β”‚ β”‚ └── pages +β”‚ β”‚ └── index.tsx +β”‚ └── tsconfig.json +└── yarn.lock # The pruned lockfile for all targets in the subworkspace +``` + +### Options + +#### `--docker` + +`type: boolean` + +Default to `false`. Passing this flag will alter the outputted folder with the pruned workspace to make it easier to use with [Docker best practices / layer caching](https://docs.docker.com/develop/develop-images/dockerfile_best-practices/). + +With the `--docker` flag. The `prune` command will generate folder called `out` with the following inside of it: + +- A folder `json` with the pruned workspace's `package.json`s +- A folder `full` with the pruned workspace's full source code, but only including the internal packages that are needed to build the target. +- A new pruned lockfile that only contains the pruned subset of the original root lockfile with the dependencies that are actually used by the packages in the pruned workspace. + +``` +. +β”œβ”€β”€ full # Folder full source code for all package needed to build the target +β”‚ β”œβ”€β”€ package.json +β”‚ └── packages +β”‚ β”œβ”€β”€ ui +β”‚ β”‚ β”œβ”€β”€ package.json +β”‚ β”‚ β”œβ”€β”€ src +β”‚ β”‚ β”‚ └── index.tsx +β”‚ β”‚ └── tsconfig.json +β”‚ β”œβ”€β”€ shared +β”‚ β”‚ β”œβ”€β”€ package.json +β”‚ β”‚ β”œβ”€β”€ src +β”‚ β”‚ β”‚ β”œβ”€β”€ __tests__ +β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ sum.test.ts +β”‚ β”‚ β”‚ β”‚ └── tsconfig.json +β”‚ β”‚ β”‚ β”œβ”€β”€ index.ts +β”‚ β”‚ β”‚ └── sum.ts +β”‚ β”‚ └── tsconfig.json +β”‚ └── frontend +β”‚ β”œβ”€β”€ next-env.d.ts +β”‚ β”œβ”€β”€ next.config.js +β”‚ β”œβ”€β”€ package.json +β”‚ β”œβ”€β”€ src +β”‚ β”‚ └── pages +β”‚ β”‚ └── index.tsx +β”‚ └── tsconfig.json +β”œβ”€β”€ json # Folder containing just package.jsons for all targets in the subworkspace +β”‚ β”œβ”€β”€ package.json +β”‚ └── packages +β”‚ β”œβ”€β”€ ui +β”‚ β”‚ └── package.json +β”‚ β”œβ”€β”€ shared +β”‚ β”‚ └── package.json +β”‚ └── frontend +β”‚ └── package.json +└── yarn.lock # The pruned lockfile for all targets in the subworkspace +``` + +## `turbo login` + +Connect machine to your Remote Cache provider. The default provider is [Vercel](https://vercel.com/). + +### Options + +#### `--url` + +`type: string` + +Defaults to `https://vercel.com/`. + +#### `--api` + +`type: string` + +Defaults to `https://vercel.com/api`. + +#### `--sso-team` + +`type: string` + +Connect to an sso-enabled Vercel team by providing your Team slug. + +``` +turbo login --sso-team=<team-slug> +``` + +## `turbo logout` + +Logs you out of your Vercel account. + +## `turbo link` + +Link the current directory to Remote Cache scope. The selected owner (either a user or and organization) will be able to share [cache artifacts](/repo/docs/core-concepts/caching) through [Remote Caching](/repo/docs/core-concepts/remote-caching). +You should run this command from the root of your monorepo. + +### Options + +#### `--api` + +`type: string` + +Defaults to `https://api.vercel.com` + +## `turbo unlink` + +Unlink the current directory from the Remote Cache. + +## `turbo bin` + +Get the path to the `turbo` binary. + +[1]: /repo/docs/refernce/configuration#experimentalPassThroughEnv +[2]: /repo/docs/refernce/configuration#experimentalGlobalPassThroughEnv diff --git a/docs/pages/repo/docs/reference/configuration.mdx b/docs/pages/repo/docs/reference/configuration.mdx new file mode 100644 index 0000000..ef52d46 --- /dev/null +++ b/docs/pages/repo/docs/reference/configuration.mdx @@ -0,0 +1,405 @@ +--- +title: Configuration +description: Learn how to configure Turborepo through `turbo.json`. +--- + +import Callout from "../../../../components/Callout"; +import OutputModeTable from "../../../../components/output-mode-table.mdx"; +import Link from 'next/link' + +# Configuration Options (`turbo.json`) + +You can configure the behavior of `turbo` by adding a `turbo.json` file in your monorepo's root directory. + +## `globalDependencies` + +`type: string[]` + +A list of file globs for global hash dependencies. The contents of these files will be included in the global hashing algorithm and affect the hashes of all tasks. +This is useful for busting the cache based on `.env` files (not in Git) or any root level file that impacts workspace tasks (but are not represented in the traditional dependency graph (e.g. a root `tsconfig.json`, `jest.config.js`, `.eslintrc`, etc.)). + +<Callout type="info"> +These must be relative paths from the location of `turbo.json`, and they should be valid for any machine where + this configuration might be used. For instance, it is not a good idea to reference files in one user's home directory. +</Callout> + +**Example** + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + // ... omitted for brevity + }, + + "globalDependencies": [ + ".env", // contents will impact hashes of all tasks + "tsconfig.json" // contents will impact hashes of all tasks + ] +} +``` + +## `globalEnv` + +`type: string[]` + +A list of environment variables for implicit global hash dependencies. The contents of these environment variables will be included in the global hashing algorithm and affect the hashes of all tasks. + +**Example** + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + // ... omitted for brevity + }, + + "globalEnv": ["GITHUB_TOKEN"] // value will impact the hashes of all tasks +} +``` + +## `extends` + +`type: string[]` + +The `extends` key is only valid in Workspace Configurations. It will be +ignored in the root `turbo.json`. Read [the docs to learn more][1]. + +## `pipeline` + +An object representing the task dependency graph of your project. `turbo` interprets these conventions to properly schedule, execute, and cache the outputs of tasks in your project. + +Each key in the `pipeline` object is the name of a task that can be executed by `turbo run`. If `turbo` finds a workspace with a `package.json` `scripts` object with a matching key, it will apply the pipeline task configuration to that npm script during execution. This allows you to use `pipeline` to set conventions across your entire Turborepo. + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"] + }, + "test": { + "outputs": ["coverage/**"], + "dependsOn": ["build"], + "inputs": ["src/**/*.tsx", "src/**/*.ts", "test/**/*.ts"], + "outputMode": "full" + }, + "dev": { + "cache": false, + "persistent": true + } + } +} +``` + +### `dependsOn` + +`type: string[]` + +The list of tasks this task depends on. + +Prefixing an item in `dependsOn` with a `^` tells `turbo` that this pipeline task depends on the workspace's topological dependencies completing the task with the `^` prefix first (e.g. "a workspace's `build` tasks should only run once all of its `dependencies` and `devDependencies` have completed their own `build` commands"). + +Items in `dependsOn` without `^` prefix, express the relationships between tasks at the workspace level (e.g. "a workspace's `test` and `lint` commands depend on `build` being completed first"). + +<Callout type="info"> + As of version 1.5, using `$` to declare environment variables in the `dependsOn` config is + deprecated. <Link href="#env">Use the `env` key instead.</Link> +</Callout> + +**Example** + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + // "A workspace's `build` command depends on its dependencies' + // or devDependencies' `build` command being completed first" + "outputs": [".next/**", "!.next/cache/**", "dist/**"], + "dependsOn": ["^build"] + }, + "test": { + // "A workspace's `test` command depends on its own `lint` and + // `build` commands first being completed" + "dependsOn": ["lint", "build"] + }, + "deploy": { + // "A workspace's `deploy` command, depends on its own `build` + // and `test` commands first being completed" + "dependsOn": ["build", "test"] + }, + // A workspace's `lint` command has no dependencies + "lint": {} + } +} +``` + +### `env` + +`type: string[]` + +The list of environment variables a task depends on. + +**Example** + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"], + "env": ["SOMETHING_ELSE"], // value will impact the hashes of all build tasks + "outputs": ["dist/**", ".next/**", "!.next/cache/**"] + }, + "web#build": { + "dependsOn": ["^build"], + "env": ["STRIPE_SECRET_KEY"], // value will impact hash of only web's build task + "outputs": [".next/**", "!.next/cache/**"] + } + }, + "globalEnv": [ + "GITHUB_TOKEN" // value will impact the hashes of all tasks + ] +} +``` + +<Callout type="info"> + When Turborepo detects a common frontend framework in a workspace, it will + automatically depend on environment variables that are going to be inlined in + your build. For example, if the `web` workspace contains a Next.js project, + you do not need to specify any environment variables that [start with + `NEXT_PUBLIC_`](https://nextjs.org/docs/basic-features/environment-variables#exposing-environment-variables-to-the-browser) + in the `dependsOn` config. Turborepo already knows that the build output will + change when the value of these environment variables change, so it will depend + on them automatically. See more in the [docs on + caching](/repo/docs/core-concepts/caching#automatic-environment-variable-inclusion). +</Callout> + +### `outputs` + +`type: string[]` + +The set of glob patterns of a task's cacheable filesystem outputs. + +Note: `turbo` automatically logs `stderr`/`stdout` to `.turbo/run-<task>.log`. This file is _always_ +treated as a cacheable artifact and never needs to be specified. + +Omitting this key or passing an empty array can be used to tell `turbo` that a task is a side-effect +and thus doesn't emit any filesystem artifacts (e.g. like a linter), but you still want to cache its +logs (and treat them like an artifact). + +<Callout type="info"> + `outputs` globs must be specified as relative paths rooted at the workspace directory. +</Callout> + +**Example** + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + // "Cache all files emitted to workspace's dist/** or .next + // directories by a `build` task" + "outputs": ["dist/**", ".next/**", "!.next/cache/**"], + "dependsOn": ["^build"] + }, + "test": { + // "Don't cache any artifacts of `test` tasks (aside from + // logs)" + "dependsOn": ["build"] + }, + "test:ci": { + // "Cache the coverage report of a `test:ci` command" + "outputs": ["coverage/**"], + "dependsOn": ["build"] + }, + "dev": { + // Never cache anything (including logs) emitted by a + // `dev` task + "cache": false, + "persistent": true + } + } +} +``` + +### `cache` + +`type: boolean` + +Defaults to `true`. Whether or not to cache the task [`outputs`](#outputs). Setting `cache` to false is useful for daemon or long-running "watch" or development mode tasks you don't want to cache. + +**Example** + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "outputs": [".svelte-kit/**", "dist/**"], + "dependsOn": ["^build"] + }, + "test": { + "dependsOn": ["build"] + }, + "dev": { + "cache": false, + "persistent": true + } + } +} +``` + +### `inputs` + +`type: string[]` + +Defaults to `[]`. Tells `turbo` the set of files to consider when determining if a workspace has changed for a particular task. +Setting this to a list of globs will cause the task to only be rerun when files matching those globs have +changed. This can be helpful if you want to, for example, skip running tests unless a source file changed. + +Specifying `[]` will cause the task to be rerun when any file in the workspace changes. + +<Callout type="info"> + `inputs` globs must be specified as relative paths rooted at the workspace directory. +</Callout> + +**Example** + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + // ... omitted for brevity + + "test": { + // A workspace's `test` task depends on that workspace's + // own `build` task being completed first. + "dependsOn": ["build"], + "outputs": [".next/**", "!.next/cache/**"], + // A workspace's `test` task should only be rerun when + // either a `.tsx` or `.ts` file has changed. + "inputs": ["src/**/*.tsx", "src/**/*.ts", "test/**/*.ts"] + } + } +} +``` + +<Callout type="info"> + Note: `turbo.json` is *always* considered an input. If you modify + `turbo.json`, all caches are invalidated. +</Callout> + +### `outputMode` + +`type: "full" | "hash-only" | "new-only" | "errors-only" | "none"` + +Set type of output logging. + +<OutputModeTable /> + +**Example** + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "dependsOn": ["^build"], + "outputs": [".svelte-kit/**", "dist/**"], + "outputMode": "new-only" + }, + "test": { + "dependsOn": ["build"] + } + } +} +``` + +### `persistent` + +`type: boolean` + +Label a task as `persistent` if it is a long-running process, such as a dev server or `--watch` mode. +Turbo will prevent other tasks from depending on persistent tasks. Without setting this +config, if any other task depends on `dev`, it will never run, because `dev` never exits. With this +option, `turbo` can warn you about an invalid configuration. + +**Example** + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "dev": { + "persistent": true + } + } +} +``` + +[1]: /repo/docs/core-concepts/monorepos/configuring-workspaces + +## Experimental + +### `experimentalGlobalPassThroughEnv` + +This goes at the root of your configuration. + +`type: string[]` + +An allowlist of environment variables that should be made available to all tasks +but should not contribute to the task's cache key. Using this key opts all tasks +into strict environment variable mode. + +Changing this list will contribute to the global cache key, but the value of each +variable will not. + +**Example** + +`AWS_SECRET_KEY` and `GITHUB_TOKEN` are available to all tasks in `strict` [env mode][r-cli-env-mode]. + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "experimentalGlobalPassThroughEnv": ["AWS_SECRET_KEY", "GITHUB_TOKEN"], + "pipeline": { + // ...task definitions... + } +} +``` + +### `experimentalPassThroughEnv` + +`type: string[]` + +This config goes inside each task definition in the [`pipeline`][r-config-pipeline]. + +An allowlist of environment variables that should be made available to this task +but should not contribute to the task's cache key. Using this key opts this task +into strict environment variable mode. + +Changing this list will contribute to the task's cache key, but the value of each +variable will not. + +**Example** + +`AWS_SECRET_KEY` and `GITHUB_TOKEN` are available to the `build` task, but not to the `lint` task +in `strict` [env mode][r-cli-env-mode]. + +```jsonc +{ + "$schema": "https://turbo.build/schema.json", + "pipeline": { + "build": { + "experimentalPassThroughEnv": ["AWS_SECRET_KEY", "GITHUB_TOKEN"] + }, + "lint": {}, + } +} +``` + +[r-config-pipeline]: #pipeline +[r-cli-env-mode]: /repo/docs/reference/command-line-reference#--experimental-env-mode diff --git a/docs/pages/repo/docs/troubleshooting.mdx b/docs/pages/repo/docs/troubleshooting.mdx new file mode 100644 index 0000000..dfa3256 --- /dev/null +++ b/docs/pages/repo/docs/troubleshooting.mdx @@ -0,0 +1,172 @@ +--- +title: Troubleshooting Runs +description: This guide aims to help you debug issues with your Turborepo builds and configuration. +--- + +# Troubleshooting Runs + +As with most tools, it can be frustrating to understand why Turborepo +is not working the way you expect. This page covers some tools to debug when +using the `turbo` CLI and some common problems you may encounter. + +## Enable Verbose Logs + +The best debugging tool we have as developers are logs. You can turn up the log +level with the [`--verbosity`][1] flag. Combined with [building from +source][3], this can be a powerful and flexible way to see what's going on under +the hood. + +## Check the Run Summary + +The [--summarize][r-summarize] flag generates and saves metadata about your `turbo run` +as a JSON file in `.turbo/runs`. You can use it to compare subsequent runs, inspect +the contents of the cached artifact, and the inputs to the hash for a task. + +## Check your Configuration + +### Task Configuration + +You can [get started][7] with Turborepo with minimal configuration -- that's one +of the things people love about Turborepo! But when you omit configuration, +Turborepo internally falls back to smart defaults. Additionally, when using +[Workspace Configurations][d-config-workspaces] in a monorepo, it can be +confusing to understand how Turborepo interpreted your `turbo.json`. You can use +the `--dry` or `--dry=json` to get a "resolved" task configuration for any task. +For example: + +```bash +turbo run build --dry=json +``` + +Look for a `resolvedTaskConfiguration` key in the output. + +### User Config + +When you link your repository to Vercel, Turborepo stores configuration in two places: + +- your Vercel team information is stored in `.turbo/config.json`. You can + inspect this file to see what else might be in there! +- an authentication token is stored in + `~/Library/Application\ Support/turborepo/config.json`. + +## Inspect the Cache + +When turborepo runs a task that has configured `outputs`, it caches those +outputs, along with the logs from that task in the `node_modules/.cache/turbo/`. +These artifacts are compressed with `tar`, but you can uncompress and see what's +in them. + +## Build from Source + +One of the advantages of JavaScript codebases are that you can open up +`node_modules/` and edit the code you're running inline. This is not possible +with `turbo`, because the runnable code is a compiled binary and you cannot edit +it inline. But because the codebase is Open Source, you can always get +the source code, modify it, and build it locally. The bulk of this +documentation is available in the [Contributing Guide][4], but you can use those +directions even if you aren't planning to make a contribution. + +1. Clone the git repo from [`vercel/turbo`][source] +1. `cd cli` +1. Make any changes (for example, add more logging) +1. Run `make` +1. From _your_ project, use `/:path/:to/:turbo/target/debug/turbo` instead of global + turbo or the version of `turbo` installed in your project. + +## Common Pitfalls + +### The `.turbo` directory + +One of the [core concepts][2] behind Turbo is that when a declared input +changes, the cached outputs for that task are invalidated. As part of running any task, +Turborepo creates the following directories: + +- A `.turbo` at the root of your repo +- A `.turbo` directory in each workspace if your project is a monorepo (e.g. `apps/my-app/.turbo/`) +- A `turbo` directory in `node_modules/.cache` + +Because the first two directories are not git-ignored by default, you may see an +issue where you run the same task twice and get a cache missing, even though you +didn't change anything, because the generated `.turbo` directories are getting included as +the task _inputs_, and invalidating cache. To avoid this problem, add `.turbo` to your +`.gitignore` file. Alternatively, you can also limit your [`inputs` configuration][r-inputs-config] +so that `.turbo` is not included in the cache inputs. + +## Common Questions + +### I'm not seeing any cache hits + +In general, you should expect that when you run `turbo run` twice in a row, you should get a +cache hit on the second run. If this isn't happening, run both builds again with the [`--summarize` +flag][r-summarize] and compare the generated Run Summaries to each other. In most cases, the +comparison should show why the second run did not get a cache hit. + +You can also ask: + +- Is any source code being generated during the build that isn't checked into git? + + This would change the fingerprint Turborepo uses to store build outputs. + +- Are cache [outputs][d-config-outputs] correctly specified in your Turborepo [pipeline][d-def-pipeline]? + + Pipeline settings are not inherited or merged, so they need to be + re-specified in [workspace-specific tasks][d-workspace-tasks] (e.g. `web#build` does + **not** inherit pipeline settings from `build`). + +- [Are relevant inlined environment variables accounted for?][12] + + [Enable verbose mode][5] to see which environment variables are included in the hashes. + +### I'm seeing cache hits, but my build is broken + +- Are [cache outputs properly specified][d-config-outputs] in your Turborepo [pipeline][d-def-pipeline]? + + Pipeline settings are not inherited or merged, so they need to be + re-specified in [workspace-specific tasks][d-workspace-tasks] (e.g. `web#build` does + **not** inherit pipeline settings from `build`). + +### My build is caching the wrong environment variables + +- [Are relevant inlined environment variables accounted for?][12] + + [Enable verbose mode][5] to see which environment variables are included in the hashes. + +## Common Monorepo Questions + +### My dependency isn't being built correctly + +- Are you properly bundling and transpiling the dependency before building the application? + + For example, libraries like `tsc`, `tsup`, `esbuild`, `babel`, and `swc` + will convert newer JavaScript features back to β€œpure” JavaScript. + + If you are using Next.js, you might be using `transpilePackages`. Ensure you + add the name of the dependency inside `next.config.js` ([example][17]). + +- Have you listed `files` in the dependency's `package.json` to point to the correct files? + +### My types are not being found + +- Did you specify `types` or `typing` inside the dependency's `package.json` to + point to the `.d.ts` file? + +- Have you altered or set custom `tsconfig.json` `paths`? + - Do they have the correct folder structure for your application? + - Are they properly configured for the meta framework, bundler, or transpilation tool? + +[1]: /repo/docs/reference/command-line-reference#verbosity +[2]: /repo/docs/core-concepts/caching +[3]: #build-from-source +[4]: https://github.com/vercel/turbo/blob/main/CONTRIBUTING.md +[5]: #enable-verbose-logs +[7]: /repo/docs/getting-started +[9]: /repo/docs/reference/command-line-reference#turbo-link +[12]: /repo/docs/core-concepts/caching#altering-caching-based-on-environment-variables +[17]: https://github.com/vercel/turbo/blob/main/examples/basic/apps/docs/next.config.js#L1 +[d-workspace-tasks]: /repo/docs/core-concepts/monorepos/running-tasks#specific-workspace-tasks +[d-config-workspaces]: /repo/docs/core-concepts/monorepos/configuring-workspaces +[d-config-outputs]: /repo/docs/core-concepts/caching#configuring-cache-outputs +[d-def-pipeline]: /repo/docs/core-concepts/monorepos/running-tasks#defining-a-pipeline +[source]: https://github.com/vercel/turbo +[r-inputs-config]: /repo/docs/reference/configuration#inputs +[r-summarize]: /repo/docs/reference/command-line-reference#--summarize diff --git a/docs/pages/repo/docs/upgrading-to-v1.mdx b/docs/pages/repo/docs/upgrading-to-v1.mdx new file mode 100644 index 0000000..3ed2de3 --- /dev/null +++ b/docs/pages/repo/docs/upgrading-to-v1.mdx @@ -0,0 +1,78 @@ +--- +title: Upgrading to v1 +description: Learn how to upgrade to Turborepo v1, now open-source and part of Vercel. +--- + +# Upgrading to Turborepo v1.x + +Turborepo has been acquired by Vercel! With this announcement, **Vercel is open sourcing the `turbo` CLI and offering Remote Caching for free on all accounts during the transition period.** + +Existing Turborepo customers should upgrade their `turbo` CLI to v1.x as soon as possible and migrate to Vercel (instructions below). Earlier versions of `turbo` CLI prior to 1.x will no longer be maintained going forward. New account creation on beta.turborepo.com has been disabled. The beta.turborepo.com dashboard and remote caching service will be shutdown on January 15th, 2022 and older versions will not be installable. + +**All existing Remote Cache artifacts will also be deleted at this time**. + +Below is a step-by-step migration guide for existing Turborepo users. If you get stuck, please reach out in the community [Discord](https://turbo.build/discord) or file an issue on [GitHub](https://github.com/vercel/turbo). Thank you again for your continued support as we begin this awesome new chapter of Turborepo together. + +--- + +## 1. Cleaning up + +For good hygiene, ensure you logout of `turbo` to remove old credentials: + +```sh +yarn turbo logout +``` + +If it exists, also delete the `.turbo` directory from the root of your monorepo: + +```sh +rm -rf .turbo +``` + +## 2. Install the latest release of `turbo` + +Install the latest version version of `turbo`: + +```sh +yarn add turbo --save-dev --ignore-workspace-root-check +``` + +## 3. Setup Remote Caching + +As mentioned, Turborepo now provides zero-config Remote Caching through [Vercel](https://vercel.com/?utm_source=turbo.build&utm_medium=referral&utm_campaign=docs-link). Remote Caching is free for all Vercel plans during this transition period. Each Vercel account has a shared Remote Cache. This cache is shared across all environments (Development, Preview, and Production). + +**Important**: turborepo.com allowed multiple caches (i.e. projects) per team (denoted through `--project` flag). With v1.x caching on Vercel, each Vercel account (user or team) has a single shared Remote Cache. If you were actively using multiple turborepo.com projects for your team, please let us know in [Discord](https://turbo.build/discord). + +Please note that we are not migrating cache artifacts to Vercel. We apologize for the slower builds during your migration as you rehydrate your remote cache on Vercel or custom cache infra. + +## 4. Local Development + +If you were using Remote Caching for local development, upgrading will take a minute or two. To get started, login to the Vercel CLI: + +```sh +npx turbo login +``` + +Now we can set up Remote Caching through Vercel by running: + +```sh +npx turbo link +``` + +Follow the prompts and select the Vercel account (user or team) to wish to connect to. + +### On Vercel + +- If you already used Turborepo and Vercel together, remove `TURBO_TOKEN`, `TURBO_TEAM`, and `TURBO_PROJECT` environment variables from all projects. These are now automatically set on your behalf by Vercel. + +- Remove the usage of `--team`, `--token`, and `--project` CLI flags in your Vercel project settings and/or `package.json` scripts. + +### On other CI/CD + +- Replace your turborepo.com personal access token with a new [Vercel personal access token](https://vercel.com/account/tokens) and update `TURBO_TOKEN` environment variable or equivalent usage of the `--token` CLI flag. +- Remove the `TURBO_PROJECT` environment variable and remove all usage of the `--project` CLI flag. This has been deprecated. +- Update the value of the `TURBO_TEAM` environment variable and `--team` CLI flag to be your Vercel account slug (i.e. `https://vercel.com/<slug>`). + +### Getting Help + +If you are having difficulty upgrading please file an issue on [GitHub](https://github.com/vercel/turbo). If you are having difficulty with your remote caching on Vercel, please reach out in [Discord](https://turbo.build/discord). diff --git a/docs/pages/repo/index.mdx b/docs/pages/repo/index.mdx new file mode 100644 index 0000000..fa22190 --- /dev/null +++ b/docs/pages/repo/index.mdx @@ -0,0 +1,7 @@ +--- +description: Turborepo is a high-performance build system for JavaScript and TypeScript codebases. +--- + +import TurborepoHome from "../../components/pages/repo-home"; + +<TurborepoHome /> diff --git a/docs/pages/showcase.mdx b/docs/pages/showcase.mdx new file mode 100644 index 0000000..5cef553 --- /dev/null +++ b/docs/pages/showcase.mdx @@ -0,0 +1,10 @@ +--- +headeronly: true +searchable: false +layout: full +description: "Turborepo by Vercel is the one of the fastest growing build systems in the frontend ecosystem. It's trusted by thousands of developers in production including teams at Vercel, AWS, Netflix, Microsoft, Disney, and more." +--- + +import Showcase from "../components/pages/showcase"; + +<Showcase /> diff --git a/docs/pages/terms.mdx b/docs/pages/terms.mdx new file mode 100644 index 0000000..89eb14b --- /dev/null +++ b/docs/pages/terms.mdx @@ -0,0 +1,90 @@ +--- +description: See our terms of our service and how they relate to you. +hidden: true +headeronly: true +container: true +searchable: false +--- + +# Turborepo.org Internet Web Site Terms of Service + +THE TERMS AND CONDITIONS SET FORTH BELOW (THE "TERMS") GOVERN YOUR USE OF THIS SITE. THESE TERMS ARE A LEGAL CONTRACT BETWEEN YOU AND VERCEL INC. ("Vercel", "WE" OR "US") AND GOVERN YOUR ACCESS TO, AND USE OF, THE TURBOREPO.ORG WEBSITE (THE "SITE"). IF YOU DO NOT AGREE WITH ANY OF THESE TERMS, DO NOT ACCESS OR OTHERWISE USE THIS SITE OR ANY INFORMATION CONTAINED ON THIS SITE. YOUR USE OF THIS SITE SHALL BE DEEMED TO BE YOUR AGREEMENT TO ABIDE BY EACH OF THE TERMS SET FORTH BELOW. VERCEL MAY MAKE CHANGES TO THE CONTENT OFFERED ON THIS SITE AT ANY TIME. VERCEL CAN CHANGE THESE TERMS AT ANY TIME. IF VERCEL MAKES ANY CHANGES, VERCEL WILL POST AN UPDATED TERMS OF USE ON THIS SITE AND MAY SEND REGISTERED USERS AN EMAIL NOTICE OF THE CHANGES. IF ANY MODIFICATION IS UNACCEPTABLE TO YOU, YOU SHALL CEASE USING THIS SITE. IF YOU DO NOT CEASE USING THIS SITE, YOU WILL BE DEEMED TO HAVE ACCEPTED THE CHANGE. + +TURBOREPO SOFTWARE IS SEPARATELY MADE AVAILABLE PURSUANT TO AN OPEN SOURCE LICENSE AND YOU MAY FIND LINKS TO THE TURBOREPO SOFTWARE AND ITS ASSOCIATED LICENSES ON THIS SITE. + +## General Use Restrictions + +Vercel provides various content through this Site. Certain information, documents, documentation, products and services provided on and through this Site, including content, logos, graphics and images (together, the "Materials") are made available to you by Vercel and are the copyrighted and/or trademarked work of Vercel or Vercel's licensors. + +Vercel grants you a limited, personal, non-exclusive and non-transferable license to use and to display and to make one copy of the Materials solely for your personal or internal business use. Except for the license set forth in the preceding sentence above, you acknowledge and agree that you have no right to modify, edit, copy, reproduce, create derivative works of, reverse engineer, alter, enhance or in any way exploit any of the Materials in any manner. This limited license terminates automatically, without notice to you, if you breach any of these Terms. Except as stated herein, you acknowledge that you have no right, title or interest in or to this Site, or any Materials. + +## Privacy Policy + +Your use of this Site is governed by our Privacy Policy, which is available at https://turborepo.org/privacy (the "Privacy Policy"). + +## Third Party Content + +Certain information and other content on the Site may be the material of third party licensors and suppliers to Vercel ("Third Party Content"). The Third Party Content is, in each case, the copyrighted and/or trademarked work of the creator/licensor. You agree to only display the Third Party Content on your computer solely for your personal or internal business use. You acknowledge and agree that you have no right to download, cache, reproduce, modify, display (except as set forth in this paragraph), edit, alter or enhance any of the Third Party Content in any manner unless you have permission from the owner of the Third Party Content. Vercel DISCLAIMS ALL EXPRESS, IMPLIED AND STATUTORY WARRANTIES AND CONDITIONS WITH REGARD TO THIRD PARTY CONTENT, INCLUDING, BUT NOT LIMITED TO, ALL IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. + +## Links to Other Vercel Sites and Third Party Sites + +This Site may be linked to other Vercel sites where you may find other information regarding Vercel and its services. +This Site also links to other sites that are not Vercel sites. Vercel is providing these links to you only as a convenience, and Vercel is not responsible for such linked sites, including, without limitation, the content or links displayed on such sites. + +## Unauthorized Activities + +Unauthorized use of any Materials or Third Party Content contained on this Site may violate certain laws and regulations. You agree to indemnify and hold Vercel and its officers, directors, employees, affiliates, agents, licensors, and business partners harmless from and against any and all costs, damages, liabilities, and expenses (including attorneys' fees and costs of defense) Vercel or any other indemnified party suffers in relation to, arising from, or for the purpose of avoiding, any claim or demand from a third party that your use of this Site violates any applicable law or regulation, or the copyrights, trademark rights or other rights of any third party. + +## Proprietary Rights + +Vercel and Turborepo are trademarks of Vercel. Other trademarks, names and logos on this Site are the property of their respective owners. + +Unless otherwise specified in these Terms, all information and screens appearing on this Site, including documents, services, site design, text, graphics, logos, images and icons, as well as the arrangement thereof, are the sole property of Vercel Inc., Copyright Β© 2021-2022 Vercel Inc. All rights not expressly granted herein are reserved. Except as otherwise required or limited by applicable law, any reproduction, distribution, modification, retransmission, or publication of any copyrighted material is strictly prohibited without the express written consent of the copyright owner or license. + +## Disclaimer of Warranties + +Your use of this Site is at your own risk. Neither the Materials nor the Third Party Content have been verified or authenticated in whole or in part by Vercel, and they may include inaccuracies or typographical or other errors. Vercel does not warrant the accuracy or timeliness of the Materials or the Third Party Content contained on this Site. Vercel has no liability for any errors or omissions in the Materials and/or the Third Party Content, whether provided by Vercel or our licensors. + +VERCEL, FOR ITSELF AND ITS LICENSORS, MAKES NO EXPRESS, IMPLIED OR STATUTORY REPRESENTATIONS, WARRANTIES, OR GUARANTEES IN CONNECTION WITH THIS SITE, ANY MATERIALS, OR THIRD PARTY CONTENT, RELATING TO THE QUALITY, SUITABILITY, TRUTH, ACCURACY OR COMPLETENESS OF ANY INFORMATION OR MATERIAL CONTAINED OR PRESENTED ON THIS SITE AND/OR THE RESULTS OBTAINED FROM THE USE OF THIS SITE, INCLUDING WITHOUT LIMITATION THE MATERIALS AND THE THIRD PARTY CONTENT. TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, THIS SITE, THE SERVICES, MATERIALS, THIRD PARTY CONTENT, AND ANY INFORMATION OR MATERIAL CONTAINED OR PRESENTED ON THIS SITE IS PROVIDED TO YOU ON AN "AS IS," "AS AVAILABLE" AND "WHERE-IS" BASIS WITH NO WARRANTY OF IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR NON-INFRINGEMENT OF THIRD PARTY RIGHTS. VERCEL DOES NOT PROVIDE ANY WARRANTIES AGAINST VIRUSES, SPYWARE OR MALWARE THAT MAY BE INSTALLED ON YOUR COMPUTER. VERCEL HAS SOLE DISCRETION WHETHER TO ACCEPT OR REJECT YOU AS AN EXPERT OR A CLIENT. + +## Limitation of Liability + +VERCEL SHALL NOT BE LIABLE TO YOU FOR ANY DAMAGES RESULTING FROM YOUR DISPLAYING, COPYING, OR DOWNLOADING ANY MATERIAL OR THIRD PARTY CONTENT TO OR FROM THIS SITE. IN NO EVENT SHALL VERCEL BE LIABLE TO YOU FOR ANY INDIRECT, EXTRAORDINARY, EXEMPLARY, PUNITIVE, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES (INCLUDING LOSS OF DATA, REVENUE, PROFITS, USE OR OTHER ECONOMIC ADVANTAGE) HOWEVER ARISING, EVEN IF VERCEL KNOWS THERE IS A POSSIBILITY OF SUCH DAMAGE OR FOR DAMAGES IN THE AGGREGATE EXCEEDING THE AMOUNT OF ONE HUNDRED DOLLARS. + +## Local Laws; Export Control + +Vercel controls and operates this Site from its headquarters in the United States of America and the Materials and/or Third Party Content may not be appropriate or available for use in other locations. If you use this Site outside the United States of America, you are responsible for following applicable local laws. + +## General + +If you violate these Terms, Vercel may terminate and/or suspend your access to this Site without notice. The proprietary rights, disclaimer of warranties, representations made by you, indemnities, limitations of liability and general provisions shall survive any termination of these Terms. + +Vercel may change these Terms from time to time by providing thirty (30) days prior notice either by posting a notice at the Site. You can review the most current version of these Terms at any time at this Site. The revised Terms will become effective thirty (30) days after Vercel posts or sends you notice of such changes, and if you use the Site after that date, your use will constitute acceptance of the revised Terms. If any change to these Terms are not acceptable to you, your only remedy is to stop using the Site. If any provision of these Terms are found to be unenforceable or invalid, that provision will be limited or eliminated to the minimum extent necessary so that these Terms will otherwise remain in full force and effect and enforceable. You may not assign, transfer or sublicense without the prior written consent of Vercel, but Vercel may assign or transfer these Terms, in whole or in part, without restriction. Any attempted assignment or transfer of these Terms by the parties in contravention of the foregoing shall be null and void. These Terms are the complete and exclusive statement of the mutual understanding of the parties and supersedes and cancels all previous written and oral agreements, communications and other understandings relating to the subject matter of these Terms, and that all amendments must be in writing signed by both parties, except as otherwise provided herein. Vercel's failure to exercise or enforce any right or provision of these Terms shall not be a waiver of that right. No agency, partnership, joint venture, or employment is created as a result of these Terms and neither party has any authority of any kind to bind the other party in any respect whatsoever. In any action or proceeding to enforce rights under these Terms, the prevailing party will be entitled to recover costs and attorneys' fees. Except as stated above, all notices under these Terms will be in writing and will be deemed to have been duly given when received, if personally delivered; when receipt is electronically confirmed, if transmitted by email; the day after it is sent, if sent for next day delivery by recognized overnight delivery service; and upon receipt, if sent by certified or registered mail, return receipt requested. + +Any delays in or failure of performance of Vercel shall not constitute a default hereunder or give rise to any claims for damages if, to the extent that, and for such period that, such delays or failures of performance are caused by any events beyond the reasonable control of Vercel including, without limitation, any of the following specific occurrences: acts of God or the public enemy, acts of terrorism, pandemics, epidemics, labor strikes, expropriation or confiscation of facilities, compliance with any unanticipated duly promulgated governmental order, acts of war, rebellion or sabotage or damage resulting therefrom, fires, floods, explosion, or riots. + +## Governing Law; Disputes; Arbitration. + +**Law.** These Terms shall be governed by the laws of the State of California without regard to its conflict of laws provisions and the parties agree to the exclusive jurisdiction of the state and federal courts in San Francisco county. + +**Binding Contract.** You acknowledge that these Terms are a contract between you and Vercel, even though it is electronic and is not physically signed by you and Vercel, and it governs your use of the Site. + +**Arbitration.** Certain portions of this Section are deemed to be a "written agreement to arbitrate" pursuant to the Federal Arbitration Act ("FAA"). You and Vercel expressly agree and intend that this Section satisfies the "writing" requirement of the Federal Arbitration Act. This Section can only be amended by mutual agreement. For purposes of this Section, "Claims" means collectively, and without limitation, any and all claims, injuries, demands, liabilities, disputes, causes of action (including statutory, contract, negligence, or other tort theories), proceedings, obligations, debts, liens, fines, charges, penalties, contracts, promises, costs, expenses (including attorneys' fees, whether incurred pre-litigation, pre-trial, at trial, on appeal, or otherwise), damages of any kind whatsoever (including consequential, compensatory, or punitive damages), or losses (whether known, unknown, asserted, non-asserted, fixed, conditional, or contingent) that arise from or relate to (i) the Site, including any and all contents, materials and software related thereto, and/or (ii) your use of the Site. + +**Informal Dispute Resolution.** If any Claim arises out of or relates to the Site or these Terms, other than as may be provided herein, then you and Vercel agree to send notice to the other providing a reasonable description of the Claim, along with a proposed resolution of it. Vercel notice to you will be sent based on the most recent contact information that you provided Vercel. If no such information exists or if such information is not current, Vercel has no obligation under this Section. For a period of sixty (60) days from the date of receipt of notice from the other party, you and Vercel will engage in a dialog to attempt to resolve the Claim, though nothing will require either you or Vercel to resolve the Claim on terms with respect to which you and Vercel, in each of the parties' sole discretion, is not comfortable. + +**Applicable Law.** If you and Vercel cannot resolve a Claim, within sixty (60) days of the receipt of the notice, then you agrees that that any such Claim and all other disputes arising out of or relating to the interpretation, applicability, enforceability or formation of these Terms, including, but not limited to any claim that all or any part of these Terms are void or voidable, or whether a claim is subject to arbitration relating to your use of the Site, will be resolved by binding arbitration, rather than in court. The FAA, not state law, shall govern the arbitrability of such disputes, including the class action waiver below. However, you and Vercel agree that California state law or United States federal law shall apply to, and govern, as appropriate, any and all Claims or disputes arising between you and Vercel regarding these Terms and the Site, whether arising or stated in contract, statute, common law, or any other legal theory, without regard to choice of law principles. There is no judge or jury in arbitration, and court review of an arbitration award is limited. However, an arbitrator must follow the terms of these Terms as a court would. THIS SECTION, INCLUDING THE PROVISIONS ON BINDING ARBITRATION AND CLASS ACTION WAIVER, SHALL SURVIVE ANY TERMINATION OF YOUR ACCOUNT OR THE SERVICE. + +**Arbitration Request.** To begin an arbitration proceeding, you must send a letter requesting arbitration and describing your claim to Vercel at 340 S Lemon Ave #4133, Walnut CA, 91789. The arbitration will be conducted by JAMS in accordance with the JAMS Streamlined Arbitration Procedure Rules in effect at the time the arbitration is initiated, excluding any rules or procedures governing or permitting class actions. Payment of all filing, administration and arbitrator fees will be governed by JAMS's rules. The arbitration shall take place in San Francisco, California or at such other venue (and pursuant to such procedures) as is mutually agreed upon. You can obtain JAMS procedures, rules, and fee information as follows: JAMS: 800.352.5267 and https://www.jamsadr.com. + +**Fees.** You and Vercel will pay the administrative and arbitrator's fees and other costs in accordance with the applicable arbitration rules; but if applicable arbitration rules or laws require Vercel to pay a greater portion or all of such fees and costs in order for this Section to be enforceable, then Vercel will have the right to elect to pay the fees and costs and proceed to arbitration. Arbitration rules may permit Vercel to recover attorneys' fees. Vercel will not seek to recover attorneys' fees and costs in arbitration unless the arbitrator determines the claims are frivolous. + +**Class Action Waiver.** YOU AND VERCEL EACH AGREE THAT ANY DISPUTE RESOLUTION PROCEEDING WILL BE CONDUCTED ONLY ON AN INDIVIDUAL BASIS AND NOT IN A CLASS, CONSOLIDATED OR REPRESENTATIVE ACTION. You and Vercel each agree that such proceeding shall take solely by means of judicial reference pursuant to California Code of Civil Procedure section 638. + +**Exceptions.** Notwithstanding the agreement to resolve all disputes through arbitration, you or Vercel may bring suit in court to enjoin infringement or other misuse of intellectual property rights (including patents, copyrights, trademarks, trade secrets, and moral rights, but not including privacy rights). You or Vercel may also seek relief in small claims court for Claims within the scope of that court's jurisdiction. In the event that the arbitration provisions above are found not to apply to you or to a particular Claim, either as a result of your decision to opt-out of the arbitration provisions or as a result of a decision by the arbitrator or a court order, you agree that the venue for any such Claim or dispute is exclusively that of a state or federal court located in San Francisco County, California. You and Vercel agree to submit to the personal jurisdiction of the courts located within San Francisco County, California for the purpose of litigating all such Claims or any other disputes arising out of or relating to the interpretation, applicability, enforceability or formation of these Terms or your use of the Site in the event that the arbitration provisions are found not to apply. In such a case, should Vercel prevail in litigation against you to enforce its rights under these Terms, Vercel shall be entitled to its costs, expenses, and reasonable attorneys' fees (whether incurred at or in preparation for trial, appeal or otherwise) incurred in resolving or settling the dispute, in addition to all other damages or awards to which Vercel may be entitled. + +**Restrictions.** TO THE FULLEST EXTENT PERMITTED BY APPLICABLE LAW, IF YOU OR VERCEL WANT TO ASSERT A DISPUTE AGAINST THE OTHER, THEN YOU OR VERCEL MUST COMMENCE IT (BY DELIVERY OF WRITTEN NOTICE AS SET FORTH HEREIN) WITHIN ONE (1) YEAR AFTER THE DISPUTE ARISES OR IT WILL BE FOREVER BARRED. "Commencing" means, as applicable: (i) by delivery of written notice as set forth herein; (ii) filing for arbitration with JAMS as set forth herein; or (iii) filing an action in state or federal court. This provision will not apply to any legal action taken by Vercel to seek an injunction or other equitable relief in connection with any losses (or potential losses) relating to the Site, intellectual property rights of Vercel, and/or Vercel's provision of the Site. + +**Opt-Out.** You have the right to opt-out and not be bound by the arbitration and class action waiver provisions set forth above by sending written notice of your decision to opt-out to: privacy@vercel.com with the subject line "COMPANY ARBITRATION AND CLASS ACTION WAIVER OPT-OUT". The notice must be sent within thirty (30) days of your first use of the Site, otherwise you shall be bound to arbitrate any disputes in accordance with these Terms providing for binding arbitration. If you opt-out of these arbitration provisions, Vercel also will not be bound by them. + +Date last updated: December 8th, 2021 -- cgit v1.2.3-70-g09d2