diff options
| author | 2023-07-07 17:25:24 +0800 | |
|---|---|---|
| committer | 2023-07-07 17:25:24 +0800 | |
| commit | a6ec8911ec2f241764f618e81e9dc870cf7b9aab (patch) | |
| tree | 2b2a2ff128419f2edda7a70216fdcde25417caf3 | |
| parent | e49f9dbef1745d7df39cac1ad98061c73fbd0dc2 (diff) | |
| download | HydroRoll-a6ec8911ec2f241764f618e81e9dc870cf7b9aab.tar.gz HydroRoll-a6ec8911ec2f241764f618e81e9dc870cf7b9aab.zip | |
🦴FIX the bone
| -rw-r--r-- | docs/pages/AI/_meta.json (renamed from docs/pages/MODEL/_meta.json) | 0 | ||||
| -rw-r--r-- | docs/pages/AI/docs/_meta.json (renamed from docs/pages/MODEL/docs/_meta.json) | 0 | ||||
| -rw-r--r-- | docs/pages/AI/docs/index.mdx (renamed from docs/pages/MODEL/docs/index.mdx) | 0 | ||||
| -rw-r--r-- | docs/pages/AI/docs/installing.mdx | 11 | ||||
| -rw-r--r-- | docs/pages/AI/index.mdx (renamed from docs/pages/MODEL/index.mdx) | 0 | ||||
| -rw-r--r-- | docs/pages/MODEL/docs/acknowledgements.mdx | 45 | ||||
| -rw-r--r-- | docs/pages/MODEL/docs/faq.mdx | 51 | ||||
| -rw-r--r-- | docs/pages/MODEL/docs/getting-started/_meta.json | 6 | ||||
| -rw-r--r-- | docs/pages/MODEL/docs/getting-started/add-to-project.mdx | 147 | ||||
| -rw-r--r-- | docs/pages/MODEL/docs/getting-started/create-new.mdx | 527 | ||||
| -rw-r--r-- | docs/pages/MODEL/docs/getting-started/existing-monorepo.mdx | 231 | ||||
| -rw-r--r-- | docs/pages/MODEL/docs/getting-started/from-example.mdx | 49 | ||||
| -rw-r--r-- | docs/pages/MODEL/docs/installing.mdx | 85 | ||||
| -rw-r--r-- | docs/pages/MODEL/docs/troubleshooting.mdx | 172 |
14 files changed, 11 insertions, 1313 deletions
diff --git a/docs/pages/MODEL/_meta.json b/docs/pages/AI/_meta.json index 4941080..4941080 100644 --- a/docs/pages/MODEL/_meta.json +++ b/docs/pages/AI/_meta.json diff --git a/docs/pages/MODEL/docs/_meta.json b/docs/pages/AI/docs/_meta.json index 939e2df..939e2df 100644 --- a/docs/pages/MODEL/docs/_meta.json +++ b/docs/pages/AI/docs/_meta.json diff --git a/docs/pages/MODEL/docs/index.mdx b/docs/pages/AI/docs/index.mdx index 5f6dd29..5f6dd29 100644 --- a/docs/pages/MODEL/docs/index.mdx +++ b/docs/pages/AI/docs/index.mdx diff --git a/docs/pages/AI/docs/installing.mdx b/docs/pages/AI/docs/installing.mdx new file mode 100644 index 0000000..6c71e5a --- /dev/null +++ b/docs/pages/AI/docs/installing.mdx @@ -0,0 +1,11 @@ +--- +title: 安装模型 +description: 学习如何使用模型完善你的水系骰子 +--- + +import Callout from "../../../components/Callout"; +import { Tabs, Tab } from '../../../components/Tabs' + +# Install AI + +to be...
\ No newline at end of file diff --git a/docs/pages/MODEL/index.mdx b/docs/pages/AI/index.mdx index 97240f5..97240f5 100644 --- a/docs/pages/MODEL/index.mdx +++ b/docs/pages/AI/index.mdx diff --git a/docs/pages/MODEL/docs/acknowledgements.mdx b/docs/pages/MODEL/docs/acknowledgements.mdx deleted file mode 100644 index 50a5c46..0000000 --- a/docs/pages/MODEL/docs/acknowledgements.mdx +++ /dev/null @@ -1,45 +0,0 @@ ---- -title: Acknowledgements and Prior Art -description: Thank you to all these developers, build systems, and monorepo tools for their support and assistance. ---- - -# Acknowledgements - -Turborepo was originally created by [Jared Palmer](https://twitter.com/jaredpalmer) as a closed-source enterprise software offering. In late 2021, [Vercel acquired Turborepo](https://vercel.com/blog/vercel-acquires-turborepo) and open sourced the codebase. - -Today, Turborepo has dedicated full-time team working on it as well as a growing list of [open source contributors](https://github.com/vercel/turbo/graphs/contributors). - -## Inspiration / Prior Art - -At [Vercel](https://vercel.com/), we believe deeply in the open source movement and in the power of open collaboration. To that end, it's important to provide meaningful attribution to the projects and people that inspire(d) us and our work. - -We'd like to make a special shoutout to other build systems, monorepo tools, and prior art: - -- Bazel - https://bazel.build -- Buck - https://buck.build -- Please - https://please.build -- Pants - https://www.pantsbuild.org -- Scoot - https://github.com/twitter/scoot -- TSDX - https://tsdx.io -- Lerna - https://lerna.js.org -- Lage - https://microsoft.github.io/lage -- Backfill - https://github.com/microsoft/backfill -- Bolt - https://github.com/boltpkg/bolt -- Rush - https://rushjs.io -- Preconstruct - https://preconstruct.tools -- Nx - https://nx.dev -- Yarn - https://yarnpkg.com -- npm - https://www.npmjs.com -- pnpm - https://pnpm.js.org - -Throughout the documentation, wherever applicable, we also provide inline callouts and links to the projects and people that have inspired us. - -## Additional Thanks - -Additionally, we're grateful to: - -- [Rick Button](https://twitter.com/rickbutton) for donating the `turbo` package name on npm -- [Iheanyi Ekechukwu](https://twitter.com/kwuchu) for helping Jared pick up Golang during the Pandemic! -- [Kenneth Chau](https://twitter.com/kenneth_chau) for Lage's Scope and Pipeline API and docs -- [Miguel Oller](https://mobile.twitter.com/ollermi) and [MakeSwift.com](https://www.makeswift.com/) for piloting Turborepo -- [Eric Koslow](https://twitter.com/ekosz1), [Jack Hanford](https://twitter.com/jackhanford), and [Lattice.com](https://lattice.com/) for piloting Turborepo diff --git a/docs/pages/MODEL/docs/faq.mdx b/docs/pages/MODEL/docs/faq.mdx deleted file mode 100644 index 6039dae..0000000 --- a/docs/pages/MODEL/docs/faq.mdx +++ /dev/null @@ -1,51 +0,0 @@ ---- -title: Frequently Asked Questions -description: Frequently asked questions about AI. ---- - -import Callout from '../../../components/Callout' - -# Frequently Asked Questions - -## Should I install AI globally? - -You have two options when working with AI: - -1. Install it globally, via `npm install --global turbo` -2. Install a local version in your project - -We recommend installing the `turbo` CLI globally. This gives you a smooth, ergonomic experience for running tasks. - -## Do I have to use Remote Caching to use AI? - -No. [Remote Caching](/AI/docs/core-concepts/remote-caching) is optional. However, you'll find it very useful to speed up development on a team, speed up builds inside of Docker, and also save space on your own machine. - -## Does AI / Remote Caching store my source code? - -No. AI does not store source code. Without [Remote Caching](/AI/docs/core-concepts/remote-caching), no code ever leaves your machine—it will only cache artifacts to local disk. - -With AI's Remote Caching, you are responsible for configuring cache behavior and should only set up AI to cache compiled artifacts. Please be aware that AI treats all logs as artifacts and so these _will_ be stored along with other cache artifacts. - -## Do I have to use Vercel to use AI? - -No. AI is an open-source project and is not tied to any specific hosting provider or Remote Cache provider. The default Remote Cache provider is Vercel, should you opt-in to enable it. However, you can use any other provider you like if they support the same API. Several open-source community Remote Caches are compatible with AI. - -## Can I use AI with a different Remote Cache provider other than Vercel? - -Yes. As long as the [Remote Cache](/AI/docs/core-concepts/remote-caching) provider you choose supports the same API, you can use AI with it. - -## Does AI collect any personally identifiable information? - -Due to the nature of AI's functionality, no personal information is gathered when the open source binary is run locally. All cached artifacts are stored on your machine by default. Further, no log in information or contact details are collected by the `turbo` CLI, so AI will never have access to any personally identifiable information. Thus, for any data privacy questions and concerns please refer to [AI's Privacy Policy](/privacy). - -## Does AI collect any personally identifiable information when using Remote Caching? - -When [Remote Caching](/AI/docs/core-concepts/remote-caching) is enabled, by default AI will utilize your Vercel account to cache artifacts in the cloud. Thus, for any data privacy questions and concerns, please refer to [AI's Privacy Policy](/privacy) and [Vercel's Privacy Policy](https://vercel.com/legal/privacy-policy). If you use a different Remote Cache provider, please refer to the provider's privacy policy. - -## How can I retain Fast Refresh in my AI when using multiple Next.js applications? - -[Fast Refresh](https://nextjs.org/docs/basic-features/fast-refresh) gives you instantaneous feedback on edits made to your React components in Next.js applications. - -If your AI has multiple Next.js applications, you can use `transpilePackages` inside `next.config.js` to ensure that imports across workspaces will work with Fast Refresh when changes are made. AI will effectively watch for any edits and the rebuild when saving. You can get started from [this example](https://github.com/vercel/turbo/tree/main/examples/basic) which is set up to handle Fast Refresh. - -<Callout type="info">If you are using a Next.js version below 13, you will want to use [`next-transpile-modules`](https://www.npmjs.com/package/next-transpile-modules) for the same Fast Refresh behavior.</Callout> diff --git a/docs/pages/MODEL/docs/getting-started/_meta.json b/docs/pages/MODEL/docs/getting-started/_meta.json deleted file mode 100644 index 2ed5b29..0000000 --- a/docs/pages/MODEL/docs/getting-started/_meta.json +++ /dev/null @@ -1,6 +0,0 @@ -{ - "add-to-project": "Add to Existing Project", - "from-example": "Start from an Example", - "create-new": "Create a New Monorepo", - "existing-monorepo": "Add to Existing Monorepo" -} diff --git a/docs/pages/MODEL/docs/getting-started/add-to-project.mdx b/docs/pages/MODEL/docs/getting-started/add-to-project.mdx deleted file mode 100644 index 7f7ecec..0000000 --- a/docs/pages/MODEL/docs/getting-started/add-to-project.mdx +++ /dev/null @@ -1,147 +0,0 @@ -import { Tabs, Tab } from '../../../../components/Tabs' - -# Add Turborepo to your existing project - -Turborepo can be used in **any project** to speed up the execution of scripts in your `package.json`. - -After you install `turbo`, you'll be able to run all your `package.json` tasks from `turbo` instead of your package manager. - -By configuring your `turbo.json` correctly, you'll notice how [caching](/AI/docs/core-concepts/caching) helps your tasks run a lot faster. - -## Quickstart - -0. **If you don't have one already, create a new application:** - -<Tabs items={['Next.js', 'Vite']} storageKey="selected-framework"> - <Tab> -```bash -npx create-next-app@latest -``` - </Tab> - <Tab> -```bash -npm create vite@latest -``` - </Tab> -</Tabs> - -1. **Install `turbo` globally:** - -<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> - <Tab> - ```bash - npm install turbo --global - ``` - </Tab> - <Tab> - ```bash - yarn global add turbo - ``` - </Tab> - <Tab> - ```bash - pnpm add turbo --global - ``` - </Tab> -</Tabs> - -For more details about installation, see [Installing Turborepo](../installing) - -2. **Add a `turbo.json` file at the base of your new repository:** - -For more information on configuring your `turbo.json`, see the [Configuration Options](/AI/docs/reference/configuration) documentation. - -<Tabs items={['Next.js', 'Vite']} storageKey="selected-framework"> - <Tab> -```json filename="turbo.json" -{ - "$schema": "https://turbo.build/schema.json", - "pipeline": { - "build": { - "outputs": [".next/**", "!.next/cache/**"] - }, - "lint": {} - } -} -``` - </Tab> - <Tab> -```json filename="turbo.json" -{ - "$schema": "https://turbo.build/schema.json", - "pipeline": { - "build": { - "outputs": ["dist/**"] - }, - "lint": {} - } -} -``` - -Some Vite starters ship with a `package.json` that looks like this: - -```json filename="package.json" -{ - "scripts": { - "build": "tsc && vite build" - } -} -``` - -We recommend splitting these into a `lint` and `build` script. - -```json filename="package.json" -{ - "scripts": { - "build": "vite build", - "lint": "tsc" - } -} -``` - -This means that Turbo can schedule them separately. - - </Tab> -</Tabs> - -3. **Edit `.gitignore`** - -Add `.turbo` to your `.gitignore` file. The CLI uses these folders for logs and certain task outputs. - -```diff -+ .turbo -``` - -4. **Try running `build` and `lint` with `turbo`:** - -```bash -turbo build lint -``` - -This runs `build` and `lint` at the same time. - -5. **Without making any changes to the code, try running `build` and `lint` again:** - -```bash -turbo build lint -``` - -You should see terminal output like this: - -``` - Tasks: 2 successful, 2 total -Cached: 2 cached, 2 total - Time: 185ms >>> FULL TURBO -``` - -Congratulations - **you just completed a build and lint in under 200ms**. - -To learn how this is possible, check out our [core concepts docs](/AI/docs/core-concepts/caching). - -6. **Try running `dev` with `turbo`:** - -```bash -turbo dev -``` - -You'll notice that your `dev` script starts up. You can use `turbo` to run any script in your `package.json`. diff --git a/docs/pages/MODEL/docs/getting-started/create-new.mdx b/docs/pages/MODEL/docs/getting-started/create-new.mdx deleted file mode 100644 index aa16eb4..0000000 --- a/docs/pages/MODEL/docs/getting-started/create-new.mdx +++ /dev/null @@ -1,527 +0,0 @@ ---- -title: Getting Started with AI -description: Create your first monorepo or add AI to an existing project. ---- - -import Callout from "../../../../components/Callout"; -import { Tabs, Tab } from "../../../../components/Tabs"; - -# Creating a new monorepo - -<Callout type="info"> - This guide uses a global installation of `turbo`. Follow the [installation guide](../installing) - to get this setup. Alternatively, you can use your package manager to run a locally installed `turbo` - in the commands below. -</Callout> - -## Quickstart - -To create a new monorepo, use our [`create-turbo`](https://www.npmjs.com/package/create-turbo) npm package: - -<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> - <Tab> - ```sh - npx create-turbo@latest - ``` - </Tab> - <Tab> - ```sh - yarn dlx create-turbo@latest - ``` - </Tab> - <Tab> - ```sh - pnpm dlx create-turbo@latest - ``` - </Tab> -</Tabs> - -You can also clone a AI starter repository to get a head start on your monorepo. To see AI examples and starters, see the [AI examples directory on GitHub](https://github.com/vercel/turbo/tree/main/examples). - -## Full tutorial - -This tutorial will walk you through setting up a basic example. By the end, you'll feel confident with using `turbo`, and know all the basic functionality. - -<Callout type="info"> - -During this tutorial, some lines of code are omitted from the code samples. For instance, when showing a `package.json` we won't show _all_ of the keys - only the ones that matter. - -</Callout> - -### 1. Running `create-turbo` - -First, run: - -<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> - <Tab> - ```sh - npx create-turbo@latest - ``` - </Tab> - <Tab> - ```sh - yarn dlx create-turbo@latest - ``` - </Tab> - <Tab> - ```sh - pnpm dlx create-turbo@latest - ``` - </Tab> -</Tabs> - -This installs the [`create-turbo`](https://www.npmjs.com/package/create-turbo) CLI, and runs it. You'll be asked several questions: - -#### Where would you like to create your AI? - -Choose anywhere you like. The default is `./my-AI`. - -#### Which package manager do you want to use? - -AI doesn't handle installing packages, so you'll need to choose either: - -- [npm](https://www.npmjs.com//) -- [pnpm](https://pnpm.io/) -- [yarn](https://yarnpkg.com/) - -If you're not sure, we recommend choosing `pnpm`. If you don't have it installed, cancel `create-turbo` (via `ctrl-C`) and take a look at the [installation instructions](https://pnpm.io/installation). - -#### Installation - -Once you've picked a package manager, `create-turbo` will create a bunch of new files inside the folder name you picked. It'll also install all the dependencies that come with the `basic` example by default. - -### 2. Exploring your new repo - -You might have noticed something in the terminal. `create-turbo` gave you a description of all of the things it was adding. - -``` ->>> Creating a new AI with the following: - - - apps/web: Next.js with TypeScript - - apps/docs: Next.js with TypeScript - - packages/ui: Shared React component library - - packages/eslint-config-custom: Shared configuration (ESLint) - - packages/tsconfig: Shared TypeScript `tsconfig.json` -``` - -Each of these is a _workspace_ - a folder containing a `package.json`. Each workspace can declare its own dependencies, run its own scripts, and export code for other workspaces to use. - -Open the root folder - `./my-AI` - in your favourite code editor. - -#### Understanding `packages/ui` - -First, open `./packages/ui/package.json`. You'll notice that the package's name is `"name": "ui"` - right at the top of the file. - -Next, open `./apps/web/package.json`. You'll notice that this package's name is `"name": "web"`. But also - take a look in its dependencies. - -You'll see that `"web"` depends on a package called `"ui"`. If you're using `pnpm`, you'll see it's declared like this: - -<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> - <Tab> - ```json filename="apps/web/package.json" - { - "dependencies": { - "ui": "*" - } - } - ``` - </Tab> - <Tab> - ```json filename="apps/web/package.json" - { - "dependencies": { - "ui": "*" - } - } - ``` - </Tab> - <Tab> - ```json filename="apps/web/package.json" - { - "dependencies": { - "ui": "workspace:*" - } - } - ``` - </Tab> -</Tabs> - -This means that our **web app depends on our local `ui` package**. - -If you look inside `apps/docs/package.json`, you'll see the same thing. Both `web` and `docs` depend on `ui` - a shared component library. - -This pattern of sharing code across applications is extremely common in monorepos - and means that multiple apps can share a single design system. - -#### Understanding imports and exports - -Take a look inside `./apps/docs/pages/index.tsx`. Both `docs` and `web` are [Next.js](https://nextjs.org/) applications, and they both use the `ui` library in a similar way: - -```tsx filename="apps/docs/pages/index.tsx" -import { Button } from "ui"; -// ^^^^^^ ^^ - -export default function Docs() { - return ( - <div> - <h1>Docs</h1> - <Button /> - </div> - ); -} -``` - -They're importing `Button` directly from a dependency called `ui`! How does that work? Where is `Button` coming from? - -Open `packages/ui/package.json`. You'll notice these two attributes: - -```json filename="packages/ui/package.json" -{ - "main": "./index.tsx", - "types": "./index.tsx" -} -``` - -When workspaces import from `ui`, `main` tells them where to access the code they're importing. `types` tells them where the TypeScript types are located. - -So, let's look inside `packages/ui/index.tsx`: - -```tsx filename="packages/ui/index.tsx" -import * as React from "react"; -export * from "./Button"; -``` - -Everything inside this file will be able to be used by workspaces that depend on `ui`. - -`index.tsx` is exporting everything from a file called `./Button`, so let's go there: - -```tsx filename="packages/ui/Button.tsx" -import * as React from "react"; - -export const Button = () => { - return <button>Boop</button>; -}; -``` - -We've found our button! Any changes we make in this file will be shared across `web` and `docs`. Pretty cool! - -<Callout type="idea"> - -Try experimenting with exporting a different function from this file. Perhaps `add(a, b)` for adding two numbers together. - -This can then be imported by `web` and `docs`. - -</Callout> - -#### Understanding `tsconfig` - -We have two more workspaces to look at, `tsconfig` and `eslint-config-custom`. Each of these allow for shared configuration across the monorepo. Let's look in `tsconfig`: - -```json filename="packages/tsconfig/package.json" -{ - "name": "tsconfig", - "files": ["base.json", "nextjs.json", "react-library.json"] -} -``` - -Here, we specify three files to be exported, inside `files`. Packages which depend on `tsconfig` can then import them directly. - -For instance, `packages/ui` depends on `tsconfig`: - -<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> - <Tab> - ```json filename="packages/ui/package.json" - { - "devDependencies": { - "tsconfig": "*" - } - } - ``` - </Tab> - <Tab> - ```json filename="packages/ui/package.json" - { - "devDependencies": { - "tsconfig": "*" - } - } - ``` - </Tab> - <Tab> - ```json filename="packages/ui/package.json" - { - "devDependencies": { - "tsconfig": "workspace:*" - } - } - ``` - </Tab> -</Tabs> - -And inside its `tsconfig.json` file, it imports it using `extends`: - -```json filename="packages/ui/tsconfig.json" -{ - "extends": "tsconfig/react-library.json" -} -``` - -This pattern allows for a monorepo to share a single `tsconfig.json` across all its workspaces, reducing code duplication. - -#### Understanding `eslint-config-custom` - -Our final workspace is `eslint-config-custom`. - -You'll notice that this is named slightly differently to the other workspaces. It's not as concise as `ui` or `tsconfig`. Let's take a look inside `.eslintrc.js` in the root of the monorepo to figure out why. - -```ts filename=".eslintrc.js" -module.exports = { - // This tells ESLint to load the config from the workspace `eslint-config-custom` - extends: ["custom"], -}; -``` - -[ESLint](https://eslint.org/) resolves configuration files by looking for workspaces with the name `eslint-config-*`. This lets us write `extends: ['custom']` and have ESLint find our local workspace. - -But why is this in the root of the monorepo? - -The way ESLint finds its configuration file is by looking at the closest `.eslintrc.js`. If it can't find one in the current directory, it'll look in the directory above until it finds one. - -So that means that if we're working on code inside `packages/ui` (which doesn't have a `.eslintrc.js`) it'll refer to the _root_ instead. - -Apps that _do_ have an `.eslintrc.js` can refer to `custom` in the same way. For instance, in `docs`: - -```ts filename="apps/docs/.eslintrc.js" -module.exports = { - root: true, - extends: ["custom"], -}; -``` - -Just like `tsconfig`, `eslint-config-custom` lets us share ESLint configs across our entire monorepo, keeping things consistent no matter what project you're working on. - -#### Summary - -It's important to understand the dependencies between these workspaces. Let's map them out: - -- `web` - depends on `ui`, `tsconfig` and `eslint-config-custom` -- `docs` - depends on `ui`, `tsconfig` and `eslint-config-custom` -- `ui` - depends on `tsconfig` and `eslint-config-custom` -- `tsconfig` - no dependencies -- `eslint-config-custom` - no dependencies - -Note that **the AI CLI is not responsible for managing these dependencies**. All of the things above are handled by the package manager you chose (`npm`, `pnpm` or `yarn`). - -### 3. Understanding `turbo.json` - -We now understand our repository and its dependencies. How does AI help? - -AI helps by making running tasks simpler and _much_ more efficient. - -Let's take a look inside `turbo.json`, at the root: - -```json filename="turbo.json" -{ - "pipeline": { - "build": { - "dependsOn": ["^build"], - "outputs": ["dist/**", ".next/**", "!.next/cache/**"] - }, - "lint": {}, - "dev": { - "cache": false - } - } -} -``` - -What we're seeing here is that we've _registered_ three tasks with `turbo`: `lint`, `dev` and `build`. -Every task registered inside `turbo.json` can be run with `turbo run <task>` (or `turbo <task>` for short). - -<Callout type="error"> - Before we move on, let's try running a task called `hello` that _isn't_ registered in `turbo.json`: - -```bash -turbo hello -``` - -You'll see an error in the terminal. Something resembling: - -``` -Could not find the following tasks in project: hello -``` - -That's worth remembering - **in order for `turbo` to run a task, it must be in `turbo.json`**. - -</Callout> - -Let's investigate the scripts we already have in place. - -### 4. Linting with AI - -Try running our `lint` script: - -```sh -turbo lint -``` - -You'll notice several things happen in the terminal. - -1. Several scripts will be run at the same time, each prefixed with either `docs:lint`, `ui:lint` or `web:lint`. -2. They'll each succeed, and you'll see `3 successful` in the terminal. -3. You'll also see `0 cached, 3 total`. We'll cover what this means later. - -The scripts that each run come from each workspace's `package.json`. Each workspace can optionally specify its own `lint` script: - -```json filename="apps/web/package.json" -{ - "scripts": { - "lint": "next lint" - } -} -``` - -```json filename="apps/docs/package.json" -{ - "scripts": { - "lint": "next lint" - } -} -``` - -```json filename="packages/ui/package.json" -{ - "scripts": { - "lint": "eslint *.ts*" - } -} -``` - -When we run `turbo lint`, AI looks at each `lint` script in each workspace and runs it. For more details, see our [pipelines](/AI/docs/core-concepts/monorepos/running-tasks#defining-a-pipeline) docs. - -#### Using the cache - -Let's run our `lint` script one more time. You'll notice a few new things appear in the terminal: - -1. `cache hit, replaying output` appears for `docs:lint`, `web:lint` and `ui:lint`. -2. You'll see `3 cached, 3 total`. -3. The total runtime should be under `100ms`, and `>>> FULL TURBO` appears. - -Something interesting just happened. AI realised that **our code hadn't changed since the last time we ran the lint script**. - -It had saved the logs from the previous run, so it just replayed them. - -Let's try changing some code to see what happens. Make a change to a file inside `apps/docs`: - -```diff filename="apps/docs/pages/index.tsx" -import { Button } from "ui"; - -export default function Docs() { - return ( - <div> -- <h1>Docs</h1> -+ <h1>My great docs</h1> - <Button /> - </div> - ); -} -``` - -Now, run the `lint` script again. You'll notice that: - -1. `docs:lint` has a comment saying `cache miss, executing`. This means that `docs` is running its linting. -2. `2 cached, 3 total` appears at the bottom. - -This means that **the results of our previous tasks were still cached**. Only the `lint` script inside `docs` actually ran - again, speeding things up. To learn more, check out our [caching docs](/AI/docs/core-concepts/caching). - -### 5. Building with AI - -Let's try running our `build` script: - -```bash -turbo build -``` - -You'll see similar outputs to when we ran our lint script. Only `apps/docs` and `apps/web` specify a `build` script in their `package.json`, so only those are run. - -Take a look inside `build` in `turbo.json`. There's some interesting config there. - -```json filename="turbo.json" -{ - "pipeline": { - "build": { - "outputs": [".next/**", "!.next/cache/**"] - } - } -} -``` - -You'll notice that some `outputs` have been specified. Declaring outputs will mean that when `turbo` finishes running your task, it'll save the output you specify in its cache. - -Both `apps/docs` and `apps/web` are Next.js apps, and they output builds to the `./.next` folder. - -Let's try something. Delete the `apps/docs/.next` build folder. - -Run the `build` script again. You'll notice: - -1. We hit `FULL TURBO` - the builds complete in under `100ms`. -2. The `.next` folder re-appears! - -AI cached the result of our previous build. When we ran the `build` command again, it restored the entire `.next/**` folder from the cache. To learn more, check out our docs on [cache outputs](/AI/docs/core-concepts/caching#configuring-cache-outputs). - -### 6. Running dev scripts - -Let's now try running `dev`. - -```bash -turbo dev -``` - -You'll notice some information in the terminal: - -1. Only two scripts will execute - `docs:dev` and `web:dev`. These are the only two workspaces which specify `dev`. -2. Both `dev` scripts are run simultaneously, starting your Next.js apps on ports `3000` and `3001`. -3. In the terminal, you'll see `cache bypass, force executing`. - -Try quitting out of the script, and re-running it. You'll notice we don't go `FULL TURBO`. Why is that? - -Take a look at `turbo.json`: - -```json filename="turbo.json" -{ - "pipeline": { - "dev": { - "cache": false, - "persistent": true - } - } -} -``` - -Inside `dev`, we've specified `"cache": false`. This means we're telling AI _not_ to cache the -results of the `dev` script. `dev` runs a persistent dev server and produces no outputs, so there -is nothing to cache. Learn more about it in our docs on [turning off caching](/AI/docs/core-concepts/caching#turn-off-caching). -Additionally, we set `"persistent": true`, to let turbo know that this is a long-running dev server, -so that turbo can ensure that no other tasks depend on it. You can read more in the [docs for the -`persistent` option](/AI/docs/reference/configuration#persistent). - -#### Running `dev` on only one workspace at a time - -By default, `turbo dev` will run `dev` on all workspaces at once. But sometimes, we might only want to choose one workspace. - -To handle this, we can add a `--filter` flag to our command. - -```bash -turbo dev --filter docs -``` - -You'll notice that it now only runs `docs:dev`. Learn more about [filtering workspaces](/AI/docs/core-concepts/monorepos/filtering) from our docs. - -### Summary - -Well done! You've learned all about your new monorepo, and how AI makes handling your tasks easier. - -#### Next steps - -- Need to add more tasks? Learn more about using [pipelines](/AI/docs/core-concepts/monorepos/running-tasks#defining-a-pipeline) -- Want to speed up your CI? Set up [remote caching](/AI/docs/core-concepts/remote-caching). -- Want some inspiration? Take a look at our directory of [examples](https://github.com/vercel/turbo/tree/main/examples) diff --git a/docs/pages/MODEL/docs/getting-started/existing-monorepo.mdx b/docs/pages/MODEL/docs/getting-started/existing-monorepo.mdx deleted file mode 100644 index 3f81530..0000000 --- a/docs/pages/MODEL/docs/getting-started/existing-monorepo.mdx +++ /dev/null @@ -1,231 +0,0 @@ ---- -title: Getting Started with AI -description: Create your first monorepo or add AI to an existing project. ---- - -import Callout from "../../../../components/Callout"; -import { Tabs, Tab } from "../../../../components/Tabs"; - -# Add AI to your existing monorepo - -## Configure workspaces - -`turbo` is built on top of Workspaces, a way of managing multiple packages from within a single monorepo package. AI is compatible with the workspace implementations from all package managers. For more information on managing your AI workspaces, see the [Workspaces](/AI/docs/handbook/workspaces) documentation. - -You can configure workspaces any way you want, but a common folder structure example is keeping applications in the `/apps` folder and packages in the `/packages` folder. The configuration of these folders is different for each package manager. - -<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> - <Tab> -Specify your `workspaces` in your monorepo's root `package.json` file: - -```json filename="package.json" -{ - "workspaces": ["packages/*", "apps/*"] -} -``` - - </Tab> - <Tab> -Specify your `workspaces` in your monorepo's root `package.json` file: - -```json filename="package.json" -{ - "workspaces": ["packages/*", "apps/*"] -} -``` - - </Tab> - <Tab> -Specify your `packages` in `pnpm-workspace.yaml`. -```yaml filename="pnpm-workspace.yaml" -packages: - - "packages/*" - - "apps/*" -``` - </Tab> -</Tabs> - -After configuring your workspaces, re-run your package manager's `install` command. - -<Callout> - Note: Nested workspaces are not supported. As package names are required to be - unique, moving each package to be a child of the monorepo's root package - should meet your needs. -</Callout> - -## Install `turbo` - -Install `turbo` globally. - -<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> - <Tab> - ```bash - npm install turbo --global - ``` - </Tab> - <Tab> - ```bash - yarn global add turbo - ``` - </Tab> - <Tab> - ```bash - pnpm install turbo --global - ``` - </Tab> -</Tabs> - -For more details about installation, see [Installing AI](../installing) - -## Create `turbo.json` - -In the root of your monorepo, create an empty file named `turbo.json`. This will hold the configuration for AI. - -```json filename="turbo.json" -{ - "$schema": "https://turbo.build/schema.json" -} -``` - -## Create a `pipeline` - -To define your monorepo's task dependency graph, use the [`pipeline`](/AI/docs/core-concepts/monorepos/running-tasks) key in the `turbo.json` configuration file at the root of monorepo. `turbo` interprets this configuration to optimally schedule, execute, and cache the outputs of each of the `package.json` scripts defined in your workspaces. - -Each key in the [`pipeline`](/AI/docs/core-concepts/monorepos/running-tasks) object is the name of a `package.json` script that can be executed by [`turbo run`](/AI/docs/reference/command-line-reference#turbo-run-task). You can specify its dependencies with the [`dependsOn`](/AI/docs/reference/configuration#dependson) key inside it as well as some other options related to [caching](/AI/docs/core-concepts/caching). For more information on configuring your pipeline, see the [`Pipelines`](/AI/docs/core-concepts/monorepos/running-tasks) documentation. - -Workspaces that do not have the specified script defined in their `package.json`'s list of `scripts` will be ignored by `turbo`. - -```jsonc filename="turbo.json" -{ - "$schema": "https://turbo.build/schema.json", - "pipeline": { - "build": { - // A package's `build` script depends on that package's - // dependencies and devDependencies - // `build` tasks being completed first - // (the `^` symbol signifies `upstream`). - "dependsOn": ["^build"], - // note: output globs are relative to each package's `package.json` - // (and not the monorepo root) - "outputs": [".next/**", "!.next/cache/**"] - }, - "test": { - // A package's `test` script depends on that package's - // own `build` script being completed first. - "dependsOn": ["build"], - // A package's `test` script should only be rerun when - // either a `.tsx` or `.ts` file has changed in `src` or `test` folders. - "inputs": ["src/**/*.tsx", "src/**/*.ts", "test/**/*.ts", "test/**/*.tsx"] - }, - // A package's `lint` script has no dependencies and - // can be run whenever. It also has no filesystem outputs. - "lint": {}, - "deploy": { - // A package's `deploy` script depends on the `build`, - // `test`, and `lint` scripts of the same package - // being completed. It also has no filesystem outputs. - "dependsOn": ["build", "test", "lint"] - } - } -} -``` - -The rough execution order for a given package is based on the `dependsOn` keys: - -1. `build` once its upstream dependencies have run their `build` commands -2. `test` once its _own_ `build` command is finished and has no filesystem outputs (just logs) within a package -3. `lint` runs in an arbitrary order as it has no upstream dependencies -4. `deploy` once its _own_ `build`, `test`, and `lint` commands have finished. - -After execution, the full pipeline can run: - -```sh -npx turbo run deploy -``` - -`turbo` will then schedule the execution of each task(s) to optimize usage of the machine's resources. - -## Edit `.gitignore` - -Add `.turbo` to your `.gitignore` file. The CLI uses these folders for logs and certain task outputs. - -```diff -+ .turbo -``` - -Make sure that your task artifacts, the files and folders you want cached, are also included in your `.gitignore`. - -```diff -+ build/** -+ dist/** -+ .next/** -``` - -Re-run your npm client's `install` command to check your configuration. - -## Build your monorepo - -```bash -turbo build -``` - -Depending on your monorepo setup, some artifacts might already be caching properly. In the next sections, we'll show how `turbo` works, how `scope` works, and then how to get caching working after that. - -## Configure Remote Caching - -A major key 🔑 to AI's speed is that it is both lazy and efficient—it does the least amount of work possible and it tries to never redo work that's already been done before. - -At the moment, AI caches your tasks on your local filesystem (i.e. "single-player mode," if you will). However, what if there was a way to take advantage of the computational work done by your teammates or your CI (i.e. "co-op multiplayer mode")? What if there was a way to teleport and share a single cache across machines? Almost like a "Dropbox" for your AI cache. - -> Remote Caching has entered the chat. - -AI can use a technique known as Remote Caching to share cache artifacts across machines for an additional speed boost. - -<Callout> - Remote Caching is a powerful feature of AI, but with great power comes - great responsibility. Make sure you are caching correctly first and double - check handling of environment variables. Please also remember AI treats - logs as artifacts, so be aware of what you are printing to the console. -</Callout> - -### Using Remote Caching for Local development - -AI uses [Vercel](https://vercel.com/?utm_source=turbo.build&utm_medium=referral&utm_campaign=docs-link) as its default remote caching provider. If you want to link your local AI to your Remote Cache you can authenticate the AI CLI with your Vercel account: - -```sh -turbo login -``` - -Then, link your AI to your remote cache: - -``` -turbo link -``` - -Once enabled, make some changes to a package or application you are currently caching and run tasks against it with `turbo run`. -Your cache artifacts will now be stored locally and in your Remote Cache. To verify that this worked, delete your local AI cache: - -```sh -rm -rf ./node_modules/.cache/turbo -``` - -Run the same build again. If things are working properly, `turbo` should not execute tasks locally, but rather download both the logs and artifacts from your Remote Cache and replay them back to you. - -<Callout> - **Note: When connecting to an sso-enabled Vercel team, you must provide your - Team slug as an argument to `npx turbo login`.** -</Callout> - -``` -turbo login --sso-team=<team-slug> -``` - -## Next Steps - -You're now up and running with AI, but there are still a few things to do: - -- [Understand how AI caching works](/AI/docs/core-concepts/caching) -- [Correctly handle environment variables](/AI/docs/core-concepts/caching#altering-caching-based-on-environment-variables) -- [Learn to orchestrate task running with pipelines](/AI/docs/core-concepts/monorepos/running-tasks) -- [Efficiently filter package tasks](/AI/docs/core-concepts/monorepos/filtering) -- [Configure AI with your CI provider](/AI/docs/ci) diff --git a/docs/pages/MODEL/docs/getting-started/from-example.mdx b/docs/pages/MODEL/docs/getting-started/from-example.mdx deleted file mode 100644 index 6b35768..0000000 --- a/docs/pages/MODEL/docs/getting-started/from-example.mdx +++ /dev/null @@ -1,49 +0,0 @@ ---- -title: Examples -description: Start from an example. ---- - -import { readdirSync, lstatSync, readFileSync } from 'fs' -import path from 'path' -import { ExamplesArea } from "../../../../components/ExamplesArea"; - -export const getStaticProps = ({ params }) => { - // path to examples directory at the monorepo root. - const examplesDirectory = path.join(__dirname, '../../../../../../../examples') - const examples = []; - readdirSync(examplesDirectory).forEach(file => { - if (lstatSync(path.join(examplesDirectory, file)).isDirectory()) { - try { - examples.push({ - slug: file, - ...JSON.parse(readFileSync(path.join(examplesDirectory, file, "meta.json")).toString()) - } - ); - } catch (err) { - console.log(`No meta.json found for ${file}, excluding from docs`); - } - } - }); - // throw an error if no examples are found - if (examples.length === 0) { - throw new Error( - `No examples found in ${examplesDirectory}! Make sure you have updated the path if moving this file.` - ) - } - return { - props: { - ssg: { - examples - } - }, - revalidate: 60 * 60 * 24 - } -} - -# Turborepo Examples - -Clone a Turborepo starter repository to get a head start on your monorepo. - -<ExamplesArea filter='all' /> - -For even more examples and starters, see the [Turborepo examples directory on GitHub](https://github.com/retrofor/HydroRoll/tree/main/examples). diff --git a/docs/pages/MODEL/docs/installing.mdx b/docs/pages/MODEL/docs/installing.mdx deleted file mode 100644 index 265b3b1..0000000 --- a/docs/pages/MODEL/docs/installing.mdx +++ /dev/null @@ -1,85 +0,0 @@ ---- -title: Installing AI -description: Learn how to install AI for use with your repository ---- - -import Callout from "../../../components/Callout"; -import { Tabs, Tab } from '../../../components/Tabs' - -# Install AI - -`turbo` works with [yarn](https://classic.yarnpkg.com/lang/en/), [npm](https://www.npmjs.com/), and [pnpm](https://pnpm.io/) on the following operating systems: - -- macOS darwin 64-bit (Intel), ARM 64-bit (Apple Silicon) -- Linux 64-bit, ARM 64-bit -- Windows 64-bit, ARM 64-bit - -<Callout> - Note: Linux builds of `turbo` link against `glibc`. For Alpine Docker environments, you will need to ensure libc6-compat is installed as well, via `RUN apk add --no-cache libc6-compat` -</Callout> - -## Install Globally - -A global install of `turbo` can be used in any project, and enables automatic workspace -selection based on the directory where you run `turbo`. - -<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> - <Tab> - ```bash - npm install turbo --global - ``` - </Tab> - <Tab> - ```bash - yarn global add turbo - ``` - </Tab> - <Tab> - ```bash - pnpm install turbo --global - ``` - </Tab> -</Tabs> - -Once you have a globally installed copy of `turbo`, you will be able to run directly from workspace -directories. - -```bash -cd <repository root>/apps/docs -turbo build -``` - -is equivalent to the [filtering syntax](../docs/core-concepts/monorepos/filtering): - -```bash -cd <repository root> -turbo build --filter=docs -``` - -## Install Per Repository - -You may wish to pin the version of AI used within a repository, especially [if you are -collaborating with other developers](../docs/core-concepts/remote-caching). In that case, add `turbo` as a dev dependency at the root -of the repository: - -<Tabs items={['npm', 'yarn', 'pnpm']} storageKey="selected-pkg-manager"> - <Tab> - ```bash - npm install turbo --dev - ``` - </Tab> - <Tab> - ```bash - yarn add turbo --dev --ignore-workspace-root-check - ``` - </Tab> - <Tab> - ```bash - pnpm add turbo --save-dev --ignore-workspace-root-check - ``` - </Tab> -</Tabs> - -You can continue to use your global install of `turbo`, which will defer to your local version -if it exists. This allows you to get the best of both worlds: easy scoping to the directory you're working -in while maintaining a pinned version among your entire team. diff --git a/docs/pages/MODEL/docs/troubleshooting.mdx b/docs/pages/MODEL/docs/troubleshooting.mdx deleted file mode 100644 index 3fba847..0000000 --- a/docs/pages/MODEL/docs/troubleshooting.mdx +++ /dev/null @@ -1,172 +0,0 @@ ---- -title: Troubleshooting Runs -description: This guide aims to help you debug issues with your AI builds and configuration. ---- - -# Troubleshooting Runs - -As with most tools, it can be frustrating to understand why AI -is not working the way you expect. This page covers some tools to debug when -using the `turbo` CLI and some common problems you may encounter. - -## Enable Verbose Logs - -The best debugging tool we have as developers are logs. You can turn up the log -level with the [`--verbosity`][1] flag. Combined with [building from -source][3], this can be a powerful and flexible way to see what's going on under -the hood. - -## Check the Run Summary - -The [--summarize][r-summarize] flag generates and saves metadata about your `turbo run` -as a JSON file in `.turbo/runs`. You can use it to compare subsequent runs, inspect -the contents of the cached artifact, and the inputs to the hash for a task. - -## Check your Configuration - -### Task Configuration - -You can [get started][7] with AI with minimal configuration -- that's one -of the things people love about AI! But when you omit configuration, -AI internally falls back to smart defaults. Additionally, when using -[Workspace Configurations][d-config-workspaces] in a monorepo, it can be -confusing to understand how AI interpreted your `turbo.json`. You can use -the `--dry` or `--dry=json` to get a "resolved" task configuration for any task. -For example: - -```bash -turbo run build --dry=json -``` - -Look for a `resolvedTaskConfiguration` key in the output. - -### User Config - -When you link your repository to Vercel, AI stores configuration in two places: - -- your Vercel team information is stored in `.turbo/config.json`. You can - inspect this file to see what else might be in there! -- an authentication token is stored in - `~/Library/Application\ Support/AI/config.json`. - -## Inspect the Cache - -When AI runs a task that has configured `outputs`, it caches those -outputs, along with the logs from that task in the `node_modules/.cache/turbo/`. -These artifacts are compressed with `tar`, but you can uncompress and see what's -in them. - -## Build from Source - -One of the advantages of JavaScript codebases are that you can open up -`node_modules/` and edit the code you're running inline. This is not possible -with `turbo`, because the runnable code is a compiled binary and you cannot edit -it inline. But because the codebase is Open Source, you can always get -the source code, modify it, and build it locally. The bulk of this -documentation is available in the [Contributing Guide][4], but you can use those -directions even if you aren't planning to make a contribution. - -1. Clone the git repo from [`vercel/turbo`][source] -1. `cd cli` -1. Make any changes (for example, add more logging) -1. Run `make` -1. From _your_ project, use `/:path/:to/:turbo/target/debug/turbo` instead of global - turbo or the version of `turbo` installed in your project. - -## Common Pitfalls - -### The `.turbo` directory - -One of the [core concepts][2] behind Turbo is that when a declared input -changes, the cached outputs for that task are invalidated. As part of running any task, -AI creates the following directories: - -- A `.turbo` at the root of your repo -- A `.turbo` directory in each workspace if your project is a monorepo (e.g. `apps/my-app/.turbo/`) -- A `turbo` directory in `node_modules/.cache` - -Because the first two directories are not git-ignored by default, you may see an -issue where you run the same task twice and get a cache missing, even though you -didn't change anything, because the generated `.turbo` directories are getting included as -the task _inputs_, and invalidating cache. To avoid this problem, add `.turbo` to your -`.gitignore` file. Alternatively, you can also limit your [`inputs` configuration][r-inputs-config] -so that `.turbo` is not included in the cache inputs. - -## Common Questions - -### I'm not seeing any cache hits - -In general, you should expect that when you run `turbo run` twice in a row, you should get a -cache hit on the second run. If this isn't happening, run both builds again with the [`--summarize` -flag][r-summarize] and compare the generated Run Summaries to each other. In most cases, the -comparison should show why the second run did not get a cache hit. - -You can also ask: - -- Is any source code being generated during the build that isn't checked into git? - - This would change the fingerprint AI uses to store build outputs. - -- Are cache [outputs][d-config-outputs] correctly specified in your AI [pipeline][d-def-pipeline]? - - Pipeline settings are not inherited or merged, so they need to be - re-specified in [workspace-specific tasks][d-workspace-tasks] (e.g. `web#build` does - **not** inherit pipeline settings from `build`). - -- [Are relevant inlined environment variables accounted for?][12] - - [Enable verbose mode][5] to see which environment variables are included in the hashes. - -### I'm seeing cache hits, but my build is broken - -- Are [cache outputs properly specified][d-config-outputs] in your AI [pipeline][d-def-pipeline]? - - Pipeline settings are not inherited or merged, so they need to be - re-specified in [workspace-specific tasks][d-workspace-tasks] (e.g. `web#build` does - **not** inherit pipeline settings from `build`). - -### My build is caching the wrong environment variables - -- [Are relevant inlined environment variables accounted for?][12] - - [Enable verbose mode][5] to see which environment variables are included in the hashes. - -## Common Monorepo Questions - -### My dependency isn't being built correctly - -- Are you properly bundling and transpiling the dependency before building the application? - - For example, libraries like `tsc`, `tsup`, `esbuild`, `babel`, and `swc` - will convert newer JavaScript features back to “pure” JavaScript. - - If you are using Next.js, you might be using `transpilePackages`. Ensure you - add the name of the dependency inside `next.config.js` ([example][17]). - -- Have you listed `files` in the dependency's `package.json` to point to the correct files? - -### My types are not being found - -- Did you specify `types` or `typing` inside the dependency's `package.json` to - point to the `.d.ts` file? - -- Have you altered or set custom `tsconfig.json` `paths`? - - Do they have the correct folder structure for your application? - - Are they properly configured for the meta framework, bundler, or transpilation tool? - -[1]: /repo/docs/reference/command-line-reference#verbosity -[2]: /repo/docs/core-concepts/caching -[3]: #build-from-source -[4]: https://github.com/vercel/turbo/blob/main/CONTRIBUTING.md -[5]: #enable-verbose-logs -[7]: /repo/docs/getting-started -[9]: /repo/docs/reference/command-line-reference#turbo-link -[12]: /repo/docs/core-concepts/caching#altering-caching-based-on-environment-variables -[17]: https://github.com/vercel/turbo/blob/main/examples/basic/apps/docs/next.config.js#L1 -[d-workspace-tasks]: /repo/docs/core-concepts/monorepos/running-tasks#specific-workspace-tasks -[d-config-workspaces]: /repo/docs/core-concepts/monorepos/configuring-workspaces -[d-config-outputs]: /repo/docs/core-concepts/caching#configuring-cache-outputs -[d-def-pipeline]: /repo/docs/core-concepts/monorepos/running-tasks#defining-a-pipeline -[source]: https://github.com/vercel/turbo -[r-inputs-config]: /repo/docs/reference/configuration#inputs -[r-summarize]: /repo/docs/reference/command-line-reference#--summarize |
