* ci: deterministic check for studio/frontend dep removals
Adds a CI gate that catches the common foot-gun: a dep dropped from
studio/frontend/package.json that something in src/ still imports.
scripts/check_frontend_dep_removal.py
Diffs package.json against a git base ref, collects every package
no longer declared, and for each one:
1. Greps the entire repo for any usage pattern (static / dynamic /
side-effect imports, require, CSS @import, HTML script/link
src, new URL(), triple-slash references, template literals,
bare quoted strings in JS-like files).
2. Resolves whether the package would still install by BFS'ing
the dep graph in the new lockfile starting from the new
package.json's declared deps (so a stale lockfile does not
give false OK-via-transitive results).
3. Distinguishes top-level node_modules/<name> from nested copies
under other packages. Bare src/ imports only resolve to the
top-level path.
4. Pip-installed playwright references are filtered, so removing
the npm playwright (CI uses the pip one) is reported correctly.
Additional hygiene checks (warnings, fail with --strict):
- lockfile <root> dep map matches package.json (catches drift).
- @types/X is not orphaned when X is no longer declared.
- No src/ import points at a package not declared in any field.
tests/studio/test_frontend_dep_removal.py
24 deterministic cases. Each patches a copy of the head
package.json, runs the script, and asserts (exit status,
reported FAIL list). Covers:
- Genuinely-breaking removals: next-themes, @xyflow/react,
@huggingface/hub, dexie, motion, canvas-confetti, recharts,
node-forge, mammoth, unpdf.
- Safe-via-transitive removals: katex, clsx, react,
@radix-ui/react-slot, zustand, tailwind-merge, remark-gfm,
date-fns, js-yaml, @tauri-apps/api.
- Mixed multi-removal failing on the unsafe entries only.
- Non-existent / not-in-base names (no-op).
- Move from deps to devDeps (not a removal).
.github/workflows/studio-frontend-ci.yml
Runs the checker on pull_request events against
origin/${{ github.base_ref }}, plus the edge-case suite.
* scripts: harden frontend dep removal check + adversarial suite
classify() now catches sneaky shapes that an earlier line-only scan
would miss:
- multi-line `import { a, b } from "pkg"` and the same shape for
`export { ... } from "pkg"` / `export * from "pkg"` /
`export type ... from "pkg"`.
- JSDoc `@import("pkg")` references.
- Word-boundary fix so `foo` no longer matches `foobar` (subpath gate:
after the package name we require closing quote or `/`).
- Negative-lookbehind on `(?<!@)\bimport\b` so CSS `@import "X"` is
classified as css_import, not side_effect_import.
find_usage() now feeds an 8-line window (4 above / 4 below the grep
hit) into classify() so multi-line import statements are picked up
even though the initial grep is line-based.
tests/studio/test_frontend_dep_removal.py now exercises three suites:
- 24 edge cases: subprocess-driven, full-pipeline.
- 28 classify() unit cases: direct function call against hand-crafted
snippets. Covers static / side-effect / dynamic / require /
css_import / html_script / html_link / re_export (4 variants) /
template_literal / new_url / tsc_triple_slash / jsdoc_import /
string_literal, plus false-positive guards (substring collision,
plain-text comments, URL path tails, Python files, markdown).
- 12 adversarial cases: write synthetic files under
studio/frontend/src/__dep_check_adversarial__/, run the full
script, then clean up. Confirms multi-line imports, re-exports,
JSDoc @import, new URL, dynamic imports all FAIL when the
underlying package is removed.
Current total: 64 / 64 cases pass.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* scripts: detect bin references in package.json scripts
Catches the last common false-negative: removing a package whose
bin is only referenced through `package.json` scripts (e.g. dropping
typescript while `"build": "tsc -b && vite build"` calls tsc).
Cross-checked the patterns Vercel/Next.js, Vite, and TanStack use
in their own manifests; the bin/scripts pairing is the one
consumer-side pattern dep checkers commonly miss.
How it works:
- Build a bin-to-package map from each lockfile entry's `bin`
field. The map is global so a stale lockfile still resolves
bins from packages about to be pruned.
- Tokenize each script value, splitting on `&&`, `||`, `;`, `|`.
Strip env-var assignments and `npx / pnpx / yarn / pnpm / bunx`
prefixes, plus `./node_modules/.bin/` and `node_modules/.bin/`
path prefixes. Look up the leading token in the bin map.
- Hits are reported as `script_bin` and feed the same reachability
gate as source imports. A bin still installed transitively
(e.g. vite via @vitejs/plugin-react peer) is OK-via-transitive;
an orphaned bin is FAIL.
Test additions:
- 5 new edge cases: removing vite, typescript, eslint, @biomejs/biome,
and (@biomejs/biome + @vitejs/plugin-react) together. Correctly
flags @biomejs/biome and the combo as FAIL while vite / typescript
/ eslint are kept by peers.
- 8 new classify() unit cases: TypeScript ambient `declare module`,
namespace imports, combined default+named, default-as-named,
re-export default (4 forms), `.then()` dynamic imports without
await, and TypeScript `import()` in type position.
Current total: 29 edge + 36 classify-unit + 12 adversarial = 77 / 77.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* scripts: detect package.json field references to packages
After surveying package.json patterns in 10+ popular repos (React,
Vue/Svelte/Astro/Next.js, Vite, Storybook, TanStack/Query, Tailwind,
ESLint, TypeScript, Prettier, SvelteKit), several config fields in
package.json itself can reference packages by string. My checker
filtered all of package.json out of the string_literal fallback,
so removing a package that is only referenced from one of these
fields was a false negative.
Now covered (new pkg_json_field kind):
- overrides / resolutions / pnpm.overrides keys
- pnpm.patchedDependencies keys
- peerDependenciesMeta keys
- prettier: "@my/prettier-config" string
- eslintConfig.extends (string or array)
- stylelint.extends / stylelint.plugins
- babel.presets / babel.plugins
- jest.preset / jest.setupFiles / jest.transform
- commitlint.extends
- renovate.extends
- remarkConfig.plugins
- any other tool config field whose strings/keys equal the pkg
name or `pkg/subpath`
False-positive guards (do not flag string values inside):
- browserslist (browser queries)
- keywords (free-form strings)
- engines / engineStrict / packageManager / volta (version pins)
- files / directories / publishConfig (paths)
- workspaces (paths/globs)
- main / module / browser / types / typings / exports / imports /
bin / man (author-side fields)
- scripts (already handled separately via scripts_bin_refs)
- name / version / description / author / repository / homepage etc.
Test additions: new PkgFieldCase suite with 19 cases covering each
tool config field, subpath references, and the 5 false-positive
guards. Combined with the existing 29 edge / 36 classify / 12
adversarial cases, the suite is 96 / 96.
* scripts: enumerate dead deps in studio/frontend
Adds an opt-in dead-dep enumeration to the existing safety check.
Iterates every package declared in studio/frontend/package.json
(all four dep fields combined) and reports each as one of:
used at least one detected reference -- in src/, a
config file, package.json scripts (bin), a
package.json tool-config field (overrides /
prettier / eslintConfig / stylelint / babel /
jest / commitlint / renovate / etc.), or
tsconfig.compilerOptions.types
unused no detected reference anywhere
type_pkg_kept @types/X where X is still declared (or X = node,
always implicit)
type_pkg_orphan @types/X where X is no longer declared --
candidate for removal alongside X
Wiring:
- New CLI flag `--enumerate-dead` (off by default).
- CI workflow now passes `--enumerate-dead` so the report shows on
every PR run; the report is informational unless `--strict` is
also set.
- With `--strict`, unused / type_pkg_orphan entries fail the run.
Tests:
- 5 new EnumCase scenarios:
E01 fake dep with no usage -> reported unused
E02 fake dep imported by a synthetic src file -> reported used
E03 fake dep referenced only in overrides -> reported used
E04 @types/X paired with X (also imported) -> kept
E05 @types/X without X -> orphan
Running the new flag against the current main reproduces exactly the
11 deps PR #5477 removed, validating the heuristic end to end.
Current total: 29 edge + 36 classify + 12 adversarial + 19 pkg-json
field + 5 enumeration = 101 / 101.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* ci: fetch base ref before running dep removal safety check
actions/checkout uses fetch-depth: 1 by default, so when the
dependency removal check ran `git show origin/main:.../package.json`
the ref wasn't available locally and the script exited 2 with
"could not read base package.json at origin/main:...".
Fetch the single base commit before invoking the check so the
git-show lookup resolves. --depth=1 keeps the extra fetch cheap.
* ci: address bot review on PR 5478
Five issues flagged across gemini and codex:
* --base-lock argparse arg was defined and advertised in the
docstring, but main() always read args.head_lock in both branches
-- the flag did nothing. Dropped the dead arg and the misleading
docstring line; the lockfile-reachability analysis only needs the
head lockfile.
* lock_resolvable() was defined but never called. Removed.
* read_pkg_file() did not specify an encoding for read_text().
Added encoding="utf-8" for cross-platform stability.
* read_pkg_file() returned {} when the path did not exist, so a
bad --head-lock value silently bypassed the reachability checks
(false PASS for removals that resolve through npm script bins).
main() now exits 2 with a clear message when the head lockfile
is missing, matching the existing behavior for the head pkg.
* studio-frontend-ci.yml pull_request paths filter only matched
studio/frontend/** and the workflow file, so PRs that modified
the checker script or its test could skip this job. Added both
files to the trigger.
* ci: address 10x reviewer findings on dep removal safety check
Eight P1s and three P2s surfaced across 10 codex reviewers; this
commit addresses all of them.
P1s:
1. Workflow refspec. `git fetch --depth=1 origin <base_ref>` may only
create FETCH_HEAD in shallow PR checkouts; the checker then dies
with `fatal: invalid object name 'origin/main'`. Use the explicit
refspec `<base>:refs/remotes/origin/<base>` so origin/<base> is
reliably created.
2. `_deps_of()` was counting optional peer dependencies as reachable.
npm only installs an optional peer when another package declares
the same dep, so for "is this removed package still in the tree"
they cannot keep it alive on their own. Skip entries marked
`optional: true` in `peerDependenciesMeta`.
3. JS-syntactic classifiers (static_import, side_effect_import,
dynamic_import, require, re_export, jsdoc_import, template_literal,
tsc_triple_slash, new_url) now gate on file extension. Previously
only the final string-literal fallback was gated, so a JS-shaped
string inside a Python fixture or a Markdown code fence triggered
a false FAIL. Added U37-U40 covering .py / .md / .sh / .yml.
4. HTML `<script src=>` and `<link href=>` patterns now respect a
package-name boundary so `/node_modules/foo-extra/...` is not
treated as a usage of `foo`. Added U41-U43.
5. New `find_command_usage()` detects CLI invocations in .sh / .yml
/ .yaml / .ps1 / .bat / Dockerfile* (npx pkg, bunx pkg, pnpm exec
pkg, yarn dlx pkg, or a bare pkg --flag). Also covers scoped CLI
packages exposed by their unscoped tail (@biomejs/biome -> biome).
6. `build_bin_to_pkg(head_lock)` was losing the bin -> package map
for packages the PR correctly removed from the lockfile, so
`scripts.biome:check` no longer flagged when @biomejs/biome was
being dropped. Now also read the base lockfile (via `git show` or
the new `--base-lock` override) and layer its bin map on top for
any package in the removed set.
7. `--strict` now runs hygiene checks (lockfile sync, @types
orphans, undeclared imports, dead-deps) on the no-removal path
too. Previously the early return at "[OK] no dependencies removed"
skipped them, so `--strict` silently passed on a tree with
uncommitted lockfile drift or unused deps.
8. Removed `@types/X` packages are now matched against the runtime
target name `X`: `/// <reference types="X" />`, tsconfig
compilerOptions.types entries, AND runtime `import "X"` shapes.
Handles the npm scope encoding (`@types/foo__bar` -> `@foo/bar`).
P2s:
9. CSS `url(...)` now accepts both quoted and unquoted forms (added
U44-U45). The previous regex required `/{pkg}/` after a slash,
missing bare-package urls like `url(katex/fonts/x.woff2)`.
10. `find_imports_without_decl()` now covers all static-import
shapes: `import "pkg"`, `import Foo from "pkg"`,
`import { Foo } from "pkg"`, `import type { Foo } from "pkg"`,
`await import("pkg")`, `require("pkg")`.
11. (Same as #8.) Removed `@types/X` is also linked to runtime
imports of `X`, not just type-only references.
Test suite expanded from 101 to 110 cases; all pass. Real-world
enumerate-dead still flags the same 11 unused packages on
studio/dep-removal-safety-check (matches PR 5477's removal set).
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* ci: address 4x Opus reviewer findings on dep removal check
Three blockers from the parallel Opus review batch:
1. scripts_bin_refs ignored every script that began with a wrapper.
The original "first non-env token wins" heuristic credited
cross-env / dotenv / dotenvx / env-cmd as the bin, so a script like
`cross-env CI=1 biome check` left @biomejs/biome looking unused.
Rewrote into _next_real_bin(), which peels env prefixes, the
leading package-manager runner (npx / pnpx / bunx / pnpm exec /
yarn dlx), and the known wrapper bins (with --/-flag-arg handling)
before returning the real CLI. shlex tokenization preserves quoted
env values like `FOO="a b"`.
2. enumerate_dep_usage skipped find_command_usage. The non-enumerate
path already credited deps used only from CI / Dockerfile / shell
scripts, but `--enumerate-dead` did not, so packages referenced
only from a workflow were silently listed as dead. Added the same
call (gated against @types/* to avoid the unscoped-tail false
positive).
3. classify multi-line window was ±4 lines. Prettier formats long
named-import lists one identifier per line, so a 20-import block
pushed the `import` keyword out of the window and the dep dropped
to the string-literal fallback (or worse, was missed entirely).
Widened to ±25 -- still bounded enough to keep false-positives
negligible, wide enough for the realistic Prettier ceiling.
Tests: added 10 _next_real_bin unit cases + 4 scripts_bin_refs
end-to-end cases (W01-W10 + I01-I04) and a 22-identifier multi-line
import adversarial case (A13). Full suite: 125/125.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
|
||
|---|---|---|
| .github | ||
| images | ||
| scripts | ||
| studio | ||
| tests | ||
| unsloth | ||
| unsloth_cli | ||
| .gitattributes | ||
| .gitignore | ||
| .pre-commit-ci.yaml | ||
| .pre-commit-config.yaml | ||
| build.sh | ||
| cli.py | ||
| CODE_OF_CONDUCT.md | ||
| CONTRIBUTING.md | ||
| COPYING | ||
| install.ps1 | ||
| install.sh | ||
| LICENSE | ||
| pyproject.toml | ||
| README.md | ||
| unsloth-cli.py | ||
Unsloth Studio lets you run and train models locally.
Features • Quickstart • Notebooks • Documentation
⚡ Get started
macOS, Linux, WSL:
curl -fsSL https://unsloth.ai/install.sh | sh
Windows:
irm https://unsloth.ai/install.ps1 | iex
Community:
⭐ Features
Unsloth Studio (Beta) lets you run and train text, audio, embedding, vision models on Windows, Linux and macOS.
Inference
- Search + download + run models including GGUF, LoRA adapters, safetensors
- Export models: Save or export models to GGUF, 16-bit safetensors and other formats.
- Tool calling: Support for self-healing tool calling and web search
- Code execution: lets LLMs test code in Claude artifacts and sandbox environments
- API inference endpoint: Deploy and run local LLMs in Claude Code, Codex tools with Unsloth
- Auto set inference settings and customize chat templates.
- We work directly with teams behind gpt-oss, Qwen3, Llama 4, Mistral, Gemma 1-3, and Phi-4, where we’ve fixed bugs that improve model accuracy.
- Upload images, audio, PDFs, code, DOCX and more file types to chat with.
Training
- Train and RL 500+ models up to 2x faster with up to 70% less VRAM, with no accuracy loss.
- Custom Triton and mathematical kernels. See some collabs we did with PyTorch and Hugging Face.
- Data Recipes: Auto-create datasets from PDF, CSV, DOCX etc. Edit data in a visual-node workflow.
- Reinforcement Learning (RL): The most efficient RL library, using 80% less VRAM for GRPO, FP8 etc.
- Supports full fine-tuning, RL, pretraining, 4-bit, 16-bit and, FP8 training.
- Observability: Monitor training live, track loss and GPU usage and customize graphs.
- Multi-GPU training is supported, with major improvements coming soon.
📥 Install
Unsloth can be used in two ways: through Unsloth Studio, the web UI, or through Unsloth Core, the code-based version. Each has different requirements.
Unsloth Studio (web UI)
Unsloth Studio (Beta) works on Windows, Linux, WSL and macOS.
- CPU: Supported for Chat and Data Recipes currently
- NVIDIA: Training works on RTX 30/40/50, Blackwell, DGX Spark, Station and more
- macOS: Currently supports chat and Data Recipes. MLX training is coming very soon
- AMD: Chat + Data works. Train with Unsloth Core. Studio support is out soon.
- Coming soon: Training support for Apple MLX, AMD, and Intel.
- Multi-GPU: Available now, with a major upgrade on the way
macOS, Linux, WSL:
curl -fsSL https://unsloth.ai/install.sh | sh
Windows:
irm https://unsloth.ai/install.ps1 | iex
Launch
unsloth studio -p 8888
For cloud VMs or LAN access, add
-H 0.0.0.0to bind on all interfaces.
Update
To update, use the same install commands as above. Or run (does not work on Windows):
unsloth studio update
Docker
Use our Docker image unsloth/unsloth container. Run:
docker run -d -e JUPYTER_PASSWORD="mypassword" \
-p 8888:8888 -p 8000:8000 -p 2222:22 \
-v $(pwd)/work:/workspace/work \
--gpus all \
unsloth/unsloth
Developer, Nightly, Uninstall
To see developer, nightly and uninstallation etc. instructions, see advanced installation.
Unsloth Core (code-based)
Linux, WSL:
curl -LsSf https://astral.sh/uv/install.sh | sh
uv venv unsloth_env --python 3.13
source unsloth_env/bin/activate
uv pip install unsloth --torch-backend=auto
Windows:
winget install -e --id Python.Python.3.13
winget install --id=astral-sh.uv -e
uv venv unsloth_env --python 3.13
.\unsloth_env\Scripts\activate
uv pip install unsloth --torch-backend=auto
For Windows, pip install unsloth works only if you have PyTorch installed. Read our Windows Guide.
You can use the same Docker image as Unsloth Studio.
AMD, Intel:
For RTX 50x, B200, 6000 GPUs: uv pip install unsloth --torch-backend=auto. Read our guides for: Blackwell and DGX Spark.
To install Unsloth on AMD and Intel GPUs, follow our AMD Guide and Intel Guide.
📒 Free Notebooks
Train for free with our notebooks. You can use our new free Unsloth Studio notebook to run and train models for free in a web UI. Read our guide. Add dataset, run, then deploy your trained model.
| Model | Free Notebooks | Performance | Memory use |
|---|---|---|---|
| Gemma 4 (E2B) | ▶️ Start for free | 1.5x faster | 50% less |
| Qwen3.5 (4B) | ▶️ Start for free | 1.5x faster | 60% less |
| gpt-oss (20B) | ▶️ Start for free | 2x faster | 70% less |
| Qwen3.5 GSPO | ▶️ Start for free | 2x faster | 70% less |
| gpt-oss (20B): GRPO | ▶️ Start for free | 2x faster | 80% less |
| Qwen3: Advanced GRPO | ▶️ Start for free | 2x faster | 70% less |
| embeddinggemma (300M) | ▶️ Start for free | 2x faster | 20% less |
| Mistral Ministral 3 (3B) | ▶️ Start for free | 1.5x faster | 60% less |
| Llama 3.1 (8B) Alpaca | ▶️ Start for free | 2x faster | 70% less |
| Llama 3.2 Conversational | ▶️ Start for free | 2x faster | 70% less |
| Orpheus-TTS (3B) | ▶️ Start for free | 1.5x faster | 50% less |
- See all our notebooks for: Kaggle, GRPO, TTS, embedding & Vision
- See all our models and all our notebooks
- See detailed documentation for Unsloth here
🦥 Unsloth News
- API inference endpoint: Deploy and run local LLMs in Claude Code, Codex tools. Guide
- Qwen3.6: Qwen3.6-35B-A3B can now be trained and run in Unsloth Studio. Blog
- Gemma 4: Run and train Google’s new models directly in Unsloth. Blog
- Introducing Unsloth Studio: our new web UI for running and training LLMs. Blog
- Qwen3.5 - 0.8B, 2B, 4B, 9B, 27B, 35-A3B, 112B-A10B are now supported. Guide + notebooks
- Train MoE LLMs 12x faster with 35% less VRAM - DeepSeek, GLM, Qwen and gpt-oss. Blog
- Embedding models: Unsloth now supports ~1.8-3.3x faster embedding fine-tuning. Blog • Notebooks
- New 7x longer context RL vs. all other setups, via our new batching algorithms. Blog
- New RoPE & MLP Triton Kernels & Padding Free + Packing: 3x faster training & 30% less VRAM. Blog
- 500K Context: Training a 20B model with >500K context is now possible on an 80GB GPU. Blog
- FP8 & Vision RL: You can now do FP8 & VLM GRPO on consumer GPUs. FP8 Blog • Vision RL
📥 Advanced Installation
The below advanced instructions are for Unsloth Studio. For Unsloth Core advanced installation, view our docs.
Developer installs: macOS, Linux, WSL:
git clone https://github.com/unslothai/unsloth
cd unsloth
./install.sh --local
unsloth studio -p 8888
Then to update :
unsloth studio update
Developer installs: Windows PowerShell:
git clone https://github.com/unslothai/unsloth.git
cd unsloth
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
.\install.ps1 --local
unsloth studio -p 8888
Then to update :
unsloth studio update
Nightly: MacOS, Linux, WSL:
git clone https://github.com/unslothai/unsloth
cd unsloth
git checkout nightly
./install.sh --local
unsloth studio -p 8888
Then to launch every time:
unsloth studio -p 8888
Nightly: Windows:
Run in Windows Powershell:
git clone https://github.com/unslothai/unsloth.git
cd unsloth
git checkout nightly
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
.\install.ps1 --local
unsloth studio -p 8888
Then to launch every time:
unsloth studio -p 8888
Uninstall
You can uninstall Unsloth Studio by deleting its install folder usually located under $HOME/.unsloth/studio on Mac/Linux/WSL and %USERPROFILE%\.unsloth\studio on Windows. Using the rm -rf commands will delete everything, including your history, cache:
- MacOS, WSL, Linux:
rm -rf ~/.unsloth/studio - Windows (PowerShell):
Remove-Item -Recurse -Force "$HOME\.unsloth\studio"
For more info, see our docs.
Deleting model files
You can delete old model files either from the bin icon in model search or by removing the relevant cached model folder from the default Hugging Face cache directory. By default, HF uses:
- MacOS, Linux, WSL:
~/.cache/huggingface/hub/ - Windows:
%USERPROFILE%\.cache\huggingface\hub\
💚 Community and Links
| Type | Links |
|---|---|
| Join Discord server | |
| Join Reddit community | |
| 📚 Documentation & Wiki | Read Our Docs |
| Follow us on X | |
| 🔮 Our Models | Unsloth Catalog |
| ✍️ Blog | Read our Blogs |
Citation
You can cite the Unsloth repo as follows:
@software{unsloth,
author = {Daniel Han, Michael Han and Unsloth team},
title = {Unsloth},
url = {https://github.com/unslothai/unsloth},
year = {2023}
}
If you trained a model with 🦥Unsloth, you can use this cool sticker!
License
Unsloth uses a dual-licensing model of Apache 2.0 and AGPL-3.0. The core Unsloth package remains licensed under Apache 2.0, while certain optional components, such as the Unsloth Studio UI are licensed under the open-source license AGPL-3.0.
This structure helps support ongoing Unsloth development while keeping the project open source and enabling the broader ecosystem to continue growing.
Thank You to
- The llama.cpp library that lets users run and save models with Unsloth
- The Hugging Face team and their libraries: transformers and TRL
- The Pytorch and Torch AO team for their contributions
- NVIDIA for their NeMo DataDesigner library and their contributions
- And of course for every single person who has contributed or has used Unsloth!