mirror of
https://github.com/unslothai/unsloth.git
synced 2026-04-26 10:31:03 +00:00
* Studio: add github_repo seed reader and GitHub Support Bot recipe
Adds a first-party Data Designer seed reader that scrapes GitHub issues,
pull requests, and commits from one or more repositories via the GraphQL
API, and a learning recipe (GitHub Support Bot) that turns those rows into
synthetic support Q&A pairs for fine-tuning.
Backend (new plugin studio/backend/plugins/data-designer-github-repo-seed):
* GitHubRepoSeedSource config: repos, token (falls back to GH_TOKEN /
GITHUB_TOKEN env var), item_types (issues / pulls / commits),
per-resource limit (0 means all), max_comments_per_item.
* Rate-limit-aware GraphQL client (GitHubClient + RepoScraper) shared
across repos; flattens each item into a uniform row with columns
item_type, repo, number, title, body, state, author, created_at,
closed_at, url, labels, comments.
* Registered via the data_designer.plugins entry point.
Frontend:
* New seed_github block variant so the seed node card shows
"GitHub repositories" instead of the generic "Document file"
placeholder, with its own icon and inline summary (repo count +
item-type list).
* Rewritten seed dialog github_repo form: repos textarea pre-filled with
unslothai/unsloth + unslothai/unsloth-zoo, password input for the GH
token, items-per-repo number with an "All" toggle, and the noisier
options (item types, max comments, include comments) tucked under an
Advanced collapsible.
* Local model auto-load on Run: if a recipe uses an is_local provider
and the inference server is not already serving that model, the
executions hook calls /api/inference/load first. Removes the "open
/chat to load a model" prerequisite that users kept tripping on.
* Honor the recipe's run.rows value in the Run dialog (previously the
store reset to 5 regardless of what the template shipped).
Recipe (studio/frontend/src/features/data-recipes/learning-recipes/
github-support-bot.json):
* Defaults to the Local Model provider + unsloth/gemma-4-E2B-it-GGUF.
* Scrapes unslothai/unsloth and unslothai/unsloth-zoo, issues and pulls,
up to 100 items per resource.
* Two LLM blocks: normalized_question (llm-text) rewrites each thread
into a clean support question, support_answer (llm-structured)
produces JSON with answer / diagnosis_questions / cites / confidence.
* Run defaults to 10 rows for a quick smoke test.
Verified end-to-end on a running Studio: card renders, source-data
dialog is pre-populated, All toggle disables the limit input, the
recipe executes and produces rows against a loaded local GGUF.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* fix: improve GitHub recipe support
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Studio: speed up GitHub scraper and harden the support-bot recipe
Addresses a perf issue found while demoing the github_repo seed reader:
Scraper is too slow at scale. The PRs GraphQL query pulls deeply nested
fields (reviewThreads, reviews, commits, timelineItems, etc.) so the
page size was pinned at 3 to stay under GitHub's node-count ceiling. 100
PRs meant 34 serial round trips. Added lighter query variants
(PRS_PAGE_QUERY_LIGHT, ISSUES_PAGE_QUERY_LIGHT) that drop the fields the
Studio flatten layer does not use (it only reads title, body, state,
author, labels, comments). With the light query PR pages can safely go
to 25 per page and issues to 50. The plugin scraper now passes
light=True to RepoScraper so Studio always uses the fast path; the heavy
query remains available for other callers.
Recipe defaults are now demo-ready with production knobs called out:
- max_parallel_requests: 1 and max_tokens: 800 so small local models
stay stable when running the support_answer structured column.
- support_answer prompt trimmed to 80-200 words so gemma-4-E2B GGUF can
actually comply with the schema. The canonical 150-300 word codex
prompt is still documented in the node3 markdown note for
production upgrades.
* Studio: rename GitHub recipe to 'GitHub Scraper' and add Easy mode
Changes the recipe framing from a single-purpose 'Support Bot' pipeline
to a general-purpose scraper that produces {user_request,
grounded_response} training pairs. Aligns with the canonical
github_data_gatherer dataset (11 enrichment tasks mirrored in pr_requests_20
/ issue_requests_20 on the input side and explain_pr / issue_fix_plan /
issue_solution on the output side).
Recipe JSON changes:
- columns[0] renamed normalized_question -> user_request, prompt now
inverts a GitHub thread into a realistic user ask instead of
normalising it.
- columns[1] renamed support_answer -> coauthor_response, emits
{response, followups, cites, task, confidence} and branches on
issue vs PR thread type.
- Notes rewritten to document the 11-task catalog and the canonical
production prompt to paste in for a full dataset backfill.
Frontend: Easy mode for github_repo recipes. The drag-and-drop canvas is
hidden behind an 'Advanced' tab; Easy mode is the default for any recipe
whose seed_source_type is github_repo. The Easy form reuses the existing
GithubRepoSeedForm (promoted to exported), adds a rows input bound to
previewRows, a model field bound to the model_config, and a single Run
button that calls runPreview() directly (no modal). Non-github recipes
see the same Editor / Runs tabs as before.
View mode persists per-recipe-id in localStorage under
recipe-studio:view-mode:<recipeId>.
* Studio: auto-detect server GH_TOKEN and widen Easy-mode detection
The GitHub seed form now fetches /api/data-recipe/seed/github/env-token
on mount and, when the server exposes a GH_TOKEN / GITHUB_TOKEN env var
and the token field is blank, shows a small 'Using server env var' badge
and swaps the placeholder text. The token value itself is never returned
to the UI.
Widens Easy-mode detection in recipe-studio-page.tsx so that recipes
saved before ui.seed_source_type was persisted also get the Easy tab:
falls back to recipe.seed_config.source.seed_type, which is always
present for github_repo seeds.
* fix: polish GitHub recipe UI
* Studio: default llama-server --threads to -1 (auto)
Previously we passed --threads only when the caller set an explicit
value, which meant llama-server fell back to its internal default.
That default has varied across llama.cpp builds (some versions use
hardware concurrency including hyperthreads, which hurts throughput on
CPU-heavy inference). Always passing --threads -1 pins the behaviour
to llama.cpp's auto-detect (physical cores).
Caller-supplied n_threads still wins when non-None.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Studio: auto-switch Easy mode to Runs pane on run start
Easy mode had no progress island or canvas overlay, so after clicking Run
the only visible state was the button label flipping to "Running..." while
the screen otherwise stayed identical. This reads as stuck even though the
job is progressing.
Wire an onExecutionStart callback from recipe-studio-page.tsx through to
useRecipeExecutions so that when a run is kicked off from easy mode, the
page flips to the executions view where the Runs sidebar, progress bar,
rate/ETA panel, and live log are rendered. Advanced/editor mode keeps its
existing behavior and stays on the canvas (it already has the floating
ExecutionProgressIsland).
* fix: clean up GitHub scraper layout
* Studio: forward llm-structured output_format as llama-server response_format
Local GGUF runs of llm-structured columns used to generate the full
max_tokens budget before the prompt-level "return JSON in a ```json
fence" instruction got parsed. Small models (e.g. gemma-4-E2B-it)
routinely broke format, so each row took ~65s and frequently failed
with "No parsable JSON structure within ```json markdown fence".
For any local-provider model_config referenced by an llm-structured
column, clone the model_config and inject response_format into the
clone's inference_parameters. Uses llama.cpp server's flat shape
(tools/server/README.md):
{"type": "json_schema", "schema": <output_format>}
Not the OpenAI-nested form; data_designer's OpenAI adapter forwards
response_format verbatim via facade._COMPLETION_REQUEST_FIELDS, and
llama-server's documented schema path expects the flat variant.
The clone is per (model_alias, column) so:
- llm-text / llm-judge columns that share the same alias keep
free-form sampling.
- Each structured column gets its own schema, so columns with
different output_formats don't collide.
Effect on gemma-4-E2B-it demos: every row parses cleanly, and the
model terminates immediately after the closing brace instead of
running to max_tokens. Net wall-clock is usually faster even though
grammar-constrained sampling is slightly slower per token.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Studio: flip Easy to Runs pane before validation scrape, not after
Previously onExecutionStart fired inside runExecution, which runs AFTER
validateRecipe() -- and validation re-invokes the seed reader. For the
github_repo reader that is a full GraphQL scrape, so the user sat on a
"Running..." button with an otherwise unchanged Easy form for 10-15s
before anything moved.
Call onExecutionStart at the top of runWithValidation, right after we
have a payload to send. The view flips immediately; ensureLocalModelLoaded
+ validateRecipe now run against the Runs pane instead of a frozen Easy
form. runExecution still calls onExecutionStart downstream, but the
callback is idempotent (the page's easy -> executions guard skips the
second call), so no behaviour change for runs that pass validation.
If validation fails the toast + runErrors path still fires; the Easy
form's error banner still reads runErrors when the user switches back.
* Studio: unify data-recipe workflow auth on sk-unsloth-* keys
The previous commit (a61b4cc9) assumed storage.create_api_key(..., internal=True)
and storage.revoke_internal_api_key(key_id) existed, but those helpers were
only in the working tree, never committed. Recipe runs in local-model mode
were therefore crashing with 500 when _inject_local_providers tried to mint
a workflow key. This commit ships the missing pieces.
auth/storage.py:
- api_keys schema gains is_internal INTEGER DEFAULT 0 (with a guarded
ALTER TABLE migration so existing auth.db files upgrade in place).
- create_api_key takes an internal=False kwarg; internal keys are flagged
so they can be hidden from user-facing listings.
- list_api_keys takes include_internal=False so UIs never see workflow keys.
- New revoke_internal_api_key(key_id): id-only revoke for keys minted by
non-user subjects (the JobManager does not know a username).
core/data_recipe/jobs/manager.py:
- JobManager.start accepts internal_api_key_id and stores it on Job so
lifecycle handlers can revoke eagerly.
- _handle_event revokes on EVENT_JOB_COMPLETED / _ERROR / _CANCELLED.
- _pump_loop subprocess-died fallback also retires the key so a crashed
worker cannot leak a live sk-unsloth-* beyond its TTL.
- Revocation is best-effort (swallow exceptions) -- the 24h TTL is the
safety net if storage hiccups.
core/data_recipe/jobs/types.py:
- Job dataclass gains internal_api_key_id: int | None = None.
Replaces the bespoke 24h JWT path that jobs.py used to mint for local
providers. One mint/revoke/verify surface for every API key the server
issues, and revocation is now eager (seconds, not 24h) instead of TTL-only.
* Studio: plug workflow-key leak on unexpected create_job errors
Review follow-up on the sk-unsloth-* workflow-key lifecycle in
create_job. Previously the revoke handlers wrapped mgr.start(...) but
only caught RuntimeError and ValueError, and get_job_manager() sat
outside the try block entirely. Any other exception type (TypeError
from a mismatched kwarg, OSError from the queue write, etc.) would
bubble up to FastAPI and leave the minted key live until its 24h TTL.
Fix: one try block covers both get_job_manager() and mgr.start(), with
a trailing except Exception that revokes and re-raises. The
RuntimeError -> 409 and ValueError -> 400 paths are unchanged so
specific client-facing status codes still surface. Revocation is still
best-effort (_revoke_internal_api_key_safe swallows errors) because we
never want revoke failures to mask the original crash.
Severity is low -- the key can't bootstrap longer access and the 24h
TTL bounds the window -- but the reviewer's point stands: eager revoke
on every failure path is the right invariant.
* Studio: nest response_format under extra_body so pydantic accepts it
The previous commit dropped response_format at the top level of a cloned
model_config's inference_parameters, which BuilderConfig rejected with:
ValidationError: Extra inputs are not permitted [type=extra_forbidden]
data_designer.model_configs.1.inference_parameters.response_format
data_designer's BaseInferenceParams is a pydantic model with extra=forbid
and only a fixed set of fields (temperature, top_p, max_tokens,
max_parallel_requests, timeout, extra_body). The pass-through path for
anything the schema doesn't know about is `extra_body`, which the
OpenAI SDK spreads into the chat-completions request body at the top
level -- which is exactly where llama-server reads response_format from.
Inject under extra_body (merging with any existing extra_body contents)
so the clone validates. llama-server still receives
{"type": "json_schema", "schema": <output_format>} at the top level of
the request body, which is the flat shape llama.cpp's server expects.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Studio: forward response_format to llama-server and fence-wrap the reply
Two-part fix for the llm-structured data-recipe path:
(1) The /v1/chat/completions proxy was dropping response_format. The
route's passthrough branch only triggered on tools / tool messages, so
requests carrying a JSON schema fell into the non-passthrough GGUF path
which calls generate_chat_completion (no response_format kwarg). The
schema never reached llama-server, so guided decoding was a no-op and
the model emitted free-form text that happened to parse a fraction of
the time. Widen the passthrough trigger and teach _build_passthrough_payload
to forward response_format so llama-server's GBNF grammar actually runs.
Guided decoding does not require supports_tools, so split the condition:
a request is now passthrough-routed if it carries tools/tool messages
(existing behavior) OR carries response_format (new). The vision guard,
streaming fork, and tools-choice defaulting are unchanged.
(2) data_designer's llm-structured parser looks for a ```json ... ```
markdown fence and discards anything else. Guided decoding emits only
the JSON object (the GBNF grammar has no fence tokens), so a
100%-valid schema-constrained run still ended up 0 ok / N failed with
"No parsable JSON structure within ```json markdown fence". In
_openai_passthrough_non_streaming, wrap each choice's content in the
expected fence when the caller asked for guided decoding. Already-fenced
content is left alone so other clients that prefer raw JSON are not
affected; the wrap is scoped to requests that carried response_format.
Net effect on the GitHub Support Bot recipe on a local GGUF: schema
actually binds during sampling, content arrives wrapped in the fence
data_designer expects, and generation terminates immediately after the
closing brace instead of running out to max_tokens.
* Studio: Easy mode runs a full run, capped at the user's row count
Easy mode used to call runPreview, which produces a test run: no
artifact persisted, reduced progress tracking, and framed in the Runs
pane as "Test run". The whole point of the form is to let a user kick
off a real dataset build with one click, so wire it to runFull instead
and bind the Rows input to fullRows (not previewRows).
runFull requires a non-empty fullRunName. The Easy form has no run-name
input, so seed a default on mount whenever Easy is active and
fullRunName is still empty. Uses `<recipe name> <iso-timestamp>` so
each Easy run gets a stable-ish default that still sorts chronologically
in the Runs pane. User can override it from the Advanced run dialog
before clicking Run.
Rename GithubScraperEasyView's rows props from previewRows/setPreviewRows
to rows/setRows so the view stays agnostic to which hook state the page
chooses to bind. Loading indicator now follows fullLoading.
* Studio: clamp GitHub scrape page size and memoize the materialization
Two wins for the "before Generating fires" gap on small previews:
(1) scrape_{issues,prs,commits} hardcoded per_page (50 / 25 / 100) and
only checked the trial limit AFTER the page was written, so a 1-row
Easy run still asked GitHub for a full 50-issue + 25-PR page, wrote
them all to JSONL, and then stopped because total_new already exceeded
the trial cap. Cap per_page at min(page_cap, trial_limit) so
github_limit=1 actually asks for first:1.
(2) GitHubRepoSeedReader.get_dataset_uri used to scrape fresh on every
invocation. data_designer calls the seed reader multiple times per
recipe job (validation, preview, per-column sampling), so a 2-repo
Easy preview ran the full GraphQL scrape three times back-to-back,
burning ~15s of dead air before any LLM generation began.
Added a module-level in-process cache keyed on
(repos, item_types, limit, include_comments, max_comments_per_item,
sha256(token)[:16]) that stores the JSONL path of the first
materialization. Subsequent calls with the same signature return the
cached path, guarded by a staleness check that drops the entry if the
file was tmp-cleaned. Raw token values never land in the key.
Net effect on a 1-row Easy run, 2 repos, limit=1: 2 GraphQL round
trips instead of ~12, and the first-to-Generating gap collapses from
~15s to roughly 2-3s.
* Studio: make Easy mode Rows input editable instead of snapping to 1
The Rows to generate input used type="number" with value bound directly
to the rows state and an onChange that coerced any non-positive parse
result back to 1. The moment the user pressed backspace to clear the
field, the parent re-rendered with value=1 and the caret jumped, making
it impossible to change the value without arrowing the browser's +/-
spinner.
Switch to a text input with inputMode="numeric" and pattern="[0-9]*"
(so mobile still shows a numeric keyboard, and the browser drops the
spinner buttons the user did not want). Add a local rowsText buffer so
the field can hold transient empty / partial digit strings while
editing without fighting the parent state; the canonical rows value
only advances when the buffer parses to a valid integer in [1, 10000],
and onBlur clamps back to 1 or 10000 if the user left it out of range.
No behavior change for valid numeric edits - the downstream runFull()
still sees a clean positive integer.
* Studio: expand dataset cells horizontally by column on click
Click a long cell to expand that whole column. Click again to collapse.
Replaces the prior row-level vertical expansion which made it hard to
compare cells across columns. State is scoped per execution and per
column; the row itself is no longer a click target.
* Studio: force expanded dataset column to grow wide enough to read
* Studio: disable thinking for local recipe inference and plumb the kwarg
Reasoning-capable models (gemma-3n, qwen3.5, etc.) emit a
<think>...</think> preamble ahead of the answer by default, which
roughly doubles the generated token count per row on a local GGUF
and pushes the actual answer past data_designer's json-fence regex
on llm-structured columns. Recipes want the terse answer, not the
scratchpad.
Two halves of the fix:
(1) routes/data_recipe/jobs.py: when _inject_local_providers walks
the recipe's model_configs to point them at the local endpoint, also
stash chat_template_kwargs={"enable_thinking": false} under each
config's inference_parameters.extra_body. OpenAI SDK spreads
extra_body into the top-level request body, so llama-server and the
Studio /v1/chat/completions route both see it.
(2) routes/inference.py: the chat-completions route previously
dropped chat_template_kwargs on the floor because the whitelist
body builder only forwarded known fields.
- At the top of openai_chat_completions, lift
chat_template_kwargs.enable_thinking from payload.model_extra
onto the typed payload.enable_thinking field when the caller
did not set the latter, so the non-passthrough GGUF path's
generate_chat_completion(...) call honors the override.
- Teach _build_passthrough_payload to forward a
chat_template_kwargs dict, and have _build_openai_passthrough_body
derive that dict from payload.enable_thinking so
response_format requests (structured columns) also land at
llama-server with the reasoning preamble suppressed.
Net effect on a 10-row support-bot run with gemma-4-E2B-it-GGUF:
responses arrive without <think> tags, wall-clock per call drops
roughly in half, and structured columns stop leaking reasoning
tokens through the GBNF-constrained output.
* Studio: update GitHub Support Bot learning recipe with maintainer layout
Replace the template with the hand-laid-out export from the maintainer
so note nodes ship with real x/y positions (scattered around the
graph instead of all stacked at x=480) and the edges / canvas pan look
correct on first load. Also picks up the maintainer's prompt tweaks and
output schema names (coauthor_response / user_request / followups / task /
cites / confidence).
Diff is mostly ui.nodes positions and prompt bodies; runtime shape is
unchanged (seed_config / columns still target model_1 against the Local
Model provider).
* Studio: auto-size dataset sample columns; wide text gets a wide column
Drop the per-column click-to-expand toggle and the 180-char truncation.
Every column now renders its full value. Columns with long text get a
min-w of 48rem so the text is readable without wrapping into a tall
block; narrow-content columns get a 12rem min-w. The table wrapper
already has overflow-x-auto, so wide-column totals cause a horizontal
scrollbar instead of cramming everything into the viewport.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* fix GitHub scrape progress
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* add resetApiBase export for test setup
* Studio: rename github-support-bot output columns to User / Assistant
Previously emitted user_request and coauthor_response, which did not
match the canonical User / Assistant chat-pair shape that downstream
SFT consumers expect. Renamed the columns in the recipe JSON (columns,
UI node ids, edges, notes, prompt Jinja refs) and the matching copy in
the learning-recipes index, data-recipes-page, and easy view.
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: wasimysaid <wasimysdev@gmail.com>
1187 lines
97 KiB
TOML
1187 lines
97 KiB
TOML
[build-system]
|
|
requires = ["setuptools==80.9.0", "setuptools-scm==9.2.0"]
|
|
build-backend = "setuptools.build_meta"
|
|
|
|
[project]
|
|
name = "unsloth"
|
|
dynamic = ["version"]
|
|
description = "2-5X faster training, reinforcement learning & finetuning"
|
|
readme = "README.md"
|
|
requires-python = ">=3.9,<3.15"
|
|
license = "Apache-2.0"
|
|
keywords = ["ai", "llm", "reinforcement learning", "machine learning", "artificial intelligence", "pytorch"]
|
|
authors = [
|
|
{email = "info@unsloth.ai"},
|
|
{name = "Unsloth AI team"},
|
|
]
|
|
maintainers = [
|
|
{name = "Daniel Han", email = "daniel@unsloth.ai"},
|
|
{name = "Michael Han", email = "info@unsloth.ai"},
|
|
]
|
|
classifiers = [
|
|
"Programming Language :: Python",
|
|
"Environment :: GPU",
|
|
"Environment :: GPU :: NVIDIA CUDA",
|
|
"Topic :: Scientific/Engineering :: Artificial Intelligence",
|
|
]
|
|
dependencies = [
|
|
"typer",
|
|
"pydantic",
|
|
"pyyaml",
|
|
"nest-asyncio",
|
|
]
|
|
|
|
[project.scripts]
|
|
unsloth = "unsloth_cli:app"
|
|
|
|
[tool.setuptools.dynamic]
|
|
version = {attr = "unsloth.models._utils.__version__"}
|
|
|
|
[tool.setuptools]
|
|
include-package-data = true
|
|
|
|
[tool.setuptools.package-data]
|
|
studio = [
|
|
"*.sh",
|
|
"*.ps1",
|
|
"*.bat",
|
|
"frontend/dist/**/*",
|
|
"frontend/*.json",
|
|
"frontend/*.ts",
|
|
"frontend/*.js",
|
|
"frontend/*.html",
|
|
"frontend/*.yaml",
|
|
"frontend/.git*",
|
|
"backend/requirements/**/*",
|
|
"backend/plugins/**/*",
|
|
"backend/core/data_recipe/oxc-validator/*.json",
|
|
"backend/core/data_recipe/oxc-validator/*.mjs",
|
|
]
|
|
|
|
[tool.setuptools.packages.find]
|
|
include = ["unsloth*", "unsloth_cli*", "studio", "studio.backend*"]
|
|
exclude = ["images*", "tests*", "*.node_modules", "*.node_modules.*"]
|
|
|
|
[project.optional-dependencies]
|
|
triton = [
|
|
"triton>=3.0.0 ; ('linux' in sys_platform)",
|
|
"triton-windows ; (sys_platform == 'win32') and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
]
|
|
|
|
huggingfacenotorch = [
|
|
"wheel>=0.42.0",
|
|
"packaging",
|
|
"numpy",
|
|
"tqdm",
|
|
"psutil",
|
|
"tyro",
|
|
"protobuf",
|
|
"sentencepiece>=0.2.0",
|
|
"datasets>=3.4.1,!=4.0.*,!=4.1.0,<4.4.0",
|
|
"accelerate>=0.34.1",
|
|
"peft>=0.18.0,!=0.11.0",
|
|
"huggingface_hub>=0.34.0",
|
|
"hf_transfer",
|
|
"diffusers",
|
|
"transformers>=4.51.3,!=4.52.0,!=4.52.1,!=4.52.2,!=4.52.3,!=4.53.0,!=4.54.0,!=4.55.0,!=4.55.1,!=4.57.0,!=4.57.4,!=4.57.5,!=5.0.0,!=5.1.0,<=5.5.0",
|
|
"trl>=0.18.2,!=0.19.0,<=0.24.0",
|
|
"sentence-transformers",
|
|
]
|
|
huggingface = [
|
|
"unsloth[huggingfacenotorch]",
|
|
"unsloth_zoo>=2026.4.8",
|
|
"torchvision",
|
|
"unsloth[triton]",
|
|
]
|
|
windows = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0 ; (sys_platform == 'win32')",
|
|
"xformers>=0.0.22.post7 ; (sys_platform == 'win32')",
|
|
]
|
|
base = [
|
|
"unsloth[huggingface]",
|
|
]
|
|
cu118only = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.22.post7%2Bcu118-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.22.post7%2Bcu118-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.22.post7%2Bcu118-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
]
|
|
cu121only = [
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.22.post7-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.22.post7-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.22.post7-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
]
|
|
cu118onlytorch211 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.23%2Bcu118-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.23%2Bcu118-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.23%2Bcu118-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
]
|
|
cu121onlytorch211 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.23-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.23-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.23-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
]
|
|
cu118onlytorch212 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.23.post1%2Bcu118-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.23.post1%2Bcu118-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.23.post1%2Bcu118-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
]
|
|
cu121onlytorch212 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.23.post1-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.23.post1-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.23.post1-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
]
|
|
cu118onlytorch220 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.24%2Bcu118-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.24%2Bcu118-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.24%2Bcu118-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
]
|
|
cu121onlytorch220 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.24-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.24-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.24-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
]
|
|
cu118onlytorch230 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.27%2Bcu118-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.27%2Bcu118-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.27%2Bcu118-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.27%2Bcu118-cp312-cp312-manylinux2014_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
]
|
|
cu121onlytorch230 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.27-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.27-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.27-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.27-cp312-cp312-manylinux2014_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
]
|
|
cu118onlytorch240 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.27.post2%2Bcu118-cp39-cp39-manylinux2014_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.27.post2%2Bcu118-cp310-cp310-manylinux2014_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.27.post2%2Bcu118-cp311-cp311-manylinux2014_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.27.post2%2Bcu118-cp312-cp312-manylinux2014_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
]
|
|
cu121onlytorch240 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.28.post1-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.28.post1-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.28.post1-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.28.post1-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
]
|
|
cu124onlytorch240 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post1-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post1-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post1-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post1-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post1-cp39-cp39-win_amd64.whl ; python_version=='3.9' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post1-cp310-cp310-win_amd64.whl ; python_version=='3.10' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post1-cp311-cp311-win_amd64.whl ; python_version=='3.11' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post1-cp312-cp312-win_amd64.whl ; python_version=='3.12' and (sys_platform == 'win32')",
|
|
]
|
|
cu118onlytorch250 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.28.post2-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.28.post2-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.28.post2-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.28.post2-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
]
|
|
cu121onlytorch250 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.28.post2-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.28.post2-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.28.post2-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.28.post2-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
]
|
|
cu124onlytorch250 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post2-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post2-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post2-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post2-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post2-cp39-cp39-win_amd64.whl ; python_version=='3.9' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post2-cp310-cp310-win_amd64.whl ; python_version=='3.10' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post2-cp311-cp311-win_amd64.whl ; python_version=='3.11' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.28.post2-cp312-cp312-win_amd64.whl ; python_version=='3.12' and (sys_platform == 'win32')",
|
|
]
|
|
cu118onlytorch251 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.29.post1-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.29.post1-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.29.post1-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.29.post1-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
]
|
|
cu121onlytorch251 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.29.post1-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.29.post1-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.29.post1-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu121/xformers-0.0.29.post1-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
]
|
|
cu124onlytorch251 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post1-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post1-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post1-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post1-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post1-cp39-cp39-win_amd64.whl ; python_version=='3.9' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post1-cp310-cp310-win_amd64.whl ; python_version=='3.10' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post1-cp311-cp311-win_amd64.whl ; python_version=='3.11' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post1-cp312-cp312-win_amd64.whl ; python_version=='3.12' and (sys_platform == 'win32')",
|
|
]
|
|
cu118onlytorch260 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.29.post3-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.29.post3-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.29.post3-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.29.post3-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
]
|
|
cu124onlytorch260 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post3-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post3-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post3-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post3-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post3-cp39-cp39-win_amd64.whl ; python_version=='3.9' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post3-cp310-cp310-win_amd64.whl ; python_version=='3.10' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post3-cp311-cp311-win_amd64.whl ; python_version=='3.11' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu124/xformers-0.0.29.post3-cp312-cp312-win_amd64.whl ; python_version=='3.12' and (sys_platform == 'win32')",
|
|
]
|
|
cu126onlytorch260 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.29.post3-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.29.post3-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.29.post3-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.29.post3-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.29.post3-cp39-cp39-win_amd64.whl ; python_version=='3.9' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.29.post3-cp310-cp310-win_amd64.whl ; python_version=='3.10' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.29.post3-cp311-cp311-win_amd64.whl ; python_version=='3.11' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.29.post3-cp312-cp312-win_amd64.whl ; python_version=='3.12' and (sys_platform == 'win32')",
|
|
]
|
|
cu118onlytorch270 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.30-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.30-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.30-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.30-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.30-cp39-cp39-win_amd64.whl ; python_version=='3.9' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.30-cp310-cp310-win_amd64.whl ; python_version=='3.10' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.30-cp311-cp311-win_amd64.whl ; python_version=='3.11' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.30-cp312-cp312-win_amd64.whl ; python_version=='3.12' and (sys_platform == 'win32')",
|
|
]
|
|
cu126onlytorch270 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.30-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.30-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.30-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.30-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.30-cp39-cp39-win_amd64.whl ; python_version=='3.9' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.30-cp310-cp310-win_amd64.whl ; python_version=='3.10' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.30-cp311-cp311-win_amd64.whl ; python_version=='3.11' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.30-cp312-cp312-win_amd64.whl ; python_version=='3.12' and (sys_platform == 'win32')",
|
|
]
|
|
cu128onlytorch270 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.30-cp39-cp39-manylinux_2_28_x86_64.whl ; python_version=='3.9' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.30-cp310-cp310-manylinux_2_28_x86_64.whl ; python_version=='3.10' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.30-cp311-cp311-manylinux_2_28_x86_64.whl ; python_version=='3.11' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.30-cp312-cp312-manylinux_2_28_x86_64.whl ; python_version=='3.12' and ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.30-cp39-cp39-win_amd64.whl ; python_version=='3.9' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.30-cp310-cp310-win_amd64.whl ; python_version=='3.10' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.30-cp311-cp311-win_amd64.whl ; python_version=='3.11' and (sys_platform == 'win32')",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.30-cp312-cp312-win_amd64.whl ; python_version=='3.12' and (sys_platform == 'win32')",
|
|
]
|
|
cu118onlytorch271 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.31.post1-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu118/xformers-0.0.31.post1-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu126onlytorch271 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.31.post1-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.31.post1-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu128onlytorch271 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.31.post1-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.31.post1-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu118onlytorch280 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.32.post2-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.32.post2-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu126onlytorch280 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.32.post2-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.32.post2-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu128onlytorch280 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu129/xformers-0.0.32.post2-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu129/xformers-0.0.32.post2-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu130onlytorch280 = [
|
|
]
|
|
cu126onlytorch290 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.33.post1-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.33.post1-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu128onlytorch290 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.33.post1-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.33.post1-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu130onlytorch290 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu130/xformers-0.0.33.post1-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu130/xformers-0.0.33.post1-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu126onlytorch291 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.33.post2-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.33.post2-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu128onlytorch291 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.33.post2-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.33.post2-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu130onlytorch291 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu130/xformers-0.0.33.post2-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu130/xformers-0.0.33.post2-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu126onlytorch2100 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.34-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu126/xformers-0.0.34-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu128onlytorch2100 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.34-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu128/xformers-0.0.34-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu130onlytorch2100 = [
|
|
"xformers @ https://download.pytorch.org/whl/cu130/xformers-0.0.34-cp39-abi3-manylinux_2_28_x86_64.whl ; ('linux' in sys_platform)",
|
|
"xformers @ https://download.pytorch.org/whl/cu130/xformers-0.0.34-cp39-abi3-win_amd64.whl ; (sys_platform == 'win32')",
|
|
]
|
|
cu118 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118only]",
|
|
]
|
|
cu121 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121only]",
|
|
]
|
|
cu118-torch211 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes==0.45.5",
|
|
"unsloth[cu118onlytorch211]",
|
|
]
|
|
cu121-torch211 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes==0.45.5",
|
|
"unsloth[cu121onlytorch211]",
|
|
]
|
|
cu118-torch212 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes==0.45.5",
|
|
"unsloth[cu118onlytorch212]",
|
|
]
|
|
cu121-torch212 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes==0.45.5",
|
|
"unsloth[cu121onlytorch212]",
|
|
]
|
|
cu118-torch220 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch220]",
|
|
]
|
|
cu121-torch220 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch220]",
|
|
]
|
|
cu118-torch230 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch230]",
|
|
]
|
|
cu121-torch230 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch230]",
|
|
]
|
|
cu118-torch240 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch240]",
|
|
]
|
|
cu121-torch240 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch240]",
|
|
]
|
|
cu124-torch240 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu124onlytorch240]",
|
|
]
|
|
cu118-torch250 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch250]",
|
|
]
|
|
cu121-torch250 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch250]",
|
|
]
|
|
cu124-torch250 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu124onlytorch250]",
|
|
]
|
|
cu118-torch251 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch251]",
|
|
]
|
|
cu121-torch251 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch251]",
|
|
]
|
|
cu124-torch251 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu124onlytorch251]",
|
|
]
|
|
cu118-torch260 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch260]",
|
|
]
|
|
cu124-torch260 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu124onlytorch260]",
|
|
]
|
|
cu126-torch260 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch260]",
|
|
]
|
|
cu118-torch270 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch270]",
|
|
]
|
|
cu126-torch270 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch270]",
|
|
]
|
|
cu128-torch270 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch270]",
|
|
]
|
|
cu118-torch271 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch271]",
|
|
]
|
|
cu126-torch271 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch271]",
|
|
]
|
|
cu128-torch271 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch271]",
|
|
]
|
|
cu118-torch280 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch280]",
|
|
]
|
|
cu126-torch280 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch280]",
|
|
]
|
|
cu128-torch280 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch280]",
|
|
]
|
|
cu130-torch280 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu130onlytorch280]",
|
|
]
|
|
cu126-torch290 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch290]",
|
|
]
|
|
cu128-torch290 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch290]",
|
|
]
|
|
cu130-torch290 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu130onlytorch290]",
|
|
]
|
|
cu126-torch291 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch291]",
|
|
]
|
|
cu128-torch291 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch291]",
|
|
]
|
|
cu130-torch291 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu130onlytorch291]",
|
|
]
|
|
cu126-torch2100 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch2100]",
|
|
]
|
|
cu128-torch2100 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch2100]",
|
|
]
|
|
cu130-torch2100 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu130onlytorch2100]",
|
|
]
|
|
kaggle = [
|
|
"unsloth[huggingface]",
|
|
]
|
|
kaggle-new = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
]
|
|
conda = [
|
|
"unsloth[huggingface]",
|
|
]
|
|
colab-torch211 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes==0.45.5",
|
|
"unsloth[cu121onlytorch211]",
|
|
]
|
|
colab-ampere-torch211 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes==0.45.5",
|
|
"unsloth[cu121onlytorch211]",
|
|
"packaging",
|
|
"ninja",
|
|
"flash-attn>=2.6.3 ; ('linux' in sys_platform)",
|
|
]
|
|
colab-torch220 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch220]",
|
|
]
|
|
colab-ampere-torch220 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch220]",
|
|
"packaging",
|
|
"ninja",
|
|
"flash-attn>=2.6.3 ; ('linux' in sys_platform)",
|
|
]
|
|
colab-new = [
|
|
"unsloth_zoo>=2026.4.8",
|
|
"packaging",
|
|
"tyro",
|
|
"transformers>=4.51.3,!=4.52.0,!=4.52.1,!=4.52.2,!=4.52.3,!=4.53.0,!=4.54.0,!=4.55.0,!=4.55.1,!=4.57.0,!=4.57.4,!=4.57.5,!=5.0.0,!=5.1.0,<=5.5.0",
|
|
"datasets>=3.4.1,!=4.0.*,!=4.1.0,<4.4.0",
|
|
"sentencepiece>=0.2.0",
|
|
"tqdm",
|
|
"psutil",
|
|
"wheel>=0.42.0",
|
|
"numpy",
|
|
"protobuf",
|
|
"huggingface_hub>=0.34.0",
|
|
"hf_transfer",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[triton]",
|
|
"sentence-transformers",
|
|
]
|
|
colab-no-deps = [
|
|
"accelerate>=0.34.1",
|
|
"trl>=0.18.2,!=0.19.0,<=0.24.0",
|
|
"peft>=0.18.0",
|
|
"xformers ; ('linux' in sys_platform or sys_platform == 'win32') and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"protobuf",
|
|
]
|
|
colab = [
|
|
"unsloth[cu121]",
|
|
]
|
|
flashattention = [
|
|
"packaging ; ('linux' in sys_platform)",
|
|
"ninja ; ('linux' in sys_platform)",
|
|
"flash-attn>=2.6.3 ; ('linux' in sys_platform)",
|
|
]
|
|
colab-ampere = [
|
|
"unsloth[colab-ampere-torch220]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118only]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu121-ampere = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121only]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch211 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes==0.45.5",
|
|
"unsloth[cu118onlytorch211]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu121-ampere-torch211 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes==0.45.5",
|
|
"unsloth[cu121onlytorch211]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch220 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch220]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu121-ampere-torch220 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch220]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch230 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch230]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu121-ampere-torch230 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch230]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch240 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch240]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu121-ampere-torch240 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch240]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu124-ampere-torch240 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu124onlytorch240]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch250 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch250]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu121-ampere-torch250 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch250]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu124-ampere-torch250 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu124onlytorch250]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch251 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch251]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu121-ampere-torch251 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu121onlytorch251]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu124-ampere-torch251 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu124onlytorch251]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch260 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch260]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu124-ampere-torch260 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu124onlytorch260]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu126-ampere-torch260 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch260]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch270 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch270]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu126-ampere-torch270 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch270]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu128-ampere-torch270 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch270]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch271 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch271]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu126-ampere-torch271 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch271]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu128-ampere-torch271 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch271]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu118-ampere-torch280 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu118onlytorch280]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu126-ampere-torch280 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch280]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu128-ampere-torch280 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch280]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu130-ampere-torch280 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu130onlytorch280]",
|
|
"unsloth[flashattention]",
|
|
]
|
|
cu126-ampere-torch290 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch290]",
|
|
]
|
|
cu128-ampere-torch290 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch290]",
|
|
]
|
|
cu130-ampere-torch290 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu130onlytorch290]",
|
|
]
|
|
cu126-ampere-torch291 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch291]",
|
|
]
|
|
cu128-ampere-torch291 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch291]",
|
|
]
|
|
cu130-ampere-torch291 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu130onlytorch291]",
|
|
]
|
|
cu126-ampere-torch2100 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu126onlytorch2100]",
|
|
]
|
|
cu128-ampere-torch2100 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu128onlytorch2100]",
|
|
]
|
|
cu130-ampere-torch2100 = [
|
|
"unsloth[huggingface]",
|
|
"bitsandbytes>=0.45.5,!=0.46.0,!=0.48.0",
|
|
"unsloth[cu130onlytorch2100]",
|
|
]
|
|
flashattentiontorch260abiFALSEcu12x = [
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp39-cp39-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.9'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.10'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp311-cp311-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.11'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp312-cp312-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.12'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp313-cp313-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.13'",
|
|
]
|
|
flashattentiontorch260abiTRUEcu12x = [
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiTRUE-cp39-cp39-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.9'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiTRUE-cp310-cp310-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.10'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiTRUE-cp311-cp311-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.11'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiTRUE-cp312-cp312-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.12'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiTRUE-cp313-cp313-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.13'",
|
|
]
|
|
flashattentiontorch250abiFALSEcu12x = [
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiFALSE-cp39-cp39-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.9'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.10'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiFALSE-cp311-cp311-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.11'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiFALSE-cp312-cp312-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.12'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiFALSE-cp313-cp313-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.13'",
|
|
]
|
|
flashattentiontorch250abiTRUEcu12x = [
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiTRUE-cp39-cp39-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.9'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiTRUE-cp310-cp310-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.10'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiTRUE-cp311-cp311-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.11'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiTRUE-cp312-cp312-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.12'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiTRUE-cp313-cp313-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.13'",
|
|
]
|
|
flashattentiontorch240abiFALSEcu12x = [
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp39-cp39-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.9'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.10'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp311-cp311-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.11'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp312-cp312-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.12'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp313-cp313-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.13'",
|
|
]
|
|
flashattentiontorch240abiTRUEcu12x = [
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiTRUE-cp39-cp39-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.9'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiTRUE-cp310-cp310-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.10'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiTRUE-cp311-cp311-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.11'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiTRUE-cp312-cp312-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.12'",
|
|
"flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiTRUE-cp313-cp313-linux_x86_64.whl ; ('linux' in sys_platform) and python_version == '3.13'",
|
|
]
|
|
intelgputorch260 = [
|
|
"unsloth_zoo[intelgpu]",
|
|
"unsloth[huggingfacenotorch]",
|
|
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.2.0-cp39-cp39-linux_x86_64.whl#sha256=147607f190a7d7aa24ba454def5977fbbfec792fdae18e4ed278cfec29b69271 ; ('linux' in sys_platform) and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.2.0-cp310-cp310-linux_x86_64.whl#sha256=23aa423fa1542afc34f67eb3ba8ef20060f6d1b3a4697eaeab22b11c92b30f2b ; ('linux' in sys_platform) and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.2.0-cp311-cp311-linux_x86_64.whl#sha256=bcfa995229bbfd9ffd8d6c8d9f6428d393e876fa6e23ee3c20e3c0d73ca75ca5 ; ('linux' in sys_platform) and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.2.0-cp312-cp312-linux_x86_64.whl#sha256=bd340903d03470708df3442438acb8b7e08087ab9e61fbe349b2872bf9257ab0 ; ('linux' in sys_platform) and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.2.0-cp313-cp313-linux_x86_64.whl#sha256=814dccc8a07159e6eca74bed70091bc8fea2d9dd87b0d91845f9f38cde62f01c ; ('linux' in sys_platform) and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-manylinux_2_24_x86_64.whl ; ('linux' in sys_platform) and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-win_amd64.whl ; (sys_platform == 'win32') and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.6.0%2Bxpu-cp39-cp39-linux_x86_64.whl#sha256=6a8adf6dc4c089406e8b3a7e58ab57a463bddf9b07130d2576e76eced43e92af ; ('linux' in sys_platform) and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.6.0%2Bxpu-cp310-cp310-linux_x86_64.whl#sha256=ff4561cbf07c83bbccaa0f6e9bb0e6dcf721bacd53c9c43c4eb0e7331b4792f9 ; ('linux' in sys_platform) and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.6.0%2Bxpu-cp311-cp311-linux_x86_64.whl#sha256=12005f66b810ddd3ab93f86c4522bcfdd412cbd27fc9d189b661ff7509bc5e8a ; ('linux' in sys_platform) and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.6.0%2Bxpu-cp312-cp312-linux_x86_64.whl#sha256=c4c5c67625cdacf35765c2b94e61fe166e3c3f4a14521b1212a59ad1b3eb0f2e ; ('linux' in sys_platform) and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.6.0%2Bxpu-cp313-cp313-linux_x86_64.whl#sha256=e6864f7a60a5ecc43d5d38f59a16e5dd132384f73dfd3a697f74944026038f7b ; ('linux' in sys_platform) and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
]
|
|
intel-gpu-torch260 = [
|
|
"unsloth[intelgputorch260]"
|
|
]
|
|
intelgputorch270 = [
|
|
"unsloth_zoo[intelgpu]",
|
|
"unsloth[huggingfacenotorch]",
|
|
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=749a7098492c6a27b356c97149a4a62973b953eae60bc1b6259260974f344913 ; ('linux' in sys_platform) and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=44362e80abd752471a08341093321955b066daa2cfb4810e73b8e3b240850f93 ; ('linux' in sys_platform) and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=faa6b8c945a837a080f641bc8ccc77a98fa66980dcd7e62e715fd853737343fd ; ('linux' in sys_platform) and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=40f6fb65b345dc9a61813abe7ac9a585f2c9808f414d140cc2a5f11f53ee063c ; ('linux' in sys_platform) and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=b22b4c02ec71b4bfc862ae3cdfd2871dc0b05d2b1802f5db2196e0f897d581e9 ; ('linux' in sys_platform) and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp39-cp39-win_amd64.whl#sha256=d4b738d7fa5100c1bd766f91614962828a4810eb57b4df92cd5214a83505a752 ; sys_platform == 'win32' and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp310-cp310-win_amd64.whl#sha256=143fe8a64d807bcdb7d81bbc062816add325570aa160448454ab6ded4a0a17a1 ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp311-cp311-win_amd64.whl#sha256=a8025459ff325d6e3532eb5cf72519db1b178155e7d60aff6c56beb5968fc758 ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp312-cp312-win_amd64.whl#sha256=0dd07e6d5b872e42e48f5ee140e609d4554ca3cc509d5bf509ac232267cf358e ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.3.0-cp313-cp313-win_amd64.whl#sha256=a936a18182d8e065a9933afc9a3ebbffadd38604969f87c493831214539fc027 ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-manylinux_2_24_x86_64.whl ; ('linux' in sys_platform) and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-win_amd64.whl ; (sys_platform == 'win32') and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp39-cp39-linux_x86_64.whl#sha256=f8ee75e50fcbb37ed5b498299ca2264da99ab278a93fae2358e921e4a6e28273 ; ('linux' in sys_platform) and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp310-cp310-linux_x86_64.whl#sha256=d6fdc342961d98fdcd9d03dfd491a3208bb5f7fbb435841f8f72ce9fdcd2d026 ; ('linux' in sys_platform) and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp311-cp311-linux_x86_64.whl#sha256=74d07f9357df5cf2bf223ad3c84de16346bfaa0504f988fdd5590d3e177e5e86 ; ('linux' in sys_platform) and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp312-cp312-linux_x86_64.whl#sha256=c806d44aa2ca5d225629f6fbc6c994d5deaac2d2cde449195bc8e3522ddd219a ; ('linux' in sys_platform) and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp313-cp313-linux_x86_64.whl#sha256=25d8277b7f01d42e2e014ccbab57a2692b6ec4eff8dcf894eda1b297407cf97a ; ('linux' in sys_platform) and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp39-cp39-win_amd64.whl#sha256=046e85125266ae69c1a0d083e6c092f947ab4b6b41532c16bafe40dbced845df ; sys_platform == 'win32' and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp310-cp310-win_amd64.whl#sha256=9ebaeffb82b0b3e39b6030927d3ebe0eb62a0e9045a3b2d7b0a9e7b15222c0db ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp311-cp311-win_amd64.whl#sha256=356ba66cee127e7e2c942880bd50e03768306a4ea08d358a0f29c6eebfc4bc81 ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp312-cp312-win_amd64.whl#sha256=94739e665d9b4d5cd7af5f517cb6103f6f9fb421c095184609653a24524040f5 ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.7.0%2Bxpu-cp313-cp313-win_amd64.whl#sha256=31df3cb674918e89bc8c532baa331dc84f4430e1f9c0ec379232db44cba78355 ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
]
|
|
intel-gpu-torch270 = [
|
|
"unsloth[intelgputorch270]"
|
|
]
|
|
intelgputorch280 = [
|
|
"unsloth_zoo[intelgpu]",
|
|
"unsloth[huggingfacenotorch]",
|
|
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=ac4d8e33986b1c3c5e48151640539272b2187e83016985853111b46fb82c3c94 ; 'linux' in sys_platform and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=999fef4c1f711092b9d3086525920545df490de476ecebe899ffc777019ae17f ; 'linux' in sys_platform and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=57b09c8c492985ff6a27cd3a22b08e8f7b96b407bd8030967b6efbb9f63b80cf ; 'linux' in sys_platform and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=df4bb3282bac9a3b90231700077110d8680b338416de03c2b7c6133c9b602649 ; 'linux' in sys_platform and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=60da63c99ca827bdcb0df28e0298bf7d066dc607454c6d6176783cb4e79d838b ; 'linux' in sys_platform and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp39-cp39-win_amd64.whl#sha256=64aea8de349f3e2e0ebf4c24b011a8122531fdffda5776edaef45829cc241cf8 ; sys_platform == 'win32' and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp310-cp310-win_amd64.whl#sha256=ae573d255b257fdbed319a3440dc9d0a721e31160ab7f6eba1b2226e6a409a1d ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp311-cp311-win_amd64.whl#sha256=8e0ea4558e5776d8ddab0264310be9b26aee5641bcac0da023537556d4317b86 ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp312-cp312-win_amd64.whl#sha256=4090dde07a4fffc34aaf855701a9db28e9fccb57b368ade520f1a0f8e811c878 ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.4.0-cp313-cp313-win_amd64.whl#sha256=a33d0888f3c8df028a2d028842715837d0049524d6c06b9bb11869890a13601a ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp39-cp39-linux_x86_64.whl ; 'linux' in sys_platform and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp310-cp310-linux_x86_64.whl ; 'linux' in sys_platform and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp311-cp311-linux_x86_64.whl ; 'linux' in sys_platform and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp312-cp312-linux_x86_64.whl ; 'linux' in sys_platform and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp313-cp313-linux_x86_64.whl ; 'linux' in sys_platform and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp39-cp39-win_amd64.whl#sha256=f2f401276892428e4875cf1d8717c5cbab704b16fc594ccf23795e7b16549a99 ; sys_platform == 'win32' and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp310-cp310-win_amd64.whl#sha256=125c60cd59d51b39581a7e9afcd4679bc3a6b8c1f9440b1bb502a23fdd60571e ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp311-cp311-win_amd64.whl#sha256=47f1a57258cd460e80b38b2ed6744e31587ab77a96b4215bf59546cb4bab5cc0 ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp312-cp312-win_amd64.whl#sha256=0937d8943c145a83d9bafc6f80ef28971167817f9eda26066d33f72caf8a6646 ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.8.0%2Bxpu-cp313-cp313-win_amd64.whl#sha256=e034aab1d71760dc80a731531be43673ffe15e99033b82d24e40d2e6d41bd8bf ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-manylinux_2_24_x86_64.whl ; ('linux' in sys_platform) and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-win_amd64.whl ; (sys_platform == 'win32') and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp39-cp39-manylinux_2_28_x86_64.whl#sha256=6e981c192045fc249c008441179ff237bb00174d818b875b0475730b63f0eaca ; 'linux' in sys_platform and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp310-cp310-manylinux_2_28_x86_64.whl#sha256=e5ba4805969277175ebfd59cc717093528cc6e3ada89ac2725fc7a3c1fee6169 ; 'linux' in sys_platform and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp311-cp311-manylinux_2_28_x86_64.whl#sha256=74c39c144104416bc4c5ad8c26ab0c169dc5cc6be58059e01bc3665dd0ef676f ; 'linux' in sys_platform and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp312-cp312-manylinux_2_28_x86_64.whl#sha256=0acec355b80c3899841184084f365df336c508602812e34a44007b8b60d53af4 ; 'linux' in sys_platform and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp313-cp313-manylinux_2_28_x86_64.whl#sha256=e2109ae773dad27b98ca17681044b4f876563c37f2382b75de3a371399edcff8 ; 'linux' in sys_platform and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp39-cp39-win_amd64.whl#sha256=5f7904e7048d414379bc8c1167260f1e84204f105db2d0a2f9c89e87ce1cf205 ; sys_platform == 'win32' and python_version == '3.9' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp310-cp310-win_amd64.whl#sha256=005fca5e658ca8e37adb63c1a021c84f5e56dfa6cf0d601d89cfe40b9473f79f ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp311-cp311-win_amd64.whl#sha256=c6d030f5361461550c0ff1339b5bca8585fc1e84fda2e64b6184e65a581e4f98 ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp312-cp312-win_amd64.whl#sha256=91aafd61864cdce27461cbec13ddbf28c1bc6494265a1e4b80131c64a3b7d18f ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.23.0%2Bxpu-cp313-cp313-win_amd64.whl#sha256=71dc4a6421742ed1e7f585b04a100ad53615c341fbccfbc255aefb38ea9091da ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
]
|
|
intel-gpu-torch280 = [
|
|
"unsloth[intelgputorch280]"
|
|
]
|
|
intelgputorch290 = [
|
|
"unsloth_zoo[intelgpu]",
|
|
"unsloth[huggingfacenotorch]",
|
|
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=c169a1de14c19673b17c751290d467fa282fc90fa5da4314b2e5cdab1f553146 ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=013d9dd5d6479bd22983161f462e61c8dbe1d82e6730624a7a8d5945507eaa61 ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=afc8cabfbf7ed51fd278d1e0f88d6afc157b0201bad4b99d681e4d542f9e66d4 ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=0d24c1716088f2764d0d24c64227732195b6a42706c3c5fc89eeb4904bfa0818 ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp310-cp310-win_amd64.whl#sha256=c83ab007311d9cfb6e809ee5a4587d99a9eef4be720b90da4f1aaa68b45139a0 ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp311-cp311-win_amd64.whl#sha256=debf75348da8e8c7166b4d4a9b91d1508bb8d6581e339f79f7604b2e6746bacd ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp312-cp312-win_amd64.whl#sha256=97337a47425f1963a723475bd61037460e84ba01db4f87a1d662c3718ff6c47e ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp313-cp313-win_amd64.whl#sha256=2caf8138695f6abb023ecd02031a2611ba1bf8fff2f19802567cb2fadefe9e87 ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.9.0%2Bxpu-cp310-cp310-linux_x86_64.whl#sha256=5afbe860ce991825a36b75706a523601087e414b77598ef0d9d3d565741c277d ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.9.0%2Bxpu-cp311-cp311-linux_x86_64.whl#sha256=607fe419c32d6e8e0556f745742e7cff1d0babce51f54be890e0c1422359c442 ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.9.0%2Bxpu-cp312-cp312-linux_x86_64.whl#sha256=376bae584d89980b8e59934d248c38d5fa3b7d4687a4df1a19f4bc1d23dcc8c1 ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.9.0%2Bxpu-cp313-cp313-linux_x86_64.whl#sha256=98d6a06dd7fb185874367b18bd609f05f16fdce4142a5980ca94461949965cd2 ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.9.0%2Bxpu-cp310-cp310-win_amd64.whl#sha256=47cc68f631f65bd9c84924d052cd04dec7531023caa85e80345e9c94611c887d ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.9.0%2Bxpu-cp311-cp311-win_amd64.whl#sha256=d56c44ab4818aba57e5c7b628f422d014e0d507427170a771c5be85e308b0bc6 ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.9.0%2Bxpu-cp312-cp312-win_amd64.whl#sha256=18cad93aaff76a01ce73aef6935ece7cfc03344b905592ec731446c44d44592b ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.9.0%2Bxpu-cp313-cp313-win_amd64.whl#sha256=579929cdc10a76800ead41289cac191ea36d1b16f5f501d3fc25607d4375cd83 ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-manylinux_2_24_x86_64.whl ; ('linux' in sys_platform) and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-win_amd64.whl ; (sys_platform == 'win32') and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.24.0%2Bxpu-cp310-cp310-manylinux_2_28_x86_64.whl#sha256=cbfae2b79b7549fd368c2462fc8e94f8f26cc450782ee72138e908077c09a519 ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.24.0%2Bxpu-cp311-cp311-manylinux_2_28_x86_64.whl#sha256=044fa36ef4b6b43edcd490b75c853fa4b3eb033c2bded29f8fbcf27734713c67 ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.24.0%2Bxpu-cp312-cp312-manylinux_2_28_x86_64.whl#sha256=4b91e4bec1d740a6211f02578a79888550b73f3a4e1383035f8f6d72f587212c ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.24.0%2Bxpu-cp313-cp313-manylinux_2_28_x86_64.whl#sha256=88239e73ca37254bec84f29cd5887e10ff712de7edbbda3fbb3609cd6190d99e ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.24.0%2Bxpu-cp310-cp310-win_amd64.whl#sha256=19c7da8ca767d593e13a88a12bb08d06e34a673f6f26c2f9c191d60e81c02953 ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.24.0%2Bxpu-cp311-cp311-win_amd64.whl#sha256=9bb0d1421c544ac8e2eca5b47daacaf54706dc9139c003aa5e77ee5f355c5931 ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.24.0%2Bxpu-cp312-cp312-win_amd64.whl#sha256=6a5194bc736089606342d48a3f6822829b167617e9495d91d753dd1bd46fda18 ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.24.0%2Bxpu-cp313-cp313-win_amd64.whl#sha256=da47a3ce2bb7f0301a31124668b5908f9b9e92d6241443de15a310ef9632fd83 ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
]
|
|
intel-gpu-torch290 = [
|
|
"unsloth[intelgputorch290]"
|
|
]
|
|
intelgputorch210 = [
|
|
"unsloth_zoo[intelgpu]",
|
|
"unsloth[huggingfacenotorch]",
|
|
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=c169a1de14c19673b17c751290d467fa282fc90fa5da4314b2e5cdab1f553146 ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=013d9dd5d6479bd22983161f462e61c8dbe1d82e6730624a7a8d5945507eaa61 ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=afc8cabfbf7ed51fd278d1e0f88d6afc157b0201bad4b99d681e4d542f9e66d4 ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl#sha256=0d24c1716088f2764d0d24c64227732195b6a42706c3c5fc89eeb4904bfa0818 ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp310-cp310-win_amd64.whl#sha256=c83ab007311d9cfb6e809ee5a4587d99a9eef4be720b90da4f1aaa68b45139a0 ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp311-cp311-win_amd64.whl#sha256=debf75348da8e8c7166b4d4a9b91d1508bb8d6581e339f79f7604b2e6746bacd ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp312-cp312-win_amd64.whl#sha256=97337a47425f1963a723475bd61037460e84ba01db4f87a1d662c3718ff6c47e ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"pytorch_triton_xpu @ https://download.pytorch.org/whl/pytorch_triton_xpu-3.5.0-cp313-cp313-win_amd64.whl#sha256=2caf8138695f6abb023ecd02031a2611ba1bf8fff2f19802567cb2fadefe9e87 ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.10.0%2Bxpu-cp310-cp310-linux_x86_64.whl#sha256=abb1d1ec1ac672bac0ff35420c965f2df0c636ef9d94e2a830e34578489d0a57 ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.10.0%2Bxpu-cp311-cp311-linux_x86_64.whl#sha256=71ad2f82da0f41eaec159f39fc85854e27c2391efa91b373e550648a6f4aaad3 ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.10.0%2Bxpu-cp312-cp312-linux_x86_64.whl#sha256=b473571d478912f92881cc13f15fa18f8463fb0fb8a068c96ed47a7d45a4da0a ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.10.0%2Bxpu-cp313-cp313-linux_x86_64.whl#sha256=3bc64a746ff25a93de140902c60c9e819d7413f5cea1e88d80999c27a5901e9c ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.10.0%2Bxpu-cp310-cp310-win_amd64.whl#sha256=ce50691ab3fb6301d9b7bb8b3834cf5fa7152a2b5f91fd24c5efdc601a25b780 ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.10.0%2Bxpu-cp311-cp311-win_amd64.whl#sha256=cb9d37f21cb9fb7df67d62863f021c3144e8d8832b9ea8e8523ac308bc620ea1 ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.10.0%2Bxpu-cp312-cp312-win_amd64.whl#sha256=3ad605be4728b6d3a28a44d07dd794b1a9e45551b0057815bf25eb2a6d6a56a7 ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torch @ https://download.pytorch.org/whl/xpu/torch-2.10.0%2Bxpu-cp313-cp313-win_amd64.whl#sha256=2b4b56dd6c792aef82006904fa888692e3782e4ae5da27526801bad4898f05a5 ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-manylinux_2_24_x86_64.whl ; ('linux' in sys_platform) and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"bitsandbytes @ https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-win_amd64.whl ; (sys_platform == 'win32') and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.25.0%2Bxpu-cp310-cp310-manylinux_2_28_x86_64.whl#sha256=7e1e7b170fcf7161c8499b67156c5a05462243626dc0974010791a0bab4378d3 ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.25.0%2Bxpu-cp311-cp311-manylinux_2_28_x86_64.whl#sha256=bd6add201bd7628af70437292e1447abb368e0b5f4ff9abd334ae435efd44792 ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.25.0%2Bxpu-cp312-cp312-manylinux_2_28_x86_64.whl#sha256=6ad2543496bc29e59d3dd614a94d09aa9870318aedb66045344fffddfedd2cf8 ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.25.0%2Bxpu-cp313-cp313-manylinux_2_28_x86_64.whl#sha256=80269f37865fcd8b57f20e4786efae2200bfa2b2727926c3c7acc82f0e7d3548 ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.25.0%2Bxpu-cp310-cp310-win_amd64.whl#sha256=6b9485ba85dcba4d196d6134d9c3332fb228fb2556416bf0450a64e8a472fcba ; sys_platform == 'win32' and python_version == '3.10' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.25.0%2Bxpu-cp311-cp311-win_amd64.whl#sha256=36cbaedf10f6412af5c89afd9aeea474e6a56a0050348ada8fabe1ecaf6b879e ; sys_platform == 'win32' and python_version == '3.11' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.25.0%2Bxpu-cp312-cp312-win_amd64.whl#sha256=738357d97468d75fe3d510ac37e65130f2787f81d9bbc1518898f7396dc3403f ; sys_platform == 'win32' and python_version == '3.12' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
"torchvision @ https://download.pytorch.org/whl/xpu/torchvision-0.25.0%2Bxpu-cp313-cp313-win_amd64.whl#sha256=1c4b44b36a557f7381e3076fb8843366742238648441d607c8d049c6da0f8886 ; sys_platform == 'win32' and python_version == '3.13' and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
]
|
|
intel-gpu-torch210 = [
|
|
"unsloth[intelgputorch210]"
|
|
]
|
|
intel = [
|
|
"unsloth[intelgputorch280]",
|
|
]
|
|
amd = [
|
|
"unsloth[huggingfacenotorch]",
|
|
"bitsandbytes>=0.49.1 ; ('linux' in sys_platform) and (platform_machine == 'AMD64' or platform_machine == 'x86_64' or platform_machine == 'aarch64')",
|
|
"bitsandbytes>=0.49.1 ; (sys_platform == 'win32') and (platform_machine == 'AMD64' or platform_machine == 'x86_64')",
|
|
]
|
|
rocm702-torch280 = [
|
|
"unsloth[amd]",
|
|
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/triton-3.4.0%2Brocm7.0.2.gitf9e5bf54-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/triton-3.4.0%2Brocm7.0.2.gitf9e5bf54-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/triton-3.4.0%2Brocm7.0.2.gitf9e5bf54-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/torch-2.8.0%2Brocm7.0.2.lw.git245bf6ed-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/torch-2.8.0%2Brocm7.0.2.lw.git245bf6ed-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/torch-2.8.0%2Brocm7.0.2.lw.git245bf6ed-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/torchvision-0.23.0%2Brocm7.0.2.git824e8c87-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/torchvision-0.23.0%2Brocm7.0.2.git824e8c87-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/torchvision-0.23.0%2Brocm7.0.2.git824e8c87-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
]
|
|
rocm72-torch291 = [
|
|
"unsloth[amd]",
|
|
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/triton-3.5.1%2Brocm7.2.0.gita272dfa8-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/triton-3.5.1%2Brocm7.2.0.gita272dfa8-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/triton-3.5.1%2Brocm7.2.0.gita272dfa8-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/triton-3.5.1%2Brocm7.2.0.gita272dfa8-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torch-2.9.1%2Brocm7.2.0.lw.git7e1940d4-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torch-2.9.1%2Brocm7.2.0.lw.git7e1940d4-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torch-2.9.1%2Brocm7.2.0.lw.git7e1940d4-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torch-2.9.1%2Brocm7.2.0.lw.git7e1940d4-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/windows/rocm-rel-7.2/torch-2.9.1%2Brocmsdk20260116-cp312-cp312-win_amd64.whl ; sys_platform == 'win32' and python_version == '3.12'",
|
|
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torchvision-0.24.0%2Brocm7.2.0.gitb919bd0c-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torchvision-0.24.0%2Brocm7.2.0.gitb919bd0c-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torchvision-0.24.0%2Brocm7.2.0.gitb919bd0c-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torchvision-0.24.0%2Brocm7.2.0.gitb919bd0c-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/windows/rocm-rel-7.2/torchvision-0.24.1%2Brocmsdk20260116-cp312-cp312-win_amd64.whl ; sys_platform == 'win32' and python_version == '3.12'",
|
|
]
|
|
rocm711-torch291 = [
|
|
"unsloth[amd]",
|
|
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/triton-3.5.1%2Brocm7.1.1.gita272dfa8-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/triton-3.5.1%2Brocm7.1.1.gita272dfa8-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/triton-3.5.1%2Brocm7.1.1.gita272dfa8-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/triton-3.5.1%2Brocm7.1.1.gita272dfa8-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torch-2.9.1%2Brocm7.1.1.lw.git351ff442-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torch-2.9.1%2Brocm7.1.1.lw.git351ff442-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torch-2.9.1%2Brocm7.1.1.lw.git351ff442-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torch-2.9.1%2Brocm7.1.1.lw.git351ff442-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torchvision-0.24.0%2Brocm7.1.1.gitb919bd0c-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torchvision-0.24.0%2Brocm7.1.1.gitb919bd0c-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torchvision-0.24.0%2Brocm7.1.1.gitb919bd0c-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torchvision-0.24.0%2Brocm7.1.1.gitb919bd0c-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
]
|
|
rocm72-torch2100 = [
|
|
"unsloth[amd]",
|
|
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/triton-3.6.0%2Brocm7.2.0.gitba5c1517-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/triton-3.6.0%2Brocm7.2.0.gitba5c1517-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/triton-3.6.0%2Brocm7.2.0.gitba5c1517-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/triton-3.6.0%2Brocm7.2.0.gitba5c1517-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torch-2.10.0%2Brocm7.2.0.lw.gitb6ee5fde-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torch-2.10.0%2Brocm7.2.0.lw.gitb6ee5fde-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torch-2.10.0%2Brocm7.2.0.lw.gitb6ee5fde-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torch-2.10.0%2Brocm7.2.0.lw.gitb6ee5fde-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torchvision-0.25.0%2Brocm7.2.0.git82df5f59-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torchvision-0.25.0%2Brocm7.2.0.git82df5f59-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torchvision-0.25.0%2Brocm7.2.0.git82df5f59-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.2/torchvision-0.25.0%2Brocm7.2.0.git82df5f59-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
]
|
|
rocm711-torch2100 = [
|
|
"unsloth[amd]",
|
|
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/triton-3.6.0%2Brocm7.1.1.gitba5c1517-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/triton-3.6.0%2Brocm7.1.1.gitba5c1517-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/triton-3.6.0%2Brocm7.1.1.gitba5c1517-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"triton @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/triton-3.6.0%2Brocm7.1.1.gitba5c1517-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torch-2.10.0%2Brocm7.1.1.lw.gitd9556b05-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torch-2.10.0%2Brocm7.1.1.lw.gitd9556b05-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torch-2.10.0%2Brocm7.1.1.lw.gitd9556b05-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torch @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torch-2.10.0%2Brocm7.1.1.lw.gitd9556b05-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torchvision-0.25.0%2Brocm7.1.1.git82df5f59-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.10' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torchvision-0.25.0%2Brocm7.1.1.git82df5f59-cp311-cp311-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.11' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torchvision-0.25.0%2Brocm7.1.1.git82df5f59-cp312-cp312-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.12' and platform_machine == 'x86_64'",
|
|
"torchvision @ https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/torchvision-0.25.0%2Brocm7.1.1.git82df5f59-cp313-cp313-linux_x86_64.whl ; platform_system == 'Linux' and python_version == '3.13' and platform_machine == 'x86_64'",
|
|
]
|
|
|
|
[project.urls]
|
|
homepage = "https://unsloth.ai"
|
|
documentation = "https://unsloth.ai/docs"
|
|
repository = "https://github.com/unslothai/unsloth"
|
|
|
|
[tool.ruff]
|
|
target-version = "py311"
|
|
force-exclude = true
|
|
extend-exclude = [
|
|
"*chat_templates.py",
|
|
"*ollama_template_mappers.py",
|
|
"*_auto_install.py",
|
|
"*mapper.py",
|
|
]
|
|
|
|
[tool.ruff.lint]
|
|
select = ["E9", "F63", "F7", "F82"]
|
|
ignore = [
|
|
"E402",
|
|
"E722",
|
|
"F403",
|
|
"F405",
|
|
"F811",
|
|
"F821",
|
|
"F841",
|
|
"F401",
|
|
"E731",
|
|
"E741",
|
|
"F601",
|
|
"E712",
|
|
]
|
|
|
|
[tool.ruff.format]
|