mirror of
https://github.com/kvcache-ai/ktransformers.git
synced 2026-04-28 03:39:48 +00:00
[docs]: add amd blis backend usage guide (#1669)
Some checks are pending
Book-CI / test (push) Waiting to run
Book-CI / test-1 (push) Waiting to run
Book-CI / test-2 (push) Waiting to run
Deploy / deploy (macos-latest) (push) Waiting to run
Deploy / deploy (ubuntu-latest) (push) Waiting to run
Deploy / deploy (windows-latest) (push) Waiting to run
Some checks are pending
Book-CI / test (push) Waiting to run
Book-CI / test-1 (push) Waiting to run
Book-CI / test-2 (push) Waiting to run
Deploy / deploy (macos-latest) (push) Waiting to run
Deploy / deploy (ubuntu-latest) (push) Waiting to run
Deploy / deploy (windows-latest) (push) Waiting to run
This commit is contained in:
parent
1ca3a2662e
commit
4850424345
1 changed files with 4 additions and 0 deletions
|
|
@ -86,6 +86,10 @@ The install script will:
|
|||
If you have an AMX-capable CPU but plan to use the LLAMAFILE backend, do NOT use the default auto-detection build.
|
||||
Use "manual mode" with `CPUINFER_CPU_INSTRUCT` set to `AVX512` or `AVX2` instead of `NATIVE` to avoid compilation issues (see below).
|
||||
|
||||
⚠️ **Important for BLIS AMD backend users:**
|
||||
for the installation guide, see this [issue](https://github.com/kvcache-ai/ktransformers/issues/1601)
|
||||
|
||||
|
||||
### Manual Configuration (Advanced)
|
||||
|
||||
If you need specific build options (e.g., for LLAMAFILE backend, compatibility, or binary distribution):
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue