kvcache-ai-ktransformers/doc/en
Jianwei Dong 15c624dcae
Fix/sglang kt detection (#1875)
* [feat]: simplify sglang installation with submodule, auto-sync CI, and version alignment

- Add kvcache-ai/sglang as git submodule at third_party/sglang (branch = main)
- Add top-level install.sh for one-click source installation (sglang + kt-kernel)
- Add sglang-kt as hard dependency in kt-kernel/pyproject.toml
- Add CI workflow to auto-sync sglang submodule daily and create PR
- Add CI workflow to build and publish sglang-kt to PyPI
- Integrate sglang-kt build into release-pypi.yml (version.py bump publishes both packages)
- Align sglang-kt version with ktransformers via SGLANG_KT_VERSION env var injection
- Update Dockerfile to use submodule and inject aligned version
- Update all 13 doc files, CLI hints, and i18n strings to reference new install methods

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* [build]: bump version to 0.5.2

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* [build]: rename PyPI package from kt-kernel to ktransformers

Users can now `pip install ktransformers` to get everything
(sglang-kt is auto-installed as a dependency).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* Revert "[build]: rename PyPI package from kt-kernel to ktransformers"

This reverts commit e0cbbf6364.

* [build]: add ktransformers meta-package for PyPI

`pip install ktransformers` now works as a single install command.
It pulls kt-kernel (which in turn pulls sglang-kt).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* [fix]: show sglang-kt package version in kt version command

- Prioritize sglang-kt package version (aligned with ktransformers)
  over sglang internal __version__
- Update display name from "sglang" to "sglang-kt"

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* [fix]: improve sglang-kt detection in kt doctor and kt version

Recognize sglang-kt package name as proof of kvcache-ai fork installation.
Previously both commands fell through to "PyPI (not recommended)" for
non-editable local source installs. Now version.py reuses the centralized
check_sglang_installation() logic.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* [build]: bump version to 0.5.2.post1

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 16:54:48 +08:00
..
api/server Necessary tips for Node.js related issues 2025-02-19 16:37:18 +08:00
kt-kernel Fix/sglang kt detection (#1875) 2026-03-04 16:54:48 +08:00
operators Initial commit 2024-07-27 16:06:58 +08:00
SFT [docs]: refine README for dpo updates (#1740) 2025-12-24 11:20:08 +08:00
AMX.md Update AMX.md 2025-04-29 11:12:51 +08:00
balance-serve.md add flashinfer to cuda device 2025-05-15 07:03:45 +00:00
benchmark.md release v0.2.3 2025-03-05 20:21:04 +08:00
deepseek-v2-injection.md * Reorganize documentation/README 2025-02-14 19:58:26 +00:00
DeepseekR1_V3_tutorial.md fix typo (#1452) 2025-11-10 16:08:04 +08:00
Docker.md 📝 fix typo ktransformer->ktransformers 2025-03-17 17:54:00 +08:00
Docker_xpu.md docs: add Dockerfile.xpu and GPU driver setup instructions 2025-05-28 13:55:35 +08:00
FAQ.md [doc]: update web doc and kt-kernel doc (#1609) 2025-11-13 20:44:13 +08:00
fp8_kernel.md Update fp8 doc; Update install.md broken link 2025-02-26 15:43:08 +00:00
install.md Merge pull request #1307 from kvcache-ai/hyc 2025-05-17 15:25:33 +08:00
Kimi-K2-Thinking.md Fix/sglang kt detection (#1875) 2026-03-04 16:54:48 +08:00
Kimi-K2.5.md Fix/sglang kt detection (#1875) 2026-03-04 16:54:48 +08:00
Kimi-K2.md Update GGUF format link in Kimi-K2 documentation 2025-09-05 20:19:37 +08:00
Kllama_tutorial_DeepSeekV2Lite.ipynb upload hands-on tutorial with KTransformers-FT, especially in customize your KT-FT+LLaMA-Factory (#1597) 2025-11-11 20:54:41 +08:00
KTransformers Full Introduction for Motivation and Practice.pdf [docs]: Add Full introduction of KT (#1636) 2025-11-29 15:46:55 +08:00
KTransformers-FT_PPT_share.pdf upload hands-on tutorial with KTransformers-FT, especially in customize your KT-FT+LLaMA-Factory (#1597) 2025-11-11 20:54:41 +08:00
llama4.md add flashinfer to cuda device 2025-05-15 07:03:45 +00:00
long_context_introduction.md docs: update long_context_introduction.md 2024-08-30 03:34:39 +09:00
long_context_tutorial.md update readme 2024-08-29 12:04:56 +08:00
makefile_usage.md : rm sensitive info in config.yaml, add readme of makefile. support old model_path config 2024-11-04 14:02:19 +08:00
MiniMax-M2.5.md Fix/sglang kt detection (#1875) 2026-03-04 16:54:48 +08:00
multi-gpu-tutorial.md * Reorganize documentation/README 2025-02-14 19:58:26 +00:00
prefix_cache.md update kvc disk path config. 2025-06-30 15:09:35 +00:00
Qwen3-Next.md fix bug 2025-09-16 13:21:58 +00:00
Qwen3.5.md Fix/sglang kt detection (#1875) 2026-03-04 16:54:48 +08:00
ROCm.md Update readme; Format code; Add example yaml. 2025-03-14 14:25:52 -04:00
SFT_Installation_Guide_KimiK2.5.md Fix/sglang kt detection (#1875) 2026-03-04 16:54:48 +08:00
SFT_Installation_Guide_KimiK2.md Update SFT Installation Guide for KimiK2 2025-11-06 17:34:21 +08:00
SmallThinker_and_Glm4moe.md update smallthinker and glm4 readme 2025-07-31 03:14:49 +00:00
V3-success.md 📝 fix some debug output and update doc 2025-02-13 17:25:12 +08:00
xpu.md docs: add Dockerfile.xpu and GPU driver setup instructions 2025-05-28 13:55:35 +08:00