kvcache-ai-ktransformers/kt-kernel/python/sft
mrhaoxx 58d7eabb9b
feat(sft): support transformers v5 fused expert format
Fused experts (e.g. Qwen3MoeExperts) store weights as 3D Parameters
(gate_up_proj [E,2I,H], down_proj [E,H,I]) instead of per-expert
nn.Linear modules. PEFT cannot attach LoRA to these, so we create
KT-managed LoRA buffers with kaiming init, nn.Parameter wrappers
for the optimizer, and pre-assigned .grad for C++ backward.

- arch.py: detect_fused_experts() detection
- weights.py: fused format extraction and weight clearing
- wrapper.py: detect fused at wrap time, store _fused_experts/_lora_rank
- lora.py: _create_fused_expert_lora_buffers, save/load fused LoRA,
  get_kt_lora_params collects fused params, deduplicate wrapper finding
- layer.py: handle v5 TopKRouter tuple output, remove dead code
- autograd.py: sync_forward_sft/submit_forward_sft API rename

Verified: v5 loss/expert-LoRA values match v4 baseline, v4 backward compat.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 13:21:29 +08:00
..
__init__.py feat(sft): AMX MoE SFT backend with LoRA support 2026-04-08 23:11:00 +08:00
amx.py feat(sft): AMX MoE SFT backend with LoRA support 2026-04-08 23:11:00 +08:00
arch.py feat(sft): support transformers v5 fused expert format 2026-04-20 13:21:29 +08:00
autograd.py feat(sft): support transformers v5 fused expert format 2026-04-20 13:21:29 +08:00
base.py feat(sft): AMX MoE SFT backend with LoRA support 2026-04-08 23:11:00 +08:00
config.py refactor(sft): share_backward_bb default True, share_cache_pool auto-derived 2026-04-09 20:10:38 +08:00
dist_utils.py refactor(sft): unify KTConfig field names with kt_ prefix, add share_cache_pool, remove dead code 2026-04-09 14:17:50 +08:00
layer.py feat(sft): support transformers v5 fused expert format 2026-04-20 13:21:29 +08:00
lora.py feat(sft): support transformers v5 fused expert format 2026-04-20 13:21:29 +08:00
weights.py feat(sft): support transformers v5 fused expert format 2026-04-20 13:21:29 +08:00
wrapper.py feat(sft): support transformers v5 fused expert format 2026-04-20 13:21:29 +08:00