Fix the broken link

This commit is contained in:
TangJingqi 2024-08-16 10:59:34 +08:00
parent e81fc482ff
commit 7199699d78

View file

@ -165,7 +165,7 @@ Through these two rules, we place all previously unmatched layers (and their sub
## Muti-GPU ## Muti-GPU
If you have multiple GPUs, you can set the device for each module to different GPUs. If you have multiple GPUs, you can set the device for each module to different GPUs.
DeepseekV2-Chat got 60 layers, if we got 2 GPUs, we can allocate 30 layers to each GPU. Complete multi GPU rule examples [here](ktransformers/optimize/optimize_rules). DeepseekV2-Chat got 60 layers, if we got 2 GPUs, we can allocate 30 layers to each GPU. Complete multi GPU rule examples [here](https://github.com/kvcache-ai/ktransformers/blob/main/ktransformers/optimize/optimize_rules/DeepSeek-V2-Chat-multi-gpu.yaml).
<p align="center"> <p align="center">