mirror of
https://github.com/LostRuins/koboldcpp.git
synced 2025-09-05 14:49:06 +00:00
* llama : deprecate llama_kv_self_ API ggml-ci * llama : allow llama_memory_(nullptr) ggml-ci * memory : add flag for optional data clear in llama_memory_clear ggml-ci |
||
---|---|---|
.. | ||
CMakeLists.txt | ||
README.md | ||
speculative.cpp |
llama.cpp/examples/speculative
Demonstration of speculative decoding and tree-based speculative decoding techniques
More info: