mirror of
https://github.com/LostRuins/koboldcpp.git
synced 2025-09-05 22:59:05 +00:00
* add common_json w/ support for truncated json healing * add common_chat_msg_diff * partial common_chat_parse * refactor parser w/ optionals * server: wire chat diffs in stream mode * fix trigger of thinking models (must happen after thoughts are closed) * fix functionary v3.2 raw python! * rename: common_chat_syntax (now contains format) * rm common_regex.at_start * don't return empty <think></think> * accommodate yet another deepseek r1 distill fantasy syntax (`<|tool▁calls|>`) * fix QwQ 32B tool call parsing after thoughts (hermes2) * better logs for grammar triggers * consume spaces after parse_json_tool_calls * fix required tool calls w/ thinking models that have pre-opened thinking tags * fix thinking model's initial trigger + test qwq's template * run most test_tool_call tests in stream + non-stream modes * make functionary v3.2 parsing more strict (differentiate first match from others) * send final diff from server, to close off raw python arguments * support partial content streaming in Generic mode * tool-call: allow content prelude before hermes2 tool calls (for Qwen2.5) * Update function-calling.md * Update tool_bench.py * chat-parser: remove input from exception (llm output may contain PII) --------- Co-authored-by: ochafik <ochafik@google.com> Co-authored-by: Olivier Chafik <ochafik@users.noreply.github.com> |
||
---|---|---|
.. | ||
templates | ||
.editorconfig | ||
ggml-vocab-aquila.gguf | ||
ggml-vocab-baichuan.gguf | ||
ggml-vocab-bert-bge.gguf | ||
ggml-vocab-bert-bge.gguf.inp | ||
ggml-vocab-bert-bge.gguf.out | ||
ggml-vocab-chameleon.gguf.inp | ||
ggml-vocab-chameleon.gguf.out | ||
ggml-vocab-command-r.gguf | ||
ggml-vocab-command-r.gguf.inp | ||
ggml-vocab-command-r.gguf.out | ||
ggml-vocab-deepseek-coder.gguf | ||
ggml-vocab-deepseek-coder.gguf.inp | ||
ggml-vocab-deepseek-coder.gguf.out | ||
ggml-vocab-deepseek-llm.gguf | ||
ggml-vocab-deepseek-llm.gguf.inp | ||
ggml-vocab-deepseek-llm.gguf.out | ||
ggml-vocab-deepseek-r1-qwen.gguf.inp | ||
ggml-vocab-deepseek-r1-qwen.gguf.out | ||
ggml-vocab-falcon.gguf | ||
ggml-vocab-falcon.gguf.inp | ||
ggml-vocab-falcon.gguf.out | ||
ggml-vocab-gpt-2.gguf | ||
ggml-vocab-gpt-2.gguf.inp | ||
ggml-vocab-gpt-2.gguf.out | ||
ggml-vocab-gpt-4o.gguf.inp | ||
ggml-vocab-gpt-4o.gguf.out | ||
ggml-vocab-gpt-neox.gguf | ||
ggml-vocab-llama-bpe.gguf | ||
ggml-vocab-llama-bpe.gguf.inp | ||
ggml-vocab-llama-bpe.gguf.out | ||
ggml-vocab-llama-spm.gguf | ||
ggml-vocab-llama-spm.gguf.inp | ||
ggml-vocab-llama-spm.gguf.out | ||
ggml-vocab-llama4.gguf.inp | ||
ggml-vocab-llama4.gguf.out | ||
ggml-vocab-mpt.gguf | ||
ggml-vocab-mpt.gguf.inp | ||
ggml-vocab-mpt.gguf.out | ||
ggml-vocab-phi-3.gguf | ||
ggml-vocab-phi-3.gguf.inp | ||
ggml-vocab-phi-3.gguf.out | ||
ggml-vocab-pixtral.gguf.inp | ||
ggml-vocab-pixtral.gguf.out | ||
ggml-vocab-qwen2.gguf | ||
ggml-vocab-qwen2.gguf.inp | ||
ggml-vocab-qwen2.gguf.out | ||
ggml-vocab-refact.gguf | ||
ggml-vocab-refact.gguf.inp | ||
ggml-vocab-refact.gguf.out | ||
ggml-vocab-roberta-bpe.gguf.inp | ||
ggml-vocab-roberta-bpe.gguf.out | ||
ggml-vocab-starcoder.gguf | ||
ggml-vocab-starcoder.gguf.inp | ||
ggml-vocab-starcoder.gguf.out |