mirror of
https://github.com/LostRuins/koboldcpp.git
synced 2026-05-07 09:02:04 +00:00
* PoC: add chat template heuristics The fallback chat template adapter of Vicuna is not ideal in some cases (e.g. a test against a sub-portion of the BBC news classification task on Kaggle gave an 82% accuracy with Vicuna and 88% with the official ChatML format for a q4_k_m Qwen 2.5 3B-Instruct gguf). This PR adds a proof of concept simple heuristic which looks at the chat template and upgrades the adapter when it is able to. * gemma 2 heuristic * Phi 4, Llama 3.x heuristics * better qwen vs generic heuristic * cleanup * mistral (generic) heuristic * fix sys msg for mistral * phi 3.5 * mistral v3 * cohere (aya expanse 32b based) * only derive from chat template if AutoGuess * add notes about alpaca fallbacks * added AutoGuess.json dummy * add mistral v7 * switch to using a json list with search strings |
||
|---|---|---|
| .. | ||
| Alpaca.json | ||
| AutoGuess.json | ||
| ChatML.json | ||
| Command-R.json | ||
| Gemma-2.json | ||
| Llama-2-Chat.json | ||
| Llama-3.json | ||
| Metharme.json | ||
| Mistral-V1.json | ||
| Mistral-V2-V3.json | ||
| Mistral-V3-Tekken-V7.json | ||
| Phi-3.json | ||
| Vicuna.json | ||