koboldcpp/examples/llama-eval
2026-05-13 09:14:24 +03:00
..
llama-eval.py llama-eval : enable type check (#22988) 2026-05-13 09:14:24 +03:00
llama-server-simulator.py examples : add llama-eval (#21152) 2026-05-12 15:07:00 +03:00
README.md examples : add llama-eval (#21152) 2026-05-12 15:07:00 +03:00
test-simulator.sh examples : add llama-eval (#21152) 2026-05-12 15:07:00 +03:00

llama-eval

Simple evaluation tool for llama.cpp with support for multiple datasets.

For a full description, usage examples, and sample results, see:

Quick start

# Single server
python3 llama-eval.py \
  --server http://localhost:8033 \
  --model my-model \
  --dataset gsm8k --n_cases 100 \
  --grader-type regex --threads 32

# Multiple servers (comma-separated URLs and thread counts)
python3 llama-eval.py \
  --server http://server1:8033,http://server2:8033 \
  --server-name server1,server2 \
  --threads 16,16 \
  --dataset aime2025 --n_cases 240 \
  --grader-type regex