koboldcpp/tools/server/tests
Concedo a3a5897d93 Merge branch 'upstream' into concedo_experimental
# Conflicts:
#	.devops/intel.Dockerfile
#	.github/workflows/python-type-check.yml
#	embd_res/templates/Qwen3.5-4B.jinja
#	examples/model-conversion/scripts/causal/compare-logits.py
#	examples/model-conversion/scripts/utils/check-nmse.py
#	examples/model-conversion/scripts/utils/compare_tokens.py
#	examples/model-conversion/scripts/utils/semantic_check.py
#	examples/sycl/build.sh
#	examples/sycl/run-llama2.sh
#	ggml/src/ggml-hexagon/htp/flash-attn-ops.c
#	ggml/src/ggml-hexagon/htp/hex-dma.h
#	ggml/src/ggml-hexagon/htp/rope-ops.c
#	scripts/gen-unicode-data.py
#	tests/test-chat.cpp
2026-03-30 21:41:19 +08:00
..
unit common : add standard Hugging Face cache support (#20775) 2026-03-24 07:30:33 +01:00
.gitignore llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
conftest.py server : add Anthropic Messages API support (#17570) 2025-11-28 12:57:04 +01:00
pytest.ini llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
requirements.txt server: /v1/responses (partial) (#18486) 2026-01-21 17:47:23 +01:00
tests.sh scripts : make the shell scripts cross-platform (#14341) 2025-06-30 10:17:18 +02:00
utils.py ci : bump ty to 0.0.26 (#21156) 2026-03-30 09:29:15 +02:00