koboldcpp/tools/server/tests
Concedo 0d43bdc46d Merge branch 'upstream' into concedo_experimental
# Conflicts:
#	examples/batched/batched.cpp
#	ggml/src/ggml-opencl/CMakeLists.txt
#	ggml/src/ggml-opencl/ggml-opencl.cpp
#	src/llama-context.cpp
#	tools/cli/README.md
#	tools/completion/README.md
#	tools/server/README.md
2026-01-17 00:41:28 +08:00
..
unit server: improve slots scheduling for n_cmpl (#18789) 2026-01-15 17:10:28 +01:00
.gitignore llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
conftest.py server : add Anthropic Messages API support (#17570) 2025-11-28 12:57:04 +01:00
pytest.ini llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
requirements.txt requirements : update transformers/torch for Embedding Gemma (#15828) 2025-09-09 06:06:52 +02:00
tests.sh scripts : make the shell scripts cross-platform (#14341) 2025-06-30 10:17:18 +02:00
utils.py server: add auto-sleep after N seconds of idle (#18228) 2025-12-21 02:24:42 +01:00