koboldcpp/tools/server
Concedo bdff33e0de Merge branch 'upstream' into concedo_experimental
# Conflicts:
#	.github/workflows/build.yml
#	README.md
#	ci/run.sh
#	docs/build.md
#	examples/CMakeLists.txt
#	examples/parallel/parallel.cpp
#	ggml/CMakeLists.txt
#	ggml/src/CMakeLists.txt
#	scripts/server-bench.py
#	src/llama-kv-cache-unified.cpp
#	tests/test-backend-ops.cpp
#	tools/batched-bench/batched-bench.cpp
#	tools/server/README.md
2025-07-17 00:28:37 +08:00
..
bench Merge branch 'upstream' into concedo_experimental 2025-05-03 12:15:36 +08:00
public server : fix appearance of the chats list context menu for Safari (#14322) 2025-06-29 19:29:57 +02:00
public_legacy llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
public_simplechat Merge branch 'upstream' into concedo_experimental 2025-05-03 12:15:36 +08:00
tests Merge branch 'upstream' into concedo_experimental 2025-07-07 17:46:58 +08:00
themes Merge branch 'upstream' into concedo_experimental 2025-05-03 12:15:36 +08:00
webui server : fix appearance of the chats list context menu for Safari (#14322) 2025-06-29 19:29:57 +02:00
chat-llama2.sh scripts : make the shell scripts cross-platform (#14341) 2025-06-30 10:17:18 +02:00
chat.mjs llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
chat.sh scripts : make the shell scripts cross-platform (#14341) 2025-06-30 10:17:18 +02:00
server.cpp server : pre-calculate EOG logit biases (#14721) 2025-07-16 14:04:12 +03:00
utils.hpp scripts: benchmark for HTTP server throughput (#14668) 2025-07-14 13:14:30 +02:00