koboldcpp/tools/server
LostRuins Concedo fdcb281a3a Merge commit '2f966b8ed8' into concedo_experimental
# Conflicts:
#	.github/workflows/release.yml
#	docs/docker.md
#	ggml/src/CMakeLists.txt
#	ggml/src/ggml-cpu/CMakeLists.txt
#	tests/test-backend-ops.cpp
#	tests/test-thread-safety.cpp
#	tools/batched-bench/batched-bench.cpp
#	tools/mtmd/clip.cpp
2025-11-08 10:34:17 +08:00
..
bench Merge branch 'upstream' into concedo_experimental 2025-08-23 11:35:28 +08:00
public webui: auto-refresh /props on inference start to resync model metadata (#16784) 2025-11-01 19:49:51 +01:00
public_legacy grammar : support array references in json schema (#16792) 2025-10-28 09:37:52 +01:00
public_simplechat Merge branch 'upstream' into concedo_experimental 2025-05-03 12:15:36 +08:00
tests Merge commit '2f966b8ed8' into concedo_experimental 2025-11-08 10:34:17 +08:00
themes Merge branch 'upstream' into concedo_experimental 2025-05-03 12:15:36 +08:00
webui webui: auto-refresh /props on inference start to resync model metadata (#16784) 2025-11-01 19:49:51 +01:00
chat-llama2.sh scripts : make the shell scripts cross-platform (#14341) 2025-06-30 10:17:18 +02:00
chat.mjs llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
chat.sh scripts : make the shell scripts cross-platform (#14341) 2025-06-30 10:17:18 +02:00
server.cpp clip : use FA (#16837) 2025-11-02 21:21:48 +01:00
utils.hpp server : support unified cache across slots (#16736) 2025-11-02 18:14:04 +02:00