koboldcpp/tools/server
Concedo 8b8396c30c Merge branch 'upstream' into concedo_experimental
# Conflicts:
#	README.md
#	docs/build-s390x.md
#	examples/llama.vim
#	ggml/src/ggml-cann/aclnn_ops.cpp
#	ggml/src/ggml-cann/common.h
#	scripts/compare-llama-bench.py
#	src/CMakeLists.txt
#	tests/test-backend-ops.cpp
#	tools/llama-bench/README.md
#	tools/llama-bench/llama-bench.cpp
#	tools/server/README.md
2025-08-23 11:35:28 +08:00
..
bench Merge branch 'upstream' into concedo_experimental 2025-08-23 11:35:28 +08:00
public server : fix webui (#15462) 2025-08-21 08:19:22 +03:00
public_legacy llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
public_simplechat Merge branch 'upstream' into concedo_experimental 2025-05-03 12:15:36 +08:00
tests Merge branch 'upstream' into concedo_experimental 2025-08-23 11:35:28 +08:00
themes Merge branch 'upstream' into concedo_experimental 2025-05-03 12:15:36 +08:00
webui server : fix webui (#15462) 2025-08-21 08:19:22 +03:00
chat-llama2.sh scripts : make the shell scripts cross-platform (#14341) 2025-06-30 10:17:18 +02:00
chat.mjs llama : move end-user examples to tools directory (#13249) 2025-05-02 20:27:13 +02:00
chat.sh scripts : make the shell scripts cross-platform (#14341) 2025-06-30 10:17:18 +02:00
server.cpp server : Support multimodal completion and embeddings prompts in JSON format (#15108) 2025-08-22 10:10:14 +02:00
utils.hpp server : Support multimodal completion and embeddings prompts in JSON format (#15108) 2025-08-22 10:10:14 +02:00