koboldcpp/examples/server
Concedo ea9bd61e47 Merge commit '64eda5deb9' into concedo_experimental
# Conflicts:
#	.devops/cuda.Dockerfile
#	.devops/intel.Dockerfile
#	.devops/llama-cli-cann.Dockerfile
#	.devops/musa.Dockerfile
#	.devops/rocm.Dockerfile
#	.devops/vulkan.Dockerfile
#	.github/workflows/build.yml
#	.github/workflows/docker.yml
#	README.md
#	docs/backend/SYCL.md
#	examples/llava/clip.cpp
#	examples/server_embd.py
#	ggml/src/ggml-cann/acl_tensor.cpp
#	ggml/src/ggml-cann/aclnn_ops.cpp
#	ggml/src/ggml-cann/aclnn_ops.h
#	ggml/src/ggml-cann/ggml-cann.cpp
#	src/CMakeLists.txt
#	tests/test-chat-template.cpp
2025-04-12 08:31:22 +08:00
..
bench Merge branch 'upstream' into concedo_experimental 2025-01-03 11:56:20 +08:00
public server : webui : Improve Chat Input with Auto-Sizing Textarea (#12785) 2025-04-08 11:14:59 +02:00
public_legacy tool-call: fix Qwen 2.5 Coder support, add micro benchmarks, support trigger patterns for lazy grammars (#12034) 2025-03-05 13:05:13 +00:00
public_simplechat rewritten checkpoint 1 - before coopmat 2024-12-13 16:55:23 +08:00
tests Merge commit '64eda5deb9' into concedo_experimental 2025-04-12 08:31:22 +08:00
themes Merge branch 'upstream' into concedo_experimental 2024-12-19 11:57:43 +08:00
webui server : webui : Improve Chat Input with Auto-Sizing Textarea (#12785) 2025-04-08 11:14:59 +02:00
chat-llama2.sh chmod : make scripts executable (#2675) 2023-08-23 17:29:09 +03:00
chat.mjs server : revamp chat UI with vuejs and daisyui (#10175) 2024-11-07 17:31:10 -04:00
chat.sh server : fix context shift (#5195) 2024-01-30 20:17:30 +02:00
httplib.h server : Support listening on a unix socket (#12613) 2025-03-27 23:41:04 +01:00
server.cpp server : fix thread.join() on exit (#12831) 2025-04-08 18:37:06 +02:00
utils.hpp ci: detach common from the library (#12827) 2025-04-09 10:11:11 +02:00