koboldcpp/examples/server
Concedo b154bd3671 Merge branch 'upstream' into concedo_experimental
# Conflicts:
#	README.md
#	docs/build.md
#	docs/development/HOWTO-add-model.md
#	tests/test-backend-ops.cpp
#	tests/test-chat-template.cpp
2025-01-10 17:57:38 +08:00
..
bench Merge branch 'upstream' into concedo_experimental 2025-01-03 11:56:20 +08:00
public server : add tooltips to settings and themes btn (#11154) 2025-01-09 11:28:29 +01:00
public_legacy sampling : refactor + optimize penalties sampler (#10803) 2024-12-16 12:31:14 +02:00
public_simplechat rewritten checkpoint 1 - before coopmat 2024-12-13 16:55:23 +08:00
tests Merge commit '017cc5f446' into concedo_experimental 2025-01-08 23:15:21 +08:00
themes Merge branch 'upstream' into concedo_experimental 2024-12-19 11:57:43 +08:00
webui server : add tooltips to settings and themes btn (#11154) 2025-01-09 11:28:29 +01:00
chat-llama2.sh chmod : make scripts executable (#2675) 2023-08-23 17:29:09 +03:00
chat.mjs server : revamp chat UI with vuejs and daisyui (#10175) 2024-11-07 17:31:10 -04:00
chat.sh server : fix context shift (#5195) 2024-01-30 20:17:30 +02:00
httplib.h Server: version bump for httplib and json (#6169) 2024-03-20 13:30:36 +01:00
server.cpp server : fix extra BOS in infill endpoint (#11106) 2025-01-06 15:36:08 +02:00
utils.hpp llama : use LLAMA_TOKEN_NULL (#11062) 2025-01-06 10:52:15 +02:00