koboldcpp/examples/server
Concedo 159c47f0e6 Merge commit '335eb04a91' into concedo_experimental
# Conflicts:
#	.github/workflows/build.yml
#	CONTRIBUTING.md
#	Makefile
#	docs/build.md
#	examples/llama.swiftui/llama.swiftui/UI/ContentView.swift
#	examples/run/run.cpp
#	ggml/CMakeLists.txt
#	ggml/src/ggml-cpu/CMakeLists.txt
#	ggml/src/ggml-cuda/CMakeLists.txt
#	ggml/src/ggml-musa/CMakeLists.txt
2025-02-24 11:55:14 +08:00
..
bench Merge branch 'upstream' into concedo_experimental 2025-01-03 11:56:20 +08:00
public server (webui): Fix Premature Submission During IME Conversion (#11971) 2025-02-20 19:43:22 +01:00
public_legacy sampling : refactor + optimize penalties sampler (#10803) 2024-12-16 12:31:14 +02:00
public_simplechat rewritten checkpoint 1 - before coopmat 2024-12-13 16:55:23 +08:00
tests Merge branch 'upstream' into concedo_experimental 2025-02-20 23:17:20 +08:00
themes Merge branch 'upstream' into concedo_experimental 2024-12-19 11:57:43 +08:00
webui server (webui): Fix Premature Submission During IME Conversion (#11971) 2025-02-20 19:43:22 +01:00
chat-llama2.sh chmod : make scripts executable (#2675) 2023-08-23 17:29:09 +03:00
chat.mjs server : revamp chat UI with vuejs and daisyui (#10175) 2024-11-07 17:31:10 -04:00
chat.sh server : fix context shift (#5195) 2024-01-30 20:17:30 +02:00
httplib.h server : bump httplib to 0.19.0 (#11908) 2025-02-16 17:11:22 +00:00
server.cpp speculative : update default params (#11954) 2025-02-19 13:29:42 +02:00
utils.hpp server : disable Nagle's algorithm (#12020) 2025-02-22 11:46:31 +01:00