koboldcpp/include
Concedo f13498df13 Merge branch 'upstream' into concedo_experimental
# Conflicts:
#	.devops/tools.sh
#	.devops/vulkan.Dockerfile
#	.github/workflows/build.yml
#	.github/workflows/docker.yml
#	.github/workflows/server.yml
#	Makefile
#	README.md
#	cmake/llama-config.cmake.in
#	common/CMakeLists.txt
#	examples/gbnf-validator/gbnf-validator.cpp
#	examples/run/run.cpp
#	examples/server/README.md
#	examples/server/tests/README.md
#	ggml/src/CMakeLists.txt
#	ggml/src/ggml-hip/CMakeLists.txt
#	scripts/sync-ggml.last
#	tests/CMakeLists.txt
#	tests/test-backend-ops.cpp
#	tests/test-chat-template.cpp
#	tests/test-grammar-integration.cpp
2025-02-01 17:14:59 +08:00
..
CL wip dont use 2023-04-21 00:35:54 +08:00
vulkan merge checkpoint 2 - functional merge without q4_0_4_4 (need regen shaders) 2024-12-13 17:04:19 +08:00
cblas.h wip dont use 2023-04-21 00:35:54 +08:00
clblast.h Revert "clblast up ver" 2024-02-21 14:35:38 +08:00
clblast_c.h Revert "clblast up ver" 2024-02-21 14:35:38 +08:00
clblast_half.h upgraded clblast 2023-05-25 10:18:12 +08:00
clblast_netlib_c.h Not working, don't use. testing a merge 2023-05-16 12:33:24 +08:00
llama-cpp.h llama : add llama_vocab, functions -> methods, naming (#11110) 2025-01-12 11:32:42 +02:00
llama.h Merge branch 'upstream' into concedo_experimental 2025-02-01 17:14:59 +08:00
openblas_config.h wip dont use 2023-04-21 00:35:54 +08:00