Concedo
|
02f92f6ecc
|
Merge branch 'upstream' into concedo_experimental
# Conflicts:
# .devops/full-cuda.Dockerfile
# .devops/full-rocm.Dockerfile
# .devops/llama-cli-cuda.Dockerfile
# .devops/llama-cli-rocm.Dockerfile
# .devops/llama-cli-vulkan.Dockerfile
# .devops/llama-cpp-cuda.srpm.spec
# .devops/llama-server-cuda.Dockerfile
# .devops/llama-server-rocm.Dockerfile
# .devops/llama-server-vulkan.Dockerfile
# .github/workflows/build.yml
# .github/workflows/docker.yml
# CMakeLists.txt
# Makefile
# README.md
# examples/llama.android/llama/src/main/cpp/CMakeLists.txt
# flake.lock
# ggml/CMakeLists.txt
# ggml/src/CMakeLists.txt
# grammars/README.md
# scripts/sync-ggml-am.sh
# scripts/sync-ggml.last
# tests/test-chat-template.cpp
# tests/test-grammar-integration.cpp
# tests/test-json-schema-to-grammar.cpp
|
2024-06-30 10:59:42 +08:00 |
|
Concedo
|
9c10486204
|
merge the file structure refactor, testing
|
2024-06-29 12:14:38 +08:00 |
|
kustaaya
|
f675b20a3b
|
Added support for Viking pre-tokenizer (#8135)
Co-authored-by: kustaaya <kustaaya@protonmail.com>
|
2024-06-27 10:58:54 +02:00 |
|
Georgi Gerganov
|
f3f65429c4
|
llama : reorganize source code + improve CMake (#8006)
* scripts : update sync [no ci]
* files : relocate [no ci]
* ci : disable kompute build [no ci]
* cmake : fixes [no ci]
* server : fix mingw build
ggml-ci
* cmake : minor [no ci]
* cmake : link math library [no ci]
* cmake : build normal ggml library (not object library) [no ci]
* cmake : fix kompute build
ggml-ci
* make,cmake : fix LLAMA_CUDA + replace GGML_CDEF_PRIVATE
ggml-ci
* move public backend headers to the public include directory (#8122)
* move public backend headers to the public include directory
* nix test
* spm : fix metal header
---------
Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
* scripts : fix sync paths [no ci]
* scripts : sync ggml-blas.h [no ci]
---------
Co-authored-by: slaren <slarengh@gmail.com>
|
2024-06-26 18:33:02 +03:00 |
|
Concedo
|
02893d3d7d
|
Revert "clblast up ver"
This reverts commit edb3dc362a.
|
2024-02-21 14:35:38 +08:00 |
|
Concedo
|
edb3dc362a
|
clblast up ver
|
2024-02-11 17:01:09 +08:00 |
|
Concedo
|
2a4a7241e6
|
Merge branch 'vulkan_test' into concedo_experimental
# Conflicts:
# CMakeLists.txt
# Makefile
# llama.cpp
|
2024-01-25 23:01:44 +08:00 |
|
Concedo
|
72f99f0545
|
changes required to get vulkan working on windows
|
2024-01-25 18:29:45 +08:00 |
|
Concedo
|
d2da155661
|
upgraded clblast
|
2023-05-25 10:18:12 +08:00 |
|
Concedo
|
e4e6994353
|
Not working, don't use. testing a merge
|
2023-05-16 12:33:24 +08:00 |
|
Concedo
|
4fa3dfe8bc
|
just doesn't work properly on windows. will leave it as a manual flag for others
|
2023-04-22 10:57:38 +08:00 |
|
Concedo
|
f555db44ec
|
adding the libraries for cublas first. but i cannot get the kernel to work yet
|
2023-04-21 23:24:09 +08:00 |
|
Concedo
|
07bb31b034
|
wip dont use
|
2023-04-21 00:35:54 +08:00 |
|