Concedo
|
6da5a63852
|
fix for uploaded wav files being incomplete due to fragmentation when converting to b64
|
2024-10-20 17:47:19 +08:00 |
|
bebopkim
|
7dac9982f9
|
Metal: remove ggml_backend_metal_log_set_callback due to backend logging mechanism unificification d6fe7ab (#1144)
|
2024-10-06 14:54:33 +08:00 |
|
Concedo
|
2d57f80af9
|
Fix compilation on macos
|
2024-10-03 15:10:30 +08:00 |
|
Concedo
|
e44ddf26ef
|
Merge branch 'upstream' into concedo_experimental
# Conflicts:
# .github/workflows/build.yml
# .github/workflows/server.yml
# CMakeLists.txt
# Makefile
# examples/embedding/embedding.cpp
# examples/imatrix/imatrix.cpp
# examples/llama-bench/llama-bench.cpp
# examples/llava/MobileVLM-README.md
# examples/parallel/parallel.cpp
# examples/perplexity/perplexity.cpp
# examples/quantize/CMakeLists.txt
# examples/server/README.md
# examples/speculative/speculative.cpp
# tests/test-backend-ops.cpp
|
2024-09-13 16:17:24 +08:00 |
|
Concedo
|
12fd16bfd4
|
Merge commit 'df270ef745 ' into concedo_experimental
# Conflicts:
# Makefile
# common/CMakeLists.txt
# common/common.h
# common/sampling.cpp
# common/sampling.h
# examples/infill/infill.cpp
# examples/llama-bench/llama-bench.cpp
# examples/quantize-stats/quantize-stats.cpp
# examples/server/server.cpp
# include/llama.h
# src/llama-sampling.cpp
# src/llama-sampling.h
# src/llama.cpp
# tests/test-grammar-integration.cpp
# tests/test-grammar-parser.cpp
# tests/test-json-schema-to-grammar.cpp
# tests/test-llama-grammar.cpp
# tests/test-sampling.cpp
|
2024-09-09 17:10:08 +08:00 |
|
Concedo
|
d220495dd4
|
Merge branch 'upstream' into concedo_experimental
# Conflicts:
# .devops/full-cuda.Dockerfile
# .devops/llama-cli-cuda.Dockerfile
# .devops/llama-server-cuda.Dockerfile
# .devops/llama-server-intel.Dockerfile
# .devops/llama-server-rocm.Dockerfile
# .devops/llama-server-vulkan.Dockerfile
# .devops/llama-server.Dockerfile
# .github/workflows/docker.yml
# docs/docker.md
# examples/llama-bench/llama-bench.cpp
# flake.lock
# ggml/include/ggml.h
# ggml/src/CMakeLists.txt
# scripts/sync-ggml.last
# src/llama.cpp
# tests/test-backend-ops.cpp
# tests/test-grad0.cpp
# tests/test-rope.cpp
|
2024-08-30 10:37:39 +08:00 |
|
Concedo
|
b2c1ff7a13
|
Merge branch 'upstream' into concedo_experimental
# Conflicts:
# .ecrc
# CMakePresets.json
# ci/run.sh
# docs/backend/SYCL.md
# ggml/src/CMakeLists.txt
# src/llama.cpp
# tests/test-backend-ops.cpp
# tests/test-sampling.cpp
|
2024-08-27 17:46:40 +08:00 |
|
Concedo
|
2f7168779d
|
Merge branch 'concedo_experimental' of https://github.com/LostRuins/koboldcpp into concedo_experimental
|
2024-06-06 20:26:57 +08:00 |
|
Concedo
|
1ad56e9b6b
|
if quiet mode just show transcription event without text
|
2024-06-06 20:26:47 +08:00 |
|
Lexi
|
1c5e05e477
|
whisper: fix printf format string (#894)
This format string uses %d to print uint32_t and size_t{ype,}, which is
not guaranteed to work. Instead, use PRIu32 for uint32_t, and %zu for
size_t.
|
2024-06-06 19:50:59 +08:00 |
|
Concedo
|
813cf829b5
|
allow selecting multigpu on vulkan
|
2024-06-06 18:36:56 +08:00 |
|
Concedo
|
b0a7d1aba6
|
fixed makefile (+1 squashed commits)
Squashed commits:
[ef6ddaf5] try fix makefile
|
2024-06-02 15:21:48 +08:00 |
|
Concedo
|
a65e0800ab
|
update docs, added gui for whisper
|
2024-06-01 02:01:49 +08:00 |
|
Concedo
|
961c789c91
|
wav file resampling
|
2024-05-30 13:41:58 +08:00 |
|
Concedo
|
f24aef8792
|
initial whisper integration
|
2024-05-29 23:13:11 +08:00 |
|