| .. |
|
bench
|
Merge branch 'upstream' into concedo_experimental
|
2026-03-22 23:39:13 +08:00 |
|
tests
|
Merge branch 'upstream' into concedo_experimental
|
2026-05-14 19:04:04 +08:00 |
|
chat-llama2.sh
|
scripts : make the shell scripts cross-platform (#14341)
|
2025-06-30 10:17:18 +02:00 |
|
chat.mjs
|
|
|
|
chat.sh
|
scripts : make the shell scripts cross-platform (#14341)
|
2025-06-30 10:17:18 +02:00 |
|
README-dev.md
|
ui: Restructure repo to use tools/ui folder and ui / UI / llama-ui / LLAMA_UI naming (#23064)
|
2026-05-16 02:02:40 +02:00 |
|
server-chat.cpp
|
Support for Codex CLI by skipping unsupported Responses tools (#23041)
|
2026-05-15 09:03:24 +02:00 |
|
server-chat.h
|
server: (router) Forward form-data to model server (Fixes #22044) (#22118)
|
2026-04-27 23:55:00 +02:00 |
|
server-common.cpp
|
server, webui: accept continue_final_message flag for vLLM API compat (#23012)
|
2026-05-13 20:47:58 +02:00 |
|
server-common.h
|
logs : reduce (#23021)
|
2026-05-14 13:05:52 +03:00 |
|
server-context.cpp
|
ui: Restructure repo to use tools/ui folder and ui / UI / llama-ui / LLAMA_UI naming (#23064)
|
2026-05-16 02:02:40 +02:00 |
|
server-context.h
|
ui: Restructure repo to use tools/ui folder and ui / UI / llama-ui / LLAMA_UI naming (#23064)
|
2026-05-16 02:02:40 +02:00 |
|
server-cors-proxy.h
|
server: (router) Forward form-data to model server (Fixes #22044) (#22118)
|
2026-04-27 23:55:00 +02:00 |
|
server-http.cpp
|
ui: Restructure repo to use tools/ui folder and ui / UI / llama-ui / LLAMA_UI naming (#23064)
|
2026-05-16 02:02:40 +02:00 |
|
server-http.h
|
server: support Vertex AI compatible API (#22545)
|
2026-05-08 15:23:04 +02:00 |
|
server-models.cpp
|
ui: Restructure repo to use tools/ui folder and ui / UI / llama-ui / LLAMA_UI naming (#23064)
|
2026-05-16 02:02:40 +02:00 |
|
server-models.h
|
ui: Restructure repo to use tools/ui folder and ui / UI / llama-ui / LLAMA_UI naming (#23064)
|
2026-05-16 02:02:40 +02:00 |
|
server-queue.cpp
|
server : print warning when HTTP timeout exceeded (#22907)
|
2026-05-10 22:00:18 +03:00 |
|
server-queue.h
|
server: allow router to report child instances sleep status (#20849)
|
2026-03-22 18:33:52 +01:00 |
|
server-task.cpp
|
logs : reduce (#23021)
|
2026-05-14 13:05:52 +03:00 |
|
server-task.h
|
spec : parallel drafting support (#22838)
|
2026-05-11 19:09:43 +03:00 |
|
server-tools.cpp
|
server : validate --tools CLI argument against known tool names (#22538)
|
2026-05-05 06:35:27 +03:00 |
|
server-tools.h
|
server: add built-in tools backend support (#20898)
|
2026-03-27 10:07:11 +01:00 |
|
server.cpp
|
ui: Restructure repo to use tools/ui folder and ui / UI / llama-ui / LLAMA_UI naming (#23064)
|
2026-05-16 02:02:40 +02:00 |