mirror of
https://github.com/carlrobertoh/ProxyAI.git
synced 2026-05-08 18:31:19 +00:00
* Add support for some extended parameters of llama.cpp(top_k, top_p, min_p, and repeat_penalty) Added 'top_k,' 'top_p,' 'min_p,' and 'repeat_penalty' fields to the llama.cpp request configuration. The default values for these fields match the defaults of llama.cpp. If left untouched, they do not affect the model's response to the request. * Bump llm-client --------- Co-authored-by: Carl-Robert Linnupuu <carlrobertoh@gmail.com> |
||
|---|---|---|
| .. | ||
| src/main/kotlin | ||
| build.gradle.kts | ||