mirror of
https://github.com/LostRuins/koboldcpp.git
synced 2025-09-10 17:14:36 +00:00
updated lite
This commit is contained in:
parent
c9c098dab2
commit
116d5fe58e
3 changed files with 1196 additions and 337 deletions
12
README.md
12
README.md
|
@ -1,6 +1,6 @@
|
|||
# koboldcpp
|
||||
|
||||
KoboldCpp is an easy-to-use AI text-generation software for GGML and GGUF models, inspired by the original KoboldAI. It's a single self-contained distributable from Concedo, that builds off llama.cpp, and adds a versatile **KoboldAI API endpoint**, additional format support, Stable Diffusion image generation, speech-to-text, backward compatibility, as well as a fancy UI with persistent stories, editing tools, save formats, memory, world info, author's note, characters, scenarios and everything KoboldAI and KoboldAI Lite have to offer.
|
||||
KoboldCpp is an easy-to-use AI text-generation software for GGML and GGUF models, inspired by the original **KoboldAI**. It's a single self-contained distributable from Concedo, that builds off llama.cpp, and adds a versatile **KoboldAI API endpoint**, additional format support, Stable Diffusion image generation, speech-to-text, backward compatibility, as well as a fancy UI with persistent stories, editing tools, save formats, memory, world info, author's note, characters, scenarios and everything KoboldAI and KoboldAI Lite have to offer.
|
||||
|
||||

|
||||

|
||||
|
@ -114,10 +114,16 @@ when you can't use the precompiled binary directly, we provide an automated buil
|
|||
## AMD Users
|
||||
- Please check out https://github.com/YellowRoseCx/koboldcpp-rocm
|
||||
|
||||
## Questions and Help
|
||||
## Questions and Help Wiki
|
||||
- **First, please check out [The KoboldCpp FAQ and Knowledgebase](https://github.com/LostRuins/koboldcpp/wiki) which may already have answers to your questions! Also please search through past issues and discussions.**
|
||||
- If you cannot find an answer, open an issue on this github, or find us on the [KoboldAI Discord](https://koboldai.org/discord).
|
||||
|
||||
## KoboldCpp and KoboldAI API Documentation
|
||||
- [Documentation for KoboldAI and KoboldCpp endpoints can be found here](https://lite.koboldai.net/koboldcpp_api)
|
||||
|
||||
## KoboldCpp Public Demo
|
||||
- [A public KoboldCpp demo can be found at our Huggingface Space. Please do not abuse it.](https://koboldai-koboldcpp-tiefighter.hf.space/)
|
||||
|
||||
## Considerations
|
||||
- For Windows: No installation, single file executable, (It Just Works)
|
||||
- Since v1.0.6, requires libopenblas, the prebuilt windows binaries are included in this repo. If not found, it will fall back to a mode without BLAS.
|
||||
|
@ -131,7 +137,7 @@ when you can't use the precompiled binary directly, we provide an automated buil
|
|||
## License
|
||||
- The original GGML library and llama.cpp by ggerganov are licensed under the MIT License
|
||||
- However, KoboldAI Lite is licensed under the AGPL v3.0 License
|
||||
- The other files are also under the AGPL v3.0 License unless otherwise stated
|
||||
- KoboldCpp code and other files are also under the AGPL v3.0 License unless otherwise stated
|
||||
|
||||
## Notes
|
||||
- If you wish, after building the koboldcpp libraries with `make`, you can rebuild the exe yourself with pyinstaller by using `make_pyinstaller.bat`
|
||||
|
|
1517
klite.embd
1517
klite.embd
File diff suppressed because one or more lines are too long
|
@ -730,7 +730,9 @@ def whisper_generate(genparams):
|
|||
return outstr
|
||||
|
||||
def utfprint(str):
|
||||
maxlen = 25000
|
||||
maxlen = 32000
|
||||
if args.debugmode >= 1:
|
||||
maxlen = 64000
|
||||
strlength = len(str)
|
||||
if strlength > maxlen: #limit max output len
|
||||
str = str[:maxlen] + f"... (+{strlength-maxlen} chars)"
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue