koboldcpp/examples/llama.swiftui
singularity 3c0b585561
llama.swiftui : support loading custom model from file picker (#4767)
* swiftui: support load model from file picker

* swiftui: remove trailing whitespace
2024-01-04 10:22:38 +02:00
..
llama.cpp.swift
llama.swiftui
llama.swiftui.xcodeproj
.gitignore
README.md

llama.swiftui

Local inference of llama.cpp on an iPhone. So far I only tested with starcoder 1B model, but it can most likely handle 7B models as well.

https://github.com/bachittle/llama.cpp/assets/39804642/e290827a-4edb-4093-9642-2a5e399ec545