mirror of
https://github.com/block/goose.git
synced 2026-04-26 10:40:45 +00:00
docs: add blog post about Mesh LLM provider option (#8655)
Signed-off-by: Michael Neale <michael.neale@gmail.com> Co-authored-by: Angie Jones <jones.angie@gmail.com>
This commit is contained in:
parent
5205540311
commit
a789bc16fb
1 changed files with 29 additions and 0 deletions
29
documentation/blog/2026-04-20-mesh-llm/index.md
Normal file
29
documentation/blog/2026-04-20-mesh-llm/index.md
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
---
|
||||
title: "Mesh LLM in goose: routing across models"
|
||||
description: "Mesh LLM is now available as a provider setting in goose."
|
||||
authors:
|
||||
- mic
|
||||
---
|
||||
|
||||
Quick note: [Mesh LLM](https://github.com/Mesh-LLM/mesh-llm/) is now in goose as an option for accessing and sharing (open) LLMs with friends and family.
|
||||
|
||||
It uses the same llama.cpp infra as local mode to run models, with a twist.
|
||||
|
||||
<!--truncate-->
|
||||
|
||||
## What is Mesh LLM?
|
||||
|
||||
Mesh LLM is an associated project we're trying out that lets people connect their compute capacity (which may just be a laptop) peer-to-peer, so they can access models they may not otherwise be able to self-host.
|
||||
|
||||
There is a demo public "mesh" which at any point has some capacity in it, but you can also make your own private networks and pool compute together. The mesh will try to work out the best places to run models (downloading them as needed) and can even split the compute in various ways.
|
||||
|
||||
This is a pretty early-stage project, so we'd love any feedback on it.
|
||||
|
||||
Check out [the project docs](https://docs.anarchai.org/) and the [live public mesh](https://meshllm.cloud/dashboard).
|
||||
|
||||
<head>
|
||||
<meta property="og:title" content="Mesh LLM in goose: routing across models" />
|
||||
<meta property="og:type" content="article" />
|
||||
<meta property="og:url" content="https://goose-docs.ai/blog/2026/04/20/mesh-llm" />
|
||||
<meta property="og:description" content="Mesh LLM is now available as a provider setting in goose." />
|
||||
</head>
|
||||
Loading…
Add table
Add a link
Reference in a new issue