From a8c74c04deff2fdc63c81655b8d0dc218a762aff Mon Sep 17 00:00:00 2001 From: Mike <71440932+Vect0rM@users.noreply.github.com> Date: Wed, 29 Apr 2026 07:10:41 +0300 Subject: [PATCH] docs: add Atomic Chat provider section (#23069) --- .../web/src/content/docs/ar/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/bs/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/da/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/de/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/es/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/fr/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/it/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/ja/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/ko/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/nb/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/pl/providers.mdx | 38 +++++++++++++++++++ packages/web/src/content/docs/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/pt-br/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/ru/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/th/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/tr/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/zh-cn/providers.mdx | 38 +++++++++++++++++++ .../web/src/content/docs/zh-tw/providers.mdx | 38 +++++++++++++++++++ 18 files changed, 684 insertions(+) diff --git a/packages/web/src/content/docs/ar/providers.mdx b/packages/web/src/content/docs/ar/providers.mdx index a43357d107..07a19b8ad2 100644 --- a/packages/web/src/content/docs/ar/providers.mdx +++ b/packages/web/src/content/docs/ar/providers.mdx @@ -319,6 +319,44 @@ OpenCode Go هي خطة اشتراك منخفضة التكلفة توفّر وص --- +### Atomic Chat + +يمكنك تكوين opencode لاستخدام النماذج المحلية عبر [Atomic Chat](https://atomic.chat)، وهو تطبيق سطح مكتب يقوم بتشغيل LLMs المحلية خلف خادم API متوافق مع OpenAI (نقطة النهاية الافتراضية `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +في هذا المثال: + +- `atomic-chat` هو معرف الموفر المخصص. يمكن أن يكون أي سلسلة نصية. +- `npm` يحدد الحزمة المستخدمة لهذا الموفر. هنا يُستخدم `@ai-sdk/openai-compatible` لأي واجهة برمجة تطبيقات متوافقة مع OpenAI. +- `name` هو الاسم المعروض للموفر في الواجهة. +- `options.baseURL` هو نقطة نهاية الخادم المحلي. غيّر المضيف والمنفذ لتتطابق مع إعدادات Atomic Chat الخاصة بك. +- `models` هو خريطة لمعرفات النماذج إلى أسمائها المعروضة. يجب أن يتطابق كل معرف مع `id` الذي يرجعه `GET /v1/models` — قم بتشغيل `curl http://127.0.0.1:1337/v1/models` لإدراج المعرفات المحملة حاليًا في Atomic Chat. + +:::tip +إذا لم تعمل استدعاءات الأدوات بشكل جيد، فاختر نموذجًا محملًا بدعم قوي لاستدعاء الأدوات (على سبيل المثال، متغير Qwen-Coder أو DeepSeek-Coder). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/bs/providers.mdx b/packages/web/src/content/docs/bs/providers.mdx index 306653f75a..4087db8cde 100644 --- a/packages/web/src/content/docs/bs/providers.mdx +++ b/packages/web/src/content/docs/bs/providers.mdx @@ -324,6 +324,44 @@ Ili ako već imate API ključ, možete odabrati **Ručno unesite API ključ** i --- +### Atomic Chat + +Možete konfigurirati opencode za korištenje lokalnih modela preko [Atomic Chata](https://atomic.chat) — desktop aplikacije koja pokreće lokalne LLM-ove iza OpenAI-kompatibilnog API servera (zadana krajnja tačka `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +U ovom primjeru: + +- `atomic-chat` je prilagođeni ID provajdera. Može biti bilo koji niz. +- `npm` specificira paket koji se koristi za ovog provajdera. Ovdje se koristi `@ai-sdk/openai-compatible` za svaki OpenAI-kompatibilni API. +- `name` je prikazano ime provajdera u interfejsu. +- `options.baseURL` je krajnja tačka lokalnog servera. Promijenite host i port da odgovaraju vašoj Atomic Chat konfiguraciji. +- `models` je mapa ID-ova modela u njihova prikazana imena. Svaki ID mora odgovarati `id` vrijednosti koju vraća `GET /v1/models` — pokrenite `curl http://127.0.0.1:1337/v1/models` da vidite ID-ove trenutno učitane u Atomic Chat. + +:::tip +Ako pozivi alata ne rade dobro, odaberite učitani model sa jakom podrškom za tool calling (na primjer, Qwen-Coder ili DeepSeek-Coder varijantu). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/da/providers.mdx b/packages/web/src/content/docs/da/providers.mdx index 0bc9baa5fa..8817d23192 100644 --- a/packages/web/src/content/docs/da/providers.mdx +++ b/packages/web/src/content/docs/da/providers.mdx @@ -315,6 +315,44 @@ Eller hvis du allerede har en API-nøgle, kan du vælge **Manually enter API Key --- +### Atomic Chat + +Du kan konfigurere opencode til at bruge lokale modeller via [Atomic Chat](https://atomic.chat) — en desktopapplikation, der kører lokale LLM'er bag en OpenAI-kompatibel API-server (standard-endpoint `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +I dette eksempel: + +- `atomic-chat` er det brugerdefinerede udbyder-ID. Det kan være en vilkårlig streng. +- `npm` specificerer pakken, der skal bruges for denne udbyder. Her bruges `@ai-sdk/openai-compatible` til enhver OpenAI-kompatibel API. +- `name` er det viste navn på udbyderen i grænsefladen. +- `options.baseURL` er endpoint'et for den lokale server. Ændr vært og port for at matche din Atomic Chat-opsætning. +- `models` er en afbildning af model-ID'er til deres viste navne. Hvert ID skal matche det `id`, der returneres af `GET /v1/models` — kør `curl http://127.0.0.1:1337/v1/models` for at liste ID'erne, der i øjeblikket er indlæst i Atomic Chat. + +:::tip +Hvis værktøjskald ikke fungerer godt, så vælg en indlæst model med god tool calling-understøttelse (for eksempel en Qwen-Coder- eller DeepSeek-Coder-variant). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/de/providers.mdx b/packages/web/src/content/docs/de/providers.mdx index 41a15fa0b9..87f78c9d22 100644 --- a/packages/web/src/content/docs/de/providers.mdx +++ b/packages/web/src/content/docs/de/providers.mdx @@ -321,6 +321,44 @@ Wenn Sie bereits über einen API-Schlüssel verfügen, können Sie **API-Schlüs --- +### Atomic Chat + +Sie können opencode so konfigurieren, dass es lokale Modelle über [Atomic Chat](https://atomic.chat) verwendet — eine Desktop-Anwendung, die lokale LLMs hinter einem OpenAI-kompatiblen API-Server bereitstellt (Standard-Endpunkt `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +In diesem Beispiel: + +- `atomic-chat` ist die benutzerdefinierte Provider-ID. Dies kann eine beliebige Zeichenkette sein. +- `npm` gibt das für diesen Provider zu verwendende Paket an. Hier wird `@ai-sdk/openai-compatible` für jede OpenAI-kompatible API verwendet. +- `name` ist der in der UI angezeigte Name des Providers. +- `options.baseURL` ist der Endpunkt des lokalen Servers. Passen Sie Host und Port an Ihre Atomic-Chat-Konfiguration an. +- `models` ist eine Zuordnung von Modell-IDs zu ihren Anzeigenamen. Jede ID muss dem `id`-Wert entsprechen, den `GET /v1/models` zurückgibt — führen Sie `curl http://127.0.0.1:1337/v1/models` aus, um die aktuell in Atomic Chat geladenen IDs aufzulisten. + +:::tip +Wenn Tool-Aufrufe nicht zuverlässig funktionieren, wählen Sie ein geladenes Modell mit starker Tool-Calling-Unterstützung (zum Beispiel eine Qwen-Coder- oder DeepSeek-Coder-Variante). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/es/providers.mdx b/packages/web/src/content/docs/es/providers.mdx index f325a0cd4c..b44ce9ee99 100644 --- a/packages/web/src/content/docs/es/providers.mdx +++ b/packages/web/src/content/docs/es/providers.mdx @@ -322,6 +322,44 @@ O si ya tienes una clave API, puedes seleccionar **Ingresar manualmente la clave --- +### Atomic Chat + +Puedes configurar opencode para usar modelos locales mediante [Atomic Chat](https://atomic.chat), una aplicación de escritorio que ejecuta LLMs locales detrás de un servidor API compatible con OpenAI (endpoint por defecto `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +En este ejemplo: + +- `atomic-chat` es el identificador personalizado del proveedor. Puede ser cualquier cadena que quieras. +- `npm` especifica el paquete que se usará para este proveedor. Aquí se usa `@ai-sdk/openai-compatible` para cualquier API compatible con OpenAI. +- `name` es el nombre que se muestra para el proveedor en la interfaz. +- `options.baseURL` es el endpoint del servidor local. Cambia el host y el puerto según tu configuración de Atomic Chat. +- `models` es un mapa de IDs de modelos a sus nombres de pantalla. Cada ID debe coincidir con el `id` devuelto por `GET /v1/models` — ejecuta `curl http://127.0.0.1:1337/v1/models` para listar los IDs cargados actualmente en Atomic Chat. + +:::tip +Si las llamadas a herramientas no funcionan bien, elige un modelo cargado con buen soporte para tool calling (por ejemplo, una variante Qwen-Coder o DeepSeek-Coder). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/fr/providers.mdx b/packages/web/src/content/docs/fr/providers.mdx index 2896df9592..6a902ab02f 100644 --- a/packages/web/src/content/docs/fr/providers.mdx +++ b/packages/web/src/content/docs/fr/providers.mdx @@ -325,6 +325,44 @@ Ou si vous disposez déjà d'une clé API, vous pouvez sélectionner **Entrer ma --- +### Atomic Chat + +Vous pouvez configurer opencode pour utiliser des modèles locaux via [Atomic Chat](https://atomic.chat), une application de bureau qui exécute des LLM locaux derrière un serveur API compatible OpenAI (point de terminaison par défaut `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +Dans cet exemple : + +- `atomic-chat` est l'identifiant personnalisé du fournisseur. Il peut s'agir de n'importe quelle chaîne. +- `npm` spécifie le paquet à utiliser pour ce fournisseur. Ici, `@ai-sdk/openai-compatible` est utilisé pour toute API compatible OpenAI. +- `name` est le nom du fournisseur affiché dans l'interface. +- `options.baseURL` est le point de terminaison du serveur local. Modifiez l'hôte et le port selon votre configuration Atomic Chat. +- `models` est une carte d'ID de modèles vers leurs noms d'affichage. Chaque ID doit correspondre à l'`id` renvoyé par `GET /v1/models` — exécutez `curl http://127.0.0.1:1337/v1/models` pour lister les ID actuellement chargés dans Atomic Chat. + +:::tip +Si les appels d'outils ne fonctionnent pas bien, choisissez un modèle chargé avec un bon support du tool calling (par exemple, une variante Qwen-Coder ou DeepSeek-Coder). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/it/providers.mdx b/packages/web/src/content/docs/it/providers.mdx index 26bf19fd6f..96da8c4df1 100644 --- a/packages/web/src/content/docs/it/providers.mdx +++ b/packages/web/src/content/docs/it/providers.mdx @@ -299,6 +299,44 @@ Oppure se hai già una chiave API, puoi selezionare **Manually enter API Key** e --- +### Atomic Chat + +Puoi configurare opencode per utilizzare modelli locali tramite [Atomic Chat](https://atomic.chat), un'applicazione desktop che esegue LLM locali dietro un server API compatibile OpenAI (endpoint predefinito `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +In questo esempio: + +- `atomic-chat` è l'ID personalizzato del provider. Può essere qualsiasi stringa. +- `npm` specifica il pacchetto da utilizzare per questo provider. Qui viene usato `@ai-sdk/openai-compatible` per qualsiasi API compatibile OpenAI. +- `name` è il nome visualizzato del provider nell'interfaccia. +- `options.baseURL` è l'endpoint del server locale. Modifica host e porta in base alla tua configurazione Atomic Chat. +- `models` è una mappa di ID modello ai rispettivi nomi visualizzati. Ogni ID deve corrispondere all'`id` restituito da `GET /v1/models` — esegui `curl http://127.0.0.1:1337/v1/models` per elencare gli ID attualmente caricati in Atomic Chat. + +:::tip +Se le chiamate agli strumenti non funzionano bene, scegli un modello caricato con buon supporto per il tool calling (ad esempio, una variante Qwen-Coder o DeepSeek-Coder). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/ja/providers.mdx b/packages/web/src/content/docs/ja/providers.mdx index 9fbb301ba5..8017d0882e 100644 --- a/packages/web/src/content/docs/ja/providers.mdx +++ b/packages/web/src/content/docs/ja/providers.mdx @@ -329,6 +329,44 @@ Pro/Max サブスクリプションをお持ちでない場合は、[**API キ --- +### Atomic Chat + +opencode は、OpenAI 互換の API サーバーの背後でローカル LLM を実行するデスクトップアプリケーション [Atomic Chat](https://atomic.chat) 経由でローカルモデルを使うように設定できます(デフォルトのエンドポイントは `http://127.0.0.1:1337/v1`)。 + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +この例では: + +- `atomic-chat` はカスタムプロバイダー ID です。任意の文字列を指定できます。 +- `npm` はこのプロバイダーに使用するパッケージを指定します。ここでは、任意の OpenAI 互換 API に対して `@ai-sdk/openai-compatible` を使用しています。 +- `name` は UI に表示されるプロバイダー名です。 +- `options.baseURL` はローカルサーバーのエンドポイントです。Atomic Chat のセットアップに合わせてホストとポートを変更してください。 +- `models` はモデル ID と表示名のマップです。各 ID は `GET /v1/models` が返す `id` と一致する必要があります。Atomic Chat に現在ロードされている ID の一覧は `curl http://127.0.0.1:1337/v1/models` を実行して確認できます。 + +:::tip +ツール呼び出しがうまく動作しない場合は、ツール呼び出しに強いロード済みモデル(例えば Qwen-Coder や DeepSeek-Coder のバリアント)を選択してください。 +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/ko/providers.mdx b/packages/web/src/content/docs/ko/providers.mdx index ad4aaae7e6..6ca3afccc3 100644 --- a/packages/web/src/content/docs/ko/providers.mdx +++ b/packages/web/src/content/docs/ko/providers.mdx @@ -325,6 +325,44 @@ Pro/Max 구독이 없는 경우 **Create an API Key**를 선택할 수 있습니 --- +### Atomic Chat + +OpenAI 호환 API 서버 뒤에서 로컬 LLM을 실행하는 데스크톱 애플리케이션인 [Atomic Chat](https://atomic.chat)을 통해 로컬 모델을 사용하도록 opencode를 구성할 수 있습니다(기본 엔드포인트 `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +이 예시에서: + +- `atomic-chat`은 사용자 지정 공급자 ID입니다. 원하는 어떤 문자열이든 사용할 수 있습니다. +- `npm`은 이 공급자에 사용할 패키지를 지정합니다. 여기서는 모든 OpenAI 호환 API에 `@ai-sdk/openai-compatible`이 사용됩니다. +- `name`은 UI에 표시되는 공급자의 이름입니다. +- `options.baseURL`은 로컬 서버의 엔드포인트입니다. Atomic Chat 설정에 맞게 호스트와 포트를 변경하세요. +- `models`는 모델 ID와 해당 표시 이름의 맵입니다. 각 ID는 `GET /v1/models`가 반환하는 `id`와 일치해야 하며, 현재 Atomic Chat에 로드된 ID 목록을 확인하려면 `curl http://127.0.0.1:1337/v1/models`를 실행하세요. + +:::tip +도구 호출이 잘 작동하지 않는 경우, 도구 호출을 잘 지원하는 로드된 모델(예: Qwen-Coder 또는 DeepSeek-Coder 변형)을 선택하세요. +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/nb/providers.mdx b/packages/web/src/content/docs/nb/providers.mdx index ea6f1cc1ab..1fe8812e67 100644 --- a/packages/web/src/content/docs/nb/providers.mdx +++ b/packages/web/src/content/docs/nb/providers.mdx @@ -323,6 +323,44 @@ Eller hvis du allerede har en API-nøkkel, kan du velge **Angi API-nøkkel manue --- +### Atomic Chat + +Du kan konfigurere opencode til å bruke lokale modeller via [Atomic Chat](https://atomic.chat) — et skrivebordsprogram som kjører lokale LLM-er bak en OpenAI-kompatibel API-server (standard endepunkt `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +I dette eksempelet: + +- `atomic-chat` er den egendefinerte leverandør-ID-en. Det kan være en hvilken som helst streng. +- `npm` spesifiserer pakken som skal brukes for denne leverandøren. Her brukes `@ai-sdk/openai-compatible` for enhver OpenAI-kompatibel API. +- `name` er visningsnavnet for leverandøren i grensesnittet. +- `options.baseURL` er endepunktet for den lokale serveren. Endre vert og port for å matche Atomic Chat-oppsettet ditt. +- `models` er en kartlegging av modell-IDer til visningsnavnene deres. Hver ID må samsvare med `id` som returneres av `GET /v1/models` — kjør `curl http://127.0.0.1:1337/v1/models` for å liste ID-ene som er lastet inn i Atomic Chat nå. + +:::tip +Hvis verktøykall ikke fungerer godt, velg en lastet modell med god tool calling-støtte (for eksempel en Qwen-Coder- eller DeepSeek-Coder-variant). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/pl/providers.mdx b/packages/web/src/content/docs/pl/providers.mdx index e2b4c70e6f..deadd07d6a 100644 --- a/packages/web/src/content/docs/pl/providers.mdx +++ b/packages/web/src/content/docs/pl/providers.mdx @@ -321,6 +321,44 @@ Lub jeśli masz już klucz API, możesz wybrać **Wprowadź klucz API ręcznie** --- +### Atomic Chat + +Możesz skonfigurować opencode do korzystania z modeli lokalnych przez [Atomic Chat](https://atomic.chat) — aplikację desktopową uruchamiającą lokalne LLM-y za serwerem API zgodnym z OpenAI (domyślny endpoint `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +W tym przykładzie: + +- `atomic-chat` to niestandardowy identyfikator dostawcy. Może to być dowolny ciąg znaków. +- `npm` określa pakiet używany dla tego dostawcy. Tutaj `@ai-sdk/openai-compatible` jest używany dla każdego API zgodnego z OpenAI. +- `name` to nazwa wyświetlana dostawcy w interfejsie. +- `options.baseURL` to endpoint serwera lokalnego. Zmień host i port, aby pasowały do twojej konfiguracji Atomic Chat. +- `models` to mapa identyfikatorów modeli na ich nazwy wyświetlane. Każdy identyfikator musi odpowiadać `id` zwracanemu przez `GET /v1/models` — uruchom `curl http://127.0.0.1:1337/v1/models`, aby wyświetlić identyfikatory aktualnie załadowane w Atomic Chat. + +:::tip +Jeśli wywołania narzędzi nie działają dobrze, wybierz załadowany model z dobrym wsparciem tool calling (na przykład wariant Qwen-Coder lub DeepSeek-Coder). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/providers.mdx b/packages/web/src/content/docs/providers.mdx index 752f6c054f..8576ec3562 100644 --- a/packages/web/src/content/docs/providers.mdx +++ b/packages/web/src/content/docs/providers.mdx @@ -334,6 +334,44 @@ the following subscriptions in OpenCode with zero setup: --- +### Atomic Chat + +You can configure opencode to use local models through [Atomic Chat](https://atomic.chat), a desktop application that runs local LLMs behind an OpenAI-compatible API server (default endpoint `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +In this example: + +- `atomic-chat` is the custom provider ID. This can be any string you want. +- `npm` specifies the package to use for this provider. Here, `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API. +- `name` is the display name for the provider in the UI. +- `options.baseURL` is the endpoint for the local server. Change the host and port to match your Atomic Chat setup. +- `models` is a map of model IDs to their display names. Each ID must match the `id` returned by `GET /v1/models` — run `curl http://127.0.0.1:1337/v1/models` to list the ids currently loaded in Atomic Chat. + +:::tip +If tool calls aren't working well, pick a loaded model with strong tool-calling support (for example, a Qwen-Coder or DeepSeek-Coder variant). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/pt-br/providers.mdx b/packages/web/src/content/docs/pt-br/providers.mdx index 58305a2737..50f841cf36 100644 --- a/packages/web/src/content/docs/pt-br/providers.mdx +++ b/packages/web/src/content/docs/pt-br/providers.mdx @@ -325,6 +325,44 @@ Ou, se você já tiver uma chave da API, pode selecionar **Inserir manualmente a --- +### Atomic Chat + +Você pode configurar o opencode para usar modelos locais através do [Atomic Chat](https://atomic.chat), um aplicativo de desktop que executa LLMs locais por trás de um servidor API compatível com OpenAI (endpoint padrão `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +Neste exemplo: + +- `atomic-chat` é o ID personalizado do provedor. Pode ser qualquer string que você quiser. +- `npm` especifica o pacote a ser usado para este provedor. Aqui, `@ai-sdk/openai-compatible` é usado para qualquer API compatível com OpenAI. +- `name` é o nome de exibição do provedor na interface. +- `options.baseURL` é o endpoint do servidor local. Altere host e porta para corresponder à sua configuração do Atomic Chat. +- `models` é um mapa de IDs de modelos para seus nomes de exibição. Cada ID deve corresponder ao `id` retornado por `GET /v1/models` — execute `curl http://127.0.0.1:1337/v1/models` para listar os IDs atualmente carregados no Atomic Chat. + +:::tip +Se as chamadas de ferramentas não estiverem funcionando bem, escolha um modelo carregado com bom suporte a tool calling (por exemplo, uma variante Qwen-Coder ou DeepSeek-Coder). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/ru/providers.mdx b/packages/web/src/content/docs/ru/providers.mdx index 144ce2ac5a..f5868ceaa0 100644 --- a/packages/web/src/content/docs/ru/providers.mdx +++ b/packages/web/src/content/docs/ru/providers.mdx @@ -321,6 +321,44 @@ OpenCode Go — это недорогой план подписки, обесп --- +### Atomic Chat + +Вы можете настроить opencode для работы с локальными моделями через [Atomic Chat](https://atomic.chat) — десктопное приложение, которое запускает локальные LLM за OpenAI-совместимым API-сервером (конечная точка по умолчанию `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +В этом примере: + +- `atomic-chat` — пользовательский идентификатор провайдера. Это может быть любая строка. +- `npm` указывает пакет, используемый для этого провайдера. Здесь используется `@ai-sdk/openai-compatible` для любых OpenAI-совместимых API. +- `name` — отображаемое имя провайдера в интерфейсе. +- `options.baseURL` — конечная точка локального сервера. Измените хост и порт в соответствии с вашей конфигурацией Atomic Chat. +- `models` — карта идентификаторов моделей и их отображаемых имён. Каждый ID должен совпадать со значением `id`, которое возвращает `GET /v1/models` — выполните `curl http://127.0.0.1:1337/v1/models`, чтобы увидеть ID моделей, загруженных в Atomic Chat. + +:::tip +Если вызовы инструментов работают нестабильно, выберите загруженную модель с хорошей поддержкой tool calling (например, вариант из семейств Qwen-Coder или DeepSeek-Coder). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/th/providers.mdx b/packages/web/src/content/docs/th/providers.mdx index c4fdb5ce34..818f39213c 100644 --- a/packages/web/src/content/docs/th/providers.mdx +++ b/packages/web/src/content/docs/th/providers.mdx @@ -321,6 +321,44 @@ OpenCode Go คือแผนการสมัครสมาชิกรา --- +### Atomic Chat + +คุณสามารถกำหนดค่า opencode ให้ใช้โมเดลท้องถิ่นผ่าน [Atomic Chat](https://atomic.chat) ซึ่งเป็นแอปพลิเคชันเดสก์ท็อปที่เรียกใช้ LLM ในเครื่องภายใต้เซิร์ฟเวอร์ API ที่เข้ากันได้กับ OpenAI (ปลายทางเริ่มต้น `http://127.0.0.1:1337/v1`) + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +ในตัวอย่างนี้: + +- `atomic-chat` คือรหัสผู้ให้บริการที่กำหนดเอง สามารถเป็นสตริงใดก็ได้ที่คุณต้องการ +- `npm` ระบุแพ็กเกจที่จะใช้สำหรับผู้ให้บริการนี้ ที่นี่ใช้ `@ai-sdk/openai-compatible` สำหรับ API ใดๆ ที่เข้ากันได้กับ OpenAI +- `name` คือชื่อแสดงของผู้ให้บริการในอินเทอร์เฟซ +- `options.baseURL` คือปลายทางของเซิร์ฟเวอร์ท้องถิ่น เปลี่ยนโฮสต์และพอร์ตให้ตรงกับการตั้งค่า Atomic Chat ของคุณ +- `models` คือแผนที่ระหว่างรหัสโมเดลกับชื่อแสดง แต่ละรหัสต้องตรงกับ `id` ที่ส่งคืนโดย `GET /v1/models` — รัน `curl http://127.0.0.1:1337/v1/models` เพื่อแสดงรายการรหัสที่โหลดอยู่ใน Atomic Chat + +:::tip +หากการเรียกเครื่องมือทำงานได้ไม่ดี ให้เลือกโมเดลที่โหลดแล้วซึ่งรองรับ tool calling ได้ดี (ตัวอย่างเช่น รุ่น Qwen-Coder หรือ DeepSeek-Coder) +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/tr/providers.mdx b/packages/web/src/content/docs/tr/providers.mdx index cf073e3095..527c20e15e 100644 --- a/packages/web/src/content/docs/tr/providers.mdx +++ b/packages/web/src/content/docs/tr/providers.mdx @@ -323,6 +323,44 @@ Veya zaten bir API anahtarınız varsa **API Anahtarını Manuel Olarak Girin** --- +### Atomic Chat + +opencode'u, yerel LLM'leri OpenAI uyumlu bir API sunucusunun arkasında çalıştıran bir masaüstü uygulaması olan [Atomic Chat](https://atomic.chat) aracılığıyla yerel modelleri kullanacak şekilde yapılandırabilirsiniz (varsayılan uç nokta `http://127.0.0.1:1337/v1`). + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +Bu örnekte: + +- `atomic-chat` özel sağlayıcı kimliğidir. İstediğiniz herhangi bir dize olabilir. +- `npm` bu sağlayıcı için kullanılacak paketi belirtir. Burada, herhangi bir OpenAI uyumlu API için `@ai-sdk/openai-compatible` kullanılır. +- `name` sağlayıcının arayüzde görüntülenen adıdır. +- `options.baseURL` yerel sunucunun uç noktasıdır. Host ve portu Atomic Chat kurulumunuzla eşleşecek şekilde değiştirin. +- `models` model kimliklerini görüntüleme adlarına eşleyen bir haritadır. Her ID, `GET /v1/models` tarafından döndürülen `id` ile eşleşmelidir — Atomic Chat'te yüklü kimlikleri listelemek için `curl http://127.0.0.1:1337/v1/models` çalıştırın. + +:::tip +Araç çağrıları iyi çalışmıyorsa, tool calling desteği güçlü olan yüklü bir model seçin (örneğin, bir Qwen-Coder veya DeepSeek-Coder varyantı). +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/zh-cn/providers.mdx b/packages/web/src/content/docs/zh-cn/providers.mdx index e22ba3ad80..80dfe1e93d 100644 --- a/packages/web/src/content/docs/zh-cn/providers.mdx +++ b/packages/web/src/content/docs/zh-cn/providers.mdx @@ -295,6 +295,44 @@ OpenCode Zen 是由 OpenCode 团队提供的模型列表,这些模型已经过 --- +### Atomic Chat + +你可以通过 [Atomic Chat](https://atomic.chat) 配置 opencode 以使用本地模型。Atomic Chat 是一款桌面应用程序,它在 OpenAI 兼容的 API 服务器后面运行本地 LLM(默认端点 `http://127.0.0.1:1337/v1`)。 + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +在此示例中: + +- `atomic-chat` 是自定义的提供商 ID。可以是任何你想要的字符串。 +- `npm` 指定此提供商使用的包。这里使用 `@ai-sdk/openai-compatible` 来连接任何 OpenAI 兼容的 API。 +- `name` 是提供商在界面中显示的名称。 +- `options.baseURL` 是本地服务器的端点。根据你的 Atomic Chat 设置修改主机和端口。 +- `models` 是模型 ID 到其显示名称的映射。每个 ID 必须与 `GET /v1/models` 返回的 `id` 匹配——运行 `curl http://127.0.0.1:1337/v1/models` 可列出 Atomic Chat 当前已加载的 ID。 + +:::tip +如果工具调用工作不佳,请选择一个对 tool calling 支持较好的已加载模型(例如 Qwen-Coder 或 DeepSeek-Coder 的变体)。 +::: + +--- + ### Azure OpenAI :::note diff --git a/packages/web/src/content/docs/zh-tw/providers.mdx b/packages/web/src/content/docs/zh-tw/providers.mdx index 8af6f01d4a..c874170959 100644 --- a/packages/web/src/content/docs/zh-tw/providers.mdx +++ b/packages/web/src/content/docs/zh-tw/providers.mdx @@ -316,6 +316,44 @@ OpenCode Go 是一個低成本的訂閱計畫,提供對 OpenCode 團隊提供 --- +### Atomic Chat + +你可以透過 [Atomic Chat](https://atomic.chat) 設定 opencode 以使用本地模型。Atomic Chat 是一款桌面應用程式,它在 OpenAI 相容的 API 伺服器後方執行本地 LLM(預設端點 `http://127.0.0.1:1337/v1`)。 + +```json title="opencode.json" "atomic-chat" {5, 6, 8, 10-14} +{ + "$schema": "https://opencode.ai/config.json", + "provider": { + "atomic-chat": { + "npm": "@ai-sdk/openai-compatible", + "name": "Atomic Chat (local)", + "options": { + "baseURL": "http://127.0.0.1:1337/v1" + }, + "models": { + "": { + "name": "" + } + } + } + } +} +``` + +在此範例中: + +- `atomic-chat` 是自訂的提供者 ID。可以是任何你想要的字串。 +- `npm` 指定此提供者所使用的套件。這裡使用 `@ai-sdk/openai-compatible` 來連接任何 OpenAI 相容的 API。 +- `name` 是提供者在介面中顯示的名稱。 +- `options.baseURL` 是本地伺服器的端點。請根據你的 Atomic Chat 設定修改主機與連接埠。 +- `models` 是模型 ID 到其顯示名稱的對應表。每個 ID 必須與 `GET /v1/models` 所回傳的 `id` 相符——執行 `curl http://127.0.0.1:1337/v1/models` 可列出 Atomic Chat 目前載入的 ID。 + +:::tip +如果工具呼叫運作不佳,請選擇一個對 tool calling 支援較好的已載入模型(例如 Qwen-Coder 或 DeepSeek-Coder 的變體)。 +::: + +--- + ### Azure OpenAI :::note