airi/services/telegram-bot
2026-04-18 19:18:17 +08:00
..
deploy fix(service/telegram-bot): prevent being trucated 2025-08-19 22:57:11 +08:00
drizzle fix(telegram-bot): field incorrect & sticker pack & animated stickers 2025-04-07 18:28:22 +08:00
scripts refactor: start websocket server by server-runtime (#634) 2025-10-04 21:56:30 +08:00
sql feat(telegram-bot): memory basic 2025-03-24 02:09:35 +08:00
src style: lint 2026-03-23 02:16:10 +08:00
.env fix, feat(telegram): fix typo, make chat_id unique (#99) 2025-03-26 17:10:54 +08:00
docker-compose.yaml feat(service/telegram-bot): added all resources to deploy otel for observability (o11y) 2025-08-19 16:35:09 +08:00
drizzle.config.ts perf: setup knip, remove unused dependencies (#992) 2026-01-24 15:08:08 +08:00
package.json chore(deps): bump dependencies 2026-04-18 19:18:17 +08:00
README.md chore: dev only start web dev now, rename to use dev: stub: build: test: lint: for scripts 2025-05-27 22:32:22 +08:00
tsconfig.json style: many typescript v6 pending errors 2026-04-05 02:07:09 +08:00
vitest.config.ts fix(vite-plugin-warpdrive): vitest incorrectly configured, format & types & import 2025-11-26 23:36:07 +08:00

telegram-bot

Allow アイリ to talk to you and many other users in Telegram.

Getting started

Clone & install dependencies:

git clone git@github.com:moeru-ai/airi.git
pnpm i
pnpm run build:packages

Start Ollama instance for embedding models

ollama start
ollama pull nomic-embed-text

Create a .env.local file:

cd services/telegram-bot
cp .env .env.local

Fill-in the following credentials as configurations:

DATABASE_URL=postgres://postgres:123456@localhost:5432/postgres
TELEGRAM_BOT_TOKEN=''

LLM_API_BASE_URL=''
LLM_API_KEY=''
LLM_MODEL=''
LLM_RESPONSE_LANGUAGE=''

LLM_VISION_API_BASE_URL=''
LLM_VISION_API_KEY=''
LLM_VISION_MODEL=''

EMBEDDING_API_BASE_URL=''
EMBEDDING_API_KEY=''
EMBEDDING_MODEL=''
EMBEDDING_DIMENSION=''

For example:

DATABASE_URL=postgres://postgres:123456@localhost:5433/postgres
TELEGRAM_BOT_TOKEN='<Bot ID>:<Token>' # get one from @BotFather

LLM_API_BASE_URL='https://openrouter.ai/api/v1/' # if you use OpenRouter too
LLM_API_KEY='sk-or-v1-<token>'
LLM_MODEL='deepseek/deepseek-chat-v3-0324:free'
LLM_RESPONSE_LANGUAGE='English'

LLM_VISION_API_BASE_URL='https://openrouter.ai/api/v1/'
LLM_VISION_API_KEY='sk-or-v1-<token>'
LLM_VISION_MODEL='openai/gpt-4o' # as long as the model supports image input

EMBEDDING_API_BASE_URL='http://localhost:11434/v1/' # ollama
EMBEDDING_API_KEY=''
EMBEDDING_MODEL='nomic-embed-text' # embedding model
EMBEDDING_DIMENSION='768' # must set

Start both DB and the bot:

docker compose up -d
pnpm run -F @proj-airi/telegram-bot start