Version 1 (#160)

New front-end
Launch Chat API
Manage Sources
Enable re-embedding of all contents
Sources can be added without a notebook now
Improved settings
Enable model selector on all chats
Background processing for better experience
Dark mode
Improved Notes

Improved Docs: 
- Remove all Streamlit references from documentation
- Update deployment guides with React frontend setup
- Fix Docker environment variables format (SURREAL_URL, SURREAL_PASSWORD)
- Update docker image tag from :latest to :v1-latest
- Change navigation references (Settings → Models to just Models)
- Update development setup to include frontend npm commands
- Add MIGRATION.md guide for users upgrading from Streamlit
- Update quick-start guide with correct environment variables
- Add port 5055 documentation for API access
- Update project structure to reflect frontend/ directory
- Remove outdated source-chat documentation files
This commit is contained in:
Luis Novo 2025-10-18 12:46:22 -03:00 committed by GitHub
parent 124d7d110c
commit b7e656a319
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
319 changed files with 46747 additions and 7408 deletions

View file

@ -7,9 +7,10 @@ This guide covers everything you need to deploy Open Notebook using Docker, from
## 📋 What You'll Get
Open Notebook is a powerful AI-powered research and note-taking tool that:
- Modern Next.js/React interface for a smooth user experience
- Helps you organize research across multiple notebooks
- Lets you chat with your documents using AI
- Supports 15+ AI providers (OpenAI, Anthropic, Google, Ollama, and more)
- Supports 16+ AI providers (OpenAI, Anthropic, Google, Ollama, and more)
- Creates AI-generated podcasts from your content
- Works with PDFs, web links, videos, audio files, and more
@ -69,9 +70,10 @@ OpenAI provides everything you need to get started:
```yaml
services:
open_notebook:
image: lfnovo/open_notebook:latest-single
image: lfnovo/open_notebook:v1-latest-single
ports:
- "8502:8502"
- "8502:8502" # Frontend
- "5055:5055" # API
environment:
- OPENAI_API_KEY=your_openai_key_here
volumes:
@ -102,7 +104,8 @@ OpenAI provides everything you need to get started:
```
5. **Access the application**:
- Open your browser to: http://localhost:8502
- **Next.js UI**: http://localhost:8502 - Modern, responsive interface
- **API Documentation**: http://localhost:5055/docs - Full REST API access
- You should see the Open Notebook interface!
### Step 4: Configure Your Models
@ -112,9 +115,9 @@ Before creating your first notebook, configure your AI models:
1. Click **"⚙️ Settings"** in the sidebar
2. Click **"🤖 Models"** tab
3. Configure these recommended models:
- **Language Model**: `gpt-4o-mini` (cost-effective)
- **Language Model**: `gpt-5-mini` (cost-effective)
- **Embedding Model**: `text-embedding-3-small` (required for search)
- **Text-to-Speech**: `tts-1` (for podcast generation)
- **Text-to-Speech**: `gpt-4o-mini-tts` (for podcast generation)
- **Speech-to-Text**: `whisper-1` (for audio transcription)
4. Click **"Save"** after configuring all models
@ -136,17 +139,17 @@ For production deployments or development, use the multi-container setup:
```yaml
services:
surrealdb:
image: surrealdb/surrealdb:latest
image: surrealdb/surrealdb:v1-latest
ports:
- "8000:8000"
command: start --log trace --user root --pass root memory
restart: always
open_notebook:
image: lfnovo/open_notebook:latest
image: lfnovo/open_notebook:v1-latest
ports:
- "8502:8502"
- "5055:5055"
- "8502:8502" # Next.js Frontend
- "5055:5055" # REST API
env_file:
- ./docker.env
volumes:
@ -198,7 +201,7 @@ OpenRouter gives you access to virtually every AI model through a single API:
```bash
docker compose restart
```
4. **Configure models** in Settings → Models
4. **Configure models** in Models
**Recommended OpenRouter models**:
- `anthropic/claude-3-haiku` - Fast and cost-effective
@ -229,7 +232,7 @@ Run AI models locally for complete privacy:
```
Replace `192.168.1.100` with your actual IP.
6. **Restart and configure** models in Settings → Models
6. **Restart and configure** models in Models
### Other Providers
@ -273,7 +276,7 @@ This protects both the web interface and API endpoints.
```yaml
services:
surrealdb:
image: surrealdb/surrealdb:latest
image: surrealdb/surrealdb:v1-latest
ports:
- "127.0.0.1:8000:8000" # Bind to localhost only
command: start --log warn --user root --pass root file:///mydata/database.db
@ -287,7 +290,7 @@ services:
cpus: "0.5"
open_notebook:
image: lfnovo/open_notebook:latest
image: lfnovo/open_notebook:v1-latest
ports:
- "127.0.0.1:8502:8502"
- "127.0.0.1:5055:5055"
@ -464,7 +467,7 @@ ENABLE_ANALYTICS=false
version: '3.8'
services:
surrealdb:
image: surrealdb/surrealdb:latest
image: surrealdb/surrealdb:v1-latest
ports:
- "8000:8000"
command: start --log warn --user root --pass root file:///mydata/database.db
@ -478,10 +481,10 @@ services:
retries: 3
open_notebook:
image: lfnovo/open_notebook:latest
image: lfnovo/open_notebook:v1-latest
ports:
- "8502:8502"
- "5055:5055"
- "8502:8502" # Next.js Frontend
- "5055:5055" # REST API
env_file:
- ./docker.env
volumes:
@ -491,7 +494,7 @@ services:
condition: service_healthy
restart: always
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8502/health"]
test: ["CMD", "curl", "-f", "http://localhost:5055/health"]
interval: 30s
timeout: 10s
retries: 3