* feat: replace provider config with credential-based system (#477) Introduce a new credential management system replacing the old ProviderConfig singleton and standalone Models page. Each credential stores encrypted API keys and provider-specific configuration with full CRUD support via a unified settings UI. Backend: - Add Credential domain model with encrypted API key storage - Add credentials API router (CRUD, discovery, registration, testing) - Add encryption utilities for secure key storage - Add key_provider for DB-first env-var fallback provisioning - Add connection tester and model discovery services - Integrate ModelManager with credential-based config - Add provider name normalization for Esperanto compatibility - Add database migrations 11-12 for credential schema Frontend: - Rewrite settings/api-keys page with credential management UI - Add model discovery dialog with search and custom model support - Add compact default model assignments (primary/advanced layout) - Add inline model testing and credential connection testing - Add env-var migration banner - Update navigation to unified settings page - Remove standalone models page and old settings components i18n: - Update all 7 locale files with credential and model management keys Closes #477 Co-Authored-By: JFMD <git@jfmd.us> Co-Authored-By: OraCatQAQ <570768706@qq.com> * fix: address PR #540 review comments - Fix docs referencing removed Models page - Fix error-handler returning raw messages instead of i18n keys - Fix auth.py misleading docstring and missing no-password guard - Fix connection_tester using wrong env var for openai_compatible - Add provision_provider_keys before model discovery/sync - Update CLAUDE.md to reflect credential-based system - Fix missing closing brace in api-keys page useEffect * fix: add logging to credential migration and surface errors in UI - Add comprehensive logging to migrate-from-env and migrate-from-provider-config endpoints (start, per-provider progress, success/failure with stack traces, final summary) - Fix frontend migration hooks ignoring errors array from response - Show error toast when migration fails instead of "nothing to migrate" - Invalidate status/envStatus queries after migration so banner updates * docs: update CLAUDE.md files for credential system Replace stale ProviderConfig and /api-keys/ references across 8 CLAUDE.md files to reflect the new Credential-based system from PR #540. * docs: update user documentation for credential-based system Replace env var API key instructions with Settings UI credential workflow across all user-facing documentation. The new flow is: set OPEN_NOTEBOOK_ENCRYPTION_KEY → start services → add credential in Settings UI → test → discover models → register. - Rewrite ai-providers.md, api-configuration.md, environment-reference.md - Update all quick-start guides and installation docs - Update ollama.md, openai-compatible.md, local-tts/stt networking sections - Update reverse-proxy.md, development-setup.md, security.md - Fix broken links to non-existent docs/deployment/ paths - Add credentials endpoints to api-reference.md - Move all API key env vars to deprecated/legacy sections * chore: bump version to 1.7.0-rc1 Release candidate for credential-based provider management system. * fix: initialize provider before try block in test_credential Prevents UnboundLocalError when Credential.get() throws (e.g., invalid credential_id) before provider is assigned. * fix: reorder down migration to drop index before table Removes duplicate REMOVE FIELD statement and reorders so the index is dropped before the table, preventing rollback failures. * refactor: simplify encryption key to always derive via SHA-256 Remove the dual code path in _ensure_fernet_key() that detected native Fernet keys. Since the credential system is new, always deriving via SHA-256 removes unnecessary complexity. Also removes the generate_key() function and Fernet.generate_key() references from docs. * fix: correct mock patch targets in embedding tests and URL validation Fix embedding tests patching wrong module path for model_manager (was targeting open_notebook.utils.embedding.model_manager but it's imported locally from open_notebook.ai.models). Also fix URL validation to allow unresolvable hostnames since they may be valid in the deployment environment (e.g., Azure endpoints, internal DNS). * feat: add global setup banner for encryption and migration status Show a persistent banner in AppShell when encryption key is missing (red) or env var API keys can be migrated (amber), so users see these prompts on every page instead of only on Settings > API Keys. Includes a docs link for the encryption banner and i18n support across all 7 locales. * docs: several improvements to docker-compose e env examples * Update README.md Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com> * docs: fix env var format in README and update model setup instructions Align the encryption key snippet in README Step 2 with the list format used in the compose file. Replace deprecated "Settings → Models" instructions with credential-based Discover Models flow. * fix: address credential system review issues - Fix SSRF bypass via IPv4-mapped IPv6 addresses (::ffff:169.254.x.x) - Fix TTS connection test missing config parameter - Add Azure-specific model discovery using api-key auth header - Add Vertex static model list for credential-based discovery - Fix PROVIDER_DISCOVERY_FUNCTIONS incorrect azure/vertex mapping - Extract business logic to api/credentials_service.py (service layer) - Move credential Pydantic schemas to api/models.py - Update tests to use new service imports and ValueError assertions * fix: sanitize error responses and migrate key_provider to Credential - Replace raw exception messages in all credential router 500 responses with generic error strings (internal details logged server-side only) - Refactor key_provider.py to use Credential.get_by_provider() instead of deprecated ProviderConfig.get_instance() - Remove unused functions (get_provider_configs, get_default_api_key, get_provider_config) that were dead code --------- Co-authored-by: JFMD <git@jfmd.us> Co-authored-by: OraCatQAQ <570768706@qq.com>
11 KiB
API Configuration
Configure AI provider credentials through the Settings UI. No file editing required.
Credential System: Open Notebook uses encrypted credentials stored in the database. Each credential connects to a provider and allows you to discover, register, and test models.
Overview
Open Notebook manages AI provider access through a credential-based system:
- You create a credential for each provider (API key + settings)
- Credentials are encrypted and stored in the database
- You test connections to verify credentials work
- You discover and register models from each credential
- Models are linked to credentials for direct configuration
Encryption Setup
Before storing credentials, you must configure an encryption key.
Setting the Encryption Key
Add OPEN_NOTEBOOK_ENCRYPTION_KEY to your docker-compose.yml:
environment:
- OPEN_NOTEBOOK_ENCRYPTION_KEY=my-secret-passphrase
Any string works as a key — it will be securely derived via SHA-256 internally.
Warning
: If you change or lose the encryption key, all stored credentials become unreadable. Back up your encryption key securely and separately from your database backups.
Docker Secrets Support
Both password and encryption key support Docker secrets:
# docker-compose.yml
services:
open_notebook:
environment:
- OPEN_NOTEBOOK_PASSWORD_FILE=/run/secrets/app_password
- OPEN_NOTEBOOK_ENCRYPTION_KEY_FILE=/run/secrets/encryption_key
secrets:
- app_password
- encryption_key
secrets:
app_password:
file: ./secrets/password.txt
encryption_key:
file: ./secrets/encryption_key.txt
Encryption Details
API keys stored in the database are encrypted using Fernet (AES-128-CBC + HMAC-SHA256).
| Configuration | Behavior |
|---|---|
| Encryption key set | Keys encrypted with your key |
| No encryption key set | Storing credentials is disabled |
Accessing Credential Configuration
- Click Settings in the navigation bar
- Select API Keys tab
- You'll see existing credentials and an Add Credential button
Navigation: Settings → API Keys
Supported Providers
Cloud Providers
| Provider | Required Fields | Optional Fields |
|---|---|---|
| OpenAI | API Key | — |
| Anthropic | API Key | — |
| Google Gemini | API Key | — |
| Groq | API Key | — |
| Mistral | API Key | — |
| DeepSeek | API Key | — |
| xAI | API Key | — |
| OpenRouter | API Key | — |
| Voyage AI | API Key | — |
| ElevenLabs | API Key | — |
Local/Self-Hosted
| Provider | Required Fields | Notes |
|---|---|---|
| Ollama | Base URL | Typically http://localhost:11434 or http://ollama:11434 |
Enterprise
| Provider | Required Fields | Optional Fields |
|---|---|---|
| Azure OpenAI | API Key, Endpoint, API Version | Service-specific endpoints (LLM, Embedding, STT, TTS) |
| OpenAI-Compatible | Base URL | API Key, Service-specific configs |
| Vertex AI | Project ID, Location, Credentials Path | — |
Creating a Credential
Step 1: Add Credential
- Go to Settings → API Keys
- Click Add Credential
- Select your provider
- Give it a descriptive name (e.g., "My OpenAI Key", "Work Anthropic")
- Fill in the required fields (API key, base URL, etc.)
- Click Save
Step 2: Test Connection
- On your new credential card, click Test Connection
- Wait for the result:
| Result | Meaning |
|---|---|
| Success | Key is valid, provider accessible |
| Invalid API key | Check key format and value |
| Connection failed | Check URL, network, firewall |
Step 3: Discover Models
- Click Discover Models on the credential card
- The system queries the provider for available models
- Review the discovered models
Step 4: Register Models
- Select the models you want to use
- Click Register Models
- The models are now available throughout Open Notebook
Multi-Credential Support
Each provider can have multiple credentials. This is useful when:
- You have different API keys for different projects
- You want to test with different endpoints
- Multiple team members need separate credentials
Creating Multiple Credentials
- Click Add Credential again
- Select the same provider
- Fill in different credentials
- Each credential can discover and register its own models
How Models Link to Credentials
When you register models from a credential, those models are linked to that specific credential. This means:
- Each model knows which API key to use
- You can have models from different credentials for the same provider
- Deleting a credential removes its linked models
Testing Connections
Click Test Connection to verify your credential:
| Result | Meaning |
|---|---|
| Success | Key is valid, provider accessible |
| Invalid API key | Check key format and value |
| Connection failed | Check URL, network, firewall |
| Model not available | Key valid but model access restricted |
Test uses inexpensive models (e.g., gpt-3.5-turbo, claude-3-haiku) to minimize cost.
Configuring Specific Providers
Simple Providers (API Key Only)
For OpenAI, Anthropic, Google, Groq, Mistral, DeepSeek, xAI, OpenRouter:
- Add credential with your API key
- Test connection
- Discover and register models
Ollama (URL-Based)
- Add credential with provider Ollama
- Enter the base URL (e.g.,
http://ollama:11434) - Test connection
- Discover and register models
Ollama allows localhost and private IPs since it runs locally.
Azure OpenAI
Azure requires multiple fields:
| Field | Example | Required |
|---|---|---|
| API Key | abc123... |
Yes |
| Endpoint | https://myresource.openai.azure.com |
Yes |
| API Version | 2024-02-15-preview |
Yes |
| LLM Endpoint | https://myresource-llm.openai.azure.com |
No |
| Embedding Endpoint | https://myresource-embed.openai.azure.com |
No |
Service-specific endpoints override the main endpoint for that service type.
OpenAI-Compatible
For custom OpenAI-compatible servers (LM Studio, vLLM, etc.):
- Add credential with provider OpenAI-Compatible
- Enter the base URL
- Enter API key (if required)
- Optionally configure per-service URLs
Supports separate configurations for:
- LLM (language models)
- Embedding
- STT (speech-to-text)
- TTS (text-to-speech)
Vertex AI
Google Cloud's enterprise AI platform:
| Field | Example |
|---|---|
| Project ID | my-gcp-project |
| Location | us-central1 |
| Credentials Path | /path/to/service-account.json |
Migrating from Environment Variables
If you have existing API keys in environment variables (from a previous version):
- Open Settings → API Keys
- A banner appears: "Environment variables detected"
- Click Migrate to Database
- Keys are copied to the database (encrypted)
- Original environment variables remain unchanged
Migration Behavior
| Scenario | Action |
|---|---|
| Key in env only | Migrated to database |
| Key in database only | No change |
| Key in both | Database version kept (skipped) |
After Migration
- Database credentials are used for all operations
- You can remove the API key environment variables from your docker-compose.yml
- Keep
OPEN_NOTEBOOK_ENCRYPTION_KEY— it's still required
Migration Banner Visibility
The migration banner only appears when:
- You have environment variables configured
- Those providers are not already in the database
- If all env providers are already migrated, the banner won't show
Migrating from ProviderConfig (v1.1 → v1.2)
If you're upgrading from an older version that used the ProviderConfig system:
- The migration happens automatically on first startup
- Your existing configurations are converted to credentials
- Check Settings → API Keys to verify the migration succeeded
- If you see issues, check the API logs for migration messages
Key Storage Security
Encryption
API keys stored in the database are encrypted using Fernet (AES-128-CBC + HMAC-SHA256).
| Configuration | Behavior |
|---|---|
| Encryption key set | Keys encrypted with your key |
| No encryption key set | Storing API keys in database is disabled |
Default Credentials
| Setting | Default Value | Production Recommendation |
|---|---|---|
| Password | open-notebook-change-me |
Set OPEN_NOTEBOOK_PASSWORD |
| Encryption Key | None (must be set) | Set OPEN_NOTEBOOK_ENCRYPTION_KEY to any secret string |
For production deployments, always set custom credentials.
Deleting Credentials
- Click the Delete button on the credential card
- Confirm deletion
- Credential and all its linked models are removed from the database
Troubleshooting
Credential Not Saving
| Symptom | Cause | Solution |
|---|---|---|
| Save button disabled | Empty or invalid input | Enter a valid key |
| Error on save | Encryption key not set | Set OPEN_NOTEBOOK_ENCRYPTION_KEY in docker-compose.yml |
| Error on save | Database connection issue | Check database status |
Test Connection Fails
| Error | Cause | Solution |
|---|---|---|
| Invalid API key | Wrong key or format | Verify key from provider dashboard |
| Connection refused | Wrong URL | Check base URL format |
| Timeout | Network issue | Check firewall, proxy settings |
| 403 Forbidden | IP restriction | Whitelist your server IP |
Migration Issues
| Problem | Solution |
|---|---|
| No migration banner | No env vars detected, or already migrated |
| Partial migration | Check error list, fix and retry |
| Keys not working after migration | Clear browser cache, restart services |
Provider Shows "Not Configured"
- Check if a credential exists for this provider (Settings → API Keys)
- Test the credential connection
- Verify key format matches provider requirements
- Re-discover and register models if needed
Provider-Specific Notes
OpenAI
- Keys start with
sk-proj-(project keys) orsk-(legacy) - Requires billing enabled on account
Anthropic
- Keys start with
sk-ant- - Check account has API access enabled
Google Gemini
- Keys start with
AIzaSy - Free tier has rate limits
Ollama
- No API key required
- Default URL:
http://localhost:11434(local) orhttp://ollama:11434(Docker) - Ensure Ollama server is running
Azure OpenAI
- Endpoint format:
https://{resource-name}.openai.azure.com - API version format:
YYYY-MM-DDorYYYY-MM-DD-preview - Deployment names configured separately when registering models via the credential's Discover Models dialog
Related
- AI Providers — Provider setup instructions and recommendations
- Security — Password and encryption configuration
- Environment Reference — All configuration options