* fix: map base_url to endpoint for Azure credentials The Azure credential form only exposes a base_url field, but the connection tester, key provisioner, and Esperanto config all expect an endpoint field. This maps base_url to endpoint for Azure providers so credentials work without requiring a dedicated endpoint form field. Closes #727 * docs: update Azure credential docs to reflect base_url mapping
11 KiB
API Configuration
Configure AI provider credentials through the Settings UI. No file editing required.
Credential System: Open Notebook uses encrypted credentials stored in the database. Each credential connects to a provider and allows you to discover, register, and test models.
Overview
Open Notebook manages AI provider access through a credential-based system:
- You create a credential for each provider (API key + settings)
- Credentials are encrypted and stored in the database
- You test connections to verify credentials work
- You discover and register models from each credential
- Models are linked to credentials for direct configuration
Encryption Setup
Before storing credentials, you must configure an encryption key.
Setting the Encryption Key
Add OPEN_NOTEBOOK_ENCRYPTION_KEY to your docker-compose.yml:
environment:
- OPEN_NOTEBOOK_ENCRYPTION_KEY=my-secret-passphrase
Any string works as a key — it will be securely derived via SHA-256 internally.
Warning
: If you change or lose the encryption key, all stored credentials become unreadable. Back up your encryption key securely and separately from your database backups.
Docker Secrets Support
Both password and encryption key support Docker secrets:
# docker-compose.yml
services:
open_notebook:
environment:
- OPEN_NOTEBOOK_PASSWORD_FILE=/run/secrets/app_password
- OPEN_NOTEBOOK_ENCRYPTION_KEY_FILE=/run/secrets/encryption_key
secrets:
- app_password
- encryption_key
secrets:
app_password:
file: ./secrets/password.txt
encryption_key:
file: ./secrets/encryption_key.txt
Encryption Details
API keys stored in the database are encrypted using Fernet (AES-128-CBC + HMAC-SHA256).
| Configuration | Behavior |
|---|---|
| Encryption key set | Keys encrypted with your key |
| No encryption key set | Storing credentials is disabled |
Accessing Credential Configuration
- Click Settings in the navigation bar
- Select API Keys tab
- You'll see existing credentials and an Add Credential button
Navigation: Settings → API Keys
Supported Providers
Cloud Providers
| Provider | Required Fields | Optional Fields |
|---|---|---|
| OpenAI | API Key | — |
| Anthropic | API Key | — |
| Google Gemini | API Key | — |
| Groq | API Key | — |
| Mistral | API Key | — |
| DeepSeek | API Key | — |
| xAI | API Key | — |
| OpenRouter | API Key | — |
| Voyage AI | API Key | — |
| ElevenLabs | API Key | — |
Local/Self-Hosted
| Provider | Required Fields | Notes |
|---|---|---|
| Ollama | Base URL | Typically http://localhost:11434 or http://ollama:11434 |
Enterprise
| Provider | Required Fields | Optional Fields |
|---|---|---|
| Azure OpenAI | API Key, URL Base (Azure endpoint) | Service-specific endpoints (LLM, Embedding, STT, TTS) |
| OpenAI-Compatible | Base URL | API Key, Service-specific configs |
| Vertex AI | Project ID, Location, Credentials Path | — |
Creating a Credential
Step 1: Add Credential
- Go to Settings → API Keys
- Click Add Credential
- Select your provider
- Give it a descriptive name (e.g., "My OpenAI Key", "Work Anthropic")
- Fill in the required fields (API key, base URL, etc.)
- Click Save
Step 2: Test Connection
- On your new credential card, click Test Connection
- Wait for the result:
| Result | Meaning |
|---|---|
| Success | Key is valid, provider accessible |
| Invalid API key | Check key format and value |
| Connection failed | Check URL, network, firewall |
Step 3: Discover Models
- Click Discover Models on the credential card
- The system queries the provider for available models
- Review the discovered models
Step 4: Register Models
- Select the models you want to use
- Click Register Models
- The models are now available throughout Open Notebook
Multi-Credential Support
Each provider can have multiple credentials. This is useful when:
- You have different API keys for different projects
- You want to test with different endpoints
- Multiple team members need separate credentials
Creating Multiple Credentials
- Click Add Credential again
- Select the same provider
- Fill in different credentials
- Each credential can discover and register its own models
How Models Link to Credentials
When you register models from a credential, those models are linked to that specific credential. This means:
- Each model knows which API key to use
- You can have models from different credentials for the same provider
- Deleting a credential removes its linked models
Testing Connections
Click Test Connection to verify your credential:
| Result | Meaning |
|---|---|
| Success | Key is valid, provider accessible |
| Invalid API key | Check key format and value |
| Connection failed | Check URL, network, firewall |
| Model not available | Key valid but model access restricted |
Test uses inexpensive models (e.g., gpt-3.5-turbo, claude-3-haiku) to minimize cost.
Configuring Specific Providers
Simple Providers (API Key Only)
For OpenAI, Anthropic, Google, Groq, Mistral, DeepSeek, xAI, OpenRouter:
- Add credential with your API key
- Test connection
- Discover and register models
Ollama (URL-Based)
- Add credential with provider Ollama
- Enter the base URL (e.g.,
http://ollama:11434) - Test connection
- Discover and register models
Ollama allows localhost and private IPs since it runs locally.
Azure OpenAI
- Add credential with provider Azure OpenAI
- Enter your API key
- Enter your Azure endpoint in the URL Base field (e.g.,
https://myresource.openai.azure.com) - Test connection
- Discover and register models
The URL Base field is automatically mapped to the Azure endpoint. The API version defaults to 2024-10-21 if not set via environment variable.
OpenAI-Compatible
For custom OpenAI-compatible servers (LM Studio, vLLM, etc.):
- Add credential with provider OpenAI-Compatible
- Enter the base URL
- Enter API key (if required)
- Optionally configure per-service URLs
Supports separate configurations for:
- LLM (language models)
- Embedding
- STT (speech-to-text)
- TTS (text-to-speech)
Vertex AI
Google Cloud's enterprise AI platform:
| Field | Example |
|---|---|
| Project ID | my-gcp-project |
| Location | us-central1 |
| Credentials Path | /path/to/service-account.json |
Migrating from Environment Variables
If you have existing API keys in environment variables (from a previous version):
- Open Settings → API Keys
- A banner appears: "Environment variables detected"
- Click Migrate to Database
- Keys are copied to the database (encrypted)
- Original environment variables remain unchanged
Migration Behavior
| Scenario | Action |
|---|---|
| Key in env only | Migrated to database |
| Key in database only | No change |
| Key in both | Database version kept (skipped) |
After Migration
- Database credentials are used for all operations
- You can remove the API key environment variables from your docker-compose.yml
- Keep
OPEN_NOTEBOOK_ENCRYPTION_KEY— it's still required
Migration Banner Visibility
The migration banner only appears when:
- You have environment variables configured
- Those providers are not already in the database
- If all env providers are already migrated, the banner won't show
Migrating from ProviderConfig (v1.1 → v1.2)
If you're upgrading from an older version that used the ProviderConfig system:
- The migration happens automatically on first startup
- Your existing configurations are converted to credentials
- Check Settings → API Keys to verify the migration succeeded
- If you see issues, check the API logs for migration messages
Key Storage Security
Encryption
API keys stored in the database are encrypted using Fernet (AES-128-CBC + HMAC-SHA256).
| Configuration | Behavior |
|---|---|
| Encryption key set | Keys encrypted with your key |
| No encryption key set | Storing API keys in database is disabled |
Default Credentials
| Setting | Default Value | Production Recommendation |
|---|---|---|
| Password | open-notebook-change-me |
Set OPEN_NOTEBOOK_PASSWORD |
| Encryption Key | None (must be set) | Set OPEN_NOTEBOOK_ENCRYPTION_KEY to any secret string |
For production deployments, always set custom credentials.
Deleting Credentials
- Click the Delete button on the credential card
- Confirm deletion
- Credential and all its linked models are removed from the database
Troubleshooting
Credential Not Saving
| Symptom | Cause | Solution |
|---|---|---|
| Save button disabled | Empty or invalid input | Enter a valid key |
| Error on save | Encryption key not set | Set OPEN_NOTEBOOK_ENCRYPTION_KEY in docker-compose.yml |
| Error on save | Database connection issue | Check database status |
Test Connection Fails
| Error | Cause | Solution |
|---|---|---|
| Invalid API key | Wrong key or format | Verify key from provider dashboard |
| Connection refused | Wrong URL | Check base URL format |
| Timeout | Network issue | Check firewall, proxy settings |
| 403 Forbidden | IP restriction | Whitelist your server IP |
Migration Issues
| Problem | Solution |
|---|---|
| No migration banner | No env vars detected, or already migrated |
| Partial migration | Check error list, fix and retry |
| Keys not working after migration | Clear browser cache, restart services |
Provider Shows "Not Configured"
- Check if a credential exists for this provider (Settings → API Keys)
- Test the credential connection
- Verify key format matches provider requirements
- Re-discover and register models if needed
Provider-Specific Notes
OpenAI
- Keys start with
sk-proj-(project keys) orsk-(legacy) - Requires billing enabled on account
Anthropic
- Keys start with
sk-ant- - Check account has API access enabled
Google Gemini
- Keys start with
AIzaSy - Free tier has rate limits
Ollama
- No API key required
- Default URL:
http://localhost:11434(local) orhttp://ollama:11434(Docker) - Ensure Ollama server is running
Azure OpenAI
- Enter your Azure endpoint in the URL Base field (format:
https://{resource-name}.openai.azure.com) - API version defaults to
2024-10-21; override viaAZURE_OPENAI_API_VERSIONenvironment variable if needed - Deployment names configured separately when registering models via the credential's Discover Models dialog
Related
- AI Providers — Provider setup instructions and recommendations
- Security — Password and encryption configuration
- Environment Reference — All configuration options