feat: add persistent tiktoken cache to reduce re-downloads (#171)
Some checks are pending
Development Build / extract-version (push) Waiting to run
Development Build / test-build-regular (push) Blocked by required conditions
Development Build / test-build-single (push) Blocked by required conditions
Development Build / summary (push) Blocked by required conditions

Configure tiktoken to cache tokenizer encodings in ./data/tiktoken-cache
instead of using system temp directory. This prevents re-downloading
encoding files on every container restart and improves startup time.

Changes:
- Add TIKTOKEN_CACHE_DIR configuration in config.py
- Set TIKTOKEN_CACHE_DIR environment variable in token_utils.py
- Bump version to 1.0.7
This commit is contained in:
Luis Novo 2025-10-19 14:50:52 -03:00 committed by GitHub
parent dd79d7a511
commit aa593c60bd
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
4 changed files with 23 additions and 12 deletions

View file

@ -3,6 +3,13 @@ Token utilities for Open Notebook.
Handles token counting and cost calculations for language models.
"""
import os
from open_notebook.config import TIKTOKEN_CACHE_DIR
# Set tiktoken cache directory before importing tiktoken to ensure
# tokenizer encodings are cached persistently in the data folder
os.environ["TIKTOKEN_CACHE_DIR"] = TIKTOKEN_CACHE_DIR
def token_count(input_string: str) -> int:
"""