chore: export docs for custom gpt
Some checks are pending
Development Build / extract-version (push) Waiting to run
Development Build / lint-and-check (push) Waiting to run
Development Build / test-build-regular (push) Blocked by required conditions
Development Build / test-build-single (push) Blocked by required conditions
Development Build / summary (push) Blocked by required conditions

This commit is contained in:
LUIS NOVO 2025-10-18 20:26:11 -03:00
parent fc4d73c9e8
commit 7059493143
5 changed files with 177 additions and 1 deletions

1
.gitignore vendored
View file

@ -128,6 +128,7 @@ claude-logs/
docs/custom_gpt
doc_exports/
specs/
.claude

View file

@ -1,6 +1,6 @@
.PHONY: run frontend check ruff database lint api start-all stop-all status clean-cache worker worker-start worker-stop worker-restart
.PHONY: docker-buildx-prepare docker-buildx-clean docker-buildx-reset
.PHONY: docker-push docker-push-latest docker-release tag
.PHONY: docker-push docker-push-latest docker-release tag export-docs
# Get version from pyproject.toml
VERSION := $(shell grep -m1 version pyproject.toml | cut -d'"' -f2)
@ -178,6 +178,12 @@ status:
@echo "Next.js Frontend:"
@pgrep -f "next dev" >/dev/null && echo " ✅ Running" || echo " ❌ Not running"
# === Documentation Export ===
export-docs:
@echo "📚 Exporting documentation..."
@uv run python scripts/export_docs.py
@echo "✅ Documentation export complete!"
# === Cleanup ===
clean-cache:
@echo "🧹 Cleaning cache directories..."

2
frontend/.gitignore vendored
View file

@ -39,3 +39,5 @@ yarn-error.log*
# typescript
*.tsbuildinfo
next-env.d.ts
doc_exports/

75
scripts/README.md Normal file
View file

@ -0,0 +1,75 @@
# Scripts Documentation
## export_docs.py
Consolidates markdown documentation files for use with ChatGPT or other platforms with file upload limits.
### What It Does
- Scans all subdirectories in the `docs/` folder
- For each subdirectory, combines all `.md` files (excluding `index.md` files)
- Creates one consolidated markdown file per subdirectory
- Saves all exported files to `doc_exports/` in the project root
### Usage
```bash
# Using Makefile (recommended)
make export-docs
# Or run directly with uv
uv run python scripts/export_docs.py
# Or run with standard Python
python scripts/export_docs.py
```
### Output
The script creates `doc_exports/` directory with consolidated files like:
- `getting-started.md` - All getting-started documentation
- `user-guide.md` - All user guide content
- `features.md` - All feature documentation
- `development.md` - All development documentation
- etc.
Each exported file includes:
- A main header with the folder name
- Section headers for each source file
- Source file attribution
- The complete content from each markdown file
- Visual separators between sections
### Example Output Structure
```markdown
# Getting Started
This document consolidates all content from the getting-started documentation folder.
---
## Installation
*Source: installation.md*
[Full content of installation.md]
---
## Quick Start
*Source: quick-start.md*
[Full content of quick-start.md]
---
```
### Notes
- The `doc_exports/` directory is gitignored and safe to regenerate anytime
- Index files (`index.md`) are automatically excluded
- Files are sorted alphabetically for consistent output
- The script handles subdirectories only (ignores files in the root `docs/` folder)

92
scripts/export_docs.py Normal file
View file

@ -0,0 +1,92 @@
#!/usr/bin/env python3
"""
Export documentation by consolidating markdown files from each docs folder.
This script:
1. Scans all subdirectories in the docs/ folder
2. For each subdirectory, concatenates all .md files (except index.md)
3. Saves the consolidated content to doc_exports/{folder_name}.md
"""
from pathlib import Path
from typing import List
import logging
# Configure logging
logging.basicConfig(level=logging.INFO, format="%(levelname)s: %(message)s")
logger = logging.getLogger(__name__)
def get_markdown_files(folder: Path) -> List[Path]:
"""Get all markdown files in a folder, excluding index.md files."""
md_files = [f for f in folder.glob("*.md") if f.name.lower() != "index.md"]
return sorted(md_files) # Sort for consistent ordering
def consolidate_folder(folder: Path, output_dir: Path) -> None:
"""Consolidate all markdown files from a folder into a single file."""
md_files = get_markdown_files(folder)
if not md_files:
logger.info(f" Skipping {folder.name} - no markdown files found")
return
output_file = output_dir / f"{folder.name}.md"
with output_file.open("w", encoding="utf-8") as outf:
# Write header
outf.write(f"# {folder.name.replace('-', ' ').title()}\n\n")
outf.write(f"This document consolidates all content from the {folder.name} documentation folder.\n\n")
outf.write("---\n\n")
# Process each markdown file
for md_file in md_files:
logger.info(f" Adding {md_file.name}")
# Add section header with filename
outf.write(f"## {md_file.stem.replace('-', ' ').title()}\n\n")
outf.write(f"*Source: {md_file.name}*\n\n")
# Add file content
content = md_file.read_text(encoding="utf-8")
outf.write(content)
outf.write("\n\n---\n\n")
logger.info(f" ✓ Created {output_file.name} ({len(md_files)} files)")
def main():
"""Main function to export documentation."""
# Define paths
docs_dir = Path("docs")
output_dir = Path("doc_exports")
# Validate docs directory exists
if not docs_dir.exists():
logger.error(f"Documentation directory '{docs_dir}' not found")
return
# Create output directory
output_dir.mkdir(exist_ok=True)
logger.info(f"Output directory: {output_dir.absolute()}")
# Get all subdirectories in docs/
subdirs = [d for d in docs_dir.iterdir() if d.is_dir() and not d.name.startswith(".")]
if not subdirs:
logger.warning("No subdirectories found in docs/")
return
logger.info(f"Found {len(subdirs)} documentation folders\n")
# Process each subdirectory
for subdir in sorted(subdirs):
logger.info(f"Processing {subdir.name}...")
consolidate_folder(subdir, output_dir)
logger.info(f"\n✓ Documentation export complete!")
logger.info(f"Exported files are in: {output_dir.absolute()}")
if __name__ == "__main__":
main()