WFGY/ProblemMap/GlobalFixMap/Enterprise_Knowledge_Gov/retention_policy.md
2025-09-05 10:47:50 +08:00

7.7 KiB
Raw Blame History

Retention Policy — Enterprise Knowledge Governance

🧭 Quick Return to Map

You are in a sub-page of Enterprise_Knowledge_Gov.
To reorient, go back here:

Think of this page as a desk within a ward.
If you need the full triage and all prescriptions, return to the Emergency Room lobby.

Guardrails and fix patterns for enterprise knowledge retention. Use this page when AI systems over-retain, delete too early, or mix expired data with active knowledge.


When to use this page

  • AI responses reference documents that should have been deleted per policy.
  • Retained snippets do not respect jurisdictional time limits (e.g., GDPR 3 years).
  • Knowledge base or embeddings store does not purge revisions.
  • RAG answers mix archived with active content.

Core acceptance targets

  • ΔS(question, expired_snippet) ≥ 0.70 → expired content must not surface.
  • All snippets carry {expiry_date, retention_scope, audit_hash} fields.
  • Coverage ≥ 0.70 within active retention window only.
  • λ remains convergent across three paraphrases and two seeds.

Typical retention problems → exact fix

Symptom Likely cause Open this
Expired docs still retrieved Store never purged embeddings vectorstore-fragmentation.md
Wrong answer mixes expired + active Snippets missing expiry_date field data-contracts.md
AI cites “archived only” docs as live Retrieval trace missing retention scope retrieval-traceability.md

Fix in 60 seconds

  1. Check ΔS to expired content: run probe with expired snippets, expect ΔS ≥ 0.70.
  2. Schema enforcement: require expiry_date and retention_scope in every snippet.
  3. Index purge: remove expired embeddings before next RAG run.
  4. Audit λ: if λ flips when expired vs active co-exist, clamp with BBAM and enforce contracts.

Copy-paste schema (JSON)

{
  "snippet_id": "KB-5532",
  "expiry_date": "2025-12-31",
  "retention_scope": "eu-3y",
  "audit_hash": "sha256:...",
  "text": "..."
}

Escalate when

  • Expired content continues to surface after purge.
  • ΔS < 0.70 against expired content → embeddings contamination.
  • Audit requires full deletion trace and cannot be reproduced.

Use retrieval-playbook.md for deep purge testing and eval_rag_precision_recall.md to validate coverage.


🔗 Quick-Start Downloads

Tool Link 3-Step Setup
WFGY 1.0 PDF Engine Paper 1 Download · 2 Upload to your LLM · 3 Ask “Answer using WFGY + ”
TXT OS (plain-text OS) TXTOS.txt 1 Download · 2 Paste into any LLM chat · 3 Type “hello world” — OS boots instantly

🧭 Explore More

Module Description Link
WFGY Core WFGY 2.0 engine is live: full symbolic reasoning architecture and math stack View →
Problem Map 1.0 Initial 16-mode diagnostic and symbolic fix framework View →
Problem Map 2.0 RAG-focused failure tree, modular fixes, and pipelines View →
Semantic Clinic Index Expanded failure catalog: prompt injection, memory bugs, logic drift View →
Semantic Blueprint Layer-based symbolic reasoning & semantic modulations View →
Benchmark vs GPT-5 Stress test GPT-5 with full WFGY reasoning suite View →
🧙‍♂️ Starter Village 🏡 New here? Lost in symbols? Click here and let the wizard guide you through Start →

👑 Early Stargazers: See the Hall of Fame — Engineers, hackers, and open source builders who supported WFGY from day one.

GitHub stars WFGY Engine 2.0 is already unlocked. Star the repo to help others discover it and unlock more on the Unlock Board.

WFGY Main   TXT OS   Blah   Blot   Bloc   Blur   Blow  

要不要直接衝刺?