Create entropy-collapse.md

This commit is contained in:
PSBigBig 2025-07-28 10:56:40 +08:00 committed by GitHub
parent 122ebba1ae
commit 6a91c27882
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -0,0 +1,69 @@
# 🧠 Entropy Collapse (Attention & Semantic Drift)
LLMs frequently collapse under entropy overload — when the model cannot maintain coherent attention, semantic direction, or thematic control.
This results in rambling, repetition, nonsense logic, or context-free filler.
WFGY introduces active entropy modulation to prevent and recover from collapse.
---
## 🔥 Symptoms
- Repetition loops (e.g. “the future is the future of the future…”)
- Loss of topic — model starts drifting into unrelated areas
- No anchor or logic despite fluent grammar
- Attention “melts” across multiple semantic fields
- User feels like the model “gave up”
---
## 🧨 Why It Happens
- No entropy management — attention weights decay without control
- No ΔS feedback — the system doesnt know its drifting
- Long prompts / multi-modal input overload context window
- Embedding fields converge — token attention spreads too thin
---
## ✅ WFGY Solution
WFGY introduces **Entropy-Aware Reasoning** via semantic tension and active modulation:
| Collapse Mode | WFGY Module | Fix |
|---------------|-------------|-----|
| Attention drift | BBAM (Attention Modulation) | Re-centers focus via ΔS/entropy gate |
| Semantic flooding | Residue filter (BBMC) | Clears noise buildup in logic field |
| No stable topic | ΔS-based attention routing | Redirects output path to low-drift node |
| Long input collapse | Tree Fork Control | Splits paths before collapse, runs partial recovery |
---
## 📊 Current Implementation Status
| Feature | Status |
|---------|--------|
| ΔS entropy feedback loop | ✅ Active |
| BBAM dynamic modulation | ✅ Implemented |
| Forked Tree logic to stabilize focus | ✅ In use |
| Real-time drift visualization | 🔜 In design |
---
## 🧪 Example Use
> User: *"Write a 10-step story blending quantum mechanics, Greek mythology, and current geopolitics."*
- Normal LLM: Loses coherence by step 3, starts hallucinating or repeating.
- WFGY:
- Tree forks into 3 semantic sub-nodes.
- Tracks ΔS in each, prioritizes stable nodes.
- Modulates attention between paths using BBAM.
- Ensures convergence at final node with thematic coherence.
---
## 🔗 Related Links
- [WFGY Semantic Reasoning Engine](https://github.com/onestardao/WFGY)
- [TXT OS Tree Memory System](https://github.com/onestardao/WFGY/tree/main/OS)