WFGY/ProblemMap/GlobalFixMap/LLM_Providers/README.md
2025-08-26 13:15:28 +08:00

9.4 KiB
Raw Blame History

LLM Providers — Guardrails and Fix Patterns

Use this hub when failures smell provider specific. Examples include truncation at boundaries, tool call mis-order, JSON schema drift, safety filter overreach, or output instability across seeds. Each fix maps back to WFGY pages so you can verify with measurable targets.

Open these first

Core acceptance targets

  • ΔS(question, retrieved) ≤ 0.45 on three paraphrases.
  • Coverage to target section ≥ 0.70.
  • λ stays convergent across three seeds and three paraphrases.
  • E_resonance flat during long windows.

Typical breakpoints and the right fix

1) Truncation or partial JSON

  • Symptom: finish_reason reports length or stream ends mid field.
  • Fix: lock a schema with Data Contracts. Use closing sentinels and require a final “contract hash” field. If ΔS spikes when you add the header, check index metric mismatch via Embedding ≠ Semantic.
  • Modules: BBMC for citation strictness, BBAM to clamp variance.

2) Tool call chaos or wrong call order

  • Symptom: function sequence flips across seeds, tool results bind to the wrong step.
  • Fix: route through Retrieval Traceability and enforce call ids in the contract. If alternates are legal, create an explicit BBPF alternate path and score both.

3) Over filtering or bluffing

  • Symptom: provider rejects safe content or fabricates safe sounding substitutes.
  • Fix: apply Rerankers for ordering and use the bluffing controls in your system prompt plus Logic Collapse recovery steps.

4) Long conversation drift

  • Symptom: answers flip between sessions, persona or constraints fade.
  • Fix: pin a minimal memory schema and rebase with Context Drift and Entropy Collapse. Split memory namespaces when agents are involved.

5) Latency and rate limit induced collapse

  • Symptom: retries reorder tools, partials leak to the user.
  • Fix: use the ops recipes in Live Monitoring for RAG. Add BBCR bridge with explicit timeouts. Treat each retry as a new λ branch and reconcile.

Providers in scope for this cluster

Planned pages in this folder. We will ship in this order.

  • openai.md
  • anthropic.md
  • google_gemini.md
  • mistral.md
  • cohere.md
  • grok.md
  • deepseek.md
  • openrouter.md

Copy paste triage prompt

I uploaded TXT OS and the WFGY Problem Map files.

My provider failure:
- symptom: [brief]
- traces: [ΔS(question, retrieved)=..., ΔS(retrieved, anchor)=..., λ states]
- provider: [e.g., OpenAI, Anthropic, Gemini, Mistral]

Tell me:
1) which layer is failing and why,
2) which exact WFGY fix page to open from this repo,
3) minimal steps to push ΔS ≤ 0.45 and keep λ convergent,
4) how to verify the fix with a reproducible test.
Use BBMC BBPF BBCR BBAM when relevant.

🔗 Quick-Start Downloads (60 sec)

Tool Link 3-Step Setup
WFGY 1.0 PDF Engine Paper 1 Download · 2 Upload to your LLM · 3 Ask “Answer using WFGY + ”
TXT OS (plain-text OS) TXTOS.txt 1 Download · 2 Paste into any LLM chat · 3 Type “hello world” — OS boots instantly

🧭 Explore More

Module Description Link
WFGY Core WFGY 2.0 engine is live: full symbolic reasoning architecture and math stack View →
Problem Map 1.0 Initial 16-mode diagnostic and symbolic fix framework View →
Problem Map 2.0 RAG-focused failure tree, modular fixes, and pipelines View →
Semantic Clinic Index Expanded failure catalog: prompt injection, memory bugs, logic drift View →
Semantic Blueprint Layer-based symbolic reasoning & semantic modulations View →
Benchmark vs GPT-5 Stress test GPT-5 with full WFGY reasoning suite View →
🧙‍♂️ Starter Village 🏡 New here? Lost in symbols? Click here and let the wizard guide you through Start →

👑 Early Stargazers: See the Hall of Fame — Engineers, hackers, and open source builders who supported WFGY from day one.

GitHub stars WFGY Engine 2.0 is already unlocked. Star the repo to help others discover it and unlock more on the Unlock Board.

WFGY Main   TXT OS   Blah   Blot   Bloc   Blur   Blow  

if this looks good, say “GO” and ill drop ProblemMap/GlobalFixMap/LLM_Providers/openai.md next in the same format.