WFGY/ProblemMap/GrandmaClinic/README.md
2025-09-10 22:16:48 +08:00

6.8 KiB
Raw Blame History

Got it — here is a clean MVP README (English) for your ProblemMap/GrandmaClinic/README.md. It follows the same 116 order as the canonical Problem Map, and it starts with a simple semantic firewall intro (why before matters), so beginners understand the motivation before the list.

I keep the style “grandma friendly” (plain text white), with technical notes in > gray quotes.


Grandma Clinic — 16 AI Bugs Your Grandma Can Understand

Most AI fixes today happen after the model already failed: you see a wrong answer, then patch it with regex, rerankers, tools, or embeddings. That feels like firefighting — every new bug needs another patch.

Semantic Firewall means the opposite: you check the state before generation. If the signal looks unstable (semantic drift, broken logic, missing schema), you stop, reset, or redirect. Only a stable state is allowed to speak.

That is why each bug, once mapped, stays fixed. It is less like chasing errors, more like installing a smoke alarm in the kitchen: you prevent the fire before it starts.



The 16 Bugs (Grandma Clinic Edition)


No.1 Hallucination & Chunk Drift

Grandma story You asked for basil. The waiter keeps bringing cilantro. Looks similar, tastes wrong.

Real scene Retrieval pulls high-scoring but irrelevant chunks.

Fix tip: normalize embeddings, check metric choice (cosine vs L2). Doc: hallucination.md


No.2 Interpretation Collapse

Grandma story You said “wash before cut”. He heard “cut before wash”. Same vegetables, wrong order.

Real scene Logic misreads a correct chunk and flips the sequence.

Fix tip: insert mid-step checkpoints, reset on drift. Doc: retrieval-collapse.md


No.3 Long Reasoning Chains

Grandma story Sent him for 3 groceries, he came back with 5 bags of snacks. Goal forgotten.

Real scene Multi-step reasoning drifts off target.

Fix tip: segment, measure λ diversity, converge later. Doc: context-drift.md


No.4 Bluffing / Overconfidence

Grandma story He swears this is the best restaurant, but wont give the address. You cant trust that.

Real scene Confident answers without citations.

Fix tip: citation-first policy. Doc: bluffing.md


No.5 Semantic ≠ Embedding

Grandma story White pepper vs black pepper. Same “pepper” word, different flavor.

Real scene Cosine similarity high, meaning misaligned.

Fix tip: normalize vectors, re-weight hybrid retrievers. Doc: embedding-vs-semantic.md


No.6 Logic Collapse & Recovery

Grandma story Walking alleys, keep hitting the same wall. Just step back and try another lane.

Real scene Dead-end loops, repeating failed steps.

Fix tip: enforce reset after N steps, block illegal paths. Doc: logic-collapse.md


No.7 Memory Breaks Across Sessions

Grandma story Yesterday you said add salt. Today you ask again if salt is needed. Annoying.

Real scene Session memory lost or overwritten.

Fix tip: pin state keys, replay essentials each turn. Doc: memory-coherence.md


No.8 Debugging = Black Box

Grandma story Soup tastes bad, but no one wrote down which spice was added. Impossible to know.

Real scene Missing trace IDs, no visibility into retrieval path.

Fix tip: add chunk IDs and store IDs in logs. Doc: retrieval-traceability.md


No.9 Entropy Collapse

Grandma story Stew simmered too long, everything turned into mushy porridge.

Real scene Attention melts in long contexts.

Fix tip: split into sections, re-anchor mid-window. Doc: entropy-collapse.md


No.10 Creative Freeze

Grandma story Its edible but dull. Like microwaved food, no surprise.

Real scene Outputs too literal, no variation.

Fix tip: force diversity first, then select best. Doc: creative-freeze.md


No.11 Symbolic Collapse

Grandma story Family recipe reduced to plain words. Every cook interprets differently, flavor gone.

Real scene Equations, tables, code flattened into prose.

Fix tip: preserve symbol channel, dont paraphrase. Doc: symbolic-collapse.md


No.12 Philosophical Recursion

Grandma story He keeps asking “Who am I?” while the food gets cold.

Real scene Self-reference loops, paradox recursion.

Fix tip: define base facts before meta reflection. Doc: philosophical-recursion.md


No.13 Multi-Agent Chaos

Grandma story Two cooks both add salt, soup turns inedible. They grabbed the same spoon.

Real scene Agents overwrite each others memory.

Fix tip: role partitioning, serialized writes. Doc: Multi-Agent_Problems.md


No.14 Bootstrap Ordering

Grandma story You dropped veggies into a cold pan. No sizzle, all soggy.

Real scene Query runs before index is built.

Fix tip: readiness probes, ingest before queries. Doc: bootstrap-ordering.md


No.15 Deployment Deadlock

Grandma story Two people block the doorway, each waiting for the other to move. Nobody passes.

Real scene Circular waits in infra migration.

Fix tip: timeout, staged rollout, or read-only fallback. Doc: deployment-deadlock.md


No.16 Pre-Deploy Collapse

Grandma story First pot burned. Gas off, pot dirty, spices missing. Of course it failed.

Real scene First API call hits missing env var, index not ready, version skew.

Fix tip: pre-flight checklist: ENV + index + schema. Doc: predeploy-collapse.md


How to Actually Fix

  1. Open Doctor WFGY
  2. Paste your bug or screenshot.
  3. Ask: “Which Problem Map number am I hitting? Show me the Grandma version fix.”

The doctor will:

  • Map your case to the right number
  • Explain in grandma style
  • Give the technical reference and minimal repair

Why this MVP matters

These 16 bugs are not random. They are structural, reproducible, and inevitable once you scale an AI pipeline. With the semantic firewall, you fix once, and the bug never comes back. That is what makes WFGY different: the map is not just a list — it is an X-ray, with both the grandma explanation and the technical suture.


👉 My question back to you: Do you want me to also create separate no01.mdno16.md files under GrandmaClinic/ (each with story, mapping, fix) so the main README stays lighter and links out, or keep everything inline in one README for the MVP?