mirror of
https://github.com/onestardao/WFGY.git
synced 2026-04-28 11:40:07 +00:00
154 lines
9.4 KiB
Markdown
154 lines
9.4 KiB
Markdown
# BitsAndBytes (bnb): Guardrails and Fix Patterns
|
||
|
||
<details>
|
||
<summary><strong>🧭 Quick Return to Map</strong></summary>
|
||
|
||
<br>
|
||
|
||
> You are in a sub-page of **LocalDeploy_Inference**.
|
||
> To reorient, go back here:
|
||
>
|
||
> - [**LocalDeploy_Inference** — on-prem deployment and model inference](./README.md)
|
||
> - [**WFGY Global Fix Map** — main Emergency Room, 300+ structured fixes](../README.md)
|
||
> - [**WFGY Problem Map 1.0** — 16 reproducible failure modes](../../README.md)
|
||
>
|
||
> Think of this page as a desk within a ward.
|
||
> If you need the full triage and all prescriptions, return to the Emergency Room lobby.
|
||
</details>
|
||
|
||
|
||
BitsAndBytes provides 8-bit and 4-bit optimizers and quantized linear layers for large language models.
|
||
It enables training and inference under constrained VRAM, but introduces specific stability and semantic risks.
|
||
This page maps common bnb issues to structural fixes in the WFGY Problem Map with measurable acceptance gates.
|
||
|
||
---
|
||
|
||
## Open these first
|
||
- Visual map and recovery: [RAG Architecture & Recovery](https://github.com/onestardao/WFGY/blob/main/ProblemMap/rag-architecture-and-recovery.md)
|
||
- End-to-end retrieval knobs: [Retrieval Playbook](https://github.com/onestardao/WFGY/blob/main/ProblemMap/retrieval-playbook.md)
|
||
- Embedding vs meaning: [Embedding ≠ Semantic](https://github.com/onestardao/WFGY/blob/main/ProblemMap/embedding-vs-semantic.md)
|
||
- Chunk schema: [Chunking Checklist](https://github.com/onestardao/WFGY/blob/main/ProblemMap/chunking-checklist.md)
|
||
- Collapse and entropy: [Logic Collapse](https://github.com/onestardao/WFGY/blob/main/ProblemMap/logic-collapse.md), [Entropy Collapse](https://github.com/onestardao/WFGY/blob/main/ProblemMap/entropy-collapse.md)
|
||
- Ordering and boot issues: [Bootstrap Ordering](https://github.com/onestardao/WFGY/blob/main/ProblemMap/bootstrap-ordering.md), [Pre-deploy Collapse](https://github.com/onestardao/WFGY/blob/main/ProblemMap/predeploy-collapse.md)
|
||
|
||
---
|
||
|
||
## Core acceptance
|
||
- ΔS drift between FP16 and bnb ≤ 0.12
|
||
- Coverage ≥ 0.70 for target section
|
||
- λ convergent across 3 paraphrases and 2 seeds
|
||
- Optimizer variance < 5% vs FP16 baseline after 1k steps
|
||
|
||
---
|
||
|
||
## Typical BitsAndBytes breakpoints → exact fix
|
||
|
||
| Symptom | Likely cause | Open this |
|
||
|---|---|---|
|
||
| Model loads but output ΔS drifts > 0.20 | Incorrect `bnb_4bit_compute_dtype` or `bnb_4bit_quant_type` | [Embedding ≠ Semantic](https://github.com/onestardao/WFGY/blob/main/ProblemMap/embedding-vs-semantic.md), [Retrieval Traceability](https://github.com/onestardao/WFGY/blob/main/ProblemMap/retrieval-traceability.md) |
|
||
| Optimizer unstable, NaNs appear | Adam8bit variance clamp missing | [Logic Collapse](https://github.com/onestardao/WFGY/blob/main/ProblemMap/logic-collapse.md), [Entropy Collapse](https://github.com/onestardao/WFGY/blob/main/ProblemMap/entropy-collapse.md) |
|
||
| GPU memory savings not visible | Linear modules not replaced, or `prepare_model_for_kbit_training` skipped | [Bootstrap Ordering](https://github.com/onestardao/WFGY/blob/main/ProblemMap/bootstrap-ordering.md) |
|
||
| Synthesis diverges at long steps | Quantization noise accumulates | [Entropy Collapse](https://github.com/onestardao/WFGY/blob/main/ProblemMap/entropy-collapse.md), [Rerankers](https://github.com/onestardao/WFGY/blob/main/ProblemMap/rerankers.md) |
|
||
| JSON outputs break format | Schema loose, minor errors amplified | [Data Contracts](https://github.com/onestardao/WFGY/blob/main/ProblemMap/data-contracts.md) |
|
||
|
||
---
|
||
|
||
## Fix in 60 seconds
|
||
|
||
1) **Measure ΔS**
|
||
Compare FP16 baseline with bnb quantized run on 10 QA pairs.
|
||
Acceptable drift ≤ 0.12.
|
||
|
||
2) **Probe λ_observe**
|
||
Vary retrieval k. If λ flips divergent, lock schema order and apply BBAM.
|
||
|
||
3) **Apply the module**
|
||
- Retrieval drift → BBMC + Retrieval Traceability
|
||
- Optimizer instability → switch to Adam8bit with variance clamp
|
||
- Long-chain collapse → BBPF + rerankers
|
||
|
||
4) **Verify**
|
||
Coverage ≥ 0.70, λ convergent, entropy stable on 3 paraphrases.
|
||
|
||
---
|
||
|
||
## Copy-paste recipes
|
||
|
||
### A) Load 4-bit quantized model with bnb
|
||
```python
|
||
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
|
||
|
||
bnb_config = BitsAndBytesConfig(
|
||
load_in_4bit=True,
|
||
bnb_4bit_use_double_quant=True,
|
||
bnb_4bit_quant_type="nf4",
|
||
bnb_4bit_compute_dtype="bfloat16"
|
||
)
|
||
|
||
model_id = "your-model"
|
||
tokenizer = AutoTokenizer.from_pretrained(model_id)
|
||
model = AutoModelForCausalLM.from_pretrained(model_id, quantization_config=bnb_config, device_map="auto")
|
||
````
|
||
|
||
### B) Enable 8-bit optimizer
|
||
|
||
```python
|
||
from transformers import Adam8bit
|
||
|
||
optimizer = Adam8bit(model.parameters(), lr=2e-5)
|
||
# Clamp variance manually if drift detected
|
||
```
|
||
|
||
---
|
||
|
||
## Ops checklist
|
||
|
||
* Always run FP16 vs bnb ΔS/λ regression before production
|
||
* Verify VRAM usage with `torch.cuda.memory_allocated()`
|
||
* Track entropy growth vs sequence length
|
||
* Clamp gradient norms at optimizer if instability appears
|
||
|
||
---
|
||
|
||
### 🔗 Quick-Start Downloads (60 sec)
|
||
|
||
| Tool | Link | 3-Step Setup |
|
||
| -------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------ | ---------------------------------------------------------------------------------------- |
|
||
| **WFGY 1.0 PDF** | [Engine Paper](https://github.com/onestardao/WFGY/blob/main/I_am_not_lizardman/WFGY_All_Principles_Return_to_One_v1.0_PSBigBig_Public.pdf) | 1️⃣ Download · 2️⃣ Upload to your LLM · 3️⃣ Ask “Answer using WFGY + \<your question>” |
|
||
| **TXT OS (plain-text OS)** | [TXTOS.txt](https://github.com/onestardao/WFGY/blob/main/OS/TXTOS.txt) | 1️⃣ Download · 2️⃣ Paste into any LLM chat · 3️⃣ Type “hello world” — OS boots instantly |
|
||
|
||
---
|
||
|
||
### 🧭 Explore More
|
||
|
||
| Module | Description | Link |
|
||
| ------------------------ | ---------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------- |
|
||
| WFGY Core | WFGY 2.0 engine is live: full symbolic reasoning architecture and math stack | [View →](https://github.com/onestardao/WFGY/tree/main/core/README.md) |
|
||
| Problem Map 1.0 | Initial 16-mode diagnostic and symbolic fix framework | [View →](https://github.com/onestardao/WFGY/tree/main/ProblemMap/README.md) |
|
||
| Problem Map 2.0 | RAG-focused failure tree, modular fixes, and pipelines | [View →](https://github.com/onestardao/WFGY/blob/main/ProblemMap/rag-architecture-and-recovery.md) |
|
||
| Semantic Clinic Index | Expanded failure catalog: prompt injection, memory bugs, logic drift | [View →](https://github.com/onestardao/WFGY/blob/main/ProblemMap/SemanticClinicIndex.md) |
|
||
| Semantic Blueprint | Layer-based symbolic reasoning & semantic modulations | [View →](https://github.com/onestardao/WFGY/tree/main/SemanticBlueprint/README.md) |
|
||
| Benchmark vs GPT-5 | Stress test GPT-5 with full WFGY reasoning suite | [View →](https://github.com/onestardao/WFGY/tree/main/benchmarks/benchmark-vs-gpt5/README.md) |
|
||
| 🧙♂️ Starter Village 🏡 | New here? Lost in symbols? Click here and let the wizard guide you through | [Start →](https://github.com/onestardao/WFGY/blob/main/StarterVillage/README.md) |
|
||
|
||
---
|
||
|
||
> 👑 **Early Stargazers: [See the Hall of Fame](https://github.com/onestardao/WFGY/tree/main/stargazers)** <img src="https://img.shields.io/github/stars/onestardao/WFGY?style=social" alt="GitHub stars"> ⭐ [WFGY Engine 2.0](https://github.com/onestardao/WFGY/blob/main/core/README.md) is already unlocked. ⭐ Star the repo to help others discover it and unlock more on the [Unlock Board](https://github.com/onestardao/WFGY/blob/main/STAR_UNLOCKS.md).
|
||
|
||
<div align="center">
|
||
|
||
[](https://github.com/onestardao/WFGY)
|
||
|
||
[](https://github.com/onestardao/WFGY/tree/main/OS)
|
||
|
||
[](https://github.com/onestardao/WFGY/tree/main/OS/BlahBlahBlah)
|
||
|
||
[](https://github.com/onestardao/WFGY/tree/main/OS/BlotBlotBlot)
|
||
|
||
[](https://github.com/onestardao/WFGY/tree/main/OS/BlocBlocBloc)
|
||
|
||
[](https://github.com/onestardao/WFGY/tree/main/OS/BlurBlurBlur)
|
||
|
||
[](https://github.com/onestardao/WFGY/tree/main/OS/BlowBlowBlow)
|
||
|
||
</div>
|