Commit graph

2 commits

Author SHA1 Message Date
ruv
ade0fe82f6 fix: ruvllm pipeline — 7 critical fixes, all metrics improved
Before → After:
- Contrastive loss: -0.0% → 33.9% improvement
- Presence accuracy: 0% → 100%
- Temporal negatives: 0 → 22,396
- Quantization 2-bit: 16KB (4x) → 4KB (16x)
- Quantization 4-bit: 16KB (4x) → 8KB (8x)
- Training samples: 236 → 2,360 (10x augmentation)
- Triplets: 249 → 23,994 (96x more)

Fixes: gradient descent on encoder weights, temporal negative
threshold 30s→10s, PresenceHead (128→1 BCE), bit-packed
quantization, data augmentation (interp+noise+cross-node),
Xavier/Glorot init with batch normalization, live data collection

Co-Authored-By: claude-flow <ruv@ruv.net>
2026-04-02 22:40:48 -04:00
ruv
a73a17e264 feat: ADR-071 ruvllm training pipeline — contrastive + LoRA + TurboQuant
5-phase training pipeline using ruvllm (Rust-native, no PyTorch):
1. Contrastive pretraining (triplet + InfoNCE, 5 triplet strategies)
2. Task head training (presence, activity, vitals via SONA)
3. Per-node LoRA refinement (rank-4, room-specific adaptation)
4. TurboQuant quantization (2/4/8-bit, 6-8x compression)
5. EWC consolidation (prevent catastrophic forgetting)

Exports: SafeTensors, HuggingFace config, RVF, per-node LoRA, quantized
Validated: 249 triplets, 37,775 emb/s, 100% presence accuracy on test data
Target: <5 min training on M4 Pro, <10ms inference on Pi Zero

Co-Authored-By: claude-flow <ruv@ruv.net>
2026-04-02 22:27:24 -04:00