feat: add ADR-042 CHCI protocol, 24 new edge modules, README restructure

- ADR-042: Coherent Human Channel Imaging (non-CSI sensing protocol)
  with DDD domain model (6 bounded contexts)
- 24 new WASM edge modules: medical (5), retail (5), security (5),
  building (5), industrial (5), exotic (8)
- README: plain-language rewrites, moved detail sections below TOC,
  added edge module links to use case tables, firmware release docs
- User guide: firmware release table, edge intelligence documentation
- .gitignore: added rules for wasm, esp32 temp files, NVS binaries
- WASM edge crate: cargo config, integration tests, module registry

Co-Authored-By: claude-flow <ruv@ruv.net>
This commit is contained in:
ruv 2026-03-03 11:35:57 -05:00
parent d63d4d95d1
commit e94c7056f2
76 changed files with 27260 additions and 324 deletions

147
docs/edge-modules/README.md Normal file
View file

@ -0,0 +1,147 @@
# Edge Intelligence Modules — WiFi-DensePose
> 60 WASM modules that run directly on an ESP32 sensor. No internet needed, no cloud fees, instant response. Each module is a tiny file (5-30 KB) that reads WiFi signal data and makes decisions locally in under 10 ms.
## Quick Start
```bash
# Build all modules for ESP32
cd rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge
cargo build --target wasm32-unknown-unknown --release
# Run all 632 tests
cargo test --features std
# Upload a module to your ESP32
python scripts/wasm_upload.py --port COM7 --module target/wasm32-unknown-unknown/release/module_name.wasm
```
## Module Categories
| | Category | Modules | Tests | Documentation |
|---|----------|---------|-------|---------------|
| | **Core** | 7 | 81 | [core.md](core.md) |
| | **Medical & Health** | 5 | 38 | [medical.md](medical.md) |
| | **Security & Safety** | 6 | 42 | [security.md](security.md) |
| | **Smart Building** | 5 | 38 | [building.md](building.md) |
| | **Retail & Hospitality** | 5 | 38 | [retail.md](retail.md) |
| | **Industrial** | 5 | 38 | [industrial.md](industrial.md) |
| | **Exotic & Research** | 10 | ~60 | [exotic.md](exotic.md) |
| | **Signal Intelligence** | 6 | 54 | [signal-intelligence.md](signal-intelligence.md) |
| | **Adaptive Learning** | 4 | 42 | [adaptive-learning.md](adaptive-learning.md) |
| | **Spatial & Temporal** | 6 | 56 | [spatial-temporal.md](spatial-temporal.md) |
| | **AI Security** | 2 | 20 | [ai-security.md](ai-security.md) |
| | **Quantum & Autonomous** | 4 | 30 | [autonomous.md](autonomous.md) |
| | **Total** | **65** | **632** | |
## How It Works
1. **WiFi signals bounce off people and objects** in a room, creating a unique pattern
2. **The ESP32 chip reads these patterns** as Channel State Information (CSI) — 52 numbers that describe how each WiFi channel changed
3. **WASM modules analyze the patterns** to detect specific things: someone fell, a room is occupied, breathing rate changed
4. **Events are emitted locally** — no cloud round-trip, response time under 10 ms
## Architecture
```
WiFi Router ──── radio waves ────→ ESP32-S3 Sensor
┌──────────────┐
│ Tier 0-2 │ C firmware: phase unwrap,
│ DSP Engine │ stats, top-K selection
└──────┬───────┘
│ CSI frame (52 subcarriers)
┌──────────────┐
│ WASM3 │ Tiny interpreter
│ Runtime │ (60 KB overhead)
└──────┬───────┘
┌───────────┼───────────┐
▼ ▼ ▼
┌──────────┐ ┌──────────┐ ┌──────────┐
│ Module A │ │ Module B │ │ Module C │
│ (5-30KB) │ │ (5-30KB) │ │ (5-30KB) │
└────┬─────┘ └────┬─────┘ └────┬─────┘
│ │ │
└───────────┼───────────┘
Events + Alerts
(UDP to aggregator or local)
```
## Host API
Every module talks to the ESP32 through 12 functions:
| Function | Returns | Description |
|----------|---------|-------------|
| `csi_get_phase(i)` | `f32` | WiFi signal phase angle for subcarrier `i` |
| `csi_get_amplitude(i)` | `f32` | Signal strength for subcarrier `i` |
| `csi_get_variance(i)` | `f32` | How much subcarrier `i` fluctuates |
| `csi_get_bpm_breathing()` | `f32` | Breathing rate (BPM) |
| `csi_get_bpm_heartrate()` | `f32` | Heart rate (BPM) |
| `csi_get_presence()` | `i32` | Is anyone there? (0/1) |
| `csi_get_motion_energy()` | `f32` | Overall movement level |
| `csi_get_n_persons()` | `i32` | Estimated number of people |
| `csi_get_timestamp()` | `i32` | Current timestamp (ms) |
| `csi_emit_event(id, val)` | — | Send a detection result to the host |
| `csi_log(ptr, len)` | — | Log a message to serial console |
| `csi_get_phase_history(buf, max)` | `i32` | Past phase values for trend analysis |
## Event ID Registry
| Range | Category | Example Events |
|-------|----------|---------------|
| 0-99 | Core | Gesture detected, coherence score, anomaly |
| 100-199 | Medical | Apnea, bradycardia, tachycardia, seizure |
| 200-299 | Security | Intrusion, perimeter breach, loitering, panic |
| 300-399 | Smart Building | Zone occupied, HVAC, lighting, elevator, meeting |
| 400-499 | Retail | Queue length, dwell zone, customer flow, turnover |
| 500-599 | Industrial | Proximity warning, confined space, vibration |
| 600-699 | Exotic | Sleep stage, emotion, gesture language, rain |
| 700-729 | Signal Intelligence | Attention, coherence gate, compression, recovery |
| 730-759 | Adaptive Learning | Gesture learned, attractor, adaptation, EWC |
| 760-789 | Spatial Reasoning | Influence, HNSW match, spike tracking |
| 790-819 | Temporal Analysis | Pattern, LTL violation, GOAP goal |
| 820-849 | AI Security | Replay attack, injection, jamming, behavior |
| 850-879 | Quantum-Inspired | Entanglement, decoherence, hypothesis |
| 880-899 | Autonomous | Inference, rule fired, mesh reconfigure |
## Module Development
### Adding a New Module
1. Create `src/your_module.rs` following the pattern:
```rust
#![cfg_attr(not(feature = "std"), no_std)]
#[cfg(not(feature = "std"))]
use libm::fabsf;
pub struct YourModule { /* fixed-size fields only */ }
impl YourModule {
pub const fn new() -> Self { /* ... */ }
pub fn process_frame(&mut self, /* inputs */) -> &[(i32, f32)] { /* ... */ }
}
```
2. Add `pub mod your_module;` to `lib.rs`
3. Add event constants to `event_types` in `lib.rs`
4. Add tests with `#[cfg(test)] mod tests { ... }`
5. Run `cargo test --features std`
### Constraints
- **No heap allocation**: Use fixed-size arrays, not `Vec` or `String`
- **No `std`**: Use `libm` for math functions
- **Budget tiers**: L (<2ms), S (<5ms), H (<10ms) per frame
- **Binary size**: Each module should be 5-30 KB as WASM
## References
- [ADR-039](../adr/ADR-039-esp32-edge-intelligence.md) — Edge processing tiers
- [ADR-040](../adr/ADR-040-wasm-programmable-sensing.md) — WASM runtime design
- [ADR-041](../adr/ADR-041-wasm-module-collection.md) — Full module specification
- [Source code](../../rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/)

View file

@ -0,0 +1,425 @@
# Adaptive Learning Modules -- WiFi-DensePose Edge Intelligence
> On-device machine learning that runs without cloud connectivity. The ESP32 chip teaches itself what "normal" looks like for each environment and adapts over time. No training data needed -- it learns from what it sees.
## Overview
| Module | File | What It Does | Event IDs | Budget |
|--------|------|-------------|-----------|--------|
| DTW Gesture Learn | `lrn_dtw_gesture_learn.rs` | Teaches custom gestures via 3 rehearsals | 730-733 | H (<10ms) |
| Anomaly Attractor | `lrn_anomaly_attractor.rs` | Models room dynamics as a chaotic attractor | 735-738 | S (<5ms) |
| Meta Adapt | `lrn_meta_adapt.rs` | Self-tunes 8 detection thresholds via hill climbing | 740-743 | S (<5ms) |
| EWC Lifelong | `lrn_ewc_lifelong.rs` | Learns new environments without forgetting old ones | 745-748 | L (<2ms) |
## How the Learning Modules Work Together
```
Raw CSI data (from signal intelligence pipeline)
|
v
+-------------------------+ +--------------------------+
| Anomaly Attractor | | DTW Gesture Learn |
| Learn what "normal" | | Users teach custom |
| looks like, detect | | gestures by performing |
| deviations from it | | them 3 times |
+-------------------------+ +--------------------------+
| |
v v
+-------------------------+ +--------------------------+
| EWC Lifelong | | Meta Adapt |
| Learn new rooms/layouts | | Auto-tune thresholds |
| without forgetting | | based on TP/FP feedback |
| old ones | | |
+-------------------------+ +--------------------------+
| |
v v
Persistent on-device knowledge Optimized detection parameters
(survives power cycles via NVS) (fewer false alarms over time)
```
- **Anomaly Attractor** learns the room's "normal" signal dynamics and alerts when something unexpected happens.
- **DTW Gesture Learn** lets users define custom gestures without any programming.
- **EWC Lifelong** ensures the device can move to a new room and learn it without losing knowledge of previous rooms.
- **Meta Adapt** continuously improves detection accuracy by tuning thresholds based on real-world feedback.
---
## Modules
### DTW Gesture Learning (`lrn_dtw_gesture_learn.rs`)
**What it does**: You teach the device custom gestures by performing them 3 times. It remembers up to 16 different gestures. When it recognizes a gesture you taught it, it fires an event with the gesture ID.
**Algorithm**: Dynamic Time Warping (DTW) with 3-rehearsal enrollment protocol.
DTW measures the similarity between two temporal sequences that may vary in speed. Unlike simple correlation, DTW can match a gesture performed slowly against one performed quickly. The Sakoe-Chiba band (width=8) constrains the warping path to prevent pathological matches.
#### Learning Protocol
```
State Machine:
Idle ──(60 frames stillness)──> WaitingStill
^ |
| (motion detected)
| v
| Recording ──(stillness)──> Captured
| |
| (save rehearsal)
| |
| +----- < 3 rehearsals? > WaitingStill
| |
| >= 3 rehearsals
| |
| (check DTW similarity)
| |
+-- (all 3 similar?) ──> commit template ──+
+-- (too different?) ──> discard & reset ──+
```
#### Public API
```rust
pub struct GestureLearner { /* ... */ }
impl GestureLearner {
pub const fn new() -> Self;
pub fn process_frame(&mut self, phases: &[f32], motion_energy: f32) -> &[(i32, f32)];
pub fn template_count() -> usize; // Number of stored gesture templates (0-16)
}
```
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 730 | `GESTURE_LEARNED` | Gesture ID (100+) | A new gesture template was successfully committed |
| 731 | `GESTURE_MATCHED` | Gesture ID | A stored gesture was recognized in the current signal |
| 732 | `MATCH_DISTANCE` | DTW distance | How closely the input matched the template (lower = better) |
| 733 | `TEMPLATE_COUNT` | Count (0-16) | Total number of stored templates |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `TEMPLATE_LEN` | 64 | Maximum samples per gesture template |
| `MAX_TEMPLATES` | 16 | Maximum stored gestures |
| `REHEARSALS_REQUIRED` | 3 | Times you must perform a gesture to teach it |
| `STILLNESS_THRESHOLD` | 0.05 | Motion energy below this = stillness |
| `STILLNESS_FRAMES` | 60 | Frames of stillness to enter learning mode (~3s at 20Hz) |
| `LEARN_DTW_THRESHOLD` | 3.0 | Max DTW distance between rehearsals to accept as same gesture |
| `RECOGNIZE_DTW_THRESHOLD` | 2.5 | Max DTW distance for recognition match |
| `MATCH_COOLDOWN` | 40 | Frames between consecutive matches (~2s at 20Hz) |
| `BAND_WIDTH` | 8 | Sakoe-Chiba band width for DTW |
#### Tutorial: Teaching Your ESP32 a Custom Gesture
**Step 1: Enter training mode.**
Stand still for 3 seconds (60 frames at 20 Hz). The device detects sustained stillness and enters `WaitingStill` mode. There is no LED indicator in the base firmware, but you can add one by listening for the state transition.
**Step 2: Perform the gesture.**
Move your hand through the WiFi field. The device records the phase-delta trajectory. The recording captures up to 64 samples (3.2 seconds at 20 Hz). Keep the gesture under 3 seconds.
**Step 3: Return to stillness.**
Stop moving. The device captures the recording as "rehearsal 1 of 3."
**Step 4: Repeat 2 more times.**
The device stays in learning mode. Perform the same gesture two more times, returning to stillness after each.
**Step 5: Automatic validation.**
After the 3rd rehearsal, the device computes pairwise DTW distances between all 3 recordings. If all 3 are mutually similar (DTW distance < 3.0), it averages them into a template and assigns gesture ID 100 (the first custom gesture). Subsequent gestures get IDs 101, 102, etc.
**Step 6: Recognition.**
Once a template is stored, the device continuously matches the incoming phase-delta stream against all stored templates. When a match is found (DTW distance < 2.5), it emits `GESTURE_MATCHED` with the gesture ID and enters a 2-second cooldown to prevent double-firing.
**Tips for reliable gesture recognition:**
- Perform gestures in the same general area of the room
- Make gestures distinct (a wave is easier to distinguish from a circle than from a slower wave)
- Avoid ambient motion during training (other people walking, fans)
- Shorter gestures (0.5-1.5 seconds) tend to be more reliable than long ones
---
### Anomaly Attractor (`lrn_anomaly_attractor.rs`)
**What it does**: Models the room's WiFi signal as a dynamical system and classifies its behavior. An empty room produces a "point attractor" (stable signal). A room with HVAC produces a "limit cycle" (periodic). A room with people produces a "strange attractor" (complex but bounded). When the signal leaves the learned attractor basin, something unusual is happening.
**Algorithm**: 4D dynamical system analysis with Lyapunov exponent estimation.
The state vector is: `(mean_phase, mean_amplitude, variance, motion_energy)`
The Lyapunov exponent quantifies trajectory divergence:
```
lambda = (1/N) * sum(log(|delta_n+1| / |delta_n|))
```
- lambda < -0.01: **Point attractor** (stable, empty room)
- -0.01 <= lambda < 0.01: **Limit cycle** (periodic, machinery/HVAC)
- lambda >= 0.01: **Strange attractor** (chaotic, occupied room)
After 200 frames of learning (~10 seconds), the attractor type is classified and the basin radius is established. Subsequent departures beyond 3x the basin radius trigger anomaly alerts.
#### Public API
```rust
pub struct AttractorDetector { /* ... */ }
impl AttractorDetector {
pub const fn new() -> Self;
pub fn process_frame(&mut self, phases: &[f32], amplitudes: &[f32], motion_energy: f32)
-> &[(i32, f32)];
pub fn lyapunov_exponent() -> f32;
pub fn attractor_type() -> AttractorType; // Unknown/PointAttractor/LimitCycle/StrangeAttractor
pub fn is_initialized() -> bool; // True after 200 learning frames
}
pub enum AttractorType { Unknown, PointAttractor, LimitCycle, StrangeAttractor }
```
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 735 | `ATTRACTOR_TYPE` | 1/2/3 | Point(1), LimitCycle(2), Strange(3) -- emitted when classification changes |
| 736 | `LYAPUNOV_EXPONENT` | Lambda | Current Lyapunov exponent estimate |
| 737 | `BASIN_DEPARTURE` | Distance ratio | Trajectory left the attractor basin (value = distance / radius) |
| 738 | `LEARNING_COMPLETE` | 1.0 | Initial 200-frame learning phase finished |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `TRAJ_LEN` | 128 | Trajectory buffer length (circular) |
| `STATE_DIM` | 4 | State vector dimensionality |
| `MIN_FRAMES_FOR_CLASSIFICATION` | 200 | Learning phase length (~10s at 20Hz) |
| `LYAPUNOV_STABLE_UPPER` | -0.01 | Lambda below this = point attractor |
| `LYAPUNOV_PERIODIC_UPPER` | 0.01 | Lambda below this = limit cycle |
| `BASIN_DEPARTURE_MULT` | 3.0 | Departure threshold (3x learned radius) |
| `CENTER_ALPHA` | 0.01 | EMA alpha for attractor center tracking |
| `DEPARTURE_COOLDOWN` | 100 | Frames between departure alerts (~5s at 20Hz) |
#### Tutorial: Understanding Attractor Types
**Point Attractor (lambda < -0.01)**
The signal converges to a fixed point. This means the environment is completely static -- no people, no machinery, no airflow. The WiFi signal is deterministic and unchanging. Any disturbance will trigger a basin departure.
**Limit Cycle (lambda near 0)**
The signal follows a periodic orbit. This typically indicates mechanical systems: HVAC cycling, fans, elevator machinery. The period usually matches the equipment's duty cycle. Human activity on top of a limit cycle will push the Lyapunov exponent positive.
**Strange Attractor (lambda > 0.01)**
The signal is bounded but aperiodic -- classical chaos. This is the signature of human activity: walking, gesturing, breathing all create complex but bounded signal dynamics. The more people, the higher the Lyapunov exponent tends to be.
**Basin Departure**
A basin departure means the current signal state is more than 3x the learned radius away from the attractor center. This can indicate:
- Someone new entered the room
- A door or window opened
- Equipment turned on/off
- Environmental change (rain, temperature)
---
### Meta Adapt (`lrn_meta_adapt.rs`)
**What it does**: Automatically tunes 8 detection thresholds to reduce false alarms and improve detection accuracy. Uses real-world feedback (true positives and false positives) to drive a simple hill-climbing optimizer.
**Algorithm**: Iterative parameter perturbation with safety rollback.
The optimizer maintains 8 parameters, each with bounds and step sizes:
| Index | Parameter | Default | Range | Step |
|-------|-----------|---------|-------|------|
| 0 | Presence threshold | 0.05 | 0.01-0.50 | 0.01 |
| 1 | Motion threshold | 0.10 | 0.02-1.00 | 0.02 |
| 2 | Coherence threshold | 0.70 | 0.30-0.99 | 0.02 |
| 3 | Gesture DTW threshold | 2.50 | 0.50-5.00 | 0.20 |
| 4 | Anomaly energy ratio | 50.0 | 10.0-200.0 | 5.0 |
| 5 | Zone occupancy threshold | 0.02 | 0.005-0.10 | 0.005 |
| 6 | Vital apnea seconds | 20.0 | 10.0-60.0 | 2.0 |
| 7 | Intrusion sensitivity | 0.30 | 0.05-0.90 | 0.03 |
The optimization loop (runs on timer, not per-frame):
1. Measure baseline performance score: `score = TP_rate - 2 * FP_rate`
2. Perturb one parameter by its step size (alternating +/- direction)
3. Wait for `EVAL_WINDOW` (10) timer ticks
4. Measure new performance score
5. If improved, keep the change. If not, revert.
6. After 3 consecutive failures, safety rollback to the last known-good snapshot.
7. Sweep through all 8 parameters, then increment the meta-level counter.
The 2x penalty on false positives reflects the real-world cost: a false alarm (waking someone up at 3 AM because the system thought it detected motion) is worse than occasionally missing a true event.
#### Public API
```rust
pub struct MetaAdapter { /* ... */ }
impl MetaAdapter {
pub const fn new() -> Self;
pub fn report_true_positive(&mut self); // Confirmed correct detection
pub fn report_false_positive(&mut self); // Detection that should not have fired
pub fn report_event(&mut self); // Generic event for normalization
pub fn get_param(idx: usize) -> f32; // Current value of parameter idx
pub fn on_timer() -> &[(i32, f32)]; // Drive optimization loop (call at 1 Hz)
pub fn iteration_count() -> u32;
pub fn success_count() -> u32;
pub fn meta_level() -> u16; // Number of complete sweeps
pub fn consecutive_failures() -> u8;
}
```
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 740 | `PARAM_ADJUSTED` | param_idx + value/1000 | A parameter was successfully tuned |
| 741 | `ADAPTATION_SCORE` | Score [-2, 1] | Performance score after successful adaptation |
| 742 | `ROLLBACK_TRIGGERED` | Meta level | Safety rollback: 3 consecutive failures, reverting all params |
| 743 | `META_LEVEL` | Level | Number of complete optimization sweeps completed |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `NUM_PARAMS` | 8 | Number of tunable parameters |
| `MAX_CONSECUTIVE_FAILURES` | 3 | Failures before safety rollback |
| `EVAL_WINDOW` | 10 | Timer ticks per evaluation phase |
| `DEFAULT_STEP_FRAC` | 0.05 | Step size as fraction of range |
#### Tutorial: Providing Feedback to Meta Adapt
The meta adapter needs feedback to know whether its changes helped. In a typical deployment:
1. **True positives**: When an event (presence detection, gesture match) is confirmed correct by another sensor or user acknowledgment, call `report_true_positive()`.
2. **False positives**: When an event fires but nothing actually happened (e.g., presence detected in an empty room), call `report_false_positive()`.
3. **Generic events**: Call `report_event()` for all events, regardless of correctness, to normalize the score.
In autonomous operation without human feedback, you can use cross-validation between modules: if both the coherence gate and the anomaly attractor agree that something happened, treat it as a true positive. If only one fires, it might be a false positive.
---
### EWC Lifelong (`lrn_ewc_lifelong.rs`)
**What it does**: Learns to classify which zone a person is in (up to 4 zones) using WiFi signal features. Critically, when moved to a new environment, it learns the new layout without forgetting previously learned ones. This is the "lifelong learning" property enabled by Elastic Weight Consolidation.
**Algorithm**: EWC (Kirkpatrick et al., 2017) on an 8-input, 4-output linear classifier.
The classifier has 32 learnable parameters (8 inputs x 4 outputs). Training uses gradient descent with an EWC penalty term:
```
L_total = L_current + (lambda/2) * sum_i(F_i * (theta_i - theta_i*)^2)
```
- `L_current` = MSE between predicted zone and one-hot target
- `F_i` = Fisher Information diagonal (how important each parameter is for previous tasks)
- `theta_i*` = parameter values at the end of the previous task
- `lambda` = 1000 (strong regularization to prevent forgetting)
Gradients are estimated via finite differences (perturb each parameter by epsilon=0.01, measure loss change). Only 4 parameters are updated per frame (round-robin) to stay within the 2ms budget.
#### Task Boundary Detection
A "task" corresponds to a stable environment (room layout). Task boundaries are detected automatically:
1. Track consecutive frames where loss < 0.1
2. After 100 consecutive stable frames, commit the task:
- Snapshot parameters as `theta_star`
- Update Fisher diagonal from accumulated gradient squares
- Reset stability counter
Up to 32 tasks can be learned before the Fisher memory saturates.
#### Public API
```rust
pub struct EwcLifelong { /* ... */ }
impl EwcLifelong {
pub const fn new() -> Self;
pub fn process_frame(&mut self, features: &[f32], target_zone: i32) -> &[(i32, f32)];
pub fn predict(features: &[f32]) -> u8; // Inference only (zone 0-3)
pub fn parameters() -> &[f32; 32]; // Current model weights
pub fn fisher_diagonal() -> &[f32; 32]; // Parameter importance
pub fn task_count() -> u8; // Completed tasks
pub fn last_loss() -> f32; // Last total loss
pub fn last_penalty() -> f32; // Last EWC penalty
pub fn frame_count() -> u32;
pub fn has_prior_task() -> bool;
pub fn reset(&mut self);
}
```
Note: `target_zone = -1` means inference only (no gradient update).
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 745 | `KNOWLEDGE_RETAINED` | Penalty | EWC penalty magnitude (lower = less forgetting, emitted every 20 frames) |
| 746 | `NEW_TASK_LEARNED` | Task count | A new task was committed (environment successfully learned) |
| 747 | `FISHER_UPDATE` | Mean Fisher | Average Fisher information across all parameters |
| 748 | `FORGETTING_RISK` | Ratio | Ratio of EWC penalty to current loss (high = risk of forgetting) |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `N_PARAMS` | 32 | Total learnable parameters (8x4) |
| `N_INPUT` | 8 | Input features (subcarrier group means) |
| `N_OUTPUT` | 4 | Output zones |
| `LAMBDA` | 1000.0 | EWC regularization strength |
| `EPSILON` | 0.01 | Finite-difference perturbation size |
| `PARAMS_PER_FRAME` | 4 | Round-robin gradient updates per frame |
| `LEARNING_RATE` | 0.001 | Gradient descent step size |
| `STABLE_FRAMES_THRESHOLD` | 100 | Consecutive stable frames to trigger task boundary |
| `STABLE_LOSS_THRESHOLD` | 0.1 | Loss below this = "stable" frame |
| `FISHER_ALPHA` | 0.01 | EMA alpha for Fisher diagonal updates |
| `MAX_TASKS` | 32 | Maximum tasks before Fisher saturates |
#### Tutorial: How Lifelong Learning Works on a Microcontroller
**The Problem**: Traditional neural networks suffer from "catastrophic forgetting." If you train a network on Room A and then train it on Room B, it forgets everything about Room A. This is a fundamental limitation, not a bug.
**The EWC Solution**: Before learning Room B, the system measures which parameters were important for Room A (via the Fisher Information diagonal). Then, while learning Room B, it adds a penalty that prevents important-for-Room-A parameters from changing too much. The result: the network learns Room B while retaining Room A knowledge.
**On the ESP32**: The classifier is intentionally tiny (32 parameters) to keep computation within 2ms per frame. Despite its simplicity, a linear classifier over 8 subcarrier group features can reliably distinguish 4 spatial zones. The Fisher diagonal only requires 32 floats (128 bytes) per task. With 32 tasks maximum, total Fisher memory is ~4 KB.
**Monitoring forgetting risk**: The `FORGETTING_RISK` event (ID 748) reports the ratio of EWC penalty to current loss. If this ratio exceeds 1.0, the EWC constraint is dominating the learning signal, meaning the system is struggling to learn the new task without forgetting old ones. This can happen when:
- The new environment is very different from all previous ones
- The 32-parameter model capacity is exhausted
- The Fisher diagonal has saturated from too many tasks
---
## How Learning Works on a Microcontroller
ESP32-S3 constraints that shape the design of all adaptive learning modules:
### No GPU
All computation is done on the CPU (Xtensa LX7 dual-core at 240 MHz) via the WASM3 interpreter. This means:
- No matrix multiplication hardware
- No parallel SIMD operations
- Every floating-point operation counts
### Fixed Memory
WASM3 allocates a fixed linear memory region. There is no heap, no `malloc`, no dynamic allocation:
- All arrays are fixed-size and stack-allocated
- Maximum data structure sizes are compile-time constants
- Buffer overflows are impossible (Rust's bounds checking + fixed arrays)
### EWC for Preventing Forgetting
Without EWC, moving the device to a new room would erase everything learned about the previous room. EWC adds ~32 floats of overhead per task (the Fisher diagonal snapshot), which is negligible on the ESP32.
### Round-Robin Gradient Estimation
Computing gradients for all 32 parameters every frame would take too long. Instead, the EWC module uses round-robin scheduling: 4 parameters per frame, cycling through all 32 in 8 frames. At 20 Hz, a full gradient pass takes 0.4 seconds -- fast enough for the slow dynamics of room occupancy.
### Task Boundary Detection
The system automatically detects when it has "converged" on a new environment (100 consecutive stable frames = 5 seconds of consistent low loss). No manual intervention needed. The user just places the device in a new room, and the learning happens automatically.
### Energy Budget
| Module | Budget | Per-Frame Operations | Memory |
|--------|--------|---------------------|--------|
| DTW Gesture Learn | H (<10ms) | DTW: 64x64=4096 mults per template, up to 16 templates | ~18 KB (templates + rehearsals) |
| Anomaly Attractor | S (<5ms) | 4D distance + log for Lyapunov + EMA | ~2.5 KB (128 trajectory points) |
| Meta Adapt | S (<5ms) | Score computation + perturbation (timer only, not per-frame) | ~256 bytes |
| EWC Lifelong | L (<2ms) | 4 finite-difference evals + gradient step | ~512 bytes (params + Fisher + theta_star) |
Total static memory for all 4 learning modules: approximately 21 KB.

View file

@ -0,0 +1,246 @@
# AI Security Modules -- WiFi-DensePose Edge Intelligence
> Tamper detection and behavioral anomaly profiling that protect the sensing system from manipulation. These modules detect replay attacks, signal injection, jamming, and unusual behavior patterns -- all running on-device with no cloud dependency.
## Overview
| Module | File | What It Does | Event IDs | Budget |
|--------|------|--------------|-----------|--------|
| Signal Shield | `ais_prompt_shield.rs` | Detects replay, injection, and jamming attacks on CSI data | 820-823 | S (<5 ms) |
| Behavioral Profiler | `ais_behavioral_profiler.rs` | Learns normal behavior and detects anomalous deviations | 825-828 | S (<5 ms) |
---
## Signal Shield (`ais_prompt_shield.rs`)
**What it does**: Detects three types of attack on the WiFi sensing system:
1. **Replay attacks**: An adversary records legitimate CSI frames and plays them back to fool the sensor into seeing a "normal" scene while actually present in the room.
2. **Signal injection**: An adversary transmits a strong WiFi signal to overpower the legitimate CSI, creating amplitude spikes across many subcarriers.
3. **Jamming**: An adversary floods the WiFi channel with noise, degrading the signal-to-noise ratio below usable levels.
**How it works**:
- **Replay detection**: Each frame's features (mean phase, mean amplitude, amplitude variance) are quantized and hashed using FNV-1a. The hash is stored in a 64-entry ring buffer. If a new frame's hash matches any recent hash, it flags a replay.
- **Injection detection**: If more than 25% of subcarriers show a >10x amplitude jump from the previous frame, it flags injection.
- **Jamming detection**: The module calibrates a baseline SNR (signal / sqrt(variance)) over the first 100 frames. If the current SNR drops below 10% of baseline for 5+ consecutive frames, it flags jamming.
#### Public API
```rust
use wifi_densepose_wasm_edge::ais_prompt_shield::PromptShield;
let mut shield = PromptShield::new(); // const fn, zero-alloc
let events = shield.process_frame(&phases, &amplitudes); // per-frame analysis
let calibrated = shield.is_calibrated(); // true after 100 frames
let frames = shield.frame_count(); // total frames processed
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 820 | `EVENT_REPLAY_ATTACK` | 1.0 (detected) | On detection (cooldown: 40 frames) |
| 821 | `EVENT_INJECTION_DETECTED` | Fraction of subcarriers with spikes [0.25, 1.0] | On detection (cooldown: 40 frames) |
| 822 | `EVENT_JAMMING_DETECTED` | SNR drop in dB (10 * log10(baseline/current)) | On detection (cooldown: 40 frames) |
| 823 | `EVENT_SIGNAL_INTEGRITY` | Composite integrity score [0.0, 1.0] | Every 20 frames |
#### Configuration Constants
| Constant | Value | Purpose |
|----------|-------|---------|
| `MAX_SC` | 32 | Maximum subcarriers processed |
| `HASH_RING` | 64 | Size of replay detection hash ring buffer |
| `INJECTION_FACTOR` | 10.0 | Amplitude jump threshold (10x previous) |
| `INJECTION_FRAC` | 0.25 | Minimum fraction of subcarriers with spikes |
| `JAMMING_SNR_FRAC` | 0.10 | SNR must drop below 10% of baseline |
| `JAMMING_CONSEC` | 5 | Consecutive low-SNR frames required |
| `BASELINE_FRAMES` | 100 | Calibration period length |
| `COOLDOWN` | 40 | Frames between repeated alerts (2 seconds at 20 Hz) |
#### Signal Integrity Score
The composite score (event 823) is emitted every 20 frames and ranges from 0.0 (compromised) to 1.0 (clean):
| Factor | Score Reduction | Condition |
|--------|-----------------|-----------|
| Replay detected | -0.4 | Frame hash matches ring buffer |
| Injection detected | up to -0.3 | Proportional to injection fraction |
| SNR degradation | up to -0.3 | Proportional to SNR drop below baseline |
#### FNV-1a Hash Details
The hash function quantizes three frame statistics to integer precision before hashing:
```
hash = FNV_OFFSET (2166136261)
for each of [mean_phase*100, mean_amp*100, amp_variance*100]:
for each byte in value.to_le_bytes():
hash ^= byte
hash = hash.wrapping_mul(FNV_PRIME) // FNV_PRIME = 16777619
```
This means two frames must have nearly identical statistical profiles (within 1% quantization) to trigger a replay alert.
#### Example: Detecting a Replay Attack
```
Calibration (frames 1-100):
Normal CSI with varying phases -> baseline SNR established
No alerts emitted during calibration
Frame 150: Normal operation
phases = [0.31, 0.28, ...], amps = [1.02, 0.98, ...]
hash = 0xA7F3B21C -> stored in ring buffer
No alerts
Frame 200: Attacker replays frame 150 exactly
phases = [0.31, 0.28, ...], amps = [1.02, 0.98, ...]
hash = 0xA7F3B21C -> MATCH found in ring buffer!
-> EVENT_REPLAY_ATTACK = 1.0
-> EVENT_SIGNAL_INTEGRITY = 0.6 (reduced by 0.4)
```
#### Example: Detecting Signal Injection
```
Frame 300: Normal amplitudes
amps = [1.0, 1.1, 0.9, 1.0, ...]
Frame 301: Adversary injects strong signal
amps = [15.0, 12.0, 14.0, 13.0, ...] (>10x jump on all subcarriers)
injection_fraction = 1.0 (100% of subcarriers spiked)
-> EVENT_INJECTION_DETECTED = 1.0
-> EVENT_SIGNAL_INTEGRITY = 0.4
```
---
## Behavioral Profiler (`ais_behavioral_profiler.rs`)
**What it does**: Learns what "normal" behavior looks like over time, then detects anomalous deviations. It builds a 6-dimensional behavioral profile using online statistics (Welford's algorithm) and flags when new observations deviate significantly from the learned baseline.
**How it works**: Every 200 frames, the module computes a 6D feature vector from the observation window. During the learning phase (first 1000 frames), it trains Welford accumulators for each dimension. After maturity, it computes per-dimension Z-scores and a combined RMS Z-score. If the combined score exceeds 3.0, an anomaly is reported.
#### The 6 Behavioral Dimensions
| # | Dimension | Description | Typical Range |
|---|-----------|-------------|---------------|
| 0 | Presence Rate | Fraction of frames with presence | [0, 1] |
| 1 | Average Motion | Mean motion energy in window | [0, ~5] |
| 2 | Average Persons | Mean person count | [0, ~4] |
| 3 | Activity Variance | Variance of motion energy | [0, ~10] |
| 4 | Transition Rate | Presence state changes per frame | [0, 0.5] |
| 5 | Dwell Time | Average consecutive presence run length | [0, 200] |
#### Public API
```rust
use wifi_densepose_wasm_edge::ais_behavioral_profiler::BehavioralProfiler;
let mut bp = BehavioralProfiler::new(); // const fn
let events = bp.process_frame(present, motion, n_persons); // per-frame
let mature = bp.is_mature(); // true after learning
let anomalies = bp.total_anomalies(); // cumulative count
let mean = bp.dim_mean(0); // mean of dimension 0
let var = bp.dim_variance(1); // variance of dim 1
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 825 | `EVENT_BEHAVIOR_ANOMALY` | Combined Z-score (RMS, > 3.0) | On detection (cooldown: 100 frames) |
| 826 | `EVENT_PROFILE_DEVIATION` | Index of most deviant dimension (0-5) | Paired with anomaly |
| 827 | `EVENT_NOVEL_PATTERN` | Count of dimensions with Z > 2.0 | When 3+ dimensions deviate |
| 828 | `EVENT_PROFILE_MATURITY` | Days since sensor start | On maturity + periodically |
#### Configuration Constants
| Constant | Value | Purpose |
|----------|-------|---------|
| `N_DIM` | 6 | Behavioral dimensions |
| `LEARNING_FRAMES` | 1000 | Frames before profiler matures |
| `ANOMALY_Z` | 3.0 | Combined Z-score threshold for anomaly |
| `NOVEL_Z` | 2.0 | Per-dimension Z-score threshold for novelty |
| `NOVEL_MIN` | 3 | Minimum deviating dimensions for NOVEL_PATTERN |
| `OBS_WIN` | 200 | Observation window size (frames) |
| `COOLDOWN` | 100 | Frames between repeated anomaly alerts |
| `MATURITY_INTERVAL` | 72000 | Frames between maturity reports (1 hour at 20 Hz) |
#### Welford's Online Algorithm
Each dimension maintains running statistics without storing all past values:
```
On each new observation x:
count += 1
delta = x - mean
mean += delta / count
m2 += delta * (x - mean)
Variance = m2 / count
Z-score = |x - mean| / sqrt(variance)
```
This is numerically stable and requires only 12 bytes per dimension (count + mean + m2).
#### Example: Detecting an Intruder's Behavioral Signature
```
Learning phase (day 1-2):
Normal pattern: 1 person, present 8am-10pm, moderate motion
Profile matures -> EVENT_PROFILE_MATURITY = 0.58 (days)
Day 3, 3am:
Observation window: presence=1, high motion, 1 person
Z-scores: presence_rate=2.8, motion=4.1, persons=0.3,
variance=3.5, transition=2.2, dwell=1.9
Combined Z = sqrt(mean(z^2)) = 3.4 > 3.0
-> EVENT_BEHAVIOR_ANOMALY = 3.4
-> EVENT_PROFILE_DEVIATION = 1 (motion dimension most deviant)
-> EVENT_NOVEL_PATTERN = 3 (3 dimensions above Z=2.0)
```
---
## Threat Model
### Attacks These Modules Detect
| Attack | Detection Module | Method | False Positive Rate |
|--------|-----------------|--------|---------------------|
| CSI frame replay | Signal Shield | FNV-1a hash ring matching | Low (1% quantization) |
| Signal injection (e.g., rogue AP) | Signal Shield | >25% subcarriers with >10x amplitude spike | Very low |
| Broadband jamming | Signal Shield | SNR drop below 10% of baseline for 5+ frames | Very low |
| Narrowband jamming | Partially -- Signal Shield | May not trigger if < 25% subcarriers affected | Medium |
| Behavioral anomaly (intruder at unusual time) | Behavioral Profiler | Combined Z-score > 3.0 across 6 dimensions | Low after maturation |
| Gradual environmental change | Behavioral Profiler | Welford stats adapt, may flag if change is abrupt | Very low |
### Attacks These Modules Cannot Detect
| Attack | Why Not | Recommended Mitigation |
|--------|---------|----------------------|
| Sophisticated replay with slight phase variation | FNV-1a uses 1% quantization; small perturbations change the hash | Add temporal correlation checks (consecutive frame deltas) |
| Man-in-the-middle on the WiFi channel | Modules analyze CSI content, not channel authentication | Use WPA3 encryption + MAC filtering |
| Physical obstruction (blocking line-of-sight) | Looks like a person leaving, not an attack | Cross-reference with PIR sensors |
| Slow amplitude drift (gradual injection) | Below the 10x threshold per frame | Add longer-term amplitude trend monitoring |
| Firmware tampering | Modules run in WASM sandbox, cannot detect host compromise | Secure boot + signed firmware (ADR-032) |
### Deployment Recommendations
1. **Always run both modules together**: Signal Shield catches active attacks, Behavioral Profiler catches passive anomalies.
2. **Allow full calibration**: Signal Shield needs 100 frames (5 seconds) for SNR baseline. Behavioral Profiler needs 1000 frames (~50 seconds) for reliable Z-scores.
3. **Combine with Temporal Logic Guard** (`tmp_temporal_logic_guard.rs`): Its safety invariants catch impossible state combinations (e.g., "fall alert when room is empty") that indicate sensor manipulation.
4. **Connect to the Self-Healing Mesh** (`aut_self_healing_mesh.rs`): If a node in the mesh is being jammed, the mesh can automatically reconfigure around the compromised node.
---
## Memory Layout
| Module | State Size (approx) | Static Event Buffer |
|--------|---------------------|---------------------|
| Signal Shield | ~420 bytes (64 hashes + 32 prev_amps + calibration) | 4 entries |
| Behavioral Profiler | ~2.4 KB (200-entry observation window + 6 Welford stats) | 4 entries |
Both modules use fixed-size arrays and static event buffers. No heap allocation. Fully no_std compliant.

View file

@ -0,0 +1,438 @@
# Quantum-Inspired & Autonomous Modules -- WiFi-DensePose Edge Intelligence
> Advanced algorithms inspired by quantum computing, neuroscience, and AI planning. These modules let the ESP32 make autonomous decisions, heal its own mesh network, interpret high-level scene semantics, and explore room states using quantum-inspired search.
## Quantum-Inspired
| Module | File | What It Does | Event IDs | Budget |
|--------|------|--------------|-----------|--------|
| Quantum Coherence | `qnt_quantum_coherence.rs` | Maps CSI phases onto a Bloch sphere to detect sudden environmental changes | 850-852 | H (<10 ms) |
| Interference Search | `qnt_interference_search.rs` | Grover-inspired multi-hypothesis room state classifier | 855-857 | H (<10 ms) |
---
### Quantum Coherence (`qnt_quantum_coherence.rs`)
**What it does**: Maps each subcarrier's phase onto a point on the quantum Bloch sphere and computes an aggregate coherence metric from the mean Bloch vector magnitude. When all subcarrier phases are aligned, the system is "coherent" (like a quantum pure state). When phases scatter randomly, it is "decoherent" (like a maximally mixed state). Sudden decoherence -- a rapid entropy spike -- indicates an environmental disturbance such as a door opening, a person entering, or furniture being moved.
**Algorithm**: Each subcarrier phase is mapped to a 3D Bloch vector:
- theta = |phase| (polar angle)
- phi = sign(phase) * pi/2 (azimuthal angle)
Since phi is always +/- pi/2, cos(phi) = 0 and sin(phi) = +/- 1. This eliminates 2 trig calls per subcarrier (saving 64+ cosf/sinf calls per frame for 32 subcarriers). The x-component of the mean Bloch vector is always zero.
Von Neumann entropy: S = -p*log(p) - (1-p)*log(1-p) where p = (1 + |bloch|) / 2. S=0 when perfectly coherent (|bloch|=1), S=ln(2) when maximally mixed (|bloch|=0). EMA smoothing with alpha=0.15.
#### Public API
```rust
use wifi_densepose_wasm_edge::qnt_quantum_coherence::QuantumCoherenceMonitor;
let mut mon = QuantumCoherenceMonitor::new(); // const fn
let events = mon.process_frame(&phases); // per-frame
let coh = mon.coherence(); // [0, 1], 1=pure state
let ent = mon.entropy(); // [0, ln(2)]
let norm_ent = mon.normalized_entropy(); // [0, 1]
let bloch = mon.bloch_vector(); // [f32; 3]
let frames = mon.frame_count(); // total frames
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 850 | `EVENT_ENTANGLEMENT_ENTROPY` | EMA-smoothed Von Neumann entropy [0, ln(2)] | Every 10 frames |
| 851 | `EVENT_DECOHERENCE_EVENT` | Entropy jump magnitude (> 0.3) | On detection |
| 852 | `EVENT_BLOCH_DRIFT` | Euclidean distance between consecutive Bloch vectors | Every 5 frames |
#### Configuration Constants
| Constant | Value | Purpose |
|----------|-------|---------|
| `MAX_SC` | 32 | Maximum subcarriers |
| `ALPHA` | 0.15 | EMA smoothing factor |
| `DECOHERENCE_THRESHOLD` | 0.3 | Entropy jump threshold |
| `ENTROPY_EMIT_INTERVAL` | 10 | Frames between entropy reports |
| `DRIFT_EMIT_INTERVAL` | 5 | Frames between drift reports |
| `LN2` | 0.693147 | Maximum binary entropy |
#### Example: Door Opening Detection via Decoherence
```
Frames 1-50: Empty room, phases stable at ~0.1 rad
Bloch vector: (0, 0.10, 0.99) -> coherence = 0.995
Entropy ~ 0.005 (near zero, pure state)
Frame 51: Door opens, multipath changes suddenly
Phases scatter: [-2.1, 0.8, 1.5, -0.3, ...]
Bloch vector: (0, 0.12, 0.34) -> coherence = 0.36
Entropy jumps to 0.61
-> EVENT_DECOHERENCE_EVENT = 0.605 (jump magnitude)
-> EVENT_BLOCH_DRIFT = 0.65 (large Bloch vector displacement)
Frames 52-100: New stable multipath
Phases settle at new values
Entropy gradually decays via EMA
No more decoherence events
```
#### Bloch Sphere Intuition
Think of each subcarrier as a compass needle. When the room is stable, all needles point roughly the same direction (high coherence, low entropy). When something changes the WiFi multipath -- a person enters, a door opens, furniture moves -- the needles scatter in different directions (low coherence, high entropy). The Bloch sphere formalism quantifies this in a way that is mathematically precise and computationally cheap.
---
### Interference Search (`qnt_interference_search.rs`)
**What it does**: Maintains 16 amplitude-weighted hypotheses for the current room state (empty, person in zone A/B/C/D, two persons, exercising, sleeping, etc.) and uses a Grover-inspired oracle+diffusion process to converge on the most likely state.
**Algorithm**: Inspired by Grover's quantum search algorithm, adapted for classical computation:
1. **Oracle**: CSI evidence (presence, motion, person count) multiplies hypothesis amplitudes by boost (1.3) or dampen (0.7) factors depending on consistency.
2. **Grover diffusion**: Reflects all amplitudes about their mean (a_i = 2*mean - a_i), concentrating probability mass on oracle-boosted hypotheses. Negative amplitudes are clamped to zero (classical approximation).
3. **Normalization**: Amplitudes are renormalized so sum-of-squares = 1.0 (probability conservation).
After enough iterations, the winner emerges with probability > 0.5 (convergence threshold).
#### The 16 Hypotheses
| Index | Hypothesis | Oracle Evidence |
|-------|-----------|----------------|
| 0 | Empty | presence=0 |
| 1-4 | Person in Zone A/B/C/D | presence=1, 1 person |
| 5 | Two Persons | n_persons=2 |
| 6 | Three Persons | n_persons>=3 |
| 7 | Moving Left | high motion, moving state |
| 8 | Moving Right | high motion, moving state |
| 9 | Sitting | low motion, present |
| 10 | Standing | low motion, present |
| 11 | Falling | high motion (transient) |
| 12 | Exercising | high motion, present |
| 13 | Sleeping | low motion, present |
| 14 | Cooking | moderate motion + moving |
| 15 | Working | low motion, present |
#### Public API
```rust
use wifi_densepose_wasm_edge::qnt_interference_search::{InterferenceSearch, Hypothesis};
let mut search = InterferenceSearch::new(); // const fn, uniform amplitudes
let events = search.process_frame(presence, motion_energy, n_persons);
let winner = search.winner(); // Hypothesis enum
let prob = search.winner_probability(); // [0, 1]
let converged = search.is_converged(); // prob > 0.5
let amp = search.amplitude(Hypothesis::Sleeping); // raw amplitude
let p = search.probability(Hypothesis::Exercising); // amplitude^2
let iters = search.iterations(); // total iterations
search.reset(); // back to uniform
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 855 | `EVENT_HYPOTHESIS_WINNER` | Winning hypothesis index (0-15) | Every 10 frames or on change |
| 856 | `EVENT_HYPOTHESIS_AMPLITUDE` | Winning hypothesis probability | Every 20 frames |
| 857 | `EVENT_SEARCH_ITERATIONS` | Total Grover iterations | Every 50 frames |
#### Configuration Constants
| Constant | Value | Purpose |
|----------|-------|---------|
| `N_HYPO` | 16 | Number of room-state hypotheses |
| `CONVERGENCE_PROB` | 0.5 | Threshold for declaring convergence |
| `ORACLE_BOOST` | 1.3 | Amplitude multiplier for supported hypotheses |
| `ORACLE_DAMPEN` | 0.7 | Amplitude multiplier for contradicted hypotheses |
| `MOTION_HIGH_THRESH` | 0.5 | Motion energy threshold for "high motion" |
| `MOTION_LOW_THRESH` | 0.15 | Motion energy threshold for "low motion" |
#### Example: Room State Classification
```
Initial state: All 16 hypotheses at probability 1/16 = 0.0625
Frames 1-30: presence=0, motion=0, n_persons=0
Oracle boosts Empty (index 0), dampens all others
Diffusion concentrates probability mass on Empty
After 30 iterations: P(Empty) = 0.72, P(others) < 0.03
-> EVENT_HYPOTHESIS_WINNER = 0 (Empty)
Frames 31-60: presence=1, motion=0.8, n_persons=1
Oracle boosts Exercising, MovingLeft, MovingRight
Oracle dampens Empty, Sitting, Sleeping
After 30 more iterations: P(Exercising) = 0.45
-> EVENT_HYPOTHESIS_WINNER = 12 (Exercising)
Winner changed -> event emitted immediately
Frames 61-90: presence=1, motion=0.05, n_persons=1
Oracle boosts Sitting, Sleeping, Working, Standing
Oracle dampens Exercising, MovingLeft, MovingRight
-> Convergence shifts to static hypotheses
```
---
## Autonomous Systems
| Module | File | What It Does | Event IDs | Budget |
|--------|------|--------------|-----------|--------|
| Psycho-Symbolic | `aut_psycho_symbolic.rs` | Context-aware inference using forward-chaining symbolic rules | 880-883 | H (<10 ms) |
| Self-Healing Mesh | `aut_self_healing_mesh.rs` | Monitors mesh node health and auto-reconfigures via min-cut analysis | 885-888 | S (<5 ms) |
---
### Psycho-Symbolic Inference (`aut_psycho_symbolic.rs`)
**What it does**: Interprets raw CSI-derived features into high-level semantic conclusions using a knowledge base of 16 forward-chaining rules. Given presence, motion energy, breathing rate, heart rate, person count, coherence, and time of day, it determines conclusions like "person resting", "possible intruder", "medical distress", or "social activity".
**Algorithm**: Forward-chaining rule evaluation. Each rule has 4 condition slots (feature_id, comparison_op, threshold). A rule fires when all non-disabled conditions match. Confidence propagation: the final confidence is the rule's base confidence multiplied by per-condition match-quality scores (how far above/below threshold the feature is, clamped to [0.5, 1.0]). Contradiction detection resolves mutually exclusive conclusions by keeping the higher-confidence one.
#### The 16 Rules
| Rule | Conclusion | Conditions | Base Confidence |
|------|-----------|------------|----------------|
| R0 | Possible Intruder | Presence + high motion (>=200) + night | 0.80 |
| R1 | Person Resting | Presence + low motion (<30) + breathing 10-22 BPM | 0.90 |
| R2 | Pet or Environment | No presence + motion (>=15) | 0.60 |
| R3 | Social Activity | Multi-person (>=2) + high motion (>=100) | 0.70 |
| R4 | Exercise | 1 person + high motion (>=150) + elevated HR (>=100) | 0.80 |
| R5 | Possible Fall | Presence + sudden stillness (motion<10, prev_motion>=150) | 0.70 |
| R6 | Interference | Low coherence (<0.4) + presence | 0.50 |
| R7 | Sleeping | Presence + very low motion (<5) + night + breathing (>=8) | 0.90 |
| R8 | Cooking Activity | Presence + moderate motion (40-120) + evening | 0.60 |
| R9 | Leaving Home | No presence + previous motion (>=50) + morning | 0.65 |
| R10 | Arriving Home | Presence + motion (>=60) + low prev_motion (<15) + evening | 0.70 |
| R11 | Child Playing | Multi-person (>=2) + very high motion (>=250) + daytime | 0.60 |
| R12 | Working at Desk | 1 person + low motion (<20) + good coherence (>=0.6) + morning | 0.75 |
| R13 | Medical Distress | Presence + very high HR (>=130) + low motion (<15) | 0.85 |
| R14 | Room Empty (Stable) | No presence + no motion (<5) + good coherence (>=0.6) | 0.95 |
| R15 | Crowd Gathering | Many persons (>=4) + high motion (>=120) | 0.70 |
#### Contradiction Pairs
These conclusions are mutually exclusive. When both fire, only the one with higher confidence survives:
| Pair A | Pair B |
|--------|--------|
| Sleeping | Exercise |
| Sleeping | Social Activity |
| Room Empty (Stable) | Possible Intruder |
| Person Resting | Exercise |
#### Input Features
| Index | Feature | Source | Range |
|-------|---------|--------|-------|
| 0 | Presence | Tier 2 DSP | 0 (absent) or 1 (present) |
| 1 | Motion Energy | Tier 2 DSP | 0 to ~1000 |
| 2 | Breathing BPM | Tier 2 vitals | 0-60 |
| 3 | Heart Rate BPM | Tier 2 vitals | 0-200 |
| 4 | Person Count | Tier 2 occupancy | 0-8 |
| 5 | Coherence | QuantumCoherenceMonitor or upstream | 0-1 |
| 6 | Time Bucket | Host clock | 0=morning, 1=afternoon, 2=evening, 3=night |
| 7 | Previous Motion | Internal (auto-tracked) | 0 to ~1000 |
#### Public API
```rust
use wifi_densepose_wasm_edge::aut_psycho_symbolic::PsychoSymbolicEngine;
let mut engine = PsychoSymbolicEngine::new(); // const fn
engine.set_coherence(0.8); // from upstream module
let events = engine.process_frame(
presence, motion, breathing, heartrate, n_persons, time_bucket
);
let rules = engine.fired_rules(); // u16 bitmap
let count = engine.fired_count(); // number of rules that fired
let prev = engine.prev_conclusion(); // last winning conclusion ID
let contras = engine.contradiction_count(); // total contradictions
engine.reset(); // clear state
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 880 | `EVENT_INFERENCE_RESULT` | Conclusion ID (1-16) | When any rule fires |
| 881 | `EVENT_INFERENCE_CONFIDENCE` | Confidence [0, 1] of the winning conclusion | Paired with result |
| 882 | `EVENT_RULE_FIRED` | Rule index (0-15) | For each rule that fired |
| 883 | `EVENT_CONTRADICTION` | Encoded pair: conclusion_a * 100 + conclusion_b | On contradiction |
#### Example: Fall Detection Sequence
```
Frame 1: Person walking briskly
Features: presence=1, motion=200, breathing=20, HR=90, persons=1, time=1
R4 (Exercise) fires: confidence = 0.80 * 0.75 = 0.60
-> EVENT_INFERENCE_RESULT = 5 (Exercise)
-> EVENT_INFERENCE_CONFIDENCE = 0.60
Frame 2: Sudden stillness (prev_motion=200, current motion=3)
R5 (Possible Fall) fires: confidence = 0.70 * 0.85 = 0.595
R1 (Person Resting) also fires: confidence = 0.90 * 0.50 = 0.45
No contradiction between these two
-> EVENT_RULE_FIRED = 5 (Fall rule)
-> EVENT_RULE_FIRED = 1 (Resting rule)
-> EVENT_INFERENCE_RESULT = 6 (Possible Fall, highest confidence)
-> EVENT_INFERENCE_CONFIDENCE = 0.595
```
---
### Self-Healing Mesh (`aut_self_healing_mesh.rs`)
**What it does**: Monitors the health of an 8-node sensor mesh and automatically detects when the network topology becomes fragile. Uses the Stoer-Wagner minimum graph cut algorithm to find the weakest link in the mesh. When the min-cut value drops below a threshold, it identifies the degraded node and triggers a reconfiguration event.
**Algorithm**: Stoer-Wagner min-cut on a weighted graph of up to 8 nodes. Edge weights are the minimum quality score of the two endpoints (min(q_i, q_j)). Quality scores are EMA-smoothed (alpha=0.15) per-node CSI coherence values. O(n^3) complexity, which is only 512 operations for n=8. State machine transitions between healthy and healing modes.
#### Public API
```rust
use wifi_densepose_wasm_edge::aut_self_healing_mesh::SelfHealingMesh;
let mut mesh = SelfHealingMesh::new(); // const fn
mesh.update_node_quality(0, coherence); // update single node
let events = mesh.process_frame(&node_qualities); // process all nodes
let q = mesh.node_quality(2); // EMA quality for node 2
let n = mesh.active_nodes(); // count
let mc = mesh.prev_mincut(); // last min-cut value
let healing = mesh.is_healing(); // fragile state?
let weak = mesh.weakest_node(); // node ID or 0xFF
mesh.reset(); // clear state
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 885 | `EVENT_NODE_DEGRADED` | Index of the degraded node (0-7) | When min-cut < 0.3 |
| 886 | `EVENT_MESH_RECONFIGURE` | Min-cut value (measure of fragility) | Paired with degraded |
| 887 | `EVENT_COVERAGE_SCORE` | Mean quality across all active nodes [0, 1] | Every frame |
| 888 | `EVENT_HEALING_COMPLETE` | Min-cut value (now healthy) | When min-cut recovers >= 0.6 |
#### Configuration Constants
| Constant | Value | Purpose |
|----------|-------|---------|
| `MAX_NODES` | 8 | Maximum mesh nodes |
| `QUALITY_ALPHA` | 0.15 | EMA smoothing for node quality |
| `MINCUT_FRAGILE` | 0.3 | Below this, mesh is considered fragile |
| `MINCUT_HEALTHY` | 0.6 | Above this, healing is considered complete |
#### State Machine
```
mincut < 0.3
[Healthy] ----------------------> [Healing]
^ |
| mincut >= 0.6 |
+---------------------------------+
```
#### Stoer-Wagner Min-Cut Details
The algorithm finds the minimum weight of edges that, if removed, would disconnect the graph into two components. For an 8-node mesh:
1. Start with the full weighted adjacency matrix
2. For each phase (n-1 phases total):
- Grow a set A by repeatedly adding the node with the highest total edge weight to A
- The last two nodes added (prev, last) define a "cut of the phase" = weight to last
- Track the global minimum cut across all phases
- Merge the last two nodes (combine their edge weights)
3. Return (global_min_cut, node_on_lighter_side)
#### Example: Node Failure and Recovery
```
Frame 1: All 4 nodes healthy
qualities = [0.9, 0.85, 0.88, 0.92]
Coverage = 0.89
Min-cut = 0.85 (well above 0.6)
-> EVENT_COVERAGE_SCORE = 0.89
Frame 50: Node 1 starts degrading
qualities = [0.9, 0.20, 0.88, 0.92]
EMA-smoothed quality[1] drops gradually
Min-cut drops to 0.20 (edge weights use min(q_i, q_j))
Min-cut < 0.3 -> FRAGILE!
-> EVENT_NODE_DEGRADED = 1
-> EVENT_MESH_RECONFIGURE = 0.20
-> Mesh enters healing mode
Host firmware can now:
- Increase node 1's transmit power
- Route traffic around node 1
- Wake up a backup node
- Alert the operator
Frame 100: Node 1 recovers (antenna repositioned)
qualities = [0.9, 0.85, 0.88, 0.92]
Min-cut climbs back to 0.85
Min-cut >= 0.6 -> HEALTHY!
-> EVENT_HEALING_COMPLETE = 0.85
```
---
## How Quantum-Inspired Algorithms Help WiFi Sensing
These modules use quantum computing metaphors -- not because the ESP32 is a quantum computer, but because the mathematical frameworks from quantum mechanics map naturally onto CSI signal analysis:
**Bloch Sphere / Coherence**: WiFi subcarrier phases behave like quantum phases. When multipath is stable, all phases align (pure state). When the environment changes, phases randomize (mixed state). The Von Neumann entropy quantifies this exactly, providing a single scalar "change detector" that is more robust than tracking individual subcarrier phases.
**Grover's Algorithm / Hypothesis Search**: The oracle+diffusion loop is a principled way to combine evidence from multiple noisy sensors. Instead of hard-coding "if motion > 0.5 then exercising", the Grover-inspired search lets multiple hypotheses compete. Evidence gradually amplifies the correct hypothesis while suppressing incorrect ones. This is more robust to noisy CSI data than a single threshold.
**Why not just use classical statistics?** You could. But the quantum-inspired formulations have three practical advantages on embedded hardware:
1. **Fixed memory**: The Bloch vector is always 3 floats. The hypothesis array is always 16 floats. No dynamic allocation needed.
2. **Graceful degradation**: If CSI data is noisy, the Grover search does not crash or give a wrong answer immediately -- it just converges more slowly.
3. **Composability**: The coherence score from the Bloch sphere module feeds directly into the Temporal Logic Guard (rule 3: "no vital signs when coherence < 0.3") and the Psycho-Symbolic engine (feature 5: coherence). This creates a pipeline where quantum-inspired metrics inform classical reasoning.
---
## Memory Layout
| Module | State Size (approx) | Static Event Buffer |
|--------|---------------------|---------------------|
| Quantum Coherence | ~40 bytes (3D Bloch vector + 2 entropy floats + counter) | 3 entries |
| Interference Search | ~80 bytes (16 amplitudes + counters) | 3 entries |
| Psycho-Symbolic | ~24 bytes (bitmap + counters + prev_motion) | 8 entries |
| Self-Healing Mesh | ~360 bytes (8x8 adjacency + 8 qualities + state) | 6 entries |
All modules use fixed-size arrays and static event buffers. No heap allocation. Fully no_std compliant for WASM3 deployment on ESP32-S3.
---
## Cross-Module Integration
These modules are designed to work together in a pipeline:
```
CSI Frame (Tier 2 DSP)
|
v
[Quantum Coherence] --coherence--> [Psycho-Symbolic Engine]
| |
v v
[Interference Search] [Inference Result]
| |
v v
[Room State Hypothesis] [GOAP Planner]
|
v
[Module Activate/Deactivate]
|
v
[Self-Healing Mesh]
|
v
[Reconfiguration Events]
```
The Quantum Coherence monitor feeds its coherence score to:
- **Psycho-Symbolic Engine**: As feature 5 (coherence), enabling rules R3 (interference) and R6 (low coherence)
- **Temporal Logic Guard**: Rule 3 checks "no vital signs when coherence < 0.3"
- **Self-Healing Mesh**: Node quality can be derived from coherence
The GOAP Planner uses inference results to decide which modules to activate (e.g., activate vitals monitoring when a person is present, enter low-power mode when the room is empty).

View file

@ -0,0 +1,397 @@
# Smart Building Modules -- WiFi-DensePose Edge Intelligence
> Make any building smarter using WiFi signals you already have. Know which rooms are occupied, control HVAC and lighting automatically, count elevator passengers, track meeting room usage, and audit energy waste -- all without cameras or badges.
## Overview
| Module | File | What It Does | Event IDs | Frame Budget |
|--------|------|--------------|-----------|--------------|
| HVAC Presence | `bld_hvac_presence.rs` | Presence detection tuned for HVAC energy management | 310-312 | ~0.5 us/frame |
| Lighting Zones | `bld_lighting_zones.rs` | Per-zone lighting control (On/Dim/Off) based on spatial occupancy | 320-322 | ~1 us/frame |
| Elevator Count | `bld_elevator_count.rs` | Occupant counting in elevator cabins (1-12 persons) | 330-333 | ~1.5 us/frame |
| Meeting Room | `bld_meeting_room.rs` | Meeting lifecycle tracking with utilization metrics | 340-343 | ~0.3 us/frame |
| Energy Audit | `bld_energy_audit.rs` | 24x7 hourly occupancy histograms for scheduling optimization | 350-352 | ~0.2 us/frame |
All modules target the ESP32-S3 running WASM3 (ADR-040 Tier 3). They receive pre-processed CSI signals from Tier 2 DSP and emit structured events via `csi_emit_event()`.
---
## Modules
### HVAC Presence Control (`bld_hvac_presence.rs`)
**What it does**: Tells your HVAC system whether a room is occupied, with intentionally asymmetric timing -- fast arrival detection (10 seconds) so cooling/heating starts quickly, and slow departure timeout (5 minutes) to avoid premature shutoff when someone briefly steps out. Also classifies whether the occupant is sedentary (desk work, reading) or active (walking, exercising).
**How it works**: A four-state machine processes presence scores and motion energy each frame:
```
Vacant --> ArrivalPending --> Occupied --> DeparturePending --> Vacant
(10s debounce) (5 min timeout)
```
Motion energy is smoothed with an exponential moving average (alpha=0.1) and classified against a threshold of 0.3 to distinguish sedentary from active behavior.
#### State Machine
| State | Entry Condition | Exit Condition |
|-------|----------------|----------------|
| `Vacant` | No presence detected | Presence score > 0.5 |
| `ArrivalPending` | Presence detected, debounce counting | 200 consecutive frames with presence -> Occupied; any absence -> Vacant |
| `Occupied` | Arrival debounce completed | First frame without presence -> DeparturePending |
| `DeparturePending` | Presence lost | 6000 frames without presence -> Vacant; any presence -> Occupied |
#### Events
| Event ID | Name | Value | When Emitted |
|----------|------|-------|--------------|
| 310 | `HVAC_OCCUPIED` | 1.0 (occupied) or 0.0 (vacant) | Every 20 frames |
| 311 | `ACTIVITY_LEVEL` | 0.0-0.99 (sedentary + EMA) or 1.0 (active) | Every 20 frames |
| 312 | `DEPARTURE_COUNTDOWN` | 0.0-1.0 (fraction of timeout remaining) | Every 20 frames during DeparturePending |
#### API
```rust
use wifi_densepose_wasm_edge::bld_hvac_presence::HvacPresenceDetector;
let mut det = HvacPresenceDetector::new();
// Per-frame processing
let events = det.process_frame(presence_score, motion_energy);
// events: &[(event_type: i32, value: f32)]
// Queries
det.state() // -> HvacState (Vacant|ArrivalPending|Occupied|DeparturePending)
det.is_occupied() // -> bool (true during Occupied or DeparturePending)
det.activity() // -> ActivityLevel (Sedentary|Active)
det.motion_ema() // -> f32 (smoothed motion energy)
```
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `ARRIVAL_DEBOUNCE` | 200 frames (10s) | Frames of continuous presence before confirming occupancy |
| `DEPARTURE_TIMEOUT` | 6000 frames (5 min) | Frames of continuous absence before declaring vacant |
| `ACTIVITY_THRESHOLD` | 0.3 | Motion EMA above this = Active |
| `MOTION_ALPHA` | 0.1 | EMA smoothing factor for motion energy |
| `PRESENCE_THRESHOLD` | 0.5 | Minimum presence score to consider someone present |
| `EMIT_INTERVAL` | 20 frames (1s) | Event emission interval |
#### Example: BACnet Integration
```python
# Python host reading events from ESP32 UDP packet
if event_id == 310: # HVAC_OCCUPIED
bacnet_write(device_id, "Occupancy", int(value)) # 1=occupied, 0=vacant
elif event_id == 311: # ACTIVITY_LEVEL
if value >= 1.0:
bacnet_write(device_id, "CoolingSetpoint", 72) # Active: cooler
else:
bacnet_write(device_id, "CoolingSetpoint", 76) # Sedentary: warmer
elif event_id == 312: # DEPARTURE_COUNTDOWN
if value < 0.2: # Less than 1 minute remaining
bacnet_write(device_id, "FanMode", "low") # Start reducing
```
---
### Lighting Zone Control (`bld_lighting_zones.rs`)
**What it does**: Manages up to 4 independent lighting zones, automatically transitioning each zone between On (occupied and active), Dim (occupied but sedentary for over 10 minutes), and Off (vacant for over 30 seconds). Uses per-zone variance analysis to determine which areas of the room have people.
**How it works**: Subcarriers are divided into groups (one per zone). Each group's amplitude variance is computed and compared against a calibrated baseline. Variance deviation above threshold indicates occupancy in that zone. A calibration phase (200 frames = 10 seconds) establishes the baseline with an empty room.
```
Off --> On (occupancy + activity detected)
On --> Dim (occupied but sedentary for 10 min)
On --> Dim (vacancy detected, grace period)
Dim --> Off (vacant for 30 seconds)
Dim --> On (activity resumes)
```
#### Events
| Event ID | Name | Value | When Emitted |
|----------|------|-------|--------------|
| 320 | `LIGHT_ON` | zone_id (0-3) | On state transition |
| 321 | `LIGHT_DIM` | zone_id (0-3) | Dim state transition |
| 322 | `LIGHT_OFF` | zone_id (0-3) | Off state transition |
Periodic summaries encode `zone_id + confidence` in the value field (integer part = zone, fractional part = occupancy score).
#### API
```rust
use wifi_densepose_wasm_edge::bld_lighting_zones::LightingZoneController;
let mut ctrl = LightingZoneController::new();
// Per-frame: pass subcarrier amplitudes and overall motion energy
let events = ctrl.process_frame(&amplitudes, motion_energy);
// Queries
ctrl.zone_state(zone_id) // -> LightState (Off|Dim|On)
ctrl.n_zones() // -> usize (number of active zones, 1-4)
ctrl.is_calibrated() // -> bool
```
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `MAX_ZONES` | 4 | Maximum lighting zones |
| `OCCUPANCY_THRESHOLD` | 0.03 | Variance deviation ratio for occupancy |
| `ACTIVE_THRESHOLD` | 0.25 | Motion energy for active classification |
| `DIM_TIMEOUT` | 12000 frames (10 min) | Sedentary frames before dimming |
| `OFF_TIMEOUT` | 600 frames (30s) | Vacant frames before turning off |
| `BASELINE_FRAMES` | 200 frames (10s) | Calibration duration |
#### Example: DALI/KNX Lighting
```python
# Map zone events to DALI addresses
DALI_ADDR = {0: 1, 1: 2, 2: 3, 3: 4}
if event_id == 320: # LIGHT_ON
zone = int(value)
dali_send(DALI_ADDR[zone], level=254) # Full brightness
elif event_id == 321: # LIGHT_DIM
zone = int(value)
dali_send(DALI_ADDR[zone], level=80) # 30% brightness
elif event_id == 322: # LIGHT_OFF
zone = int(value)
dali_send(DALI_ADDR[zone], level=0) # Off
```
---
### Elevator Occupancy Counting (`bld_elevator_count.rs`)
**What it does**: Counts the number of people in an elevator cabin (0-12), detects door open/close events, and emits overload warnings when the count exceeds a configurable threshold. Uses the confined-space multipath characteristics of an elevator to correlate amplitude variance with body count.
**How it works**: In a small reflective metal box like an elevator, each additional person adds significant multipath scattering. The module calibrates on the empty cabin, then maps the ratio of current variance to baseline variance onto a person count. Frame-to-frame amplitude deltas detect sudden geometry changes (door open/close). Count estimate fuses the module's own variance-based estimate (40% weight) with the host's person count hint (60% weight) when available.
#### Events
| Event ID | Name | Value | When Emitted |
|----------|------|-------|--------------|
| 330 | `ELEVATOR_COUNT` | Person count (0-12) | Every 10 frames |
| 331 | `DOOR_OPEN` | Current count at time of opening | On door open detection |
| 332 | `DOOR_CLOSE` | Current count at time of closing | On door close detection |
| 333 | `OVERLOAD_WARNING` | Current count | When count >= overload threshold |
#### API
```rust
use wifi_densepose_wasm_edge::bld_elevator_count::ElevatorCounter;
let mut ec = ElevatorCounter::new();
// Per-frame: amplitudes, phases, motion energy, host person count hint
let events = ec.process_frame(&amplitudes, &phases, motion_energy, host_n_persons);
// Queries
ec.occupant_count() // -> u8 (0-12)
ec.door_state() // -> DoorState (Open|Closed)
ec.is_calibrated() // -> bool
// Configuration
ec.set_overload_threshold(8); // Set custom overload limit
```
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `MAX_OCCUPANTS` | 12 | Maximum tracked occupants |
| `DEFAULT_OVERLOAD` | 10 | Default overload warning threshold |
| `DOOR_VARIANCE_RATIO` | 4.0 | Delta magnitude for door detection |
| `DOOR_DEBOUNCE` | 3 frames | Debounce for door events |
| `DOOR_COOLDOWN` | 40 frames (2s) | Cooldown after door event |
| `BASELINE_FRAMES` | 200 frames (10s) | Calibration with empty cabin |
---
### Meeting Room Tracker (`bld_meeting_room.rs`)
**What it does**: Tracks the full lifecycle of meeting room usage -- from someone entering, to confirming a genuine multi-person meeting, to detecting when the meeting ends and the room is available again. Distinguishes actual meetings (2+ people for more than 3 seconds) from a single person briefly using the room. Tracks peak headcount and calculates room utilization rate.
**How it works**: A four-state machine processes presence and person count:
```
Empty --> PreMeeting --> Active --> PostMeeting --> Empty
(someone (2+ people (everyone left,
entered) confirmed) 2 min cooldown)
```
The PreMeeting state has a 3-minute timeout: if only one person remains, the room is not promoted to "Active" (it is not counted as a meeting).
#### Events
| Event ID | Name | Value | When Emitted |
|----------|------|-------|--------------|
| 340 | `MEETING_START` | Current person count | On transition to Active |
| 341 | `MEETING_END` | Duration in minutes | On transition to PostMeeting |
| 342 | `PEAK_HEADCOUNT` | Peak person count | On meeting end + periodic during Active |
| 343 | `ROOM_AVAILABLE` | 1.0 | On transition from PostMeeting to Empty |
#### API
```rust
use wifi_densepose_wasm_edge::bld_meeting_room::MeetingRoomTracker;
let mut mt = MeetingRoomTracker::new();
// Per-frame: presence (0/1), person count, motion energy
let events = mt.process_frame(presence, n_persons, motion_energy);
// Queries
mt.state() // -> MeetingState (Empty|PreMeeting|Active|PostMeeting)
mt.peak_headcount() // -> u8
mt.meeting_count() // -> u32 (total meetings since reset)
mt.utilization_rate() // -> f32 (fraction of time in meetings, 0.0-1.0)
```
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `MEETING_MIN_PERSONS` | 2 | Minimum people for a "meeting" |
| `PRE_MEETING_TIMEOUT` | 3600 frames (3 min) | Max time waiting for meeting to form |
| `POST_MEETING_TIMEOUT` | 2400 frames (2 min) | Cooldown before marking room available |
| `MEETING_MIN_FRAMES` | 6000 frames (5 min) | Reference minimum meeting duration |
#### Example: Calendar Integration
```python
# Sync meeting room status with calendar system
if event_id == 340: # MEETING_START
calendar_api.mark_room_in_use(room_id, headcount=int(value))
elif event_id == 341: # MEETING_END
duration_min = value
calendar_api.log_actual_usage(room_id, duration_min)
elif event_id == 343: # ROOM_AVAILABLE
calendar_api.mark_room_available(room_id)
display_screen.show("Room Available")
```
---
### Energy Audit (`bld_energy_audit.rs`)
**What it does**: Builds a 7-day, 24-hour occupancy histogram (168 hourly bins) to identify energy waste patterns. Finds which hours are consistently unoccupied (candidates for HVAC/lighting shutoff), detects after-hours occupancy anomalies (security/safety concern), and reports overall building utilization.
**How it works**: Each frame increments the appropriate hour bin's counters. The module maintains its own simulated clock (hour/day) that advances by counting frames (72,000 frames = 1 hour at 20 Hz). The host can set the real time via `set_time()`. After-hours is defined as 22:00-06:00 (wraps midnight correctly). Sustained presence (30+ seconds) during after-hours triggers an alert.
#### Events
| Event ID | Name | Value | When Emitted |
|----------|------|-------|--------------|
| 350 | `SCHEDULE_SUMMARY` | Current hour's occupancy rate (0.0-1.0) | Every 1200 frames (1 min) |
| 351 | `AFTER_HOURS_ALERT` | Current hour (22-5) | After 600 frames (30s) of after-hours presence |
| 352 | `UTILIZATION_RATE` | Overall utilization (0.0-1.0) | Every 1200 frames (1 min) |
#### API
```rust
use wifi_densepose_wasm_edge::bld_energy_audit::EnergyAuditor;
let mut ea = EnergyAuditor::new();
// Set real time from host
ea.set_time(0, 8); // Monday 8 AM (day 0-6, hour 0-23)
// Per-frame: presence (0/1), person count
let events = ea.process_frame(presence, n_persons);
// Queries
ea.utilization_rate() // -> f32 (overall)
ea.hourly_rate(day, hour) // -> f32 (occupancy rate for specific slot)
ea.hourly_headcount(day, hour) // -> f32 (average headcount)
ea.unoccupied_hours(day) // -> u8 (hours below 10% occupancy)
ea.current_time() // -> (day, hour)
```
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `FRAMES_PER_HOUR` | 72000 | Frames in one hour at 20 Hz |
| `SUMMARY_INTERVAL` | 1200 frames (1 min) | How often to emit summaries |
| `AFTER_HOURS_START` | 22 (10 PM) | Start of after-hours window |
| `AFTER_HOURS_END` | 6 (6 AM) | End of after-hours window |
| `USED_THRESHOLD` | 0.1 | Minimum occupancy rate to consider an hour "used" |
| `AFTER_HOURS_ALERT_FRAMES` | 600 frames (30s) | Sustained presence before alert |
#### Example: Energy Optimization Report
```python
# Generate weekly energy optimization report
for day in range(7):
unused = auditor.unoccupied_hours(day)
print(f"{DAY_NAMES[day]}: {unused} hours could have HVAC off")
for hour in range(24):
rate = auditor.hourly_rate(day, hour)
if rate < 0.1:
print(f" {hour:02d}:00 - unused ({rate:.0%} occupancy)")
```
---
## Integration Guide
### Connecting to BACnet / HVAC Systems
All five building modules emit events via the standard `csi_emit_event()` interface. A typical integration path:
1. **ESP32 firmware** receives events from the WASM module
2. **UDP packet** carries events to the aggregator server (port 5005)
3. **Sensing server** (`wifi-densepose-sensing-server`) exposes events via REST API
4. **BMS integration script** polls the API and writes BACnet/Modbus objects
Key BACnet object mappings:
| Module | BACnet Object Type | Property |
|--------|--------------------|----------|
| HVAC Presence | Binary Value | Occupancy (310: 1=occupied) |
| HVAC Presence | Analog Value | Activity Level (311: 0-1) |
| Lighting Zones | Multi-State Value | Zone State (320-322: Off/Dim/On) |
| Elevator Count | Analog Value | Occupant Count (330: 0-12) |
| Meeting Room | Binary Value | Room In Use (340/343) |
| Energy Audit | Analog Value | Utilization Rate (352: 0-1.0) |
### Lighting Control Integration (DALI, KNX)
The `bld_lighting_zones` module emits zone-level On/Dim/Off transitions. Map each zone to a DALI address group or KNX group address:
- Event 320 (LIGHT_ON) -> DALI command `DAPC(254)` or KNX `DPT_Switch ON`
- Event 321 (LIGHT_DIM) -> DALI command `DAPC(80)` or KNX `DPT_Scaling 30%`
- Event 322 (LIGHT_OFF) -> DALI command `DAPC(0)` or KNX `DPT_Switch OFF`
### BMS (Building Management System) Integration
For full BMS integration combining all five modules:
```
ESP32 Nodes (per room/zone)
|
v UDP events
Aggregator Server
|
v REST API / WebSocket
BMS Gateway Script
|
+-- HVAC Controller (BACnet/Modbus)
+-- Lighting Controller (DALI/KNX)
+-- Elevator Display Panel
+-- Meeting Room Booking System
+-- Energy Dashboard
```
### Deployment Considerations
- **Calibration**: Lighting and Elevator modules require a 10-second calibration with an empty room/cabin. Schedule calibration during known unoccupied periods.
- **Clock sync**: The Energy Audit module needs `set_time()` called at startup. Use NTP on the aggregator or pass timestamp via the host API.
- **Multiple ESP32s**: For open-plan offices, deploy one ESP32 per zone. Each runs its own HVAC Presence and Lighting Zones instance. The aggregator merges zone-level data.
- **Event rate**: All modules throttle events to at most one emission per second (EMIT_INTERVAL = 20 frames). Total bandwidth per module is under 100 bytes/second.

594
docs/edge-modules/core.md Normal file
View file

@ -0,0 +1,594 @@
# Core Modules -- WiFi-DensePose Edge Intelligence
> The foundation modules that every ESP32 node runs. These handle gesture detection, signal quality monitoring, anomaly detection, zone occupancy, vital sign tracking, intrusion classification, and model packaging.
All seven modules compile to `wasm32-unknown-unknown` and run inside the WASM3 interpreter on ESP32-S3 after Tier 2 DSP completes (ADR-040). They share a common `no_std`-compatible design: a struct with `const fn new()`, a `process_frame` (or `on_timer`) entry point, and zero heap allocation.
## Overview
| Module | File | What It Does | Compute Budget |
|--------|------|-------------|----------------|
| Gesture Classifier | `gesture.rs` | Recognizes hand gestures from CSI phase sequences using DTW template matching | ~2,400 f32 ops/frame (60x40 cost matrix) |
| Coherence Monitor | `coherence.rs` | Measures signal quality via phasor coherence across subcarriers | ~100 trig ops/frame (32 subcarriers) |
| Anomaly Detector | `adversarial.rs` | Flags physically impossible signals: phase jumps, flatlines, energy spikes | ~130 f32 ops/frame |
| Intrusion Detector | `intrusion.rs` | Detects unauthorized entry via phase velocity and amplitude disturbance | ~130 f32 ops/frame |
| Occupancy Detector | `occupancy.rs` | Divides sensing area into spatial zones and reports which are occupied | ~100 f32 ops/frame |
| Vital Trend Analyzer | `vital_trend.rs` | Monitors breathing/heart rate over 1-min and 5-min windows for clinical alerts | ~20 f32 ops/timer tick |
| RVF Container | `rvf.rs` | Binary container format that packages WASM modules with manifest and signature | Builder only (std), no per-frame cost |
## Modules
---
### Gesture Classifier (`gesture.rs`)
**What it does**: Recognizes predefined hand gestures from WiFi CSI phase sequences. It compares a sliding window of phase deltas against 4 built-in templates (wave, push, pull, swipe) using Dynamic Time Warping.
**How it works**: Each incoming frame provides subcarrier phases. The detector computes the phase delta from the previous frame and pushes it into a 60-sample ring buffer. When enough samples accumulate, it runs constrained DTW (with a Sakoe-Chiba band of width 5) between the tail of the observation window and each template. If the best normalized distance falls below the threshold (2.5), the corresponding gesture ID is emitted. A 40-frame cooldown prevents duplicate detections.
#### API
| Item | Type | Description |
|------|------|-------------|
| `GestureDetector` | struct | Main state holder. Contains ring buffer, templates, and cooldown timer. |
| `GestureDetector::new()` | `const fn` | Creates a detector with 4 built-in templates. |
| `GestureDetector::process_frame(&mut self, phases: &[f32]) -> Option<u8>` | method | Feed one frame of phase data. Returns `Some(gesture_id)` on match. |
| `MAX_TEMPLATE_LEN` | const (40) | Maximum number of samples in a gesture template. |
| `MAX_WINDOW_LEN` | const (60) | Maximum observation window length. |
| `NUM_TEMPLATES` | const (4) | Number of built-in templates. |
| `DTW_THRESHOLD` | const (2.5) | Normalized DTW distance threshold for a match. |
| `BAND_WIDTH` | const (5) | Sakoe-Chiba band width (limits warping). |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `DTW_THRESHOLD` | 2.5 | 0.5 -- 10.0 | Lower = stricter matching, fewer false positives but may miss soft gestures |
| `BAND_WIDTH` | 5 | 1 -- 20 | Width of the Sakoe-Chiba band. Wider = more flexible time warping but more computation |
| Cooldown frames | 40 | 10 -- 200 | Frames to wait before next detection. At 20 Hz, 40 frames = 2 seconds |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|-------------|
| 1 | `event_types::GESTURE_DETECTED` | A gesture template matched. Value = gesture ID (1=wave, 2=push, 3=pull, 4=swipe). |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::gesture::GestureDetector;
let mut detector = GestureDetector::new();
// Feed frames from CSI data (typically at 20 Hz).
let phases: Vec<f32> = get_csi_phases(); // your phase data
if let Some(gesture_id) = detector.process_frame(&phases) {
println!("Detected gesture {}", gesture_id);
// 1 = wave, 2 = push, 3 = pull, 4 = swipe
}
```
#### Tutorial: Adding a Custom Gesture Template
1. **Collect reference data**: Record the phase-delta sequence for your gesture by feeding CSI frames through the detector and logging the delta values in the ring buffer.
2. **Normalize the template**: Scale the phase-delta values so they span roughly -1.0 to 1.0. This ensures consistent DTW distances across different signal strengths.
3. **Edit the template array**: In `gesture.rs`, increase `NUM_TEMPLATES` by 1 and add a new entry in the `templates` array inside `GestureDetector::new()`:
```rust
GestureTemplate {
values: {
let mut v = [0.0f32; MAX_TEMPLATE_LEN];
v[0] = 0.2; v[1] = 0.6; // ... your values
v
},
len: 8, // number of valid samples
id: 5, // unique gesture ID
},
```
4. **Tune the threshold**: Run test data through `dtw_distance()` directly to see the distance between your template and real observations. Adjust `DTW_THRESHOLD` if your gesture is consistently matched at a distance higher than 2.5.
5. **Test**: Add a unit test that feeds the template values as phase inputs and verifies that `process_frame` returns your new gesture ID.
---
### Coherence Monitor (`coherence.rs`)
**What it does**: Measures the phase coherence of the WiFi signal across subcarriers. High coherence means the signal is stable and sensing is accurate. Low coherence means multipath interference or environmental changes are degrading the signal.
**How it works**: For each frame, it computes the inter-frame phase delta per subcarrier, converts each delta to a unit phasor (cos + j*sin), and averages them. The magnitude of this mean phasor is the raw coherence (0 = random, 1 = perfectly aligned). This raw value is smoothed with an exponential moving average (alpha = 0.1). A hysteresis gate classifies the result into Accept (>0.7), Warn (0.4--0.7), or Reject (<0.4).
#### API
| Item | Type | Description |
|------|------|-------------|
| `CoherenceMonitor` | struct | Tracks phasor sums, EMA score, and gate state. |
| `CoherenceMonitor::new()` | `const fn` | Creates a monitor with initial coherence of 1.0 (Accept). |
| `process_frame(&mut self, phases: &[f32]) -> f32` | method | Feed one frame of phase data. Returns EMA-smoothed coherence [0, 1]. |
| `gate_state(&self) -> GateState` | method | Current gate classification (Accept, Warn, Reject). |
| `mean_phasor_angle(&self) -> f32` | method | Dominant phase drift direction in radians. |
| `coherence_score(&self) -> f32` | method | Current EMA-smoothed coherence score. |
| `GateState` | enum | `Accept`, `Warn`, `Reject` -- signal quality classification. |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `ALPHA` | 0.1 | 0.01 -- 0.5 | EMA smoothing factor. Lower = slower response, more stable. Higher = faster response, more noisy |
| `HIGH_THRESHOLD` | 0.7 | 0.5 -- 0.95 | Coherence above this = Accept |
| `LOW_THRESHOLD` | 0.4 | 0.1 -- 0.6 | Coherence below this = Reject |
| `MAX_SC` | 32 | 1 -- 64 | Maximum subcarriers tracked (compile-time) |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|-------------|
| 2 | `event_types::COHERENCE_SCORE` | Emitted every 20 frames with the current coherence score (from the combined pipeline in `lib.rs`). |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::coherence::{CoherenceMonitor, GateState};
let mut monitor = CoherenceMonitor::new();
let phases: Vec<f32> = get_csi_phases();
let score = monitor.process_frame(&phases);
match monitor.gate_state() {
GateState::Accept => { /* full accuracy */ }
GateState::Warn => { /* predictions may be degraded */ }
GateState::Reject => { /* sensing unreliable, recalibrate */ }
}
```
---
### Anomaly Detector (`adversarial.rs`)
**What it does**: Detects physically impossible or suspicious CSI signals that may indicate sensor malfunction, RF jamming, replay attacks, or environmental interference. It runs three independent checks on every frame.
**How it works**: During the first 100 frames it accumulates a baseline (mean amplitude per subcarrier and mean total energy). After calibration, it checks each frame for three anomaly types:
1. **Phase jump**: If more than 50% of subcarriers show a phase discontinuity greater than 2.5 radians, something non-physical happened.
2. **Amplitude flatline**: If amplitude variance across subcarriers is near zero (below 0.001) while the mean is nonzero, the sensor may be stuck.
3. **Energy spike**: If total signal energy exceeds 50x the baseline, an external source may be injecting power.
A 20-frame cooldown prevents event flooding.
#### API
| Item | Type | Description |
|------|------|-------------|
| `AnomalyDetector` | struct | Tracks baseline, previous phases, cooldown, and anomaly count. |
| `AnomalyDetector::new()` | `const fn` | Creates an uncalibrated detector. |
| `process_frame(&mut self, phases: &[f32], amplitudes: &[f32]) -> bool` | method | Returns `true` if an anomaly is detected on this frame. |
| `total_anomalies(&self) -> u32` | method | Lifetime count of detected anomalies. |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `PHASE_JUMP_THRESHOLD` | 2.5 rad | 1.0 -- pi | Phase jump to flag per subcarrier |
| `MIN_AMPLITUDE_VARIANCE` | 0.001 | 0.0001 -- 0.1 | Below this = flatline |
| `MAX_ENERGY_RATIO` | 50.0 | 5.0 -- 500.0 | Energy spike threshold vs baseline |
| `BASELINE_FRAMES` | 100 | 50 -- 500 | Frames to calibrate baseline |
| `ANOMALY_COOLDOWN` | 20 | 5 -- 100 | Frames between anomaly reports |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|-------------|
| 3 | `event_types::ANOMALY_DETECTED` | When any anomaly check fires (after cooldown). |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::adversarial::AnomalyDetector;
let mut detector = AnomalyDetector::new();
// First 100 frames calibrate the baseline (always returns false).
for _ in 0..100 {
detector.process_frame(&phases, &amplitudes);
}
// Now anomalies are reported.
if detector.process_frame(&phases, &amplitudes) {
log!("Signal anomaly detected! Total: {}", detector.total_anomalies());
}
```
---
### Intrusion Detector (`intrusion.rs`)
**What it does**: Detects unauthorized entry into a monitored area. It is designed for security applications with a bias toward low false-negative rate (it would rather alarm falsely than miss a real intrusion).
**How it works**: The detector goes through four states:
1. **Calibrating** (200 frames): Learns baseline amplitude mean and variance per subcarrier.
2. **Monitoring**: Waits for the environment to be quiet (low disturbance for 100 consecutive frames) before arming.
3. **Armed**: Actively watching. Computes a disturbance score combining phase velocity (60% weight) and amplitude deviation (40% weight). If disturbance exceeds 0.8 for 3 consecutive frames, it triggers an alert.
4. **Alert**: Intrusion detected. Returns to Armed once disturbance drops below 0.3 for 50 frames.
#### API
| Item | Type | Description |
|------|------|-------------|
| `IntrusionDetector` | struct | State machine with baseline, debounce, and cooldown. |
| `IntrusionDetector::new()` | `const fn` | Creates a detector in Calibrating state. |
| `process_frame(&mut self, phases: &[f32], amplitudes: &[f32]) -> &[(i32, f32)]` | method | Returns a slice of events (up to 4 per frame). |
| `state(&self) -> DetectorState` | method | Current state machine state. |
| `total_alerts(&self) -> u32` | method | Lifetime alert count. |
| `DetectorState` | enum | `Calibrating`, `Monitoring`, `Armed`, `Alert`. |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `INTRUSION_VELOCITY_THRESH` | 1.5 rad/frame | 0.5 -- 3.0 | Phase velocity that counts as fast movement |
| `AMPLITUDE_CHANGE_THRESH` | 3.0 sigma | 1.0 -- 10.0 | Amplitude deviation in standard deviations |
| `ARM_FRAMES` | 100 | 20 -- 500 | Quiet frames needed to arm (at 20 Hz: 5 sec) |
| `DETECT_DEBOUNCE` | 3 | 1 -- 10 | Consecutive detection frames before alert |
| `ALERT_COOLDOWN` | 100 | 20 -- 500 | Frames between alerts |
| `BASELINE_FRAMES` | 200 | 100 -- 1000 | Calibration window |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|-------------|
| 200 | `EVENT_INTRUSION_ALERT` | Intrusion detected. Value = disturbance score. |
| 201 | `EVENT_INTRUSION_ZONE` | Identifies which subcarrier zone has the most disturbance. |
| 202 | `EVENT_INTRUSION_ARMED` | Detector has armed after a quiet period. |
| 203 | `EVENT_INTRUSION_DISARMED` | Detector disarmed (not currently emitted). |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::intrusion::{IntrusionDetector, DetectorState};
let mut detector = IntrusionDetector::new();
// Calibrate and arm (feed quiet frames).
for _ in 0..300 {
detector.process_frame(&quiet_phases, &quiet_amps);
}
assert_eq!(detector.state(), DetectorState::Armed);
// Now process live data.
let events = detector.process_frame(&live_phases, &live_amps);
for &(event_type, value) in events {
if event_type == 200 {
trigger_alarm(value);
}
}
```
---
### Occupancy Detector (`occupancy.rs`)
**What it does**: Divides the sensing area into spatial zones (based on subcarrier groupings) and determines which zones are currently occupied by people. Useful for smart building applications such as HVAC control and lighting automation.
**How it works**: Subcarriers are divided into groups of 4, with each group representing a spatial zone (up to 8 zones). For each zone, the detector computes the variance of amplitude values within that group. During calibration (200 frames), it learns the baseline variance. After calibration, it computes the deviation from baseline, applies EMA smoothing (alpha=0.15), and uses a hysteresis threshold to classify each zone as occupied or empty. Events include per-zone occupancy (emitted every 10 frames) and zone transitions (emitted immediately on change).
#### API
| Item | Type | Description |
|------|------|-------------|
| `OccupancyDetector` | struct | Per-zone state, calibration accumulators, frame counter. |
| `OccupancyDetector::new()` | `const fn` | Creates uncalibrated detector. |
| `process_frame(&mut self, phases: &[f32], amplitudes: &[f32]) -> &[(i32, f32)]` | method | Returns events (up to 12 per frame). |
| `occupied_count(&self) -> u8` | method | Number of currently occupied zones. |
| `is_zone_occupied(&self, zone_id: usize) -> bool` | method | Check a specific zone. |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `MAX_ZONES` | 8 | 1 -- 16 | Maximum number of spatial zones |
| `ZONE_THRESHOLD` | 0.02 | 0.005 -- 0.5 | Score above this = occupied. Hysteresis exit at 0.5x |
| `ALPHA` | 0.15 | 0.05 -- 0.5 | EMA smoothing factor for zone scores |
| `BASELINE_FRAMES` | 200 | 100 -- 1000 | Calibration window length |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|-------------|
| 300 | `EVENT_ZONE_OCCUPIED` | Every 10 frames for each occupied zone. Value = `zone_id + confidence`. |
| 301 | `EVENT_ZONE_COUNT` | Every 10 frames. Value = total occupied zone count. |
| 302 | `EVENT_ZONE_TRANSITION` | Immediately on zone state change. Value = `zone_id + 0.5` (entered) or `zone_id + 0.0` (vacated). |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::occupancy::OccupancyDetector;
let mut detector = OccupancyDetector::new();
// Calibrate with empty-room data.
for _ in 0..200 {
detector.process_frame(&empty_phases, &empty_amps);
}
// Live monitoring.
let events = detector.process_frame(&live_phases, &live_amps);
println!("Occupied zones: {}", detector.occupied_count());
println!("Zone 0 occupied: {}", detector.is_zone_occupied(0));
```
---
### Vital Trend Analyzer (`vital_trend.rs`)
**What it does**: Monitors breathing rate and heart rate over time and alerts on clinically significant conditions. It tracks 1-minute and 5-minute trends and detects apnea, bradypnea, tachypnea, bradycardia, and tachycardia.
**How it works**: Called at 1 Hz with current vital sign readings (from Tier 2 DSP). It pushes each reading into a 300-sample ring buffer (5-minute history). Each call checks for:
- **Apnea**: Breathing BPM below 1.0 for 20+ consecutive seconds.
- **Bradypnea**: Sustained breathing below 12 BPM (5+ consecutive samples).
- **Tachypnea**: Sustained breathing above 25 BPM (5+ consecutive samples).
- **Bradycardia**: Sustained heart rate below 50 BPM (5+ consecutive samples).
- **Tachycardia**: Sustained heart rate above 120 BPM (5+ consecutive samples).
Every 60 seconds, it emits 1-minute averages for both breathing and heart rate.
#### API
| Item | Type | Description |
|------|------|-------------|
| `VitalTrendAnalyzer` | struct | Two ring buffers (breathing, heartrate), debounce counters, apnea counter. |
| `VitalTrendAnalyzer::new()` | `const fn` | Creates analyzer with empty history. |
| `on_timer(&mut self, breathing_bpm: f32, heartrate_bpm: f32) -> &[(i32, f32)]` | method | Called at 1 Hz. Returns clinical alerts (up to 8). |
| `breathing_avg_1m(&self) -> f32` | method | 1-minute breathing rate average. |
| `breathing_trend_5m(&self) -> f32` | method | 5-minute breathing trend (positive = increasing). |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `BRADYPNEA_THRESH` | 12.0 BPM | 8 -- 15 | Below this = dangerously slow breathing |
| `TACHYPNEA_THRESH` | 25.0 BPM | 20 -- 35 | Above this = dangerously fast breathing |
| `BRADYCARDIA_THRESH` | 50.0 BPM | 40 -- 60 | Below this = dangerously slow heart rate |
| `TACHYCARDIA_THRESH` | 120.0 BPM | 100 -- 150 | Above this = dangerously fast heart rate |
| `APNEA_SECONDS` | 20 | 10 -- 60 | Seconds of near-zero breathing before alert |
| `ALERT_DEBOUNCE` | 5 | 2 -- 15 | Consecutive abnormal samples before alert |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|-------------|
| 100 | `EVENT_VITAL_TREND` | Reserved for generic trend events. |
| 101 | `EVENT_BRADYPNEA` | Sustained slow breathing. Value = current BPM. |
| 102 | `EVENT_TACHYPNEA` | Sustained fast breathing. Value = current BPM. |
| 103 | `EVENT_BRADYCARDIA` | Sustained slow heart rate. Value = current BPM. |
| 104 | `EVENT_TACHYCARDIA` | Sustained fast heart rate. Value = current BPM. |
| 105 | `EVENT_APNEA` | Breathing stopped. Value = seconds of apnea. |
| 110 | `EVENT_BREATHING_AVG` | 1-minute breathing average. Emitted every 60 seconds. |
| 111 | `EVENT_HEARTRATE_AVG` | 1-minute heart rate average. Emitted every 60 seconds. |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::vital_trend::VitalTrendAnalyzer;
let mut analyzer = VitalTrendAnalyzer::new();
// Called at 1 Hz from the on_timer WASM export.
let events = analyzer.on_timer(breathing_bpm, heartrate_bpm);
for &(event_type, value) in events {
match event_type {
105 => alert_apnea(value as u32),
101 => alert_bradypnea(value),
104 => alert_tachycardia(value),
110 => log_breathing_avg(value),
_ => {}
}
}
// Query trend data.
let avg = analyzer.breathing_avg_1m();
let trend = analyzer.breathing_trend_5m();
```
---
### RVF Container (`rvf.rs`)
**What it does**: Defines the RVF (RuVector Format) binary container that packages a compiled WASM module with its manifest (name, author, capabilities, budget, hash) and an optional Ed25519 signature. This is the file format that gets uploaded to ESP32 nodes via the `/api/wasm/upload` endpoint.
**How it works**: The format has four sections laid out sequentially:
```
[Header: 32 bytes][Manifest: 96 bytes][WASM: N bytes][Signature: 0|64 bytes]
```
The header contains magic bytes (`RVF\x01`), format version, section sizes, and flags. The manifest describes the module's identity (name, author), resource requirements (max frame time, memory limit), and capability flags (which host APIs it needs). The WASM section is the raw compiled binary. The signature section is optional (indicated by `FLAG_HAS_SIGNATURE`) and covers everything before it.
The builder (available only with the `std` feature) creates RVF files from WASM binary data and a configuration struct. It automatically computes a SHA-256 hash of the WASM payload and embeds it in the manifest for integrity verification.
#### API
| Item | Type | Description |
|------|------|-------------|
| `RvfHeader` | `#[repr(C, packed)]` struct | 32-byte header with magic, version, section sizes. |
| `RvfManifest` | `#[repr(C, packed)]` struct | 96-byte manifest with module metadata. |
| `RvfConfig` | struct (std only) | Builder configuration input. |
| `build_rvf(wasm_data: &[u8], config: &RvfConfig) -> Vec<u8>` | function (std only) | Build a complete RVF container. |
| `patch_signature(rvf: &mut [u8], signature: &[u8; 64])` | function (std only) | Patch an Ed25519 signature into an existing RVF. |
| `RVF_MAGIC` | const (`0x0146_5652`) | Magic bytes: `RVF\x01` as little-endian u32. |
| `RVF_FORMAT_VERSION` | const (1) | Current format version. |
| `RVF_HEADER_SIZE` | const (32) | Header size in bytes. |
| `RVF_MANIFEST_SIZE` | const (96) | Manifest size in bytes. |
| `RVF_SIGNATURE_LEN` | const (64) | Ed25519 signature length. |
| `RVF_HOST_API_V1` | const (1) | Host API version this crate supports. |
#### Capability Flags
| Flag | Value | Description |
|------|-------|-------------|
| `CAP_READ_PHASE` | `1 << 0` | Module reads phase data |
| `CAP_READ_AMPLITUDE` | `1 << 1` | Module reads amplitude data |
| `CAP_READ_VARIANCE` | `1 << 2` | Module reads variance data |
| `CAP_READ_VITALS` | `1 << 3` | Module reads vital sign data |
| `CAP_READ_HISTORY` | `1 << 4` | Module reads phase history |
| `CAP_EMIT_EVENTS` | `1 << 5` | Module emits events |
| `CAP_LOG` | `1 << 6` | Module uses logging |
| `CAP_ALL` | `0x7F` | All capabilities |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::rvf::builder::{build_rvf, RvfConfig, patch_signature};
use wifi_densepose_wasm_edge::rvf::*;
// Read compiled WASM binary.
let wasm_data = std::fs::read("target/wasm32-unknown-unknown/release/my_module.wasm")?;
// Configure the module.
let config = RvfConfig {
module_name: "my-gesture-v2".into(),
author: "team-alpha".into(),
capabilities: CAP_READ_PHASE | CAP_EMIT_EVENTS,
max_frame_us: 5000, // 5 ms budget per frame
max_events_per_sec: 20,
memory_limit_kb: 64,
min_subcarriers: 8,
max_subcarriers: 64,
..Default::default()
};
// Build the RVF container.
let rvf = build_rvf(&wasm_data, &config);
// Optionally sign and patch.
let signature = sign_with_ed25519(&rvf[..rvf.len() - RVF_SIGNATURE_LEN]);
let mut rvf_mut = rvf;
patch_signature(&mut rvf_mut, &signature);
// Upload to ESP32.
std::fs::write("my-gesture-v2.rvf", &rvf_mut)?;
```
---
## Testing
### Running Core Module Tests
From the crate directory:
```bash
cd rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge
cargo test --features std -- gesture coherence adversarial intrusion occupancy vital_trend rvf
```
This runs all tests whose names contain any of the seven module names. The `--features std` flag is required because the RVF builder tests need `sha2` and `std::io`.
### Expected Output
All tests should pass:
```
running 32 tests
test adversarial::tests::test_anomaly_detector_init ... ok
test adversarial::tests::test_calibration_phase ... ok
test adversarial::tests::test_normal_signal_no_anomaly ... ok
test adversarial::tests::test_phase_jump_detection ... ok
test adversarial::tests::test_amplitude_flatline_detection ... ok
test adversarial::tests::test_energy_spike_detection ... ok
test adversarial::tests::test_cooldown_prevents_flood ... ok
test coherence::tests::test_coherence_monitor_init ... ok
test coherence::tests::test_empty_phases_returns_current_score ... ok
test coherence::tests::test_first_frame_returns_one ... ok
test coherence::tests::test_constant_phases_high_coherence ... ok
test coherence::tests::test_incoherent_phases_lower_coherence ... ok
test coherence::tests::test_gate_hysteresis ... ok
test coherence::tests::test_mean_phasor_angle_zero_for_no_drift ... ok
test gesture::tests::test_gesture_detector_init ... ok
test gesture::tests::test_empty_phases_returns_none ... ok
test gesture::tests::test_first_frame_initializes ... ok
test gesture::tests::test_constant_phase_no_gesture_after_cooldown ... ok
test gesture::tests::test_dtw_identical_sequences ... ok
test gesture::tests::test_dtw_different_sequences ... ok
test gesture::tests::test_dtw_empty_input ... ok
test gesture::tests::test_cooldown_prevents_duplicate_detection ... ok
test gesture::tests::test_window_ring_buffer_wraps ... ok
test intrusion::tests::test_intrusion_init ... ok
test intrusion::tests::test_calibration_phase ... ok
test intrusion::tests::test_arm_after_quiet ... ok
test intrusion::tests::test_intrusion_detection ... ok
test occupancy::tests::test_occupancy_detector_init ... ok
test occupancy::tests::test_occupancy_calibration ... ok
test occupancy::tests::test_occupancy_detection ... ok
test vital_trend::tests::test_vital_trend_init ... ok
test vital_trend::tests::test_normal_vitals_no_alerts ... ok
test vital_trend::tests::test_apnea_detection ... ok
test vital_trend::tests::test_tachycardia_detection ... ok
test vital_trend::tests::test_breathing_average ... ok
test rvf::builder::tests::test_build_rvf_roundtrip ... ok
test rvf::builder::tests::test_build_hash_integrity ... ok
```
### Test Coverage Notes
| Module | Tests | Coverage |
|--------|-------|----------|
| `gesture.rs` | 8 | Init, empty input, first frame, constant input, DTW identical/different/empty, ring buffer wrap, cooldown |
| `coherence.rs` | 7 | Init, empty input, first frame, constant phases, incoherent phases, gate hysteresis, phasor angle |
| `adversarial.rs` | 7 | Init, calibration, normal signal, phase jump, flatline, energy spike, cooldown |
| `intrusion.rs` | 4 | Init, calibration, arming, intrusion detection |
| `occupancy.rs` | 3 | Init, calibration, zone detection |
| `vital_trend.rs` | 5 | Init, normal vitals, apnea, tachycardia, breathing average |
| `rvf.rs` | 2 | Build roundtrip, hash integrity |
## Common Patterns
All seven core modules share these design patterns:
### 1. Const-constructible state
Every module's main struct can be created with `const fn new()`, which means it can be placed in a `static` variable without runtime initialization. This is essential for WASM modules where there is no allocator.
```rust
static mut STATE: MyModule = MyModule::new();
```
### 2. Calibration-then-detect lifecycle
Modules that need a baseline (`adversarial`, `intrusion`, `occupancy`) follow the same pattern: accumulate statistics for N frames, compute mean/variance, then switch to detection mode. The calibration frame count is always a compile-time constant.
### 3. Ring buffer for history
Both `gesture` (phase deltas) and `vital_trend` (BPM readings) use fixed-size ring buffers with modular index arithmetic. The pattern is:
```rust
self.values[self.idx] = new_value;
self.idx = (self.idx + 1) % MAX_SIZE;
if self.len < MAX_SIZE { self.len += 1; }
```
### 4. Static event buffers
Modules that return multiple events per frame (`intrusion`, `occupancy`, `vital_trend`) use `static mut` arrays as return buffers to avoid heap allocation. This is safe in single-threaded WASM but requires `unsafe` blocks. The pattern is:
```rust
static mut EVENTS: [(i32, f32); N] = [(0, 0.0); N];
let mut n_events = 0;
// ... populate EVENTS[n_events] ...
unsafe { &EVENTS[..n_events] }
```
### 5. Cooldown/debounce
Every detection module uses a cooldown counter to prevent event flooding. After firing an event, the counter is set to a constant value and decremented each frame. No new events are emitted while the counter is positive.
### 6. EMA smoothing
Modules that track continuous scores (`coherence`, `occupancy`) use exponential moving average smoothing: `smoothed = alpha * raw + (1 - alpha) * smoothed`. The alpha constant controls responsiveness vs. stability.
### 7. Hysteresis thresholds
To prevent oscillation at detection boundaries, modules use different thresholds for entering and exiting a state. For example, the coherence monitor requires a score above 0.7 to enter Accept but only drops to Reject below 0.4.

View file

@ -0,0 +1,78 @@
é chip revision: v0.2
I (34) boot.esp32s3: Boot SPI Speed : 80MHz
I (38) boot.esp32s3: SPI Mode : DIO
I (43) boot.esp32s3: SPI Flash Size : 8MB
I (48) boot: Enabling RNG early entropy source...
I (53) boot: Partition Table:
I (57) boot: ## Label Usage Type ST Offset Length
I (64) boot: 0 nvs WiFi data 01 02 00009000 00006000
I (71) boot: 1 phy_init RF data 01 01 0000f000 00001000
I (79) boot: 2 factory factory app 00 00 00010000 00100000
I (86) boot: End of partition table
I (91) esp_image: segment 0: paddr=00010020 vaddr=3c0b0020 size=2e5ach (189868) map
I (133) esp_image: segment 1: paddr=0003e5d4 vaddr=3fc97e00 size=01a44h ( 6724) load
I (135) esp_image: segment 2: paddr=00040020 vaddr=42000020 size=a0acch (658124) map
I (257) esp_image: segment 3: paddr=000e0af4 vaddr=3fc99844 size=02bbch ( 11196) load
I (260) esp_image: segment 4: paddr=000e36b8 vaddr=40374000 size=13d5ch ( 81244) load
I (289) boot: Loaded app from partition at offset 0x10000
I (289) boot: Disabling RNG early entropy source...
I (300) cpu_start: Multicore app
I (310) cpu_start: Pro cpu start user code
I (310) cpu_start: cpu freq: 160000000 Hz
I (310) cpu_start: Application information:
I (313) cpu_start: Project name: esp32-csi-node
I (319) cpu_start: App version: 1
I (323) cpu_start: Compile time: Mar 3 2026 04:15:10
I (329) cpu_start: ELF file SHA256: 50c89a9ed...
I (334) cpu_start: ESP-IDF: v5.2
I (339) cpu_start: Min chip rev: v0.0
I (344) cpu_start: Max chip rev: v0.99
I (349) cpu_start: Chip rev: v0.2
I (353) heap_init: Initializing. RAM available for dynamic allocation:
I (361) heap_init: At 3FCA9468 len 000402A8 (256 KiB): RAM
I (367) heap_init: At 3FCE9710 len 00005724 (21 KiB): RAM
I (373) heap_init: At 3FCF0000 len 00008000 (32 KiB): DRAM
I (379) heap_init: At 600FE010 len 00001FD8 (7 KiB): RTCRAM
I (386) spi_flash: detected chip: gd
I (390) spi_flash: flash io: dio
I (394) sleep: Configure to isolate all GPIO pins in sleep state
I (400) sleep: Enable automatic switching of GPIO sleep configuration
I (408) main_task: Started on CPU0
I (412) main_task: Calling app_main()
I (441) nvs_config: NVS override: ssid=ruv.net
I (442) nvs_config: NVS override: password=***
I (443) nvs_config: NVS override: target_ip=192.168.1.20
I (448) nvs_config: NVS override: wasm_verify=0
I (452) main: ESP32-S3 CSI Node (ADR-018) â?? Node ID: 1
I (460) pp: pp rom version: e7ae62f
I (462) net80211: net80211 rom version: e7ae62f
I (469) wifi:wifi driver task: 3fcb3784, prio:23, stack:6656, core=0
I (489) wifi:wifi firmware version: cc1dd81
I (489) wifi:wifi certification version: v7.0
I (489) wifi:config NVS flash: enabled
I (490) wifi:config nano formating: disabled
I (494) wifi:Init data frame dynamic rx buffer num: 32
I (499) wifi:Init static rx mgmt buffer num: 5
I (503) wifi:Init management short buffer num: 32
I (507) wifi:Init dynamic tx buffer num: 32
I (511) wifi:Init static tx FG buffer num: 2
I (515) wifi:Init static rx buffer size: 2212
I (519) wifi:Init static rx buffer num: 16
I (523) wifi:Init dynamic rx buffer num: 32
I (527) wifi_init: rx ba win: 16
I (531) wifi_init: tcpip mbox: 32
I (535) wifi_init: udp mbox: 32
I (538) wifi_init: tcp mbox: 6
I (542) wifi_init: tcp tx win: 5760
I (546) wifi_init: tcp rx win: 5760
I (550) wifi_init: tcp mss: 1440
I (554) wifi_init: WiFi IRAM OP enabled
I (559) wifi_init: WiFi RX IRAM OP enabled
I (566) phy_init: phy_version 620,ec7ec30,Sep 5 2023,13:49:13
I (612) wifi:mode : sta (3c:0f:02:ec:c2:28)
I (612) wifi:enable tsf
I (614) main: WiFi STA initialized, connecting to SSID: ruv.net
I (623) wifi:new:<5,0>, old:<1,0>, ap:<255,255>, sta:<5,0>, prof:1
I (625) wifi:state: init -> auth (b0)
I (656) wifi:state: auth -> assoc (0)
I (749) wifi:state: assoc -> run (10)

645
docs/edge-modules/exotic.md Normal file
View file

@ -0,0 +1,645 @@
# Exotic & Research Modules -- WiFi-DensePose Edge Intelligence
> Experimental sensing applications that push the boundaries of what WiFi
> signals can detect. From contactless sleep staging to sign language
> recognition, these modules explore novel uses of RF sensing. Some are
> highly experimental -- marked with their maturity level.
## Maturity Levels
- **Proven**: Based on published research with validated results
- **Experimental**: Working implementation, needs real-world validation
- **Research**: Proof of concept, exploratory
## Overview
| Module | File | What It Does | Event IDs | Maturity |
|--------|------|-------------|-----------|----------|
| Sleep Stage Classification | `exo_dream_stage.rs` | Classifies sleep phases from breathing + micro-movements | 600-603 | Experimental |
| Emotion Detection | `exo_emotion_detect.rs` | Estimates arousal/stress from physiological proxies | 610-613 | Research |
| Sign Language Recognition | `exo_gesture_language.rs` | DTW-based letter recognition from hand/arm CSI patterns | 620-623 | Research |
| Music Conductor Tracking | `exo_music_conductor.rs` | Extracts tempo, beat, dynamics from conducting motions | 630-634 | Research |
| Plant Growth Detection | `exo_plant_growth.rs` | Detects plant growth drift and circadian leaf movement | 640-643 | Research |
| Ghost Hunter (Anomaly) | `exo_ghost_hunter.rs` | Classifies unexplained perturbations in empty rooms | 650-653 | Experimental |
| Rain Detection | `exo_rain_detect.rs` | Detects rain from broadband structural vibrations | 660-662 | Experimental |
| Breathing Synchronization | `exo_breathing_sync.rs` | Detects phase-locked breathing between multiple people | 670-673 | Research |
| Time Crystal Detection | `exo_time_crystal.rs` | Detects period-doubling and temporal coordination | 680-682 | Research |
| Hyperbolic Space Embedding | `exo_hyperbolic_space.rs` | Poincare ball location classification with hierarchy | 685-687 | Research |
## Architecture
All modules share these design constraints:
- **`no_std`** -- no heap allocation, runs on WASM3 interpreter on ESP32-S3
- **`const fn new()`** -- all state is stack-allocated and const-constructible
- **Static event buffer** -- events are returned via `&[(i32, f32)]` from a static array (max 3-5 events per frame)
- **Budget-aware** -- each module declares its per-frame time budget (L/S/H)
- **Frame rate** -- all modules assume 20 Hz CSI frame rate from the host Tier 2 DSP
Shared utilities from `vendor_common.rs`:
- `CircularBuffer<N>` -- fixed-size ring buffer with O(1) push and indexed access
- `Ema` -- exponential moving average with configurable alpha
- `WelfordStats` -- online mean/variance computation (Welford's algorithm)
---
## Modules
### Sleep Stage Classification (`exo_dream_stage.rs`)
**What it does**: Classifies sleep phases (Awake, NREM Light, NREM Deep, REM) from breathing patterns, heart rate variability, and micro-movements -- without touching the person.
**Maturity**: Experimental
**Research basis**: WiFi-based contactless sleep monitoring has been demonstrated in peer-reviewed research. See [1] for RF-based sleep staging using breathing patterns and body movement.
#### How It Works
The module uses a four-feature state machine with hysteresis:
1. **Breathing regularity** -- Coefficient of variation (CV) of a 64-sample breathing BPM window. Low CV (<0.08) indicates deep sleep; high CV (>0.20) indicates REM or wakefulness.
2. **Motion energy** -- EMA-smoothed motion from host Tier 2. Below 0.15 = sleep-like; above 0.5 = awake.
3. **Heart rate variability (HRV)** -- Variance of recent HR BPM values. High HRV (>8.0) correlates with REM; very low HRV (<2.0) with deep sleep.
4. **Phase micro-movements** -- High-pass energy of the phase signal (successive differences). Captures muscle atonia disruption during REM.
Stage transitions require 10 consecutive frames of the candidate stage (hysteresis), preventing jittery classification.
#### Sleep Stages
| Stage | Code | Conditions |
|-------|------|-----------|
| Awake | 0 | No presence, high motion, or moderate motion + irregular breathing |
| NREM Light | 1 | Low motion, moderate breathing regularity, default sleep state |
| NREM Deep | 2 | Very low motion, very regular breathing (CV < 0.08), low HRV (< 2.0) |
| REM | 3 | Very low motion, high HRV (> 8.0), micro-movements above threshold |
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `SLEEP_STAGE` | 600 | 0-3 (Awake/Light/Deep/REM) | Every frame (after warmup) |
| `SLEEP_QUALITY` | 601 | Sleep efficiency [0, 100] | Every 20 frames |
| `REM_EPISODE` | 602 | Current/last REM episode length (frames) | When REM active or just ended |
| `DEEP_SLEEP_RATIO` | 603 | Deep/total sleep ratio [0, 1] | Every 20 frames |
#### Quality Metrics
- **Efficiency** = (sleep_frames / total_frames) * 100
- **Deep ratio** = deep_frames / sleep_frames
- **REM ratio** = rem_frames / sleep_frames
#### Configuration Constants
| Parameter | Default | Description |
|-----------|---------|-------------|
| `BREATH_HIST_LEN` | 64 | Rolling window for breathing BPM history |
| `HR_HIST_LEN` | 64 | Rolling window for heart rate history |
| `PHASE_BUF_LEN` | 128 | Phase buffer for micro-movement detection |
| `MOTION_ALPHA` | 0.1 | Motion EMA smoothing factor |
| `MIN_WARMUP` | 40 | Minimum frames before classification begins |
| `STAGE_HYSTERESIS` | 10 | Consecutive frames required for stage transition |
#### API
```rust
let mut detector = DreamStageDetector::new();
let events = detector.process_frame(
breathing_bpm, // f32: from Tier 2 DSP
heart_rate_bpm, // f32: from Tier 2 DSP
motion_energy, // f32: from Tier 2 DSP
phase, // f32: representative subcarrier phase
variance, // f32: representative subcarrier variance
presence, // i32: 1 if person detected, 0 otherwise
);
// events: &[(i32, f32)] -- event ID + value pairs
let stage = detector.stage(); // SleepStage enum
let eff = detector.efficiency(); // f32 [0, 100]
let deep = detector.deep_ratio(); // f32 [0, 1]
let rem = detector.rem_ratio(); // f32 [0, 1]
```
#### Tutorial: Setting Up Contactless Sleep Tracking
1. **Placement**: Mount the WiFi transmitter and receiver so the line of sight crosses the bed at chest height. Place the ESP32 node 1-3 meters from the bed.
2. **Calibration**: Let the system run for 40+ frames (2 seconds at 20 Hz) with the person in bed before expecting valid stage classifications.
3. **Interpreting Results**: Monitor `SLEEP_STAGE` events. A healthy sleep cycle progresses through Light -> Deep -> Light -> REM, repeating in ~90 minute cycles. The `SLEEP_QUALITY` event (601) gives an overall efficiency percentage -- above 85% is considered good.
4. **Limitations**: The module requires the Tier 2 DSP to provide valid `breathing_bpm` and `heart_rate_bpm`. If the person is too far from the WiFi path or behind thick walls, these vitals may not be detectable.
---
### Emotion Detection (`exo_emotion_detect.rs`)
**What it does**: Estimates continuous arousal level and discrete stress/calm/agitation states from WiFi CSI without cameras or microphones. Uses physiological proxies: breathing rate, heart rate, fidgeting, and phase variance.
**Maturity**: Research
**Limitations**: This module does NOT detect emotions directly. It detects physiological arousal -- elevated heart rate, rapid breathing, and fidgeting. These correlate with stress and anxiety but can also be caused by exercise, caffeine, or excitement. The module cannot distinguish between positive and negative arousal. It is a research tool for exploring the feasibility of affect sensing via RF, not a clinical instrument.
#### How It Works
The arousal level is a weighted sum of four normalized features:
| Feature | Weight | Source | Score = 0 | Score = 1 |
|---------|--------|--------|-----------|-----------|
| Breathing rate | 0.30 | Host Tier 2 | 6-10 BPM (calm) | >= 20 BPM (stressed) |
| Heart rate | 0.20 | Host Tier 2 | <= 70 BPM (baseline) | 100+ BPM (elevated) |
| Fidget energy | 0.30 | Motion successive diffs | No fidgeting | Continuous fidgeting |
| Phase variance | 0.20 | Subcarrier variance | Stable signal | Sharp body movements |
The stress index uses different weights (0.4/0.3/0.2/0.1) emphasizing breathing and heart rate over fidgeting.
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `AROUSAL_LEVEL` | 610 | Continuous arousal [0, 1] | Every frame |
| `STRESS_INDEX` | 611 | Stress index [0, 1] | Every frame |
| `CALM_DETECTED` | 612 | 1.0 when calm state detected | When conditions met |
| `AGITATION_DETECTED` | 613 | 1.0 when agitation detected | When conditions met |
#### Discrete State Detection
- **Calm**: arousal < 0.25 AND motion < 0.08 AND breathing 6-10 BPM AND breath CV < 0.08
- **Agitation**: arousal > 0.75 AND (motion > 0.6 OR fidget > 0.15 OR breath CV > 0.25)
#### API
```rust
let mut detector = EmotionDetector::new();
let events = detector.process_frame(
breathing_bpm, // f32
heart_rate_bpm, // f32
motion_energy, // f32
phase, // f32 (unused in current implementation)
variance, // f32
);
let arousal = detector.arousal(); // f32 [0, 1]
let stress = detector.stress_index(); // f32 [0, 1]
let calm = detector.is_calm(); // bool
let agitated = detector.is_agitated(); // bool
```
---
### Sign Language Recognition (`exo_gesture_language.rs`)
**What it does**: Classifies hand/arm movements into sign language letter groups using WiFi CSI phase and amplitude patterns. Uses DTW (Dynamic Time Warping) template matching on compact 6D feature sequences.
**Maturity**: Research
**Limitations**: Full 26-letter ASL alphabet recognition via WiFi is extremely challenging. This module provides a proof-of-concept framework. Real-world accuracy depends heavily on: (a) template quality and diversity, (b) environmental stability, (c) person-to-person variation. Expect proof-of-concept accuracy, not production ASL translation.
#### How It Works
1. **Feature extraction**: Per frame, compute 6 features: mean phase, phase spread, mean amplitude, amplitude spread, motion energy, variance. These are accumulated in a gesture window (max 32 frames).
2. **Gesture segmentation**: Active gestures are bounded by pauses (low motion for 15+ frames). When a pause is detected, the accumulated gesture window is matched against templates.
3. **DTW matching**: Each template is a reference feature sequence. Multivariate DTW with Sakoe-Chiba band (width=4) computes the alignment distance. The best match below threshold (0.5) is accepted.
4. **Word boundaries**: Extended pauses (15+ low-motion frames) emit word boundary events.
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `LETTER_RECOGNIZED` | 620 | Letter index (0=A, ..., 25=Z) | On match after pause |
| `LETTER_CONFIDENCE` | 621 | Inverse DTW distance [0, 1] | With recognized letter |
| `WORD_BOUNDARY` | 622 | 1.0 | After extended pause |
| `GESTURE_REJECTED` | 623 | 1.0 | When gesture does not match |
#### API
```rust
let mut detector = GestureLanguageDetector::new();
// Load templates (required before recognition works)
detector.load_synthetic_templates(); // 26 ramp-pattern templates for testing
// OR load custom templates:
detector.set_template(0, &features_for_letter_a); // 0 = 'A'
let events = detector.process_frame(
&phases, // &[f32]: per-subcarrier phase
&amplitudes, // &[f32]: per-subcarrier amplitude
variance, // f32
motion_energy, // f32
presence, // i32
);
```
---
### Music Conductor Tracking (`exo_music_conductor.rs`)
**What it does**: Extracts musical conducting parameters from WiFi CSI motion signatures: tempo (BPM), beat position (1-4 in 4/4 time), dynamic level (MIDI velocity 0-127), and special gestures (cutoff and fermata).
**Maturity**: Research
**Research basis**: Gesture tracking via WiFi CSI has been demonstrated for coarse arm movements. Conductor tracking extends this to periodic rhythmic motion analysis.
#### How It Works
1. **Tempo detection**: Autocorrelation of a 128-point motion energy buffer at lags 4-64. The dominant peak determines the period, converted to BPM: `BPM = 60 * 20 / lag` (at 20 Hz frame rate). Valid range: 30-240 BPM.
2. **Beat position**: A modular frame counter relative to the detected period maps to beats 1-4 in 4/4 time.
3. **Dynamic level**: Motion energy relative to the EMA-smoothed peak, scaled to MIDI velocity [0, 127].
4. **Cutoff detection**: Sharp drop in motion energy (ratio < 0.2 of recent peak) with high preceding motion.
5. **Fermata detection**: Sustained low motion (< 0.05) for 10+ consecutive frames.
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `CONDUCTOR_BPM` | 630 | Detected tempo in BPM | After tempo lock |
| `BEAT_POSITION` | 631 | Beat number (1-4) | After tempo lock |
| `DYNAMIC_LEVEL` | 632 | MIDI velocity [0, 127] | Every frame |
| `GESTURE_CUTOFF` | 633 | 1.0 | On cutoff gesture |
| `GESTURE_FERMATA` | 634 | 1.0 | During fermata hold |
#### API
```rust
let mut detector = MusicConductorDetector::new();
let events = detector.process_frame(
phase, // f32 (unused)
amplitude, // f32 (unused)
motion_energy, // f32: from Tier 2 DSP
variance, // f32 (unused)
);
let bpm = detector.tempo_bpm(); // f32
let fermata = detector.is_fermata(); // bool
let cutoff = detector.is_cutoff(); // bool
```
---
### Plant Growth Detection (`exo_plant_growth.rs`)
**What it does**: Detects plant growth and leaf movement from micro-CSI changes over hours/days. Plants cause extremely slow, monotonic drift in CSI amplitude (growth) and diurnal phase oscillations (circadian leaf movement -- nyctinasty).
**Maturity**: Research
**Requirements**: Room must be empty (`presence == 0`) to isolate plant-scale perturbations from human motion. This module is designed for long-running monitoring (hours to days).
#### How It Works
- **Growth rate**: Tracks the slow drift of amplitude baseline via a very slow EWMA (alpha=0.0001, half-life ~175 seconds). Plant growth produces continuous ~0.01 dB/hour amplitude decrease as new leaf area intercepts RF energy.
- **Circadian phase**: Tracks peak-to-trough oscillation in phase EWMA over a rolling window. Nyctinastic leaf movement (folding at night) produces ~24-hour oscillations.
- **Wilting detection**: Short-term amplitude rises above baseline (less absorption) combined with reduced phase variance.
- **Watering event**: Abrupt amplitude drop (more water = more RF absorption) followed by recovery.
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `GROWTH_RATE` | 640 | Amplitude drift rate (scaled) | Every 100 empty-room frames |
| `CIRCADIAN_PHASE` | 641 | Oscillation magnitude [0, 1] | When oscillation detected |
| `WILT_DETECTED` | 642 | 1.0 | When wilting signature seen |
| `WATERING_EVENT` | 643 | 1.0 | When watering signature seen |
#### API
```rust
let mut detector = PlantGrowthDetector::new();
let events = detector.process_frame(
&amplitudes, // &[f32]: per-subcarrier amplitudes (up to 32)
&phases, // &[f32]: per-subcarrier phases (up to 32)
&variance, // &[f32]: per-subcarrier variance (up to 32)
presence, // i32: 0 = empty room (required for detection)
);
let calibrated = detector.is_calibrated(); // true after MIN_EMPTY_FRAMES
let empty = detector.empty_frames(); // frames of empty-room data
```
---
### Ghost Hunter -- Environmental Anomaly Detector (`exo_ghost_hunter.rs`)
**What it does**: Monitors CSI when no humans are detected for any perturbation above the noise floor. When the room should be empty but CSI changes are detected, something unexplained is happening. Classifies anomalies by their temporal signature.
**Maturity**: Experimental
**Practical applications**: Despite the playful name, this module has serious uses: detecting HVAC compressor cycling, pest/animal movement, structural settling, gas leaks (which alter dielectric properties), hidden intruders who evade the primary presence detector, and electromagnetic interference.
#### Anomaly Classification
| Class | Code | Signature | Typical Sources |
|-------|------|-----------|----------------|
| Impulsive | 1 | < 5 frames, sharp transient | Object falling, thermal cracking |
| Periodic | 2 | Recurring, detectable autocorrelation peak | HVAC, appliances, pest movement |
| Drift | 3 | 30+ frames same-sign amplitude delta | Temperature change, humidity, gas leak |
| Random | 4 | Stochastic, no pattern | EMI, co-channel WiFi interference |
#### Hidden Presence Detection
A sub-detector looks for breathing signatures in the phase signal: periodic oscillation at 0.2-2.0 Hz via autocorrelation at lags 5-15 (at 20 Hz frame rate). This can detect a motionless person who evades the main presence detector.
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `ANOMALY_DETECTED` | 650 | Energy level [0, 1] | When anomaly active |
| `ANOMALY_CLASS` | 651 | 1-4 (see table above) | With anomaly detection |
| `HIDDEN_PRESENCE` | 652 | Confidence [0, 1] | When breathing signature found |
| `ENVIRONMENTAL_DRIFT` | 653 | Drift magnitude | When sustained drift detected |
#### API
```rust
let mut detector = GhostHunterDetector::new();
let events = detector.process_frame(
&phases, // &[f32]
&amplitudes, // &[f32]
&variance, // &[f32]
presence, // i32: must be 0 for detection
motion_energy, // f32
);
let class = detector.anomaly_class(); // AnomalyClass enum
let hidden = detector.hidden_presence_confidence(); // f32 [0, 1]
let energy = detector.anomaly_energy(); // f32
```
---
### Rain Detection (`exo_rain_detect.rs`)
**What it does**: Detects rain from broadband CSI phase variance perturbations caused by raindrop impacts on building surfaces. Classifies intensity as light, moderate, or heavy.
**Maturity**: Experimental
**Research basis**: Raindrops impacting surfaces produce broadband impulse vibrations that propagate through building structure and modulate CSI phase. These are distinguishable from human motion by their broadband nature (all subcarrier groups affected equally), stochastic timing, and small amplitude.
#### How It Works
1. **Requires empty room** (`presence == 0`) to avoid confounding with human motion.
2. **Broadband criterion**: Compute per-group variance ratio (short-term / baseline). If >= 75% of groups (6/8) have elevated variance (ratio > 2.5x), the signal is broadband -- consistent with rain.
3. **Hysteresis state machine**: Onset requires 10 consecutive broadband frames; cessation requires 20 consecutive quiet frames.
4. **Intensity classification**: Based on smoothed excess energy above baseline.
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `RAIN_ONSET` | 660 | 1.0 | On rain start |
| `RAIN_INTENSITY` | 661 | 1=light, 2=moderate, 3=heavy | While raining |
| `RAIN_CESSATION` | 662 | 1.0 | On rain stop |
#### Intensity Thresholds
| Level | Code | Energy Range |
|-------|------|-------------|
| None | 0 | (not raining) |
| Light | 1 | energy < 0.3 |
| Moderate | 2 | 0.3 <= energy < 0.7 |
| Heavy | 3 | energy >= 0.7 |
#### API
```rust
let mut detector = RainDetector::new();
let events = detector.process_frame(
&phases, // &[f32]
&variance, // &[f32]
&amplitudes, // &[f32]
presence, // i32: must be 0
);
let raining = detector.is_raining(); // bool
let intensity = detector.intensity(); // RainIntensity enum
let energy = detector.energy(); // f32 [0, 1]
```
---
### Breathing Synchronization (`exo_breathing_sync.rs`)
**What it does**: Detects when multiple people's breathing patterns synchronize. Extracts per-person breathing components via subcarrier group decomposition and computes pairwise normalized cross-correlation.
**Maturity**: Research
**Research basis**: Breathing synchronization (interpersonal physiological synchrony) is a known phenomenon in couples, parent-infant pairs, and close social groups. This module attempts to detect it contactlessly via WiFi CSI.
#### How It Works
1. **Per-person decomposition**: With N persons, the 8 subcarrier groups are divided among persons (e.g., 2 persons = 4 groups each). Each person's phase signal is bandpass-filtered to the breathing band using dual EWMA (DC removal + low-pass).
2. **Pairwise correlation**: For each pair, compute normalized zero-lag cross-correlation over a 64-sample buffer: `rho = sum(x_i * x_j) / sqrt(sum(x_i^2) * sum(x_j^2))`
3. **Synchronization state machine**: High correlation (|rho| > 0.6) for 20+ consecutive frames declares synchronization. Low correlation for 15+ frames declares sync lost.
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `SYNC_DETECTED` | 670 | 1.0 | On sync onset |
| `SYNC_PAIR_COUNT` | 671 | Number of synced pairs | On count change |
| `GROUP_COHERENCE` | 672 | Average coherence [0, 1] | Every 10 frames |
| `SYNC_LOST` | 673 | 1.0 | On sync loss |
#### Constraints
- Maximum 4 persons (6 pairwise comparisons)
- Requires >= 8 subcarriers and >= 2 persons
- 64-frame warmup before analysis begins
#### API
```rust
let mut detector = BreathingSyncDetector::new();
let events = detector.process_frame(
&phases, // &[f32]: per-subcarrier phases
&variance, // &[f32]: per-subcarrier variance
breathing_bpm, // f32: host aggregate (unused internally)
n_persons, // i32: number of persons detected
);
let synced = detector.is_synced(); // bool
let coherence = detector.group_coherence(); // f32 [0, 1]
let persons = detector.active_persons(); // usize
```
---
### Time Crystal Detection (`exo_time_crystal.rs`)
**What it does**: Detects temporal symmetry breaking patterns -- specifically period doubling -- in motion energy. A "time crystal" in this context is when the system oscillates at a sub-harmonic of the driving frequency. Also counts independent non-harmonic periodic components as a "coordination index" for multi-person temporal coordination.
**Maturity**: Research
**Background**: In condensed matter physics, discrete time crystals exhibit period doubling under periodic driving. This module applies the same mathematical criterion (autocorrelation peak at lag L AND lag 2L) to human motion patterns. Two people walking at different cadences produce independent periodic peaks at non-harmonic ratios.
#### How It Works
1. **Autocorrelation**: 256-point motion energy buffer, autocorrelation at lags 1-128. Pre-linearized for performance (eliminates modulus ops in inner loop).
2. **Period doubling**: Search for peaks where a strong autocorrelation at lag L is accompanied by a strong peak at lag 2L (+/- 2 frame tolerance).
3. **Coordination index**: Count peaks whose lag ratios are not integer multiples of any other peak (within 5% tolerance). These represent independent periodic motions.
4. **Stability tracking**: Crystal detection is tracked over 200-frame windows. The stability score is the fraction of frames where the crystal was detected, EMA-smoothed.
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `CRYSTAL_DETECTED` | 680 | Period multiplier (2 = doubling) | When detected |
| `CRYSTAL_STABILITY` | 681 | Stability score [0, 1] | Every frame |
| `COORDINATION_INDEX` | 682 | Non-harmonic peak count | When > 0 |
#### API
```rust
let mut detector = TimeCrystalDetector::new();
let events = detector.process_frame(motion_energy);
let detected = detector.is_detected(); // bool
let multiplier = detector.multiplier(); // u8 (0 or 2)
let stability = detector.stability(); // f32 [0, 1]
let coordination = detector.coordination_index(); // u8
```
---
### Hyperbolic Space Embedding (`exo_hyperbolic_space.rs`)
**What it does**: Embeds CSI fingerprints into a 2D Poincare disk to exploit the natural hierarchy of indoor spaces (rooms contain zones). Hyperbolic geometry provides exponentially more representational capacity near the boundary, ideal for tree-structured location taxonomies.
**Maturity**: Research
**Research basis**: Hyperbolic embeddings have been shown to outperform Euclidean embeddings for hierarchical data (Nickel & Kiela, 2017). This module applies the concept to indoor localization.
#### How It Works
1. **Feature extraction**: 8D vector from mean amplitude across 8 subcarrier groups.
2. **Linear projection**: 2x8 matrix maps features to 2D Poincare disk coordinates.
3. **Normalization**: If the projected point exceeds the disk boundary, scale to radius 0.95.
4. **Nearest reference**: Compute Poincare distance to 16 reference points and find the closest.
5. **Hierarchy level**: Points near the center (radius < 0.5) are room-level; near the boundary are zone-level.
#### Poincare Distance
```
d(x, y) = acosh(1 + 2 * ||x-y||^2 / ((1 - ||x||^2) * (1 - ||y||^2)))
```
This metric respects the hyperbolic geometry: distances near the boundary grow exponentially.
#### Default Reference Layout
| Index | Label | Radius | Description |
|-------|-------|--------|-------------|
| 0-3 | Rooms | 0.3 | Bathroom, Kitchen, Living room, Bedroom |
| 4-6 | Zone 0a-c | 0.7 | Bathroom sub-zones |
| 7-9 | Zone 1a-c | 0.7 | Kitchen sub-zones |
| 10-12 | Zone 2a-c | 0.7 | Living room sub-zones |
| 13-15 | Zone 3a-c | 0.7 | Bedroom sub-zones |
#### Events
| Event | ID | Value | Frequency |
|-------|-----|-------|-----------|
| `HIERARCHY_LEVEL` | 685 | 0 = room, 1 = zone | Every frame |
| `HYPERBOLIC_RADIUS` | 686 | Disk radius [0, 1) | Every frame |
| `LOCATION_LABEL` | 687 | Nearest reference (0-15) | Every frame |
#### API
```rust
let mut embedder = HyperbolicEmbedder::new();
let events = embedder.process_frame(&amplitudes);
let label = embedder.label(); // u8 (0-15)
let pos = embedder.position(); // &[f32; 2]
// Custom calibration:
embedder.set_reference(0, [0.2, 0.1]);
embedder.set_projection_row(0, [0.05, 0.03, 0.02, 0.01, -0.01, -0.02, -0.03, -0.04]);
```
---
## Event ID Registry (600-699)
| Range | Module | Events |
|-------|--------|--------|
| 600-603 | Dream Stage | SLEEP_STAGE, SLEEP_QUALITY, REM_EPISODE, DEEP_SLEEP_RATIO |
| 610-613 | Emotion Detect | AROUSAL_LEVEL, STRESS_INDEX, CALM_DETECTED, AGITATION_DETECTED |
| 620-623 | Gesture Language | LETTER_RECOGNIZED, LETTER_CONFIDENCE, WORD_BOUNDARY, GESTURE_REJECTED |
| 630-634 | Music Conductor | CONDUCTOR_BPM, BEAT_POSITION, DYNAMIC_LEVEL, GESTURE_CUTOFF, GESTURE_FERMATA |
| 640-643 | Plant Growth | GROWTH_RATE, CIRCADIAN_PHASE, WILT_DETECTED, WATERING_EVENT |
| 650-653 | Ghost Hunter | ANOMALY_DETECTED, ANOMALY_CLASS, HIDDEN_PRESENCE, ENVIRONMENTAL_DRIFT |
| 660-662 | Rain Detect | RAIN_ONSET, RAIN_INTENSITY, RAIN_CESSATION |
| 670-673 | Breathing Sync | SYNC_DETECTED, SYNC_PAIR_COUNT, GROUP_COHERENCE, SYNC_LOST |
| 680-682 | Time Crystal | CRYSTAL_DETECTED, CRYSTAL_STABILITY, COORDINATION_INDEX |
| 685-687 | Hyperbolic Space | HIERARCHY_LEVEL, HYPERBOLIC_RADIUS, LOCATION_LABEL |
## Code Quality Notes
All 10 modules have been reviewed for:
- **Edge cases**: Division by zero is guarded everywhere (explicit checks before division, EPSILON constants). Negative variance from floating-point rounding is clamped to zero. Empty buffers return safe defaults.
- **NaN protection**: All computations use `libm` functions (`sqrtf`, `acoshf`, `sinf`) which are well-defined for valid inputs. Inputs are validated before reaching math functions.
- **Buffer safety**: All `CircularBuffer` accesses use the `get(i)` method which returns 0.0 for out-of-bounds indices. Fixed-size arrays prevent overflow.
- **Range clamping**: All outputs that represent ratios or probabilities are clamped to [0, 1]. MIDI velocity is clamped to [0, 127]. Poincare disk coordinates are normalized to radius < 1.
- **Test coverage**: Each module has 7-10 tests covering: construction, warmup period, happy path detection, edge cases (no presence, insufficient data), range validation, and reset.
## Research References
1. Liu, J., et al. "Monitoring Vital Signs and Postures During Sleep Using WiFi Signals." IEEE Internet of Things Journal, 2018. -- WiFi-based sleep monitoring using CSI breathing patterns.
2. Zhao, M., et al. "Through-Wall Human Pose Estimation Using Radio Signals." CVPR 2018. -- RF-based pose estimation foundations.
3. Wang, H., et al. "RT-Fall: A Real-Time and Contactless Fall Detection System with Commodity WiFi Devices." IEEE Transactions on Mobile Computing, 2017. -- WiFi CSI for human activity recognition.
4. Li, H., et al. "WiFinger: Talk to Your Smart Devices with Finger Gesture." UbiComp 2016. -- WiFi-based gesture recognition using CSI.
5. Ma, Y., et al. "SignFi: Sign Language Recognition Using WiFi." ACM IMWUT, 2018. -- WiFi CSI for sign language.
6. Nickel, M. & Kiela, D. "Poincare Embeddings for Learning Hierarchical Representations." NeurIPS 2017. -- Hyperbolic embedding foundations.
7. Wang, W., et al. "Understanding and Modeling of WiFi Signal Based Human Activity Recognition." MobiCom 2015. -- CSI-based activity recognition.
8. Adib, F., et al. "Smart Homes that Monitor Breathing and Heart Rate." CHI 2015. -- Contactless vital sign monitoring via RF signals.
## Contributing New Research Modules
### Adding a New Exotic Module
1. **Choose an event ID range**: Use the next available range in the 600-699 block. Check `lib.rs` event_types for allocated IDs.
2. **Create the source file**: Name it `exo_<name>.rs` in `src/`. Follow the existing pattern:
- Module-level doc comment with algorithm description, events, and budget
- `const fn new()` constructor
- `process_frame()` returning `&[(i32, f32)]` via static buffer
- Public accessor methods for key state
- `reset()` method
3. **Register in `lib.rs`**: Add `pub mod exo_<name>;` in the Category 6 section.
4. **Register event constants**: Add entries to `event_types` in `lib.rs`.
5. **Update this document**: Add the module to the overview table and write its section.
6. **Testing requirements**:
- At minimum: `test_const_new`, `test_warmup_no_events`, one happy-path detection test, `test_reset`
- Test edge cases: empty input, extreme values, insufficient data
- Verify all output values are in their documented ranges
- Run: `cargo test --features std -- exo_` (from within the wasm-edge crate directory)
### Design Constraints
- **`no_std`**: No heap allocation. Use `CircularBuffer`, `Ema`, `WelfordStats` from `vendor_common`.
- **Stack budget**: Keep total struct size reasonable. The ESP32-S3 WASM3 stack is limited.
- **Time budget**: Stay within your declared budget (L < 2ms, S < 5ms, H < 10ms at 20 Hz).
- **Static events**: Use a `static mut EVENTS` array for zero-allocation event returns.
- **Input validation**: Always check array lengths, handle missing data gracefully.

View file

@ -0,0 +1,832 @@
# Industrial & Specialized Modules -- WiFi-DensePose Edge Intelligence
> Worker safety and compliance monitoring using WiFi CSI signals. Works through
> dust, smoke, shelving, and walls where cameras fail. Designed for warehouses,
> factories, clean rooms, farms, and construction sites.
**ADR-041 Category 5 | Event IDs 500--599 | Crate `wifi-densepose-wasm-edge`**
## Safety Warning
These modules are **supplementary monitoring tools**. They do NOT replace:
- Certified safety systems (SIL-rated controllers, safety PLCs)
- Gas detectors, O2 monitors, or LEL sensors
- OSHA-required personal protective equipment
- Physical barriers, guardrails, or interlocks
- Trained safety attendants or rescue teams
Always deploy alongside certified primary safety systems. WiFi CSI sensing is
susceptible to environmental changes (new metal objects, humidity, temperature)
that can cause false negatives. Calibrate regularly and validate against ground
truth.
---
## Overview
| Module | File | What It Does | Event IDs | Budget |
|---|---|---|---|---|
| Forklift Proximity | `ind_forklift_proximity.rs` | Warns when pedestrians are near moving forklifts/AGVs | 500--502 | S (<5 ms) |
| Confined Space | `ind_confined_space.rs` | Monitors worker vitals in tanks, manholes, vessels | 510--514 | L (<2 ms) |
| Clean Room | `ind_clean_room.rs` | Personnel count and turbulent motion for ISO 14644 | 520--523 | L (<2 ms) |
| Livestock Monitor | `ind_livestock_monitor.rs` | Animal health monitoring in pens, barns, enclosures | 530--533 | L (<2 ms) |
| Structural Vibration | `ind_structural_vibration.rs` | Seismic, resonance, and structural drift detection | 540--543 | H (<10 ms) |
---
## Modules
### Forklift Proximity Warning (`ind_forklift_proximity.rs`)
**What it does**: Warns when a person is too close to a moving forklift, AGV,
or mobile robot, even around blind corners and through shelving racks.
**How it works**: The module separates forklift signatures from human
signatures using three CSI features:
1. **Amplitude ratio**: Large metal bodies (forklifts) produce 2--5x amplitude
increases across all subcarriers relative to an empty-warehouse baseline.
2. **Low-frequency phase dominance**: Forklifts move slowly (<0.3 Hz phase
modulation) compared to walking humans (0.5--2 Hz). The module computes
the ratio of low-frequency energy to total phase energy.
3. **Motor vibration**: Electric forklift motors produce elevated, uniform
variance across subcarriers (>0.08 threshold).
When all three conditions are met for 4 consecutive frames (debounced), the
module declares a vehicle present. If a human signature (host-reported
presence + motion energy >0.15) co-occurs, a proximity warning is emitted
with a distance category derived from amplitude ratio.
#### API
```rust
pub struct ForkliftProximityDetector { /* ... */ }
impl ForkliftProximityDetector {
/// Create a new detector. Requires 100-frame calibration (~5 s at 20 Hz).
pub const fn new() -> Self;
/// Process one CSI frame. Returns events as (event_id, value) pairs.
pub fn process_frame(
&mut self,
phases: &[f32], // per-subcarrier phase values
amplitudes: &[f32], // per-subcarrier amplitude values
variance: &[f32], // per-subcarrier variance values
motion_energy: f32, // host-reported motion energy
presence: i32, // host-reported presence flag (0/1)
n_persons: i32, // host-reported person count
) -> &[(i32, f32)];
/// Whether a vehicle is currently detected.
pub fn is_vehicle_present(&self) -> bool;
/// Current amplitude ratio (proxy for vehicle proximity).
pub fn amplitude_ratio(&self) -> f32;
}
```
#### Events Emitted
| Event ID | Constant | Value | Meaning |
|---|---|---|---|
| 500 | `EVENT_PROXIMITY_WARNING` | Distance category: 0.0 = critical, 1.0 = warning, 2.0 = caution | Person dangerously close to vehicle |
| 501 | `EVENT_VEHICLE_DETECTED` | Amplitude ratio (float) | Forklift/AGV entered sensor zone |
| 502 | `EVENT_HUMAN_NEAR_VEHICLE` | Motion energy (float) | Human detected in vehicle zone (fires once on transition) |
#### State Machine
```
+-----------+
| |
+-------->| No Vehicle|<---------+
| | | |
| +-----+-----+ |
| | |
| amp_ratio > 2.5 AND |
| low_freq_dominant AND | debounce drops
| vibration > 0.08 | below threshold
| (4 frames debounce) |
| | |
| +-----v-----+ |
| | |----------+
+---------| Vehicle |
| Present |
+-----+-----+
|
human present | (presence + motion > 0.15)
+ debounce |
+-----v-----+
| Proximity |----> EVENT 500 (cooldown 40 frames)
| Warning |----> EVENT 502 (once on transition)
+-----------+
```
#### Configuration
| Parameter | Default | Range | Safety Implication |
|---|---|---|---|
| `FORKLIFT_AMP_RATIO` | 2.5 | 1.5--5.0 | Lower = more sensitive, more false positives |
| `HUMAN_MOTION_THRESH` | 0.15 | 0.05--0.5 | Lower = catches slow-moving workers |
| `VEHICLE_DEBOUNCE` | 4 frames | 2--10 | Higher = fewer false alarms, slower response |
| `PROXIMITY_DEBOUNCE` | 2 frames | 1--5 | Higher = fewer false alarms, slower response |
| `ALERT_COOLDOWN` | 40 frames (2 s) | 10--200 | Lower = more frequent warnings |
| `DIST_CRITICAL` | amp ratio > 4.0 | -- | Very close proximity |
| `DIST_WARNING` | amp ratio > 3.0 | -- | Close proximity |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::ind_forklift_proximity::ForkliftProximityDetector;
let mut detector = ForkliftProximityDetector::new();
// Calibration phase: feed 100 frames of empty warehouse
for _ in 0..100 {
detector.process_frame(&phases, &amps, &variance, 0.0, 0, 0);
}
// Normal operation
let events = detector.process_frame(&phases, &amps, &variance, 0.5, 1, 1);
for &(event_id, value) in events {
match event_id {
500 => {
let category = match value as i32 {
0 => "CRITICAL -- stop forklift immediately",
1 => "WARNING -- reduce speed",
_ => "CAUTION -- be alert",
};
trigger_alarm(category);
}
501 => log("Vehicle detected, amplitude ratio: {}", value),
502 => log("Human entered vehicle zone"),
_ => {}
}
}
```
#### Tutorial: Setting Up Warehouse Proximity Alerts
1. **Sensor placement**: Mount one ESP32 WiFi sensor per aisle, at shelf
height (1.5--2 m). Each sensor covers approximately one aisle width
(3--4 m) and 10--15 m of aisle length.
2. **Calibration**: Power on during a quiet period (no forklifts, no
workers). The module auto-calibrates over the first 100 frames (5 s
at 20 Hz). The baseline amplitude represents the empty aisle.
3. **Threshold tuning**: If false alarms occur due to hand trucks or
pallet jacks, increase `FORKLIFT_AMP_RATIO` from 2.5 to 3.0. If
forklifts are missed, decrease to 2.0.
4. **Integration**: Connect `EVENT_PROXIMITY_WARNING` (500) to a warning
light (amber for caution/warning, red for critical) and audible alarm.
Connect to the facility SCADA system for logging.
5. **Validation**: Walk through the aisle while a forklift operates.
Verify all three distance categories trigger at appropriate ranges.
---
### Confined Space Monitor (`ind_confined_space.rs`)
**What it does**: Monitors workers inside tanks, manholes, vessels, or any
enclosed space. Confirms they are breathing and alerts if they stop moving
or breathing.
**Compliance**: Designed to support OSHA 29 CFR 1910.146 confined space
entry requirements. The module provides continuous proof-of-life monitoring
to supplement (not replace) the required safety attendant.
**How it works**: Uses debounced presence detection to track entry/exit
transitions. While a worker is inside, the module continuously monitors
two vital indicators:
1. **Breathing**: Host-reported breathing BPM must stay above 4.0 BPM.
If breathing is not detected for 300 frames (15 seconds at 20 Hz),
an extraction alert is emitted.
2. **Motion**: Host-reported motion energy must stay above 0.02. If no
motion is detected for 1200 frames (60 seconds), an immobility alert
is emitted.
The module transitions between `Empty`, `Present`, `BreathingCeased`, and
`Immobile` states. When breathing or motion resumes, the state recovers
back to `Present`.
#### API
```rust
pub enum WorkerState {
Empty, // No worker in the space
Present, // Worker present, vitals normal
BreathingCeased, // No breathing detected (danger)
Immobile, // No motion detected (danger)
}
pub struct ConfinedSpaceMonitor { /* ... */ }
impl ConfinedSpaceMonitor {
pub const fn new() -> Self;
/// Process one frame.
pub fn process_frame(
&mut self,
presence: i32, // host-reported presence (0/1)
breathing_bpm: f32, // host-reported breathing rate
motion_energy: f32, // host-reported motion energy
variance: f32, // mean CSI variance
) -> &[(i32, f32)];
/// Current worker state.
pub fn state(&self) -> WorkerState;
/// Whether a worker is inside the space.
pub fn is_worker_inside(&self) -> bool;
/// Seconds since last confirmed breathing.
pub fn seconds_since_breathing(&self) -> f32;
/// Seconds since last detected motion.
pub fn seconds_since_motion(&self) -> f32;
}
```
#### Events Emitted
| Event ID | Constant | Value | Meaning |
|---|---|---|---|
| 510 | `EVENT_WORKER_ENTRY` | 1.0 | Worker entered the confined space |
| 511 | `EVENT_WORKER_EXIT` | 1.0 | Worker exited the confined space |
| 512 | `EVENT_BREATHING_OK` | BPM (float) | Periodic breathing confirmation (~every 5 s) |
| 513 | `EVENT_EXTRACTION_ALERT` | Seconds since last breath | No breathing for >15 s -- initiate rescue |
| 514 | `EVENT_IMMOBILE_ALERT` | Seconds without motion | No motion for >60 s -- check on worker |
#### State Machine
```
+---------+
| Empty |<----------+
+----+----+ |
| |
presence | | absence (10 frames)
(10 frames) | |
v |
+---------+ |
+------>| Present |-----------+
| +----+----+
| | |
| breathing | no | no motion
| resumes | breathing| (1200 frames)
| | (300 |
| | frames) |
| +----v------+ |
+-------|Breathing | |
| | Ceased | |
| +-----------+ |
| |
| +-----------+ |
+-------| Immobile |<--+
+-----------+
motion resumes -> Present
```
#### Configuration
| Parameter | Default | Range | Safety Implication |
|---|---|---|---|
| `BREATHING_CEASE_FRAMES` | 300 (15 s) | 100--600 | Lower = faster alert, more false positives |
| `IMMOBILE_FRAMES` | 1200 (60 s) | 400--3600 | Lower = catches slower collapses |
| `MIN_BREATHING_BPM` | 4.0 | 2.0--8.0 | Lower = more tolerant of slow breathing |
| `MIN_MOTION_ENERGY` | 0.02 | 0.005--0.1 | Lower = catches subtle movements |
| `ENTRY_EXIT_DEBOUNCE` | 10 frames | 5--30 | Higher = fewer false entry/exits |
| `MIN_PRESENCE_VAR` | 0.005 | 0.001--0.05 | Noise rejection for empty space |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::ind_confined_space::{
ConfinedSpaceMonitor, WorkerState,
EVENT_EXTRACTION_ALERT, EVENT_IMMOBILE_ALERT,
};
let mut monitor = ConfinedSpaceMonitor::new();
// Process each CSI frame
let events = monitor.process_frame(presence, breathing_bpm, motion_energy, variance);
for &(event_id, value) in events {
match event_id {
513 => { // EXTRACTION_ALERT
activate_rescue_alarm();
notify_safety_attendant(value); // seconds since last breath
}
514 => { // IMMOBILE_ALERT
notify_safety_attendant(value); // seconds without motion
}
_ => {}
}
}
// Query state for dashboard display
match monitor.state() {
WorkerState::Empty => display_green("Space empty"),
WorkerState::Present => display_green("Worker OK"),
WorkerState::BreathingCeased => display_red("NO BREATHING"),
WorkerState::Immobile => display_amber("Worker immobile"),
}
```
---
### Clean Room Monitor (`ind_clean_room.rs`)
**What it does**: Tracks personnel count and movement patterns in cleanrooms
to enforce ISO 14644 occupancy limits and detect turbulent motion that could
disturb laminar airflow.
**How it works**: Uses the host-reported person count with debounced
violation detection. Turbulent motion (rapid movement with energy >0.6) is
flagged because it disrupts the laminar airflow that keeps particulate counts
low. The module maintains a running compliance percentage for audit reporting.
#### API
```rust
pub struct CleanRoomMonitor { /* ... */ }
impl CleanRoomMonitor {
/// Create with default max occupancy of 4.
pub const fn new() -> Self;
/// Create with custom maximum occupancy.
pub const fn with_max_occupancy(max: u8) -> Self;
/// Process one frame.
pub fn process_frame(
&mut self,
n_persons: i32, // host-reported person count
presence: i32, // host-reported presence (0/1)
motion_energy: f32, // host-reported motion energy
) -> &[(i32, f32)];
/// Current occupancy count.
pub fn current_count(&self) -> u8;
/// Maximum allowed occupancy.
pub fn max_occupancy(&self) -> u8;
/// Whether currently in violation.
pub fn is_in_violation(&self) -> bool;
/// Compliance percentage (0--100).
pub fn compliance_percent(&self) -> f32;
/// Total number of violation events.
pub fn total_violations(&self) -> u32;
}
```
#### Events Emitted
| Event ID | Constant | Value | Meaning |
|---|---|---|---|
| 520 | `EVENT_OCCUPANCY_COUNT` | Person count (float) | Occupancy changed |
| 521 | `EVENT_OCCUPANCY_VIOLATION` | Current count (float) | Count exceeds max allowed |
| 522 | `EVENT_TURBULENT_MOTION` | Motion energy (float) | Rapid movement detected (airflow risk) |
| 523 | `EVENT_COMPLIANCE_REPORT` | Compliance % (0--100) | Periodic compliance summary (~30 s) |
#### State Machine
```
+------------------+
| Monitoring |
| (count <= max) |
+--------+---------+
| count > max
| (10 frames debounce)
+--------v---------+
| Violation |----> EVENT 521 (cooldown 200 frames)
| (count > max) |
+--------+---------+
| count <= max
|
+--------v---------+
| Monitoring |
+------------------+
Parallel:
motion_energy > 0.6 (3 frames) ----> EVENT 522 (cooldown 100 frames)
Every 600 frames (~30 s) ----------> EVENT 523 (compliance %)
```
#### Configuration
| Parameter | Default | Range | Safety Implication |
|---|---|---|---|
| `DEFAULT_MAX_OCCUPANCY` | 4 | 1--255 | Per ISO 14644 room class |
| `TURBULENT_MOTION_THRESH` | 0.6 | 0.3--0.9 | Lower = stricter movement control |
| `VIOLATION_DEBOUNCE` | 10 frames | 3--20 | Higher = tolerates brief over-counts |
| `VIOLATION_COOLDOWN` | 200 frames (10 s) | 40--600 | Alert repeat interval |
| `COMPLIANCE_REPORT_INTERVAL` | 600 frames (30 s) | 200--6000 | Audit report frequency |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::ind_clean_room::{
CleanRoomMonitor, EVENT_OCCUPANCY_VIOLATION, EVENT_COMPLIANCE_REPORT,
};
// ISO Class 5 cleanroom: max 3 personnel
let mut monitor = CleanRoomMonitor::with_max_occupancy(3);
let events = monitor.process_frame(n_persons, presence, motion_energy);
for &(event_id, value) in events {
match event_id {
521 => alert_cleanroom_supervisor(value as u8),
522 => alert_turbulent_motion(),
523 => log_compliance_audit(value),
_ => {}
}
}
// Dashboard
println!("Occupancy: {}/{}", monitor.current_count(), monitor.max_occupancy());
println!("Compliance: {:.1}%", monitor.compliance_percent());
```
---
### Livestock Monitor (`ind_livestock_monitor.rs`)
**What it does**: Monitors animal presence and health in pens, barns, and
enclosures. Detects abnormal stillness (possible illness), labored breathing,
and escape events.
**How it works**: Tracks presence with debounced entry/exit detection.
Monitors breathing rate against species-specific normal ranges. Detects
prolonged stillness (>5 minutes) as a sign of illness, and sudden absence
after confirmed presence as an escape event.
Species-specific breathing ranges:
| Species | Normal BPM | Labored: below | Labored: above |
|---|---|---|---|
| Cattle | 12--30 | 8.4 (0.7x min) | 39.0 (1.3x max) |
| Sheep | 12--20 | 8.4 (0.7x min) | 26.0 (1.3x max) |
| Poultry | 15--30 | 10.5 (0.7x min) | 39.0 (1.3x max) |
| Custom | configurable | 0.7x min | 1.3x max |
#### API
```rust
pub enum Species {
Cattle,
Sheep,
Poultry,
Custom { min_bpm: f32, max_bpm: f32 },
}
pub struct LivestockMonitor { /* ... */ }
impl LivestockMonitor {
/// Create with default species (Cattle).
pub const fn new() -> Self;
/// Create with a specific species.
pub const fn with_species(species: Species) -> Self;
/// Process one frame.
pub fn process_frame(
&mut self,
presence: i32, // host-reported presence (0/1)
breathing_bpm: f32, // host-reported breathing rate
motion_energy: f32, // host-reported motion energy
variance: f32, // mean CSI variance (unused, reserved)
) -> &[(i32, f32)];
/// Whether an animal is currently detected.
pub fn is_animal_present(&self) -> bool;
/// Configured species.
pub fn species(&self) -> Species;
/// Minutes of stillness.
pub fn stillness_minutes(&self) -> f32;
/// Last observed breathing BPM.
pub fn last_breathing_bpm(&self) -> f32;
}
```
#### Events Emitted
| Event ID | Constant | Value | Meaning |
|---|---|---|---|
| 530 | `EVENT_ANIMAL_PRESENT` | BPM (float) | Periodic presence report (~10 s) |
| 531 | `EVENT_ABNORMAL_STILLNESS` | Minutes still (float) | No motion for >5 minutes |
| 532 | `EVENT_LABORED_BREATHING` | BPM (float) | Breathing outside normal range |
| 533 | `EVENT_ESCAPE_ALERT` | Minutes present before escape (float) | Animal suddenly absent after confirmed presence |
#### State Machine
```
+---------+
| Empty |<---------+
+----+----+ |
| |
presence | absence >= 20 frames
(10 frames) | (after >= 200 frames presence
v | -> EVENT 533 escape alert)
+---------+ |
| Present |----------+
+----+----+
|
no motion (6000 frames = 5 min) -> EVENT 531 (once)
breathing outside range (20 frames) -> EVENT 532 (repeating)
```
#### Configuration
| Parameter | Default | Range | Safety Implication |
|---|---|---|---|
| `STILLNESS_FRAMES` | 6000 (5 min) | 1200--12000 | Lower = earlier illness detection |
| `MIN_PRESENCE_FOR_ESCAPE` | 200 (10 s) | 60--600 | Minimum presence before escape counts |
| `ESCAPE_ABSENCE_FRAMES` | 20 (1 s) | 10--100 | Brief absences tolerated |
| `LABORED_DEBOUNCE` | 20 frames (1 s) | 5--60 | Lower = faster breathing alerts |
| `MIN_MOTION_ACTIVE` | 0.03 | 0.01--0.1 | Sensitivity to subtle movement |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::ind_livestock_monitor::{
LivestockMonitor, Species, EVENT_ESCAPE_ALERT, EVENT_LABORED_BREATHING,
};
// Dairy barn: monitor cows
let mut monitor = LivestockMonitor::with_species(Species::Cattle);
let events = monitor.process_frame(presence, breathing_bpm, motion_energy, variance);
for &(event_id, value) in events {
match event_id {
532 => alert_veterinarian(value), // labored breathing BPM
533 => alert_farm_security(value), // escape: minutes present before loss
531 => log_health_concern(value), // minutes of stillness
_ => {}
}
}
```
---
### Structural Vibration Monitor (`ind_structural_vibration.rs`)
**What it does**: Detects building vibration, seismic activity, and structural
stress using CSI phase stability. Only operates when the monitored space is
unoccupied (human movement masks structural signals).
**How it works**: When no humans are present, WiFi CSI phase is highly stable
(noise floor ~0.02 rad). The module detects three types of structural events:
1. **Seismic**: Broadband energy increase (>60% of subcarriers affected,
RMS >0.15 rad). Indicates earthquake, heavy vehicle pass-by, or
construction activity.
2. **Mechanical resonance**: Narrowband peaks detected via autocorrelation
of the mean-phase time series. A peak-to-mean ratio >3.0 with RMS above
2x noise floor indicates periodic mechanical vibration (HVAC, pumps,
rotating equipment).
3. **Structural drift**: Slow monotonic phase change across >50% of
subcarriers for >30 seconds. Indicates material stress, foundation
settlement, or thermal expansion.
#### API
```rust
pub struct StructuralVibrationMonitor { /* ... */ }
impl StructuralVibrationMonitor {
/// Create a new monitor. Requires 100-frame calibration when empty.
pub const fn new() -> Self;
/// Process one CSI frame.
pub fn process_frame(
&mut self,
phases: &[f32], // per-subcarrier phase values
amplitudes: &[f32], // per-subcarrier amplitude values
variance: &[f32], // per-subcarrier variance values
presence: i32, // 0 = empty (analyze), 1 = occupied (skip)
) -> &[(i32, f32)];
/// Current RMS vibration level.
pub fn rms_vibration(&self) -> f32;
/// Whether baseline has been established.
pub fn is_calibrated(&self) -> bool;
}
```
#### Events Emitted
| Event ID | Constant | Value | Meaning |
|---|---|---|---|
| 540 | `EVENT_SEISMIC_DETECTED` | RMS vibration level (rad) | Broadband seismic activity |
| 541 | `EVENT_MECHANICAL_RESONANCE` | Dominant frequency (Hz) | Narrowband mechanical vibration |
| 542 | `EVENT_STRUCTURAL_DRIFT` | Drift rate (rad/s) | Slow structural deformation |
| 543 | `EVENT_VIBRATION_SPECTRUM` | RMS level (rad) | Periodic spectrum report (~5 s) |
#### State Machine
```
+--------------+
| Calibrating | (100 frames, presence=0 required)
+------+-------+
|
+------v-------+
| Idle | (presence=1: skip analysis, reset drift)
| (Occupied) |
+------+-------+
| presence=0
+------v-------+
| Analyzing |
+------+-------+
|
+-----> RMS > 0.15 + broadband -------> EVENT 540 (seismic)
+-----> autocorr peak ratio > 3.0 ----> EVENT 541 (resonance)
+-----> monotonic drift > 30 s -------> EVENT 542 (drift)
+-----> every 100 frames -------------> EVENT 543 (spectrum)
```
#### Configuration
| Parameter | Default | Range | Safety Implication |
|---|---|---|---|
| `SEISMIC_THRESH` | 0.15 rad RMS | 0.05--0.5 | Lower = more sensitive to tremors |
| `RESONANCE_PEAK_RATIO` | 3.0 | 2.0--5.0 | Lower = detects weaker resonances |
| `DRIFT_RATE_THRESH` | 0.0005 rad/frame | 0.0001--0.005 | Lower = detects slower drift |
| `DRIFT_MIN_FRAMES` | 600 (30 s) | 200--2400 | Minimum drift duration before alert |
| `SEISMIC_DEBOUNCE` | 4 frames | 2--10 | Higher = fewer false seismic alerts |
| `SEISMIC_COOLDOWN` | 200 frames (10 s) | 40--600 | Alert repeat interval |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::ind_structural_vibration::{
StructuralVibrationMonitor, EVENT_SEISMIC_DETECTED, EVENT_STRUCTURAL_DRIFT,
};
let mut monitor = StructuralVibrationMonitor::new();
// Calibrate during unoccupied period
for _ in 0..100 {
monitor.process_frame(&phases, &amps, &variance, 0);
}
assert!(monitor.is_calibrated());
// Normal operation
let events = monitor.process_frame(&phases, &amps, &variance, presence);
for &(event_id, value) in events {
match event_id {
540 => {
trigger_building_alarm();
log_seismic_event(value); // RMS vibration level
}
542 => {
notify_structural_engineer(value); // drift rate rad/s
}
_ => {}
}
}
```
---
## OSHA Compliance Notes
### Forklift Proximity (OSHA 29 CFR 1910.178)
- **Standard**: Powered Industrial Trucks -- operator must warn others.
- **Module supports**: Automated proximity detection supplements horn/light
warnings. Does NOT replace operator training, seat belts, or speed limits.
- **Additional equipment required**: Physical barriers, floor markings,
traffic mirrors, operator training program.
### Confined Space (OSHA 29 CFR 1910.146)
- **Standard**: Permit-Required Confined Spaces.
- **Module supports**: Continuous proof-of-life monitoring (breathing and
motion confirmation). Assists the required safety attendant.
- **Additional equipment required**:
- Atmospheric monitoring (O2, H2S, CO, LEL) -- the WiFi module cannot
detect gas hazards.
- Communication system between entrant and attendant.
- Rescue equipment (retrieval system, harness, tripod).
- Entry permit documenting hazards and controls.
- **Audit trail**: `EVENT_BREATHING_OK` (512) provides timestamped
proof-of-life records for compliance documentation.
### Clean Room (ISO 14644)
- **Standard**: Cleanrooms and associated controlled environments.
- **Module supports**: Real-time occupancy enforcement and turbulent motion
detection for particulate control.
- **Additional equipment required**: Particle counters, differential pressure
monitors, HEPA/ULPA filtration systems.
- **Documentation**: `EVENT_COMPLIANCE_REPORT` (523) provides periodic
compliance percentages for audit records.
### Livestock (no direct OSHA standard; see USDA Animal Welfare Act)
- **Module supports**: Automated health monitoring reduces manual inspection
burden. Escape detection supports perimeter security.
- **Additional equipment required**: Veterinary monitoring systems, proper
fencing, temperature/humidity sensors.
### Structural Vibration (OSHA 29 CFR 1926 Subpart P, Excavations)
- **Standard**: Structural stability requirements for construction.
- **Module supports**: Continuous vibration monitoring during unoccupied
periods. Seismic detection provides early warning.
- **Additional equipment required**: Certified structural inspection,
accelerometers for critical structures, tilt sensors.
---
## Deployment Guide
### Sensor Placement for Warehouse Coverage
```
+---+---+---+---+---+
| S | | | | S | S = WiFi sensor (ESP32)
+---+ Aisle 1 +---+ Mounted at shelf height (1.5-2 m)
| | | | One sensor per aisle intersection
+---+ Aisle 2 +---+
| S | | S | Coverage: ~15 m range per sensor
+---+---+---+---+---+ For proximity: sensor every 10 m along aisle
```
- Mount sensors at shelf height (1.5--2 m) for best human/forklift separation.
- Place at aisle intersections for blind-corner coverage.
- Each sensor covers approximately 10--15 m of aisle length.
- For critical zones (loading docks, charging areas), use overlapping sensors.
### Multi-Sensor Setup for Confined Spaces
```
Ground Level
+-----------+
| Sensor A | <-- Entry point monitoring
+-----+-----+
|
| Manhole / Hatch
|
+-----v-----+
| Sensor B | <-- Inside space (if possible)
+-----------+
```
- Sensor A at the entry point detects worker entry/exit.
- Sensor B inside the confined space (if safely mountable) provides
breathing and motion monitoring.
- If only one sensor is available, mount at the entry facing into the space.
- WiFi signals penetrate metal walls poorly -- use multiple sensors for
large vessels.
### Integration with Safety PLCs
Connect ESP32 event output to safety PLCs via:
1. **UDP**: The sensing server receives ESP32 CSI data and emits events
via REST API. Poll `/api/v1/events` for real-time alerts.
2. **Modbus TCP**: Use a gateway to convert UDP events to Modbus registers
for direct PLC integration.
3. **GPIO**: For hard-wired safety circuits, connect ESP32 GPIO outputs
to PLC safety inputs. Configure the ESP32 firmware to assert GPIO on
specific event IDs.
### Calibration Checklist
1. Ensure the monitored space is in its normal empty state.
2. Power on the sensor and wait for calibration to complete:
- Forklift Proximity: 100 frames (5 seconds)
- Structural Vibration: 100 frames (5 seconds)
- Confined Space: No calibration needed (uses host presence)
- Clean Room: No calibration needed (uses host person count)
- Livestock: No calibration needed (uses host presence)
3. Validate by walking through the space and confirming presence detection.
4. For forklift proximity, drive a forklift through and verify vehicle
detection and proximity warnings at appropriate distances.
5. Document calibration date, sensor position, and firmware version.
---
## Event ID Registry (Category 5)
| Range | Module | Events |
|---|---|---|
| 500--502 | Forklift Proximity | `PROXIMITY_WARNING`, `VEHICLE_DETECTED`, `HUMAN_NEAR_VEHICLE` |
| 510--514 | Confined Space | `WORKER_ENTRY`, `WORKER_EXIT`, `BREATHING_OK`, `EXTRACTION_ALERT`, `IMMOBILE_ALERT` |
| 520--523 | Clean Room | `OCCUPANCY_COUNT`, `OCCUPANCY_VIOLATION`, `TURBULENT_MOTION`, `COMPLIANCE_REPORT` |
| 530--533 | Livestock Monitor | `ANIMAL_PRESENT`, `ABNORMAL_STILLNESS`, `LABORED_BREATHING`, `ESCAPE_ALERT` |
| 540--543 | Structural Vibration | `SEISMIC_DETECTED`, `MECHANICAL_RESONANCE`, `STRUCTURAL_DRIFT`, `VIBRATION_SPECTRUM` |
Total: 20 event types across 5 modules.

View file

@ -0,0 +1,688 @@
# Medical & Health Modules -- WiFi-DensePose Edge Intelligence
> Contactless health monitoring using WiFi signals. No wearables, no cameras -- just an ESP32 sensor reading WiFi reflections off a person's body to detect breathing problems, heart rhythm issues, walking difficulties, and seizures.
## Important Disclaimer
These modules are **research tools, not FDA-approved medical devices**. They should supplement -- not replace -- professional medical monitoring. WiFi CSI-derived vital signs are inherently noisier than clinical instruments (ECG, pulse oximetry, respiratory belts). False positives and false negatives will occur. Always validate findings against clinical-grade equipment before acting on alerts.
## Overview
| Module | File | What It Does | Event IDs | Budget |
|--------|------|-------------|-----------|--------|
| Sleep Apnea Detection | `med_sleep_apnea.rs` | Detects apnea episodes when breathing ceases for >10s; tracks AHI score | 100-102 | L (< 2 ms) |
| Cardiac Arrhythmia | `med_cardiac_arrhythmia.rs` | Detects tachycardia, bradycardia, missed beats, HRV anomalies | 110-113 | S (< 5 ms) |
| Respiratory Distress | `med_respiratory_distress.rs` | Detects tachypnea, labored breathing, Cheyne-Stokes, composite distress score | 120-123 | H (< 10 ms) |
| Gait Analysis | `med_gait_analysis.rs` | Extracts step cadence, asymmetry, shuffling, festination, fall-risk score | 130-134 | H (< 10 ms) |
| Seizure Detection | `med_seizure_detect.rs` | Detects tonic-clonic seizures with phase discrimination (fall vs tremor) | 140-143 | S (< 5 ms) |
All modules:
- Compile to `no_std` for WASM (ESP32 WASM3 runtime)
- Use `const fn new()` for zero-cost initialization
- Return events via `&[(i32, f32)]` slices (no heap allocation)
- Include NaN and division-by-zero protections
- Implement cooldown timers to prevent event flooding
---
## Modules
### Sleep Apnea Detection (`med_sleep_apnea.rs`)
**What it does**: Monitors breathing rate from the host CSI pipeline and detects when breathing drops below 4 BPM for more than 10 consecutive seconds, indicating an apnea episode. It tracks all episodes and computes the Apnea-Hypopnea Index (AHI) -- the number of apnea events per hour of monitored sleep time. AHI is the standard clinical metric for sleep apnea severity.
**Clinical basis**: Obstructive and central sleep apnea are defined by cessation of airflow for 10 seconds or more. The module uses a breathing rate threshold of 4 BPM (essentially near-zero breathing) with a 10-second onset delay to confirm cessation is sustained. AHI severity classification: < 5 normal, 5-15 mild, 15-30 moderate, > 30 severe.
**How it works**:
1. Each second, checks if breathing BPM is below 4.0
2. Increments a consecutive-low-breath counter
3. After 10 consecutive seconds, declares apnea onset (backdated to when breathing first dropped)
4. When breathing resumes above 4 BPM, records the episode with its duration
5. Every 5 minutes, computes AHI = (total episodes) / (monitoring hours)
6. Only monitors when presence is detected; if subject leaves during apnea, the episode is ended
#### API
| Item | Type | Description |
|------|------|-------------|
| `SleepApneaDetector` | struct | Main detector state |
| `SleepApneaDetector::new()` | `const fn` | Create detector with zeroed state |
| `process_frame(breathing_bpm, presence, variance)` | method | Process one frame at ~1 Hz; returns event slice |
| `ahi()` | method | Current AHI value |
| `episode_count()` | method | Total recorded apnea episodes |
| `monitoring_seconds()` | method | Total seconds with presence active |
| `in_apnea()` | method | Whether currently in an apnea episode |
| `APNEA_BPM_THRESH` | const | 4.0 BPM -- below this counts as apnea |
| `APNEA_ONSET_SECS` | const | 10 seconds -- minimum duration to declare apnea |
| `AHI_REPORT_INTERVAL` | const | 300 seconds (5 min) -- how often AHI is recalculated |
| `MAX_EPISODES` | const | 256 -- maximum episodes stored per session |
#### Events Emitted
| Event ID | Constant | Value | Clinical Meaning |
|----------|----------|-------|-----------------|
| 100 | `EVENT_APNEA_START` | Current breathing BPM | Breathing has ceased or dropped below 4 BPM for >10 seconds |
| 101 | `EVENT_APNEA_END` | Duration in seconds | Breathing has resumed after an apnea episode |
| 102 | `EVENT_AHI_UPDATE` | AHI score (events/hour) | Periodic severity metric; >5 = mild, >15 = moderate, >30 = severe |
#### State Machine
```
presence lost
[Monitoring] -----> [Not Monitoring] (no events, counter paused)
| |
| bpm < 4.0 | presence regained
v v
[Low Breath Counter] [Monitoring]
|
| count >= 10s
v
[In Apnea] ---------> [Episode End] (bpm >= 4.0 or presence lost)
| |
| v
| [Record Episode, emit APNEA_END]
|
+-- emit APNEA_START (once)
```
#### Configuration
| Parameter | Default | Clinical Range | Description |
|-----------|---------|----------------|-------------|
| `APNEA_BPM_THRESH` | 4.0 | 0-6 BPM | Breathing rate below which apnea is suspected |
| `APNEA_ONSET_SECS` | 10 | 10-20 s | Seconds of low breathing before apnea is declared |
| `AHI_REPORT_INTERVAL` | 300 | 60-3600 s | How often AHI is recalculated and emitted |
| `MAX_EPISODES` | 256 | -- | Fixed buffer size for episode history |
| `PRESENCE_ACTIVE` | 1 | -- | Minimum presence flag value for monitoring |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::med_sleep_apnea::*;
let mut detector = SleepApneaDetector::new();
// Normal breathing -- no events
let events = detector.process_frame(14.0, 1, 0.1);
assert!(events.is_empty());
// Simulate apnea: feed low BPM for 15 seconds
for _ in 0..15 {
let events = detector.process_frame(1.0, 1, 0.1);
for &(event_id, value) in events {
match event_id {
EVENT_APNEA_START => println!("Apnea detected! BPM: {}", value),
_ => {}
}
}
}
assert!(detector.in_apnea());
// Resume normal breathing
let events = detector.process_frame(14.0, 1, 0.1);
for &(event_id, value) in events {
match event_id {
EVENT_APNEA_END => println!("Apnea ended after {} seconds", value),
_ => {}
}
}
println!("Episodes: {}", detector.episode_count());
println!("AHI: {:.1}", detector.ahi());
```
#### Tutorial: Setting Up Bedroom Sleep Monitoring
1. **ESP32 placement**: Mount the ESP32-S3 on the wall or ceiling 1-2 meters from the bed, at chest height. The sensor should have line-of-sight to the sleeping area. Avoid placing near metal objects or moving fans that create CSI interference.
2. **WiFi router**: Ensure a stable WiFi AP is within range. The ESP32 monitors the CSI (Channel State Information) of WiFi signals reflected off the person's body. The AP should be on the opposite side of the bed from the sensor for best body reflection capture.
3. **Firmware configuration**: Flash the ESP32 firmware with Tier 2 edge processing enabled (provides breathing BPM). The sleep apnea WASM module runs as a Tier 3 algorithm on top of the Tier 2 vitals output.
4. **Threshold tuning**: The default 4 BPM threshold is conservative (near-complete cessation). For a more sensitive detector, lower to 6-8 BPM, but expect more false positives from shallow breathing. The 10-second onset delay matches clinical apnea definitions.
5. **Reading AHI results**: AHI is emitted every 5 minutes. After a full night (7-8 hours), the final AHI value represents the overnight severity. Compare against clinical thresholds: < 5 (normal), 5-15 (mild), 15-30 (moderate), > 30 (severe).
6. **Limitations**: WiFi-based breathing detection works best when the subject is relatively still (sleeping). Tossing and turning may cause momentary breathing detection loss, which could either mask or falsely trigger apnea events. A single-night study should always be confirmed with clinical polysomnography.
---
### Cardiac Arrhythmia Detection (`med_cardiac_arrhythmia.rs`)
**What it does**: Monitors heart rate from the host CSI pipeline and detects four types of cardiac rhythm abnormalities: tachycardia (sustained fast heart rate), bradycardia (sustained slow heart rate), missed beats (sudden HR drops), and HRV anomalies (heart rate variability outside normal bounds).
**Clinical basis**: Tachycardia is defined as HR > 100 BPM sustained for 10+ seconds. Bradycardia is HR < 50 BPM sustained for 10+ seconds (the 50 BPM threshold is used instead of the typical 60 BPM to account for CSI measurement noise and to avoid false positives in athletes with naturally low resting HR). Missed beats are detected as a >30% drop from the running average. HRV is assessed via RMSSD (root mean square of successive differences) with a widened normal band (10-120 ms equivalent) to account for the coarser CSI-derived HR measurement compared to ECG.
**How it works**:
1. Maintains an exponential moving average (EMA) of heart rate with alpha=0.1
2. Tracks consecutive seconds above 100 BPM (tachycardia) or below 50 BPM (bradycardia)
3. After 10 consecutive seconds in an abnormal range, emits the corresponding alert
4. Computes fractional drop from EMA to detect missed beats
5. Maintains a 30-second ring buffer of successive HR differences for RMSSD calculation
6. RMSSD is converted from BPM units to approximate ms-equivalent (scale factor ~17)
7. All alerts have a 30-second cooldown to prevent event flooding
8. Invalid readings (< 1 BPM or NaN) are silently ignored to prevent contamination
#### API
| Item | Type | Description |
|------|------|-------------|
| `CardiacArrhythmiaDetector` | struct | Main detector state |
| `CardiacArrhythmiaDetector::new()` | `const fn` | Create detector with zeroed state |
| `process_frame(hr_bpm, phase)` | method | Process one frame at ~1 Hz; returns event slice |
| `hr_ema()` | method | Current EMA heart rate |
| `frame_count()` | method | Total frames processed |
| `TACHY_THRESH` | const | 100.0 BPM |
| `BRADY_THRESH` | const | 50.0 BPM |
| `SUSTAINED_SECS` | const | 10 seconds |
| `MISSED_BEAT_DROP` | const | 0.30 (30% drop from EMA) |
| `HRV_WINDOW` | const | 30 seconds |
| `RMSSD_LOW` / `RMSSD_HIGH` | const | 10.0 / 120.0 ms (widened for CSI) |
| `COOLDOWN_SECS` | const | 30 seconds |
#### Events Emitted
| Event ID | Constant | Value | Clinical Meaning |
|----------|----------|-------|-----------------|
| 110 | `EVENT_TACHYCARDIA` | Current HR in BPM | Heart rate sustained above 100 BPM for 10+ seconds |
| 111 | `EVENT_BRADYCARDIA` | Current HR in BPM | Heart rate sustained below 50 BPM for 10+ seconds |
| 112 | `EVENT_MISSED_BEAT` | Current HR in BPM | Sudden HR drop >30% from running average |
| 113 | `EVENT_HRV_ANOMALY` | RMSSD value (ms) | Heart rate variability outside 10-120 ms normal range |
#### State Machine
The cardiac module does not have a formal state machine -- it uses independent detectors with cooldown timers:
```
For each frame:
1. Tick cooldowns (4 independent timers)
2. Reject invalid inputs (< 1 BPM or NaN)
3. Update EMA (alpha = 0.1)
4. Update RR-diff ring buffer
5. Check tachycardia (HR > 100 for 10+ consecutive seconds)
6. Check bradycardia (HR < 50 for 10+ consecutive seconds)
7. Check missed beat (>30% drop from EMA)
8. Check HRV anomaly (RMSSD outside 10-120 ms, requires full 30s window)
9. Each check respects its own 30-second cooldown
```
#### Configuration
| Parameter | Default | Clinical Range | Description |
|-----------|---------|----------------|-------------|
| `TACHY_THRESH` | 100.0 | 90-120 BPM | HR threshold for tachycardia |
| `BRADY_THRESH` | 50.0 | 40-60 BPM | HR threshold for bradycardia |
| `SUSTAINED_SECS` | 10 | 5-30 s | Consecutive seconds required for alert |
| `MISSED_BEAT_DROP` | 0.30 | 0.20-0.40 | Fractional HR drop to flag missed beat |
| `RMSSD_LOW` | 10.0 | 5-20 ms | Minimum normal RMSSD |
| `RMSSD_HIGH` | 120.0 | 80-150 ms | Maximum normal RMSSD |
| `EMA_ALPHA` | 0.1 | 0.05-0.2 | EMA smoothing coefficient |
| `COOLDOWN_SECS` | 30 | 10-60 s | Minimum time between repeated alerts |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::med_cardiac_arrhythmia::*;
let mut detector = CardiacArrhythmiaDetector::new();
// Normal heart rate -- no events
for _ in 0..60 {
let events = detector.process_frame(72.0, 0.0);
assert!(events.is_empty() || events.iter().all(|&(t, _)| t == EVENT_HRV_ANOMALY));
}
// Sustained tachycardia
for _ in 0..15 {
let events = detector.process_frame(120.0, 0.0);
for &(event_id, value) in events {
if event_id == EVENT_TACHYCARDIA {
println!("Tachycardia alert! HR: {} BPM", value);
}
}
}
```
---
### Respiratory Distress Detection (`med_respiratory_distress.rs`)
**What it does**: Detects four types of respiratory abnormalities from the host CSI pipeline: tachypnea (fast breathing), labored breathing (high amplitude variance), Cheyne-Stokes respiration (a crescendo-decrescendo breathing pattern), and a composite respiratory distress severity score from 0-100.
**Clinical basis**: Tachypnea is defined clinically as > 20 BPM in adults. This module uses a threshold of 25 BPM (more conservative) to reduce false positives from the inherently noisier CSI-derived breathing rate. Labored breathing is detected as a 3x increase in amplitude variance relative to a learned baseline. Cheyne-Stokes respiration is a pathological breathing pattern with 30-90 second periodicity, commonly associated with heart failure and neurological conditions. The module detects it via autocorrelation of the breathing amplitude envelope.
**How it works**:
1. Maintains a 120-second ring buffer of breathing BPM for autocorrelation analysis
2. Maintains a 60-second ring buffer of amplitude variance
3. Learns a baseline variance over the first 60 seconds (Welford online mean)
4. Checks for tachypnea: breathing rate > 25 BPM sustained for 8+ seconds
5. Checks for labored breathing: current variance > 3x baseline variance
6. Checks for Cheyne-Stokes: significant autocorrelation peak in 30-90s lag range
7. Computes composite distress score (0-100) every 30 seconds based on: rate deviation from normal (16 BPM center), variance ratio, tachypnea flag, and recent Cheyne-Stokes detection
8. NaN inputs are excluded from ring buffers to prevent contamination
#### API
| Item | Type | Description |
|------|------|-------------|
| `RespiratoryDistressDetector` | struct | Main detector state |
| `RespiratoryDistressDetector::new()` | `const fn` | Create detector with zeroed state |
| `process_frame(breathing_bpm, phase, variance)` | method | Process one frame at ~1 Hz; returns event slice |
| `last_distress_score()` | method | Most recent composite score (0-100) |
| `frame_count()` | method | Total frames processed |
| `TACHYPNEA_THRESH` | const | 25.0 BPM (conservative; clinical is 20 BPM) |
| `SUSTAINED_SECS` | const | 8 seconds |
| `LABORED_VAR_RATIO` | const | 3.0x baseline |
| `CS_LAG_MIN` / `CS_LAG_MAX` | const | 30 / 90 seconds (Cheyne-Stokes period range) |
| `CS_PEAK_THRESH` | const | 0.35 (normalized autocorrelation) |
| `BASELINE_SECS` | const | 60 seconds (learning period) |
| `COOLDOWN_SECS` | const | 20 seconds |
#### Events Emitted
| Event ID | Constant | Value | Clinical Meaning |
|----------|----------|-------|-----------------|
| 120 | `EVENT_TACHYPNEA` | Current breathing BPM | Breathing rate sustained above 25 BPM for 8+ seconds |
| 121 | `EVENT_LABORED_BREATHING` | Variance ratio | Breathing effort > 3x baseline; possible respiratory distress |
| 122 | `EVENT_CHEYNE_STOKES` | Period in seconds | Crescendo-decrescendo breathing pattern; associated with heart failure |
| 123 | `EVENT_RESP_DISTRESS_LEVEL` | Score 0-100 | Composite severity: 0-20 normal, 20-50 mild, 50-80 moderate, 80-100 severe |
#### State Machine
The respiratory distress module uses independent detector tracks with cooldowns rather than a single state machine:
```
For each frame:
1. Tick cooldowns (3 independent timers)
2. Skip NaN inputs for ring buffer updates
3. Update breathing BPM ring buffer (120s) and variance ring buffer (60s)
4. Learn baseline variance during first 60 seconds (Welford)
5. Tachypnea check: BPM > 25 for 8+ consecutive seconds
6. Labored breathing: current variance mean > 3x baseline (after baseline period)
7. Cheyne-Stokes: autocorrelation peak > 0.35 in 30-90s lag range (needs full 120s buffer)
8. Composite distress score emitted every 30 seconds
```
#### Configuration
| Parameter | Default | Clinical Range | Description |
|-----------|---------|----------------|-------------|
| `TACHYPNEA_THRESH` | 25.0 | 20-30 BPM | Breathing rate for tachypnea alert |
| `SUSTAINED_SECS` | 8 | 5-15 s | Debounce period for tachypnea |
| `LABORED_VAR_RATIO` | 3.0 | 2.0-5.0 | Variance ratio above baseline |
| `AC_WINDOW` | 120 | 90-180 s | Autocorrelation buffer for Cheyne-Stokes |
| `CS_PEAK_THRESH` | 0.35 | 0.25-0.50 | Autocorrelation peak threshold |
| `CS_LAG_MIN` / `CS_LAG_MAX` | 30 / 90 | 20-120 s | Cheyne-Stokes period search range |
| `BASELINE_SECS` | 60 | 30-120 s | Duration to learn baseline variance |
| `DISTRESS_REPORT_INTERVAL` | 30 | 10-60 s | How often composite score is emitted |
| `COOLDOWN_SECS` | 20 | 10-60 s | Minimum time between repeated alerts |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::med_respiratory_distress::*;
let mut detector = RespiratoryDistressDetector::new();
// Build baseline with normal breathing (60 seconds)
for _ in 0..60 {
detector.process_frame(16.0, 0.0, 0.5);
}
// Simulate respiratory distress: high rate + high variance
for _ in 0..30 {
let events = detector.process_frame(30.0, 0.0, 3.0);
for &(event_id, value) in events {
match event_id {
EVENT_TACHYPNEA => println!("Tachypnea! Rate: {} BPM", value),
EVENT_LABORED_BREATHING => println!("Labored breathing! Variance ratio: {:.1}x", value),
EVENT_RESP_DISTRESS_LEVEL => println!("Distress score: {:.0}/100", value),
_ => {}
}
}
}
```
#### Tutorial: Setting Up ICU/Ward Monitoring
1. **Placement**: Mount the ESP32 at the foot of the bed or on the ceiling directly above the patient. The sensor needs clear WiFi signal reflection from the patient's torso.
2. **Baseline learning**: The module automatically learns a 60-second baseline variance when first activated. Ensure the patient is breathing normally during this calibration period. If the patient is already in distress at module start, the baseline will be skewed and labored-breathing detection will be unreliable.
3. **Cheyne-Stokes detection**: Requires at least 120 seconds of data to begin autocorrelation analysis. The 30-90 second periodicity search range covers the clinically documented Cheyne-Stokes cycle range. In practice, detection typically becomes reliable after 3-4 minutes of monitoring.
4. **Distress score interpretation**: The composite score (0-100) combines four factors: rate deviation from normal, variance ratio, tachypnea presence, and Cheyne-Stokes detection. A score above 50 warrants clinical attention. Above 80 suggests acute distress.
---
### Gait Analysis (`med_gait_analysis.rs`)
**What it does**: Extracts gait parameters from CSI phase variance periodicity to assess mobility and fall risk. Detects step cadence, gait asymmetry (limping), stride variability, shuffling gait patterns (associated with Parkinson's disease), festination (involuntary acceleration), and computes a composite fall-risk score from 0-100.
**Clinical basis**: Normal walking cadence is 80-120 steps/min for healthy adults. Shuffling gait (>140 steps/min with low energy) is characteristic of Parkinson's disease and other neurological conditions. Festination (involuntary cadence acceleration) is a Parkinsonian feature. Gait asymmetry (left/right step interval ratio deviating from 1.0 by >15%) indicates limping or musculoskeletal issues. High stride variability (coefficient of variation) is a strong predictor of fall risk in elderly patients.
**How it works**:
1. Maintains a 60-second ring buffer of phase variance and motion energy
2. Detects steps as local maxima in the phase variance signal (peak-to-trough ratio > 1.5)
3. Records step intervals in a 64-entry buffer
4. Every 10 seconds, computes: cadence (60 / mean step interval), asymmetry (odd/even step interval ratio), variability (coefficient of variation)
5. Tracks cadence history over 6 reporting periods for festination detection
6. Shuffling is flagged when cadence > 140 and motion energy is low
7. Festination is detected as cadence accelerating by > 1.5 steps/min/sec
8. Fall-risk score (0-100) is a weighted composite of: abnormal cadence (25%), asymmetry (25%), variability (25%), low energy (15%), festination (10%)
#### API
| Item | Type | Description |
|------|------|-------------|
| `GaitAnalyzer` | struct | Main analyzer state |
| `GaitAnalyzer::new()` | `const fn` | Create analyzer with zeroed state |
| `process_frame(phase, amplitude, variance, motion_energy)` | method | Process one frame at ~1 Hz; returns event slice |
| `last_cadence()` | method | Most recent cadence (steps/min) |
| `last_asymmetry()` | method | Most recent asymmetry ratio (1.0 = symmetric) |
| `last_fall_risk()` | method | Most recent fall-risk score (0-100) |
| `frame_count()` | method | Total frames processed |
| `NORMAL_CADENCE_LOW` / `HIGH` | const | 80.0 / 120.0 steps/min |
| `SHUFFLE_CADENCE_HIGH` | const | 140.0 steps/min |
| `ASYMMETRY_THRESH` | const | 0.15 (15% deviation from 1.0) |
| `FESTINATION_ACCEL` | const | 1.5 steps/min/sec |
| `REPORT_INTERVAL` | const | 10 seconds |
| `COOLDOWN_SECS` | const | 15 seconds |
#### Events Emitted
| Event ID | Constant | Value | Clinical Meaning |
|----------|----------|-------|-----------------|
| 130 | `EVENT_STEP_CADENCE` | Steps/min | Detected walking cadence; <80 or >120 is abnormal |
| 131 | `EVENT_GAIT_ASYMMETRY` | Ratio (1.0=symmetric) | Step interval asymmetry; >1.15 or <0.85 indicates limping |
| 132 | `EVENT_FALL_RISK_SCORE` | Score 0-100 | Composite: 0-25 low, 25-50 moderate, 50-75 high, 75-100 critical |
| 133 | `EVENT_SHUFFLING_DETECTED` | Cadence (steps/min) | High-frequency, low-amplitude gait; Parkinson's indicator |
| 134 | `EVENT_FESTINATION` | Cadence (steps/min) | Involuntary cadence acceleration; Parkinsonian feature |
#### State Machine
The gait analyzer operates on a periodic reporting cycle:
```
Continuous (every frame):
- Push variance and energy into ring buffers
- Detect step peaks (local max in variance > 1.5x neighbors)
- Record step intervals
Every REPORT_INTERVAL (10s), if >= 4 steps detected:
1. Compute cadence, asymmetry, variability
2. Emit EVENT_STEP_CADENCE
3. If asymmetry > threshold: emit EVENT_GAIT_ASYMMETRY
4. If cadence > 140 and energy < 0.3: emit EVENT_SHUFFLING_DETECTED
5. If cadence accelerating > 1.5/s over 3 periods: emit EVENT_FESTINATION
6. Compute and emit EVENT_FALL_RISK_SCORE
7. Reset step buffer for next window
```
#### Configuration
| Parameter | Default | Clinical Range | Description |
|-----------|---------|----------------|-------------|
| `GAIT_WINDOW` | 60 | 30-120 s | Ring buffer size for phase variance |
| `STEP_PEAK_RATIO` | 1.5 | 1.2-2.0 | Min peak-to-trough ratio for step detection |
| `NORMAL_CADENCE_LOW` | 80.0 | 70-90 steps/min | Lower bound of normal cadence |
| `NORMAL_CADENCE_HIGH` | 120.0 | 110-130 steps/min | Upper bound of normal cadence |
| `SHUFFLE_CADENCE_HIGH` | 140.0 | 120-160 steps/min | Cadence threshold for shuffling |
| `SHUFFLE_ENERGY_LOW` | 0.3 | 0.1-0.5 | Energy ceiling for shuffling detection |
| `FESTINATION_ACCEL` | 1.5 | 1.0-3.0 steps/min/s | Cadence acceleration threshold |
| `ASYMMETRY_THRESH` | 0.15 | 0.10-0.25 | Asymmetry ratio deviation from 1.0 |
| `REPORT_INTERVAL` | 10 | 5-30 s | Gait analysis reporting period |
| `MIN_MOTION_ENERGY` | 0.1 | 0.05-0.3 | Minimum energy for step detection |
| `COOLDOWN_SECS` | 15 | 10-30 s | Cooldown for shuffling/festination alerts |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::med_gait_analysis::*;
let mut analyzer = GaitAnalyzer::new();
// Simulate walking with alternating high/low variance (steps)
for i in 0..30 {
let variance = if i % 2 == 0 { 5.0 } else { 0.5 };
let events = analyzer.process_frame(0.0, 1.0, variance, 1.0);
for &(event_id, value) in events {
match event_id {
EVENT_STEP_CADENCE => println!("Cadence: {:.0} steps/min", value),
EVENT_FALL_RISK_SCORE => println!("Fall risk: {:.0}/100", value),
EVENT_GAIT_ASYMMETRY => println!("Asymmetry: {:.2}", value),
_ => {}
}
}
}
```
#### Tutorial: Setting Up Hallway Gait Monitoring
1. **Placement**: Mount the ESP32 in a hallway or corridor at waist height on the wall. The walking path should be 3-5 meters long within the sensor's field of view. Position the WiFi AP at the opposite end of the hallway for optimal body reflection.
2. **Calibration**: The step detector relies on periodic peaks in phase variance. The `STEP_PEAK_RATIO` of 1.5 works well for most flooring surfaces. On carpet (which dampens impact signals), consider lowering to 1.2. On hard floors with shoes, 1.5-2.0 is appropriate.
3. **Clinical context**: The fall-risk score is most useful for longitudinal monitoring. A single reading provides a snapshot, but tracking trends over days/weeks reveals progressive mobility decline. A rising fall-risk score (e.g., from 20 to 40 over a month) warrants clinical assessment even if individual readings are below the "high risk" threshold.
4. **Limitations**: At a 1 Hz timer rate, the module cannot detect cadences above ~60 steps/min via direct peak counting. For higher cadences, the step detection relies on the host's higher-rate CSI processing to pre-compute variance peaks. Shuffling detection at >140 steps/min requires the host to be providing step-level variance data at higher than 1 Hz.
---
### Seizure Detection (`med_seizure_detect.rs`)
**What it does**: Detects tonic-clonic (grand mal) seizures by identifying sustained high-energy rhythmic motion in the 3-8 Hz band. Discriminates seizures from falls (single impulse followed by stillness) and tremor (lower amplitude, higher regularity). Tracks seizure phases: tonic (sustained muscle rigidity), clonic (rhythmic jerking), and post-ictal (sudden cessation of movement).
**Clinical basis**: Tonic-clonic seizures have a characteristic progression: (1) tonic phase with sustained muscle rigidity causing high motion energy with low variance, lasting 10-20 seconds; (2) clonic phase with rhythmic jerking at 3-8 Hz, lasting 30-60 seconds; (3) post-ictal phase with sudden cessation of movement and deep unresponsiveness. Falls produce a brief (<10 frame) high-energy spike followed by stillness. Tremors have lower amplitude than seizure-grade jerking.
**How it works**:
1. Operates at ~20 Hz frame rate (higher than other modules) for rhythm detection
2. Maintains 100-frame ring buffers for motion energy and amplitude
3. State machine progresses: Monitoring -> PossibleOnset -> Tonic/Clonic -> PostIctal -> Cooldown
4. Onset requires 10+ consecutive frames of high motion energy (>2.0 normalized)
5. Fall discrimination: if high energy lasts < 10 frames then drops, it is classified as a fall and ignored
6. Tonic phase: high energy with low variance (< 0.5)
7. Clonic phase: detected via autocorrelation of amplitude buffer for 2-7 frame period (3-8 Hz at 20 Hz sampling)
8. Post-ictal: motion drops below 0.2 for 40+ consecutive frames
9. After an episode, 200-frame cooldown prevents re-triggering
10. Presence must be active; loss of presence resets the state machine
#### API
| Item | Type | Description |
|------|------|-------------|
| `SeizureDetector` | struct | Main detector state |
| `SeizureDetector::new()` | `const fn` | Create detector with zeroed state |
| `process_frame(phase, amplitude, motion_energy, presence)` | method | Process at ~20 Hz; returns event slice |
| `phase()` | method | Current `SeizurePhase` enum value |
| `seizure_count()` | method | Total seizure episodes detected |
| `frame_count()` | method | Total frames processed |
| `SeizurePhase` | enum | Monitoring, PossibleOnset, Tonic, Clonic, PostIctal, Cooldown |
| `HIGH_ENERGY_THRESH` | const | 2.0 (normalized) |
| `TONIC_MIN_FRAMES` | const | 20 frames (1 second at 20 Hz) |
| `CLONIC_PERIOD_MIN` / `MAX` | const | 2 / 7 frames (3-8 Hz at 20 Hz) |
| `POST_ICTAL_MIN_FRAMES` | const | 40 frames (2 seconds at 20 Hz) |
| `COOLDOWN_FRAMES` | const | 200 frames (10 seconds at 20 Hz) |
#### Events Emitted
| Event ID | Constant | Value | Clinical Meaning |
|----------|----------|-------|-----------------|
| 140 | `EVENT_SEIZURE_ONSET` | Motion energy | Seizure activity detected; immediate clinical attention needed |
| 141 | `EVENT_SEIZURE_TONIC` | Duration in frames | Tonic phase identified; sustained rigidity |
| 142 | `EVENT_SEIZURE_CLONIC` | Period in frames | Clonic phase identified; rhythmic jerking with detected periodicity |
| 143 | `EVENT_POST_ICTAL` | 1.0 | Post-ictal phase; movement has ceased after seizure |
#### State Machine
```
presence lost (from any active state)
+-----------------------------------------+
v |
[Monitoring] --> [PossibleOnset] --> [Tonic] --> [Clonic] --> [PostIctal] --> [Cooldown]
^ | | | | |
| | | +------> [PostIctal] -----+ |
| | | (direct if energy drops) |
| | +--------> [Clonic] |
| | (skip tonic) |
| | |
| +-- timeout (200 frames) --> [Monitoring] |
| +-- fall (<10 frames) -----> [Monitoring] |
| |
+------ cooldown expires (200 frames) ------------------------------------+
```
Transitions:
- **Monitoring -> PossibleOnset**: 10+ frames of motion energy > 2.0
- **PossibleOnset -> Tonic**: Low energy variance + high energy (muscle rigidity pattern)
- **PossibleOnset -> Clonic**: Rhythmic autocorrelation peak + amplitude above tremor floor
- **PossibleOnset -> Monitoring**: Energy drop within 10 frames (fall) or timeout at 200 frames
- **Tonic -> Clonic**: Energy variance increases and rhythm is detected
- **Tonic -> PostIctal**: Motion energy drops below 0.2 for 40+ frames
- **Clonic -> PostIctal**: Motion energy drops below 0.2 for 40+ frames
- **PostIctal -> Cooldown**: After 40 frames in post-ictal
- **Cooldown -> Monitoring**: After 200 frames (10 seconds)
#### Configuration
| Parameter | Default | Clinical Range | Description |
|-----------|---------|----------------|-------------|
| `ENERGY_WINDOW` / `PHASE_WINDOW` | 100 | 60-200 frames | Ring buffer sizes for analysis |
| `HIGH_ENERGY_THRESH` | 2.0 | 1.5-3.0 | Motion energy threshold for onset |
| `TONIC_ENERGY_THRESH` | 1.5 | 1.0-2.0 | Energy threshold during tonic phase |
| `TONIC_VAR_CEIL` | 0.5 | 0.3-1.0 | Max energy variance for tonic classification |
| `TONIC_MIN_FRAMES` | 20 | 10-40 frames | Min frames to confirm tonic phase |
| `CLONIC_PERIOD_MIN` / `MAX` | 2 / 7 | 2-10 frames | Period range for 3-8 Hz rhythm |
| `CLONIC_AUTOCORR_THRESH` | 0.30 | 0.20-0.50 | Autocorrelation threshold for rhythm |
| `CLONIC_MIN_FRAMES` | 30 | 20-60 frames | Min frames to confirm clonic phase |
| `POST_ICTAL_ENERGY_THRESH` | 0.2 | 0.1-0.5 | Energy threshold for cessation |
| `POST_ICTAL_MIN_FRAMES` | 40 | 20-80 frames | Min frames of low energy |
| `FALL_MAX_DURATION` | 10 | 5-20 frames | Max high-energy duration classified as fall |
| `TREMOR_AMPLITUDE_FLOOR` | 0.8 | 0.5-1.5 | Min amplitude to distinguish from tremor |
| `COOLDOWN_FRAMES` | 200 | 100-400 frames | Cooldown after episode completes |
| `ONSET_MIN_FRAMES` | 10 | 5-20 frames | Min high-energy frames before onset |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::med_seizure_detect::*;
let mut detector = SeizureDetector::new();
// Normal motion -- no seizure
for _ in 0..200 {
let events = detector.process_frame(0.0, 0.5, 0.3, 1);
assert!(events.is_empty());
}
assert_eq!(detector.phase(), SeizurePhase::Monitoring);
// Tonic phase: sustained high energy, low variance
for _ in 0..50 {
let events = detector.process_frame(0.0, 2.0, 3.0, 1);
for &(event_id, value) in events {
match event_id {
EVENT_SEIZURE_ONSET => println!("SEIZURE ONSET! Energy: {}", value),
EVENT_SEIZURE_TONIC => println!("Tonic phase: {} frames", value),
_ => {}
}
}
}
// Post-ictal: sudden cessation
for _ in 0..100 {
let events = detector.process_frame(0.0, 0.05, 0.05, 1);
for &(event_id, _) in events {
if event_id == EVENT_POST_ICTAL {
println!("Post-ictal phase detected -- patient needs immediate assessment");
}
}
}
```
#### Tutorial: Setting Up Seizure Monitoring
1. **Placement**: Mount the ESP32 on the ceiling directly above the bed or monitoring area. Seizure detection requires the highest sensitivity to body motion, so minimize distance to the patient. Ensure no other people or moving objects are in the sensor's field of view (pets, curtains, fans).
2. **Frame rate**: Unlike other medical modules that operate at 1 Hz, the seizure detector expects ~20 Hz frame input for accurate rhythm detection in the 3-8 Hz band. Ensure the host firmware is configured for high-rate CSI processing when this module is loaded.
3. **Sensitivity tuning**: The `HIGH_ENERGY_THRESH` of 2.0 and `ONSET_MIN_FRAMES` of 10 balance sensitivity against false positives. In a quiet bedroom environment, these defaults work well. In noisier environments (shared ward, nearby equipment vibration), consider raising `HIGH_ENERGY_THRESH` to 2.5-3.0.
4. **Fall vs seizure discrimination**: The module automatically distinguishes falls (brief energy spike < 10 frames) from seizures (sustained energy). If the patient is known to be a fall risk, consider running the gait analysis module in parallel for complementary monitoring.
5. **Response protocol**: When `EVENT_SEIZURE_ONSET` fires, immediately notify clinical staff. The `EVENT_POST_ICTAL` event indicates the active seizure has ended and the patient is entering post-ictal state -- they need assessment but are no longer in the convulsive phase.
---
## Testing
All medical modules include comprehensive unit tests covering initialization, normal operation, clinical scenario detection, edge cases, and cooldown behavior.
```bash
cd rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge
cargo test --features std -- med_
```
Expected output: **38 tests passed, 0 failed**.
### Test Coverage by Module
| Module | Tests | Scenarios Covered |
|--------|-------|-------------------|
| Sleep Apnea | 7 | Init, normal breathing, apnea onset/end, no monitoring without presence, AHI update, multiple episodes, presence-loss during apnea |
| Cardiac Arrhythmia | 7 | Init, normal HR, tachycardia, bradycardia, missed beat, HRV anomaly (low variability), cooldown flood prevention, EMA convergence |
| Respiratory Distress | 6 | Init, normal breathing, tachypnea, labored breathing, distress score emission, Cheyne-Stokes detection, distress score range |
| Gait Analysis | 7 | Init, no events without steps, cadence extraction, fall-risk score range, asymmetry detection, shuffling detection, variability (uniform + varied) |
| Seizure Detection | 7 | Init, normal motion, fall discrimination, seizure onset with sustained energy, post-ictal detection, no detection without presence, energy variance, cooldown after episode |
---
## Clinical Thresholds Reference
| Condition | Normal Range | Module Threshold | Clinical Standard | Notes |
|-----------|-------------|------------------|-------------------|-------|
| Breathing rate | 12-20 BPM | -- | -- | Normal adult at rest |
| Bradypnea | < 12 BPM | Not directly detected | < 12 BPM | Gap: covered implicitly by distress score |
| Tachypnea | > 20 BPM | > 25 BPM | > 20 BPM | Conservative threshold for CSI noise tolerance |
| Apnea | 0 BPM | < 4 BPM for > 10s | Cessation > 10s | 4 BPM threshold accounts for CSI noise floor |
| Bradycardia | < 60 BPM | < 50 BPM | < 60 BPM | Lower threshold avoids false positives in athletes |
| Tachycardia | > 100 BPM | > 100 BPM | > 100 BPM | Matches clinical standard |
| Heart rate (normal) | 60-100 BPM | -- | 60-100 BPM | -- |
| AHI (mild apnea) | -- | > 5 events/hr | > 5 events/hr | Matches clinical standard |
| AHI (moderate) | -- | > 15 events/hr | > 15 events/hr | Matches clinical standard |
| AHI (severe) | -- | > 30 events/hr | > 30 events/hr | Matches clinical standard |
| RMSSD (normal HRV) | 20-80 ms | 10-120 ms | 19-75 ms | Widened band for CSI-derived HR |
| Gait cadence (normal) | 80-120 steps/min | 80-120 steps/min | 90-120 steps/min | Slightly wider range |
| Gait asymmetry | 1.0 ratio | > 0.15 deviation | > 0.10 deviation | Slightly higher threshold for CSI |
| Cheyne-Stokes period | 30-90 s | 30-90 s lag search | 30-100 s | Matches clinical range |
| Seizure clonic frequency | 3-8 Hz | 3-8 Hz (period 2-7 frames at 20 Hz) | 3-8 Hz | Matches clinical standard |
### Threshold Rationale
Several thresholds differ from strict clinical standards. This is intentional:
- **WiFi CSI is not ECG/pulse oximetry.** The signal-to-noise ratio is lower, so thresholds are widened to reduce false positives while maintaining clinical relevance.
- **Conservative thresholds favor specificity over sensitivity.** A missed alert is preferable to alert fatigue in a non-clinical-grade system.
- **All thresholds are compile-time constants.** To adjust for a specific deployment, modify the constants at the top of each module file and recompile.
---
## Safety Considerations
1. **Not a substitute for medical devices.** These modules are research/assistive tools. They have not been validated through clinical trials and are not FDA/CE cleared. Never rely on them as the sole source of patient monitoring.
2. **False positive rates.** WiFi CSI is affected by environmental factors: moving objects (fans, pets, curtains), multipath changes (opening doors, people walking nearby), and electromagnetic interference. Expect false positive rates of 5-15% in typical home environments and 1-5% in controlled clinical settings.
3. **False negative rates.** The conservative thresholds mean some borderline conditions may not trigger alerts. Specifically:
- Bradypnea (12-20 BPM dropping to 12-4 BPM) is not directly flagged -- only sub-4 BPM apnea is detected
- Mild tachycardia (100-120 BPM) is detected, but the 10-second sustained requirement means brief episodes are missed
- Low-amplitude seizures without strong motor components may not exceed the energy threshold
4. **Environmental factors affecting accuracy:**
- **Multi-person environments**: All modules assume a single subject. Multiple people in the sensor's field of view will corrupt readings.
- **Distance**: CSI sensitivity drops with distance. Place sensor within 2 meters of the subject.
- **Obstructions**: Thick walls, metal furniture, and large water bodies (aquariums) between sensor and subject degrade performance.
- **WiFi congestion**: Heavy WiFi traffic on the same channel increases noise in CSI measurements.
5. **Power and connectivity**: The ESP32 must maintain continuous WiFi connectivity for CSI monitoring. Power loss or WiFi disconnection will silently stop all monitoring. Consider UPS power and redundant AP placement for critical applications.
6. **Data privacy**: These modules process health-related data. Ensure compliance with HIPAA, GDPR, or local health data regulations when deploying in clinical or home care settings. CSI data and emitted events should be encrypted in transit and at rest.

482
docs/edge-modules/retail.md Normal file
View file

@ -0,0 +1,482 @@
# Retail & Hospitality Modules -- WiFi-DensePose Edge Intelligence
> Understand customer behavior without cameras or consent forms. Count queues, map foot traffic, track table turnover, measure shelf engagement -- all from WiFi signals that are already there.
## Overview
| Module | File | What It Does | Event IDs | Frame Budget |
|--------|------|--------------|-----------|--------------|
| Queue Length | `ret_queue_length.rs` | Estimates queue length and wait time using Little's Law | 400-403 | ~0.5 us/frame |
| Dwell Heatmap | `ret_dwell_heatmap.rs` | Tracks dwell time per spatial zone (3x3 grid) | 410-413 | ~1 us/frame |
| Customer Flow | `ret_customer_flow.rs` | Directional foot traffic counting (ingress/egress) | 420-423 | ~1.5 us/frame |
| Table Turnover | `ret_table_turnover.rs` | Restaurant table lifecycle tracking with turnover rate | 430-433 | ~0.3 us/frame |
| Shelf Engagement | `ret_shelf_engagement.rs` | Detects and classifies customer shelf interaction | 440-443 | ~1 us/frame |
All modules target the ESP32-S3 running WASM3 (ADR-040 Tier 3). They receive pre-processed CSI signals from Tier 2 DSP and emit structured events via `csi_emit_event()`.
---
## Modules
### Queue Length Estimation (`ret_queue_length.rs`)
**What it does**: Estimates the number of people waiting in a queue, computes arrival and service rates, estimates wait time using Little's Law (L = lambda x W), and fires alerts when the queue exceeds a configurable threshold.
**How it works**: The module tracks person count changes frame-to-frame to detect arrivals (count increased or new presence with variance spike) and departures (count decreased or presence edge with low motion). Over 30-second windows, it computes arrival rate (lambda) and service rate (mu) in persons-per-minute. The queue length is smoothed via EMA on the raw person count. Wait time is estimated as `queue_length / (arrival_rate / 60)`.
#### Events
| Event ID | Name | Value | When Emitted |
|----------|------|-------|--------------|
| 400 | `QUEUE_LENGTH` | Estimated queue length (0-20) | Every 20 frames (1s) |
| 401 | `WAIT_TIME_ESTIMATE` | Estimated wait in seconds | Every 600 frames (30s window) |
| 402 | `SERVICE_RATE` | Service rate (persons/min, smoothed) | Every 600 frames (30s window) |
| 403 | `QUEUE_ALERT` | Current queue length | When queue >= 5 (once, resets below 4) |
#### API
```rust
use wifi_densepose_wasm_edge::ret_queue_length::QueueLengthEstimator;
let mut q = QueueLengthEstimator::new();
// Per-frame: presence (0/1), person count, variance, motion energy
let events = q.process_frame(presence, n_persons, variance, motion_energy);
// Queries
q.queue_length() // -> u8 (0-20, smoothed)
q.arrival_rate() // -> f32 (persons/minute, EMA-smoothed)
q.service_rate() // -> f32 (persons/minute, EMA-smoothed)
```
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `REPORT_INTERVAL` | 20 frames (1s) | Queue length report interval |
| `SERVICE_WINDOW_FRAMES` | 600 frames (30s) | Window for rate computation |
| `QUEUE_EMA_ALPHA` | 0.1 | EMA smoothing for queue length |
| `RATE_EMA_ALPHA` | 0.05 | EMA smoothing for arrival/service rates |
| `JOIN_VARIANCE_THRESH` | 0.05 | Variance spike threshold for join detection |
| `DEPART_MOTION_THRESH` | 0.02 | Motion threshold for departure detection |
| `QUEUE_ALERT_THRESH` | 5.0 | Queue length that triggers alert |
| `MAX_QUEUE` | 20 | Maximum tracked queue length |
#### Example: Retail Queue Management
```python
# React to queue events
if event_id == 400: # QUEUE_LENGTH
queue_len = int(value)
dashboard.update_queue(register_id, queue_len)
elif event_id == 401: # WAIT_TIME_ESTIMATE
wait_seconds = value
signage.show(f"Estimated wait: {int(wait_seconds / 60)} min")
elif event_id == 403: # QUEUE_ALERT
staff_pager.send(f"Register {register_id}: {int(value)} in queue")
```
---
### Dwell Heatmap (`ret_dwell_heatmap.rs`)
**What it does**: Divides the sensing area into a 3x3 grid (9 zones) and tracks how long customers spend in each zone. Identifies "hot zones" (highest dwell time) and "cold zones" (lowest dwell time). Emits session summaries when the space empties, enabling store layout optimization.
**How it works**: Subcarriers are divided into 9 groups, one per zone. Each zone's variance is smoothed via EMA and compared against a threshold. When variance exceeds the threshold and presence is detected, dwell time accumulates at 0.05 seconds per frame. Sessions start when someone enters and end after 100 frames (5 seconds) of empty space.
#### Events
| Event ID | Name | Value Encoding | When Emitted |
|----------|------|----------------|--------------|
| 410 | `DWELL_ZONE_UPDATE` | `zone_id * 1000 + dwell_seconds` | Every 600 frames (30s) per occupied zone |
| 411 | `HOT_ZONE` | `zone_id + dwell_seconds/1000` | Every 600 frames (30s) |
| 412 | `COLD_ZONE` | `zone_id + dwell_seconds/1000` | Every 600 frames (30s) |
| 413 | `SESSION_SUMMARY` | Session duration in seconds | When space empties after occupancy |
**Value decoding for DWELL_ZONE_UPDATE**: The zone ID is encoded in the thousands place. For example, `value = 2015.5` means zone 2 with 15.5 seconds of dwell time.
#### API
```rust
use wifi_densepose_wasm_edge::ret_dwell_heatmap::DwellHeatmapTracker;
let mut t = DwellHeatmapTracker::new();
// Per-frame: presence (0/1), per-subcarrier variances, motion energy, person count
let events = t.process_frame(presence, &variances, motion_energy, n_persons);
// Queries
t.zone_dwell(zone_id) // -> f32 (seconds in current session)
t.zone_total_dwell(zone_id) // -> f32 (seconds across all sessions)
t.is_zone_occupied(zone_id) // -> bool
t.is_session_active() // -> bool
```
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `NUM_ZONES` | 9 | Spatial zones (3x3 grid) |
| `REPORT_INTERVAL` | 600 frames (30s) | Heatmap update interval |
| `ZONE_OCCUPIED_THRESH` | 0.015 | Variance threshold for zone occupancy |
| `ZONE_EMA_ALPHA` | 0.12 | EMA smoothing for zone variance |
| `EMPTY_FRAMES_FOR_SUMMARY` | 100 frames (5s) | Vacancy duration before session end |
| `MAX_EVENTS` | 12 | Maximum events per frame |
#### Zone Layout
The 3x3 grid maps to the physical space:
```
+-------+-------+-------+
| Z0 | Z1 | Z2 |
| | | |
+-------+-------+-------+
| Z3 | Z4 | Z5 |
| | | |
+-------+-------+-------+
| Z6 | Z7 | Z8 |
| | | |
+-------+-------+-------+
Near Mid Far
```
Subcarriers are divided evenly: with 27 subcarriers, each zone gets 3 subcarriers. Lower-index subcarriers correspond to nearer Fresnel zones.
---
### Customer Flow Counting (`ret_customer_flow.rs`)
**What it does**: Counts people entering and exiting through a doorway or passage using directional phase gradient analysis. Maintains cumulative ingress/egress counts and reports net occupancy (in - out, clamped to zero). Emits hourly traffic summaries.
**How it works**: Subcarriers are split into two groups: low-index (near entrance) and high-index (far side). A person walking through the sensing area causes an asymmetric phase velocity pattern -- the near-side group's phase changes before the far-side group for ingress, and vice versa for egress. The directional gradient (low_gradient - high_gradient) is smoothed via EMA and thresholded. Combined with motion energy and amplitude spike detection, this discriminates genuine crossings from noise.
```
Ingress: positive smoothed gradient (low-side phase leads)
Egress: negative smoothed gradient (high-side phase leads)
```
#### Events
| Event ID | Name | Value | When Emitted |
|----------|------|-------|--------------|
| 420 | `INGRESS` | Cumulative ingress count | On each detected entry |
| 421 | `EGRESS` | Cumulative egress count | On each detected exit |
| 422 | `NET_OCCUPANCY` | Current net occupancy (>= 0) | On crossing + every 100 frames |
| 423 | `HOURLY_TRAFFIC` | `ingress * 1000 + egress` | Every 72000 frames (1 hour) |
**Decoding HOURLY_TRAFFIC**: `ingress = int(value / 1000)`, `egress = int(value % 1000)`.
#### API
```rust
use wifi_densepose_wasm_edge::ret_customer_flow::CustomerFlowTracker;
let mut cf = CustomerFlowTracker::new();
// Per-frame: per-subcarrier phases, amplitudes, variance, motion energy
let events = cf.process_frame(&phases, &amplitudes, variance, motion_energy);
// Queries
cf.net_occupancy() // -> i32 (ingress - egress, clamped to 0)
cf.total_ingress() // -> u32 (cumulative entries)
cf.total_egress() // -> u32 (cumulative exits)
cf.current_gradient() // -> f32 (smoothed directional gradient)
```
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `PHASE_GRADIENT_THRESH` | 0.15 | Minimum gradient magnitude for crossing |
| `MOTION_THRESH` | 0.03 | Minimum motion energy for valid crossing |
| `AMPLITUDE_SPIKE_THRESH` | 1.5 | Amplitude change scale factor |
| `CROSSING_DEBOUNCE` | 10 frames (0.5s) | Debounce between crossing events |
| `GRADIENT_EMA_ALPHA` | 0.2 | EMA smoothing for gradient |
| `OCCUPANCY_REPORT_INTERVAL` | 100 frames (5s) | Net occupancy report interval |
#### Example: Store Occupancy Display
```python
# Real-time occupancy counter at store entrance
if event_id == 422: # NET_OCCUPANCY
occupancy = int(value)
display.show(f"Currently in store: {occupancy}")
if occupancy >= max_capacity:
door_signal.set("WAIT")
else:
door_signal.set("ENTER")
elif event_id == 423: # HOURLY_TRAFFIC
ingress = int(value / 1000)
egress = int(value % 1000)
analytics.log_hourly(hour, ingress, egress)
```
---
### Table Turnover Tracking (`ret_table_turnover.rs`)
**What it does**: Tracks the full lifecycle of a restaurant table -- from guests sitting down, through eating, to departing and cleanup. Measures seating duration and computes a rolling turnover rate (turnovers per hour). Designed for one ESP32 node per table or table group.
**How it works**: A five-state machine processes presence, motion energy, and person count:
```
Empty --> Eating --> Departing --> Cooldown --> Empty
| (2s (motion (30s |
| debounce) increase) cleanup) |
| |
+----------------------------------------------+
(brief absence: stays in Eating)
```
The `Seating` state exists in the enum for completeness but transitions are handled directly (Empty -> Eating after debounce). The `Departing` state detects when guests show increased motion and reduced person count. Vacancy requires 5 seconds of confirmed absence to avoid false triggers from brief bathroom breaks.
#### Events
| Event ID | Name | Value | When Emitted |
|----------|------|-------|--------------|
| 430 | `TABLE_SEATED` | Person count at seating | After 40-frame debounce |
| 431 | `TABLE_VACATED` | Seating duration in seconds | After 100-frame absence debounce |
| 432 | `TABLE_AVAILABLE` | 1.0 | After 30-second cleanup cooldown |
| 433 | `TURNOVER_RATE` | Turnovers per hour (rolling) | Every 6000 frames (5 min) |
#### API
```rust
use wifi_densepose_wasm_edge::ret_table_turnover::TableTurnoverTracker;
let mut tt = TableTurnoverTracker::new();
// Per-frame: presence (0/1), motion energy, person count
let events = tt.process_frame(presence, motion_energy, n_persons);
// Queries
tt.state() // -> TableState (Empty|Seating|Eating|Departing|Cooldown)
tt.total_turnovers() // -> u32 (cumulative turnovers)
tt.session_duration_s() // -> f32 (current session length in seconds)
tt.turnover_rate() // -> f32 (turnovers/hour, rolling window)
```
#### State Machine
| State | Entry Condition | Exit Condition |
|-------|----------------|----------------|
| `Empty` | Table is free | 40 frames (2s) of continuous presence |
| `Eating` | Guests confirmed seated | 100 frames (5s) of absence -> Cooldown; high motion + fewer people -> Departing |
| `Departing` | High motion with dropping count | 100 frames absence -> Cooldown; motion settles -> back to Eating |
| `Cooldown` | Table vacated, cleanup period | 600 frames (30s) -> Empty; presence during cooldown -> Eating (fast re-seat) |
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `SEATED_DEBOUNCE_FRAMES` | 40 frames (2s) | Confirmation before marking seated |
| `VACATED_DEBOUNCE_FRAMES` | 100 frames (5s) | Absence confirmation before vacating |
| `AVAILABLE_COOLDOWN_FRAMES` | 600 frames (30s) | Cleanup time before marking available |
| `EATING_MOTION_THRESH` | 0.1 | Motion below this = settled/eating |
| `ACTIVE_MOTION_THRESH` | 0.3 | Motion above this = arriving/departing |
| `TURNOVER_REPORT_INTERVAL` | 6000 frames (5 min) | Rate report interval |
| `MAX_TURNOVERS` | 50 | Rolling window buffer for rate |
#### Example: Restaurant Operations Dashboard
```python
# Restaurant table management
if event_id == 430: # TABLE_SEATED
party_size = int(value)
kitchen.notify(f"Table {table_id}: {party_size} guests seated")
pos.start_timer(table_id)
elif event_id == 431: # TABLE_VACATED
duration_s = value
analytics.log_seating(table_id, duration_s, peak_persons)
staff.alert(f"Table {table_id}: needs bussing ({duration_s/60:.0f} min use)")
elif event_id == 432: # TABLE_AVAILABLE
hostess_display.mark_available(table_id)
elif event_id == 433: # TURNOVER_RATE
rate = value
manager_dashboard.update(table_id, turnovers_per_hour=rate)
```
---
### Shelf Engagement Detection (`ret_shelf_engagement.rs`)
**What it does**: Detects when a customer stops in front of a shelf and classifies their engagement level: Browse (under 5 seconds), Consider (5-30 seconds), or Deep Engagement (over 30 seconds). Also detects reaching gestures (hand/arm movement toward the shelf). Uses the principle that a person standing still but interacting with products produces high-frequency phase perturbations with low translational motion.
**How it works**: The key insight is distinguishing two types of CSI phase changes:
- **Translational motion** (walking): Large uniform phase shifts across all subcarriers
- **Localized interaction** (reaching, examining): High spatial variance in frame-to-frame phase differences
The module computes the standard deviation of per-subcarrier phase differences. High std-dev with low overall motion indicates shelf interaction. A reach gesture produces a burst of high-frequency perturbation exceeding a higher threshold.
#### Engagement Classification
| Level | Duration | Description | Event ID |
|-------|----------|-------------|----------|
| None | -- | No engagement (absent or walking) | -- |
| Browse | < 5s | Brief glance, passing interest | 440 |
| Consider | 5-30s | Examining, reading label, comparing | 441 |
| Deep Engage | > 30s | Extended interaction, decision-making | 442 |
The `REACH_DETECTED` event (443) fires independently whenever a sudden high-frequency phase burst is detected while the customer is standing still.
#### Events
| Event ID | Name | Value | When Emitted |
|----------|------|-------|--------------|
| 440 | `SHELF_BROWSE` | Engagement duration in seconds | On classification (with cooldown) |
| 441 | `SHELF_CONSIDER` | Engagement duration in seconds | On level upgrade |
| 442 | `SHELF_ENGAGE` | Engagement duration in seconds | On level upgrade |
| 443 | `REACH_DETECTED` | Phase perturbation magnitude | Per reach burst |
#### API
```rust
use wifi_densepose_wasm_edge::ret_shelf_engagement::ShelfEngagementDetector;
let mut se = ShelfEngagementDetector::new();
// Per-frame: presence (0/1), motion energy, variance, per-subcarrier phases
let events = se.process_frame(presence, motion_energy, variance, &phases);
// Queries
se.engagement_level() // -> EngagementLevel (None|Browse|Consider|DeepEngage)
se.engagement_duration_s() // -> f32 (seconds)
se.total_browse_events() // -> u32
se.total_consider_events() // -> u32
se.total_engage_events() // -> u32
se.total_reach_events() // -> u32
```
#### Configuration Constants
| Constant | Value | Description |
|----------|-------|-------------|
| `BROWSE_THRESH_S` | 5.0s (100 frames) | Engagement time for Browse |
| `CONSIDER_THRESH_S` | 30.0s (600 frames) | Engagement time for Consider |
| `STILL_MOTION_THRESH` | 0.08 | Motion below this = standing still |
| `PHASE_PERTURBATION_THRESH` | 0.04 | Phase variance for interaction |
| `REACH_BURST_THRESH` | 0.15 | Phase burst for reach detection |
| `STILL_DEBOUNCE` | 10 frames (0.5s) | Stillness confirmation before counting |
| `ENGAGEMENT_COOLDOWN` | 60 frames (3s) | Cooldown between engagement events |
#### Example: Planogram Analytics
```python
# Shelf performance analytics
shelf_stats = defaultdict(lambda: {"browse": 0, "consider": 0, "engage": 0, "reaches": 0})
if event_id == 440: # SHELF_BROWSE
shelf_stats[shelf_id]["browse"] += 1
elif event_id == 441: # SHELF_CONSIDER
shelf_stats[shelf_id]["consider"] += 1
elif event_id == 442: # SHELF_ENGAGE
shelf_stats[shelf_id]["engage"] += 1
duration_s = value
if duration_s > 60:
analytics.flag_decision_difficulty(shelf_id)
elif event_id == 443: # REACH_DETECTED
shelf_stats[shelf_id]["reaches"] += 1
# Conversion funnel: Browse -> Consider -> Engage
# Low consider-to-engage ratio = poor shelf placement or pricing
```
---
## Use Cases
### Retail Store Layout Optimization
Deploy ESP32 nodes at key locations:
- **Entrance**: Customer Flow module counts foot traffic and peak hours
- **Checkout lanes**: Queue Length module monitors wait times, triggers "open register" alerts
- **Aisles**: Dwell Heatmap identifies high-traffic zones for premium product placement
- **Endcaps/displays**: Shelf Engagement measures which displays convert attention to interaction
```
Entrance
(CustomerFlow)
|
+--------------+--------------+
| | |
Aisle 1 Aisle 2 Aisle 3
(DwellHeatmap) (DwellHeatmap) (DwellHeatmap)
| | |
[Shelf A] [Shelf B] [Shelf C]
(ShelfEngage) (ShelfEngage) (ShelfEngage)
| | |
+--------------+--------------+
|
Checkout Area
(QueueLength x3)
```
### Restaurant Operations
Deploy per-table ESP32 nodes plus entrance/exit nodes:
- **Entrance**: Customer Flow tracks customer arrivals
- **Each table**: Table Turnover monitors seating lifecycle
- **Host stand**: Queue Length estimates wait time for walk-ins
- **Kitchen view**: Dwell Heatmap identifies server traffic patterns
Key metrics:
- Average seating duration per table
- Turnovers per hour (efficiency)
- Peak vs. off-peak utilization
- Wait time vs. party size correlation
### Shopping Mall Analytics
Multi-floor, multi-zone deployment:
- **Mall entrances** (4-8 nodes): Customer Flow for total foot traffic + directionality
- **Food court**: Table Turnover + Queue Length per restaurant
- **Anchor store entrances**: Customer Flow per store
- **Common areas**: Dwell Heatmap for seating area utilization
- **Kiosks/pop-ups**: Shelf Engagement for promotional display effectiveness
### Event Venue Management
- **Gates**: Customer Flow for entry/exit counting, capacity monitoring
- **Concession stands**: Queue Length with staff dispatch alerts
- **Seating sections**: Dwell Heatmap for section utilization
- **Merchandise areas**: Shelf Engagement for product interest
---
## Integration Architecture
```
ESP32 Nodes (per zone)
|
v UDP events (port 5005)
Sensing Server (wifi-densepose-sensing-server)
|
v REST API + WebSocket
+---+---+---+---+
| | | | |
v v v v v
POS Dashboard Staff Analytics
Pager Backend
```
### Event Packet Format
Each event is a `(event_type: i32, value: f32)` pair. Multiple events per frame are packed into a single UDP packet. The sensing server deserializes and exposes them via:
- `GET /api/v1/sensing/latest` -- latest raw events
- `GET /api/v1/sensing/events?type=400-403` -- filtered by event type
- WebSocket `/ws/events` -- real-time stream
### Privacy Considerations
These modules process WiFi CSI data (channel amplitude and phase), not video or personally identifiable information. No MAC addresses, device identifiers, or individual tracking data leaves the ESP32. All output is aggregate metrics: counts, durations, zone labels. This makes WiFi sensing suitable for jurisdictions with strict privacy requirements (GDPR, CCPA) where camera-based analytics would require consent forms or impact assessments.

View file

@ -0,0 +1,615 @@
# Security & Safety Modules -- WiFi-DensePose Edge Intelligence
> Perimeter monitoring and threat detection using WiFi Channel State Information (CSI).
> Works through walls, in complete darkness, without visible cameras.
> Each module runs on an $8 ESP32-S3 chip at 20 Hz frame rate.
> All modules are `no_std`-compatible and compile to WASM for hot-loading via ADR-040 Tier 3.
## Overview
| Module | File | What It Does | Event IDs | Budget |
|--------|------|--------------|-----------|--------|
| Intrusion Detection | `intrusion.rs` | Phase/amplitude anomaly intrusion alarm with arm/disarm | 200-203 | S (<5 ms) |
| Perimeter Breach | `sec_perimeter_breach.rs` | Multi-zone perimeter crossing with approach/departure | 210-213 | S (<5 ms) |
| Weapon Detection | `sec_weapon_detect.rs` | Concealed metallic object detection via RF reflectivity ratio | 220-222 | S (<5 ms) |
| Tailgating Detection | `sec_tailgating.rs` | Double-peak motion envelope for unauthorized following | 230-232 | L (<2 ms) |
| Loitering Detection | `sec_loitering.rs` | Prolonged stationary presence with 4-state machine | 240-242 | L (<2 ms) |
| Panic Motion | `sec_panic_motion.rs` | Erratic motion, struggle, and fleeing patterns | 250-252 | S (<5 ms) |
Budget key: **S** = Standard (<5 ms per frame), **L** = Light (<2 ms per frame).
## Shared Design Patterns
All security modules follow these conventions:
- **`const fn new()`**: Zero-allocation constructor, no heap, suitable for `static mut` on ESP32.
- **`process_frame(...) -> &[(i32, f32)]`**: Returns event tuples `(event_id, value)` via a static buffer (safe in single-threaded WASM).
- **Calibration phase**: First N frames (typically 100-200 at 20 Hz = 5-10 seconds) learn ambient baseline. No events during calibration.
- **Debounce**: Consecutive-frame counters prevent single-frame noise from triggering alerts.
- **Cooldown**: After emitting an event, a cooldown window suppresses duplicate emissions (40-100 frames = 2-5 seconds).
- **Hysteresis**: Debounce counters use `saturating_sub(1)` for gradual decay rather than hard reset, reducing flap on borderline signals.
---
## Modules
### Intrusion Detection (`intrusion.rs`)
**What it does**: Monitors a previously-empty space and triggers an alarm when someone enters. Works like a traditional motion alarm -- the environment must settle before the system arms itself.
**How it works**: During calibration (200 frames), the detector learns per-subcarrier amplitude mean and variance. After calibration, it waits for the environment to be quiet (100 consecutive frames with low disturbance) before arming. Once armed, it computes a composite disturbance score from phase velocity (sudden phase jumps between frames) and amplitude deviation (amplitude departing from baseline by more than 3 sigma). If the disturbance exceeds 0.8 for 3+ consecutive frames, an alert fires.
#### State Machine
```
Calibrating --> Monitoring --> Armed --> Alert
^ |
| (quiet for |
| 50 frames) |
+---- Armed <----------+
```
- **Calibrating**: Accumulates baseline amplitude statistics for 200 frames.
- **Monitoring**: Waits for 100 consecutive quiet frames before arming.
- **Armed**: Active detection. Triggers alert on 3+ consecutive high-disturbance frames.
- **Alert**: Active alert. Returns to Armed after 50 consecutive quiet frames. 100-frame cooldown prevents re-triggering.
#### API
| Item | Type | Description |
|------|------|-------------|
| `IntrusionDetector::new()` | `const fn` | Create detector in Calibrating state |
| `process_frame(phases, amplitudes)` | `fn` | Process one CSI frame, returns events |
| `state()` | `fn -> DetectorState` | Current state (Calibrating/Monitoring/Armed/Alert) |
| `total_alerts()` | `fn -> u32` | Cumulative alert count |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|--------------|
| 200 | `EVENT_INTRUSION_ALERT` | Intrusion detected (disturbance score as value) |
| 201 | `EVENT_INTRUSION_ZONE` | Zone index of highest disturbance |
| 202 | `EVENT_INTRUSION_ARMED` | System transitioned to Armed state |
| 203 | `EVENT_INTRUSION_DISARMED` | System disarmed (currently unused -- reserved) |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `INTRUSION_VELOCITY_THRESH` | 1.5 | 0.5-3.0 | Phase velocity threshold (rad/frame) |
| `AMPLITUDE_CHANGE_THRESH` | 3.0 | 2.0-5.0 | Sigma multiplier for amplitude deviation |
| `ARM_FRAMES` | 100 | 40-200 | Quiet frames required before arming (5s at 20 Hz) |
| `DETECT_DEBOUNCE` | 3 | 2-10 | Consecutive disturbed frames before alert |
| `ALERT_COOLDOWN` | 100 | 20-200 | Frames between re-alerts (5s at 20 Hz) |
| `BASELINE_FRAMES` | 200 | 100-500 | Calibration frames (10s at 20 Hz) |
---
### Perimeter Breach Detection (`sec_perimeter_breach.rs`)
**What it does**: Divides the monitored area into 4 zones (mapped to subcarrier groups) and detects movement crossing zone boundaries. Classifies motion direction as approaching or departing using energy gradient trends.
**How it works**: Subcarriers are split into 4 equal groups, each representing a spatial zone. Per-zone metrics are computed every frame:
1. **Phase gradient**: Mean absolute phase difference between current and previous frame within the zone's subcarrier range.
2. **Variance ratio**: Current zone variance divided by calibrated baseline variance.
A breach is flagged when phase gradient exceeds 0.6 rad/subcarrier AND variance ratio exceeds 2.5x baseline. Direction is determined by linear regression slope over an 8-frame energy history buffer -- positive slope = approaching, negative = departing.
#### State Machine
There is no explicit state machine enum. Instead, per-zone counters track:
- `disturb_run`: Consecutive breach frames (resets to 0 when zone is quiet).
- `approach_run` / `departure_run`: Consecutive frames with positive/negative energy trend (debounced to 3 frames).
- Four independent cooldown timers for breach, approach, departure, and transition events.
No stuck states possible: all counters either reset on quiet input or are bounded by `saturating_add`.
#### API
| Item | Type | Description |
|------|------|-------------|
| `PerimeterBreachDetector::new()` | `const fn` | Create uncalibrated detector |
| `process_frame(phases, amplitudes, variance, motion_energy)` | `fn` | Process one frame, returns up to 4 events |
| `is_calibrated()` | `fn -> bool` | Whether baseline calibration is complete |
| `frame_count()` | `fn -> u32` | Total frames processed |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|--------------|
| 210 | `EVENT_PERIMETER_BREACH` | Significant disturbance in any zone (value = energy score) |
| 211 | `EVENT_APPROACH_DETECTED` | Energy trend rising in a breached zone (value = zone index) |
| 212 | `EVENT_DEPARTURE_DETECTED` | Energy trend falling in a zone (value = zone index) |
| 213 | `EVENT_ZONE_TRANSITION` | Movement shifted from one zone to another (value = `from*10 + to`) |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `BASELINE_FRAMES` | 100 | 60-200 | Calibration frames (5s at 20 Hz) |
| `BREACH_GRADIENT_THRESH` | 0.6 | 0.3-1.5 | Phase gradient for breach (rad/subcarrier) |
| `VARIANCE_RATIO_THRESH` | 2.5 | 1.5-5.0 | Variance ratio above baseline for disturbance |
| `DIRECTION_DEBOUNCE` | 3 | 2-8 | Consecutive trend frames for direction confirmation |
| `COOLDOWN` | 40 | 20-100 | Frames between events of same type (2s at 20 Hz) |
| `HISTORY_LEN` | 8 | 4-16 | Energy history buffer for trend estimation |
| `MAX_ZONES` | 4 | 2-4 | Number of perimeter zones |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::sec_perimeter_breach::*;
let mut detector = PerimeterBreachDetector::new();
// Feed CSI frames (phases, amplitudes, variance arrays, motion energy scalar)
let events = detector.process_frame(&phases, &amplitudes, &variance, motion_energy);
for &(event_id, value) in events {
match event_id {
EVENT_PERIMETER_BREACH => {
// value = energy score (higher = more severe)
log!("Breach detected, energy={:.2}", value);
}
EVENT_APPROACH_DETECTED => {
// value = zone index (0-3)
log!("Approach in zone {}", value as u32);
}
EVENT_ZONE_TRANSITION => {
// value encodes from*10 + to
let from = (value as u32) / 10;
let to = (value as u32) % 10;
log!("Movement from zone {} to zone {}", from, to);
}
_ => {}
}
}
```
#### Tutorial: Setting Up a 4-Zone Perimeter System
1. **Sensor placement**: Mount the ESP32-S3 at the center of the monitored boundary (e.g., warehouse entrance, property line). The WiFi AP should be on the opposite side so the sensing link crosses all 4 zones.
2. **Zone mapping**: Subcarriers are divided equally among 4 zones. With 32 subcarriers:
- Zone 0: subcarriers 0-7 (nearest to the ESP32)
- Zone 1: subcarriers 8-15
- Zone 2: subcarriers 16-23
- Zone 3: subcarriers 24-31 (nearest to the AP)
3. **Calibration**: Power on the system with no one in the monitored area. Wait 5 seconds (100 frames) for calibration to complete. `is_calibrated()` returns `true`.
4. **Alert integration**: Forward events to your security system:
- `EVENT_PERIMETER_BREACH` (210) -> Trigger alarm siren / camera recording
- `EVENT_APPROACH_DETECTED` (211) -> Pre-alert: someone approaching
- `EVENT_ZONE_TRANSITION` (213) -> Track movement direction through zones
5. **Tuning**: If false alarms occur in windy or high-traffic environments, increase `BREACH_GRADIENT_THRESH` and `VARIANCE_RATIO_THRESH`. If detections are missed, decrease them.
---
### Concealed Metallic Object Detection (`sec_weapon_detect.rs`)
**What it does**: Detects concealed metallic objects (knives, firearms, tools) carried by a person walking through the sensing area. Metal has significantly higher RF reflectivity than human tissue, producing a characteristic amplitude-variance-to-phase-variance ratio.
**How it works**: During calibration (100 frames in an empty room), the detector computes baseline amplitude and phase variance per subcarrier using online variance accumulation. After calibration, running Welford statistics track amplitude and phase variance in real-time. The ratio of running amplitude variance to running phase variance is computed across all subcarriers. Metal produces a high ratio (amplitude swings wildly from specular reflection while phase varies less than diffuse tissue).
Two thresholds are applied:
- **Metal anomaly** (ratio > 4.0, debounce 4 frames): General metallic object detection.
- **Weapon alert** (ratio > 8.0, debounce 6 frames): High-reflectivity alert for larger metal masses.
Detection requires `presence >= 1` and `motion_energy >= 0.5` to avoid false positives on environmental noise.
**Important**: This module is research-grade and experimental. It requires per-environment calibration and should not be used as a sole security measure.
#### API
| Item | Type | Description |
|------|------|-------------|
| `WeaponDetector::new()` | `const fn` | Create uncalibrated detector |
| `process_frame(phases, amplitudes, variance, motion_energy, presence)` | `fn` | Process one frame, returns up to 3 events |
| `is_calibrated()` | `fn -> bool` | Whether baseline calibration is complete |
| `frame_count()` | `fn -> u32` | Total frames processed |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|--------------|
| 220 | `EVENT_METAL_ANOMALY` | Metallic object signature detected (value = amp/phase ratio) |
| 221 | `EVENT_WEAPON_ALERT` | High-reflectivity metal signature (value = amp/phase ratio) |
| 222 | `EVENT_CALIBRATION_NEEDED` | Baseline drift exceeds threshold (value = max drift ratio) |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `BASELINE_FRAMES` | 100 | 60-200 | Calibration frames (empty room, 5s at 20 Hz) |
| `METAL_RATIO_THRESH` | 4.0 | 2.0-8.0 | Amp/phase variance ratio for metal detection |
| `WEAPON_RATIO_THRESH` | 8.0 | 5.0-15.0 | Ratio for weapon-grade alert |
| `MIN_MOTION_ENERGY` | 0.5 | 0.2-2.0 | Minimum motion to consider detection valid |
| `METAL_DEBOUNCE` | 4 | 2-10 | Consecutive frames for metal anomaly |
| `WEAPON_DEBOUNCE` | 6 | 3-12 | Consecutive frames for weapon alert |
| `COOLDOWN` | 60 | 20-120 | Frames between events (3s at 20 Hz) |
| `RECALIB_DRIFT_THRESH` | 3.0 | 2.0-5.0 | Drift ratio triggering recalibration alert |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::sec_weapon_detect::*;
let mut detector = WeaponDetector::new();
// Calibrate in empty room (100 frames)
for _ in 0..100 {
detector.process_frame(&phases, &amplitudes, &variance, 0.0, 0);
}
assert!(detector.is_calibrated());
// Normal operation: person walks through
let events = detector.process_frame(&phases, &amplitudes, &variance, motion_energy, presence);
for &(event_id, value) in events {
match event_id {
EVENT_METAL_ANOMALY => {
log!("Metal detected, ratio={:.1}", value);
}
EVENT_WEAPON_ALERT => {
log!("WEAPON ALERT, ratio={:.1}", value);
// Trigger security response
}
EVENT_CALIBRATION_NEEDED => {
log!("Environment changed, recalibration recommended");
}
_ => {}
}
}
```
---
### Tailgating Detection (`sec_tailgating.rs`)
**What it does**: Detects tailgating at doorways -- two or more people passing through in rapid succession. A single authorized passage produces one smooth energy peak; a tailgater following closely produces a second peak within a configurable window (default 3 seconds).
**How it works**: The detector uses temporal clustering of motion energy peaks through a 3-state machine:
1. **Idle**: Waiting for motion energy to exceed the adaptive threshold.
2. **InPeak**: Tracking an active peak. Records peak maximum energy and duration. Peak ends when energy drops below 30% of peak maximum. Noise spikes (peaks shorter than 3 frames) are discarded.
3. **Watching**: Peak ended, monitoring for another peak within the tailgate window (60 frames = 3s). If another peak arrives, it transitions back to InPeak. When the window expires, it evaluates: 1 peak = single passage, 2+ peaks = tailgating.
The threshold adapts to ambient noise via exponential moving average of variance.
#### State Machine
```
Idle ----[energy > threshold]----> InPeak
|
[energy < 30% of peak max]
|
[peak too short] v
Idle <------------------------- InPeak end
|
[peak valid (>= 3 frames)]
v
Watching
/ \
[new peak starts] / \ [window expires]
v v
InPeak Evaluate
/ \
[1 peak] [2+ peaks]
| |
SINGLE_PASSAGE TAILGATE_DETECTED
| + MULTI_PASSAGE
v v
Idle Idle
```
#### API
| Item | Type | Description |
|------|------|-------------|
| `TailgateDetector::new()` | `const fn` | Create detector |
| `process_frame(motion_energy, presence, n_persons, variance)` | `fn` | Process one frame, returns up to 3 events |
| `frame_count()` | `fn -> u32` | Total frames processed |
| `tailgate_count()` | `fn -> u32` | Total tailgating events detected |
| `single_passages()` | `fn -> u32` | Total single passages recorded |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|--------------|
| 230 | `EVENT_TAILGATE_DETECTED` | Two or more peaks within window (value = peak count) |
| 231 | `EVENT_SINGLE_PASSAGE` | Single peak followed by quiet window (value = peak energy) |
| 232 | `EVENT_MULTI_PASSAGE` | Three or more peaks within window (value = peak count) |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `ENERGY_PEAK_THRESH` | 2.0 | 1.0-5.0 | Motion energy threshold for peak start |
| `ENERGY_VALLEY_FRAC` | 0.3 | 0.1-0.5 | Fraction of peak max to end peak |
| `TAILGATE_WINDOW` | 60 | 20-120 | Max inter-peak gap for tailgating (3s at 20 Hz) |
| `MIN_PEAK_ENERGY` | 1.5 | 0.5-3.0 | Minimum peak energy for valid passage |
| `COOLDOWN` | 100 | 40-200 | Frames between events (5s at 20 Hz) |
| `MIN_PEAK_FRAMES` | 3 | 2-10 | Minimum peak duration to filter noise spikes |
| `MAX_PEAKS` | 8 | 4-16 | Maximum peaks tracked in one window |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::sec_tailgating::*;
let mut detector = TailgateDetector::new();
// Process frames from host
let events = detector.process_frame(motion_energy, presence, n_persons, variance_mean);
for &(event_id, value) in events {
match event_id {
EVENT_TAILGATE_DETECTED => {
log!("TAILGATE: {} people in rapid succession", value as u32);
// Lock door / alert security
}
EVENT_SINGLE_PASSAGE => {
log!("Normal passage, energy={:.2}", value);
}
EVENT_MULTI_PASSAGE => {
log!("Multi-passage: {} people", value as u32);
}
_ => {}
}
}
```
---
### Loitering Detection (`sec_loitering.rs`)
**What it does**: Detects prolonged stationary presence in a monitored area. Distinguishes between a person passing through (normal) and someone standing still for an extended time (loitering). Default dwell threshold is 5 minutes.
**How it works**: Uses a 4-state machine that tracks presence duration and motion level. Only stationary frames (motion energy below 0.5) count toward the dwell threshold -- a person actively walking through does not accumulate loitering time. The exit cooldown (30 seconds) prevents false "loitering ended" events from brief signal dropouts or occlusions.
#### State Machine
```
Absent --[presence + no post_end cooldown]--> Entering
|
[60 frames with presence]
|
[absence before 60] v
Absent <------------------------------ Entering confirmed
|
v
Present
/ \
[6000 stationary / \ [absent > 300
frames] / \ frames]
v v
Loitering Absent
/ \
[presence continues] [absent >= 600 frames]
| |
LOITERING_ONGOING LOITERING_END
(every 600 frames) |
| v
v Absent
Loitering (post_end_cd = 200)
```
#### API
| Item | Type | Description |
|------|------|-------------|
| `LoiteringDetector::new()` | `const fn` | Create detector in Absent state |
| `process_frame(presence, motion_energy)` | `fn` | Process one frame, returns up to 2 events |
| `state()` | `fn -> LoiterState` | Current state (Absent/Entering/Present/Loitering) |
| `frame_count()` | `fn -> u32` | Total frames processed |
| `loiter_count()` | `fn -> u32` | Total loitering events |
| `dwell_frames()` | `fn -> u32` | Current accumulated stationary dwell frames |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|--------------|
| 240 | `EVENT_LOITERING_START` | Dwell threshold exceeded (value = dwell time in seconds) |
| 241 | `EVENT_LOITERING_ONGOING` | Periodic report while loitering (value = total dwell seconds) |
| 242 | `EVENT_LOITERING_END` | Loiterer departed after exit cooldown (value = total dwell seconds) |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `ENTER_CONFIRM_FRAMES` | 60 | 20-120 | Presence confirmation (3s at 20 Hz) |
| `DWELL_THRESHOLD` | 6000 | 1200-12000 | Stationary frames for loitering (5 min at 20 Hz) |
| `EXIT_COOLDOWN` | 600 | 200-1200 | Absent frames before ending loitering (30s at 20 Hz) |
| `STATIONARY_MOTION_THRESH` | 0.5 | 0.2-1.5 | Motion energy below which person is stationary |
| `ONGOING_REPORT_INTERVAL` | 600 | 200-1200 | Frames between ongoing reports (30s at 20 Hz) |
| `POST_END_COOLDOWN` | 200 | 100-600 | Cooldown after end before re-detection (10s at 20 Hz) |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::sec_loitering::*;
let mut detector = LoiteringDetector::new();
let events = detector.process_frame(presence, motion_energy);
for &(event_id, value) in events {
match event_id {
EVENT_LOITERING_START => {
log!("Loitering started after {:.0}s", value);
// Alert security
}
EVENT_LOITERING_ONGOING => {
log!("Still loitering, total {:.0}s", value);
}
EVENT_LOITERING_END => {
log!("Loiterer departed after {:.0}s total", value);
}
_ => {}
}
}
// Check state programmatically
if detector.state() == LoiterState::Loitering {
// Continuous monitoring actions
}
```
---
### Panic/Erratic Motion Detection (`sec_panic_motion.rs`)
**What it does**: Detects three categories of distress-related motion:
1. **Panic**: Erratic, high-jerk motion with rapid random direction changes (e.g., someone flailing, being attacked).
2. **Struggle**: Elevated jerk with moderate energy and some direction changes (e.g., physical altercation, trying to break free).
3. **Fleeing**: Sustained high energy with low entropy -- running in one direction.
**How it works**: Maintains a 100-frame (5-second) circular buffer of motion energy and variance values. Computes window-level statistics each frame:
- **Mean jerk**: Average absolute rate-of-change of motion energy across the window. High jerk = erratic, unpredictable motion.
- **Entropy proxy**: Fraction of frames with direction reversals (energy transitions from increasing to decreasing or vice versa). High entropy = chaotic motion.
- **High jerk fraction**: Fraction of individual frame-to-frame jerks exceeding `JERK_THRESH`. Ensures the high mean is not from a single spike.
Detection logic:
- **Panic** = `mean_jerk > 2.0` AND `entropy > 0.35` AND `high_jerk_frac > 0.3`
- **Struggle** = `mean_jerk > 1.5` AND `energy in [1.0, 5.0)` AND `entropy > 0.175` AND not panic
- **Fleeing** = `mean_energy > 5.0` AND `mean_jerk > 0.05` AND `entropy < 0.25` AND not panic
#### API
| Item | Type | Description |
|------|------|-------------|
| `PanicMotionDetector::new()` | `const fn` | Create detector |
| `process_frame(motion_energy, variance_mean, phase_mean, presence)` | `fn` | Process one frame, returns up to 3 events |
| `frame_count()` | `fn -> u32` | Total frames processed |
| `panic_count()` | `fn -> u32` | Total panic events detected |
#### Events Emitted
| Event ID | Constant | When Emitted |
|----------|----------|--------------|
| 250 | `EVENT_PANIC_DETECTED` | Erratic high-jerk + high-entropy motion (value = severity 0-10) |
| 251 | `EVENT_STRUGGLE_PATTERN` | Elevated jerk at moderate energy (value = mean jerk) |
| 252 | `EVENT_FLEEING_DETECTED` | Sustained high-energy directional motion (value = mean energy) |
#### Configuration
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `WINDOW` | 100 | 40-200 | Analysis window size (5s at 20 Hz) |
| `JERK_THRESH` | 2.0 | 1.0-4.0 | Per-frame jerk threshold for panic |
| `ENTROPY_THRESH` | 0.35 | 0.2-0.6 | Direction reversal rate threshold |
| `MIN_MOTION` | 1.0 | 0.3-2.0 | Minimum motion energy (ignore idle) |
| `TRIGGER_FRAC` | 0.3 | 0.2-0.5 | Fraction of window frames exceeding thresholds |
| `COOLDOWN` | 100 | 40-200 | Frames between events (5s at 20 Hz) |
| `FLEE_ENERGY_THRESH` | 5.0 | 3.0-10.0 | Minimum energy for fleeing detection |
| `FLEE_JERK_THRESH` | 0.05 | 0.01-0.5 | Minimum jerk for fleeing (above noise floor) |
| `FLEE_MAX_ENTROPY` | 0.25 | 0.1-0.4 | Maximum entropy for fleeing (directional motion) |
| `STRUGGLE_JERK_THRESH` | 1.5 | 0.8-3.0 | Minimum mean jerk for struggle pattern |
#### Example Usage
```rust
use wifi_densepose_wasm_edge::sec_panic_motion::*;
let mut detector = PanicMotionDetector::new();
let events = detector.process_frame(motion_energy, variance_mean, phase_mean, presence);
for &(event_id, value) in events {
match event_id {
EVENT_PANIC_DETECTED => {
log!("PANIC: severity={:.1}", value);
// Immediate security dispatch
}
EVENT_STRUGGLE_PATTERN => {
log!("Struggle detected, jerk={:.2}", value);
// Investigate
}
EVENT_FLEEING_DETECTED => {
log!("Person fleeing, energy={:.1}", value);
// Track direction via perimeter module
}
_ => {}
}
}
```
---
## Event ID Registry (Security Range 200-299)
| Range | Module | Events |
|-------|--------|--------|
| 200-203 | `intrusion.rs` | INTRUSION_ALERT, INTRUSION_ZONE, INTRUSION_ARMED, INTRUSION_DISARMED |
| 210-213 | `sec_perimeter_breach.rs` | PERIMETER_BREACH, APPROACH_DETECTED, DEPARTURE_DETECTED, ZONE_TRANSITION |
| 220-222 | `sec_weapon_detect.rs` | METAL_ANOMALY, WEAPON_ALERT, CALIBRATION_NEEDED |
| 230-232 | `sec_tailgating.rs` | TAILGATE_DETECTED, SINGLE_PASSAGE, MULTI_PASSAGE |
| 240-242 | `sec_loitering.rs` | LOITERING_START, LOITERING_ONGOING, LOITERING_END |
| 250-252 | `sec_panic_motion.rs` | PANIC_DETECTED, STRUGGLE_PATTERN, FLEEING_DETECTED |
| 253-299 | | Reserved for future security modules |
---
## Testing
```bash
# Run all security module tests (requires std feature)
cd rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge
cargo test --features std -- sec_ intrusion
```
### Test Coverage Summary
| Module | Tests | Coverage Notes |
|--------|-------|----------------|
| `intrusion.rs` | 4 | Init, calibration, arming, intrusion detection |
| `sec_perimeter_breach.rs` | 6 | Init, calibration, breach, zone transition, approach, quiet signal |
| `sec_weapon_detect.rs` | 6 | Init, calibration, no presence, metal anomaly, normal person, drift recalib |
| `sec_tailgating.rs` | 7 | Init, single passage, tailgate, wide spacing, noise spike, multi-passage, low energy |
| `sec_loitering.rs` | 7 | Init, entering, cancel, loitering start/ongoing/end, brief absence, moving person |
| `sec_panic_motion.rs` | 7 | Init, window fill, calm motion, panic, no presence, fleeing, struggle, low motion |
---
## Deployment Considerations
### Coverage Area per Sensor
Each ESP32-S3 with a WiFi AP link covers a single sensing path. The coverage area depends on:
- **Distance**: 1-10 meters between ESP32 and AP (optimal: 3-5 meters for indoor).
- **Width**: First Fresnel zone width -- approximately 0.5-1.5 meters at 5 GHz.
- **Through-wall**: WiFi CSI penetrates drywall and wood but attenuates through concrete/metal. Signal quality degrades beyond one wall.
### Multi-Sensor Coordination
For larger areas, deploy multiple ESP32 sensors in a mesh:
- Each sensor runs its own WASM module instance independently.
- The aggregator server (`wifi-densepose-sensing-server`) collects events from all sensors.
- Cross-sensor correlation (e.g., tracking a person across zones) is done server-side, not on-device.
- Use `EVENT_ZONE_TRANSITION` (213) from perimeter breach to correlate movement across adjacent sensors.
### False Alarm Reduction
1. **Calibration**: Always calibrate in the intended operating conditions (time of day, HVAC state, door positions).
2. **Threshold tuning**: Start with defaults, increase thresholds if false alarms occur, decrease if detections are missed.
3. **Debounce tuning**: Increase debounce counters in high-noise environments (near HVAC vents, open windows).
4. **Multi-module correlation**: Require 2+ modules to agree before triggering high-severity responses. For example: perimeter breach + panic motion = confirmed threat; perimeter breach alone = investigation.
5. **Time-of-day filtering**: Server-side logic can suppress certain events during business hours (e.g., single passages are normal during the day).
### Integration with Existing Security Systems
- **Event forwarding**: Events are emitted via `csi_emit_event()` to the host firmware, which packs them into UDP packets sent to the aggregator.
- **REST API**: The sensing server exposes events at `/api/v1/sensing/events` for integration with SIEM, VMS, or access control systems.
- **Webhook support**: Configure the server to POST event payloads to external endpoints.
- **MQTT**: For IoT integration, events can be published to MQTT topics (one per event type or per sensor).
### Resource Usage on ESP32-S3
| Resource | Budget | Notes |
|----------|--------|-------|
| RAM | ~2-4 KB per module | Static buffers, no heap allocation |
| CPU | <5 ms per frame (S budget) | Well within 50 ms frame budget at 20 Hz |
| Flash | ~3-8 KB WASM per module | Compiled with `opt-level = "s"` and LTO |
| Total (6 modules) | ~15-25 KB RAM, ~30 KB Flash | Fits in 925 KB firmware with headroom |

View file

@ -0,0 +1,444 @@
# Signal Intelligence Modules -- WiFi-DensePose Edge Intelligence
> Real-time WiFi signal analysis and enhancement running directly on the ESP32 chip. These modules clean, compress, and extract features from raw WiFi channel data so that higher-level modules (health, security, etc.) get better input.
## Overview
| Module | File | What It Does | Event IDs | Budget |
|--------|------|-------------|-----------|--------|
| Flash Attention | `sig_flash_attention.rs` | Focuses processing on the most informative subcarrier groups | 700-702 | S (<5ms) |
| Coherence Gate | `sig_coherence_gate.rs` | Filters out noisy/corrupted CSI frames using phase coherence | 710-712 | L (<2ms) |
| Temporal Compress | `sig_temporal_compress.rs` | Stores CSI history in 3-tier compressed circular buffer | 705-707 | S (<5ms) |
| Sparse Recovery | `sig_sparse_recovery.rs` | Recovers dropped subcarriers using ISTA sparse optimization | 715-717 | H (<10ms) |
| Min-Cut Person Match | `sig_mincut_person_match.rs` | Maintains stable person IDs across frames using bipartite matching | 720-722 | H (<10ms) |
| Optimal Transport | `sig_optimal_transport.rs` | Detects subtle motion via sliced Wasserstein distance | 725-727 | S (<5ms) |
## How Signal Processing Fits In
The signal intelligence modules form a processing pipeline between raw CSI data and application-level modules:
```
Raw CSI from WiFi chipset (Tier 0-2 firmware DSP)
|
v
+---------------------+ +---------------------+
| Coherence Gate | --> | Sparse Recovery |
| Reject noisy frames, | | Fill in dropped |
| gate quality levels | | subcarriers via ISTA |
+---------------------+ +---------------------+
| |
v v
+---------------------+ +---------------------+
| Flash Attention | | Temporal Compress |
| Focus on informative | | Store CSI history |
| subcarrier groups | | at 3 quality tiers |
+---------------------+ +---------------------+
| |
v v
+---------------------+ +---------------------+
| Min-Cut Person Match | | Optimal Transport |
| Track person IDs | | Detect subtle motion |
| across frames | | via distribution |
+---------------------+ +---------------------+
| |
v v
Application modules: Health, Security, Smart Building, etc.
```
The **Coherence Gate** acts as a quality filter at the top of the pipeline. Frames that pass the gate feed into the **Sparse Recovery** module (if subcarrier dropout is detected) and then into downstream analysis. **Flash Attention** identifies which spatial regions carry the most signal, while **Temporal Compress** maintains an efficient rolling history. **Min-Cut Person Match** and **Optimal Transport** extract higher-level features (person identity and motion) that application modules consume.
## Shared Utilities (`vendor_common.rs`)
All signal intelligence modules share these utilities from `vendor_common.rs`:
| Utility | Purpose |
|---------|---------|
| `CircularBuffer<N>` | Fixed-size ring buffer for phase history, stack-allocated |
| `Ema` | Exponential moving average with configurable alpha |
| `WelfordStats` | Online mean/variance/stddev in O(1) memory |
| `dot_product`, `l2_norm`, `cosine_similarity` | Fixed-size vector math |
| `dtw_distance`, `dtw_distance_banded` | Dynamic Time Warping for gesture/pattern matching |
| `FixedPriorityQueue<CAP>` | Top-K selection without heap allocation |
---
## Modules
### Flash Attention (`sig_flash_attention.rs`)
**What it does**: Focuses processing on the WiFi channels that carry the most useful information -- ignores noise. Divides 32 subcarriers into 8 groups and computes attention weights showing where signal activity is concentrated.
**Algorithm**: Tiled attention (Q*K/sqrt(d)) over 8 subcarrier groups with softmax normalization and Shannon entropy tracking.
1. Compute group means: Q = current phase per group, K = previous phase per group, V = amplitude per group
2. Score each group: `score[g] = Q[g] * K[g] / sqrt(8)`
3. Softmax normalization (numerically stable: subtract max before exp)
4. Track entropy H = -sum(p * ln(p)) via EMA smoothing
Low entropy means activity is focused in one spatial zone (a Fresnel region); high entropy means activity is spread uniformly.
#### Public API
```rust
pub struct FlashAttention { /* ... */ }
impl FlashAttention {
pub const fn new() -> Self;
pub fn process_frame(&mut self, phases: &[f32], amplitudes: &[f32]) -> &[(i32, f32)];
pub fn weights() -> &[f32; 8]; // Current attention weights per group
pub fn entropy() -> f32; // EMA-smoothed entropy [0, ln(8)]
pub fn peak_group() -> usize; // Group index with highest weight
pub fn centroid() -> f32; // Weighted centroid position [0, 7]
pub fn frame_count() -> u32;
pub fn reset(&mut self);
}
```
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 700 | `ATTENTION_PEAK_SC` | Group index (0-7) | Which subcarrier group has the strongest attention weight |
| 701 | `ATTENTION_SPREAD` | Entropy (0 to ~2.08) | How spread out the attention is (low = focused, high = uniform) |
| 702 | `SPATIAL_FOCUS_ZONE` | Centroid (0.0-7.0) | Weighted center of attention across groups |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `N_GROUPS` | 8 | Number of subcarrier groups (tiles) |
| `MAX_SC` | 32 | Maximum subcarriers processed |
| `ENTROPY_ALPHA` | 0.15 | EMA smoothing factor for entropy |
#### Tutorial: Understanding Attention Weights
The 8 attention weights sum to 1.0. When a person stands in a particular area of the room, the WiFi signal changes most in the subcarrier group(s) whose Fresnel zones intersect that area.
- **All weights near 0.125 (= 1/8)**: Uniform attention. No localized activity -- either an empty room or whole-body motion affecting all subcarriers equally.
- **One weight near 1.0, others near 0.0**: Highly focused. Activity concentrated in one spatial zone. The `peak_group` index tells you which zone.
- **Two adjacent groups elevated**: Activity at the boundary between two spatial zones, or a person moving between them.
- **Entropy below 1.0**: Strong spatial focus. Good for zone-level localization.
- **Entropy above 1.8**: Nearly uniform. Hard to localize activity.
The `centroid` value (0.0 to 7.0) gives a weighted average position. Tracking centroid over time reveals motion direction across the room.
---
### Coherence Gate (`sig_coherence_gate.rs`)
**What it does**: Decides whether each incoming CSI frame is trustworthy enough to use for sensing, or should be discarded. Uses the statistical consistency of phase changes across subcarriers to measure signal quality.
**Algorithm**: Per-subcarrier phase deltas form unit phasors (cos + i*sin). The magnitude of the mean phasor is the coherence score [0,1]. Welford online statistics track mean/variance for Z-score computation. A hysteresis state machine prevents rapid oscillation between states.
State transitions:
- Accept -> PredictOnly: 5 consecutive frames below LOW_THRESHOLD (0.40)
- PredictOnly -> Reject: single frame below threshold
- Reject/PredictOnly -> Accept: 10 consecutive frames above HIGH_THRESHOLD (0.75)
- Any -> Recalibrate: running variance exceeds 4x the initial snapshot
#### Public API
```rust
pub struct CoherenceGate { /* ... */ }
impl CoherenceGate {
pub const fn new() -> Self;
pub fn process_frame(&mut self, phases: &[f32]) -> &[(i32, f32)];
pub fn gate() -> GateDecision; // Accept/PredictOnly/Reject/Recalibrate
pub fn coherence() -> f32; // Last coherence score [0, 1]
pub fn zscore() -> f32; // Z-score of last coherence
pub fn variance() -> f32; // Running variance of coherence
pub fn frame_count() -> u32;
pub fn reset(&mut self);
}
pub enum GateDecision { Accept, PredictOnly, Reject, Recalibrate }
```
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 710 | `GATE_DECISION` | 2/1/0/-1 | Accept(2), PredictOnly(1), Reject(0), Recalibrate(-1) |
| 711 | `COHERENCE_SCORE` | [0.0, 1.0] | Phase phasor coherence magnitude |
| 712 | `RECALIBRATE_NEEDED` | Variance | Environment has changed significantly -- retrain baseline |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `HIGH_THRESHOLD` | 0.75 | Coherence above this = good quality |
| `LOW_THRESHOLD` | 0.40 | Coherence below this = poor quality |
| `DEGRADE_COUNT` | 5 | Consecutive bad frames before degrading |
| `RECOVER_COUNT` | 10 | Consecutive good frames before recovering |
| `VARIANCE_DRIFT_MULT` | 4.0 | Variance multiplier triggering recalibrate |
#### Tutorial: Using the Coherence Gate
The coherence gate protects downstream modules from processing garbage data. In practice:
1. **Accept** (value=2): Frame is clean. Use it for all sensing tasks (vitals, presence, gestures).
2. **PredictOnly** (value=1): Frame quality is marginal. Use cached predictions from previous frames; do not update models.
3. **Reject** (value=0): Frame is too noisy. Skip entirely. Do not feed to any learning module.
4. **Recalibrate** (value=-1): The environment has changed fundamentally (furniture moved, new AP, door opened). Reset baselines and re-learn.
Common causes of low coherence:
- Microwave oven running (2.4 GHz interference)
- Multiple people walking in different directions (phase cancellation)
- Hardware glitch (intermittent antenna contact)
---
### Temporal Compress (`sig_temporal_compress.rs`)
**What it does**: Maintains a rolling history of up to 512 CSI snapshots in compressed form. Recent data is stored at high precision; older data is progressively compressed to save memory while retaining long-term trends.
**Algorithm**: Three-tier quantization with automatic demotion at age boundaries.
| Tier | Age Range | Bits | Quantization Levels | Max Error |
|------|-----------|------|---------------------|-----------|
| Hot | 0-63 (newest) | 8-bit | 256 | <0.5% |
| Warm | 64-255 | 5-bit | 32 | <3% |
| Cold | 256-511 | 3-bit | 8 | <15% |
At 20 Hz, the buffer stores approximately:
- Hot: 3.2 seconds of high-fidelity data
- Warm: 9.6 seconds of medium-fidelity data
- Cold: 12.8 seconds of low-fidelity data
- Total: ~25.6 seconds, or longer at lower frame rates
Each snapshot stores 8 phase + 8 amplitude values (group means), plus a scale factor and tier tag.
#### Public API
```rust
pub struct TemporalCompressor { /* ... */ }
impl TemporalCompressor {
pub const fn new() -> Self;
pub fn push_frame(&mut self, phases: &[f32], amps: &[f32], ts_ms: u32) -> &[(i32, f32)];
pub fn on_timer() -> &[(i32, f32)];
pub fn get_snapshot(age: usize) -> Option<[f32; 16]>; // Decompressed 8 phase + 8 amp
pub fn compression_ratio() -> f32;
pub fn frame_rate() -> f32;
pub fn total_written() -> u32;
pub fn occupied() -> usize;
}
```
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 705 | `COMPRESSION_RATIO` | Ratio (>1.0) | Raw bytes / compressed bytes |
| 706 | `TIER_TRANSITION` | Tier (1 or 2) | A snapshot was demoted to Warm(1) or Cold(2) |
| 707 | `HISTORY_DEPTH_HOURS` | Hours | How much wall-clock time the buffer covers |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `CAP` | 512 | Total snapshot capacity |
| `HOT_END` | 64 | First N snapshots at 8-bit precision |
| `WARM_END` | 256 | Snapshots 64-255 at 5-bit precision |
| `RATE_ALPHA` | 0.05 | EMA alpha for frame rate estimation |
---
### Sparse Recovery (`sig_sparse_recovery.rs`)
**What it does**: When WiFi hardware drops some subcarrier measurements (nulls/zeros due to deep fades, firmware glitches, or multipath nulls), this module reconstructs the missing values using mathematical optimization.
**Algorithm**: Iterative Shrinkage-Thresholding Algorithm (ISTA) -- an L1-minimizing sparse recovery method.
```
x_{k+1} = soft_threshold(x_k + step * A^T * (b - A*x_k), lambda)
```
where:
- `A` is a tridiagonal correlation model (diagonal + immediate neighbors, 96 f32s instead of full 32x32=1024)
- `b` is the observed (non-null) subcarrier values
- `soft_threshold(x, t) = sign(x) * max(|x| - t, 0)` promotes sparsity
- Maximum 10 iterations per frame
The correlation model is learned online from valid frames using EMA-blended products.
#### Public API
```rust
pub struct SparseRecovery { /* ... */ }
impl SparseRecovery {
pub const fn new() -> Self;
pub fn process_frame(&mut self, amplitudes: &mut [f32]) -> &[(i32, f32)];
pub fn dropout_rate() -> f32; // Fraction of null subcarriers
pub fn last_residual_norm() -> f32; // L2 residual from last recovery
pub fn last_recovered_count() -> u32; // How many subcarriers were recovered
pub fn is_initialized() -> bool; // Whether correlation model is ready
}
```
Note: `process_frame` modifies `amplitudes` in place -- null subcarriers are overwritten with recovered values.
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 715 | `RECOVERY_COMPLETE` | Count | Number of subcarriers recovered |
| 716 | `RECOVERY_ERROR` | L2 norm | Residual error of the recovery |
| 717 | `DROPOUT_RATE` | Fraction [0,1] | Fraction of null subcarriers (emitted every 20 frames) |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `NULL_THRESHOLD` | 0.001 | Amplitude below this = dropped out |
| `MIN_DROPOUT_RATE` | 0.10 | Minimum dropout fraction to trigger recovery |
| `MAX_ITERATIONS` | 10 | ISTA iteration cap per frame |
| `STEP_SIZE` | 0.05 | Gradient descent learning rate |
| `LAMBDA` | 0.01 | L1 sparsity penalty weight |
| `CORR_ALPHA` | 0.05 | EMA alpha for correlation model updates |
#### Tutorial: When Recovery Kicks In
1. The module needs at least 10 fully valid frames to initialize the correlation model (`is_initialized() == true`).
2. Recovery only triggers when dropout exceeds 10% (e.g., 4+ of 32 subcarriers are null).
3. Below 10%, the nulls are too sparse to warrant recovery overhead.
4. The tridiagonal correlation model exploits the fact that adjacent WiFi subcarriers are highly correlated. A null at subcarrier 15 can be estimated from subcarriers 14 and 16.
5. Monitor `RECOVERY_ERROR` -- a rising residual suggests the correlation model is stale and the environment has changed.
---
### Min-Cut Person Match (`sig_mincut_person_match.rs`)
**What it does**: Maintains stable identity labels for up to 4 people in the sensing area. When people move around, their WiFi signatures change position -- this module tracks which signature belongs to which person across consecutive frames.
**Algorithm**: Inspired by `ruvector-mincut` (DynamicPersonMatcher). Each frame:
1. **Feature extraction**: For each detected person, extract the top-8 subcarrier variances (sorted descending) from their spatial region. This produces an 8D signature vector.
2. **Cost matrix**: Compute L2 distances between all current features and all stored signatures.
3. **Greedy assignment**: Pick the minimum-cost (detection, slot) pair, mark both as used, repeat. Like a simplified Hungarian algorithm, optimal for max 4 persons.
4. **Signature update**: Blend new features into stored signatures via EMA (alpha=0.15).
5. **Timeout**: Release slots after 100 frames of absence.
#### Public API
```rust
pub struct PersonMatcher { /* ... */ }
impl PersonMatcher {
pub const fn new() -> Self;
pub fn process_frame(&mut self, amplitudes: &[f32], variances: &[f32], n_persons: usize) -> &[(i32, f32)];
pub fn active_persons() -> u8;
pub fn total_swaps() -> u32;
pub fn is_person_stable(slot: usize) -> bool;
pub fn person_signature(slot: usize) -> Option<&[f32; 8]>;
}
```
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 720 | `PERSON_ID_ASSIGNED` | person_id + confidence*0.01 | Which slot was assigned (integer part) and match confidence (fractional part) |
| 721 | `PERSON_ID_SWAP` | prev*16 + curr | An identity swap was detected (prev and curr slot indices encoded) |
| 722 | `MATCH_CONFIDENCE` | [0.0, 1.0] | Average matching confidence across all detected persons (emitted every 10 frames) |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `MAX_PERSONS` | 4 | Maximum simultaneous person tracks |
| `FEAT_DIM` | 8 | Signature vector dimension |
| `SIG_ALPHA` | 0.15 | EMA blending factor for signature updates |
| `MAX_MATCH_DISTANCE` | 5.0 | L2 distance threshold for valid match |
| `STABLE_FRAMES` | 10 | Frames before a track is considered stable |
| `ABSENT_TIMEOUT` | 100 | Frames of absence before slot release (~5s at 20Hz) |
---
### Optimal Transport (`sig_optimal_transport.rs`)
**What it does**: Detects subtle motion that traditional variance-based detectors miss. Computes how much the overall shape of the WiFi signal distribution changes between frames, even when the total power stays constant.
**Algorithm**: Sliced Wasserstein distance -- a computationally efficient approximation to the full Wasserstein (earth mover's) distance.
1. Generate 4 fixed random projection directions (deterministic LCG PRNG, const-computed at compile time)
2. Project both current and previous amplitude vectors onto each direction
3. Sort the projected values (Shell sort with Ciura gaps, O(n^1.3))
4. Compute 1D Wasserstein-1 distance between sorted projections (just mean absolute difference)
5. Average across all 4 projections
6. Smooth via EMA and compare against thresholds
**Subtle motion detection**: When the Wasserstein distance is elevated (distribution shape changed) but the variance is stable (total power unchanged), something moved without creating obvious disturbance -- e.g., slow hand motion, breathing, or a door slowly closing.
#### Public API
```rust
pub struct OptimalTransportDetector { /* ... */ }
impl OptimalTransportDetector {
pub const fn new() -> Self;
pub fn process_frame(&mut self, amplitudes: &[f32]) -> &[(i32, f32)];
pub fn distance() -> f32; // EMA-smoothed Wasserstein distance
pub fn variance_smoothed() -> f32; // EMA-smoothed variance
pub fn frame_count() -> u32;
}
```
#### Events
| ID | Name | Value | Meaning |
|----|------|-------|---------|
| 725 | `WASSERSTEIN_DISTANCE` | Distance | Smoothed sliced Wasserstein distance (emitted every 5 frames) |
| 726 | `DISTRIBUTION_SHIFT` | Distance | Large distribution change detected (debounced, 3 consecutive frames > 0.25) |
| 727 | `SUBTLE_MOTION` | Distance | Motion detected despite stable variance (5 consecutive frames with distance > 0.10 and variance change < 15%) |
#### Configuration
| Constant | Value | Purpose |
|----------|-------|---------|
| `N_PROJ` | 4 | Number of random projection directions |
| `ALPHA` | 0.15 | EMA alpha for distance smoothing |
| `VAR_ALPHA` | 0.1 | EMA alpha for variance smoothing |
| `WASS_SHIFT` | 0.25 | Wasserstein threshold for distribution shift event |
| `WASS_SUBTLE` | 0.10 | Wasserstein threshold for subtle motion |
| `VAR_STABLE` | 0.15 | Maximum relative variance change for "stable" classification |
| `SHIFT_DEB` | 3 | Debounce count for distribution shift |
| `SUBTLE_DEB` | 5 | Debounce count for subtle motion |
#### Tutorial: Interpreting Wasserstein Distance
The Wasserstein distance measures the "cost" of transforming one distribution into another. Unlike variance-based metrics that only measure spread, it captures changes in shape, location, and mode structure.
**Typical values:**
- 0.00-0.05: No motion. Static environment.
- 0.05-0.15: Breathing, subtle body sway, environmental drift.
- 0.15-0.30: Walking, arm movement, normal activity.
- 0.30+: Large motion, multiple people moving, or sudden environmental change.
**Why "subtle motion" matters**: A person sitting still and slowly raising their hand creates almost no change in total signal variance, but the Wasserstein distance increases because the spatial distribution of signal strength shifts. This is critical for:
- Fall detection (pre-fall sway)
- Gesture recognition (micro-movements)
- Intruder detection (someone trying to move stealthily)
---
## Performance Budget
| Module | Budget Tier | Typical Latency | Stack Memory | Key Bottleneck |
|--------|-------------|-----------------|--------------|----------------|
| Flash Attention | S (<5ms) | ~0.5ms | ~512 bytes | Softmax exp() over 8 groups |
| Coherence Gate | L (<2ms) | ~0.3ms | ~320 bytes | sin/cos per subcarrier |
| Temporal Compress | S (<5ms) | ~0.8ms | ~12 KB | 512 snapshots * 24 bytes |
| Sparse Recovery | H (<10ms) | ~3ms | ~768 bytes | 10 ISTA iterations * 32 subcarriers |
| Min-Cut Person Match | H (<10ms) | ~1.5ms | ~640 bytes | 4x4 cost matrix + feature extraction |
| Optimal Transport | S (<5ms) | ~1.5ms | ~1 KB | 8 Shell sorts (4 projections * 2 distributions) |
All latencies are estimated for ESP32-S3 running WASM3 interpreter at 240 MHz. Actual performance varies with subcarrier count and frame complexity.
## Memory Layout
All modules use fixed-size stack/static allocations. No heap, no `alloc`, no `Vec`. This is required for `no_std` WASM deployment on the ESP32-S3.
Total static memory for all 6 signal modules: approximately 15 KB, well within the ESP32-S3's available WASM linear memory.

View file

@ -0,0 +1,448 @@
# Spatial & Temporal Intelligence -- WiFi-DensePose Edge Intelligence
> Location awareness, activity patterns, and autonomous decision-making running on the ESP32 chip. These modules figure out where people are, learn daily routines, verify safety rules, and let the device plan its own actions.
## Spatial Reasoning
| Module | File | What It Does | Event IDs | Budget |
|--------|------|--------------|-----------|--------|
| PageRank Influence | `spt_pagerank_influence.rs` | Finds the dominant person in multi-person scenes using cross-correlation PageRank | 760-762 | S (<5 ms) |
| Micro-HNSW | `spt_micro_hnsw.rs` | On-device approximate nearest-neighbor search for CSI fingerprint matching | 765-768 | S (<5 ms) |
| Spiking Tracker | `spt_spiking_tracker.rs` | Bio-inspired person tracking using LIF neurons with STDP learning | 770-773 | M (<8 ms) |
---
### PageRank Influence (`spt_pagerank_influence.rs`)
**What it does**: Figures out which person in a multi-person scene has the strongest WiFi signal influence, using the same math Google uses to rank web pages. Up to 4 persons are modelled as graph nodes; edge weights come from the normalized cross-correlation of their subcarrier phase groups (8 subcarriers per person).
**Algorithm**: 4x4 weighted adjacency graph built from abs(dot-product) / (norm_a * norm_b) cross-correlation. Standard PageRank power iteration with damping factor 0.85, 10 iterations, column-normalized transition matrix. Ranks are normalized to sum to 1.0 after each iteration.
#### Public API
```rust
use wifi_densepose_wasm_edge::spt_pagerank_influence::PageRankInfluence;
let mut pr = PageRankInfluence::new(); // const fn, zero-alloc
let events = pr.process_frame(&phases, 2); // phases: &[f32], n_persons: usize
let score = pr.rank(0); // PageRank score for person 0
let dom = pr.dominant_person(); // index of dominant person
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 760 | `EVENT_DOMINANT_PERSON` | Person index (0-3) | Every frame |
| 761 | `EVENT_INFLUENCE_SCORE` | PageRank score of dominant person [0, 1] | Every frame |
| 762 | `EVENT_INFLUENCE_CHANGE` | Encoded person_id + signed delta (fractional) | When rank shifts > 0.05 |
#### Configuration Constants
| Constant | Value | Purpose |
|----------|-------|---------|
| `MAX_PERSONS` | 4 | Maximum tracked persons |
| `SC_PER_PERSON` | 8 | Subcarriers assigned per person group |
| `DAMPING` | 0.85 | PageRank damping factor (standard) |
| `PR_ITERS` | 10 | Power-iteration rounds |
| `CHANGE_THRESHOLD` | 0.05 | Minimum rank change to emit change event |
#### Example: Detecting the Dominant Speaker in a Room
When multiple people are present, the person moving the most creates the strongest CSI disturbance. PageRank identifies which person's signal "influences" the others most strongly.
```
Frame 1: Person 0 speaking (active), Person 1 seated
-> EVENT_DOMINANT_PERSON = 0, EVENT_INFLUENCE_SCORE = 0.62
Frame 50: Person 1 stands and walks
-> EVENT_DOMINANT_PERSON = 1, EVENT_INFLUENCE_SCORE = 0.58
-> EVENT_INFLUENCE_CHANGE (person 1 rank increased by 0.08)
```
#### How It Works (Step by Step)
1. Host reports `n_persons` and provides up to 32 subcarrier phases
2. Module groups subcarriers: person 0 gets phases[0..8], person 1 gets phases[8..16], etc.
3. Cross-correlation is computed between every pair of person groups (abs cosine similarity)
4. A 4x4 adjacency matrix is built (no self-loops)
5. PageRank power iteration runs 10 times with damping=0.85
6. The person with the highest rank is reported as the dominant person
7. If any person's rank changed by more than 0.05 since last frame, a change event fires
---
### Micro-HNSW (`spt_micro_hnsw.rs`)
**What it does**: Stores up to 64 reference CSI fingerprint vectors (8 dimensions each) in a single-layer navigable small-world graph, enabling fast approximate nearest-neighbor lookup. When the sensor sees a new CSI pattern, it finds the most similar stored reference and returns its classification label.
**Algorithm**: HNSW (Hierarchical Navigable Small World) simplified to a single layer for embedded use. 64 nodes, 4 neighbors per node, beam search width 4, maximum 8 hops. L2 (Euclidean) distance. Bidirectional edges with worst-neighbor replacement pruning when a node is full.
#### Public API
```rust
use wifi_densepose_wasm_edge::spt_micro_hnsw::MicroHnsw;
let mut hnsw = MicroHnsw::new(); // const fn, zero-alloc
let idx = hnsw.insert(&features_8d, label); // Option<usize>
let (nearest_id, distance) = hnsw.search(&query_8d); // (usize, f32)
let events = hnsw.process_frame(&features); // per-frame query
let label = hnsw.last_label(); // u8 or 255=unknown
let dist = hnsw.last_match_distance(); // f32
let n = hnsw.size(); // number of stored vectors
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 765 | `EVENT_NEAREST_MATCH_ID` | Index of nearest stored vector | Every frame |
| 766 | `EVENT_MATCH_DISTANCE` | L2 distance to nearest match | Every frame |
| 767 | `EVENT_CLASSIFICATION` | Label of nearest match (255 if too far) | Every frame |
| 768 | `EVENT_LIBRARY_SIZE` | Number of stored reference vectors | Every frame |
#### Configuration Constants
| Constant | Value | Purpose |
|----------|-------|---------|
| `MAX_VECTORS` | 64 | Maximum stored reference fingerprints |
| `DIM` | 8 | Dimensions per feature vector |
| `MAX_NEIGHBORS` | 4 | Edges per node in the graph |
| `BEAM_WIDTH` | 4 | Search beam width (quality vs speed) |
| `MAX_HOPS` | 8 | Maximum graph traversal depth |
| `MATCH_THRESHOLD` | 2.0 | Distance above which classification returns "unknown" |
#### Example: Room Location Fingerprinting
Pre-load reference CSI fingerprints for known locations, then classify new readings in real-time.
```
Setup:
hnsw.insert(&kitchen_fingerprint, 1); // label 1 = kitchen
hnsw.insert(&bedroom_fingerprint, 2); // label 2 = bedroom
hnsw.insert(&bathroom_fingerprint, 3); // label 3 = bathroom
Runtime:
Frame arrives with features = [0.32, 0.15, ...]
-> EVENT_NEAREST_MATCH_ID = 1 (kitchen reference)
-> EVENT_MATCH_DISTANCE = 0.45
-> EVENT_CLASSIFICATION = 1 (kitchen)
-> EVENT_LIBRARY_SIZE = 3
```
#### How It Works (Step by Step)
1. **Insert**: New vector is added at position `n_vectors`. The module scans all existing nodes (N<=64, so linear scan is fine) to find the 4 nearest neighbors. Bidirectional edges are added; if a node already has 4 neighbors, the worst (farthest) is replaced if the new connection is shorter.
2. **Search**: Starting from the entry point, a beam search (width 4) explores neighbor nodes for up to 8 hops. Each hop expands unvisited neighbors of the current beam and inserts closer ones. Search terminates when no hop improves the beam.
3. **Classify**: If the nearest match distance is below `MATCH_THRESHOLD` (2.0), its label is returned. Otherwise, 255 (unknown).
---
### Spiking Tracker (`spt_spiking_tracker.rs`)
**What it does**: Tracks a person's location across 4 spatial zones using a biologically inspired spiking neural network. 32 Leaky Integrate-and-Fire (LIF) neurons (one per subcarrier) feed into 4 output neurons (one per zone). The zone with the highest spike rate indicates the person's location. Zone transitions measure velocity.
**Algorithm**: LIF neuron model with membrane leak factor 0.95, threshold 1.0, reset to 0.0. STDP (Spike-Timing-Dependent Plasticity) learning: potentiation LR=0.01 when pre+post fire within 1 frame, depression LR=0.005 when only pre fires. Weights clamped to [0, 2]. EMA smoothing on zone spike rates (alpha=0.1).
#### Public API
```rust
use wifi_densepose_wasm_edge::spt_spiking_tracker::SpikingTracker;
let mut st = SpikingTracker::new(); // const fn
let events = st.process_frame(&phases, &prev_phases); // returns events
let zone = st.current_zone(); // i8, -1 if lost
let rate = st.zone_spike_rate(0); // f32 for zone 0
let vel = st.velocity(); // EMA velocity
let tracking = st.is_tracking(); // bool
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 770 | `EVENT_TRACK_UPDATE` | Zone ID (0-3) | When tracked |
| 771 | `EVENT_TRACK_VELOCITY` | Zone transitions/frame (EMA) | When tracked |
| 772 | `EVENT_SPIKE_RATE` | Mean spike rate across zones [0, 1] | Every frame |
| 773 | `EVENT_TRACK_LOST` | Last known zone ID | When track lost |
#### Configuration Constants
| Constant | Value | Purpose |
|----------|-------|---------|
| `N_INPUT` | 32 | Input neurons (one per subcarrier) |
| `N_OUTPUT` | 4 | Output neurons (one per zone) |
| `THRESHOLD` | 1.0 | LIF firing threshold |
| `LEAK` | 0.95 | Membrane decay per frame |
| `STDP_LR_PLUS` | 0.01 | Potentiation learning rate |
| `STDP_LR_MINUS` | 0.005 | Depression learning rate |
| `W_MIN` / `W_MAX` | 0.0 / 2.0 | Weight bounds |
| `MIN_SPIKE_RATE` | 0.05 | Minimum rate to consider zone active |
#### Example: Tracking Movement Between Zones
```
Frames 1-30: Strong phase changes in subcarriers 0-7 (zone 0)
-> EVENT_TRACK_UPDATE = 0, EVENT_SPIKE_RATE = 0.15
Frames 31-60: Activity shifts to subcarriers 16-23 (zone 2)
-> EVENT_TRACK_UPDATE = 2, EVENT_TRACK_VELOCITY = 0.033
STDP strengthens zone 2 connections, weakens zone 0
Frames 61-90: No activity
-> Spike rates decay via EMA
-> EVENT_TRACK_LOST = 2 (last known zone)
```
#### How It Works (Step by Step)
1. Phase deltas (|current - previous|) inject current into LIF neurons
2. Each neuron leaks (membrane *= 0.95), then adds current
3. If membrane >= threshold (1.0), the neuron fires and resets to 0
4. Input spikes propagate to output zones via weighted connections
5. Output neurons fire when cumulative input exceeds threshold
6. STDP adjusts weights: correlated pre+post firing strengthens connections, uncorrelated pre firing weakens them (sparse iteration skips silent neurons for 70-90% savings)
7. Zone spike rates are EMA-smoothed; the zone with the highest rate above `MIN_SPIKE_RATE` is reported as the tracked location
---
## Temporal Analysis
| Module | File | What It Does | Event IDs | Budget |
|--------|------|--------------|-----------|--------|
| Pattern Sequence | `tmp_pattern_sequence.rs` | Learns daily activity routines and detects deviations | 790-793 | S (<5 ms) |
| Temporal Logic Guard | `tmp_temporal_logic_guard.rs` | Verifies 8 LTL safety invariants on every frame | 795-797 | S (<5 ms) |
| GOAP Autonomy | `tmp_goap_autonomy.rs` | Autonomous module management via A* goal-oriented planning | 800-803 | S (<5 ms) |
---
### Pattern Sequence (`tmp_pattern_sequence.rs`)
**What it does**: Learns daily activity routines and alerts when something changes. Each minute is discretized into a motion symbol (Empty, Still, LowMotion, HighMotion, MultiPerson), stored in a 24-hour circular buffer (1440 entries). An hourly LCS (Longest Common Subsequence) comparison between today and yesterday yields a routine confidence score. If grandma usually goes to the kitchen by 8am but has not moved, it notices.
**Algorithm**: Two-row dynamic programming LCS with O(n) memory (60-entry comparison window). Majority-vote symbol selection from per-frame accumulation. Two-day history buffer with day rollover.
#### Public API
```rust
use wifi_densepose_wasm_edge::tmp_pattern_sequence::PatternSequenceAnalyzer;
let mut psa = PatternSequenceAnalyzer::new(); // const fn
psa.on_frame(presence, motion, n_persons); // called per CSI frame (~20 Hz)
let events = psa.on_timer(); // called at ~1 Hz
let conf = psa.routine_confidence(); // [0, 1]
let n = psa.pattern_count(); // stored patterns
let min = psa.current_minute(); // 0-1439
let day = psa.day_offset(); // days since start
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 790 | `EVENT_PATTERN_DETECTED` | LCS length of detected pattern | Hourly |
| 791 | `EVENT_PATTERN_CONFIDENCE` | Routine confidence [0, 1] | Hourly |
| 792 | `EVENT_ROUTINE_DEVIATION` | Minute index where deviation occurred | Per minute (when deviating) |
| 793 | `EVENT_PREDICTION_NEXT` | Predicted next-minute symbol (from yesterday) | Per minute |
#### Configuration Constants
| Constant | Value | Purpose |
|----------|-------|---------|
| `DAY_LEN` | 1440 | Minutes per day |
| `MAX_PATTERNS` | 32 | Maximum stored pattern templates |
| `PATTERN_LEN` | 16 | Maximum symbols per pattern |
| `LCS_WINDOW` | 60 | Comparison window (1 hour) |
| `THRESH_STILL` / `THRESH_LOW` / `THRESH_HIGH` | 0.05 / 0.3 / 0.7 | Motion discretization thresholds |
#### Symbols
| Symbol | Value | Condition |
|--------|-------|-----------|
| Empty | 0 | No presence |
| Still | 1 | Present, motion < 0.05 |
| LowMotion | 2 | Present, 0.3 < motion <= 0.7 |
| HighMotion | 3 | Present, motion > 0.7 |
| MultiPerson | 4 | More than 1 person present |
#### Example: Elderly Care Routine Monitoring
```
Day 1: Learning phase
07:00 - Still (person in bed)
07:30 - HighMotion (getting ready)
08:00 - LowMotion (breakfast)
-> Patterns stored in history buffer
Day 2: Comparison active
07:00 - Still (normal)
07:30 - Still (DEVIATION! Expected HighMotion)
-> EVENT_ROUTINE_DEVIATION = 450 (minute 7:30)
-> EVENT_PREDICTION_NEXT = 3 (HighMotion expected)
08:30 - Still (still no activity)
-> Caregiver notified via DEVIATION events
```
---
### Temporal Logic Guard (`tmp_temporal_logic_guard.rs`)
**What it does**: Encodes 8 safety rules as Linear Temporal Logic (LTL) state machines. G-rules ("globally") are violated on any single frame. F-rules ("eventually") have deadlines. Every frame, the guard checks all rules and emits violations with counterexample frame indices.
**Algorithm**: State machine per rule (Satisfied/Pending/Violated). G-rules use immediate boolean checks. F-rules use deadline counters (frame-based). Counterexample tracking records the frame index when violation first occurs.
#### The 8 Safety Rules
| Rule | Type | Description | Violation Condition |
|------|------|-------------|---------------------|
| R0 | G | No fall alert when room is empty | `presence==0 AND fall_alert` |
| R1 | G | No intrusion alert when nobody present | `intrusion_alert AND presence==0` |
| R2 | G | No person ID active when nobody detected | `n_persons==0 AND person_id_active` |
| R3 | G | No vital signs when coherence is too low | `coherence<0.3 AND vital_signs_active` |
| R4 | F | Continuous motion must stop within 300s | Motion > 0.1 for 6000 consecutive frames |
| R5 | F | Fast breathing must trigger alert within 5s | Breathing > 40 BPM for 100 consecutive frames |
| R6 | G | Heart rate must not exceed 150 BPM | `heartrate_bpm > 150` |
| R7 | G-F | After seizure, no normal gait within 60s | Normal gait reported < 1200 frames after seizure |
#### Public API
```rust
use wifi_densepose_wasm_edge::tmp_temporal_logic_guard::{TemporalLogicGuard, FrameInput};
let mut guard = TemporalLogicGuard::new(); // const fn
let events = guard.on_frame(&input); // per-frame check
let satisfied = guard.satisfied_count(); // how many rules OK
let state = guard.rule_state(4); // Satisfied/Pending/Violated
let vio = guard.violation_count(0); // total violations for rule 0
let frame = guard.last_violation_frame(3); // frame index of last violation
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 795 | `EVENT_LTL_VIOLATION` | Rule index (0-7) | On violation |
| 796 | `EVENT_LTL_SATISFACTION` | Count of currently satisfied rules | Every 200 frames |
| 797 | `EVENT_COUNTEREXAMPLE` | Frame index when violation occurred | Paired with violation |
---
### GOAP Autonomy (`tmp_goap_autonomy.rs`)
**What it does**: Lets the ESP32 autonomously decide which sensing modules to activate or deactivate based on the current situation. Uses Goal-Oriented Action Planning (GOAP) with A* search over an 8-bit boolean world state to find the cheapest action sequence that achieves the highest-priority unsatisfied goal.
**Algorithm**: A* search over 8-bit world state. 6 prioritized goals, 8 actions with preconditions and effects encoded as bitmasks. Maximum plan depth 4, open set capacity 32. Replans every 60 seconds.
#### World State Properties
| Bit | Property | Meaning |
|-----|----------|---------|
| 0 | `has_presence` | Room occupancy detected |
| 1 | `has_motion` | Motion energy above threshold |
| 2 | `is_night` | Nighttime period |
| 3 | `multi_person` | More than 1 person present |
| 4 | `low_coherence` | Signal quality is degraded |
| 5 | `high_threat` | Threat score above threshold |
| 6 | `has_vitals` | Vital sign monitoring active |
| 7 | `is_learning` | Pattern learning active |
#### Goals (Priority Order)
| # | Goal | Priority | Condition |
|---|------|----------|-----------|
| 0 | Monitor Health | 0.9 | Achieve `has_vitals = true` |
| 1 | Secure Space | 0.8 | Achieve `has_presence = true` |
| 2 | Count People | 0.7 | Achieve `multi_person = false` |
| 3 | Learn Patterns | 0.5 | Achieve `is_learning = true` |
| 4 | Save Energy | 0.3 | Achieve `is_learning = false` |
| 5 | Self Test | 0.1 | Achieve `low_coherence = false` |
#### Actions
| # | Action | Precondition | Effect | Cost |
|---|--------|-------------|--------|------|
| 0 | Activate Vitals | Presence required | Sets `has_vitals` | 2 |
| 1 | Activate Intrusion | None | Sets `has_presence` | 1 |
| 2 | Activate Occupancy | Presence required | Clears `multi_person` | 2 |
| 3 | Activate Gesture Learn | Low coherence must be false | Sets `is_learning` | 3 |
| 4 | Deactivate Heavy | None | Clears `is_learning` + `has_vitals` | 1 |
| 5 | Run Coherence Check | None | Clears `low_coherence` | 2 |
| 6 | Enter Low Power | None | Clears `is_learning` + `has_motion` | 1 |
| 7 | Run Self Test | None | Clears `low_coherence` + `high_threat` | 3 |
#### Public API
```rust
use wifi_densepose_wasm_edge::tmp_goap_autonomy::GoapPlanner;
let mut planner = GoapPlanner::new(); // const fn
planner.update_world(presence, motion, n_persons,
coherence, threat, has_vitals, is_night);
let events = planner.on_timer(); // called at ~1 Hz
let ws = planner.world_state(); // u8 bitmask
let goal = planner.current_goal(); // goal index or 0xFF
let len = planner.plan_len(); // steps in current plan
planner.set_goal_priority(0, 0.95); // dynamically adjust
```
#### Events
| Event ID | Constant | Value | Frequency |
|----------|----------|-------|-----------|
| 800 | `EVENT_GOAL_SELECTED` | Goal index (0-5) | On replan |
| 801 | `EVENT_MODULE_ACTIVATED` | Action index that activated a module | On plan step |
| 802 | `EVENT_MODULE_DEACTIVATED` | Action index that deactivated a module | On plan step |
| 803 | `EVENT_PLAN_COST` | Total cost of the planned action sequence | On replan |
#### Example: Autonomous Night-Mode Transition
```
18:00 - World state: presence=1, motion=0, night=0, vitals=1
Goal 0 (Monitor Health) satisfied, Goal 1 (Secure Space) satisfied
-> Goal 2 selected (Count People, prio 0.7)
22:00 - World state: presence=0, motion=0, night=1
-> Goal 1 selected (Secure Space, prio 0.8)
-> Plan: [Action 1: Activate Intrusion] (cost=1)
-> EVENT_GOAL_SELECTED = 1
-> EVENT_MODULE_ACTIVATED = 1 (intrusion detection)
-> EVENT_PLAN_COST = 1
03:00 - No presence, low coherence detected
-> Goal 5 selected (Self Test, prio 0.1)
-> Plan: [Action 5: Run Coherence Check] (cost=2)
```
---
## Memory Layout Summary
All modules use fixed-size arrays and static event buffers. No heap allocation.
| Module | State Size (approx) | Static Event Buffer |
|--------|---------------------|---------------------|
| PageRank Influence | ~192 bytes (4x4 adj + 2x4 rank + meta) | 8 entries |
| Micro-HNSW | ~3.5 KB (64 nodes x 48 bytes + meta) | 4 entries |
| Spiking Tracker | ~1.1 KB (32x4 weights + membranes + rates) | 4 entries |
| Pattern Sequence | ~3.2 KB (2x1440 history + 32 patterns + LCS rows) | 4 entries |
| Temporal Logic Guard | ~120 bytes (8 rules + counters) | 12 entries |
| GOAP Autonomy | ~1.6 KB (32 open-set nodes + goals + plan) | 4 entries |
## Integration with Host Firmware
These modules receive data from the ESP32 Tier 2 DSP pipeline via the WASM3 host API:
```
ESP32 Firmware (C) WASM3 Runtime WASM Module (Rust)
| | |
CSI frame arrives | |
Tier 2 DSP runs | |
|--- csi_get_phase() ---->|--- host_get_phase() --->|
|--- csi_get_presence() ->|--- host_get_presence()->|
| | process_frame() |
|<-- csi_emit_event() ----|<-- host_emit_event() ---|
| | |
Forward to aggregator | |
```
Modules can be hot-loaded via OTA (ADR-040) without reflashing the firmware.