fix: WebSocket race condition, data source indicators, auto-start pose detection (#96)

* feat: RVF training pipeline & UI integration (ADR-036)

Implement full model training, management, and inference pipeline:

Backend (Rust):
- recording.rs: CSI recording API (start/stop/list/download/delete)
- model_manager.rs: RVF model loading, LoRA profile switching, model library
- training_api.rs: Training API with WebSocket progress streaming, simulated
  training mode with realistic loss curves, auto-RVF export on completion
- main.rs: Wire new modules, recording hooks in all CSI paths, data dirs

UI (new components):
- ModelPanel.js: Dark-mode model library with load/unload, LoRA dropdown
- TrainingPanel.js: Recording controls, training config, live Canvas charts
- model.service.js: Model REST API client with events
- training.service.js: Training + recording API client with WebSocket progress

UI (enhancements):
- LiveDemoTab: Model selector, LoRA profile switcher, A/B split view toggle,
  training quick-panel with 60s recording shortcut
- SettingsPanel: Full dark mode conversion (issue #92), model configuration
  (device, threads, auto-load), training configuration (epochs, LR, patience)
- PoseDetectionCanvas: 10-frame pose trail with ghost keypoints and motion
  trajectory lines, cyan trail toggle button
- pose.service.js: Model-inference confidence thresholds

UI (plumbing):
- index.html: Training tab (8th tab)
- app.js: Panel initialization and tab routing
- style.css: ~250 lines of training/model panel dark-mode styles

191 Rust tests pass, 0 failures. Closes #92.

Refs: ADR-036, #93

Co-Authored-By: claude-flow <ruv@ruv.net>

* fix: real RuVector training pipeline + UI service fixes

Training pipeline (training_api.rs):
- Replace simulated training with real signal-based training loop
- Load actual CSI data from .csi.jsonl recordings or live frame history
- Extract 180 features per frame: subcarrier amplitudes, temporal variance,
  Goertzel frequency analysis (9 bands), motion gradients, global stats
- Train calibrated linear CSI-to-pose mapping via mini-batch gradient descent
  with L2 regularization (ridge regression), Xavier init, cosine LR decay
- Self-supervised: teacher targets from derive_pose_from_sensing() heuristics
- Real validation metrics: MSE and PCK@0.2 on 80/20 train/val split
- Export trained .rvf with real weights, feature normalization stats, witness
- Add infer_pose_from_model() for live inference from trained model
- 16 new tests covering features, training, inference, serialization

UI fixes:
- Fix double-URL bug in model.service.js and training.service.js
  (buildApiUrl was called twice — once in service, once in apiService)
- Fix route paths to match Rust backend (/api/v1/train/*, /api/v1/recording/*)
- Fix request body formats (session_name, nested config object)
- Fix top-level await in LiveDemoTab.js blocking module graph
- Dynamic imports for ModelPanel/TrainingPanel in app.js
- Center nav tabs with flex-wrap for 8-tab layout

Co-Authored-By: claude-flow <ruv@ruv.net>

* fix: WebSocket onOpen race condition, data source indicators, auto-start pose detection

- Fix WebSocket onOpen race condition in websocket.service.js where
  setupEventHandlers replaced onopen after socket was already open,
  preventing pose service from receiving connection signal
- Add 4-state data source indicator (LIVE/SIMULATED/RECONNECTING/OFFLINE)
  across Dashboard, Sensing, and Live Demo tabs via sensing.service.js
- Add hot-plug ESP32 auto-detection in sensing server (auto mode runs
  both UDP listener and simulation, switches on ESP32_TIMEOUT)
- Auto-start pose detection when backend is reachable
- Hide duplicate PoseDetectionCanvas controls when enableControls=false
- Add standalone Demo button in LiveDemoTab for offline animated demo
- Add data source banner and status styling

Co-Authored-By: claude-flow <ruv@ruv.net>
This commit is contained in:
rUv 2026-03-02 13:47:49 -05:00 committed by GitHub
parent c193cd4299
commit 113011e704
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
20 changed files with 6124 additions and 83 deletions

View file

@ -0,0 +1,228 @@
# ADR-036: RVF Model Training Pipeline & UI Integration
## Status
Proposed
## Date
2026-03-02
## Context
The wifi-densepose system currently operates in **signal-derived** mode — `derive_pose_from_sensing()` maps aggregate CSI features (motion power, breathing rate, variance) to keypoint positions using deterministic math. This gives whole-body presence and gross motion but cannot track individual limbs.
The infrastructure for **model inference** mode exists but is disconnected:
1. **RVF container format** (`rvf_container.rs`, 1,102 lines) — a 64-byte-aligned binary format supporting model weights (`SEG_VEC`), metadata (`SEG_MANIFEST`), quantization (`SEG_QUANT`), LoRA profiles (`SEG_LORA`), contrastive embeddings (`SEG_EMBED`), and witness audit trails (`SEG_WITNESS`). Builder and reader are fully implemented with CRC32 integrity checks.
2. **Training crate** (`wifi-densepose-train`) — AdamW optimizer, PCK@0.2/OKS metrics, LR scheduling with warmup, early stopping, CSV logging, and checkpoint export. Supports `CsiDataset` trait with planned MM-Fi (114→56 subcarrier interpolation) and Wi-Pose (30→56 zero-pad) loaders per ADR-015.
3. **NN inference crate** (`wifi-densepose-nn`) — ONNX Runtime backend with CPU/GPU support, dynamic tensor shapes, thread-safe `OnnxBackend` wrapper, model info inspection, and warmup.
4. **Sensing server CLI** (`--model <path>`, `--train`, `--pretrain`, `--embed`) — flags exist for model loading, training mode, and embedding extraction, but the end-to-end path from raw CSI → trained `.rvf` → live inference is not wired together.
5. **UI gaps** — No model management, training progress visualization, LoRA profile switching, or embedding inspection. The Settings panel lacks model configuration. The Live Demo has no way to load a trained model or compare signal-derived vs model-inference output side-by-side.
### What users need
- A way to **collect labeled CSI data** from their own environment (self-supervised or teacher-student from camera).
- A way to **train an .rvf model** from collected data without leaving the UI.
- A way to **load and switch models** in the live demo, seeing the quality improvement.
- Visibility into **training progress** (loss curves, validation PCK, early stopping).
- **Environment adaptation** via LoRA profiles (office → home → warehouse) without full retraining.
## Decision
### Phase 1: Data Collection & Self-Supervised Pretraining
#### 1.1 CSI Recording API
Add REST endpoints to the sensing server:
```
POST /api/v1/recording/start { duration_secs, label?, session_name }
POST /api/v1/recording/stop
GET /api/v1/recording/list
GET /api/v1/recording/download/:id
DELETE /api/v1/recording/:id
```
- Records raw CSI frames + extracted features to `.csi.jsonl` files.
- Optional camera-based label overlay via teacher model (Detectron2/MediaPipe on client).
- Each recording session tagged with environment metadata (room dimensions, node positions, AP count).
#### 1.2 Contrastive Pretraining (ADR-024 Phase 1)
- Self-supervised NT-Xent loss learns a 128-dim CSI embedding without pose labels.
- Positive pairs: adjacent frames from same person; negatives: different sessions/rooms.
- VICReg regularization prevents embedding collapse.
- Output: `.rvf` container with `SEG_EMBED` + `SEG_VEC` segments.
- Training triggered via `POST /api/v1/train/pretrain { dataset_ids[], epochs, lr }`.
### Phase 2: Supervised Training Pipeline
#### 2.1 Dataset Integration
- **MM-Fi loader**: Parse HDF5 files, 114→56 subcarrier interpolation via `ruvector-solver` sparse least-squares.
- **Wi-Pose loader**: Parse .mat files, 30→56 zero-padding with Hann window smoothing.
- **Self-collected**: `.csi.jsonl` from Phase 1 recording + camera-generated labels.
- All datasets implement `CsiDataset` trait and produce `(amplitude[B,T*links,56], phase[B,T*links,56], keypoints[B,17,2], visibility[B,17])`.
#### 2.2 Training API
```
POST /api/v1/train/start {
dataset_ids: string[],
config: {
epochs: 100,
batch_size: 32,
learning_rate: 3e-4,
weight_decay: 1e-4,
early_stopping_patience: 15,
warmup_epochs: 5,
pretrained_rvf?: string, // Base model for fine-tuning
lora_profile?: string, // Environment-specific LoRA
}
}
POST /api/v1/train/stop
GET /api/v1/train/status // { epoch, train_loss, val_pck, val_oks, lr, eta_secs }
WS /ws/train/progress // Real-time streaming of training metrics
```
#### 2.3 RVF Export
On training completion:
- Best checkpoint exported as `.rvf` with `SEG_VEC` (weights), `SEG_MANIFEST` (metadata), `SEG_WITNESS` (training hash + final metrics), and optional `SEG_QUANT` (INT8 quantization).
- Stored in `data/models/` directory, indexed by model ID.
- `GET /api/v1/models` lists available models; `POST /api/v1/models/load { model_id }` hot-loads into inference.
### Phase 3: LoRA Environment Adaptation
#### 3.1 LoRA Fine-Tuning
- Given a base `.rvf` model, fine-tune only LoRA adapter weights (rank 4-16) on environment-specific recordings.
- 5-10 minutes of labeled data from new environment suffices.
- New LoRA profile appended to existing `.rvf` via `SEG_LORA` segment.
- `POST /api/v1/train/lora { base_model_id, dataset_ids[], profile_name, rank: 8, epochs: 20 }`.
#### 3.2 Profile Switching
- `POST /api/v1/models/lora/activate { model_id, profile_name }` — hot-swap LoRA weights without reloading base model.
- UI dropdown lists available profiles per loaded model.
### Phase 4: UI Integration
#### 4.1 Model Management Panel (new: `ui/components/ModelPanel.js`)
- **Model Library**: List loaded and available `.rvf` models with metadata (version, dataset, PCK score, size, created date).
- **Model Inspector**: Show RVF segment breakdown — weight count, quantization type, LoRA profiles, embedding config, witness hash.
- **Load/Unload**: One-click model loading with progress bar.
- **Compare**: Side-by-side signal-derived vs model-inference toggle in Live Demo.
#### 4.2 Training Dashboard (new: `ui/components/TrainingPanel.js`)
- **Recording Controls**: Start/stop CSI recording, session list with duration and frame counts.
- **Training Progress**: Real-time loss curve (train loss, val loss) and metric charts (PCK@0.2, OKS) via WebSocket streaming.
- **Epoch Table**: Scrollable table of per-epoch metrics with best-epoch highlighting.
- **Early Stopping Indicator**: Visual countdown of patience remaining.
- **Export Button**: Download trained `.rvf` from browser.
#### 4.3 Live Demo Enhancements
- **Model Selector**: Dropdown in toolbar to switch between signal-derived and loaded `.rvf` models.
- **LoRA Profile Selector**: Sub-dropdown showing environment profiles for the active model.
- **Confidence Heatmap Overlay**: Per-keypoint confidence visualization when model is loaded (toggle in render mode dropdown).
- **Pose Trail**: Ghosted keypoint history showing last N frames of motion trajectory.
- **A/B Split View**: Left half signal-derived, right half model-inference for quality comparison.
#### 4.4 Settings Panel Extensions
- **Model section**: Default model path, auto-load on startup, GPU/CPU toggle, inference threads.
- **Training section**: Default hyperparameters, checkpoint directory, auto-export on completion.
- **Recording section**: Default recording directory, max duration, auto-label with camera.
#### 4.5 Dark Mode
All new panels follow the dark mode established in ADR-035 (`#0d1117` backgrounds, `#e0e0e0` text, translucent dark panels with colored accents).
### Phase 5: Inference Pipeline Wiring
#### 5.1 Model-Inference Pose Path
When a `.rvf` model is loaded:
1. CSI frame arrives (UDP or simulated).
2. Extract amplitude + phase tensors from subcarrier data.
3. Feed through ONNX session: `input[1, T*links, 56]``output[1, 17, 4]` (x, y, z, conf).
4. Apply Kalman smoothing from `pose_tracker.rs`.
5. Broadcast via WebSocket with `pose_source: "model_inference"`.
6. UI Estimation Mode badge switches from green "SIGNAL-DERIVED" to blue "MODEL INFERENCE".
#### 5.2 Progressive Loading (ADR-031 Layer A/B/C)
- **Layer A** (instant): Signal-derived pose starts immediately.
- **Layer B** (5-10s): Contrastive embeddings loaded, HNSW index warm.
- **Layer C** (30-60s): Full pose model loaded, inference active.
- Transitions seamlessly; UI badge updates automatically.
## Consequences
### Positive
- Users can train a model on **their own environment** without external tools or Python dependencies.
- LoRA profiles mean a single base model adapts to multiple rooms in minutes, not hours.
- Training progress is visible in real-time — no black-box waiting.
- A/B comparison lets users see the quality jump from signal-derived to model-inference.
- RVF container bundles everything (weights, metadata, LoRA, witness) in one portable file.
- Self-supervised pretraining requires no labels — just leave ESP32s running.
- Progressive loading means the UI is never "loading..." — signal-derived kicks in immediately.
### Negative
- Training requires significant compute: GPU recommended for supervised training (CPU possible but 10-50x slower).
- MM-Fi and Wi-Pose datasets must be downloaded separately (10-50 GB each) — cannot be bundled.
- LoRA rank must be tuned per environment; too low loses expressiveness, too high overfits.
- ONNX Runtime adds ~50 MB to the binary size when GPU support is enabled.
- Real-time inference at 10 FPS requires ~10ms per frame — tight budget on CPU.
- Teacher-student labeling (camera → pose labels → CSI training) requires camera access, which may conflict with the privacy-first premise.
### Mitigations
- Provide pre-trained base `.rvf` model downloadable from releases (trained on MM-Fi + Wi-Pose).
- INT8 quantization (`SEG_QUANT`) reduces model size 4x and speeds inference ~2x on CPU.
- Camera-based labeling is **optional** — self-supervised pretraining works without camera.
- Training API validates VRAM availability before starting GPU training; falls back to CPU with warning.
## Implementation Order
| Phase | Effort | Dependencies | Priority |
|-------|--------|-------------|----------|
| 1.1 CSI Recording API | 2-3 days | sensing server | High |
| 1.2 Contrastive Pretraining | 3-5 days | ADR-024, recording API | High |
| 2.1 Dataset Integration | 3-5 days | ADR-015, CsiDataset trait | High |
| 2.2 Training API | 2-3 days | training crate, dataset loaders | High |
| 2.3 RVF Export | 1-2 days | RvfBuilder | Medium |
| 3.1 LoRA Fine-Tuning | 3-5 days | base trained model | Medium |
| 3.2 Profile Switching | 1 day | LoRA in RVF | Medium |
| 4.1 Model Panel UI | 2-3 days | models API | High |
| 4.2 Training Dashboard UI | 3-4 days | training API + WS | High |
| 4.3 Live Demo Enhancements | 2-3 days | model loading | Medium |
| 4.4 Settings Extensions | 1 day | model/training APIs | Low |
| 4.5 Dark Mode | 0.5 days | new panels | Low |
| 5.1 Inference Wiring | 3-5 days | ONNX backend, pose tracker | High |
| 5.2 Progressive Loading | 2-3 days | ADR-031 | Medium |
**Total estimate: 4-6 weeks** (phases can overlap; 1+2 parallel with 4).
## Files to Create/Modify
### New Files
- `ui/components/ModelPanel.js` — Model library, inspector, load/unload controls
- `ui/components/TrainingPanel.js` — Recording controls, training progress, metric charts
- `rust-port/.../sensing-server/src/recording.rs` — CSI recording API handlers
- `rust-port/.../sensing-server/src/training_api.rs` — Training API handlers + WS progress stream
- `rust-port/.../sensing-server/src/model_manager.rs` — Model loading, hot-swap, 32LoRA activation
- `data/models/` — Default model storage directory
### Modified Files
- `rust-port/.../sensing-server/src/main.rs` — Wire recording, training, and model APIs
- `rust-port/.../train/src/trainer.rs` — Add WebSocket progress callback, LoRA training mode
- `rust-port/.../train/src/dataset.rs` — MM-Fi and Wi-Pose dataset loaders
- `rust-port/.../nn/src/onnx.rs` — LoRA weight injection, INT8 quantization support
- `ui/components/LiveDemoTab.js` — Model selector, LoRA dropdown, A/B spsplit view
- `ui/components/SettingsPanel.js` — Model and training configuration sections
- `ui/components/PoseDetectionCanvas.js` — Pose trail rendering, confidence heatmap overlay
- `ui/services/pose.service.js` — Model-inference keypoint processing
- `ui/index.html` — Add Training tabhee
- `ui/style.css` — Styles for new panels
## References
- ADR-015: MM-Fi + Wi-Pose training datasets
- ADR-016: RuVector training pipeline integration
- ADR-024: Project AETHER — contrastive CSI embedding model
- ADR-029: RuvSense multistatic sensing mode
- ADR-031: RuView sensing-first RF mode (progressive loading)
- ADR-035: Live sensing UI accuracy & data source transparency
- Issue: https://github.com/ruvnet/wifi-densepose/issues/92
- RVF format: `crates/wifi-densepose-sensing-server/src/rvf_container.rs`
- Training crate: `crates/wifi-densepose-train/src/trainer.rs`
- NN inference: `crates/wifi-densepose-nn/src/onnx.rs`

View file

@ -11,6 +11,9 @@
mod rvf_container;
mod rvf_pipeline;
mod vital_signs;
mod recording;
mod model_manager;
mod training_api;
// Training pipeline modules (exposed via lib.rs)
use wifi_densepose_sensing_server::{graph_transformer, trainer, dataset, embedding};
@ -272,6 +275,9 @@ struct AppStateInner {
frame_history: VecDeque<Vec<f64>>,
tick: u64,
source: String,
/// Timestamp of the last ESP32 UDP frame received.
/// Used by the hybrid auto-detect task to switch between esp32 and simulation.
last_esp32_frame: Option<std::time::Instant>,
tx: broadcast::Sender<String>,
total_detections: u64,
start_time: std::time::Instant,
@ -289,6 +295,14 @@ struct AppStateInner {
active_sona_profile: Option<String>,
/// Whether a trained model is loaded.
model_loaded: bool,
/// CSI frame recording state (ADR-036).
recording_state: recording::RecordingState,
/// Currently loaded model via model_manager API (ADR-036).
loaded_model: Option<model_manager::LoadedModelState>,
/// Training pipeline state (ADR-036).
training_state: training_api::TrainingState,
/// Broadcast channel for training progress WebSocket (ADR-036).
training_progress_tx: tokio::sync::broadcast::Sender<String>,
}
/// Number of frames retained in `frame_history` for temporal analysis.
@ -889,6 +903,17 @@ async fn windows_wifi_task(state: SharedState, tick_ms: u64) {
s.latest_vitals = vitals.clone();
let feat_variance = features.variance;
// ADR-036: Capture data for recording before values are moved.
let rec_amps = multi_ap_frame.amplitudes.clone();
let rec_rssi = first_rssi;
let rec_features = serde_json::json!({
"variance": feat_variance,
"motion_band_power": features.motion_band_power,
"breathing_band_power": features.breathing_band_power,
"spectral_power": features.spectral_power,
});
let update = SensingUpdate {
msg_type: "sensing_update".to_string(),
timestamp: chrono::Utc::now().timestamp_millis() as f64 / 1000.0,
@ -921,7 +946,14 @@ async fn windows_wifi_task(state: SharedState, tick_ms: u64) {
if let Ok(json) = serde_json::to_string(&update) {
let _ = s.tx.send(json);
}
s.latest_update = Some(update);
drop(s);
// ADR-036: Record frame if recording is active.
recording::maybe_record_frame(
&state, &rec_amps, rec_rssi, -90.0, &rec_features,
).await;
debug!(
"Multi-BSSID tick #{tick}: {obs_count} BSSIDs, quality={:.2}, verdict={:?}",
@ -998,6 +1030,16 @@ async fn windows_wifi_fallback_tick(state: &SharedState, seq: u32) {
s.latest_vitals = vitals.clone();
let feat_variance = features.variance;
// ADR-036: Capture data for recording before values are moved.
let rec_amps = vec![signal_pct];
let rec_features = serde_json::json!({
"variance": feat_variance,
"motion_band_power": features.motion_band_power,
"breathing_band_power": features.breathing_band_power,
"spectral_power": features.spectral_power,
});
let update = SensingUpdate {
msg_type: "sensing_update".to_string(),
timestamp: chrono::Utc::now().timestamp_millis() as f64 / 1000.0,
@ -1030,7 +1072,14 @@ async fn windows_wifi_fallback_tick(state: &SharedState, seq: u32) {
if let Ok(json) = serde_json::to_string(&update) {
let _ = s.tx.send(json);
}
s.latest_update = Some(update);
drop(s);
// ADR-036: Record frame if recording is active.
recording::maybe_record_frame(
state, &rec_amps, rssi_dbm, -90.0, &rec_features,
).await;
}
/// Probe if Windows WiFi is connected
@ -1766,6 +1815,7 @@ async fn udp_receiver_task(state: SharedState, udp_port: u16) {
let mut s = state.write().await;
s.source = "esp32".to_string();
s.last_esp32_frame = Some(std::time::Instant::now());
// Append current amplitudes to history before extracting features so
// that temporal analysis includes the most recent frame.
@ -1829,7 +1879,25 @@ async fn udp_receiver_task(state: SharedState, udp_port: u16) {
if let Ok(json) = serde_json::to_string(&update) {
let _ = s.tx.send(json);
}
// Capture data for recording before storing.
let rec_amps = frame.amplitudes.iter().take(56).cloned().collect::<Vec<_>>();
let rec_rssi = features.mean_rssi;
let rec_features = serde_json::json!({
"variance": features.variance,
"motion_band_power": features.motion_band_power,
"breathing_band_power": features.breathing_band_power,
"spectral_power": features.spectral_power,
});
s.latest_update = Some(update);
drop(s);
// ADR-036: Record frame if recording is active.
recording::maybe_record_frame(
&state, &rec_amps, rec_rssi,
frame.noise_floor as f64, &rec_features,
).await;
}
}
Err(e) => {
@ -1842,6 +1910,9 @@ async fn udp_receiver_task(state: SharedState, udp_port: u16) {
// ── Simulated data task ──────────────────────────────────────────────────────
/// Duration without ESP32 frames before falling back to simulation.
const ESP32_TIMEOUT: Duration = Duration::from_secs(3);
async fn simulated_data_task(state: SharedState, tick_ms: u64) {
let mut interval = tokio::time::interval(Duration::from_millis(tick_ms));
info!("Simulated data source active (tick={}ms)", tick_ms);
@ -1849,7 +1920,23 @@ async fn simulated_data_task(state: SharedState, tick_ms: u64) {
loop {
interval.tick().await;
// If ESP32 sent a frame recently, skip simulation — real data is flowing.
{
let s = state.read().await;
if let Some(last) = s.last_esp32_frame {
if last.elapsed() < ESP32_TIMEOUT {
continue; // ESP32 is active, don't emit simulated frames
}
}
}
let mut s = state.write().await;
// If we just transitioned from esp32 → simulated, log once.
if s.source == "esp32" {
info!("ESP32 silent for {}s — switching to simulation", ESP32_TIMEOUT.as_secs());
}
s.source = "simulated".to_string();
s.tick += 1;
let tick = s.tick;
@ -1928,7 +2015,24 @@ async fn simulated_data_task(state: SharedState, tick_ms: u64) {
if let Ok(json) = serde_json::to_string(&update) {
let _ = s.tx.send(json);
}
// Capture data for recording before storing.
let rec_amps = frame.amplitudes.clone();
let rec_rssi = features.mean_rssi;
let rec_features = serde_json::json!({
"variance": features.variance,
"motion_band_power": features.motion_band_power,
"breathing_band_power": features.breathing_band_power,
"spectral_power": features.spectral_power,
});
s.latest_update = Some(update);
drop(s);
// ADR-036: Record frame if recording is active.
recording::maybe_record_frame(
&state, &rec_amps, rec_rssi, -90.0, &rec_features,
).await;
}
}
@ -2396,6 +2500,7 @@ async fn main() {
info!(" Source: {}", args.source);
// Auto-detect data source
let is_auto_mode = args.source == "auto";
let source = match args.source.as_str() {
"auto" => {
info!("Auto-detecting data source...");
@ -2406,7 +2511,7 @@ async fn main() {
info!(" Windows WiFi detected");
"wifi"
} else {
info!(" No hardware detected, using simulation");
info!(" No hardware detected, starting with simulation (hot-plug enabled)");
"simulate"
}
}
@ -2488,12 +2593,14 @@ async fn main() {
}
let (tx, _) = broadcast::channel::<String>(256);
let (training_progress_tx, _) = broadcast::channel::<String>(512);
let state: SharedState = Arc::new(RwLock::new(AppStateInner {
latest_update: None,
rssi_history: VecDeque::new(),
frame_history: VecDeque::new(),
tick: 0,
source: source.into(),
last_esp32_frame: if source == "esp32" { Some(std::time::Instant::now()) } else { None },
tx,
total_detections: 0,
start_time: std::time::Instant::now(),
@ -2504,19 +2611,39 @@ async fn main() {
progressive_loader,
active_sona_profile: None,
model_loaded,
recording_state: recording::RecordingState::default(),
loaded_model: None,
training_state: training_api::TrainingState::default(),
training_progress_tx,
}));
// Start background tasks based on source
match source {
"esp32" => {
tokio::spawn(udp_receiver_task(state.clone(), args.udp_port));
tokio::spawn(broadcast_tick_task(state.clone(), args.tick_ms));
// Ensure data directories exist (ADR-036).
for dir in &[recording::RECORDINGS_DIR, model_manager::MODELS_DIR] {
if let Err(e) = std::fs::create_dir_all(dir) {
warn!("Failed to create directory {dir}: {e}");
}
"wifi" => {
tokio::spawn(windows_wifi_task(state.clone(), args.tick_ms));
}
_ => {
tokio::spawn(simulated_data_task(state.clone(), args.tick_ms));
}
// Start background tasks based on source.
// In auto mode we always start BOTH the UDP listener (for ESP32 hot-plug)
// and the simulation task (which self-pauses when ESP32 packets arrive).
if is_auto_mode {
info!("Auto mode: UDP listener + simulation fallback both active (hot-plug enabled)");
tokio::spawn(udp_receiver_task(state.clone(), args.udp_port));
tokio::spawn(simulated_data_task(state.clone(), args.tick_ms));
tokio::spawn(broadcast_tick_task(state.clone(), args.tick_ms));
} else {
match source {
"esp32" => {
tokio::spawn(udp_receiver_task(state.clone(), args.udp_port));
tokio::spawn(broadcast_tick_task(state.clone(), args.tick_ms));
}
"wifi" => {
tokio::spawn(windows_wifi_task(state.clone(), args.tick_ms));
}
_ => {
tokio::spawn(simulated_data_task(state.clone(), args.tick_ms));
}
}
}
@ -2571,6 +2698,10 @@ async fn main() {
.route("/api/v1/stream/pose", get(ws_pose_handler))
// Sensing WebSocket on the HTTP port so the UI can reach it without a second port
.route("/ws/sensing", get(ws_sensing_handler))
// ADR-036: Recording, model management, and training APIs
.merge(recording::routes())
.merge(model_manager::routes())
.merge(training_api::routes())
// Static UI files
.nest_service("/ui", ServeDir::new(&ui_path))
.layer(SetResponseHeaderLayer::overriding(

View file

@ -0,0 +1,482 @@
//! Model loading and lifecycle management API.
//!
//! Provides REST endpoints for listing, loading, and unloading `.rvf` models.
//! Models are stored in `data/models/` and inspected using `RvfReader`.
//!
//! Endpoints:
//! - `GET /api/v1/models` — list all available models
//! - `GET /api/v1/models/:id` — detailed info for a specific model
//! - `POST /api/v1/models/load` — load a model for inference
//! - `POST /api/v1/models/unload` — unload the active model
//! - `GET /api/v1/models/active` — get active model info
//! - `POST /api/v1/models/lora/activate` — activate a LoRA profile
//! - `GET /api/v1/models/lora/profiles` — list LoRA profiles for active model
use std::path::PathBuf;
use std::sync::Arc;
use std::time::Instant;
use axum::{
extract::{Path as AxumPath, State},
response::Json,
routing::{get, post},
Router,
};
use serde::{Deserialize, Serialize};
use tokio::sync::RwLock;
use tracing::{error, info};
use crate::rvf_container::RvfReader;
// ── Models data directory ────────────────────────────────────────────────────
/// Base directory for RVF model files.
pub const MODELS_DIR: &str = "data/models";
// ── Types ────────────────────────────────────────────────────────────────────
/// Summary information for a model discovered on disk.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ModelInfo {
pub id: String,
pub filename: String,
pub version: String,
pub description: String,
pub size_bytes: u64,
pub created_at: String,
pub pck_score: Option<f64>,
pub has_quantization: bool,
pub lora_profiles: Vec<String>,
pub segment_count: usize,
}
/// Information about the currently loaded model, including runtime stats.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ActiveModelInfo {
pub model_id: String,
pub filename: String,
pub version: String,
pub description: String,
pub avg_inference_ms: f64,
pub frames_processed: u64,
pub pose_source: String,
pub lora_profiles: Vec<String>,
pub active_lora_profile: Option<String>,
}
/// Runtime state for the loaded model.
///
/// Stored inside `AppStateInner` and read by the inference path.
pub struct LoadedModelState {
/// Model identifier (derived from filename).
pub model_id: String,
/// Original filename.
pub filename: String,
/// Version string from the RVF manifest.
pub version: String,
/// Description from the RVF manifest.
pub description: String,
/// LoRA profiles available in this model.
pub lora_profiles: Vec<String>,
/// Currently active LoRA profile (if any).
pub active_lora_profile: Option<String>,
/// Model weights (f32 parameters).
pub weights: Vec<f32>,
/// Number of frames processed since load.
pub frames_processed: u64,
/// Cumulative inference time for avg calculation.
pub total_inference_ms: f64,
/// When the model was loaded.
pub loaded_at: Instant,
}
/// Request body for `POST /api/v1/models/load`.
#[derive(Debug, Deserialize)]
pub struct LoadModelRequest {
pub model_id: String,
}
/// Request body for `POST /api/v1/models/lora/activate`.
#[derive(Debug, Deserialize)]
pub struct ActivateLoraRequest {
pub model_id: String,
pub profile_name: String,
}
/// Shared application state type.
pub type AppState = Arc<RwLock<super::AppStateInner>>;
// ── Internal helpers ─────────────────────────────────────────────────────────
/// Scan the models directory and build `ModelInfo` for each `.rvf` file.
async fn scan_models() -> Vec<ModelInfo> {
let dir = PathBuf::from(MODELS_DIR);
let mut models = Vec::new();
let mut entries = match tokio::fs::read_dir(&dir).await {
Ok(e) => e,
Err(_) => return models,
};
while let Ok(Some(entry)) = entries.next_entry().await {
let path = entry.path();
if path.extension().and_then(|e| e.to_str()) != Some("rvf") {
continue;
}
let filename = path
.file_name()
.unwrap_or_default()
.to_string_lossy()
.to_string();
let id = filename.trim_end_matches(".rvf").to_string();
let size_bytes = tokio::fs::metadata(&path)
.await
.map(|m| m.len())
.unwrap_or(0);
// Read the RVF to extract manifest info.
// This is a blocking I/O operation so we use spawn_blocking.
let path_clone = path.clone();
let info = tokio::task::spawn_blocking(move || {
RvfReader::from_file(&path_clone).ok()
})
.await
.unwrap_or(None);
let (version, description, pck_score, has_quant, lora_profiles, segment_count, created_at) =
if let Some(reader) = &info {
let manifest = reader.manifest().unwrap_or_default();
let metadata = reader.metadata().unwrap_or_default();
let version = manifest
.get("version")
.and_then(|v| v.as_str())
.unwrap_or("unknown")
.to_string();
let description = manifest
.get("description")
.and_then(|v| v.as_str())
.unwrap_or("")
.to_string();
let created_at = manifest
.get("created_at")
.and_then(|v| v.as_str())
.unwrap_or("")
.to_string();
let pck = metadata
.get("training")
.and_then(|t| t.get("best_pck"))
.and_then(|v| v.as_f64());
let has_quant = reader.quant_info().is_some();
let lora = reader.lora_profiles();
let seg_count = reader.segment_count();
(version, description, pck, has_quant, lora, seg_count, created_at)
} else {
(
"unknown".to_string(),
String::new(),
None,
false,
Vec::new(),
0,
String::new(),
)
};
models.push(ModelInfo {
id,
filename,
version,
description,
size_bytes,
created_at,
pck_score,
has_quantization: has_quant,
lora_profiles,
segment_count,
});
}
models.sort_by(|a, b| a.id.cmp(&b.id));
models
}
/// Load a model from disk by ID and return its `LoadedModelState`.
fn load_model_from_disk(model_id: &str) -> Result<LoadedModelState, String> {
let file_path = PathBuf::from(MODELS_DIR).join(format!("{model_id}.rvf"));
let reader = RvfReader::from_file(&file_path)?;
let manifest = reader.manifest().unwrap_or_default();
let version = manifest
.get("version")
.and_then(|v| v.as_str())
.unwrap_or("unknown")
.to_string();
let description = manifest
.get("description")
.and_then(|v| v.as_str())
.unwrap_or("")
.to_string();
let filename = format!("{model_id}.rvf");
let lora_profiles = reader.lora_profiles();
let weights = reader.weights().unwrap_or_default();
Ok(LoadedModelState {
model_id: model_id.to_string(),
filename,
version,
description,
lora_profiles,
active_lora_profile: None,
weights,
frames_processed: 0,
total_inference_ms: 0.0,
loaded_at: Instant::now(),
})
}
// ── Axum handlers ────────────────────────────────────────────────────────────
async fn list_models(State(_state): State<AppState>) -> Json<serde_json::Value> {
let models = scan_models().await;
Json(serde_json::json!({
"models": models,
"count": models.len(),
}))
}
async fn get_model(
State(_state): State<AppState>,
AxumPath(id): AxumPath<String>,
) -> Json<serde_json::Value> {
let models = scan_models().await;
match models.into_iter().find(|m| m.id == id) {
Some(model) => Json(serde_json::to_value(&model).unwrap_or_default()),
None => Json(serde_json::json!({
"status": "error",
"message": format!("Model '{id}' not found"),
})),
}
}
async fn load_model(
State(state): State<AppState>,
Json(body): Json<LoadModelRequest>,
) -> Json<serde_json::Value> {
let model_id = body.model_id.clone();
// Perform blocking file I/O on spawn_blocking.
let load_result = tokio::task::spawn_blocking(move || load_model_from_disk(&model_id))
.await
.map_err(|e| format!("spawn_blocking panicked: {e}"));
let loaded = match load_result {
Ok(Ok(loaded)) => loaded,
Ok(Err(e)) => {
error!("Failed to load model '{}': {e}", body.model_id);
return Json(serde_json::json!({
"status": "error",
"message": format!("Failed to load model: {e}"),
}));
}
Err(e) => {
error!("Internal error loading model: {e}");
return Json(serde_json::json!({
"status": "error",
"message": format!("Internal error: {e}"),
}));
}
};
let model_id = loaded.model_id.clone();
let weight_count = loaded.weights.len();
{
let mut s = state.write().await;
s.loaded_model = Some(loaded);
s.model_loaded = true;
}
info!("Model loaded: {model_id} ({weight_count} params)");
Json(serde_json::json!({
"status": "loaded",
"model_id": model_id,
"weight_count": weight_count,
}))
}
async fn unload_model(State(state): State<AppState>) -> Json<serde_json::Value> {
let mut s = state.write().await;
if s.loaded_model.is_none() {
return Json(serde_json::json!({
"status": "error",
"message": "No model is currently loaded.",
}));
}
let model_id = s
.loaded_model
.as_ref()
.map(|m| m.model_id.clone())
.unwrap_or_default();
s.loaded_model = None;
s.model_loaded = false;
info!("Model unloaded: {model_id}");
Json(serde_json::json!({
"status": "unloaded",
"model_id": model_id,
}))
}
async fn active_model(State(state): State<AppState>) -> Json<serde_json::Value> {
let s = state.read().await;
match &s.loaded_model {
Some(model) => {
let avg_ms = if model.frames_processed > 0 {
model.total_inference_ms / model.frames_processed as f64
} else {
0.0
};
let info = ActiveModelInfo {
model_id: model.model_id.clone(),
filename: model.filename.clone(),
version: model.version.clone(),
description: model.description.clone(),
avg_inference_ms: avg_ms,
frames_processed: model.frames_processed,
pose_source: "model_inference".to_string(),
lora_profiles: model.lora_profiles.clone(),
active_lora_profile: model.active_lora_profile.clone(),
};
Json(serde_json::to_value(&info).unwrap_or_default())
}
None => Json(serde_json::json!({
"status": "no_model",
"message": "No model is currently loaded.",
})),
}
}
async fn activate_lora(
State(state): State<AppState>,
Json(body): Json<ActivateLoraRequest>,
) -> Json<serde_json::Value> {
let mut s = state.write().await;
let model = match s.loaded_model.as_mut() {
Some(m) => m,
None => {
return Json(serde_json::json!({
"status": "error",
"message": "No model is loaded. Load a model first.",
}));
}
};
if model.model_id != body.model_id {
return Json(serde_json::json!({
"status": "error",
"message": format!(
"Model '{}' is not loaded. Active model: '{}'",
body.model_id, model.model_id
),
}));
}
if !model.lora_profiles.contains(&body.profile_name) {
return Json(serde_json::json!({
"status": "error",
"message": format!(
"LoRA profile '{}' not found. Available: {:?}",
body.profile_name, model.lora_profiles
),
}));
}
model.active_lora_profile = Some(body.profile_name.clone());
info!(
"LoRA profile activated: {} on model {}",
body.profile_name, body.model_id
);
Json(serde_json::json!({
"status": "activated",
"model_id": body.model_id,
"profile_name": body.profile_name,
}))
}
async fn list_lora_profiles(State(state): State<AppState>) -> Json<serde_json::Value> {
let s = state.read().await;
match &s.loaded_model {
Some(model) => Json(serde_json::json!({
"model_id": model.model_id,
"profiles": model.lora_profiles,
"active": model.active_lora_profile,
})),
None => Json(serde_json::json!({
"profiles": serde_json::Value::Array(vec![]),
"message": "No model is loaded.",
})),
}
}
// ── Router factory ───────────────────────────────────────────────────────────
/// Build the model management sub-router.
///
/// All routes are prefixed with `/api/v1/models`.
pub fn routes() -> Router<AppState> {
Router::new()
.route("/api/v1/models", get(list_models))
.route("/api/v1/models/active", get(active_model))
.route("/api/v1/models/load", post(load_model))
.route("/api/v1/models/unload", post(unload_model))
.route("/api/v1/models/lora/activate", post(activate_lora))
.route("/api/v1/models/lora/profiles", get(list_lora_profiles))
.route("/api/v1/models/{id}", get(get_model))
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn model_info_serializes() {
let info = ModelInfo {
id: "test-model".to_string(),
filename: "test-model.rvf".to_string(),
version: "1.0.0".to_string(),
description: "A test model".to_string(),
size_bytes: 1024,
created_at: "2024-01-01T00:00:00Z".to_string(),
pck_score: Some(0.85),
has_quantization: false,
lora_profiles: vec!["default".to_string()],
segment_count: 5,
};
let json = serde_json::to_string(&info).unwrap();
assert!(json.contains("test-model"));
assert!(json.contains("0.85"));
}
#[test]
fn active_model_info_serializes() {
let info = ActiveModelInfo {
model_id: "demo".to_string(),
filename: "demo.rvf".to_string(),
version: "0.1.0".to_string(),
description: String::new(),
avg_inference_ms: 2.5,
frames_processed: 100,
pose_source: "model_inference".to_string(),
lora_profiles: vec![],
active_lora_profile: None,
};
let json = serde_json::to_string(&info).unwrap();
assert!(json.contains("model_inference"));
}
}

View file

@ -0,0 +1,486 @@
//! CSI frame recording API.
//!
//! Provides REST endpoints for recording CSI frames to `.csi.jsonl` files.
//! When recording is active, each processed CSI frame is appended as a JSON
//! line to the current session file stored under `data/recordings/`.
//!
//! Endpoints:
//! - `POST /api/v1/recording/start` — start a new recording session
//! - `POST /api/v1/recording/stop` — stop the active recording
//! - `GET /api/v1/recording/list` — list all recording sessions
//! - `GET /api/v1/recording/download/:id` — download a recording file
//! - `DELETE /api/v1/recording/:id` — delete a recording
use std::path::{Path, PathBuf};
use std::sync::Arc;
use std::time::Instant;
use axum::{
extract::{Path as AxumPath, State},
response::{IntoResponse, Json},
routing::{delete, get, post},
Router,
};
use serde::{Deserialize, Serialize};
use tokio::sync::RwLock;
use tracing::{error, info, warn};
// ── Recording data directory ─────────────────────────────────────────────────
/// Base directory for recording files.
pub const RECORDINGS_DIR: &str = "data/recordings";
// ── Types ────────────────────────────────────────────────────────────────────
/// Request body for `POST /api/v1/recording/start`.
#[derive(Debug, Deserialize)]
pub struct StartRecordingRequest {
pub session_name: String,
pub label: Option<String>,
pub duration_secs: Option<u64>,
}
/// Metadata for a completed or active recording session.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct RecordingSession {
pub id: String,
pub name: String,
pub label: Option<String>,
pub started_at: String,
pub ended_at: Option<String>,
pub frame_count: u64,
pub file_size_bytes: u64,
pub file_path: String,
}
/// A single recorded CSI frame line (JSONL format).
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct RecordedFrame {
pub timestamp: f64,
pub subcarriers: Vec<f64>,
pub rssi: f64,
pub noise_floor: f64,
pub features: serde_json::Value,
}
/// Runtime state for the active recording session.
///
/// Stored inside `AppStateInner` and checked on each CSI frame tick.
pub struct RecordingState {
/// Whether a recording is currently active.
pub active: bool,
/// Session ID of the active recording.
pub session_id: String,
/// Session display name.
pub session_name: String,
/// Optional label / activity tag.
pub label: Option<String>,
/// Path to the JSONL file being written.
pub file_path: PathBuf,
/// Number of frames written so far.
pub frame_count: u64,
/// When the recording started.
pub start_time: Instant,
/// ISO-8601 start timestamp for metadata.
pub started_at: String,
/// Optional auto-stop duration.
pub duration_secs: Option<u64>,
}
impl Default for RecordingState {
fn default() -> Self {
Self {
active: false,
session_id: String::new(),
session_name: String::new(),
label: None,
file_path: PathBuf::new(),
frame_count: 0,
start_time: Instant::now(),
started_at: String::new(),
duration_secs: None,
}
}
}
/// Shared application state type used across all handlers.
pub type AppState = Arc<RwLock<super::AppStateInner>>;
// ── Public helpers (called from the CSI processing loop in main.rs) ──────────
/// Append a single frame to the active recording file.
///
/// This is designed to be called from the main CSI processing tick.
/// If recording is not active, it returns immediately.
pub async fn maybe_record_frame(
state: &AppState,
subcarriers: &[f64],
rssi: f64,
noise_floor: f64,
features: &serde_json::Value,
) {
let should_write;
let file_path;
let auto_stop;
{
let s = state.read().await;
let rec = &s.recording_state;
if !rec.active {
return;
}
should_write = true;
file_path = rec.file_path.clone();
auto_stop = rec.duration_secs.map(|d| rec.start_time.elapsed().as_secs() >= d).unwrap_or(false);
}
if auto_stop {
// Duration exceeded — stop recording.
stop_recording_inner(state).await;
return;
}
if !should_write {
return;
}
let frame = RecordedFrame {
timestamp: chrono::Utc::now().timestamp_millis() as f64 / 1000.0,
subcarriers: subcarriers.to_vec(),
rssi,
noise_floor,
features: features.clone(),
};
let line = match serde_json::to_string(&frame) {
Ok(l) => l,
Err(e) => {
warn!("Failed to serialize recording frame: {e}");
return;
}
};
// Append line to file (async).
if let Err(e) = append_line(&file_path, &line).await {
warn!("Failed to write recording frame: {e}");
return;
}
// Increment frame counter.
{
let mut s = state.write().await;
s.recording_state.frame_count += 1;
}
}
async fn append_line(path: &Path, line: &str) -> std::io::Result<()> {
use tokio::io::AsyncWriteExt;
let mut file = tokio::fs::OpenOptions::new()
.create(true)
.append(true)
.open(path)
.await?;
file.write_all(line.as_bytes()).await?;
file.write_all(b"\n").await?;
Ok(())
}
// ── Internal helpers ─────────────────────────────────────────────────────────
/// Stop the active recording and write session metadata.
async fn stop_recording_inner(state: &AppState) {
let mut s = state.write().await;
if !s.recording_state.active {
return;
}
s.recording_state.active = false;
let ended_at = chrono::Utc::now().to_rfc3339();
let session = RecordingSession {
id: s.recording_state.session_id.clone(),
name: s.recording_state.session_name.clone(),
label: s.recording_state.label.clone(),
started_at: s.recording_state.started_at.clone(),
ended_at: Some(ended_at),
frame_count: s.recording_state.frame_count,
file_size_bytes: std::fs::metadata(&s.recording_state.file_path)
.map(|m| m.len())
.unwrap_or(0),
file_path: s.recording_state.file_path.to_string_lossy().to_string(),
};
// Write a companion .meta.json alongside the JSONL file.
let meta_path = s.recording_state.file_path.with_extension("meta.json");
if let Ok(json) = serde_json::to_string_pretty(&session) {
if let Err(e) = tokio::fs::write(&meta_path, json).await {
warn!("Failed to write recording metadata: {e}");
}
}
info!(
"Recording stopped: {} ({} frames)",
session.id, session.frame_count
);
}
/// Scan the recordings directory and return all sessions with metadata.
async fn list_sessions() -> Vec<RecordingSession> {
let dir = PathBuf::from(RECORDINGS_DIR);
let mut sessions = Vec::new();
let mut entries = match tokio::fs::read_dir(&dir).await {
Ok(e) => e,
Err(_) => return sessions,
};
while let Ok(Some(entry)) = entries.next_entry().await {
let path = entry.path();
if path.extension().and_then(|e| e.to_str()) == Some("json")
&& path.to_string_lossy().contains(".meta.")
{
if let Ok(data) = tokio::fs::read_to_string(&path).await {
if let Ok(session) = serde_json::from_str::<RecordingSession>(&data) {
sessions.push(session);
}
}
}
}
// Sort by started_at descending (newest first).
sessions.sort_by(|a, b| b.started_at.cmp(&a.started_at));
sessions
}
// ── Axum handlers ────────────────────────────────────────────────────────────
async fn start_recording(
State(state): State<AppState>,
Json(body): Json<StartRecordingRequest>,
) -> Json<serde_json::Value> {
// Ensure recordings directory exists.
if let Err(e) = tokio::fs::create_dir_all(RECORDINGS_DIR).await {
error!("Failed to create recordings directory: {e}");
return Json(serde_json::json!({
"status": "error",
"message": format!("Cannot create recordings directory: {e}"),
}));
}
let mut s = state.write().await;
if s.recording_state.active {
return Json(serde_json::json!({
"status": "error",
"message": "A recording is already active. Stop it first.",
"active_session": s.recording_state.session_id,
}));
}
let session_id = format!(
"{}-{}",
body.session_name.replace(' ', "_"),
chrono::Utc::now().format("%Y%m%d_%H%M%S")
);
let file_name = format!("{session_id}.csi.jsonl");
let file_path = PathBuf::from(RECORDINGS_DIR).join(&file_name);
let started_at = chrono::Utc::now().to_rfc3339();
s.recording_state = RecordingState {
active: true,
session_id: session_id.clone(),
session_name: body.session_name.clone(),
label: body.label.clone(),
file_path: file_path.clone(),
frame_count: 0,
start_time: Instant::now(),
started_at: started_at.clone(),
duration_secs: body.duration_secs,
};
info!(
"Recording started: {session_id} (label={:?}, duration={:?}s)",
body.label, body.duration_secs
);
Json(serde_json::json!({
"status": "recording",
"session_id": session_id,
"session_name": body.session_name,
"label": body.label,
"started_at": started_at,
"file_path": file_path.to_string_lossy(),
"duration_secs": body.duration_secs,
}))
}
async fn stop_recording(State(state): State<AppState>) -> Json<serde_json::Value> {
{
let s = state.read().await;
if !s.recording_state.active {
return Json(serde_json::json!({
"status": "error",
"message": "No active recording to stop.",
}));
}
}
stop_recording_inner(&state).await;
let s = state.read().await;
Json(serde_json::json!({
"status": "stopped",
"session_id": s.recording_state.session_id,
"frame_count": s.recording_state.frame_count,
}))
}
async fn list_recordings(
State(_state): State<AppState>,
) -> Json<serde_json::Value> {
let sessions = list_sessions().await;
Json(serde_json::json!({
"recordings": sessions,
"count": sessions.len(),
}))
}
async fn download_recording(
State(_state): State<AppState>,
AxumPath(id): AxumPath<String>,
) -> impl IntoResponse {
let dir = PathBuf::from(RECORDINGS_DIR);
// Find the JSONL file matching the ID.
let file_path = dir.join(format!("{id}.csi.jsonl"));
if !file_path.exists() {
return (
axum::http::StatusCode::NOT_FOUND,
Json(serde_json::json!({
"status": "error",
"message": format!("Recording '{id}' not found"),
})),
)
.into_response();
}
match tokio::fs::read(&file_path).await {
Ok(data) => {
let headers = [
(
axum::http::header::CONTENT_TYPE,
"application/x-ndjson".to_string(),
),
(
axum::http::header::CONTENT_DISPOSITION,
format!("attachment; filename=\"{id}.csi.jsonl\""),
),
];
(headers, data).into_response()
}
Err(e) => (
axum::http::StatusCode::INTERNAL_SERVER_ERROR,
Json(serde_json::json!({
"status": "error",
"message": format!("Failed to read recording: {e}"),
})),
)
.into_response(),
}
}
async fn delete_recording(
State(_state): State<AppState>,
AxumPath(id): AxumPath<String>,
) -> Json<serde_json::Value> {
let dir = PathBuf::from(RECORDINGS_DIR);
let jsonl_path = dir.join(format!("{id}.csi.jsonl"));
let meta_path = dir.join(format!("{id}.csi.meta.json"));
if !jsonl_path.exists() && !meta_path.exists() {
return Json(serde_json::json!({
"status": "error",
"message": format!("Recording '{id}' not found"),
}));
}
let mut deleted = Vec::new();
if jsonl_path.exists() {
if let Err(e) = tokio::fs::remove_file(&jsonl_path).await {
warn!("Failed to delete {}: {e}", jsonl_path.display());
} else {
deleted.push(jsonl_path.to_string_lossy().to_string());
}
}
if meta_path.exists() {
if let Err(e) = tokio::fs::remove_file(&meta_path).await {
warn!("Failed to delete {}: {e}", meta_path.display());
} else {
deleted.push(meta_path.to_string_lossy().to_string());
}
}
Json(serde_json::json!({
"status": "deleted",
"id": id,
"deleted_files": deleted,
}))
}
// ── Router factory ───────────────────────────────────────────────────────────
/// Build the recording sub-router.
///
/// Mount this at the top level; all routes are prefixed with `/api/v1/recording`.
pub fn routes() -> Router<AppState> {
Router::new()
.route("/api/v1/recording/start", post(start_recording))
.route("/api/v1/recording/stop", post(stop_recording))
.route("/api/v1/recording/list", get(list_recordings))
.route(
"/api/v1/recording/download/{id}",
get(download_recording),
)
.route("/api/v1/recording/{id}", delete(delete_recording))
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn default_recording_state_is_inactive() {
let rs = RecordingState::default();
assert!(!rs.active);
assert_eq!(rs.frame_count, 0);
}
#[test]
fn recorded_frame_serializes_to_json() {
let frame = RecordedFrame {
timestamp: 1700000000.0,
subcarriers: vec![1.0, 2.0, 3.0],
rssi: -45.0,
noise_floor: -90.0,
features: serde_json::json!({"motion": 0.5}),
};
let json = serde_json::to_string(&frame).unwrap();
assert!(json.contains("\"timestamp\""));
assert!(json.contains("\"subcarriers\""));
}
#[test]
fn recording_session_deserializes() {
let json = r#"{
"id": "test-20240101_120000",
"name": "test",
"label": "walking",
"started_at": "2024-01-01T12:00:00Z",
"ended_at": "2024-01-01T12:05:00Z",
"frame_count": 3000,
"file_size_bytes": 1500000,
"file_path": "data/recordings/test-20240101_120000.csi.jsonl"
}"#;
let session: RecordingSession = serde_json::from_str(json).unwrap();
assert_eq!(session.id, "test-20240101_120000");
assert_eq!(session.frame_count, 3000);
assert_eq!(session.label, Some("walking".to_string()));
}
}

View file

@ -130,6 +130,9 @@ class WiFiDensePoseApp {
this.components.sensing = new SensingTab(sensingContainer);
}
// Training tab - lazy load to avoid breaking other tabs if import fails
this.initTrainingTab();
// Architecture tab - static content, no component needed
// Performance tab - static content, no component needed
@ -137,6 +140,28 @@ class WiFiDensePoseApp {
// Applications tab - static content, no component needed
}
// Lazy-load Training tab panels (dynamic import so failures don't break other tabs)
async initTrainingTab() {
try {
const [{ default: TrainingPanel }, { default: ModelPanel }] = await Promise.all([
import('./components/TrainingPanel.js'),
import('./components/ModelPanel.js')
]);
const trainingContainer = document.getElementById('training-panel-container');
if (trainingContainer) {
this.components.trainingPanel = new TrainingPanel(trainingContainer);
}
const modelContainer = document.getElementById('model-panel-container');
if (modelContainer) {
this.components.modelPanel = new ModelPanel(modelContainer);
}
} catch (error) {
console.error('Failed to load Training tab components:', error);
}
}
// Handle tab changes
handleTabChange(newTab, oldTab) {
console.log(`Tab changed from ${oldTab} to ${newTab}`);
@ -168,6 +193,16 @@ class WiFiDensePoseApp {
});
}
break;
case 'training':
// Refresh panels when training tab becomes visible
if (this.components.trainingPanel && typeof this.components.trainingPanel.refresh === 'function') {
this.components.trainingPanel.refresh();
}
if (this.components.modelPanel && typeof this.components.modelPanel.refresh === 'function') {
this.components.modelPanel.refresh();
}
break;
}
}

View file

@ -2,6 +2,7 @@
import { healthService } from '../services/health.service.js';
import { poseService } from '../services/pose.service.js';
import { sensingService } from '../services/sensing.service.js';
export class DashboardTab {
constructor(containerElement) {
@ -63,6 +64,17 @@ export class DashboardTab {
this.updateHealthStatus(health);
});
// Subscribe to sensing service state changes for data source indicator
this._sensingUnsub = sensingService.onStateChange(() => {
this.updateDataSourceIndicator();
});
// Also update on data — catches source changes mid-stream
this._sensingDataUnsub = sensingService.onData(() => {
this.updateDataSourceIndicator();
});
// Initial update
this.updateDataSourceIndicator();
// Start periodic stats updates
this.statsInterval = setInterval(() => {
this.updateLiveStats();
@ -72,6 +84,25 @@ export class DashboardTab {
healthService.startHealthMonitoring(30000);
}
// Update the data source indicator on the dashboard
updateDataSourceIndicator() {
const el = this.container.querySelector('#dashboard-datasource');
if (!el) return;
const ds = sensingService.dataSource;
const statusText = el.querySelector('.status-text');
const statusMsg = el.querySelector('.status-message');
const config = {
'live': { text: 'ESP32', status: 'healthy', msg: 'Real hardware connected' },
'server-simulated': { text: 'SIMULATED', status: 'warning', msg: 'Server running without hardware' },
'reconnecting': { text: 'RECONNECTING', status: 'degraded', msg: 'Attempting to connect...' },
'simulated': { text: 'OFFLINE', status: 'unhealthy', msg: 'Server unreachable, local fallback' },
};
const cfg = config[ds] || config['reconnecting'];
el.className = `component-status status-${cfg.status}`;
if (statusText) statusText.textContent = cfg.text;
if (statusMsg) statusMsg.textContent = cfg.msg;
}
// Update API info display
updateApiInfo(info) {
// Update version
@ -394,11 +425,13 @@ export class DashboardTab {
if (this.healthSubscription) {
this.healthSubscription();
}
if (this._sensingUnsub) this._sensingUnsub();
if (this._sensingDataUnsub) this._sensingDataUnsub();
if (this.statsInterval) {
clearInterval(this.statsInterval);
}
healthService.stopHealthMonitoring();
}
}

View file

@ -4,6 +4,11 @@ import { PoseDetectionCanvas } from './PoseDetectionCanvas.js';
import { poseService } from '../services/pose.service.js';
import { streamService } from '../services/stream.service.js';
import { wsService } from '../services/websocket.service.js';
import { sensingService } from '../services/sensing.service.js';
// Optional services - loaded lazily in init() to avoid blocking module graph
let modelService = null;
let trainingService = null;
export class LiveDemoTab {
constructor(containerElement) {
@ -32,6 +37,27 @@ export class LiveDemoTab {
connectionAttempts: 0
};
// Model control state
this.modelState = {
models: [],
activeModelId: null,
activeModelInfo: null,
loraProfiles: [],
selectedLoraProfile: null,
loading: false
};
// Training state
this.trainingState = {
status: 'idle', // 'idle' | 'training' | 'recording'
epoch: 0,
totalEpochs: 0,
showTrainingPanel: false
};
// A/B split view state
this.splitViewActive = false;
this.subscriptions = [];
this.logger = this.createLogger();
@ -58,7 +84,17 @@ export class LiveDemoTab {
async init() {
try {
this.logger.info('Initializing LiveDemoTab component');
// Load optional services (non-blocking)
try {
const mod = await import('../services/model.service.js');
modelService = mod.modelService;
} catch (e) { /* model features disabled */ }
try {
const mod = await import('../services/training.service.js');
trainingService = mod.trainingService;
} catch (e) { /* training features disabled */ }
// Create enhanced DOM structure
this.createEnhancedStructure();
@ -71,9 +107,31 @@ export class LiveDemoTab {
// Set up monitoring and health checks
this.setupMonitoring();
// Fetch available models on init
this.fetchModels();
// Set up model/training event listeners
this.setupServiceListeners();
// Initialize state
this.updateUI();
// Auto-start pose detection when a backend is reachable.
// Check after a brief delay (sensing WS may still be connecting).
this._autoStartOnce = false;
const tryAutoStart = () => {
if (this._autoStartOnce || this.state.isActive) return;
const ds = sensingService.dataSource;
if (ds === 'live' || ds === 'server-simulated') {
this._autoStartOnce = true;
this.logger.info('Auto-starting pose detection (data source: ' + ds + ')');
this.startDemo();
}
};
setTimeout(tryAutoStart, 2000);
// Also listen for sensing state changes in case server connects later
this._autoStartUnsub = sensingService.onStateChange(tryAutoStart);
this.logger.info('LiveDemoTab component initialized successfully');
} catch (error) {
this.logger.error('Failed to initialize LiveDemoTab', { error: error.message });
@ -88,6 +146,11 @@ export class LiveDemoTab {
// Create enhanced structure if it doesn't exist
const enhancedHTML = `
<div class="live-demo-enhanced">
<!-- Data source banner prominent indicator for live vs simulated -->
<div id="demo-source-banner" class="demo-source-banner demo-source-unknown" role="status" aria-live="polite">
Detecting data source...
</div>
<div class="demo-header">
<div class="demo-title">
<h2>Live Human Pose Detection</h2>
@ -99,6 +162,7 @@ export class LiveDemoTab {
<div class="demo-controls">
<button class="btn btn--primary" id="start-enhanced-demo">Start Detection</button>
<button class="btn btn--secondary" id="stop-enhanced-demo" disabled>Stop Detection</button>
<button class="btn btn--accent" id="run-offline-demo">Demo</button>
<button class="btn btn--primary" id="toggle-debug">Debug Mode</button>
<select class="zone-select" id="zone-selector">
<option value="zone_1">Zone 1</option>
@ -148,6 +212,49 @@ export class LiveDemoTab {
</div>
</div>
<div class="model-control-panel" id="model-control-panel">
<h4>Model Control</h4>
<div class="setting-row-ld">
<label class="ld-label">Model:</label>
<select class="ld-select" id="model-selector">
<option value="">Signal-Derived (no model)</option>
</select>
</div>
<div class="model-info-row" id="model-active-info" style="display: none;">
<span class="ld-label" id="model-active-name"></span>
<span class="model-pck-badge" id="model-active-pck"></span>
</div>
<div class="setting-row-ld" id="lora-profile-row" style="display: none;">
<label class="ld-label">LoRA Profile:</label>
<select class="ld-select" id="lora-profile-selector">
<option value="">None</option>
</select>
</div>
<div class="model-actions">
<button class="btn-ld btn-ld-accent" id="load-model-btn">Load Model</button>
<button class="btn-ld btn-ld-muted" id="unload-model-btn" disabled>Unload</button>
</div>
<div class="model-status-text" id="model-status-text">No model loaded</div>
</div>
<div class="split-view-panel">
<div class="setting-row-ld">
<label class="ld-label">Compare: Signal vs Model</label>
<button class="btn-ld btn-ld-toggle" id="split-view-toggle" disabled>Off</button>
</div>
</div>
<div class="training-quick-panel" id="training-quick-panel">
<h4>Training</h4>
<div class="training-status-row">
<span class="training-status-badge" id="training-status-badge">Idle</span>
</div>
<div class="training-actions">
<button class="btn-ld btn-ld-accent" id="open-training-panel-btn">Open Training Panel</button>
<button class="btn-ld btn-ld-muted" id="quick-record-btn">Record 60s</button>
</div>
</div>
<div class="setup-guide-panel">
<h4>Setup Guide</h4>
<div class="setup-levels">
@ -606,6 +713,270 @@ export class LiveDemoTab {
border-radius: 3px;
font-size: 10px;
}
/* Model Control Panel */
.model-control-panel,
.split-view-panel,
.training-quick-panel {
background: rgba(17, 24, 39, 0.9);
border: 1px solid rgba(56, 68, 89, 0.6);
border-radius: 12px;
padding: 16px;
}
.model-control-panel h4,
.training-quick-panel h4 {
margin: 0 0 12px 0;
color: #e0e0e0;
font-size: 14px;
font-weight: 600;
}
.setting-row-ld {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 10px;
gap: 8px;
}
.ld-label {
color: #8899aa;
font-size: 11px;
flex-shrink: 0;
}
.ld-select {
flex: 1;
padding: 6px 10px;
border: 1px solid rgba(56, 68, 89, 0.6);
border-radius: 6px;
background: rgba(15, 20, 35, 0.8);
color: #b0b8c8;
font-size: 12px;
cursor: pointer;
min-width: 0;
}
.ld-select:focus {
outline: none;
border-color: #667eea;
box-shadow: 0 0 0 2px rgba(102, 126, 234, 0.15);
}
.ld-select option {
background: #1a2234;
color: #c8d0dc;
}
.model-info-row {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 10px;
padding: 6px 8px;
background: rgba(30, 40, 60, 0.6);
border-radius: 6px;
}
.model-pck-badge {
font-size: 11px;
font-weight: 600;
padding: 2px 8px;
border-radius: 8px;
background: rgba(102, 126, 234, 0.15);
color: #8ea4f0;
}
.model-actions,
.training-actions {
display: flex;
gap: 8px;
margin-top: 10px;
}
.btn-ld {
flex: 1;
padding: 7px 12px;
border: 1px solid rgba(255, 255, 255, 0.1);
border-radius: 8px;
font-size: 12px;
font-weight: 500;
cursor: pointer;
transition: all 0.2s ease;
text-align: center;
}
.btn-ld:disabled {
opacity: 0.4;
cursor: not-allowed;
}
.btn-ld-accent {
background: rgba(102, 126, 234, 0.15);
color: #8ea4f0;
border-color: rgba(102, 126, 234, 0.3);
}
.btn-ld-accent:hover:not(:disabled) {
background: rgba(102, 126, 234, 0.25);
border-color: rgba(102, 126, 234, 0.5);
}
.btn-ld-muted {
background: rgba(30, 40, 60, 0.8);
color: #8899aa;
border-color: rgba(255, 255, 255, 0.08);
}
.btn-ld-muted:hover:not(:disabled) {
background: rgba(40, 50, 70, 0.9);
color: #b0b8c8;
}
.btn-ld-toggle {
min-width: 44px;
flex: 0;
padding: 4px 10px;
background: rgba(30, 40, 60, 0.8);
color: #8899aa;
border-color: rgba(255, 255, 255, 0.08);
border-radius: 12px;
font-size: 11px;
}
.btn-ld-toggle.active {
background: rgba(0, 212, 255, 0.15);
color: #00d4ff;
border-color: rgba(0, 212, 255, 0.4);
}
.model-status-text {
margin-top: 8px;
font-size: 11px;
color: #6b7a8d;
}
.training-status-row {
margin-bottom: 8px;
}
.training-status-badge {
display: inline-block;
padding: 3px 10px;
border-radius: 10px;
font-size: 11px;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.4px;
background: rgba(108, 117, 125, 0.15);
color: #8899aa;
border: 1px solid rgba(108, 117, 125, 0.3);
}
.training-status-badge.training {
background: rgba(251, 191, 36, 0.12);
color: #fbbf24;
border-color: rgba(251, 191, 36, 0.3);
}
.training-status-badge.recording {
background: rgba(239, 68, 68, 0.12);
color: #ef4444;
border-color: rgba(239, 68, 68, 0.3);
animation: pulse 1.5s ease-in-out infinite;
}
/* A/B Split View Overlay */
.split-view-divider {
position: absolute;
top: 0;
bottom: 0;
left: 50%;
width: 2px;
background: repeating-linear-gradient(
to bottom,
rgba(255, 255, 255, 0.4) 0px,
rgba(255, 255, 255, 0.4) 6px,
transparent 6px,
transparent 12px
);
z-index: 15;
pointer-events: none;
}
.split-view-label {
position: absolute;
top: 8px;
z-index: 16;
font-size: 10px;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.5px;
padding: 3px 8px;
border-radius: 4px;
pointer-events: none;
}
.split-view-label.left {
left: 8px;
background: rgba(0, 204, 136, 0.2);
color: #00cc88;
}
.split-view-label.right {
right: 8px;
background: rgba(102, 126, 234, 0.2);
color: #8ea4f0;
}
/* Training modal overlay */
.training-panel-overlay {
position: fixed;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: rgba(0, 0, 0, 0.7);
display: flex;
align-items: center;
justify-content: center;
z-index: 1000;
}
.training-panel-modal {
background: #0d1117;
border: 1px solid rgba(56, 68, 89, 0.6);
border-radius: 12px;
padding: 24px;
min-width: 400px;
max-width: 600px;
max-height: 80vh;
overflow-y: auto;
color: #e0e0e0;
}
.training-panel-modal h3 {
margin: 0 0 16px 0;
font-size: 18px;
color: #e0e0e0;
}
.training-panel-modal .close-btn {
float: right;
background: rgba(30, 40, 60, 0.8);
border: 1px solid rgba(255, 255, 255, 0.1);
color: #8899aa;
border-radius: 6px;
padding: 4px 10px;
cursor: pointer;
font-size: 12px;
}
.training-panel-modal .close-btn:hover {
background: rgba(50, 60, 80, 0.9);
color: #c8d0dc;
}
`;
if (!document.querySelector('#live-demo-enhanced-styles')) {
@ -664,6 +1035,16 @@ export class LiveDemoTab {
stopBtn.addEventListener('click', () => this.stopDemo());
}
// Offline demo button — runs client-side animated demo (no server needed)
const offlineDemoBtn = this.container.querySelector('#run-offline-demo');
if (offlineDemoBtn) {
offlineDemoBtn.addEventListener('click', () => {
if (this.components.poseCanvas) {
this.components.poseCanvas.toggleDemo();
}
});
}
if (debugBtn) {
debugBtn.addEventListener('click', () => this.toggleDebugMode());
}
@ -690,6 +1071,9 @@ export class LiveDemoTab {
exportLogsBtn.addEventListener('click', () => this.exportLogs());
}
// Model, training, and split-view controls
this.setupModelTrainingControls();
this.logger.debug('Enhanced controls set up');
}
@ -706,6 +1090,23 @@ export class LiveDemoTab {
this.updateMetricsDisplay();
}, 1000);
// Subscribe to sensing service for data-source changes
this._sensingStateUnsub = sensingService.onStateChange(() => {
this.updateSourceBanner();
this.updateStatusIndicator();
});
// Throttle data-based banner updates (frames arrive at 10Hz)
let lastBannerUpdate = 0;
this._sensingDataUnsub = sensingService.onData(() => {
const now = Date.now();
if (now - lastBannerUpdate > 2000) {
lastBannerUpdate = now;
this.updateSourceBanner();
}
});
// Initial banner update
this.updateSourceBanner();
this.logger.debug('Monitoring set up');
}
@ -901,17 +1302,40 @@ export class LiveDemoTab {
}
getStatusClass() {
if (this.state.isActive) {
return this.state.connectionState === 'connected' ? 'active' : 'connecting';
if (!this.state.isActive) {
return this.state.connectionState === 'error' ? 'error' : '';
}
return this.state.connectionState === 'error' ? 'error' : '';
const ds = sensingService.dataSource;
if (ds === 'live') return 'active';
if (ds === 'server-simulated') return 'sim';
return 'connecting';
}
getStatusText() {
if (this.state.isActive) {
return this.state.connectionState === 'connected' ? 'Active' : 'Connecting...';
if (!this.state.isActive) {
return this.state.connectionState === 'error' ? 'Error' : 'Ready';
}
return this.state.connectionState === 'error' ? 'Error' : 'Ready';
const ds = sensingService.dataSource;
if (ds === 'live') return 'Active \u2014 ESP32 Live';
if (ds === 'server-simulated') return 'Active \u2014 Simulated Data';
if (ds === 'simulated') return 'Active \u2014 Offline Simulation';
return 'Connecting...';
}
/** Update the prominent data-source banner at the top of Live Demo. */
updateSourceBanner() {
const banner = this.container.querySelector('#demo-source-banner');
if (!banner) return;
const ds = sensingService.dataSource;
const config = {
'live': { text: 'LIVE \u2014 ESP32 Hardware Connected', cls: 'demo-source-live' },
'server-simulated': { text: 'SIMULATED DATA \u2014 No Hardware Detected', cls: 'demo-source-sim' },
'reconnecting': { text: 'RECONNECTING TO SERVER...', cls: 'demo-source-reconnecting' },
'simulated': { text: 'OFFLINE \u2014 Server Unreachable, Local Sim', cls: 'demo-source-offline' },
};
const cfg = config[ds] || config['reconnecting'];
banner.textContent = cfg.text;
banner.className = 'demo-source-banner ' + cfg.cls;
}
updateControls() {
@ -942,8 +1366,20 @@ export class LiveDemoTab {
};
if (elements.connectionStatus) {
elements.connectionStatus.textContent = this.state.connectionState;
elements.connectionStatus.className = `health-${this.getHealthClass(this.state.connectionState)}`;
const ds = sensingService.dataSource;
const dsLabels = {
'live': 'Connected \u2014 ESP32',
'server-simulated': 'Connected \u2014 Simulated',
'reconnecting': 'Reconnecting...',
'simulated': 'Offline \u2014 Simulated',
};
const label = dsLabels[ds] || this.state.connectionState;
elements.connectionStatus.textContent = label;
const cls = ds === 'live' ? 'good'
: ds === 'server-simulated' ? 'sim'
: ds === 'simulated' ? 'bad'
: this.getHealthClass(this.state.connectionState);
elements.connectionStatus.className = `health-${cls}`;
}
if (elements.frameCount) {
@ -1061,6 +1497,356 @@ export class LiveDemoTab {
}
}
// --- Model Control Methods ---
async fetchModels() {
if (!modelService) return;
try {
const data = await modelService.listModels();
this.modelState.models = data?.models || [];
this.populateModelSelector();
// Check if a model is already active
const active = await modelService.getActiveModel();
if (active && active.model_id) {
this.modelState.activeModelId = active.model_id;
this.modelState.activeModelInfo = active;
this.updateModelUI();
}
} catch (error) {
this.logger.warn('Could not fetch models', { error: error.message });
}
}
populateModelSelector() {
const selector = this.container.querySelector('#model-selector');
if (!selector) return;
// Keep the first "Signal-Derived" option
selector.innerHTML = '<option value="">Signal-Derived (no model)</option>';
this.modelState.models.forEach(model => {
const opt = document.createElement('option');
opt.value = model.id || model.model_id || model.name;
opt.textContent = model.name || model.id || 'Unknown Model';
selector.appendChild(opt);
});
if (this.modelState.activeModelId) {
selector.value = this.modelState.activeModelId;
}
}
async handleLoadModel() {
if (!modelService) return;
const selector = this.container.querySelector('#model-selector');
const modelId = selector?.value;
if (!modelId) {
this.setModelStatus('Select a model first');
return;
}
try {
this.modelState.loading = true;
this.setModelStatus('Loading...');
const loadBtn = this.container.querySelector('#load-model-btn');
if (loadBtn) loadBtn.disabled = true;
await modelService.loadModel(modelId);
this.modelState.activeModelId = modelId;
// Try to fetch full info
try {
const info = await modelService.getModel(modelId);
this.modelState.activeModelInfo = info;
} catch (e) {
this.modelState.activeModelInfo = { model_id: modelId };
}
// Fetch LoRA profiles
try {
const profiles = await modelService.getLoraProfiles();
this.modelState.loraProfiles = profiles || [];
} catch (e) {
this.modelState.loraProfiles = [];
}
this.modelState.loading = false;
this.updateModelUI();
this.updateSplitViewAvailability();
// Update pose source badge to model inference
this.setState({ poseSource: 'model_inference' });
} catch (error) {
this.modelState.loading = false;
this.setModelStatus(`Error: ${error.message}`);
const loadBtn = this.container.querySelector('#load-model-btn');
if (loadBtn) loadBtn.disabled = false;
this.logger.error('Failed to load model', { error: error.message });
}
}
async handleUnloadModel() {
if (!modelService) return;
try {
await modelService.unloadModel();
this.modelState.activeModelId = null;
this.modelState.activeModelInfo = null;
this.modelState.loraProfiles = [];
this.modelState.selectedLoraProfile = null;
this.updateModelUI();
this.updateSplitViewAvailability();
this.disableSplitView();
this.setState({ poseSource: 'signal_derived' });
} catch (error) {
this.setModelStatus(`Error: ${error.message}`);
this.logger.error('Failed to unload model', { error: error.message });
}
}
async handleLoraProfileChange(profileName) {
if (!modelService || !this.modelState.activeModelId) return;
if (!profileName) return;
try {
await modelService.activateLoraProfile(this.modelState.activeModelId, profileName);
this.modelState.selectedLoraProfile = profileName;
this.setModelStatus(`LoRA: ${profileName} active`);
} catch (error) {
this.setModelStatus(`LoRA error: ${error.message}`);
}
}
updateModelUI() {
const loadBtn = this.container.querySelector('#load-model-btn');
const unloadBtn = this.container.querySelector('#unload-model-btn');
const infoRow = this.container.querySelector('#model-active-info');
const nameEl = this.container.querySelector('#model-active-name');
const pckEl = this.container.querySelector('#model-active-pck');
const loraRow = this.container.querySelector('#lora-profile-row');
const loraSel = this.container.querySelector('#lora-profile-selector');
const isLoaded = !!this.modelState.activeModelId;
if (loadBtn) loadBtn.disabled = isLoaded;
if (unloadBtn) unloadBtn.disabled = !isLoaded;
if (infoRow) {
infoRow.style.display = isLoaded ? 'flex' : 'none';
}
if (isLoaded && this.modelState.activeModelInfo) {
const info = this.modelState.activeModelInfo;
const name = info.name || info.model_id || this.modelState.activeModelId;
const version = info.version ? ` v${info.version}` : '';
const pck = info.pck_score != null ? info.pck_score.toFixed(2) : '--';
if (nameEl) nameEl.textContent = `${name}${version}`;
if (pckEl) pckEl.textContent = `PCK: ${pck}`;
this.setModelStatus(`Model: ${name} (PCK: ${pck})`);
} else if (!isLoaded) {
this.setModelStatus('No model loaded');
}
// LoRA profiles
if (loraRow && loraSel) {
if (isLoaded && this.modelState.loraProfiles.length > 0) {
loraRow.style.display = 'flex';
loraSel.innerHTML = '<option value="">None</option>';
this.modelState.loraProfiles.forEach(profile => {
const opt = document.createElement('option');
opt.value = profile.name || profile;
opt.textContent = profile.name || profile;
loraSel.appendChild(opt);
});
} else {
loraRow.style.display = 'none';
}
}
}
setModelStatus(text) {
const el = this.container.querySelector('#model-status-text');
if (el) el.textContent = text;
}
// --- A/B Split View Methods ---
updateSplitViewAvailability() {
const toggle = this.container.querySelector('#split-view-toggle');
if (toggle) {
toggle.disabled = !this.modelState.activeModelId;
}
}
toggleSplitView() {
if (!this.modelState.activeModelId) return;
this.splitViewActive = !this.splitViewActive;
const toggle = this.container.querySelector('#split-view-toggle');
if (toggle) {
toggle.textContent = this.splitViewActive ? 'On' : 'Off';
toggle.classList.toggle('active', this.splitViewActive);
}
this.updateSplitViewOverlay();
}
disableSplitView() {
this.splitViewActive = false;
const toggle = this.container.querySelector('#split-view-toggle');
if (toggle) {
toggle.textContent = 'Off';
toggle.classList.remove('active');
}
this.updateSplitViewOverlay();
}
updateSplitViewOverlay() {
const mainContainer = this.container.querySelector('.pose-detection-container');
if (!mainContainer) return;
// Remove existing overlays
mainContainer.querySelectorAll('.split-view-divider, .split-view-label').forEach(el => el.remove());
if (this.splitViewActive) {
const divider = document.createElement('div');
divider.className = 'split-view-divider';
mainContainer.appendChild(divider);
const leftLabel = document.createElement('div');
leftLabel.className = 'split-view-label left';
leftLabel.textContent = 'Signal-Derived';
mainContainer.appendChild(leftLabel);
const rightLabel = document.createElement('div');
rightLabel.className = 'split-view-label right';
rightLabel.textContent = 'Model Inference';
mainContainer.appendChild(rightLabel);
}
}
// --- Training Quick-Panel Methods ---
updateTrainingStatus() {
const badge = this.container.querySelector('#training-status-badge');
if (!badge) return;
const state = this.trainingState.status;
badge.classList.remove('training', 'recording');
if (state === 'training') {
badge.classList.add('training');
badge.textContent = `Training epoch ${this.trainingState.epoch}/${this.trainingState.totalEpochs}`;
} else if (state === 'recording') {
badge.classList.add('recording');
badge.textContent = 'Recording...';
} else {
badge.textContent = 'Idle';
}
}
async handleQuickRecord() {
if (!trainingService) {
this.logger.warn('Training service not available');
return;
}
try {
await trainingService.startRecording({ session_name: `quick_${Date.now()}`, duration_secs: 60 });
this.trainingState.status = 'recording';
this.updateTrainingStatus();
// Auto-reset after ~65 seconds
setTimeout(() => {
if (this.trainingState.status === 'recording') {
this.trainingState.status = 'idle';
this.updateTrainingStatus();
}
}, 65000);
} catch (error) {
this.logger.error('Quick record failed', { error: error.message });
}
}
showTrainingPanel() {
// Create a simple modal overlay for the training panel
const existing = document.querySelector('.training-panel-overlay');
if (existing) existing.remove();
const overlay = document.createElement('div');
overlay.className = 'training-panel-overlay';
overlay.innerHTML = `
<div class="training-panel-modal">
<button class="close-btn" id="close-training-modal">Close</button>
<h3>Training Panel</h3>
<p style="color: #8899aa; font-size: 13px; margin-bottom: 16px;">
Configure and start model training from here. Connect to the backend training API to manage epochs, datasets, and checkpoints.
</p>
<div style="display: flex; flex-direction: column; gap: 10px;">
<div class="setting-row-ld">
<label class="ld-label" style="flex: 1;">Status:</label>
<span style="color: #c8d0dc; font-size: 12px;">${this.trainingState.status}</span>
</div>
<div class="setting-row-ld">
<label class="ld-label" style="flex: 1;">Training service:</label>
<span style="color: ${trainingService ? '#00cc88' : '#ef4444'}; font-size: 12px;">${trainingService ? 'Connected' : 'Not available'}</span>
</div>
</div>
</div>
`;
document.body.appendChild(overlay);
// Close handler
overlay.querySelector('#close-training-modal').addEventListener('click', () => overlay.remove());
overlay.addEventListener('click', (e) => {
if (e.target === overlay) overlay.remove();
});
}
// --- Service Event Listeners ---
setupServiceListeners() {
if (modelService) {
const unsub1 = modelService.on('model-loaded', (data) => {
this.logger.info('Model loaded event', data);
});
const unsub2 = modelService.on('model-unloaded', () => {
this.modelState.activeModelId = null;
this.modelState.activeModelInfo = null;
this.updateModelUI();
this.disableSplitView();
});
this.subscriptions.push(unsub1, unsub2);
}
if (trainingService) {
const unsub3 = trainingService.on('progress', (data) => {
if (data && data.epoch != null) {
this.trainingState.epoch = data.epoch;
this.trainingState.totalEpochs = data.total_epochs || data.totalEpochs || this.trainingState.totalEpochs;
this.trainingState.status = 'training';
this.updateTrainingStatus();
}
});
const unsub4 = trainingService.on('training-stopped', () => {
this.trainingState.status = 'idle';
this.updateTrainingStatus();
});
this.subscriptions.push(unsub3, unsub4);
}
}
// --- Enhanced Controls Setup ---
setupModelTrainingControls() {
// Model control buttons
const loadBtn = this.container.querySelector('#load-model-btn');
const unloadBtn = this.container.querySelector('#unload-model-btn');
const loraSel = this.container.querySelector('#lora-profile-selector');
const splitToggle = this.container.querySelector('#split-view-toggle');
const openTrainingBtn = this.container.querySelector('#open-training-panel-btn');
const quickRecordBtn = this.container.querySelector('#quick-record-btn');
if (loadBtn) loadBtn.addEventListener('click', () => this.handleLoadModel());
if (unloadBtn) unloadBtn.addEventListener('click', () => this.handleUnloadModel());
if (loraSel) loraSel.addEventListener('change', (e) => this.handleLoraProfileChange(e.target.value));
if (splitToggle) splitToggle.addEventListener('click', () => this.toggleSplitView());
if (openTrainingBtn) openTrainingBtn.addEventListener('click', () => this.showTrainingPanel());
if (quickRecordBtn) quickRecordBtn.addEventListener('click', () => this.handleQuickRecord());
}
// Clean up
dispose() {
try {
@ -1088,6 +1874,9 @@ export class LiveDemoTab {
// Unsubscribe from services
this.subscriptions.forEach(unsubscribe => unsubscribe());
this.subscriptions = [];
if (this._sensingStateUnsub) this._sensingStateUnsub();
if (this._sensingDataUnsub) this._sensingDataUnsub();
if (this._autoStartUnsub) this._autoStartUnsub();
this.logger.info('LiveDemoTab component disposed successfully');
} catch (error) {

230
ui/components/ModelPanel.js Normal file
View file

@ -0,0 +1,230 @@
// ModelPanel Component for WiFi-DensePose UI
// Dark-mode panel for model management: listing, loading, LoRA profiles.
import { modelService } from '../services/model.service.js';
const MP_STYLES = `
.mp-panel{background:rgba(17,24,39,.9);border:1px solid rgba(56,68,89,.6);border-radius:8px;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,sans-serif;color:#e0e0e0;overflow:hidden}
.mp-header{display:flex;align-items:center;justify-content:space-between;padding:14px 16px;background:rgba(13,17,23,.95);border-bottom:1px solid rgba(56,68,89,.6)}
.mp-title{font-size:14px;font-weight:600;color:#e0e0e0}
.mp-badge{background:rgba(102,126,234,.2);color:#8ea4f0;font-size:11px;font-weight:600;padding:2px 8px;border-radius:10px;border:1px solid rgba(102,126,234,.3)}
.mp-error{background:rgba(220,53,69,.15);color:#f5a0a8;border:1px solid rgba(220,53,69,.3);border-radius:4px;padding:8px 12px;margin:10px 12px 0;font-size:12px}
.mp-active-card{margin:12px;padding:12px;background:rgba(13,17,23,.8);border:1px solid rgba(56,68,89,.6);border-left:3px solid #28a745;border-radius:6px}
.mp-active-name{font-size:14px;font-weight:600;color:#c8d0dc;margin-bottom:6px}
.mp-active-meta{display:flex;gap:6px;flex-wrap:wrap;margin-bottom:8px}
.mp-active-stats{font-size:12px;color:#8899aa;margin-bottom:10px}
.mp-stat-label{color:#8899aa}.mp-stat-value{color:#c8d0dc;font-weight:500}.mp-stat-sep{color:rgba(56,68,89,.8);margin:0 6px}
.mp-lora-row{display:flex;align-items:center;gap:8px;margin-bottom:10px}
.mp-lora-label{font-size:12px;color:#8899aa}
.mp-lora-select{flex:1;padding:4px 8px;background:rgba(30,40,60,.8);border:1px solid rgba(56,68,89,.6);border-radius:4px;color:#c8d0dc;font-size:12px}
.mp-list-section{padding:0 12px 12px}
.mp-section-title{font-size:11px;font-weight:600;text-transform:uppercase;letter-spacing:.5px;color:#8899aa;padding:10px 0 8px}
.mp-model-card{padding:10px;margin-bottom:8px;background:rgba(13,17,23,.6);border:1px solid rgba(56,68,89,.4);border-radius:6px;transition:border-color .2s}
.mp-model-card:hover{border-color:rgba(102,126,234,.4)}
.mp-card-name{font-size:13px;font-weight:500;color:#c8d0dc;margin-bottom:4px}
.mp-card-meta{display:flex;gap:6px;flex-wrap:wrap;margin-bottom:8px}
.mp-meta-tag{background:rgba(30,40,60,.8);color:#8899aa;font-size:10px;padding:2px 6px;border-radius:3px;border:1px solid rgba(56,68,89,.4)}
.mp-card-actions{display:flex;gap:6px}
.mp-empty{color:#6b7a8d;font-size:12px;padding:16px 0;text-align:center;line-height:1.5}
.mp-footer{padding:10px 12px;border-top:1px solid rgba(56,68,89,.4);display:flex;justify-content:flex-end}
.mp-btn{padding:5px 12px;border-radius:4px;font-size:12px;font-weight:500;cursor:pointer;border:1px solid transparent;transition:all .15s}
.mp-btn:disabled{opacity:.5;cursor:not-allowed}
.mp-btn-success{background:rgba(40,167,69,.2);color:#51cf66;border-color:rgba(40,167,69,.3)}
.mp-btn-success:hover:not(:disabled){background:rgba(40,167,69,.35)}
.mp-btn-danger{background:rgba(220,53,69,.2);color:#ff6b6b;border-color:rgba(220,53,69,.3)}
.mp-btn-danger:hover:not(:disabled){background:rgba(220,53,69,.35)}
.mp-btn-secondary{background:rgba(30,40,60,.8);color:#b0b8c8;border-color:rgba(56,68,89,.6)}
.mp-btn-secondary:hover:not(:disabled){background:rgba(40,50,75,.9)}
.mp-btn-muted{background:transparent;color:#6b7a8d;border-color:rgba(56,68,89,.4);font-size:11px;padding:4px 8px}
.mp-btn-muted:hover:not(:disabled){color:#ff6b6b;border-color:rgba(220,53,69,.3)}
`;
export default class ModelPanel {
constructor(container) {
this.container = typeof container === 'string'
? document.getElementById(container) : container;
if (!this.container) throw new Error('ModelPanel: container element not found');
this.state = { models: [], activeModel: null, loraProfiles: [], loading: false, error: null };
this.unsubs = [];
this._injectStyles();
this.render();
this.refresh();
this.unsubs.push(
modelService.on('model-loaded', () => this.refresh()),
modelService.on('model-unloaded', () => this.refresh()),
modelService.on('lora-activated', () => this.refresh())
);
}
// --- Data ---
async refresh() {
this._set({ loading: true, error: null });
try {
const [listRes, active] = await Promise.all([
modelService.listModels().catch(() => ({ models: [] })),
modelService.getActiveModel().catch(() => null)
]);
let lora = [];
if (active) lora = await modelService.getLoraProfiles().catch(() => []);
this._set({ models: listRes?.models ?? [], activeModel: active, loraProfiles: lora, loading: false });
} catch (e) { this._set({ loading: false, error: e.message }); }
}
// --- Actions ---
async _load(id) {
this._set({ loading: true, error: null });
try { await modelService.loadModel(id); await this.refresh(); }
catch (e) { this._set({ loading: false, error: `Load failed: ${e.message}` }); }
}
async _unload() {
this._set({ loading: true, error: null });
try { await modelService.unloadModel(); await this.refresh(); }
catch (e) { this._set({ loading: false, error: `Unload failed: ${e.message}` }); }
}
async _delete(id) {
this._set({ loading: true, error: null });
try { await modelService.deleteModel(id); await this.refresh(); }
catch (e) { this._set({ loading: false, error: `Delete failed: ${e.message}` }); }
}
async _loraChange(modelId, profile) {
if (!profile) return;
this._set({ loading: true, error: null });
try { await modelService.activateLoraProfile(modelId, profile); await this.refresh(); }
catch (e) { this._set({ loading: false, error: `LoRA failed: ${e.message}` }); }
}
_set(p) { Object.assign(this.state, p); this.render(); }
// --- Render ---
render() {
const el = this.container;
el.innerHTML = '';
const panel = this._el('div', 'mp-panel');
// Header
const hdr = this._el('div', 'mp-header');
hdr.appendChild(this._el('span', 'mp-title', 'Model Library'));
hdr.appendChild(this._el('span', 'mp-badge', String(this.state.models.length)));
panel.appendChild(hdr);
if (this.state.error) panel.appendChild(this._el('div', 'mp-error', this.state.error));
// Active model
if (this.state.activeModel) panel.appendChild(this._renderActive());
// List
const ls = this._el('div', 'mp-list-section');
ls.appendChild(this._el('div', 'mp-section-title', 'Available Models'));
const models = this.state.models.filter(
m => !(this.state.activeModel && this.state.activeModel.model_id === m.id)
);
if (models.length === 0 && !this.state.loading) {
ls.appendChild(this._el('div', 'mp-empty', 'No .rvf models found. Train a model or place .rvf files in data/models/'));
} else {
models.forEach(m => ls.appendChild(this._renderCard(m)));
}
panel.appendChild(ls);
// Footer
const ft = this._el('div', 'mp-footer');
const rb = this._btn('Refresh', 'mp-btn mp-btn-secondary', () => this.refresh());
rb.disabled = this.state.loading;
ft.appendChild(rb);
panel.appendChild(ft);
el.appendChild(panel);
}
_renderActive() {
const am = this.state.activeModel;
const card = this._el('div', 'mp-active-card');
card.appendChild(this._el('div', 'mp-active-name', am.model_id || 'Active Model'));
const full = this.state.models.find(m => m.id === am.model_id);
if (full) {
const meta = this._el('div', 'mp-active-meta');
if (full.version) meta.appendChild(this._tag('v' + full.version));
if (full.pck_score != null) meta.appendChild(this._tag('PCK ' + (full.pck_score * 100).toFixed(1) + '%'));
card.appendChild(meta);
}
if (am.avg_inference_ms != null) {
const st = this._el('div', 'mp-active-stats');
st.innerHTML = `<span class="mp-stat-label">Inference:</span> <span class="mp-stat-value">${am.avg_inference_ms.toFixed(1)} ms</span><span class="mp-stat-sep">|</span><span class="mp-stat-label">Frames:</span> <span class="mp-stat-value">${am.frames_processed ?? 0}</span>`;
card.appendChild(st);
}
if (this.state.loraProfiles.length > 0) {
const row = this._el('div', 'mp-lora-row');
row.appendChild(this._el('span', 'mp-lora-label', 'LoRA Profile:'));
const sel = document.createElement('select');
sel.className = 'mp-lora-select';
const def = document.createElement('option');
def.value = ''; def.textContent = '-- none --'; sel.appendChild(def);
this.state.loraProfiles.forEach(p => {
const o = document.createElement('option');
o.value = p; o.textContent = p; sel.appendChild(o);
});
sel.addEventListener('change', () => this._loraChange(am.model_id, sel.value));
row.appendChild(sel);
card.appendChild(row);
}
const ub = this._btn('Unload', 'mp-btn mp-btn-danger', () => this._unload());
ub.disabled = this.state.loading;
card.appendChild(ub);
return card;
}
_renderCard(model) {
const card = this._el('div', 'mp-model-card');
card.appendChild(this._el('div', 'mp-card-name', model.filename || model.id));
const meta = this._el('div', 'mp-card-meta');
if (model.version) meta.appendChild(this._tag('v' + model.version));
if (model.size_bytes != null) meta.appendChild(this._tag(this._fmtB(model.size_bytes)));
if (model.pck_score != null) meta.appendChild(this._tag('PCK ' + (model.pck_score * 100).toFixed(1) + '%'));
if (model.lora_profiles && model.lora_profiles.length > 0) meta.appendChild(this._tag(model.lora_profiles.length + ' LoRA'));
card.appendChild(meta);
const acts = this._el('div', 'mp-card-actions');
const lb = this._btn('Load', 'mp-btn mp-btn-success', () => this._load(model.id));
lb.disabled = this.state.loading;
const db = this._btn('Delete', 'mp-btn mp-btn-muted', () => this._delete(model.id));
db.disabled = this.state.loading;
acts.appendChild(lb); acts.appendChild(db);
card.appendChild(acts);
return card;
}
// --- Helpers ---
_el(tag, cls, txt) { const e = document.createElement(tag); if (cls) e.className = cls; if (txt != null) e.textContent = txt; return e; }
_btn(txt, cls, fn) { const b = document.createElement('button'); b.className = cls; b.textContent = txt; b.addEventListener('click', fn); return b; }
_tag(txt) { return this._el('span', 'mp-meta-tag', txt); }
_fmtB(b) { return b < 1024 ? b + ' B' : b < 1048576 ? (b / 1024).toFixed(1) + ' KB' : (b / 1048576).toFixed(1) + ' MB'; }
_injectStyles() {
if (document.getElementById('model-panel-styles')) return;
const s = document.createElement('style');
s.id = 'model-panel-styles';
s.textContent = MP_STYLES;
document.head.appendChild(s);
}
destroy() {
this.unsubs.forEach(fn => fn());
this.unsubs = [];
if (this.container) this.container.innerHTML = '';
}
dispose() {
this.destroy();
}
}

View file

@ -45,7 +45,12 @@ export class PoseDetectionCanvas {
// Initialize settings panel
this.settingsPanel = null;
// Pose trail state
this.poseTrail = [];
this.showTrail = false;
this.maxTrailLength = 10;
// Initialize component
this.initializeComponent();
}
@ -88,7 +93,7 @@ export class PoseDetectionCanvas {
<span class="status-text" id="status-text-${this.containerId}">Disconnected</span>
</div>
</div>
<div class="pose-canvas-controls" id="controls-${this.containerId}">
<div class="pose-canvas-controls" id="controls-${this.containerId}" ${!this.config.enableControls ? 'style="display:none"' : ''}>
<button class="btn btn-start" id="start-btn-${this.containerId}">&#9654; Start</button>
<button class="btn btn-stop" id="stop-btn-${this.containerId}" disabled>&#9632; Stop</button>
<button class="btn btn-reconnect" id="reconnect-btn-${this.containerId}" disabled>&#8635; Reconnect</button>
@ -99,6 +104,7 @@ export class PoseDetectionCanvas {
<option value="heatmap">Heatmap</option>
<option value="dense">Dense</option>
</select>
<button class="btn btn-trail" id="trail-btn-${this.containerId}">&#9676; Trail</button>
<button class="btn btn-settings" id="settings-btn-${this.containerId}">&#9881; Settings</button>
</div>
</div>
@ -285,6 +291,25 @@ export class PoseDetectionCanvas {
border-color: rgba(100, 116, 139, 0.5);
}
.btn-trail {
background: rgba(0, 212, 255, 0.1);
color: #5ec4d4;
border-color: rgba(0, 212, 255, 0.25);
}
.btn-trail:hover:not(:disabled) {
background: rgba(0, 212, 255, 0.2);
border-color: rgba(0, 212, 255, 0.45);
box-shadow: 0 4px 12px rgba(0, 212, 255, 0.15);
}
.btn-trail.active {
background: rgba(0, 212, 255, 0.2);
color: #00d4ff;
border-color: rgba(0, 212, 255, 0.5);
box-shadow: 0 0 8px rgba(0, 212, 255, 0.2);
}
.mode-select {
padding: 8px 12px;
border: 1px solid rgba(255, 255, 255, 0.1);
@ -416,6 +441,10 @@ export class PoseDetectionCanvas {
const demoBtn = document.getElementById(`demo-btn-${this.containerId}`);
demoBtn.addEventListener('click', () => this.toggleDemo());
// Trail toggle button
const trailBtn = document.getElementById(`trail-btn-${this.containerId}`);
trailBtn.addEventListener('click', () => this.toggleTrail());
// Settings button
const settingsBtn = document.getElementById(`settings-btn-${this.containerId}`);
settingsBtn.addEventListener('click', () => this.showSettings());
@ -445,6 +474,7 @@ export class PoseDetectionCanvas {
case 'pose_update':
this.state.lastPoseData = update.data;
this.state.frameCount++;
this.updateTrail(update.data);
this.renderPoseData(update.data);
this.updateStats();
this.notifyCallback('onPoseUpdate', update.data);
@ -487,14 +517,40 @@ export class PoseDetectionCanvas {
return;
}
try {
// Render trail before the current frame if enabled
if (this.showTrail && this.poseTrail.length > 1) {
// The renderer.render() clears the canvas, so we render trail
// by hooking into the renderer's canvas context after clear.
// We override the render flow: clear, trail, then current.
this.renderer.clearCanvas();
this.renderTrail(this.renderer.ctx);
// Now render current frame without clearing again
this.renderCurrentFrameNoClean(poseData);
} else {
this.renderer.render(poseData, {
frameCount: this.state.frameCount,
connectionState: this.state.connectionState
});
}
} catch (error) {
this.logger.error('Render error', { error: error.message });
this.showError(`Render error: ${error.message}`);
}
}
renderCurrentFrameNoClean(poseData) {
// Call the renderer's render logic without clearing the canvas.
// We temporarily stub clearCanvas, render, then restore.
const origClear = this.renderer.clearCanvas.bind(this.renderer);
this.renderer.clearCanvas = () => {}; // no-op
try {
this.renderer.render(poseData, {
frameCount: this.state.frameCount,
connectionState: this.state.connectionState
});
} catch (error) {
this.logger.error('Render error', { error: error.message });
this.showError(`Render error: ${error.message}`);
} finally {
this.renderer.clearCanvas = origClear;
}
}
@ -650,6 +706,104 @@ export class PoseDetectionCanvas {
}
}
// --- Pose Trail Methods ---
toggleTrail() {
this.showTrail = !this.showTrail;
const trailBtn = document.getElementById(`trail-btn-${this.containerId}`);
if (trailBtn) {
trailBtn.classList.toggle('active', this.showTrail);
trailBtn.textContent = this.showTrail ? '\u25CB Trail On' : '\u25CB Trail';
}
if (!this.showTrail) {
this.poseTrail = [];
}
this.logger.info('Trail toggled', { showTrail: this.showTrail });
}
updateTrail(poseData) {
if (!this.showTrail) return;
if (!poseData || !poseData.persons || poseData.persons.length === 0) return;
// Deep clone the keypoints from all persons for this frame
const frameKeypoints = poseData.persons.map(person => {
if (!person.keypoints) return null;
return person.keypoints.map(kp => ({
x: kp.x,
y: kp.y,
confidence: kp.confidence
}));
}).filter(Boolean);
if (frameKeypoints.length > 0) {
this.poseTrail.push(frameKeypoints);
if (this.poseTrail.length > this.maxTrailLength) {
this.poseTrail.shift();
}
}
}
renderTrail(ctx) {
if (!this.poseTrail || this.poseTrail.length < 2) return;
const totalFrames = this.poseTrail.length;
// Keypoint color palette (same as renderer's body part colors)
const kpColors = [
'#ff0000', '#ff4500', '#ffa500', '#ffff00', '#adff2f',
'#00ff00', '#00ff7f', '#00ffff', '#0080ff', '#0000ff',
'#4000ff', '#8000ff', '#ff00ff', '#ff0080', '#ff0040',
'#ff8080', '#ffb380'
];
// Render ghosted keypoints and trajectory lines for each frame in the trail
// (skip the last frame since it's the current one rendered by the normal pipeline)
for (let frameIdx = 0; frameIdx < totalFrames - 1; frameIdx++) {
const alpha = 0.1 + (frameIdx / totalFrames) * 0.7;
const framePersons = this.poseTrail[frameIdx];
const nextFramePersons = this.poseTrail[frameIdx + 1];
framePersons.forEach((personKeypoints, personIdx) => {
if (!personKeypoints) return;
personKeypoints.forEach((kp, kpIdx) => {
if (kp.confidence <= 0.1) return;
const x = this.renderer.scaleX(kp.x);
const y = this.renderer.scaleY(kp.y);
const color = kpColors[kpIdx % kpColors.length];
// Draw ghosted keypoint dot
ctx.globalAlpha = alpha * 0.6;
ctx.fillStyle = color;
ctx.beginPath();
ctx.arc(x, y, 2.5, 0, Math.PI * 2);
ctx.fill();
// Draw trajectory line to same keypoint in next frame
if (nextFramePersons && nextFramePersons[personIdx]) {
const nextKp = nextFramePersons[personIdx][kpIdx];
if (nextKp && nextKp.confidence > 0.1) {
const nx = this.renderer.scaleX(nextKp.x);
const ny = this.renderer.scaleY(nextKp.y);
ctx.globalAlpha = alpha * 0.4;
ctx.strokeStyle = color;
ctx.lineWidth = 1;
ctx.beginPath();
ctx.moveTo(x, y);
ctx.lineTo(nx, ny);
ctx.stroke();
}
}
});
});
}
// Reset alpha
ctx.globalAlpha = 1.0;
}
// Toggle demo mode
toggleDemo() {
if (this.demoState && this.demoState.isRunning) {

View file

@ -216,9 +216,10 @@ export class SensingTab {
// Map the service's dataSource to banner text and CSS modifier class.
const dataSource = sensingService.dataSource;
const bannerConfig = {
live: { text: 'LIVE - ESP32', cls: 'sensing-source-live' },
reconnecting: { text: 'RECONNECTING...', cls: 'sensing-source-reconnecting' },
simulated: { text: 'SIMULATED DATA', cls: 'sensing-source-simulated' },
'live': { text: 'LIVE \u2014 ESP32 HARDWARE', cls: 'sensing-source-live' },
'server-simulated': { text: 'SIMULATED \u2014 NO HARDWARE', cls: 'sensing-source-server-sim' },
'reconnecting': { text: 'RECONNECTING...', cls: 'sensing-source-reconnecting' },
'simulated': { text: 'OFFLINE \u2014 CLIENT SIMULATION', cls: 'sensing-source-simulated' },
};
const cfg = bannerConfig[dataSource] || bannerConfig.reconnecting;
banner.textContent = cfg.text;
@ -256,7 +257,8 @@ export class SensingTab {
// Details
this._setText('valDomFreq', (f.dominant_freq_hz || 0).toFixed(3) + ' Hz');
this._setText('valChangePoints', String(f.change_points || 0));
this._setText('valSampleRate', data.source === 'simulated' ? 'sim' : 'live');
const srcLabel = (data.source === 'simulated' || data.source === 'simulate') ? 'sim' : data.source || 'live';
this._setText('valSampleRate', srcLabel);
// Sparkline
this._drawSparkline();

View file

@ -55,7 +55,23 @@ export class SettingsPanel {
// Advanced settings
heartbeatInterval: 30000,
maxReconnectAttempts: 10,
enableSmoothing: true
enableSmoothing: true,
// Model settings
defaultModelPath: 'data/models/',
autoLoadModel: false,
inferenceDevice: 'CPU',
inferenceThreads: 4,
progressiveLoading: true,
// Training settings
defaultEpochs: 100,
defaultBatchSize: 32,
defaultLearningRate: 0.0003,
earlyStoppingPatience: 15,
checkpointDirectory: 'data/models/',
autoExportOnCompletion: true,
recordingDirectory: 'data/recordings/'
};
this.callbacks = {
@ -245,6 +261,67 @@ export class SettingsPanel {
</div>
</div>
<!-- Model Settings -->
<div class="settings-section">
<h4>Model Configuration</h4>
<div class="setting-row">
<label for="default-model-path-${this.containerId}">Default Model Path:</label>
<input type="text" id="default-model-path-${this.containerId}" class="setting-input setting-input-wide" placeholder="data/models/">
</div>
<div class="setting-row">
<label for="auto-load-model-${this.containerId}">Auto-load Model on Startup:</label>
<input type="checkbox" id="auto-load-model-${this.containerId}" class="setting-checkbox">
</div>
<div class="setting-row">
<label for="inference-device-${this.containerId}">Inference Device:</label>
<select id="inference-device-${this.containerId}" class="setting-select">
<option value="CPU">CPU</option>
<option value="GPU">GPU</option>
</select>
</div>
<div class="setting-row">
<label for="inference-threads-${this.containerId}">Inference Threads:</label>
<input type="number" id="inference-threads-${this.containerId}" class="setting-input" min="1" max="16">
</div>
<div class="setting-row">
<label for="progressive-loading-${this.containerId}">Progressive Loading:</label>
<input type="checkbox" id="progressive-loading-${this.containerId}" class="setting-checkbox">
</div>
</div>
<!-- Training Settings -->
<div class="settings-section">
<h4>Training Configuration</h4>
<div class="setting-row">
<label for="default-epochs-${this.containerId}">Default Epochs:</label>
<input type="number" id="default-epochs-${this.containerId}" class="setting-input" min="1" max="10000">
</div>
<div class="setting-row">
<label for="default-batch-size-${this.containerId}">Default Batch Size:</label>
<input type="number" id="default-batch-size-${this.containerId}" class="setting-input" min="1" max="512">
</div>
<div class="setting-row">
<label for="default-learning-rate-${this.containerId}">Default Learning Rate:</label>
<input type="number" id="default-learning-rate-${this.containerId}" class="setting-input" min="0.000001" max="1" step="0.0001">
</div>
<div class="setting-row">
<label for="early-stopping-patience-${this.containerId}">Early Stopping Patience:</label>
<input type="number" id="early-stopping-patience-${this.containerId}" class="setting-input" min="1" max="100">
</div>
<div class="setting-row">
<label for="checkpoint-directory-${this.containerId}">Checkpoint Directory:</label>
<input type="text" id="checkpoint-directory-${this.containerId}" class="setting-input setting-input-wide" placeholder="data/models/">
</div>
<div class="setting-row">
<label for="auto-export-on-completion-${this.containerId}">Auto-export on Completion:</label>
<input type="checkbox" id="auto-export-on-completion-${this.containerId}" class="setting-checkbox">
</div>
<div class="setting-row">
<label for="recording-directory-${this.containerId}">Recording Directory:</label>
<input type="text" id="recording-directory-${this.containerId}" class="setting-input setting-input-wide" placeholder="data/recordings/">
</div>
</div>
<div class="settings-toggle">
<button class="btn btn-sm" id="toggle-advanced-${this.containerId}">Show Advanced</button>
</div>
@ -267,11 +344,12 @@ export class SettingsPanel {
const style = document.createElement('style');
style.textContent = `
.settings-panel {
background: #fff;
border: 1px solid #ddd;
background: #0d1117;
border: 1px solid rgba(56, 68, 89, 0.6);
border-radius: 8px;
font-family: Arial, sans-serif;
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
overflow: hidden;
color: #e0e0e0;
}
.settings-header {
@ -279,13 +357,13 @@ export class SettingsPanel {
justify-content: space-between;
align-items: center;
padding: 15px 20px;
background: #f8f9fa;
border-bottom: 1px solid #ddd;
background: rgba(15, 20, 35, 0.95);
border-bottom: 1px solid rgba(56, 68, 89, 0.6);
}
.settings-header h3 {
margin: 0;
color: #333;
color: #e0e0e0;
font-size: 16px;
font-weight: 600;
}
@ -297,26 +375,43 @@ export class SettingsPanel {
.settings-content {
padding: 20px;
max-height: 400px;
max-height: 500px;
overflow-y: auto;
}
.settings-content::-webkit-scrollbar {
width: 6px;
}
.settings-content::-webkit-scrollbar-track {
background: rgba(15, 20, 35, 0.5);
}
.settings-content::-webkit-scrollbar-thumb {
background: rgba(56, 68, 89, 0.8);
border-radius: 3px;
}
.settings-content::-webkit-scrollbar-thumb:hover {
background: rgba(80, 96, 120, 0.9);
}
.settings-section {
margin-bottom: 25px;
padding-bottom: 20px;
border-bottom: 1px solid #eee;
padding: 16px;
background: rgba(17, 24, 39, 0.9);
border: 1px solid rgba(56, 68, 89, 0.4);
border-radius: 8px;
}
.settings-section:last-child {
border-bottom: none;
margin-bottom: 0;
padding-bottom: 0;
}
.settings-section h4 {
margin: 0 0 15px 0;
color: #555;
font-size: 14px;
color: #8899aa;
font-size: 12px;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.5px;
@ -332,7 +427,7 @@ export class SettingsPanel {
.setting-row label {
flex: 1;
color: #666;
color: #8899aa;
font-size: 13px;
font-weight: 500;
}
@ -340,9 +435,26 @@ export class SettingsPanel {
.setting-input, .setting-select {
flex: 0 0 120px;
padding: 6px 8px;
border: 1px solid #ddd;
border: 1px solid rgba(56, 68, 89, 0.6);
border-radius: 4px;
font-size: 13px;
background: rgba(15, 20, 35, 0.8);
color: #e0e0e0;
}
.setting-input:focus, .setting-select:focus {
outline: none;
border-color: #667eea;
box-shadow: 0 0 0 2px rgba(102, 126, 234, 0.15);
}
.setting-input-wide {
flex: 0 0 160px;
}
.setting-select option {
background: #1a2234;
color: #c8d0dc;
}
.setting-range {
@ -353,41 +465,45 @@ export class SettingsPanel {
.setting-value {
flex: 0 0 40px;
font-size: 12px;
color: #666;
color: #b0b8c8;
text-align: center;
background: #f8f9fa;
background: rgba(15, 20, 35, 0.8);
padding: 2px 6px;
border-radius: 3px;
border: 1px solid #ddd;
border: 1px solid rgba(56, 68, 89, 0.6);
}
.setting-checkbox {
flex: 0 0 auto;
width: 18px;
height: 18px;
accent-color: #667eea;
}
.setting-color {
flex: 0 0 50px;
height: 30px;
border: 1px solid #ddd;
border: 1px solid rgba(56, 68, 89, 0.6);
border-radius: 4px;
cursor: pointer;
background: rgba(15, 20, 35, 0.8);
}
.btn {
padding: 6px 12px;
border: 1px solid #ddd;
border: 1px solid rgba(56, 68, 89, 0.6);
border-radius: 4px;
background: #fff;
background: rgba(30, 40, 60, 0.8);
color: #b0b8c8;
cursor: pointer;
font-size: 12px;
transition: all 0.2s;
}
.btn:hover {
background: #f8f9fa;
border-color: #adb5bd;
background: rgba(40, 55, 80, 0.9);
border-color: rgba(80, 96, 120, 0.8);
color: #e0e0e0;
}
.btn-sm {
@ -398,32 +514,32 @@ export class SettingsPanel {
.settings-toggle {
text-align: center;
padding-top: 15px;
border-top: 1px solid #eee;
border-top: 1px solid rgba(56, 68, 89, 0.4);
}
.settings-footer {
padding: 10px 20px;
background: #f8f9fa;
border-top: 1px solid #ddd;
background: rgba(15, 20, 35, 0.95);
border-top: 1px solid rgba(56, 68, 89, 0.6);
text-align: center;
}
.settings-status {
font-size: 12px;
color: #666;
color: #6b7a8d;
}
.advanced-section {
background: #f9f9f9;
background: rgba(20, 28, 45, 0.9);
margin: 0 -20px 25px -20px;
padding: 20px;
border: none;
border-top: 1px solid #ddd;
border-bottom: 1px solid #ddd;
border-top: 1px solid rgba(56, 68, 89, 0.4);
border-bottom: 1px solid rgba(56, 68, 89, 0.4);
}
.advanced-section h4 {
color: #dc3545;
color: #ef4444;
}
`;
@ -492,7 +608,9 @@ export class SettingsPanel {
const checkboxes = [
'auto-reconnect', 'show-keypoints', 'show-skeleton', 'show-bounding-box',
'show-confidence', 'show-zones', 'show-debug-info', 'enable-validation',
'enable-performance-tracking', 'enable-debug-logging', 'enable-smoothing'
'enable-performance-tracking', 'enable-debug-logging', 'enable-smoothing',
'auto-load-model', 'progressive-loading',
'auto-export-on-completion'
];
checkboxes.forEach(id => {
@ -503,12 +621,14 @@ export class SettingsPanel {
});
});
// Number inputs
// Number inputs (integers)
const numberInputs = [
'connection-timeout', 'max-persons', 'max-fps',
'heartbeat-interval', 'max-reconnect-attempts'
'connection-timeout', 'max-persons', 'max-fps',
'heartbeat-interval', 'max-reconnect-attempts',
'inference-threads', 'default-epochs', 'default-batch-size',
'early-stopping-patience'
];
numberInputs.forEach(id => {
const input = document.getElementById(`${id}-${this.containerId}`);
input?.addEventListener('change', (e) => {
@ -517,6 +637,32 @@ export class SettingsPanel {
});
});
// Float number inputs
const floatInputs = ['default-learning-rate'];
floatInputs.forEach(id => {
const input = document.getElementById(`${id}-${this.containerId}`);
input?.addEventListener('change', (e) => {
const settingKey = this.camelCase(id);
this.updateSetting(settingKey, parseFloat(e.target.value));
});
});
// Text inputs
const textInputs = ['default-model-path', 'checkpoint-directory', 'recording-directory'];
textInputs.forEach(id => {
const input = document.getElementById(`${id}-${this.containerId}`);
input?.addEventListener('change', (e) => {
const settingKey = this.camelCase(id);
this.updateSetting(settingKey, e.target.value);
});
});
// Inference device select
const inferenceDeviceSelect = document.getElementById(`inference-device-${this.containerId}`);
inferenceDeviceSelect?.addEventListener('change', (e) => {
this.updateSetting('inferenceDevice', e.target.value);
});
// Color inputs
const colorInputs = ['skeleton-color', 'keypoint-color', 'bounding-box-color'];
colorInputs.forEach(id => {
@ -696,7 +842,19 @@ export class SettingsPanel {
enableDebugLogging: false,
heartbeatInterval: 30000,
maxReconnectAttempts: 10,
enableSmoothing: true
enableSmoothing: true,
defaultModelPath: 'data/models/',
autoLoadModel: false,
inferenceDevice: 'CPU',
inferenceThreads: 4,
progressiveLoading: true,
defaultEpochs: 100,
defaultBatchSize: 32,
defaultLearningRate: 0.0003,
earlyStoppingPatience: 15,
checkpointDirectory: 'data/models/',
autoExportOnCompletion: true,
recordingDirectory: 'data/recordings/'
};
}

View file

@ -0,0 +1,419 @@
// TrainingPanel Component for WiFi-DensePose UI
// Dark-mode panel for training management, CSI recordings, and progress charts.
import { trainingService } from '../services/training.service.js';
const TP_STYLES = `
.tp-panel{background:rgba(17,24,39,.9);border:1px solid rgba(56,68,89,.6);border-radius:8px;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,sans-serif;color:#e0e0e0;overflow:hidden}
.tp-header{display:flex;align-items:center;justify-content:space-between;padding:14px 16px;background:rgba(13,17,23,.95);border-bottom:1px solid rgba(56,68,89,.6)}
.tp-title{font-size:14px;font-weight:600;color:#e0e0e0}
.tp-badge{font-size:11px;font-weight:600;padding:2px 8px;border-radius:10px}
.tp-badge-idle{background:rgba(108,117,125,.2);color:#8899aa;border:1px solid rgba(108,117,125,.3)}
.tp-badge-active{background:rgba(40,167,69,.2);color:#51cf66;border:1px solid rgba(40,167,69,.3);animation:tp-pulse 1.5s ease-in-out infinite}
.tp-badge-done{background:rgba(102,126,234,.2);color:#8ea4f0;border:1px solid rgba(102,126,234,.3)}
@keyframes tp-pulse{0%,100%{opacity:1}50%{opacity:.6}}
.tp-error{background:rgba(220,53,69,.15);color:#f5a0a8;border:1px solid rgba(220,53,69,.3);border-radius:4px;padding:8px 12px;margin:10px 12px 0;font-size:12px}
.tp-section{padding:12px;border-bottom:1px solid rgba(56,68,89,.3)}
.tp-section:last-child{border-bottom:none}
.tp-section-title{font-size:11px;font-weight:600;text-transform:uppercase;letter-spacing:.5px;color:#8899aa;margin-bottom:8px}
.tp-empty{color:#6b7a8d;font-size:12px;padding:12px 0;text-align:center}
.tp-rec-row{display:flex;align-items:center;justify-content:space-between;padding:6px 8px;margin-bottom:4px;background:rgba(13,17,23,.6);border:1px solid rgba(56,68,89,.3);border-radius:4px}
.tp-rec-info{display:flex;flex-direction:column;gap:2px}
.tp-rec-name{font-size:12px;color:#c8d0dc;font-weight:500}
.tp-rec-meta{font-size:10px;color:#6b7a8d}
.tp-rec-actions{margin-top:8px}
.tp-config-header{display:flex;align-items:center;justify-content:space-between;margin-bottom:6px}
.tp-config-form{display:flex;flex-direction:column;gap:6px}
.tp-label{font-size:12px;color:#8899aa;display:block;margin-bottom:2px}
.tp-input-row{display:flex;justify-content:space-between;align-items:center;gap:8px}
.tp-input-row .tp-label{flex:1;margin-bottom:0}
.tp-input{width:110px;padding:4px 8px;background:rgba(30,40,60,.8);border:1px solid rgba(56,68,89,.6);border-radius:4px;color:#c8d0dc;font-size:12px}
.tp-input:focus{outline:none;border-color:#667eea}
.tp-ds-container{display:flex;flex-direction:column;gap:4px;margin-bottom:4px;max-height:100px;overflow-y:auto}
.tp-ds-item{display:flex;align-items:center;gap:6px;font-size:12px;color:#c8d0dc;cursor:pointer}
.tp-ds-item input{width:14px;height:14px}
.tp-train-actions{display:flex;gap:6px;margin-top:10px}
.tp-progress-bar{height:6px;background:rgba(30,40,60,.8);border-radius:3px;overflow:hidden;margin-bottom:4px}
.tp-progress-fill{height:100%;background:linear-gradient(90deg,#667eea,#764ba2);border-radius:3px;transition:width .3s}
.tp-progress-label{font-size:11px;color:#8899aa;text-align:center;margin-bottom:10px}
.tp-chart-row{display:flex;gap:8px;margin-bottom:10px;flex-wrap:wrap}
.tp-chart-row canvas{border:1px solid rgba(56,68,89,.4);border-radius:4px;flex:1;min-width:120px}
.tp-metrics-grid{display:grid;grid-template-columns:1fr 1fr;gap:6px}
.tp-metric-cell{background:rgba(13,17,23,.6);border:1px solid rgba(56,68,89,.3);border-radius:4px;padding:6px 8px}
.tp-metric-label{font-size:10px;color:#6b7a8d;text-transform:uppercase;letter-spacing:.3px}
.tp-metric-value{font-size:13px;color:#c8d0dc;font-weight:500;margin-top:2px}
.tp-btn{padding:5px 12px;border-radius:4px;font-size:12px;font-weight:500;cursor:pointer;border:1px solid transparent;transition:all .15s}
.tp-btn:disabled{opacity:.5;cursor:not-allowed}
.tp-btn-success{background:rgba(40,167,69,.2);color:#51cf66;border-color:rgba(40,167,69,.3)}
.tp-btn-success:hover:not(:disabled){background:rgba(40,167,69,.35)}
.tp-btn-danger{background:rgba(220,53,69,.2);color:#ff6b6b;border-color:rgba(220,53,69,.3)}
.tp-btn-danger:hover:not(:disabled){background:rgba(220,53,69,.35)}
.tp-btn-secondary{background:rgba(30,40,60,.8);color:#b0b8c8;border-color:rgba(56,68,89,.6)}
.tp-btn-secondary:hover:not(:disabled){background:rgba(40,50,75,.9)}
.tp-btn-rec{background:rgba(220,53,69,.15);color:#ff6b6b;border-color:rgba(220,53,69,.3)}
.tp-btn-rec:hover:not(:disabled){background:rgba(220,53,69,.3)}
.tp-btn-muted{background:transparent;color:#6b7a8d;border-color:rgba(56,68,89,.4);font-size:11px;padding:3px 8px}
.tp-btn-muted:hover:not(:disabled){color:#b0b8c8;border-color:rgba(56,68,89,.8)}
`;
export default class TrainingPanel {
constructor(container) {
this.container = typeof container === 'string'
? document.getElementById(container) : container;
if (!this.container) throw new Error('TrainingPanel: container element not found');
this.state = {
recordings: [], trainingStatus: null, isRecording: false,
configOpen: true, loading: false, error: null
};
this.config = {
epochs: 100, batch_size: 32, learning_rate: 3e-4, patience: 15,
selectedRecordings: [], base_model: '', lora_profile_name: ''
};
this.progressData = { losses: [], pcks: [] };
this.unsubscribers = [];
this._injectStyles();
this.render();
this.refresh();
this._bindEvents();
}
_bindEvents() {
this.unsubscribers.push(
trainingService.on('progress', (d) => this._onProgress(d)),
trainingService.on('training-started', () => this.refresh()),
trainingService.on('training-stopped', () => {
trainingService.disconnectProgressStream();
this.refresh();
})
);
}
_onProgress(data) {
if (data.train_loss != null) this.progressData.losses.push(data.train_loss);
if (data.val_pck != null) this.progressData.pcks.push(data.val_pck);
this._set({ trainingStatus: { ...this.state.trainingStatus, ...data } });
}
// --- Data ---
async refresh() {
this._set({ loading: true, error: null });
try {
const [recordings, status] = await Promise.all([
trainingService.listRecordings().catch(() => []),
trainingService.getTrainingStatus().catch(() => null)
]);
if (status && !status.active) this.progressData = { losses: [], pcks: [] };
this._set({ recordings, trainingStatus: status, loading: false });
} catch (e) { this._set({ loading: false, error: e.message }); }
}
// --- Actions ---
async _startRec() {
this._set({ loading: true, error: null });
try {
await trainingService.startRecording({ session_name: `rec_${Date.now()}`, label: 'pose' });
this._set({ isRecording: true, loading: false });
await this.refresh();
} catch (e) { this._set({ loading: false, error: `Recording failed: ${e.message}` }); }
}
async _stopRec() {
this._set({ loading: true, error: null });
try {
await trainingService.stopRecording();
this._set({ isRecording: false, loading: false });
await this.refresh();
} catch (e) { this._set({ loading: false, error: `Stop recording failed: ${e.message}` }); }
}
async _delRec(id) {
this._set({ loading: true, error: null });
try {
await trainingService.deleteRecording(id);
this.config.selectedRecordings = this.config.selectedRecordings.filter(r => r !== id);
await this.refresh();
} catch (e) { this._set({ loading: false, error: `Delete failed: ${e.message}` }); }
}
async _launchTraining(method, extraCfg = {}) {
this._set({ loading: true, error: null });
this.progressData = { losses: [], pcks: [] };
try {
trainingService.connectProgressStream();
const payload = {
dataset_ids: this.config.selectedRecordings,
config: {
epochs: this.config.epochs,
batch_size: this.config.batch_size,
learning_rate: this.config.learning_rate,
...extraCfg
}
};
await trainingService[method](payload);
await this.refresh();
} catch (e) { this._set({ loading: false, error: `Training failed: ${e.message}` }); }
}
async _stopTraining() {
this._set({ loading: true, error: null });
try { await trainingService.stopTraining(); await this.refresh(); }
catch (e) { this._set({ loading: false, error: `Stop failed: ${e.message}` }); }
}
_set(p) { Object.assign(this.state, p); this.render(); }
// --- Render ---
render() {
const el = this.container;
el.innerHTML = '';
const panel = this._el('div', 'tp-panel');
panel.appendChild(this._renderHeader());
if (this.state.error) panel.appendChild(this._el('div', 'tp-error', this.state.error));
panel.appendChild(this._renderRecordings());
const ts = this.state.trainingStatus;
const active = ts && ts.active;
if (active) panel.appendChild(this._renderProgress());
else if (ts && !ts.active && this.progressData.losses.length > 0) panel.appendChild(this._renderComplete());
else panel.appendChild(this._renderConfig());
el.appendChild(panel);
if (active) requestAnimationFrame(() => this._drawCharts());
}
_renderHeader() {
const h = this._el('div', 'tp-header');
h.appendChild(this._el('span', 'tp-title', 'Training'));
const ts = this.state.trainingStatus;
let cls = 'tp-badge tp-badge-idle', txt = 'Idle';
if (ts && ts.active) { cls = 'tp-badge tp-badge-active'; txt = 'Training'; }
else if (ts && !ts.active && this.progressData.losses.length > 0) { cls = 'tp-badge tp-badge-done'; txt = 'Completed'; }
h.appendChild(this._el('span', cls, txt));
return h;
}
_renderRecordings() {
const s = this._el('div', 'tp-section');
s.appendChild(this._el('div', 'tp-section-title', 'CSI Recordings'));
if (this.state.recordings.length === 0 && !this.state.loading) {
s.appendChild(this._el('div', 'tp-empty', 'Start recording CSI data to train a model'));
} else {
this.state.recordings.forEach(rec => {
const row = this._el('div', 'tp-rec-row');
const info = this._el('div', 'tp-rec-info');
info.appendChild(this._el('span', 'tp-rec-name', rec.name || rec.id));
const parts = [];
if (rec.frame_count != null) parts.push(rec.frame_count + ' frames');
if (rec.file_size_bytes != null) parts.push(this._fmtB(rec.file_size_bytes));
if (rec.started_at && rec.ended_at) parts.push(Math.round((new Date(rec.ended_at) - new Date(rec.started_at)) / 1000) + 's');
info.appendChild(this._el('span', 'tp-rec-meta', parts.join(' / ')));
row.appendChild(info);
const del = this._btn('Delete', 'tp-btn tp-btn-muted', () => this._delRec(rec.id));
del.disabled = this.state.loading;
row.appendChild(del);
s.appendChild(row);
});
}
const acts = this._el('div', 'tp-rec-actions');
if (this.state.isRecording) {
const b = this._btn('Stop Recording', 'tp-btn tp-btn-danger', () => this._stopRec());
b.disabled = this.state.loading; acts.appendChild(b);
} else {
const b = this._btn('Start Recording', 'tp-btn tp-btn-rec', () => this._startRec());
b.disabled = this.state.loading; acts.appendChild(b);
}
s.appendChild(acts);
return s;
}
_renderConfig() {
const s = this._el('div', 'tp-section');
const hdr = this._el('div', 'tp-config-header');
hdr.appendChild(this._el('span', 'tp-section-title', 'Training Configuration'));
hdr.appendChild(this._btn(this.state.configOpen ? 'Collapse' : 'Expand', 'tp-btn tp-btn-muted',
() => { this.state.configOpen = !this.state.configOpen; this.render(); }));
s.appendChild(hdr);
if (!this.state.configOpen) return s;
const form = this._el('div', 'tp-config-form');
if (this.state.recordings.length > 0) {
form.appendChild(this._el('label', 'tp-label', 'Datasets'));
const dc = this._el('div', 'tp-ds-container');
this.state.recordings.forEach(rec => {
const lb = this._el('label', 'tp-ds-item');
const cb = document.createElement('input');
cb.type = 'checkbox';
cb.checked = this.config.selectedRecordings.includes(rec.id);
cb.addEventListener('change', () => {
if (cb.checked) { if (!this.config.selectedRecordings.includes(rec.id)) this.config.selectedRecordings.push(rec.id); }
else { this.config.selectedRecordings = this.config.selectedRecordings.filter(r => r !== rec.id); }
});
lb.appendChild(cb);
lb.appendChild(this._el('span', null, rec.name || rec.id));
dc.appendChild(lb);
});
form.appendChild(dc);
}
const ir = (l, t, v, fn) => {
const r = this._el('div', 'tp-input-row');
r.appendChild(this._el('label', 'tp-label', l));
const inp = document.createElement('input');
inp.type = t; inp.className = 'tp-input'; inp.value = v;
inp.addEventListener('change', () => fn(inp.value));
r.appendChild(inp); return r;
};
form.appendChild(ir('Epochs', 'number', this.config.epochs, v => { this.config.epochs = parseInt(v) || 100; }));
form.appendChild(ir('Batch Size', 'number', this.config.batch_size, v => { this.config.batch_size = parseInt(v) || 32; }));
form.appendChild(ir('Learning Rate', 'text', this.config.learning_rate, v => { this.config.learning_rate = parseFloat(v) || 3e-4; }));
form.appendChild(ir('Early Stop Patience', 'number', this.config.patience, v => { this.config.patience = parseInt(v) || 15; }));
form.appendChild(ir('Base Model (opt.)', 'text', this.config.base_model, v => { this.config.base_model = v; }));
form.appendChild(ir('LoRA Profile (opt.)', 'text', this.config.lora_profile_name, v => { this.config.lora_profile_name = v; }));
s.appendChild(form);
const acts = this._el('div', 'tp-train-actions');
const btns = [
this._btn('Start Training', 'tp-btn tp-btn-success', () => this._launchTraining('startTraining', { patience: this.config.patience, base_model: this.config.base_model || undefined })),
this._btn('Pretrain', 'tp-btn tp-btn-secondary', () => this._launchTraining('startPretraining')),
this._btn('LoRA', 'tp-btn tp-btn-secondary', () => this._launchTraining('startLoraTraining', { base_model: this.config.base_model || undefined, profile_name: this.config.lora_profile_name || 'default' }))
];
btns.forEach(b => { b.disabled = this.state.loading; acts.appendChild(b); });
s.appendChild(acts);
return s;
}
_renderProgress() {
const ts = this.state.trainingStatus || {};
const s = this._el('div', 'tp-section');
s.appendChild(this._el('div', 'tp-section-title', 'Training Progress'));
const pct = ts.total_epochs ? Math.round((ts.epoch / ts.total_epochs) * 100) : 0;
const bar = this._el('div', 'tp-progress-bar');
const fill = this._el('div', 'tp-progress-fill');
fill.style.width = pct + '%';
bar.appendChild(fill); s.appendChild(bar);
s.appendChild(this._el('div', 'tp-progress-label', `Epoch ${ts.epoch ?? 0} / ${ts.total_epochs ?? '?'} (${pct}%)`));
const cr = this._el('div', 'tp-chart-row');
const lc = document.createElement('canvas'); lc.id = 'tp-loss-chart'; lc.width = 260; lc.height = 140;
const pc = document.createElement('canvas'); pc.id = 'tp-pck-chart'; pc.width = 260; pc.height = 140;
cr.appendChild(lc); cr.appendChild(pc); s.appendChild(cr);
const g = this._el('div', 'tp-metrics-grid');
const mc = (l, v) => { const c = this._el('div', 'tp-metric-cell'); c.appendChild(this._el('div', 'tp-metric-label', l)); c.appendChild(this._el('div', 'tp-metric-value', v)); return c; };
g.appendChild(mc('Loss', ts.train_loss != null ? ts.train_loss.toFixed(4) : '--'));
g.appendChild(mc('PCK', ts.val_pck != null ? (ts.val_pck * 100).toFixed(1) + '%' : '--'));
g.appendChild(mc('OKS', ts.val_oks != null ? ts.val_oks.toFixed(3) : '--'));
g.appendChild(mc('LR', ts.lr != null ? ts.lr.toExponential(1) : '--'));
g.appendChild(mc('Best PCK', ts.best_pck != null ? (ts.best_pck * 100).toFixed(1) + '% (e' + (ts.best_epoch ?? '?') + ')' : '--'));
g.appendChild(mc('Patience', ts.patience_remaining != null ? String(ts.patience_remaining) : '--'));
g.appendChild(mc('ETA', ts.eta_secs != null ? this._fmtEta(ts.eta_secs) : '--'));
g.appendChild(mc('Phase', ts.phase || '--'));
s.appendChild(g);
const stop = this._btn('Stop Training', 'tp-btn tp-btn-danger', () => this._stopTraining());
stop.disabled = this.state.loading; stop.style.marginTop = '10px'; s.appendChild(stop);
return s;
}
_renderComplete() {
const ts = this.state.trainingStatus || {};
const s = this._el('div', 'tp-section');
s.appendChild(this._el('div', 'tp-section-title', 'Training Complete'));
const g = this._el('div', 'tp-metrics-grid');
const mc = (l, v) => { const c = this._el('div', 'tp-metric-cell'); c.appendChild(this._el('div', 'tp-metric-label', l)); c.appendChild(this._el('div', 'tp-metric-value', v)); return c; };
const losses = this.progressData.losses;
g.appendChild(mc('Final Loss', losses.length > 0 ? losses[losses.length - 1].toFixed(4) : '--'));
g.appendChild(mc('Best PCK', ts.best_pck != null ? (ts.best_pck * 100).toFixed(1) + '%' : '--'));
g.appendChild(mc('Best Epoch', ts.best_epoch != null ? String(ts.best_epoch) : '--'));
g.appendChild(mc('Total Epochs', String(losses.length)));
s.appendChild(g);
const acts = this._el('div', 'tp-train-actions');
acts.appendChild(this._btn('New Training', 'tp-btn tp-btn-secondary', () => {
this.progressData = { losses: [], pcks: [] }; this._set({ trainingStatus: null });
}));
s.appendChild(acts);
return s;
}
// --- Chart drawing ---
_drawCharts() {
this._drawChart('tp-loss-chart', this.progressData.losses, { color: '#ff6b6b', label: 'Loss', yMin: 0, yMax: null });
this._drawChart('tp-pck-chart', this.progressData.pcks, { color: '#51cf66', label: 'PCK', yMin: 0, yMax: 1 });
}
_drawChart(id, data, opts) {
const cv = document.getElementById(id);
if (!cv) return;
const ctx = cv.getContext('2d'), w = cv.width, h = cv.height;
const p = { t: 20, r: 10, b: 24, l: 44 };
ctx.fillStyle = '#0d1117'; ctx.fillRect(0, 0, w, h);
ctx.fillStyle = '#8899aa'; ctx.font = '11px -apple-system,sans-serif'; ctx.fillText(opts.label, p.l, 14);
if (!data.length) { ctx.fillStyle = '#6b7a8d'; ctx.fillText('No data', w / 2 - 20, h / 2); return; }
const pw = w - p.l - p.r, ph = h - p.t - p.b;
let yMin = opts.yMin ?? Math.min(...data), yMax = opts.yMax ?? Math.max(...data);
if (yMax === yMin) yMax = yMin + 1;
ctx.strokeStyle = 'rgba(255,255,255,.08)'; ctx.lineWidth = 1;
for (let i = 0; i <= 4; i++) {
const y = p.t + (ph / 4) * i;
ctx.beginPath(); ctx.moveTo(p.l, y); ctx.lineTo(w - p.r, y); ctx.stroke();
const v = yMax - ((yMax - yMin) / 4) * i;
ctx.fillStyle = '#6b7a8d'; ctx.font = '9px sans-serif'; ctx.fillText(v.toFixed(v >= 1 ? 2 : 3), 2, y + 3);
}
const xl = Math.min(data.length, 5);
for (let i = 0; i < xl; i++) {
const idx = Math.round((data.length - 1) * (i / (xl - 1 || 1)));
ctx.fillStyle = '#6b7a8d'; ctx.fillText(String(idx + 1), p.l + (pw * idx) / (data.length - 1 || 1) - 4, h - 4);
}
ctx.strokeStyle = opts.color; ctx.lineWidth = 1.5; ctx.beginPath();
data.forEach((v, i) => {
const x = p.l + (pw * i) / (data.length - 1 || 1);
const y = p.t + ph - ((v - yMin) / (yMax - yMin)) * ph;
i === 0 ? ctx.moveTo(x, y) : ctx.lineTo(x, y);
});
ctx.stroke();
if (data.length > 0) {
const ly = p.t + ph - ((data[data.length - 1] - yMin) / (yMax - yMin)) * ph;
ctx.fillStyle = opts.color; ctx.beginPath(); ctx.arc(p.l + pw, ly, 3, 0, Math.PI * 2); ctx.fill();
}
}
// --- Helpers ---
_el(tag, cls, txt) {
const e = document.createElement(tag);
if (cls) e.className = cls;
if (txt != null) e.textContent = txt;
return e;
}
_btn(txt, cls, fn) {
const b = document.createElement('button');
b.className = cls; b.textContent = txt;
b.addEventListener('click', fn); return b;
}
_fmtB(b) { return b < 1024 ? b + ' B' : b < 1048576 ? (b / 1024).toFixed(1) + ' KB' : (b / 1048576).toFixed(1) + ' MB'; }
_fmtEta(s) { return s < 60 ? Math.round(s) + 's' : s < 3600 ? Math.round(s / 60) + 'm' : (s / 3600).toFixed(1) + 'h'; }
_injectStyles() {
if (document.getElementById('training-panel-styles')) return;
const s = document.createElement('style');
s.id = 'training-panel-styles';
s.textContent = TP_STYLES;
document.head.appendChild(s);
}
destroy() {
this.unsubscribers.forEach(fn => fn());
this.unsubscribers = [];
trainingService.disconnectProgressStream();
if (this.container) this.container.innerHTML = '';
}
dispose() {
this.destroy();
}
}

View file

@ -28,6 +28,7 @@
<button class="nav-tab" data-tab="performance">Performance</button>
<button class="nav-tab" data-tab="applications">Applications</button>
<button class="nav-tab" data-tab="sensing">Sensing</button>
<button class="nav-tab" data-tab="training">Training</button>
</nav>
<!-- Dashboard Tab -->
@ -67,6 +68,11 @@
<span class="status-text">-</span>
<span class="status-message"></span>
</div>
<div class="component-status" data-component="datasource" id="dashboard-datasource">
<span class="component-name">Data Source</span>
<span class="status-text">-</span>
<span class="status-message"></span>
</div>
</div>
</div>
@ -482,6 +488,18 @@
<!-- Sensing Tab -->
<section id="sensing" class="tab-content"></section>
<!-- Training Tab -->
<section id="training" class="tab-content">
<div class="tab-header">
<h2>Model Training</h2>
<p>Record CSI data, train pose estimation models, and manage .rvf files</p>
</div>
<div id="training-container" style="display: flex; gap: 20px; flex-wrap: wrap;">
<div id="training-panel-container" style="flex: 1; min-width: 400px;"></div>
<div id="model-panel-container" style="flex: 1; min-width: 350px; max-width: 450px;"></div>
</div>
</section>
</div>
<!-- Error Toast -->

View file

@ -0,0 +1,152 @@
// Model Service for WiFi-DensePose UI
// Manages model loading, listing, LoRA profiles, and lifecycle events.
import { apiService } from './api.service.js';
export class ModelService {
constructor() {
this.activeModel = null;
this.listeners = {};
this.logger = this.createLogger();
}
createLogger() {
return {
debug: (...args) => console.debug('[MODEL-DEBUG]', new Date().toISOString(), ...args),
info: (...args) => console.info('[MODEL-INFO]', new Date().toISOString(), ...args),
warn: (...args) => console.warn('[MODEL-WARN]', new Date().toISOString(), ...args),
error: (...args) => console.error('[MODEL-ERROR]', new Date().toISOString(), ...args)
};
}
// --- Event emitter helpers ---
on(event, callback) {
if (!this.listeners[event]) {
this.listeners[event] = [];
}
this.listeners[event].push(callback);
return () => this.off(event, callback);
}
off(event, callback) {
if (!this.listeners[event]) return;
this.listeners[event] = this.listeners[event].filter(cb => cb !== callback);
}
emit(event, data) {
if (!this.listeners[event]) return;
this.listeners[event].forEach(cb => {
try { cb(data); } catch (err) { this.logger.error('Listener error', { event, err }); }
});
}
// --- API methods ---
async listModels() {
try {
const data = await apiService.get('/api/v1/models');
this.logger.info('Listed models', { count: data?.models?.length ?? 0 });
return data;
} catch (error) {
this.logger.error('Failed to list models', { error: error.message });
throw error;
}
}
async getModel(id) {
try {
const data = await apiService.get(`/api/v1/models/${encodeURIComponent(id)}`);
return data;
} catch (error) {
this.logger.error('Failed to get model', { id, error: error.message });
throw error;
}
}
async loadModel(modelId) {
try {
this.logger.info('Loading model', { modelId });
const data = await apiService.post('/api/v1/models/load', { model_id: modelId });
this.activeModel = { model_id: modelId };
this.emit('model-loaded', { model_id: modelId });
return data;
} catch (error) {
this.logger.error('Failed to load model', { modelId, error: error.message });
throw error;
}
}
async unloadModel() {
try {
this.logger.info('Unloading model');
const data = await apiService.post('/api/v1/models/unload', {});
this.activeModel = null;
this.emit('model-unloaded', {});
return data;
} catch (error) {
this.logger.error('Failed to unload model', { error: error.message });
throw error;
}
}
async getActiveModel() {
try {
const data = await apiService.get('/api/v1/models/active');
this.activeModel = data || null;
return this.activeModel;
} catch (error) {
if (error.status === 404) {
this.activeModel = null;
return null;
}
this.logger.error('Failed to get active model', { error: error.message });
throw error;
}
}
async activateLoraProfile(modelId, profileName) {
try {
this.logger.info('Activating LoRA profile', { modelId, profileName });
const data = await apiService.post(
'/api/v1/models/lora/activate',
{ model_id: modelId, profile_name: profileName }
);
this.emit('lora-activated', { model_id: modelId, profile: profileName });
return data;
} catch (error) {
this.logger.error('Failed to activate LoRA', { modelId, profileName, error: error.message });
throw error;
}
}
async getLoraProfiles() {
try {
const data = await apiService.get('/api/v1/models/lora/profiles');
return data?.profiles ?? [];
} catch (error) {
this.logger.error('Failed to get LoRA profiles', { error: error.message });
throw error;
}
}
async deleteModel(id) {
try {
this.logger.info('Deleting model', { id });
const data = await apiService.delete(`/api/v1/models/${encodeURIComponent(id)}`);
return data;
} catch (error) {
this.logger.error('Failed to delete model', { id, error: error.message });
throw error;
}
}
dispose() {
this.listeners = {};
this.activeModel = null;
this.logger.info('ModelService disposed');
}
}
// Create singleton instance
export const modelService = new ModelService();

View file

@ -21,13 +21,17 @@ export class PoseService {
};
this.validationErrors = [];
this.logger = this.createLogger();
// Model inference mode tracking
this.modelActive = false;
// Configuration
this.config = {
enableValidation: true,
enablePerformanceTracking: true,
maxValidationErrors: 10,
confidenceThreshold: 0.3,
confidenceThresholdModelInference: 0.15,
maxPersons: 10,
timeoutMs: 5000
};
@ -127,9 +131,14 @@ export class PoseService {
throw new Error(`Invalid stream options: ${validationResult.errors.join(', ')}`);
}
// Use a lower confidence threshold when model inference is active
const defaultThreshold = this.modelActive
? this.config.confidenceThresholdModelInference
: this.config.confidenceThreshold;
const params = {
zone_ids: options.zoneIds?.join(','),
min_confidence: options.minConfidence || this.config.confidenceThreshold,
min_confidence: options.minConfidence || defaultThreshold,
max_fps: options.maxFps || 30,
token: options.token || apiService.authToken
};
@ -494,9 +503,18 @@ export class PoseService {
};
}
// Extract persons from zone data
const persons = zoneData.pose.persons || [];
console.log('👥 Extracted persons:', persons);
// Determine the pose source for this message
const poseSource = originalMessage.pose_source || zoneData.pose_source || null;
// Choose confidence threshold based on pose source
const threshold = (poseSource === 'model_inference' || this.modelActive)
? this.config.confidenceThresholdModelInference
: this.config.confidenceThreshold;
// Extract persons from zone data, applying source-aware filtering
const rawPersons = zoneData.pose.persons || [];
const persons = rawPersons.filter(p => p.confidence === undefined || p.confidence >= threshold);
console.log('Extracted persons:', persons.length, '/', rawPersons.length, '(threshold:', threshold, ')');
// Create zone summary
const zoneSummary = {};
@ -511,7 +529,7 @@ export class PoseService {
persons: persons,
zone_summary: zoneSummary,
processing_time_ms: zoneData.metadata?.processing_time_ms || 0,
pose_source: originalMessage.pose_source || zoneData.pose_source || null,
pose_source: poseSource,
metadata: {
mock_data: false,
source: 'websocket',
@ -653,6 +671,14 @@ export class PoseService {
this.logger.info('Configuration updated', { config: this.config });
}
// Enable or disable model inference mode.
// When active, confidence thresholds are lowered because model inference
// produces more reliable detections than raw signal-derived heuristics.
setModelMode(active) {
this.modelActive = !!active;
this.logger.info('Model mode updated', { modelActive: this.modelActive });
}
// Health check
async healthCheck() {
try {

View file

@ -32,8 +32,14 @@ class SensingService {
this._simTimer = null;
// Connection state: disconnected | connecting | connected | reconnecting | simulated
this._state = 'disconnected';
// Data-source label exposed to the UI: "live" | "reconnecting" | "simulated"
// Data-source label exposed to the UI:
// "live" — real ESP32 hardware connected
// "server-simulated" — server is running but using synthetic data (no hardware)
// "reconnecting" — WebSocket disconnected, retrying
// "simulated" — client-side fallback simulation (server unreachable)
this._dataSource = 'reconnecting';
// The raw source string from the server (e.g. "esp32", "simulated", "simulate")
this._serverSource = null;
this._lastMessage = null;
// Ring buffer of recent RSSI values for sparkline
@ -113,7 +119,9 @@ class SensingService {
this._reconnectAttempt = 0;
this._stopSimulation();
this._setState('connected');
this._setDataSource('live');
// Don't assume "live" yet — wait for first frame's source field.
// Fetch server status to determine actual data source immediately.
this._detectServerSource();
};
this._ws.onmessage = (evt) => {
@ -256,11 +264,61 @@ class SensingService {
};
}
// ---- Server source detection -------------------------------------------
/**
* Fetch `/api/v1/status` to find out if the server is using real
* hardware or simulation. Called once on WebSocket open.
*/
async _detectServerSource() {
try {
const resp = await fetch('/api/v1/status');
if (resp.ok) {
const json = await resp.json();
this._applyServerSource(json.source);
} else {
// Can't reach status endpoint — assume live until first frame tells us
this._setDataSource('live');
}
} catch {
this._setDataSource('live');
}
}
/**
* Map a raw server source string to the UI data-source label.
*/
_applyServerSource(rawSource) {
this._serverSource = rawSource;
if (rawSource === 'esp32' || rawSource === 'wifi' || rawSource === 'live') {
this._setDataSource('live');
} else if (rawSource === 'simulated' || rawSource === 'simulate') {
this._setDataSource('server-simulated');
} else {
// Unknown source — show as server-simulated to be safe
this._setDataSource('server-simulated');
}
}
/** @return {string|null} Raw server source (e.g. "esp32", "simulated") */
get serverSource() {
return this._serverSource;
}
// ---- Data handling -----------------------------------------------------
_handleData(data) {
this._lastMessage = data;
// Track the server's source field from each frame so the UI
// can react if the server switches between esp32 ↔ simulated at runtime.
if (data.source && this._state === 'connected') {
const raw = data.source;
if (raw !== this._serverSource) {
this._applyServerSource(raw);
}
}
// Update RSSI history for sparkline
if (data.features && data.features.mean_rssi != null) {
this._rssiHistory.push(data.features.mean_rssi);
@ -292,7 +350,7 @@ class SensingService {
/**
* Update the dataSource label and notify state listeners so the UI can
* react without needing a separate subscription.
* @param {'live'|'reconnecting'|'simulated'} source
* @param {'live'|'server-simulated'|'reconnecting'|'simulated'} source
*/
_setDataSource(source) {
if (source === this._dataSource) return;

View file

@ -0,0 +1,211 @@
// Training Service for WiFi-DensePose UI
// Manages training lifecycle, progress streaming, and CSI recordings.
import { buildWsUrl } from '../config/api.config.js';
import { apiService } from './api.service.js';
export class TrainingService {
constructor() {
this.progressSocket = null;
this.listeners = {};
this.logger = this.createLogger();
}
createLogger() {
return {
debug: (...args) => console.debug('[TRAIN-DEBUG]', new Date().toISOString(), ...args),
info: (...args) => console.info('[TRAIN-INFO]', new Date().toISOString(), ...args),
warn: (...args) => console.warn('[TRAIN-WARN]', new Date().toISOString(), ...args),
error: (...args) => console.error('[TRAIN-ERROR]', new Date().toISOString(), ...args)
};
}
// --- Event emitter helpers ---
on(event, callback) {
if (!this.listeners[event]) {
this.listeners[event] = [];
}
this.listeners[event].push(callback);
return () => this.off(event, callback);
}
off(event, callback) {
if (!this.listeners[event]) return;
this.listeners[event] = this.listeners[event].filter(cb => cb !== callback);
}
emit(event, data) {
if (!this.listeners[event]) return;
this.listeners[event].forEach(cb => {
try { cb(data); } catch (err) { this.logger.error('Listener error', { event, err }); }
});
}
// --- Training API methods ---
async startTraining(config) {
try {
this.logger.info('Starting training', { config });
const data = await apiService.post('/api/v1/train/start', config);
this.emit('training-started', data);
return data;
} catch (error) {
this.logger.error('Failed to start training', { error: error.message });
throw error;
}
}
async stopTraining() {
try {
this.logger.info('Stopping training');
const data = await apiService.post('/api/v1/train/stop', {});
this.emit('training-stopped', data);
return data;
} catch (error) {
this.logger.error('Failed to stop training', { error: error.message });
throw error;
}
}
async getTrainingStatus() {
try {
const data = await apiService.get('/api/v1/train/status');
return data;
} catch (error) {
this.logger.error('Failed to get training status', { error: error.message });
throw error;
}
}
async startPretraining(config) {
try {
this.logger.info('Starting pretraining', { config });
const data = await apiService.post('/api/v1/train/pretrain', config);
this.emit('training-started', data);
return data;
} catch (error) {
this.logger.error('Failed to start pretraining', { error: error.message });
throw error;
}
}
async startLoraTraining(config) {
try {
this.logger.info('Starting LoRA training', { config });
const data = await apiService.post('/api/v1/train/lora', config);
this.emit('training-started', data);
return data;
} catch (error) {
this.logger.error('Failed to start LoRA training', { error: error.message });
throw error;
}
}
// --- Recording API methods ---
async listRecordings() {
try {
const data = await apiService.get('/api/v1/recording/list');
return data?.recordings ?? [];
} catch (error) {
this.logger.error('Failed to list recordings', { error: error.message });
throw error;
}
}
async startRecording(config) {
try {
this.logger.info('Starting recording', { config });
const data = await apiService.post('/api/v1/recording/start', config);
this.emit('recording-started', data);
return data;
} catch (error) {
this.logger.error('Failed to start recording', { error: error.message });
throw error;
}
}
async stopRecording() {
try {
this.logger.info('Stopping recording');
const data = await apiService.post('/api/v1/recording/stop', {});
this.emit('recording-stopped', data);
return data;
} catch (error) {
this.logger.error('Failed to stop recording', { error: error.message });
throw error;
}
}
async deleteRecording(id) {
try {
this.logger.info('Deleting recording', { id });
const data = await apiService.delete(
`/api/v1/recording/${encodeURIComponent(id)}`
);
return data;
} catch (error) {
this.logger.error('Failed to delete recording', { id, error: error.message });
throw error;
}
}
// --- WebSocket progress stream ---
connectProgressStream() {
if (this.progressSocket) {
this.logger.warn('Progress stream already connected');
return this.progressSocket;
}
const url = buildWsUrl('/ws/train/progress');
this.logger.info('Connecting progress stream', { url });
const ws = new WebSocket(url);
ws.onopen = () => {
this.logger.info('Progress stream connected');
this.emit('progress-connected', {});
};
ws.onmessage = (event) => {
try {
const data = JSON.parse(event.data);
this.emit('progress', data);
} catch (err) {
this.logger.warn('Failed to parse progress message', { error: err.message });
}
};
ws.onerror = (error) => {
this.logger.error('Progress stream error', { error });
this.emit('progress-error', { error });
};
ws.onclose = () => {
this.logger.info('Progress stream disconnected');
this.progressSocket = null;
this.emit('progress-disconnected', {});
};
this.progressSocket = ws;
return ws;
}
disconnectProgressStream() {
if (this.progressSocket) {
this.progressSocket.close();
this.progressSocket = null;
}
}
dispose() {
this.disconnectProgressStream();
this.listeners = {};
this.logger.info('TrainingService disposed');
}
}
// Create singleton instance
export const trainingService = new TrainingService();

View file

@ -83,9 +83,24 @@ export class WebSocketService {
const ws = await this.createWebSocketWithTimeout(url);
connectionData.ws = ws;
// Set up event handlers
// Set up event handlers (replaces onopen/onmessage/etc.)
this.setupEventHandlers(url, ws, handlers);
// The WebSocket is already open at this point (createWebSocketWithTimeout
// resolved on the original onopen). setupEventHandlers replaced onopen, so
// the new handler never fires. Manually trigger the connected path now.
if (ws.readyState === WebSocket.OPEN) {
connectionData.status = 'connected';
connectionData.lastActivity = Date.now();
this.reconnectAttempts.set(url, 0);
this.notifyConnectionState(url, 'connected');
if (handlers.onOpen) {
try { handlers.onOpen(new Event('open')); } catch (e) {
this.logger.error('Error in onOpen handler', { url, error: e.message });
}
}
}
// Start heartbeat
this.startHeartbeat(url);

View file

@ -355,6 +355,21 @@ pre code {
background: var(--color-secondary-active);
}
.btn--accent {
background: rgba(139, 92, 246, 0.2);
color: #a78bfa;
border-color: rgba(139, 92, 246, 0.3);
}
.btn--accent:hover {
background: rgba(139, 92, 246, 0.3);
border-color: rgba(139, 92, 246, 0.5);
}
.btn--accent:active {
background: rgba(139, 92, 246, 0.15);
}
.btn--outline {
background: transparent;
border: 1px solid var(--color-border);
@ -683,7 +698,9 @@ body {
/* Navigation tabs */
.nav-tabs {
display: flex;
overflow-x: auto;
justify-content: center;
flex-wrap: wrap;
gap: 2px;
border-bottom: 1px solid var(--color-border);
margin-bottom: var(--space-24);
scrollbar-width: none;
@ -695,11 +712,11 @@ body {
}
.nav-tab {
padding: var(--space-12) var(--space-20);
padding: var(--space-12) var(--space-16);
background: none;
border: none;
color: var(--color-text-secondary);
font-size: var(--font-size-md);
font-size: var(--font-size-sm);
font-weight: var(--font-weight-medium);
cursor: pointer;
transition: all var(--duration-normal) var(--ease-standard);
@ -1033,9 +1050,87 @@ body {
}
.demo-status {
display: flex;
align-items: center;
gap: 8px;
margin-left: auto;
}
/* Status indicator dot */
.status-indicator {
display: inline-block;
width: 10px;
height: 10px;
border-radius: 50%;
background: #555;
}
.status-indicator.active {
background: #00cc88;
box-shadow: 0 0 6px #00cc88;
}
.status-indicator.sim {
background: #ffa500;
box-shadow: 0 0 6px #ffa500;
animation: pulse 1.5s infinite;
}
.status-indicator.connecting {
background: #f0ad4e;
animation: pulse 1s infinite;
}
.status-indicator.error {
background: #ff3c3c;
}
/* Live Demo data-source banner */
.demo-source-banner {
display: block;
width: 100%;
padding: 10px 16px;
margin-bottom: 12px;
border-radius: 6px;
text-align: center;
font-size: 13px;
font-weight: 700;
letter-spacing: 0.06em;
text-transform: uppercase;
box-sizing: border-box;
}
.demo-source-live {
background: rgba(0, 204, 136, 0.15);
border: 1px solid #00cc88;
color: #00cc88;
}
.demo-source-sim {
background: rgba(255, 165, 0, 0.15);
border: 1px solid #ffa500;
color: #ffa500;
}
.demo-source-reconnecting {
background: rgba(255, 180, 0, 0.12);
border: 1px solid #f0ad4e;
color: #f0ad4e;
animation: pulse 1.5s infinite;
}
.demo-source-offline {
background: rgba(255, 60, 60, 0.12);
border: 1px solid #ff3c3c;
color: #ff3c3c;
}
.demo-source-unknown {
background: rgba(128, 128, 128, 0.12);
border: 1px solid #888;
color: #888;
}
.demo-grid {
display: grid;
grid-template-columns: 1fr 1fr;
@ -1388,6 +1483,15 @@ canvas {
background: rgba(var(--color-warning-rgb), 0.05);
}
.component-status.status-warning {
border-color: #ffa500;
background: rgba(255, 165, 0, 0.08);
}
.component-status.status-warning .status-text {
color: #ffa500;
}
.component-status.status-unhealthy {
border-color: var(--color-error);
background: rgba(var(--color-error-rgb), 0.05);
@ -1806,12 +1910,24 @@ canvas {
animation: pulse 1.5s infinite;
}
.sensing-source-server-sim {
background: rgba(255, 165, 0, 0.15);
border: 1px solid #ffa500;
color: #ffa500;
}
.sensing-source-simulated {
background: rgba(255, 60, 60, 0.12);
border: 1px solid var(--color-error);
color: var(--color-error);
}
/* Health indicator for server-simulated data */
.health-sim {
color: #ffa500;
font-weight: 600;
}
/* Big RSSI value */
.sensing-big-value {
font-size: var(--font-size-3xl);
@ -1956,3 +2072,355 @@ canvas {
font-family: var(--font-family-mono);
font-weight: var(--font-weight-medium);
}
/* ===== Training Tab Styles ===== */
#training .tab-header {
margin-bottom: 20px;
}
#training .tab-header h2 {
color: var(--color-text);
margin: 0 0 8px 0;
}
#training .tab-header p {
color: var(--color-text-secondary);
margin: 0;
font-size: var(--font-size-sm);
}
/* Training Panel */
.training-panel {
background: var(--color-surface);
border: 1px solid var(--color-card-border);
border-radius: var(--radius-lg);
padding: var(--space-16);
}
.training-panel-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: var(--space-16);
padding-bottom: var(--space-12);
border-bottom: 1px solid var(--color-card-border-inner);
}
.training-panel-header h3 {
color: var(--color-text);
margin: 0;
font-size: var(--font-size-base);
}
.training-status-badge {
padding: var(--space-2) 10px;
border-radius: var(--radius-full);
font-size: var(--font-size-xs);
font-weight: var(--font-weight-semibold);
text-transform: uppercase;
}
.training-status-idle {
background: var(--color-secondary);
color: var(--color-text-secondary);
border: 1px solid var(--color-border);
}
.training-status-active {
background: rgba(var(--color-error-rgb), 0.15);
color: var(--color-error);
border: 1px solid rgba(var(--color-error-rgb), var(--status-border-opacity));
animation: pulse-training 2s infinite;
}
.training-status-completed {
background: rgba(var(--color-success-rgb), 0.15);
color: var(--color-success);
border: 1px solid rgba(var(--color-success-rgb), var(--status-border-opacity));
}
@keyframes pulse-training {
0%, 100% { opacity: 1; }
50% { opacity: 0.6; }
}
/* Recording list */
.recording-item {
display: flex;
justify-content: space-between;
align-items: center;
padding: 10px var(--space-12);
background: var(--color-secondary);
border: 1px solid var(--color-card-border-inner);
border-radius: var(--radius-base);
margin-bottom: var(--space-8);
}
.recording-item-info {
flex: 1;
}
.recording-item-name {
color: var(--color-text);
font-size: var(--font-size-sm);
font-weight: var(--font-weight-medium);
}
.recording-item-meta {
color: var(--color-text-secondary);
font-size: var(--font-size-xs);
margin-top: var(--space-2);
}
/* Model cards */
.model-card {
padding: var(--space-12);
background: var(--color-secondary);
border: 1px solid var(--color-card-border-inner);
border-radius: var(--radius-base);
margin-bottom: var(--space-8);
transition: border-color 0.2s;
}
.model-card:hover {
border-color: var(--color-border);
}
.model-card-active {
border-left: 3px solid var(--color-success);
}
.model-card-name {
color: var(--color-text);
font-size: var(--font-size-sm);
font-weight: var(--font-weight-semibold);
}
.model-card-meta {
color: var(--color-text-secondary);
font-size: var(--font-size-xs);
margin-top: var(--space-4);
}
.model-card-stats {
display: flex;
gap: var(--space-12);
margin-top: var(--space-8);
}
.model-card-stat {
font-size: var(--font-size-xs);
}
.model-card-stat-label {
color: var(--color-text-secondary);
}
.model-card-stat-value {
color: var(--color-text);
font-weight: var(--font-weight-semibold);
}
/* Training chart */
.training-chart-container {
background: var(--color-secondary);
border: 1px solid var(--color-card-border-inner);
border-radius: var(--radius-base);
padding: var(--space-12);
margin: var(--space-12) 0;
}
.training-chart-label {
color: var(--color-text-secondary);
font-size: var(--font-size-xs);
text-transform: uppercase;
letter-spacing: 0.05em;
margin-bottom: var(--space-8);
}
/* Training config form */
.training-config-form {
display: grid;
grid-template-columns: 1fr 1fr;
gap: var(--space-12);
}
.training-form-group {
display: flex;
flex-direction: column;
gap: var(--space-4);
}
.training-form-label {
color: var(--color-text-secondary);
font-size: var(--font-size-xs);
text-transform: uppercase;
letter-spacing: 0.05em;
}
.training-form-input {
background: var(--color-background);
border: 1px solid var(--color-border);
border-radius: var(--radius-base);
color: var(--color-text);
padding: var(--space-8) 10px;
font-size: var(--font-size-sm);
font-family: inherit;
}
.training-form-input:focus {
outline: none;
border-color: var(--color-primary);
box-shadow: var(--focus-ring);
}
.training-form-select {
background: var(--color-background);
border: 1px solid var(--color-border);
border-radius: var(--radius-base);
color: var(--color-text);
padding: var(--space-8) 10px;
font-size: var(--font-size-sm);
}
/* Training buttons */
.training-btn {
padding: var(--space-8) var(--space-16);
border-radius: var(--radius-base);
border: 1px solid transparent;
font-size: var(--font-size-xs);
font-weight: var(--font-weight-semibold);
cursor: pointer;
transition: all 0.2s;
}
.training-btn-primary {
background: rgba(var(--color-success-rgb), 0.15);
color: var(--color-success);
border-color: rgba(var(--color-success-rgb), var(--status-border-opacity));
}
.training-btn-primary:hover {
background: rgba(var(--color-success-rgb), 0.25);
}
.training-btn-danger {
background: rgba(var(--color-error-rgb), 0.15);
color: var(--color-error);
border-color: rgba(var(--color-error-rgb), var(--status-border-opacity));
}
.training-btn-danger:hover {
background: rgba(var(--color-error-rgb), 0.25);
}
.training-btn-secondary {
background: rgba(var(--color-primary-rgb), 0.15);
color: var(--color-primary);
border-color: rgba(var(--color-primary-rgb), var(--status-border-opacity));
}
.training-btn-secondary:hover {
background: rgba(var(--color-primary-rgb), 0.25);
}
.training-btn-muted {
background: var(--color-secondary);
color: var(--color-text-secondary);
border-color: var(--color-border);
}
.training-btn-muted:hover {
background: var(--color-secondary-hover);
}
/* Progress bar */
.training-progress-bar {
width: 100%;
height: 6px;
background: var(--color-secondary);
border-radius: var(--radius-full);
overflow: hidden;
margin: var(--space-8) 0;
}
.training-progress-fill {
height: 100%;
background: linear-gradient(90deg, var(--color-primary), var(--color-success));
border-radius: var(--radius-full);
transition: width 0.3s ease;
}
/* Metrics grid */
.training-metrics-grid {
display: grid;
grid-template-columns: repeat(3, 1fr);
gap: var(--space-8);
margin: var(--space-12) 0;
}
.training-metric {
text-align: center;
padding: var(--space-8);
background: var(--color-secondary);
border-radius: var(--radius-base);
}
.training-metric-value {
color: var(--color-text);
font-size: var(--font-size-2xl);
font-weight: var(--font-weight-bold);
font-family: var(--font-family-mono);
}
.training-metric-label {
color: var(--color-text-secondary);
font-size: var(--font-size-xs);
text-transform: uppercase;
letter-spacing: 0.05em;
margin-top: var(--space-2);
}
/* Collapsible section */
.training-collapsible-header {
display: flex;
justify-content: space-between;
align-items: center;
padding: 10px 0;
cursor: pointer;
color: var(--color-text);
font-size: var(--font-size-sm);
font-weight: var(--font-weight-semibold);
border-bottom: 1px solid var(--color-card-border-inner);
}
.training-collapsible-header:hover {
color: var(--color-primary);
}
.training-collapsible-content {
padding: var(--space-12) 0;
}
/* Pose trail toggle in toolbar */
.pose-trail-btn {
padding: var(--space-6) 14px;
border-radius: var(--radius-base);
font-size: var(--font-size-xs);
font-weight: var(--font-weight-semibold);
cursor: pointer;
transition: all 0.2s;
background: rgba(var(--color-primary-rgb), 0.1);
color: var(--color-primary);
border: 1px solid rgba(var(--color-primary-rgb), 0.3);
}
.pose-trail-btn.active {
background: rgba(var(--color-primary-rgb), 0.25);
border-color: rgba(var(--color-primary-rgb), 0.6);
}
.pose-trail-btn:hover {
background: rgba(var(--color-primary-rgb), 0.2);
}