Ruview/v2/crates/wifi-densepose-nn
rUv f49c722764
chore(repo): rename rust-port/wifi-densepose-rs → v2/ (flatten to one level) (#427)
The Rust port lived two directories deep (rust-port/wifi-densepose-rs/)
without any sibling under rust-port/ that warranted the extra level.
Move the whole workspace up to v2/ to match v1/ (Python) at the same
depth and shorten every cd / build command across the repo.

git mv preserves history for all tracked files. 60 files updated for
path references (CI workflows, ADRs, docs, scripts, READMEs, internal
.claude-flow state). Two manual fixes for relative-cd paths in
CLAUDE.md and ADR-043 that became wrong after the depth change
(cd ../.. → cd ..).

Validated:
- cargo check --workspace --no-default-features → clean (after target/
  nuke; the gitignored target/ was carried by the OS rename and had
  hard-coded old paths in build scripts)
- cargo test --workspace --no-default-features → 1,539 passed, 0 failed,
  8 ignored (same totals as pre-rename)
- ESP32-S3 on COM7 → still streaming live CSI (cb #40300, RSSI -64 dBm)

After-merge follow-up: contributors should `rm -rf v2/target` once and
let cargo regenerate from the new path.
2026-04-25 21:28:13 -04:00
..
benches chore(repo): rename rust-port/wifi-densepose-rs → v2/ (flatten to one level) (#427) 2026-04-25 21:28:13 -04:00
src chore(repo): rename rust-port/wifi-densepose-rs → v2/ (flatten to one level) (#427) 2026-04-25 21:28:13 -04:00
Cargo.toml chore(repo): rename rust-port/wifi-densepose-rs → v2/ (flatten to one level) (#427) 2026-04-25 21:28:13 -04:00
README.md chore(repo): rename rust-port/wifi-densepose-rs → v2/ (flatten to one level) (#427) 2026-04-25 21:28:13 -04:00

wifi-densepose-nn

Crates.io Documentation License

Multi-backend neural network inference for WiFi-based DensePose estimation.

Overview

wifi-densepose-nn provides the inference engine that maps processed WiFi CSI features to DensePose body surface predictions. It supports three backends -- ONNX Runtime (default), PyTorch via tch-rs, and Candle -- so models can run on CPU, CUDA GPU, or TensorRT depending on the deployment target.

The crate implements two key neural components:

  • DensePose Head -- Predicts 24 body part segmentation masks and per-part UV coordinate regression.
  • Modality Translator -- Translates CSI feature embeddings into visual feature space, bridging the domain gap between WiFi signals and image-based pose estimation.

Features

  • ONNX Runtime backend (default) -- Load and run .onnx models with CPU or GPU execution providers.
  • PyTorch backend (tch-backend) -- Native PyTorch inference via libtorch FFI.
  • Candle backend (candle-backend) -- Pure-Rust inference with candle-core and candle-nn.
  • CUDA acceleration (cuda) -- GPU execution for supported backends.
  • TensorRT optimization (tensorrt) -- INT8/FP16 optimized inference via ONNX Runtime.
  • Batched inference -- Process multiple CSI frames in a single forward pass.
  • Model caching -- Memory-mapped model weights via memmap2.

Feature flags

Flag Default Description
onnx yes ONNX Runtime backend
tch-backend no PyTorch (tch-rs) backend
candle-backend no Candle pure-Rust backend
cuda no CUDA GPU acceleration
tensorrt no TensorRT via ONNX Runtime
all-backends no Enable onnx + tch + candle together

Quick Start

use wifi_densepose_nn::{InferenceEngine, DensePoseConfig, OnnxBackend};

// Create inference engine with ONNX backend
let config = DensePoseConfig::default();
let backend = OnnxBackend::from_file("model.onnx")?;
let engine = InferenceEngine::new(backend, config)?;

// Run inference on a CSI feature tensor
let input = ndarray::Array4::zeros((1, 256, 64, 64));
let output = engine.infer(&input)?;

println!("Body parts: {}", output.body_parts.shape()[1]); // 24

Architecture

wifi-densepose-nn/src/
  lib.rs          -- Re-exports, constants (NUM_BODY_PARTS=24), prelude
  densepose.rs    -- DensePoseHead, DensePoseConfig, DensePoseOutput
  inference.rs    -- Backend trait, InferenceEngine, InferenceOptions
  onnx.rs         -- OnnxBackend, OnnxSession (feature-gated)
  tensor.rs       -- Tensor, TensorShape utilities
  translator.rs   -- ModalityTranslator (CSI -> visual space)
  error.rs        -- NnError, NnResult
Crate Role
wifi-densepose-core Foundation types and NeuralInference trait
wifi-densepose-signal Produces CSI features consumed by inference
wifi-densepose-train Trains the models this crate loads
ort ONNX Runtime Rust bindings
tch PyTorch Rust bindings
candle-core Hugging Face pure-Rust ML framework

License

MIT OR Apache-2.0